# [Official] AMD R9 390/390X Owners Club



## Agent Smith1984




----------



## hyp36rmax

Great thread! Looking forward to it. Hopefully all the R9 290(X) tweaks roll over to the refreshed series.


----------



## DividebyZERO

I may be joining soon myself. Just waiting and waiting and waiting ... Damn you AMD


----------



## Agent Smith1984

Here's my thoughts for anyone thinking about these cards...

If you buy one, I think you made a great choice....

If you don't buy one, I completely understand









At the end of the day, I am thrilled to be the first on here to own one of these cards.
I know these are now going to be considered mainstream GPU's, but I feel like these cards are still pretty beastly, and are great performers for the money.

Just stay tuned..... This will be a work in progress.

Teaser info....

This 390 is about 14% faster out of the box than my 290 Tri-x OC.

At 100+MV I have hit 1180/1600 with full stability.

I will finish finding the max VRAM clock this evening, and then continue to push voltage from there for more core frequency.


----------



## Amhro

I'm joining you in (hopefully) 12 hours! Can't wait


----------



## sTOrM41

Quote:


> Originally Posted by *Agent Smith1984*
> 
> This 390 is about 14% faster out of the box than my 290 Tri-x OC.


huh?








hard to believe.


----------



## Agent Smith1984

Quote:


> Originally Posted by *sTOrM41*
> 
> huh?
> 
> 
> 
> 
> 
> 
> 
> 
> hard to believe.


Stock 290 Tri-X pulled around 10,800 graphics score in FireStrike

Stock MSI Gaming 390 is pulling 12,500!!!








http://www.3dmark.com/fs/5184408

Notice Futuremark has mislabeled the card as a 380... They'd better get this fixed quickly!

An overclcoked run so you can see that the clocks aren't fudged on the first run. http://www.3dmark.com/fs/5184788
Firestrike definitely reports the clocks correct every time with these cards (had issues with this on the 2 series)!


----------



## sTOrM41

cant tell why, but my [email protected] bios is also faster than with my 290 bios at firestrike,
on the other hand fps in games are pretty much the same.


----------



## Agent Smith1984

Quote:


> Originally Posted by *sTOrM41*
> 
> cant tell why, but my [email protected] bios is also faster than with my 290 bios at firestrike,
> on the other hand fps in games are pretty mich the same.


Part of the performance boost lay within the 15.15 driver.









My minimum FPS are up about 8-10% from my 290, and the max FPS are up as high as 15% in some cases.

Throw in the extra clock speed I can hit on this card, and it definitely makes for a better experience.

I'm not saying anyone with a 290/290x needs to go out and get one of these cards, but anyone with a 7 series/ or <280 series would really be getting a nice upgrade here!

This refresh/driver combination is a winner in my opinion!


----------



## DividebyZERO

just throwing these in here

some benches from here


----------



## Agent Smith1984

Quote:


> Originally Posted by *DividebyZERO*
> 
> just throwing these in here
> 
> some benches from here


Great to see 390x performing directly with 980!


----------



## Amhro

Count me in









MSI 390


----------



## Agent Smith1984

Quote:


> Originally Posted by *Amhro*
> 
> Count me in
> 
> 
> 
> 
> 
> 
> 
> 
> 
> MSI 390


Good man!

I am getting the chart together today.

So far it's just you and I.

I have a feeling by the time the Fury reviews get around, you are going to see a lot more activity in here.

I'd also bet that many who do go 300 series, will get the MSI gaming, unless it goes against a color scheme or something?


----------



## Agent Smith1984

@Amhro Added to spreadsheet!

I don't expect it to be _too_ busy in here with Fury launching, but after some of the stuff I'm seeing with that card, this thread may get a little busier than I had originally anticipated...









I'll be working on gathering more information on the 300 series as it comes along.

I will also be making a list of known issues I have experienced with the card. One major issue being some strange tree flickering in Dirt Rally (appears to be obvious driver troubles), and also CCC not working at all with the 15.15 driver set, even after doing a cleaning, and reinstall.... very strange, but I'll keep trying to get it sorted and report back.


----------



## Amhro

I am wondering how long will it take until someone unlocks 390 to 390X


----------



## Agent Smith1984

Quote:


> Originally Posted by *Amhro*
> 
> I am wondering how long will it take until someone unlocks 390 to 390X


Probably not even possible, but I'll know if mine is tonight when I try flashing the 390X BIOS over.

If anything, I'll get higher default clocks out of it, and maybe a heftier power tune profile.


----------



## DividebyZERO

Yeah i was wondering this as well, +rep for testing uncharted water!


----------



## Agent Smith1984

@Amhro

Can you tell me your MSI 390 Default clocks?

Mine are not as advertised.

I am reporting 1040/1500 instead of 1060/1525

Also, have you had any driver/CCC/game issues.

Any overclocking yet?

Thanks


----------



## BackwoodsNC

What's the max volts you can put through the card?


----------



## Agent Smith1984

Quote:


> Originally Posted by *BackwoodsNC*
> 
> What's the max volts you can put through the card?


Only tested 100+mv on AB so far, but am going to attempt more with Trixx this evening.

What was nice to see, was that even though the core hits around 80c on heavy load, the VRM never breaks 64c, so there is plenty of headroom for more voltage.

Seeing 1185MHz stable on a 100mv+ bump was very nice to see.....

I will monitor the reported voltage later and see if the stock voltage, and overlcocked voltage are reporting higher (roughly) than what my 290 did.
I am inclined to believe so..... but if that is the case, then it means a few things are true...

1. The FrozerV cooler on this card is doing a great job on the core AND the VRM's
2. If the stock voltage is higher than the 290 (by 50-100mv) this means we have potential to get in the realm of 250-300mv+ voltage increase (over 290) with trixx, without ever having to touch a new BIOS
3. With that said, even if the silicone was not improved, and all these clocks are coming by way of voltage, we are still being given more potential voltage, so overclocks on this series should be higher either way


----------



## Amhro

Quote:


> Originally Posted by *Agent Smith1984*
> 
> @Amhro
> 
> Can you tell me your MSI 390 Default clocks?
> 
> Mine are not as advertised.
> 
> I am reporting 1040/1500 instead of 1060/1525
> 
> Also, have you had any driver/CCC/game issues.
> 
> Any overclocking yet?
> 
> Thanks


Same here, 1040/1500.
On official site it says it's gaming mode http://www.msi.com/product/vga/R9-390-GAMING-8G.html#hero-overview
Games are fine so far, only issue I had with CCC was changing some settings - turned them on, then back off and screen went black, had to restart.
Haven't tried overclocking yet, will get to that later









Tried running 4K on 55" TV and there were few issues, like grey line on right side and couldn't figure it out.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Amhro*
> 
> Same here, 1040/1500.
> On official site it says it's gaming mode http://www.msi.com/product/vga/R9-390-GAMING-8G.html#hero-overview
> Games are fine so far, only issue I had with CCC was changing some settings - turned them on, then back off and screen went black, had to restart.
> Haven't tried overclocking yet, will get to that later
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Tried running 4K on 55" TV and there were few issues, like grey line on right side and couldn't figure it out.


DERP!!!

See what happens when you rush things?? lol

I didn't even install the MSI gaming app, or included version of AfterBurner from the CD.

No optical drive in my rig, so I'll DL that app and test later.

May also look for an AB update also, to see if there is some missing support form current version....


----------



## jwcw

390 looks great for me at 1080p from doing my research just unsure if i should go for that or a 290x at this stage with them been basically the same price in the uk, if only the 390x was a little less i could justify getting it.


----------



## razerprecicion

i had to return my 390 becasue i killed my motherboard and didnt have the money to replace it


----------



## DividebyZERO

Quote:


> Originally Posted by *jwcw*
> 
> 390 looks great for me at 1080p from doing my research just unsure if i should go for that or a 290x at this stage with them been basically the same price in the uk, if only the 390x was a little less i could justify getting it.


290x with 8gb vs 390 with 8gb? They would be pretty much equal. 390x is faster than 290x though even if its only clock speed its stock clock is like 10% or more faster.


----------



## Agent Smith1984

Quote:


> Originally Posted by *jwcw*
> 
> 390 looks great for me at 1080p from doing my research just unsure if i should go for that or a 290x at this stage with them been basically the same price in the uk, if only the 390x was a little less i could justify getting it.


390 is backed by the latest driver revisions and no one knows if 290 series will get the benefits of that revision or not.

Also, the 390 is beating the 290x in most benchmarks, and has twice the VRAM.

Not saying it's better, just saying it's newer, faster, and costs the same....

Wait...

Maybe I am saying it's better









For the best look at the 390.... look at this Fury X review...
https://www.techpowerup.com/reviews/AMD/R9_Fury_X/11.html

It includes the 290x, the 390, and the 390x... and these are with reference samples.

It sheds a nice light on both the 390 and the 390x....
But as with the last series, you gotta determine if that $ difference to go from 390 to 390x is worth the performance improvement.

I will say that, with the tess improvements in the 15.15 drivers, that 256 shader increase does a lot more for the 390x than it ever did for the 290x.

In many of those charts I could see a highly clocked 390x performing very closely with a Fury X (especially at 1080/1440), which is probably a bigger testament to Fury's poor release drivers, but it's also a huge statement for how well the "Grenada" is performing TODAY.


----------



## jwcw

Quote:


> Originally Posted by *DividebyZERO*
> 
> 290x with 8gb vs 390 with 8gb? They would be pretty much equal. 390x is faster than 290x though even if its only clock speed its stock clock is like 10% or more faster.


No i was thinking 290x 4gb or 390 8gb but think the 390 is the right choice for me now!

Quote:


> Originally Posted by *Agent Smith1984*
> 
> 390 is backed by the latest driver revisions and no one knows if 290 series will get the benefits of that revision or not.
> 
> Also, the 390 is beating the 290x in most benchmarks, and has twice the VRAM.
> 
> Not saying it's better, just saying it's newer, faster, and costs the same....
> 
> Wait...
> 
> Maybe I am saying it's better
> 
> 
> 
> 
> 
> 
> 
> 
> 
> For the best look at the 390.... look at this Fury X review...
> https://www.techpowerup.com/reviews/AMD/R9_Fury_X/11.html
> 
> It includes the 290x, the 390, and the 390x... and these are with reference samples.
> 
> It sheds a nice light on both the 390 and the 390x....
> But as with the last series, you gotta determine if that $ difference to go from 390 to 390x is worth the performance improvement.
> 
> I will say that, with the tess improvements in the 15.15 drivers, that 256 shader increase does a lot more for the 390x than it ever did for the 290x.
> 
> In many of those charts I could see a highly clocked 390x performing very closely with a Fury X (especially at 1080/1440), which is probably a bigger testament to Fury's poor release drivers, but it's also a huge statement for how well the "Grenada" is performing TODAY.


Thanks very helpful i did have a look at the 970 also instead of a 390 but the ram issue does bother me slightly and i think the 390 is the correct choice for me cant really justify the extra for the 390x and dont want to wait longer for the nano


----------



## Agent Smith1984

Quote:


> Originally Posted by *jwcw*
> 
> No i was thinking 290x 4gb or 390 8gb but think the 390 is the right choice for me now!
> Thanks very helpful i did have a look at the 970 also instead of a 390 but the ram issue does bother me slightly and i think the 390 is the correct choice for me cant really justify the extra for the 390x and dont want to wait longer for the nano


I can't see anyone regretting getting the 390.... It's proving to be a great value!


----------



## Pyrokills

Picked up a 390x. No overclocking as of yet.


----------



## Agent Smith1984

Not sure why, as I have not seen any issues in any other title, but my beloved Dirt Rally is flickering trees normal to white, over and over, regardless of clock speeds...


----------



## DividebyZERO

Weird do you have any CCC override settings or changes? I wonder if its specific to you or a game bug

Is that at stock?


----------



## Agent Smith1984

Quote:


> Originally Posted by *DividebyZERO*
> 
> Weird do you have any CCC override settings or changes? I wonder if its specific to you or a game bug
> 
> Is that at stock?


No ccc changes at all.

The white seems to flicker on the trees only....
Completely stock clocks.

Had to be driver right?

Cry3, bf4, fc4, even Skrotum, err, skyrim, all look fine

Update: forgot i found a new setting called advanced blending" that i turned on yesterday...

Turned this off, and all is perfect!

Phew....


----------



## DividebyZERO

Quote:


> Originally Posted by *Agent Smith1984*
> 
> No ccc changes at all.
> 
> The white seems to flicker on the trees only....
> Completely stock clocks.
> 
> Had to be driver right?
> 
> Cry3, bf4, fc4, even Skrotum, err, skyrim, all look fine
> 
> Update: forgot i found a new setting called advanced blending" that i turned on yesterday...
> 
> Turned this off, and all is perfect!
> 
> Phew....


Advanced blending? was that in the Blendtec section, "Does it blend?"



sorry, its what came to my mind. Good you sorted it out!


----------



## aDyerSituation

How well do this things overclock? Considering they already are

It's either a pair of these, a Fury X, or a 980 Ti and these look a little more promising


----------



## Amhro

Quote:


> Originally Posted by *jwcw*
> 
> No i was thinking 290x 4gb or 390 8gb but think the 390 is the right choice for me now!
> Thanks very helpful i did have a look at the 970 also instead of a 390 but the ram issue does bother me slightly and i think the 390 is the correct choice for me cant really justify the extra for the 390x and dont want to wait longer for the nano


Yep, only yesterday I have noticed 3,5gb+ vram usage several times, good thing i did not go with 970







Quote:


> Originally Posted by *Agent Smith1984*
> 
> Probably not even possible, but I'll know if mine is tonight when I try flashing the 390X BIOS over.
> 
> If anything, I'll get higher default clocks out of it, and maybe a heftier power tune profile.


Have you tried it? Results?


----------



## Agent Smith1984

Quote:


> Originally Posted by *aDyerSituation*
> 
> How well do this things overclock? Considering they already are
> 
> It's either a pair of these, a Fury X, or a 980 Ti and these look a little more promising


So far, I seem to be topping out at around 1180/1620

I was expecting the core clock to be about what I hit, but I was hoping more on the VRAM...

It's very strange. The VRAM will run as high as 1750 with no issues through the Firestrike demo, but then black screen at the end, and I have to reboot to recover from the error.
It will do this until I drop below 1625.

Has anyone every played with AUX voltage on these cards?

My living room was roasting at 79F (101 outside) yesterday, and I noticed that, the cooler seems to do a great job of cooling the card, but it seems to have exposed my airflow limitations, because as soon as the card runs at 82-83c for a few minutes, the case becomes saturated with heat inside, and then temps begin rising again.

I am replacing my stock exhaust fans with some high flow corsair units, and also mounting a fan on the bottom portion of my from mount raditor to get more air on the card.

I am convinced this is a good cooler, as reviews have put temps around 72c (open bench) load, and I am seeing around 78c if I remove my door.


----------



## Ha-Nocri

Downloading some benchmark tools, 3d Mark being one of them. The FireStrike test u r running, is it free (basic) version? Could you run Unigine Heaven as it is not CPU dependent? These settings:


----------



## Agent Smith1984

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Downloading some benchmark tools, 3d Mark being one of them. The FireStrike test u r running, is it free (basic) version? Could you run Unigine Heaven as it is not CPU dependent? These settings:


Will run later.

Very curious to see how Heaven doe myself...

I have not had a chance to try 390x BIOS yet. Been slammed.

I'll try and give it a go by the weekend. Tomorrow is my wife's birthday, so I'll probably be away from the PC most of the day.

Adding new members to spreadsheet today.


----------



## joeh4384

It looks like MSI did a good job with the updated cooler. I had their 290x gaming and that cooler was mediocre at best temps on both core and vrm in low 80s at stock. I ended up snagging a corsair hg10 and put that on and it does ok but could do a better job on the vrms.


----------



## Agent Smith1984

Quote:


> Originally Posted by *joeh4384*
> 
> It looks like MSI did a good job with the updated cooler. I had their 290x gaming and that cooler was mediocre at best temps on both core and vrm in low 80s at stock. I ended up snagging a corsair hg10 and put that on and it does ok but could do a better job on the vrms.


Yes, this cooler seems to be doing much better on the VRM temps.

Normally in the 70c range with the overvolt. Pending it's correct anyways.... Only reads VRM1, no VRM2 temps shown in GPU-Z on this card.


----------



## Ha-Nocri

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Stock 290 Tri-X pulled around 10,800 graphics score in FireStrike
> 
> Stock MSI Gaming 390 is pulling 12,500!!!
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/fs/5184408
> 
> Notice Futuremark has mislabeled the card as a 380... They'd better get this fixed quickly!
> 
> An overclcoked run so you can see that the clocks aren't fudged on the first run. http://www.3dmark.com/fs/5184788
> Firestrike definitely reports the clocks correct every time with these cards (had issues with this on the 2 series)!


FireStrike 1200/1600

There is some difference, probably memory timings and maybe CPU some


----------



## Gumbi

Quote:


> Originally Posted by *Ha-Nocri*
> 
> FireStrike 1200/1600
> 
> There is some difference, probably memory timings and maybe CPU some


Pretty nice. Here is my Vapor X 290 (1150/1550) for comparison. I think I can hit 1600, but haven't tested right yet.

http://www.3dmark.com/3dm/7498953


----------



## Agent Smith1984

Quote:


> Originally Posted by *Gumbi*
> 
> Pretty nice. Here is my Vapor X 290 (1150/1550) for comparison. I think I can hit 1600, but haven't tested right yet.
> 
> http://www.3dmark.com/3dm/7498953


I find it pretty insane that the MSI 390 at stock 1060/1525 clocks, is scoring what your card does at 1150/1550.

They really did a nice job tweaking these cards (and their driver).

Thanks for the submissions, it's good have a few benchmarks from the last gen to compare to for the 3 series owners.


----------



## aDyerSituation

I hope there are water blocks for these cards


----------



## Agent Smith1984

Quote:


> Originally Posted by *aDyerSituation*
> 
> I hope there are water blocks for these cards


I'm researching now, as to whether or not the 290 series blocks will work on these.

Also trying to find out which vendors are using reference design, and which aren't.

These are just so new, and with all eyes on Fury, there isn't as much publicity.

Though if you consider that Fury X is only 11% faster than 390x at 1080P and cost 34% more, you have to wonder WHY????

I think the misconception right now, is that 390=290


----------



## Ha-Nocri

1200/1600


----------



## Agent Smith1984

Quote:


> Originally Posted by *Ha-Nocri*
> 
> 1200/1600


Thanks

I'll run Heaven at stock, and at max OC later to see the differences against this.
Heaven is a good test since the CPU has very little to do with the score.


----------



## Gumbi

Here's what my Vapor X does, 1150/1550 (not max overclock but very easily gameable). I think + 81mv. I can do a bit more on the core but I haven't bothered perfecting the overclock. The Vapor X cooler still holds its own







VRM 1 max 55, VRM 2 max 51 (beastly VRM cooling on this card) and max on the core 72. 60% fan I believe.

My score is within spitting distance of yours (withing margin of error almost), I think my CPU is helping my minimums or something (I have a golden 4790k, 4.9ghz at 1.32v air).


----------



## Blameless

Really not seeing this supposed edge the 300 series is supposed to have, clock for clock.

However, I do see several CPU limited 290/290X benchmarks, benchmarks not done with the 300 series drivers, and probable unstable/throttling OCs.
Quote:


> Originally Posted by *Agent Smith1984*
> 
> 390 is backed by the latest driver revisions and no one knows if 290 series will get the benefits of that revision or not.


Been using the 15.15 drivers on my 290 and 290Xes for over a week.

They benefit, just as one would expect, as any Hawaii part.

Improvement was pretty variable, depending on test. Firestrike gained about 300 points. Heaven sees somewhat more significant of a boost, probably because of the heavier tessellation.
Quote:


> Originally Posted by *Ha-Nocri*
> 
> FireStrike 1200/1600
> 
> There is some difference, probably memory timings and maybe CPU some


You're CPU limited. I'm getting a very similar graphics score at 1100/1500 on my 290X with the 15.15 driver, all default driver settings.









Quote:


> Originally Posted by *Ha-Nocri*
> 
> 1200/1600


290X,1100/1500, 15.15 drivers, no CCC tweaks, same Heaven settings as above posts...everything as it is in my current sig:










I did notice an oddity with Heaven and the new drivers. There seems to be more aggressive culling/popup...perhaps measures to limit VRAM asset allocation for Fury?


----------



## Ha-Nocri

Quote:


> Originally Posted by *Gumbi*
> 
> 
> 
> Here's what my Vapor X does, 1150/1550 (not max overclock but very easily gameable). I think + 81mv. I can do a bit more on the core but I haven't bothered perfecting the overclock. The Vapor X cooler still holds its own
> 
> 
> 
> 
> 
> 
> 
> VRM 1 max 55, VRM 2 max 51 (beastly VRM cooling on this card) and max on the core 72. 60% fan I believe.
> 
> My score is within spitting distance of yours (withing margin of error almost), I think my CPU is helping my minimums or something (I have a golden 4790k, 4.9ghz at 1.32v air).


Yeah, must be CPU. I think the results r too close for the clock difference. I'm @+100 mv. I might be able to do 1210-1220 maybe. And maybe even higher memory.

My GPU temp is 75c, but VRM1 is also around 75c.

*EDIT: can run 1220/1620, gained 1.0 FPS


----------



## Ha-Nocri

Quote:


> Originally Posted by *Blameless*
> 
> Really not seeing this supposed edge the 300 series is supposed to have, clock for clock.
> 
> However, I do see several CPU limited 290/290X benchmarks, benchmarks not done with the 300 series drivers, and probable unstable/throttling OCs.
> Been using the 15.15 drivers on my 290 and 290Xes for over a week.
> 
> They benefit, just as one would expect, as any Hawaii part.
> 
> Improvement was pretty variable, depending on test. Firestrike gained about 300 points. Heaven sees somewhat more significant of a boost, probably because of the heavier tessellation.
> You're CPU limited. I'm getting a very similar graphics score at 1100/1500 on my 290X with the 15.15 driver, all default driver settings.


FireStrike is definitely less CPU-bound. It has separate graphics score after all.

He is getting 13800 with his 380. You would have to rly OC that 290x to match it. We need more ppl with 390's to test tho.


----------



## Blameless

Quote:


> Originally Posted by *Ha-Nocri*
> 
> FireStrike is definitely less CPU-bound.


It does seem that way.
Quote:


> Originally Posted by *Ha-Nocri*
> 
> He is getting 13800 with his 380. You would have to rly OC that 290x to match it.


It's a 390, the benchmark is just reading it wrong.

Anyway, if Firestrike's GPU score scales near linearly with GPU clocks (and it seems to), I'd get over 14100 at 1180/1600. That doesn't seem far off the 390 vs. 390X discrepancy, certainly not by enough to hint at a GPU architectural change between the 290 series and 390 series.

This 290X cannot run that GPU and memory speed simultaneously, unless I put more voltage into it than the +100mV AB can give.


----------



## Ha-Nocri

Quote:


> Originally Posted by *Blameless*
> 
> It does seem that way.
> It's a 390, the benchmark is just reading it wrong.
> 
> Anyway, if Firestrike's GPU score scales near linearly with GPU clocks (and it seems to), I'd get over 14100 at 1180/1600. That doesn't seem far off the 390 vs. 390X discrepancy, certainly not by enough to hint at a GPU architectural change between the 290 series and 390 series.
> 
> This 290X cannot run that GPU and memory speed simultaneously, unless I put more voltage into it than the +100mV AB can give.


Yeah, meant 390.

Don't assume















Run it. With TriXX you can go to +200


----------



## Blameless

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Run it. With TriXX you can go to +200


Not going to put +200mV through my GPU and I'm certainly not going to stress test it with +200mV to the degree I'd need to rule out instabilities affecting scores.

It's easier to do clock for clock comparisons at clocks that all cards can reach without issue.


----------



## rdr09

Quote:


> Originally Posted by *Blameless*
> 
> Not going to put +200mV through my GPU and I'm certainly not going to stress test it with +200mV to the degree I'd need to rule out instabilities affecting scores.
> 
> It's easier to do clock for clock comparisons at clocks that all cards can reach without issue.


why not?

http://www.3dmark.com/3dm/4253714?

Win 10


----------



## Blameless

Quote:


> Originally Posted by *rdr09*
> 
> http://www.3dmark.com/3dm/4253714?
> 
> Win 10


What card and clocks?


----------



## rdr09

Quote:


> Originally Posted by *Blameless*
> 
> What card and clocks?


290 @ 1305MHz.

at lower clocks, my benches are always lower by 200 points compared to ivy and haswell. i think its my aging sandy and slow ram.

i've seen 290Xs here score upwards of 15K graphics.


----------



## Particle

Particle
Radeon 390
Sapphire Nitro Tri-X


----------



## Gumbi

Nice, the nitro seems sweet. Read a review or two and it performed well. Is it a redesigned TriX type card? Or is it more on the vapor x level in terms of quality?


----------



## DividebyZERO

Quote:


> Originally Posted by *Ha-Nocri*
> 
> FireStrike is definitely less CPU-bound. It has separate graphics score after all.
> 
> He is getting 13800 with his 380. You would have to rly OC that 290x to match it. We need more ppl with 390's to test tho.


the way things are shaping up with fury x and above 4k resolutions its looking like i will be getting 390x's and if i do i'll be glad to post benchmarks here but my platform is older and may not help much aside from i could test 390x/290x side by side perhaps. just waiting a little longer to get more information on furyx


----------



## Cannon19932006

I'd like to join the club

Validation
http://www.techpowerup.com/gpuz/details.php?id=vd8f9

Card
Gigabyte R9 390X G1

Cooling
Stock


----------



## Agent Smith1984

Awesome!
I'll get new members added tomorrow!


----------



## aDyerSituation

Quote:


> Originally Posted by *Cannon19932006*
> 
> I'd like to join the club
> 
> Validation
> http://www.techpowerup.com/gpuz/details.php?id=vd8f9
> 
> Card
> Gigabyte R9 390X G1
> 
> Cooling
> Stock


Finally someone with this card. Is the Windforce logo RGB like on the 980 Ti?


----------



## Cannon19932006

Quote:


> Originally Posted by *aDyerSituation*
> 
> Finally someone with this card. Is the Windforce logo RGB like on the 980 Ti?


I don't know if it's RGB, ill have to download their software and see if I can change it, It's blue right now.


----------



## Particle

Quote:


> Originally Posted by *Gumbi*
> 
> Nice, the nitro seems sweet. Read a review or two and it performed well. Is it a redesigned TriX type card? Or is it more on the vapor x level in terms of quality?


I don't have the old one to compare against in order to say from personal experience, but my understanding is that the cooler has been improved over that of the 200 series Tri-X models. I was impressed by the cooling performance in PC Perspective's review which is why I went for this 390 in particular. I really wanted to be sure that I could keep clock speeds high instead of throttling after the beating the reference 290s took because of that. It does appear to be pretty solid and well-made. I'd recommend it to a friend.


----------



## Cannon19932006

Quote:


> Originally Posted by *aDyerSituation*
> 
> Finally someone with this card. Is the Windforce logo RGB like on the 980 Ti?


I'm trying to get it to change, but the Gigabyte software doesn't seem to be able to make any changes to it right now.

To anything on 390x I mean. doesn't seem to be supported.


----------



## aDyerSituation

Quote:


> Originally Posted by *Cannon19932006*
> 
> I'm trying to get it to change, but the Gigabyte software doesn't seem to be able to make any changes to it right now.
> 
> To anything on 390x I mean. doesn't seem to be supported.


That'll be a real bummer if you can't change it


----------



## Cannon19932006

Quote:


> Originally Posted by *aDyerSituation*
> 
> That'll be a real bummer if you can't change it


Yeah gigabytes software doesn't seem to work at all on the 390x. On a side note, I am currently unable to change my voltage in afterburner. I've toyed with the afterburner settings and haven't been able to get that voltage slider to unlock at all.


----------



## Gumbi

Quote:


> Originally Posted by *Particle*
> 
> I don't have the old one to compare against in order to say from personal experience, but my understanding is that the cooler has been improved over that of the 200 series Tri-X models. I was impressed by the cooling performance in PC Perspective's review which is why I went for this 390 in particular. I really wanted to be sure that I could keep clock speeds high instead of throttling after the beating the reference 290s took because of that. It does appear to be pretty solid and well-made. I'd recommend it to a friend.


Nice, the nitro seems sweet.

God, don't be tempting :/ Have a perfectly functional 6 month old Vapor X 290 and a new job so I'm afraid I'll get trigger happy


----------



## Performer81

Anyone with a powercolor 390X PCS+ who likes to offer me his bios?


----------



## fyzzz

My 290 is not to far from a 390 performance http://www.3dmark.com/compare/fs/5216832/fs/5184408 i clocked my 290 to the same and got 191 points lower, but i think drivers can improve it.


----------



## Ha-Nocri

Quote:


> Originally Posted by *fyzzz*
> 
> My 290 is not to far from a 390 performance http://www.3dmark.com/compare/fs/5216832/fs/5184408 i clocked my 290 to the same and got 191 points lower, but i think drivers can improve it.


Maybe it's win8.1 that gives a boost


----------



## fyzzz

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Maybe it's win8.1 that gives a boost


I run windows 10 and it seems like my 290 is scoring higher than others i can almost hit 14000 gpu score in firestrike at 1220/1600.


----------



## Gumbi

Quote:


> Originally Posted by *fyzzz*
> 
> I run windows 10 and it seems like my 290 is scoring higher than others i can almost hit 14000 gpu score in firestrike at 1220/1600.


Windows 10 probabky helps a bit.


----------



## By-Tor

Powercolor fan here and using a 290x LCS w/factory EK waterblock and going to hold out for the 390x LCS. I talked with Powercolor 2 days ago and was told they will be releasing an LCS version...

Will have to wait and see...

They do have a 390x Devil with built in water cooler coming out that's very ugly, but I want to add it to my loop and that won't work..


----------



## Agent Smith1984

New members update!!

Keep em coming guys!!

One thing I wanted to say...

I appreciate all of the 290/290x benchmarks , but would like to see as many or more 300 series bencmarks being posted too








One thing I don't want to happen is for this thread to turn into "post your 290 firestrike score that beats a 390" because I am well aware that through clock speeds, one can beat the other... but I'd like this thread to be more about how the cards are different, than how they are the same.

With that said....

It is good to see some of the higher scoring 290/x benchmarks on the 15.15 driver.
In my opinion, Hawaii is still a very capable GPU. Obviously it has a new big brother now, but the initial results on the Fury X don't show it in a totally dominating light.... more of a standard 15-20% improvement from Hawaii, typical of any next gen GPU offering, and less on the ground breaking side of things (especially at 1080 and 1440p resolutions, in which the 300 series are dominant from a value standpoint).

Also wanted to say my apologies for the slowness at which information is being added to the main page.
My wife's birthday is today, and I'm trying to get all that sorted.

Once the weekend comes, there should be a lot of good bits being added.

Any suggestions for information you would like to see added are MORE THAN WELCOME!!!


----------



## Gumbi

Memory overclocks seem great on the 390s. Lowest I've seen so far is 1600, which is excellent.


----------



## jwcw

Just about to go and get one from local shop, just wondering how bad the bottle neck with my i7 920 will be im sure it will bottleneck it too a certain extent?


----------



## Agent Smith1984

Quote:


> Originally Posted by *jwcw*
> 
> Just about to go and get one from local shop, just wondering how bad the bottle neck with my i7 920 will be im sure it will bottleneck it too a certain extent?


It will likely pose as somewhat of a bottleneck, but depending on the title, it may not actually reveal itself as a gameplay issue. You may only "find" the bottleneck through utilization monitoring.









I would just suggest hitting the maximum OC possible on your CPU.


----------



## jwcw

Quote:


> Originally Posted by *Agent Smith1984*
> 
> It will likely pose as somewhat of a bottleneck, but depending on the title, it may not actually reveal itself as a gameplay issue. You may only "find" the bottleneck through utilization monitoring.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I would just suggest hitting the maximum OC possible on your CPU.


Going to just boost it to 3.2 now its on water anyway, i will be upgrading this very old cpu setup in the near future but waiting for ddr4


----------



## Agent Smith1984

Quote:


> Originally Posted by *jwcw*
> 
> Going to just boost it to 3.2 now its on water anyway, i will be upgrading this very old cpu setup in the near future but waiting for ddr4


You are aware DDR4 is already very abundant on the X99 platform right??


----------



## jwcw

Quote:


> Originally Posted by *Agent Smith1984*
> 
> You are aware DDR4 is already very abundant on the X99 platform right??


Im really not in the know anymore, i just asked for some cpu/mobo advice and got told to wait for new platform but im fed up of not been able to max the games i play so thats why i want the 390


----------



## Agent Smith1984

Quote:


> Originally Posted by *jwcw*
> 
> Im really not in the know anymore, i just asked for some cpu/mobo advice and got told to wait for new platform but im fed up of not been able to max the games i play so thats why i want the 390


Well you have a major upgrade instore for yourself my friend!


----------



## sabag123




----------



## Agent Smith1984

Quote:


> Originally Posted by *sabag123*


Thanks for joining sabag123!

You have been added to the owners sheet.

Please update us with any overclocking/voltage.benchmark results you may have!!


----------



## sabag123

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Thanks for joining sabag123!
> 
> You have been added to the owners sheet.
> 
> Please update us with any overclocking/voltage.benchmark results you may have!!


thank you


----------



## Agent Smith1984

Quote:


> Originally Posted by *sabag123*
> 
> How can I change the fan speed ? I've tried with afterburner and it's always going back to normal
> and i've tried with amd OverDrive too


sabag123

You will need to click the small settings "gear" button in the fan speed section on MSI afterburner.

It will turn that section red, meaning that the user mode fan setting has been enabled.

Then select the larger setting gear button below the fan section, go to the fan tab, and create your own profile.

That should get you going.

PS: Make your sure are using the latest version of afterburner!!


----------



## DividebyZERO

Quote:


> Originally Posted by *jwcw*
> 
> Im really not in the know anymore, i just asked for some cpu/mobo advice and got told to wait for new platform but im fed up of not been able to max the games i play so thats why i want the 390


If your sig rig is correct, look into xeons for your board, there are many threads dedicated to using xeons overclocked to go from 4 cores to 6 and usually 4ghz is common. You may need to update your bios. The xeons sell for as little as 70$ so it very cheap as well.

http://www.overclock.net/t/1461359/official-xeon-x5660-x58-review-discussion-and-xeon-l5639-benchmarks-inside

There is also an x58 owners club both have plenty of people to help you.


----------



## Agent Smith1984

Quote:


> Originally Posted by *DividebyZERO*
> 
> If your sig rig is correct, look into xeons for your board, there are many threads dedicated to using xeons overclocked to go from 4 cores to 6 and usually 4ghz is common. You may need to update your bios. The xeons sell for as little as 70$ so it very cheap as well.
> 
> http://www.overclock.net/t/1461359/official-xeon-x5660-x58-review-discussion-and-xeon-l5639-benchmarks-inside
> 
> There is also an x58 owners club both have plenty of people to help you.


That's a great suggestion.

At one point I was hunting a cheap x58/XEON setup myself to upgrade from my thuban, but I went a different route.

@jwcw A 6 core x58 XEON at 4GHz would be a nice upgrade, and give you plenty of CPU power for years to come.


----------



## sabag123

Quote:


> Originally Posted by *Agent Smith1984*
> 
> sabag123
> 
> You will need to click the small settings "gear" button in the fan speed section on MSI afterburner.
> 
> It will turn that section red, meaning that the user mode fan setting has been enabled.
> 
> Then select the larger setting gear button below the fan section, go to the fan tab, and create your own profile.
> 
> That should get you going.
> 
> PS: Make your sure are using the latest version of afterburner!!


omg ... I had the msi gaming app active in the system tray .. that's why I couldn't change my fan speed
37c idle


----------



## jwcw

Just got the MSI 390









Have overclocked my 920 a little and doing some stress testing, will post proof to join the club shortly.


----------



## sidfirex

Hi Guys

A new owner of R9 390 here - Gigabyte Gaming G1.

Coming from a SLI 780's lol still a bit shy ;-P but watching this topic with interest.









What I am loving about this card is, I am running EYEFINITY at 5760 x 1080 and it seems to be better than my GTX 780 SLi's in terms of raw performance and also liking the overclocking ability albeit this Gigabyte G! model I have has its voltage locked.

edit:

my GPU-Z validation link: http://www.techpowerup.com/gpuz/details.php?id=e7rp7


----------



## sabag123

Quote:


> Originally Posted by *Agent Smith1984*
> 
> That's a great suggestion.
> 
> At one point I was hunting a cheap x58/XEON setup myself to upgrade from my thuban, but I went a different route.
> 
> @jwcw A 6 core x58 XEON at 4GHz would be a nice upgrade, and give you plenty of CPU power for years to come.


Can you share your msi afterburner fan settings ?
I don't know how to set it for idle and load


----------



## By-Tor

Are the 390/390x's using reference PCB's?


----------



## Agent Smith1984

Quote:


> Originally Posted by *sabag123*
> 
> Can you share your msi afterburner fan settings ?
> I don't know how to set it for idle and load


I use a custom profile that curves with the temps...

I use 35% as a minimum up to 50c, then I start to climb much faster...

I peak at 100% at 90c but the card never breaks 83c at around 80% fan, and that is will overvoltage....

The VRM does great though. Only 70c usually!!!

Here is a pic I found on google of a cutom profile similar to mine:
http://tinyurl.com/970FanCurve


----------



## Gumbi

Quote:


> Originally Posted by *sidfirex*
> 
> Hi Guys
> 
> A new owner of R9 390 here - Gigabyte Gaming G1.
> 
> Coming from a SLI 780's lol still a bit shy ;-P but watching this topic with interest.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What I am loving about this card is, I am running EYEFINITY at 5760 x 1080 and it seems to be better than my GTX 780 SLi's in terms of raw performance and also liking the overclocking ability albeit this Gigabyte G! model I have has its voltage locked.
> 
> edit:
> 
> my GPU-Z validation link: http://www.techpowerup.com/gpuz/details.php?id=e7rp7


Dl the latest Afterburner, make sure you tick the unlock voltage box. You might need a new BIOS if that doesn't unlock ot for you.

Decent oc though for not having to touch voltage.


----------



## Agent Smith1984

Quote:


> Originally Posted by *By-Tor*
> 
> Are the 390/390x's using reference PCB's?


Still researching this....

I will be creating a section on Post #1 listing what cards use reference PCB, if not all....

Stay tuned.


----------



## jwcw

Here is the proof! Thanks for the CPU advice i will scout out ebay for one of those chips


----------



## Agent Smith1984

Quote:


> Originally Posted by *jwcw*
> 
> Just got the MSI 390
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Have overclocked my 920 a little and doing some stress testing, will post proof to join the club shortly.


Pending proof, welcome!!
Quote:


> Originally Posted by *sidfirex*
> 
> Hi Guys
> 
> A new owner of R9 390 here - Gigabyte Gaming G1.
> 
> Coming from a SLI 780's lol still a bit shy ;-P but watching this topic with interest.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What I am loving about this card is, I am running EYEFINITY at 5760 x 1080 and it seems to be better than my GTX 780 SLi's in terms of raw performance and also liking the overclocking ability albeit this Gigabyte G! model I have has its voltage locked.
> 
> edit:
> 
> my GPU-Z validation link: http://www.techpowerup.com/gpuz/details.php?id=e7rp7


Added and welcome aboard!!

Please update us on overclocking when and if you get your voltage unlocked.

Benchmarks are welcome also!


----------



## sabag123

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I use a custom profile that curves with the temps...
> 
> I use 35% as a minimum up to 50c, then I start to climb much faster...
> 
> I peak at 100% at 90c but the card never breaks 83c at around 80% fan, and that is will overvoltage....
> 
> The VRM does great though. Only 70c usually!!!
> 
> Here is a pic I found on google of a cutom profile similar to mine:
> http://tinyurl.com/970FanCurve


Something like that ? http://i.imgur.com/RdQ8wWF.jpg


----------



## Agent Smith1984

Quote:


> Originally Posted by *sabag123*
> 
> Something like that ? http://i.imgur.com/RdQ8wWF.jpg


Yep!

That should keep you nice and cool.


----------



## Blameless

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Maybe it's win8.1 that gives a boost


Windows 8.1 does use a newer WDDM revision and probably makes at least some degree of difference.

I'll be on 7/Server 2008 R2 at least until Windows 10 shows up, however.
Quote:


> Originally Posted by *Gumbi*
> 
> Memory overclocks seem great on the 390s. Lowest I've seen so far is 1600, which is excellent.


Memory definitely seems better on average.

Anyone able to provide the exact memory IC model from their 390 or 390X?
Quote:


> Originally Posted by *By-Tor*
> 
> Are the 390/390x's using reference PCB's?


There are no reference 300 series cards, to the best of my knowledge.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Blameless*
> 
> There are no reference 300 series cards, to the best of my knowledge.


I am researching this...

So far, the XFX has been identified as a reference PCB.

I am beginning to wonder if they are all reference.

Still gathering info though.

I am doing a noctua TIM on my card this weekend, and will get pics of the PCB and verify for the MSI also.


----------



## Clockster

I was so tempted into buying 2x 390X today


----------



## Agent Smith1984

Quote:


> Originally Posted by *Clockster*
> 
> I was so tempted into buying 2x 390X today


I think 2x 390 is a better value....

Would handily beat out a Fury X or 980 ti for the same amount of money...

Of course, that's in titles that utilize CF, which is almost everything now (though sometimes drivers need a few tweaks after launc).

390x in CF would be really potent though, since you'd gain a total of 512 shaders.


----------



## russik

reserved


----------



## By-Tor

I'm looking at a pair of Powercolor 390's, but that depends on the release of water blocks...


----------



## Minotaurtoo

Quote:


> Originally Posted by *Blameless*
> 
> Really not seeing this supposed edge the 300 series is supposed to have, clock for clock.
> 
> However, I do see several CPU limited 290/290X benchmarks, benchmarks not done with the 300 series drivers, and probable unstable/throttling OCs.
> Been using the 15.15 drivers on my 290 and 290Xes for over a week.
> 
> They benefit, just as one would expect, as any Hawaii part.
> 
> Improvement was pretty variable, depending on test. Firestrike gained about 300 points. Heaven sees somewhat more significant of a boost, probably because of the heavier tessellation.
> You're CPU limited. I'm getting a very similar graphics score at 1100/1500 on my 290X with the 15.15 driver, all default driver settings.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 290X,1100/1500, 15.15 drivers, no CCC tweaks, same Heaven settings as above posts...everything as it is in my current sig:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I did notice an oddity with Heaven and the new drivers. There seems to be more aggressive culling/popup...perhaps measures to limit VRAM asset allocation for Fury?


little late here... but I just had to post a result from my old twins... they've aged well lol. This is why if I upgrade it'll be to the fury... only because I can't even begin to justify a horizontal move (actually backing up a little) But, the fact that a single card is even almost keeping up with my cards is impressive.


----------



## Agent Smith1984

Quote:


> Originally Posted by *By-Tor*
> 
> I'm looking at a pair of Powercolor 390's, but that depends on the release of water blocks...


If my theory that ALL 390 cards currently being sold are reference, then we have all the blocks we need fellas!!

I will be updating in the cooling section as we go.

Any information anyone would like to submit would be helpful as well.
I will get it added to the main post, with your credentials listed!


----------



## Minotaurtoo

I'll ask this here as I have a genuine interest in this.... can anyone here beat this graphics score with a single 390/390x card? (not interested in composite) http://www.3dmark.com/3dm/7376026?


----------



## Blameless

Quote:


> Originally Posted by *Agent Smith1984*
> 
> If my theory that ALL 390 cards currently being sold are reference, then we have all the blocks we need fellas!!


I just looked up three random 390(X)s and all but readily apparent, differences in PCB layout or shape, sometimes even size.

Take a look at the Sapphire Tri-X, The MSI Gaming, and the XFX Double DD...all different PCBs.

AMD doesn't have a reference 300 series PCB or cooler this time around, and I haven't been able to find any two 390(X)s from different brands that have identical PCBs, even though some are very similar.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Blameless*
> 
> I just looked up three random 390(X)s and all but readily apparent, differences in PCB layout or shape, sometimes even size.
> 
> Take a look at the Sapphire Tri-X, The MSI Gaming, and the XFX Double DD...all different PCBs.
> 
> AMD doesn't have a reference 300 series PCB or cooler this time around, and I haven't been able to find any two 390(X)s from different brands that have identical PCBs, even though some are very similar.


So strange, because at E3 they showed the reference design cards, but you can't find one anywhere?

I guess it was a paper/OEM launch on the reference design only....


----------



## aDyerSituation

Oh lord help us all in finding waterblocks if they are all different


----------



## jwcw

Quote:


> Originally Posted by *Minotaurtoo*
> 
> I'll ask this here as I have a genuine interest in this.... can anyone here beat this graphics score with a single 390/390x card? (not interested in composite) http://www.3dmark.com/3dm/7376026?


Can i just ask a really stupid question in regards to that, ive just ran my 3D vantage score got a 9260 but think im quite cpu limited, ive got DDR3 corsair like you, why does our ram show 667mhz? Do we times that number by three, sorry if this sounds stupid.


----------



## aDyerSituation

Quote:


> Originally Posted by *jwcw*
> 
> Can i just ask a really stupid question in regards to that, ive just ran my 3D vantage score got a 9260 but think im quite cpu limited, ive got DDR3 corsair like you, why does our ram show 667mhz? Do we times that number by three, sorry if this sounds stupid.


Should be by 2


----------



## Blameless

Quote:


> Originally Posted by *Agent Smith1984*
> 
> So strange, because at E3 they showed the reference design cards, but you can't find one anywhere?
> 
> I guess it was a paper/OEM launch on the reference design only....


Could have been samples on the 290 reference PCB.

When I look at 390/390X reviews, I don't see any cards with the AMD logo that is almost always stenciled on reference PCBs, nor can I find perfectly identical PCBs from two different brands.

It's likely that not all manufacturers are using their own PCBs, but at the very least there are several different designs being used. Go to Newegg and look at the images of the backs of the 390s, some of the PCBs aren't even the same height/shape as others.

MSI and ASUS is definitely using taller custom PCBs, and not the same one either.


----------



## snow cakes

Quote:


> Originally Posted by *aDyerSituation*
> 
> Oh lord help us all in finding waterblocks if they are all different


----------



## Particle

Quote:


> Originally Posted by *Agent Smith1984*
> 
> So strange, because at E3 they showed the reference design cards, but you can't find one anywhere?
> 
> I guess it was a paper/OEM launch on the reference design only....


AMD very likely does have a reference layout, and I'm sure that's one they used for testing and development. I think the best explanation would be to look at this launch from the perspective of AMD's board partners. If AMD was going to let its partners ship custom cooling solutions from the beginning like was the case with the 300 series launch, there would be very little reason for partners to ship models with the inferior reference design when PCBs and corresponding coolers from each partner's physically identical 200 series products already existed.


----------



## Cannon19932006

Quote:


> Originally Posted by *sidfirex*
> 
> Hi Guys
> 
> A new owner of R9 390 here - Gigabyte Gaming G1.
> 
> Coming from a SLI 780's lol still a bit shy ;-P but watching this topic with interest.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What I am loving about this card is, I am running EYEFINITY at 5760 x 1080 and it seems to be better than my GTX 780 SLi's in terms of raw performance and also liking the overclocking ability albeit this Gigabyte G! model I have has its voltage locked.
> 
> edit:
> 
> my GPU-Z validation link: http://www.techpowerup.com/gpuz/details.php?id=e7rp7


Let me know if you get that voltage unlocked, I'd be very interested in that.


----------



## Minotaurtoo

I got my volts unlocked on my old giga cards only by bios modding... you could try that... but you could get a brick for christmas.... lol


----------



## Cannon19932006

Quote:


> Originally Posted by *Minotaurtoo*
> 
> I got my volts unlocked on my old giga cards only by bios modding... you could try that... but you could get a brick for christmas.... lol


What did you use to mod the bios?


----------



## Particle

Quote:


> Originally Posted by *jwcw*
> 
> Can i just ask a really stupid question in regards to that, ive just ran my 3D vantage score got a 9260 but think im quite cpu limited, ive got DDR3 corsair like you, why does our ram show 667mhz? Do we times that number by three, sorry if this sounds stupid.


SDRAM runs the IO bus at the same speed as the underlying memory and uses single edge signalling. This is the classic way to clock memory.
DDR runs the IO bus at the same frequency of the underlying memory and uses double edge signalling. This yields an effective data rate of two times that of SDRAM.
DDR2 runs the IO bus at two times the frequency of the underlying memory and uses double edge signalling. This yields an effective data rate of four times that of SDRAM.
DDR3 runs the IO bus at four times the frequency of the underlying memory and uses double edge signalling. This yields an effective data rate of eight times that of SDRAM.
DDR4 runs the IO bus at eight times the frequency of the underlying memory and uses double edge signalling. This yields an effective data rate of sixteen times that of SDRAM.

Single edge signalling (retronymed as SDR) means that signals (commands, data) are latched on either the rising or falling edge of the clock signal but not both. Double edge signalling (DDR) means that signals are latched on both the rising and falling edges of a clock signal. Quadruple edge signalling also exists where signals are latched on the rising and falling edges of two clock signals that are out of phase with respect to each other, but this technique isn't used for PC memory. It was most notably deployed in PCs as part of Intel's front side bus used with processors spanning from the introduction of the Pentium 4 until it was phased out in favor of QPI with the launch of Nehalem.

In memory products where the IO bus runs at a higher speed than the underlying memory array itself, the internal memory array must be correspondingly wider. That is to say that a memory chip with an x-bit wide IO bus running the IO bus at y times the frequency of the memory will internally have a memory array that is x*y bits wide and the memory chip controller will clock out data from across that wide internal interface over multiple clock cycles of the IO bus even though the memory itself isn't changing states any faster than plain DDR memory at that frequency. (This part is technically speculation, but it stands to reason.)

Unfortunately, the way memory speeds are reported are almost universally incorrect and inconsistent in the way that they are incorrect due to all of the possible points where "effective" rates may be calculated.


----------



## jwcw

Quote:


> Originally Posted by *Particle*
> 
> SDRAM runs the IO bus at the same speed as the underlying memory and uses single edge signalling. This is the classic way to clock memory.
> DDR runs the IO bus at the same frequency of the underlying memory and uses double edge signalling. This yields an effective data rate of two times that of SDRAM.
> DDR2 runs the IO bus at two times the frequency of the underlying memory and uses double edge signalling. This yields an effective data rate of four times that of SDRAM.
> DDR3 runs the IO bus at four times the frequency of the underlying memory and uses double edge signalling. This yields an effective data rate of eight times that of SDRAM.
> DDR4 runs the IO bus at eight times the frequency of the underlying memory and uses double edge signalling. This yields an effective data rate of sixteen times that of SDRAM.
> 
> Single edge signalling (retronymed as SDR) means that signals (commands, data) are latched on either the rising or falling edge of the clock signal but not both. Double edge signalling (DDR) means that signals are latched on both the rising and falling edges of a clock signal. Quadruple edge signalling also exists where signals are latched on the rising and falling edges of two clock signals that are out of phase with respect to each other, but this technique isn't used for PC memory. It was most notably deployed in PCs as part of Intel's front side bus used with processors spanning from the introduction of the Pentium 4 until it was phased out in favor of QPI with the launch of Nehalem.
> 
> In memory products where the IO bus runs at a higher speed than the underlying memory array itself, the internal memory array must be correspondingly wider. That is to say that a memory chip with an x-bit wide IO bus running the IO bus at y times the frequency of the memory will internally have a memory array that is x*y bits wide and the memory chip controller will clock out data from across that wide internal interface over multiple clock cycles of the IO bus even though the memory itself isn't changing states any faster than plain DDR memory at that frequency. (This part is technically speculation, but it stands to reason.)
> 
> Unfortunately, the way memory speeds are reported are almost universally incorrect and inconsistent in the way that they are incorrect due to all of the possible points where "effective" rates may be calculated.


Appreciated, changed the frequency in my bios as it was on auto before which held it at 667 rather than the 800

Back to the 390 have played GTA V on it all night with everything maxed and its a dream to play! Very please with what i have seen so far.


----------



## aDyerSituation

Quote:


> Originally Posted by *jwcw*
> 
> Appreciated, changed the frequency in my bios as it was on auto before which held it at 667 rather than the 800
> 
> Back to the 390 have played GTA V on it all night with everything maxed and its a dream to play! Very please with what i have seen so far.


My 970 cannot handle ultra settings even with only 2x msaa.

What is your fps in grassy areas and in general?? My 970 craps itself when I go into the mountains/desert with grass on very high.


----------



## Particle

Are you both playing at the same resolution?


----------



## Minotaurtoo

Quote:


> Originally Posted by *Cannon19932006*
> 
> What did you use to mod the bios?


ati winflash is what i used to pull the bios off the card and to program it back on after modding. The program I used for modding it is found here http://www.techpowerup.com/forums/threads/vbe7-vbios-editor-for-radeon-hd-7000-series-cards.189089/

I have no idea if it will work with all cards or not, but it worked on the R9 280 bios I downloaded and the stock bios on my cards...


----------



## aDyerSituation

Quote:


> Originally Posted by *Particle*
> 
> Are you both playing at the same resolution?


1080


----------



## Arizonian

Congrats to the 390 and 390X owners as you get your cards. Staryoshi our GPU editor has authorized the *[Official]* status. Great job on the OP Agent Smith1984 thank you for your taking on the club!

With this said, I'd like to remind members once again this is now an owners thread. A place for owners to be able to discuss their GPU's freely.


----------



## Cannon19932006

Quote:


> Originally Posted by *Minotaurtoo*
> 
> ati winflash is what i used to pull the bios off the card and to program it back on after modding. The program I used for modding it is found here http://www.techpowerup.com/forums/threads/vbe7-vbios-editor-for-radeon-hd-7000-series-cards.189089/
> 
> I have no idea if it will work with all cards or not, but it worked on the R9 280 bios I downloaded and the stock bios on my cards...


It wouldn't open the bios file.


----------



## Cannon19932006

Here's my firestrike run with a small overclock (as high as it will stably go without being able to control voltage.)
http://www.3dmark.com/fs/5224963

Ended up at 1110MHz core, and 1700MHz memory.


----------



## Particle

While my games were working at 1111 MHz, I ultimately had to drop my clock a bit to avoid artifacts in Firestrike Extreme. I tried for 1081 MHz and that seemed to work without problems. After that, I raised the memory clock from 1500 MHz to 1600 MHz and it didn't seem to artifact either. I'll try 1081/1625 in games and FE later on.

Ultimately, I doubt it's worth fighting very hard for overclocks. We're probably talking 5% performance at the cost of reduced stability.

The Sapphire card seems to have issues driving multiple displays, at least when using an active DP adapter of mine. Depending on the DP port I select it's sometimes flickery all the time or just every once in a while. I believe I first noticed this pre-overclock, not after.


----------



## Ha-Nocri

Quote:


> Originally Posted by *Cannon19932006*
> 
> Here's my firestrike run with a small overclock (as high as it will stably go without being able to control voltage.)
> http://www.3dmark.com/fs/5224963
> 
> Ended up at 1110MHz core, and 1700MHz memory.


That's a bit better that my 290 @ 1200/1600. Win7 too?


----------



## Amhro

Quote:


> Originally Posted by *jwcw*
> 
> Appreciated, changed the frequency in my bios as it was on auto before which held it at 667 rather than the 800
> 
> Back to the 390 have played GTA V on it all night with everything maxed and its a dream to play! Very please with what i have seen so far.


Yup, same here, GTA V is now so beautiful when everything is maxed








With my old 6870 it was really...sad








I also tried playing on 4K TV, had to change some settings, but still it was beautiful.
I'm really glad I went with 8GB, it took 6-7GB at 4K and it usually takes around 3,9GB at 1080p


----------



## sidfirex

Here is my Gigabyte G1 R9 390 (non-x) Firestrike results: http://www.3dmark.com/3dm/7520821

some reason it shows up as R9 380 lol

at core clocked to 1097 and memory at 1600 mhz

still unable to unlock voltage, tried everything MSi AB mods, Trix etc guess it is hardware locked.


----------



## Minotaurtoo

hmm... disappointing... I was really hoping that someones graphics score on firestrike would give me a good reason to abandon my old cards in favor of one beast of a card.... so far only Fury, Nvidias Titan X and 980 ti give me anything comparable.... but, I'd still consider it an upgrade since I could get away from crossfire and back to a single card... My max oc is a bit higher than this, but this is my daily clocks for my cards...again I'm only interested in graphics score, not composite as it will vary with cpu to cpu. http://www.3dmark.com/3dm/7375646? as mentioned before these are actually 7950's I flashed over to R9 280's... I want to see a graphics score of 16000 or more on a single card even if its a slight OC to get it... After I get back from vacation I do believe I'm ordering a Fury X , and will be selling my old cards on ebay if anyone is interested send me a note.

oh, i've heard the old perf per dollar bit.... but by that argument one could seriously favor the old 290's now lol... even my cards would look good in that light.. but I have my reasons for wanting one single beast take the place of my two cards... power consumption being one, but mostly games that will not use crossfire...ugh.


----------



## Ha-Nocri

Quote:


> Originally Posted by *Minotaurtoo*
> 
> hmm... disappointing... I was really hoping that someones graphics score on firestrike would give me a good reason to abandon my old cards in favor of one beast of a card.... so far only Fury, Nvidias Titan X and 980 ti give me anything comparable.... but, I'd still consider it an upgrade since I could get away from crossfire and back to a single card... My max oc is a bit higher than this, but this is my daily clocks for my cards...again I'm only interested in graphics score, not composite as it will vary with cpu to cpu. http://www.3dmark.com/3dm/7375646? as mentioned before these are actually 7950's I flashed over to R9 280's... I want to see a graphics score of 16000 or more on a single card even if its a slight OC to get it... After I get back from vacation I do believe I'm ordering a Fury X , and will be selling my old cards on ebay if anyone is interested send me a note.
> 
> oh, i've heard the old perf per dollar bit.... but by that argument one could seriously favor the old 290's now lol... even my cards would look good in that light.. but I have my reasons for wanting one single beast take the place of my two cards... power consumption being one, but mostly games that will not use crossfire...ugh.


Yeah, not sure why ppl r buying 390 over 290:


Hope AMD can make CF work with freeSync before I get my 1400p monitor... and to find an used 290 for 150 euros


----------



## Minotaurtoo

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Yeah, not sure why ppl r buying 390 over 290:
> 
> 
> Hope AMD can make CF work with freeSync before I get my 1400p monitor... and to find an used 290 for 150 euros


if it weren't for the fact that crossfire is causing trouble for me in a couple games and in others just flat out not working; I wouldn't even come close to thinking of upgrading... but I need ever ounce of performance that these cards have to maintain acceptable fps in my fav games so I can't back off power wise... here is what I'm looking at:

http://www.3dmark.com/compare/fs/5223480/fs/5119132/fs/5119434

you'll notice that at near max clocks mine ties the fury x in almost all gpu dependent tests... I don't use max clocks (1200mhz) except for benching and can't seem to find a link atm of when I last used them... but the 1150 is about the highest I can maintain any respectable temps anyway. I figure that if I get a fury x and mod it the same way I modded these, then I will be able to get better performance at less power draw and with a lot less crossfire issues...

I was really hoping the 290x would top my two, but even after a long run of attempts in another thread no one could top my score..dang it... then I came over here hoping that maybe the 390's would do it.... but alas it looks like 980ti or fury x is the only single cards that can do it... and I really want away from crossfire... its annoying... at least I should be able to get some money back from selling these two cards and some other old pc parts on ebay so it won't cost me the whole 650$ for the fury... but then the adapter situation ugh.. will have to get 3 display port to vga adapters... because even though my monitors are not old I wasn't looking and all they have is vga, dvi, and hdmi... crap... and I discovered through trial and error that vga adapters just seem to work easier.


----------



## rdr09

Quote:


> Originally Posted by *Minotaurtoo*
> 
> if it weren't for the fact that crossfire is causing trouble for me in a couple games and in others just flat out not working; I wouldn't even come close to thinking of upgrading... but I need ever ounce of performance that these cards have to maintain acceptable fps in my fav games so I can't back off power wise... here is what I'm looking at:
> 
> http://www.3dmark.com/compare/fs/5223480/fs/5119132/fs/5119434
> 
> you'll notice that at near max clocks mine ties the fury x in almost all gpu dependent tests... I don't use max clocks (1200mhz) except for benching and can't seem to find a link atm of when I last used them... but the 1150 is about the highest I can maintain any respectable temps anyway. I figure that if I get a fury x and mod it the same way I modded these, then I will be able to get better performance at less power draw and with a lot less crossfire issues...
> 
> I was really hoping the 290x would top my two, but even after a long run of attempts in another thread no one could top my score..dang it... then I came over here hoping that maybe the 390's would do it.... but alas it looks like 980ti or fury x is the only single cards that can do it... and I really want away from crossfire... its annoying... at least I should be able to get some money back from selling these two cards and some other old pc parts on ebay so it won't cost me the whole 650$ for the fury... but then the adapter situation ugh.. will have to get 3 display port to vga adapters... because even though my monitors are not old I wasn't looking and all they have is vga, dvi, and hdmi... crap... and I discovered through trial and error that vga adapters just seem to work easier.


the 290X is about 30% faster than the 7950. 2 7950s are about 30% faster than a single 290X. Wait for the Fury (nonX).

i use to own a 7900 series cards and have crossfired them to max games @ 1080. the 290 at stock does the same in games like BF4, C3, etc. Don't base it on a single benchmark.

Your best bet would be the Fury (nonX),


----------



## By-Tor

My 290x is still doing well after seeing some of the scores posted. Going to just add a second card for now...


----------



## DividebyZERO

So the 3xx are not voltage adjustable, same as fury and everyone rushes to compare overclocks. When voltage is unlocked then it would be a fair comparison would it not?
The results may not change much but right now people keep comparing apples to oranges.


----------



## Minotaurtoo

Quote:


> Originally Posted by *rdr09*
> 
> the 290X is about 30% faster than the 7950. 2 7950s are about 30% faster than a single 290X. Wait for the Fury (nonX).
> 
> i use to own a 7900 series cards and have crossfired them to max games @ 1080. the 290 at stock does the same in games like BF4, C3, etc. Don't base it on a single benchmark.
> 
> Your best bet would be the Fury (nonX),


is there an exact date of release on it yet? I can wait... but







I have to admit I've kinda got my heart set on the x version.


----------



## rdr09

Quote:


> Originally Posted by *Minotaurtoo*
> 
> is there an exact date of release on it yet? I can wait... but
> 
> 
> 
> 
> 
> 
> 
> I have to admit I've kinda got my heart set on the x version.


Next month i read. should be a good replacement for the 7950s. Should be around $550.


----------



## Agent Smith1984

Quote:


> Originally Posted by *DividebyZERO*
> 
> So the 3xx are not voltage adjustable, same as fury and everyone rushes to compare overclocks. When voltage is unlocked then it would be a fair comparison would it not?
> The results may not change much but right now people keep comparing apples to oranges.


I have full voltage control on my msi 390


----------



## Minotaurtoo

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I have full voltage control on my msi 390


I'm sure that voltage control is going to be more available soon.. but how did you do it? was it just available by default or did you have to mod something?


----------



## CelticGamer

Am I understanding correctly that there is no R9 390/390X reference card being sold?


----------



## Ha-Nocri

Quote:


> Originally Posted by *CelticGamer*
> 
> Am I understanding correctly that there is no R9 390/390X reference card being sold?


yep, no reference


----------



## Agent Smith1984

Quote:


> Originally Posted by *Minotaurtoo*
> 
> I'm sure that voltage control is going to be more available soon.. but how did you do it? was it just available by default or did you have to mod something?


Had it by default, from my previous 290 ab settings.


----------



## Minotaurtoo

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Had it by default, from my previous 290 ab settings.


interesting.. and nice...


----------



## Agent Smith1984

Quote:


> Originally Posted by *Minotaurtoo*
> 
> interesting.. and nice...


Trixx also works all the way up to 200mv+


----------



## Performer81

390 has the same pcs as 290 and they are all voltage ajustable.


----------



## By-Tor

Since the 390/390x series are Hawaii GPU's can they be CF with the last gen 290/290x cards?


----------



## Cannon19932006

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I have full voltage control on my msi 390


Here's an interesting question, I have no control over voltage on my Gigabyte 390X, would there even be a chance of the msi 390x bios working on this card?


----------



## Gumbi

Quote:


> Originally Posted by *Cannon19932006*
> 
> Here's an interesting question, I have no control over voltage on my Gigabyte 390X, would there even be a chance of the msi 390x bios working on this card?


Quite possibly. I know the Gigaybye 290(x)s also had locked voltage, but I'm quite sure a BIOS change unlocked it for them.


----------



## vavyn

Quote:


> Originally Posted by *Cannon19932006*
> 
> Here's an interesting question, I have no control over voltage on my Gigabyte 390X, would there even be a chance of the msi 390x bios working on this card?


Could you please test what's the temperature on the vrm and core while benchmarking. I have a choice between the Powercolor PCS+, Sapphire Nitro and gigabyte just that I could not find a review for Gigabyte.


----------



## Agent Smith1984

Couple of things....

I don't believe it's a good idea to try another brand of BIOS on any card, since we don't appear to any reference designs...

Also... I am very curious about the cfing ability of 390 and 290...

The only way i can see it being possible, is with the hacked 15.15 driver.

I would like to add though, that some users, are having luck flashing 390 bios on their respective brand 290's.


----------



## Cannon19932006

Quote:


> Originally Posted by *vavyn*
> 
> Could you please test what's the temperature on the vrm and core while benchmarking. I have a choice between the Powercolor PCS+, Sapphire Nitro and gigabyte just that I could not find a review for Gigabyte.


I don't seem to have vrm monitoring, as far as I can tell anyways.

The temps I do have are Core and PCB temp

While at 100% load with my custom fan profile the core reaches 75c, the pcb reaches 64c.


----------



## Gumbi

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Couple of things....
> 
> I don't believe it's a good idea to try another brand of BIOS on any card, since we don't appear to any reference designs...
> 
> Also... I am very curious about the cfing ability of 390 and 290...
> 
> The only way i can see it being possible, is with the hacked 15.15 driver.
> 
> I would like to add though, that some users, are having luck flashing 390 bios on their respective brand 290's.


Agreed, it's only worth doing if you know what you're doing, or you have a dual-Bios setup.


----------



## CelticGamer

Well, for the last week I"ve been torn between the GTX 970, the R9 390 8GB, and the R9 290X. I've decided to go with the R9 390 GB. From all the benchmarking I"ve seen it seems to match, or even edge the 970 and the R9 290X in certain games. Do I need the 8GB'S? Probably not, but for the same priced cards, I'd rather have it and not need it, than possibly need it for titles 2 years down the road and not have it with the GTX 970 or 290X.

How would 2 R 390's crossfired stack up against 2 R9 290x's or 2 GTX 970's? Would the 8GB of Ram the R9 390 has be any advantage in crossfire supported games?


----------



## Ha-Nocri

Quote:


> Originally Posted by *CelticGamer*
> 
> Well, for the last week I"ve been torn between the GTX 970, the R9 390 8GB, and the R9 290X. I've decided to go with the R9 390 GB. From all the benchmarking I"ve seen it seems to match, or even edge the 970 and the R9 290X in certain games. Do I need the 8GB'S? Probably not, but for the same priced cards, I'd rather have it and not need it, than possibly need it for titles 2 years down the road and not have it with the GTX 970 or 290X.
> 
> How would 2 R 390's crossfired stack up against 2 R9 290x's or 2 GTX 970's? Would the 8GB of Ram the R9 390 has be any advantage in crossfire supported games?


Wouldn't be an advantage atm, but down the road it might @4k or even 1440p


----------



## Kalistoval

Im in with a Kraken x41


----------



## Agent Smith1984

Awesome !
I'll get the new members added by Monday!


----------



## diggiddi

Quote:


> Originally Posted by *By-Tor*
> 
> Since the 390/390x series are Hawaii GPU's can they be CF with the last gen 290/290x cards?


That's the million dollar question right there


----------



## By-Tor

If someone has a 390/390x and may still have a 290/290x please try and CF them to see if it is possible?

Thank you


----------



## Cannon19932006

I think I'm going to send mine back, the whole no voltage control thing is no bueno. I'll most likely ask for a refund and pick up an msi or sapphire 390x, or maybe wait a couple weeks for Air cooled fury.


----------



## DividebyZERO

Quote:


> Originally Posted by *Cannon19932006*
> 
> I think I'm going to send mine back, the whole no voltage control thing is no bueno. I'll most likely ask for a refund and pick up an msi or sapphire 390x, or maybe wait a couple weeks for Air cooled fury.


+rep for sharing that Gigabtye R9 390X G1 isn't voltage unlocked. I'd love to see someone else confirm this for good measure.


----------



## Cannon19932006

Quote:


> Originally Posted by *DividebyZERO*
> 
> +rep for sharing that Gigabtye R9 390X G1 isn't voltage unlocked. I'd love to see someone else confirm this for good measure.


I asked Gigabyte support about it, they confirm that it is voltage locked on the windforce g1 390x.


----------



## Particle

I ran into artifacts in Firestrike Extreme with the core frequency at 1110 MHz, so I ended up stepping down to 1081 MHz instead. Memory however seems happy at 1650 MHz, so that's what I'm running.

1081/1650 might be my final clocks on stock voltage. I could pursue higher memory frequencies, but I see little reason to. Even going to 1650 is mostly academic.


----------



## DividebyZERO

Quote:


> Originally Posted by *Particle*
> 
> I ran into artifacts in Firestrike Extreme with the core frequency at 1110 MHz, so I ended up stepping down to 1081 MHz instead. Memory however seems happy at 1650 MHz, so that's what I'm running.
> 
> 1081/1650 might be my final clocks on stock voltage. I could pursue higher memory frequencies, but I see little reason to. Even going to 1650 is mostly academic.


Can you give an update on your multi-display blanking. I currently have that issue when i got past +100mv/150MV+ on my 290x and 290's. I have never solved it aside from using DVI only. Curious about this as if i get 390x, i was looking at the sapphire model since the gigabyte is voltage locked and i really want 3 DP


----------



## Particle

Quote:


> Originally Posted by *DividebyZERO*
> 
> Can you give an update on your multi-display blanking. I currently have that issue when i got past +100mv/150MV+ on my 290x and 290's. I have never solved it aside from using DVI only. Curious about this as if i get 390x, i was looking at the sapphire model since the gigabyte is voltage locked and i really want 3 DP


No luck as yet. It only seems to occur for me when using DP. DVI by itself doesn't have a problem, but DP by itself does. If both are in use, both will have the problem. Frequency of the blanking/garbage seems to vary depending on which DP port I use as well as where I power my DP->DL-DVI adapter from. I've ordered a different adapter to see if it works better with this card. It's a newer style that can manage DL-DVI without external power.


----------



## Minotaurtoo

Quote:


> Originally Posted by *Cannon19932006*
> 
> I asked Gigabyte support about it, they confirm that it is voltage locked on the windforce g1 390x.


In my experience with giga, they are all volt locked... I'm sure there are some that are not somewhere, but every giga card I've had has been volt locked.... I get around this by just modding the bios and selecting the volts I want, but I have to say that usually the boost volts are the max volts allowed even in the bios.. so unless you are wanting to OC without hitting boost volts like I did, then its pretty much useless there... I don't have a 390x to play with or I'd look into it for everyone... but the 79X0's and 280's of the windforce variety are limited to the boost volts as a max... but then, its usually pretty high.


----------



## The Stilt

Quote:


> Originally Posted by *Cannon19932006*
> 
> I asked Gigabyte support about it, they confirm that it is voltage locked on the windforce g1 390x.


If the card follows AMD design guidelines it cannot be voltage locked.
Hawaii cards use SVI2 standard and all the compatible controller support voltage control.

Gigabyte could add their own password protection, but it can be reversed in two seconds


----------



## Agent Smith1984

I have to admit, the reports on the giga voltage lock is just one of many windforce cards, from several series (7,2, and 3) that have all been volt locked.....

I can without a doubt confirm i have the ability to go to 200mv in trixx on msi card.

Will post screenies soon!

I don't want to shoot down gigabyte, since they make reputable products, but at this time, i would advise against their 390's (if you Want high clocks) until someone finds a way to unlock voltage.

Edit:

Dat voltage tho!!!











My Tri-X OC 290 didn't take voltage anywhere near as well as this card does...

This was the first try at Trixx, and will test how high the clocks will go later...

1200 was just a starting point. I'm going to see if I can get in the 1225-1250 range or better at 200mv... probably won't leave it there, but it'll be awesome for benching...

Look at that voltage of 1.43++... that's insane! It holds too, with no force voltage or ULPS disabled.... Very impressed with the potential on the MSI card.
This thing might actually be worth finding a block for.....

Will definitely be doing some research on what water cooling options we will have!


----------



## fyzzz

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I have to admit, the reports on the giga voltage lock is just one of many windforce cards, from several series (7,2, and 3) that have all been volt locked.....
> 
> I can without a doubt confirm i have the ability to go to 200mv in trixx on msi card.
> 
> Will post screenies soon!
> 
> I don't want to shoot down gigabyte, since they make reputable products, but at this time, i would advise against their 390's (if you Want high clocks) until someone finds a way to unlock voltage.
> 
> Edit:
> 
> Dat voltage tho!!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My Tri-X OC 290 didn't take voltage anywhere near as well as this card does...
> 
> This was the first try at Trixx, and will test how high the clocks will go later...
> 
> 1200 was just a starting point. I'm going to see if I can get in the 1225-1250 range or better at 200mv... probably won't leave it there, but it'll be awesome for benching...
> 
> Look at that voltage of 1.43++... that's insane! It holds too, with no force voltage or ULPS disabled.... Very impressed with the potential on the MSI card.
> This thing might actually be worth finding a block for.....
> 
> Will definitely be doing some research on what water cooling options we will have!


Can you send bios? I want to test for experimental uses. I know people doesn't have good experiences with 390 bios on 290, but mine works quite well with it.


----------



## Blameless

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I don't believe it's a good idea to try another brand of BIOS on any card, since we don't appear to any reference designs...


If you have the same voltage control IC, there is a good chance a flash will work.

If you have the same voltage control IC and the same GDDR5 ICs, there is a better chance a flash will work.

How many of these 390s have switchable dual BIOSes? That makes things a lot easier, though it's really hard to outright brick a GPU with a BIOS flash, even with only a single ROM chip.
Quote:


> Originally Posted by *Agent Smith1984*
> 
> My Tri-X OC 290 didn't take voltage anywhere near as well as this card does...
> 
> This was the first try at Trixx, and will test how high the clocks will go later...


Can you check GPU-Z for the reported ASIC quality?


----------



## Minotaurtoo

If it weren't for my cards dual bios, I'd bricked it 3 times lol... mostly learning how to edit the bios features....

edit: I do realize thought that I still could have used a separate gpu to un-brick it.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Blameless*
> 
> If you have the same voltage control IC, there is a good chance a flash will work.
> 
> If you have the same voltage control IC and the same GDDR5 ICs, there is a better chance a flash will work.
> 
> How many of these 390s have switchable dual BIOSes? That makes things a lot easier, though it's really hard to outright brick a GPU with a BIOS flash, even with only a single ROM chip.
> Can you check GPU-Z for the reported ASIC quality?


ASIC is 71.5%

I'll load the BIOS later.

Wife is doing project on my box right now (hence why project wifey is in the works)









Highest reported voltage I could ever get out of my 290 Tri-X was 1.26v peak, and it never held constant.
This card will peg at 1.438 and hold around 1.425 constant.

Unbelievable difference....

I am looking forward to doing some high clock (1200+++) testing this evening when my wife is done with her project.


----------



## Blameless

Firmware probably has a different vdroop slope.


----------



## Gumbi

Quote:


> Originally Posted by *Agent Smith1984*
> 
> ASIC is 71.5%
> 
> I'll load the BIOS later.
> 
> Wife is doing project on my box right now (hence why project wifey is in the works)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Highest reported voltage I could ever get out of my 290 Tri-X was 1.26v peak, and it never held constant.
> This card will peg at 1.438 and hold around 1.425 constant.
> 
> Unbelievable difference....
> 
> I am looking forward to doing some high clock (1200+++) testing this evening when my wife is done with her project.


What's the max wattage reported in power in?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Gumbi*
> 
> What's the max wattage reported in power in?


Didn't check, but I'll get you an answer later this evening.

I'll post stock, 100mv and 200mv wattage.


----------



## Gumbi

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Didn't check, but I'll get you an answer later this evening.
> 
> I'll post stock, 100mv and 200mv wattage.


Thanks. Can you give me max, but also the numbwr the power tends to stick at at a given voltage, as of course the power may spike to max under stock or 100mv, and that would confuse the results.


----------



## premonition08

Good day guys, I recently bought an asus direct cu 2 r9 390x. I cant seem to find the vsr option checkbox. And it seems that my gtx 970 strix is running better. I hope u guys can help. This is my first amd card and I was hoping it would run better than my gtx 970. Ive seen reviews where it is neck to neck with the 980 but much cheaper so I took the chance.


----------



## sidfirex

Quote:


> Originally Posted by *Cannon19932006*
> 
> I think I'm going to send mine back, the whole no voltage control thing is no bueno. I'll most likely ask for a refund and pick up an msi or sapphire 390x, or maybe wait a couple weeks for Air cooled fury.


Hi Cannon

On a unrelated note, do you think it would be a good idea for me to try out your G1 390x bios on my g1 390 non-x ? I can and see if it is any different  hit me up with a link to download please. cheers


----------



## Cannon19932006

Quote:


> Originally Posted by *sidfirex*
> 
> Hi Cannon
> 
> On a unrelated note, do you think it would be a good idea for me to try out your G1 390x bios on my g1 390 non-x ? I can and see if it is any different  hit me up with a link to download please. cheers


I'm not sure if it will work correctly, but I know people had some luck unlocking 290's to 290x's doing something like this.

I sure can upload the bios file for you, but I am not responsible if something goes wrong. Good luck!









Stock.zip 98k .zip file


----------



## diggiddi

Quote:


> Originally Posted by *premonition08*
> 
> Good day guys, I recently bought an asus direct cu 2 r9 390x. I cant seem to find the vsr option checkbox. And it seems that my gtx 970 strix is running better. I hope u guys can help. This is my first amd card and I was hoping it would run better than my gtx 970. Ive seen reviews where it is neck to neck with the 980 but much cheaper so I took the chance.


How is the cooling on that card, the DCU II 290's were problematic


----------



## premonition08

So far I when I run valley benchmark it goes 73c, my gtx 97 runs about the same but overclocked by a lot, and the results of benchmarking is just the same, I wonder what I missed.
Quote:


> Originally Posted by *diggiddi*
> 
> How is the cooling on that card, the DCU II 290's were problematic


----------



## diggiddi

Quote:


> Originally Posted by *premonition08*
> 
> So far I when I run valley benchmark it goes 73c, my gtx 97 runs about the same but overclocked by a lot, and the results of benchmarking is just the same, I wonder what I missed.


Maybe a fresh driver install? Use DDU to uninstall old driver and then reinstall, btw does it have Hynix memory? Throw up a GPUz


----------



## premonition08

Quote:


> Originally Posted by *diggiddi*
> 
> Maybe a fresh driver install? Use DDU to uninstall old driver and then reinstall, btw does it have Hynix memory? Throw up a GPUz


Yes it has hynix mem, and I used ddu to uninstall my old nvidia. I will show some gpuz shots as soon as I get home


----------



## Blameless

Been playing around with the 15.15 drivers some more, and OCL performance is almost 15% higher on my 290X.

I was comparing to this review's LuxMark results for the 390X: http://www.pcgameshardware.de/AMD-Radeon-Grafikkarte-255597/Tests/Radeon-R9-Fury-X-Test-1162693/2/

With the 15.6 drivers I was falling about 8-10% short of the 390X, despite a 50Mhz higher core clock.

With the 15.15 drivers I'm beating the the 390X by 2-4%. So, about the same performance per clock cycle.

Doing some other repeatable 'canned' benchmarks now. Drivers are all over the place --I actually lost a small amount of performance in the FFXIV Heavensward test -- seems like the closer to real world the tests get worse they look. OCL performance seems real though.


----------



## sTOrM41

if somebody gets his hand on a Club 3D Radeon R9 390 royalQueen, please upload the bios


----------



## Ha-Nocri

Quote:


> Originally Posted by *premonition08*
> 
> So far I when I run valley benchmark it goes 73c, my gtx 97 runs about the same but overclocked by a lot, and the results of benchmarking is just the same, I wonder what I missed.


Don't think you missed anything. When both cards r OC'ed I doubt 390x will be any faster than 970. I hope you didn't buy it to replace your 970.


----------



## Hasmir

Count me in ;D (390 MSI gaming edition 8G)



My card preformes really good in games, but really bad in benchmarks... my previous 770 SLI setup got a way higher score, in 3Dmark, but not near as high FPS in witcher 3... Might be a driver issue or something







.. Anyone else hvad this problem?


----------



## Agent Smith1984

@Kalistoval

Added!

Also... a few questions for everyone:

*Has anyone played with aux voltage?*

I seem to be able to now push the VRAM past 1620 with increased aux voltage.
Obviously with the 512 bit memory bus, memory clocks seem to have lesser impact than core, but I can see that there is an improvement when used high-res, or high AA settings.

*What's the best VRAM clock so far?*

At +50mv aux voltage, I was at 1700 and climbing before I went to bed.

I ran Heaven on max settings 1080P and scored 1550, which appears to be a pretty good score, but it was more driven by the core, as increasing VRAM did little as I went up.

*What's the highest core clock anyone has hit yet??*

I nailed Firestrike at 1185 last night with 100mv+ and no artifacts. 1190 brought a few small squares towards the end of the demo (which really loads the card down more than any of the individual test to be honest).

I was also able to make it through Heaven artifact free.
I need to note that the voltage in GPU-Z under that load, is not anywhere near as high as what it was using the GPU-Z render test.
Max vcore tops around 1.34v using 100mv+ but tends to hang around 1.3v.

A few oberservations...

Stock, the power in for the card seems to be around 235 watts with some peaks at 250
With 1100 core and 50% power limit increase on stock voltage to the card starts edging the 275-280w power.

At 100mv+ and 1100 core with 50% power limit, the card was in the 315w most of the time with a few small 325w spikes

At 100mv+ and 1180 core with 50% power limit, the card running around 330w most of the time, and spiking at 350w on a pretty consisten basis.

Increasing to 200mv+ only improved the clock ceiling to around 1210, but temps were still tolerable at 83c. with 85c VRM1
It also seems that the 200mv+ was not consistently effective, as the fluctuations became much more apparent.
It seems that the board was running out of power, and in my opinion the 100mv+ is about all you need on this card.

I saw wattage at 200mv+ 1200 core, 50% power limit at a consistent 350w, and it would not break 357w and when it did the artifacts came. .

I would like to note that my PSU is for one, not at all of the highest quality, and two, using separate 12v rails.... (1 rail powering the card, have not tried combining power from two yet)
I saw the 12v reading at 11v in GPU-Z under heavy load, so I could very well be power limited and nowhere near the actual ceiling on this card.

Here is a nice firestrike run I did @ 1180/1620 and it was fully stable and artifact free at these clocks.
http://www.3dmark.com/fs/5246984

Again these cards are continuing to be listed as r9 380's in Firestrike. No idea why?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Hasmir*
> 
> Count me in ;D (390 MSI gaming edition 8G)
> 
> 
> 
> My card preformes really good in games, but really bad in benchmarks... my previous 770 SLI setup got a way higher score, in 3Dmark, but not near as high FPS in witcher 3... Might be a driver issue or something
> 
> 
> 
> 
> 
> 
> 
> .. Anyone else hvad this problem?


Added and welcome!

It's important to know that 2 770 sli will certainly beat a 390 in benchmarks, and in some cases may even win out in games.

The 770 is a decent performing card, and is on par with an r9 380, of which 2 cards, would also beat out a single 390.

If you are comfortable running multiple GPU's, you may want to start saving up for a second 390 to run with your current card.

Even for now, you won't mind any titles that this card won't handle at 1080-1440p with high-max settings, and once you add a second, it will fully handle 4k gaming (that's to say if you even need that kind of power to begin with).

Edit: When you say "really bad" in benchmarks, do you mind sharing the links with us??

Thanks


----------



## premonition08

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Don't think you missed anything. When both cards r OC'ed I doubt 390x will be any faster than 970. I hope you didn't buy it to replace your 970.


so here are some screen shots, as you can see. the 970 Strix OC is just ever slightly on par with the stock r9 390x Direct cu II. I was opting to get a second gtx970 for sli, then i saw benchmarks between the gtx 980 and r9 390x. almost similar but amd is much cheaper. so i decided to get the 390x. i was planning to get another 390x to crossfire but then when i tested it, i was just as fast as my gtx 970 OC'ed. i was planning to give the 970 to my brother to upgrade his old gtx 580 on an intel 1156 obsolete socket with an i5 661 cpu. i dont know if i made the right purchase or my setting is just off. btw this is my firs all AMD built.


----------



## premonition08

sorry guys screen capture is too small


----------



## premonition08

sorry if i'm spamming. i more thing. there is no option for virtual super resolution?


----------



## Hasmir

Sure







here is the results from "The valley"



SLI isent really my thing tbh - and wanted to get an AMD GPU ^^ With +100 mV and coreclock set to 1150 i can max withcer 3 on ultra (no AA) on 1920x1080 with 55+ fps... way better than my sli setup


----------



## Ha-Nocri

well, like I said, when both OC'ed they are on pair. You probably will get a few more FPS from 390x, but it's a side-grade, not upgrade. I believe tho that Fury(non-X) will be a beast tho. Maybe it would be better if you waited for it.

Not sure why you don't have VSR. Mine is there:

Quote:


> Originally Posted by *Hasmir*
> 
> Sure
> 
> 
> 
> 
> 
> 
> 
> here is the results from "The valley"
> 
> 
> 
> SLI isent really my thing tbh - and wanted to get an AMD GPU ^^ With +100 mV and coreclock set to 1150 i can max withcer 3 on ultra (no AA) on 1920x1080 with 55+ fps... way better than my sli setup


Kepler performance in recent games is rly bad. 290 is beating 780ti in new Batman and probably in Wither 3 too. And 280X is waaaaay better that 770


----------



## Agent Smith1984

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Don't think you missed anything. When both cards r OC'ed I doubt 390x will be any faster than 970. I hope you didn't buy it to replace your 970.


Agreed....

@premonition08

The 390 will knockout the 970 stock vs stock in most tests, but when it comes to overclocking, the maxwell core has a good bit more headroom than the Hawaii.

When the AMD card shines is it's memory bus and frame buffer size (256 bit 3.5GB on the NVIDIA VS 512 bit 8GB on the AMD card)

If you are playing 1080P, the cards are about a draw.... At 1440P and up is where the 390/390x cards start giving the even the GTX 980 a really hard time.
And even with that said, overclocking the 980 can swing the momentum back in the other direction.

Of course, that leaves me with my best bit of advise for you.... if you already had a 970, and were looking 980 performance, you would be best served by OCing your 970 within every inch of it's life because the 970 will beat out the 980 when doing so. The 980 was a bum deal to begin with (NOT a bad card at all, just a BAD VALUE).

You are at the top of the high-end mainstream segment with either a 970/980, or a 390/390x....
From there, there is currently nowhere to go but flagship series GPU's beginning with either the GTX 980ti or the Radeon Fury X.
The vanilla Fury will likely be in the $550 or so range, and should be a heavy hitter for an otherwise empty competition segment.

At the end of the day though, you can trust that your new 390x graphics card has a full 8GB of VRAM, not 7 or 7.5GB


----------



## premonition08

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Agreed....
> 
> @premonition08
> 
> The 390 will knockout the 970 stock vs stock in most tests, but when it comes to overclocking, the maxwell core has a good bit more headroom than the Hawaii.
> 
> When the AMD card shines is it's memory bus and frame buffer size (256 bit 3.5GB on the NVIDIA VS 512 bit 8GB on the AMD card)
> 
> If you are playing 1080P, the cards are about a draw.... At 1440P and up is where the 390/390x cards start giving the even the GTX 980 a really hard time.
> And even with that said, overclocking the 980 can swing the momentum back in the other direction.
> 
> Of course, that leaves me with my best bit of advise for you.... if you already had a 970, and were looking 980 performance, you would be best served by OCing your 970 within every inch of it's life because the 970 will beat out the 980 when doing so. The 980 was a bum deal to begin with (NOT a bad card at all, just a BAD VALUE).
> 
> You are at the top of the high-end mainstream segment with either a 970/980, or a 390/390x....
> From there, there is currently nowhere to go but flagship series GPU's beginning with either the GTX 980ti or the Radeon Fury X.
> The vanilla Fury will likely be in the $550 or so range, and should be a heavy hitter for an otherwise empty competition segment.
> 
> At the end of the day though, you can trust that your new 390x graphics card has a full 8GB of VRAM, not 7 or 7.5GB


thanks so much. i feel a little better now LOL. i can still overclock the r9 390x right? another issue is a cant seem to find info on the virtual super resolution. why is there no option for this card? thanks for the reply by the way. i wanted to test the VSR until i upgrade my monitor to a higher resolution


----------



## whiteturbo

Hiya All,
Just had to say my MSI R9 390X is being delivered tomorrow(fingers crossed) it will replace my HD 7870 which has been a good card i must say and is one of the reasons i stayed with AMD because i was tempted by the GTX 970. But i have heard that some people are having issues getting the drivers installed do you know of this and is there any advice you can give me. I have got the drivers on my desktop ready to go and i have also got DDU as i thought i might need it. I am so excited!! the last time i felt like this was err waiting for my HD 7870 lol


----------



## Agent Smith1984

Well, I can tell you, the 390x is going to be one heck of an upgrade of that 7870....

Make sure you do this (in this order) before installing any software (card is fine to pop in)

Go to applications and actually uninstall CCC first
Then launch DDU, opt to restart in safemode, finish running DDU and restart.

Then install the 15.15 driver, then restart as prompted.

I found if you don't do this, it will not upgrade CCC, and only change out the driver.
When that happens, CCC will either not work, or not have the new features such as Framerate target, etc....

Just a tip.


----------



## magicase

Does anyone know if you can CF a 295x2 with a 390/390X?


----------



## Ha-Nocri

Quote:


> Originally Posted by *magicase*
> 
> Does anyone know if you can CF a 295x2 with a 390/390X?


Maybe, but you know you would lose 4GB on 390(x)?


----------



## premonition08

i still cant find any info on why there isn't a check box for virtual super resolution??


----------



## Amhro

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Well, I can tell you, the 390x is going to be one heck of an upgrade of that 7870....
> 
> Make sure you do this (in this order) before installing any software (card is fine to pop in)
> 
> Go to applications and actually uninstall CCC first
> Then launch DDU, opt to restart in safemode, finish running DDU and restart.
> 
> Then install the 15.15 driver, then restart as prompted.
> 
> I found if you don't do this, it will not upgrade CCC, and only change out the driver.
> When that happens, CCC will either not work, or not have the new features such as Framerate target, etc....
> 
> Just a tip.










I just uninstalled CCC with everything included, restarted and installed new CCC with driver, works without any prob


----------



## Ha-Nocri

Quote:


> Originally Posted by *premonition08*
> 
> i still cant find any info on why there isn't a check box for virtual super resolution??


How is your monitor connected? Via DP? If you can you should try DVI and/or HDMI


----------



## premonition08

Quote:


> Originally Posted by *Ha-Nocri*
> 
> How is your monitor connected? Via DP? If you can you should try DVI and/or HDMI


i am currently using DP. I've tried both HDMI and DVI. does not work...


----------



## Agent Smith1984

Quote:


> Originally Posted by *magicase*
> 
> Does anyone know if you can CF a 295x2 with a 390/390X?


Still waiting on confirmation of this...

My guess is that it won't be possible without the hacked 15.15 driver, or waiting for the next instance of CCC driver which may or may not offer support for both cards.

Bottom line is, GPU-Z still reports 390 cards as a Hawaii, and not a Grenada, so at the end of the day, the two should be compatible.

The problem is, right now the two cards rely on separate driver platforms.


----------



## DividebyZERO

Quote:


> Originally Posted by *premonition08*
> 
> i am currently using DP. I've tried both HDMI and DVI. does not work...


Whats the resolution and refresh rate of your monitor?


----------



## premonition08

Quote:


> Originally Posted by *DividebyZERO*
> 
> Whats the resolution and refresh rate of your monitor?


its an AOC 29 inch [email protected]


----------



## Gumbi

Quote:


> Originally Posted by *Agent Smith1984*
> 
> @Kalistoval
> 
> Added!
> 
> Also... a few questions for everyone:
> 
> *Has anyone played with aux voltage?*
> 
> I seem to be able to now push the VRAM past 1620 with increased aux voltage.
> Obviously with the 512 bit memory bus, memory clocks seem to have lesser impact than core, but I can see that there is an improvement when used high-res, or high AA settings.
> 
> *What's the best VRAM clock so far?*
> 
> At +50mv aux voltage, I was at 1700 and climbing before I went to bed.
> 
> I ran Heaven on max settings 1080P and scored 1550, which appears to be a pretty good score, but it was more driven by the core, as increasing VRAM did little as I went up.
> 
> *What's the highest core clock anyone has hit yet??*
> 
> I nailed Firestrike at 1185 last night with 100mv+ and no artifacts. 1190 brought a few small squares towards the end of the demo (which really loads the card down more than any of the individual test to be honest).
> 
> I was also able to make it through Heaven artifact free.
> I need to note that the voltage in GPU-Z under that load, is not anywhere near as high as what it was using the GPU-Z render test.
> Max vcore tops around 1.34v using 100mv+ but tends to hang around 1.3v.
> 
> A few oberservations...
> 
> Stock, the power in for the card seems to be around 235 watts with some peaks at 250
> With 1100 core and 50% power limit increase on stock voltage to the card starts edging the 275-280w power.
> 
> At 100mv+ and 1100 core with 50% power limit, the card was in the 315w most of the time with a few small 325w spikes
> 
> At 100mv+ and 1180 core with 50% power limit, the card running around 330w most of the time, and spiking at 350w on a pretty consisten basis.
> 
> Increasing to 200mv+ only improved the clock ceiling to around 1210, but temps were still tolerable at 83c. with 85c VRM1
> It also seems that the 200mv+ was not consistently effective, as the fluctuations became much more apparent.
> It seems that the board was running out of power, and in my opinion the 100mv+ is about all you need on this card.
> 
> I saw wattage at 200mv+ 1200 core, 50% power limit at a consistent 350w, and it would not break 357w and when it did the artifacts came. .
> 
> I would like to note that my PSU is for one, not at all of the highest quality, and two, using separate 12v rails.... (1 rail powering the card, have not tried combining power from two yet)
> I saw the 12v reading at 11v in GPU-Z under heavy load, so I could very well be power limited and nowhere near the actual ceiling on this card.
> 
> Here is a nice firestrike run I did @ 1180/1620 and it was fully stable and artifact free at these clocks.
> http://www.3dmark.com/fs/5246984
> 
> Again these cards are continuing to be listed as r9 380's in Firestrike. No idea why?


Thanks! Will compare them to my own power draws on my vaporx later. My card does 1110mhz core 1450mhz mem on stock voltage (25mv) which I generally prefer to sticking 100 plus mv through my card for ~1200/1600


----------



## edo101

Hope everything is going well here. no CFX issues? How are the 300 cards working out for you guys?


----------



## Agent Smith1984

So far, it seems as if the 390 series is just as much luck of the draw for overclocking as 290 series.

I was expecting an improved manufacturing process that would have increased clock ceiling potential, but so far it just appears that the power usage at given clock is down.

An improvement none the less.... and I am seeing almost every sample hit around 1100MHz on stock voltage, which was not quite as common on the 290's

Also appears that every sample so far is using Hynix VRAM rated at 1500MHz, so no Elpida VS Hynix confusion going on with this series.

I will say though, that I am seeing most of the VRAM clocks still capping at 1600-1650. Not a total disappointment, but not in line with the review results I was seeing that were putting some chips in the 1700-1800 range.

I believe increasing the aux voltage had some bearing on this though. I will continue to test the +50mv aux voltage (temps rise with this being increased, so IT IS DOING SOMETHING)

Last test ran was with VRAM at 1720 last night, and no black screen (prior testing was giving black screens within a few minutes at anything over 1630 in Firestrike).


----------



## BlaXey

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Well, I can tell you, the 390x is going to be one heck of an upgrade of that 7870....
> 
> Make sure you do this (in this order) before installing any software (card is fine to pop in)
> 
> Go to applications and actually uninstall CCC first
> Then launch DDU, opt to restart in safemode, finish running DDU and restart.
> 
> Then install the 15.15 driver, then restart as prompted.
> 
> I found if you don't do this, it will not upgrade CCC, and only change out the driver.
> When that happens, CCC will either not work, or not have the new features such as Framerate target, etc....
> 
> Just a tip.


What's CCC?


----------



## Amhro

^Catalyst Control Center


----------



## Particle

Quote:


> Originally Posted by *Agent Smith1984*
> 
> So far, it seems as if the 390 series is just as much luck of the draw for overclocking as 290 series.
> 
> I was expecting an improved manufacturing process that would have increased clock ceiling potential, but so far it just appears that the power usage at given clock is down.
> 
> An improvement none the less.... and I am seeing almost every sample hit around 1100MHz on stock voltage, which was not quite as common on the 290's
> 
> Also appears that every sample so far is using Hynix VRAM rated at 1500MHz, so no Elpida VS Hynix confusion going on with this series.
> 
> I will say though, that I am seeing most of the VRAM clocks still capping at 1600-1650. Not a total disappointment, but not in line with the review results I was seeing that were putting some chips in the 1700-1800 range.
> 
> I believe increasing the aux voltage had some bearing on this though. I will continue to test the +50mv aux voltage (temps rise with this being increased, so IT IS DOING SOMETHING)
> 
> Last test ran was with VRAM at 1720 last night, and no black screen (prior testing was giving black screens within a few minutes at anything over 1630 in Firestrike).


My personal results with stock voltage were such that I had to reduce my core overclock a few percent down to 7% resulting in 1081 MHz. RAM seems to run fine at 1650 MHz, but I haven't tried higher. No artifacts in Firestrike. As an overclocker, I'm really only interested in default clock results for my own purposes, so this is where I'll probably stay. It would probably be safe to update my clocks on the chart if you have a moment.


----------



## BlaXey

When directx 12 show up if the results on games are good i will buy a r9 390x, but i have a question, what option is the best choice? Sapphire, Msi, XFX, or Asus? Prices in Spain are Msi: 380€, Sapphire: 360€, XfX, 345€, i like overclock (L)


----------



## Ha-Nocri

Quote:


> Originally Posted by *BlaXey*
> 
> When directx 12 show up if the results on games are good i will buy a r9 390x, but i have a question, what option is the best choice? Sapphire, Msi, XFX, or Asus? Prices in Spain are Msi: 380€, Sapphire: 360€, XfX, 345€, i like overclock (L)


Sapphire has the best cooler. Tri-X is the quietest and cools the best. I would prefer if it was quieter @idle, but it's not that bad. Not sure if they fixed it for 390(x)


----------



## BlaXey

The question is what's the best for overclocking.


----------



## Ha-Nocri

Quote:


> Originally Posted by *BlaXey*
> 
> The question is what's the best for overclocking.


If all have reference AMD board, then Saphhire again.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Particle*
> 
> My personal results with stock voltage were such that I had to reduce my core overclock a few percent down to 7% resulting in 1081 MHz. RAM seems to run fine at 1650 MHz, but I haven't tried higher. No artifacts in Firestrike. As an overclocker, I'm really only interested in default clock results for my own purposes, so this is where I'll probably stay. It would probably be safe to update my clocks on the chart if you have a moment.


Done!

Anyone else wanting to post final overclock/voltage results just let me know!

Thanks


----------



## FastEddieNYC

Quote:


> Originally Posted by *BlaXey*
> 
> The question is what's the best for overclocking.


Both the MSI Lightning and the Sapphire Tri-x use custom PCB's. The Sapphire uses 2 8 pin pci-e connectors so you will have additional juice when needed. It still comes down to getting lucky with a good Chip that will OC.


----------



## BlaXey

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Done!
> 
> Anyone else wanting to post final overclock/voltage results just let me know!
> 
> Thanks


I like to know benchmarks in games without and with occ, and what voltaje and what Mhz you used to this.


----------



## Agent Smith1984

Quote:


> Originally Posted by *BlaXey*
> 
> I like to know benchmarks in games without and with occ, and what voltaje and what Mhz you used to this.


I will log some FRAPs as the week goes on. Spent the majority of my time so far just figuring out the temp/fan curves, where the voltage is happy, where my PSU is crapping out on me, and just how much I can squeeze out of it.

It's always easy to find a benching clock that is stable, but finding that nice summer-safe daily clock that has a nice blend of power/temps/performance is what's tricky.

So far I believe I will settle on 1160/1600 with 50mv+ and no AUX voltage as a daily clock.
I just see no reason to give up any of the clock speed I've gained, even with it needing additional voltage, because the temps are well under control.

Benching clock is 1185/1700 @ 100mv and 50mv+ aux voltage. (I'll be pushing the VRAM further, and pushing my CPU further to break 14k graphics score on single card







)
It's so stable I could honestly run daily that way, but I'm not a fan of running core and VRM at 83-85c under max load all the time.

I have to admit.... Crysis 3 has picked up the biggest gains from this new card/driver combo over the 290.
It simply destroys the FPS I had on the 290 by 15-20%. That's min and max... not just avg/max.....

FPS gain is way up in FC4 and Dirt Rally also.

I am sadly not the owner of GTA V yet (been dodging that god awful 60GB+ DL), but will be getting in the next week or two, and am also wanting the Witcher 3.list to get.


----------



## BlaXey

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I will log some FRAPs as the week goes on. Spent the majority of my time so far just figuring out the temp/fan curves, where the voltage is happy, where my PSU is crapping out on me, and just how much I can squeeze out of it.
> 
> It's always easy to find a benching clock that is stable, but finding that nice summer-safe daily clock that has a nice blend of power/temps/performance is what's tricky.
> 
> So far I believe I will settle on 1160/1600 with 50mv+ and no AUX voltage as a daily clock.
> I just see no reason to give up any of the clock speed I've gained, even with it needing additional voltage, because the temps are well under control.
> 
> Benching clock is 1185/1700 @ 100mv and 50mv+ aux voltage. (I'll be pushing the VRAM further, and pushing my CPU further to break 14k graphics score on single card
> 
> 
> 
> 
> 
> 
> 
> )
> It's so stable I could honestly run daily that way, but I'm not a fan of running core and VRM at 83-85c under max load all the time.
> 
> I have to admit.... Crysis 3 has picked up the biggest gains from this new card/driver combo over the 290.
> It simply destroys the FPS I had on the 290 by 15-20%. That's min and max... not just avg/max.....
> 
> FPS gain is way up in FC4 and Dirt Rally also.
> 
> I am sadly not the owner of GTA V yet (been dodging that god awful 60GB+ DL), but will be getting in the next week or two, and am also wanting the Witcher 3.list to get.


Do you think that i can get the same results with a Sapphire tri-X?


----------



## Agent Smith1984

Here is a list of Tri-X reviews that include overclocking.

http://www.tweaktown.com/reviews/7205/sapphire-tri-radeon-r9-390x-8gb-video-card-review/index10.html

http://hexus.net/tech/reviews/graphics/84194-sapphire-radeon-r9-390x-tri-x/?page=13

http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/69646-amd-r9-390x-8gb-performance-review-17.html

http://www.legitreviews.com/sapphire-nitro-r9-390-8gb-nitro-r9-380-4gb-video-card-review_166123/4
(







this one shocked me, and I am a bit skeptical that the card is holding that clock speed. I'd like to see the core clock readout on a full firestrike run for sure.... could still be legit though, and further proves my notes below)

Please pay attention to which ones increase voltage, and which ones use stock voltage, as that will always be a big factor in the overclock. Some of these reviews never mention either so just figure the usual 1130 or less results are on stock voltage when not mentioned.

Typical results on stock voltage will be 1075-1140 core and 1600-1700 memory in my opinion.
Overvoltage results can be so inconsistent.... you are literally talking 1145MHz up into the low-mid 1200's with stock cooling.

Very seldom will you see a card NOT on water, that breaks 1220MHz...

Also remember that max voltage won't always mean max OC.

Sometimes these boards have a sweet spot with temps/power draw that could land you at a slightly higher MHz with less voltage


----------



## cosita88

Quote:


> Originally Posted by *Particle*
> 
> 
> 
> Particle
> Radeon 390
> Sapphire Nitro Tri-X


I would be grateful if you share your BIOS.

Thank you .

I use google translator,sorry.,


----------



## Amhro

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Done!
> 
> Anyone else wanting to post final overclock/voltage results just let me know!
> 
> Thanks


I'm at 1100/[email protected] voltage for now


----------



## Face76

Any plug and play water cooling solutions for the 390 yet? My MSI 390 will be here tomorrow.


----------



## Ha-Nocri

Quote:


> Originally Posted by *Amhro*
> 
> I'm at 1100/[email protected] voltage for now


What's stock voltage for the card?


----------



## russik

Does all 3series gpus support hdmi 2.0 or only high end like 390/390x ? And does anyone have tried it and works well without problems?


----------



## Particle

Russik, it's my understanding that HDMI 2.0 support didn't come with this generation. Fiji certainly doesn't have it.


----------



## Agent Smith1984

Quote:


> Originally Posted by *russik*
> 
> Does all 3series gpus support hdmi 2.0 or only high end like 390/390x ? And does anyone have tried it and works well without problems?


Particle is correct...

There is no HDMI 2.0 on any of AMD's GPU offerings.

A let down indeed, but there should be some DP-> HDMI 2.0 adapters coming soon that allow 4k @ 60Hz (if that's why you are asking)


----------



## russik

Yes thats what I asked. But thats sad why AMD do it to us?


----------



## Motley01

We need to see some pics of these cards in your rigs guys!!! I looked through the entire thread and didn't see any?

I just ordered the MSI 390x and should have it by Thursday.


----------



## Superjit94

Just recently jumped ship from Team Green to Team Red. Coming from someone who is also buying Nvidia products im very pleased with this card. Even though there isn't much Overclockability right now with this card, it's still performing more than enough for me. Going to definitely crossfire these in the near future.

I have an XFX R9 390

i have it OverClocked to 1090MHz on the GPU and 1735MHz on the Memory. Haven't seen any visual artifacts yet.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Motley01*
> 
> We need to see some pics of these cards in your rigs guys!!! I looked through the entire thread and didn't see any?
> 
> I just ordered the MSI 390x and should have it by Thursday.


I agree!!!

I will get some shots of my mine later.... LOOKS AWESOME!!

Thanks for the nudge, and look forward to adding you to the club.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Superjit94*
> 
> 
> 
> Just recently jumped ship from Team Green to Team Red. Coming from someone who is also buying Nvidia products im very pleased with this card. Even though there isn't much Overclockability right now with this card, it's still performing more than enough for me. Going to definitely crossfire these in the near future.
> 
> I have an XFX R9 390
> 
> i have it OverClocked to 1090MHz on the GPU and 1735MHz on the Memory. Haven't seen any visual artifacts yet.


Added!

Can you specify your voltage settings please?

Thanks


----------



## Agent Smith1984

Quote:


> Originally Posted by *Face76*
> 
> Any plug and play water cooling solutions for the 390 yet? My MSI 390 will be here tomorrow.


I would think that the NZXT G10 and Kraken setup would work just fine.

I believe @Kalistoval is already using a kraken cooler with his card.

You may want to speak with him.


----------



## Amhro

Quote:


> Originally Posted by *Ha-Nocri*
> 
> What's stock voltage for the card?


No idea as I have not touched it








Quote:


> Originally Posted by *Motley01*
> 
> We need to see some pics of these cards in your rigs guys!!! I looked through the entire thread and didn't see any?
> 
> I just ordered the MSI 390x and should have it by Thursday.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Amhro*
> 
> No idea as I have not touched it


As a note, you can use the MSI Gaming App to control the LED's response to GPU load... pretty neat, though haven't used it myself.

I think the MSI card is still a very appealing offer right now...

You get a nice cooler, great (though extremely common) color scheme, an LED, and a great looking back plate.
The card is also not overly lengthy, and it has a very aggressive OC in OC mode compared to other offerings.
It also seems to clock well with voltage.

I also like the Sapphire option a lot, as my last cards were Sapphire Tri-X, and served me very well.


----------



## By-Tor

Was using EK's Cooling Configurator and found that a couple of the 290x full cover blocks they already produce fit 3 companies 390/390x cards. Powercolor, XFX and Asus cards already have full cover blocks that will fit if I'm using it right.

http://configurator.ekwb.com/


----------



## Motley01

Finally some pics, very nice! Ya I like the LEDs on the MSI. No other offerings have that.

So are you guys connecting the two 8 pin power connectors? Same as the MSI 290x Lighting.


----------



## Hasmir

Here is a pic of my card in my setup ^^



Not the best quality tho


----------



## Superjit94

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Added!
> 
> Can you specify your voltage settings please?
> 
> Thanks


I'm not sure what the exact numbers are. But on the MSI Afterburner Overclocking tool I did +15 on the Voltage Control, and +15% on the Power Limit slider.

Did a 1 hour Kombuster stress test, and temps never went over 70* C, the RV02 is an awesome aircooling case.


----------



## Agent Smith1984

Hadn't dusted lately, excuse the mess









Bottom power cables already ran from my last CF setup, and ready to get for my second 390.


----------



## Cannon19932006

Quote:


> Originally Posted by *Agent Smith1984*
> 
> 
> 
> Hadn't dusted lately, excuse the mess
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Bottom power cables already ran from my last CF setup, and ready to get for my second 390.


The vrm cooling is real!


----------



## Amhro

Quote:


> Originally Posted by *Motley01*
> 
> Finally some pics, very nice! Ya I like the LEDs on the MSI. No other offerings have that.
> 
> So are you guys connecting the two 8 pin power connectors? Same as the MSI 290x Lighting.


MSI 390 needs 8+6 pin connectors.


----------



## Performer81

Anybody who has the 390X PCS+?


----------



## maynard14

hi guys, currently i have ref xfx 290x card, i will sell it tomorrow, i am thinking now if which card to buy, i really dont want to use a aio cooler on my 290x with nzxt g10, i want an air cooled card,, so im deciding between 390x tri x or nvidia 980 ref cooler or the cheapest 980 card, can you help me choose which card is the coolest and destroys fps on 1080p ultra settings on the latest games specially withcer 3

thanks in advance guys


----------



## Ha-Nocri

Quote:


> Originally Posted by *maynard14*
> 
> hi guys, currently i have ref xfx 290x card, i will sell it tomorrow, i am thinking now if which card to buy, i really dont want to use a aio cooler on my 290x with nzxt g10, i want an air cooled card,, so im deciding between 390x tri x or nvidia 980 ref cooler or the cheapest 980 card, can you help me choose which card is the coolest and destroys fps on 1080p ultra settings on the latest games specially withcer 3
> 
> thanks in advance guys


UpSidegrade from 290x to 390x/980? Seems a waste of money to me. Is reference cooler loud @idle btw?


----------



## maynard14

Quote:


> Originally Posted by *Ha-Nocri*
> 
> UpSidegrade from 290x to 390x/980? Seems a waste of money to me. Is reference cooler loud @idle btw?


i know sir but i cant take the heat of my 290x and i been through 2 290x already, always gives me black screen when i use a nzxt g10 and aio cooler, so this new ref card i have i dont want to bother taking it apart and put the nzxt g10,.. the ref cooler is really loud like a blow dryer at 100 percent fan speed and 83c gpu core and vrm1 61 c and vrm2 72c,, so thats why i want to get rid of it asap although performance i can oc it to 1105 gpu core with default voltage,,


----------



## rdr09

Quote:


> Originally Posted by *maynard14*
> 
> i know sir but i cant take the heat of my 290x and i been through 2 290x already, always gives me black screen when i use a nzxt g10 and aio cooler, so this new ref card i have i dont want to bother taking it apart and put the nzxt g10,.. the ref cooler is really loud like a blow dryer at 100 percent fan speed and 83c gpu core and vrm1 61 c and vrm2 72c,, so thats why i want to get rid of it asap although performance i can oc it to 1105 gpu core with default voltage,,


Go for 8GB. I would if I am going between those two. But, I am bias.









Edit: if you find a cheaper 390, then I recommend that. It is a bit faster than the 290X.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Cannon19932006*
> 
> The vrm cooling is real!


LMAO

Yeah, they get hot on this cheap board, and to get this CPU to a more competitive clock speed, I had to beef up the cooling.
I also have a 120mm low-pro fan on the rear of the CPU socket cooling that too.
Quote:


> Originally Posted by *maynard14*
> 
> i know sir but i cant take the heat of my 290x and i been through 2 290x already, always gives me black screen when i use a nzxt g10 and aio cooler, so this new ref card i have i dont want to bother taking it apart and put the nzxt g10,.. the ref cooler is really loud like a blow dryer at 100 percent fan speed and 83c gpu core and vrm1 61 c and vrm2 72c,, so thats why i want to get rid of it asap although performance i can oc it to 1105 gpu core with default voltage,,


I agree that you are basically making somewhat of a side-grade (though the 390x is about 10% faster overall with it's clock speeds and improved driver).
If you are choosing between 390x and 980, but have the money for the 980, you may want to hang tight just a few weeks and see how the Fury base and Fury Nano stack up.

Either way, if I had to choose between the 390x at $430, and the GTX980 at $500, the choice is simple.... the 390x


----------



## maynard14

Quote:


> Originally Posted by *Agent Smith1984*
> 
> LMAO
> 
> Yeah, they get hot on this cheap board, and to get this CPU to a more competitive clock speed, I had to beef up the cooling.
> I also have a 120mm low-pro fan on the rear of the CPU socket cooling that too.
> I agree that you are basically making somewhat of a side-grade (though the 390x is about 10% faster overall with it's clock speeds and improved driver).
> If you are choosing between 390x and 980, but have the money for the 980, you may want to hang tight just a few weeks and see how the Fury base and Fury Nano stack up.
> 
> Either way, if I had to choose between the 390x at $430, and the GTX980 at $500, the choice is simple.... the 390x


thanks bro for advice and comparison., here in the philippines 390x tri x is price at 465 dollars, while the 2nd hand 980 is also price at 465 dollars atleast, but the brand new 980 coz 500 dollars plus here

ehich one is quiter and cooler? and how about hair works and other stuff of the nvidia side?


----------



## By-Tor

Quote:


> Originally Posted by *maynard14*
> 
> hi guys, currently i have ref xfx 290x card, i will sell it tomorrow, i am thinking now if which card to buy, i really dont want to use a aio cooler on my 290x with nzxt g10, i want an air cooled card,, so im deciding between 390x tri x or nvidia 980 ref cooler or the cheapest 980 card, can you help me choose which card is the coolest and destroys fps on 1080p ultra settings on the latest games specially withcer 3
> 
> thanks in advance guys


If I were replacing my current 290x (which at this point i'm not) I would go with a pair of Powercolor 390's. That pair would dominate any single card on the market at a much better price point.

Just my 2 cents


----------



## maynard14

Quote:


> Originally Posted by *By-Tor*
> 
> If I were replacing my current 290x (which at this point i'm not) I would go with a pair of Powercolor 390's. That pair would dominate any single card on the market at a much better price point.
> 
> Just my 2 cents


thanks but cant afford to buy 2 cards


----------



## By-Tor

Quote:


> Originally Posted by *maynard14*
> 
> thanks but cant afford to buy 2 cards


Could always pickup the first now and a second later...

Myself I'm looking for a second Powercolor R9 290x LCS to go with the one I have now.

But very hard to find right now.

Very nice card...
http://www.newegg.com/Product/Product.aspx?Item=N82E16814131543

PS: If anyone is selling one of these cards to go with a 390/390X please PM me.


----------



## rdr09

Quote:


> Originally Posted by *maynard14*
> 
> thanks but cant afford to buy 2 cards


you don't need two to match the others in your choices. Just oc a bit. I believe the figures in the charts are graphics scores . . .



290 at 1130 MHz . . .

http://www.3dmark.com/3dm/2896682

@ 1260 . . .

http://www.3dmark.com/3dm/1895777

The 390 is faster. Needs a smaller oc. 8GB if spending that much. And, Hairworks? Even nVidia users are turning those off.


----------



## maynard14

Quote:


> Originally Posted by *By-Tor*
> 
> Could always pickup the first now and a second later...
> 
> Myself I'm looking for a second Powercolor R9 290x LCS to go with the one I have now.
> 
> Very nice card...
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814131543


thanks thanks... but ill stick to just 1080p 120hz refresh rate,.. and also 390 is still not available here in the philippines


----------



## Particle

Maynard14, the 290/390 and 290X/390X are the same chip. The differentiating factors are the doubling of the frame buffer from 4 to 8 GB, an increase in stock frequencies, virtually all custom cooler designs from partners, and current market prices. If heat is your main problem and you are price sensitive, you might be best served by seeking out a custom-cooled 290/290X second-hand. Make sure to check reviews though, as some of them had reputations for not being very effective. You compared prices against a used 980 which suggests you're open to the idea of used cards.


----------



## maynard14

Quote:


> Originally Posted by *Particle*
> 
> Maynard14, the 290/390 and 290X/390X are the same chip. The differentiating factors are the doubling of the frame buffer from 4 to 8 GB, an increase in stock frequencies, virtually all custom cooler designs from partners, and current market prices. If heat is your main problem and you are price sensitive, you might be best served by seeking out a custom-cooled 290/290X second-hand. Make sure to check reviews though, as some of them had reputations for not being very effective. You compared prices against a used 980 which suggests you're open to the idea of used cards.


yes im going to buy a 2nd hand nvdia or amd,,, a new 390x still is a good price here, i can afford it, but yes a 390x/290x is the same only the driver optimization i think is the factor why the two cards are not performing the same and the gpu clock and memory size,, i tried my 290x ref oc to 1105 stock voltage and it can match the 390x and 980,, but unfortunately i cant stand aio cooler and nzxt g10 it tends to break on me and the ref cooler is so horrible thats why im selling it,. im also considering 290x tri x 2nd hand but im really eyeing on the 980 simle because i want to change to nvidia for the extra features and temperature and oc ability, but if price diff between a 2nd hand 980 vs a brand new 390x well ill get the 390x IF it is cooler than 290x and oc well


----------



## Agent Smith1984

So far, I have noticed the biggest areas of improvement for the 390 to be the following (still going to be logging some FRAPS soon):

FireStrike Graphics score

Crysis 3

Far Cry4

BF4 seems to just be so happy on these cards that the only improvement is through clock speed, but the Cry stuff is drastically faster....

Crysis 3 has shown a solid 15-20% improvement, and that's not just max and avg, I am seeing the mins way up.....

Whatever AMD did on the 15.15, it worked for more than just Witcher 3 and GTA V.....

Crysis 3 is an old title now, and it loves this driver.

I'd like to add that I have not read any reports of black screens on these cards as of yet.

I think people need to be aware that when the 290 series dropped, it's direct competition was the 780/780ti....

Now 2 years later, it competes again with the 970/980....

I'd say coming out twice with the same architecture and competing with two totally different NVIDIA architectures, is pretty impressive, no?


----------



## maynard14

Quote:


> Originally Posted by *Agent Smith1984*
> 
> So far, I have noticed the biggest areas of improvement for the 390 to be the following (still going to be logging some FRAPS soon):
> 
> FireStrike Graphics score
> 
> Crysis 3
> 
> Far Cry4
> 
> BF4 seems to just be so happy on these cards that the only improvement is through clock speed, but the Cry stuff is drastically faster....
> 
> Crysis 3 has shown a solid 15-20% improvement, and that's not just max and avg, I am seeing the mins way up.....
> 
> Whatever AMD did on the 15.15, it worked for more than just Witcher 3 and GTA V.....
> 
> Crysis 3 is an old title now, and it loves this driver.
> 
> I'd like to add that I have not read any reports of black screens on these cards as of yet.
> 
> I think people need to be aware that when the 290 series dropped, it's direct competition was the 780/780ti....
> 
> Now 2 years later, it competes again with the 970/980....
> 
> I'd say coming out twice with the same architecture and competing with two totally different NVIDIA architectures, is pretty impressive, no?


hmmm i agree,,, 780ti and now 970 the 200 series and 300 series still be able to compete with nvidias offerings,,,







hmmm now im really confuse on what card to buy to replace my 290x volcano hehe


----------



## Agent Smith1984

Quote:


> Originally Posted by *maynard14*
> 
> hmmm i agree,,, 780ti and now 970 the 200 series and 300 series still be able to compete with nvidias offerings,,,
> 
> 
> 
> 
> 
> 
> 
> hmmm now im really confuse on what card to buy to replace my 290x volcano hehe


If it's just a temp issue for you, any third party 290 or 390 will have you covered.

In my opinion the temperatures associated with the 290 reference cooler is old news at this point, and AMD's partners remedied this on the 290 series a while back.

I mean, my tri-x OC 290 never broke 72c with an overclock/overvolt on it when running by itself (custom fan profile with the overvolt)

Honestly, at stock speeds, with the stock fan profile, the card was very quiet, and I usually have games cranked up anyways.

I still don't understand the load-noise complaints associated with graphics cards.... I often wonder, what the hell are people doing with no sound on, that is putting the card under load (besides mining, in which you're never in the room with the cards anyways....)

Anyone gaming, is not listening to their box.

I keep my CPU rad at full tilt (2600rpm) to keep that booger cool, and also have an aggressive GPU fan profile to keep it cool when overvolted, and I never hear any of it over the games I'm playing.
I guess some people have their reasons though....


----------



## maynard14

Quote:


> Originally Posted by *Agent Smith1984*
> 
> If it's just a temp issue for you, any third party 290 or 390 will have you covered.
> 
> In my opinion the temperatures associated with the 290 reference cooler is old news at this point, and AMD's partners remedied this on the 290 series a while back.
> 
> I mean, my tri-x OC 290 never broke 72c with an overclock/overvolt on it when running by itself (custom fan profile with the overvolt)
> 
> Honestly, at stock speeds, with the stock fan profile, the card was very quiet, and I usually have games cranked up anyways.
> 
> I still don't understand the load-noise complaints associated with graphics cards.... I often wonder, what the hell are people doing with no sound on, that is putting the card under load (besides mining, in which you're never in the room with the cards anyways....)
> 
> Anyone gaming, is not listening to their box.
> 
> I keep my CPU rad at full tilt (2600rpm) to keep that booger cool, and also have an aggressive GPU fan profile to keep it cool when overvolted, and I never hear any of it over the games I'm playing.
> I guess some people have their reasons though....


its hot in here in the philippines,, the nzxt g10 and aio cooler works great but it tends to black screen when i change the ref cooler,, hehe the ref cooler is not very good at all, my idle is 56c and ramping the fan to 100 percent is very loud its annoying,, i want 1080p max fps with good cooling







and as much as possibly quiet


----------



## whiteturbo

Got my MSI 390X yesterday,installed Drivers,Afterburner etc etc. And the first thing i noticed was how hot it was, up in the 90's running Heaven and that's with no overclock and the fans going full tilt, was certainly noisier than i'm used to.I played about with overclocking up to 1175 core 1600 memory but it was blasting the heat out(high 90's) so i gave that up as a bad idea and went back to stock.
To be fair it was a warm day yesterday and todays even hotter(28C). So after thinking about it i decided to say f*** the warrantie and for my own piece of mind i have redone the thermal paste with MX4. There was in my opinion(what do i know lol) an excess of thermal paste but i must admit i was expecting there to be not enough so does this card just naturally run hot?.At this moment i have it hovering around 50 degrees but i have underclocked the card back to 1000 core 1400 memory and i have the fans set to be off until 50C and they are flicking off and on at the moment.Also because of the new card my computer is not in its normal place so i'm probably hearing more fan noise because of its proximity. I will give it a session of Grim Dawn(my game of the moment) and see how it gets on.What temperature should i expect with the 390X when heavy gaming e.g. Withcher 3.


----------



## Agent Smith1984

Quote:


> Originally Posted by *whiteturbo*
> 
> Got my MSI 390X yesterday,installed Drivers,Afterburner etc etc. And the first thing i noticed was how hot it was, up in the 90's running Heaven and that's with no overclock and the fans going full tilt, was certainly noisier than i'm used to.I played about with overclocking up to 1175 core 1600 memory but it was blasting the heat out(high 90's) so i gave that up as a bad idea and went back to stock.
> To be fair it was a warm day yesterday and todays even hotter(28C). So after thinking about it i decided to say f*** the warrantie and for my own piece of mind i have redone the thermal paste with MX4. There was in my opinion(what do i know lol) an excess of thermal paste but i must admit i was expecting there to be not enough so does this card just naturally run hot?.At this moment i have it hovering around 50 degrees but i have underclocked the card back to 1000 core 1400 memory and i have the fans set to be off until 50C and they are flicking off and on at the moment.Also because of the new card my computer is not in its normal place so i'm probably hearing more fan noise because of its proximity. I will give it a session of Grim Dawn(my game of the moment) and see how it gets on.What temperature should i expect with the 390X when heavy gaming e.g. Withcher 3.


Very curious about your case's airflow, and if the MX4 helped or not, and what your fan profile looks like...

With 100mv I see the core get around 82-84c in Heaven....

What is your ambient room and ambient case temps?

90c is not dangerous, but certainly hot.


----------



## Ha-Nocri

Quote:


> Originally Posted by *whiteturbo*
> 
> Got my MSI 390X yesterday,installed Drivers,Afterburner etc etc. And the first thing i noticed was how hot it was, up in the 90's running Heaven and that's with no overclock and the fans going full tilt, was certainly noisier than i'm used to.I played about with overclocking up to 1175 core 1600 memory but it was blasting the heat out(high 90's) so i gave that up as a bad idea and went back to stock.
> To be fair it was a warm day yesterday and todays even hotter(28C). So after thinking about it i decided to say f*** the warrantie and for my own piece of mind i have redone the thermal paste with MX4. There was in my opinion(what do i know lol) an excess of thermal paste but i must admit i was expecting there to be not enough so does this card just naturally run hot?.At this moment i have it hovering around 50 degrees but i have underclocked the card back to 1000 core 1400 memory and i have the fans set to be off until 50C and they are flicking off and on at the moment.Also because of the new card my computer is not in its normal place so i'm probably hearing more fan noise because of its proximity. I will give it a session of Grim Dawn(my game of the moment) and see how it gets on.What temperature should i expect with the 390X when heavy gaming e.g. Withcher 3.


It shouldn't be that hot. It is 26c here and max temp on my card is ~75c. It has 3 fans tho, but still they are running @55% @75c. Maybe your case airflow is bad?


----------



## Thoth420

Hey all. I bought an XFX Fury X for a brand new build I plan on putting together this weekend(still waiting on parts). Assuming it doesn't have any sound issues etc. I plan on keeping it as I will be using the ASUS MG279Q 1440 @ 120hz.

Assuming I have to RMA the Fury X my backup plan was to pick up a 390x to use in the meantime. I was laser focused on the Fury X and haven't taken anytime to look at the 390x performance at 1440 and temperatures. Just wondering what to expect running 1440 @ 120hz with a single 390x. Long run I plan on using a Fury X but have 0 patience to run on integrated and have no other GPU laying around as a backup so if I did purchase a 390x I would have to sell it in a month or two which is fine....don't mind losing some money.

Also what brand and model should I go after? I don't want factory OC FYI.

Cheers and I hope you are all enjoying the cards


----------



## Agent Smith1984

Quote:


> Originally Posted by *Thoth420*
> 
> Hey all. I bought an XFX Fury X for a brand new build I plan on putting together this weekend(still waiting on parts). Assuming it doesn't have any sound issues etc. I plan on keeping it as I will be using the ASUS MG279Q 1440 @ 120hz.
> 
> Assuming I have to RMA the Fury X my backup plan was to pick up a 390x to use in the meantime. I was laser focused on the Fury X and haven't taken anytime to look at the 390x performance at 1440 and temperatures. Just wondering what to expect running 1440 @ 120hz with a single 390x. Long run I plan on using a Fury X but have 0 patience to run on integrated and have no other GPU laying around as a backup so if I did purchase a 390x I would have to sell it in a month or two which is fine....don't mind losing some money.
> 
> Also what brand and model should I go after? I don't want factory OC FYI.
> 
> Cheers and I hope you are all enjoying the cards


I've not tested on 1440p but here is a good Fury X review that shows right where the 390x would also fall into the equation...
https://www.techpowerup.com/reviews/AMD/R9_Fury_X/7.html

TBH, it's not very far behind the Fury X at that resolution. Especially not with the $ difference factored in...
If you were to OC the 390x... it would perform within 5-10% of the fury in most cases!

I would go for the tri-x if the factory OC is a non-factor, simply because it's the coolest/quietest (under load) running card.


----------



## Thoth420

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I've not tested on 1440p but here is a good Fury X review that shows right where the 390x would also fall into the equation...
> https://www.techpowerup.com/reviews/AMD/R9_Fury_X/7.html
> 
> TBH, it's not very far behind the Fury X at that resolution. Especially not with the $ difference factored in...
> If you were to OC the 390x... it would perform within 5-10% of the fury in most cases!
> 
> I would go for the tri-x if the factory OC is a non-factor, simply because it's the coolest/quietest (under load) running card.


Thank you sir!









I really like the small form factor and the AIO cooler but if it turns out to be a noisy mess then I may just go with a single 390x and stick. The performance is def acceptable for what I am looking to get.


----------



## Particle

Considering the performance I've seen out of the regular 390 at that resolution, it's hard to imagine not being satisfied with a 390X.


----------



## Thoth420

Quote:


> Originally Posted by *Particle*
> 
> Considering the performance I've seen out of the regular 390 at that resolution, it's hard to imagine not being satisfied with a 390X.


Yeah I think the cool SFF and the AIO cooler gave me tunnel vision. I should probably consider...IDK saving money for once...


----------



## whiteturbo

Hi,
Quote:


> Originally Posted by *Ha-Nocri*
> 
> It shouldn't be that hot. It is 26c here and max temp on my card is ~75c. It has 3 fans tho, but still they are running @55% @75c. Maybe your case airflow is bad?


Quote:


> Originally Posted by *Agent Smith1984*
> 
> Very curious about your case's airflow, and if the MX4 helped or not, and what your fan profile looks like...
> 
> With 100mv I see the core get around 82-84c in Heaven....
> 
> What is your ambient room and ambient case temps?
> 
> 90c is not dangerous, but certainly hot.


Well first the MX4 really sorted it for me, it still gets to low 90's but thats with max overclock my 390X can take 1180 core 1600 mem +100mv and +50 power it hit 93C but the main point is it was back to 50C in a minute or so. Before i re-did the thermal paste it was taking an hour to return to base temps(not even 50C)
Can only guess ambient room temp at 25-30C(Lollypop crashed my moby AGAIN) according to my Bios the motherboard is 38C but that was before i ran Heaven. My case is a Coolermaster HAF XB evo, its like a big biscuit tin with mesh and vents everwhere 3 120mm fans, 7 fans if you count the GPU and CPU cooler fans, I can assure you there is no airflow problems. I have my fan profile at zero untill 55C then a sharpish curve,almost a straight line, reaching 100% at about 80C. I will drop the Core Back to 1150 and mem to 1550 for everyday setup might even drop the mem back to stock as it doesn't seem to make a lot of difference.

Ray


----------



## Cannon19932006

Quote:


> Originally Posted by *Agent Smith1984*
> 
> LMAO
> 
> Yeah, they get hot on this cheap board, and to get this CPU to a more competitive clock speed, I had to beef up the cooling.
> I also have a 120mm low-pro fan on the rear of the CPU socket cooling that too.
> I agree that you are basically making somewhat of a side-grade (though the 390x is about 10% faster overall with it's clock speeds and improved driver).
> If you are choosing between 390x and 980, but have the money for the 980, you may want to hang tight just a few weeks and see how the Fury base and Fury Nano stack up.
> 
> Either way, if I had to choose between the 390x at $430, and the GTX980 at $500, the choice is simple.... the 390x


Yeah, I had to do the same thing to get competitive clocks without throttling on my Gigabyte 990FXA-UD3 Rev 4. The things we do for a few hundred MHz am I right?


----------



## Motley01

OK I'm really gonna lose it right now I'm so pissed. I find out that only XFX 300 series cards come with free Dirt Rally game.

What the heck?

http://www.newegg.com/Product/Product.aspx?Item=N82E16814150727&cm_re=xfx_390x-_-14-150-727-_-Product

No MSI, no Asus, no EVGA models have the free game deal.


----------



## DividebyZERO

Quote:


> Originally Posted by *Motley01*
> 
> OK I'm really gonna lose it right now I'm so pissed. I find out that only XFX 300 series cards come with free Dirt Rally game.
> 
> What the heck?
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814150727&cm_re=xfx_390x-_-14-150-727-_-Product
> 
> No MSI, no Asus, no EVGA models have the free game deal.


I was wondering this but all the ones i viewed had no games. I guess the XFX changed it? But yeah i wanted it because all the 2xx series seems to be getting free games. why not 300?


----------



## Motley01

Quote:


> Originally Posted by *whiteturbo*
> 
> Hi,
> 
> Well first the MX4 really sorted it for me, it still gets to low 90's but thats with max overclock my 390X can take 1180 core 1600 mem +100mv and +50 power it hit 93C but the main point is it was back to 50C in a minute or so. Before i re-did the thermal paste it was taking an hour to return to base temps(not even 50C)
> Can only guess ambient room temp at 25-30C(Lollypop crashed my moby AGAIN) according to my Bios the motherboard is 38C but that was before i ran Heaven. My case is a Coolermaster HAF XB evo, its like a big biscuit tin with mesh and vents everwhere 3 120mm fans, 7 fans if you count the GPU and CPU cooler fans, I can assure you there is no airflow problems. I have my fan profile at zero untill 55C then a sharpish curve,almost a straight line, reaching 100% at about 80C. I will drop the Core Back to 1150 and mem to 1550 for everyday setup might even drop the mem back to stock as it doesn't seem to make a lot of difference.
> 
> Ray


Hmm, interesting. Thanks for the heads up on the heatsink paste. I should have my MSI 390x delivered tomorrow. I think I will re-do the heatsink paste, I got some real good GE LID thermal past that I used on my 5820k recently, and that stuff is the best for cooling.


----------



## DividebyZERO

I was hoping to see someone put one of these on water, just curious if it helps any. Looks like my old 290x blocks arent going to fit. However i may just get waterclocks for gpu cores... if i get 390x's that is, i am still waiting for more information on the mysterious never found fury X


----------



## Duke976

Count me in







XFX 390


----------



## Motley01

Quote:


> Originally Posted by *Duke976*
> 
> Count me in
> 
> 
> 
> 
> 
> 
> 
> XFX 390


Lucky son of a ...., you got a free game with it too! Very cool.


----------



## Duke976

Quote:


> Originally Posted by *Motley01*
> 
> Lucky son of a ...., you got a free game with it too! Very cool.


Yeah, but I haven't claim it yet. I am planning to get the Fury X once it goes back in stock at amazon, so I am forbidding myself of claiming the game as it is unfair and dishonest if ever I return this XFX.


----------



## simo292a

Hi guys i bought the r9 390x and got it yesterday other than having driver problems i finally got it working by installing the driver manually from device manager. Most games run perfect like the witcher 3 which i am very happy with but there is one game that is gta 5 that stutters like crazy when playing its almost unplayable do you guys experience that too? By the way i am running Windows 10.


----------



## kizwan

What is the percentage between 390 vs. 390X? 390 around 93 to 94% against 390X?
Quote:


> Originally Posted by *Duke976*
> 
> Count me in
> 
> 
> 
> 
> 
> 
> 
> XFX 390


Clocks?


----------



## Ha-Nocri

Quote:


> Originally Posted by *simo292a*
> 
> Hi guys i bought the r9 390x and got it yesterday other than having driver problems i finally got it working by installing the driver manually from device manager. Most games run perfect like the witcher 3 which i am very happy with but there is one game that is gta 5 that stutters like crazy when playing its almost unplayable do you guys experience that too? By the way i am running Windows 10.


Can't help you with win10, but there is a beta driver made for GTA5. 15.3 or 15.4, not sure. How much RAM do you have?


----------



## simo292a

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Can't help you with win10, but there is a beta driver made for GTA5. 15.3 or 15.4, not sure. How much RAM do you have?


I got 8gb do the drivers for gta 5 work with the r9 390x? I thought it was only the driver made for the new series.


----------



## Ha-Nocri

Quote:


> Originally Posted by *simo292a*
> 
> I got 8gb do the drivers for gta 5 work with the r9 390x? I thought it was only the driver made for the new series.


Not sure if earlier 2xx series drivers work with 390(x). 8GB is enough if you close other programs, like browser. But I'm also using modded win 10 drivers on win7 and GTA is smooth


----------



## simo292a

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Not sure if earlier 2xx series drivers work with 390(x). 8GB is enough if you close other programs, like browser. But I'm also using modded win 10 drivers on win7 and GTA is smooth


I may be interested in the modded drivers for win 10. I may aswell try it.


----------



## Agent Smith1984

Quote:


> Originally Posted by *DividebyZERO*
> 
> I was hoping to see someone put one of these on water, just curious if it helps any. Looks like my old 290x blocks arent going to fit. However i may just get waterclocks for gpu cores... if i get 390x's that is, i am still waiting for more information on the mysterious never found fury X


What 290x cards are you running?

I believe if you go through EK's block search, there are some 290x blocks that work.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Duke976*
> 
> Count me in
> 
> 
> 
> 
> 
> 
> 
> XFX 390


Hey Duke, welcome!
In order to add you, I need a GPU-Z shot and something open in the background with your user name (can even be a window showing your profile logged in on OCN).
Or you can just snap a picture of the box/card with your name written on something.

I'll get you added as soon as you get it.

Thanks!

PS: Can you also post your clocks and voltage?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Not sure if earlier 2xx series drivers work with 390(x). 8GB is enough if you close other programs, like browser. But I'm also using modded win 10 drivers on win7 and GTA is smooth


To my knowledge only the 15.20 works on Windows 10, and only the 15.15 driver for WIndows 8.1 and prior.

The 15.15 is one hell of a driver though.

It's been the best running, most reliable driver I've gotten from AMD in a long time...


----------



## simo292a

Quote:


> Originally Posted by *Agent Smith1984*
> 
> To my knowledge only the 15.20 works on Windows 10, and only the 15.15 driver for WIndows 8.1 and prior.
> 
> The 15.15 is one hell of a driver though.
> 
> It's been the best running, most reliable driver I've gotten from AMD in a long time...


Can i find the 15.20 driver somewhere? I kinda need it since i am running windows 10.


----------



## Ha-Nocri

Quote:


> Originally Posted by *simo292a*
> 
> Can i find the 15.20 driver somewhere? I kinda need it since i am running windows 10.


Windows should install them automatically. Check in CCC:



I use modded 15.20 on win7.


----------



## Agent Smith1984

Quote:


> Originally Posted by *simo292a*
> 
> Can i find the 15.20 driver somewhere? I kinda need it since i am running windows 10.


I can't seem to find it online, though it did come on the included software disc. I don't use an optical drive so I never tried it.

If you can get 15.15 to work, that's the way to go though, it's much fastwer than the 15.20 from my reading, and 15.20 seems to be breaking games anyways.


----------



## Ha-Nocri

You mean breaks games on 390(x)? Cause for me both Witcher 3 and GTA5 work great


----------



## Hasmir

According to GPU-Z my MSI r9 390 is running at 20 mhz lower than it should;



According to newegg;



Anyone else have that error, or am i just reading something diffrently







?


----------



## simo292a

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Windows should install them automatically. Check in CCC:
> 
> 
> 
> I use modded 15.20 on win7.


Funny enough because when i set windows to automatically find drivers it install a driver for r9 200 series and then the screen goes black and if i restart it shows for a short time and goes black again.


----------



## whiteturbo

Count me in MSI R9 390X


----------



## Amhro

Quote:


> Originally Posted by *Hasmir*
> 
> According to GPU-Z my MSI r9 390 is running at 20 mhz lower than it should;
> 
> 
> 
> According to newegg;
> 
> 
> 
> Anyone else have that error, or am i just reading something diffrently
> 
> 
> 
> 
> 
> 
> 
> ?


Gaming mode is default = 1040mhz. OC mode is 1060mhz, you can switch them in MSI gaming app


----------



## simo292a

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I can't seem to find it online, though it did come on the included software disc. I don't use an optical drive so I never tried it.
> 
> If you can get 15.15 to work, that's the way to go though, it's much fastwer than the 15.20 from my reading, and 15.20 seems to be breaking games anyways.


I could only install the driver from the disc so i suppose i have the 15.20? I will check when i get home from work. The 15.15 driver would not install the driver on win10.
Is there a modded 15.15 driver that work with win10?


----------



## Particle

Quote:


> Originally Posted by *simo292a*
> 
> Funny enough because when i set windows to automatically find drivers it install a driver for r9 200 series and then the screen goes black and if i restart it shows for a short time and goes black again.


Restart in safe mode and run DDU to get rid of whichever drivers your system has cached from previous installs or Windows Update. You can force a safe mode boot by using the repair menu on a Windows install disc if necessary. If absent from there too (sometimes happens), you can use a command prompt on an install disc (press Shift+F10 once the install menu loads) and use bcdedit to set the safe mode boot flag.

bcdedit /set {default} safeboot network


----------



## Duke976

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Hey Duke, welcome!
> In order to add you, I need a GPU-Z shot and something open in the background with your user name (can even be a window showing your profile logged in on OCN).
> Or you can just snap a picture of the box/card with your name written on something.
> 
> I'll get you added as soon as you get it.
> 
> Thanks!
> 
> PS: Can you also post your clocks and voltage?


I'll be happy to oblige.


----------



## simo292a

Quote:


> Originally Posted by *Particle*
> 
> Restart in safe mode and run DDU to get rid of whichever drivers your system has cached from previous installs or Windows Update. You can force a safe mode boot by using the repair menu on a Windows install disc if necessary. If absent from there too (sometimes happens), you can use a command prompt on an install disc (press Shift+F10 once the install menu loads) and use bcdedit to set the safe mode boot flag.
> 
> bcdedit /set {default} safeboot network


Yeah i tried that and the only driver that worked was the driver from the disc which i believe is the 15.20 driver but i have problems with that driver in gta 5 there is huge stutters but something like witcher 3 runs butter smooth.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Duke976*
> 
> I'll be happy to oblige.


Added!

Nice heaven score, looks like the high VRAM clock helped.

I am going to try some more aux voltage tonight to try and get 1750+ on the memory.

Looking to break 14k graphics in FireStrike, and 1600 in Heaven with those settings.

AMD's decision to maintain 64 ROP's on the Fury has kept the 300 series very competitive with it at 1080 and 1440P resolutions when using a nice overclock.


----------



## whiteturbo

Quote:


> Originally Posted by *whiteturbo*
> 
> Count me in MSI R9 390X


Is that not right?


----------



## Agent Smith1984

Quote:


> Originally Posted by *whiteturbo*
> 
> Is that not right?


I'll get you added ASAP bud, not at pc right now!

Welcome!

How's the card doing?


----------



## Agent Smith1984

Matter of fact..... How is everyone's cards doing?

I take it, no news, it's good news!

So far no black screen reports, and no artifact reports, unless wrong driver was loaded.

I want to see some more aggressive overclocks and benchmarks gentlemen


----------



## Thoth420

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Matter of fact..... How is everyone's cards doing?
> 
> I take it, no news, it's good news!
> 
> So far no black screen reports, and no artifact reports, unless wrong driver was loaded.
> 
> I want to see some more aggressive overclocks and benchmarks gentlemen


I saw a ton of reviews on the Gigabyte 390 on newegg about restart and shutdown with the card installed it would just restart about a second after powering down. Aside that reviews seemed positive the biggest complaints across the board was the size of the card, namely the Sapphire. I don't know what people expected....GPU's are big these days.

Few people nitpicking about performance and one guy who clearly had a driver issue causing extremely bad performance.


----------



## Agent Smith1984

I put zero stock in most of the uneducated newegg reviews, but the restart thing on the gigaboard may be something to look out for.

Seems like giga is very power conteous, locked voltage..etc...

Maybe they went too low on vcore


----------



## Bigm

Total noob when it comes to overclocking AMD. Can someone explain to me why even though my card is set in Afterburner to 1150 core when I run Firestrike the core jumps all over the place? ie 775mhz to 1150mhz?


----------



## DividebyZERO

Quote:


> Originally Posted by *Bigm*
> 
> Total noob when it comes to overclocking AMD. Can someone explain to me why even though my card is set in Afterburner to 1150 core when I run Firestrike the core jumps all over the place? ie 775mhz to 1150mhz?


You can try going to AB settings disable ULPS and right next to it should have a drop down select "without powerplaysupport". See if that helps

Probably have to reboot, then adjust settings(OC) after boot and run firestrike


----------



## Bigm

Quote:


> Originally Posted by *DividebyZERO*
> 
> You can try going to AB settings disable ULPS and right next to it should have a drop down select "without powerplaysupport". See if that helps


I fixed it right after I posted









All I had to do was check enable overdrive in the AMD Control Panel and it's working now.

Edit: Or actually, it seems as if Afterburner isn't saving the increased powerlimit. I have to set it in CCC for the clock to stabilize.


----------



## Thoth420

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I put zero stock in most of the uneducated newegg reviews, but the restart thing on the gigaboard may be something to look out for.
> 
> Seems like giga is very power conteous, locked voltage..etc...
> 
> Maybe they went too low on vcore


Yeah I take them with a grain of salt as well.


----------



## Bigm

http://www.3dmark.com/3dm/7598825


----------



## Motley01

I got mine delivered today! Fricken badass man, really like it.

Besides having problems getting the old AMD drivers and CCC un-uninstalled, whooops should have done that first before installing this new card. (my old one was the 290x).

Finally got that all figured out. Geeeez. Anyways, got the new drivers installed 15.15 and its running smooth as butter.

Ran Unigen Valley got some good scores.

My MSI motherboard, case, etc. has a Black and Yellow scheme. So this red accents on this one doesn't match. So I'm gonna pull the fans and paint the red accents, with Yellow. Should match after I paint it.

I'll post up some more pics after I paint it.

But here's the benchies....


----------



## whiteturbo

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Matter of fact..... How is everyone's cards doing?
> 
> I take it, no news, it's good news!
> 
> So far no black screen reports, and no artifact reports, unless wrong driver was loaded.
> 
> I want to see some more aggressive overclocks and benchmarks gentlemen


You only have to ask!!



I dont think i won the silicone lottery because the above is the absolute limit infact i spotted a tiny bit of sparkly blue bits that shouldnt have been there but i saved it anyway.


----------



## jon666

Will post GPU z later, been playing with this since I got it today. On par with my xfire 7870's with a couple of exceptions. Slightley higher scores on benchmarking, and everything seems to be smoother framerate wise. Once I put my universal waterblock on this card overclocking will continue well beyond the BSOD/blackscreen. Have Hynix on this card this round from Sapphire. Middle of summer so temps will be an issue unless I am up at 4 in the morning on a non work day.

http://www.3dmark.com/3dm/7599974?


----------



## Particle

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Matter of fact..... How is everyone's cards doing?
> 
> I take it, no news, it's good news!
> 
> So far no black screen reports, and no artifact reports, unless wrong driver was loaded.
> 
> I want to see some more aggressive overclocks and benchmarks gentlemen


I've had some issues, but they're so far fairly minor:
- The adapter's DP ports seem to be picky with respect to which active DP->DVI adapters will work.
- Displays can randomly start to artifact when using DP. DVI seems immune to this. Could be related to the problem above.
- The display driver sometimes crashes when UVD initializes at the start of playback in a HW-accelerated media player.

I haven't had any issues with 3D/gaming itself though.
Quote:


> Originally Posted by *Bigm*
> 
> Total noob when it comes to overclocking AMD. Can someone explain to me why even though my card is set in Afterburner to 1150 core when I run Firestrike the core jumps all over the place? ie 775mhz to 1150mhz?


There seems to be a three-way curve among GPU load (demand), GPU temperature, and estimated GPU power with the lesser of these limits determining the clock the GPU runs at. When the load is low such as at your desktop, you'll probably see the lowest clocks (300 MHz core / 150 MHz mem). In situations where the GPU load is non-zero but fairly light such as game menus where the engine doesn't run at full tilt or easily hits an engine framerate cap such as the Source engine's 300 fps limit, you might see the GPU clock fluctuate between 300 MHz and some point up to the maximum clock. I see 500s through 700s a lot in game menus and loading screens. Only when there is sufficient demand that the card can't keep up at lower frequencies will it attempt to run at its maximum frequency. GPU die temperature and GPU estimated power will also serve as a dampener to that target if either one of them exceeds the programmed limits.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Motley01*
> 
> I got mine delivered today! Fricken badass man, really like it.
> 
> Besides having problems getting the old AMD drivers and CCC un-uninstalled, whooops should have done that first before installing this new card. (my old one was the 290x).
> 
> Finally got that all figured out. Geeeez. Anyways, got the new drivers installed 15.15 and its running smooth as butter.
> 
> Ran Unigen Valley got some good scores.
> 
> My MSI motherboard, case, etc. has a Black and Yellow scheme. So this red accents on this one doesn't match. So I'm gonna pull the fans and paint the red accents, with Yellow. Should match after I paint it.
> 
> I'll post up some more pics after I paint it.
> 
> But here's the benchies....


Try custom fan that ramps to 100 at 80c...
Also, add some aux voltage and you'll get crazy high memory clocks! At 1750 with 50+

Thanks for the feedback guys! I'll be adding to the OP on Monday
Quote:


> Originally Posted by *whiteturbo*
> 
> You only have to ask!!
> 
> 
> 
> I dont think i won the silicone lottery because the above is the absolute limit infact i spotted a tiny bit of sparkly blue bits that shouldnt have been there but i saved it anyway.


You may want to update msi ab just to make sure you have full support for the card.

Again, jacking the fan up set load, and adding a tad bit of aux voltage will give you some memory headroom.

Anyone tried 200mv on trixx?
She gets too hot for me


----------



## jon666

http://www.techpowerup.com/gpuz/details.php?id=bbrr8


----------



## Agent Smith1984

About to post a badass fs run at 1190/1750... Should be over 14k graphics score!

UPDATE:
http://www.3dmark.com/3dm/7609217

Dat graphics score tho!!


----------



## jon666

What are your settings for voltage? I tried to go that high but weird things would pop up when trying to bench.


----------



## Agent Smith1984

Quote:


> Originally Posted by *jon666*
> 
> What are your settings for voltage? I tried to go that high but weird things would pop up when trying to bench.


Make sure you have 100mv+, 50mv+ aux voltage, a high speed fan profile, and a 50% power tune.

That was a 100% artifact free run. I did have fan at 100% and my case door off though.

I have got to get some better case fans and get my air flowing.

The card never broke 76c at 100mv with my door off....

I am going to try some crazy stuff, like 1220 200mv next


----------



## Agent Smith1984

Soooo.....

I tried VSR for the first time..

Running at 1440p on a 1080P TV. It really looks great. I even upped my Crysis 3 settings to max with 1440p 2x AA and am maintaining a consistent 55+ FPS...... Very nice cause this TV is 60Hz, so there is not much tearing.

I am picking up a 4K 120Hz TV next week and getting another 390 7-27 for my upcoming birthday.

I can deal with the 30FPS cap on 4k until I get an adapter.

I can't wait for CF and 4k...

The image quality even on 1440 is amazing.


----------



## DividebyZERO

Quote:


> Originally Posted by *Agent Smith1984*
> 
> About to post a badass fs run at 1190/1750... Should be over 14k graphics score!
> 
> UPDATE:
> http://www.3dmark.com/3dm/7609217
> 
> Dat graphics score tho!!


Nice


----------



## Gumbi

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Make sure you have 100mv+, 50mv+ aux voltage, a high speed fan profile, and a 50% power tune.
> 
> That was a 100% artifact free run. I did have fan at 100% and my case door off though.
> 
> I have got to get some better case fans and get my air flowing.
> 
> The card never broke 76c at 100mv with my door off....
> 
> I am going to try some crazy stuff, like 1220 200mv next


Do you find the aux voltage helps with memory overclocks? I never experimented much tbh with it.


----------



## By-Tor

Quote:


> Originally Posted by *Agent Smith1984*
> 
> About to post a badass fs run at 1190/1750... Should be over 14k graphics score!
> 
> UPDATE:
> http://www.3dmark.com/3dm/7609217
> 
> Dat graphics score tho!!


Nice score...

Just to compare your 390 and my 290x...


----------



## Agent Smith1984

Quote:


> Originally Posted by *Gumbi*
> 
> Do you find the aux voltage helps with memory overclocks? I never experimented much tbh with it.


Yes, aux voltage directly impacts memory clocks on this card!
Quote:


> Originally Posted by *By-Tor*
> 
> Nice score...
> 
> Just to compare your 390 and my 290x...


Wow, that graphics difference is crazy. Guess the combination of higher memory clocks and this 15.15 driver are really putting power to the ground.

I also hit 1600 inn heaven earlier with max 1080 settings. Really happy with this 390so far.

Love seeing the benches fellas! Keep clocking!


----------



## By-Tor

Glad to see they did make an improvement over the 290 series. But in 1080p game play I'm sure I wouldn't see any difference and I still plan on adding a second card whenever I find one...


----------



## Motley01

OK I finished painting my MSI 390x. Now it matches my Yellow & Black scheme. I removed the fan shroud and painted all the red pieces yellow.


----------



## Ramzinho

Quote:


> Originally Posted by *Motley01*
> 
> OK I finished painting my MSI 390x. Now it matches my Yellow & Black scheme. I removed the fan shroud and painted all the red pieces yellow.
> 
> 
> Spoiler: Warning: Spoiler!


I've to say this is looking great man.. very well done on the painting job..


----------



## Agent Smith1984

Quote:


> Originally Posted by *Motley01*
> 
> OK I finished painting my MSI 390x. Now it matches my Yellow & Black scheme. I removed the fan shroud and painted all the red pieces yellow.


Just awesome! + rep


----------



## Agent Smith1984

Funny thing is, I'm about to do some painting of my own.... Project Wifey is a pink case, and guts will be lime green and black.


----------



## Duke976

Tried and run some Firestrike and run another Unigine.

Firestirke:



With Firestike my clock was 1150 and 1680

Heaven



While in Heaven clock was 1200 to 1600


----------



## Motley01

Quote:


> Originally Posted by *Ramzinho*
> 
> I've to say this is looking great man.. very well done on the painting job..


Quote:


> Originally Posted by *Agent Smith1984*
> 
> Just awesome! + rep


Thanks guys I appreciate the great feedback.

Painting the fan shrouds was easy. Now when I painted my case, holy crap that was a ton of work. So much tedious masking. But I'm glad I did it, looks so much better now.


----------



## eurostyle360

So what are the thoughts on the MSI 390x vs. the gigabyte windforce g1 390x? The MSI heatsink looks a bit thicker; does it cool better as well? I think i recall reading a few pages back that these gigabyte cards are voltage locked. Is that really the case? This doesn't matter so much for me now because I don't intend on overclocking until the future when I need to for some game. Will the voltage ever become unlocked or controllable via afterburner in the future?

Anyway the reason I ask this is because IMHO the gigabyte card looks much better to me aesthetically speaking and will match my system better.


----------



## specopsFI

Quote:


> Originally Posted by *By-Tor*
> 
> Nice score...
> 
> Just to compare your 390 and my 290x...


That is actually quite a bad result to use as reference. Here's my run with 1150/1500MHz on my 290:

http://www.3dmark.com/fs/5176679

That's with 15.5 betas. The 15.15's from what I've gathered would give perhaps 300-400 points more in graphics score.

IMHO, it has been proven beyond any reasonable doubt that the only actual difference between 290(X) and 390(X) is the memory chips. For the 8GB 290X's, not even that. AMD really needs to stop playing coy with their drivers.

In general: I totally respect that this is the 390/390X owners club and am in no way saying you guys shouldn't be happy with your cards. It's just that if there keeps on coming up claims or innuendos of hardware level superiority over 290/290X, you make your own thread vulnerable to unnecessary bickering. For the record, I consider this post by The Stilt (who you should know by reputation if you're into AMD GPUs) on another forum to be the most thorough, non-biased, factually correct statement yet on the whole Hawaii/Grenada matter:

http://forums.guru3d.com/showpost.php?p=5109102&postcount=106

That is all I'm going to say on your thread (that is, unless I end up owning a 390/390X at some point, which wouldn't surprise me). I use this opportunity to subscribe, because I am actually really interested in your OC's and driver experiencies. Keep it going, guys!


----------



## Cannon19932006

Quote:


> Originally Posted by *eurostyle360*
> 
> So what are the thoughts on the MSI 390x vs. the gigabyte windforce g1 390x? The MSI heatsink looks a bit thicker; does it cool better as well? I think i recall reading a few pages back that these gigabyte cards are voltage locked. Is that really the case? This doesn't matter so much for me now because I don't intend on overclocking until the future when I need to for some game. Will the voltage ever become unlocked or controllable via afterburner in the future?
> 
> Anyway the reason I ask this is because IMHO the gigabyte card looks much better to me aesthetically speaking and will match my system better.


The gigabyte card is completely voltage locked, and as of right now there are no programs to alter the bios of the card.

Which is why I returned my Gigabyte G1 390X and just placed an order for an MSI 390x.


----------



## MojoW

Guys what is the max VSR resolution on the 390/x?
Or is it still 3200x1800 as the 290/x?


----------



## Klocek001

Quote:


> Originally Posted by *Motley01*


aw yiss


----------



## Agent Smith1984

Quote:


> Originally Posted by *MojoW*
> 
> Guys what is the max VSR resolution on the 390/x?
> Or is it still 3200x1800 as the 290/x?


Still 3200 x 1800 according to my display setting options, however, when I try 3200 x 1800 I get a jarbled mess of a screen.

The 1440 option works perfectly though.
Quote:


> Originally Posted by *specopsFI*
> 
> That is actually quite a bad result to use as reference. Here's my run with 1150/1500MHz on my 290:
> 
> http://www.3dmark.com/fs/5176679
> 
> That's with 15.5 betas. The 15.15's from what I've gathered would give perhaps 300-400 points more in graphics score.
> 
> IMHO, it has been proven beyond any reasonable doubt that the only actual difference between 290(X) and 390(X) is the memory chips. For the 8GB 290X's, not even that. AMD really needs to stop playing coy with their drivers.
> 
> In general: I totally respect that this is the 390/390X owners club and am in no way saying you guys shouldn't be happy with your cards. It's just that if there keeps on coming up claims or innuendos of hardware level superiority over 290/290X, you make your own thread vulnerable to unnecessary bickering. For the record, I consider this post by The Stilt (who you should know by reputation if you're into AMD GPUs) on another forum to be the most thorough, non-biased, factually correct statement yet on the whole Hawaii/Grenada matter:
> 
> http://forums.guru3d.com/showpost.php?p=5109102&postcount=106
> 
> That is all I'm going to say on your thread (that is, unless I end up owning a 390/390X at some point, which wouldn't surprise me). I use this opportunity to subscribe, because I am actually really interested in your OC's and driver experiencies. Keep it going, guys!


Thanks for the input and subscribing to the thread.

I have tried to share with everyone that the only differences are those stated by stilt in his post.

It is important to know that memory chips themselves are also different. They are all Hynix IC's rated at 1500MHz, and with a touch of voltage will do well over 1600MHz. Though not totally unseen on the 290 series of cards, it is very uncommon to see cards reaching that kind of memory clock.

As far as the driver is concerned.... One thing I've maintained throughout, is that part of the entire performance boost associated with the 300 series lay in the driver itself.
I am very curious as to whether or not AMD will ever implement the tesselation performance iimprovements into all of their driver releases ot not.

If they do not, it is somewhat understandable in helping sustain the sales of the newer generation, but I can totally understand 290 owners being upset with a gimped driver....


----------



## ManofGod1000

Well, I now have a 4k display and I am very pleased with the performance of my single R9 290 reference edition on it. However, I am now tempted to buy 2 x XFX R9 390 just for the heck of it. I realize $719 plus tax is a lot of money but I cannot go with crossfire reference R9 290 cards since they are too hot for that. (I tried it and it did not work, the case itself became very hot.) What do you guys think? I do not want an 980Ti so skip that even though it is a nice card.


----------



## Agent Smith1984

Quote:


> Originally Posted by *ManofGod1000*
> 
> Well, I now have a 4k display and I am very pleased with the performance of my single R9 290 reference edition on it. However, I am now tempted to buy 2 x XFX R9 390 just for the heck of it. I realize $719 plus tax is a lot of money but I cannot go with crossfire reference R9 290 cards since they are too hot for that. (I tried it and it did not work, the case itself became very hot.) What do you guys think? I do not want an 980Ti so skip that even though it is a nice card.


I too am planning crossfire 390's for a 4k tv. Just trying to figure out a workaround for the 30fps limitation of hdmi 1.4


----------



## eurostyle360

Quote:


> Originally Posted by *Cannon19932006*
> 
> The gigabyte card is completely voltage locked, and as of right now there are no programs to alter the bios of the card.
> 
> Which is why I returned my Gigabyte G1 390X and just placed an order for an MSI 390x.


Ah that's a damn shame. I think the Gigabyte card is beautiful compared to the MSI card. The fact that it's voltage locked and comes with only a 10 Mhz bump over stock sounds dubious, though.


----------



## BlaXey

Hello, what's the best choize for an overclocker, the XFX 390 or de MSI? I descarted The Sapphire because it doesn't enter in my Zalman z11







and cut the zone is a problem because that mean to dismantle all the components... And i don't like that. So i'm thinking to get a Msi or a XFX, i like both, but if the Msi have more quality in the components there aren't doubts.


----------



## DizzlePro

Asus 390(x) STRIX with the DirectCU III cooler









too bad they' aren't available anywhere atm
https://www.asus.com/uk/Graphics_Cards/STRIXR9390DC3OC8GD5GAMING/


----------



## eurostyle360

Quote:


> Originally Posted by *DizzlePro*
> 
> Asus 390(x) STRIX with the DirectCU III cooler
> 
> 
> 
> 
> 
> 
> 
> 
> too bad they' aren't available anywhere atm
> https://www.asus.com/uk/Graphics_Cards/STRIXR9390DC3OC8GD5GAMING/


Wish I knew what this guy was saying


----------



## Agent Smith1984

I bought my msi because it was the cheapest card with the most features.

$329 and you get a great looking cooler, and a back plate.
Plus the led and msi gaming app is nice.

It is a shame the giga parts are voltage locked.

The strixx looks nice but it's still using a 6+8 pin power configuration so i don't expect much better clocking unless the chips are higher binned.


----------



## RED_LINE

Coming from a Sapphire Radeon 7870 XT w/BoOst to this


----------



## Agent Smith1984

Quote:


> Originally Posted by *RED_LINE*
> 
> Coming from a Sapphire Radeon 7870 XT w/BoOst to this


Nice upgrade and welcome!

I'll have new members in Monday morning when i get back my work station.


----------



## RED_LINE

Thanks man! good to be a part of club


----------



## Ha-Nocri

Quote:


> Originally Posted by *RED_LINE*
> 
> Thanks man! good to be a part of club


How is the cooler? Do the fans stop spinning @ idle and low temps?


----------



## Cannon19932006

Quote:


> Originally Posted by *eurostyle360*
> 
> Ah that's a damn shame. I think the Gigabyte card is beautiful compared to the MSI card. The fact that it's voltage locked and comes with only a 10 Mhz bump over stock sounds dubious, though.


Yeah and mine didn't overclock worth a crap, 1070MHz was as high as i could get it stable, any more than that and there were artifacts. The memory overclocked very well though, all the way up to 1700MHz.


----------



## RED_LINE

Quote:


> Originally Posted by *Ha-Nocri*
> 
> How is the cooler? Do the fans stop spinning @ idle and low temps?


The cooler is impressive, being that is a larger cooler than any other of the third party options out there. This in turn seems to perform the best because of the heatsink design and three fans compared to the dual fans on other cards.. Only downside I guess may be the noise @ 100% ?

The temps stay in the 70's under load when playing shadow of mordor with my overclock.


----------



## Bigm

http://www.3dmark.com/3dm/7625250

My best run so far.


----------



## Motley01

Been playing with my overclock. I agree, the memory is impressive on these!

I'm at 1176/1684 now, running solid as a rock. Temps are great. 73 during full load.


----------



## Bigm

Quote:


> Originally Posted by *Motley01*
> 
> Been playing with my overclock. I agree, the memory is impressive on these!
> 
> I'm at 1176/1684 now, running solid as a rock. Temps are great. 73 during full load.


Nice core clock. I seem to top out at 1150.

Upped my CPU clock a bit and hit 11,845

http://www.3dmark.com/fs/5305689.

At this point, the card is outdoing my 970 I had before it.


----------



## Motley01

Quote:


> Originally Posted by *Bigm*
> 
> Nice core clock. I seem to top out at 1150.
> 
> Upped my CPU clock a bit and hit 11,845
> 
> http://www.3dmark.com/fs/5305689.
> 
> At this point, the card is outdoing my 970 I had before it.


Thats pretty solid. Good results. And you have 4.5GB more VRAM now too. LOL

Nice OC on the 4790.


----------



## Bigm

Quote:


> Originally Posted by *Motley01*
> 
> Thats pretty solid. Good results. And you have 4.5GB more VRAM now too. LOL
> 
> Nice OC on the 4790.


Ha yeah, the vram is the main reason I switched since I'm at 1440p and hoping to pick up a 2nd 390 and maybe a 4k screen within the next 2-3 months.

Not to mention I got it practically free between the money selling my 970 and the Best Buy credit I had.


----------



## Motley01

Quote:


> Originally Posted by *Bigm*
> 
> Ha yeah, the vram is the main reason I switched since I'm at 1440p and hoping to pick up a 2nd 390 and maybe a 4k screen within the next 2-3 months.
> 
> Not to mention I got it practically free between the money selling my 970 and the Best Buy credit I had.


Sweet! Ya I just sold my 290x for $260. So I only paid $180 for this.

What 4k monitor are you looking at? Make sure you get minimum 32", because of the GUI scaling in windows.


----------



## Bigm

Quote:


> Originally Posted by *Motley01*
> 
> Sweet! Ya I just sold my 290x for $260. So I only paid $180 for this.
> 
> What 4k monitor are you looking at? Make sure you get minimum 32", because of the GUI scaling in windows.


Haven't even started looking but I will keep that in mind, thanks!


----------



## Motley01

Quote:


> Originally Posted by *Bigm*
> 
> Haven't even started looking but I will keep that in mind, thanks!


Get this 32" Acer, been hearing some really good things about it over on [H]. Only $799 if you buy it direct from Acer.

http://us.acer.com/ac/en/US/content/professional-model/UM.JB6AA.002


----------



## ManofGod1000

Quote:


> Originally Posted by *Motley01*
> 
> Sweet! Ya I just sold my 290x for $260. So I only paid $180 for this.
> 
> What 4k monitor are you looking at? Make sure you get minimum 32", because of the GUI scaling in windows.


I do not agree with the minimum size of 32 inches for a 4K monitor. However, that is because I am running Windows 10 and the scaling is automatic and works great. I picked up the Samsung U28D590D for $494 plus tax which is a 10 bit TN panel and it works great. Also, if you need a VESA mount for it, there are ones that connect over the spot where the regular mount normally goes. However, either way, enjoy.


----------



## hawker-gb

Guys,i have one question.

I can buy used Sapphire R9 290 (referent cooler). I think card is used for mining. its still under warranty (6 more months).

Or i can give 100 dollars more for new Sapphire R9 390 8GB NITRO.

What you guys think?


----------



## Ha-Nocri

Quote:


> Originally Posted by *hawker-gb*
> 
> Guys,i have one question.
> 
> I can buy used Sapphire R9 290 (referent cooler). I think card is used for mining. its still under warranty (6 more months).
> 
> Or i can give 100 dollars more for new Sapphire R9 390 8GB NITRO.
> 
> What you guys think?


Can't you find an used non-reference 290? It would be the best choice.


----------



## hawker-gb

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Can't you find an used non-reference 290? It would be the best choice.


For that money which i can buy this R9 290 no.
Over here it overpriced.

So,you think non referent used 290(if i can find one) is better choice then new 390 (cca 100 dollars more)?

In that case i will wait for opportunity.

P.S. These cards are overpriced here. For example,full price of this 390 NITRO is 450 dollars here. I can get it for 340 dollars.(discount i have)

That used R9 290 is 240 dollars.


----------



## Ha-Nocri

Quote:


> Originally Posted by *hawker-gb*
> 
> For that money which i can buy this R9 290 no.
> Over here it overpriced.
> 
> So,you think non referent used 290(if i can find one) is better choice then new 390 (cca 100 dollars more)?
> 
> In that case i will wait for opportunity.
> 
> P.S. These cards are overpriced here. For example,full price of this 390 NITRO is 450 dollars here. I can get it for 340 dollars.(discount i have)
> 
> That used R9 290 is 240 dollars.


That reference is a rly bad cooler. Loud and cools poorly. Idle noise doesn't seem to be too bad tho. If you game with headphones then it might not be a bad choice.

But the price of that 390 is great for Europe, just realized that.


----------



## simo292a

Hey i want to join the club







im coming from a GTX 770 and i can see a huge performance boost in a game like the witcher 3. What would be a stable overclock i could try to start off?


----------



## eurostyle360

Will a Corsair HX750 run two 390x's in crossfire?


----------



## eurostyle360

Quote:


> Originally Posted by *ManofGod1000*
> 
> I do not agree with the minimum size of 32 inches for a 4K monitor. However, that is because I am running Windows 10 and the scaling is automatic and works great. I picked up the Samsung U28D590D for $494 plus tax which is a 10 bit TN panel and it works great. Also, if you need a VESA mount for it, there are ones that connect over the spot where the regular mount normally goes. However, either way, enjoy.


I agree with this. I have the same monitor and windows 8 and 10 handle scaling nicely (still not as well as a Mac). But 4k has so many pixels you might as well stretch them over a larger screen if you can afford it.


----------



## BackwoodsNC

Quote:


> Originally Posted by *eurostyle360*
> 
> Will a Corsair HX750 run two 390x's in crossfire?


according to this no Link


----------



## Duke976

Quote:


> Originally Posted by *simo292a*
> 
> Hey i want to join the club
> 
> 
> 
> 
> 
> 
> 
> im coming from a GTX 770 and i can see a huge performance boost in a game like the witcher 3. *What would be a stable overclock i could try to start off?*


Welcome to the club, try to start with 1100 core and 1650 mem using 50% power tune and 50% core voltage mv all by using afterburner. Hope that helps.


----------



## simo292a

Quote:


> Originally Posted by *Duke976*
> 
> Welcome to the club, try to start with 1100 core and 1650 mem using 50% power tune and 50% core voltage mv all by using afterburner. Hope that helps.


OK it seems to be stable at that setting but for some reason my 3dmark score seems pretty low compared to others on this thread i am running win10.


----------



## CrossoXoHair

Hey guys not seeing many answers to my Query Elsewhere so Thought I'd try here

http://www.gigabyte.us/products/product-page.aspx?pid=5500#ov ($499AUD)

http://www.msi.com/product/vga/R9-390-GAMING-8G.html#hero-overview ($549AUD)

http://www.sapphiretech.com/productdetial.asp?pid=C436E37C-8A09-48B6-9F2B-F4AF86E377B6&lang=eng ($559AUD)

Which one should I get, Don't want to spend the extra $50-$60 "Just Cause"

1. I have never overclocked before, and may not and have no idea how too
2. Does the extra little bit of clock via MSI make any difference?
3. Is Gigabyte a good brand? I know the voltage is locked, but I hear that the safe form of overclocking is still available?
4. I was all ready to get the Gigabyte till I read about the voltage lock, I don't like that it sounds nerfed.

Thanks all, I await many replies


----------



## simo292a

Quote:


> Originally Posted by *CrossoXoHair*
> 
> Hey guys not seeing many answers to my Query Elsewhere so Thought I'd try here
> 
> http://www.gigabyte.us/products/product-page.aspx?pid=5500#ov ($499AUD)
> 
> http://www.msi.com/product/vga/R9-390-GAMING-8G.html#hero-overview ($549AUD)
> 
> http://www.sapphiretech.com/productdetial.asp?pid=C436E37C-8A09-48B6-9F2B-F4AF86E377B6&lang=eng ($559AUD)
> 
> Which one should I get, Don't want to spend the extra $50-$60 "Just Cause"
> 
> 1. I have never overclocked before, and may not and have no idea how too
> 2. Does the extra little bit of clock via MSI make any difference?
> 3. Is Gigabyte a good brand? I know the voltage is locked, but I hear that the safe form of overclocking is still available?
> 4. I was all ready to get the Gigabyte till I read about the voltage lock, I don't like that it sounds nerfed.
> 
> Thanks all, I away many replies


Well if you want to overclock your card at some point the msi and sapphire are booth good cards with good cooling. I myself have the msi 390x it got some overclocking in it but not much on the core more on the memory side. sapphire has always made good amd cards so im sure that would be a good option aswell.


----------



## JourneymanMike

Quote:


> Originally Posted by *CrossoXoHair*
> 
> Hey guys not seeing many answers to my Query Elsewhere so Thought I'd try here
> 
> http://www.gigabyte.us/products/product-page.aspx?pid=5500#ov ($499AUD)
> 
> http://www.msi.com/product/vga/R9-390-GAMING-8G.html#hero-overview ($549AUD)
> 
> http://www.sapphiretech.com/productdetial.asp?pid=C436E37C-8A09-48B6-9F2B-F4AF86E377B6&lang=eng ($559AUD)
> 
> Which one should I get, Don't want to spend the extra $50-$60 "Just Cause"
> 
> 1. I have never overclocked before, and may not and have no idea how too
> 2. Does the extra little bit of clock via MSI make any difference?
> 3. Is Gigabyte a good brand? I know the voltage is locked, but I hear that the safe form of overclocking is still available?
> 4. I was all ready to get the Gigabyte till I read about the voltage lock, I don't like that it sounds nerfed.
> 
> Thanks all, I await many replies


First of all, Welcome to OCN!

I think that the Sapphire would be a good choice...


----------



## Zanpakuto

Waiting for Nitro 390 to be restocked. http://www.newegg.com/Product/Product.aspx?Item=N82E16814202148 @ ~$339.99 , if only Amazon lower their price


----------



## DividebyZERO

Quote:


> Originally Posted by *CrossoXoHair*
> 
> Hey guys not seeing many answers to my Query Elsewhere so Thought I'd try here
> 
> http://www.gigabyte.us/products/product-page.aspx?pid=5500#ov ($499AUD)
> 
> http://www.msi.com/product/vga/R9-390-GAMING-8G.html#hero-overview ($549AUD)
> 
> http://www.sapphiretech.com/productdetial.asp?pid=C436E37C-8A09-48B6-9F2B-F4AF86E377B6&lang=eng ($559AUD)
> 
> Which one should I get, Don't want to spend the extra $50-$60 "Just Cause"
> 
> 1. I have never overclocked before, and may not and have no idea how too
> 2. Does the extra little bit of clock via MSI make any difference?
> 3. Is Gigabyte a good brand? I know the voltage is locked, but I hear that the safe form of overclocking is still available?
> 4. I was all ready to get the Gigabyte till I read about the voltage lock, I don't like that it sounds nerfed.
> 
> Thanks all, I await many replies


The MSI one will give the best performance using stock settings (oc mode). If you dont care about overclocking then its your fastest card. If money is your concern then Gigabyte and you def dont plan to OC.


----------



## Amhro

Quote:


> Originally Posted by *hawker-gb*
> 
> For that money which i can buy this R9 290 no.
> Over here it overpriced.
> 
> So,you think non referent used 290(if i can find one) is better choice then new 390 (cca 100 dollars more)?
> 
> In that case i will wait for opportunity.
> 
> P.S. These cards are overpriced here. For example,full price of this 390 NITRO is 450 dollars here. I can get it for 340 dollars.(discount i have)
> 
> That used R9 290 is 240 dollars.


I'd go with new 390, 340$ is so damn cheap here in europe, I bought mine for 370€.


----------



## Gumbi

Quote:


> Originally Posted by *Amhro*
> 
> I'd go with new 390, 340$ is so damn cheap here in europe, I bought mine for 370€.


If you have a discount on 390 cards, go for it!


----------



## Agent Smith1984

NEW MEMBERS ADDED!

If I've missed anyone just let me know.

Hope everyone's weekend went well.

I'm going to be working on some stuff for the OP today.

Mainly some OC information and tips, and whatever else I can come up with.


----------



## CrossoXoHair

Quote:


> Originally Posted by *simo292a*
> 
> Well if you want to overclock your card at some point the msi and sapphire are booth good cards with good cooling. I myself have the msi 390x it got some overclocking in it but not much on the core more on the memory side. sapphire has always made good amd cards so im sure that would be a good option aswell.


Quote:


> Originally Posted by *JourneymanMike*
> 
> First of all, Welcome to OCN!
> 
> I think that the Sapphire would be a good choice...


Quote:


> Originally Posted by *DividebyZERO*
> 
> The MSI one will give the best performance using stock settings (oc mode). If you dont care about overclocking then its your fastest card. If money is your concern then Gigabyte and you def dont plan to OC.


So I rang the store getting the Gigabytes and they said they have no idea when stock will be in now (they originally said this week) and for me to ring back in 9 more days.

Rang another business that had told me they would be receiving MSI and Sapphire cards last week. They never received those, but informed me the Sapphire is available to order from their supplier interstate and will arrive on friday and is in fact $30 cheaper than I was originally quoted. (It's $530 AUD)

SO NEW QUESTION. ANY reason I Shouldn't grab the Sapphire 390?
http://www.sapphiretech.com/productdetial.asp?pid=C436E37C-8A09-48B6-9F2B-F4AF86E377B6&lang=eng
I have no idea when other models will be available. Seems Australia got fluff all stock.


----------



## Particle

Sapphire's card has performed fine for me, CrossoXoHair. The only problems I've had are related to displays being garbled when using DP-DVI adapters and a few UVD driver crashes, but I would suspect those are more related to the driver software than the hardware. Can't say for sure though.


----------



## Gumbi

Quote:


> Originally Posted by *CrossoXoHair*
> 
> So I rang the store getting the Gigabytes and they said they have no idea when stock will be in now (they originally said this week) and for me to ring back in 9 more days.
> 
> Rang another business that had told me they would be receiving MSI and Sapphire cards last week. They never received those, but informed me the Sapphire is available to order from their supplier interstate and will arrive on friday and is in fact $30 cheaper than I was originally quoted. (It's $530 AUD)
> 
> SO NEW QUESTION. ANY reason I Shouldn't grab the Sapphire 390?
> http://www.sapphiretech.com/productdetial.asp?pid=C436E37C-8A09-48B6-9F2B-F4AF86E377B6&lang=eng
> I have no idea when other models will be available. Seems Australia got fluff all stock.


It's a great card! Better cooling than the Gigabyte, and you can over volt with them







It'll do 1150/1600 easy, possibly 1200/1700 too!


----------



## Ha-Nocri

Quote:


> Originally Posted by *Gumbi*
> 
> It's a great card! Better cooling than the Gigabyte, and you can over volt with them
> 
> 
> 
> 
> 
> 
> 
> It'll do 1150/1600 easy, possibly 1200/1700 too!


is that card noiseless @idle (fans turn off)?


----------



## Gumbi

Quote:


> Originally Posted by *Ha-Nocri*
> 
> is that card noiseless @idle (fans turn off)?


Doesn't every AMD card go into ZeroCore power and shut their fans down after extended idle period?

Fans at 20% are pretty damn quiet anyways, and wouldn't be heard over the system in general anyway. There is also a possibility Sapphire implemented the IFC (intelligent fan control) system that they have on their Vapor X cards, whereby under 60c (I think), only one fan is active.


----------



## CrossoXoHair

Quote:


> Originally Posted by *Ha-Nocri*
> 
> is that card noiseless @idle (fans turn off)?


Yeh it is

"The SAPPHIRE NITRO R9 390 features the latest version of our award winning Tri-X cooler now with dual ball bearings in each of the three fans for higher reliability and enhanced Intelligent Fan Control (IFC-II) which turns off the fans for silent operation under light load"

So everyone your saying ignore the higher clock of the MSI n go for it?









Will my CORSAIR TX 650 WATT PSU by enough AND my case (COOLERMASTER RC-692) long enough ?


----------



## Gumbi

Quote:


> Originally Posted by *CrossoXoHair*
> 
> Yeh it is
> 
> "The SAPPHIRE NITRO R9 390 features the latest version of our award winning Tri-X cooler now with dual ball bearings in each of the three fans for higher reliability and enhanced Intelligent Fan Control (IFC-II) which turns off the fans for silent operation under light load"
> 
> So everyone your saying ignore the higher clock of the MSI n go for it?


Quote:


> Originally Posted by *Ha-Nocri*
> 
> is that card noiseless @idle (fans turn off)?


Doesn't every AMD card go into ZeroCore power and shut their fans down after extended idle period?

Fans at 20% are pretty damn quiet anyways, and wouldn't be heard over the system in general anyway. There is also a possibility Sapphire implemented the IFC (intelligent fan control) system that they

They'll both overclock to the same level anyways







Go for it







Better cooler too


----------



## Zanpakuto

Waiting for Nitro 390 to get back instock


----------



## Ha-Nocri

Quote:


> Originally Posted by *CrossoXoHair*
> 
> Yeh it is
> 
> "The SAPPHIRE NITRO R9 390 features the latest version of our award winning Tri-X cooler now with dual ball bearings in each of the three fans for higher reliability and enhanced Intelligent Fan Control (IFC-II) which turns off the fans for silent operation under light load"
> 
> So everyone your saying ignore the higher clock of the MSI n go for it?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Will my CORSAIR TX 650 WATT PSU by enough AND my case (COOLERMASTER RC-692) long enough ?


Damn, mine is not. Wonder if I could flash to 390 nitro's bios and have fans turn off









Dunno bout the case, but PSU is fine


----------



## Agent Smith1984

So far, the OC's on these cards' cores are about on par with some of the better 290 cards, but still not drastically better.
I can do 1180 on 100mv, 1200 on 150mv, and bench at 1215 with 200mv. That would obviously improve with reduced temps, but I don't think water on my GPU's is in the plans for me with this series.

The memory however seems to consistently clock 1600MHz and over, and I'd bet anyone stopping at 1600 just needs a touch of aux voltage to get past that (and on up into the 1700++ range).


----------



## Ha-Nocri

Well, I can bench 1230/1680 @+100mV


----------



## Agent Smith1984

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Well, I can bench 1230/1680 @+100mV


That's a nice sample then.

You tried hacked 15.15 driver?


----------



## Ha-Nocri

Quote:


> Originally Posted by *Agent Smith1984*
> 
> That's a nice sample then.
> 
> You tried hacked 15.15 driver?


I did, but just for GTA5 and Witcher 3. Didn't see any difference. Using 15.20 atm


----------



## CrossoXoHair

Turns out my case is 4mm to small to use the Sapphire... (Case is 304 max)

This is sumones 290x in the same case as mine (305mm) it barely gets in there.

I'm pissed off


----------



## Agent Smith1984

Quote:


> Originally Posted by *CrossoXoHair*
> 
> Turns out my case is 4mm to small to use the Sapphire... (Case is 304 max)
> 
> This is sumones 290x in the same case as mine (305mm) it barely gets in there.
> 
> I'm pissed off


That was one thing I preferred about the MSI card, was it's length.

It was around 3/4" shorter than my previous 290 Tri-X cards.
I would imagine the Nitro and Tri-X coolers are similar in the length.

I believe they are actually the same coolers with a different shroud and better fans, but not certain.


----------



## Particle

That is one thing to be careful of, but if your case can fit it a long cooler is certainly an advantage.


----------



## Bigm

Hit 1165 core and 1765 memory which seems to be my absolute max.

Is anyone benching lower with the Windows 10 15.20 driver than the Windows 8.1 driver from the AMD site? I seem to get about 400 points less on Firestrike with the same clocks.

http://www.3dmark.com/compare/fs/5305689/fs/5321246


----------



## Agent Smith1984

Well,

I've lolly gagged long enough.

I am logging some FRAPS tonight on Crysis 3 using 1080P and 1440P (using VSR).

I will test the stock 1060/1525 OC clocks, my daily 1150/1700 clocks, and also probably try my benching clocks of 1180/1750.

The only thing that limits those benching clocks from being my daily clocks, is me having to remove my case door, which won't be an issue once I replace my exhaust fans I'm guessing....


----------



## Agent Smith1984

Quote:


> Originally Posted by *Bigm*
> 
> Hit 1165 core and 1765 memory which seems to be my absolute max.
> 
> Is anyone benching lower with the Windows 10 15.20 driver than the Windows 8.1 driver from the AMD site? I seem to get about 400 points less on Firestrike with the same clocks.
> 
> http://www.3dmark.com/compare/fs/5305689/fs/5321246


Yes, sadly, the 15.20 driver is slower than the 15.15 driver, at least as far as benchmarks are concerned. Not sure about in game performance yet....


----------



## Bigm

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Yes, sadly, the 15.20 driver is slower than the 15.15 driver, at least as far as benchmarks are concerned. Not sure about in game performance yet....


Darn, was really really hoping to hit that glorious 12k in Firestrike but I don't want to go back to Windows 8.1. Decisions, decisions.


----------



## BackwoodsNC

Anyway to move the power limit past +50% bios mod?


----------



## Agent Smith1984

Quote:


> Originally Posted by *BackwoodsNC*
> 
> Anyway to move the power limit past +50% bios mod?


I don't think it will do any good.
I can't get that card to draw over 357w of power with the 6+8 configuration it uses. I've tested all the way up to 200mv+.
Maybe an 8+8 would do better, but none of the 390's I've seen use it.


----------



## BackwoodsNC

according to this review it matters

Link


----------



## slayersic

been reading the posts in here and im just wondering, will a silverstone sfx 600 watts psu enough to power up 390/390x ?


----------



## RomulusVolk

Just ordered my MSI R9 390 from New egg! Super keen. My system is new so I havent had any amd drivers installed before only nvidia. Do I turn off automatic update of drivers before installing? Also I don't have a cd drive so where could I download the appropriate driver? Thanks!

Also, are people happy with their MSI R9 390 so far?

Cheers.


----------



## Agent Smith1984

Crysis 3 @ 1180/1700

These settings:



Skyline 16 player map.
2 minute benchmarks ran three times, and took the lowest of the three for session.

Frames Time (ms) Min Max Avg
9204 120000 55 112 76.7

Then at the stock MSI "OC Mode" at 1060/1525 clocks

Frames Time (ms) Min Max Avg
8871 120000 52 97 73.925

Very little difference in minimum and average FPS, only the max is impacted by the OC.

I am going to test same settings and map tomorrow at 1440p.


----------



## Cannon19932006

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Crysis 3 @ 1180/1700
> 
> These settings:
> 
> 
> 
> Skyline 16 player map.
> 2 minute benchmarks ran three times, and took the lowest of the three for session.
> 
> Frames Time (ms) Min Max Avg
> 9204 120000 55 112 76.7
> 
> Then at the stock MSI "OC Mode" at 1060/1525 clocks
> 
> Frames Time (ms) Min Max Avg
> 8871 120000 52 97 73.925
> 
> Very little difference in minimum and average FPS, only the max is impacted by the OC.
> 
> I am going to test same settings and map tomorrow at 1440p.


What's the gpu usage look like through the benchmark? The results may be skewed slightly by the minimum frames being cpu bound.


----------



## Gumbi

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Crysis 3 @ 1180/1700
> 
> These settings:
> 
> 
> 
> Skyline 16 player map.
> 2 minute benchmarks ran three times, and took the lowest of the three for session.
> 
> Frames Time (ms) Min Max Avg
> 9204 120000 55 112 76.7
> 
> Then at the stock MSI "OC Mode" at 1060/1525 clocks
> 
> Frames Time (ms) Min Max Avg
> 8871 120000 52 97 73.925
> 
> Very little difference in minimum and average FPS, only the max is impacted by the OC.
> 
> I am going to test same settings and map tomorrow at 1440p.


Crysis can be quite CPU limiting in certain cases.


----------



## CrossoXoHair

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I don't think it will do any good.
> I can't get that card to draw over 357w of power with the 6+8 configuration it uses. I've tested all the way up to 200mv+.
> Maybe an 8+8 would do better, but none of the 390's I've seen use it.


The sapphire r9 390 does. (8 + 8)


----------



## Agent Smith1984

Quote:


> Originally Posted by *Cannon19932006*
> 
> What's the gpu usage look like through the benchmark? The results may be skewed slightly by the minimum frames being cpu bound.


The GPU stays pegged the entire time to 99% except during menus.

I ran 290's in crossfire with this CPU and also had 99% utilization on both GPU's.

I really wanted to see others results in the game at the same settings.....

Anyone have Crysis that they can log some FRAPS on at those settings in that map?

I thought 73+ FPS average was pretty good for maxing that game out with 4x AA......

My first 290 by itself only averaged around 59FPS stock (1000/1300), and 64 with an overclock (1150/1500).

10FPS is a pretty good boost considering it's the same architecture.

I went back and tested at 1150/1500 to see how it did compared to my 290, and it averaged 74.5FPS, so it's still a 10 FPS bump at the same clock speeds.
This driver really makes a huge improvement. Same story on Far Cry 4 too.....

Quote:


> Originally Posted by *CrossoXoHair*
> 
> The sapphire r9 390 does. (8 + 8)


Yeah, just looked that up. They seemed to have adopted the 8+8 towards the end of the 290's run with the "NEW EDITION" they were selling on newegg.
Those cards may to voltage a little better....

I do not find my card hitting TDP at 50mv+ the way they found int he HardOCP review though. Mine seems to hit a max of 350watts at 100mv+, and then crawls just over that to around 357w with additional voltage in trixx. It is definitely power limited.

I would love to see people post some high voltage STABLE overclocks of their cards using Trixx. Especially the Sapphire. It may be the best overclocker with it's TDP capabilities and superior cooler.


----------



## Gumbi

I'll do some runs later when I get the chance. I have a VaporX with dual 8 pins.


----------



## CrossoXoHair

Well I will order one tomorrow if someone can assure me I can remove the top hard drive cage (riveted) from the coolermaster rc-690ii and that the bottom part will stay in place (with my 3HDD in it) and keep structural integrity.

Also if the card will hit anything im not thinking of with that removed?

http://community.coolermaster.com/index.php/topic/10181-how-to-de-rivet-your-case-and-paint-the-inside/


----------



## Agent Smith1984

Quote:


> Originally Posted by *Gumbi*
> 
> Crysis can be quite CPU limiting in certain cases.


Not sure what map was tested in this review, but I am well ahead of the results in these graphs....
http://www.techpowerup.com/reviews/MSI/R9_390X_Gaming/14.html

Considering I was achieving max utilization on 2) 290's prior to this card, I can't the CPU being a bottleneck at all.

I get it's an AMD, and that's a whole other debate I won't get going in here, but this CPU seems to handle everything I throw at it just fine.


----------



## Agent Smith1984

Quote:


> Originally Posted by *CrossoXoHair*
> 
> Well I will order one tomorrow if someone can assure me I can remove the top hard drive cage (riveted) from the coolermaster rc-690ii and that the bottom part will stay in place (with my 3HDD in it) and keep structural integrity.
> 
> Also if the card will hit anything im not thinking of with that removed?
> 
> http://community.coolermaster.com/index.php/topic/10181-how-to-de-rivet-your-case-and-paint-the-inside/


I stuffed a XFX DD 7950 in my sons micro ATX case.... just took a drill and some tin snips








Not pretty, but he doesn't use a window anyways, lol.

He's like, "dad, why are you cutting my case up?" and I'm like, "son, we have to stuff this massive video card in your little box so you can kill bad guys on Far Cry 4", and he simply said, "okay, cool"


----------



## CrossoXoHair

Doesn't really answer my question lol.,


----------



## Agent Smith1984

Quote:


> Originally Posted by *CrossoXoHair*
> 
> Doesn't really answer my question lol.,


Well, I'm assuring you that you can remove it.... you can remove anything... it just doesn't always turn out as nice as we'd hope, is what I'm getting at.
I wouldn't assume that you will keep structural integrity if you go removing the cage though. It could impact the case strength somewhat.


----------



## CrossoXoHair

it looks like its just held by 4 rivets to the side and doesn't even touch the 3.5 bays above.

From those pics above


----------



## Zanpakuto

I think I'm going to purchase the Nitro 390, can you guys tell amazon the lowest price on the page (scroll down to "tell us lowest price") , http://www.newegg.com/Product/Product.aspx?Item=N82E16814202148 has it for $339.99


----------



## Bigm

My life is now complete.

http://www.3dmark.com/fs/5331156


----------



## Agent Smith1984

Quote:


> Originally Posted by *Bigm*
> 
> My life is now complete.
> 
> http://www.3dmark.com/fs/5331156


Good job!!

Stable clocks or just benching clocks?


----------



## Bigm

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Good job!!
> 
> Stable clocks or just benching clocks?


The CPU is about 95% stable, its 100% stable at 4.8 with 1.3v compared to 1.35 for the 4.9, so I backed down to 4.8 unless I'm benching.

The GPU is about 99% stable as well, I get the slightest bit of flickering rarely. I have more room to go up on the Aux voltage, which I'll probably do later and see if I can get the memory higher or stabilize the flickering. Right now, I'm just running at 1125/1725 because I'm not doing anything that would even need the GPU at stock clocks.


----------



## Zanpakuto

OH NOOOOOO! Nitro 390 sold out on newegg, but I purchased the MSI 390 . Thursday is when I start building my Computer w/ the 4k Monitor


----------



## Zanpakuto

Quote:


> Originally Posted by *Zanpakuto*
> 
> OH NOOOOOO! Nitro 390 sold out on newegg, but I purchased the MSI 390 . Thursday is when I start building my Computer w/ the 4k Monitor


I lied. I cancelled the order, to use my Stacked AGCs and Got the Nitro 390 So I will be owning the Nitro 390 on Thursday.


----------



## Cannon19932006

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Not sure what map was tested in this review, but I am well ahead of the results in these graphs....
> http://www.techpowerup.com/reviews/MSI/R9_390X_Gaming/14.html
> 
> Considering I was achieving max utilization on 2) 290's prior to this card, I can't the CPU being a bottleneck at all.
> 
> I get it's an AMD, and that's a whole other debate I won't get going in here, but this CPU seems to handle everything I throw at it just fine.


I hope that's not the impression you got from me, I know Crysis 3 is well threaded, I just wanted to know if the gpu usage was at max the whole time. Just wanted to rule out the minimum fps being caused by the cpu. I have nothing against AMD cpu's, my other rig has an 8320 and a 990fxa-ud3.









Also I need my gpu in the club updated, have an msi 390x now.

http://www.techpowerup.com/gpuz/details.php?id=ewcwz


----------



## Agent Smith1984

Quote:


> Originally Posted by *Cannon19932006*
> 
> I hope that's not the impression you got from me, I know Crysis 3 is well threaded, I just wanted to know if the gpu usage was at max the whole time. Just wanted to rule out the minimum fps being caused by the cpu. I have nothing against AMD cpu's, my other rig has an 8320 and a 990fxa-ud3.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also I need my gpu in the club updated, have an msi 390x now.
> 
> http://www.techpowerup.com/gpuz/details.php?id=ewcwz


No problem, I'll get you updated tomorrow.

Didn't take the wrong impression, just used to having to defend my CPU to certain people


----------



## raist679

All right guys prepare for the ******ation. I bought a Gigabyte 390X thinking that the LED color would be adjustable. I'm just not happy with the blue. I can still return it so I thought I might do that and try the XFX. A nice neutral black card. However all that is in stock anywhere is the stock model. Is there any point in waiting for the black edition to come back in stock? Do they actually bin the chips for the black edition cards? Thanks in advance for any input.


----------



## Cannon19932006

Quote:


> Originally Posted by *Agent Smith1984*
> 
> No problem, I'll get you updated tomorrow.
> 
> Didn't take the wrong impression, just used to having to defend my CPU to certain people


ended up with this as far as overclock.
http://www.techpowerup.com/gpuz/details.php?id=2ereu

1150MHz core, 1750MHz memory
+50mv core, +75mv Aux
+50% power limit

Seems 100% stable at the moment. 1175 gives artifacting and 1200 crashes. 1750 mem seems good as well, 1800 causes issues.

3Dmark score: 11731
http://www.3dmark.com/fs/5333934

Heaven


----------



## sidfirex

hi there again guys

I got myself a Sapphire R9 390x lol and guess what Crossfired with my Gigabyte R9 390 (non x)

I put everything to stock clocks and here's my benchmark - havent touch the sliders yet !!

3D Mark score : 15427
http://www.3dmark.com/fs/5335775

CrossfireX - Sapphire R9 390x + Gigabyte Gaming g1 R9 390 (non-x) pic:



loving the ability to enable/disable crossfire per game profile in CCC feels so much out of the fly in EYEFINITY

This was not possible in Surround, you cannot disable SLI in Surround mode

Coming from a SLi GTX780 which was such a PITA !!


----------



## Agent Smith1984

Quote:


> Originally Posted by *sidfirex*
> 
> hi there again guys
> 
> I got myself a Sapphire R9 390x lol and guess what Crossfired with my Gigabyte R9 390 (non x)
> 
> I put everything to stock clocks and here's my benchmark - havent touch the sliders yet !!
> 
> 3D Mark score : 15427
> http://www.3dmark.com/fs/5335775
> 
> CrossfireX - Sapphire R9 390x + Gigabyte Gaming g1 R9 390 (non-x) pic:
> 
> 
> 
> loving the ability to enable/disable crossfire per game profile in CCC feels so much out of the fly in EYEFINITY
> 
> This was not possible in Surround, you cannot disable SLI in Surround mode
> 
> Coming from a SLi GTX780 which was such a PITA !!


So when you crossfired the two it reported as 290x (2) on Futuremark?

The score seems low. I was getting around 22k graphics with my 2 (290) at 1040/1350 (my daily summer clocks for crossfire)....

I wonder if anyone else is seeing their cards as 290x when crossfiring them?

Are you planning on keeping these two cards together in crossfire?
I have some tips if so....


----------



## sidfirex

yes please, any tips appreciated ! I am planning to keep this R9 390 in CF until after Windows 10 to see how it performs (with all that DX12, Mantle , Multi GPU gains yadda yadda..)

The Sapphire 390x is voltage adjustable here's the afterburner for it: running it in stock voltage with OC to core 1100 and Memory 1600mhz, have not run 3D Mark yet with these.

And yes the two are reporting as 290x x2 in Futuremark



The temps never go over 70 for the Sapphire.. whereas for the Gigabyte G1 temps reach over 73 easily , I play a lot in eyefinity at gpu 100% in 5760 x 1080. My game is iRacing  This game comes first. I have to disable Crossfire for this in profile.

and this is the Giga G1 390 voltage locked:



and Thank you Agent Smith,

lemme know if there is anything else you would want me to try


----------



## NotReadyYet

Think it's worth selling my XFX DD 290x that's only 7 months old, for a 390x? Specifically, the MSI 390x?

Also, this uses less power than the 290x? Right? Think I'd be good with this power wise in my sig rig?


----------



## sidfirex

ok I clocked both cards to 1100/1600 on stock volt

and ran 3D Mark

got a score of : 16958 this time (24k graphics)

http://www.3dmark.com/3dm/7670063?

edit:

ok things I have done so far..

Disabled ULPS

raised 390x core voltage to +50 and power limit to +25 ... and clocks 1100/1600 So far so good.


----------



## Zanpakuto

Quote:


> Originally Posted by *NotReadyYet*
> 
> Think it's worth selling my XFX DD 290x that's only 7 months old, for a 390x? Specifically, the MSI 390x?
> 
> Also, this uses less power than the 290x? Right? Think I'd be good with this power wise in my sig rig?


I would wait for a node shrink next year and keep the 290X. Here's an article you should read


----------



## Agent Smith1984

Quote:


> Originally Posted by *sidfirex*
> 
> ok I clocked both cards to 1100/1600 on stock volt
> 
> and ran 3D Mark
> 
> got a score of : 16958 this time (24k graphics)
> 
> http://www.3dmark.com/3dm/7670063?
> 
> edit:
> 
> ok things I have done so far..
> 
> Disabled ULPS
> 
> raised 390x core voltage to +50 and power limit to +25 ... and clocks 1100/1600 So far so good.


Oh yeah, that's nice!

4K POWA


----------



## Zanpakuto

Need more benches for latest drivers


----------



## Cannon19932006

http://www.3dmark.com/fs/5341379

Re run with same clocks and newest drivers.

Marginal increase in graphics score
15.15 =13799
15.7=13817


----------



## Bigm

Got a bump from 1160 to 1170 with the new drivers. That plus going from 4.9 to 5 on my CPU gave me this

http://www.3dmark.com/fs/5341674


----------



## Cannon19932006

Quote:


> Originally Posted by *Bigm*
> 
> Got a bump from 1160 to 1170 with the new drivers. That plus going from 4.9 to 5 on my CPU gave me this
> 
> http://www.3dmark.com/fs/5341674


What do you get in heaven? I'm surprised how much higher your graphics score is compared to mine with a 390 at close to the same clocks as my 390x.

Edit: A comparison of our best runs.

http://www.3dmark.com/compare/fs/5341674/fs/5341379


----------



## Bigm

Quote:


> Originally Posted by *Cannon19932006*
> 
> What do you get in heaven? I'm surprised how much higher your graphics score is compared to mine with a 390 at close to the same clocks as my 390x.
> 
> Edit: A comparison of our best runs.
> 
> http://www.3dmark.com/compare/fs/5341674/fs/5341379


Haven't run heaven. Will try and get around to it tonight.


----------



## Cannon19932006

Quote:


> Originally Posted by *Bigm*
> 
> Haven't run heaven. Will try and get around to it tonight.


Cool, I wonder how much of the difference can be from PCIE 2.0 to 3.0 and how much is coming from the 4790k vs 2700k.


----------



## Bigm

Quote:


> Originally Posted by *Cannon19932006*
> 
> Cool, I wonder how much of the difference can be from PCIE 2.0 to 3.0 and how much is coming from the 4790k vs 2700k.


To be honest, half my runs score similar to yours. Some runs I score significantly better than others despite everything being the exact same.

For example,

http://www.3dmark.com/compare/fs/5341880/fs/5341674


----------



## Cannon19932006

Quote:


> Originally Posted by *Bigm*
> 
> To be honest, half my runs score similar to yours. Some runs I score significantly better than others despite everything being the exact same.
> 
> For example,
> 
> http://www.3dmark.com/compare/fs/5341880/fs/5341674


That's odd, all of my runs have been right around 11700. Pretty much a +/- 50 variance from 11700.


----------



## Bigm

Quote:


> Originally Posted by *Cannon19932006*
> 
> That's odd, all of my runs have been right around 11700. Pretty much a +/- 50 variance from 11700.


Yeah. It's very weird. I might switch over to my Win10 build and see what I'm at.


----------



## Gumbi

Quote:


> Originally Posted by *Bigm*
> 
> Got a bump from 1160 to 1170 with the new drivers. That plus going from 4.9 to 5 on my CPU gave me this
> 
> http://www.3dmark.com/fs/5341674


Very nice. My 290 at 1175/1600 is scoring 13.2k I think (just updated drivers too).


----------



## Agent Smith1984

It's important to remember that 20mhz on Hawaii core makes a solid 200-400 point difference when benching.


----------



## Bigm

Quote:


> Originally Posted by *Agent Smith1984*
> 
> It's important to remember that 20mhz on Hawaii core makes a solid 200-400 point difference when benching.


Did not know that. I've never had a Hawaii card before. My last AMD card was a 7950.


----------



## diggiddi

What is the jump like from the 7950?


----------



## Bigm

Quote:


> Originally Posted by *diggiddi*
> 
> What is the jump like from the 7950?


Couldn't say. It was several cards ago.


----------



## Bigm

Quote:


> Originally Posted by *Cannon19932006*
> 
> What do you get in heaven? I'm surprised how much higher your graphics score is compared to mine with a 390 at close to the same clocks as my 390x.
> 
> Edit: A comparison of our best runs.
> 
> http://www.3dmark.com/compare/fs/5341674/fs/5341379




My Heaven run. No idea why my min FPS went that low.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Bigm*
> 
> 
> 
> My Heaven run. No idea why my min FPS went that low.


run the entire course, and then hit bench, should keep min around 26-30fps. Doesn't help score much, but always looks better on screenies


----------



## Cannon19932006

Quote:


> Originally Posted by *Bigm*
> 
> 
> 
> My Heaven run. No idea why my min FPS went that low.


Mine was

FPS: 63.9
Score: 1611
Min: 27.0
Max: 130.6

That minimum FPS is odd indeed.


----------



## Cannon19932006

I was able to bump it to 1170 to bench, there was a little bit of artifacting but it puts our cards at the exact same clocks.

http://www.3dmark.com/3dm/7676801


----------



## Bigm

1185/1750


----------



## sidfirex

Quote:


> Originally Posted by *Cannon19932006*
> 
> I was able to bump it to 1170 to bench, there was a little bit of artifacting but it puts our cards at the exact same clocks.
> 
> http://www.3dmark.com/3dm/7676801


Hey Cannon

On that clock/mem, may I ask what volt are you using (+/-) and are you increasing the aux volt as well ? and power limit?

I will try it on a single Sapphire 390x tri-oc this time when I get back home and post result  cheers


----------



## Cannon19932006

Quote:


> Originally Posted by *sidfirex*
> 
> Hey Cannon
> 
> On that clock/mem, may I ask what volt are you using (+/-) and are you increasing the aux volt as well ? and power limit?
> 
> I will try it on a single Sapphire 390x tri-oc this time when I get back home and post result  cheers


I'm using

1150MHz core @ +50mv
1750MHz Memory @ +75mv AUX
Power Limit +50%
And a custom fan profile that hits 100% at 75c, this msi seems to be dead silent in my case regardless of fan rpm so may as well keep it cool.


----------



## Cannon19932006

Interesting note, if you go to sapphire's r9 390x page and go to downloads you will find Sapphire TRIXX 5.0.0 at this link.

http://asia.dl.sapphiretech.com/archive/gm/drivers/TRIXX_installer_5.0.0_5878785.zip

Not sure who of you out there are using TRIXX to oc, but maybe it works better than afterburner. 5.0.0 is supposedly for fury, and 300 series cards.


----------



## Bigm

Got up to 1190/1750 but forgot to take a screenie.


----------



## simo292a

AMD 15.7 DRIVER HYPE!!! have you guys tested it? I am going to when i get home from work.


----------



## Kalistoval

So I'm playing gta v on my 42" 1080p tv, Im wondering if and what setting should i enable in ccc I have vsr going it looks nice in game but everything in ccc is on auto. In game I set everything to the max except msaa. If anyone has any pointer I'm all ears.


----------



## Talented

Tested on my 390X on Fire Strike at 1080P with my Core i5 4690k @ 4.3GHz.

15.5 Driver 1060/1500 (Core/Mem)
Low: 10402
High: 10503

15.7 Driver 1060/1500 (Core/Mem)
Low: 10621
High: 10628


Gained a bit, it was about equivalent to 1/3rd of what I was able to achieve with an overclock on this card, I'm using the Gigabyte 390X which was able to reach 1100/1600 rather easily with an OC.


----------



## sidfirex

Hi Cannon

Just did the same , http://www.3dmark.com/compare/fs/5347629/fs/5343118

got score of 11823 ..

Graphics score of 14164 ,, CPU is not oomph enough 

but like you said little artifact here and there not much tho.


----------



## Bigm

http://www.3dmark.com/compare/fs/5347750/fs/5341674

Win 10 on the left, 8.1 on the right.

Edit: Ran the API overhead test for laughs.

http://www.3dmark.com/3dm/7682787


----------



## simo292a

Quote:


> Originally Posted by *Bigm*
> 
> http://www.3dmark.com/compare/fs/5347750/fs/5341674
> 
> Win 10 on the left, 8.1 on the right.
> 
> Edit: Ran the API overhead test for laughs.
> 
> http://www.3dmark.com/3dm/7682787


That is pretty close between win8.1 and win10.
But i am pretty hyped to see what DX12 can do especially for AMD. AMD also said they were ready for win10 and it seems like they delivered.


----------



## Ha-Nocri

Quote:


> Originally Posted by *Bigm*
> 
> http://www.3dmark.com/compare/fs/5347750/fs/5341674
> 
> Win 10 on the left, 8.1 on the right.
> 
> Edit: Ran the API overhead test for laughs.
> 
> http://www.3dmark.com/3dm/7682787


have a feeling win7 would be slower....


----------



## Agent Smith1984

Quote:


> Originally Posted by *Talented*
> 
> Tested on my 390X on Fire Strike at 1080P with my Core i5 4690k @ 4.3GHz.
> 
> 15.5 Driver 1060/1500 (Core/Mem)
> Low: 10402
> High: 10503
> 
> 15.7 Driver 1060/1500 (Core/Mem)
> Low: 10621
> High: 10628
> 
> 
> Gained a bit, it was about equivalent to 1/3rd of what I was able to achieve with an overclock on this card, I'm using the Gigabyte 390X which was able to reach 1100/1600 rather easily with an OC.


If you can get your username on a GPU-Z or a screenshot with your name and GPU-Z open, I can get you added to the list!

Thanks


----------



## Agent Smith1984

BTW.... I also upgraded drivers last night.
Very impressed with AMD for getting their entire line on one driver again.

If they can keep driver support at this level, they will have a lot of happy customers.

Even 7900 owners are cheering about this driver, and that card is 4 years old almost.
I am going to upgrade my son's box tonight and see how his 7950 does on it.

Far Cry 4 should see a big boost (he loves that game







)


----------



## Gumbi

Quote:


> Originally Posted by *Agent Smith1984*
> 
> BTW.... I also upgraded drivers last night.
> Very impressed with AMD for getting their entire line on one driver again.
> 
> If they can keep driver support at this level, they will have a lot of happy customers.
> 
> Even 7900 owners are cheering about this driver, and that card is 4 years old almost.
> I am going to upgrade my son's box tonight and see how his 7950 does on it.
> 
> Far Cry 4 should see a big boost (he loves that game
> 
> 
> 
> 
> 
> 
> 
> )


Great news. I sold my 7950 a few months ago when upgrading to a 290 and still miss it. Great card.

Was a crazy good clocker, did 1200mhz at 1.1v (90.2% ASIC so ridiculously low voltage for clocks, just 1037mv for 1100mhz). Was the only one of my models (DualX 950mhz limited edition) with bum Elpida mem tho


----------



## Agent Smith1984

I just run my son's at 1000/1400 with stock voltage because his PSU can handle any more than that.

I have a modded BIOS for it with 1.4v available but the cooler and PSU won't hold it.

My 280x DCU2 however was a monster.. that core would do 1260MHz on 1.25v and 1800MHz on the memory.
That was a tough little card, even though it suffered from the weird artiacts in some games (common on the Asus and Sapphire Toxic cards)

I have to say.... I am having the best gaming experience I have had on this 390 with these current drivers and a nice OC.


----------



## MOSER91

Does the XFX r9 390 Black Edition have a reference pcb, or is it custom? I have a waterblock that I can use, but it's for reference pcb models. Thinking about buying this card.


----------



## Agent Smith1984

Quote:


> Originally Posted by *MOSER91*
> 
> Does the XFX r9 390 Black Edition have a reference pcb, or is it custom? I have a waterblock that I can use, but it's for reference pcb models. Thinking about buying this card.


Please check the EK website linked inthe OP in the cooling section.

There you will be able to go to their block finder and check to see if your block's part number matches any of the blocks (if any) compatible with the XFX 390.

As of yet, none of the 390 cards are officially reference, but some of the manufacturers are assumed to have used a reference "design."


----------



## Agent Smith1984

****OP updated with current driver information***

***OP Overclocking Section also updated with new information based on user submissions and personal experience.*****


----------



## Talented

Quote:


> Originally Posted by *Agent Smith1984*
> 
> If you can get your username on a GPU-Z or a screenshot with your name and GPU-Z open, I can get you added to the list!
> 
> Thanks


Here is the default clock GPUz Validation, as for overclocking with only touching the Power Limit % (+50) I was able to achieve 1100/1700, any higher on the Core Clock causes artifacts to appear on my screen. Here is the Firestrike run for my 1100/1700, here is the GPUz validation for the OC.

I am using the GIGABYTE GV-R939X G1 GAMING, at stock values it has a total system power draw of 444w measured with my UPS and during my overclock hit a peak of 483w during firestrike, this is including 30w for my XL2411Z however which places my system load around 453w which I find is perfect for my 650w power supply.

PS: Redlines default values say 10000







.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Talented*
> 
> Here is the default clock GPUz Validation, as for overclocking with only touching the Power Limit % (+50) I was able to achieve 1100/1700, any higher on the Core Clock causes artifacts to appear on my screen. Here is the Firestrike run for my 1100/1700, here is the GPUz validation for the OC.
> 
> I am using the GIGABYTE GV-R939X G1 GAMING, at stock values it has a total system power draw of 444w measured with my UPS and during my overclock hit a peak of 483w during firestrike, this is including 30w for my XL2411Z however which places my system load around 453w which I find is perfect for my 650w power supply.
> 
> PS: Redlines default values say 10000
> 
> 
> 
> 
> 
> 
> 
> .


I'll get you added and sheet fixed as soon as our systems are back up (at work, everything is down).

Welcome!

Has anyone ran 3dmark11 yet?

I like that bench so much better than firestrike....


----------



## Talented

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Has anyone ran 3dmark11 yet?
> 
> I like that bench so much better than firestrike....


I only own the 3DMark on steam which offers Firestrike, Sky Diver, Cloud Gate and Ice Storm (and all extreme / ultra variants), I'm in the process of downloading the 3DMark 11 Demo on Steam to see what it will let me do.


----------



## Pyrokills

So after trying to push this card a little I have noticed I get really weird artifacting in certain games, even at stock speeds/voltages. For instance in certain games I get what looks like screen tearing, but is more large chunks going black/distorting in a line shape. In other games I get little discolored spots flashing up at certain times. I've tried the whole clean reinstall of drivers etc etc etc, but was wondering if anyone else has had any issues. I'm wondering if its a driver problem on their end or my cards just faulty.


----------



## Talented

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Has anyone ran 3dmark11 yet?


3DMark 11 score with my 24/7 OC of 1100 / 1600 was 13978 using only the basic edition (demo edition on steam) which only allows performance mode. I don't know if it matters much to 3D Mark but I'm running 5 monitors so that might effect my scores?


----------



## DividebyZERO

Quote:


> Originally Posted by *Talented*
> 
> 3DMark 11 score with my 24/7 OC of 1100 / 1600 was 13978 using only the basic edition (demo edition on steam) which only allows performance mode. I don't know if it matters much to 3D Mark but I'm running 5 monitors so that might effect my scores?


Yes it will, to get a true score you have to hook up only one monitor. Its one reason i hate 3dmark while using eyefinity. It such a hassle to disconnect monitors just to get a bwtter score. I think for me most of the time itz a good 500 points or more lost.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Talented*
> 
> 3DMark 11 score with my 24/7 OC of 1100 / 1600 was 13978 using only the basic edition (demo edition on steam) which only allows performance mode. I don't know if it matters much to 3D Mark but I'm running 5 monitors so that might effect my scores?


Looks good!

Here is my score with my daily 1150/1700 clocks:

http://www.3dmark.com/3dm11/10029945


----------



## DividebyZERO

I just realized its 3dmark 11 and not firestrike. I would assume it does affect the score as it does in firestrike but maybe less. I get them confused.


----------



## Agent Smith1984

http://videocardz.com/57160/amd-catalyst-15-7-adds-crossfire-support-between-radeon-r9-390x-and-290x

Finally confirmed!


----------



## DividebyZERO

Quote:


> Originally Posted by *Agent Smith1984*
> 
> http://videocardz.com/57160/amd-catalyst-15-7-adds-crossfire-support-between-radeon-r9-390x-and-290x
> 
> Finally confirmed!


Hmmm im just waiting for fury pro numbers but this gives me hope as i was debating swapping 290x 4gb one at a time to 390x 8gb if this came true... now its looking like this is most likely my course ... just a little longer because i am also curious about vsr on 390x if it will ever do 4k


----------



## Agent Smith1984

Quote:


> Originally Posted by *DividebyZERO*
> 
> Hmmm im just waiting for fury pro numbers but this gives me hope as i was debating swapping 290x 4gb one at a time to 390x 8gb if this came true... now its looking like this is most likely my course ... just a little longer because i am also curious about vsr on 390x if it will ever do 4k


I dunno.... I am expecting Fury Pro with its 56 ROP's and 3596 SP's to be a somewhat mild jump from the 290/390x.... (10-12% at 4k)

It may even perform in line with a nicely overclocked 390x in some cases due to the lesser amount of ROP's.....

Of course that's all depending on well the Fury Pro may (OR MAY NOT) overclock.....

You may be better to just skip a generation altogether my friend.

I'm not sure if the 390 will get 4K VSR or not.... it's doubtful to be honest.....


----------



## sidfirex

Quote:


> Originally Posted by *DividebyZERO*
> 
> Yes it will, to get a true score you have to hook up only one monitor. Its one reason i hate 3dmark while using eyefinity. It such a hassle to disconnect monitors just to get a bwtter score. I think for me most of the time itz a good 500 points or more lost.


Really? Wow, up until now I had been just disabling eyefinity (3x) in the control panel and the test runs in 1080p in the center monitor.. Do you reckon it still affects the score ??

And Agent Smith, could you pls update the table to my new card 390x Sapphire ,thnx mate


----------



## Cannon19932006

That review that showed the power limit being hit when at over +50mv on the 390x appears to be wrong.

I did some benching of my own and made a different discovery all together, and it seems to be true in whatever benchmark I run.

at +100mv my voltage ranges in 3dmark between 1.25v and 1.3v, but never below 1.25v. At low load (not power saving) the voltage is around 1.359v.

at +50mv my voltage ranges in 3dmark between 1.21v and 1.24v, but never over 1.24v. At low load, the voltage is around 1.310.

This results, although with minor voltage differences is able to be replicated easily, regardless of what the voltage actually is, it seems at +100mv the voltage is on average about +50mv higher at the same frame than, well +50mv.

In Furmark the voltage is much more stable, at +100mv it's between 1.278v and 1.281v I believe this is due to a much more uniform load being provided from Furmark, vs the load 3dmark provides.
At +50mv we see 1.227v-1.231v in Furmark.

I believe this drop in voltage is not because of a power limitation, but more likely because of some sort of vdroop. This leads me to believe that the article review overclocking the 390x is indeed wrong. They say *"Simply pushing the Core Voltage setting to its maximum will not give you the best overclock on this particular card as that power level pushes the card past its TDP limit and then causes the card to throttle. If you watch in-game clocks while you are playing a game, this is obvious, but is something that might not be realized while running a canned benchmark. We spent a lot of time finding the fine balance between raising the voltage, getting the highest overclock, and the most performance increase before we hit the TDP limit causing performance-limiting throttling. We found that +50 on PowerTune power setting was perfect for our card with and without voltage manipulations."* What I have found leads me to believe this is incorrect, and that they were not hitting performance limiting throttle, but actually just having vdroop kick in under load.

The article in question
http://www.hardocp.com/article/2015/07/06/msi_radeon_r9_390x_gaming_8g_overclocking_review

We all know that it can be difficult to get a stable overclock when voltage droops, your minimum voltage must be high enough to sustain your overclock. I believe that hardocp mistakenly labeled vdroop as power limitation and I plan to look into this further, but the results so far have been promising.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Cannon19932006*
> 
> That review that showed the power limit being hit when at over +50mv on the 390x appears to be wrong.
> 
> I did some benching of my own and made a different discovery all together, and it seems to be true in whatever benchmark I run.
> 
> at +100mv my voltage ranges in 3dmark between 1.25v and 1.3v, but never below 1.25v. At low load (not power saving) the voltage is around 1.359v.
> 
> at +50mv my voltage ranges in 3dmark between 1.21v and 1.24v, but never over 1.24v. At low load, the voltage is around 1.310.
> 
> This results, although with minor voltage differences is able to be replicated easily, regardless of what the voltage actually is, it seems at +100mv the voltage is on average about +50mv higher at the same frame than, well +50mv.
> 
> In Furmark the voltage is much more stable, at +100mv it's between 1.278v and 1.281v I believe this is due to a much more uniform load being provided from Furmark, vs the load 3dmark provides.
> At +50mv we see 1.227v-1.231v in Furmark.
> 
> I believe this drop in voltage is not because of a power limitation, but more likely because of some sort of vdroop. This leads me to believe that the article review overclocking the 390x is indeed wrong. They say *"Simply pushing the Core Voltage setting to its maximum will not give you the best overclock on this particular card as that power level pushes the card past its TDP limit and then causes the card to throttle. If you watch in-game clocks while you are playing a game, this is obvious, but is something that might not be realized while running a canned benchmark. We spent a lot of time finding the fine balance between raising the voltage, getting the highest overclock, and the most performance increase before we hit the TDP limit causing performance-limiting throttling. We found that +50 on PowerTune power setting was perfect for our card with and without voltage manipulations."* What I have found leads me to believe this is incorrect, and that they were not hitting performance limiting throttle, but actually just having vdroop kick in under load.
> 
> The article in question
> http://www.hardocp.com/article/2015/07/06/msi_radeon_r9_390x_gaming_8g_overclocking_review
> 
> We all know that it can be difficult to get a stable overclock when voltage droops, your minimum voltage must be high enough to sustain your overclock. I believe that hardocp mistakenly labeled vdroop as power limitation and I plan to look into this further, but the results so far have been promising.


Same results on my end, and i too questioned those findings. I stated my experience in my op update.

+ rep


----------



## Cannon19932006

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Same results on my end, and i too questioned those findings. I stated my experience in my op update.
> 
> + rep


I may have to look at results trying +150mv with trixx.


----------



## Motley01

Well I just installed the new 15.7 drivers. And these are great! Very impressed with AMD's drivers lately. Now what are all the Nvidia fanboys gonna complain about? LOL

Valley benchmark increased from 2135 to 2230 just by installing these drivers. Same OC and everything else stayed the same.


----------



## brooklands

Hi there,

I just recently made the jump from a 7970GHz wih Mk-26 to a brand spanking new MSI Gaming 390x 8G.
(GPU-Z validation Link: http://www.techpowerup.com/gpuz/details.php?id=9m4sg)

I'm very pleased with the performance of this card at stock speeds as well as on OC. -I got about 1177MHz GPU and 1625 on Memory without raising voltage.

*BUT*

I'm not _that_ happy with the acoustics... yes, there are definitely louder cards on the market and at least the fan noise is not so annoying like the noise from the the powercoler 390x. But between my my old 7970GHz with custom-cooler and the MSI Gaming 390x 8G are worlds. So I went the same way as whiteturbo and replaced the thermal compound.

And man.. there where _copious_ amounts of thermal paste (I wish I had taken some photos before cleaning). Enough for 3-4 GPUs. So I made a first cleaning run with plenty iso-propyl alcohol, then a finish with arcticlean thermal material remover & surface purifier and applied some prolimatech pk-3 and put everything back together.

Temps didn't improve that much, about 2-3 degrees maybe but fan speeds changed quite drastically, about 20% for the better.

I'm really liking the card, especially the fact that I presumably got quite lucky with the silicon lottery and with an Alpenfoehn Peter 2 (thank god, MSI is not using those 5 pin fan-headers) there's still hope for me to get this thing (more) silent.


----------



## Gumbi

Quote:


> Originally Posted by *brooklands*
> 
> Hi there,
> 
> I just recently made the jump from a 7970GHz wih Mk-26 to a brand spanking new MSI Gaming 390x 8G.
> (GPU-Z validation Link: http://www.techpowerup.com/gpuz/details.php?id=9m4sg)
> 
> I'm very pleased with the performance of this card at stock speeds as well as on OC. -I got about 1177MHz GPU and 1625 on Memory without raising voltage.
> 
> *BUT*
> 
> I'm not _that_ happy with the acoustics... yes, there are definitely louder cards on the market and at least the fan noise is not so annoying like the noise from the the powercoler 390x. But between my my old 7970GHz with custom-cooler and the MSI Gaming 390x 8G are worlds. So I went the same way as whiteturbo and replaced the thermal compound.
> 
> And man.. there where _copious_ amounts of thermal paste (I wish I had taken some photos before cleaning). Enough for 3-4 GPUs. So I made a first cleaning run with plenty iso-propyl alcohol, then a finish with arcticlean thermal material remover & surface purifier and applied some prolimatech pk-3 and put everything back together.
> 
> Temps didn't improve that much, about 2-3 degrees maybe but fan speeds changed quite drastically, about 20% for the better.
> 
> I'm really liking the card, especially the fact that I presumably got quite lucky with the silicon lottery and with an Alpenfoehn Peter 2 (thank god, MSI is not using those 5 pin fan-headers) there's still hope for me to get this thing (more) silent.


How is the cooling? Core/VRM temps? Surely a 20% saving on fan speeds kept them fairly queit under load, no? Maybe relax them even more and allow temps to hit 80ish, no harm in that


----------



## brooklands

Quote:


> Originally Posted by *Gumbi*
> 
> How is the cooling? Core/VRM temps?


Oh, cooling is quite good I'd say. Considering the TDP and the 28 degrees celcius in my office and a case temp of about 30c.

Temps after 30 minutes valley extreme hd:

GPU max = 79c
VRM 1 max = 76c
VRM 2 max = 49c
GPU fan max = 78%


----------



## Agent Smith1984

Quote:


> Originally Posted by *sidfirex*
> 
> Really? Wow, up until now I had been just disabling eyefinity (3x) in the control panel and the test runs in 1080p in the center monitor.. Do you reckon it still affects the score ??
> 
> And Agent Smith, could you pls update the table to my new card 390x Sapphire ,thnx mate


Updating table for new owners now.

Is yours a "Nitro"?

Is tjhere a specific OC/voltage you want posted, or are you still testing?


----------



## Agent Smith1984

Quote:


> Originally Posted by *brooklands*
> 
> Oh, cooling is quite good I'd say. Considering the TDP and the 28 degrees celcius in my office and a case temp of about 30c.
> 
> Temps after 30 minutes valley extreme hd:
> 
> GPU max = 79c
> VRM 1 max = 76c
> VRM 2 max = 49c
> GPU fan max = 78%


Could you post a firestrike run at your OC speeds?

Very shocked at how high your core is clocking on stock volts.


----------



## jon666

http://puu.sh/iU91C/eabf284a43.png

Haven't been overclocking very much, the heat would kill me long before the card did.


----------



## sidfirex

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Updating table for new owners now.
> 
> Is yours a "Nitro"?
> 
> Is tjhere a specific OC/voltage you want posted, or are you still testing?


It's a Sapphire R9 390x Tri-X OC

My daily clock currently is : core 1150 mhz , mem 1750mhz , core volt is +50 and aux volt is +50 too

thank you


----------



## brooklands

OK, here are some quick results while overclocked.

Fire Strike 1.1:

http://www.3dmark.com/fs/5361431
3DMark Score = 12133
Graphics Score = 14374

Fire Strike Extreme 1.1:

http://www.3dmark.com/fs/5361479
3DMark Score = 6049
Graphics Score = 6401

Fire Strike Ultra 1.1:

http://www.3dmark.com/fs/5361524
3DMark Score = 3254
Graphics Score = 3218


----------



## Agent Smith1984

Quote:


> Originally Posted by *brooklands*
> 
> OK, here are some quick results while overclocked.
> 
> Fire Strike 1.1:
> 
> http://www.3dmark.com/fs/5361431
> 3DMark Score = 12133
> Graphics Score = 14374
> 
> Fire Strike Extreme 1.1:
> 
> http://www.3dmark.com/fs/5361479
> 3DMark Score = 6049
> Graphics Score = 6401
> 
> Fire Strike Ultra 1.1:
> 
> http://www.3dmark.com/fs/5361524
> 3DMark Score = 3254
> Graphics Score = 3218


Very nice scores!!!









I've not tested on the 15.7 driver package yet, but will be playing with it tonight.

Thanks for the post!


----------



## Duke976

After seeing the reviews of Sapphire and Asus fury, I decided that it was not worth the price of their admission. I am better of with MSI 390 crossfire. Returned the XFX and went to Microcenter and bought these two bad boys.

Courtesy of Microcenter.


Love the look inside my Phantek Primo


----------



## Agent Smith1984

Quote:


> Originally Posted by *Duke976*
> 
> After seeing the reviews of Sapphire and Asus fury, I decided that it was not worth the price of their admission. I am better of with MSI 390 crossfire. Returned the XFX and went to Microcenter and bought these two bad boys.
> 
> Courtesy of Microcenter.
> 
> 
> Love the look inside my Phantek Primo


OH MY!

You got me super excited to add my second 390 in crossfire my friend!

Updating the sheet now!

Let us know clocks and benchies when you got 'em....

System looks great!

Edit: can you also give me a run down on your temps... OC's/voltage ETC..
VERY INTERESTED.... I'll be running the same setup in a few weeks.


----------



## Duke976

Quote:


> Originally Posted by *Agent Smith1984*
> 
> OH MY!
> 
> You got me super excited to add my second 390 in crossfire my friend!
> 
> Updating the sheet now!
> 
> Let us know clocks and benchies when you got 'em....
> 
> System looks great!
> 
> Edit: can you also give me a run down on your temps... OC's/voltage ETC..
> VERY INTERESTED.... I'll be running the same setup in a few weeks.


Thank you. Will update you soon.







I also ordered the Asus MG279Q to go along with these cards.


----------



## Ha-Nocri

@Duke976: looks great









How are the temps in CF?

*EDIT: nvm, see it already's been asked


----------



## Agent Smith1984

AMD was really smart to just let their partners handle this line of cards,..

Knowing these are rebrands, no-one one of had any faith in reference cards/coolers.... It would be nice to have more clarity on water blocks though, however i understand the concern for there even being a water market on a mainstream card.

The 290x/390x to fury gap, is smaller than the 7970/280x to 290 gap was so far.... I could see the 390 series taking off even further now because of this...


----------



## DizzlePro

is it worth upgrading from a R9 280X to a 390 for 1080p?

can sell my 280x for £120

however the cheapest 390s are around £260+


----------



## th3illusiveman

Quote:


> Originally Posted by *DizzlePro*
> 
> is it worth upgrading from a R9 280X to a 390 for 1080p?
> 
> can sell my 280x for £120
> 
> however the cheapest 390s are around £260+


it's a decent jump (i went from 7970 to 290X) but not mind blowing. How much would a Fury (non-X) be for you? That's the only jump i'd make if i was in your shoes.


----------



## Ha-Nocri

I think 280X if fine for 1080p. I would wait for next gen which will be die shrink to 14/16nm, so potentially a huge performance boost. That is if you don't have an upgrade itch.


----------



## Duke976

It seems that I might have to wait for my RIVE board since the board that I am using is not working for me with this crossfire set-up. I've tried to measure the temp and the top card was reaching 95c while the bottom card hovers between high 50 to low 60.

I've tried to put the bottom card on the 3rd PCi-E even though it will only run at 8x, but unfortunately the cover shroud of the PSU is hitting the fan lol. So my recommendation is to anyone that is looking to crossfire there 390 or 390x, make sure the cards are not too close to each other to avoid overheating.

Just to make sure that the top card wasn't defective or anything, I run it by itself and gave me 50c using Skyrim @4k. Will revisit again once I get my RIVE from RMA.


----------



## DizzlePro

Quote:


> Originally Posted by *th3illusiveman*
> 
> it's a decent jump (i went from 7970 to 290X) but not mind blowing. How much would a Fury (non-X) be for you? That's the only jump i'd make if i was in your shoes.


Fury's aren't out yet in the uk but they should sit between GTX 980(£400) & fury X/980ti(£540)

too much for me plus i'm not looking to upgrade my monitors anytime soon

Quote:


> Originally Posted by *Ha-Nocri*
> 
> I think 280X if fine for 1080p. I would wait for next gen which will be die shrink to 14/16nm, so potentially a huge performance boost. That is if you don't have an upgrade itch.


my 280x is voltage locked so i'm using that as an excuse to upgrade









might wait until the nano is released & see if the prices shift

also do the 390(x)'s come with dual bios?


----------



## sidfirex

Quote:


> Originally Posted by *DizzlePro*
> 
> also do the 390(x)'s come with dual bios?


my Sapphire Tri-X 390x does come with dual bios (the default bios is legacy and the other one is UEFI)


----------



## Shatun-Bear

Guys can I ask a question.

I was debating getting the 390 after watching 



 (I don't normally watch this guy).

Basically, he recommends the 390 over the 970 for a number of reasons. One of which is the overclocking potential, where he claims the 390s seem to overclock better than the 290s. He gets 1200 (and actually 1250, but scales it back) overclock on the core. But reading this thread, and the table from the OP, 1180 seems to be the highest. Did JayzTwoCents YouTube guy get a golden sample or something? Does the MSI Gaming 390 really overclock better than non-ref 290s?

EDIT: Just read this in the OP:
Quote:


> After gathering more overclocking information from new members, and based on my personal experience, it is apparent that these cards, while capable of reaching respectable clock speeds, are not capable of much higher overclocks than were achieved on the 290/290X cards.
> 
> At this point, most members are able to reach speeds of 1090-1125 on stock voltage, which is not quite as common on the 290 series, but also not unheard of.


----------



## Snailgun

Rolling in with MSI R9 390



Upgraded from HD6930 - difference is huge of course, but card got pretty strong coil noise in 3d, which I haven't experienced before. Cannot hear it in headphones though.

Anyway here's my results on stock clocks
http://www.3dmark.com/fs/5370795
1150 on GPU core
http://www.3dmark.com/fs/5370965

Gonna play more with clocks later.


----------



## rdr09

Quote:


> Originally Posted by *Snailgun*
> 
> Rolling in with MSI R9 390
> 
> 
> 
> Upgraded from HD6930 - difference is huge of course, but card got pretty strong coil noise in 3d, which I haven't experienced before. Cannot hear it in headphones though.
> 
> Anyway here's my results on stock clocks
> http://www.3dmark.com/fs/5370795
> 1150 on GPU core
> http://www.3dmark.com/fs/5370965
> 
> Gonna play more with clocks later.


i have to oc my 290 to 1200 to match your 390 @ 1150. With 8GB . . . well worth it me thinks.

Congrats!


----------



## Agent Smith1984

Anyone playing Dirt Rally?
I find this game too be so great with a wheel....

Ultra settings, 8xAA, 60fps vsync on...

Gpu never breaks 40%, cpu at 10-15%
Runs flawless at a constant 60fps and looks amazing... Wish every game ran this good!


----------



## BlaXey

I have the Dirt Rally, i played 1 hour and i unistalled it, for me is unplayable, i used a logitech driving force gt, i don't kwow how it is using a better wheel.


----------



## Agent Smith1984

Quote:


> Originally Posted by *BlaXey*
> 
> I have the Dirt Rally, i played 1 hour and i unistalled it, for me is unplayable, i used a logitech driving force gt, i don't kwow how it is using a better wheel.


I use the DFGT also.... Works great for me. Very addictive, but also a very difficult game.... More simulation than any of the other dirt games.


----------



## Agent Smith1984

This 15.7 driver has made crysis 3 so smooth...

(1150/1600) Max settings 1080p with 4xaa, min fps is 60.... That's way up from the 45's i used to see on a single 290(1150/1500) with the 15.4 driver


----------



## Motley01

I should be playing Dirt Rally, but MSI 300 series cards don't come with the free game code for this.

Anyone who has a Dirt Rally code that doesn't want it, let me know!


----------



## Motley01

Quote:


> Originally Posted by *BlaXey*
> 
> I have the Dirt Rally, i played 1 hour and i unistalled it, for me is unplayable, i used a logitech driving force gt, i don't kwow how it is using a better wheel.


I'd be interested in your Dirt Rally code if you don't want it?


----------



## Duke976

Quote:


> Originally Posted by *Shatun-Bear*
> 
> Guys can I ask a question.
> 
> I was debating getting the 390 after watching
> 
> 
> 
> (I don't normally watch this guy).
> 
> Basically, he recommends the 390 over the 970 for a number of reasons. One of which is the overclocking potential, where he claims the 390s seem to overclock better than the 290s. He gets 1200 (and actually 1250, but scales it back) overclock on the core. But reading this thread, and the table from the OP, 1180 seems to be the highest. Did JayzTwoCents YouTube guy get a golden sample or something? Does the MSI Gaming 390 really overclock better than non-ref 290s?
> 
> EDIT: Just read this in the OP:


My 1st XFX 390 card did 1200 core and 1600 mem, now that I have MSI 390 I was also able to get 1200 core but with 1700 mem @+75mv. So it seems that MSI cards can attain 1200 its just a matter of applying the right voltage that u are comfortable using.

Here's a screenshot of my MSI 390 running heaven benchmark, hope that helps.



http://imgur.com/TN84EFF


----------



## BlaXey

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I use the DFGT also.... Works great for me. Very addictive, but also a very difficult game.... More simulation than any of the other dirt games.


Play assetto corsa, it is the best simulation game for me wink.gif Dirt for me is arcade, i have played 192 hours of assetto corsa


----------



## BradleyW

Can I join, since the 290X is the same as the 390X now that the driver improvements have been carried over for previous gen?


----------



## Agent Smith1984

Quote:


> Originally Posted by *BradleyW*
> 
> Can I join, since the 290X is the same as the 390X now that the driver improvements have been carried over for previous gen?


Lol, sorry bud


----------



## Agent Smith1984

Quote:


> Originally Posted by *Duke976*
> 
> My 1st XFX 390 card did 1200 core and 1600 mem, now that I have MSI 390 I was also able to get 1200 core but with 1700 mem @+75mv. So it seems that MSI cards can attain 1200 its just a matter of applying the right voltage that u are comfortable using.
> 
> Here's a screenshot of my MSI 390 running heaven benchmark, hope that helps.
> 
> 
> 
> http://imgur.com/TN84EFF


You running both cards yet? I've tried lots of voltage... 1200 won't happen for me without 150mv, and wasnt worth the temps


----------



## Motley01

Quote:


> Originally Posted by *BlaXey*
> 
> How can i do that? I get the cd key and i send it for you? Because i bought it a few months ago-


Yes as long as you un-install it from your steam library. Then just give me your activation code.

Edit: Well not so. Apparantly you connot transfer games after it has been registered to your account. So I'm outta luck. It has to be an un-registered game code.


----------



## Duke976

Quote:


> Originally Posted by *Agent Smith1984*
> 
> You running both cards yet? I've tried lots of voltage... 1200 won't happen for me without 150mv, and wasnt worth the temps


Not after the 95c temp top card I am just waiting for my RIVE mobo from RMA so I can retest these cards again. My Sabertooth X79 PCI-E positions are not helping at all with the crossfire set-up.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Duke976*
> 
> Not after the 95c temp top card I am just waiting for my RIVE mobo from RMA so I can retest these cards again. My Sabertooth X79 PCI-E positions are not helping at all with the crossfire set-up.


Have you tried running them stock voltage with custom fan profiles?

That's what i Had to do with my 290's


----------



## Duke976

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Have you tried running them stock voltage with custom fan profiles?
> 
> That's what i Had to do with my 290's


Yup, and it still reach 95c


----------



## Agent Smith1984

Quote:


> Originally Posted by *Duke976*
> 
> Yup, and it still reach 95c


Even with 100% fan? Wow
Tried running your case with the door off?

You May want to try new tim, what's your ambient temp?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Even with 100% fan? Wow
> Tried running your case with the door off?
> 
> You May want to try new tim, what's your ambient temp?


One thing I'll say, is that my board does have good slot spacing... See my sig rig pic ( not updated yet)


----------



## jon666

Stock Clocks Not sure what is going on with temps.


----------



## tbob22

Looking to upgrade my 7950. The Powercolor PCS+ seems to be pretty good for the $ _(decent cooling)_. Any Idea if it will work with my board?

I remember people having issues with 290's and x58, and I think some of the cards only supported UEFI .


----------



## precision351

I have a x58 board and my MSI 390x runs fine.


----------



## DividebyZERO

Yes, i was one with major issues on 290/290x on x58. I am not sure but i think it may have been a driver issue because my original work around at that time was to use ASUS PT1 bios. I am now using stock/Stilt's bios and they work fine. To be honest i am not sure when they fixed it, but its apparently gone now. So i could only imagine 390/x working as well like user above stated.


----------



## MrSharkington

So I'm considering an R9 390 to upgrade from my 660. I was just wondering if my 550w psu (model is in my sig) would be able to handle it even with an overclock?


----------



## Duke976

Quote:


> Originally Posted by *MrSharkington*
> 
> So I'm considering an R9 390 to upgrade from my 660. I was just wondering if my 550w psu (model is in my sig) would be able to handle it even with an overclock?


Unfortunately not, PSU recommendation is at least 750w for a single 390.


----------



## neurotix

I'll stick with my beautiful dual Vapor-X coolers with lovely backplates and VRM LEDs.

Shame that the new Tri-X still doesn't have a fricken backplate. All cards above $300 should have backplates, it's 2015 for Christ's sake. The MSI gaming seems to have one and it looks great.

No compelling reason to upgrade, I can't understand why you'd be so excited about a rebrand and sell your old card to get these new ones when they're literally the same thing.

I'd love to see someone try and beat this on air with a R9 390.


----------



## Ha-Nocri

well, 550W could be enough, but I wouldn't increase voltage too much. I see reviewers testing 290 in CF reporting from 600 to 750W usage.


----------



## MrSharkington

Quote:


> Originally Posted by *Ha-Nocri*
> 
> well, 550W could be enough, but I wouldn't increase voltage too much. I see reviewers testing 290 in CF reporting from 600 to 750W usage.


Yeah I don't plan to add voltage, just a basic overclock so ill just go with it


----------



## Shatun-Bear

Quote:


> Originally Posted by *Duke976*
> 
> My 1st XFX 390 card did 1200 core and 1600 mem, now that I have MSI 390 I was also able to get 1200 core but with 1700 mem @+75mv. So it seems that MSI cards can attain 1200 its just a matter of applying the right voltage that u are comfortable using.
> 
> Here's a screenshot of my MSI 390 running heaven benchmark, hope that helps.
> 
> 
> 
> http://imgur.com/TN84EFF


Thanks for the info.

Hmmm, seems they can overclock better than the 290s by quite a bit if you are lucky.

1200 on the core and 1700 memory would increase the performance significantly and I'd be very happy indeed if I could achieve that with one.

A general question to you guys here - I like the look of the *Gigabyte Radeon R9 390 Gaming G1*, but I notice most people rocking the MSI Gaming. What's the general consensus on the Gigabyte card compared to the others? I can't find a review of it anywhere.


----------



## diggiddi

Its voltage locked per what I've read on here


----------



## tbob22

Quote:


> Originally Posted by *precision351*
> 
> I have a x58 board and my MSI 390x runs fine.


Quote:


> Originally Posted by *DividebyZERO*
> 
> Yes, i was one with major issues on 290/290x on x58. I am not sure but i think it may have been a driver issue because my original work around at that time was to use ASUS PT1 bios. I am now using stock/Stilt's bios and they work fine. To be honest i am not sure when they fixed it, but its apparently gone now. So i could only imagine 390/x working as well like user above stated.


Good to know. Think I'll run into any power issues? I'm running pretty high clocks on my CPU, I wouldn't be surprised if it pulled 200W+ at load.


----------



## Shatun-Bear

Quote:


> Originally Posted by *diggiddi*
> 
> Its voltage locked per what I've read on here


Yeah I know, but I'm interested in the Gigabyte cooler on there and it's temps.

The thing is, the MSI Gaming would be a great 390 to buy but I just prefer the colour scheme of the Gigabyte and I'm a sucker for the "Windforce" LED writing on the side of the card! The Gigabyte also has two additional LED indicators either side of "Windforce": "Silent" and "Stop".


----------



## Agent Smith1984

Quote:


> Originally Posted by *Shatun-Bear*
> 
> Yeah I know, but I'm interested in the Gigabyte cooler on there and it's temps.
> 
> The thing is, the MSI Gaming would be a great 390 to buy but I just prefer the colour scheme of the Gigabyte and I'm a sucker for the "Windforce" LED writing on the side of the card! The Gigabyte also has two additional LED indicators either side of "Windforce": "Silent" and "Stop".


From an oc standpoint the G1 will be inferior. Many users have returned theirs for sapphire or msi cards.

From a cooling standpoint the card is good but not great.


----------



## Agent Smith1984

Quote:


> Originally Posted by *neurotix*
> 
> I'll stick with my beautiful dual Vapor-X coolers with lovely backplates and VRM LEDs.
> 
> Shame that the new Tri-X still doesn't have a fricken backplate. All cards above $300 should have backplates, it's 2015 for Christ's sake. The MSI gaming seems to have one and it looks great.
> 
> No compelling reason to upgrade, I can't understand why you'd be so excited about a rebrand and sell your old card to get these new ones when they're literally the same thing.
> 
> I'd love to see someone try and beat this on air with a R9 390.


Beat what?


----------



## Hazardz

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Beat what?


Looks like P17000 on 3DMark11.


----------



## Shatun-Bear

Quote:


> Originally Posted by *Agent Smith1984*
> 
> From an oc standpoint the G1 will be inferior. Many users have returned theirs for sapphire or msi cards.
> 
> *From a cooling standpoint the card is good but not great.*


Hmmm thanks for your thoughts.

Are you basing its cooling capability on how Windforce 2x cooling works on other Gigabyte cards?


----------



## Agent Smith1984

Quote:


> Originally Posted by *neurotix*
> 
> I'll stick with my beautiful dual Vapor-X coolers with lovely backplates and VRM LEDs.
> 
> Shame that the new Tri-X still doesn't have a fricken backplate. All cards above $300 should have backplates, it's 2015 for Christ's sake. The MSI gaming seems to have one and it looks great.
> 
> No compelling reason to upgrade, I can't understand why you'd be so excited about a rebrand and sell your old card to get these new ones when they're literally the same thing.
> 
> I'd love to see someone try and beat this on air with a R9 390.


I could get your graphics score, but not the overall score because of my cpu


----------



## Superjit94

So i've been playing Witcher 3 lately and as im playing the game, it'll freeze, the screen will go all red, then my PC will restart. Anybody experiencing this?


----------



## Agent Smith1984

Uhhhh
Anybody ever seen heaven lose textures and just render wire frames?


----------



## Horsemama1956

Yeah, when wireframes are turned on.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Horsemama1956*
> 
> Yeah, when wireframes are turned on.


----------



## Snailgun

Quote:


> Originally Posted by *Superjit94*
> 
> So i've been playing Witcher 3 lately and as im playing the game, it'll freeze, the screen will go all red, then my PC will restart. Anybody experiencing this?


Check your CPU overclock.


----------



## Superjit94

Quote:


> Originally Posted by *Snailgun*
> 
> Check your CPU overclock.


CPU is not overclocked, only thing overclocked is the r9 390, and it's mild at best to what people are running on this thread


----------



## sidfirex

looks like there are some freezing issues with really high overclock.

I was on 1150 / 1650 on the Sapphire Tri-x R9 390x (powerlimt to 50 and voltage to 50)

I ran this tool's benchmark to stress test the GPU - http://www.ocbase.com/index.php/download @ 1440p , fps limit off and was monitoring the temps in HWINFO,

What I found is, no matter what speed the tri x fans are , the VRM temps are going to 100*c after a few minutes, woah !! and then there was this freeze and display driver not responding error !!!

note: The gpu temp stays at around 70'c but the VRM temp over 100 !!

Can anyone advise if this is normal when benchmark testing for the VRM temp to hit that high???


----------



## specopsFI

Quote:


> Originally Posted by *sidfirex*
> 
> looks like there are some freezing issues with really high overclock.
> 
> I was on 1150 / 1650 on the Sapphire Tri-x R9 390x (powerlimt to 50 and voltage to 50)
> 
> I ran this tool's benchmark to stress test the GPU - http://www.ocbase.com/index.php/download @ 1440p , fps limit off and was monitoring the temps in HWINFO,
> 
> What I found is, no matter what speed the tri x fans are , the VRM temps are going to 100*c after a few minutes, woah !! and then there was this freeze and display driver not responding error !!!
> 
> note: The gpu temp stays at around 70'c but the VRM temp over 100 !!
> 
> Can anyone advise if this is normal when benchmark testing for the VRM temp to hit that high???


Perfectly normal for what you we're doing. OCCT is at least as bad as Furmark. I would strongly advice never turning it on again.

Still, the card crashing would suggest an unstable overclock which isn't too surprising. +50mV seems like a low OV for those clocks. +75mV would be more likely to be absolutely stable.


----------



## sidfirex

Quote:


> Originally Posted by *specopsFI*
> 
> Perfectly normal for what you we're doing. OCCT is at least as bad as Furmark. I would strongly advice never turning it on again.
> 
> Still, the card crashing would suggest an unstable overclock which isn't too surprising. +50mV seems like a low OV for those clocks. +75mV would be more likely to be absolutely stable.


Ok thanks mate, really appreciate your response  will not use that program again.

I have reverted back to 1100 clock and 1550 memory for now just for playing games no problems at +50 power limit

Also during gaming like in Crysis 3 , Battlefield 4, VRM temp is at 65*c to 70 which I hope is good .


----------



## Agent Smith1984

I can confirm that there are three programs that will absolutely cook your GPU, and I see no need to ever use them...

OCCT
FurMark
Kombuster

I've seen those things bring VRM's to 105c (this was back when I had my first 280x and thought this was a good way to test it)

I learned my lesson.....

Heaven on a few loops has been my absolute go to stability test for overclocking.

The FireStrike demo will generally let you know if artifacts are coming too, right around the time the music picks back up and the rock beast gets blown up by the chick in the cloaked suite...

I did some extensive overclocking yesterday with my door off and a box fan blowing on everything....

I found that the card IS NOT POWER LIMITED....

HardOCP put out an article saying they were hitting the TDP limits at 50mv+.... this is NOT true at all (as confirmed by Cannon also)

I found that my voltage at 100mv+ would stay around 1.266 under full load....

I found at 150mv+ in trixx, that my voltage would stay around 1.313 under load

At 200mv+ in trixx it would stay around 1.349

So while the voltage scaling is not 100%, it is working, and the card was not hitting the TDP

The problem is.... the clock scaling itself... Hawaii just doesn't scale well with voltage.

I was benching at 1200 (150mv+) yesterday, and all ran well.

Going to 200mv only netted me 15mhz more.....

I can do 1180MHz at 100mv which is the sweet spot for my card

Daily, I just ran at 1150 (50mv+) and it performs great.

These are all classic characteristics seen in Hawaii cores. Maybe a tad better results overall, but nothing stellar.


----------



## diggiddi

So what do you use to Overclock your GPU's?


----------



## DizzlePro

Msi AfterBurner, Sapphire TrixX, Asus Gpu tweak


----------



## diggiddi

Quote:


> Originally Posted by *DizzlePro*
> 
> Msi AfterBurner, Sapphire TrixX, Asus Gpu tweak


Yeah but how do you test for stability


----------



## Agent Smith1984

Quote:


> Originally Posted by *diggiddi*
> 
> Yeah but how do you test for stability


About 3 loops of Heaven is all I need to verify the GPU OC, then about an hour or more of Crysis 3 and if it's going, I call it stable....


----------



## diggiddi

Quote:


> Originally Posted by *Agent Smith1984*
> 
> About 3 loops of Heaven is all I need to verify the GPU OC, then about an hour or more of Crysis 3 and if it's going, I call it stable....


I see, I used this guide below, it has techpowerup's GPU tool which uses furmark

http:// http://www.overclock.net/t/633816/how-to-overclock-your-amd-ati-gpu


----------



## Agent Smith1984

Quote:


> Originally Posted by *diggiddi*
> 
> I see, I used this guide below, it has techpowerup's GPU tool which uses furmark
> 
> http:// http://www.overclock.net/t/633816/how-to-overclock-your-amd-ati-gpu


That definitely works, I just find it to be overkill.

It puts the card under a stress condition that can't be achieved by any games.


----------



## jon666

I like to run through Valley, Heaven, the Metro 2033/Last Light benchmarks for stability. If it can pass three runs on the Metro benchmarks, then I could potentially play The WItcher 3 for a few hours before black screening with everything maxed out.


----------



## jon666

Is anyone running into issues with temperature readings? Most benchmarks tell me GPU is running in the thousands in Celsius, and GPU-Z has given me a high of 146 C. Card doesn't seem to be running that warm. Highest VRM temps are 62 C...


----------



## Gumbi

Quote:


> Originally Posted by *jon666*
> 
> Is anyone running into issues with temperature readings? Most benchmarks tell me GPU is running in the thousands in Celsius, and GPU-Z has given me a high of 146 C. Card doesn't seem to be running that warm. Highest VRM temps are 62 C...


Ignore the 146 degree spike GPUZ and the crazy numbwrs in Valley, I get the same on my 290 Vapor X.


----------



## Motley01

Did you guys see that Hardforum just posted a review on the MSI 390x vs Asus Strix Fury.

Wow I'm glad i got the 390x instead of the Fury. They seem to perform the same, yet the Fury costs $120 more.

http://www.hardocp.com/article/2015/07/13/msi_r9_390x_gaming_vs_asus_strix_fury_review#.VaRHzvlVhBe


----------



## Clockster

Quote:


> Originally Posted by *Motley01*
> 
> Did you guys see that Hardforum just posted a review on the MSI 390x vs Asus Strix Fury.
> 
> Wow I'm glad i got the 390x instead of the Fury. They seem to perform the same, yet the Fury costs $120 more.
> 
> http://www.hardocp.com/article/2015/07/13/msi_r9_390x_gaming_vs_asus_strix_fury_review#.VaRHzvlVhBe


lol yeah in 1 game the R9 390X keeps up with Fury...most of the time Fury is faster bud.
Also no 4K benchmarks? I wonder why.


----------



## th3illusiveman

Quote:


> Originally Posted by *Motley01*
> 
> Did you guys see that Hardforum just posted a review on the MSI 390x vs Asus Strix Fury.
> 
> Wow I'm glad i got the 390x instead of the Fury. They seem to perform the same, yet the Fury costs $120 more.
> 
> http://www.hardocp.com/article/2015/07/13/msi_r9_390x_gaming_vs_asus_strix_fury_review#.VaRHzvlVhBe


Those guys have no idea what they are doing. They are the only website where a 390X is equal to a Fury (non-X). I don't trust their reviews, they seem very incompetent (they couldn't even overclock the 290X cards).

I still think all AMDs cards need price drops (Fury non X should be $450 if they really wanna sell them) but that review is garbage.


----------



## neurotix

I would second Agent Smith's recommendation *NOT* to use Furmark.

Furmark puts your card(s) under ridiculously unrealistic load, and it's not uncommon for people to fry them, though this happened more often back in the day when the cards were much poorer quality and had bad power delivery designs.

I agree, running Heaven or Valley for a few loops and watching for temps (above 80C is generally bad for Hawaii and most AMD cards) as well as artifacts is the fastest way to test stability. Additionally, OCCT GPU test with max shader complexity for a few minutes is a great test, as it will literally tell you if it finds errors, and if it does, the OC is unstable. This test seems like it may be based on Furmark but I'm unsure, in actual practice my cards have never gotten anywhere near as hot with OCCT as they do with Furmark.

Of course, the recommendation to play a demanding game for an hour or two is the best one. This is the only way to know for sure that you're stable, putting the card under a load it will realistically be doing most of the time....using it for what you were planning to use it for. It's even better if you play more than one game, as a card may seem stable in one but artifact or crash in another.

I would also recommend trying out [email protected] for a couple of work units (should take about a day), this is a most excellent test of your overclock, because the work units WILL fail and give you an error if your overclock is bad. They even have a GPU memory error tester you can use to test your memory OC. (This may be hard to find now, though.)

Personally, I use Sapphire Trixx to overclock. In a multi-card setup, Trixx actually allows you to set separate clocks for individual cards through a drop-down menu. I have not figured out how to achieve the same thing using Afterburner or anything else.


----------



## diggiddi

CCC also allows you to select speeds for individual cards in the overdrive menu


----------



## neurotix

But, it doesn't have voltage control, does it?


----------



## diggiddi

Quote:


> Originally Posted by *neurotix*
> 
> I would second Agent Smith's recommendation *NOT* to use Furmark.
> 
> Furmark puts your card(s) under ridiculously unrealistic load, and it's not uncommon for people to fry them, though this happened more often back in the day when the cards were much poorer quality and had bad power delivery designs.
> 
> I agree, running Heaven or Valley for a few loops and watching for temps (above 80C is generally bad for Hawaii and most AMD cards) as well as artifacts is the fastest way to test stability. Additionally, OCCT GPU test with max shader complexity for a few minutes is a great test, as it will literally tell you if it finds errors, and if it does, the OC is unstable. This test seems like it may be based on Furmark but I'm unsure, in actual practice my cards have never gotten anywhere near as hot with OCCT as they do with Furmark.
> 
> Of course, the recommendation to play a demanding game for an hour or two is the best one. This is the only way to know for sure that you're stable, putting the card under a load it will realistically be doing most of the time....using it for what you were planning to use it for. It's even better if you play more than one game, as a card may seem stable in one but artifact or crash in another.
> 
> I would also recommend trying out [email protected] for a couple of work units (should take about a day), this is a most excellent test of your overclock, because the work units WILL fail and give you an error if your overclock is bad. They even have a GPU memory error tester you can use to test your memory OC. (This may be hard to find now, though.)
> 
> Personally, I use Sapphire Trixx to overclock. In a multi-card setup, Trixx actually allows you to set separate clocks for individual cards through a drop-down menu. I have not figured out how to achieve the same thing using Afterburner or anything else.


See this thread post #9 for opinions of OCCT
http://www.overclock.net/t/1171172/so-how-reliable-is-occt-gpu-stress-test

Quote:


> Originally Posted by *neurotix*
> 
> But, it doesn't have voltage control, does it?


Yes you are right no voltage control


----------



## jon666

I thought in xfire, both cards ran at the lowest clocked card anyways? Did that change when AMD did away with bridges?


----------



## neurotix

Quote:


> Originally Posted by *diggiddi*
> 
> See this thread post #9 for opinions of OCCT
> http://www.overclock.net/t/1171172/so-how-reliable-is-occt-gpu-stress-test
> Yes you are right no voltage control


I don't know how anyone could kill a card with OCCT, it's similar to Furmark but different, and none of the cards I've used it with have gone over about 55C with 100% fan while testing them. I doubt it could kill a card. Additionally, I *did* recommend heavy gaming too.
Quote:


> Originally Posted by *jon666*
> 
> I thought in xfire, both cards ran at the lowest clocked card anyways? Did that change when AMD did away with bridges?


I don't really know because I've only had Crossfire with 290s.

However, from what I recall, when @Devildog83 had his 270X/7870 Crossfire (which uses a bridge) I believe he was able to run different clocks on each card.

What you *might* be thinking about, is that from what I know, if you Crossfire two GPUs of the same type together, but one is weaker than the other (example: 7870 and 7850), the higher specced card will have some of it's CUs disabled when in Crossfire. I think. This means it will have less shaders available.


----------



## sidfirex

Quote:


> Originally Posted by *neurotix*
> 
> I don't know how anyone could kill a card with OCCT, it's similar to Furmark but different, and none of the cards I've used it with have gone over about 55C with 100% fan while testing them. I doubt it could kill a card. Additionally, I *did* recommend heavy gaming too.
> I don't really know because I've only had Crossfire with 290s.
> 
> However, from what I recall, when @Devildog83 had his 270X/7870 Crossfire (which uses a bridge) I believe he was able to run different clocks on each card.
> 
> What you *might* be thinking about, is that from what I know, if you Crossfire two GPUs of the same type together, but one is weaker than the other (example: 7870 and 7850), the higher specced card will have some of it's CUs disabled when in Crossfire. I think. This means it will have less shaders available.


Hi Have you tried running OCCT and monitoring your VRM temp? (not the gpu temp in general)

My 390x saphhire goes well upto 100*C and more in just a few minutes - (100% fans on the tri x cooler)


----------



## neurotix

Quote:


> Originally Posted by *sidfirex*
> 
> Hi Have you tried running OCCT and monitoring your VRM temp? (not the gpu temp in general)
> 
> My 390x saphhire goes well upto 100*C and more in just a few minutes - (100% fans on the tri x cooler)


Yep!

Here's the result of a 5 min OCCT run with 100% fan on my 290 Vapor-X:



I run Eyefinity. I also run OCCT in Window mode.

(It's a large image so right click it, then "open in new tab", then finally click on it to zoom in.)

You can see the relevant stats in my AIDA64 sidebar gadget, such as GPU 57C and VRM1 46C.







(The GPU1 clock is 300mhz because the clocks jump around during the test, but for the most part they were 1100mhz, per my overclock.)

The results with my 7970 Vapor-X, and 270X Vapor-X have been similar. VRM1 has never gone above 50C on any of them.

In comparison, Furmark makes my card and my VRMs get WAY WAY hotter a lot faster.

It's very strange that with the Tri-X, you are getting such high temps.

Maybe you're "doing it wrong" or you have poor case airflow?

Perhaps other people with bad temps just have bad temps in general because they bought bad cards from bad vendors? I dunno.


----------



## Ha-Nocri

Quote:


> Originally Posted by *th3illusiveman*
> 
> Those guys have no idea what they are doing. They are the only website where a 390X is equal to a Fury (non-X). I don't trust their reviews, they seem very incompetent (they couldn't even overclock the 290X cards).
> 
> I still think all AMDs cards need price drops (Fury non X should be $450 if they really wanna sell them) but that review is garbage.


I never read their reviews b/c my eyes hurt from reading white letters on black background. Who thought it was a good design, rly?! Now I have another reason not to read them


----------



## sidfirex

Quote:


> Originally Posted by *neurotix*
> 
> Yep!
> 
> Here's the result of a 5 min OCCT run with 100% fan on my 290 Vapor-X:
> 
> 
> 
> I run Eyefinity. I also run OCCT in Window mode.
> 
> (It's a large image so right click it, then "open in new tab", then finally click on it to zoom in.)
> 
> You can see the relevant stats in my AIDA64 sidebar gadget, such as GPU 57C and VRM1 46C.
> 
> 
> 
> 
> 
> 
> 
> (The GPU1 clock is 300mhz because the clocks jump around during the test, but for the most part they were 1100mhz, per my overclock.)
> 
> The results with my 7970 Vapor-X, and 270X Vapor-X have been similar. VRM1 has never gone above 50C on any of them.
> 
> In comparison, Furmark makes my card and my VRMs get WAY WAY hotter a lot faster.
> 
> It's very strange that with the Tri-X, you are getting such high temps.
> 
> Maybe you're "doing it wrong" or you have poor case airflow?
> 
> Perhaps other people with bad temps just have bad temps in general because they bought bad cards from bad vendors? I dunno.


This something of a concern , I have good airflow, maybe this vrm1 temps is relevant to 390x cards???

edit: I run eyefinity too 

just checked your screenshot btw, you got your fps limit to 60fps - try entering 0 there (unlimited fps)

at unlimited fps, 1150 / 1650 clocks I was getting 275 fps and the gpu temp stays at 65 degrees but the vrm1 temps shoots up to 100 degrees.

and I ran my test at 2440p


----------



## specopsFI

Quote:


> Originally Posted by *neurotix*
> 
> Yep!
> 
> Here's the result of a 5 min OCCT run with 100% fan on my 290 Vapor-X:
> 
> 
> 
> I run Eyefinity. I also run OCCT in Window mode.
> 
> (It's a large image so right click it, then "open in new tab", then finally click on it to zoom in.)
> 
> You can see the relevant stats in my AIDA64 sidebar gadget, such as GPU 57C and VRM1 46C.
> 
> 
> 
> 
> 
> 
> 
> (The GPU1 clock is 300mhz because the clocks jump around during the test, but for the most part they were 1100mhz, per my overclock.)
> 
> The results with my 7970 Vapor-X, and 270X Vapor-X have been similar. VRM1 has never gone above 50C on any of them.
> 
> In comparison, Furmark makes my card and my VRMs get WAY WAY hotter a lot faster.
> 
> It's very strange that with the Tri-X, you are getting such high temps.
> 
> Maybe you're "doing it wrong" or you have poor case airflow?
> 
> Perhaps other people with bad temps just have bad temps in general because they bought bad cards from bad vendors? I dunno.


You run it wrong then (not that there is a right way to run OCCT, it shouldn't be run at all IMHO). No wonder you're not getting the temps up if you only run it locked at 60 fps. Also, if my memory serves me right, shader complexity goes up 9 whereas you have it at 7?

OCCT is just as bad as Furmark. Both are also completely irrelevant for gaming stability.


----------



## Agent Smith1984

No disrespect to anyone over at HardOCP, but time to time, I do find myself skimming their reviews for the apples - to - apples benches and that's it.

Here's what I will say....

I like what they were trying to show with the review. And maybe the focus was to show people looking for mainstream resolutions that their money is still best spent with a mainstream card offering. Seems to be a very "casual comparison" between the two cards.

I would of also liked to have seen some overclocking come into play in that review, and see how the performance scales. I take their OC results with a grain of salt though, as the same reviewers were convinced a few weeks ago that the 390's board was power limited at 50mv+ when it is clearly not....

I think it's good to see the direct comparison made in that review though. HardOCP thoroughly tests the game-play itself, so I see no need to dispute those results.

Bottom line is; what's happening in that review is very simple....

At 1440P resolution (the exact res targeted by the 390x) the 390x is a much better value and performs very admirably compared to the Fury... the reason(my guess anyways)?

390X' 64 ROP's versus the 56 of the Fury.
At 4K, the advantage will shift back to the shaders, and you will see Fury pull ahead. The HBM will also begin to make some noise at 4K also.

Truth is, if you play at 1080 or 1440, the Fury is not a great value. If you play at 4k, the single Fury itself won't be a turnkey solution by itself anyways, so you may still be better served by pairing some 390's...

Speaking of which, if HardOCP wanted to make a value argument for that review, the ideal thing to do would be to test the 390 against the Fury, then OC them both, and test again, and see how close things really are, and I'd bet it's not a $230 difference.... I bet it's not even a $130 difference......


----------



## th3illusiveman

Fury is not a good value at all, neither is the GTX 980. They are almost $100 more then they are worth. Also it has 64 ROPS. everything in that card except the ROPs is significantly higher then what you will find in a 290X/390X which is why their review is crap.

speaking of 4K ....







i need to go out more


----------



## Agent Smith1984

Quote:


> Originally Posted by *th3illusiveman*
> 
> Fury is not a good value at all, neither is the GTX 980. They are almost $100 more then they are worth. Also it has 64 ROPS. everything in that card except the ROPs is significantly higher then what you will find in a 290X/390X which is why their review is crap.
> 
> speaking of 4K ....
> 
> 
> 
> 
> 
> 
> 
> i need to go out more


The Fury "pro" uses 56 ROP's and 3584 shaders

Disregard, both cards use 64 ROP's

With a Fury at stock core clocks, and a 390X Gaming at 1100MHz core, none of the results in that review should surprise anybody.

At lower res, the performance is VERY close, and other reviews at 1080P/1440P show similar results to this article. There is nothing about the Fury pro that is dramatically faster than the 390X until you get into 4K territory.


----------



## Ha-Nocri

Fury has 64 ROP's too


----------



## Agent Smith1984

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Fury has 64 ROP's too


Corrected, sorry

Still, the number of ROP being the same is the obvious limitation. The shaders go to work at higher resolution and that's where fiji pulls away. The problem is, looking at how close fiji pro is to xt, you can again see the bottleneck caused by using 64 rops.


----------



## th3illusiveman

Quote:


> Originally Posted by *Agent Smith1984*
> 
> The Fury "pro" uses 56 ROP's and 3584 shaders
> 
> Disregard, both cards use 64 ROP's
> 
> With a Fury at stock core clocks, and a 390X Gaming at 1100MHz core, none of the results in that review should surprise anybody.
> 
> At lower res, the performance is VERY close, and other reviews at 1080P/1440P show similar results to this article. There is nothing about the Fury pro that is dramatically faster than the 390X until you get into 4K territory.


It takes about 50 Mhz for a 290 to match a 290X in performance. the 290x has 10% more cores and 10% more TMUs then the 290.

the Fury has 27% more shaders, 27% more TMUs and much faster bandwidth then the 390X so there should be no scenario at any resolution where they are equal in performance even with the 50 Mhz core clock advantage.


----------



## Agent Smith1984

Random benchies

My best FireStrike
http://www.3dmark.com/fs/5292646

My best 3DMark 11
http://www.3dmark.com/3dm11/10045030

Overall scores are kinda low due to CPU limitations, but the graphics scores are pretty nice....

Those are ZERO artifact runs.


----------



## Agent Smith1984

Quote:


> Originally Posted by *th3illusiveman*
> 
> It takes about 50 Mhz for a 290 to match a 290X in performance. the 290x has 10% more cores and 10% more TMUs then the 290.
> 
> the Fury has 27% more shaders, 27% more TMUs and much faster bandwidth then the 390X so there should be no scenario at any resolution where they are equal in performance even with the 50 Mhz core clock advantage.


On paper you are correct....

But look how close they are in 1080 and in 1440....
http://www.tweaktown.com/reviews/7241/sapphire-tri-radeon-r9-fury-video-card-review-hbm-water-cooler/index6.html

Every Fury review shows this.... I don't quite understand it either. So I can only chock it up to the limitations of the ROP count.
It would explain why the NVIDIA cards with 96 ROPS absolutely dominate the lower resolutions....

Edit:

Even more..
http://www.techpowerup.com/reviews/ASUS/R9_Fury_Strix/15.html

TPU review is EXTENSIVE game coverage, and it clearly shows the Fury is a pubic hair faster at 1080 and 1440.
390X is literally the next fastest GPU solution to the Fury.

Based on that review, I'd say the 390 series are the best value in PC gaming period right now (when buying a new card, and 290 is no longer available).....
Call me bias?


----------



## Duke976

Any news on the waterblock for MSI 390? I've check EK and only the Asus 390 has the full waterblock since it is compatible with the 290. I am getting worried with this situation with the lack of waterblocks for the MSI 390's.


----------



## gatygun

Would it not be better to fuse this and the 290 topic together
Quote:


> Originally Posted by *Agent Smith1984*
> 
> On paper you are correct....
> 
> But look how close they are in 1080 and in 1440....
> http://www.tweaktown.com/reviews/7241/sapphire-tri-radeon-r9-fury-video-card-review-hbm-water-cooler/index6.html
> 
> Every Fury review shows this.... I don't quite understand it either. So I can only chock it up to the limitations of the ROP count.
> It would explain why the NVIDIA cards with 96 ROPS absolutely dominate the lower resolutions....
> 
> Edit:
> 
> Even more..
> http://www.techpowerup.com/reviews/ASUS/R9_Fury_Strix/15.html
> 
> TPU review is EXTENSIVE game coverage, and it clearly shows the Fury is a pubic hair faster at 1080 and 1440.
> 390X is literally the next fastest GPU solution to the Fury.
> 
> Based on that review, I'd say the 390 series are the best value in PC gaming period right now (when buying a new card, and 290 is no longer available).....
> Call me bias?


That's really bad on the fury, god dam that 290 i bought is going to age well.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Duke976*
> 
> Any news on the waterblock for MSI 390? I've check EK and only the Asus 390 has the full waterblock since it is compatible with the 290. I am getting worried with this situation with the lack of waterblocks for the MSI 390's.


I don't think they are going to produce full cover blocks.

EK normally produces blocks for top level cards, and with the 390 series being a mainstream card, I doubt they will make any.
Anyone wanting to go full cover water probably needs to purchase a 390 that is compatible with one of the 290 blocks that they manufacture.

I don't believe AMD's partners made any assumptions that block manufacturers would even pursue making blocks for these cards, nor did they figure for any current 290 owners migrating to these cards, so they saw no need to make the boards compatible with old blocks. At least that's my theory...

It would be nice to see some blocks for these though, since currently, unless you have one of the 300 series using the same PCB layout as the manufacturer used on their respective 200 series card, you don't really have any option but to use a GPU core only water solution...

Then again, maybe popularity will drive manufacturers to producing some blocks.
Remember, we only have 20+/- members in this club, so these obviously aren't selling in huge quantities.
If that picks up, then maybe EK or someone else will see that there is enough of a market base to constitute designing different blocks.








Quote:


> Originally Posted by *gatygun*
> 
> Would it not be better to fuse this and the 290 topic together
> That's really bad on the fury, god dam that 290 i bought is going to age well.


I would say no, since the cards do have their differences.
We also have two threads for 7 series cards, and for the 2** cards....
The cards use the same architecture, but there are differences enough to constitute another thread.

For example, when discussing the overclocking of memory on our 300 series cards, we will have little to nothing in common with 290 owners.
Also, cooling availibility will be a much different topic as well (as seen above).
Power usage and needs will be different in many cases also.

You are right about your 290 though. They are great cards, and should age very well


----------



## By-Tor

I think EK has full cover blocks for Powercolor, XFX and Asus 390/390x cards when I was looking to go for a 390, but ended up just buying a second Powercolor 290x LCS card.


----------



## Agent Smith1984

Quote:


> Originally Posted by *By-Tor*
> 
> I think EK has full cover blocks for Powercolor, XFX and Asus 390/390x cards when I was looking to go for a 390, but ended up just buying a second Powercolor 290x LCS card.


Great info!!!

Thanks By-Tor

BTW... SHOUT OUT TO MY CACCILLAC PEEEPS!!!!


----------



## By-Tor

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Great info!!!
> 
> Thanks By-Tor
> 
> BTW... SHOUT OUT TO MY CACCILLAC PEEEPS!!!!


Your not that far from me...Maybe 3 hours.. Just north of Fayettenam..

I was stationed at Pope back in the mid 80's, but was always flying someplace else in the world all the time and didn't spend much time there.


----------



## Agent Smith1984

Quote:


> Originally Posted by *By-Tor*
> 
> Your not that far from me...Maybe 3 hours.. Just north of Fayettenam..
> 
> I was stationed at Pope back in the mid 80's, but was always flying someplace else in the world all the time and didn't spend much time there.


Yeah, I figured you were military judging by the avatar.... or in your case, the avia-tar









I'm about 30 minutes north of Fayettnam.

Thank you for your service BTW


----------



## By-Tor

If you ever come down to this part of the coast drop me a PM and maybe setup a meet..


----------



## Agent Smith1984

UPDATE FROM EK:

This is referring to questions asked about a full cover block for the MSI 390/390X Gaming
Quote:


> Originally Posted by *EK_tiborrr*
> 
> There are no plans for such water block, please see the link above.


If you want to use full cover water cooling, you will not want to purchased an MSI card.

I will post this in the cooling section of the OP.

Thanks


----------



## xhitekredneckx

1. PROOF - http://www.techpowerup.com/gpuz/details.php?id=d7baw
2. XFX - AMD Radeon R9 390 8GB
3. Cooling - Stock

Have not started OC this yet, but getting ready to. Reading everyone's posts to see what has been achieved already.


----------



## Gumbi

Quote:


> Originally Posted by *xhitekredneckx*
> 
> 1. PROOF - http://www.techpowerup.com/gpuz/details.php?id=d7baw
> 2. XFX - AMD Radeon R9 390 8GB
> 3. Cooling - Stock
> 
> Have not started OC this yet, but getting ready to. Reading everyone's posts to see what has been achieved already.


Let us know how the VRM cooling is on them







It was passable at best on the XFX 290s.


----------



## Agent Smith1984

Quote:


> Originally Posted by *xhitekredneckx*
> 
> 1. PROOF - http://www.techpowerup.com/gpuz/details.php?id=d7baw
> 2. XFX - AMD Radeon R9 390 8GB
> 3. Cooling - Stock
> 
> Have not started OC this yet, but getting ready to. Reading everyone's posts to see what has been achieved already.


Added, and welcome


----------



## gatygun

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I don't think they are going to produce full cover blocks.
> 
> EK normally produces blocks for top level cards, and with the 390 series being a mainstream card, I doubt they will make any.
> Anyone wanting to go full cover water probably needs to purchase a 390 that is compatible with one of the 290 blocks that they manufacture.
> 
> I don't believe AMD's partners made any assumptions that block manufacturers would even pursue making blocks for these cards, nor did they figure for any current 290 owners migrating to these cards, so they saw no need to make the boards compatible with old blocks. At least that's my theory...
> 
> It would be nice to see some blocks for these though, since currently, unless you have one of the 300 series using the same PCB layout as the manufacturer used on their respective 200 series card, you don't really have any option but to use a GPU core only water solution...
> 
> Then again, maybe popularity will drive manufacturers to producing some blocks.
> Remember, we only have 20+/- members in this club, so these obviously aren't selling in huge quantities.
> If that picks up, then maybe EK or someone else will see that there is enough of a market base to constitute designing different blocks.
> 
> 
> 
> 
> 
> 
> 
> 
> I would say no, since the cards do have their differences.
> We also have two threads for 7 series cards, and for the 2** cards....
> The cards use the same architecture, but there are differences enough to constitute another thread.
> 
> For example, when discussing the overclocking of memory on our 300 series cards, we will have little to nothing in common with 290 owners.
> Also, cooling availibility will be a much different topic as well (as seen above).
> Power usage and needs will be different in many cases also.
> 
> You are right about your 290 though. They are great cards, and should age very well


Ah that's true tho.


----------



## DividebyZERO

I am a little late on the waterclock stuff as i did ask a rep here about the 290x/290 full cover clocks vs 390x and what i saw on the webstie.
I was hoping to re-use my 290x full copper blocks, but it appears for most of the 390/390x it won't work. I think the closest matching PCB was the XFX DD 390x. On the EK website it says its possible the 290x FC2.0 might fit but it was a bit confusing for me. anways hope the below helps.


----------



## Agent Smith1984

Quote:


> Originally Posted by *DividebyZERO*
> 
> I am a little late on the waterclock stuff as i did ask a rep here about the 290x/290 full cover clocks vs 390x and what i saw on the webstie.
> I was hoping to re-use my 290x full copper blocks, but it appears for most of the 390/390x it won't work. I think the closest matching PCB was the XFX DD 390x. On the EK website it says its possible the 290x FC2.0 might fit but it was a bit confusing for me. anways hope the below helps.


Thanks, also confirmed this earlier myself in another thread regarding the MSI 390's...

EK tiborr did in fact clarify that NO NEW BLOCKS WILL BE MANUFACTURED

I have updated the OP with this....

So far XFX, PCS+, and Asus DCU are compatible....

What 290X's are you running?


----------



## jon666

What makes them incompatible? Can this be fixed by pulling the pre-applied thermal pads, and applying new to the correct spots?


----------



## Agent Smith1984

Quote:


> Originally Posted by *jon666*
> 
> What makes them incompatible? Can this be fixed by pulling the pre-applied thermal pads, and applying new to the correct spots?


The mounting hole alingment will also vary on some boards...

Remember though: "Where there is a will, there is a way"

If you REALLY wanted to mount a full cover block on any of these cards, it can be done, but the madness it may bring, may not be worth it....
People like plug and play these days, and when dealing with water passing over live electronic components, that is especially the case!!


----------



## DividebyZERO

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Thanks, also confirmed this earlier myself in another thread regarding the MSI 390's...
> 
> EK tiborr did in fact clarify that NO NEW BLOCKS WILL BE MANUFACTURED
> 
> I have updated the OP with this....
> 
> So far XFX, PCS+, and Asus DCU are compatible....
> 
> What 290X's are you running?


LOL i just realized i spelled waterblocks wrong in my post


----------



## Gumbi

There are Powercolor 390s available on Newegg now, gonna look for some reviews now of them. They had a great 290 card, and some reviews noted power savings in their 290s over other 290s (maybe ot was doen to low stock voltage but I never looked far enoigh into it). They also could do 1150-1200mhz core.

Anyone have a Powercolor 390?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Gumbi*
> 
> There are Powercolor 390s available on Newegg now, gonna look for some reviews now of them. They had a great 290 card, and some reviews noted power savings in their 290s over other 290s (maybe ot was doen to low stock voltage but I never looked far enoigh into it). They also could do 1150-1200mhz core.
> 
> Anyone have a Powercolor 390?


Nope, no powercolor owners reported yet.


----------



## Duke976

Since their will be no full waterblock and I wanted to put these cards in water, I will have to exchange these two and pay more for the asus. I like the build quality of MSI but since the the PCI-e layout of my board is atrocious for x-fire aircooling, their is really no choice but to get an asus and wc both.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Duke976*
> 
> Since their will be no full waterblock and I wanted to put these cards in water, I will have to exchange these two and pay more for the asus. I like the build quality of MSI but since the the PCI-e layout of my board is atrocious for x-fire aircooling, their is really no choice but to get an asus and wc both.


If you have any issues returning them, let me know, I am interested in buying a second card









Also, I would suggest getting the cheapest card you can find between the XFX, the Asus and the Powercolor.

As far as those three, I feel like the Powercolor cards were one of the best 290s they made. That acclaim is primarily for their stock cooling, which won't apply to you when water cooling, but also for their track record as well performing/decent clocking card.

The XFX cards seem to be really good too.

Asus has left a bad impression with their last 290's, but that's mainly because they thought it would be cute to slap a 780 cooler on a 290, and expect it to cool, which it did not.
Their newest Strix 390x is pretty sexy, but not sure if the board is compatible with the 290 blocks the same way the DCU board is.


----------



## By-Tor

For me Powercolor is the only brand I would buy. I have had a Powercolor 7750, 7850, 2x 7950's and now a pair of 290x LCS water cooled cards and have always been happy with them. Had to RMA one of my 7950's and was very pleased with the fast turn around service and updates I was given.

My first 7850 was an Asus and when I found out it was voltage locked I sold it for a powercolor card.


----------



## Agent Smith1984

Quote:


> Originally Posted by *By-Tor*
> 
> For me Powercolor is the only brand I would buy. I have had a Powercolor 7750, 7850, 2x 7950's and now a pair of 290x LCS water cooled cards and have always been happy with them. Had to RMA one of my 7950's and was very pleased with the fast turn around service and updates I was given.
> 
> My first 7850 was an Asus and when I found out it was voltage locked I sold it for a powercolor card.


I hear a lot of good things about PowerColor these days....

That blows my mind, because back in the early days, PowerColor had the cheapest cards, and were often touted as junk.
Not sure what changed within the company, but they are making as high of quality, if not better, than anyone out there right now.

Very good to see that ALL of AMD's partners have a pretty good reputation (though Asus has declined through the years)....

However, I must say, that I have never dealt with a company as professional, as helpful, and as fast as EVGA.

During the 7900GT diaster back in 05 or 06, EVGA was overnighting cards across the country with a return label for the bad card.

I went through 4 before getting a good one.

Think about that.....

I bought 1 card for around $350....

They spent most likely $30-50 per shipment (depending on their UPS discounts) to get me a new card the next day, without asking me for anything. And also prepaid the return freight.
And they did it 4 times, just to make things right, and keep their reputation.

I don't know of any partner on the red or green side who would do that....


----------



## Gumbi

That kind of service would have me as a customer for life. I've been pretty lucky wigh components tbh. I've only ever had as Asrock mobo go on me.

And I've been building/buying for over 5 years.


----------



## By-Tor

I have heard a lot of good things about EVGA, but since they only make Nvidia cards I will never have an EVGA video card.

Call me a fan boy, I'm good with that...

Gumbi,

I built a ITX rig for my wife and used an Asrock MB for the first time and was very impressed with it.


----------



## jon666

Ah yes, The Superclocked GTX 460. That thing dumped out heat. Now I own an R9 390 which dumps out heat.


----------



## Agent Smith1984

Quote:


> Originally Posted by *By-Tor*
> 
> I have heard a lot of good things about EVGA, but since they only make Nvidia cards I will never have an EVGA video card.
> 
> Call me a fan boy, I'm good with that...
> 
> Gumbi,
> 
> I built a ITX rig for my wife and used an Asrock MB for the first time and was very impressed with it.


I'm with you man, there is a reason why my last EVGA card was in 05.... I switched to AMD and never looked back.....
The idea was that I wouldn't need that level of customer service, because I wouldn't have any problems


----------



## Duke976

Quote:


> Originally Posted by *Agent Smith1984*
> 
> *If have any issues returning them, let me know, I am interested in buying a second card
> 
> 
> 
> 
> 
> 
> 
> *
> 
> Also, I would suggest getting the cheapest card you can find between the XFX, the Asus and the Powercolor.
> 
> As far as those three, I feel like the Powercolor cards were one of the best 290s they made. That acclaim is primarily for their stock cooling, which won't apply to you when water cooling, but also for their track record as well performing/decent clocking card.
> 
> The XFX cards seem to be really good too.
> 
> Asus has left a bad impression with their last 290's, but that's mainly because they thought it would be cute to slap a 780 cooler on a 290, and expect it to cool, which it did not.
> Their newest Strix 390x is pretty sexy, but not sure if the board is compatible with the 290 blocks the same way the DCU board is.


Will definitely let you know if ever, MC is good with return and since I have only purchased these cards 2 weeks tops I am sure they will exchange it with their asus 390 cards.

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I hear a lot of good things about PowerColor these days....
> 
> That blows my mind, because back in the early days, PowerColor had the cheapest cards, and were often touted as junk. i
> Not sure what changed within the company, but they are making as high of quality, if not better, than anyone out there right now.
> 
> Very good to see that ALL of AMD's partners have a pretty good reputation (though Asus has declined through the years)....
> 
> *However, I must say, that I have never dealt with a company as professional, as helpful, and as fast as EVGA.*
> 
> During the 7900GT diaster back in 05 or 06, EVGA was overnighting cards across the country with a return label for the bad card.
> 
> I went through 4 before getting a good one.
> 
> Think about that.....
> 
> I bought 1 card for around $350....
> 
> They spent most likely $30-50 per shipment (depending on their UPS discounts) to get me a new card the next day, without asking me for anything. And also prepaid the return freight.
> And they did it 4 times, just to make things right, and keep their reputation.
> 
> I don't know of any partner on the red or green side who would do that....


I couldn't agree more, prior to me having these 390 cards I used to have EVGA 670 FTW in SLI. I encountered problems with those cards last month, I was literally 2 weeks past my 3 year warranty and yet EVGA gave me a 1 time courtesy and replace both my cards with the upgraded GTX 770 Dual w/ EVGA ACX Cooler. They have a customer for life in me with their excellent after sales customer service.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Duke976*
> 
> Will definitely let you know if ever, MC is good with return and since I have only purchased these cards 2 weeks tops I am sure they will exchange it with their asus 390 cards.
> I couldn't agree more, prior to me having these 390 cards I used to have EVGA 670 FTW in SLI. I encountered problems with those cards last month, I was literally 2 weeks past my 3 year warranty and yet EVGA gave me a 1 time courtesy and replace both my cards with the upgraded GTX 770 Dual w/ EVGA ACX Cooler. They have a customer for life in me with their excellent after sales customer service.


That's awesome.

If I do decide to try anything out from the green team, it undoubtedly be an EVGA.

Which leads me to my next questions for EVERYONE....

How has your experiences been with AMD's partners when needing to RMA a product?

This Gaming 390 is my first MSI GPU (though I used to exclusively use MSI motherboards back in the mid 2000's)

How have the vendors treated you in regards to cards with TIM replaced, overclocked, etc...?


----------



## gatygun

I never send stuff directly towards the maker of the card, it's always the shop i buy stuff with, they give warranty's.

Some shops here give 5 years warranty for 17 euro's premium for example, if your hardware breaks in year 4, they either give you a replacement product that is in the same league as your old model or they just flat out pay you the full bought price on day one back. Never had issue's with warranty's and it doesn't matter what product i buy from who through such shop.


----------



## Duke976

Quote:


> Originally Posted by *Agent Smith1984*
> 
> That's awesome.
> 
> If I do decide to try anything out from the green team, it undoubtedly be an EVGA.
> 
> Which leads me to my next questions for EVERYONE....
> *
> How has your experiences been with AMD's partners when needing to RMA a product?*
> 
> This Gaming 390 is my first MSI GPU (though I used to exclusively use MSI motherboards back in the mid 2000's)
> 
> How have the vendors treated you in regards to cards with TIM replaced, overclocked, etc...?


My experience with Sapphire has been good, I had to once RMA my X1950 AGP card back then. Sapphire told me to contact Newegg since I purchased the unit from them. In return Newegg replaced my card. If my recollection is correct Sapphire let the Originating Store replace the defective items. I do not know if that is still true today,


----------



## Agent Smith1984

Oh no....

Someone has tempted me with this for $500 cash locally....
http://www.newegg.com/Product/Product.aspx?Item=N82E16814500378&cm_re=gtx_980ti_zotac-_-14-500-378-_-Product










Wife would have my head on a stick....


----------



## gatygun

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Oh no....
> 
> Someone has tempted me with this for $500 cash locally....
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814500378&cm_re=gtx_980ti_zotac-_-14-500-378-_-Product
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Wife would have my head on a stick....


Just give it to her as a present. She will be like what is this? it's 500 dollars pure love. Then she would be happy with it, throw it in a corner and you can pick it up to use it.

Always works.

Always i tell you, always.


Spoiler: Warning: Spoiler!



i have no wife


----------



## Zanpakuto

Didn't win the silicon lottery


----------



## Duke976

Quote:


> Originally Posted by *Zanpakuto*
> 
> Didn't win the silicon lottery


May I ask what brand is that? Thanks


----------



## Zanpakuto

Nitro 390


----------



## Agent Smith1984

Quote:


> Originally Posted by *Zanpakuto*
> 
> Nitro 390


That can't be right....

What is your max and average core voltage under load?
Any throttling?
Did you remember to set power to 50%?

1120 would be the absolute worst OC I've seen on these cards at 100mv.

Did you try 200mv?

Your temps are great... almost too good.... something's not right









Also.... don't use furmark.... it's just a torturous card killer...

Use Heaven


----------



## murakume

Just got my new MSI Gaming 390x delivered today







Moved from an EVGA 780 Classified. No overclocking yet, bit hot right now for it, but I'll throw something at the card to see what it can do later on tonight.

Firestrike comparisons: http://www.3dmark.com/fs/5423681 390x stock clocks
http://www.3dmark.com/fs/5376202 The most I was personally ever able to pull out of my Classified, no artifact gaming stable clocks

Firestrike Ultra (4k): http://www.3dmark.com/fs/5423739 390x
http://www.3dmark.com/fs/5416023 stock 780 Classified

I do have a Firestrike run somewhere with the 780 clocked at 1306, but it performed around 50-60 points higher than the one I had linked and was not a clean run.

And my proof.


----------



## murakume

Double post, I know, but there's an almost 4 hour gap so I don't feel too bad about it. This is what I've come up with for the night. Firestrike saw temps in the low 70's and Heaven put the card around 82 peak with 70% fan. Not bad for around an hour and a half of messing around considering I've tried and fought for hours to get 11000 on the 780 and never achieved it.

The last AMD card I've owned was a 4890 so I have a question: Around what temperature do you start wanting to back off? I know the 290x cards ran hot out of the box and these are pretty much the "same" card. Gut is telling me mid 80's is the turn back point, is this about right?


----------



## brooklands

Quote:


> Originally Posted by *murakume*
> 
> Around what temperature do you start wanting to back off?


Well, stock 290x run quite happily at 95 degrees c for days (we use a few of them for computational purposes at work) and as long as vrms stay cool enough I don't see a problem with a gpu temperature that high. Heck, i've seen passively cooled 8600 Gts that hit 110c on a regular basis, that outlived their usefulness.

For my own gpus: Temps are ok as long as fans don't have to spin at 100% and with an (correctly installed) aftermarket cooler temps are no thing to worry about anyways, as the chip will probably give up before temps go anywhere near 90c.


----------



## Gumbi

Quote:


> Originally Posted by *brooklands*
> 
> Well, stock 290x run quite happily at 95 degrees c for days (we use a few of them for computational purposes at work) and as long as vrms stay cool enough I don't see a problem with a gpu temperature that high. Heck, i've seen passively cooled 8600 Gts that hit 110c on a regular basis, that outlived their usefulness.
> 
> For my own gpus: Temps are ok as long as fans don't have to spin at 100% and with an (correctly installed) aftermarket cooler temps are no thing to worry about anyways, as the chip will probably give up before temps go anywhere near 90c.


If you felt like cooling them even more, thermal paste can drastically help those temps, as the stock paste is very poorly applied.


----------



## Agent Smith1984

Quote:


> Originally Posted by *murakume*
> 
> Double post, I know, but there's an almost 4 hour gap so I don't feel too bad about it. This is what I've come up with for the night. Firestrike saw temps in the low 70's and Heaven put the card around 82 peak with 70% fan. Not bad for around an hour and a half of messing around considering I've tried and fought for hours to get 11000 on the 780 and never achieved it.
> 
> The last AMD card I've owned was a 4890 so I have a question: Around what temperature do you start wanting to back off? I know the 290x cards ran hot out of the box and these are pretty much the "same" card. Gut is telling me mid 80's is the turn back point, is this about right?


Nice scores man!

Looks like a real nice card if that's stable....

Those temps are fine. It's generally good to keep 'er under 80, but anything under 85 is plenty safe for these cards.


----------



## murakume

Not known if it is stable or not yet. Kept watching the 3DMark score climb to 12000 and wanted to bust it. I can say that it and Heaven were clean runs, though I didn't do any looped or game testing as I needed to be up at five this morning







Once I get off of work I will verify the stability and if I can squeeze a bit more out of it. May take a peak at the thermal paste since I believe I have some PK-3 around somewhere, but that little void warranty sticker sucks.


----------



## Agent Smith1984

Quote:


> Originally Posted by *murakume*
> 
> Not known if it is stable or not yet. Kept watching the 3DMark score climb to 12000 and wanted to bust it. I can say that it and Heaven were clean runs, though I didn't do any looped or game testing as I needed to be up at five this morning
> 
> 
> 
> 
> 
> 
> 
> Once I get off of work I will verify the stability and if I can squeeze a bit more out of it. May take a peak at the thermal paste since I believe I have some PK-3 around somewhere, but that little void warranty sticker sucks.


I wouldn't void warranty over a few degrees if the temps are already safe...

What you may find though, is not only will the temps drop a little, but the fan will maintain the lower temp at a slower speed. That only really matters if you are big on sound.

I personally use a custom profile to ramp the fan up gradually until it hits 100% at 85c.

I never see it get past 82c at 100mv with 50mv AUX though.
VRM1 never breaks 75c.

I will probably put some noctua on mine this weekend and try some more 200mv testing.
I have some Fractal Design fans coming for my case that should almost double my airflow and improve temps a lot.

I'm shooting for 1200/1700+ stable.


----------



## Superjit94

about to grab my second r9 390 here in the next week, but im assuming that my 750 watt PowerSupply is not enough. Just looking for any recommendations on a bigger power supply, preferably 1000watt.


----------



## diggiddi

EVA G2 supernova 1000w good price on Newegg


----------



## Agent Smith1984

Quote:


> Originally Posted by *diggiddi*
> 
> EVA G2 supernova 1000w good price on Newegg


This unit also gets my vote.

I myself, am purchasing the EVGA 1000 G2 as my next unit.


----------



## Derek129

just found this info what does everyone think of this post?


__
https://www.reddit.com/r/3dfkls/if_anyones_interested_accelero_xtreme_iv_fits/


----------



## IcarusLSC

I bought a MSI 390 today







Need to get it in and tested to replace this dreadful NVidia driver experience I've had this round with my 970s (among many other issues!)


----------



## Agent Smith1984

Quote:


> Originally Posted by *IcarusLSC*
> 
> I bought a MSI 390 today
> 
> 
> 
> 
> 
> 
> 
> Need to get it in and tested to replace this dreadful NVidia driver experience I've had this round with my 970s (among many other issues!)


I'll get all the new members added by Monday!

Just got a 55" 4k TV

Running BF4 on ultra with no AA @ 1100/1600 stock voltage with 43 FPS min, 73 MAX, and 58 FPS AVG


----------



## russik

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I'll get all the new members added by Monday!
> 
> Just got a 55" 4k TV
> 
> Running BF4 on ultra with no AA @ 1100/1600 stock voltage with 43 FPS min, 73 MAX, and 58 FPS AVG


But you dont see 60fps and only see 30fps because amd dosent support hdmi2.0


----------



## Agent Smith1984

Quote:


> Originally Posted by *russik*
> 
> But you dont see 60fps and only see 30fps because amd dosent support hdmi2.0


Correct, but it's a good showing for the card... I'm getting dvi-d adapter today for 2160p60


----------



## russik

So with that do you get HDMI sound too?


----------



## Agent Smith1984

Quote:


> Originally Posted by *russik*
> 
> So with that do you get HDMI sound too?


No but don't use it anyways. ALC1150 is much better


----------



## Jerseyseven

Hi guys. I just got myself an MSI R9 390 Gaming. Encountered a really odd thing just now while testing my new system. I could not get the DisplayPort to work while the HDMI worked. I tried all night only to bring it back to the shop and dude used HDMI and it worked. Does it require any drivers to work or should it work out of the box?

My parts while testing :
MSI Z97 Gaming 5
MSI R9 390 Gaming
I5 4690k
Kingston HyperX 8GB fury

Note: Have not installed windows. Only tried booting into BIOS. Does not work with DisplayPort on R9 390. But works with HDMI

Lots of thanks!


----------



## diggiddi

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I'll get all the new members added by Monday!
> 
> Just got a 55" 4k TV
> 
> Running BF4 on ultra with no AA @ 1100/1600 stock voltage with 43 FPS min, 73 MAX, and 58 FPS AVG


Cool, what model? let us know how good of a gamer it is esp wrt to lag


----------



## Agent Smith1984

Quote:


> Originally Posted by *diggiddi*
> 
> Cool, what model? let us know how good of a gamer it is esp wrt to lag


LG 55UB8500

Has 4k for daddy, and 3d for the kids.
Picture looks great!

Games are unbelievable, and this single 390 is making a better showing then i thought it would. Can't wait for two...


----------



## diggiddi

Quote:


> Originally Posted by *Agent Smith1984*
> 
> LG 55UB8500
> 
> Has 4k for daddy, and 3d for the kids.
> Picture looks great!
> 
> Games are unbelievable, and this single 390 is making a better showing then i thought it would. Can't wait for two...


Rock on my man


----------



## IcarusLSC

Liking to card so far other than its a furnace like my 480's where :/ AMD CCC doesn't seem to control the fan in any way either, guess I need to use Afterburner there eh?

Thanks


----------



## IcarusLSC

I dunno what's up with this card. It's probably the drivers, but after installing the MSI Gaming App it sticks the GPU frequency at 1040 MHz always even on desktop, and it idles at almost 60*. Same with my CPU its maxed it at 4400 MHz.
Any ideas?


----------



## Duke976

Try to uninstall the gaming app and see what happens. I also own 2 MSI 390 and I am only using afterburner and do not have that problem.


----------



## IcarusLSC

I did, still idling at 60*. On the desktop it idles at 500MHz core, and 1500 Memory according to Afterburner.

It has also buggered up Afterburner and makes it give a task scheduler error when I shutdown the computer...

I'm off to a great start, I really don't want to screw with this crap just to get it to work normally, I went though this hassle with NVidia and a dozen 9xx cards, not doing it again...


----------



## IcarusLSC

I uninstalled everything and did it over, and its back to normal _so far_. I didn't bother with the Gaming App or the AMD App this time. Drivers seem flaky to me. CCC panel doesn't remember position or size, not much to do in it. Seems very 1999... Still fluctuates a ton between 150-1500 memory clocks at idle on desktop and just this website window open, I dunno...

Now to see what this toaster can do








Should I be doing all the OC in Afterburner and not touch the CCC panel?

Thanks


----------



## jon666

Afterburner is probably the best bet since you have an MSi GPU. EIther way the clock changes should show in both programs if you change them in one. If I remember correctly Afterburner has more options so if it were me I would stick with that.


----------



## Agent Smith1984

Quote:


> Originally Posted by *IcarusLSC*
> 
> I uninstalled everything and did it over, and its back to normal _so far_. I didn't bother with the Gaming App or the AMD App this time. Drivers seem flaky to me. CCC panel doesn't remember position or size, not much to do in it. Seems very 1999... Still fluctuates a ton between 150-1500 memory clocks at idle on desktop and just this website window open, I dunno...
> 
> Now to see what this toaster can do
> 
> 
> 
> 
> 
> 
> 
> 
> Should I be doing all the OC in Afterburner and not touch the CCC panel?
> 
> Thanks


Please see the cooling section in the OP.

Your card is idling at 60C because pretty much all of these are using a zero state fan setup now.
If you set a manual fan profile, it will not idle that high at all.

Also, the 390 reports 1040 core clock in "Gaming Mode" and increases to 1060 in "OC mode"....

The memory clock bouncing around is pretty common from what I've seen, but causes no issues.


----------



## IcarusLSC

It's working right now _without_ the Gaming App, idling about ~40*. It's still a toaster when gaming though! Not seen temps over 75* yet (stock fan profile) even after it warmed my room up to 33* after playing games most the night, lol!!









I didn't like the Gaming App, I really wanted to dim the LED but I can't, and it made the clocks stay maxed (At 1040 or 1060 depending which I selected) even after I uninstalled it and on just the desktop! I had to start over to get it working right without the Gaming App...









No more stuttering in games like the 970's which is really nice (though one is still doing it some, its not near as bad now, so must be the game I think.) SoM I can run with Ultra textures now too! Overall frame rates seem to be a bit higher so far in all I play


----------



## IcarusLSC

Anyone run into
Quote:


> Originally Posted by *Jerseyseven*
> 
> Hi guys. I just got myself an MSI R9 390 Gaming. Encountered a really odd thing just now while testing my new system. I could not get the DisplayPort to work while the HDMI worked. I tried all night only to bring it back to the shop and dude used HDMI and it worked. Does it require any drivers to work or should it work out of the box?
> 
> My parts while testing :
> MSI Z97 Gaming 5
> MSI R9 390 Gaming
> I5 4690k
> Kingston HyperX 8GB fury
> 
> Note: Have not installed windows. Only tried booting into BIOS. Does not work with DisplayPort on R9 390. But works with HDMI
> 
> Lots of thanks!


I've not tried Display Port cable yet, looks like I should test it as I've got a similar setup.
I did notice with my DVI cable if I put it in the one upper port it will boot and show post screen and bios, but once Windows 7 loads it goes black and I have to shut it down as nothing will make it show. Works fine in the other lower DVI port though...


----------



## IcarusLSC

You guys that are overclocking these cards, what kind of differences are you getting in games (ie real life application as opposed to synth benchmarks which really mean ?)? Is it worthwhile to OC them or is it just good for benchmarks?


----------



## Particle

You will receive different opinions on this, Icarus, but here is mine. Overclocking doesn't yield much of a tangible real-world benefit. If the card wasn't capable of playing a game well before overclocking it, a 10% improvement isn't going to be enough to make the difference between a poor and great experience. The best you'll probably see is from poor to mediocre, and you'll be better served by turning down AA or something like that. As such, overclocking is more for sport than anything.


----------



## IcarusLSC

Aiy, I figure I'll get some interesting opinions.







It seems the ceiling for overclocking stuff for actual benefit in use for both cpu and gpus is slowing with newer stuff (which makes sense, but is it still useful all around?)...


----------



## rv8000

Quote:


> Originally Posted by *Particle*
> 
> You will receive different opinions on this, Icarus, but here is mine. Overclocking doesn't yield much of a tangible real-world benefit. If the card wasn't capable of playing a game well before overclocking it, a 10% improvement isn't going to be enough to make the difference between a poor and great experience. The best you'll probably see is from poor to mediocre, and you'll be better served by turning down AA or something like that. As such, overclocking is more for sport than anything.


It all depends on cooling if you ask me. Generally a 15-20% OC on most cards puts it at stock performance of the card a tier above it, and an OC like that can sometimes be a difference of 5-10 fps which has the most impact on the min FPS and makes low dips more tolerable and the game overall more fluid. If temps and stability are in check it is always worth it unless your pulling off 144fps or higher.


----------



## murakume

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Nice scores man!
> 
> Looks like a real nice card if that's stable....
> 
> Those temps are fine. It's generally good to keep 'er under 80, but anything under 85 is plenty safe for these cards.


You ever notice that people seem to need help doing things like moving when you're really wanting to do something yourself?? Some testing last night revealed a need for an additional +10 on voltage bringing me to an offset of 85 to pass through a Firestrike Ultra test at 1200 GPU and 1600 memory.

I ran three Heaven benches then Firestrike Ultra back-to-back and hit 83 Celsius on core and VRM1 hit 80 peak, which was during Heaven. Seem Firestrike is nicer to temps? Either way, no artifacts or driver crashes, so I would call that benchmark stable. Still haven't had time to game at these settings, though I would probably worry about temps after an extended gaming session.


----------



## IcarusLSC

I noticed in the AMD CCC under Image Scaling Preferences, that if I select anything and hit Apply, it doesn't take effect in a program (like running Heaven in 19020x1080 so I can compare it to everyone's tests the same.) I have to alt tab out of it then go back in for it to take effect. Anyone else run into this?

Edit: its different for games, some scale, others don't no matter what I do, weird, driver issue?...

Also, is it normal for CCC to modify the power options upon install? I had a custom plan that kept everything maxed and now it just at high perf and shuts stuff off. I put it back on my custom one.


----------



## DDSZ

XFX DD 390(x) and Asus DCU II 390(x) owners, can you please post (or PM me) your GPU-Z bios dumps?


----------



## jackalopeater

Here's mine








AMD sent it to me and so far I'm really enjoying it
link to my imgur album for it


http://imgur.com/a




just realized this was post 666 and I don't have a Devil card


----------



## Gumbi

Quote:


> Originally Posted by *jackalopeater*
> 
> Here's mine
> 
> 
> 
> 
> 
> 
> 
> 
> AMD sent it to me and so far I'm really enjoying it
> link to my imgur album for it
> 
> 
> http://imgur.com/a
> 
> 
> 
> 
> just realized this was post 666 and I don't have a Devil card


How's the cooling on that? I wonder if Asus fixed the cooling issues they had on 290/290xs. How is the core/VRM cooling under load?


----------



## jackalopeater

Quote:


> Originally Posted by *DDSZ*
> 
> XFX DD 390(x) and Asus DCU II 390(x) owners, can you please post (or PM me) your GPU-Z bios dumps?


Here is mine from the DCU II 390x, it's UEFI and just a note, the 390x DCU II does not have a bios switch 

Hawaii.txt 128k .txt file


----------



## jackalopeater

Quote:


> Originally Posted by *Gumbi*
> 
> How's the cooling on that? I wonder if Asus fixed the cooling issues they had on 290/290xs. How is the core/VRM cooling under load?


Here's a snip from running Valley for a half hour with everything at stock, bumping the fans to 50% drops the VRMs down by a lot!


----------



## Gumbi

Quote:


> Originally Posted by *jackalopeater*
> 
> Here's a snip from running Valley for a half hour with everything at stock, bumping the fans to 50% drops the VRMs down by a lot!


Damn that VRM cooling is pretty poor :/ They obviously just have sinks slapped on them and no direct main heatsink contact.

How much cooler are they on 50% fan speed? 96 degrees is pretty hot, and might drastically affect over clocking viability.

Seems like Asus didn't bother revamping the cooler much. Your VRMs will be even hotter again if you put any extra voltage through them.


----------



## Agent Smith1984

Quote:


> Originally Posted by *IcarusLSC*
> 
> You guys that are overclocking these cards, what kind of differences are you getting in games (ie real life application as opposed to synth benchmarks which really mean ?)? Is it worthwhile to OC them or is it just good for benchmarks?


I definitely see a difference when overclocking my card.
The thing to remember is that some of these cards already have an OC on them, especially the MSI cards....

If you factor in the fact that whatever clock speed you get to will be a good ways past the reference level of performance, then it definitely makes a difference.

At 1100/1600 on stock voltage, I find my temps, power usage, and performance to be nice for daily usage.

I can run everything at 1150/1700 with 50mv/50mv AUX+ and get a few more frames also, which I find to be most helpful at 4K.
It's a bit too hot this time of year to keep the 1180/1750 at 100mv/50mv AUX thing going right now, though that seems to pick up another frame or two also. Again, all helpful at 4k.... anything dipping in the high 20's can be brought over that 30FPS mark through overclocking, so it's very helpful at that resolution. (of course settings have to be tweaked accordingly, depending on the title)

At 1080/1440 the differences may be MORE measurable but LESS noticeable, if that makes sense?

Also, can you post up your screen shot or GPU-Z so I can get you added to the club?

Thanks


----------



## Particle

I tried the new driver release this weekend, and they don't appear to address the issues with multiple monitors and DP->DVI adapters. Anyone have any active DL-DVI adapters that they're using without issues?


----------



## brooklands

Quote:


> Originally Posted by *IcarusLSC*
> 
> You guys that are overclocking these cards, what kind of differences are you getting in games (ie real life application as opposed to synth benchmarks which really mean ?)? Is it worthwhile to OC them or is it just good for benchmarks?


To be honest: No, I don't notice any differences. -*Subjectively*, that is. Currently I spend much (too much) of my free time in GTA V and even if I stare at the fps counter I cannot *subjectively* spot a difference between 1080/1500 stock (yes, 1080 is stock, not the advertised 1100/1525, MSI







) and 1200/1600. Neither in 1080p (around 100fps) nor in 4k (around 45ish fps).

But the *measured* gains are certainly there and are about equal to those gains in synthetic benchmarks.


----------



## ZXMustang

I just built my system with an MSI 390x. Love the card and the price was oh so nice. Here are my specs.

4790k
390x
16gb ddr3 1600
EVO 850 250gb SSD
750psu

It runs like a raped ape. I couldnt be happier with the performance of the 390x. The card is stable overclocked at the moment to 1125/1525. Runs GTAV at full settings 1080p around 50-60fps. Cant ask for much more with a whore piece of **** game like that. GTAV is crashing all the time though. Not sure why. I reloaded windows and got the newest AMD drivers from their site last night and still the same thing. There is no Rhyme or reason to it. Anyway, I love the card. Ill be grabbing another one in the next month to run crossfire with. Cant wait to see how that improves performance.


----------



## Duke976

Quote:


> Originally Posted by *ZXMustang*
> 
> I just built my system with an MSI 390x. Love the card and the price was oh so nice. Here are my specs.
> 
> 4790k
> 390x
> 16gb ddr3 1600
> EVO 850 250gb SSD
> 750psu
> 
> It runs like a raped ape. I couldnt be happier with the performance of the 390x. The card is stable overclocked at the moment to 1125/1525. Runs GTAV at full settings 1080p around 50-60fps. Cant ask for much more with a whore piece of **** game like that. GTAV is crashing all the time though. Not sure why. I reloaded windows and got the newest AMD drivers from their site last night and still the same thing. There is no Rhyme or reason to it. Anyway, I love the card. Ill be grabbing another one in the next month to run crossfire with. Cant wait to see how that improves performance.


Congrats and welcome to the forum.


----------



## jackalopeater

Quote:


> Originally Posted by *Gumbi*
> 
> Damn that VRM cooling is pretty poor :/ They obviously just have sinks slapped on them and no direct main heatsink contact.
> 
> How much cooler are they on 50% fan speed? 96 degrees is pretty hot, and might drastically affect over clocking viability.
> 
> Seems like Asus didn't bother revamping the cooler much. Your VRMs will be even hotter again if you put any extra voltage through them.


I'll test it when I get home, I know when I'm actually playing a game it stays under 70c core, usually around 68c with stock fan profile. VRMs stay around 80c at the same time


----------



## diggiddi

Hey guys theres a battle going on, inna dis ya thread, represent your club!!

http://www.overclock.net/t/1565660/lets-settle-this-290-x-vs-390-x-debate-once-and-for-all

OP incoming PM

Do the Powercolor and Sapphire cards have Bios switches?


----------



## jackalopeater

Quote:


> Originally Posted by *Gumbi*
> 
> Damn that VRM cooling is pretty poor :/ They obviously just have sinks slapped on them and no direct main heatsink contact.
> 
> How much cooler are they on 50% fan speed? 96 degrees is pretty hot, and might drastically affect over clocking viability.
> 
> Seems like Asus didn't bother revamping the cooler much. Your VRMs will be even hotter again if you put any extra voltage through them.


Here it is at 50% fan speed, nearly a 20c reduction in VRM1 temps, makes a bit of noise, but not bad


----------



## Jared2608

A local supplier has the Powercolor R9-390 PCS+ on special at the moment. If it's still around on pay day I'm going to grab one.


----------



## Duke976

Just got my RIVE board back from rma today and was able to use the mg MSI 390 in crossfire for testing. I have left the door open and my AC is on so bear with me about my findings. Prior to RIVE, I was using Sabertooth X79. And as u know the cards doesn't have any breather when in xfire as shown here:


The temp for the top card with this setup was 95c that was with windows open and the using manual fan control. That was just running Dota 2 for 10 minutes. The same can be said with Heaven benchmark and 3dmark. So I decided that it was not worth it to xfire those card using Sabertooth X79 because of the PCi-E layout.

Now that I am using RIVE as shown here:

As you can see, I have enough clearance for both cards. It *suppose* to not reach 95c right? Wrong the same temp shows after 2 runs of heaven. Granted that in the Sabertooth it only took 1 run to reach 95c. So what else is their for me to do? I've switch both cards and still the results were the same, top card reaches 95c and then started to throttle the core clock. It seems that the only way to run these cards in crossfire is to watercool them. I was expecting a better outcome from having tested 2 different board with 2 different PCi-e layout. But sad to say that it seems pretty grim to crossfire as these 2 cards are pumping so much heat. I couldn't imagine the temp if the door was close during testing.


----------



## rv8000

Quote:


> Originally Posted by *Duke976*
> 
> Just got my RIVE board back from rma today and was able to use the mg MSI 390 in crossfire for testing. I have left the door open and my AC is on so bear with me about my findings. Prior to RIVE, I was using Sabertooth X79. And as u know the cards doesn't have any breather when in xfire as shown here:
> 
> 
> The temp for the top card with this setup was 95c that was with windows open and the using manual fan control. That was just running Dota 2 for 10 minutes. The same can be said with Heaven benchmark and 3dmark. So I decided that it was not worth it to xfire those card using Sabertooth X79 because of the PCi-E layout.
> 
> Now that I am using RIVE as shown here:
> 
> As you can see, I have enough clearance for both cards. It *suppose* to not reach 95c right? Wrong the same temp shows after 2 runs of heaven. Granted that in the Sabertooth it only took 1 run to reach 95c. So what else is their for me to do? I've switch both cards and still the results were the same, top card reaches 95c and then started to throttle the core clock. It seems that the only way to run these cards in crossfire is to watercool them. I was expecting a better outcome from having tested 2 different board with 2 different PCi-e layout. But sad to say that it seems pretty grim to crossfire as these 2 cards are pumping so much heat. I couldn't imagine the temp if the door was close during testing.


Several problems i'm seeing right off the bat. The fans on your 360mm rad are dumping all the heat from your loop into the case. You have no fan on the h140x creating a hotspot in the loop and case, heat will rise directly into your gpus. Get a fan on that h120/140x which ever it is as its hards to tell from the pictures. Swap the fan direction on your 360 rad fans to exhaust air, and make sure you have some good intake fans on the front of your case to provide ample fresh air. This should greatly improve temps, but the top card is still likely to run a good deal hotter than the bottom card.

* looking back it does seem there is a fan on the h120/140x so ignore what i posted about that if there is. Id tidy up the gpu power cabling with some zip ties, it being spaghettied all over the place in front of thr cards like that wont help airflow. You can also try raising the case on small pedestals and adding some intake fans at the bottom of the case as it looks like you have room for them.


----------



## Duke976

Quote:


> Originally Posted by *rv8000*
> 
> Several problems i'm seeing right off the bat. The fans on your 360mm rad are dumping all the heat from your loop into the case. You have no fan on the h140x creating a hotspot in the loop and case, heat will rise directly into your gpus. Get a fan on that h120/140x which ever it is as its hards to tell from the pictures. Swap the fan direction on your 360 rad fans to exhaust air, and make sure you have some good intake fans on the front of your case to provide ample fresh air. This should greatly improve temps, but the top card is still likely to run a good deal hotter than the bottom card.
> 
> * looking back it does seem there is a fan on the h120/140x so ignore what i posted about that if there is. Id tidy up the gpu power cabling with some zip ties, it being spaghettied all over the place in front of thr cards like that wont help airflow. You can also try raising the case on small pedestals and adding some intake fans at the bottom of the case as it looks like you have room for them.


I can see your point, but I can assure you that I have tried both push and pull on the fans on the 480 just to alleviate the air issue. There was no difference regardless of the orientation. Now I've also tested these with the door open and with the AC on and close to the PC. I can tidy up the cable sure, the most I can get is 1 to 2c the most. For the top card to reach that high of a temp is really bad. I've tried to use just one card and the temp that I got was 77 to 80. That tells me that if I venture into crossfire I was bound to have temp problems.


----------



## kizwan

Quote:


> Originally Posted by *Duke976*
> 
> Just got my RIVE board back from rma today and was able to use the mg MSI 390 in crossfire for testing. I have left the door open and my AC is on so bear with me about my findings. Prior to RIVE, I was using Sabertooth X79. And as u know the cards doesn't have any breather when in xfire as shown here:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> The temp for the top card with this setup was 95c that was with windows open and the using manual fan control. That was just running Dota 2 for 10 minutes. The same can be said with Heaven benchmark and 3dmark. So I decided that it was not worth it to xfire those card using Sabertooth X79 because of the PCi-E layout.
> 
> Now that I am using RIVE as shown here:
> 
> As you can see, I have enough clearance for both cards. It *suppose* to not reach 95c right? Wrong the same temp shows after 2 runs of heaven. Granted that in the Sabertooth it only took 1 run to reach 95c. So what else is their for me to do? I've switch both cards and still the results were the same, top card reaches 95c and then started to throttle the core clock. It seems that the only way to run these cards in crossfire is to watercool them. I was expecting a better outcome from having tested 2 different board with 2 different PCi-e layout. But sad to say that it seems pretty grim to crossfire as these 2 cards are pumping so much heat. I couldn't imagine the temp if the door was close during testing.


You need to dial up the fan speed on both cards to get away from 95C throttling. If I can do this with 290 referenced cooler, I don't see any reason why not yours.

Pic before water cooling.


----------



## Duke976

Quote:


> Originally Posted by *kizwan*
> 
> You need to dial up the fan speed on both cards to get away from 95C throttling. If I can do this with 290 referenced cooler, I don't see any reason why not yours.
> 
> Pic before water cooling.


I was using manual control, when the temp reach 80 the fan goes on full 100%. I underestimate the heat that these cards produce.


----------



## kizwan

Custom cooler should do better. Bad TIM application maybe. At what mode you're running the cards?

1060 MHz (OC Mode)
1040 MHz (Gaming Mode)
1000 MHz (Silent Mode)


----------



## tbob22

Got my Powercolor 390 today. All seems to work well. The cooler on this thing is a beast, it just barely fits in my case, I don't think the Nitro would fit as it is about a half inch longer, but not quite as thick. Had to switch my SATA cables for angled ones as the heatsink hit.

Had to raise the max power to 20% to keep the clocks from jumping around while running Furmark, I had to do the same thing with my 7950. Temps max out at 67c core with the fan speed at 74% (pretty noisy). VRM1 gets a bit hotter than I would have liked, maxes out at 77c, VRM2 maxes out at 66c. I think the temps are pretty good, but it does get pretty loud. Under normal load it is quite quiet.

At high fan speeds (60%+) the shroud is rattling a bit. If i push on it between second and third fan it stops.. A bit annoying, but I'll probably just wrap a ziptie around it to stop it, not worth sending it in for something like that.

Firestrike 7950 (stock)
Firestrike 390 (stock)

I know Firestrike is a best case increase, but I'm still pretty happy with it. I tested a few games and everything is a lot smoother at high settings.

A few photos:
(290, 7950, 470(dead-ish), 4850x2(dead-ish),4870(dead), 8800gt)








http://imgur.com/14dkp


----------



## diggiddi

Quote:


> Originally Posted by *tbob22*
> 
> Got my Powercolor 390 today. All seems to work well. The cooler on this thing is a beast, it just barely fits in my case, I don't think the Nitro would fit as it is about a half inch longer, but not quite as thick. Had to switch my SATA cables for angled ones as the heatsink hit.
> 
> Had to raise the max power to 20% to keep the clocks from jumping around while running Furmark, I had to do the same thing with my 7950. Temps max out at 67c core with the fan speed at 74% (pretty noisy). VRM1 gets a bit hotter than I would have liked, maxes out at 77c, VRM2 maxes out at 66c. I think the temps are pretty good, but it does get pretty loud. Under normal load it is quite quiet.
> 
> At high fan speeds (60%+) the shroud is rattling a bit. If i push on it between second and third fan it stops.. A bit annoying, but I'll probably just wrap a ziptie around it to stop it, not worth sending it in for something like that.
> 
> Firestrike 7950 (stock)
> Firestrike 390 (stock)
> 
> I know Firestrike is a best case increase, but I'm still pretty happy with it. I tested a few games and everything is a lot smoother at high settings.
> 
> A few photos:


Does it have a Bios switch? and we'd love to see your results in this thread thanks


----------



## tbob22

Quote:


> Originally Posted by *diggiddi*
> 
> Does it have a Bios switch? and we'd love to see your results in this thread thanks


For dual bios you mean? Not that I'm aware of.


----------



## Duke976

Quote:


> Originally Posted by *kizwan*
> 
> Custom cooler should do better. Bad TIM application maybe. At what mode you're running the cards?
> 
> 1060 MHz (OC Mode)
> 1040 MHz (Gaming Mode)
> 1000 MHz (Silent Mode)


Im running these at 1100core and 1600 mem


----------



## diggiddi

Quote:


> Originally Posted by *tbob22*
> 
> Got my Powercolor 390 today. All seems to work well. The cooler on this thing is a beast, it just barely fits in my case, I don't think the Nitro would fit as it is about a half inch longer, but not quite as thick. Had to switch my SATA cables for angled ones as the heatsink hit.
> 
> Had to raise the max power to 20% to keep the clocks from jumping around while running Furmark, I had to do the same thing with my 7950. Temps max out at 67c core with the fan speed at 74% (pretty noisy). VRM1 gets a bit hotter than I would have liked, maxes out at 77c, VRM2 maxes out at 66c. I think the temps are pretty good, but it does get pretty loud. Under normal load it is quite quiet.
> 
> At high fan speeds (60%+) the shroud is rattling a bit. If i push on it between second and third fan it stops.. A bit annoying, but I'll probably just wrap a ziptie around it to stop it, not worth sending it in for something like that.
> 
> Firestrike 7950 (stock)
> Firestrike 390 (stock)
> 
> I know Firestrike is a best case increase, but I'm still pretty happy with it. I tested a few games and everything is a lot smoother at high settings.
> 
> A few photos:


Does it have a Bios switch? and we'd love to see your results in this thread
Quote:


> Originally Posted by *tbob22*
> 
> For dual bios you mean? Not that I'm aware of.


So none of these have dual bios then? +rep


----------



## tbob22

Quote:


> Originally Posted by *diggiddi*
> 
> Does it have a Bios switch? and we'd love to see your results in this thread
> So none of these have dual bios then? +rep


Oh, scratch that. I must have been thinking about the 290 version. It has a switch, I guess only one bios can be overwritten.

Edit: Or not. I guess the 290 has the switch as well. Must have been some other brand.


----------



## kizwan

Quote:


> Originally Posted by *Duke976*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Custom cooler should do better. Bad TIM application maybe. At what mode you're running the cards?
> 
> 1060 MHz (OC Mode)
> 1040 MHz (Gaming Mode)
> 1000 MHz (Silent Mode)
> 
> 
> 
> Im running these at 1100core and 1600 mem
Click to expand...

Ah! Overclocked. That changed things. I checked your ambient is high too. These cards are does dump a lot of heat.

On an unrelated note, how many rads do you have?


----------



## Duke976

Quote:


> Originally Posted by *kizwan*
> 
> Ah! Overclocked. That changed things. I checked your ambient is high too. These cards are does dump a lot of heat.
> 
> On an unrelated note, how many rads do you have?


Ambient temp is 72c. These suckers pump more heat than my stove







. Oh I am using 2 rads currently 1 x 480 and 1 x 140. I will be replacing the thermal paste since I have spare gelid in my stash. Will update tom and hopefully temp will be better.


----------



## mrbull3tproof

Hi everyone, first time poster.
If anyone cares Arctic Accelero Xtreme IV fits R9 390, or at least fits my Gigabyte G1 Gaming model.
Original temps were quite higher than expected and fan noise was simply unbearable - everything above 50% fans speed.
Anyway, now have ~33-34 C in idle (~61 was with original one) with no fans spinning, under load (metro 2033, crysis 3) have 71 C at 63% fans speed (custom curve).
Card is very silent now, 70% fans speed gives nothing more than whispers.






Haf 912+ case, 4 intake demciflex filtrered fans (top, front), one outtake - rear.
Yes, I know I have mess in the cables.


----------



## Tabinhu

I just bought a R9 390 Nitro! But there's an issue!
I read in some reviews that fans turn off at light load, the thing is my PC is at 0% load, and my GPU clocks are at 300mhz and the fans are at 1200rpm!
Even in MSI afterburner I cant turn them off

EDIT: Looks like they're off, i just took the side panel out. It seems that my Hyper 212 Evo fan is ramping up more than usual!


----------



## Agent Smith1984

Quote:


> Originally Posted by *Duke976*
> 
> Just got my RIVE board back from rma today and was able to use the mg MSI 390 in crossfire for testing. I have left the door open and my AC is on so bear with me about my findings. Prior to RIVE, I was using Sabertooth X79. And as u know the cards doesn't have any breather when in xfire as shown here:
> 
> 
> The temp for the top card with this setup was 95c that was with windows open and the using manual fan control. That was just running Dota 2 for 10 minutes. The same can be said with Heaven benchmark and 3dmark. So I decided that it was not worth it to xfire those card using Sabertooth X79 because of the PCi-E layout.
> 
> Now that I am using RIVE as shown here:
> 
> As you can see, I have enough clearance for both cards. It *suppose* to not reach 95c right? Wrong the same temp shows after 2 runs of heaven. Granted that in the Sabertooth it only took 1 run to reach 95c. So what else is their for me to do? I've switch both cards and still the results were the same, top card reaches 95c and then started to throttle the core clock. It seems that the only way to run these cards in crossfire is to watercool them. I was expecting a better outcome from having tested 2 different board with 2 different PCi-e layout. But sad to say that it seems pretty grim to crossfire as these 2 cards are pumping so much heat. I couldn't imagine the temp if the door was close during testing.


Those are strange results....

I ran two tri-x 290's in crossfire with similar spacing, and the top card never got over 88c.

I know the tri-x has a tad better cooling, but I would not expect 95c, that's really high!

Have you tried getting a makeshift fan on them?

How do the cards do cooling wise when by themselves?

Have you tried running 100% fan?

Edit:

I think a TIM change and a 140mm blowing on them from either the door, or routing your cables differently (see the pic of my sig rig) and mounting a 140mm fan in front of them (this can be done pretty tastefully) would give you some good results.


----------



## tbob22

Does anyone know how to disable the idle fan off? I really don't like the idea of it sitting at 60c when doing standard desktop work.


----------



## mrbull3tproof

If you have MSI afterburner under settings ->fan ->check enable automatic fan control by user. After checking that you will see fan speed curve, where you can set minimal fan speed for your card. Make afterburner start with windows so your fans wil be spinning as soon as system starts.
Or just replace cooler as I did and get 33 deg in idle without spinning fans ^^


----------



## tbob22

Quote:


> Originally Posted by *mrbull3tproof*
> 
> If you have MSI afterburner under settings ->fan ->check enable automatic fan control by user. After checking that you will see fan speed curve, where you can set minimal fan speed for your card. Make afterburner start with windows so your fans wil be spinning as soon as system starts.
> Or just replace cooler as I did and get 33 deg in idle without spinning fans ^^


Yeah, I would rather replace the bios, don't really like having software controlling the fan speeds. I'll look into that.

One of the primary reasons I got this card was for the cooler, it's nearly the same size as my old Accelero Extreme III. It idles at about 35c if I'm not doing anything and only have one screen hooked up, but with dual screens and lots of stuff going on (Photoshop, browsers, etc) it sits at about 55-60c, and the VRMs sit at 60-65c.

Edit: Well I went ahead and set the min fan speed to 35%. It's still silent but the Core is now at 40-45c and the VRM's are at 40-50c.


----------



## Agent Smith1984

At 25% on my 390 i generally see 35c idle.

I seem to have found a new level of load for these cards though....

Prior to going 4k, my Max load was usually around 77-82 depending on game and ambient temps, i now see 88c! It's an airflow problem though, because when i remove my door it drops to 75c...

I'm gearing up for crossfire, so more flow is a must this time of year

I have two high flow (95cfm) 120mm exhaust fans coming to remedy the problem...


----------



## rtwhalen

Just upgraded my old HD6950 to a Nitro 390. The memory clocks great up to 1720, but the max I can get on core without artifacts is only 1110 and that is adding 100mv. Temps are very good. Max temp is 64C during Firestrike. My normal 24/7 clocks right now are 1050/1680. Here is a screenshot:


----------



## Duke976

Quote:


> Originally Posted by *rtwhalen*
> 
> Just upgraded my old HD6950 to a Nitro 390. The memory clocks great up to 1720, but the max I can get on core without artifacts is only 1110 and that is adding 100mv. Temps are very good. Max temp is 64C during Firestrike. My normal 24/7 clocks right now are 1050/1680. Here is a screenshot:


Congrats and welcome to the forum.

_________________________________________________

A little update on the temp problems that I am having. I was able to replace the tim on the upper card of my MSI 390. Upon inspection on the core, the tim wasn't very well place. The tim was over lapping the core and was all over the place. After cleaning it, I applied the GC-Extreme using the old fashion spreading it even. I used Heaven to see what my temp was after the change of tim but this time I went all 4k.

Here's the result.



http://imgur.com/a3xRYi2


I was surprise to see that not only I can loop this test, it drastically reduce the temp by 6c on the same testing environment but with much higher resolution being use.


----------



## kizwan

What score you get if you run with 8x AA?

BTW, what monitor you have there? Unigine doesn't push the card hard enough, I'm pretty sure you can run higher than 1110.


----------



## Duke976

Quote:


> Originally Posted by *kizwan*
> 
> What score you get if you run with 8x AA?
> 
> BTW, what monitor you have there? Unigine doesn't push the card hard enough, I'm pretty sure you can run higher than 1110.


Here's the result with 8x AA.



http://imgur.com/OvKZ8CD


The monitor that I am using for the 4k is PB287Q. As far as OC the cards, 1 card can do 1200 with 1700 @75mv. While the other one is 1150core with 1650. I have to find a higher clock for them to work harmoniously. Other than that 1110 and 1600 is stable with only +25mv.


----------



## tbob22

Small overclock at stock volts http://www.3dmark.com/3dm/7848237


----------



## rtwhalen

I'm stuck at low core speeds as well. With no additional voltage, the max speed I can hit without artifacts is 1050. I can run Firestrike clean with +100mv, but I have to have my fans running 100%. Temps never go above 64-64C. Perhaps the core is not covered with TIM and I have a local hot spot? My goal was to run 1100/1650 clocks 24/7, but I did not want to run much more than +50mv on the core. I just bought a new case - a Fractal Design R5 and added an addition 140mm front fan. My case has good airflow...

I don't know much about these chips, but if this overclock seems abnormally low, I may replace the TIM. Just don't want to do it unless I need to since my temps seem good and I don't want to void my warranty.


----------



## Ha-Nocri

It looks to me 390(x) are not OC'ing better than 290(x). I can play games @1200/1600 +100mV, altho I don't find it worth it over 1125/1400 +0mV.

@rtwhalen I get the exact same FireStrike score @1230/1600


----------



## Agent Smith1984

Guys, I am trying to get new members added today.

I'm getting slammed at work right now, so bear with me.

Anyone who is not on the list, and HAS POSTED A SCREENIE, please say aye and I will find your post and get you on there today.

Thanks for the patience.


----------



## tbob22

Quote:


> Originally Posted by *rtwhalen*
> 
> I'm stuck at low core speeds as well. With no additional voltage, the max speed I can hit without artifacts is 1050. I can run Firestrike clean with +100mv, but I have to have my fans running 100%. Temps never go above 64-64C. Perhaps the core is not covered with TIM and I have a local hot spot? My goal was to run 1100/1650 clocks 24/7, but I did not want to run much more than +50mv on the core. I just bought a new case - a Fractal Design R5 and added an addition 140mm front fan. My case has good airflow...
> 
> I don't know much about these chips, but if this overclock seems abnormally low, I may replace the TIM. Just don't want to do it unless I need to since my temps seem good and I don't want to void my warranty.


Yeah, about the same here. At stock volts it seems to be limited around 1065mhz before I get artifacts. Memory seems to be around 1650mhz before artifacts start appearing.

Not quite like overclocking my old 7950 where I was able to bump it up +150mhz core and +300mhz memory on stock volts without artifacts.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Duke976*
> 
> Here's the result with 8x AA.
> 
> 
> 
> http://imgur.com/OvKZ8CD
> 
> 
> The monitor that I am using for the 4k is PB287Q. As far as OC the cards, 1 card can do 1200 with 1700 @75mv. While the other one is 1150core with 1650. I have to find a higher clock for them to work harmoniously. Other than that 1110 and 1600 is stable with only +25mv.


Any luck getting the CF going again with the new TIM (obviously it's working from that screenie, but how are temps)?

6C is a nice bonus, and even if it translates to a real 6C reduction with both cards under load, you are in good shape....
89C is perfectly fine for these cards to run at with no throttling.

My 290 crossfire usually saw 87-89C on the top card, and around 75-77c on the bottom. Just comes with the territory I guess.


----------



## Kalistoval

Could we add asic quality to the list?


----------



## IcarusLSC

Asic quality doesn't mean a thing.


----------



## IcarusLSC

Nothing on any of the questions or stuff I've found? Would be interesting to see if they are bugs in drivers or whatever...


----------



## Agent Smith1984

Quote:


> Originally Posted by *Kalistoval*
> 
> Could we add asic quality to the list?


I suppose we could, but I've found it to have no bearing on the overclocability of any Hawaii GPU's....

I believe I even read somewhere that the ASIC quality readings on these isn't even accurate.

I'll do some research later


----------



## Duke976

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Any luck getting the CF going again with the new TIM (obviously it's working from that screenie, but how are temps)?
> 
> 6C is a nice bonus, and even if it translates to a real 6C reduction with both cards under load, you are in good shape....
> 89C is perfectly fine for these cards to run at with no throttling.
> 
> My 290 crossfire usually saw 87-89C on the top card, and around 75-77c on the bottom. Just comes with the territory I guess.


That's the result after the tim replacement on the top card. Both are running pretty good in crossfire setup


----------



## Agent Smith1984

Quote:


> Originally Posted by *Duke976*
> 
> That's the result after the tim replacement on the top card. Both are running pretty good in crossfire setup


GOOD NEWS!!









Any 4K gaming numbers yet?


----------



## Apihl1000

Yes, i'm new here so I will put my Sapphire R9 390 Nitro, sorry for the kinda bad pictures, but didn't want to take it out of the case again. Sorry about the blue ram sticks totally not matching the inside, but I had no plans to change them.


----------



## IcarusLSC

I contacted MSI about the DVI port and other things, and they are saying the card is defective and want me to RMA it.
Can anyone test to see if they have the same issue please?
On the upper DVI port, it will show stuff till Windows loads, then go black. With the other it works fine all the time. This sounds like driver to me, but they say otherwise...
Thanks


----------



## kizwan

Black screen never sound like driver problem. RMA it.


----------



## IcarusLSC

Quote:


> Originally Posted by *kizwan*
> 
> Black screen never sound like driver problem. RMA it.


Lots of people where having black screen issues with early 3xx drivers, read some of the reviews all over etc...


----------



## kizwan

Quote:


> Originally Posted by *IcarusLSC*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Black screen never sound like driver problem. RMA it.
> 
> 
> 
> Lots of people where having black screen issues with early 3xx drivers, read some of the reviews all over etc...
Click to expand...

Keyword "early drivers". For the sake of argument, why only affected one DVI port, not both?


----------



## IcarusLSC

I don't know, I'm just going by what I've read and that's why I asked if anyone else has the same issue? I don't want to RMA for nothing as these places like to do...


----------



## Agent Smith1984

The only time I've ever gotten a black screen with this card is if I set my memory over 1600MHz at startup.

In order to get the memory over 1600MHz on my card, I have to add a little AUX voltage (usually use 50mv for 1700MHz)
When I set that as a start profile, the clock sets before the power sets (I'm guessing, and it goes blank).

I reduce it to 1600MHz, and it completely stops. I can also manually add the AUX voltage and the memory clock at the same time and it won't do this....

I think all the black screen issues on the 290 series were all memory related, but I've not seen much of that going around with the 390 series.

If a DVI port is not working, and another is, I'd immediately assume the card has an issue with that port.


----------



## IcarusLSC

I'm not oc'd so do you guys think I should rma or exchange it then? The store doesn't have any so who knows how long I'll have to wait...


----------



## Agent Smith1984

Quote:


> Originally Posted by *IcarusLSC*
> 
> I'm not oc'd so do you guys think I should rma or exchange it then? The store doesn't have any so who knows how long I'll have to wait...


Have you tried booting up with the lower port, then switching it to the top while windows is open? (and visa versa)

Also, try different resolutions and refresh rate settings doing the same thing back and forth.

Makes very little since as to why it would show the windows screen and then black out, but anything is possible...

Also, have you thumbed through CCC to see if there are any settings at all, that pertain to input/output settings?

I have to admit, the manufacturer's CSR is usually the LAST PERSON who will tell you to RMA the card, so if they are saying it's bad, it may very well be.


----------



## IcarusLSC

Yes, it stays black. I can switch the working port anytime though.

You think a CSR would avoid saying that? When I had all my issues with my 970/980's they all blamed the cards and said to RMA first till NVidia blamed the drivers, then they (MSI, Asus and Gigabyte did this!) said yes, it must be drivers, hence I got nowhere with any of them, lol...


----------



## LongRod

Spoiler: HOTHOTHOT






My ASUS 290 died on me, Microcenter replaced it with this.

Cooler on this card absolutely rocks!


----------



## Duke976

Quote:


> Originally Posted by *LongRod*
> 
> 
> 
> Spoiler: HOTHOTHOT
> 
> 
> 
> 
> 
> 
> My ASUS 290 died on me, Microcenter replaced it with this.
> 
> Cooler on this card absolutely rocks!


Grats, can you update us on the temp when benching, kindly include the vrm as well if you can. Thanks


----------



## IcarusLSC

Sweet, nice deal LongRod









I swapped this 390 into my old system and didn't install any drivers and both DVI ports work fine now! This makes me think it's a driver issue, no?!


----------



## Duke976

Quote:


> Originally Posted by *IcarusLSC*
> 
> Sweet, nice deal LongRod
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I swapped this 390 into my old system and didn't install any drivers and both DVI ports work fine now! This makes me think it's a driver issue, no?!


There's your answer


----------



## IcarusLSC

Arg, I figured as much. So should I put in a ticket with AMD then?


----------



## Duke976

If it works on the other machine then I would advice against it since you know that it was a driver issue when you used it on another computer.


----------



## Agent Smith1984

Damn, gotta new MSI 390 for a dead Asus 290... can't beat that!!

That Asus probably died cause the cooler is horrible on it!









Let us know how overclocking fairs for you.

Added!


----------



## Agent Smith1984

I'd really like to see more members pushing their cards to max overclocks, even if it's just temporarily









If we can get more info, I can analyze the data and see how these cards are doing overall.

I REALLY thought I would be seeing more 1200MHz samples, but guess not....

So far, the Gigabyte seems to be the worst clocker with it's locked voltage, followed by the Sapphire card, with two confirmed members only hitting low 1100's even with additional voltage.
I wonder if the default voltage is set lower in the sapphire BIOS? I would think not since a few samples have gone a bit higher.

So far, the MSI cards have been the best clockers, but also seem to be the most popular, so that skews the results a little.

Honestly, most people who joined the club, have disappeared, and not posted up their OC results....


----------



## sugarhell

1200? Some people push over 1350 on their 290x










thats what i call a max overclock.


----------



## Agent Smith1984

Quote:


> Originally Posted by *sugarhell*
> 
> 1200? Some people push over 1350 on their 290x
> 
> 
> 
> 
> 
> 
> 
> 
> 
> thats what i call a max overclock.


Well, given that there is little-to-no water cooling market for the 390 series, you have to keep the comparisons in perspective.
290's on air, generally OC in the 1120-1220 range, with the latter being pretty rare samples.

The amount of people getting much past 1200MHz on the 290's is very slim, even with water.


----------



## sugarhell

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Well, given that there is little-to-no water cooling market for the 390 series, you have to keep the comparisons in perspective.
> 290's on air, generally OC in the 1120-1220 range, with the latter being pretty rare samples.
> 
> The amount of people getting much past 1200MHz on the 290's is very slim, even with water.


Because this amount knows how to overclock simple as that.


----------



## Agent Smith1984

Also, has anyone besides Duke been able to test any games at either 3200x1800 in VSR, or actual 4K?

I have been pretty impressed with how well 1 card handles 4k....

BF4 on ultra with no AA at 4K is averaging 58FPS

Crysis 3 on custom settings (with most things turned to either high or very high, and no AA) is getting around 45FPS.

I game just fine at these settings...

I know there are some who are hell bent on being 60FPS at all times (and now you have a new group who are 120FPS+ only) but my experience has been great!
I wouldn't go back to 65FPS with 4X MSAA all max settings on Crysis 3 at 1080P for anything.... give me 4K @ 45FPS with custom settings!!

Again, this is just the beginning for me, as I'm adding another card in a few weeks (to handle witcher and GTA V)

2 of these cards at $660 is a very capable 4K solution in my opinion.

@Duke976 How is the 4K CF thing working out for you?


----------



## Agent Smith1984

Quote:


> Originally Posted by *sugarhell*
> 
> Because this amount knows how to overclock simple as that.


That's not true...

There are plenty of people who know how to overclock, who find their clock ceilings much earlier than planned....

I have been overclocking GPU's since the Ti4000 on passive cooling.

I've done extensive BIOS modding, hard modding on the entire 7800 and 7900 GeForce series... you name it!!

If the card won't do better, it just won't do better (within the constraints of cooling, voltage, and power)


----------



## Jared2608

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Also, has anyone besides Duke been able to test any games at either 3200x1800 in VSR, or actual 4K?
> 
> I have been pretty impressed with how well 1 card handles 4k....
> 
> BF4 on ultra with no AA at 4K is averaging 58FPS
> 
> Crysis 3 on custom settings (with most things turned to either high or very high, and no AA) is getting around 45FPS.
> 
> I game just fine at these settings...
> 
> I know there are some who are hell bent on being 60FPS at all times (and now you have a new group who are 120FPS+ only) but my experience has been great!
> I wouldn't go back to 65FPS with 4X MSAA all max settings on Crysis 3 at 1080P for anything.... give me 4K @ 45FPS with custom settings!!
> 
> Again, this is just the beginning for me, as I'm adding another card in a few weeks (to handle witcher and GTA V)
> 
> 2 of these cards at $660 is a very capable 4K solution in my opinion.
> 
> @Duke976 How is the 4K CF thing working out for you?


I guess what makes these so great for 4K in a crossfire setup, especially the R9-390, is that they have 8GB of RAM so you're not limited at all. An they come at a great price.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Jared2608*
> 
> I guess what makes these so great for 4K in a crossfire setup, especially the R9-390, is that they have 8GB of RAM so you're not limited at all. An they come at a great price.


And that is exactly why I was one of the few who shifted from 290 to 390.....

That and the fact that this 390 looks so much better than my tri-x cards....

I will admit, my 290 CF setup was just SO overkill for 1080p...

I know the 4GB won't be limiting in many things at all right now, but at 4K, we could see the 4GB become a limitation sooner than some think......


----------



## sugarhell

Quote:


> Originally Posted by *Agent Smith1984*
> 
> That's not true...
> 
> There are plenty of people who know how to overclock, who find their clock ceilings much earlier than planned....
> 
> I have been overclocking GPU's since the Ti4000 on passive cooling.
> 
> I've done extensive BIOS modding, hard modding on the entire 7800 and 7900 GeForce series... you name it!!
> 
> If the card won't do better, it just won't do better (within the constraints of cooling, voltage, and power)


Then probably these guys doesnt know









http://puu.sh/e5hFx/f07928186d.jpg

Here a score from a friend. Just water.Kinda old.

I have done bios modding to 7970 and 290x but that means nothing. You talk about max overclock. Max overclock without definition means nothing. You mean max overclock on a casual meaning. 1200 is an easy oc if you know what to do with amd gpus.


----------



## Agent Smith1984

Quote:


> Originally Posted by *sugarhell*
> 
> Then probably these guys doesnt know
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://puu.sh/e5hFx/f07928186d.jpg
> 
> Here a score from a friend. Just water.Kinda old.
> 
> I have done bios modding to 7970 and 290x but that means nothing. You talk about max overclock. Max overclock without definition means nothing. You mean max overclock on a casual meaning. 1200 is an easy oc if you know what to do with amd gpus.


I mean max overclock for each user's card, so I can put it in the table and compare how these are overclocking in general.

1200 is not an easy OC BTW.....

1200MHz on air takes me 150mv with my door off, and a box fan blowing on the card.
1190 MHz however, falls in my lap at 100mv.... I can bench at 1220 with 200mv

Again, it's all relative to the cooling, the power, and the silicon. Saying 1200MHz is easy, is not exactly an accurate statement.


----------



## sugarhell

You force brute the clocks on this card. For 150mv you need below 65C vrms temps.

I dont agree i never had an amd gcn card that couldnt do at least 1300 with the proper cooling. But i respect your opinion and i will leave it there.


----------



## Agent Smith1984

Quote:


> Originally Posted by *sugarhell*
> 
> You force brute the clocks on this card. For 150mv you need below 65C vrms temps.
> 
> I dont agree i never had an amd gcn card that couldnt do at least 1300 with the proper cooling. But i respect your opinion and i will leave it there.


Could you share some of your results and methods at 1300MHz core clocks?

We'd all be very interested I'm sure


----------



## Duke976

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Also, has anyone besides Duke been able to test any games at either 3200x1800 in VSR, or actual 4K?
> 
> I have been pretty impressed with how well 1 card handles 4k....
> 
> BF4 on ultra with no AA at 4K is averaging 58FPS
> 
> Crysis 3 on custom settings (with most things turned to either high or very high, and no AA) is getting around 45FPS.
> 
> I game just fine at these settings...
> 
> I know there are some who are hell bent on being 60FPS at all times (and now you have a new group who are 120FPS+ only) but my experience has been great!
> I wouldn't go back to 65FPS with 4X MSAA all max settings on Crysis 3 at 1080P for anything.... give me 4K @ 45FPS with custom settings!!
> 
> Again, this is just the beginning for me, as I'm adding another card in a few weeks (to handle witcher and GTA V)
> 
> 2 of these cards at $660 is a very capable 4K solution in my opinion.
> 
> @Duke976 How is the 4K CF thing working out for you?


I wish I can give u a better answer







The only decent game that this pc have is GTA V, as I have uninstalled Unity and Dying Light. Other game in this machine are twitch games mainly Dota 2, League of Legend, Heroes of the Storm and Hearstone.

Once my son is done with playing games he just tells me to remove it lol.

But on a serious note, even though that the temp improved when I replaced the tim. I cannot recommended going CF without adequate air circulation or going water. Remember when I did the benchmark was with my door open and the ac was on. All hell will break loose if I bench it with the door close.









I am still thinking of getting the Asus 390 just so I can get it WC. As this MSI cards are too hot to handle in CF.


----------



## sugarhell

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Could you share some of your results and methods at 1300MHz core clocks?
> 
> We'd all be very interested I'm sure


That will require a guide and i simply dont have time to spare.

If you want search for tsm comments, some of them are for amd gpu overclocking.

Simplified guide:

Needs a lot volts. 1400 requires 1.5 +, 1300 1.45+, 7970 1400mhz requires 1.4+ volts. But you cant do it without a hardmode and i will not tell you how with software
<50C vrms or you will kill the gpu
Disable powertune simple as that


----------



## Jared2608

Quote:


> Originally Posted by *Agent Smith1984*
> 
> And that is exactly why I was one of the few who shifted from 290 to 390.....
> 
> That and the fact that this 390 looks so much better than my tri-x cards....
> 
> I will admit, my 290 CF setup was just SO overkill for 1080p...
> 
> I know the 4GB won't be limiting in many things at all right now, but at 4K, we could see the 4GB become a limitation sooner than some think......


I agree, even if 4GB is over kill now, it's always nice to have a little cushion especially because parts are so expensive.

On a different note, I'm stuck deciding between two cards at the moment. The supplier I buy from has the Powercolor R9-390 PCS+ on sale for R 4 799.00 which is about R 700.00 reduced from normal. Then they have the MSI R9-390 Gaming for R 5 699.00. The MSI card has a 3 year warranty, the Powercolor a 2 year warranty. Really stuck trying to decide if it's worth the extra R 900.00.


----------



## Apihl1000

Does anybody know if the old EKWB R9 290 fullcover waterblocks fits the Sapphire R9 390 Nitro?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Jared2608*
> 
> I agree, even if 4GB is over kill now, it's always nice to have a little cushion especially because parts are so expensive.
> 
> On a different note, I'm stuck deciding between two cards at the moment. The supplier I buy from has the Powercolor R9-390 PCS+ on sale for R 4 799.00 which is about R 700.00 reduced from normal. Then they have the MSI R9-390 Gaming for R 5 699.00. The MSI card has a 3 year warranty, the Powercolor a 2 year warranty. Really stuck trying to decide if it's worth the extra R 900.00.


Hmmm, I would say they are both good cards....

I can personally attest the MSI card being a well built unit, and it seems to OC well, but the PCS+ 290 series was highly touted for it's cooling ( I assume that translates to the 390 series) , so it is a tough decision.

If the OC matters to you, I would note that the MSI cards seemed to be binned a little higher than the powercolor cards.

Most of the 390/390x reviews show the PCS cards hitting in the 1125-1160MHz range, even with voltage.

Most all of the MSI reviews are over 1150MHz, especially with additional voltage.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Apihl1000*
> 
> Does anybody know if the old EKWB R9 290 fullcover waterblocks fits the Sapphire R9 390 Nitro?


According to EK, there is no compatible full cover block for the Nitro at all.

This is the case with most 390's...

Only the XFX, Asus, and PowerColor PCS+ cards have transferable blocks (from my understanding).


----------



## Agent Smith1984

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Hmmm, I would say they are both good cards....
> 
> I can personally attest the MSI card being a well built unit, and it seems to OC well, but the PCS+ 290 series was highly touted for it's cooling ( I assume that translates to the 390 series) , so it is a tough decision.
> 
> If the OC matters to you, I would note that the MSI cards seemed to be binned a little higher than the powercolor cards.
> 
> Most of the 390/390x reviews show the PCS cards hitting in the 1125-1160MHz range, even with voltage.
> 
> Most all of the MSI reviews are over 1150MHz, especially with additional voltage.


Now, going back to cooling, if you wanted to water cool the card, you will certainly want to get the powercolor as no full cover blocks will be produced for the 390.


----------



## Jared2608

I probably won't OC it, I usually OC the CPU and just leave the card stock. The PCS+ is a triple slot cooler, so it's pretty huge but it does look good. I might end up flipping a coin lol


----------



## Apihl1000

Quote:


> Originally Posted by *Agent Smith1984*
> 
> According to EK, there is no compatible full cover block for the Nitro at all.
> 
> This is the case with most 390's...
> 
> Only the XFX, Asus, and PowerColor PCS+ cards have transferable blocks (from my understanding).


I knew I should have bought the XFX card, but I like Sapphire more than XFX.


----------



## Duke976

Quote:


> Originally Posted by *Apihl1000*
> 
> Does anybody know if the old EKWB R9 290 fullcover waterblocks fits the Sapphire R9 390 Nitro?


Unfortunately not as there are no EK waterblock for the the nitro and MSI 390


----------



## Apihl1000

Quote:


> Originally Posted by *Duke976*
> 
> Unfortunately not as there are no EK waterblock for the the nitro and MSI 390


Soo, the r9 390x tri-x does have a supporting fullcover waterblock? crap, but yea as I said I should have gone with the XFX.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Apihl1000*
> 
> Soo, the r9 390x tri-x does have a supporting fullcover waterblock? crap, but yea as I said I should have gone with the XFX.


According to EK, there is no block for the tri-x 390x either.

I would say XFX is the way to go if you want to go water.

Their cards have been known to clock higher than the Asus and the PCS cards so far, mind you that is with stock cooler.

Water can change everything.... but reviews still give you a good indication of what a chip can do on air.


----------



## By-Tor

EK blocks for your cards just go here and type what card in the search, select it from the drop down and see what pops up.

http://configurator.ekwb.com/

XFX, ASUS and POWERCOLOR are the only ones with EK blocks


----------



## Duke976

Quote:


> Originally Posted by *By-Tor*
> 
> EK blocks for your cards just go here and type what card in the search, select it from the drop down and see what pops up.
> 
> http://configurator.ekwb.com/
> 
> MSI, ASUS and POWERCOLOR are the only ones with EK blocks


You mean XFX, Asus and Powercolor


----------



## By-Tor

Quote:


> Originally Posted by *Duke976*
> 
> You mean XFX, Asus and Powercolor


Yeah sorry had MSI on the brain I guess... Fixed

EK does have a block for the MSI FuryX


----------



## LongRod

Quote:


> Originally Posted by *Duke976*
> 
> Grats, can you update us on the temp when benching, kindly include the vrm as well if you can. Thanks


Forgot to take a screenshot (will do it next benchmark, letting my room cool down a bit as it's summer in Texas, gets quite hot), but with the core at 1100 and memory untouched, core topped out at 72c, and vrms were around 68c.
Quote:


> Originally Posted by *IcarusLSC*
> 
> Sweet, nice deal LongRod
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I swapped this 390 into my old system and didn't install any drivers and both DVI ports work fine now! This makes me think it's a driver issue, no?!


Definitely a sweet deal, didn't think they'd do a straight trade like that.
Quote:


> Originally Posted by *Agent Smith1984*
> 
> Damn, gotta new MSI 390 for a dead Asus 290... can't beat that!!
> 
> That Asus probably died cause the cooler is horrible on it!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Let us know how overclocking fairs for you.
> 
> Added!


I never had too much of a cooling issue on the ASUS 290, but then again, I have a fairly aggressive fan profile. I haven't even touched the fan profile on this one yet and it runs cooler than the 290.

Aside from a slight bump to 1100 on the core, I haven't done much as its too hot in here right now, but come night time...


----------



## Agent Smith1984

What do you guys make of this????

http://www.tweaktown.com/reviews/7241/sapphire-tri-radeon-r9-fury-video-card-review-hbm-water-cooler/index8.html


----------



## Jared2608

All I get from that, is that the 980Ti is a monstrous beast...It stomps the hell out of everything my gosh!


----------



## Agent Smith1984

Quote:


> Originally Posted by *Jared2608*
> 
> All I get from that, is that the 980Ti is a monstrous beast...It stomps the hell out of everything my gosh!


lol

I was referring to the 390x performance compared to the 980 and the fury... lol


----------



## Jared2608

lol yeah I noticed that too, but then I saw the green beast rampaging around and got distracted......


----------



## Agent Smith1984

Quote:


> Originally Posted by *Jared2608*
> 
> lol yeah I noticed that too, but then I saw the green beast rampaging around and got distracted......


It certainly is a fast card...

At $300-400 I will put my money on AMD usually, but if I were one of the folks who splurge on GPU's ($500-600







) then I'd be placing my money on the 980ti.


----------



## pengs

Something is going on with the drivers for the Fury in that review, most definitely. It has the 290x winning against the Fury 4-5 times.
---
Nevermind, I realized that is the 8GB 290x


----------



## Agent Smith1984

Quote:


> Originally Posted by *pengs*
> 
> Something is going on with the drivers for the Fury in that review, most definitely. It has the 290x winning against the Fury 4-5 times.


Yeah, gotta be, those results are crazy.

Although, every review I've seen, even for the Fury Pro on the new drivers, show just 2 or 3 FPS difference between the 390X and the Fury..... kind of sad....

http://www.techpowerup.com/reviews/ASUS/R9_Fury_Strix/15.html


----------



## pengs

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Yeah, gotta be, those results are crazy.
> 
> Although, every review I've seen, even for the Fury Pro on the new drivers, show just 2 or 3 FPS difference between the 390X and the Fury..... kind of sad....
> 
> http://www.techpowerup.com/reviews/ASUS/R9_Fury_Strix/15.html


Nah, didn't realize that they were using the 8GB 290x at 4K which makes sense against a 4GB Fury.


----------



## Agent Smith1984

Quote:


> Originally Posted by *pengs*
> 
> Nah, didn't realize that they were using the 8GB 290x at 4K which makes sense against a 4GB Fury.


So why the hell is everybody dumbing down the relevance of the 390 series ("rebrand," "not worth it," blah blah blah) and then turning around and touting Fury as this breakthrough graphics card?









It is clearly not that much faster....

I mean, would you spend $250 MORE to get a Fury instead of a 390X, for 2-3 FPS???

That just seems ludicrous. to me...


----------



## pengs

Yeah the 4GB wall is going to sting for anyone who wants absolute solidity over 1080p or 1440p but those 700 extra SP's show it's strength when it's not memory constrained. The Fury would be beautiful with 8GB's.

If I was to pick between a Fury and a 390x I'd go 390x because of the video memory and the thought that I could add another for CFX in a year or two without running into any VRAM limits.


----------



## drazt3ch




----------



## BackwoodsNC

I got a asus 390 coming to play with. Should be here tomorrow; I should note this is going to be just a backup card.


----------



## Marc-Olivier Beaudoin

Eyh guys I've just installed an ASUS 390 in my rig and when I restarted the bios splash screen was all screwed up random colors everywhere etc, but then it gets to the OS boot up screen/sequence and everything returns to normal.

Why would it do that? It seems to run perfectly when it's in windows but the plash screen really freaks me out...


----------



## BlaXey

What is the best choize for 1080p during 3 years, 970 gtx msi full occ or A r9 390 xfx full occ

Enviado desde mi MX4 mediante Tapatalk


----------



## IcarusLSC

Anyone running into the DVI issue I am having?
Thanks


----------



## Ha-Nocri

Quote:


> Originally Posted by *BlaXey*
> 
> What is the best choize for 1080p during 3 years, 970 gtx msi full occ or A r9 390 xfx full occ
> 
> Enviado desde mi MX4 mediante Tapatalk


390 only because of 970's VRAM issue.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Marc-Olivier Beaudoin*
> 
> Eyh guys I've just installed an ASUS 390 in my rig and when I restarted the bios splash screen was all screwed up random colors everywhere etc, but then it gets to the OS boot up screen/sequence and everything returns to normal.
> 
> Why would it do that? It seems to run perfectly when it's in windows but the plash screen really freaks me out...


I've seen that happen on several different cards over the years.... especially my old ATI's....

Scared me at first, but it worked fine so I let it be.

While not extremely common, it does happen to some people. I think it's a motherboard/GPU issue at POST which gets sorted once the driver loads.


----------



## Agent Smith1984

Quote:


> Originally Posted by *BackwoodsNC*
> 
> I got a asus 390 coming to play with. Should be here tomorrow; I should note this is going to be just a backup card.


A $330 backup?









What are you backing up with it?


----------



## By-Tor

Quote:


> Originally Posted by *BackwoodsNC*
> 
> I got a asus 390 coming to play with. Should be here tomorrow; I should note this is going to be just a backup card.


I'd say pull the 280x out and use it as a back up and rock the 390.

Could sell the 280x and pickup a second 390 for crossfire..


----------



## BackwoodsNC

Quote:


> Originally Posted by *By-Tor*
> 
> I'd say pull the 280x out and use it as a back up and rock the 390.
> 
> Could sell the 280x and pickup a second 390 for crossfire..


I need to update that sig. I currently have a 780 classified in the chill box. Waiting on voltage control of fury, if it doesn't come in the next month just going to get 980ti classified.

Oh ups guy came real early and dropped off the 390. I wonder what it will do cold haha







will find out this weekend hopefully.


----------



## IcarusLSC

No one using DVI?


----------



## Agent Smith1984

Quote:


> Originally Posted by *IcarusLSC*
> 
> No one using DVI?


I will be using DVI-D to HDMI adapter VERY SOON for 2160p60Hz so I will let you know how it goes.

For now, can't comment on it.


----------



## BackwoodsNC

Quote:


> Originally Posted by *Agent Smith1984*
> 
> A $330 backup?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What are you backing up with it?


I play iracing so i can't be without a good backup, i dont turn settings down


----------



## Agent Smith1984

Quote:


> Originally Posted by *BackwoodsNC*
> 
> I play iracing so i can't be without a good backup, i dont turn settings down


You sound like my co-worker! lol

That guy has a GTX 780, and then a spare 780... instead of running both in crossfire, he keeps one as a backup, and also an additional PSU (apparently he had blown both in the past and "missed a race"







)

You'll honestly probably find the 390 to be a pretty good card.

Not sure how iracing does on NVIDIA vs AMD, but the 390 with some good clocking should take out a 780 pretty handily (unless you are running extreme overclocks/modded BIOS, etc)


----------



## jaydude

Quote:


> Originally Posted by *IcarusLSC*
> 
> No one using DVI?


I am, Have not noticed any issues with it so far


----------



## IcarusLSC

Quote:


> Originally Posted by *jaydude*
> 
> I am, Have not noticed any issues with it so far


Have you tried both ports? What OS are you running?


----------



## Slay

Can you tell me if the third expansion slot is blocked by Msi 390 or Sapphire 390?


----------



## BackwoodsNC

Quote:


> Originally Posted by *Agent Smith1984*
> 
> You sound like my co-worker! lol
> 
> Not sure how iracing does on NVIDIA vs AMD, but the 390 with some good clocking should take out a 780 pretty handily (unless you are running extreme overclocks/modded BIOS, etc)


I was running the 780 at 1280mhz daily, i do everything extreme.

I have issues with buying cards just to see what they clock at. I then keep them for a little bit and then sell it and buy something else. I lose about 25% each time but i alwaya get new toys to play with.


----------



## Agent Smith1984

Quote:


> Originally Posted by *BackwoodsNC*
> 
> I was running the 780 at 1280mhz daily, i do everything extreme.
> 
> I have issues with buying cards just to see what they clock at. I then keep them for a little bit and then sell it and buy something else. I lose about 25% each time but i alwaya get new toys to play with.


You sound like me! Where at in NC are you? lol


----------



## Agent Smith1984

Quote:


> Originally Posted by *Slay*
> 
> Can you tell me if the third expansion slot is blocked by Msi 390 or Sapphire 390?


390 definitely goes 2 1/2 slots, and so does PCS, not sure on the Nitro, I believe they kept that one at two. The XFX is definitely a two slot....


----------



## BackwoodsNC

Quote:


> Originally Posted by *Agent Smith1984*
> 
> You sound like me! Where at in NC are you? lol


Just north of Winston-Salem.

A side fact about my 780, it's display port is dead. So when i rma it i will probably get a 970 back which is a nonstater for me. Just going to sell that when it gets back.


----------



## Agent Smith1984

Quote:


> Originally Posted by *BackwoodsNC*
> 
> Just north of Winston-Salem.
> 
> A side fact about my 780, it's display port is dead. So when i rma it i will probably get a 970 back which is a nonstater for me. Just going to sell that when it gets back.


So funny to me how many 290/x owners sold off their perfectly working *4GB* cards to go buy the 970.... only to find out that the 970 is only 3.5GB, and that above 1080P it's not even as fast.... and with the new 15.7 drivers, the 290's really blow the 970's out (except for GTA V and Witcher 3 at 1080P).


----------



## Slay

Quote:


> Originally Posted by *Agent Smith1984*
> 
> 390 definitely goes 2 1/2 slots, and so does PCS, not sure on the Nitro, I believe they kept that one at two. The XFX is definitely a two slot....


i've seen XFX but the lack of 0dB fan mode turnde me off.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Slay*
> 
> i've seen XFX but the lack of 0dB fan mode turnde me off.


I'd take the idle fan (probably around 15-20%, and practically silent) with a 35C-40C idle temp (leading to cooler internal ambient temps) than make the silent fan a must have thing, and deal with a 60C idle temp....

If you skim through this thread, several people have either asked about how to get the idle temps lower, or stated that they immediately set a custom fan profile in order to get idle temps lower.

I personally run 25% all the time for a 33-35C idle temp. I am not a supporter of the zero fan and it's 55C+ idle temps..... that's just me though.

If the temps don't bother you (remember that this may or may not affect CPU temps also if the GPU fills the case with hot air while not under load), and the noise DOES bother you, then by all means go with zero state...

If 2 slot card is a must, the XFX gets my vote. If 2 slots, and silence is a must, then the Nitro is your best bet.

All indications point to the XFX being the better clocker of those two, as well as having water block support (which you seem to not be interested in since you are looking for specific slot spacing for the air cooler).

Hope this helps


----------



## brooklands

Quote:


> Originally Posted by *IcarusLSC*
> 
> Anyone running into the DVI issue I am having?
> Thanks


I'm using both DVI-connectors, the hdmi-port and display-port to DVI. No problems even with my 5 meter long cables. Did you check your DVI-cable for bend pins (on both ends) or those little hooks in in the port (both gpu and monitor)?

I once had a problem with a cheap DVI-cable with flimsy pins of which two got stuck in one connector hole on a DP-adapter and it even worked for about 3 months before the adapter eventually passed out.


----------



## Jared2608

I'm now 99% sure I'm going to get the PCS+, with the R 900.00 I save, I can grab myself something nice, like a good headset or something. Or even add a bit and get an SSD.


----------



## diggiddi

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Also, has anyone besides Duke been able to test any games at either 3200x1800 in VSR, or actual 4K?
> 
> I have been pretty impressed with how well 1 card handles 4k....
> 
> BF4 on ultra with no AA at 4K is averaging 58FPS
> 
> Crysis 3 on custom settings (with most things turned to either high or very high, and no AA) is getting around 45FPS.
> 
> I game just fine at these settings...
> 
> I know there are some who are hell bent on being 60FPS at all times (and now you have a new group who are 120FPS+ only) but my experience has been great!
> I wouldn't go back to 65FPS with 4X MSAA all max settings on Crysis 3 at 1080P for anything.... give me 4K @ 45FPS with custom settings!!
> 
> Again, this is just the beginning for me, as I'm adding another card in a few weeks (to handle witcher and GTA V)
> 
> 2 of these cards at $660 is a very capable 4K solution in my opinion.
> 
> @Duke976 How is the 4K CF thing working out for you?


What is the Vram usage at 4k for those games? can u try out BF3 if you have it too? ThX


----------



## Agent Smith1984

Quote:


> Originally Posted by *diggiddi*
> 
> What is the Vram usage at 4k for those games? can u try out BF3 if you have it too? ThX


BF4 never broke 2.75GB I don't think, and I believe Crysis was at 3-3.5GB +/-

That's with no AA though. Once I add a second card and turn AA on, I expect that to go up a bit.
I do not have BF3, sorry.


----------



## aDyerSituation

Any news on full blocks? Really leaning towards Fury when I can upgrade so I can be guaranteed blocks.

I know the xfx card has a full block/or it matches the 290x pcb


----------



## diggiddi

So with AA Cry should be @ 4gb Vram then?

Lines up with this?
http://www.digitalstorm.com/unlocked/video-memory-usage-at-4k-uhd-resolutions-idnum146/


----------



## IcarusLSC

I've tried two cables (one is brand new.) I checked pins and they seem fine. Both ports work until drivers are installed (15.7 on Win 7 64bit) then the upper port stops working when Windows loads...


----------



## Agent Smith1984

Quote:


> Originally Posted by *IcarusLSC*
> 
> I've tried two cables (one is brand new.) I checked pins and they seem fine. Both ports work until drivers are installed (15.7 on Win 7 64bit) then the upper port stops working when Windows loads...


Did you try rolling back to 15.15 and seeing if that works?

Remember to do a complete uninstall, then run DDU in safemode, then restart and install 15.15...


----------



## IcarusLSC

No these are the only drivers this system has ever seen. Though it does it on my old computer as well, fine on both ports till I install drivers...

MSI is still saying that it most likely needs an RMA, sigh...


----------



## jaydude

Quote:


> Originally Posted by *IcarusLSC*
> 
> Have you tried both ports? What OS are you running?


No I am only using DVI, Windows 7 64bit


----------



## jaydude

Might as well join the club while I am at it


----------



## Slay

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I'd take the idle fan (probably around 15-20%, and practically silent) with a 35C-40C idle temp (leading to cooler internal ambient temps) than make the silent fan a must have thing, and deal with a 60C idle temp....
> 
> If you skim through this thread, several people have either asked about how to get the idle temps lower, or stated that they immediately set a custom fan profile in order to get idle temps lower.
> 
> I personally run 25% all the time for a 33-35C idle temp. I am not a supporter of the zero fan and it's 55C+ idle temps..... that's just me though.
> 
> If the temps don't bother you (remember that this may or may not affect CPU temps also if the GPU fills the case with hot air while not under load), and the noise DOES bother you, then by all means go with zero state...
> 
> If 2 slot card is a must, the XFX gets my vote. If 2 slots, and silence is a must, then the Nitro is your best bet.
> 
> All indications point to the XFX being the better clocker of those two, as well as having water block support (which you seem to not be interested in since you are looking for specific slot spacing for the air cooler).
> 
> Hope this helps


I've got Aerocool DS cube case and I want to put a wifi card in it and that's why I asked. as for the temps, if it doesn't throttle while overclocked to the limit, it's fine. Again, if fans are silent then sure, 10-20% is fine.

On the other hand, nitro has a 3x fan cooler, wouldn't it technically be the better overclocker? Also, do both the XFX 390 and Sapphire 390 support manual voltage regulation?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Slay*
> 
> I've got Aerocool DS cube case and I want to put a wifi card in it and that's why I asked. as for the temps, if it doesn't throttle while overclocked to the limit, it's fine. Again, if fans are silent then sure, 10-20% is fine.
> 
> On the other hand, nitro has a 3x fan cooler, wouldn't it technically be the better overclocker? Also, do both the XFX 390 and Sapphire 390 support manual voltage regulation?


Sapphire and XFX both have manual voltage control.

The Nitro has the best cooler for sure, but so far I have seen some dud clockers... mind you, I have no solid OC information on the XFX cards since there aren't any solid OCing reviews for it., and still have a very small sample of OC results for Sapphire. If I had to guess, the MSI is outselling every other 390 4 to 1 or better right now.

Of the two cards (XFX and Sapphire) though, I would recommend the Sapphire for those planning to stay on air cooling, and the XFX for anyone wanting to go water cooling.

Edit: Keep in mind that the Nitro 390 requires 2) 8 pin connectors for power. Some people care, some don't.....

Also, the XFX card is a little shorter, if space is an issue...

AND the XFX includes a Ruby mousepad, a free XFX Hat, and a game code right now on newegg.

That's a pretty nice package....









Edit2: Das' wut she said


----------



## Slay

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Sapphire and XFX both have manual voltage control.
> 
> The Nitro has the best cooler for sure, but so far I have seen some dud clockers... mind you, I have no solid OC information on the XFX cards since there aren't any solid OCing reviews for it., and still have a very small sample of OC results for Sapphire. If I had to guess, the MSI is outselling every other 390 4 to 1 or better right now.
> 
> Of the two cards (XFX and Sapphire) though, I would recommend the Sapphire for those planning to stay on air cooling, and the XFX for anyone wanting to go water cooling.
> 
> Edit: Keep in mind that the Nitro 390 requires 2) 8 pin connectors for power. Some people care, some don't.....
> 
> Also, the XFX card is a little shorter, if space is an issue...
> 
> AND the XFX includes a Ruby mousepad, a free XFX Hat, and a game code right now on newegg.
> 
> That's a pretty nice package....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit2: Das' wut she said


Well, newegg doesn't really concern me as I'm from Poland, the space is only an issue if the card is longer that 32cm, which it isnt. Dual 8-pin= MOAR POWAH, MOAR FPS. About water cooling, not my budget right now, and even if I did, It'd be something like a 590x. Of course I'd get the 390 from MSI if not for the 2,5 slot design. Also, the Sapphire looks better, but if anything I could just spray paint the XFX white. I actually had plans regarding WhiteFrozr but again, too thick.


----------



## IcarusLSC

Quote:


> Originally Posted by *jaydude*
> 
> No I am only using DVI, Windows 7 64bit


Sorry, mobile version doesn't let me see what card you have. I've got a MSI which has two DVI ports...


----------



## Agent Smith1984

Quote:


> Originally Posted by *Slay*
> 
> Well, newegg doesn't really concern me as I'm from Poland, the space is only an issue if the card is longer that 32cm, which it isnt. Dual 8-pin= MOAR POWAH, MOAR FPS. About water cooling, not my budget right now, and even if I did, It'd be something like a 590x. Of course I'd get the 390 from MSI if not for the 2,5 slot design. Also, the Sapphire looks better, but if anything I could just spray paint the XFX white. I actually had plans regarding WhiteFrozr but again, too thick.


Don't get me wrong, I really like the Nitro, and I myself had 2) Tri-X 290 OC's which is basically the predecessor to the Nitro. Those cards did great, but I feel like Sapphire should have put a back plate on the Nitro.
XFX has one, PowerColor has one, MSI has one.... Sapphire should have matched! It looks SO much better having a back plate.


----------



## Jared2608

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Don't get me wrong, I really like the Nitro, and I myself had 2) Tri-X 290 OC's which is basically the predecessor to the Nitro. Those cards did great, but I feel like Sapphire should have put a back plate on the Nitro.
> XFX has one, PowerColor has one, MSI has one.... Sapphire should have matched! It looks SO much better having a back plate.


Definitely looks better, and I feel better having a backing plate on a card packing such big heatsinks and fans.


----------



## jaydude

Quote:


> Originally Posted by *IcarusLSC*
> 
> Sorry, mobile version doesn't let me see what card you have. I've got a MSI which has two DVI ports...


Ah, My 390 only has 1 DVI port


----------



## BackwoodsNC

Anyway to get more voltage control like the 290x's? reference(200mv+).


----------



## Duke976

*Sapphire Nitro R9 390 Review*

http://lanoc.org/review/video-cards/7105-sapphire-nitro-r9-390
Quote:


> Price wise the Sapphire Nitro R9 390 falls into an interesting spot. Officially its direct competitor is the GTX 970 and it out performs it by a good margin in every performance test I run.


----------



## Agent Smith1984

Quote:


> Originally Posted by *BackwoodsNC*
> 
> 
> 
> 
> Anyway to get more voltage control like the 290x's? reference(200mv+).


It should do 200 in trixx, but that's all your gonna get it of these things.... The nitro with 8+8 should hold it easily


----------



## Agent Smith1984

I need to admit something.....

My name is Brandon
And I'm a 4K-holic..

Seriously!


----------



## tbob22

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Don't get me wrong, I really like the Nitro, and I myself had 2) Tri-X 290 OC's which is basically the predecessor to the Nitro. Those cards did great, but I feel like Sapphire should have put a back plate on the Nitro.
> XFX has one, PowerColor has one, MSI has one.... Sapphire should have matched! It looks SO much better having a back plate.


Yep, If Sapphire would have put a back plate on the Nitro I probably would have gotten it over the Powercolor.

On another note, I removed Afterburner because it conflicts with some apps _(Kodi crashes for starters)_ and I'm back to the default profile.. I'm not sure these thermal paste/pads just needed some breaking or what but its sitting here under the same circumstances as before and the core is sitting at 45c with the fan off, the VRM's are sitting at a comfortable 45-50c.


----------



## Slay

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Don't get me wrong, I really like the Nitro, and I myself had 2) Tri-X 290 OC's which is basically the predecessor to the Nitro. Those cards did great, but I feel like Sapphire should have put a back plate on the Nitro.
> XFX has one, PowerColor has one, MSI has one.... Sapphire should have matched! It looks SO much better having a back plate.


Thankfully not a pfoblem for me, horizontal mobo orientation, cooler facing the window. Less strain on the components (like that would matter).

On the other hand i've heard that nitro actually doesn't have 0dB fan mode (might've been a different GPU tho) and the fans are REALLY loud.


----------



## Apihl1000

Quote:


> Originally Posted by *Slay*
> 
> Thankfully not a pfoblem for me, horizontal mobo orientation, cooler facing the window. Less strain on the components (like that would matter).
> 
> On the other hand i've heard that nitro actually doesn't have 0dB fan mode (might've been a different GPU tho) and the fans are REALLY loud.


It does have 0dB fan mode, but some people confuses it beacuse it says 1200 rpm at 0% fan speed. But it does run 0dB fan mode, it just says some weird RPM.


----------



## Apihl1000

Quote:


> Originally Posted by *aDyerSituation*
> 
> Any news on full blocks? Really leaning towards Fury when I can upgrade so I can be guaranteed blocks.
> 
> I know the xfx card has a full block/or it matches the 290x pcb


The Fury got a fullcover waterblock. R9 390(x) XFX, Powercolor and Asus got a fullcover waterblock.


----------



## aDyerSituation

Quote:


> Originally Posted by *Apihl1000*
> 
> The Fury got a fullcover waterblock. R9 390(x) XFX, Powercolor and Asus got a fullcover waterblock.


Links? Don't see them anywhere


----------



## BigMepp

So my MSI R9 390 arrived today! I made the choice over a GTX 970 largely thanks to reading through this thread, many thanks! Now which drivers do I install? I've heard there have been problems with some, so I would appreciate a little advice.


----------



## Duke976

Quote:


> Originally Posted by *BigMepp*
> 
> So my MSI R9 390 arrived today! I made the choice over a GTX 970 largely thanks to reading through this thread, many thanks! Now which drivers do I install? I've heard there have been problems with some, so I would appreciate a little advice.


Congrats, you have acquired one fine card. The most updated driver is in the 1st page of this thread which is 15.7.


----------



## By-Tor

Quote:


> Originally Posted by *aDyerSituation*
> 
> Links? Don't see them anywhere


Type your card into the first search box

http://configurator.ekwb.com/


----------



## BigMepp

Quote:


> Originally Posted by *Duke976*
> 
> Congrats, you have acquired one fine card. The most updated driver is in the 1st page of this thread which is 15.7.


Thank you very much! I look forward to get it up and running.


----------



## Agent Smith1984

So, as I said, I have become addicted to 4K....

You can literally sit 3 feet away from this 55" 4K TV, play any game with NO AA, and see no stepping at all.... it's amazing!

Anyways.....

I figured I would post up the following three screen shots of MSI Afterburner from a 15 minute session of Skyline in Crysis 3....

Note a few things....

VRAM usage is ONLY 1.7GB!!!!!

This is done with max settings, except system spec set to high, instead of very high, and no AA

FPS is averaging 57 which is right below the 60Hz limit at 4k, so it's perfect. Keep in mind this is a single R9 390 @ 1160/1700 with 50+mv core, and 50+mv AUX (my daily settings), and it never breaks 80C with my fan profile.

That is f'n impressive for a $330 card guys!!!!


----------



## gatygun

Quote:


> Originally Posted by *BigMepp*
> 
> So my MSI R9 390 arrived today! I made the choice over a GTX 970 largely thanks to reading through this thread, many thanks! Now which drivers do I install? I've heard there have been problems with some, so I would appreciate a little advice.


Good choice, 8gb vs 3,5gb. No more worry's about v-ram for you








Quote:


> Originally Posted by *Agent Smith1984*
> 
> So, as I said, I have become addicted to 4K....
> 
> You can literally sit 3 feet away from this 55" 4K TV, play any game with NO AA, and see no stepping at all.... it's amazing!
> 
> Anyways.....
> 
> I figured I would post up the following three screen shots of MSI Afterburner from a 15 minute session of Skyline in Crysis 3....
> 
> Note a few things....
> 
> VRAM usage is ONLY 1.7GB!!!!!
> 
> This is done with max settings, except system spec set to high, instead of very high, and no AA
> 
> FPS is averaging 57 which is right below the 60Hz limit at 4k, so it's perfect. Keep in mind this is a single R9 390 @ 1160/1700 with 50+mv core, and 50+mv AUX (my daily settings), and it never breaks 80C with my fan profile.
> 
> That is f'n impressive for a $330 card guys!!!!


Always boggled my mind tho how the 290/390 hardly gets any hit for higher resolutions. While my older card would die out instantly.


----------



## sinholueiro

What brands do you recommend to get a 390? What's the best brand colling the VRMs? I was thinking in the Shapphire Nitro or the XFX. What do you think?


----------



## jackalopeater

Quote:


> Originally Posted by *sinholueiro*
> 
> What brands do you recommend to get a 390? What's the best brand colling the VRMs? I was thinking in the Shapphire Nitro or the XFX. What do you think?


Sapphire/XFX/MSI should be good in that department, I love my Asus DCUII 390x but stock fan profile VRM1 pushes 100c after a 2 hour BF4 stint, pushing the fan to 50% drops it down in the mid 80s, but with the noise


----------



## ManofGod1000

Quote:


> Originally Posted by *Agent Smith1984*
> 
> So, as I said, I have become addicted to 4K....
> 
> You can literally sit 3 feet away from this 55" 4K TV, play any game with NO AA, and see no stepping at all.... it's amazing!
> 
> Anyways.....
> 
> I figured I would post up the following three screen shots of MSI Afterburner from a 15 minute session of Skyline in Crysis 3....
> 
> Note a few things....
> 
> VRAM usage is ONLY 1.7GB!!!!!
> 
> This is done with max settings, except system spec set to high, instead of very high, and no AA
> 
> FPS is averaging 57 which is right below the 60Hz limit at 4k, so it's perfect. Keep in mind this is a single R9 390 @ 1160/1700 with 50+mv core, and 50+mv AUX (my daily settings), and it never breaks 80C with my fan profile.
> 
> That is f'n impressive for a $330 card guys!!!!


Yep, I am jealous.







On my R9 290 stock clocks and FX 8350 at 4.4 Ghz, the best I get is around 32fps or so in Crysis 3 at medium settings. (4k resolution on Windows 10 Pro.) However, it still plays smooth and looks great anyways, just not as good as your setup. Am I missing something maybe or is the R9 390 just that much better?


----------



## diggiddi

Quote:


> Originally Posted by *ManofGod1000*
> 
> Yep, I am jealous.
> 
> 
> 
> 
> 
> 
> 
> On my R9 290 stock clocks and FX 8350 at 4.4 Ghz, the best I get is around 32fps or so in Crysis 3 at medium settings. (4k resolution on Windows 10 Pro.) However, it still plays smooth and looks great anyways, just not as good as your setup. *Am I missing something maybe or is the R9 390 just that much better*?


And that is exactly why I started this thread
http://www.overclock.net/t/1565660/lets-settle-this-290-x-vs-390-x-debate-once-and-for-all

BTW still waitin on you Agent Smith


----------



## IcarusLSC

Anyone with the MSI 390 able to check if they have same issue as I with the DVI ports please? MSI wants me to RMA it, but I think it's drivers...


----------



## IcarusLSC

I don't what to think and really am not happy that yet again that I am screwing around with drivers and cards, if I wanted to do this I would have kept my 970's...

On my old system I uninstalled the new 15.7 drivers as MSI asked me to try old drivers (the ones that came on disk 15.2) and now neither DVI port works right away, unless I unplug and replug in the monitor, then both ports work!

This is getting old fast. No wonder consoles are taking over...


----------



## tbob22

Anyone getting micro stutter? I get a stutter about once every 2 seconds or so. It happens in every game no matter how I change the settings, it's driving me a bit crazy.

Already disabled ULPS, tried overclocking modes Without PowerPlay support and Disabled. Tried using Radeonpro to keep the clocks at max. Tried all kinds of different settings, forcing triple buffering, etc. It doesn't seem to happen without Vsync, but that is not an option. I cannot stand tearing.

My 7950 did not have this issue, it was as smooth as butter as long as the FPS was at 60.


----------



## IcarusLSC

Not very helpful to ya, but I've not run into stutter with this new 390 (one of the main reasons I ditched my 970's was stuttering all the time!)


----------



## Jared2608

Quote:


> Originally Posted by *tbob22*
> 
> Anyone getting micro stutter? I get a stutter about once every 2 seconds or so. It happens in every game no matter how I change the settings, it's driving me a bit crazy.
> 
> Already disabled ULPS, tried overclocking modes Without PowerPlay support and Disabled. Tried using Radeonpro to keep the clocks at max. Tried all kinds of different settings, forcing triple buffering, etc. It doesn't seem to happen without Vsync, but that is not an option. I cannot stand tearing.
> 
> My 7950 did not have this issue, it was as smooth as butter as long as the FPS was at 60.


I'm about to buy this exact card in a few days...This is not good news...


----------



## tbob22

Quote:


> Originally Posted by *Jared2608*
> 
> I'm about to buy this exact card in a few days...This is not good news...


It is very subtle.. But if you are sensitive to that kind of thing it will be very annoying.
It could just be something with my setup. I will update if I find a fix.


----------



## Jared2608

Yeah maybe it's just a case of something not playing nicely, I wonder if any other PCS+ owners have had this issue?

Have you tried totally removing the drivers, maybe using drive sweeper then reinstalling again from scratch after a restart?


----------



## tbob22

Quote:


> Originally Posted by *Jared2608*
> 
> Yeah maybe it's just a case of something not playing nicely, I wonder if any other PCS+ owners have had this issue?
> 
> Have you tried totally removing the drivers, maybe using drive sweeper then reinstalling again from scratch after a restart?


Not yet. I'll try that next.

I wonder if it's a frame time thing. I can't imagine the frame time being worse on the 390 than the 7950. This is basically what I'm seeing but only once every few seconds, it isn't nearly as bad as the video shown there.


----------



## Jared2608

I have a funny feeling the drivers will solve this. I know it doesn't make sense that you should have to do this, but I think this might just sort you out.


----------



## tbob22

Quote:


> Originally Posted by *Jared2608*
> 
> I have a funny feeling the drivers will solve this. I know it doesn't make sense that you should have to do this, but I think this might just sort you out.


Well I cleaned the drivers out and reinstalled. Still having the same issue. :/

Next up, grab an old drive and install a fresh version of Windows and see if it has the same issue.


----------



## Jared2608

Quote:


> Originally Posted by *tbob22*
> 
> Well I cleaned the drivers out and reinstalled. Still having the same issue. :/
> 
> Next up, grab an old drive and install a fresh version of Windows and see if it has the same issue.


Will be interested to see what happens


----------



## herho

Hi,

How is your 390´s VRM´s doing?

During modest overklock (1130/1600) VRM 1 is reaching like 90c. VRM 2 about 20c less. (MSI 390 Gaming)

Running stock Clocks now, slightly undervolted: 70c/46c during Firestrike.

Is it worth doing som modifications on the original cooler? Maybe replacing the pads or what not?


----------



## Agent Smith1984

Quote:


> Originally Posted by *tbob22*
> 
> Well I cleaned the drivers out and reinstalled. Still having the same issue. :/
> 
> Next up, grab an old drive and install a fresh version of Windows and see if it has the same issue.


Are you seeing the stutter in everything you play?

Have you tried using the frame target in CCC instead of the VSYNC within the game?

ALso, have you tried disabling pagefile?

I have been running with zero pagefile lately, and everything runs so much better. You've got plenty of RAM so it should work nicely.


----------



## Agent Smith1984

Quote:


> Originally Posted by *herho*
> 
> Hi,
> 
> How is your 390´s VRM´s doing?
> 
> During modest overklock (1130/1600) VRM 1 is reaching like 90c. VRM 2 about 20c less. (MSI 390 Gaming)
> 
> Running stock Clocks now, slightly undervolted: 70c/46c during Firestrike.
> 
> Is it worth doing som modifications on the original cooler? Maybe replacing the pads or what not?


They can get hot, I recommend cranking the fans up! I run my fans anywhere between 75-95% depending on how hot the card gets (custom profile that scales with the temp).


----------



## herho

Im running default fan profile so fans are at 100%.

Im a bit curios if everyone else have different temperatures for vrm1 and 2.

If not, I really need to check the vrm pads.


----------



## sonicpete

Im a bit late to the party. This is my first AMD card for years. I got a new freesync monitor the MG279q as a gift and was going to get a fury but the 390x was at a great price.

My early overclocking fun with this card has been pretty good. i cant seem to get it past the memory clock wall of 1680


----------



## Agent Smith1984

Quote:


> Originally Posted by *sonicpete*
> 
> 
> 
> Im a bit late to the party. This is my first AMD card for years. I got a new freesync monitor the MG279q as a gift and was going to get a fury but the 390x was at a great price.
> 
> My early overclocking fun with this card has been pretty good. i cant seem to get it past the memory clock wall of 1680


Give it about 50mv of aux voltage and the memory will keep climbing









Btw, welcome to the club, I'll get you on the roster sheet tomorrow morning.


----------



## Overcloquero

Hi people.

My new adquisition is the R9 390 Sapphire Nitro. With voltage stock: 1075/1675. At 1100/1700, artifacts with voltage stock. Don't try and more voltage.

So, my basic overclock is:

*Overcloquero - R9 390 Sapphire Nitro - Stock 1010/1500 - Overclock 1075/1675 - Voltage - Stock default*

Regards and good graphic card for the price.


----------



## sonicpete

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Give it about 50mv of aux voltage and the memory will keep climbing
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Btw, welcome to the club, I'll get you on the roster sheet tomorrow morning.


Thanks for the advice. Its a fun card to overclock.With your advice I have 1700 stable now and will keep pushing. Temps dont really seem a problem on my card at all which has suprised me


----------



## IcarusLSC

I took my card to the computer shop and asked them to test it and on the one system they had, it ran on both dvi ports without drivers, but when they installed them they got black screens on both ports!
So we don't know what to think. They ordered another MSI card for me to see if it still does it, but they aren't sure how long it will take to get.


----------



## Jared2608

I'm interested to see what tbob comes up with when he does a fresh install of Windows, to see if the stuttering still happens.


----------



## jaydude

Quote:


> Originally Posted by *tbob22*
> 
> Anyone getting micro stutter? I get a stutter about once every 2 seconds or so. It happens in every game no matter how I change the settings, it's driving me a bit crazy.
> 
> Already disabled ULPS, tried overclocking modes Without PowerPlay support and Disabled. Tried using Radeonpro to keep the clocks at max. Tried all kinds of different settings, forcing triple buffering, etc. It doesn't seem to happen without Vsync, but that is not an option. I cannot stand tearing.
> 
> My 7950 did not have this issue, it was as smooth as butter as long as the FPS was at 60.


I found that using anti aliasing in The Witcher 3 caused micro-stutter, have not tried much else but you might want to try disabling any AA you may using, I am thinking it could be a driver issue.


----------



## abcanw

I bought an XFX 390 but unfortunately it has a really bad coil whine so im returning it tomorrow unfortunately. Since there is really not much information about the XFX here i thought i would share some.
It has a bios switch so that makes it a dual bios card.




10 min in Kombustor stock speeds


after one heaven benchmark stock speeds


after around 30 min in heaven at stock speeds

since I will return it tomorrow I don't know what to get instead, I would love to get the MSI but I will use an ITX case (Coolermaster elite 130) which can only take 2 slots card, and the Sapphire and Asus cost 20$ more than XFX on Amazon. should I get another XFX?

If you want more info about the XFX I will be glad to provide as long as I have one


----------



## tbob22

Quote:


> Originally Posted by *Jared2608*
> 
> I'm interested to see what tbob comes up with when he does a fresh install of Windows, to see if the stuttering still happens.


Well, the stuttering is now mostly gone. I don't really know what changed. I did another driver clean/reinstall and the stuttering is gone. Portal Stories: Mel still stutters a bit, but it is barely noticeable. Trackmania Valley was stuttering really bad, but it is now completely gone. The Talos Principle does not stutter anymore. Other games were stuttering a bit but I don't notice anything now.

On another note, VSR works really well. 2560x1600 is quite nice for certain applications (Photoshop/Lightroom), just need to make sure to set "Let me choose one scaling level for all displays" and set it to 100%. The text is not perfectly sharp and is a bit small sometimes, but the desktop real estate is much better.


----------



## vavyn

Woohoo..New to the club. Sapphire R9 390 Nitro @ stock


----------



## IcarusLSC

I'm surprised I am the only one with a MSI card that's using DVI...


----------



## DMC911

Hello guys, I have a question and I hope you guys could help me.








In your opinion, what is the highest temperature a stock MSI R9 390 could reach under heavy gaming in a *30c* room(both GPU and VRM)? I plan on getting a MSI R9 390 next week but I'm worried about the temperature since I live in a tropical country and the average temps here is around 30c.
Thanks guys.


----------



## Duke976

Quote:


> Originally Posted by *abcanw*
> 
> I bought an XFX 390 but unfortunately it has a really bad coil whine so im returning it tomorrow unfortunately. Since there is really not much information about the XFX here i thought i would share some.
> It has a bios switch so that makes it a dual bios card.
> 
> 
> 
> 
> 10 min in Kombustor stock speeds
> 
> 
> after one heaven benchmark stock speeds
> 
> 
> after around 30 min in heaven at stock speeds
> 
> since I will return it tomorrow I don't know what to get instead, I would love to get the MSI but I will use an ITX case (Coolermaster elite 130) which can only take 2 slots card, and the Sapphire and Asus cost 20$ more than XFX on Amazon. should I get another XFX?
> 
> If you want more info about the XFX I will be glad to provide as long as I have one


My 1st XFX 390 actually was a very good OC as it only need 50mv to do 1150 core and 1750 mem. If you can get one from bestbuy the better. If my memory is correct I got 81% asic on that XFX of mine from bestbuy.


----------



## Duke976

Quote:


> Originally Posted by *DMC911*
> 
> Hello guys, I have a question and I hope you guys could help me.
> 
> 
> 
> 
> 
> 
> 
> 
> In your opinion, what is the highest temperature a stock MSI R9 390 could reach under heavy gaming in a *30c* room(both GPU and VRM)? I plan on getting a MSI R9 390 next week but I'm worried about the temperature since I live in a tropical country and the average temps here is around 30c.
> Thanks guys.


That will be difficult to answer since we have different room temp, but if you want the MSI you can just use a custom fan profile to compensate for your room temperature.


----------



## tbob22

Quote:


> Originally Posted by *DMC911*
> 
> Hello guys, I have a question and I hope you guys could help me.
> 
> 
> 
> 
> 
> 
> 
> 
> In your opinion, what is the highest temperature a stock MSI R9 390 could reach under heavy gaming in a *30c* room(both GPU and VRM)? I plan on getting a MSI R9 390 next week but I'm worried about the temperature since I live in a tropical country and the average temps here is around 30c.
> Thanks guys.


It will get very hot unless you run very high fan speeds. You'd be better off with a Powercolor PCS+ if you want a backplate or a Sapphire Nitro if you don't care about that. Or just get a custom cooler like the Accelero Xtreme IV.

MSI 390x:
Guru3d: Noise, Temps
Techpowerup: Noise, Temps

Powercolor PCS+ 390:
Guru3d: Noise, Temps
Techpowerup: Noise, Temps

I know that is a 390x vs a 390, but the temps should be very similar. Maybe 2-3c difference max. Maybe slightly more because of the clocks on the MSI.
390 vs 390x temps

Going by that, the Powercolor should be a solid 8-9c cooler at the same or similar noise level.

On my Powercolor the core maxes at 66c, VRM1 maxes at 76c and VRM2 maxes at 65c running Furmark (20min) at 22c ambient. I'm not sure how the MSI does in that regard.

At 1060mhz/1600mhz it maxes around 68c on the core, 78c on VRM1. and 68c on VRM2.


----------



## DMC911

Thank you Duke976 and tbob22 for your help







. Unfortunately,the MSI is the only one available here.
Another question if you don't mind: would it be possible for the MSI R9 390 to keep the temp below 80c with an acceptable fan speed and noise ?
Thanks again guys .


----------



## tbob22

Quote:


> Originally Posted by *DMC911*
> 
> Thank you Duke976 and tbob22 for your help
> 
> 
> 
> 
> 
> 
> 
> . Unfortunately,the MSI is the only one available here.
> Another question if you don't mind: would it be possible for the MSI R9 390 to keep the temp below 80c with an acceptable fan speed and noise ?
> Thanks again guys .


Depends on what you consider to be acceptable noise.. At full speed it should keep it under 80c at those ambient temps, but it will be quite loud. It depends on the load as well.


----------



## Jared2608

Quote:


> Originally Posted by *tbob22*
> 
> Well, the stuttering is now mostly gone. I don't really know what changed. I did another driver clean/reinstall and the stuttering is gone. Portal Stories: Mel still stutters a bit, but it is barely noticeable. Trackmania Valley was stuttering really bad, but it is now completely gone. The Talos Principle does not stutter anymore. Other games were stuttering a bit but I don't notice anything now.
> 
> On another note, VSR works really well. 2560x1600 is quite nice for certain applications (Photoshop/Lightroom), just need to make sure to set "Let me choose one scaling level for all displays" and set it to 100%. The text is not perfectly sharp and is a bit small sometimes, but the desktop real estate is much better.


Great news, glad you got your card to work. I guess there might be a few driver issues still but I think AMD will get it right.

The PCS+ only comes in stock tomorrow, and I get paid on the 31st so hopefully there's still stock or I'll have to save a bit longer for the MSI part.


----------



## Amhro

Quote:


> Originally Posted by *IcarusLSC*
> 
> I'm surprised I am the only one with a MSI card that's using DVI...


I have MSI too and I'm using DVI port. Top port works like a charm, no idea about second port since I have never used it.


----------



## ljh08

Has anyone tried the Asus Strix 390x? Was gonna get the gigabyte one but now I'm skeptical about the locked voltage.


----------



## henboy7777

Would like to join.
ASUS Direct CU II Stock Cooler


----------



## Duke976

Quote:


> Originally Posted by *henboy7777*
> 
> Would like to join.
> ASUS Direct CU II Stock Cooler


Congrats and welcome to the forum. Kindly give us an update on your temp and overclock when you benchmark the card. Thanks


----------



## IcarusLSC

Quote:


> Originally Posted by *Amhro*
> 
> I have MSI too and I'm using DVI port. Top port works like a charm, no idea about second port since I have never used it.


Thanks for letting me know at least, doesn't seem like anyone is willing to try, so I'll return or exchange the card like MSI suggested...


----------



## Duke976

Quote:


> Originally Posted by *IcarusLSC*
> 
> Thanks for letting me know at least, doesn't seem like anyone is willing to try, so I'll return or exchange the card like MSI suggested...


I would have help you but unfortunately my monitor (PB287Q) doesn't have DVI.


----------



## IcarusLSC

Thanks anyways.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Overcloquero*
> 
> Hi people.
> 
> My new adquisition is the R9 390 Sapphire Nitro. With voltage stock: 1075/1675. At 1100/1700, artifacts with voltage stock. Don't try and more voltage.
> 
> So, my basic overclock is:
> 
> *Overcloquero - R9 390 Sapphire Nitro - Stock 1010/1500 - Overclock 1075/1675 - Voltage - Stock default*
> 
> Regards and good graphic card for the price.


Please see post 1 for requirements to join roster. Just a simple screen shot of GPU-Z with a note document open showing your OCN name is the simplest form of proof in my opinion.

Thanks


----------



## sonicpete

I have been having a lot of fun testing this card. Thanks Agent Smith1984 for your advice I managed to break the 1700 barrier on my mem clock with +50mv on the aux.

I can get 1725 stable so will keep pushing for 1750. I can get 1165 on the core clock with 65 mv and temperatures in the 70s at full load.

Im glad i didnt spend the extra money on the fury. The 390x is very capable at 2560 x 1440


----------



## henboy7777

Hi, haven't overclocked as my Asrock Z97 Anniversary is a cheap Z97 chipset and also my PSU is only 650W, so not much headroom for OCing.

Temps after using Furmark Extreme Burn-In Test reached about 71 degrees celcius before I stopped as I didnt think it was an accurate representation of temps during actual gaming.
The Direct CU II cooler is suprisingly cool and quiet. During games, reached about 65 degrees. Never been past the 70 degree mark.

At 100% the fan was at only 30%


----------



## sinholueiro

I think a 650W PSU is powerful enought to do a good overclock to a single 390.


----------



## henboy7777

I think I could get away with a little overclock but my PSU is said to be a low-quality PSU- Tier 4 on the Heirarchy put together on Tomshardware. Also, don't think my motherboard would be that good for OCing as it only has a 4 phase power design. It was the cheapest Z97 board.


----------



## BigMepp

Apologies for the newb question but I would like to benchmark my r9 390 and I was wondering what is the best free benchmarking software and where can I find some scores for the r9 390?


----------



## Slowpoke66

Quote:


> Originally Posted by *BigMepp*
> 
> Apologies for the newb question but I would like to benchmark my r9 390 and I was wondering what is the best free benchmarking software and where can I find some scores for the r9 390?


Unigine's Valley/Heaven and 3DMark's Firestrike (free demo, but worth buying).


----------



## henboy7777

Yeh, Unigene Valley/Heaven and 3DMark 11/Firestrike are good. I did benchmark my card and not too sure if performance is up to par with other 390s.

Unigene Valley - 1080p Ultra Preset, 8x MSAA and Extreme Tessellation - 54.8-55.8 FPS ( i5 4670k) is this good?

3DMark 11 - P12584 - Graphics score - 17068

Furmark - 1080p preset - 71 fps avg - 4304 points.

Are these results right?


----------



## Overcloquero

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Please see post 1 for requirements to join roster. Just a simple screen shot of GPU-Z with a note document open showing your OCN name is the simplest form of proof in my opinion.
> 
> Thanks


Good morning Agent Smith1984. Sorry. I don´t know the screen shot of GPU-Z, for validation my overclock. Sorry again.

This is the screen shot GPU-Z and my name:

http://www.subirimagenes.net/i/150728091246418303.png

Thanks a lot and regards


----------



## 2M1VFO

My mild overclock, still getting over the newness of the card, so not quite ready to start stuffing more voltage in yet. 1100 / 1600 MHz for core / memory clock.


----------



## LtAldoRaine

Hi ,welcome all, thats my first post in all forum thread.Am from Poland,Warsaw.
My own MSi 390x,used MSI GamingAPP now and be in:gpu clock 1100mhz and 1525mhz in memory now.Later i change(maybe) settings(oc).

Thx all,and sorry to my english is low.


----------



## BigMepp

Thanks for the advice on the benchmarking software lads! Not clocked yet


----------



## Gasbah

Hello from Finland!
My second post in here. I have a 4690k and two R9 390's running in crossfire. I already posted the picture in Corsair air 540 owners club but Im gonna share it here too. Love what you guys are doing here!
Running the gpu's at 4k gets pretty hot so Im looking forward to watercooling solutions in the future. I have never done custom water cooling so Im a noob there.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Gasbah*
> 
> Hello from Finland!
> My second post in here. I have a 4690k and two R9 390's running in crossfire. I already posted the picture in Corsair air 540 owners club but Im gonna share it here too. Love what you guys are doing here!
> Running the gpu's at 4k gets pretty hot so Im looking forward to watercooling solutions in the future. I have never done custom water cooling so Im a noob there.


Perhaps some more info on your cards, and a screenshot of GPU-Z... we could get you added to the club....

Whether you can go custom watercooling or not (on the entire GPU) will be dependent on which cards you have....


----------



## Gasbah

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Perhaps some more info on your cards, and a screenshot of GPU-Z... we could get you added to the club....
> 
> Whether you can go custom watercooling or not (on the entire GPU) will be dependent on which cards you have....


MSI R9 390. I read that 290 blocks are not compatible with MSI.
Here's my shot..


----------



## IcarusLSC

No one running DVI yet that can see if they have the same issue as me?

How about VRM Temps, do both read on your MSI or is VRM2 stuck at one temp (disabled) no matter what what?


----------



## Jared2608

Quote:


> Originally Posted by *IcarusLSC*
> 
> No one running DVI yet that can see if they have the same issue as me?
> 
> How about VRM Temps, do both read on your MSI or is VRM2 stuck at one temp (disabled) no matter what what?


Why not have a look at the Crossover monitor thread, those monitors only have dual link dvi and surely someone there has an R9-390.

http://www.overclock.net/t/1555354/official-crossover-2795qhd-pwm-flicker-free-overclock-able-matte-1440p-ah-ips

Maybe someone there has feedback for you?


----------



## IcarusLSC

Most of the ppl here have the MSI, and this is the "official" thread for it, so thought I'd have the best luck here.


----------



## LongRod

Quote:


> Originally Posted by *IcarusLSC*
> 
> Most of the ppl here have the MSI, and this is the "official" thread for it, so thought I'd have the best luck here.


I run DVI on my card, both ports are working perfectly with drivers from 15.15, 15.20, and 15.7.


----------



## Jared2608

I'm sure you will, but maybe someone there has used that DVI monitor on that card? Still, someone in this thread must have done the same?


----------



## Agent Smith1984

Wish I could help man....

I still have the red plastic covers on my DVI ports....


----------



## IcarusLSC

Thanks Long Rod, hopefully more will chime in and check it









Anyone else notice the VRM 2 issue either, I'm sure all you guys messing with these would notice that...


----------



## Agent Smith1984

Quote:


> Originally Posted by *IcarusLSC*
> 
> Thanks Long Rod, hopefully more will chime in and check it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyone else notice the VRM 2 issue either, I'm sure all you guys messing with these would notice that...


Could you restate the issue with VRM 2 you are having? That I can verify for you this evening (depending on what it is).


----------



## LongRod

Quote:


> Originally Posted by *IcarusLSC*
> 
> Thanks Long Rod, hopefully more will chime in and check it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyone else notice the VRM 2 issue either, I'm sure all you guys messing with these would notice that...


Yup, that bug I do have. VRM 2 stays the same temp no matter what, on my friends card it's VRM 1 that does that.

It's the weirdest thing, I assume the temps are fine, but it's just the software/sensor reading wrong.


----------



## IcarusLSC

VRM 2 is a non working sensor. It reads about 48* no matter what. I talked with W1zzard and he thinks the sensor isn't hooked up.


----------



## Agent Smith1984

Quote:


> Originally Posted by *IcarusLSC*
> 
> VRM 2 is a non working sensor. It reads about 48* no matter what. I talked with W1zzard and he thinks the sensor isn't hooked up.


Last GPU-Z checks I ran, I was getting normal readings on 1 and 2, but I don't know if I've check it since going to 15.7. I'll take a look this evening.
That issue should not be driver related, but who knows with this stuff anymore









I would say though... have you updated GPU-Z itself to check them?

Have you tried HWINFO to check there?


----------



## IcarusLSC

This is the thread about the VRMS I had with W1zzard:
http://www.techpowerup.com/forums/threads/voltage-and-vrm-temp-questions-0-8-4-with-r9-390.214432/


----------



## sonicpete

Well my 390x didnt last very long. During a stock speed gaming session yesterday my card started reaching 90 ish degrees and then the screen started flashing like crazy. Two minutes later and it popped and died. Im now waiting for the replacment!


----------



## Jared2608

This is off topic but I once had a Pentium Dual Core CPU that would say it was idling at 44 C no matter how cold it was, even in the BIOS it would report 44, but another machine with the exact same CPU would say it was in the mid 30's...I changed thermal paste so many times before I googled this and found out that my temp sensors were probably stuck. The max temps worked, but that idling at 44 in the BIOS messed me around big time lol


----------



## Noirgheos

Can anyone post temps with the default fan curve, OC'ed or not? MSI one specifically.

Also, how is this performing at 1080p?

Games staying above 60FPS maxed out?


----------



## IcarusLSC

Quote:


> Originally Posted by *sonicpete*
> 
> Well my 390x didnt last very long. During a stock speed gaming session yesterday my card started reaching 90 ish degrees and then the screen started flashing like crazy. Two minutes later and it popped and died. Im now waiting for the replacment!


Ugh, that's no good sonicpete. Any idea what popped? Hopefully you get it replaced fine.


----------



## Noirgheos

Quote:


> Originally Posted by *IcarusLSC*
> 
> Ugh, that's no good sonicpete. Any idea what popped? Hopefully you get it replaced fine.


Wow that has me scared.


----------



## Jared2608

Quote:


> Originally Posted by *sonicpete*
> 
> Well my 390x didnt last very long. During a stock speed gaming session yesterday my card started reaching 90 ish degrees and then the screen started flashing like crazy. Two minutes later and it popped and died. Im now waiting for the replacment!


Man that really sucks! What model did you have? Hopefully it gets swapped out quickly.


----------



## Wroy

This is probally off-topic. If so, i am really sorry.
But since you are all owners of the card i have i have one question.

Does anyone here has the MSI r9 390 (non x) build into a Mitx case? If so, which one do you have and is it working for you?

Thanks in advance!


----------



## IcarusLSC

Jared2608, he has an MSI 390x according to his sig.

Wroy, my case is massive (Lian Li X2000F,) so can't help much, but the MSI 390 card is 11" long and 4.75" wide, plus you need room on side for plugs...


----------



## Agent Smith1984

Well, I am pretty excited....

I have a set of 120MM Cooler Master Jetflo 120's going in my little case tonight for exhaust.....

My current fans are 45CFM, these are 95CFM at full tilt and have showin in reviews to ramp up a tad to around 100CFM....

That should give me enough airflow to run daily @ 1175/1750 (75mv/75mv) without the need to remove the door. I currently run at 1160/1700 (50mv/50mv)

I know it seems like a stretch for a few MHz on the core, and 50MHz on the VRAM, but I game at 4K only now, and every little bit helps at this resolution. Especially on the memory I've noticed.

This should also give me enough temp headroom for crossfire (which I have already managed successfully in this case with my stock fans and kept temps under 90C).


----------



## maclem8223

Proud new owner here. I recently got a great deal on a BenQ XL2730z and wanted to take advantage of Freesync. So I just replaced my 970 with a Sapphire 390x; had some bios issues to begin with and couldn't get the card running properly. Everything is since fixed and running silky smooth on this monitor at 1440p. Loving Freesync! Oh and the 390x's new home:thumb:


----------



## Jared2608

Nice. How have you been with the much talked about AMD drivers?


----------



## maclem8223

Quote:


> Originally Posted by *Jared2608*
> 
> Nice. How have you been with the much talked about AMD drivers?


I literally just got everything up and running last night. Ran a little Alien Isolation and GTAV, smooth as silk. Cant really comment on the drivers at this point as I haven't even had a full day with the card yet. No issues on install though, at least with the drivers.


----------



## Jared2608

Awesome. What card did you get? MSI?

EDIT: I'm stupid, didn't see you said Sapphire....


----------



## AverdanOriginal

Hi Guys,

I am new to this Forum. Here the Screenshot of my GPU-Z MSI R9 390.



As you might notice, I have not overclocked it since I have some problems concerning heat Temps and my motherboard+CPU still wait for an upgrade.
Here my system Specs:

Motherboard: ASUS M4A87TD / USB 3.0
CPU: AMD Phenom II X6 1055T @ 3.7 GHz
*Graphicscard: MSI R9 390 (@ normal Gaming GPU Clock 1040 MHz, and Memory 1500 MHz)*
RAM: 2x4 GB DDR3 Kingston Value Ram
PSU: Be Quiet! Straight Power 10 / 600W CM
Harddrive: System Drive: Samsung SSD 850 EVO 256 GB, BackUP: Western Digital 500 GB

The reason why I have not tried to overclock yet has mutliple reasons, one of course beeing the bottleneck of my motherboard (only 2.0 PCI-Express 16x) and CPU (5 year old CPU but 6 core increase from 2.8 to 3.7 GHz). *But the major reason is that I seem to experience extrem Heat with this card* and I was not sure if this is due to my case (Cube dual chamber), the old motherboard (bad connections and one row lower than on a LGA 1150 Mobo) or just picked up a faulty graphic card????

My case is a cube Anidees AI-7 BW. Airflow shouldn't be too bad. I got 4x120mm Fans at 7V (@ 12V each has around 67 m³/h). I would say one of these 120mm fans blows directly at the card from the front. During benchmarking and haevy gaming I turn on the 12V switch. At the same time I have 1x120mm behind the CPU blowing out 95 m³/h @ 12V and one Noctua 140mm connected via a y-cable to the CPU PWM Fan control on my Mobo. when hot should be around 100m³/h out. also this case is a dual chamber where I have another Noctua 92mm blowing out roughly 60 m³/h.

Now to the temps... In Furmark during stress test I reached temps around 94 Celsius (mind you ambient was on that day 35 Celsius)... I turned it off since I thought this might be not too good for the card. I read now a couple of times Furmark is not the best to test a card as it goes to high extremes. So I thought ok let's see how it does in games.
First Up Watchdogs. Everything on ultra 8x MSAA and so on... fluent gameplay but --> 86 C° (Ambient also 36 C°). I though maybe hot weather, it might go down on a cool day. Then Ambient 29.6 C° Game high Temp went down to 78 C°. Ok, good... but that wouldn't leave much space for overclocking but those were temps that I though with a TDP of 275 Watt or so... no wonder.
Then I installed Witcher 2 (not 3) and I got Temps of 84 C° (Ambient 25 C°). I read that trick on V-Sync and it dropped to 77 C°. OOOOOkay... so far so good. Now today again up on 87 C° at 25 C° Ambient...

I kept reading people having temps around 70 C° under load, or max 75 C°.... but mine keeps running around 85 C°.
Here is what I think:
1. If I get a new Mobo which has of course fresher connections, the Graphic card would move up one PCI slot (currently around 6cm space between floor and card) so roughly 2 cm more air. Also the Front fan would then blow fresh exactly below the card.
2. place one more 120mm below the card with only 800 rpm (extra thin of 15mm) which only pushes around 33 m³/h into the card

You guys think this would help to lower temps by maybe 2-5 C°?
Oh just tried Unigine Valley and max. Temp was 81 C° [email protected] 87% (Ambient 26 C°)

Am I worrying too much or are these normal Temps considering the setup?

Oh and by the way.... this card rocks. forget the temps, but I am able to play Watchdogs and Witcher 2 (from what I have heard most intensive games on the system) on all Ultra @ 1080p (Grid 2 downsampling @ 1800p)

Can'T wait for your opinions on my situation








and one pic of the card:


----------



## Agent Smith1984

Quote:


> Originally Posted by *AverdanOriginal*
> 
> Hi Guys,
> 
> I am new to this Forum. Here the Screenshot of my GPU-Z MSI R9 390.
> 
> 
> 
> As you might notice, I have not overclocked it since I have some problems concerning heat Temps and my motherboard+CPU still wait for an upgrade.
> Here my system Specs:
> 
> Motherboard: ASUS M4A87TD / USB 3.0
> CPU: AMD Phenom II X6 1055T @ 3.7 GHz
> *Graphicscard: MSI R9 390 (@ normal Gaming GPU Clock 1040 MHz, and Memory 1500 MHz)*
> RAM: 2x4 GB DDR3 Kingston Value Ram
> PSU: Be Quiet! Straight Power 10 / 600W CM
> Harddrive: System Drive: Samsung SSD 850 EVO 256 GB, BackUP: Western Digital 500 GB
> 
> The reason why I have not tried to overclock yet has mutliple reasons, one of course beeing the bottleneck of my motherboard (only 2.0 PCI-Express 16x) and CPU (5 year old CPU but 6 core increase from 2.8 to 3.7 GHz). *But the major reason is that I seem to experience extrem Heat with this card* and I was not sure if this is due to my case (Cube dual chamber), the old motherboard (bad connections and one row lower than on a LGA 1150 Mobo) or just picked up a faulty graphic card????
> 
> My case is a cube Anidees AI-7 BW. Airflow shouldn't be too bad. I got 4x120mm Fans at 7V (@ 12V each has around 67 m³/h). I would say one of these 120mm fans blows directly at the card from the front. During benchmarking and haevy gaming I turn on the 12V switch. At the same time I have 1x120mm behind the CPU blowing out 95 m³/h @ 12V and one Noctua 140mm connected via a y-cable to the CPU PWM Fan control on my Mobo. when hot should be around 100m³/h out. also this case is a dual chamber where I have another Noctua 92mm blowing out roughly 60 m³/h.
> 
> Now to the temps... In Furmark during stress test I reached temps around 94 Celsius (mind you ambient was on that day 35 Celsius)... I turned it off since I thought this might be not too good for the card. I read now a couple of times Furmark is not the best to test a card as it goes to high extremes. So I thought ok let's see how it does in games.
> First Up Watchdogs. Everything on ultra 8x MSAA and so on... fluent gameplay but --> 86 C° (Ambient also 36 C°). I though maybe hot weather, it might go down on a cool day. Then Ambient 29.6 C° Game high Temp went down to 78 C°. Ok, good... but that wouldn't leave much space for overclocking but those were temps that I though with a TDP of 275 Watt or so... no wonder.
> Then I installed Witcher 2 (not 3) and I got Temps of 84 C° (Ambient 25 C°). I read that trick on V-Sync and it dropped to 77 C°. OOOOOkay... so far so good. Now today again up on 87 C° at 25 C° Ambient...
> 
> I kept reading people having temps around 70 C° under load, or max 75 C°.... but mine keeps running around 85 C°.
> Here is what I think:
> 1. If I get a new Mobo which has of course fresher connections, the Graphic card would move up one PCI slot (currently around 6cm space between floor and card) so roughly 2 cm more air. Also the Front fan would then blow fresh exactly below the card.
> 2. place one more 120mm below the card with only 800 rpm (extra thin of 15mm) which only pushes around 33 m³/h into the card
> 
> You guys think this would help to lower temps by maybe 2-5 C°?
> Oh just tried Unigine Valley and max. Temp was 81 C° [email protected] 87% (Ambient 26 C°)
> 
> Am I worrying too much or are these normal Temps considering the setup?
> 
> Oh and by the way.... this card rocks. forget the temps, but I am able to play Watchdogs and Witcher 2 (from what I have heard most intensive games on the system) on all Ultra @ 1080p (Grid 2 downsampling @ 1800p)
> 
> Can'T wait for your opinions on my situation
> 
> 
> 
> 
> 
> 
> 
> 
> and one pic of the card:


I'll get you added, welcome to the club...

In the mean time, PLEASE don't use furmark on these graphics cards....

It's like taking someone who already has shaky hands, and asking them to perform brain surgery....

It's hard to do anything with shaky hands, but it's ridiculous to perform brain surgery with shaky hands right?

Point being, these cards are already hot monsters, and asking them to run Furmark makes matters worse!

Furmark is an unrealistic level of card load, that is created for the soul purpose of overheating your GPU.

Instead use Unigine Heaven 4.0, which is still a heavy load, but not so much so that it gives you an unrealistic level of temps (especially on the VRMs).

The next thing to do is build a custom fan profile, so that you aren't listening to fans running full tilt all the time, but can ramp them up as needed.

I use a profile that keeps my fan at 25% up until 45C, and from that point it ramps up to 100% at 90C. This does pretty well for me, and keeps me in the low 80's during 4K gaming.

It does help to experiment with fan, but sometimes the results are marginal.

Another big impact, would be to change out the TIM.

Hope this helps some


----------



## AverdanOriginal

Hi Agent,
Thx for the quick response. Yeah Furmark I pretty quickly discarded after reading it a couple of times. Was just strange to read through some reviews and they used Furmark and got 80 C°???

I adjusted the fans currently have it at 0% until 50 C° than jumps to 40% at 60C° 80% at 80C° and 90C°. Also adjusted that Temperature Hysteres (not sure if it is the same name in English) @ 4C° to keep it lower.

Once I get my new mobo and CPU and low profile fan I will attempt my overclocks.

Cheers.


----------



## Twau

Hi!

My last AMD card was PowerColor Radeon 9800 PRO, now I am on the red team again!
I were planning to buy the Sapphire Radeon R9 390 Nitro since it was cheaper in my country, but it did not fit my case (max 29cm gfx card length) so I bought the MSI Radeon R9 390.



So far I am very happy with my MSI R9 390 card, first impression by holding it was very high build quality and it felt very sturdy with a backplate.
Will play with some OC later on, just tried out 1100MHz core clock without any voltage on Heaven Benchmark, but I will run my card on stock for now and overclock down the road when I need it I think.

A few questions:

1. Any special settings you should change/think of in CCC after installing the drivers? (currently only enabled VSR, still trying to get used to CCC over nvidia controllpanel)
2. The driver installation asked if I wanted to install several applications... ACP Application/AMD Gaming Evolved, do you use any of these at all?

Some MSI card related question.
3. MSI Gaming App, do you use it? Is it just a overclock simple overclock program so I can just set those clockspeeds into MSI afterburner instead that I am familiar to?
4. MSI Live Update 6, do you use it for your card? Will it update your gfx card bios if new comes out?


----------



## RiasHaise

My Msi r9 390's fans dont spin. I know they dont spin at idle to keep quiet but I ran heaven benchmark and the gpu was at 98% load and they still wouldnt spin and then once the gpu hit 100 degrees c the pc turned off. Ive made sure the fans havent been disabled by looking in afterburner and they arent. Please help me I dont know what to do ;(


----------



## Jared2608

Quote:


> Originally Posted by *RiasHaise*
> 
> My Msi r9 390's fans dont spin. I know they dont spin at idle to keep quiet but I ran heaven benchmark and the gpu was at 98% load and they still wouldnt spin and then once the gpu hit 100 degrees c the pc turned off. Ive made sure the fans havent been disabled by looking in afterburner and they arent. Please help me I dont know what to do ;(


You sure you haven't accidentally made a custom fan profile that has them turned off?


----------



## RiasHaise

Yes Im sure the fans wouldnt spin before I resorted to looking on afterburner.


----------



## Jared2608

Any chance there's a loos connector on the card? Maybe the fan connector came unplugged.


----------



## RiasHaise

Fan connector? I've only plugged in a 8 and 4 pin power connector. What is this other connector you are on about?


----------



## Jared2608

I mean if you take a close look at the card, you might find a small plug that goes into a socket on the actual PCB of the card. I don't know how the MSI card works but on the Powercolor version, there is a small plug that connects the fans power cable to the graphics cards PCB. It'll be a small plug, probably under the plastic shroud.


----------



## diggiddi

I believe he means on the card itself


----------



## RiasHaise

I will take a look tomorrow as more light etc. Thanks so much for the help so far I will report back for an update


----------



## Jared2608

Quote:


> Originally Posted by *diggiddi*
> 
> I believe he means on the card itself


Yeah, I can't see any other thing that would stop the fans from working. They're either not getting power, because a connector is loose, or they're all faulty which would be horrible on a brand new card, but if they're faulty at least you can get an RMA.


----------



## RiasHaise

Ive had to send my new motherboard back already as that was faulty no power was entering it but thats fixed with my new motherboard. My new pc has been nothing but problems so far..


----------



## Jared2608

Quote:


> Originally Posted by *RiasHaise*
> 
> Ive had to send my new motherboard back already as that was faulty no power was entering it but thats fixed with my new motherboard. My new pc has been nothing but problems so far..


That sucks man, where about are you from? I have problems with electronics too...I usually celebrate if a new gadget I buy myself works properly first time lol.


----------



## RiasHaise

I am from Wales and yes all my electronics are like that too.


----------



## BackwoodsNC

Did anyone upgrade to windows 10 today? If so did you lose Catalyst control center when you right mouse? No idea how to get it back.


----------



## tbob22

Quote:


> Originally Posted by *Agent Smith1984*
> 
> ....
> Furmark is an unrealistic level of card load, that is created for the soul purpose of overheating your GPU.
> ....


Maybe it's just the MSI, but the Sapphire and Powercolor coolers seem to handle Furmark just fine. Mine maxes out at 67c after 20min, VRM1 77c, VRM2 65c.. Granted the fans jump up to 75% and are not exactly quiet and I have very good airflow in my case. I mean I'm not going to run it much like that, but it's nice to know what the max temp you'll ever see is going to be.
Quote:


> Originally Posted by *BackwoodsNC*
> 
> Did anyone upgrade to windows 10 today? If so did you lose Catalyst control center when you right mouse? No idea how to get it back.


Using the Win10 drivers?


----------



## Ju_nin_mai

hey guys I just got my R9 390 from MSi and during KOMBUSTOR for 5 min Im hitting like 85 C degrees.

its been very hot recently ambiant is around 25-26 C degrees.

My OC is 1111 core and 1600 memory.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Ju_nin_mai*
> 
> hey guys I just got my R9 390 from MSi and during KOMBUSTOR for 5 min Im hitting like 85 C degrees.
> 
> its been very hot recently ambiant is around 25-26 C degrees.
> 
> My OC is 1111 core and 1600 memory.


Kombuster is practically like running furmark. 85C in kombuster isn't bad at all


----------



## Kalistoval

What are best max settings for my 390 on my tv for GTAV its a 1080p 60hz. I'm trying a bit of different settings vsr looks good would like to hear setting recommendations please thanks.


----------



## By-Tor

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Kombuster is practically like running furmark. 85C in kombuster isn't bad at all


I guess it does not support crossfire... Only one of my cards is doing the work..


----------



## RiasHaise

Ok fan connector is securely in place and still the fans will not run even at temperatures above 80 degrees c. Im not sure what to do now, should I return my graphics card and get a new one?


----------



## diggiddi

Quote:


> Originally Posted by *RiasHaise*
> 
> Ok fan connector is securely in place and still the fans will not run even at temperatures above 80 degrees c. Im not sure what to do now, should I return my graphics card and get a new one?


Post haste


----------



## Agent Smith1984

Quote:


> Originally Posted by *RiasHaise*
> 
> Ok fan connector is securely in place and still the fans will not run even at temperatures above 80 degrees c. Im not sure what to do now, should I return my graphics card and get a new one?


Try launching CCC and use the performance tab to set a constant fan speed.
Make sure afterburner, or any other third party app pertaining to the GPU is closed.

If the fans still don't respond, then you likely need to RMA the card.

Edit: don't forget to post your proof if you wanted to be added to the club roster.
Thanks


----------



## Agent Smith1984

Quote:


> Originally Posted by *Ju_nin_mai*
> 
> hey guys I just got my R9 390 from MSi and during KOMBUSTOR for 5 min Im hitting like 85 C degrees.
> 
> its been very hot recently ambiant is around 25-26 C degrees.
> 
> My OC is 1111 core and 1600 memory.


These things certainly run hot.... get a screen shot or picture with name and I will get you on the roster!


----------



## Jared2608

Quote:


> Originally Posted by *RiasHaise*
> 
> Ok fan connector is securely in place and still the fans will not run even at temperatures above 80 degrees c. Im not sure what to do now, should I return my graphics card and get a new one?


At this point I'd take it back and ask for a new one. One cool thing over here is that if we get a product that's defective or we're not happy with, we can exchange it for a new one, not a repair, within the first seven days.


----------



## AverdanOriginal

Quote:


> Originally Posted by *Ju_nin_mai*
> 
> hey guys I just got my R9 390 from MSi and during KOMBUSTOR for 5 min Im hitting like 85 C degrees.
> 
> its been very hot recently ambiant is around 25-26 C degrees.
> 
> My OC is 1111 core and 1600 memory.


Hi Jun_nin_mai,

What settings did you use for Kombustor?
I tried like 1080p, 8xMSAA, 3D test= Compute Shader/Blasma (1-M particle)
I only got like 65 C° after 5 min. that can't be the same test you did or?

Sorry for asking but I got 87 C° today again while playing DA Inquisition with adjusted fan-curve (roughly 91% fan speed on 87 C°)


----------



## Sesameopen

Hey guys quick question. I just got my XFX DD Core Edition and looking it up on the EK website they say that it uses the r9 290 reference waterblock. Does this mean I can put my previous r9 290 waterblock onto the r9 390?

Thanks,
Sesameopen


----------



## CerealKillah

So, here I am in another video card thread. I took my EVGA GTX 970 SSC back to Best Buy last night and picked up the XFX 390 for $319.00 (with $10 BB rewards cert).

Skimming through this thread it seems like maybe the XFX is a good 390?

I am getting ready to tear down my old system (watercooled 2500K at 4.6 and HD 7950 at stock speeds under water) and install my new parts (4690K, MSI Gaming 5 Mobo, 16 Gigs of DDR 3 2400 memory and R9 390) hopefully tonight or tomorrow.

I am concerned that my trusty ol' Rosewill HIVE-650 won't be enough horsepower to power both an overclocked CPU AND the R9 390.

Will I be OK with my HIVE-650?


----------



## BlaXey

I think it will be okey, i have a R9 270X and a fx 8320 overclocked to 4.4ghz with a cheap PSU with 750W and it suports without problems my pc, if your PSU is good brand, you shouldn't have problems. I think r9 390 XFX is a good purcharse, 6ºC more hotter than the MSI, but more cheap, and has a nice quality.


----------



## BlaXey

What should i buy? I'm between the msi r9 390 391€ and the XFX 345€, SAPPHIRE is not a choice because it doesn't enter in my case


----------



## Flash Gordon

Hey guys picked up two used 390's from Micro Center this morning









Sadly I wasn't able to test the cards seeing that I ran into some CPU problems while switching from my 970 to my 390.

But here are some pics...I couldn't pass them up at this price and surprisingly they came in their retail box with all its contents I couldn't be more happier with my purchase I wish I could have tested them before I had to go to work.







Can someone help me with my CPU annoyance...idk what went wrong from when I took my 970 out and popped in my 390...here's a little video I made.

http://s1071.photobucket.com/user/HashTester/media/20150730_133347_zpshnzga9oc.mp4.html


----------



## Jared2608

Quote:


> Originally Posted by *Flash Gordon*
> 
> Hey guys picked up two used 390's from Micro Center this morning
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sadly I wasn't able to test the cards seeing that I ran into some CPU problems while switching from my 970 to my 390.
> 
> But here are some pics...I couldn't pass them up at this price and surprisingly they came in their retail box with all its contents I couldn't be more happier with my purchase I wish I could have tested them before I had to go to work.
> 
> 
> 
> 
> 
> 
> 
> Can someone help me with my CPU annoyance...idk what went wrong from when I took my 970 out and popped in my 390...here's a little video I made.
> 
> http://s1071.photobucket.com/user/HashTester/media/20150730_133347_zpshnzga9oc.mp4.html


Nice find on the cards, these things are begging to be used in Crossfire, with the 8GB of RAM.

As for the CPU - Z thing, have you tried resetting your BIOS to defaults? You'll lose your OC obviously but it'll put everything back to how it should be. Boot into Windows on the reset BIOS and see if CPU-Z still does that. I'm not sure if it's serious or not, I probably wouldn't worry personally if it was my rig and it was still stable, but I don't want to tell you not to worry and then...your CPU dies lol.


----------



## 2M1VFO

Quote:


> Originally Posted by *Jared2608*
> 
> these things are begging to be used in Crossfire, with the 8GB of RAM.


Exactly.
One of the reasons why i went for the 390 and sent my gtx 970 back. My board only supports crossfire too, but its 8x. Has anyone done any real world testing on the PCI express lane speed differences / do we have a definitive conclusion on the effect on performance when using a speed on the 2nd lane under 16x?

I'm only able to afford 1x 390 for a while though. Will have to starve for a few weeks, all for a good cause.


----------



## 2M1VFO

Quote:


> Originally Posted by *CerealKillah*
> 
> So, here I am in another video card thread. I took my EVGA GTX 970 SSC back to Best Buy last night and picked up the XFX 390 for $319.00 (with $10 BB rewards cert).
> 
> Skimming through this thread it seems like maybe the XFX is a good 390?
> 
> I am getting ready to tear down my old system (watercooled 2500K at 4.6 and HD 7950 at stock speeds under water) and install my new parts (4690K, MSI Gaming 5 Mobo, 16 Gigs of DDR 3 2400 memory and R9 390) hopefully tonight or tomorrow.
> 
> I am concerned that my trusty ol' Rosewill HIVE-650 won't be enough horsepower to power both an overclocked CPU AND the R9 390.
> 
> Will I be OK with my HIVE-650?


Like you, I also done the same - sent my MSI GTX 970 back for exchange (it was bit of a squeeze though, they weren't happy) for the MSI R9 390.

To answer your PSU question, I wouldn't bother with focusing on watts when it comes to PSU's, as they are often not a true reflection of the capacity of the PSU. I try to just concentrate on the current. My PSU was fairly cheap - an EVGA supply, which is only 600 watt, but has a current rating of 49 Amps on the 12v rail. It copes just fine, although I haven't had it very long, just a few weeks, but it stays nice and cool.

This fairly cheap supply is running AMD FX-8320 (125w) overclocked to 4.3GHz, my MSI R9 390 - slightly overclocked to 1100 MHz / 1600 MHz core and mem clock. Only a single hard drive and a few fans are being driven from the supply too - no SSD.


----------



## Flash Gordon

Quote:


> Originally Posted by *Jared2608*
> 
> Nice find on the cards, these things are begging to be used in Crossfire, with the 8GB of RAM.
> 
> As for the CPU - Z thing, have you tried resetting your BIOS to defaults? You'll lose your OC obviously but it'll put everything back to how it should be. Boot into Windows on the reset BIOS and see if CPU-Z still does that. I'm not sure if it's serious or not, I probably wouldn't worry personally if it was my rig and it was still stable, but I don't want to tell you not to worry and then...your CPU dies lol.


I'm afraid my cpu will die lol it was hitting 99 C without me doing anything just staring at CPU Z.

I cleared the CMOS that's about all I had time to do since I was rushing for work.

If I have to get a new MB I will but hopefully it won't come to that.


----------



## Jared2608

OH snap, yeah that's crazy hot. Are you sure that you didn't bump the cpu cooler or something while changin cards? Or, maybe dislodge a cable and now the cooler isn't running?


----------



## Flash Gordon

I'm disconnecting everything when I get home make sure everything is properly connected.

Thanks for your input that might be it. That's like the only explanation. Maybe the voltages are jumping like that as a safety feature the cpu trying to keep itself cool or whatnot.


----------



## Jared2608

Yeah it may be thermal throttling, at those temps it's definitely going to come into affect. Hope you get it right so you can enjoy those two cards


----------



## CerealKillah

Quote:


> Originally Posted by *CerealKillah*
> 
> So, here I am in another video card thread. I took my EVGA GTX 970 SSC back to Best Buy last night and picked up the XFX 390 for $319.00 (with $10 BB rewards cert).
> 
> Skimming through this thread it seems like maybe the XFX is a good 390?
> 
> I am getting ready to tear down my old system (watercooled 2500K at 4.6 and HD 7950 at stock speeds under water) and install my new parts (4690K, MSI Gaming 5 Mobo, 16 Gigs of DDR 3 2400 memory and R9 390) hopefully tonight or tomorrow.
> 
> I am concerned that my trusty ol' Rosewill HIVE-650 won't be enough horsepower to power both an overclocked CPU AND the R9 390.
> 
> Will I be OK with my HIVE-650?


Quote:


> Originally Posted by *2M1VFO*
> 
> Like you, I also done the same - sent my MSI GTX 970 back for exchange (it was bit of a squeeze though, they weren't happy) for the MSI R9 390.
> 
> To answer your PSU question, I wouldn't bother with focusing on watts when it comes to PSU's, as they are often not a true reflection of the capacity of the PSU. I try to just concentrate on the current. My PSU was fairly cheap - an EVGA supply, which is only 600 watt, but has a current rating of 49 Amps on the 12v rail. It copes just fine, although I haven't had it very long, just a few weeks, but it stays nice and cool.
> 
> This fairly cheap supply is running AMD FX-8320 (125w) overclocked to 4.3GHz, my MSI R9 390 - slightly overclocked to 1100 MHz / 1600 MHz core and mem clock. Only a single hard drive and a few fans are being driven from the supply too - no SSD.


According to Rosewill:

[email protected]

Seems good if 49 is fine...


----------



## DannyDK

I have a question for all you 390/390x owners in here, would it be a good thing to get a 390x insted of my 980?
I game on a 1080p flatscreen but i do use DSR on the 980, just not at 4k though since thats way to much for the card to handle if i want 60 steady fps, but i think it would be better with a 390x using VSR since it has 8gb of ram. Am i wrong to think that the 390x would be better?


----------



## Ha-Nocri

980 is faster


----------



## Ju_nin_mai

Quote:


> Originally Posted by *Agent Smith1984*
> 
> These things certainly run hot.... get a screen shot or picture with name and I will get you on the roster!


----------



## Flash Gordon

This is the first group I am a part of


----------



## SystemTech

Finally installed GPU-z and took a screenshot








Aquired her beginning of July and still running stock.
Not for long though


----------



## gatygun

Quote:


> Originally Posted by *DannyDK*
> 
> I have a question for all you 390/390x owners in here, would it be a good thing to get a 390x insted of my 980?
> I game on a 1080p flatscreen but i do use DSR on the 980, just not at 4k though since thats way to much for the card to handle if i want 60 steady fps, but i think it would be better with a 390x using VSR since it has 8gb of ram. Am i wrong to think that the 390x would be better?


Performance is king, keep your 980 it's faster in every way. The 8gb only makes sense when you push crossfire forwards and go towards 4k resolutions with msaa which basically even on crossfire will kill it's performance.


----------



## Agent Smith1984

Quote:


> Originally Posted by *gatygun*
> 
> Performance is king, keep your 980 it's faster in every way. The 8gb only makes sense when you push crossfire forwards and go towards 4k resolutions with msaa which basically even on crossfire will kill it's performance.


The 390X does beat out the 980 in several benchmarks now, but the problem is, once you begin comparing the potential of the cards with overclocking, the 980 begins to creep past it a good bit.

The 980 is a great card at 1080P, it's not until you go over 1440 that the pendilum siwngs towards the 390X and it's much faster framw buffer.

There would definitely not be any reason to get a different card if you have a 980 though.


----------



## mstrmind5

Could any MSI R9 390 owners give some stats for me.

I'm looking for GPU temp, VRM temps, and the corresponding fan rpm and fan % if it's not too much trouble (hopefully a couple of you might be so kind to get a more accurate reading).

I've just got a 750 EVGA G2, I want a quiet GPU and deciding between a MSI GTX 970 and the MSI R9 390.

Thanks.


----------



## Agent Smith1984

Guys, I have some really nice information to post. I want to show the results of airflow improvement through changing fans, and how drastically in can improve temps.......

Anyone experiencing high temps needs to take a look at this.....

These results were all obtained under the same testings conditions.

2 runs of Heaven 4.0 with ambient temp of 23c

This first screen shows the card at 1160/1600 with 50mv/0mv offsets
These aren't bad temps at all, but nothing stellar either....



This shot was taken @ 1180/1700 with 75mv/50mv
Now we see temps start to climb as we add some voltage and clock speeds..... not much room left thermally for overclocking... Can this be improved? hmmmmmm



This shot was taken, with the same settings, but now running (2) Coooler Master Jetflo 120's for my rear and top exhaust.
HOLY COW!!! Look at the drop in temps on the core and on VRM1.... not mention, I was under the impression the VRM2 was stuck at 53C, however now, it reports 48C constantly.....



This is a HUGE improvement on temps. 12C temp drop on the core and the VRM1!!!









Now I think I'm in good shape to start pushing towards 1200MHz+ on this baby.... testing begins now!! Wish me luck!!


----------



## 2M1VFO

Quote:


> Originally Posted by *mstrmind5*
> 
> Could any MSI R9 390 owners give some stats for me.
> 
> I'm looking for GPU temp, VRM temps, and the corresponding fan rpm and fan % if it's not too much trouble (hopefully a couple of you might be so kind to get a more accurate reading).
> 
> I've just got a 750 EVGA G2, I want a quiet GPU and deciding between a MSI GTX 970 and the MSI R9 390.
> 
> Thanks.


Hi mate. I have used these very cards in my system (MSI GTX 970 & MSI R9 390) so I can maybe answer your question on which to buy. In short - the R9 390. I bought the MSI GTX 970, even though the 390 was in stock. When the benchmarks started to give a true picture of the new 390, not just "a rebrand of an R9 290 as many were claiming), I started regretting the GTX 970 purchase. I returned the GTX 970 and got the MSI R9 390 and I am so, so glad I did. That is not to say the GTX 970 isn't a really good card, because it is. But given the choice, I would always go 390. I liked the 970, but this 390 has just blown me away.

Another advantage to the 390 is obviously the 8GB VRAM, which I suspect future games will utilise, especially when I see Far Cry 4 using just over 4Gb in places at a mere 1080p resolution, if you enable 8x AA. Although the 8Gb is maybe overkill for the performance of these cards, crossfire would take care of that well.


----------



## Gumbi

Very impressive, especially given the very low fan speed on the actual GPU cooler now, just 35%!

Did you replace existing case fans? Or just add them? How loud are the fans?

I have a Nanoxia DS1 case, it's great for silence etc, wouldn't mind slightly betterexhaust setup though, plus I'm mostly using the stock Nanoxia fans which are nice and queit (nice voltage controller allows me 2 sets of threes fans, 5v to 12v I believe), but perhaos slightly lacking in airflow.

My VaporXs core gets a bit hot under load, it might have to do with the fact I repasted it recently, might not be mounted correctly. The Vrm/mem cooling is superb, but the core gets a tad hotter.

It might have something to do with airflow mlre than anything else though... 48c on vrm 2 is insane, mine gets to mid 50s and I thought I was rockin







VRM stays mid 50s too though.

The core cooling is great too, at the moment, I get to 72c in Crysis while on stock voltage of 25mv and 1100/1450. I bet better cooling woukd help a lot with that.

Apologies if I'm being vague/inconsistent with numbers, am at work so can't confirm









Ill do more extensive benching later, I'm very impressed by the gains from good excellent airflow and I'm convinced my setup can be improved drastically.

I'm also looking for any excuse to upgrade random stuff cos I have the money lying around and don't spend it on anything else really


----------



## mstrmind5

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Guys, I have some really nice information to post. I want to show the results of airflow improvement through changing fans, and how drastically in can improve temps.......
> 
> Anyone experiencing high temps needs to take a look at this.....
> 
> These results were all obtained under the same testings conditions.
> 
> 2 runs of Heaven 4.0 with ambient temp of 23c
> 
> This first screen shows the card at 1160/1600 with 50mv/0mv offsets
> These aren't bad temps at all, but nothing stellar either....
> 
> 
> 
> This shot was taken @ 1180/1700 with 75mv/50mv
> Now we see temps start to climb as we add some voltage and clock speeds..... not much room left thermally for overclocking... Can this be improved? hmmmmmm
> 
> 
> 
> This shot was taken, with the same settings, but now running (2) Coooler Master Jetflo 120's for my rear and top exhaust.
> HOLY COW!!! Look at the drop in temps on the core and on VRM1.... not mention, I was under the impression the VRM2 was stuck at 53C, however now, it reports 48C constantly.....
> 
> 
> 
> This is a HUGE improvement on temps. 12C temp drop on the core and the VRM1!!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now I think I'm in good shape to start pushing towards 1200MHz+ on this baby.... testing begins now!! Wish me luck!!


On your screenshots, if I see them correclty, you haven't selected max fan rpm on the first two pics at least.


----------



## mstrmind5

Quote:


> Originally Posted by *2M1VFO*
> 
> Hi mate. I have used these very cards in my system (MSI GTX 970 & MSI R9 390) so I can maybe answer your question on which to buy. In short - the R9 390. I bought the MSI GTX 970, even though the 390 was in stock. When the benchmarks started to give a true picture of the new 390, not just "a rebrand of an R9 290 as many were claiming), I started regretting the GTX 970 purchase. I returned the GTX 970 and got the MSI R9 390 and I am so, so glad I did. That is not to say the GTX 970 isn't a really good card, because it is. But given the choice, I would always go 390. I liked the 970, but this 390 has just blown me away.
> 
> Another advantage to the 390 is obviously the 8GB VRAM, which I suspect future games will utilise, especially when I see Far Cry 4 using just over 4Gb in places at a mere 1080p resolution, if you enable 8x AA. Although the 8Gb is maybe overkill for the performance of these cards, crossfire would take care of that well.


Any stats on the GPU, vrms temps and the fam rpm and fan % on the 390?


----------



## Ha-Nocri

That is why I like my case. Has 200mm intake and 200mm and 120mm fans for exhaust. Hardly any newer case I buy will match the airflow...


----------



## gatygun

Quote:


> Originally Posted by *2M1VFO*
> 
> Hi mate. I have used these very cards in my system (MSI GTX 970 & MSI R9 390) so I can maybe answer your question on which to buy. In short - the R9 390. I bought the MSI GTX 970, even though the 390 was in stock. When the benchmarks started to give a true picture of the new 390, not just "a rebrand of an R9 290 as many were claiming), I started regretting the GTX 970 purchase. I returned the GTX 970 and got the MSI R9 390 and I am so, so glad I did. That is not to say the GTX 970 isn't a really good card, because it is. But given the choice, I would always go 390. I liked the 970, but this 390 has just blown me away.
> 
> Another advantage to the 390 is obviously the 8GB VRAM, which I suspect future games will utilise, especially when I see Far Cry 4 using just over 4Gb in places at a mere 1080p resolution, if you enable 8x AA. Although the 8Gb is maybe overkill for the performance of these cards, crossfire would take care of that well.


It's typical nvidia, skimping v-ram in order to create demand of new gpu's a year later.

3,5gb on the 970 is a joke.


----------



## Agent Smith1984

Quote:


> Originally Posted by *mstrmind5*
> 
> On your screenshots, if I see them correclty, you haven't selected max fan rpm on the first two pics at least.


Forgot those, but the fan speed always matches the gpu temp with my fan profile. So 71c has 71% fan speed. Meaning i could probably get into the 60's with 100%...

My stock fans were 45cfm, 26db, these fans are 95cfm @ 34db, VERY WELL WORTH THE NOISE


----------



## Gumbi

Very nice! I'm going to evaluate me cooling later. I have a very nice voltage controller with my case, which means I can easily control noise (2 controllers, 3 fans per controller) if it's too loud.

No problem with loudish noise while gaming, but I do appreciate low noises at idle/when browsing etc.


----------



## Ha-Nocri

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Forgot those, but the fan speed always matches the gpu temp with my fan profile. So 71c has 71% fan speed. Meaning i could probably get into the 60's with 100%...
> 
> My stock fans were 45cfm, 26db, these fans are 95cfm @ 34db, VERY WELL WORTH THE NOISE


Do you use any fan controller? No need for the fans to run full speed all the time


----------



## Agent Smith1984

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Do you use any fan controller? No need for the fans to run full speed all the time


No fan controller cause i don't have any drive bays to mount, but i am looking at some rear mounted external ones.


----------



## Agent Smith1984

So the highest i could get as a daily stable clock, was 1185/1775 @ 90mv/75mv

Core temp never breaks 72C, and VRM1 never breaks 73C.... I couldn't get 1200mhz without some slight artifacts after 4 runs in heaven, 1190 did the same, but 1185 is rock solid.

60Fps average in Crysis 3 now @ 4k (true 4096x2160), high system settings, and very high textures! Minimum never goes below 40!

Not bad for a single card...


----------



## Gumbi

Quote:


> Originally Posted by *Agent Smith1984*
> 
> So the highest i could get as a daily stable clock, was 1185/1775 @ 90mv/75mv
> 
> Core temp never breaks 72C, and VRM1 never breaks 73C.... I couldn't get 1200mhz without some slight artifacts after 4 runs in heaven, 1190 did the same, but 1185 is rock solid.
> 
> 60Fps average in Crysis 3 now @ 4k (true 4096x2160), high system settings, and very high textures! Minimum never goes below 40!
> 
> Not bad for a single card...


Memory overclock is sweet as







I imagine it helps more at 4k?

Pity you have to work for the last few mhz for the core, but the temps are excellent. I wonder would you be in the high 50s across the board at the lowest voltage possible for 1100mhz?


----------



## AverdanOriginal

Quote:


> Originally Posted by *Agent Smith1984*
> 
> So the highest i could get as a daily stable clock, was 1185/1775 @ 90mv/75mv
> 
> Core temp never breaks 72C, and VRM1 never breaks 73C.... I couldn't get 1200mhz without some slight artifacts after 4 runs in heaven, 1190 did the same, but 1185 is rock solid.
> 
> 60Fps average in Crysis 3 now @ 4k (true 4096x2160), high system settings, and very high textures! Minimum never goes below 40!
> 
> Not bad for a single card...


Nice temps. especially with that overclock. Allthough those fans would be a tad too loud for me. I went the other way, like I said in my original post and bought one very silent and thin 120mm fan (only 15mm thick) and placed it below the MSI R9 390 blowing only a little fresh air from the floor. spinning at approx. 900 rpm (so 100 rpm faster than it should







)

Anyways, gave me about an extra of 4-5 C° on all different games and benchmarks. So without overclocking I have now the same temps as you did in your first pic (around 77 C°) but Ambient Temp is currently 26 C°


Now I feel confident to try a little overclocking









Oh and by the way --> 60FPS @ true 2160p... AWESOME


----------



## Sesameopen

Hey guys quick question. I just got my XFX DD Core Edition and looking it up on the EK website they say that it uses the r9 290 reference waterblock. Does this mean I can put my previous r9 290 waterblock onto the r9 390?


----------



## pengs

Quote:


> Originally Posted by *Ju_nin_mai*


Quote:


> Originally Posted by *Ju_nin_mai*
> 
> hey guys I just got my R9 390 from MSi and during KOMBUSTOR for 5 min Im hitting like 85 C degrees.
> 
> its been very hot recently ambiant is around 25-26 C degrees.
> 
> My OC is 1111 core and 1600 memory.


I assume that HWMonitor shot is from gaming? If so that's a 21*C difference.

That's Furmark for you, in this case Kombuster. It really only exists to give you the beyond a absolute worst case scenario, it's a bit unrealistic and has been dubbed a power virus for those reasons. It's also a really boring game







With that said 85*C running Furmark isn't bad but you should do your 390 a favor and don't run it


----------



## Agent Smith1984

Quote:


> Originally Posted by *Sesameopen*
> 
> Hey guys quick question. I just got my XFX DD Core Edition and looking it up on the EK website they say that it uses the r9 290 reference waterblock. Does this mean I can put my previous r9 290 waterblock onto the r9 390?


Yes, yes it does


----------



## ljh08

I have the gigabyte wind force 390x in hand now. Finally got it installed last night. tried GTA 5 and witched 3 and both run well. But on witched 3 it sounds like a jet engine is inside my case.... and heat is rolling out. Obviously running at high / maxed settings with or without hair works is still alot of work, but just wondering if anyone else is getting extremely quiet performance? It honestly makes me think about returning it for a Fury X.


----------



## ljh08

Quote:


> Originally Posted by *ljh08*
> 
> I have the gigabyte wind force 390x in hand now. Finally got it installed last night. tried GTA 5 and witched 3 and both run well. But on witched 3 it sounds like a jet engine is inside my case.... and heat is rolling out. Obviously running at high / maxed settings with or without hair works is still alot of work, but just wondering if anyone else is getting extremely quiet performance? It honestly makes me think about returning it for a Fury X.


*Note* I'm coming from a 7850 and it seems significantly louder.


----------



## Sesameopen

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Yes, yes it does


Oh my god. And r9 200's run crossfire in tandem with r9 390's right? OLOLOLOLOLOLOLOLOOLOLOLOL. 290 290 390 trifire LOLOFEO:FHSF


----------



## Flash Gordon

Quote:


> Originally Posted by *ljh08*
> 
> I have the gigabyte wind force 390x in hand now. Finally got it installed last night. tried GTA 5 and witched 3 and both run well. But on witched 3 it sounds like a jet engine is inside my case.... and heat is rolling out. Obviously running at high / maxed settings with or without hair works is still alot of work, but just wondering if anyone else is getting extremely quiet performance? It honestly makes me think about returning it for a Fury X.


So if someone else is getting quiet performance with your card you will return yours? Lol idk if that makes much sense but...

I ran Witcher 3 last night maxed out everything except hair works and the resolution was 3k I believe...for some reason my CCC's Maxx VSR is something like 3600x1800 so not 4k.

My top card gets super hot and my bottom hard never passes 73 C.

Keep in mind I'm at stock settings and my top GPU's fans are spinning to keep it cool.

Top GPU's Temps get close to 94 or 95 in some passes.

I need to find a better cooling solution it's just the backplate gets super hot so the bottom card isn't doing any favors for the top card.

A single card is relatively simple to keep cool if you have the right situation when it comes to air cooling your case and enabling the right fan curve.

If noise bothers you that much buy some headphones or move your PC away from you a couple of feet lol my PC literally sits next to me and I'm just rocking out with my headphones.


----------



## ljh08

Quote:


> Originally Posted by *Flash Gordon*
> 
> So if someone else is getting quiet performance with your card you will return yours? Lol idk if that makes much sense but...
> 
> I ran Witcher 3 last night maxed out everything except hair works and the resolution was 3k I believe...for some reason my CCC's Maxx VSR is something like 3600x1800 so not 4k.
> 
> My top card gets super hot and my bottom hard never passes 73 C.
> 
> Keep in mind I'm at stock settings and my top GPU's fans are spinning to keep it cool.
> 
> Top GPU's Temps get close to 94 or 95 in some passes.
> 
> I need to find a better cooling solution it's just the backplate gets super hot so the bottom card isn't doing any favors for the top card.
> 
> A single card is relatively simple to keep cool if you have the right situation when it comes to air cooling your case and enabling the right fan curve.
> 
> If noise bothers you that much buy some headphones or move your PC away from you a couple of feet lol my PC literally sits next to me and I'm just rocking out with my headphones.


Sadly headphones and me don't get along. Have them but don't use. And I just wondered if everyone else's card was running as hot and loud as mine seems to be. Noctua and cooler master case fans on my blackhawk case puts pretty good airflow but the graphic card is louder than everything else combined. And I really just wondered about different 390x models. Obviously the gigabyte one can stay cool but doing so requires using the fans extensively.


----------



## Flash Gordon

Quote:


> Originally Posted by *ljh08*
> 
> Sadly headphones and me don't get along. Have them but don't use. And I just wondered if everyone else's card was running as hot and loud as mine seems to be. Noctua and cooler master case fans on my blackhawk case puts pretty good airflow but the graphic card is louder than everything else combined. And I really just wondered about different 390x models. Obviously the gigabyte one can stay cool but doing so requires using the fans extensively.


To answer your question directly my GPU's are the loudest thing in my rig atm.

I will be changing my front facing fans since they do not push cool air directly to the GPU's.

I game in an extremely cool room to boot.

I literally stay with my AC unit on everytime I'm spending extended time in my room that helps me slightly.


----------



## mstrmind5

Quote:


> Originally Posted by *Flash Gordon*
> 
> To answer your question directly my GPU's are the loudest thing in my rig atm.
> 
> I will be changing my front facing fans since they do not push cool air directly to the GPU's.
> 
> I game in an extremely cool room to boot.
> 
> I literally stay with my AC unit on everytime I'm spending extended time in my room that helps me slightly.


Do you have the msi r9 390?

If so, could you post some stats on your gpu - namely fan rpm and the corresponding fan speed % (usually gpuz). So I can get an idea how quiet it might be compared to the msi gtx 970.


----------



## Flash Gordon

Quote:


> Originally Posted by *mstrmind5*
> 
> Do you have the msi r9 390?
> 
> If so, could you post some stats on your gpu - namely fan rpm and the corresponding fan speed % (usually gpuz). So I can get an idea how quiet it might be compared to the msi gtx 970.


I actually have two of them and ran them through their paces separately before configuring my Crossfire.

Unfortunately I have those benchmarks in my computer and I am at work but I will be more than happy to post them here for you when I get the chance.

I can tell you that one of my cards at stock never passed 75 C and the other never passed 73 C. Both stock and I have my fan curve set up that it hits 100% at 90 C. I step back 10% for every 10 C.

It may not be the most efficient noise or temp wise but it's a good starting point for me.

I will need to adjust those fan curves since my top card is going to always be hotter than the bottom.


----------



## mstrmind5

Quote:


> Originally Posted by *Flash Gordon*
> 
> I actually have two of them and ran them through their paces separately before configuring my Crossfire.
> 
> Unfortunately I have those benchmarks in my computer and I am at work but I will be more than happy to post them here for you when I get the chance.
> 
> I can tell you that one of my cards at stock never passed 75 C and the other never passed 73 C. Both stock and I have my fan curve set up that it hits 100% at 90 C. I step back 10% for every 10 C.
> 
> It may not be the most efficient noise or temp wise but it's a good starting point for me.
> 
> I will need to adjust those fan curves since my top card is going to always be hotter than the bottom.


Thanks.


----------



## Agent Smith1984

So, just to keep things nice and cool, and in the hopes of preserving the card....

I settled on 1175/1700 @ 75mv/50mv as my daily clocks.

Played a few rounds of BF4 @ 4k ultra settings (with AA off) this morning.

The core nor the vrm broke 70c (70% gpu fan). Things are kinda loud under load (due to the case fans), but i play my games cranked up anyways.

These fans have been a fabulous addition to this rig, which for all intents and purposes is practically the size of a micro atx...

60Fps+ in battlefield @4k on ultra is still a sight to behold, even in 2015!

Looking forward to crossfire on these cards, and temps are no concern now.


----------



## Noirgheos

Quote:


> Originally Posted by *Agent Smith1984*
> 
> So, just to keep things nice and cool, and in the hopes of preserving the card....
> 
> I settled on 1175/1700 @ 75mv/50mv as my daily clocks.
> 
> Played a few rounds of BF4 @ 4k ultra settings (with AA off) this morning.
> 
> The core nor the vrm broke 70c (70% gpu fan). Things are kinda loud under load (due to the case fans), but i play my games cranked up anyways.
> 
> These fans have been a fabulous addition to this rig, which for all intents and purposes is practically the size of a micro atx...
> 
> 60Fps+ in battlefield @4k on ultra is still a sight to behold, even in 2015!
> 
> Looking forward to crossfire on these cards, and temps are no concern now.


Would just like to ask, how much of an OC is that? And based on what you got, do you think the MSI R9 390 OCed can reach MSI R9 390X levels? The 390Z is $140 more here, which is why I'm asking.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Noirgheos*
> 
> Would just like to ask, how much of an OC is that? And based on what you got, do you think the MSI R9 390 OCed can reach MSI R9 390X levels? The 390Z is $140 more here, which is why I'm asking.


Stock (oc mode) clock on msi 390 is 1060/1525

I am basically running a 10% oc on both the core and memory. It has netted me a 6-8% performance gain depending on title.

My card will exceed the performance if stock 390x at these speeds, but of course, the 390x has just as much headroom, and will generally be around 3% faster at same clocks. Not quite worth a $100+ in my opinion, but that's just me....


----------



## By-Tor

Quote:


> Originally Posted by *Agent Smith1984*
> 
> So, just to keep things nice and cool, and in the hopes of preserving the card....
> 
> I settled on 1175/1700 @ 75mv/50mv as my daily clocks.
> 
> Played a few rounds of BF4 @ 4k ultra settings (with AA off) this morning.
> 
> The core nor the vrm broke 70c (70% gpu fan). Things are kinda loud under load (due to the case fans), but i play my games cranked up anyways.
> 
> These fans have been a fabulous addition to this rig, which for all intents and purposes is practically the size of a micro atx...
> 
> 60Fps+ in battlefield @4k on ultra is still a sight to behold, even in 2015!
> 
> Looking forward to crossfire on these cards, and temps are no concern now.


What fans did you add Agent?


----------



## Noirgheos

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Stock (oc mode) clock on msi 390 is 1060/1525
> 
> I am basically running a 10% oc on both the core and memory. It has netted me a 6-8% performance gain depending on title.
> 
> My card will exceed the performance if stock 390x at these speeds, but of course, the 390x has just as much headroom, and will generally be around 3% faster at same clocks. Not quite worth a $100+ in my opinion, but that's just me....


Ok, one last thing, when I get my MSI R9 390, do you think I can add you on Steam to guide me through overclocking? I'd like to have someone to ask quick questions to. Is this ok? If it is, I'll PM you my Steam.


----------



## Agent Smith1984

Quote:


> Originally Posted by *By-Tor*
> 
> What fans did you add Agent?


Cooler Master Jetflow 120's in white LED....

95CFM each









Somehow got them for $8.99 a piece on amazon, but they are usually $16.99.... Got two for one, but these things are worth it, even at the regular price.

Quote:


> Originally Posted by *Noirgheos*
> 
> Ok, one last thing, when I get my MSI R9 390, do you think I can add you on Steam to guide me through overclocking? I'd like to have someone to ask quick questions to. Is this ok? If it is, I'll PM you my Steam.


It wouldn't do you too much good my friend, I am hardly ever on steam at all.... you are welcome to PM on OCN though, that goes to my phone email.


----------



## Noirgheos

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Cooler Master Jetflow 120's in white LED....
> 
> 95CFM each
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Somehow got them for $8.99 a piece on amazon, but they are usually $16.99.... Got two for one, but these things are worth it, even at the regular price.
> It wouldn't do you too much good my friend, I am hardly ever on steam at all.... you are welcome to PM on OCN though, that goes to my phone email.


Will do.


----------



## DannyDK

Quote:


> Originally Posted by *Agent Smith1984*
> 
> So, just to keep things nice and cool, and in the hopes of preserving the card....
> 
> I settled on 1175/1700 @ 75mv/50mv as my daily clocks.
> 
> Played a few rounds of BF4 @ 4k ultra settings (with AA off) this morning.
> 
> The core nor the vrm broke 70c (70% gpu fan). Things are kinda loud under load (due to the case fans), but i play my games cranked up anyways.
> 
> These fans have been a fabulous addition to this rig, which for all intents and purposes is practically the size of a micro atx...
> 
> 60Fps+ in battlefield @4k on ultra is still a sight to behold, even in 2015!
> 
> Looking forward to crossfire on these cards, and temps are no concern now.


How much VRAM was used for that setting?


----------



## Agent Smith1984

Quote:


> Originally Posted by *DannyDK*
> 
> How much VRAM was used for that setting?


2.9GB


----------



## Flash Gordon

Quote:


> Originally Posted by *Agent Smith1984*
> 
> So, just to keep things nice and cool, and in the hopes of preserving the card....
> 
> I settled on 1175/1700 @ 75mv/50mv as my daily clocks.
> 
> Played a few rounds of BF4 @ 4k ultra settings (with AA off) this morning.
> 
> The core nor the vrm broke 70c (70% gpu fan). Things are kinda loud under load (due to the case fans), but i play my games cranked up anyways.
> 
> These fans have been a fabulous addition to this rig, which for all intents and purposes is practically the size of a micro atx...
> 
> 60Fps+ in battlefield @4k on ultra is still a sight to behold, even in 2015!
> 
> Looking forward to crossfire on these cards, and temps are no concern now.


Hey good luck CF'ing lol

The back plate gets so hot it will auto add 20 C to the top card off the rip.

My cards by them selves can handle unigen pretty well maybe 72 C - 75 C. When I put them together only the bottom card acts normal.

If you figure out a way to dissipate the heat in between card 1 and card 2 let me know...

Im thinking of building a little bracket ala the same one that holds the fans on the side of a Noctua Nh-D14 to blow air directly out of the middle of both cards. These temps got me petrified to overclock lol


----------



## DannyDK

Quote:


> Originally Posted by *Agent Smith1984*
> 
> 2.9GB


Thats impressive how little was used, is that with V-sync and aa enabled and everything on the absolut highest? I cant hit 60fps with my 980 @4k with the settings i just wrote :-(


----------



## Noirgheos

Quote:


> Originally Posted by *Agent Smith1984*
> 
> 2.9GB


Which one performs better? The MSI 390 or the Gigabyte G1 390?


----------



## RiasHaise

Ok my R9 390 has been returned and I thought I would try an MSI gtx 980 due to seeing amazing performance at 1440p on it. The person achieved a stable 60fps with an occasional drop down to 55fps at true 1440p with 2x MSAA with everything set to very high and extended distance scaling and extended shadow distance cranked up fully. The gtx 980 was overclocked to 1500MHz with the gpu staying at 70 degrees c. My gtx 980 is on order and was very pleased to find out it would arrive on Monday - Tuesday with a free pre-order code of Metal Gear Solid 5 Phantom Pain


----------



## Flash Gordon

My FireStrike scores with MSI 390s in CF.

Stock btw haven't trouched a thing except fan profiles.



Edit: According to the graph it says I can't game at 4k but I have been easily crushing 3k gaming lol im sure if I turn off AA or lower other settings like Hairworks in Witcher 3 I could easily run through 4k 60+ FPS.

Idk how accurate that chart is


----------



## RiasHaise

Quote:


> Originally Posted by *Flash Gordon*
> 
> My FireStrike scores with MSI 390s in CF.
> 
> Stock btw haven't trouched a thing except fan profiles.
> 
> 
> 
> Edit: According to the graph it says I can't game at 4k but I have been easily crushing 3k gaming lol im sure if I turn off AA or lower other settings like Hairworks in Witcher 3 I could easily run through 4k 60+ FPS.
> 
> Idk how accurate that chart is


Very nice







It's amazing how much performance you can get out of crossfire. Do you get any stuttering at all or any other negatives because of crossfire?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Flash Gordon*
> 
> My FireStrike scores with MSI 390s in CF.
> 
> Stock btw haven't trouched a thing except fan profiles.
> 
> 
> 
> Edit: According to the graph it says I can't game at 4k but I have been easily crushing 3k gaming lol im sure if I turn off AA or lower other settings like Hairworks in Witcher 3 I could easily run through 4k 60+ FPS.
> 
> Idk how accurate that chart is


A few things....

I've already ran 290's in crossfire, and they did get hot (88c top, 77c bottom), but that was with stock case fans, i should now do much better with these jetflos....

Also, i never run aa with 4k, because you don't need it. That's the point of running such a high resolution.

As far as your crossfire score, the graphics score is fine, your overall score is low because you are running your cpu at stock. Get that physics score in the 9k range with overclocking, and you'll see the overall score shoot way up.


----------



## tbob22

Upgraded to Win10. The stuttering is back.. Only much worse now, and my FPS took a huge drop. In the same games that I was playing before my FPS went from a solid 60 to 30-50fps.

Completely uninstalled the drivers, used Driver Sweeper, and reinstalled. No difference.
GPU usage pegged at 100%, clocks are 1010/1500.. Really strange.. I'm about to drop back to 8.1, this is crazy.

Edit: So it's not 100% all of the time. It gets stuck at 100%, then it go directly to 0% for a while.. Not much in-between.. I even dropped the settings to the lowest and it does the same thing, as soon as it jumps up to 100%, the fps drop out of nowhere. It seems to be worse in certain games.


----------



## Agent Smith1984

Quote:


> Originally Posted by *tbob22*
> 
> Upgraded to Win10. The stuttering is back.. Only much worse now, and my FPS took a huge drop. In the same games that I was playing before my FPS went from a solid 60 to 30-50fps.
> 
> Completely uninstalled the drivers, used Driver Sweeper, and reinstalled. No difference.
> GPU usage pegged at 100%, clocks are 1010/1500.. Really strange.. I'm about to drop back to 8.1, this is crazy.
> 
> Edit: So it's not 100% all of the time. It gets stuck at 100%, then it go directly to 0% for a while.. Not much in-between.. I even dropped the settings to the lowest and it does the same thing, as soon as it jumps up to 100%, the fps drop out of nowhere. It seems to be worse in certain games.


Just a shot in the dark, but try turning off your pagefile, turning off ULPS in afterburner.

Are the cards running at full utilization?


----------



## tbob22

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Just a shot in the dark, but try turning off your pagefile, turning off ULPS in afterburner.
> 
> Are the cards running at full utilization?


It shouldn't be. The game I'm testing right now is Trackmania Valley, it used to use only about 40% of the GPU on average, now it's at 100% sometimes and then it drops to 0% and vice versa. The Talos Principle does the same thing. Portal seems to work smoothly, instead of jumping from 0% to 100% it jumps from 0% to 80% or 90%.

On Win 8 it was all well under 100% and it never dropped to 0% like that.

Edit: Furmark does the same thing. Something is up.
Edit2: Disabling ULPS and bumping the power limit to 25% fixed Furmark.
3DMark looks fine, same score as before. These games must need an update or something...


----------



## Jared2608

Quote:


> Originally Posted by *tbob22*
> 
> Upgraded to Win10. The stuttering is back.. Only much worse now, and my FPS took a huge drop. In the same games that I was playing before my FPS went from a solid 60 to 30-50fps.
> 
> Completely uninstalled the drivers, used Driver Sweeper, and reinstalled. No difference.
> GPU usage pegged at 100%, clocks are 1010/1500.. Really strange.. I'm about to drop back to 8.1, this is crazy.
> 
> Edit: So it's not 100% all of the time. It gets stuck at 100%, then it go directly to 0% for a while.. Not much in-between.. I even dropped the settings to the lowest and it does the same thing, as soon as it jumps up to 100%, the fps drop out of nowhere. It seems to be worse in certain games.


Just a thought, have you confirmed that the drivers you installed are actually the ones in the system. Windows 10 apparently tried to install the Microsoft Signed drives if you don't tell it not to per piece of hardware.


----------



## tbob22

Quote:


> Originally Posted by *Jared2608*
> 
> Just a thought, have you confirmed that the drivers you installed are actually the ones in the system. Windows 10 apparently tried to install the Microsoft Signed drives if you don't tell it not to per piece of hardware.


Yeah, the correct ones are installed (15.200.1062.1002).

On another note, I was updating The Talos Principle and then opened MSI Afterburner, then my PC froze solid. Restarted and of course it is now missing and it's installing from scratch. Black Mesa is now missing as well.

Also, my search in the start menu is only picking up some of my desktop apps so I now have to find them in Program Files to open them, I even added some extra shortcuts in Programs and it still won't add them.








Time to switch to launchy again I guess. Win8 had no problem picking up shortcuts.


----------



## Jared2608

Did you do a fresh install or just the upgrade to Windows 10? I have never been a fan of in place upgrades, although mostly they work without issue I always prefer the old school story, backup everything and fresh install from scratch.


----------



## tbob22

Quote:


> Originally Posted by *Jared2608*
> 
> Did you do a fresh install or just the upgrade to Windows 10? I have never been a fan of in place upgrades, although mostly they work without issue I always prefer the old school story, backup everything and fresh install from scratch.


I upgraded. I have way too much stuff in place to even consider a fresh install. It would take a month to get all of my applications set up again as this is my work machine as well. Vista>7>8>8.1 were all fairly painless and I never had issues like this.

Edit: After the update, The Talos Principle seems to be back to normal.


----------



## DannyDK

Since Widows 10 has beed mentioned, i want to ask if any of you guys wuth AMD cards have issues when playing a movie or watching Netflix? Im having issues with the sound getting distorted and framestuttering (same time as the sound distorts), in fact its also pressent when playing games, is it only an Nvidia thing or just me alone?


----------



## tbob22

Quote:


> Originally Posted by *DannyDK*
> 
> Since Widows 10 has beed mentioned, i want to ask if any of you guys wuth AMD cards have issues when playing a movie or watching Netflix? Im having issues with the sound getting distorted and framestuttering (same time as the sound distorts), in fact its also pressent when playing games, is it only an Nvidia thing or just me alone?


Haven't seen anything like that. Sounds more like a sound card driver issue to me.


----------



## DannyDK

Quote:


> Originally Posted by *tbob22*
> 
> Haven't seen anything like that. Sounds more like a sound card driver issue to me.


But the stuttering of the framerate in both games and movies is also an issue an comes at the ecxact same time as the distortion


----------



## tbob22

Quote:


> Originally Posted by *DannyDK*
> 
> But the stuttering of the framerate in both games and movies is also an issue an comes at the ecxact same time as the distortion


It could be a number of things causing the issue. Check out LatencyMon to see if you notice any drivers going crazy with pagefaults.


----------



## Agent Smith1984

Behold, the perfect hour of 4K BF4.......

Now THAT is how an AB readout should look!!!


----------



## jackalopeater

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Behold, the perfect hour of 4K BF4.......
> 
> Now THAT is how an AB readout should look!!!


I'm SO stoked to see this since I've got a 4k Monoprice monitor coming this week for review









Just moved to win10 and results are pretty good with my max overclock atm, Asus DCUII 390x 1150/1600
http://www.3dmark.com/fs/5585789


----------



## Noirgheos

Which one is faster? The G1 Gaming ir the Twin Frozr V?


----------



## jackalopeater

Quote:


> Originally Posted by *Noirgheos*
> 
> Which one is faster? The G1 Gaming ir the Twin Frozr V?


From what I've been seeing is that the MSI Twin Fozr V is going to be the way to go


----------



## gendumz

Hi, nice to see there's a thread about r9 390.
I'm planning to buy one, but at where I live all sapphire and XFX have been sold out, so there's only powercolor. Is it good? Especially the fan, quietness, and oc potential.
Thx


----------



## Flash Gordon

Quote:


> Originally Posted by *RiasHaise*
> 
> Very nice
> 
> 
> 
> 
> 
> 
> 
> It's amazing how much performance you can get out of crossfire. Do you get any stuttering at all or any other negatives because of crossfire?


Yeah in Lord of the Fallen I had some stuttering I think I had to adjust some settings in CCC...I forgot but I Googled the issue and took care of it.
Quote:


> Originally Posted by *Agent Smith1984*
> 
> A few things....
> 
> I've already ran 290's in crossfire, and they did get hot (88c top, 77c bottom), but that was with stock case fans, i should now do much better with these jetflos....
> 
> Also, i never run aa with 4k, because you don't need it. That's the point of running such a high resolution.
> 
> As far as your crossfire score, the graphics score is fine, your overall score is low because you are running your cpu at stock. Get that physics score in the 9k range with overclocking, and you'll see the overall score shoot way up.


Yeah my CPU is like a baby kitten compared to my two Cheetah-like GPUs but I honestly didn't know that about 4k gaming and AA.

While I do not have a 4k monitor its nice that a system can max out any game.

I think I'll pick up some Jetflos to see if they keep my top card a little cooler.

Thanks for the response I am not experienced with overclocking mostly due to paranoia of breaking something lol


----------



## kizwan

Quote:


> Originally Posted by *DannyDK*
> 
> Since Widows 10 has beed mentioned, i want to ask if any of you guys wuth AMD cards have issues when playing a movie or watching Netflix? Im having issues with the sound getting distorted and framestuttering (same time as the sound distorts), in fact its also pressent when playing games, is it only an Nvidia thing or just me alone?


You need to install codec, e.g. K-Lite Codec.


----------



## BlaXey

Quote:


> Originally Posted by *Flash Gordon*
> 
> Yeah in Lord of the Fallen I had some stuttering I think I had to adjust some settings in CCC...I forgot but I Googled the issue and took care of it.
> Yeah my CPU is like a baby kitten compared to my two Cheetah-like GPUs but I honestly didn't know that about 4k gaming and AA.
> 
> While I do not have a 4k monitor its nice that a system can max out any game.
> 
> I think I'll pick up some Jetflos to see if they keep my top card a little cooler.
> 
> Thanks for the response I am not experienced with overclocking mostly due to paranoia of breaking something lol


A good coolers are the AeroCool Shark, i have a lot of them in my Zalman Z11,they are noisy at full power, but i use headphones to music and games so i don't have any problems, and they move more air than the jet flo coolers, and are more cheap


----------



## Agent Smith1984

Quote:


> Originally Posted by *jackalopeater*
> 
> I'm SO stoked to see this since I've got a 4k Monoprice monitor coming this week for review
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just moved to win10 and results are pretty good with my max overclock atm, Asus DCUII 390x 1150/1600
> http://www.3dmark.com/fs/5585789


Dat physics score tho!
What's that CPU running at? I'm guessing 5.1+? Sorry if it's in your sig, I'm on mobile.


----------



## jackalopeater

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Dat physics score tho!
> What's that CPU running at? I'm guessing 5.1+? Sorry if it's in your sig, I'm on mobile.


Lol, yeah, I was really pushing to try and hit 4k on the combined score. CPU was at 5.3ghz and ram at 2400cl10, so I was def giving it hell


----------



## tbob22

Quote:


> Originally Posted by *gendumz*
> 
> Hi, nice to see there's a thread about r9 390.
> I'm planning to buy one, but at where I live all sapphire and XFX have been sold out, so there's only powercolor. Is it good? Especially the fan, quietness, and oc potential.
> Thx


They are all moderately loud under load. Someone on newegg claimed that they got 1300/1650 on the Powercolor, unless they are going crazy with the volts that seems really high. Mine maxes out at about 1200/1650 at +100mv.


----------



## tbob22

Bumped up the clocks a bit and bumped my CPU to 4.5ghz. This is at +75mv.


http://www.3dmark.com/fs/5596340


----------



## CamsX

Suscribed! Read every single post on this thread until now.

Can't wait to get an R9 390 to replace my Sapphire 7970. Not expecting any ground breaking performance improvements, but the main reason to upgrade is because I want to get a Freesync Monitor.

I decided that I will go with the Sapphire Nitro model, making it the 5th straight card I get from them (after a 6870, 2 7970s, and a used 7770 for my wife's PC). Other than the lack of proper Crossfire support back when I was rocking it (mantle BF4), all of them have performed as expected or better, and the 6870/7770 are still working great with their new owners.

Already moved the G10 cooled card to my wife's computer to make room. Have never really experimented much with GPU overclocking, but since there is so much useful info here I might give it a shot. Not expecting much, as the ambient temp here is 30°C~ all year round. The nitro barely fits in my Phantom 630, hehe, and I've already order a couple of NZXT 140mm 83cfm fans to replace the 120mm at the bottom of my case.


Spoiler: Warning: Spoiler!







I'll update once everything is ready to go.


----------



## Jerseyseven

I have the MSI R9 390 Gaming and it seems like I'm experiencing some overheating. My MSI R9 390 goes up to 92 Degrees when I play Sleeping Dogs on 1440p ultra. Any ideas why this is happening?


----------



## tbob22

Quote:


> Originally Posted by *Jerseyseven*
> 
> I have the MSI R9 390 Gaming and it seems like I'm experiencing some overheating. My MSI R9 390 goes up to 92 Degrees when I play Sleeping Dogs on 1440p ultra. Any ideas why this is happening?


What speeds are the fans running at? That is way too hot. Assuming ambient is somewhat normal, the fans are running normally (they should be running at 100% at those temps), and you have good airflow, then you'll probably want to contact MSI.


----------



## Jerseyseven

Quote:


> Originally Posted by *tbob22*
> 
> What speeds are the fans running at? That is way too hot. Assuming ambient is somewhat normal, the fans are running normally (they should be running at 100% at those temps), and you have good airflow, then you'll probably want to contact MSI.


I'm running a Define R5. 1 stock exhaust fan at the back, 1 stock intake fan, and 1 Fractal Venturi fan intake too. Temps just seem too high. It could be the weather here in Singapore. But it shouldn't hit 92-93 right?


----------



## LongRod

Does anyone have a download for the 15.7 drivers for Windows 10? It turns out that the GTAV problem with some 300 series cards bug on 15.7.1 is hitting me hard, GTAV won't run for more than 5 minutes now without it crashing.


----------



## BlueFunk

Hello!

I picked up a PCS+ 390 on Friday and have been mucking about with it since. I got it from PLE in Aus for about $A60 less than the MSI version, which seemed a good deal. The 390x version of the PCS+ was another $A140 and the Fury/ Fury X were about $A545 more expensive, which made even less sense (being more than 100% more expensive, without anywhere near that increase in performance).

Thanks to all who posted as that gave me some reference points.

I'll add my data points to the group.

ASIC 70.8%

Below are stable OCs at various voltages, tested in UNIGINE, anything above these on the core clock at the respective voltage resulted in instability. This is using 15.7 drivers (haven't updated to 15.7.1 yet).
core/memory

stock: 1060/1600
+50mV: 1100/1700
+75mV: 1125/1600
+100mV: 1140/1600
I haven't finished testing mem clocks (notice i bumped the mem clock to 1700MHz at the +50mV OC - this is without touching aux voltage so far). At the moment I am settling in at +50mV level for daily use.

Also, at all voltages listed above, my GPU didn't exceed 72 degrees and the VRM maxed at 70 degrees. Max fan speed for all this was around 55% too (achieved on the +100mV) on a custom curve (which might not be optimised yet). All in all the cooler appears to be very good on this card, even if my OC's were below average. Noise has been pretty good so far, but it does get noticeable above 50% fan speed.

membership proof.


Cheers.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Jerseyseven*
> 
> I'm running a Define R5. 1 stock exhaust fan at the back, 1 stock intake fan, and 1 Fractal Venturi fan intake too. Temps just seem too high. It could be the weather here in Singapore. But it shouldn't hit 92-93 right?


You need more exhaust. Two intakes, and one exhaust fan does you no good, because you aren't able to expel the heat from the card.

Have you tried running a custom fan profile on AfterBurner?


----------



## Agent Smith1984

Quote:


> Originally Posted by *BlueFunk*
> 
> Hello!
> 
> I picked up a PCS+ 390 on Friday and have been mucking about with it since. I got it from PLE in Aus for about $A60 less than the MSI version, which seemed a good deal. The 390x version of the PCS+ was another $A140 and the Fury/ Fury X were about $A545 more expensive, which made even less sense (being more than 100% more expensive, without anywhere near that increase in performance).
> 
> Thanks to all who posted as that gave me some reference points.
> 
> I'll add my data points to the group.
> 
> ASIC 70.8%
> 
> Below are stable OCs at various voltages, tested in UNIGINE, anything above these on the core clock at the respective voltage resulted in instability. This is using 15.7 drivers (haven't updated to 15.7.1 yet).
> core/memory
> 
> stock: 1060/1600
> +50mV: 1100/1700
> +75mV: 1125/1600
> +100mV: 1140/1600
> I haven't finished testing mem clocks (notice i bumped the mem clock to 1700MHz at the +50mV OC - this is without touching aux voltage so far). At the moment I am settling in at +50mV level for daily use.
> 
> Also, at all voltages listed above, my GPU didn't exceed 72 degrees and the VRM maxed at 70 degrees. Max fan speed for all this was around 55% too (achieved on the +100mV) on a custom curve (which might not be optimised yet). All in all the cooler appears to be very good on this card, even if my OC's were below average. Noise has been pretty good so far, but it does get noticeable above 50% fan speed.
> 
> membership proof.
> 
> 
> Cheers.


Added...

Have I missed anyone? Been slammed with work and some personal issues.... if you have submitted proof and have not been added, just let me know!
Thanks


----------



## rv8000

Quote:


> Originally Posted by *Agent Smith1984*
> 
> You need more exhaust. Two intakes, and one exhaust fan does you no good, because you aren't able to expel the heat from the card.
> 
> Have you tried running a custom fan profile on AfterBurner?


If he still has the HD Bays in it is likely he actually has REALLY poor airflow.

Jersey I'd remove any HD bays if you can/or are not using them that block the intakes. It's going to take high RPM fans to offset how little air you can draw in from the front of the case as opposed to other cases. Even by opening the case door on the front you can dramatically reduce temps by a couple degrees.

I have a Define S, and even with 3 140mm intakes and a 140mm exhaust, they don't do much of anything until I have them @ 1000+ rpms through my fan controller, and at that point they're to noisy for me







. Going from ~830rpm to 1050rpm on the fans was dropping my idle temps by 3-4c, I didn't check load temps. Long story short, most Fractal cases are designed with Silence in mind, and it's my personal belief that in order to have good airflow/temps for a high end system you end up sacrificing that due to the need for fast/louder fans.


----------



## tbob22

Quote:


> Originally Posted by *Jerseyseven*
> 
> I'm running a Define R5. 1 stock exhaust fan at the back, 1 stock intake fan, and 1 Fractal Venturi fan intake too. Temps just seem too high. It could be the weather here in Singapore. But it shouldn't hit 92-93 right?


What's ambient? What fan speed is the GPU running at?

Take your sidepanel off and put a box fan pulling the air out as a test. If it still goes up to those temps then the heatsink probably isn't making proper contact.
Quote:


> Originally Posted by *BlueFunk*
> 
> ....
> core/memory
> 
> stock: 1060/1600
> +50mV: 1100/1700
> +75mV: 1125/1600
> +100mV: 1140/1600
> I haven't finished testing mem clocks (notice i bumped the mem clock to 1700MHz at the +50mV OC - this is without touching aux voltage so far). At the moment I am settling in at +50mV level for daily use.
> 
> Also, at all voltages listed above, my GPU didn't exceed 72 degrees and the VRM maxed at 70 degrees. Max fan speed for all this was around 55% too (achieved on the +100mV) on a custom curve (which might not be optimised yet). All in all the cooler appears to be very good on this card, even if my OC's were below average. Noise has been pretty good so far, but it does get noticeable above 50% fan speed.
> ...


Good to know







. Those results are similar to mine. At +100mv I was able to get up to about 1200/1650, but that's not really usable as it artifacts a bit I haven't done a lot of testing yet so I'm not sure what max stable is yet. The noise does get a little crazy with Furmark, but it stays nice and cool and I'd rather have a bit of noise and getting a few more years out of the GPU in a second PC later down the road.


----------



## jackalopeater

Quote:


> Originally Posted by *LongRod*
> 
> Does anyone have a download for the 15.7 drivers for Windows 10? It turns out that the GTAV problem with some 300 series cards bug on 15.7.1 is hitting me hard, GTAV won't run for more than 5 minutes now without it crashing.


I'm using 15.7.1 with a 390x, haven't had that problem tho, but I haven't been playing GTAV very much since moving to Win10


----------



## LongRod

Quote:


> Originally Posted by *jackalopeater*
> 
> I'm using 15.7.1 with a 390x, haven't had that problem tho, but I haven't been playing GTAV very much since moving to Win10


Yeah, 15.7.1 was working fine for me on Windows 7 until I updated to 10, but it is a known bug so I'll just chock it down to me being unlucky.

Maybe I'll try reinstalling 15.7.1 again and seeing if it works fine.


----------



## BlueFunk

Cooling performance is definitely what I am most impressed by. My max core temp was in the 70s, but at stock it's running in the mid 60s. Ambient temps where I am would have been around 20, as it is winter here, so we are talking a delta change of around 40 to 50.

It will be interesting to see how it performs in summer when it gets up to 40 degrees ambient. My old reference 7970 would get to 90 at stock, and be very loud of course. The wind force would be up in the high 80s (I got a gigabyte 7970 as well from an RMA).


----------



## jon666

I think I will be able to throw on one of my universal swiftech blocks after looking at the PCB on a few sites. Will probably use a case fan to cool the rest of the GPU off. Just gotta wait for my CPU block to get back in stock...should have everything back under water next month if I am lucky. Hopefully ambient temps go down to good oc'ing weather, been hitting mid ninities, dropped everything back to stock. Otherwise I sweat until I can't sweat anymore while gaming. 2x360 rads for CPU and GPU, so I should be able to keep everything reasonable. In the mean time keep posting those overclock settings. I will be using those for reference. Hopefully I don't have a dud.


----------



## Flash Gordon

I took your advice @Agent Smith1984 and upped the clocked on my CPU and decided to up my ram as well.



Just shy of the 9k you were talking about.

I wonder if I upgraded my CPU with the i7 if my scores would go up?







Though I imagine real world impact (games) wouldn't be all to different.


----------



## BlaXey

What programs are you using to test the gpu overclock?

Enviado desde mi MX4 mediante Tapatalk


----------



## Ha-Nocri

Unigine Heaven, Fire Strike...


----------



## BlaXey

Valley benchmark is bad for this really?

Enviado desde mi MX4 mediante Tapatalk


----------



## Ha-Nocri

Don't use it much since it's more CPU bound than Heaven. Also Heaven is good for showing graphical artifacts.


----------



## SystemTech

Quote:


> Originally Posted by *SystemTech*
> 
> Finally installed GPU-z and took a screenshot
> 
> 
> 
> 
> 
> 
> 
> 
> Aquired her beginning of July and still running stock.
> Not for long though


Mr Smith,
Please do not forget me


----------



## Noirgheos

Just wondering, since the MSI 390/390X are the popular ones, it would make sense to hear more people with issues on it right?

What games have you guys testes the 390 on? They've all been running well? No stuttering or spikes in GPU usage?


----------



## BlueFunk

In the past few days I've tested/played the following games at different clock speeds without problems (except when I exceeded stable overclocks): Tomb Raider, Shadow of Mordor. elite: dangerous, War Thunder, BF4, Max Payne 3, and BioShock Infinite. For all of them GPU loads and core/memory clocks were stable (i.e., GPU at 100%, clocks at whatever I set them).

For example for Tomb Raider at 1440p, maxed out (past ultra preset), no AA, I got the following in the canned benchmark (with a 3570k OC'ed to 4Ghz, 16GB Ram) without any hitches:

Stock: 57.5 fps
1060/1500: 60.1
1100/1700 (+50mV): 62.4


----------



## Noirgheos

Quote:


> Originally Posted by *BlueFunk*
> 
> In the past few days I've tested/played the following games at different clock speeds without problems (except when I exceeded stable overclocks): Tomb Raider, Shadow of Mordor. elite: dangerous, War Thunder, BF4, Max Payne 3, and BioShock Infinite. For all of them GPU loads and core/memory clocks were stable (i.e., GPU at 100%, clocks at whatever I set them).
> 
> For example for Tomb Raider at 1440p, maxed out (past ultra preset), no AA, I got the following in the canned benchmark (with a 3570k OC'ed to 4Ghz, 16GB Ram) without any hitches:
> 
> Stock: 57.5 fps
> 1060/1500: 60.1
> 1100/1700 (+50mV): 62.4


All games performed well. No stuttering?


----------



## BlueFunk

Quote:


> Originally Posted by *Noirgheos*
> 
> All games performed well. No stuttering?


I didn't notice anything, but I must admit that I haven't had time for long gaming sessions and have mostly been testing, so I was concentrating on temps, sounds, clocks speeds and the like - checking there weren't any obvious faults . I did play Max Payne 3 (been on my 'to do' list for ages) for a couple of hours without any dramas and elite for a couple hours as well. So yeah, everything has been smooth.

Have you been having dramas? I hear GTA V with 15.7.1 drivers has stutters for some.


----------



## Noirgheos

Quote:


> Originally Posted by *BlueFunk*
> 
> I didn't notice anything, but I must admit that I haven't had time for long gaming sessions and have mostly been testing, so I was concentrating on temps, sounds, clocks speeds and the like - checking there weren't any obvious faults . I did play Max Payne 3 (been on my 'to do' list for ages) for a couple of hours without any dramas and elite for a couple hours as well. So yeah, everything has been smooth.
> 
> Have you been having dramas? I hear GTA V with 15.7.1 drivers has stutters for some.


I'm just worried about the card...

I don't even have it yet. Got a 390X but returned it for a 390 and an extra $140.


----------



## Netix

Hi guys,

I'm from Canada and I got a VisionTek Radeon R9 390X for 445$ CAD. Should I keep it ? What you guys think of this company ? Every other brand are going for 530$ to 570$.

Any input on this model ? https://www.visiontek.com/graphics-cards/visiontek-radeon-r9-390x-detail.html

I'm running GTA V at Ultra but my temps are going around 83-84 C is that high ?


----------



## BlaXey

Hello, add me to de gpu list, i have a XFX R9 390, actually at 1180hz and 1700hz with 50MV, máx. temps achieved in Assassin's creed unity are 70ºC in 20 minutes, very nice







, also the same using Heaven benchmark.


----------



## Piccolo55

I'm looking at one of these two cards for my new pc build, is there a huge diffrance between the 390 and 390x? also I saw the tune guide but I was wondering what people think the best tune is?


----------



## BlueFunk

Quote:


> Originally Posted by *BlaXey*
> 
> Hello, add me to de gpu list, i have a XFX R9 390, actually at 1180hz and 1700hz with 50MV, máx. temps achieved in Assassin's creed unity are 70ºC in 20 minutes, very nice
> 
> 
> 
> 
> 
> 
> 
> , also the same using Heaven benchmark.


Nice, I could only get 1100 on the core at that voltage increase. Even going to 1105 resulted in artifacting.

Thermals definitely haven't been the limiting factor for my overclocking.


----------



## Gumbi

Quote:


> Originally Posted by *BlaXey*
> 
> Hello, add me to de gpu list, i have a XFX R9 390, actually at 1180hz and 1700hz with 50MV, máx. temps achieved in Assassin's creed unity are 70ºC in 20 minutes, very nice
> 
> 
> 
> 
> 
> 
> 
> , also the same using Heaven benchmark.


That's an excellent overclock.. how hot are the VRMs getting at that voltage?


----------



## DannyDK

So i have just now taking my 980 out of my case and only use the igpu on my i7 and now the sound and everything is great, so what is up my you think?
Have tried and use different drivers and bioss but it didnt change anything. Is my card broken somehow? I didnt have any issues with win 8.1 but with win 10 its bad. I have even done a clean install of win 10 and it didnt do anything either :-(


----------



## flopper

Quote:


> Originally Posted by *Gumbi*
> 
> That's an excellent overclock.. how hot are the VRMs getting at that voltage?


should be low, xfx re-design and added heatsink cooling for vrm.

http://hardforum.com/showpost.php?p=1041767187&postcount=5


----------



## Gumbi

That's good to hear!


----------



## CerealKillah

Well, I have completed the new build and should begin testing/overclocking tonight.

I will likely start with the CPU (4690K) and then move to the XFX 390. I am still slightly nervous about the 650 watt power supply, but every online calculator indicates I should be fine, even overclocking.

I am hoping to get at least 4.5 Ghz out of the CPU (it is under water) and a nice O/C on that XFX 390.

I don't have any exhaust fans on my Enthoo Pro, but I do have my big 200 mm front fan blowing right on the R9 390 (hard drive cages are removed) from the front of the case.

Why am I sharing all this info? I don't know... LOL


----------



## THUMPer1

Guys I'm torn between 390, 390x, and fury pro. I will be playing games on the BenQ XL2730Z. Currently I play BF4, CS:GO and Dirty bomb. So nothing that amazing. I know the Fury i'snt that great at 1440p so that's why I'm considering the 390 and 390x.

I realize CS: GO could be played on an Iphone, so out of those 3 games BF4 is the most taxing.

What say you?


----------



## Agent Smith1984

Quote:


> Originally Posted by *THUMPer1*
> 
> Guys I'm torn between 390, 390x, and fury pro. I will be playing games on the BenQ XL2730Z. Currently I play BF4, CS:GO and Dirty bomb. So nothing that amazing. I know the Fury i'snt that great at 1440p so that's why I'm considering the 390 and 390x.
> 
> I realize CS: GO could be played on an Iphone, so out of those 3 games BF4 is the most taxing.
> 
> What say you?


390 will be the best value.

I currently play BF4 at 4K ultra settings with 50+ FPS on a single 390 (overclocked 1170/1700 daily), so 1440 won't be an issue at all.


----------



## Agent Smith1984

Guys, sorry for the delay in getting new members added.

I am dealing with some personal issues right now, and have not had much time to work on this thread, let alone participate in the conversation.

I will get newbies in ASAP.

Thanks for your patience.


----------



## flopper

Quote:


> Originally Posted by *THUMPer1*
> 
> Guys I'm torn between 390, 390x, and fury pro. I will be playing games on the BenQ XL2730Z. Currently I play BF4, CS:GO and Dirty bomb. So nothing that amazing. I know the Fury i'snt that great at 1440p so that's why I'm considering the 390 and 390x.
> 
> I realize CS: GO could be played on an Iphone, so out of those 3 games BF4 is the most taxing.
> 
> What say you?


390 OC it a bit and your set.
Main difference with a Fury pro would be maybe 20fps but you pay twice the price for it also
Quote:


> Originally Posted by *Agent Smith1984*
> 
> 390 will be the best value.
> 
> I currently play BF4 at 4K ultra settings with 50+ FPS on a single 390 (overclocked 1170/1700 daily), so 1440 won't be an issue at all.


good to hear.


----------



## Agent Smith1984

Quote:


> Originally Posted by *flopper*
> 
> 390 OC it a bit and your set.
> Main difference with a Fury pro would be maybe 20fps but you pay twice the price for it also
> good to hear.


390X VS Fury Pro is normally 2-8 FPS difference at 1440P, depending on the title, but it is definitely only 3FPS or less difference in BF4 at that resolution.

You can clock a 390 to perform well past a stock 390x, and even make up the ground gained by the Fury....

Obviously, you can OC all of them for more performance, but just stating the 390 comparison from a value standpoint.


----------



## THUMPer1

I think I'll try the Sapphire 390 Nitro, see how well it will OC,. If it's not going well I will get something else. Thanks!


----------



## flopper

Quote:


> Originally Posted by *Agent Smith1984*
> 
> 390X VS Fury Pro is normally 2-8 FPS difference at 1440P, depending on the title, but it is definitely only 3FPS or less difference in BF4 at that resolution.
> 
> You can clock a 390 to perform well past a stock 390x, and even make up the ground gained by the Fury....
> 
> Obviously, you can OC all of them for more performance, but just stating the 390 comparison from a value standpoint.


vs a 290x and a fury anand has around 14fps 2560p but the value for the card are without doubt impressive.


----------



## koxy

Hey Guys,

I'm thinking about getting one 390 since i returned my msi gtx970 due to bad coil whine and vram issue, so im with no graphics card since February... Anyway consider xfx 390 or msi 390 (even i have bad bad experience with this company) which one is most quiets one compared to gtx 970 ? PowerColor and Shapphire are no go coz they lenght but if they more quiet than msi or xfx can consider them aswell.


----------



## Agent Smith1984

Quote:


> Originally Posted by *flopper*
> 
> vs a 290x and a fury anand has around 14fps 2560p but the value for the card are without doubt impressive.


290x results from Andand are not always reliable, because they were not done on 15.7.... they just used their old results in most cases.
Not to mention the 390x is just faster than a 290x in general, when comparing stock results.

Take a look here:

http://www.techpowerup.com/reviews/ASUS/R9_Fury_Strix/11.html

Mind you, when I look at reviews, I always consider my 390 to be as fast as the 390x they are showing results for (or usually faster) since I am overclocked a good bit.


----------



## CamsX

Quote:


> Originally Posted by *THUMPer1*
> 
> I think I'll try the Sapphire 390 Nitro, see how well it will OC,. If it's not going well I will get something else. Thanks!


Same one I ordered earlier this week. I hope all goes well and I join the club officially next week.


----------



## CamsX

Quote:


> Originally Posted by *Agent Smith1984*
> 
> 390 will be the best value.
> 
> I currently play BF4 at 4K ultra settings with 50+ FPS on a single 390 (overclocked 1170/1700 daily), so 1440 won't be an issue at all.


I wonder how you are achieving such high frames in BF4, compared to review sites. Not trying to discredit what you have managed or anything like that. On the other hand it is really impressive.

Wish I get close to this once I go 1440p.


----------



## Agent Smith1984

Quote:


> Originally Posted by *CamsX*
> 
> I wonder how you are achieving such high frames in BF4, compared to review sites. Not trying to discredit what you have managed or anything like that. On the other hand it is really impressive.
> 
> Wish I get close to this once I go 1440p.


I'm assuming through a lot of overclocking, but not sure really.

Says the 390x gets about 30FPS with "no AA" at 4k but gives no exact details.

I will run some FRAPS later and see the actual results. I know I get some dips in the 30's, but my average is over 50 last I remember.

I will post screenies, etc....

You are correct though, my personal experience with games normally gives me better results than most benchmarks online.


----------



## BlaXey

It is recommended have disactivated ULPS?

Enviado desde mi MX4 mediante Tapatalk


----------



## BlaXey

Reedit my settings: I was seeing artifcats and i had some crashed gaming, so i started to do more tests and now i'm using 112+ voltaje and the same clocks: 1180+1700, max. temps are 75 Cº with gpu fans 75-80%.


----------



## Zack Foo

Hi guys! I'm new here and also new to amd cards. Used to be using gtx 970 from asus and someone decided to trade with me his 1 week old R9 390 dc3 with my 4 months old 970 so i give it a go and try the red team this time.

May i know about r9 390 dc3 a little bit more?

Performance wise and also the cooling acoustic of it. Reviews are nowhere to be found about asus strix r9 390 tho every single one is r9 390x.


----------



## CamsX

Quote:


> Originally Posted by *Zack Foo*
> 
> Hi guys! I'm new here and also new to amd cards. Used to be using gtx 970 from asus and someone decided to trade with me his 1 week old R9 390 dc3 with my 4 months old 970 so i give it a go and try the red team this time.
> 
> May i know about r9 390 dc3 a little bit more?
> 
> Performance wise and also the cooling acoustic of it. Reviews are nowhere to be found about asus strix r9 390 tho every single one is r9 390x.


I'd say you got really lucky on the deal, if it is indeed a DC3 (3 fans model). I gather ASUS noticed that having only 2 fans might not be enough to cool the 390 series.

If you don't overclock I doubt you would have any Noise related issues. You can simple set a nice quiet balanced fan profile and forget about it.

If you plan to overclock it, then you will be the one providing the information around here, as not many ppl have the now older DC2 model, much less the DC3.

I think that whatever review you find about the 390x DC3 applies just perfectly to your 390, minus a 5 to 10 frames per second on the related benchmarks. It should be faster than the 970 on all games, excluding Nvidia Optimized titles.

Good luck!


----------



## Zack Foo

Quote:


> Originally Posted by *CamsX*
> 
> I'd say you got really lucky on the deal, if it is indeed a DC3 (3 fans model). I gather ASUS noticed that having only 2 fans might not be enough to cool the 390 series.
> 
> If you don't overclock I doubt you would have any Noise related issues. You can simple set a nice quiet balanced fan profile and forget about it.
> 
> If you plan to overclock it, then you will be the one providing the information around here, as not many ppl have the now older DC2 model, much less the DC3.
> 
> I think that whatever review you find about the 390x DC3 applies just perfectly to your 390, minus a 5 to 10 frames per second on the related benchmarks. It should be faster than the 970 on all games, excluding Nvidia Optimized titles.
> 
> Good luck!


Thx for the very informative reply! Gives me more confident about the swap too.
Ofc if I can help out with data gathering what not I will be very willing to help out this community!


----------



## BlueFunk

Quote:


> Originally Posted by *Zack Foo*
> 
> Ofc if I can help out with data gathering what not I will be very willing to help out this community!


I am always interested to see overclock performance - both in terms of stable clock speeds achieved and how that translates into measurable performance gains (in-game FPS). so, please add your data points if you can


----------



## Flash Gordon

Hey guys I'm back...I honestly have never OC'd GPU's before seeing as I don't like breaking things but seeing as everybody else is having success in OC'ing these cards and I have another 3 weeks to return these cards if anything goes wrong (ie. I burn my house down accidentally) I decided to OC these cards.

I took it slow at first only raising the core clocks until I saw artifacts on screen and I couldn't get higher than 1120 stable (without artifacts) on the core clock. I honestly didn't spend as much time on the memory and I am sure I can get it higher than I did but 1600 is just want I set it as.

One limitation I have is my PSU. I currently own a EVGA 750G2 which is simply not enough and I learned the hard way running 3DMark with 1120 Core and 1600 Mem









Safe to say that will be my next upgrade a beefier PSU









Anyways before I post pics just want to give some preliminary info on my cards to be as transparent as possible:

Both my Asic scores for my card so if you have something comparable maybe you can achieve similar results.

I didn't really know much about these types of scores until I saw a video about it but here y'all go:




That almost 80% Asic score looks mighty sexy









Next some benchmark scores. Now I saved several throughout the OC'ing process for reference points:

*Stock CPU and Stock GPU's (Crossfire) Obviously Stable*



*CPU 4.4 Mhz @ 1.25v, GPU Core Clock 1120 Mem 1500 (CF) Stable*



*CPU 4.4 Mhz @ 1.25v, GPU Stock Core Clock and Mem @ 1600 (CF) Stable*



These next three pics were not stable but I will post them since there were nearly stable but I didn't want to overdo the Voltage seeing as I have a 750 watt PSU and I like having a computer that doesn't turn into a bomb









Anyways enjoy (some of these almost passed ugh)

*CPU 4.4 Mhz @ 1.25v, GPU Core Clock 1120 and Mem @ 1600 (CF) Not Stable (Artifacts near the end screwed me)*



I then upped the voltage a bit to try and remedy the artifacts and got this:

*CPU 4.4 Mhz @ 1.25v, GPU Core Clock 1120 and Mem @ 1600 (CF) Not Stable (Blurry textures, pop in, no artifacts) 3 freakin' points shy of 3K I just saw a 5960k with a Titan X get a score of 2873 so this unstable result broke my heart







*



I upped the voltage yet again trying to remedy the blurry textures and pop ins and got this:

*CPU 4.4 Mhz @ 1.25v, GPU Core Clock 1120 and Mem @ 1600 (CF) Not Stable (No artifacts, no blurry textures or pop in but I did see a MASSIVE slowdown near the end of the benchmark hence the low FPS of 6 smh) oh so close but yet so far







*



I know its a lot of info to digest but I just wanted to share my experiences with the card just in case anybody decides they want to CF these amazing amazing cards


----------



## Gumbi

Quote:


> Originally Posted by *CamsX*
> 
> I'd say you got really lucky on the deal, if it is indeed a DC3 (3 fans model). I gather ASUS noticed that having only 2 fans might not be enough to cool the 390 series.
> 
> If you don't overclock I doubt you would have any Noise related issues. You can simple set a nice quiet balanced fan profile and forget about it.
> 
> If you plan to overclock it, then you will be the one providing the information around here, as not many ppl have the now older DC2 model, much less the DC3.
> 
> I think that whatever review you find about the 390x DC3 applies just perfectly to your 390, minus a 5 to 10 frames per second on the related benchmarks. It should be faster than the 970 on all games, excluding Nvidia Optimized titles.
> 
> Good luck!


2 fans wasn't the problem. They literally lifted the 780 design and placed it on the 290 (x). It was shoddy work, didn't cool the VRMs properly, and 2 of the pipes made no contact with the GPU at all IIRC.

I'm glad thry redesigned it in any case.


----------



## Darkeylel

Hey there just got this beautiful card , needed a upgrade after my faithful 7950 parted ways with this world sad days









But I do have a question when I initially installed this card my FPS was amazing for all my games , I then upgraded to win 10 had horrible FPS dips with 15.7.1 drivers so formatted and started anew back on win 7. Now I still have FPS spikes in Cs:Go and Black ops 2 monitoring my GPU usage it just spikes up and down doesn't consistently use it any help would be amazing


----------



## jaydude

Quote:


> Originally Posted by *Darkeylel*
> 
> Hey there just got this beautiful card , needed a upgrade after my faithful 7950 parted ways with this world sad days
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But I do have a question when I initially installed this card my FPS was amazing for all my games , I then upgraded to win 10 had horrible FPS dips with 15.7.1 drivers so formatted and started anew back on win 7. Now I still have FPS spikes in Cs:Go and Black ops 2 monitoring my GPU usage it just spikes up and down doesn't consistently use it any help would be amazing


What card did you get? and what are your pc's specs? I have been noticing something similar since I upgraded to windows 10, only in some games though not all.

I also upgraded from a 7950, so I am not sure if these gpu usage spikes and dips and irregular clocks on load is normal, I do use vsync though and it all works as it should when vsync is off, clocks remain steady and gpu usage is 100% when benchmarking and gaming, but when vsync is on I notice the clocks always fluctuate around 900 to 1025


----------



## Darkeylel

Quote:


> Originally Posted by *jaydude*
> 
> What card did you get? and what are your pc's specs? I have been noticing something similar since I upgraded to windows 10, only in some games though not all.


Gigabyte version

Pc specs:

I7 3820 stock clock
16gb RAM
Asus p9x79 pro mobo


----------



## Zack Foo

Quote:


> Originally Posted by *BlueFunk*
> 
> I am always interested to see overclock performance - both in terms of stable clock speeds achieved and how that translates into measurable performance gains (in-game FPS). so, please add your data points if you can


Will do as soon I have the time! Rigid tubing cooling my cpu now and hopefully done by tonight and when the card arrive tmr I can overclock and provide enough information


----------



## Zack Foo

Quote:


> Originally Posted by *Gumbi*
> 
> 2 fans wasn't the problem. They literally lifted the 780 design and placed it on the 290 (x). It was shoddy work, didn't cool the VRMs properly, and 2 of the pipes made no contact with the GPU at all IIRC.
> 
> I'm glad thry redesigned it in any case.


Now I'm worried haha cause DC3 looks like they just take 980ti design and slap it on.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Flash Gordon*
> 
> Hey guys I'm back...I honestly have never OC'd GPU's before seeing as I don't like breaking things but seeing as everybody else is having success in OC'ing these cards and I have another 3 weeks to return these cards if anything goes wrong (ie. I burn my house down accidentally) I decided to OC these cards.
> 
> I took it slow at first only raising the core clocks until I saw artifacts on screen and I couldn't get higher than 1120 stable (without artifacts) on the core clock. I honestly didn't spend as much time on the memory and I am sure I can get it higher than I did but 1600 is just want I set it as.
> 
> One limitation I have is my PSU. I currently own a EVGA 750G2 which is simply not enough and I learned the hard way running 3DMark with 1120 Core and 1600 Mem
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Safe to say that will be my next upgrade a beefier PSU
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyways before I post pics just want to give some preliminary info on my cards to be as transparent as possible:
> 
> Both my Asic scores for my card so if you have something comparable maybe you can achieve similar results.
> 
> I didn't really know much about these types of scores until I saw a video about it but here y'all go:
> 
> 
> 
> 
> That almost 80% Asic score looks mighty sexy
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Next some benchmark scores. Now I saved several throughout the OC'ing process for reference points:
> 
> *Stock CPU and Stock GPU's (Crossfire) Obviously Stable*
> 
> 
> 
> *CPU 4.4 Mhz @ 1.25v, GPU Core Clock 1120 Mem 1500 (CF) Stable*
> 
> 
> 
> *CPU 4.4 Mhz @ 1.25v, GPU Stock Core Clock and Mem @ 1600 (CF) Stable*
> 
> 
> 
> These next three pics were not stable but I will post them since there were nearly stable but I didn't want to overdo the Voltage seeing as I have a 750 watt PSU and I like having a computer that doesn't turn into a bomb
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyways enjoy (some of these almost passed ugh)
> 
> *CPU 4.4 Mhz @ 1.25v, GPU Core Clock 1120 and Mem @ 1600 (CF) Not Stable (Artifacts near the end screwed me)*
> 
> 
> 
> I then upped the voltage a bit to try and remedy the artifacts and got this:
> 
> *CPU 4.4 Mhz @ 1.25v, GPU Core Clock 1120 and Mem @ 1600 (CF) Not Stable (Blurry textures, pop in, no artifacts) 3 freakin' points shy of 3K I just saw a 5960k with a Titan X get a score of 2873 so this unstable result broke my heart
> 
> 
> 
> 
> 
> 
> 
> *
> 
> 
> 
> I upped the voltage yet again trying to remedy the blurry textures and pop ins and got this:
> 
> *CPU 4.4 Mhz @ 1.25v, GPU Core Clock 1120 and Mem @ 1600 (CF) Not Stable (No artifacts, no blurry textures or pop in but I did see a MASSIVE slowdown near the end of the benchmark hence the low FPS of 6 smh) oh so close but yet so far
> 
> 
> 
> 
> 
> 
> 
> *
> 
> 
> 
> I know its a lot of info to digest but I just wanted to share my experiences with the card just in case anybody decides they want to CF these amazing amazing cards


Thanks for posting results....

What are your temps looking like with both cards running together at full load for a bit?


----------



## CerealKillah

OK, I spent last night getting my 4690K going.

Is my Firestrike in line with a stock 390?

http://www.3dmark.com/3dm/8075933?


----------



## Agent Smith1984

Quote:


> Originally Posted by *CerealKillah*
> 
> OK, I spent last night getting my 4690K going.
> 
> Is my Firestrike in line with a stock 390?
> 
> http://www.3dmark.com/3dm/8075933?


My 390 pulled 12,500 graphics score out of the box @ 1040/1500 clock speeds.

Try running at that and see what you get..... in most cases the 390 is consistently a tad faster than 290x, even at the same clocks, despite is lack of 256 shaders....

Most of it is BIOS tweaks, memory timings, etc, since the newer drivers have leveled the playing field some for the 290's.


----------



## CerealKillah

Quote:


> Originally Posted by *Agent Smith1984*
> 
> My 390 pulled 12,500 graphics score out of the box @ 1040/1500 clock speeds.
> 
> Try running at that and see what you get..... in most cases the 390 is consistently a tad faster than 290x, even at the same clocks, despite is lack of 256 shaders....
> 
> Most of it is BIOS tweaks, memory timings, etc, since the newer drivers have leveled the playing field some for the 290's.


I will start there tonight when I get off of work.

Kind of disappointed I could only get to 4.5 ghz @ 1.3v on my 4690K. I was hoping for at least 4.6.


----------



## CamsX

Quote:


> Originally Posted by *Zack Foo*
> 
> Now I'm worried haha cause DC3 looks like they just take 980ti design and slap it on.


Haha, this wouldn't surprise me.


----------



## Horsemama1956

Quote:


> Originally Posted by *Agent Smith1984*
> 
> My 390 pulled 12,500 graphics score out of the box @ 1040/1500 clock speeds.
> 
> Try running at that and see what you get..... in most cases the 390 is consistently a tad faster than 290x, even at the same clocks, despite is lack of 256 shaders....
> 
> Most of it is BIOS tweaks, memory timings, etc, since the newer drivers have leveled the playing field some for the 290's.


I would imagine there are specific benchmark tweaks in the drivers for the rebrands as well to make them look better.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Horsemama1956*
> 
> I would imagine there are specific benchmark tweaks in the drivers for the rebrands as well to make them look better.


Benchmarks maybe, but games themselves show improved numbers on these cards also. I highly doubt AMD is gimping the driver support for every title just to make a card look better,especially considering everyone is well aware of 390 being a rebrand.


----------



## Agent Smith1984

I knew I wasn't crazy....

My framerates are a few frames better than the 390X results (mine are 38/63) below when using the same settings (due to my overclock):



A single 390 card is still one hell of a value guys.....

2) 390's in crossfire for $660 is about as much power as you can pack into a rig for that kind of money......


----------



## Flash Gordon

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Thanks for posting results....
> 
> What are your temps looking like with both cards running together at full load for a bit?


My idle temps at the moment are 29 C and 28 C







(I have these amazing Sunon 12v fans blowing directly on the cards at 50%. These fans are so good it keeps the backplates cold to the touch lol)

Top card sits around 75 - 85 C (At my maximum overclock and voltage which was unnecessary I found out I could easily get this temp under 80%)

Bottom card - 68 - 73 C

One problem I had before was placing the cards on top of each other since then I have put the bottom card at the last PCI-e slot...it's a x4 slot but the temps were getting out of hand.

Another thing is I had MSI Afterburner locking the fan speeds at 25% because I disabled User Defined Fan Control so I was running benchmarks with the fan not passing 25% speed

You live and you learn right?


----------



## koxy

How loud msi r9 390 is ? compared to msi gtx 970


----------



## CamsX

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I knew I wasn't crazy....
> 
> My framerates are a few frames better than the 390X results (mine are 38/63) below when using the same settings (due to my overclock):
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> A single 390 card is still one hell of a value guys.....
> 
> 2) 390's in crossfire for $660 is about as much power as you can pack into a rig for that kind of money......


Thanks for being living proof of what reviewers are saying. The effort on this thread is greatly appreciated.

Can't wait for my card to arrive.


----------



## flopper

Quote:


> Originally Posted by *Horsemama1956*
> 
> I would imagine there are specific benchmark tweaks in the drivers for the rebrands as well to make them look better.


they optimize for those programs as they dont change much over time.
390 offers previous 290x+ performance cant say I complain about that.
my 290 died so are looking for replacement.
was going to be a fury but bought a new screen instead which was a good idea as my old ones was to old.
getting a 390 end of month


----------



## BlueFunk

Quote:


> Originally Posted by *Flash Gordon*
> 
> Top card sits around 75 - 85 C (At my maximum overclock and voltage which was unnecessary I found out I could easily get this temp under 80%)
> 
> Bottom card - 68 - 73 C


Approximately what would your ambient temp be?


----------



## Flash Gordon

Quote:


> Originally Posted by *BlueFunk*
> 
> Approximately what would your ambient temp be?


I've never measured that but I keep my AC unit turned on when I'm in my game room...even in the winter time.

I would say 65-70 F or a little lower...idk what that translates to in C but it's fairly cold in my room.

It's never hot and I'm running obnoxiously loud Sunon fans whose max RPMs are a ridiculous 4500 or some silly number like that lol that's why I run them at 50% which still keeps my backplates on the card cold to the touch until I start gaming or benching.


----------



## BlueFunk

Quote:


> Originally Posted by *Flash Gordon*
> 
> I would say 65-70 F or a little lower...idk what that translates to in C but it's fairly cold in my room.


Thanks! I am curious what my temps will be when summer rolls by. My ambient is about the same as yours at the moment (~20 celsius) but it gets up to 35+ (over 95F) in summer for extended periods. Given the good cooling I am seeing now (sub 70 degrees after hours of 100% load on stock clocks) I am thinking it will be fine and even potentially suited for CF if I go that route. Two 390s for less than a Fury or Fury X is tempting - but I really think my next upgrade should be an ultrawide freesync monitor (I've been waiting for the 144Hz, IPS, ultrawide, 1440p freesync set of features to come together in one product).


----------



## diggiddi

Quote:


> Originally Posted by *Flash Gordon*
> 
> I've never measured that but I keep my AC unit turned on when I'm in my game room...even in the winter time.
> 
> I would say 65-70 F or a little lower...idk what that translates to in C but it's fairly cold in my room.
> 
> It's never hot and I'm running obnoxiously loud Sunon fans whose max RPMs are a ridiculous 4500 or some silly number like that lol that's why I run them at 50% which still keeps my backplates on the card cold to the touch until I start gaming or benching.


70F=21 C


----------



## cbarros82

Has anyone changed there thermal paste yet? If so how much better was your temps on the 390/x


----------



## CamsX

Quote:


> Originally Posted by *cbarros82*
> 
> Has anyone changed there thermal paste yet? If so how much better was your temps on the 390/x


Some MSI card owners have reported improvement after doing this. Worth a shot if you consider your temps to be too high.


----------



## CamsX

Quote:


> Originally Posted by *BlueFunk*
> 
> ... but I really think my next upgrade should be an ultrawide freesync monitor (I've been waiting for the 144Hz, IPS, ultrawide, 1440p freesync set of features to come together in one product).


The Acer XR341CK might be what you want, but @ $1100 USD it iss way out of my league. The monitor market is starting to get very interesting with the wide adoption of Freesync. Hope this helps AMD improve on the market, at least until Nvidia decides to cut their proprietary crap.


----------



## BlueFunk

Yeah, I've seen that monitor (much excite) and I agree that it is great to see the innovation happening, although I shudder to think how much that specific monitor will cost in Aus. Now I continue the waiting game for it to drop in price a bit (or to save the cash).


----------



## x1TheDoctor1x

Got a MSI 390 coming later today, reading all of these posts is just getting me even more excited! Had a 970 beforehand so I'll be so thankful for the 390's 8GB.


----------



## CerealKillah

Quote:


> Originally Posted by *Agent Smith1984*
> 
> My 390 pulled 12,500 graphics score out of the box @ 1040/1500 clock speeds.
> 
> Try running at that and see what you get..... in most cases the 390 is consistently a tad faster than 290x, even at the same clocks, despite is lack of 256 shaders....
> 
> Most of it is BIOS tweaks, memory timings, etc, since the newer drivers have leveled the playing field some for the 290's.


Mine scored 12,500 at the MSIs stock speeds (1060/1500).

I am concerned that I am not getting enough cool air for the video card right now. I might have to just bite the bullet and order a waterblock for it today.


----------



## cbarros82

there is no full cover block for the msi due to the different pcb from the 290 version. also you vram's are gonna run really hot with just the gpu die block and not a full block. best bet is to change tim and use a more aggressive fan curve


----------



## Agent Smith1984

Quote:


> Originally Posted by *CerealKillah*
> 
> Mine scored 12,500 at the MSIs stock speeds (1060/1500).
> 
> I am concerned that I am not getting enough cool air for the video card right now. I might have to just bite the bullet and order a waterblock for it today.


There are no full cover blocks for these cards, only GPU only blocks.

My card stays perfectly cool with a 24C ambient, and custom fan profile, and I haven't even changed the TIM yet.

It's all about case flow.


----------



## Agent Smith1984

Quote:


> Originally Posted by *CamsX*
> 
> The Acer XR341CK might be what you want, but @ $1100 USD it iss way out of my league. The monitor market is starting to get very interesting with the wide adoption of Freesync. Hope this helps AMD improve on the market, at least until Nvidia decides to cut their proprietary crap.


Been meaning to ask you something.....

Is that your 240?

Whutcha got in it?


----------



## CerealKillah

Quote:


> Originally Posted by *cbarros82*
> 
> there is no full cover block for the msi due to the different pcb from the 290 version. also you vram's are gonna run really hot with just the gpu die block and not a full block. best bet is to change tim and use a more aggressive fan curve


I have the XFX 390 from Best Buy. There are several waterblocks available on the market today.

I was just trying to communicate that the XFX at the same speeds as the MSI yields the same results based on Agent Smith's comment a few pages back(sorry if I did a bad job of communicating this).


----------



## Agent Smith1984

Quote:


> Originally Posted by *CerealKillah*
> 
> I have the XFX 390 from Best Buy. There are several waterblocks available on the market today.
> 
> I was just trying to communicate that the XFX at the same speeds as the MSI yields the same results based on Agent Smith's comment a few pages back(sorry if I did a bad job of communicating this).


Oh gotcha!









Yes, there are definitely some blocks for your card.

And according to many, the XFX suffers from less than stellar VRM cooling..... however, from what I have seen their actual chips are binned very well, maybe not quite at the level of MSI, but they appear to be better than Asus/PowerColor/Sapphire's binnings.

Water would probably open some doors on that card.

What kind of temps are you seeing now, and what kind of clocks are you able to achieve?

I ask because, I have never done a custom loop on a GPU, and my sentiment is this; if I have a nice card that does 1160+ on fairly low voltage (50-75mv), then I'd maybe look into water to keep it cooler, and push it further, but if I had a dud clocker that needed 100mv+ to hit 1120MHz, I'd just let the damn thing run hot and go on about my business, since it's obviously not going to have that great of an OC ceiling, regardless of how cool it is.


----------



## cbarros82

Quote:


> Originally Posted by *CerealKillah*
> 
> I have the XFX 390 from Best Buy. There are several waterblocks available on the market today.
> 
> I was just trying to communicate that the XFX at the same speeds as the MSI yields the same results based on Agent Smith's comment a few pages back(sorry if I did a bad job of communicating this).


ok....yeah the xfx will benefit from a water block massively
sorry i just misread your post


----------



## CamsX

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Been meaning to ask you something.....
> 
> Is that your 240?
> 
> Whutcha got in it?


Yup, its a '93 200SX S13 (EDM market) with a CA18DET in it. Currently rebuilding the engine, after a spun bearing failure.







Lots of upgrades, but nothing crazy. Aiming for 260+ whp.

I love Cars as much as I love computers & Games. My daily driver is a '05 Honda S2000. All of this account for thousands of hours Sim Racing in GPL, LFS, RBR and recently Assetto Corsa & Dirt Rally.

Are you also into cars?


----------



## Agent Smith1984

Quote:


> Originally Posted by *CamsX*
> 
> Yup, its a '93 200SX S13 (EDM market) with a CA18DET in it. Currently rebuilding the engine, after a spun bearing failure.
> 
> 
> 
> 
> 
> 
> 
> Lots of upgrades, but nothing crazy. Aiming for 260+ whp.
> 
> I love Cars as much as I love computers & Games. My daily driver is a '05 Honda S2000. All of this account for thousands of hours Sim Racing in GPL, LFS, RBR and recently Assetto Corsa & Dirt Rally.
> 
> Are you also into cars?


Used to be a Honda nut.....

I've had several, inlcuding DA and DC teggies, EG hatch, DC5 RSX, and my little brother had some DA's and a B16 swapped REX.

Both my DA's were fully built.

Had a 91 Teggy with LS/V B18, decked block, milled CTR head with CTR cams, ITR pistons, shot peened rods, Skunk 2 intake, titatium flat faced valves, 2 layer head gaket, 310CC injectors, Mugen header, tinabe exhaust, Phearable basemap p28 ECU later tuned by a guy named J Mills, B16 tranny with LS 5th gear. Damn car made 218WHP out of 1.8L with 12.4:1 compression!!! Has full tokico illumina suspension, sway bars, all the goodies, was a total track car in murder black primer, lol!

Also had another DA that was a beautiful and fresh melano red, had B18C type R block and head, rebuilt with CTR pistons and a nice tune, with YS1 B17 tranny.
That thing was a friggin monster! 13:1 on pump gas (93 octane here) and it ran a 13.5 1/4 mile


----------



## CerealKillah

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Oh gotcha!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yes, there are definitely some blocks for your card.
> 
> And according to many, the XFX suffers from less than stellar VRM cooling..... however, from what I have seen their actual chips are binned very well, maybe not quite at the level of MSI, but they appear to be better than Asus/PowerColor/Sapphire's binnings.
> 
> Water would probably open some doors on that card.
> 
> What kind of temps are you seeing now, and what kind of clocks are you able to achieve?
> 
> I ask because, I have never done a custom loop on a GPU, and my sentiment is this; if I have a nice card that does 1160+ on fairly low voltage (50-75mv), then I'd maybe look into water to keep it cooler, and push it further, but if I had a dud clocker that needed 100mv+ to hit 1120MHz, I'd just let the damn thing run hot and go on about my business, since it's obviously not going to have that great of an OC ceiling, regardless of how cool it is.


I am thinking of water cooling because the CPU is already under water and the XSPC Photon/D5 combo is mounted (by necessity) where the exhaust fan would go at the back of the case. In order to exhaust hot air from the case, the 200 mm front fan is actually blowing out.

The air inside the case is very hot (like heat up the panels of the case kind of hot). I can lower the case temps by put putting the 390 under water and keeping the air cooler from flooding the case with more hot air.

$100 bucks buys either a part of a new case or a waterblock for the 390. I already have all the tubing fittings and radiator to do it.

Last night while benching it, the fans were going nuts and the case panels felt warm to the touch.


----------



## Agent Smith1984

Quote:


> Originally Posted by *CerealKillah*
> 
> I am thinking of water cooling because the CPU is already under water and the XSPC Photon/D5 combo is mounted (by necessity) where the exhaust fan would go at the back of the case. In order to exhaust hot air from the case, the 200 mm front fan is actually blowing out.
> 
> The air inside the case is very hot (like heat up the panels of the case kind of hot). I can lower the case temps by put putting the 390 under water and keeping the air cooler from flooding the case with more hot air.
> 
> $100 bucks buys either a part of a new case or a waterblock for the 390. I already have all the tubing fittings and radiator to do it.
> 
> Last night while benching it, the fans were going nuts and the case panels felt warm to the touch.


If I could justify the cost I'd do the same....

I took the cheap route and fixed my issues with some 95CFM 120mm fans....









At 1175+/1750 75mv/75mv AUX, my core never breaks 72C, and the VRM's are 73C with 72% fan speed (custom fan curve that directly matches the temp with equal percentage).

Got a 5C drop in my CPU RAD temp, my VRM temps went down 8C, and my SSD and HDD both went down around 3-5C.

The noise is a bit much under full throttle, but I always have the game volume cranked in my living room while playing anyways.


----------



## kizwan

Quote:


> Originally Posted by *CerealKillah*
> 
> I am thinking of water cooling because the CPU is already under water and the XSPC Photon/D5 combo is mounted (by necessity) where the exhaust fan would go at the back of the case. In order to exhaust hot air from the case, the 200 mm front fan is actually blowing out.
> 
> The air inside the case is very hot (like heat up the panels of the case kind of hot). I can lower the case temps by put putting the 390 under water and keeping the air cooler from flooding the case with more hot air.
> 
> $100 bucks buys either a part of a new case or a waterblock for the 390. I already have all the tubing fittings and radiator to do it.
> 
> Last night while benching it, the fans were going nuts and the case panels felt warm to the touch.


Water block it. DOOO EEETTTT!.


----------



## CerealKillah

Quote:


> Originally Posted by *kizwan*
> 
> Water block it. DOOO EEETTTT!.


http://www.performance-pcs.com/ek-msi-gigabyte-radeon-r9-290x-vga-liquid-cooling-block-acetal.html

Ordered.


----------



## CamsX

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Used to be a Honda nut.....
> 
> I've had several, inlcuding DA and DC teggies, EG hatch, DC5 RSX, and my little brother had some DA's and a B16 swapped REX.
> 
> Both my DA's were fully built.
> 
> Had a 91 Teggy with LS/V B18, decked block, milled CTR head with CTR cams, ITR pistons, shot peened rods, Skunk 2 intake, titatium flat faced valves, 2 layer head gaket, 310CC injectors, Mugen header, tinabe exhaust, Phearable basemap p28 ECU later tuned by a guy named J Mills, B16 tranny with LS 5th gear. Damn car made 218WHP out of 1.8L with 12.4:1 compression!!! Has full tokico illumina suspension, sway bars, all the goodies, was a total track car in murder black primer, lol!
> 
> Also had another DA that was a beautiful and fresh melano red, had B18C type R block and head, rebuilt with CTR pistons and a nice tune, with YS1 B17 tranny.
> That thing was a friggin monster! 13:1 on pump gas (93 octane here) and it ran a 13.5 1/4 mile


Haha, nice. Seems the need for mods and power carried over to your PC side. The S2K is my first Honda, so I'm not that familiar with them, but there are some crazy Naturally Aspirated projects down here with EK and EG chassis.


----------



## CamsX

I wish NZXT or someone else would bring out another Asetek GPU cooling bracket like the Kraken G10. It made a huge difference in temperatures of the top card when I had my 7970 crossfire going. Corsair H55 + the bracket was a very cost effective setup. I was also very lucky that the Sapphire card model I have (now my wife's) had factory installed heatsinks on the Memory and VRMs.


----------



## Agent Smith1984

Quote:


> Originally Posted by *CamsX*
> 
> I wish NZXT or someone else would bring out another Asetek GPU cooling bracket like the Kraken G10. It made a huge difference in temperatures of the top card when I had my 7970 crossfire going. Corsair H55 + the bracket was a very cost effective setup. I was also very lucky that the Sapphire card model I have (now my wife's) had factory installed heatsinks on the Memory and VRMs.


Corsair HG10









It also cools VRM


----------



## Piccolo55

Anyone have a asus strix r9 390 or 390x? I'm interested in that tune and was wondering how it was?


----------



## CamsX

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Corsair HG10
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It also cools VRM


Interesting option for future experiments. Wasn't aware it existed. Biggest issues would probably be finding the blower fan bit or making sure it is compatible with the Sapphire Nitro I ordered.


----------



## MalsBrownCoat

Hi guys,

So, I just jumped ship from a brand new pair of ASUS DC2OC-4GD5 R290X's, to a freshly pulled trigger for a pair of ASUS STRIX-R9390X-DC3OC-8GD5's.

You can see the whole debacle here - http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/38910#post_24268178

But to summarize, the reason for this was because the 3 new monitors that I just bought, ASUS MG279Q's, only have Displayport and HDMI inputs.
The 290X's have 1 Displayport, 1 HDMI and 2 DVI-D (Dual-Link) outputs.
My intention was to run 1440p, across all three monitors, at 144hz.

Only to be realized after an "oh crap" moment, that this was clearly not going to be able to be accomplished with the 290X's, even running in Crossfire.

So, I started looking into cards that have more displayport outputs on them (HDMI didn't matter because it wouldn't pass the resolution and frequency that I was after anyway, nor did DVI, since the monitors do not have those inputs).

Then I found the STRIX DC3 version of the 390X, which has 3 displayports, 1 HDMI, and 1 DVI-D (Dual-Link).
And I bought two of them.
They'll be watercooled with EK blocks (thankfully the ones from my 290X's are compatible).

Did I do the right thing?

With this new set up, will I be able to achieve all three monitors, at 1440p, at 144hz?

And what about power? Will my AX1200i be sufficient?

Thank you.


----------



## diggiddi

You should be good to go with that 1200 but you can calculate at Extreme outer vision


----------



## CamsX

I'd say you are pretty safe on the power and gpu side. Depends on what game you plan to run the eyefinity setup, might not be able to achieve 120+ fps in every case. Tho I think it is a great opportunity to run freesync on all 3 and cap them at 90fps.

Let me know your overall experience with the monitors. IQ is expected to be heavenly, but there are lots of comments about light bleed and dead pixel issues.

Good luck!


----------



## MalsBrownCoat

I'm pretty sure all of this is filed under "serious overkill", since the majority of my gaming is done with SWTOR, BF4 and maybe a few other titles in my steam library.

But I figure if I'm going to spend the money, _spend the money_, and do it right.

At the time that I initially started this build (a year ago); my first PC build in more than a decade, the 290X was the best option to go with. Of course, I also started with the intentions of a single card, and an 860i PSU.

And now look where I am (build log updated in sig).

For quite a while, I was under the impression that Freesync was not compatible with Crossfire, until I was just informed of the contrary the other day. So yes, there will need to be some experimentation with capping the Freesync experience in Crossfire, vs running natively (without Freesync) and letting the frequency scale go up to 144hz.

And I did a lot of research and saw the postings/reviews about the inherent problems with these monitors. I'm ok with playing the panel lottery a few times (for each of the three monitors), since I won't be charged for returns or shipping (within 30 days of purchase). That should be plenty of time for me to know if I've received a good or a bad panel.

Then I'll be looking to the experts here to help me get all of the performance that I can out of these cards.

Thanks guys!


----------



## semiroundboss

OvrclockProof.jpg 192k .jpg file


Love this card. It's as sexy as it looks, but those temps got me scared. Should I get Gigabyte or XFX Black for my other two cards? I'll be running at the same clock speeds. And what PSU should I get if I have three of those torches OCed and an AMD FX 9370?


----------



## diggiddi

I'd say 1500w for 3 cards and OC 9370


----------



## maximsilentfoot

Hi Everyone!

Recently installed an Asus 390X Direct CU II in my rig (see sig).

First off, some advice for anyone considering picking up this version of the 390x : DON'T.

ASUS did not do a good job with the cooler on this card, which I'm sure has been mentioned on this forum. I haven't messed with any of the fan profiles, but out of the box I'm getting low 80c when benchmarking in Valley or 3dMark, while in gaming it fluctuates between mid 70c to low 80c. And while the fan is silent when not gaming, it is LOUD when on load. I keep my rig a few feet away from where I sit, and unless you are wearing a headset or have your speakers turned up quite a bit, you can definitely here the mini vacuum that is the DCU II cooler.

I was using an MSI 270x Gaming (Twin Frozr cooler) before this and the difference is night and day when it comes to temps and noise levels. If you are considering a 390x, get the Twin Frozr version which I assume will do a waaay better job than the DCU II.

As for performance, this card is a beast! Thanks to the newer ATI drivers, I've been setting the res. to 1440p in most games and usually it can maintain 60fps with other options turned to high/ultra. Too bad I'm still on a 1080p/60hz monitor though...I'm seriously considering one of those 1440p/144hz/freesync monitors 

I was wondering if you guys could help me with an issue I've been noticing since I got the card though... for some reason, and I dont think I'm imagining this... I don't feel like AA/AF works very well. I've tried different drivers and this hasn't made a difference in most games. Speed is def not an issue though, the games are buttery smooth.

For example, I recently installed COD: Ghosts (catching up on the series, bought on sale). Here is an album I uploaded to imgur with some examples of jaggies (which I hate). It becomes really obvious when moving around because it jumps out at you, like you see random lines on edges esp in the space station at the entrances. I've tried forcing the settings from CCC, doesn't seem to make a difference. Here's the album, It helps if you zoom in and I'm sorry if it doesn't really show well:



http://imgur.com/1wn3x


Is this normal? Like I said I see these jaggies around objects, including characters, in every game. There seems to be jaggies everywhere, for example in Dead Space 3, when Isaac is standing still and just doing his normal breathing movements, the jaggies around his form really jumps out at you.

Am I just expecting too much from the card or is something off? Thanks for any input, happy to be here and looking forward to participating in the forums!


----------



## maximsilentfoot

Sorry I forgot the 3dMark Scores for my card. CCC settings were on default.


----------



## CamsX

A bit jealous on triplehead monitor setup. Still trying to convince my wife to let me buy 1 of those, haha.

Regarding the AA issue, what driver are you using? Did you try uninstalling the drivers and reinstalling them from scratch? What are the exact AA settings for those games? The few issues reported so far include stuttering, dvi output failure and general cooling related, but nothing about IQ.


----------



## CamsX

@all new owners, remember to post your screenshot with your name in note pad, the gpuz info, max overclocking achieved and temps, so that agent smith can include you in the list. This confirms ownership, per forum guidelines, and its easier to track. Check initial 10 pages of the thread for examples.


----------



## BlaXey

Hello, there is a way to up the memory voltage?

Enviado desde mi MX4 mediante Tapatalk


----------



## maximsilentfoot

Hi CamsX!

Sorry about not posting the proof earlier, here it is:



Regarding the jaggies, I've tried uninstalling (also using drivercleaner pro) and reinstalling both up to date and older drivers, there hasn't been a difference. Another thing I'm beginning to notice is sometimes the gamma is weird...I'm not sure if you've played Dead Space 3, but when you go into sub menus in the main menu screen, the animation involves a piece of frozen item in the middle of the screen to be moved around. Usually that item is brighter (clear and can be seen), but since I've installed the 390x its dark and not really visible unless gamma is turned higher up (rest of game is bright enough and normal, so when Gamma is turned up everything seems off) The gamma/brightness in Heroes of the Storm is also weird in the main menu, though in-game seems fine. I will upload pics showing the difference when I can.

In the imgur album I linked, the CCC was set to override application settings (I've tried setting the AA in application settings to every option (MSAA, FXAA), doesn't really seem to make a difference.


----------



## LongRod

Spoiler: Warning: Spoiler!






Welp, finally got some time to try some basic OC'ing, and this is what I got out of it!

Core needed 100mV to do 1200(not horrible considering my ASIC is 63.3%). Going to have to fix the airflow in my case before attempting more with TRIXX. On the bright side, the memory loved being at 1700 once I bumped the AUX voltage to +50.

Can't wait to see if I can get any more out of this thing!


----------



## THUMPer1

Well it looks like the MSI ones are clocking the best out of the 390x samples. I was going to go with Sapphire, but I want at least 1150 core or more. The XFX is available locally but has awful VRM cooling.









I don't think the back plate on the MSI does anything except look cool. And they have a warranty sticker on one of the screws for the cooler so you can't take it off without them knowing. Assuming you have to RMA it one day. But Sapphire "doesnt care" in that aspect. Decisions....


----------



## CamsX

Quote:


> Originally Posted by *maximsilentfoot*
> 
> Hi CamsX!
> 
> Sorry about not posting the proof earlier, here it is:
> 
> 
> 
> Regarding the jaggies, I've tried uninstalling (also using drivercleaner pro) and reinstalling both up to date and older drivers, there hasn't been a difference. Another thing I'm beginning to notice is sometimes the gamma is weird...I'm not sure if you've played Dead Space 3, but when you go into sub menus in the main menu screen, the animation involves a piece of frozen item in the middle of the screen to be moved around. Usually that item is brighter (clear and can be seen), but since I've installed the 390x its dark and not really visible unless gamma is turned higher up (rest of game is bright enough and normal, so when Gamma is turned up everything seems off) The gamma/brightness in Heroes of the Storm is also weird in the main menu, though in-game seems fine. I will upload pics showing the difference when I can.
> 
> In the imgur album I linked, the CCC was set to override application settings (I've tried setting the AA in application settings to every option (MSAA, FXAA), doesn't really seem to make a difference.


This guy used DDU in safe mode for the uninstall process.

https://community.amd.com/thread/185024


----------



## Notarnicola

1 day old!
Quote:


> Originally Posted by *THUMPer1*
> 
> Well it looks like the MSI ones are clocking the best out of the 390x samples. I was going to go with Sapphire, but I want at least 1150 core or more. The XFX is available locally but has awful VRM cooling.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I don't think the back plate on the MSI does anything except look cool. And they have a warranty sticker on one of the screws for the cooler so you can't take it off without them knowing. Assuming you have to RMA it one day. But Sapphire "doesnt care" in that aspect. Decisions....


The back plate looks cool and hits 45ºC so...


----------



## undyingbread




----------



## phenom2955

Hello,

I just upgraded my GTX 750Ti to an XFX R9 390 from bestbuy on July 2nd. Its the 1000mhz default core clock speed version.

I just started toying with some slight overclocking with a few days ago and I am still a bit iffy on if I should stick with it or not. Currently it seems to be running stable at 1100Mhz core and 1600Mhz memory with +15 voltage in MSI afterburner. However I have tried pushing it up to 1150Mhz core and +25-+30 Voltage in Afterburner and I keep getting display driver crashes in Crysis 3, however MSI Kombuster seems to run for hours stable with those settings.

Can anybody recommend good stable settings in MSI afterburner for this card?

Also the card is getting around 80c under heavy load while gaming and if I raise any higher than +15 offset in Afterburner it starts getting upwards of 85-90c with fans at 100%, how much higher would be safe if I plan to keep this card working for a long haul?


----------



## gatygun

Quote:


> Originally Posted by *phenom2955*
> 
> Hello,
> 
> I just upgraded my GTX 750Ti to an XFX R9 390 from bestbuy on July 2nd. Its the 1000mhz default core clock speed version.
> 
> I just started toying with some slight overclocking with a few days ago and I am still a bit iffy on if I should stick with it or not. Currently it seems to be running stable at 1100Mhz core and 1600Mhz memory with +15 voltage in MSI afterburner. However I have tried pushing it up to 1150Mhz core and +25-+30 Voltage in Afterburner and I keep getting display driver crashes in Crysis 3, however MSI Kombuster seems to run for hours stable with those settings.
> 
> Can anybody recommend good stable settings in MSI afterburner for this card?
> 
> Also the card is getting around 80c under heavy load while gaming and if I raise any higher than +15 offset in Afterburner it starts getting upwards of 85-90c with fans at 100%, how much higher would be safe if I plan to keep this card working for a long haul?


Download trixx

Put fans on 100%
Push +200 volt
Put it anywhere near 1200 core.


----------



## phenom2955

Is that anywhere near safe? Especially with stock air cooling?

Also just to clarify, the +Offset for Voltage in Afterburner+Trixx is a straightforward +mV right? Its not some percentage right? So if I go +15, that is +15 mV?


----------



## LongRod

Quote:


> Originally Posted by *THUMPer1*
> 
> Well it looks like the MSI ones are clocking the best out of the 390x samples. I was going to go with Sapphire, but I want at least 1150 core or more. The XFX is available locally but has awful VRM cooling.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I don't think the back plate on the MSI does anything except look cool. And they have a warranty sticker on one of the screws for the cooler so you can't take it off without them knowing. Assuming you have to RMA it one day. But Sapphire "doesnt care" in that aspect. Decisions....


Everything I've read states that MSi doesn't care as long as you don't damage the card while removing the cooler, and you reapply the cooler if submitting an RMA (at least what I've seen from googling and MSi's official forums).
Quote:


> Originally Posted by *phenom2955*
> 
> Hello,
> 
> I just upgraded my GTX 750Ti to an XFX R9 390 from bestbuy on July 2nd. Its the 1000mhz default core clock speed version.
> 
> I just started toying with some slight overclocking with a few days ago and I am still a bit iffy on if I should stick with it or not. Currently it seems to be running stable at 1100Mhz core and 1600Mhz memory with +15 voltage in MSI afterburner. However I have tried pushing it up to 1150Mhz core and +25-+30 Voltage in Afterburner and I keep getting display driver crashes in Crysis 3, however MSI Kombuster seems to run for hours stable with those settings.
> 
> Can anybody recommend good stable settings in MSI afterburner for this card?
> 
> Also the card is getting around 80c under heavy load while gaming and if I raise any higher than +15 offset in Afterburner it starts getting upwards of 85-90c with fans at 100%, how much higher would be safe if I plan to keep this card working for a long haul?


Sounds like you might need a bit more core voltage to get 1150 stable. Memory might not like being at 1600 with no voltage bump, but that's just my own personal experience.
Quote:


> Originally Posted by *phenom2955*
> 
> Is that anywhere near safe? Especially with stock air cooling?
> 
> Also just to clarify, the +Offset for Voltage in Afterburner+Trixx is a straightforward +mV right? Its not some percentage right? So if I go +15, that is +15 mV?


Long as you keep the temps under control, definitely. Yeah, it's a straightforward mV increase, no percentage.


----------



## phenom2955

instant "No signal detected"


----------



## CerealKillah

Quote:


> Originally Posted by *THUMPer1*
> 
> Well it looks like the MSI ones are clocking the best out of the 390x samples. I was going to go with Sapphire, but I want at least 1150 core or more. The XFX is available locally but has awful VRM cooling.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I don't think the back plate on the MSI does anything except look cool. And they have a warranty sticker on one of the screws for the cooler so you can't take it off without them knowing. Assuming you have to RMA it one day. But Sapphire "doesnt care" in that aspect. Decisions....


Hey, what makes you say that the XFX has terrible VRM cooling? Are you aware that the 390 and 390x have VRM cooling, unlike the XFX 290 models? The reviews I have seen have actually been complimentary of the XFX 390 series cards FWIW.


----------



## CerealKillah




----------



## battleaxe

This has probably already been asked, but can I crossfire a 390x with a 290x without modding anything?


----------



## LongRod

Does anyone know a way I can mod afterburner to give me more than +100mV on the core?

With TRIXX, I was able to do 1250 on the core with +160mV, but since I can't modify the memory voltage, I can't get my memory past 1550. Would love to see what this thing benches with both memory and core overclocked.


----------



## maximsilentfoot

Quote:


> Originally Posted by *CamsX*
> 
> This guy used DDU in safe mode for the uninstall process.
> 
> https://community.amd.com/thread/185024


Thanks will try this now and report back


----------



## diggiddi

Quote:


> Originally Posted by *battleaxe*
> 
> This has probably already been asked, but can I crossfire a 390x with a 290x without modding anything?


Yes its been done no need to mod


----------



## Darkeylel

So I have a extremely noob question but how likely would it be that my CPU is bottle necking my GPU ? Because i'm no longer able to hold a stable framerate above 90 in Black ops 2 where I was able to hold 125 with ease. Same is in Cs:Go could hold 200 stable now I have to run it uncapped because it will just dip like crazy if I in force a frame rate lock.


----------



## CamsX

Quote:


> Originally Posted by *Darkeylel*
> 
> So I have a extremely noob question but how likely would it be that my CPU is bottle necking my GPU ? Because i'm no longer able to hold a stable framerate above 90 in Black ops 2 where I was able to hold 125 with ease. Same is in Cs:Go could hold 200 stable now I have to run it uncapped because it will just dip like crazy if I in force a frame rate lock.


I doubt you are any close to having a bottleneck with your CPU or GPU in any of those 2 games. I'd say it is a driver problem or you are running out of juice on your PSU.


----------



## Darkeylel

Quote:


> Originally Posted by *CamsX*
> 
> I doubt you are any close to having a bottleneck with your CPU or GPU in any of those 2 games. I'd say it is a driver problem or you are running out of juice on your PSU.


Pretty sure it's not my PSU

http://www.3dmark.com/3dm11/10108228

That's my 3dmark score if that helps anyone at all


----------



## jon666

Anybody else getting dropped out of games without any errors? So far it has only happened in Battlefield 4, and WoT. Reinstalled drivers a few times, not sure what else could be at fault. To my knowledge nothing is overheating.

I take that back, GPU-Z is reading 107 C as highest which is insane since I am not overclocked.


----------



## flopper

Quote:


> Originally Posted by *CerealKillah*
> 
> Hey, what makes you say that the XFX has terrible VRM cooling? Are you aware that the 390 and 390x have VRM cooling, unlike the XFX 290 models? The reviews I have seen have actually been complimentary of the XFX 390 series cards FWIW.


Quote:


> Originally Posted by *THUMPer1*
> 
> . The XFX is available locally but has awful VRM cooling.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> .


xfx has changed the vrm cooler for the 390 series as seen here.
a 40c drop of vrm can be expected



http://hardforum.com/showpost.php?p=1041767187&postcount=5


----------



## Gumbi

Asus dropped the ball with the cooler on their 290(x) by simply lifting the design from the 780 and plonking it down on the 290(x).
Quote:


> Originally Posted by *jon666*
> 
> Anybody else getting dropped out of games without any errors? So far it has only happened in Battlefield 4, and WoT. Reinstalled drivers a few times, not sure what else could be at fault. To my knowledge nothing is overheating.
> 
> I take that back, GPU-Z is reading 107 C as highest which is insane since I am not overclocked.


Check to make sure it's not a random spike (my Vapor X randomly spikes in GPUz to 350 degrees or something). If i's not, then remount your cooler. Your VRMs are cool as anything, so the cooler clearly isn't bad.


----------



## battleaxe

Quote:


> Originally Posted by *diggiddi*
> 
> Yes its been done no need to mod


Sweet!

Thank you!


----------



## THUMPer1

Quote:


> Originally Posted by *LongRod*
> 
> Everything I've read states that MSi doesn't care as long as you don't damage the card while removing the cooler, and you reapply the cooler if submitting an RMA (at least what I've seen from googling and MSi's official forums).


There is a warranty sticker on the bottom of the card. One of those "Warranty void if removed" seen here. That little white sticker.
http://assets.hardwarezone.com/img/2015/07/msi-390x-backplate.jpg


----------



## pengs

Quote:


> Originally Posted by *jon666*
> 
> Anybody else getting dropped out of games without any errors? So far it has only happened in Battlefield 4, and WoT. Reinstalled drivers a few times, not sure what else could be at fault. To my knowledge nothing is overheating.
> 
> I take that back, GPU-Z is reading 107 C as highest which is insane since I am not overclocked.


Uh... yeah. That's not right, your usage is 100% but the scale shows 0. Throttling is also 95C so you wouldn't be boosting to full frequency at 107C anyhow. Try Afterburner and take readings from that. It's a reading error, software or the heatsink is not seated onto the GPU correctly.


----------



## kizwan

Quote:


> Originally Posted by *pengs*
> 
> Quote:
> 
> 
> 
> Originally Posted by *jon666*
> 
> Anybody else getting dropped out of games without any errors? So far it has only happened in Battlefield 4, and WoT. Reinstalled drivers a few times, not sure what else could be at fault. To my knowledge nothing is overheating.
> 
> I take that back, GPU-Z is reading 107 C as highest which is insane since I am not overclocked.
> 
> 
> 
> 
> 
> Uh... yeah. That's not right, your usage is 100% but the scale shows 0. Throttling is also 95C so you wouldn't be boosting to full frequency at 107C anyhow. Try Afterburner and take readings from that. It's a reading error, software or the heatsink is not seated onto the GPU correctly.
Click to expand...

Reading error yes but not the software. You read error. Those are max value, not current reading.









Granted, the 107C temp is definitely reading error.


----------



## pengs

Quote:


> Originally Posted by *kizwan*
> 
> Reading error yes but not the software. You read error. Those are max value, not current reading.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Granted, the 107C temp is definitely reading error.


Oh yeah, true, didn't see the 'max'. Brain error


----------



## CerealKillah

Quote:


> Originally Posted by *flopper*
> 
> xfx has changed the vrm cooler for the 390 series as seen here.
> a 40c drop of vrm can be expected
> 
> 
> 
> http://hardforum.com/showpost.php?p=1041767187&postcount=5


I have an EK waterblock coming in this week. I bet it is going to be a pain to get those heatsinks off the VRMs.


----------



## abcanw

Quote:


> Originally Posted by *abcanw*
> 
> I bought an XFX 390 but unfortunately it has a really bad coil whine so im returning it tomorrow unfortunately. Since there is really not much information about the XFX here i thought i would share some.
> It has a bios switch so that makes it a dual bios card.
> 
> 
> 
> 
> 10 min in Kombustor stock speeds
> 
> 
> after one heaven benchmark stock speeds
> 
> 
> after around 30 min in heaven at stock speeds
> 
> since I will return it tomorrow I don't know what to get instead, I would love to get the MSI but I will use an ITX case (Coolermaster elite 130) which can only take 2 slots card, and the Sapphire and Asus cost 20$ more than XFX on Amazon. should I get another XFX?
> 
> If you want more info about the XFX I will be glad to provide as long as I have one


ok so i returned the XFX 390 and got sapphire 390 instead and its quiet with no coil whine

proof : 

those overclocks are with stock core voltage and +35 AUX voltage


----------



## Zack Foo

Hi Guys! I've done overcloking on my card whole night to get this results. Kindly comment cause i don't exactly know how and what is the normal score as asus strix 390 is still nowhere to be found.

First or all is 3dmark score. It is at 10097 for combine score. 11803 for graphics score. Saw a few from msi is better even at stock speed which makes me kinda sad i would say.

My overclock is at 1175mhz, +169 mV, and at 1575 mhz using Trixx.

furmark and 3d mark both don't have any artifacts and run smooth without problems but any additional will cause crash like gpu clock cnt reach 1180 and also memory cannot pass 163-. i put it back to 1575 as not much of a difference between them in performance but gpu clock i just try to get the max i can go.

Pardon for not posting pictures about it i am just wayyy too tired.

3dmark link is here: http://www.3dmark.com/3dm/8128570?


----------



## kizwan

Quote:


> Originally Posted by *Zack Foo*
> 
> Hi Guys! I've done overcloking on my card whole night to get this results. Kindly comment cause i don't exactly know how and what is the normal score as asus strix 390 is still nowhere to be found.
> 
> First or all is 3dmark score. It is at 10097 for combine score. 11803 for graphics score. Saw a few from msi is better even at stock speed which makes me kinda sad i would say.
> 
> My overclock is at 1175mhz, +169 mV, and at 1575 mhz using Trixx.
> 
> furmark and 3d mark both don't have any artifacts and run smooth without problems but any additional will cause crash like gpu clock cnt reach 1180 and also memory cannot pass 163-. i put it back to 1575 as not much of a difference between them in performance but gpu clock i just try to get the max i can go.
> 
> Pardon for not posting pictures about it i am just wayyy too tired.
> 
> 3dmark link is here: http://www.3dmark.com/3dm/8128570?


That graphics score doesn't look right. Did you run your CPU at stock?


----------



## CamsX

Quote:


> Originally Posted by *kizwan*
> 
> That graphics score doesn't look right. Did you run your CPU at stock?


I have to agree with kizwan. That score seems a bit too low. The graphics score is not that low and the combined looks normal, but the physics score is dragging you down by a lot.


----------



## Zack Foo

Quote:


> Originally Posted by *CamsX*
> 
> I have to agree with kizwan. That score seems a bit too low. The graphics score is not that low and the combined looks normal, but the physics score is dragging you down by a lot.


wait. running cpu at stock is a must do?


----------



## Zack Foo

Quote:


> Originally Posted by *CamsX*
> 
> I have to agree with kizwan. That score seems a bit too low. The graphics score is not that low and the combined looks normal, but the physics score is dragging you down by a lot.


should i ?


----------



## kizwan

Quote:


> Originally Posted by *CamsX*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> That graphics score doesn't look right. Did you run your CPU at stock?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have to agree with kizwan. That score seems a bit too low. The graphics score is not that low and the combined looks normal, but the physics score is dragging you down by a lot.
Click to expand...

With 1175/1575, the score should be in high 12K or low 13K for graphics score, don't you think?
Quote:


> Originally Posted by *Zack Foo*
> 
> wait. running cpu at stock is a must do?


Your 3dmark doesn't show correct CPU frequency. That's why I asked whether you run CPU stock? Better overclocked of course.

Set windows Power Options to "High Performance".


----------



## Zack Foo

Quote:


> Originally Posted by *kizwan*
> 
> That graphics score doesn't look right. Did you run your CPU at stock?


Quote:


> Originally Posted by *CamsX*
> 
> I have to agree with kizwan. That score seems a bit too low. The graphics score is not that low and the combined looks normal, but the physics score is dragging you down by a lot.


just realise when i run it at stock speed the score is way higher.
http://www.3dmark.com/3dm/8129434?

so what now==


----------



## Zack Foo

The 3dmark score is getting even lower and lower as i bump up my memory clock too. 1620 was stable but now my score is 9k something.


----------



## phenom2955

What is up with all of you people claiming you can run this card @ 1150-1250 with 100+ mV overclocks. I am getting massive artifacting in firestrike all of a sudden just running at 1100 core and default memory. Any higher than +100mV and the monitor loses display and the card crashes out and I have to hard reset the computer.

I am using a Corsair CX 750w PSU with the card so its not inadequate power, either this is all false information or my **** is defective?


----------



## Zack Foo

Quote:


> Originally Posted by *phenom2955*
> 
> What is up with all of you people claiming you can run this card @ 1150-1250 with 100+ mV overclocks. I am getting massive artifacting in firestrike all of a sudden just running at 1100 core and default memory. Any higher than +100mV and the monitor loses display and the card crashes out and I have to hard reset the computer.
> 
> I am using a Corsair CX 750w PSU with the card so its not inadequate power, either this is all false information or my **** is defective?


Mine is around 1175. 1180 will have some artifacts but not serious.
But I done it with +170mV not 100. If 100 I believe mine will have massive artifacts too


----------



## kizwan

Quote:


> Originally Posted by *Zack Foo*
> 
> The 3dmark score is getting even lower and lower as i bump up my memory clock too. 1620 was stable but now my score is 9k something.


When the score lower when overclocking the memory, that means your memory overclock is not stable. Did you try increasing AUX voltage to +50mV? Seems that work quite well for many people here.
Quote:


> Originally Posted by *phenom2955*
> 
> What is up with all of you people claiming you can run this card @ 1150-1250 with 100+ mV overclocks. I am getting massive artifacting in firestrike all of a sudden just running at 1100 core and default memory. Any higher than +100mV and the monitor loses display and the card crashes out and I have to hard reset the computer.
> 
> I am using a Corsair CX 750w PSU with the card so its not inadequate power, either this is all false information or my **** is defective?


Silicon lottery. Just like CPUs, GPUs are not created equal. Your card probably need more voltage to achieve higher clocks.


----------



## Zack Foo

Quote:


> Originally Posted by *kizwan*
> 
> When the score lower when overclocking the memory, that means your memory overclock is not stable. Did you try increasing AUX voltage to +50mV? Seems that work quite well for many people here.
> Silicon lottery. Just like CPUs, GPUs are not created equal. Your card probably need more voltage to achieve higher clocks.


What is aux voltage?


----------



## kizwan

Quote:


> Originally Posted by *Zack Foo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> When the score lower when overclocking the memory, that means your memory overclock is not stable. Did you try increasing AUX voltage to +50mV? Seems that work quite well for many people here.
> Silicon lottery. Just like CPUs, GPUs are not created equal. Your card probably need more voltage to achieve higher clocks.
> 
> 
> 
> What is aux voltage?
Click to expand...

It helps memory overclock. However, you can only change AUX voltage using MSI Afterburner.


----------



## Zack Foo

Quote:


> Originally Posted by *kizwan*
> 
> It helps memory overclock. However, you can only change AUX voltage using MSI Afterburner.


which means i cant go above 100mV if i wanna increase aux voltage?


----------



## kizwan

Quote:


> Originally Posted by *Zack Foo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> It helps memory overclock. However, you can only change AUX voltage using MSI Afterburner.
> 
> 
> 
> which means i cant go above 100mV if i wanna increase aux voltage?
Click to expand...

No. With MSI AB you can control both core voltage & AUX voltage.


----------



## Zack Foo

Quote:


> Originally Posted by *kizwan*
> 
> No. With MSI AB you can control both core voltage & AUX voltage.


But can't go above 100 only trixx can.


----------



## Zack Foo

Quote:


> Originally Posted by *kizwan*
> 
> It helps memory overclock. However, you can only change AUX voltage using MSI Afterburner.


http://www.3dmark.com/3dm/8129956? this is my final overclock i guess. 1160 on 100 mV. 1600mhz memory.


----------



## kizwan

Quote:


> Originally Posted by *Zack Foo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> No. With MSI AB you can control both core voltage & AUX voltage.
> 
> 
> 
> But can't go above 100 only trixx can.
Click to expand...

Sorry, I misunderstand your statement. Yes by default with AB you can't go above +100mV but there's workaround to get more than +100mV with MSI AB. However, on air cooler I don't recommend more than +100mV for 24/7.
Quote:


> Originally Posted by *Zack Foo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> It helps memory overclock. However, you can only change AUX voltage using MSI Afterburner.
> 
> 
> 
> http://www.3dmark.com/3dm/8129956? this is my final overclock i guess. 1160 on 100 mV. 1600mhz memory.
Click to expand...

Your graphics score in this one is pretty good.


----------



## Zack Foo

Quote:


> Originally Posted by *kizwan*
> 
> Sorry, I misunderstand your statement. Yes by default with AB you can't go above +100mV but there's workaround to get more than +100mV with MSI AB. However, on air cooler I don't recommend more than +100mV for 24/7.
> Your graphics score in this one is pretty good.


ya this will be my everday overclock profile i think. need to game more to test the stability tho.


----------



## phenom2955

I don't get it, I get artifacts with 1120mhz core @ +100mV. If I even try to go over 100mV it completly loses display.and freezes my computer.

Not that it would matter anyways since it hits 94 degrees celsius with that overvoltage(even with stock clocks) and gets throttled massively.

What would I have to do just to get a stable 1100mhz on the core. It gives a decent boost of FPS, not by a huge margin, but keeps it up about 5fps consistently. So in scenes where I would get 35fps I am maintaining over 40 witch is a huge help to me.


----------



## battleaxe

Quote:


> Originally Posted by *phenom2955*
> 
> I don't get it, I get artifacts with 1120mhz core @ +100mV. If I even try to go over 100mV it completly loses display.and freezes my computer.
> 
> Not that it would matter anyways since it hits 94 degrees celsius with that overvoltage(even with stock clocks) and gets throttled massively.
> 
> What would I have to do just to get a stable 1100mhz on the core. It gives a decent boost of FPS, not by a huge margin, but keeps it up about 5fps consistently. So in scenes where I would get 35fps I am maintaining over 40 witch is a huge help to me.


You've got to get that thing cooler. 94c is too hot to hit high clocks in my experience on these. Some cards can take differing amounts of voltage, its just the way it is. Mine can do +193mv, anything more and I get the same thing. Some can take +300mv, just depends on what you got from the factory. No way to know either until you get it. Hence silicon lottery comments.


----------



## kizwan

@phenom2955, Also there's has been a couple of cases where some 290s unable to take more voltage without having black screen & crash. So your problem not really strange. Unfortunately you have dud card. If possible exchange or RMA it.


----------



## Agent Smith1984

I think people need to understand that overclocking these cards is not as simple as setting voltage to max and seeing how high it will clock....

There are a lot of factors involved with these, and some respond differently to voltages and temps than others do.

Some people will find their max clocks in the 50mv-75mv range. Some people can find it past 100mv. There is no simple formula or generic set of settings to be used from one card to another.

Also, people need to always turn power limit to 50% when overclocking these, even on stock voltage.

The power limits of the board will reach it's cap and cause throttling if you don't.
VRM and core temp are both really important factors also.

VRM temps need to be below 80c to really be able to dial in a max OC. So if for instance, you have landed at 1160MHz with 100mv, and your VRM's are 85C, you may find some temp relief at 50mv and get the same clock speed.

Not saying that will always be the case, but I see it a lot.

It's also important to know that these cores tend to have a "happy place" at which they will run at a pretty good clock speed, with a moderate voltage increase, and only clock a tad higher with a large voltage increase.

I get 1170MHz on 50mv, but 1175 on 75mv, and 1180 at 100mv.

Pretty obvious to see why I run at 1170 for my daily clocks right?


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I think people need to understand that overclocking these cards is not as simple as setting voltage to max and seeing how high it will clock....
> 
> There are a lot of factors involved with these, and some respond differently to voltages and temps than others do.
> 
> Some people will find their max clocks in the 50mv-75mv range. Some people can find it past 100mv. There is no simple formula or generic set of settings to be used from one card to another.
> 
> Also, people need to always turn power limit to 50% when overclocking these, even on stock voltage.
> 
> The power limits of the board will reach it's cap and cause throttling if you don't.
> VRM and core temp are both really important factors also.
> 
> VRM temps need to be below 80c to really be able to dial in a max OC. So if for instance, you have landed at 1160MHz with 100mv, and your VRM's are 85C, you may find some temp relief at 50mv and get the same clock speed.
> 
> Not saying that will always be the case, but I see it a lot.
> 
> It's also important to know that these cores tend to have a "happy place" at which they will run at a pretty good clock speed, with a moderate voltage increase, and only clock a tad higher with a large voltage increase.
> 
> I get 1170MHz on 50mv, but 1175 on 75mv, and 1180 at 100mv.
> 
> Pretty obvious to see why I run at 1170 for my daily clocks right?


Explained well and accurate IMO. +1


----------



## Gboss

Anyone on here running an FX-9590 in combination with a R9 390 or 390?

I'm interested in what psu your running and general gaming feedback please


----------



## Agent Smith1984

Quote:


> Originally Posted by *Gboss*
> 
> Anyone on here running an FX-9590 in combination with a R9 390 or 390?
> 
> I'm interested in what psu your running and general gaming feedback please


Not running a 9590, but am running an 8300 @ 4.8GHz with an MSI 390 @ 1170/1700 (50mv/50mv), daily clocks

Games run great.... I have a semi-cheap-o Raidmax 850W PSU and everything runs fine.

I also had a 290 crossfire setup (1050/1400 on both cards at stock voltage) with CPU at 4.9GHz during the winter/spring and it handled that just fine also.


----------



## Gboss

I have a fx-9590 but my main board, asrock fatal1ty 990x just died on me. And this happened 2 days after ordering a PowerColor R9 390 online.
My replacement main board will be here tomorrow or Wednesday and I have been getting a bit anxious that my PSU won't be up to the task..

I have an Antec Gamers High Current 750w 80+bronze. It handled my 9590, 280L watercooler, radeon 6870, drives etc without a hich but the R9 390 has a hight TDP and so does my CPU... getting worried I might need to replace it :'(


----------



## Zack Foo

Quote:


> Originally Posted by *Gboss*
> 
> I have a fx-9590 but my main board, asrock fatal1ty 990x just died on me. And this happened 2 days after ordering a PowerColor R9 390 online.
> My replacement main board will be here tomorrow or Wednesday and I have been getting a bit anxious that my PSU won't be up to the task..
> 
> I have an Antec Gamers High Current 750w 80+bronze. It handled my 9590, 280L watercooler, radeon 6870, drives etc without a hich but the R9 390 has a hight TDP and so does my CPU... getting worried I might need to replace it :'(


Pretty sure it should be enough tho. If you don't overclock for both of it it should helps even more to prevents


----------



## Gboss

I don't plan on oveeclocking the CPU as I have read they don't OC well.
I do plan on doing a lil OC on the GPU later. If the PSU doesn't hold up to the OC I'll replace it with a much bigger one as my future plan is to crossfire a pair of R9 390


----------



## Twau

I can't get my card stable beyond 1105/1550 +50mV








It's not certain that this is the max clocks yet since I have not played around with it so much, but 1110 on core with +50mV is instant blackscreen/crash for me during firestrike, so I guess I did not win the lottery









Tried HDMI cable today, usually I am hooked up with the displayport, but with HDMI I could raise my core to 1120 with +75mV without any crash...?
This is impossible for me to do with using displayport, anyone heard anything like that before, change display connector shouldnt ever affect the core clock right? Bet its just random to my hw setup..

http://www.3dmark.com/fs/5674486


----------



## diggiddi

Quote:


> Originally Posted by *Zack Foo*
> 
> But can't go above 100 only trixx can.


It must be for your card cos my 290x lightning has +200mV option

Quote:


> Originally Posted by *Gboss*
> 
> I have a fx-9590 but my main board, asrock fatal1ty 990x just died on me. And this happened 2 days after ordering a PowerColor R9 390 online.
> My replacement main board will be here tomorrow or Wednesday and I have been getting a bit anxious that my PSU won't be up to the task..
> 
> I have an Antec Gamers High Current 750w 80+bronze. It handled my 9590, 280L watercooler, radeon 6870, drives etc without a hich but the R9 390 has a hight TDP and so does my CPU... getting worried I might need to replace it :'(


I have the same psu and it can handle my 850 @4.6ghz (0.1 ghz lower than stock 9590) and primary card at 1230/1620 and my 7950(2nd monitor) fine


----------



## CamsX

Talking about PSUs. Would you say that my PSU is faulty? Yesterday I spent the entire day trying to OC my 8350 to something past 4.4ghz (multiplier only). I know when a worker fails in Prime, it is because I'm somewhat short on VCore, but most of the time my computer reboots out of the sudden. I reached 4.532ghz on 232 FSB, but it still reboots after 30minutes or so. I think its not thermal because it is not going over 55°C on the Core.

My card should arrive tomorrow and I wanted to get the most out of it.

OCZ Z 850W 80+gold PSU
Mobo GB GA-990FXA-UD3 v1.1 (old CMOS style BIOS, F10e)


----------



## Agent Smith1984

Quote:


> Originally Posted by *CamsX*
> 
> Talking about PSUs. Would you say that my PSU is faulty? Yesterday I spent the entire day trying to OC my 8350 to something past 4.4ghz (multiplier only). I know when a worker fails in Prime, it is because I'm somewhat short on VCore, but most of the time my computer reboots out of the sudden. I reached 4.532ghz on 232 FSB, but it still reboots after 30minutes or so. I think its not thermal because it is not going over 55°C on the Core.
> 
> My card should arrive tomorrow and I wanted to get the most out of it.
> 
> OCZ Z 850W 80+gold PSU
> Mobo GB GA-990FXA-UD3 v1.1 (old CMOS style BIOS, F10e)


Did you add more voltage to the core?

How old is your 8350 (do you know the batch number?)

Restarts can be PSU, but can also just mean that the vcore is too low (though I normally get hard locks that require I manually reboot).

Your PSU should be fine from an output standpoint, but it's not to say something couldn't be wrong with it from an operational standpoint (though I'd rule that out until checking a few other things first).


----------



## CamsX

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Did you add more voltage to the core?
> 
> How old is your 8350 (do you know the batch number?)
> 
> Restarts can be PSU, but can also just mean that the vcore is too low (though I normally get hard locks that require I manually reboot).
> 
> Your PSU should be fine from an output standpoint, but it's not to say something couldn't be wrong with it from an operational standpoint (though I'd rule that out until checking a few other things first).


FD8350FRHKBOX, bought in March 2013. Not sure where to the get the actual S/N.

Voltages:
Vcore 1.362
CPU/NB 1.275
NB 1.175
DRAM 1.6

Hope it isn't the PSU.


----------



## Agent Smith1984

Quote:


> Originally Posted by *CamsX*
> 
> FD8350FRHKBOX, bought in March 2013. Not sure where to the get the actual S/N.
> 
> Voltages:
> Vcore 1.362
> CPU/NB 1.275
> NB 1.175
> DRAM 1.6
> 
> Hope it isn't the PSU.


You are probably going to need between 1.4 and 1.45v to get 4.6 on that thing.

Post 1429's (year 14', week 29) tend to do 4.6 on 1.35v or less, but the older ones need a little more juice.


----------



## cbarros82

For me i'm a big fan of overclock with no voltage for gaming 24/7 . because gains are sometimes not worth the stress or the heat on the video cards. i used to be that way overclock high no matter what voltage it needs but only gain 5+ fps for hotter temps . when i ran my 290 with ek water block i ran it at 1050/1300 all day with my core @40c gaming ( yeah i could overclock it to 1175/1600 +50 power/ +100 mv core @ 55-60c ) the gains to me weren't worth it . But just bench-marking im gonna give it everything i can.


----------



## CerealKillah

Quote:


> Originally Posted by *cbarros82*
> 
> For me i'm a big fan of overclock with no voltage for gaming 24/7 . because gains are sometimes not worth the stress or the heat on the video cards. i used to be that way overclock high no matter what voltage it needs but only gain 5+ fps for hotter temps . when i ran my 290 with ek water block i ran it at 1050/1300 all day with my core @40c gaming ( yeah i could overclock it to 1175/1600 +50 power/ +100 mv core @ 55-60c ) the gains to me weren't worth it . But just bench-marking im gonna give it everything i can.


My EK waterblock just showed up today. I am encouraged to hear your numbers (both in terms of heat and oc values).

I was cheap and didn't order the backplate.....


----------



## CamsX

Quote:


> Originally Posted by *Agent Smith1984*
> 
> You are probably going to need between 1.4 and 1.45v to get 4.6 on that thing.
> 
> Post 1429's (year 14', week 29) tend to do 4.6 on 1.35v or less, but the older ones need a little more juice.












I'll give it a go tonight, but I'm a bit afraid of the heat, as the ambient temps around here are always above 27°C, and on the few instances I went with 1.41 and 1.44 I was over 63°C on the core.


----------



## BlaXey

There is a problem in have 100mv in core and auxiliar? Is the only way to get a solid overclock on my xfx 390 1180 and 1580 on memories actually with a temp of 75º maxium on core and 76 on vrms, my question is if with this voltage could reduce dangerously the life of my card? If the answer is yes, I prefer use less overclock.


----------



## diggiddi

Quote:


> Originally Posted by *BlaXey*
> 
> There is a problem in have 100mv in core and auxiliar? Is the only way to get a solid overclock on my xfx 390 1180 and 1580 on memories actually with a temp of 75º maxium on core and 76 on vrms, my question is if with this voltage could reduce dangerously the life of my card? If the answer is yes, I prefer use less overclock.


Your temps are good


----------



## cbarros82

This is a pretty good gain in valley extreme HD preset
gigabyte reference r9 290 ekwb
Top score : 947/1250 stock reference clock core 40c , vram #1 48c vram #2 40c
bottom score max oc 1160/1600 +50 power and +100 mv. core temp 60c , vram #1 65c , vram #2 50c
faster than most gtx 970's at 1500/8000

I hope my msi r9 390 does better as they oc alittle better than r9 290's did.....ive seen 1200/1700 on air which is pretty damn good , still wish ekwb would make an msi r9 390 water block


----------



## BlaXey

And voltage settings should be on maintain voltage nevel?

Enviado desde mi MX4 mediante Tapatalk


----------



## sinholueiro

Guys, if the XFX model, once you put away the cooler, has a passive VRM cooling, as we can see a few post ago, can the VRMs be cooling ONLY passively? I pretend to watercool the core only.


----------



## cbarros82

Quote:


> Originally Posted by *sinholueiro*
> 
> Guys, if the XFX model, once you put away the cooler, has a passive VRM cooling, as we can see a few post ago, can the VRMs be cooling ONLY passively? I pretend to watercool the core only.


when using ek full cover water block it will cool the core , mem and vrams. As it still uses reference PCB


----------



## 12Cores

Hello, has anyone upgraded from a 280x/7970 crossfire setup to crossfire 390/390x build, I just want to know if you are seeing any gains. I game at 1440p with massive overclocks across the board in my current rig,but I am considering picking up two 390's to have something to play with this fall.


----------



## CerealKillah

So strange thing has come up today.

It looks like EK removed the XFX from the R9 390 compatibility list. Wish me luck....


----------



## CamsX

Back to square 1 on the CPU overclocking. Only stable @ 200x22=4.4Ghz on -0.025 on Vcore. Currently testing 4.466Ghz via 203x22 and all voltages including PLL at default x 1.015 and Vcore at default. LLC at Extreme, but don't really understand how this affects the overclock.

55~°C holding so far. Ambient 27°C ish.

















Edit:


----------



## phenom2955

Hello,

So I replaced my XFX R9 390 today, was going to trade it in for an EVGA 970 but they didn't have one that matched price and I couldn't cover the spread. However with Newegg selling the same model EVGA 970 for 339.99$ USD and best buys price match guarantee I might end up trading in for the EVGA GTX 970 SSC version later this week.

Looking at this EVGA 970 card: http://www.newegg.com/Product/Product.aspx?Item=N82E16814487088
Here is the one listed on BestBuys website: http://www.bestbuy.com/site/evga-geforce-gtx-970-4gb-gddr5-pci-express-3-0-graphics-card-black/2496109.p?id=1219549325520&skuId=2496109

Comes with 1190 stock core and 1342 boost.

Anyways with the New R9 390 that I just got today I actually got a better Firestrike score with stock speeds. Here are the scores below, The first one with stock speeds and the second with 1100mhz.

Stock Speeds:
http://www.3dmark.com/3dm/8141338

Core 1100Mhz stock voltage:
http://www.3dmark.com/3dm/8141722

No artifacts so far with that 1100 OC but I guess we will see, hopefully this one will stay stable @ 1100Mhz core with absolutely no artifacting and then maybe I can consider trying to bump it up a bit more with a bit of Voltage boost.

I am still not sure though, I kind of want to trade it in for that EVGA GTX 970 SSC if BestBuy can match the newegg price. The only reason I went with the 390 was the 8GB of VRAM as opposed to the supposed 3.5gb on the GTX 970, but honestly no game has gone past 3gb VRAM useage since I have had this card and the R9 390 gets way too hot in my ratty old mid tower case. I think the NVIDIA card using less power would stay a bit cooler?

Anybody with more knowledge in graphics cards than me willing to give any opinons?


----------



## Flash Gordon

A 750W PSU will not be able to handle and OC CPU and a OC pair of 390's...your system will shut off on you (It will shut off when temps get close to the 90's on either card btw).

I am talking from experience...I have a brand new 750G2 from EVGA and it couldn't handle that setup I doubt any other comparable PSU will be able to.

A 750W PSU can handle an OC CPU and stock pair of 390's just fine though


----------



## Gboss

I highly doubt my 750w PSU will handle 2x390s and my fx-9590....


----------



## Flash Gordon

Well an Amd CPU is a whole 'nother beast lol

I got an 4690k so I have no issues









Why are Amd Cpu's so inefficient...that is such a turn off -_-


----------



## Gboss

Lol you preaching to the converted.
I thought I'd challenge myself and do a complete AMD build.
If I'd known it was gonna take a 280mm watercooler to keep my CPU cool I would have gone with an i7...

But I must say this has been my most rewarding PC build I've done. My replacement main board arrives today and I finally will get to test drive my R9 390 that's been sitting the box for a week.


----------



## diggiddi

Quote:


> Originally Posted by *Gboss*
> 
> Lol you preaching to the converted.
> I thought I'd challenge myself and do a complete AMD build.
> If I'd known it was gonna take a 280mm watercooler to keep my CPU cool I would have gone with an i7...
> 
> But I must say this has been my most rewarding PC build I've done. My replacement main board arrives today and I finally will get to test drive my R9 390 that's been sitting the box for a week.


What were you using before to keep it cool?

BTW OP look at the PSU in this thread to get an idea
http://www.overclock.net/t/1483021/official-amd-r9-295x2-owners-club


----------



## Gboss

When I bought the CPU I had ordered a Deep Cool heating with 8 copper pipes and 2x120mm fans. The heating was split down the centre and had 1 of the 120mm in the middle in the split. So 1 fan pushed and the other pulled.
This setup managed to keep my CPU idling at around 45 to 55 degrees Celsius. I stay in Durban, South Africa and we have high humidity as well as high ambient temperature.

After much frustration and lots of money spent on all types of different thermal paste, I decided that the only way to go as watercooling.

After some researching and forum crawling I settled on the Cooler Master 280L Nepton AIO watercooler. This is a huge cooler d required a new case to support it.

Now my CPU idles ant 25 -35 degrees Celsius. I haven't seen the temps go above 55 under load.


----------



## flopper

Quote:


> Originally Posted by *12Cores*
> 
> Hello, has anyone upgraded from a 280x/7970 crossfire setup to crossfire 390/390x build, I just want to know if you are seeing any gains. I game at 1440p with massive overclocks across the board in my current rig,but I am considering picking up two 390's to have something to play with this fall.


Of course you see gains. one 390 could almost be compared with crossfired 7970´s.
Quote:


> Originally Posted by *CerealKillah*
> 
> So strange thing has come up today.
> 
> It looks like EK removed the XFX from the R9 390 compatibility list. Wish me luck....


ouch, send an email and ask why

Quote:


> Originally Posted by *phenom2955*
> 
> Hello,
> 
> I am still not sure though, I kind of want to trade it in for that EVGA GTX 970 SSC if BestBuy can match the newegg price. The only reason I went with the 390 was the 8GB of VRAM as opposed to the supposed 3.5gb on the GTX 970, but honestly no game has gone past 3gb VRAM useage since I have had this card and the R9 390 gets way too hot in my ratty old mid tower case. I think the NVIDIA card using less power would stay a bit cooler?
> 
> Anybody with more knowledge in graphics cards than me willing to give any opinons?


No doubt it be cooler with a 970.
You can under clock the 390 slightly or optimize airflow in the case to ensure values you want also.


----------



## AverdanOriginal

Quote:


> Originally Posted by *Flash Gordon*
> 
> A 750W PSU will not be able to handle and OC CPU and a OC pair of 390's...your system will shut off on you (It will shut off when temps get close to the 90's on either card btw).
> 
> I am talking from experience...I have a brand new 750G2 from EVGA and it couldn't handle that setup I doubt any other comparable PSU will be able to.
> 
> A 750W PSU can handle an OC CPU and stock pair of 390's just fine though


That depends on the quality of the PSU. I have a 600 W Bequiet Straight Power and no issues with OC CPU and R9 390 (even though I have to admit the graphic card I haven't pushed to the limit).
Just use any PSU calculator there is on the internet and they will give you around 500-560 Watts needed if both are OCed.
Quote:


> Originally Posted by *flopper*
> 
> No doubt it be cooler with a 970.
> You can under clock the 390 slightly or optimize airflow in the case to ensure values you want also.


I keep 3 profiles in my MSI AB (similar to the ones in the Gaming APP - which by the way ruins my card i think since in idle the memory mhz is always on the limit). When I turn on the Silent mode while playing watchdogs on ultra. it runs smooth (~45-60 FPS) and it does not get hotter than 73 Celsius @ 61% Fan (with 1060 Mhz it goes up to 80 Celsius).... now mind you that was with *Ambient temp. at 29.5 Celsius* and all different scenes in Watchdogs. so considering 1ambient = 1-1.1 Computer Temp in Winter this should go down to 60-65 Celsius.

The problem I have is that my old AMD Phenom II x6 1055T @3.7 GHz heats up too high... need something more efficient like the one Flash Gordon has









Edit: ok just saw you meant a pair of 390s. Sry


----------



## Gboss

May I ask what CPU your using?


----------



## kizwan

Quote:


> Originally Posted by *sinholueiro*
> 
> Guys, if the XFX model, once you put away the cooler, has a passive VRM cooling, as we can see a few post ago, can the VRMs be cooling ONLY passively? I pretend to watercool the core only.


If you're thinking using universal GPU block to watercool the GPU core only, it's best to attach a fan for the VRM cooling.
Quote:


> Originally Posted by *CerealKillah*
> 
> So strange thing has come up today.
> 
> It looks like EK removed the XFX from the R9 390 compatibility list. Wish me luck....


Good luck.
Quote:


> Originally Posted by *phenom2955*
> 
> Hello,
> 
> So I replaced my XFX R9 390 today, was going to trade it in for an EVGA 970 but they didn't have one that matched price and I couldn't cover the spread. However with Newegg selling the same model EVGA 970 for 339.99$ USD and best buys price match guarantee I might end up trading in for the EVGA GTX 970 SSC version later this week.
> 
> Looking at this EVGA 970 card: http://www.newegg.com/Product/Product.aspx?Item=N82E16814487088
> Here is the one listed on BestBuys website: http://www.bestbuy.com/site/evga-geforce-gtx-970-4gb-gddr5-pci-express-3-0-graphics-card-black/2496109.p?id=1219549325520&skuId=2496109
> 
> Comes with 1190 stock core and 1342 boost.
> 
> Anyways with the New R9 390 that I just got today I actually got a better Firestrike score with stock speeds. Here are the scores below, The first one with stock speeds and the second with 1100mhz.
> 
> Stock Speeds:
> http://www.3dmark.com/3dm/8141338
> 
> Core 1100Mhz stock voltage:
> http://www.3dmark.com/3dm/8141722
> 
> No artifacts so far with that 1100 OC but I guess we will see, hopefully this one will stay stable @ 1100Mhz core with absolutely no artifacting and then maybe I can consider trying to bump it up a bit more with a bit of Voltage boost.
> 
> I am still not sure though, I kind of want to trade it in for that EVGA GTX 970 SSC if BestBuy can match the newegg price. The only reason I went with the 390 was the 8GB of VRAM as opposed to the supposed 3.5gb on the GTX 970, but honestly no game has gone past 3gb VRAM useage since I have had this card and the R9 390 gets way too hot in my ratty old mid tower case. I think the NVIDIA card using less power would stay a bit cooler?
> 
> Anybody with more knowledge in graphics cards than me willing to give any opinons?


Your new 390 look pretty good. If heat is a concern & you don't plan to crossfire later, 970 is a good alternative for you.
Quote:


> Originally Posted by *AverdanOriginal*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Flash Gordon*
> 
> A 750W PSU will not be able to handle and OC CPU and a OC pair of 390's...your system will shut off on you (It will shut off when temps get close to the 90's on either card btw).
> 
> I am talking from experience...I have a brand new 750G2 from EVGA and it couldn't handle that setup I doubt any other comparable PSU will be able to.
> 
> A 750W PSU can handle an OC CPU and stock pair of 390's just fine though
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That depends on the quality of the PSU. I have a 600 W Bequiet Straight Power and no issues with OC CPU and R9 390 (even though I have to admit the graphic card I haven't pushed to the limit).
> Just use any PSU calculator there is on the internet and they will give you around 500-560 Watts needed if both are OCed.
Click to expand...

He said *a pair* 390. 750W is not enough if you plan to OC both CPU & two GPUs.


----------



## AverdanOriginal

Quote:


> Originally Posted by *kizwan*
> 
> He said *a pair* 390. 750W is not enough if you plan to OC both CPU & two GPUs.


Ups. my bad. was thinking that Flash would know better


----------



## CerealKillah

Before I tear everything apart, I posted a question in the EK forum. Hopefully I get some answers.

Waterblock is still in the package at home


----------



## Agent Smith1984

Quote:


> Originally Posted by *phenom2955*
> 
> Hello,
> 
> So I replaced my XFX R9 390 today, was going to trade it in for an EVGA 970 but they didn't have one that matched price and I couldn't cover the spread. However with Newegg selling the same model EVGA 970 for 339.99$ USD and best buys price match guarantee I might end up trading in for the EVGA GTX 970 SSC version later this week.
> 
> Looking at this EVGA 970 card: http://www.newegg.com/Product/Product.aspx?Item=N82E16814487088
> Here is the one listed on BestBuys website: http://www.bestbuy.com/site/evga-geforce-gtx-970-4gb-gddr5-pci-express-3-0-graphics-card-black/2496109.p?id=1219549325520&skuId=2496109
> 
> Comes with 1190 stock core and 1342 boost.
> 
> Anyways with the New R9 390 that I just got today I actually got a better Firestrike score with stock speeds. Here are the scores below, The first one with stock speeds and the second with 1100mhz.
> 
> Stock Speeds:
> http://www.3dmark.com/3dm/8141338
> 
> Core 1100Mhz stock voltage:
> http://www.3dmark.com/3dm/8141722
> 
> No artifacts so far with that 1100 OC but I guess we will see, hopefully this one will stay stable @ 1100Mhz core with absolutely no artifacting and then maybe I can consider trying to bump it up a bit more with a bit of Voltage boost.
> 
> I am still not sure though, I kind of want to trade it in for that EVGA GTX 970 SSC if BestBuy can match the newegg price. The only reason I went with the 390 was the 8GB of VRAM as opposed to the supposed 3.5gb on the GTX 970, but honestly no game has gone past 3gb VRAM useage since I have had this card and the R9 390 gets way too hot in my ratty old mid tower case. I think the NVIDIA card using less power would stay a bit cooler?
> 
> Anybody with more knowledge in graphics cards than me willing to give any opinons?


What was your reasoning for considering the 970 over the 390?

I ask because there are quite a few people on our roster who dumped their 970's like a cheating girlfriend, and made the somewhat-side grade shift to the 390.

The 390 is a better card in my opinion.

It's got more raw horsepower, and more VRAM (the 970 doesn't even have a truly usable 4GB to begin with).
The 970 uses less power, and produces less heat, but if those things don't concern you as much as actual FPS, then the 390 is a clear winner.....

Based on your scores, if there is one thing your rig needs more than anything, it's a good CPU overclock.

If you can get that FX6300 in the 4.6-5GHz range, it will help you quite a bit!!


----------



## Agent Smith1984

Quote:


> Originally Posted by *CamsX*
> 
> Back to square 1 on the CPU overclocking. Only stable @ 200x22=4.4Ghz on -0.025 on Vcore. Currently testing 4.466Ghz via 203x22 and all voltages including PLL at default x 1.015 and Vcore at default. LLC at Extreme, but don't really understand how this affects the overclock.
> 
> 55~°C holding so far. Ambient 27°C ish.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit:


I don't recommend extreme LLC for such a mild overclock. Just use high or medium.....

You just aren't going to get much more than 4.3-4.4 with stock voltage on an older 8350. It is going to need 1.4v or more, but as long as you can keep your CPU under 69C you will be fine....
Also, you may need a fan on your VRM's to keep them cool, as that is a key factor in overclocking these chips. I actually use a fan on my VRM's and on the rear of my socket!!!

Here are my results on a newer visehera (low leakage post 1429 chip)

With no LLC at all (not available on my board) at 4.9GHz (1.464v under load) my FX 8300 hits 68C in the winter time with 20C ambient...

At my summer settings of 4.8GHz (1.4v under load) I hit 64C with 24C ambient

That's 10 passes of very high IBT AVX.....

Again, on the old visheras, you need 1.4v-1.5v to get clock speeds in the 4.6-5GHz range.


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> What was your reasoning for considering the 970 over the 390?
> 
> I ask because there are quite a few people on our roster who dumped their 970's like a cheating girlfriend, and made the somewhat-side grade shift to the 390.
> 
> The 390 is a better card in my opinion.
> 
> It's got more raw horsepower, and more VRAM (the 970 doesn't even have a truly usable 4GB to begin with).
> The 970 uses less power, and produces less heat, but if those things don't concern you as much as actual FPS, then the 390 is a clear winner.....
> 
> Based on your scores, if there is one thing your rig needs more than anything, it's a good CPU overclock.
> 
> If you can get that FX6300 in the 4.6-5GHz range, it will help you quite a bit!!


I have a 970 and a very good one at that. I then bought another 290x to go with my 290 when they went on crazy sale a few weeks ago. Now I have my eye on the 390x. There's no way I would buy a 970 right now. I would get another 290x in a heartbeat over a 970 all day. The 390 for sure too before a 970 for me anyway. I'm loving my Xfire setup right about now.


----------



## Agent Smith1984

OP updated with information for each individual card as well as some pictures!!

Enjoy!

I will also be working on some other updates throughout the week......


----------



## Piccolo55

Quote:


> Originally Posted by *Agent Smith1984*
> 
> OP updated with information for each individual card as well as some pictures!!
> 
> Enjoy!
> 
> I will also be working on some other updates throughout the week......


Thanks very much, as a future buyer of one of these cards that info would be great


----------



## CamsX

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I don't recommend extreme LLC for such a mild overclock. Just use high or medium.....
> 
> You just aren't going to get much more than 4.3-4.4 with stock voltage on an older 8350. It is going to need 1.4v or more, but as long as you can keep your CPU under 69C you will be fine....
> Also, you may need a fan on your VRM's to keep them cool, as that is a key factor in overclocking these chips. I actually use a fan on my VRM's and on the rear of my socket!!!
> 
> Here are my results on a newer visehera (low leakage post 1429 chip)
> 
> With no LLC at all (not available on my board) at 4.9GHz (1.464v under load) my FX 8300 hits 68C in the winter time with 20C ambient...
> 
> At my summer settings of 4.8GHz (1.4v under load) I hit 64C with 24C ambient
> 
> That's 10 passes of very high IBT AVX.....
> 
> Again, on the old visheras, you need 1.4v-1.5v to get clock speeds in the 4.6-5GHz range.


Thanks for the comments, will lower LLC to High. My 390 is still delayed by 1 more day, so I'll set everything backtto stock and work today on finding one by one the limits of my setup, specially FSB, HT freq and NB freq.

Any pointers on testing my memory? I have Corsair Vengeance 1600Mhz Lat 9-9-9-24 1.5v


----------



## diggiddi

HCI memtest, run several instances at once, divide your available ram by number of processors at least 100%, 400% should be better


----------



## Geoclock

Hi guys, i just wanted to ask WHICH R9 390 is worth to buy.( Have Corsair HX750 PSU 5 year old)
First i bought XFX 290 and had nothig but headaches, BSOD and Black Screens, Returned for refund.
Now i'm worry of my PSU if it will handle R9 390, do you think because of it i should switch to GTX 970 ?
Are AMD x2 hot than GTX 970 ?
Now i got trouble to decide.
MSI got good reviews but it cost more, Sapphire? ASUS?
Powercolor also just few reviews but that's all. Got $20 OFF from Powercolor and will expire in 3 days.
I don't see if Powercolor cools VRM chips, see image

Do you think is still worth to buy MSI over others?
Thanks.


----------



## cbarros82

stock msi r9 390 valley score
1040/1500
max temp 61c ( my fan profile)
max temp 72c (stock profile)



stock r9 290 water cooled
947/1250
max temp 48c



oc r9 290 water cooled
1160/1600 50% / +100mv
max temp 60c


----------



## 12Cores

[/quote]

Thanks for posting, what model was your 290 and do you remember the brand/model of the block you had on the card?


----------



## 12Cores

Quote:


> Originally Posted by *battleaxe*
> 
> I have a 970 and a very good one at that. I then bought another 290x to go with my 290 when they went on crazy sale a few weeks ago. Now I have my eye on the 390x. There's no way I would buy a 970 right now. I would get another 290x in a heartbeat over a 970 all day. The 390 for sure too before a 970 for me anyway. I'm loving my Xfire setup right about now.


Are your 290's overclocked and if so would you mind sharing your valley score at 1080p extreme HD?


----------



## cbarros82

Quote:


> Originally Posted by *12Cores*


Thanks for posting, what model was your 290 and do you remember the brand/model of the block you had on the card?[/quote]

Gigabyte 290 reference with ek water block rev 1.0


----------



## cbarros82

This is as far as i wanted to take my oc in valley
MSI gaming R9 390
core 1200 ghz / 1650 mhz
50% power limit / +75 mv
max core 72c
max vrm 79c



There is NO GTX 970 on the VALLEY 1.0 forum that could beat this score or my 290 score


----------



## flopper

Quote:


> Originally Posted by *Geoclock*
> 
> Hi guys, i just wanted to ask WHICH R9 390 is worth to buy.( Have Corsair HX750 PSU 5 year old)
> First i bought XFX 290 and had nothig but headaches, BSOD and Black Screens, Returned for refund.
> 
> Thanks.


290 series had issues.
390 it all seems fixed and improved adding 8gb ram along the way.
cards OC around the same so in all equal overall.


----------



## Gumbi

Quote:


> Originally Posted by *flopper*
> 
> 290 series had issues.
> 390 it all seems fixed and improved adding 8gb ram along the way.
> cards OC around the same so in all equal overall.


Some of the models did, sure, but if you did your homework there were plenty of excellent 290 (x) models available.

The 390 series is definitely more polished, however, and does have most of the kinks worked out. I'm not 100% convinced Asus have fixed their VRM issues yet however.


----------



## AverdanOriginal

MSI Afterburner and Gaming App Rpoblem:

Hey guys. wanted to try out that Gaming App from MSI in order to let the dragon breathe. Sadly when I played around the different modes like silent/OC/Gaming I realized that the GPU Clock and Memory Clock seem to get stuck on the highest setting and the card starts to behave as if there is a constant full load on it. So it wont go down from 1040 MHz Clock to 300 or 350 or so when in idle?

*So I stopped using Gmaing App,* but when *using custom profiles in the MSI Afterburner* I have now a similar problem but only concerning the Mem Clock. *it gets stuck on 1500 MHz and keeps the card on 60-65 Celsius in Idle.*








Is that normal?
I know you shouldn't use Gaming App and MSI Afterburner at the same time, as they seem to overwrite eache other constantly, but could there be another program, maybe CCC that also makes problems with Afterburner running at the same time?

Do you guys have a similar Problem? Any hints tips or boxes I need to tag in Afterburner to make it stop using full mem clock?

Thx for any help.


----------



## battleaxe

Quote:


> Originally Posted by *12Cores*
> 
> Are your 290's overclocked and if so would you mind sharing your valley score at 1080p extreme HD?


I've had no time for OC'ing lately. But you can see some of the others posting scores here. Mine were around the same, slightly lower maybe if memory serves.


----------



## Agent Smith1984

Quote:


> Originally Posted by *AverdanOriginal*
> 
> MSI Afterburner and Gaming App Rpoblem:
> 
> Hey guys. wanted to try out that Gaming App from MSI in order to let the dragon breathe. Sadly when I played around the different modes like silent/OC/Gaming I realized that the GPU Clock and Memory Clock seem to get stuck on the highest setting and the card starts to behave as if there is a constant full load on it. So it wont go down from 1040 MHz Clock to 300 or 350 or so when in idle?
> 
> *So I stopped using Gmaing App,* but when *using custom profiles in the MSI Afterburner* I have now a similar problem but only concerning the Mem Clock. *it gets stuck on 1500 MHz and keeps the card on 60-65 Celsius in Idle.*
> 
> 
> 
> 
> 
> 
> 
> 
> Is that normal?
> I know you shouldn't use Gaming App and MSI Afterburner at the same time, as they seem to overwrite eache other constantly, but could there be another program, maybe CCC that also makes problems with Afterburner running at the same time?
> 
> Do you guys have a similar Problem? Any hints tips or boxes I need to tag in Afterburner to make it stop using full mem clock?
> 
> Thx for any help.


When using AfterBurner, I disable ULPS, so I do see my vcore bounce around a little as I'm working on the desktop... I also see the memory clock speed bounce between 150 and my 1700MHz OC speed. I don't really pay it any mind though, since it's not hurting anything....

If you want your idle temp lower, then you need to turn your fan on. I use a custom profile that uses 25% fan speed up to 40C in which case it scales linearly all the way up to 90% at 90C.

I get a 32C idle at 25% and max temps are 71C core at 70% fan, and 70C VRM1. That's with 50mv/50mv overvolting.
Mind you, that is with some very high air flow in my case.


----------



## CerealKillah

Just a quick update.

My XFX R9 390 DD had the right inductor on the PCB and the EK waterblock fit without issue.

Maybe just the 390X version has the beefed up inductor? I am not sure why mine checked out, but I am glad it did.









The potential problem area with the PCB was outlined in yellow by EK support.

A picture of the reference PCB:



A picture of the XFX 390X PCB:



A picture of MY XFX 390 potential problem area that is not a problem:


----------



## Agent Smith1984

Quote:


> Originally Posted by *CerealKillah*
> 
> Just a quick update.
> 
> My XFX R9 390 DD had the right inductor on the PCB and the EK waterblock fit without issue.
> 
> Maybe just the 390X version has the beefed up inductor? I am not sure why mine checked out, but I am glad it did.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The potential problem area with the PCB was outlined in yellow by EK support.
> 
> A picture of the reference PCB:
> 
> 
> 
> A picture of the XFX 390X PCB:
> 
> 
> 
> A picture of MY XFX 390 potential problem area that is not a problem:


Good news









Let us know how it does under water.

Would love to see some 1225+/1750+ bench numbers out of that thing!!


----------



## CerealKillah

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Good news
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Let us know how it does under water.
> 
> Would love to see some 1225+/1750+ bench numbers out of that thing!!


You and I both









I have been leak testing since about 10 PM Central last night. I will get it fired up tonight and see how far I can push it under water.

I have a thin 360 RAD (NEXXOS ST30 360mm), so I am hoping I don't run into limitations on that. I used to run my loop with an XSPC ES240 radiator too, but gave it to the guy that bought my HD7950 with waterblock so we could get wet too.


----------



## AverdanOriginal

Quote:


> Originally Posted by *Agent Smith1984*
> 
> When using AfterBurner, I disable ULPS, so I do see my vcore bounce around a little as I'm working on the desktop... I also see the memory clock speed bounce between 150 and my 1700MHz OC speed. I don't really pay it any mind though, since it's not hurting anything....
> 
> If you want your idle temp lower, then you need to turn your fan on. I use a custom profile that uses 25% fan speed up to 40C in which case it scales linearly all the way up to 90% at 90C.
> 
> I get a 32C idle at 25% and max temps are 71C core at 70% fan, and 70C VRM1. That's with 50mv/50mv overvolting.
> Mind you, that is with some very high air flow in my case.


That's what I thought I must disable something. so it is the ULPS ticking box. Didn't know what that was for LOL








My idle temps on auto settings (except fans) running on afterburner is low. currently around 45-55°with no fans and Ambient around 30-32 (currently summertime). When I hit 55° they start to turn with about 20% and 4 hysterical temp steps (or so).
Did you disable it via the Afterburner or the regedit, how some people have explained in other forums? cause simply disabling it in Afterburner will probably have no effect on MSI gaming app i guess or?

I did notice that my idle temps after playing would not move down that quickly (actually stayed around 60° with Ambient 29°) when I was using an underclockign profile on my card. So during gaming 79 C° (Dragon Age Inqui) everything on ultra, but after like 10 min and the mem clock beeing stuck at 1500 MHz, the card was still at 65°. without using a profile it would drop down to 45°-50° in like 5 min.

I know you installed 2 Jetflos in your case... still think that must be crazy loud







but I love the raw power you aim for









I have 4x120 mm fans in front pushing in together 260m³/h, one below the card only moving air of 33 m³/h, and 3 fans pushing out air at around 300 m³/h.
So in theory my air inside the case should be refreshed 1.8 times per second.... that is enough and it is not quiet nor silent but at least not loud (can also switch down to approx. 200m³/h and there it gets silent)

Anyways, I think temps are a matter of airflow, game settings (unecessary framerates....) and computer settings. Only from adjusting the later 2 I was able to cool down a game by roughly 7C° and keep 60 FPS with everything on Ultra.


----------



## Agent Smith1984

I sure hope newegg gets MSI 390 back in stock before I'm ready to order my second card....







Quote:


> Originally Posted by *AverdanOriginal*
> 
> That's what I thought I must disable something. so it is the ULPS ticking box. Didn't know what that was for LOL
> 
> 
> 
> 
> 
> 
> 
> 
> My idle temps on auto settings (except fans) running on afterburner is low. currently around 45-55°with no fans and Ambient around 30-32 (currently summertime). When I hit 55° they start to turn with about 20% and 4 hysterical temp steps (or so).
> Did you disable it via the Afterburner or the regedit, how some people have explained in other forums? cause simply disabling it in Afterburner will probably have no effect on MSI gaming app i guess or?
> 
> I did notice that my idle temps after playing would not move down that quickly (actually stayed around 60° with Ambient 29°) when I was using an underclockign profile on my card. So during gaming 79 C° (Dragon Age Inqui) everything on ultra, but after like 10 min and the mem clock beeing stuck at 1500 MHz, the card was still at 65°. without using a profile it would drop down to 45°-50° in like 5 min.
> 
> I know you installed 2 Jetflos in your case... still think that must be crazy loud
> 
> 
> 
> 
> 
> 
> 
> but I love the raw power you aim for
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have 4x120 mm fans in front pushing in together 260m³/h, one below the card only moving air of 33 m³/h, and 3 fans pushing out air at around 300 m³/h.
> So in theory my air inside the case should be refreshed 1.8 times per second.... that is enough and it is not quiet nor silent but at least not loud (can also switch down to approx. 200m³/h and there it gets silent)
> 
> Anyways, I think temps are a matter of airflow, game settings (unecessary framerates....) and computer settings. Only from adjusting the later 2 I was able to cool down a game by roughly 7C° and keep 60 FPS with everything on Ultra.


To use the fan profile in afterburner just turn on auto fan, and click user mode to the left of the slider.
I wouldn't use the MSI Gaming app with it, unless you really want to control the LED dragon on the card.

My jetflows are a bit noisy at full force (2000+ RPM) but if I use 1600RPM they are much quiter.
I keep them pegged when gaming though, since I normally have the volume cranked anyways. I've never really been bothered by fan noise, at least not nearly as much as I am by high temps.

One thing for sure, is that gaming in 4K pushes these cards to a different level.....

5 loops of 1080P Unigine Heaven at 8X AA, ultra settings, yield around 69-70C on the core.

30 minutes of 4K Crysis 3 can put the core to 73C with the same fan profile.


----------



## AverdanOriginal

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I sure hope newegg gets MSI 390 back in stock before I'm ready to order my second card....
> 
> 
> 
> 
> 
> 
> 
> 
> To use the fan profile in afterburner just turn on auto fan, and click user mode to the left of the slider.
> I wouldn't use the MSI Gaming app with it, unless you really want to control the LED dragon on the card.
> 
> My jetflows are a bit noisy at full force (2000+ RPM) but if I use 1600RPM they are much quiter.
> I keep them pegged when gaming though, since I normally have the volume cranked anyways. I've never really been bothered by fan noise, at least not nearly as much as I am by high temps.
> 
> One thing for sure, is that gaming in 4K pushes these cards to a different level.....
> 
> 5 loops of 1080P Unigine Heaven at 8X AA, ultra settings, yield around 69-70C on the core.
> 
> 30 minutes of 4K Crysis 3 can put the core to 73C with the same fan profile.


Thanks for your answers Agent. Fan profile I already have. Works fine.

one more thing, whats with the overvoltage increase you always mention being at 50mv/50mv, isn't there only one slider for voltage adjustement in MSI Afterburner? Or am I missing somethign again.

Once heat here in central europe decreases to normal 20-25 C° I will start to overclock my card and see what I can reach.
Thanks again for your help.


----------



## Agent Smith1984

Quote:


> Originally Posted by *AverdanOriginal*
> 
> Thanks for your answers Agent. Fan profile I already have. Works fine.
> 
> one more thing, whats with the overvoltage increase you always mention being at 50mv/50mv, isn't there only one slider for voltage adjustement in MSI Afterburner? Or am I missing somethign again.
> 
> Once heat here in central europe decreases to normal 20-25 C° I will start to overclock my card and see what I can reach.
> Thanks again for your help.


The second "50mv" is AUX voltage.

If you push the little down arrow beside the voltage slider, you can adjust the AUX voltage also.

I put mine at 50mv, which lets me run at 1700MHz memory.

I can push to 75mv and get 1750 stable also.


----------



## AverdanOriginal

Quote:


> Originally Posted by *Agent Smith1984*
> 
> The second "50mv" is AUX voltage.
> 
> If you push the little down arrow beside the voltage slider, you can adjust the AUX voltage also.
> 
> I put mine at 50mv, which lets me run at 1700MHz memory.
> 
> I can push to 75mv and get 1750 stable also.


Ah ok. check. so I totally oversaw that little arrow. thanks again.


----------



## Agent Smith1984

Quote:


> Originally Posted by *AverdanOriginal*
> 
> Ah ok. check. so I totally oversaw that little arrow. thanks again.


No problem,

Don't forget to post a GPU-Z and notepad (with member name) screenie up so I can get you on the roster.

Nevermind, I see you were added a while back


----------



## Agent Smith1984

When upgrading to Windows 10, did anyone have issues with any games or apps working afterwards?

I am hoping to upgrade, and fire up my games without having to reinstall (I do not use optical drives, and normally download stuff, so this would be a nightmare)


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> When upgrading to Windows 10, did anyone have issues with any games or apps working afterwards?
> 
> I am hoping to upgrade, and fire up my games without having to reinstall (I do not use optical drives, and normally download stuff, so this would be a nightmare)


I had to update PB and do an auto fix on the BF3 and BF4 games. (right clicking on the game in Origin) Everything else runs fine so far it seems. All my other programs including Adobe Lightroom and CS6 work fine as before too.

I really like Win10. Glad I updated it. Seems more well thought out to me than 7, 8, or 8.1 was.


----------



## Agent Smith1984

Quote:


> Originally Posted by *battleaxe*
> 
> I had to update PB and do an auto fix on the BF3 and BF4 games. (right clicking on the game in Origin) Everything else runs fine so far it seems. All my other programs including Adobe Lightroom and CS6 work fine as before too.
> 
> I really like Win10. Glad I updated it. Seems more well thought out to me than 7, 8, or 8.1 was.


Good to know, thanks for the info, not so scared to shift 'er over now!


----------



## Sonic B0000M

Crossfire VTX R9 390. AMD FX 8350 CPU.

Want to overclock the GPU when I use only 1 for certain games. Any ideas on best overclock on MSI burner?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Sonic B0000M*
> 
> 
> 
> Crossfire VTX R9 390. AMD FX 8350 CPU.
> 
> Want to overclock the GPU when I use only 1 for certain games. Any ideas on best overclock on MSI burner?


VTX?

I advise reading through the OP for OC suggestions.

It really varies for every card. And temps will also be a factor.


----------



## thegamehhh

hi guys, i just bought my msi r9 390 and immadiately i ran assassins creed: unity to check out the performance. after a while i checked temperatures- she went to 94! C with a 100% speed fan.
have u got any ideas whats wrong with this card? i suppose she has a defect but i am not sure, please help


----------



## Sonic B0000M

VTX yeah?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Sonic B0000M*
> 
> 
> 
> VTX yeah?


Wow, I have never seen that brand before!

let us know how the temps are if you don't mind.....

Quote:


> Originally Posted by *thegamehhh*
> 
> hi guys, i just bought my msi r9 390 and immadiately i ran assassins creed: unity to check out the performance. after a while i checked temperatures- she went to 94! C with a 100% speed fan.
> have u got any ideas whats wrong with this card? i suppose she has a defect but i am not sure, please help


My first guess is airflow in your case...... 94C for a single card is way out of line with that most single MSI 390's run at....

I was seeing mid 80's with my previous case fans, and an overvoltage in place, but after swapping my case fans to some high flow units, my card never breaks 72c with my custom profile.


----------



## thegamehhh

i have got this one 
zalman z3 plus
should i remove cooler from the back?


----------



## Sonic B0000M

I bought them from Amazon.UK for £260 each. Although I got one free.

Having huge issues with flickering and stuff with battlefield hardline tho.


----------



## Sonic B0000M

Also make sure you add me to the owners list ☺


----------



## CerealKillah

Quote:


> Originally Posted by *thegamehhh*
> 
> hi guys, i just bought my msi r9 390 and immadiately i ran assassins creed: unity to check out the performance. after a while i checked temperatures- she went to 94! C with a 100% speed fan.
> have u got any ideas whats wrong with this card? i suppose she has a defect but i am not sure, please help


One more vote for airflow.

Try running the computer with the side off and check temps. If they are still hot, it might be a bad card. If temps drop quite a bit, it is most likely airflow. Depending on your case, you may or may not have good options to overcome.


----------



## dislikeyou

Hello, I was waiting for R9 Nano to get released but now considering to buy a Sapphire Radeon R9 390 Nitro 8GB tomorrow for $380 instead.

Not sure if it is worth to wait for the Nano, it's going to cost ~ 100 bucks more but maybe it will not offer that much better performance at that cost.

I game at 4K but I don't play any demanding 3D FPS games, mostly MOBA, strategy and MMO Fantasy games, and CS:S.


----------



## MazrimCF

Please add me to the club



MSI 390 stock 1040/1500 OC 1120/1575 +50mv


----------



## battleaxe

I have an overwhelming urge to buy a 390x, but I want to wait for the FuryX to go 8Gb instead. Decisions... decisions, decisions...


----------



## cbarros82

Really wish someone would make an MSI r9 390/x water block


----------



## THUMPer1

Just got my msi 390x today and my BenQ monitor. Will get gpuz screenshot later. So far running 1150/1700. Will try for 1200 one day.

EDIT:


----------



## Agent Smith1984

Ohhh, new peoples!!

Will get everyone added tomorrow morning!








Quote:


> Originally Posted by *battleaxe*
> 
> I have an overwhelming urge to buy a 390x, but I want to wait for the FuryX to go 8Gb instead. Decisions... decisions, decisions...


I don't expect an 8GB Fiji for a while..... they can't even get the 4GB's shipped... lol

I advise enjoy your 290x/290 crossfire though.... it's plenty for now my friend! Had I known that there was going to be a 15.7 driver, and 290x BIOS mods popping up that bring them SO close to 390 capabilities, I'd have sat tight (had a pair of 290 Trixxies @ 1100/1400).
But my real plan was 4K, and I could just see the 4GB VRAM's time drawing near. Of course, now that I have the 4K TV, and see that even BF4 and Crysis 3 don't hit 3GB, and only SoM and GTA V can hit 4GB+, I guess it's still not that bad. Of course, that's with no AA, but you need a lot of horsepower to run 4K and AA together anyways, and it honestly isn't even needed.


----------



## Agent Smith1984

Quote:


> Originally Posted by *cbarros82*
> 
> Really wish someone would make an MSI r9 390/x water block


You and me both man.

I'm actually considering water loop for the first time, but nobody makes one for me to get started.

I ONLY wanted the MSI for it's looks to begin with, but now I only want to run the MSI's (getting a second for crossfire) because based on results, they are clearly the highest binned cards available.


----------



## Geoclock

New Downclocked MSI R9 390 is out

http://www.newegg.com/Product/Product.aspx?Item=N82E16814127875&cm_re=r9_390-_-14-127-875-_-Product


----------



## CerealKillah

Having fun with the waterblock.

I am currently at 1100/1600 with stock voltages.

http://www.3dmark.com/3dm/8167513?

Why does my 390 keep showing up as a 290x?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Geoclock*
> 
> New Downclocked MSI R9 390 is out
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814127875&cm_re=r9_390-_-14-127-875-_-Product


This is bad news considering the prior card is now out of stock and this one is the same price... They should of made it $299... I was just about to order another one


----------



## Geoclock

Quote:


> Originally Posted by *Agent Smith1984*
> 
> This is bad news considering the prior card is now out of stock and this one is the same price... They should of made it $299... I was just about to order another one


Me too , now Powercolors $20 promotion will expire in 2 days for near $315 , do you think is worth to wait to OC-ed MSI 390 with higher price $340
If not probably i'll stick with MSI GTX 970 for $325, it's cooler and OC's like a HELL.


----------



## 12Cores

Quote:


> Originally Posted by *CerealKillah*
> 
> Having fun with the waterblock.
> 
> I am currently at 1100/1600 with stock voltages.
> 
> http://www.3dmark.com/3dm/8167513?
> 
> Why does my 390 keep showing up as a 290x?


Cereal what is the make/model of the waterblock for your GPU?


----------



## CerealKillah

This is the link to my waterblock:

http://www.performance-pcs.com/ek-msi-gigabyte-radeon-r9-290x-vga-liquid-cooling-block-acetal.html


----------



## CamsX

So I'm finally able to join the owners club. Not too happy with the out of the box scores, as I was hoping to be closer to 10k score in firestrike, but I think my CPU is holding me back by a lot. I'm unable to get a stable CPU overclock and my combined scores are just crap. Not sure whats going on there.

http://www.3dmark.com/3dm/8166433

Haven't even played a single game yet, but graphic scores are similar to what I've seen around here. Doubt I'll attempt a GPU overclock until I figure the CPU issues.


----------



## cbarros82

Quote:


> Originally Posted by *Geoclock*
> 
> New Downclocked MSI R9 390 is out
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814127875&cm_re=r9_390-_-14-127-875-_-Product


same price i paid for the normal r9 390 msi gaming 8g........so not really a deal plus it will be most likely lower binned


----------



## cbarros82

powercolor would be a good idea because it used reference PCB so you can use 290/x water block


----------



## cbarros82




----------



## PrinceLUDA21

Ok so I just spent the better part of my day reading pretty much this entire thread. On top of doing a bunch of research last week.

I too (just recently, like Tuesday) picked up the MSI 390. I did a bunch of research on all manufactures before I decided to stick with MSI as I intended on.... Anyway long story short. I see everyone keeps saying there are no water blocks for this card. Which has me confused because of this....

https://shop.ekwb.com/ek-fc-r9-290x-nickel-rev-2-0

Can anyone tell me if this is true and that there are now Full Blocks? Because it seems by popular demand this has come true. I need to know before I start this custom loop and buy a second card.



Also I post benchmarks and all that later so that I can be added to the group.


----------



## thegamehhh

Quote:


> Originally Posted by *CerealKillah*
> 
> One more vote for airflow.
> 
> Try running the computer with the side off and check temps. If they are still hot, it might be a bad card. If temps drop quite a bit, it is most likely airflow. Depending on your case, you may or may not have good options to overcome.


well with the sides off i am running with about 85 C which is still way to high







and fan was running with 100% speed..

nvm guys there was smth wrong with my case, now i am getting ~75 C in the stress, so its not that bad.
thanks for your help and respond !


----------



## AverdanOriginal

Quote:


> Originally Posted by *Agent Smith1984*
> 
> You and me both man.
> 
> I'm actually considering water loop for the first time, but nobody makes one for me to get started.
> 
> I ONLY wanted the MSI for it's looks to begin with, but now I only want to run the MSI's (getting a second for crossfire) because based on results, they are clearly the highest binned cards available.


That's also my impression from reading benchmarks, overclocks and reviews.
Sapphire Nitro 390 seems to run the coolest and equally quiet to the MSI R9 390. While the MSI seems to the best for overclocking (highest binned). Some people have even problems going over 1060 MHz Clock with the Nitro without adfjusting Volts.
Interesting enough, the Asus R9 390 seems to be a mixture of both (3 fans and good binned), while I read somewhere they have no good VRAM cooling.

And I second that with the looks. Love my little dragon giving the blue light case a little bit of red/white light touch


----------



## AverdanOriginal

Quote:


> Originally Posted by *CerealKillah*
> 
> Having fun with the waterblock.
> 
> I am currently at 1100/1600 with stock voltages.
> 
> http://www.3dmark.com/3dm/8167513?
> 
> Why does my 390 keep showing up as a 290x?


Had the same problem. Update the 3DMark to the latest version then you will see the correct graphics card, and might even get a better result.


----------



## AverdanOriginal

Quote:


> Originally Posted by *CamsX*
> 
> 
> 
> So I'm finally able to join the owners club. Not too happy with the out of the box scores, as I was hoping to be closer to 10k score in firestrike, but I think my CPU is holding me back by a lot. I'm unable to get a stable CPU overclock and my combined scores are just crap. Not sure whats going on there.
> 
> http://www.3dmark.com/3dm/8166433
> 
> Haven't even played a single game yet, but graphic scores are similar to what I've seen around here. Doubt I'll attempt a GPU overclock until I figure the CPU issues.


Update your 3D Mark to get the correct card reading. Look at my score below. updated 3D Mark, and it shows the correct card. My CPU is like 6 years old and overclocked it from 2.8 GHz to 3.7 GHz and I am getting closer to 10k event though my cpu is much worse then yours *(actually I only have PCI-e 2.0x16 on my board*


----------



## phenom2955

I am just curious here if anybody can give me an explanation here.

I went through 2 XFX R9 390's. Took the first one back because it was overheating, and then took the second one back to get exchange for a GTX 970. Over the month that I used the R9 390 I played every game I could throw at it at least a little bit.

GTAV, Shadow of Mordor with HD Texture, Assassins Creed Unity, Watch Dogs(played awful on the AMD card, played great maxed out now), The Witcher 3, The Witcher 2, Batman AK, All 3 Crysis games(played them all the way through on the 390) and even tested a few other games as well I cannot think of right now.

Anyways during that time I could not get a single game to break the 3.2GB-3.5GB VRAM usage range(according to MSI AB OSD) no matter what settings I tried or resolution. I tried all of them with Maxed out settings and even tried 1440p and 3k resolutions and simply could not get the card to utilize more than that in VRAM.

Now with my new EVGA GTX 970 I am hitting 3.8GB VRAM usage in Watch_Dogs and in Shadow of Mordor maxed out with HD Texture Pack its reading 5.5GB VRAM as I am playing right now. SoM is playing 60+ FPS, around 75-80 Average wtih maxed settings and 1080p. Watch_Dogs varies from 35-75FPS. Averages in mid 40s to lower 50s though, same thing maxed out 1080p. Other games like The Witcher still barely break 1.5gb-2gb VRAM usage though.

I am just a bit confused by this though, why is the card with 8GB Vram using less of it, while the card that supposedly only has 4GB is utilizing it better and exceeding it(is that shared memory with system memory or something?).


----------



## AverdanOriginal

Quote:


> Originally Posted by *phenom2955*
> 
> I am just curious here if anybody can give me an explanation here.
> 
> I went through 2 XFX R9 390's. Took the first one back because it was overheating, and then took the second one back to get exchange for a GTX 970. Over the month that I used the R9 390 I played every game I could throw at it at least a little bit.
> 
> GTAV, Shadow of Mordor with HD Texture, Assassins Creed Unity, Watch Dogs(played awful on the AMD card, played great maxed out now), The Witcher 3, The Witcher 2, Batman AK, All 3 Crysis games(played them all the way through on the 390) and even tested a few other games as well I cannot think of right now.
> 
> Anyways during that time I could not get a single game to break the 3.2GB-3.5GB VRAM usage range(according to MSI AB OSD) no matter what settings I tried or resolution. I tried all of them with Maxed out settings and even tried 1440p and 3k resolutions and simply could not get the card to utilize more than that in VRAM.
> 
> Now with my new EVGA GTX 970 I am hitting 3.8GB VRAM usage in Watch_Dogs and in Shadow of Mordor maxed out with HD Texture Pack its reading 5.5GB VRAM as I am playing right now. SoM is playing 60+ FPS, around 75-80 Average wtih maxed settings and 1080p. Watch_Dogs varies from 35-75FPS. Averages in mid 40s to lower 50s though, same thing maxed out 1080p. Other games like The Witcher still barely break 1.5gb-2gb VRAM usage though.
> 
> I am just a bit confused by this though, why is the card with 8GB Vram using less of it, while the card that supposedly only has 4GB is utilizing it better and exceeding it(is that shared memory with system memory or something?).


Interesting point and difficult question. I actually have a test to this topic in the last PC Games Hardware magazin lying at home. They checked on 4-5 games, with 4-5 different cards (sadly neither GTX 970 nor the R9 390 are inlcuded in the test







) the different usage of VRAM, RAM and virtual memory. They tested only on 1080p, but with max/ultra everything else. The only Games that broke the 4GB barrier where Batman Arkham Knights and GTAV (and that only on those cards that have more than 4GB)
Interestingly, and that's where I get back to your question, each card seemed to distribute the needed memory differently between RAM, VRAM and virtual memory.

If you guys want, I can get more details later on.

Now why the GTX 970 shows 5.5GB I don't know, I would imagine that is a software reading fault. try maybe another software to read the VRAM usage?
And why GTX 970 needs more VRAM than the R9 390 --> It's just a guess, but since the R9 390 has double the memory interface than the GTX 970 (512 bits to 256 bits) it calculates quicker and hence needs less data stored... so the in between storage of might be needed less.... Its just a wild guess and I could be totaly wrong though


----------



## flopper

Quote:


> Originally Posted by *phenom2955*
> 
> I am just curious here if anybody can give me an explanation here.
> .


some games take notice how much ram you have and lower settings/loads textures accordingly.
software can be deciving due to how games loads textures and dont use some of them directly and also how the driver works with the game depending on the tech with the card.
really difficult now and then to compare across different brands and tech.
4gb HBM allows the same as 6gb gddr5.
previously the solution was to add more gb ram due to the optimization with bandwidth needed it.
HBM allows wider bus so amd could in that case change the needed loading of textures [gddr5] and swap them faster back and forth with the wider HBM bus.

exactly why stuff shows up like that for you I am sure someone knows exactly why.


----------



## Agent Smith1984

Quote:


> Originally Posted by *phenom2955*
> 
> I am just curious here if anybody can give me an explanation here.
> 
> I went through 2 XFX R9 390's. Took the first one back because it was overheating, and then took the second one back to get exchange for a GTX 970. Over the month that I used the R9 390 I played every game I could throw at it at least a little bit.
> 
> GTAV, Shadow of Mordor with HD Texture, Assassins Creed Unity, Watch Dogs(played awful on the AMD card, played great maxed out now), The Witcher 3, The Witcher 2, Batman AK, All 3 Crysis games(played them all the way through on the 390) and even tested a few other games as well I cannot think of right now.
> 
> Anyways during that time I could not get a single game to break the 3.2GB-3.5GB VRAM usage range(according to MSI AB OSD) no matter what settings I tried or resolution. I tried all of them with Maxed out settings and even tried 1440p and 3k resolutions and simply could not get the card to utilize more than that in VRAM.
> 
> Now with my new EVGA GTX 970 I am hitting 3.8GB VRAM usage in Watch_Dogs and in Shadow of Mordor maxed out with HD Texture Pack its reading 5.5GB VRAM as I am playing right now. SoM is playing 60+ FPS, around 75-80 Average wtih maxed settings and 1080p. Watch_Dogs varies from 35-75FPS. Averages in mid 40s to lower 50s though, same thing maxed out 1080p. Other games like The Witcher still barely break 1.5gb-2gb VRAM usage though.
> 
> I am just a bit confused by this though, why is the card with 8GB Vram using less of it, while the card that supposedly only has 4GB is utilizing it better and exceeding it(is that shared memory with system memory or something?).


The 390 has a MUCH FASTER frame buffer. It could be that the buffer is refreshing information at a rate in which the usage never exceeds a certain mark, while the slower buffer on the 970 tends to store a bit more information to compensate for it's slower bandwidth. This is basically the same reason why 4GB HBM can do the work of 6GB GDDR5.....

Also, if you are exceeding 3.5GB VRAM on 970, you are killing your minimum framerates.... those cards only have 3.5GB actual VRAM, and then a second 512MB VRAM partition that is majorly GIMPED.

This is why I was shocked to hear you went from a 390 to a 970 with so many going the other way around...
No offense to the 970 though. That card at launch, was the best bang for the buck we had ever seen. Disappointing that NVIDIA misrepresented the VRAM so badly though.


----------



## Agent Smith1984

Quote:


> Originally Posted by *PrinceLUDA21*
> 
> Ok so I just spent the better part of my day reading pretty much this entire thread. On top of doing a bunch of research last week.
> 
> I too (just recently, like Tuesday) picked up the MSI 390. I did a bunch of research on all manufactures before I decided to stick with MSI as I intended on.... Anyway long story short. I see everyone keeps saying there are no water blocks for this card. Which has me confused because of this....
> 
> https://shop.ekwb.com/ek-fc-r9-290x-nickel-rev-2-0
> 
> Can anyone tell me if this is true and that there are now Full Blocks? Because it seems by popular demand this has come true. I need to know before I start this custom loop and buy a second card.
> 
> 
> 
> Also I post benchmarks and all that later so that I can be added to the group.


Read carefully and you will see that it works with the MSI 290, and with all other "reference" 290 and 390 boards.

The MSI 390 board is non reference, and according to HNIC at EK, there will NOT be an MSI 390 waterblock made.
However, I must say, the popularity of these cards is increasing quickly, and it would not hurt for ALL OF US to let them know our interest in water blocks. It could prompt them to produce one....


----------



## MazrimCF

My stock clocks are 1040/1500 as reported by gpu-z afterburner and 3dmark. I am not sure why as the link on tiger direct said 1060/1525.


----------



## Agent Smith1984

Quote:


> Originally Posted by *MazrimCF*
> 
> My stock clocks are 1040/1500 as reported by gpu-z afterburner and 3dmark. I am not sure why as the link on tiger direct said 1060/1525.


This is normal.
Please install the MSI gaming app, and set to "OC Mode" and it will set your default clocks to 1060/1525.

I chose to ignore the Gaming app, and just OC with afterburner since my goal was to go well past the stock clocks anyways.

Sadly, there is not much info on this issue with the included packaging, so many people are giving these cards bad reviews on newegg saying the claimed clock speeds were a lie, when it reality, the card has three "modes" of which gaming mode (default) clocks are 1040/1500 and OC mode clocks are 1060/1525.

Note: I use the advertised OC mode clocks as the stock clocks for all MSI cards on our list.


----------



## MazrimCF

Thanks for the information. I was ready to ask for an RMA myself lol.


----------



## Agent Smith1984

I'm pretty anxious to see a review on this MSI 390 LE card.....

It's got a 1010 core like many of the other 390's, but is the same price as the regular one. I am guessing the original will go up to $339.99 now, but it may be worth the $10 if the binning is better on the original. MSI's binning seems to be a step above everybody else on their 390 series, but I am guessing they are running into a large amount of chips that won't break 1090 on stock voltage, and those are being used on the LE model.

If newegg doesn't get the original back in stock, I can order one from Tiger for $340. I'll pay the $10 to break 1160 on the core (with the potential for 1200MHz fairly common, and my current card able to do 1190)


----------



## CerealKillah

Well, I think my aging Rosewill HIVE 650 is causing problems. It is 4 years old now and is getting long in the tooth.

I am getting pretty good overclocks on the XFX 390 at stock voltage. If I increase the voltage at all to the 390, both the CPU and GPU become unstable.

I have ordered a Sentey 850 Watt Modular power supply off of Amazon. Let's see if that improves the picture at all for me.

Long story short, no amazing overclocks until this weekend


----------



## battleaxe

Quote:


> Originally Posted by *CerealKillah*
> 
> Well, I think my aging Rosewill HIVE 650 is causing problems. It is 4 years old now and is getting long in the tooth.
> 
> I am getting pretty good overclocks on the XFX 390 at stock voltage. If I increase the voltage at all to the 390, both the CPU and GPU become unstable.
> 
> I have ordered a Sentey 850 Watt Modular power supply off of Amazon. Let's see if that improves the picture at all for me.
> 
> Long story short, no amazing overclocks until this weekend


Deleted. My post made no sense.


----------



## Agent Smith1984

Quote:


> Originally Posted by *CerealKillah*
> 
> Well, I think my aging Rosewill HIVE 650 is causing problems. It is 4 years old now and is getting long in the tooth.
> 
> I am getting pretty good overclocks on the XFX 390 at stock voltage. If I increase the voltage at all to the 390, both the CPU and GPU become unstable.
> 
> I have ordered a Sentey 850 Watt Modular power supply off of Amazon. Let's see if that improves the picture at all for me.
> 
> Long story short, no amazing overclocks until this weekend


That's strange....

The XFX should clock pretty good under water with some extra voltage. Let us know how it turns out....

How far did you make it on stock voltage? I know you were at 1100/1600 last I read.... That's a pretty typical stock voltage clock for these, so I'm guessing the water won't come in handy until the voltage is bumped way up.


----------



## phenom2955

Quote:


> Originally Posted by *Agent Smith1984*
> 
> The 390 has a MUCH FASTER frame buffer. It could be that the buffer is refreshing information at a rate in which the usage never exceeds a certain mark, while the slower buffer on the 970 tends to store a bit more information to compensate for it's slower bandwidth. This is basically the same reason why 4GB HBM can do the work of 6GB GDDR5.....
> 
> Also, if you are exceeding 3.5GB VRAM on 970, you are killing your minimum framerates.... those cards only have 3.5GB actual VRAM, and then a second 512MB VRAM partition that is majorly GIMPED.
> 
> This is why I was shocked to hear you went from a 390 to a 970 with so many going the other way around...
> No offense to the 970 though. That card at launch, was the best bang for the buck we had ever seen. Disappointing that NVIDIA misrepresented the VRAM so badly though.


Alright well to tell the truth, so far with the 970 I have actually had better framerates in all of the games I am currently playing. Not saying that won't change in a case by case basis or when newer games come out ect... But right now it seems to be performing a little better.

The main reason I decided to switch though was for the cooler functioning of the 970, in my experience I have typically always used only AMD cards since I have used a PC for gaming and they always run very HOT. In my case right now I am using an old recycled Mid Tower Antec case that frankly sucks, and the 390 was toasty warm inside it with fans cranked to 100% all the time. The 970 stays very cool compared to the 390 and is running great, I really don't think I could go wrong with either card though, the 390 and 970 are by the best cards I have ever had. Can't really complain about either of them.

BTW maybe it was software error, I noticed in GPU-Z my max Memory usage was 444002991 or some such ridiculous number, and just now in The WItcher 3 it was reading something like that from MSI afterbuner OSD. Wierd.


----------



## Agent Smith1984

Quote:


> Originally Posted by *phenom2955*
> 
> Alright well to tell the truth, so far with the 970 I have actually had better framerates in all of the games I am currently playing. Not saying that won't change in a case by case basis or when newer games come out ect... But right now it seems to be performing a little better.
> 
> The main reason I decided to switch though was for the cooler functioning of the 970, in my experience I have typically always used only AMD cards since I have used a PC for gaming and they always run very HOT. In my case right now I am using an old recycled Mid Tower Antec case that frankly sucks, and the 390 was toasty warm inside it with fans cranked to 100% all the time. The 970 stays very cool compared to the 390 and is running great, I really don't think I could go wrong with either card though, the 390 and 970 are by the best cards I have ever had. Can't really complain about either of them.
> 
> BTW maybe it was software error, I noticed in GPU-Z my max Memory usage was 444002991 or some such ridiculous number, and just now in The WItcher 3 it was reading something like that from MSI afterbuner OSD. Wierd.


What titles are you seeing the 970 perform faster in, and what settings/resolution are you using?


----------



## phenom2955

Quote:


> Originally Posted by *Agent Smith1984*
> 
> What titles are you seeing the 970 perform faster in, and what settings/resolution are you using?


Shadow of Mordor, The Witcher 3, Crysis3(in particular one scene towards the begining where you look through the windows and theres a lot of rain outside, used to dip low 20's with the 390, now it stays mid 30's low 40's on that part...), Batman AK(only played this a little bit).

However the major improvement comes in Watch_Dogs. With the 390 with everything on Ultra I was dropping low 20's constantly, eventually I found out that turning "Level of Detail" setting to low allowed the 390 to stay above 30 with everything else maxed. However now I can play with everything maxed out even "Level of Detail" setting with 40+, some dips around 35.

Also in Crysis 3 with the 390 I had to play on High Settings to maintain 40+ FPS with some dips into 30's, with the 970 I can play all Maxed out and stay above 40, with some dips of course into mid 30s for a few seconds in some scenes.

All games I am playing with maxed out settings @ 1920x1080p. I don't ever plan to go above that resolution anytime soon(not even within the next couple years) so I guess the 970 suits me good and I don't plan to Sli or Crossfire anytime soon either(unless i can come accross a cheap that I card that I can justify getting).

Don't get me wrong though, I loved the 390, and I have only had the 970 2days now, but so far I think the 970 is suiting a little better(especially in regards to temps.

I was however getting a Higher 3dmark Graphics score with the 2nd 390 that I owned(the one I exchanged the original for). 3DMark graphics score with my 390 @ core 1100 with stock voltage was 13k. The 970 is getting 12.7k.

Current 3DMark score: http://www.3dmark.com/3dm/8163744

Also the last night I had the 390 I decided to start OC'ing my CPU and noticed that it Benefited a lot more from the CPU overclock than the 970. It went from 11.5k'ish Graphics Score in 3DMark to 13k.


----------



## CamsX

Quote:


> Originally Posted by *AverdanOriginal*
> 
> Update your 3D Mark to get the correct card reading. Look at my score below. updated 3D Mark, and it shows the correct card. My CPU is like 6 years old and overclocked it from 2.8 GHz to 3.7 GHz and I am getting closer to 10k event though my cpu is much worse then yours *(actually I only have PCI-e 2.0x16 on my board*


I couldn't agree more. It is so frustrating. Not sure if it is the CPU, Mobo, RAM, PSU, drivers or a combination of those. I'll turn on all standard bios features and bring everything back to true stock settings just to compare.

I know it is not the card, because the Graphic Scores are on par with other stock clocked scores around here.

Fitment is great. I thought it was going to be much bigger, but fits great in my nzxt case. Runs also very cool, at 66°C on benchmarks and games with fans under 60% speed. They only start to get a bit noisy after 70%, and even then they are more quiet that the fans on the 7970. My PC is not dead silent but neither is annoyingly loud.

I think on that aspect, the Nitro is a good balance between neutral looks, heat, noise and performance, on stock clocks that is.


----------



## Agent Smith1984

Quote:


> Originally Posted by *phenom2955*
> 
> Shadow of Mordor, The Witcher 3, Crysis3(in particular one scene towards the begining where you look through the windows and theres a lot of rain outside, used to dip low 20's with the 390, now it stays mid 30's low 40's on that part...), Batman AK(only played this a little bit).
> 
> However the major improvement comes in Watch_Dogs. With the 390 with everything on Ultra I was dropping low 20's constantly, eventually I found out that turning "Level of Detail" setting to low allowed the 390 to stay above 30 with everything else maxed. However now I can play with everything maxed out even "Level of Detail" setting with 40+, some dips around 35.
> 
> Also in Crysis 3 with the 390 I had to play on High Settings to maintain 40+ FPS with some dips into 30's, with the 970 I can play all Maxed out and stay above 40, with some dips of course into mid 30s for a few seconds in some scenes.
> 
> All games I am playing with maxed out settings @ 1920x1080p. I don't ever plan to go above that resolution anytime soon(not even within the next couple years) so I guess the 970 suits me good and I don't plan to Sli or Crossfire anytime soon either(unless i can come accross a cheap that I card that I can justify getting).
> 
> Don't get me wrong though, I loved the 390, and I have only had the 970 2days now, but so far I think the 970 is suiting a little better(especially in regards to temps.
> 
> I was however getting a Higher 3dmark Graphics score with the 2nd 390 that I owned(the one I exchanged the original for). 3DMark graphics score with my 390 @ core 1100 with stock voltage was 13k. The 970 is getting 12.7k.
> 
> Current 3DMark score: http://www.3dmark.com/3dm/8163744
> 
> Also the last night I had the 390 I decided to start OC'ing my CPU and noticed that it Benefited a lot more from the CPU overclock than the 970. It went from 11.5k'ish Graphics Score in 3DMark to 13k.


Those are terrible in game results buddy... something was wrong for you to had been getting that bad of performance from a 390.... That card dominates 1080P, with max settings, in almost any title you can name!

See below:







That's three games, from three different sources, all showing the 390's destroying the 970.

I wish you would of come to us for help before selling off a faster card in exchange for a slower one...


----------



## Geoclock

Should i buy Bitcoin Mined Sapphire 290x for $200 ?


----------



## Agent Smith1984

Quote:


> Originally Posted by *CamsX*
> 
> I couldn't agree more. It is so frustrating. Not sure if it is the CPU, Mobo, RAM, PSU, drivers or a combination of those. I'll turn on all standard bios features and bring everything back to true stock settings just to compare.
> 
> I know it is not the card, because the Graphic Scores are on par with other stock clocked scores around here.
> 
> Fitment is great. I thought it was going to be much bigger, but fits great in my nzxt case. Runs also very cool, at 66°C on benchmarks and games with fans under 60% speed. They only start to get a bit noisy after 70%, and even then they are more quiet that the fans on the 7970. My PC is not dead silent but neither is annoyingly loud.
> 
> I think on that aspect, the Nitro is a good balance between neutral looks, heat, noise and performance, on stock clocks that is.


Looks good my friend!!

Very clean build, and not too busy, with plenty of room and flow for a second one later









I like how the angles on the card compliment the "Z" on your power supply


----------



## Agent Smith1984

Quote:


> Originally Posted by *Geoclock*
> 
> Should i buy Bitcoin Mined Sapphire 290x for $200 ?


Are they reference?

I'd try to get them for $180 if they are reference.

If they are Tri-X or Vapors, then that's a good price


----------



## phenom2955

Hmm, I don't know. Shadow of Mordor played great on the 390, always 60+ averaging around 70's-80's just like above, however the 970 does as well. Just some games such as Crysis 3 and especially Watch_dogs played terribly for me on the 390 compared to the 970. But all in all they seem pretty close to me. I am still worried a bit about the memory difference and the 256bit vs 512bit(not 100% entirely sure what it means from a technical stand point other than that the memory runs faster), but like I said I think either of the 2 cards will keep me happy anyways. Best cards I have ever had, never owned a higher end card before, usually always bought low-mid range AMD cards. I have up until now always used ATI-AMD cards due to the Price difference used to be so much cheaper than NVIDIA, and they performed just as well. However since they are about the same price ranges now I guess it just comes down to witch ever one works best you. I have actually only 2 NVIDIA cards, 750Ti that I upgraded from to the 390 and now this 970.

And of course if all else fails I still have plenty of time to return this card and exchange back to a 390 via Bestbuy 15 day return deal.


----------



## battleaxe

I've got a G1 970 that does 1600+ and 8000mem I'm willing to trade for a Saphire TriX 390. Betting I don't get any takers either.









Edit: yes, of course that's a joke. I know no-one would do that...


----------



## Alfshizzle

Due to being hacked off with the crap nvidia drivers having tons of issues that they are unwilling to fix, im contemplating moving over to amd.

Question is does the 390x do chroma subsampling?

Can it do 4k 60hz over hdmi with reduced colour?

Anyone running crossfire 390x? Any issues?

Looking for a swift 100% accurate reply, hopefully someone here knows before i spend £350 quid on a card just to find out if it can do it.

Thanks


----------



## Agent Smith1984

Quote:


> Originally Posted by *battleaxe*
> 
> I've got a G1 970 that does 1600+ and 8000mem I'm willing to trade for a Saphire TriX 390. Betting I don't get any takers either.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: yes, of course that's a joke. I know no-one would do that...


Honestly, a 1600/8000 970 is a nice ass card bro!!

Nobody would trade I don't think, but I'm sure someone on the market place would pay $280-300 for it.









That thing will scream at 1080P!


----------



## Agent Smith1984

Quote:


> Originally Posted by *Alfshizzle*
> 
> Due to being hacked off with the crap nvidia drivers having tons of issues that they are unwilling to fix, im contemplating moving over to amd.
> 
> Question is does the 390x do chroma subsampling? *No clue, try google*
> 
> Can it do 4k 60hz over hdmi with reduced colour? *No, you will need to use DP, or DP to HDMI 2.0 adapter for 60Hz*
> 
> Anyone running crossfire 390x? Any issues? *The only known issues are with temps on the top card being high, but as far as driver/gameplay stuff, everything seems great!*
> 
> Looking for a swift 100% accurate reply, hopefully someone here knows before i spend £350 quid on a card just to find out if it can do it.
> 
> Thanks


----------



## CerealKillah

Quote:


> Originally Posted by *Agent Smith1984*
> 
> That's strange....
> 
> The XFX should clock pretty good under water with some extra voltage. Let us know how it turns out....
> 
> How far did you make it on stock voltage? I know you were at 1100/1600 last I read.... That's a pretty typical stock voltage clock for these, so I'm guessing the water won't come in handy until the voltage is bumped way up.


I haven't pushed the memory over 1600 yet, but was benching at 1125/1600 core at stock voltages before I quite messing with it and played a few games.

If I apply any extra voltage to the GPU or increase the power limits, both the GPU and CPU become unstable. That power supply is 4 years old and only 650 Watts. I am not TOO upset that I am upgrading to a new 850Watt.


----------



## Agent Smith1984

Quote:


> Originally Posted by *CerealKillah*
> 
> I haven't pushed the memory over 1600 yet, but was benching at 1125/1600 core at stock voltages before I quite messing with it and played a few games.
> 
> If I apply any extra voltage to the GPU or increase the power limits, both the GPU and CPU become unstable. That power supply is 4 years old and only 650 Watts. I am not TOO upset that I am upgrading to a new 850Watt.


Yeah, definitely sounds like the power supply to me too.

850W should do you just fine. I even ran 2) 290's on my semi-cheap 850W PSU with no issues.

1125 is pretty nice on stock voltage. I'd be looking to see what 200mv on trixx yeilds you.

Most of the 290 owners are able to get into the 1220-1300 range with that much voltage on water, pending the VRM temps are sub 50C.


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Honestly, a 1600/8000 970 is a nice ass card bro!!
> 
> Nobody would trade I don't think, but I'm sure someone on the market place would pay $280-300 for it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That thing will scream at 1080P!


True enough. But my 290x beats it handily. Costs me less too. And my 290 is only about 3% behind even though the 970 clocks like a beast. Go figure...


----------



## Agent Smith1984

Quote:


> Originally Posted by *battleaxe*
> 
> True enough. But my 290x beats it handily.


Even overclocked that much?

Guess that newest driver really helped the 290 along. I know at release, the 970 in stock form was slightly edging the 290's at 1080 and in some cases at 1440, I figured an OC that high would tip the scale. Good to know


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Even overclocked that much?
> 
> Guess that newest driver really helped the 290 along. I know at release, the 970 in stock form was slightly edging the 290's at 1080 and in some cases at 1440, I figured an OC that high would tip the scale. Good to know


Yup.


----------



## Alfshizzle

Quote:


> Originally Posted by *Agent Smith1984*


Thanks, have tried google and there is no info. However if it can do subsampling it will output 4k 60hz over hdmi.

Looks like im going to have to spend money to find out.


----------



## battleaxe

Quote:


> Originally Posted by *Alfshizzle*
> 
> Thanks, have tried google and there is no info. However if it can do subsampling it will output 4k 60hz over hdmi.
> 
> Looks like im going to have to spend money to find out.


Any reason you can't use DP? Not on monitor I guess?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Alfshizzle*
> 
> Thanks, have tried google and there is no info. However if it can do subsampling it will output 4k 60hz over hdmi.
> 
> Looks like im going to have to spend money to find out.


You can do 3200x1800 at 60hz over HDMI I believe









Not sure at all on the sub sampling thing.


----------



## battleaxe

Quote:


> Originally Posted by *Alfshizzle*
> 
> Thanks, have tried google and there is no info. However if it can do subsampling it will output 4k 60hz over hdmi.
> 
> Looks like im going to have to spend money to find out.


Quote:


> Originally Posted by *Agent Smith1984*
> 
> You can do 3200x1800 at 60hz over HDMI I believe
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Not sure at all on the sub sampling thing.


Yeah, I seem to remember something about this. I don't think AMD has the same level of HDMI out as Nvidia. This is an area where AMD has gone more toward DP if memory serves. I read about this a bit recently, but I'm not sure where it was. So, you may not be able to get 4k 60hz on HDMI...


----------



## Agent Smith1984

Quote:


> Originally Posted by *battleaxe*
> 
> Yeah, I seem to remember something about this. I don't think AMD has the same level of HDMI out as Nvidia. This is an area where AMD has gone more toward DP if memory serves. I read about this a bit recently, but I'm not sure where it was. So, you may not be able to get 4k 60hz on HDMI...


AMD should have done something funny like, require (2) cables be used for 4k 120hz using DP......









If you don't get that joke, I am sorry....

If you do get that joke, and find it offensive, I am NOT sorry.


----------



## Alfshizzle

Quote:


> Originally Posted by *battleaxe*
> 
> Any reason you can't use DP? Not on monitor I guess?


Nope, tv that says it has hdmi 2.0 but infact its not it can only do 4k 60hz at ycbcr 4.2.0 chroma subsampling..

So looks like nvida only..
Quote:


> Originally Posted by *Agent Smith1984*
> 
> You can do 3200x1800 at 60hz over HDMI I believe
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Not sure at all on the sub sampling thing.


Thanks


----------



## gatygun

Quote:


> Originally Posted by *phenom2955*
> 
> I am just curious here if anybody can give me an explanation here.
> 
> I went through 2 XFX R9 390's. Took the first one back because it was overheating, and then took the second one back to get exchange for a GTX 970. Over the month that I used the R9 390 I played every game I could throw at it at least a little bit.
> 
> GTAV, Shadow of Mordor with HD Texture, Assassins Creed Unity, Watch Dogs(played awful on the AMD card, played great maxed out now), The Witcher 3, The Witcher 2, Batman AK, All 3 Crysis games(played them all the way through on the 390) and even tested a few other games as well I cannot think of right now.
> 
> Anyways during that time I could not get a single game to break the 3.2GB-3.5GB VRAM usage range(according to MSI AB OSD) no matter what settings I tried or resolution. I tried all of them with Maxed out settings and even tried 1440p and 3k resolutions and simply could not get the card to utilize more than that in VRAM.
> 
> Now with my new EVGA GTX 970 I am hitting 3.8GB VRAM usage in Watch_Dogs and in Shadow of Mordor maxed out with HD Texture Pack its reading 5.5GB VRAM as I am playing right now. SoM is playing 60+ FPS, around 75-80 Average wtih maxed settings and 1080p. Watch_Dogs varies from 35-75FPS. Averages in mid 40s to lower 50s though, same thing maxed out 1080p. Other games like The Witcher still barely break 1.5gb-2gb VRAM usage though.
> 
> I am just a bit confused by this though, why is the card with 8GB Vram using less of it, while the card that supposedly only has 4GB is utilizing it better and exceeding it(is that shared memory with system memory or something?).


The reporting memory solution is just a gamble entirely tho, you are better off just watching and play a game and see if the v-ram forms a issue then trust numbers on msi afterburner for example.

I believe that a 970 also uses some form of texture compression technology which makes it's lower speed bus still fast enough.
Quote:


> Originally Posted by *phenom2955*
> 
> Shadow of Mordor, The Witcher 3, Crysis3(in particular one scene towards the begining where you look through the windows and theres a lot of rain outside, used to dip low 20's with the 390, now it stays mid 30's low 40's on that part...), Batman AK(only played this a little bit).
> 
> However the major improvement comes in Watch_Dogs. With the 390 with everything on Ultra I was dropping low 20's constantly, eventually I found out that turning "Level of Detail" setting to low allowed the 390 to stay above 30 with everything else maxed. However now I can play with everything maxed out even "Level of Detail" setting with 40+, some dips around 35.
> 
> Also in Crysis 3 with the 390 I had to play on High Settings to maintain 40+ FPS with some dips into 30's, with the 970 I can play all Maxed out and stay above 40, with some dips of course into mid 30s for a few seconds in some scenes.
> 
> All games I am playing with maxed out settings @ 1920x1080p. I don't ever plan to go above that resolution anytime soon(not even within the next couple years) so I guess the 970 suits me good and I don't plan to Sli or Crossfire anytime soon either(unless i can come accross a cheap that I card that I can justify getting).
> 
> Don't get me wrong though, I loved the 390, and I have only had the 970 2days now, but so far I think the 970 is suiting a little better(especially in regards to temps.
> 
> I was however getting a Higher 3dmark Graphics score with the 2nd 390 that I owned(the one I exchanged the original for). 3DMark graphics score with my 390 @ core 1100 with stock voltage was 13k. The 970 is getting 12.7k.
> 
> Current 3DMark score: http://www.3dmark.com/3dm/8163744
> 
> Also the last night I had the 390 I decided to start OC'ing my CPU and noticed that it Benefited a lot more from the CPU overclock than the 970. It went from 11.5k'ish Graphics Score in 3DMark to 13k.


This could easily be your cpu, as nvidia cards have lower api overhead then amd cards in dx11 for sure.

If you got a fast enough cpu the difference will be minimal to nothing, but a slow cpu nvidia gpu's will edge out then for sure, specially on 1080p resolutions.


----------



## gerpogi

hello guys! i just bought a r9 390x and for some reason it hits 80-85 celsius when i play ffxiv . is this normal temps? i play at 1440p btw.


----------



## Dorland203

Hi.I would like to join the club

Does anyone use Mirillis Action to record gameplay ? It crashes when I set "use hardware acceleration for video encoding" to AMD App.Does this only happen to AMD Cards ?

Edit: ASIC


----------



## PrinceLUDA21

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Read carefully and you will see that it works with the MSI 290, and with all other "reference" 290 and 390 boards.
> 
> The MSI 390 board is non reference, and according to HNIC at EK, there will NOT be an MSI 390 waterblock made.
> However, I must say, the popularity of these cards is increasing quickly, and it would not hurt for ALL OF US to let them know our interest in water blocks. It could prompt them to produce one....


That/This is a bunch a bull. I'm going to write the a letter every two weeks asking for a block. But I get it if they don't want to make a new block, but come one. As you said they are getting more and more popular.

Guess I'll have to search for someone to make a custom water block for me.


----------



## gerpogi

Quote:


> Originally Posted by *gerpogi*
> 
> hello guys! i just bought a r9 390x and for some reason it hits 80-85 celsius when i play ffxiv . is this normal temps? i play at 1440p btw.


Just to add: im New to amd cards so I don't know if it's normal, cuz im about to return it if its not. My. Last card was a reference gtx 970 and it had lower temps than this...


----------



## pengs

Quote:


> Originally Posted by *gerpogi*
> 
> Just to add: im New to amd cards so I don't know if it's normal, cuz im about to return it if its not. My. Last card was a reference gtx 970 and it had lower temps than this...


What? Because it's runs at 80C? You guys really wouldn't had liked the reference 290x or the full Fermi line up







It's completely fine, the GPU was designed to run up to 95C and your also looking at almost double the power consumption (not quite) which needs to be dissipated.


----------



## gerpogi

Quote:


> Originally Posted by *pengs*
> 
> What? Because it's runs at 80C? You guys really wouldn't had liked the reference 290x or the full Fermi line up
> 
> 
> 
> 
> 
> 
> 
> It's completely fine, the GPU was designed to run up to 95C and your also looking at almost double the power consumption (not quite) which needs to be dissipated.


If its a normal thing to hit that temp on this particular model then that would make me feel at ease. I just not used to hitting the 80 Celsius line. Ive been using nvidia cards for the longest time. I just bought this because it fits my needs : small ish size and very strong card.


----------



## cfcboy

Hi all, great thread.. im new to the 390 club. So far im loving it but need a bit of advice if possible.

Im playing a triple 27" screen setup and getting decent fps for games.

BF4 - 60
FC4 - 45-55
W3 - 40

These are based on high settings with a couple of bits toned down (things like hair and fur that i dont care for).

Im using a 600w corsair psu and a mid (bit on the small size) atx case.

Im not overly bothered about the fps, if i want to double/triple it i just go to the one 1080p ips screen but I beleive anything between 60+ is almost a waste as the screen tearing kicks in. Ive tried the 120hz monitor route which was good but wasnt worth the sacrifice for the ips colours. 60hz is great for me, runs smooth without tearing.

When i have the triple monitor going with something like Witcher 3 on high settings my MSI 390 is hitting max temps of 85C.
Im also thinking of trying to OC the card soon to try and squeeze out some extra FPS to get nearer the 60hz mark but im a little worried the temps are going to start creeping up to 90c mark and from what ive heard thats not great for a period of time.

Is it worth the following:

Uprgrading both PSU and Case?
Stick with what i have, do the OC and watch temps (assuming 85-90c is acceptable)
Just leave well alone









Any advice would be appreciated


----------



## AverdanOriginal

Quote:


> Originally Posted by *cfcboy*
> 
> When i have the triple monitor going with something like Witcher 3 on high settings my MSI 390 is hitting max temps of 85C.
> Im also thinking of trying to OC the card soon to try and squeeze out some extra FPS to get nearer the 60hz mark but im a little worried the temps are going to start creeping up to 90c mark and from what ive heard thats not great for a period of time.
> 
> Is it worth the following:
> 
> Uprgrading both PSU and Case?
> Stick with what i have, do the OC and watch temps (assuming 85-90c is acceptable)
> Just leave well alone
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Any advice would be appreciated


Hi and welcome.
Love those FPS you are reaching on triple monitor. Especailly with Witcher 3, where everyone is reporting it is not so good on AMD cards (I can't report as I have not played it on my card).

I had a similar issue in the beginning changing from nvidia back to AMD (I also have MSI R9 390). Here is a checklist with the help from AgentSmith and others in this community (feel free to add suggestions







):

Check your *Ambient Temps*. If your room has around 21C° and you get 85C° on your card then you need to adjust your system. If you have 36C° ambient or more and you get 85C° on your card on stressfull games like Watchdogs or Witcher....than that is pretty normal. (I had this problem due to hot summer time)
Check your *air flow in the system*. do you have enough air getting pushed in AND out. Here you get always two types of suggestions, either more air in (Overpressure) or more air out (underpressure). My personal suggestion "similar amount of air in and out and at least i would suggest 100m³/h in and out". So minimum 2x120mm fans in and 2x120mm fans out.
*Adjust via MSI Afterburner the fan curve*. Most have it so that you hit 100% fan on 90 C°, starting maybe with 25% on 40C°? Mine is 0% until 50C° and then goes up to 100% on 90C°
Changing the PSU or Motherboard if it is too old, could also give you some extra small Temps but its not worth the cost for the little advantage
One major point is for me also to see *what game needs what* (some games just run crazy hot and can be worse than some benchmarks, especially when they jsut hit the market). Witcher 2 for example was running hot on all different cards unless you turn on V-Sync. I have mine throttled on 60 FPS (Dynamic frame rate controll) anyways as it makes from my point of view no sense to have more FPS than necessary. Anyways, even with 60 FPS the Witcher 2 would still heat up to 90 C° until I turned on V-Sync... now it runs at around 78C° with Ambient at 30C°
And last but not least (and I might actually consider that aswell soon), see if the *thermal paste has been applied correctly* by MSI. I have heard somewhere that people where able to cool the card by an additional 4C° just by applying their own thermal paste.
Hope that helps. Oh yes, anyone else got any additional suggestions?


----------



## cfcboy

Quote:


> Originally Posted by *AverdanOriginal*
> 
> Hi and welcome.
> Love those FPS you are reaching on triple monitor. Especailly with Witcher 3, where everyone is reporting it is not so good on AMD cards (I can't report as I have not played it on my card).
> 
> I had a similar issue in the beginning changing from nvidia back to AMD (I also have MSI R9 390). Here is a checklist with the help from AgentSmith and others in this community (feel free to add suggestions
> 
> 
> 
> 
> 
> 
> 
> ):
> 
> Check your *Ambient Temps*. If your room has around 21C° and you get 85C° on your card then you need to adjust your system. If you have 36C° ambient or more and you get 85C° on your card on stressfull games like Watchdogs or Witcher....than that is pretty normal. (I had this problem due to hot summer time)
> Check your *air flow in the system*. do you have enough air getting pushed in AND out. Here you get always two types of suggestions, either more air in (Overpressure) or more air out (underpressure). My personal suggestion "similar amount of air in and out and at least i would suggest 100m³/h in and out". So minimum 2x120mm fans in and 2x120mm fans out.
> *Adjust via MSI Afterburner the fan curve*. Most have it so that you hit 100% fan on 90 C°, starting maybe with 25% on 40C°? Mine is 0% until 50C° and then goes up to 100% on 90C°
> Changing the PSU or Motherboard if it is too old, could also give you some extra small Temps but its not worth the cost for the little advantage
> One major point is for me also to see *what game needs what* (some games just run crazy hot and can be worse than some benchmarks, especially when they jsut hit the market). Witcher 2 for example was running hot on all different cards unless you turn on V-Sync. I have mine throttled on 60 FPS (Dynamic frame rate controll) anyways as it makes from my point of view no sense to have more FPS than necessary. Anyways, even with 60 FPS the Witcher 2 would still heat up to 90 C° until I turned on V-Sync... now it runs at around 78C° with Ambient at 30C°
> And last but not least (and I might actually consider that aswell soon), see if the *thermal paste has been applied correctly* by MSI. I have heard somewhere that people where able to cool the card by an additional 4C° just by applying their own thermal paste.
> Hope that helps. Oh yes, anyone else got any additional suggestions?


Thank you for a quick and detailed reply AverdanOriginal









My room where i have my pc situated is pretty hot, i have led wall temps and they are currently hitting 24-25c at the moment. Its a studio room where the kitchen is so with things like oven on etc.

My build is less than a week old so probably not a dust issue. Getting the GPU into the case was a bit of tight fit shall we say lol.
The case is a CIT Vantage Type-R Midi Mesh Gaming Case Black Interior 4 Fans. Its actually quite "cute" and im not really wanting to change if i dont have to i must admit but will do so for the sake of a GPU.

I have it set up so bottom is intake fan, then you have GPU, then two case mounted 120mm fans that sit across from the GPU and Noctua NH-U9S CPU fan. Above just below the 650 w corair psu is another 120mm fan. Cables arent wrapped but but ive tried to tuck them up in the space thats to the right of the psu.

I guess if i do hit 90c and its not full time thing, only on certain games then i could just carry on? My main worry is blowing the card, not getting it within its limits of what it can handle

Thanks again


----------



## AverdanOriginal

Quote:


> Originally Posted by *cfcboy*
> 
> Thank you for a quick and detailed reply AverdanOriginal
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My room where i have my pc situated is pretty hot, i have led wall temps and they are currently hitting 24-25c at the moment. Its a studio room where the kitchen is so with things like oven on etc.
> 
> My build is less than a week old so probably not a dust issue. Getting the GPU into the case was a bit of tight fit shall we say lol.
> The case is a CIT Vantage Type-R Midi Mesh Gaming Case Black Interior 4 Fans. Its actually quite "cute" and im not really wanting to change if i dont have to i must admit but will do so for the sake of a GPU.
> 
> I have it set up so bottom is intake fan, then you have GPU, then two case mounted 120mm fans that sit across from the GPU and Noctua NH-U9S CPU fan. Above just below the 650 w corair psu is another 120mm fan. Cables arent wrapped but but ive tried to tuck them up in the space thats to the right of the psu.
> 
> I guess if i do hit 90c and its not full time thing, only on certain games then i could just carry on? My main worry is blowing the card, not getting it within its limits of what it can handle
> 
> Thanks again


Glad to calm the heat issue







AND awesome to see someone using the same CPU fan as I am... small but practicle Noctua (cools my overclocked old AMD CPU just enough)

I had a quick look at your case. I am correctly assuming that both the fans on the side of case (opposite the graphics card) are blowing air in? That would mean you have 3 intaking and one 120mm fan sucking air out. If that is the case, then hot air might be piling up inside the case and not beeing able to exit the case fast enough. Even though your are throwing in air with a temp of 24 (which is not a bad room temp) you are not giving the system a chance to efficiently push out the hot air. Also I see that the case gives you no option to pull out hot air at the top, except for your PSU, IF the PSU fan is facing down of course








IF that is the case, I would suggest perhaps to turn the top fan on the side around to blow out the air instead of in.
But like I said, I am not a 100% sure how your set up looks.
Perhaps a photo might help?


----------



## cfcboy

So you have the bottom fan, then the two on the side panel which can be switch on and off on the side. Then below the fan is the back fan (see below pic, fan not attached but you get what im saying:


----------



## AverdanOriginal

Quote:


> Originally Posted by *cfcboy*
> 
> So you have the bottom fan, then the two on the side panel which can be switch on and off on the side. Then below the fan is the back fan (see below pic, fan not attached but you get what im saying:


Yep those are the images I have seen online







*I meant an actual photo of your Case set-up with everything installed.* So that we can see if the upper fan actually blows air out or in, and to see how high your card and CPU are situated in the case. Side fans are great for CrossFire/SLI but for overall airflow they can be a bit disruptive.


----------



## cfcboy

ah ok!







lol !

I will do when i get home, thanks for your help


----------



## Agent Smith1984

Congrats to everyone on their new cards.

Remember to read through the OP, and also post proof if you want to be added to the roster!!!









I am working on some updates to the cooling section today that will go into more detail of just how important case flow and ambient temp is. It will also explain the normal temperature boundaries for these so anyone switching from the green team is not terrified to see they now have a card that can run at 80C+.

We are seeing more and more new owners either facing higher temps than they expected, or facing higher temps than they should really be having in general.
There are solutions that do not require much effort at all, and I will try and get some more information up ASAP.


----------



## flopper

Quote:


> Originally Posted by *cfcboy*
> 
> I guess if i do hit 90c and its not full time thing, only on certain games then i could just carry on? My main worry is blowing the card, not getting it within its limits of what it can handle
> 
> Thanks again


90c with these cards is fine.
the 290 series was designed around 95c. However one would make sure heat dont get out of hand in a case.
good fans with as few restrications as possible for flow does a lot.


----------



## Cannon19932006

If I wanted to get more voltage out of afterburner how would I do so?


----------



## battleaxe

Quote:


> Originally Posted by *Cannon19932006*
> 
> If I wanted to get more voltage out of afterburner how would I do so?


click unlock voltage control in settings


----------



## Cannon19932006

Quote:


> Originally Posted by *battleaxe*
> 
> click unlock voltage control in settings


More than +100mv I mean.


----------



## fyzzz

Quote:


> Originally Posted by *Cannon19932006*
> 
> More than +100mv I mean.


Or just use Sapphire Trixx


----------



## battleaxe

Quote:


> Originally Posted by *Cannon19932006*
> 
> More than +100mv I mean.


over on the 290 forum at the very beginning there is a section on increasing the voltage in afterburner. It's basically a bat file. It's possible to go up to 300 millivolts in afterburner


----------



## Cannon19932006

Quote:


> Originally Posted by *fyzzz*
> 
> Or just use Sapphire Trixx


I want Aux control as well, I'm running my memory at 1750 and it requires a bit of that.


----------



## Cannon19932006

Quote:


> Originally Posted by *battleaxe*
> 
> over on the 290 forum at the very beginning there is a section on increasing the voltage in afterburner. It's basically a bat file. It's possible to go up to 300 millivolts in afterburner


I can't seem to get it to work, afterburner just doesn't open with the /wi6,30,8d,10 or /wi4,30,8d,10 strings added.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Cannon19932006*
> 
> I want Aux control as well, I'm running my memory at 1750 and it requires a bit of that.


I am looking to do this also.

Tied up at work today, so if you come across a definite way, please post results.

I need the AUX for memory also, which is why I tend to shy away from Trixx, though I have discovered that you can apply the AUX in AB and it sticks, even when using Trixx afterwards.


----------



## fyzzz

Quote:


> Originally Posted by *Cannon19932006*
> 
> I can't seem to get it to work, afterburner just doesn't open with the /wi6,30,8d,10 or /wi4,30,8d,10 strings added.


Make a shortcut of msi ab to your desktop and put "C:\Program Files (x86)\MSI Afterburner\MSIAfterburner.exe" /wi6,30,8d,20 (20 will give 200mv i think) in the target. It will not open msi ab, but apply the voltage. You should be able to open your regular msi ab shortcut and adjust voltage higher.


----------



## Geoclock

XFX black edition r9 390 is for $300 after MIR and disscount.


----------



## Dorland203

Create *2* shortcuts of msi afterburner on the desktop.
Name them shortcut 1 and 2.
Right click on shortcut 1,choose Properties.In the "Target" box,add /wi6,30d,8,20.It should look like this .exe" /wi6,30d,8,20.Notice there is a space between " and /.
Click ok to close shortcut 1's Properties.
Double click shortcut *1*.This will give you 200 mV core voltage but will *not* open msi afterburner.
Use shortcut *2* to open msi afterburner.
You can change the voltage to 150 mV by replacing 20 with 14,or 300 mV by replacing 20 with 30.Just remember to double click shortcut 1 before open shortcut 2.
Hope this help


----------



## Agent Smith1984

Quote:


> Originally Posted by *Dorland203*
> 
> Create *2* shortcuts of msi afterburner on the desktop.
> Name them shortcut 1 and 2.
> Right click on shortcut 1,choose Properties.In the "Target" box,add /wi6,30d,8,20.It should look like this .exe" /wi6,30d,8,20.Notice there is a space between " and /.
> Click ok to close shortcut 1's Properties.
> Double click shortcut *1*.This will give you 200 mV core voltage but will *not* open msi afterburner.
> Use shortcut *2* to open msi afterburner.
> You can change the voltage to 150 mV by replacing 20 with 14,or 300 mV by replacing 20 with 30.Just remember to double click shortcut 1 before open shortcut 2.
> Hope this help


I take it these voltages do not show up in the app's slider?

Thanks a lot for the write up!


----------



## Agent Smith1984

Quote:


> Originally Posted by *Geoclock*
> 
> XFX black edition r9 390 is for $300 after MIR and disscount.


Thanks for the heads up!

Anyone want a well binned 390 with access to full cover waterblock?
$299.99 after rebate, plus a $10 giftcard!!!

HERE YOU GO!!
http://www.newegg.com/Product/Product.aspx?Item=N82E16814150728&cm_re=xfx_390-_-14-150-728-_-Product


----------



## BlaXey

Lepa is a good PSU brand? I'm thinking in buy a new one.


----------



## Dorland203

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I take it these voltages do not show up in the app's slider?
> 
> Thanks a lot for the write up!


If you follow the steps.When you open shortcut 2,the voltage will already be set to 200 mV (or 150,300).From this point,don't touch the slider.If you do,the voltage will drop to less than 100,and will not back up to more than 100 even if you pull it all the way to the right.You have to close msi afterburner and reopen it to set the voltage to 200 again.


----------



## Agent Smith1984

Quote:


> Originally Posted by *BlaXey*
> 
> Lepa is a good PSU brand? I'm thinking in buy a new one.


My brother had a 1600W LEPA.... that thing was a beast....

He wasn't pushing it very hard at all though. The most we ever had running on it was an OC'd 3820 and 2) highly overclocked tahiti cards.

He only bought it cause it was $80 on Criagslist, and at the time is was a $280+ PSU I believe.

It was a nice unit though, but am opting to get away from the mutli rail thing on my next PSU purchase personally.


----------



## sinholueiro

The XFX 390 Black Edition is better binned than the normal one or is only the same card with better clocks?


----------



## Agent Smith1984

Quote:


> Originally Posted by *sinholueiro*
> 
> The XFX 390 Black Edition is better binned than the normal one or is only the same card with better clocks?


It's likely a slightly better bin, but then again, all of these cards will reach the BE's stock clocks, so they may not be binned at all.

I just know the few XFX samples we have seen so far, seem to clock well. Maybe not on par with MSI, but >= Sapphire, and definitely better than most of the PowerColor cards.


----------



## Agent Smith1984

Hexus was able to get the new Devil 390X to 1190MHz on stock voltage









If it were just like , $449.99 it would be a hell of a card, but I think the $500 is a tad much....
Then again, that's still cheaper than GTX 980, and it's water cooled. It's also much cheaper than the going rate for Fury pro right now, and it performs within 5-8% or so of it in most cases...

I wish they'd have offered a 390 Devil for $399.99. I could see them selling more of those than anything else.

Interesting card none the less...
http://hexus.net/tech/reviews/graphics/85436-powercolor-radeon-r9-390x-devil/?page=12


----------



## Gumbi

Wow,I'd love to see their voltage for stock! 1190 on stock is _*ridiculous*_. I wonder if their VRM cooling is up to scratch too, if it is, we might see 1250 + clocks out of these bad boys on a regular basis!


----------



## Geoclock

Is there a "REASON" why r9 390 prices are droping down like the flies and Gtx 970 prices are still same?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Geoclock*
> 
> Is there a "REASON" why r9 390 prices are droping down like the flies and Gtx 970 prices are still same?


Have they gone down?

All I saw was the XFX with the MIR.

I'd like to see $300 on the reg.... I'm ordering my second MSI 390 within the next week, unless I am strangly compelled to go a completely different direction before then.


----------



## battleaxe

Quote:


> Originally Posted by *Geoclock*
> 
> Is there a "REASON" why r9 390 prices are droping down like the flies and Gtx 970 prices are still same?


This is a great time to be on the red team I guess.

Edited: to keep from starting a war.


----------



## CamsX

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I just know the few XFX samples we have seen so far, seem to clock well. Maybe not on par with MSI, but >= Sapphire, and definitely better than most of the PowerColor cards.


I recent that remark.


----------



## sinholueiro

What's the average clock without raising voltage? 1100?

Also, that PowerColor has to be the best binned chips for sure


----------



## battleaxe

Quote:


> Originally Posted by *sinholueiro*
> 
> What's the average clock without raising voltage? 1100?
> 
> Also, that PowerColor has to be the best binned chips for sure


I doubt its that high. 1075 maybe max? Seems quite rare to get one that does 1150 with stock volts. So I doubt 1100 is the norm. Somewhere 1075 ish... I could be wrong though.


----------



## Agent Smith1984

Quote:


> Originally Posted by *battleaxe*
> 
> I doubt its that high. 1075 maybe max? Seems quite rare to get one that does 1150 with stock volts. So I doubt 1100 is the norm. Somewhere 1075 ish... I could be wrong though.


I've seen between 1075 and 1120 on stock volts for the varying brands.

So far MSI is king, but that may just be because we have the most samples to look at for them.

Of course, going water changes everything.... That may be why that devil is looking so..... Devilish?


----------



## CamsX

Btw, has anyone tried the new TriXX version 5.0.0 for overclocking?

It is large, and it is missing a proper minimize button (at least I haven't found it).

It now monitors VRMs and couple of other things that I think weren't available on version 4.


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I've seen between 1075 and 1120 on stock volts for the varying brands.
> 
> So far MSI is king, but that may just be because we have the most samples to look at for them.
> 
> Of course, going water changes everything.... That may be why that devil is looking so..... Devilish?


Well... then I guess that means 1100 is about the average minimum. I stand corrected.
Quote:


> Originally Posted by *CamsX*
> 
> Btw, has anyone tried the new TriXX version 5.0.0 for overclocking?
> 
> It is large, and it is missing a proper minimize button (at least I haven't found it).
> 
> It now monitors VRMs and couple of other things that I think weren't available on version 4.


Yes, I've been using it. The only thing I don't like is that you can't seem to type in a value you want. You have to use the sliders or use the arrow keys to move the slider. Kinda cludgy, but it does look nice.

*Question:*

Once DX12 is being utilized by the coders fully, could an 8GB 390x be Xfired with a 4Gb 290x and use all 8Gb or would it be 12Gb?... or only 4Gb still limited by the 290x?


----------



## battleaxe

delete: double post


----------



## Agent Smith1984

Quote:


> Originally Posted by *battleaxe*
> 
> Well... then I guess that means 1100 is about the average minimum. I stand corrected.
> Yes, I've been using it. The only thing I don't like is that you can't seem to type in a value you want. You have to use the sliders or use the arrow keys to move the slider. Kinda cludgy, but it does look nice.
> 
> *Question:*
> 
> Once DX12 is being utilized by the coders fully, could an 8GB 390x be Xfired with a 4Gb 290x and use all 8Gb or would it be 12Gb?... or only 4Gb still limited by the 290x?


It'll either be 4gb or 12gb, no in between. Off course, that's only if the devs code the game to use multiple frame buffers, and i suspect many won't.... Sadly, all the games we want to play are just console ports that get rushed through developers for release dates. Well, except for GTA V... Props to them for making us wait!


----------



## CamsX

Quote:


> Originally Posted by *battleaxe*
> 
> Yes, I've been using it. The only thing I don't like is that you can't seem to type in a value you want. You have to use the sliders or use the arrow keys to move the slider. Kinda cludgy, but it does look nice.


Yeah, the interface does seem clunky. Why is it always @ +13mV?

Finally reached 9k in my firestrike scores, after disabling Tessellation in CCC.

http://www.3dmark.com/3dm/8195276

Still feel that my combined score is way too low. Exactly the same as it was with my 7970.


----------



## Agent Smith1984

Quote:


> Originally Posted by *CamsX*
> 
> Yeah, the interface does seem clunky. Why is it always @ +13mV?
> 
> Finally reached 9k in my firestrike scores, after disabling Tessellation in CCC.
> 
> http://www.3dmark.com/3dm/8195276
> 
> Still feel that my combined score is way too low. Exactly the same as it was with my 7970.


Firestrike gimps AMD CPU's to no end my friend.....

Just focus on graphics score!

If you want a good benchmark, use 3dmark11









Please post results, I'll show you mine if you show me yours


----------



## CamsX

When I turned tesselation off 3dmarks showed as invalid. Is this normal?


----------



## Agent Smith1984

Quote:


> Originally Posted by *CamsX*
> 
> When I turned tesselation off 3dmarks showed as invalid. Is this normal?


Yep, FS uses tess, so turning it off is essentially cheating the results.

Are you going to OC the GPU? it helps a lot with the score...

Also, WE HAVE to find a way to get your CPU to 4.7+.. It's essential for owning an FX-8!

I can give you tons of help with that. PM if you want.


----------



## cbarros82

Quote:


> Originally Posted by *sinholueiro*
> 
> What's the average clock without raising voltage? 1100?
> 
> Also, that PowerColor has to be the best binned chips for sure


my msi 390 gets to 1175 mhz no voltage but took +75 mv to do 1200 mhz core


----------



## CerealKillah

So I went ahead and updated to Windows 10 last night and 3dMark still is reporting my video card is the 290X and now 3dMark doesnt like my driver.

Why did I update???


----------



## CamsX

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Yep, FS uses tess, so turning it off is essentially cheating the results.
> 
> Are you going to OC the GPU? it helps a lot with the score...
> 
> Also, WE HAVE to find a way to get your CPU to 4.7+.. It's essential for owning an FX-8!
> 
> I can give you tons of help with that. PM if you want.


Ran a couple of Tests with the GPU on MSI stock OC settings 1060/1525. Set tessellation back to AMD defaults.

Cpu @4.4Ghz
http://www.3dmark.com/3dm/8197538
graphics 12586
overall 9023

CPU @4.76Ghz
http://www.3dmark.com/3dm/8197702
graphics 12601
overall 9429

Kinda brute forced the CPU via multiplier, all voltages on Auto, so I don't think it is stable at all. Don't plan to stress test it, as this always ends in random instant reboots. I'll wait for it to crash in games and dial it down.

GPU is behaving extremely well on the cooling department. Max fan 44% (not intended) during valley and firestrike yields max 70°C core and 72°C vrms.

*Stupid TriXX isn't saving the fan or clock settings after a reboot, grrr.*


----------



## AverdanOriginal

Quote:


> Originally Posted by *CerealKillah*
> 
> So I went ahead and updated to Windows 10 last night and 3dMark still is reporting my video card is the 290X and now 3dMark doesnt like my driver.
> 
> Why did I update???


here is a screenshot of the 3DMark I am using.


V 1.5.915 gave me the correct reading of my card. before that it was also showing as 290 and not as 390. Plus it was running a bit unstable.

Updating windows won't make 3DMark realize you have a 390(X), you need to update to the latest version of 3DMark.

Here my latest benchmark. Once it gets cooler here and I have more time, I will try to overclock my card, until then I am more eager to cool it








http://www.3dmark.com/3dm/8198592


----------



## Agent Smith1984

Quote:


> Originally Posted by *CamsX*
> 
> Ran a couple of Tests with the GPU on MSI stock OC settings 1060/1525. Set tessellation back to AMD defaults.
> 
> Cpu @4.4Ghz
> http://www.3dmark.com/3dm/8197538
> graphics 12586
> overall 9023
> 
> CPU @4.76Ghz
> http://www.3dmark.com/3dm/8197702
> graphics 12601
> overall 9429
> 
> Kinda brute forced the CPU via multiplier, all voltages on Auto, so I don't think it is stable at all. Don't plan to stress test it, as this always ends in random instant reboots. I'll wait for it to crash in games and dial it down.
> 
> GPU is behaving extremely well on the cooling department. Max fan 44% (not intended) during valley and firestrike yields max 70°C core and 72°C vrms.
> 
> *Stupid TriXX isn't saving the fan or clock settings after a reboot, grrr.*


Here is my Heaven at 1200/1725











Clean run too, no artifacts at all. This card is seeming to want to run at 1200 now.... not sure if it just needed "breaking in" or what, but this thing is beasting. The core temp never breaking 73C at 100mv+/50mv+ probably helps some also....


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Here is my Heaven at 1200/1725
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Clean run too, no artifacts at all. This card is seeming to want to run at 1200 now.... not sure if it just needed "breaking in" or what, but this thing is beasting. The core temp never breaking 73C at 100mv+/50mv+ probably helps some also....


I see you are on Win 10 now. How are you liking it?


----------



## Agent Smith1984

Quote:


> Originally Posted by *battleaxe*
> 
> I see you are on Win 10 now. How are you liking it?


Lovin' it so far! It's a nice middle ground between 7 and 8.1


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Here is my Heaven at 1200/1725
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Clean run too, no artifacts at all. This card is seeming to want to run at 1200 now.... not sure if it just needed "breaking in" or what, but this thing is beasting. The core temp never breaking 73C at 100mv+/50mv+ probably helps some also....


Here's my 290x. This was at 1203Mhz and 1575 on the RAM.

I have Samsung RAM, so I'm guessing the timings play a role?

I've got to get this thing under water. The VRM was over 100c on this run.


----------



## CerealKillah

Quote:


> Originally Posted by *AverdanOriginal*
> 
> here is a screenshot of the 3DMark I am using.
> 
> 
> V 1.5.915 gave me the correct reading of my card. before that it was also showing as 290 and not as 390. Plus it was running a bit unstable.
> 
> Updating windows won't make 3DMark realize you have a 390(X), you need to update to the latest version of 3DMark.
> 
> Here my latest benchmark. Once it gets cooler here and I have more time, I will try to overclock my card, until then I am more eager to cool it
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/8198592


Here is my 3dMark. Maybe it is the Steam version messing up? Appears to be the same version other than mine says Steam.



AgentSmith - What drivers are you currently using? I downloaded the newest from AMD and 3DMark says they are invalid.


----------



## battleaxe

Quote:


> Originally Posted by *CerealKillah*
> 
> Here is my 3dMark. Maybe it is the Steam version messing up? Appears to be the same version other than mine says Steam.
> 
> 
> 
> AgentSmith - What drivers are you currently using? I downloaded the newest from AMD and 3DMark says they are invalid.


I've also been having fits with the Steam version. Won't run at all on my PC. I'm going to try this. Hopefully it works for me too.

Edit: no dice. Still won't load.


----------



## Agent Smith1984

Quote:


> Originally Posted by *CerealKillah*
> 
> Here is my 3dMark. Maybe it is the Steam version messing up? Appears to be the same version other than mine says Steam.
> 
> 
> 
> AgentSmith - What drivers are you currently using? I downloaded the newest from AMD and 3DMark says they are invalid.


I'm still on 15.7, but m going to try 15.7.1 later


----------



## sinholueiro

Due to new capacitors in the XFX version, I can't buy a version with a full waterblock in my country. If I want to watercool the core, I have to leave the VRM cooling with passive heatsinks. Will it be enought some heatsinks to cool the VRMs?


----------



## Agent Smith1984

Thought i might share a few pics, please excuse the dust! LOL


----------



## Agent Smith1984

Okay, tweaked a little more, here's what I got:


That's as good as it's gonna get for this thing...
Not bad though, because a stock Fury X gets around 1625 @ 64.6 FPS


----------



## CerealKillah

Quote:


> Originally Posted by *battleaxe*
> 
> I've also been having fits with the Steam version. Won't run at all on my PC. I'm going to try this. Hopefully it works for me too.
> 
> Edit: no dice. Still won't load.


Updated to the regular version of 3dMark and my card is STILL showing up as a 290X.

Score seems a little lower than my W7 scores.

http://www.3dmark.com/3dm/8205042

I uninstalled and reinstalled Afterburner and score is looking MUCH better. Still at stock voltage until I can get my in PSU installed.

http://www.3dmark.com/3dm/8205369


----------



## CerealKillah

Quote:


> Originally Posted by *sinholueiro*
> 
> Due to new capacitors in the XFX version, I can't buy a version with a full waterblock in my country. If I want to watercool the core, I have to leave the VRM cooling with passive heatsinks. Will it be enought some heatsinks to cool the VRMs?


Did you take the stock cooler off and verify that you actually have the different inductor?

I have my XFX card under water


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Okay, tweaked a little more, here's what I got:
> 
> 
> That's as good as it's gonna get for this thing...
> Not bad though, because a stock Fury X gets around 1625 @ 64.6 FPS


Yeah, my previous run is all I can get too. I need to get the heat away from the VRM's in order to go higher on the core. The core is plenty happy, but the thing just gets too hot so I can't push any further.
Quote:


> Originally Posted by *CerealKillah*
> 
> Updated to the regular version of 3dMark and my card is STILL showing up as a 290X.
> 
> Score seems a little lower than my W7 scores.
> 
> http://www.3dmark.com/3dm/8205042
> 
> I uninstalled and reinstalled Afterburner and score is looking MUCH better. Still at stock voltage until I can get my in PSU installed.
> 
> http://www.3dmark.com/3dm/8205369


I'm gonna try to uninstall and reinstall. Mine still won't even load. I click on the 3dMark button and the app crashes immediately. So weird.


----------



## Agent Smith1984

Quote:


> Originally Posted by *CerealKillah*
> 
> Updated to the regular version of 3dMark and my card is STILL showing up as a 290X.
> 
> Score seems a little lower than my W7 scores.
> 
> http://www.3dmark.com/3dm/8205042
> 
> I uninstalled and reinstalled Afterburner and score is looking MUCH better. Still at stock voltage until I can get my in PSU installed.
> 
> http://www.3dmark.com/3dm/8205369


Looking good for stock voltage.

Can't wait to see what it does under water with some more juice.

What block you running? I'll update the roster with it...


----------



## Dorland203

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Here is my Heaven at 1200/1725
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Clean run too, no artifacts at all. This card is seeming to want to run at 1200 now.... not sure if it just needed "breaking in" or what, but this thing is beasting. The core temp never breaking 73C at 100mv+/50mv+ probably helps some also....


Wow.Can you get it to run Crysis 3 max settings for 1 hour with that clock/mem ?


----------



## sinholueiro

Quote:


> Originally Posted by *CerealKillah*
> 
> Did you take the stock cooler off and verify that you actually have the different inductor?
> 
> I have my XFX card under water


I didn't buy anything yet. I am still thinking in what components to get.

Anyone has tried to passive cooling VRMs?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Dorland203*
> 
> Wow.Can you get it to run Crysis 3 max settings for 1 hour with that clock/mem ?


Well, had to step out for a minute, but I pulled off 30 minutes straight with no problems... just did catch the back end of the AB readout when I got back. I'll get another later.
Keep in mind this is at 4K very high settings, with no AA, and it's averaging around 55 FPS. That is amazing performance @ $330.... Core hits 76C, the VRM does around 75C


----------



## CerealKillah

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Looking good for stock voltage.
> 
> Can't wait to see what it does under water with some more juice.
> 
> What block you running? I'll update the roster with it...


I am using the EK Waterblock EK-FC-R9-290X-V2-CA.


----------



## ManofGod1000

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Well, had to step out for a minute, but I pulled off 30 minutes straight with no problems... just did catch the back end of the AB readout when I got back. I'll get another later.
> Keep in mind this is at 4K very high settings, with no AA, and it's averaging around 55 FPS. That is amazing performance @ $330.... Core hits 76C, the VRM does around 75C


+

I do not understand, is there a benchmark program in Crysis 3? Did you just start the program and let it run at a single moment in the game?


----------



## gerpogi

Hi guys! I'm , planning to buy 390 and im picking between the two : asus direct cu ii($280) or a gigabyte($325). Which one would be better? I was gonna go for the asus but I read bad stuff about it so I dont know...


----------



## semiroundboss

Quote:


> Originally Posted by *gerpogi*
> 
> Hi guys! I'm , planning to buy 390 and im picking between the two : asus direct cu ii($280) or a gigabyte($325). Which one would be better? I was gonna go for the asus but I read bad stuff about it so I dont know...


Out of those two, Gigabyte for sure. But I recommend spending a little more for a Sapphire Nitro. I had to send back my Asus Strix R9 390X since it was so awful. Now I have this wonderful Sapphire Tri-X R9 390X overclocked (It's higher than every other card on here with stock voltages, but I'm still testing it for stability) and it stays very cool. Someone correct me if I am wrong, but the Nitro has the exact same cooler as the Tri-X 390X, just different colors and it is a 390 instead of a 390X.

Edit: I'll send proof that I have it and the overclock along with benchmarks later tonight.


----------



## gerpogi

Quote:


> Originally Posted by *semiroundboss*
> 
> Out of those two, Gigabyte for sure. But I recommend spending a little more for a Sapphire Nitro. I had to send back my Asus Strix R9 390X since it was so awful. Now I have this wonderful Sapphire Tri-X R9 390X overclocked (It's higher than every other card on here with stock voltages, but I'm still testing it for stability) and it stays very cool. Someone correct me if I am wrong, but the Nitro has the exact same cooler as the Tri-X 390X, just different colors and it is a 390 instead of a 390X.
> 
> Edit: I'll send proof that I have it and the overclock along with benchmarks later tonight.


thanks for that suggestion , i was originally going for that but it wont fit my case. plus i need a card with a back plate since part of my AIO cooler will be sitting on top of my GPU, i dont want static problems or what not. im definitely leaning towards the gigabyte model , i was just attracted by the asus' price, which i can buy at $280.


----------



## semiroundboss

Quote:


> Originally Posted by *gerpogi*
> 
> thanks for that suggestion , i was originally going for that but it wont fit my case. plus i need a card with a back plate since part of my AIO cooler will be sitting on top of my GPU, i dont want static problems or what not. im definitely leaning towards the gigabyte model , i was just attracted by the asus' price, which i can buy at $280.


Ah. Well in that case I recommend the XFX for the backplate and the same dual slot size that Gigabyte and Asus have. That's what I am going to pick up as my second card later this year, since I can't fit a second Tri-X in my motherboard.


----------



## MalsBrownCoat

FYI for anyone who is considering water cooling the ASUS STRIX R9 390X DC3OC with an EK block, their configurator says that the EK-FC R9-290X DCII is fully compatible (and is the only one they offer for this card).

However, it does N O T fit. The capacitors are too tall and the block doesn't even reach the die. The board may be the same as the 290X version, but the components are Not.

Everything marked in red is too tall.







There is no way that anyone at EK even _remotely_ looked/tested that block with this card. I'll be working with them to get a resolution and will update accordingly. In the meantime, don't do it!


----------



## gerpogi

Quote:


> Originally Posted by *semiroundboss*
> 
> Ah. Well in that case I recommend the XFX for the backplate and the same dual slot size that Gigabyte and Asus have. That's what I am going to pick up as my second card later this year, since I can't fit a second Tri-X in my motherboard.


very cool. is the XFX model better than the gigabyte in terms of cooling? i actually thought about that too since XFX still has that lifetime warranty thing if i were to buy in best buy


----------



## Ju_nin_mai

ok on windows 10 pro.

MSi R9 390. Im running

1180 clock
1600 memory
+50mV


----------



## CamsX

Quote:


> Originally Posted by *gerpogi*
> 
> thanks for that suggestion , i was originally going for that but it wont fit my case. plus i need a card with a back plate since part of my AIO cooler will be sitting on top of my GPU, i dont want static problems or what not. im definitely leaning towards the gigabyte model , i was just attracted by the asus' price, which i can buy at $280.


Yeah, if the lenght of the card and the lack of backplate is an issue, then the Nitro is not your option, tho it is performing really well for me right now I must say.

IMHO, if you can, get an MSI. Its arguably the best performer around here. A new LE version is out, with stock clocks from factory, if you don't mind. Regular length, good cooler and has a back plate. Red tho. haha.

2nd option the XFX, also popular and fairly descent overall

3rd option Gigabyte, and this is only if you don't plan to overclock it, because it comes with locked voltages from factory.

I would avoid the Asus DCU2, unless you plan to watercool it via full waterblock (check page 1), or you plan to use it totally stock and never mess with it until you buy a new Graphics Card in the future. I believe they are no longer making this model and switched to a 3 fan cooler on the Strix version. *Edit, forget the waterblock option, see above*

Hope to see you join the club sometime soon.


----------



## gerpogi

Quote:


> Originally Posted by *CamsX*
> 
> Yeah, if the lenght of the card and the lack of backplate is an issue, then the Nitro is not your option, tho it is performing really well for me right now I must say.
> 
> IMHO, if you can, get an MSI. Its arguably the best performer around here. A new LE version is out, with stock clocks from factory, if you don't mind. Regular length, good cooler and has a back plate. Red tho. haha.
> 
> 2nd option the XFX, also popular and fairly descent overall
> 
> 3rd option Gigabyte, and this is only if you don't plan to overclock it, because it comes with locked voltages from factory.
> 
> I would avoid the Asus DCU2, unless you plan to watercool it via full waterblock (check page 1), or you plan to use it totally stock and never mess with it until you buy a new Graphics Card in the future. I believe they are no longer making this model and switched to a 3 fan cooler on the Strix version. *Edit, forget the waterblock option, see above*
> 
> Hope to see you join the club sometime soon.


thanks for that suggestion ! my only reason for the gigabyte really is for the looks ( blue led) would look very nice in my case. but if the xfx will cool better than gigabyte by a couple of degrees then i might lean towards that more.

heres my rig with my old 970











i had to replace that 970 because it is a VERY tight fit, it is pretty much pressing on my front rad bending the backplate by abit, and i dont want that so meh.


----------



## CamsX

Quote:


> Originally Posted by *gerpogi*
> 
> thanks for that suggestion ! my only reason for the gigabyte really is for the looks ( blue led) would look very nice in my case. but if the xfx will cool better than gigabyte by a couple of degrees then i might lean towards that more.
> 
> i had to replace that 970 because it is a VERY tight fit, it is pretty much pressing on my front rad bending the backplate by abit, and i dont want that so meh.


It still your decision, but if the looks is important (and you are sure that the Windforce fits properly), then I would go for that option. Again, the only real drawback is the overclocking capability.

Your case seems pretty cramped (looks really good btw!), so I wouldn't dare overclocking any 390(x) in there. You are aware that the 390 runs hotter compared to the 970, right?


----------



## Agent Smith1984

Quote:


> Originally Posted by *ManofGod1000*
> 
> +
> 
> I do not understand, is there a benchmark program in Crysis 3? Did you just start the program and let it run at a single moment in the game?


That was 30 minutes of 16 player slyline.


----------



## gerpogi

Quote:


> Originally Posted by *CamsX*
> 
> It still your decision, but if the looks is important (and you are sure that the Windforce fits properly), then I would go for that option. Again, the only real drawback is the overclocking capability.
> 
> Your case seems pretty cramped (looks really good btw!), so I wouldn't dare overclocking any 390(x) in there. You are aware that the 390 runs hotter compared to the 970, right?


Yeah im sure it fits, im trying out a 390x atm and im kinda not comfortable with its temps.. that's why im gonna go for a 390 this time and see if I can keep the temps below 80.

Also I am aware that it runs hotter but supposedly not by much though right?


----------



## Ju_nin_mai

guys i went to 1750 memory clock on my msi 390 and the screen went black...and by mistake i had freaking apply at startup on.

So i had to go into safemode and uninstall afterburner to get computer to work.

Did i damage my card? is it possible?


----------



## semiroundboss

Ha. That's happened to me three times tonight. No. You're fine. Just boot up in safe mode and uninstall Afterburner. Then just boot into normal Windows and reinstall. That's what I did the one time I left the start up on. Just turn your voltage up a little bit to get it stable.


----------



## Ju_nin_mai

Quote:


> Originally Posted by *semiroundboss*
> 
> Ha. That's happened to me three times tonight. No. You're fine. Just boot up in safe mode and uninstall Afterburner. Then just boot into normal Windows and reinstall. That's what I did the one time I left the start up on. Just turn your voltage up a little bit to get it stable.


im scared i heard something pop but people were doing firecrackers outside! so I dunno if im just paranoid. How can I make sure its ok? I ran a game it and seems fine. just i run furmark?


----------



## semiroundboss

Furmark is pointless. Your card is fine. Haha. I doubt you did anything harmful. Your display driver will crash before it fries itself. But try running Unigine Heavan at stock speeds to make sure everything is fine, I guess.


----------



## Ju_nin_mai

Quote:


> Originally Posted by *semiroundboss*
> 
> Furmark is pointless. Your card is fine. Haha. I doubt you did anything harmful. Your display driver will crash before it fries itself. But try running Unigine Heavan at stock speeds to make sure everything is fine, I guess.


ok thanks buddy. will do.


----------



## Zack Foo

Quote:


> Originally Posted by *semiroundboss*
> 
> Out of those two, Gigabyte for sure. But I recommend spending a little more for a Sapphire Nitro. I had to send back my Asus Strix R9 390X since it was so awful. Now I have this wonderful Sapphire Tri-X R9 390X overclocked (It's higher than every other card on here with stock voltages, but I'm still testing it for stability) and it stays very cool. Someone correct me if I am wrong, but the Nitro has the exact same cooler as the Tri-X 390X, just different colors and it is a 390 instead of a 390X.
> 
> Edit: I'll send proof that I have it and the overclock along with benchmarks later tonight.


My asus trix 39 0working fine overclock to 1150 1600 tho. Sapphire nitro runs super cool but without a backplate tho.


----------



## semiroundboss

Well see, my Tri-X overclocked ridiculously high is around the same temps as your Strix at the same clocks you have. And the Sapphire cards don't need backplates due to the PCB being connected to the board more so than your average card. A backplate would have been nicer looking, but I don't have to worry about sag either way.


----------



## CamsX

Quote:


> Originally Posted by *gerpogi*
> 
> Yeah im sure it fits, im trying out a 390x atm and im kinda not comfortable with its temps.. that's why im gonna go for a 390 this time and see if I can keep the temps below 80.
> 
> Also I am aware that it runs hotter but supposedly not by much though right?


Have you set a custom fan profile with the 390x? What happens if you manually set fans to 100%? Whats the idle temperature with the fans ON?

All these cards have a factory fan off mode for quiet operation up to 60°C. I've disabled this on my Nitro.

As I said, your case doesn't seem to have much room, and blowing the hot air from your cpu rad on to the video card is not helping at all. Card won't throttle down until it reaches 90+°C, so there is nothing wrong if its running between 75-85 normally.

Some newegg reviews of the windforce 390 state that it runs on the hot side.


----------



## Ju_nin_mai

Quote:


> Originally Posted by *semiroundboss*
> 
> Furmark is pointless. Your card is fine. Haha. I doubt you did anything harmful. Your display driver will crash before it fries itself. But try running Unigine Heavan at stock speeds to make sure everything is fine, I guess.


ran it at stock settings and ran the benchmark

max frames was like 160
min was 8

max temp was 82C in a room thats ambient 25C

tested it on ultra extreme. average was 71 fps.

The test results said my 390 has 4gb of ram not 8gb. is that just an inaccurate reading by unigine????


----------



## semiroundboss

Quote:


> Originally Posted by *Ju_nin_mai*
> 
> ran it at stock settings and ran the benchmark
> 
> max frames was like 160
> min was 8
> 
> max temp was 82C in a room thats ambient 25C
> 
> tested it on ultra extreme. average was 71 fps.
> 
> The test results said my 390 has 4gb of ram not 8gb. is that just an inaccurate reading by unigine????


It's just inaccurate. Unigine says I have less than 4GB. (4095MB actually). And wow. You're faster than I am. Maybe I should stop worrying about memory and focus on the core.


----------



## diggiddi

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Okay, tweaked a little more, here's what I got:
> 
> 
> That's as good as it's gonna get for this thing...
> Not bad though, *because a stock Fury X gets around 1625 @ 64.6 FPS*


Interesting, this is my score 290x at 1230/1620


Spoiler: Warning: Spoiler!


----------



## gerpogi

Quote:


> Originally Posted by *CamsX*
> 
> Have you set a custom fan profile with the 390x? What happens if you manually set fans to 100%? Whats the idle temperature with the fans ON?
> 
> All these cards have a factory fan off mode for quiet operation up to 60°C. I've disabled this on my Nitro.
> 
> As I said, your case doesn't seem to have much room, and blowing the hot air from your cpu rad on to the video card is not helping at all. Card won't throttle down until it reaches 90+°C, so there is nothing wrong if its running between 75-85 normally.
> 
> Some newegg reviews of the windforce 390 state that it runs on the hot side.


idle temps usually hit 40-50 , heck sometimes the fan stops turning if im just surfing the net.yes it does run air to the side but theres like vents at the back that help exhaust air as well. it goes 85 max when i play ffxiv but on a hot day ( like yesterday, 102 celsius outside holy crap..) i went up to 88.

i always use a custom profile because i dont think my card can handle without one. i usually set it to go 100% at around 80. regarding my cpu, i think i got abit lucky on it. im only running 4.2 ghz with 1.080 volts. it rarely hits 55 so i think im good there, it does bring in warmer air though. i also tried a reference gtx 970 before this r9 390x though and it has lower temps than the 390x hence why im gonna replace it with a 390 and hope itll bring down temps abit.


----------



## Ju_nin_mai

Quote:


> Originally Posted by *semiroundboss*
> 
> It's just inaccurate. Unigine says I have less than 4GB. (4095MB actually). And wow. You're faster than I am. Maybe I should stop worrying about memory and focus on the core.


oh ok good.

Yeah everything seems good. just ran it with my old oc settings. temp went down a bit lol. I have house vent right infront of my desktop and i use a Kleenex box thats empty and I flip it upside down and lay it on top of the vent.

then i cut out a hold on the side that points to my intake fan. So I have COLD AIR intake basically on my computer. lol. with the over clock I got lower temps cuz i turned on the AC. max temp was 79C.

+56 mV
1160 core
1535 memory

my memory sucks. anything over 1540 and its artifacts. dunno why. but only game i tested on was shadow of mordor.


----------



## Ju_nin_mai

thats my oc


----------



## diggiddi

Quote:


> Originally Posted by *Ju_nin_mai*
> 
> 
> 
> thats my oc


Run your test at 1920 x 1080 8xAA


----------



## semiroundboss

Quote:


> Originally Posted by *Ju_nin_mai*
> 
> oh ok good.
> 
> Yeah everything seems good. just ran it with my old oc settings. temp went down a bit lol. I have house vent right infront of my desktop and i use a Kleenex box thats empty and I flip it upside down and lay it on top of the vent.
> 
> then i cut out a hold on the side that points to my intake fan. So I have COLD AIR intake basically on my computer. lol. with the over clock I got lower temps cuz i turned on the AC. max temp was 79C.
> 
> +56 mV
> 1160 core
> 1535 memory
> 
> my memory sucks. anything over 1540 and its artifacts. dunno why. but only game i tested on was shadow of mordor.


I'm not entirely sure it is stable, but my 390X is at 1800 MHz memory. It's not artifacting in 3DMark Firestrike at least. I'm trying to get my core to 1200 or 1250 MHz, but 3DMark is trying to call me a cheater for some reason. Once I get everything stable, I'm gonna play Shadow of Mordor and see how that is. I bet my magic memory can go further.


----------



## Ju_nin_mai

Quote:


> Originally Posted by *semiroundboss*
> 
> I'm not entirely sure it is stable, but my 390X is at 1800 MHz memory. It's not artifacting in 3DMark Firestrike at least. I'm trying to get my core to 1200 or 1250 MHz, but 3DMark is trying to call me a cheater for some reason. Once I get everything stable, I'm gonna play Shadow of Mordor and see how that is. I bet my magic memory can go further.


omg your memory is magic.


----------



## Ju_nin_mai

Quote:


> Originally Posted by *diggiddi*
> 
> Run your test at 1920 x 1080 8xAA


oh geez i missed that


----------



## Ju_nin_mai

here it is with 1920.1080 8xAA


----------



## diggiddi

Try fullscreen instead of windowed mode, plus it says 1063 instead of 1080???


----------



## gerpogi

2 more question. how Much better is the xfx cooler than the gigabyte? and does sapphire make a separate backplate like evga does?


----------



## semiroundboss

Quote:


> Originally Posted by *gerpogi*
> 
> 2 more question. how Much better is the xfx cooler than the gigabyte? and does sapphire make a separate backplate like evga does?


Better to warrant you buying it over the Gigabyte. XFX is voltage unlocked so if you ever want to overclock a bit in the future, you can. It comes with XFX's famous lifetime warranty, as well. Sapphire doesn't make a backplate. The card doesn't need one since the cooler is attached better to it than most cards, so you won't see sagging over time.


----------



## gerpogi

Quote:


> Originally Posted by *semiroundboss*
> 
> Better to warrant you buying it over the Gigabyte. XFX is voltage unlocked so if you ever want to overclock a bit in the future, you can. It comes with XFX's famous lifetime warranty, as well. Sapphire doesn't make a backplate. The card doesn't need one since the cooler is attached better to it than most cards, so you won't see sagging over time.


in terms of temps, does xfx do a better job ?


----------



## tbob22

Quote:


> Originally Posted by *gerpogi*
> 
> in terms of temps, does xfx do a better job ?


Why not the PCS+? It's about 1 inch shorter than the Nitro, has a backplate and will have similar temps.

Temps are great, maxes at 67c in Furmark. Mine overclocks alright, about 1150/1600 without artifacts at +100mv. I can push it further but artifacts appear occasionally.. I haven't went past +100mv.

The lifetime warranty could be a draw for the XFX though, but expect 80c+ at load and I'm not sure about the VRM cooling. I know the PCS+ and Nitro have good VRM cooling, and should stay well under 80c even with Furmark.


----------



## dislikeyou

I bought a Sapphire 390 yesterday, happy with it so far, runs a bit cooler then my old MSI GTX 970.



I have been reading somehwere that the 8 GB is just marketing trick and that the chip relly can't handle so much memory, not sure what is true.


----------



## Agent Smith1984

Quote:


> Originally Posted by *diggiddi*
> 
> Interesting, this is my score 290x at 1230/1620
> 
> 
> Spoiler: Warning: Spoiler!


Yeah, Core speed is king on these cards. I'm going to test voltages over 100mv+ today and see if i can get into the same territory.

I'm thrilled to even get the 1200 at this point. Not many get that stable on air, and I played Crysis 3 at 4K for over an hour with no artifacts or crashing at all. Also, I noticed the memory clock is much more helpful at that res than it is at 1080P.

Seems like 1080P just wants core clock for any sort of major performance boost, but once you get to 4k, the memory and core clocks both begin to help out a great deal.

As of now, my memory bandwidth reports in GPU-Z at 448GB. Prior to HBM, that would of been an unbelievably fast frame buffer.

I really REALLY, love this 390. I hope my next one does as well as this one.
Two of these cards with this kind of overclock (or even 1150/1700 if I gotta settle for less) is really going to slay some 4K.


----------



## Dorland203

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Yeah, Core speed is king on these cards. I'm going to test voltages over 100mv+ today and see if i can get into the same territory.
> 
> I'm thrilled to even get the 1200 at this point. Not many get that stable on air, and I played Crysis 3 at 4K for over an hour with no artifacts or crashing at all. Also, I noticed the memory clock is much more helpful at that res than it is at 1080P.
> 
> Seems like 1080P just wants core clock for any sort of major performance boost, but once you get to 4k, the memory and core clocks both begin to help out a great deal.
> 
> As of now, my memory bandwidth reports in GPU-Z at 448GB. Prior to HBM, that would of been an unbelievably fast frame buffer.
> 
> I really REALLY, love this 390. I hope my next one does as well as this one.
> Two of these cards with this kind of overclock (or even 1150/1700 if I gotta settle for less) is really going to slay some 4K.


Glad to see you get that stable clock with 100 mV.I guess you could get 1250/1800 with 150 mV and even approach 1300/1900 with 200 mV.Imo,heaven and valley and 3dmark are not good tests to check whether a overclock is stable or not.I can overclock and run one of these test overnight without problem but crash as soon as playing crysis 3.Another good test is metro last light with very high quality,af 16x,ssaa on,tessellation very high.


----------



## Agent Smith1984

I doubt anything near 1250 is approachable on this card with air cooling, but I will see how it goes.

So, what happens when you do get a nice round 1200/1750 clock stable on a lil' ol' 390?

Well let's find out...

*TEST SETUP:*

FX-8300 @ 4.8GHz
Trident X 4x4GB @ 2133MHz (9-11-10-30-40 1T)
Samsung 850 EVO 250GB (RAPID MODE ENABLED)
MSI Gaming R9 390 @ 1200/1750
LG 4k TV

*Battlefield 4* - Operation Locker - 64 Player map (FULL) 2 Minute FRAPS benchmark

*1080P Ultra Preset - MANTLE:*

Min 69
Max 153
Avg *122.317*

*4k @ Ultra preset minus AA - MANTLE:*

Min 24
Max 76
Avg *51.575*

*Crysis 3* - Skyline- 16 players CELL (15/16) 2 Minute FRAPS benchmark

*1080P Veru High Textures - Default Very High System settings -*

Min 63
Max 145
Avg *102.983*

*4k (True 4096x2160) Very High Terxtures - Custom system spec (HIGH - AA)*

Min 44
Max 71
Avg *58.175*

MORE TESTING TO COME....

Wife has some homework to do...

I am going to five FarCry 4 and Dirt Rally a go later


----------



## flopper

Quote:


> Originally Posted by *dislikeyou*
> 
> I bought a Sapphire 390 yesterday, happy with it so far, runs a bit cooler then my old MSI GTX 970.
> 
> I have been reading somehwere that the 8 GB is just marketing trick and that the chip relly can't handle so much memory, not sure what is true.


when you go over 4gb textures its useful its rarely the case in games normally.
today 4gb is the sweetspot for gaming.
your likely to run out of corespeed fps before we are ram limited


----------



## gerpogi

Quote:


> Originally Posted by *tbob22*
> 
> Why not the PCS+? It's about 1 inch shorter than the Nitro, has a backplate and will have similar temps.
> 
> Temps are great, maxes at 67c in Furmark. Mine overclocks alright, about 1150/1600 without artifacts at +100mv. I can push it further but artifacts appear occasionally.. I haven't went past +100mv.
> 
> The lifetime warranty could be a draw for the XFX though, but expect 80c+ at load and I'm not sure about the VRM cooling. I know the PCS+ and Nitro have good VRM cooling, and should stay well under 80c like
> Quote:
> 
> 
> 
> Originally Posted by *tbob22*
> 
> Why not the PCS+? It's about 1 inch shorter than the Nitro, has a backplate and will have similar temps.
> 
> Temps are great, maxes at 67c in Furmark. Mine overclocks alright, about 1150/1600 without artifacts at +100mv. I can push it further but artifacts appear occasionally.. I haven't went past +100mv.
> 
> The lifetime warranty could be a draw for the XFX though, but expect 80c+ at load and I'm not sure about the VRM cooling. I know the PCS+ and Nitro have good VRM cooling, and should stay well under 80c even with Furmark.
> 
> 
> 
> If you saw my previous post , I used to have a gigabyte gtx 970 and it was a very tight fit in my case, pretty much pressing on my rad so I doubt the pcs will fit since its about the same length. If it is atleast 10mm shorter than the gigabyte gtx 970, then I might consider the pcs
Click to expand...


----------



## Liranan

Quote:


> Originally Posted by *gerpogi*
> 
> If you saw my previous post , I used to have a gigabyte gtx 970 and it was a very tight fit in my case, pretty much pressing on my rad so I doubt the pcs will fit since its about the same length. If it is atleast 10mm shorter than the gigabyte gtx 970, then I might consider the pcs


The Nano is for you then.


----------



## gerpogi

Quote:


> Originally Posted by *Liranan*
> 
> The Nano is for you then.


That isn't out yet though. I don't know much about the nano but from what what I heard its based from the fury model? If it is , then I don't think itll fit my budget..


----------



## semiroundboss

Quote:


> Originally Posted by *gerpogi*
> 
> That isn't out yet though. I don't know much about the nano but from what what I heard its based from the fury model? If it is , then I don't think itll fit my budget..


The Nano is an unvervolted and underclocked Fury X on a single fan cooler for $500. Personally, I see two overclocked R9 390's as being the better option for the money unless you were to buy a second Nano when the 400 series or the 500 series come out.


----------



## semiroundboss

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I doubt anything near 1250 is approachable on this card with air cooling, but I will see how it goes.
> 
> So, what happens when you do get a nice round 1200/1750 clock stable on a lil' ol' 390?


Is that the highest you've gotten the 390 to go? I'm on 1180 MHz Core and 1850 MHz memory, but still testing for stability. Not sure I'll find it with Tri-X fans at 100%. Do you use the Power Limit on Afterburner or do you just leave it at +0, cause I am having mixed results with it.


----------



## Agent Smith1984

Quote:


> Originally Posted by *semiroundboss*
> 
> Is that the highest you've gotten the 390 to go? I'm on 1180 MHz Core and 1850 MHz memory, but still testing for stability. Not sure I'll find it with Tri-X fans at 100%. Do you use the Power Limit on Afterburner or do you just leave it at +0?


The memory will go a tad higher but the performance diminishes after 1750.

I always use 50% power limiter. Otherwise the core will power throttle with additional voltage.

I will be testing core clock speeds over 1200 with 150mv-200mv+ this evening.


----------



## semiroundboss

Quote:


> Originally Posted by *Agent Smith1984*
> 
> The memory will go a tad higher but the performance diminishes after 1750.
> 
> I always use 50% power limiter. Otherwise the core will power throttle with additional voltage.
> 
> I will be testing core clock speeds over 1200 with 150mv-200mv+ this evening.


I'm gonna be running my 390X for around 4 to 4 and a half years. What do you think a safe voltage long term is? I play 1080p 144hz. Two Grenadas will make me future proof. Should I focus on getting a higher core or a higher memory clock for them?


----------



## Agent Smith1984

Quote:


> Originally Posted by *semiroundboss*
> 
> I'm gonna be running my 390X for around 4 to 4 and a half years. What do you think a safe voltage long term is? I play 1080p 144hz. Two Grenadas will make me future proof. Should I focus on getting a higher core or a higher memory clock for them?


For 1080p, you're going to want to push the core as high as you can. Any amount of voltage should be pretty safe if your temps are good.


----------



## tbob22

Quote:


> Originally Posted by *gerpogi*
> 
> If you saw my previous post , I used to have a gigabyte gtx 970 and it was a very tight fit in my case, pretty much pressing on my rad so I doubt the pcs will fit since its about the same length. If it is atleast 10mm shorter than the gigabyte gtx 970, then I might consider the pcs


Measurements seem to be all over the place. The Gigabyte is said to be 312mm, but I think they are measuring it with the I/O mount overhang. The PCS+ is about 295mm if you measure end to end without I/O mount overhang, which I expect some of these specs seem to include.

From what I've gathered:
G1 Gaming 970: 312mm (~302 without overhang)
Nitro 390: 318mm (~308mm without overhang)
PCS+: 305mm (~295mm without overhang)

Assuming you have 300mm of space _(my case has about 305mm)_, the PCS+ should fit without issue.


----------



## Geoclock

Quote:


> Originally Posted by *semiroundboss*
> 
> Better to warrant you buying it over the Gigabyte. XFX is voltage unlocked so if you ever want to overclock a bit in the future, you can. It comes with XFX's famous lifetime warranty, as well. Sapphire doesn't make a backplate. The card doesn't need one since the cooler is attached better to it than most cards, so you won't see sagging over time.


-

You can say "LAST GOODBYE" to XFX Lifetime Warranty with just R9 290 cards, 390 series LW offering just Best Buy stores with special Best Buy Light Edition cards like MSI does and don't forget Best Buy is charging hell of the money for everything plus Sales Taxes. Does it worth it? I guess no. But Newegg have cheapest prices now $290 with MIR on 390.
Just stick with MSI , ASUS, PC or Sapphire.
At least MSI got 3 year of warranty.


----------



## Agent Smith1984

Anyone benched bf4 or crysis 3 on their cards yet? Would love to have some comparisons against the numbers i posted earlier. I just find it fascinating that with some driver improvements, some overclocking, and custom (yet very good looking) settings, that a single card can average almost 60fps in those titles @ 4k.

There are a lot of people on the roster now, and I'm sure most everyone is just enjoying their cards, but general feedback from users would be great.


----------



## semiroundboss

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Anyone benched bf4 or crysis 3 on their cards yet? Would love to have some comparisons against the numbers i posted earlier. I just find it fascinating that with some driver improvements, some overclocking, and custom (yet very good looking) settings, that a single card can average almost 60fps in those titles @ 4k.
> 
> There are a lot of people on the roster now, and I'm sure most everyone is just enjoying their cards, but general feedback from users would be great.


With school starting up for me tomorrow, I'm pretty busy. I plan to do some benchmarks on those games and more once that settles down. And hell, I'll even have Crossfire benchmarks for everyone in November/December for the 390X when I pick another one up on Black Friday. The only 4K benchmarks I can do are on 3DMark Firestrike Ultra. Is there a third party resolution upscaler? Catalyst lets me go to 3200x1800 at the most.


----------



## CamsX

Yeah, I don't have 4K TV/monitor available either. Most I've done is 1440p VRS upscale, but I'm too lazy to run benchmarks.

Did some on Dirt Rally, for consistency. Minimum frames 56, Avg 69, Max everything no AA.

Other games played include Crysis 3, CS GO, BF4 (VRS is broken in here as usual), Assetto Corsa.


----------



## battleaxe

So where are the VRM's located on the 390x? Anyone know?


----------



## gerpogi

Quote:


> Originally Posted by *tbob22*
> 
> Measurements seem to be all over the place. The Gigabyte is said to be 312mm, but I think they are measuring it with the I/O mount overhang. The PCS+ is about 295mm if you measure end to end without I/O mount overhang, which I expect some of these specs seem to include.
> 
> From what I've gathered:
> G1 Gaming 970: 312mm (~302 without overhang)
> Nitro 390: 318mm (~308mm without overhang)
> PCS+: 305mm (~295mm without overhang)
> 
> Assuming you have 300mm of space _(my case has about 305mm)_, the PCS+ should fit without issue.


im not quite sure but guru 3d has a picture of them measuring the pcs+ 390 and it seemed like labit less than 290mm.i might be wrong though but you guys can check it out


----------



## tbob22

Quote:


> Originally Posted by *gerpogi*
> 
> im not quite sure but guru 3d has a picture of them measuring the pcs+ 390 and it seemed like labit less than 290mm.i might be wrong though but you guys can check it out


Yeah, I was going by the specs on Powercolors site and just removing 10mm, they probably are overshooting a bit to be safe in case of manufacturing differences. I just measured mine and it is taking up almost exactly 285mm inbetween the I/O slots the HDD cage.

Either way, if you were able to fit the Gigabyte 970 in there, the PCS+ should fit no problem.


----------



## gerpogi

Quote:


> Originally Posted by *tbob22*
> 
> Yeah, I was going by the specs on Powercolors site and just removing 10mm, they probably are overshooting a bit to be safe in case of manufacturing differences. I just measured mine and it is taking up almost exactly 285mm inbetween the I/O slots the HDD cage.
> 
> Either way, if you were able to fit the Gigabyte 970 in there, the PCS+ should fit no problem.


probably. i just wish i can get to test one without getting to keep it permanently... only one i know that sells a pcs+ 390 is newegg and they have a no refund policy.. so ehhh.....


----------



## Cannon19932006

Here's my heaven run at my normal OC 1150/1750 on 390x


----------



## CerealKillah

New powersupply fixed my issues. HOORAY!

Not pushing too hard. Last benchmark:

http://www.3dmark.com/3dm/8225262?

Broke 11K score.


----------



## kizwan

Quote:


> Originally Posted by *MalsBrownCoat*
> 
> FYI for anyone who is considering water cooling the ASUS STRIX R9 390X DC3OC with an EK block, their configurator says that the EK-FC R9-290X DCII is fully compatible (and is the only one they offer for this card).
> 
> However, it does N O T fit. The capacitors are too tall and the block doesn't even reach the die. The board may be the same as the 290X version, but the components are Not.
> 
> Everything marked in red is too tall.
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> There is no way that anyone at EK even _remotely_ looked/tested that block with this card. I'll be working with them to get a resolution and will update accordingly. In the meantime, don't do it!


That's suck! If you add shims, you also need longer screws.
Quote:


> Originally Posted by *Ju_nin_mai*
> 
> here it is with 1920.1080 8xAA


1920x1080 8xAA fullscreen, Preset Custom, Quality Ultra & Tessellation Extreme. You should run with this settings. Press F12 to take screenshot.
Quote:


> Originally Posted by *battleaxe*
> 
> So where are the VRM's located on the 390x? Anyone know?


Pretty much at the same place. See above, ASUS card as an example.


----------



## jackalopeater

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Anyone benched bf4 or crysis 3 on their cards yet? Would love to have some comparisons against the numbers i posted earlier. I just find it fascinating that with some driver improvements, some overclocking, and custom (yet very good looking) settings, that a single card can average almost 60fps in those titles @ 4k.
> 
> There are a lot of people on the roster now, and I'm sure most everyone is just enjoying their cards, but general feedback from users would be great.


I haven't gotten a chance to test out with Crysis 3, but I have run BF4 on a 64 man server with my stock clocked Asus 390x. I didn't benchmark it, but I did take a string of screenshots with the FPS captured if that helps.

BF4 High preset at UHD 4k


http://imgur.com/4fC7I


I'm getting my new 4k monitor in tomorrow and later this week will be able to get some more results, any suggested lists of titles for single 390x 4k gaming testing? I'm shooting for a guide on what settings in games get you a good single 390x 4k gaming experience


----------



## Darkeylel

Quote:


> Originally Posted by *jackalopeater*
> 
> I haven't gotten a chance to test out with Crysis 3, but I have run BF4 on a 64 man server with my stock clocked Asus 390x. I didn't benchmark it, but I did take a string of screenshots with the FPS captured if that helps.
> 
> BF4 High preset at UHD 4k
> 
> 
> http://imgur.com/4fC7I
> 
> 
> I'm getting my new 4k monitor in tomorrow and later this week will be able to get some more results, any suggested lists of titles for single 390x 4k gaming testing? I'm shooting for a guide on what settings in games get you a good single 390x 4k gaming experience


Shadow of mordor please









Very surprised at that FPS you're getting at 4k, i'm only getting around the 150 on average with everything on low apart from textures which is on ultra


----------



## AverdanOriginal

Hi guys.

After still having a problem with high Ambient Temps here in summer time (around 30-36 C° throughout the day), I decided to go the other way and try to *undervolt my card*. I know most here are interested in overclocking and so am I, but I was hitting 87C° during gaming with my crazy Ambient Temps at home and I just wanted to find a way to get the same performance but less heat.

So here are my steps on undervolting my MSI R9 390:


The first grey line is the normal gaming mode of the MSI R9 390 in order to compare and see what I can get out of decreasing the voltage.
The blue lines are where I experienced a bluescreen due to too high a memory.

In the end I am undecisive between the light green or green option.
Both gave me an *cooler Card of 6C°* while the later of course a tiny bit better Heaven Benchmark. Both passed Heaven and 3dMark (no artifacts). But also with both I experienced a crash on Kombuster Furmark.
I didn't have the time yet to test in extensive gaming sessions, But I am pretty sure that Kombuster Furmark is not the best to test on undervolting (let alone on overclocking).

Does anyone have experience with undervolting and can suggest a good stress test? Just seems strange that I can decrease my core volt by -100mV and even increase my Memory to 1720 without issues.
Sometimes I think Heaven is simply a bench but no real stability test. Oh and Watchdog on Ultra setting ran smooth, but only had 20 min of time to test.

Edit: I did not test all the steps with Kombuster Furmark only the ones where I wanted test for longtime stability, + Vram temps 66 and 62 are constant pretty much.


----------



## Agent Smith1984

Quote:


> Originally Posted by *AverdanOriginal*
> 
> Hi guys.
> 
> After still having a problem with high Ambient Temps here in summer time (around 30-36 C° throughout the day), I decided to go the other way and try to *undervolt my card*. I know most here are interested in overclocking and so am I, but I was hitting 87C° during gaming with my crazy Ambient Temps at home and I just wanted to find a way to get the same performance but less heat.
> 
> So here are my steps on undervolting my MSI R9 390:
> 
> 
> The first grey line is the normal gaming mode of the MSI R9 390 in order to compare and see what I can get out of decreasing the voltage.
> The blue lines are where I experienced a bluescreen due to too high a memory.
> 
> In the end I am undecisive between the light green or green option.
> Both gave me an *cooler Card of 6C°* while the later of course a tiny bit better Heaven Benchmark. Both passed Heaven and 3dMark (no artifacts). But also with both I experienced a crash on Kombuster Furmark.
> I didn't have the time yet to test in extensive gaming sessions, But I am pretty sure that Kombuster Furmark is not the best to test on undervolting (let alone on overclocking).
> 
> Does anyone have experience with undervolting and can suggest a good stress test? Just seems strange that I can decrease my core volt by -100mV and even increase my Memory to 1720 without issues.
> Sometimes I think Heaven is simply a bench but no real stability test. Oh and Watchdog on Ultra setting ran smooth, but only had 20 min of time to test.
> 
> Edit: I did not test all the steps with Kombuster Furmark only the ones where I wanted test for longtime stability, + Vram temps 66 and 62 are constant pretty much.


Nice undervolting results!

If it's game stable, I wouldn't worry about Kombuster, that thing is just a torture device! lol


----------



## AverdanOriginal

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Nice undervolting results!
> 
> If it's game stable, I wouldn't worry about Kombuster, that thing is just a torture device! lol


Thanks. Saw this in a magazine with a MSI R9 390X where the end result was similar. They ended up with:
1050/1650 @-75mV and they achieved also a drop of 6C° at same FPS and about 80W less power draw (405W instead of 485W)

Was thinking for hot summer nights, or for people living in hotter regions, and general quieter setting for everyday use this seems nice/cheap and easy to achieve. Overclocking results will follow once I have my new PSU fixed









Even got a better result in 3DMark with the 1040/1720 @-100mV then with nornmal OC Mode of 1060/1525 of this Card.
I just don't trust it yet. Does it make sense to just let Heaven run (basically not do the benchmark) for 1-2 hours to test for stability? For CPU I use Prime 95 or Linx, but for GPU I only know of Furmark or Kombuster (which has furmark aswell), but I also think they are an overkill.


----------



## Gumbi

Quote:


> Originally Posted by *AverdanOriginal*
> 
> Thanks. Saw this in a magazine with a MSI R9 390X where the end result was similar. They ended up with:
> 1050/1650 @-75mV and they achieved also a drop of 6C° at same FPS and about 80W less power draw (405W instead of 485W)
> 
> Was thinking for hot summer nights, or for people living in hotter regions, and general quieter setting for everyday use this seems nice/cheap and easy to achieve. Overclocking results will follow once I have my new PSU fixed
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Even got a better result in 3DMark with the 1040/1720 @-100mV then with nornmal OC Mode of 1060/1525 of this Card.
> I just don't trust it yet. Does it make sense to just let Heaven run (basically not do the benchmark) for 1-2 hours to test for stability? For CPU I use Prime 95 or Linx, but for GPU I only know of Furmark or Kombuster (which has furmark aswell), but I also think they are an overkill.


Quote:


> Originally Posted by *Geoclock*
> 
> Is there a "REASON" why r9 390 prices are droping down like the flies and Gtx 970 prices are still same?


Yep, Furmark is complete overkill, running Heaven for a few hours is fine as a stability test. Of course some games are slightly more stressful (Crysis 3 would be an example), but if you make it 2 hours intoHeaven yoyou're pretty damn good. And if you crash in Crysis, no biggie, just bump the voltage a small bit and you're good


----------



## Agent Smith1984

Quote:


> Originally Posted by *Gumbi*
> 
> Yep, Furmark is complete overkill, running Heaven for a few hours is fine as a stability test. Of course some games are slightly more stressful (Crysis 3 would be an example), but if you make it 2 hours intoHeaven yoyou're pretty damn good. And if you crash in Crysis, no biggie, just bump the voltage a small bit and you're good


I would go straight to Crysis 3 testing so you are putting the entire system under load.


----------



## BlaXey

Hello, Lepa PSUs are good? I'm thinking in buy a new one to have a more stable PC.


----------



## pengs

Quote:


> Originally Posted by *AverdanOriginal*
> 
> Thanks. Saw this in a magazine with a MSI R9 390X where the end result was similar. They ended up with:
> 1050/1650 @-75mV and they achieved also a drop of 6C° at same FPS and about 80W less power draw (405W instead of 485W)
> 
> Was thinking for hot summer nights, or for people living in hotter regions, and general quieter setting for everyday use this seems nice/cheap and easy to achieve. Overclocking results will follow once I have my new PSU fixed
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Even got a better result in 3DMark with the 1040/1720 @-100mV then with nornmal OC Mode of 1060/1525 of this Card.
> I just don't trust it yet. Does it make sense to just let Heaven run (basically not do the benchmark) for 1-2 hours to test for stability? For CPU I use Prime 95 or Linx, but for GPU I only know of Furmark or Kombuster (which has furmark aswell), but I also think they are an overkill.


Looks like you've got some quality silicon to be able to undervolt it to that level. Shaving 80w from a 390 is the equivalent or slightly better than running a GTX970 at full load if you look at the peak power consumption here - and your 100mV is probably redeeming you around 100w... that, sir, is impressive









I used the default clocks to undervolt mine and at first ran -30 to -37mV on almost every game and stability test but was unable to run BeamNG Drive for some odd reason (maybe all the HDR/bloom and post processing) and ended up taking it back to -19mV for stability and recouping about -20w/-2°C which ties in closely with the articles -75mV/-80watts/-6°C. I'd honestly just run a lot of games and play them for a while, if you CTD check the Event Viewer and System for display driver crashes.

Each game or benchmark will put a different type of load on the GPU; just when you think it's stable


----------



## Gumbi

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I would go straight to Crysis 3 testing so you are putting the entire system under load.


Not a bad idea tbh! I find Crysis sniffs out CPU instability fairly well too.


----------



## Gumbi

Quote:


> Originally Posted by *pengs*
> 
> Looks like you've got some quality silicon to be able to undervolt it to that level. I used the default clocks to undervolt mine and at first ran -30 to -37mV on almost every game and stability test but was unable to run BeamNG Drive for some odd reason (maybe all the HDR/bloom and post processing) and ended up taking it back to -19mV for stability. I'd honestly just run a lot of games and play them for a while, if you CTD check the Event Viewer and System for display driver crashes.
> 
> Each game or benchmark will put a different type of load on the GPU; just when you think it's stable


I might be looking into undervolting my GPU soon.One of my fan connectors blew on my Vapor X 290, so only my two outer fans function, resulting in decreased cooling capability.

I can run 1120/1400 handily enough at stock voltage(plus 25 mv which means 1.15v~), but I might see how much of an undervolt I can do for 1000mhz core or something like that (1050maybe).


----------



## pengs

Quote:


> Originally Posted by *Gumbi*
> 
> I might be looking into undervolting my GPU soon.One of my fan connectors blew on my Vapor X 290, so only my two outer fans function, resulting in decreased cooling capability.
> 
> I can run 1120/1400 handily enough at stock voltage(plus 25 mv which means 1.15v~), but I might see how much of an undervolt I can do for 1000mhz core or something like that (1050maybe).


Yeah I was thinking the same thing. 20mV seems to be the difference between 30MHz from 1040 to 1070 for me, another 20mV seems to get me to 1100 without the memory bump, another 30MHz. By that logic 1000/1300 should be doable around -45mV and that's probably a 50w shave which is really substantial.

The 100w Averdan removed allows the 390 to consume about 30w less power as a GTX970... all praise voltage control


----------



## dislikeyou

Does the 390 and 390x support directx 12_1 or only 12_0?


----------



## Agent Smith1984

Quote:


> Originally Posted by *dislikeyou*
> 
> Does the 390 and 390x support directx 12_1 or only 12_0?


12.0

On a side note...

Free 2133MHz 8GB (2x4) DDR3 RAM kit with purchase of Sapphire Nitro!!!

DEAL OF THE DAY









http://www.newegg.com/Product/Product.aspx?Item=N82E16814202148&cm_re=r9_390-_-14-202-148-_-Product


----------



## pengs

Quote:


> Originally Posted by *dislikeyou*
> 
> Does the 390 and 390x support directx 12_1 or only 12_0?


12_0 but the two key features from fl12_1 are algorithms which can be run on the existing architecture so at the end of the day AMD will be able to at least emulate which is how Maxwell is doing it currently anyhow I believe. I don't think Maxwell 2.0 (900 series) has any actual dedicated hardware to accommodate 12_1, if that's true AMD can easily do the same (which is probably why AMD _meh_'ed 12.1 support with GCN 1.2 (Fury/Fury X) and refined Hawaii (390/390X))


----------



## dislikeyou

I googled and it shows that Nvidia 900 series is 12_1 but it could be some emulation like you say, not sure.

AMD also did not implement HDMI 2.0 even thought it does not cost anything extra over 1.4, they sure must have a reason for leaving out features that Nvidia 900 series have.
Quote:


> Will current Adopters be required to pay an additional Annual Fee if they choose to adopt the HDMI 2.0 specification?
> No. There is no additional Annual Fee for HDMI Adopters who choose to license the HDMI 2.0 specification.


http://www.hdmi.org/manufacturer/hdmi_2_0/hdmi_2_0_faq.aspx#138


----------



## sinholueiro

Quote:


> Originally Posted by *dislikeyou*
> 
> Does the 390 and 390x support directx 12_1 or only 12_0?


DX11 architectures of Nvidia only has 11_0 and AMD has 11_1 and no one complain. Is there any game that a 780Ti can't run?
You can read more here: http://www.extremetech.com/extreme/207598-demystifying-directx-12-support-what-amd-intel-and-nvidia-do-and-dont-deliver


----------



## pengs

Quote:


> Originally Posted by *dislikeyou*
> 
> I googled and it shows that Nvidia 900 series is 12_1 but it could be some emulation like you say, not sure.


Correct, because they coined the standard. It would be foolish for any company to establish a standard, with which it used it's resources, to be used for marketing whilst not using it for marketing.
Quote:


> AMD also did not implement HDMI 2.0 even thought it does not cost anything extra over 1.4, they sure must have a reason for leaving out features that Nvidia 900 series have.
> http://www.hdmi.org/manufacturer/hdmi_2_0/hdmi_2_0_faq.aspx#138


There are some oddities and marketing tricks in that scene which I cannot comment because I am DVI guy


----------



## THUMPer1

1200/1700 on my 390x. Not sure why it says 390


----------



## Agent Smith1984

Quote:


> Originally Posted by *THUMPer1*
> 
> 1200/1700 on my 390x. Not sure why it says 390


That's beasting right there brother!!!

What voltage/temps?

Stable or just benching?


----------



## THUMPer1

Quote:


> Originally Posted by *Agent Smith1984*
> 
> That's beasting right there brother!!!
> 
> What voltage/temps?
> 
> Stable or just benching?


GPU temp got to 72c. +70 GPU voltage, and +75 Aux voltage. Fan at 100%
Stable in the bench. As in Heaven didn't crash. I will try BF4 and see how stable it is. I may try for more. I bet with +75 GPU Voltage, i could get 1220...Maybe. hah


----------



## LongRod

Quote:


> Originally Posted by *THUMPer1*
> 
> GPU temp got to 72c. +70 GPU voltage, and +75 Aux voltage. Fan at 100%
> Stable in the bench. As in Heaven didn't crash. I will try BF4 and see how stable it is. I may try for more. I bet with +75 GPU Voltage, i could get 1220...Maybe. hah


God you're lucky, I need +100mV on the core to hit 1200, anything below and it artifacts.

Good thing TRIXX lets you hit +200mV! I wonder if the MSI cooler can keep the core under control with that much voltage and that high of a clock speed (aiming for 1250+)...


----------



## Agent Smith1984

Quote:


> Originally Posted by *LongRod*
> 
> God you're lucky, I need +100mV on the core to hit 1200, anything below and it artifacts.
> 
> Good thing TRIXX lets you hit +200mV! I wonder if the MSI cooler can keep the core under control with that much voltage and that high of a clock speed (aiming for 1250+)...


1250 won't happen on a 390 without custom water cooling (and maybe not even then), and here's why:

1) Hawaii rarely exhibits the type of clock scaling with voltage necessary to run at that clock speed with any true stability, at least without drastically reducing temperatures, and drastically increasing voltage.

2) 390 boards (all but the Sapphire @ 375W), only support a max power capability of 350W on paper. Increasing the core by 100mv+ usually nets me around 357W max board power, and from there, the actual voltage going to the core is not indicative of the voltage setting you are using (after 100mv). You can crank the slider to 200mv all you want, but it will not really be increasing load power that much, and in some cases, none at all. Even witha 50% power limiter, these boards show signs of power limitations.... This was reported by HardOCP when testing the MSI 390X, and while I disagree that it was to the intensity of their findings, I do believe it's lower than we expected.

3) Temps.... okay, with a very well ventilated case, I am usually getting 75C core under load at 100mv+ core/50mv+ AUX, and around 73-76C on the VRM depending on the game. (4K seems to stress the card more so than anything else.)
Those are definitely decent temps on air with that kind of voltage, but the problem is, Hawaii doesn't show much love for 5c and 10c temp drops. It needs 15c+++ temp reductions to show major clocking improvements. Again, the clock ceilings on the cards are being realized long before the power/heat limitations are in most cases. Consider that these cards can run at 90C+ with clocks in the 1000-1150 range, and as long as they don't hit 94, they are fine. Pretty much proves the point....

4) It has already been widely mentioned that power play is more aggressive on the 390 cards. Though the BIOS/Memory refinements have brought about some nice improvements on the "Grenada" cards, the reality is, most people experimenting with 390 BIOS on 290 cards, and finding that while their performance has greatly improved, their actual overclocking abilities have been reduced. I don't know the exact details on why that is, but I have seen this reported on several occasions in some of the BIOS MOD threads.

The good news?? Well, while you will be VERY hard pressed to find a 390 that will break 1210MHz core clock, you will see tons of 390's running at 1150-1200 and scoring as well as some 290's running in the 1200-1250MHz clock range.

Sadly, we just don't have enough people with water cooled 390's to see how hard these can be pushed. There is some theory and assumptions made in my statements, but most of the logic IS fact from what many, including myself, have experienced on the 290 series.

At the end of the day. A 1200+core clock is an outlier with this series. That is a fact based on all the results we have here, and elsewhere on other sites and reviews.
I don't know if that's because the average user doesn't want to push the cards to their limits, if it's because it's summer in many places and just not a good time for overclocking becaue of the heat, or if these card are just as limited as the 290 series in regards to their "upper-end" clock ceiling (in terms are standard voltage increases), but one thing is for sure..... Most 390 owners are getting to 1150-1190MHz much easier than 290 owners have in the past.

This reminds me a lot of what AMD acheived over time with the FX-8300 series....

The older FX-8 chips were of a higher leakage, and with extreme cooling, and high voltages, they could acheive 5Ghz+++

The newer "low leakage" FX-8 chips are getting to the 4.6-4.8GHz range with MUCH MUCH lower voltages than the older ones needed, however most users find that once they are done, they are done, and no amount of juice can get them to budge much past 5GHz.

I believe the newer Hawaii cores share this same characteristic. They are lower leakage chips, that clock higher than the previous silicon at lower voltages, up until a certain point, when they hit a ceiling, and from there, they are done, while the high leakage silicon will keep on taking the voltage (as it bleeds it off) and clocking higher.


----------



## JohnnyMoore

Hello all , I would be added on the member list please.Owning two R9 390x manufactured by MSI.


----------



## JohnnyMoore

Who knows where I can buy watterblock for 390x? Because I dont need to go in my kitchen to fry steak...when I start full load stress test the main upper card attempt very fast throttling temperature,2nd is ok about 75 degree(i own HAF X case), to resolve it i connected 2nd card to the 3rd PCI-E 3.0 wich have only x4 lines...and i think my config have bottleneck because of this... x8/x4 instead x8/x8 (CPU i7 4770k 4200MHz and motherboard Z87 MSI Mpower Max)


----------



## diggiddi

Quote:


> Originally Posted by *jackalopeater*
> 
> I haven't gotten a chance to test out with Crysis 3, but I have run BF4 on a 64 man server with my stock clocked Asus 390x. I didn't benchmark it, but I did take a string of screenshots with the FPS captured if that helps.
> 
> BF4 High preset at UHD 4k
> 
> 
> http://imgur.com/4fC7I
> 
> 
> I'm getting my new 4k monitor in tomorrow and later this week will be able to get some more results, any suggested lists of titles for single 390x 4k gaming testing? I'm shooting for a guide on what settings in games get you a good single 390x 4k gaming experience


What server was that?


----------



## JohnnyMoore




----------



## jackalopeater

Quote:


> Originally Posted by *diggiddi*
> 
> What server was that?


it was on this server
http://battlelog.battlefield.com/bf4/servers/show/pc/0279eb49-aef1-4c92-bc06-60863919bede/eGO-DICE-Conquest-Popular-Maps-Edge-Gamers-com-EA/


----------



## Geoclock

Hi .
Wanted to buy used POWERCOLOR 390 but how i find out Powercolor DOESN'T support second owner by ANY warranties.
I know eBay is problematic with stuff like this especially if owner does NOT offer returns.
What to do if Black Screen or BSOD ?
Any recommendation?


----------



## JohnnyMoore

Quote:


> Originally Posted by *Geoclock*
> 
> Hi .
> Wanted to buy used POWERCOLOR 390 but how i find out Powercolor DOESN'T support second owner by ANY warranties.
> I know eBay is problematic with stuff like this especially if owner does NOT offer returns.
> What to do if Black Screen or BSOD ?
> Any recommendation?


If the card is defected you mean?because i have some times black screen too , its driver issue , amd has admitted it and is working in this way. I think that not only powercolor doesnt support second owner ,it apply to all stuff and brands. Only solution arrange with seller,but i think he would not.Or buy it for your risk.


----------



## Geoclock

The biggest problem is Powercolor doesn't support second owner warranty, so if something will happen in 2-3 months i'll have no support .
Locks like we all gonna get rid from PC.
Any chance to trick Powercolor by somehow?


----------



## diggiddi

Just don't buy it
Quote:


> Originally Posted by *jackalopeater*
> 
> it was on this server
> http://battlelog.battlefield.com/bf4/servers/show/pc/0279eb49-aef1-4c92-bc06-60863919bede/eGO-DICE-Conquest-Popular-Maps-Edge-Gamers-com-EA/


thx

Quote:


> Originally Posted by *Geoclock*
> 
> The biggest problem is Powercolor doesn't support second owner warranty, so if something will happen in 2-3 months i'll have no support .
> Locks like we all gonna get rid from PC.
> Any chance to trick Powercolor by somehow?


Just don't risk it if you have any concerns


----------



## Geoclock

Wanted to buy XFX 390 from Newegg for $290 after MIR and today they push up prices at $355, higher than MSI 390. Go figure it out.


----------



## JohnnyMoore

Quote:


> Originally Posted by *Geoclock*
> 
> Wanted to buy XFX 390 from Newegg for $290 after MIR and today they push up prices at $355, higher than MSI 390. Go figure it out.


in France r9 390 tri-x cost approximately 340-350 euros...


----------



## CerealKillah

I can't get 1200 core stable







That makes me one sad dude. Even +200 in Trix isn't good enough.

Also VRM1 was getting up to 90C...everything else was in the 60's.

I keep throwing voltage at it and it just won't make it through Firestrike.


----------



## JohnnyMoore

srsly, runing 1150MHz with 31mv..,dont worry i have 75.4 with 1150 and 76.1 with 1200 but need much more tension , it not justify this 0.7 fps difference
Quote:


> Originally Posted by *CerealKillah*
> 
> I can't get 1200 core stable
> 
> 
> 
> 
> 
> 
> 
> That makes me one sad dude. Even +200 in Trix isn't good enough.
> 
> Also VRM1 was getting up to 90C...everything else was in the 60's.
> 
> I keep throwing voltage at it and it just won't make it through Firestrike.


----------



## cbarros82

my msi 390 does 1175 no volts and 1200 with +75


----------



## CerealKillah

I would bet the extreme VRM temps are not helping me at all. I have ordered some thicker pads to use on VRM 1. I will pull the loop apart again (sigh) this weekend and install the new pads.

The core and VRM2 are staying very cool (sub 65C) while VRM 1 soars above 90.

http://www.amazon.com/gp/product/B00ZSJPZQ2?psc=1&redirect=true&ref_=oh_aui_detailpage_o00_s00


----------



## IcarusLSC

VRM2 is a non working sensor on the 390's from all I've found out. It will read one temp and stay there.

I miss my 390 already







This 970 is crap... sigh...


----------



## diggiddi

Quote:


> Originally Posted by *IcarusLSC*
> 
> VRM2 is a non working sensor on the 390's from all I've found out. It will read one temp and stay there.
> 
> I miss my 390 already
> 
> 
> 
> 
> 
> 
> 
> This 970 is crap... sigh...


Why did you switch an whats it doing to make you miss the old one?


----------



## AverdanOriginal

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Nice undervolting results!
> 
> If it's game stable, I wouldn't worry about Kombuster, that thing is just a torture device! lol


Thx. That's what I thought too... I guess Furmark really focuses on volts, that's why it probably is "hottest"







stresstest.
Quote:


> Originally Posted by *Gumbi*
> 
> Yep, Furmark is complete overkill, running Heaven for a few hours is fine as a stability test. Of course some games are slightly more stressful (Crysis 3 would be an example), but if you make it 2 hours intoHeaven yoyou're pretty damn good. And if you crash in Crysis, no biggie, just bump the voltage a small bit and you're good


Thx. did 1 hour of heaven with the 1040/1700 @-100mV.... Lol you'll see the result below!
Quote:


> Originally Posted by *pengs*
> 
> Looks like you've got some quality silicon to be able to undervolt it to that level. Shaving 80w from a 390 is the equivalent or slightly better than running a GTX970 at full load if you look at the peak power consumption here - and your 100mV is probably redeeming you around 100w... that, sir, is impressive
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I used the default clocks to undervolt mine and at first ran -30 to -37mV on almost every game and stability test but was unable to run BeamNG Drive for some odd reason (maybe all the HDR/bloom and post processing) and ended up taking it back to -19mV for stability and recouping about -20w/-2°C which ties in closely with the articles -75mV/-80watts/-6°C. I'd honestly just run a lot of games and play them for a while, if you CTD check the Event Viewer and System for display driver crashes.
> 
> Each game or benchmark will put a different type of load on the GPU; just when you think it's stable


Yeah while Watchdog ran smooth... Dragon Age Inquisition slaped me in the face









Ok guys. Thx for all the responses. Seems I hit a small nerve there. After following your suggestions, I ran Heaven for like 50 mins or so and everything was fine... THEN, since I dont have Crysis 3, i decided to test on my currently most stressfull game on CPU and GPU at everything ULTRA --> Dragon Age Inquisition.
So loading game.... in the main menu... start game... loading.... loading.... BANG Stuck, no CTD or Blue Horizon on my Screen... ok what to do.. reboot.

Seems that undervolting +overclocking just doesn't work wiht my baby









So I dropped memory back to 1700 (which already gave me bluescreen on 1735) and ended up with 1020/1700 @ -81mV. Heaven passed, Dragon Age Inqui smooth for 1 hour, firestrike passed.

Then. after a night of restless sleep I got up really early... I could not settle on beeing shy of 0.2 FPS compared to the official Gaming setting on the card.
and so I tested on 1030/1700 @ -87mV and BAM... running smooth.









Here the final Excel sheet:


And here the final bench run through Heaven:


Now I am 0.2 FPS... that's right 0.2 FPS better and about 5.5 Celsius cooler...









Here also Firestrike: http://www.3dmark.com/3dm/8238838
Don't know why it shows that I have 1040 MHz instead of 1030. Keep in mind, My board is like 6 years old with PCI-e at 2.0 x16 and not 3.0x16 plus my actuall PSU (Be Quiet straight power 10 600 W) is for repair. So I am pretty sure, PSU, Mobo and CPU are holding me a bit back.

Maybe you also noticed the decreasing Ambient temps.... which means this cooling setting is done (DA Inqui and Heaven not more than 71 C° at 25-26 Ambient)... now once I have time again I will start to see what this silicon baby can overclock to.

Thanks again guys for your tipps.
Cheers.


----------



## thegamehhh

am i the only one whos getting bsod 116 while playing game like ac: unity? i have tried changing drivers, reinstalling windows etc and nohting has changed.


----------



## AverdanOriginal

Quote:


> Originally Posted by *thegamehhh*
> 
> am i the only one whos getting bsod 116 while playing game like ac: unity? i have tried changing drivers, reinstalling windows etc and nohting has changed.


Sadly I have yet to get around to play AC Unity. Even though this game is supposed to be badly scripted, but a bsod 116 isn't normal. Do you still have the heat issue on your card? (I remember you posted you got like 94C° or something while playing AC Unity correct?).

I found a threat where someone had a similar problem http://www.tomshardware.co.uk/forum/381257-33-bsod-error

Apart from that, I would check PSU run some bench and stress tests. Did you uninstall all previous graphic card drviers with DDU? And do you also have all the current drivers for your Mobo and Windows and so on? Does this only happen in games?
perhaps you should monitor your card (Temps, FPS...) during the gaming. Maybe something faulty can be seen?

I would guess... but that is widely fetched... it is a hardware problem. Either faulty card or old drivers or so.


----------



## JohnnyMoore

Quote:


> Originally Posted by *cbarros82*
> 
> my msi 390 does 1175 no volts and 1200 with +75


you should probably add some tention because it can work worse without artifacts then at stock , insufficient tension


----------



## Dorland203

Does anyone use Mirillis action ? I observe odd behavior of my MSI 390X when using Mirillis Action to capture video.It records fine with DX9,DX10 games but will crash in DX11 games.


----------



## JohnnyMoore

Quote:


> Originally Posted by *Dorland203*
> 
> Does anyone use Mirillis action ? I observe odd behavior of my MSI 390X when using Mirillis Action to capture video.It records fine with DX9,DX10 games but will crash in DX11 games.


Known issue , wait for amd new driver.My games crash also when using gaming evolved video recording


----------



## Agent Smith1984

Quote:


> Originally Posted by *CerealKillah*
> 
> I can't get 1200 core stable
> 
> 
> 
> 
> 
> 
> 
> That makes me one sad dude. Even +200 in Trix isn't good enough.
> 
> Also VRM1 was getting up to 90C...everything else was in the 60's.
> 
> I keep throwing voltage at it and it just won't make it through Firestrike.


Sadly, 1200 core will probably never happen until your VRM's are SUB 80C or lower.

I found that I could not get past 1180 with any stability, at any voltage, until I got the VRM1 into the low 70's, and then BAM, there it was!









MSI's Frozr V isn't neccesarely the best cooler out there overall, but if you can get high case flow, it cools the VRM's better than anything I've seen.
Consider that I'm running 100mv/50mv AUX, and my VRM1 is around 74C, and VRM2 is 48C..... that is pretty impressive for an air cooler.

Mind you it comes with some noise from the fans, but when I'm gaming, all I hear is shots fired!


----------



## JohnnyMoore

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Sadly, 1200 core will probably never happen until your VRM's are SUB 80C or lower.
> 
> I found that I could not get past 1180 with any stability, at any voltage, until I got the VRM1 into the low 70's, and then BAM, there it was!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> MSI's Frozr V isn't neccesarely the best cooler out there overall, but if you can get high case flow, it cools the VRM's better than anything I've seen.
> Consider that I'm running 100mv/50mv AUX, and my VRM1 is around 74C, and VRM2 is 48C..... that is pretty impressive for an air cooler.
> 
> Mind you it comes with some noise from the fans, but when I'm gaming, all I hear is shots fired!


When you have high temp - resistance changing value and you have 'leakage current' appearing , the best results are on lowest temp , you have 1% to reach good overclock at 90+temp.


----------



## JohnnyMoore

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Sadly, 1200 core will probably never happen until your VRM's are SUB 80C or lower.
> 
> I found that I could not get past 1180 with any stability, at any voltage, until I got the VRM1 into the low 70's, and then BAM, there it was!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> MSI's Frozr V isn't neccesarely the best cooler out there overall, but if you can get high case flow, it cools the VRM's better than anything I've seen.
> Consider that I'm running 100mv/50mv AUX, and my VRM1 is around 74C, and VRM2 is 48C..... that is pretty impressive for an air cooler.
> 
> Mind you it comes with some noise from the fans, but when I'm gaming, all I hear is shots fired!


DId you added me at list?)


----------



## Agent Smith1984

Quote:


> Originally Posted by *JohnnyMoore*
> 
> Do you added me at list?)


Yep, added! Welcome to the club!

Were you able to cool your top card down, and stop throttling?

You are going to need tons of air flow, and a very high speed fan profile for the top card, especially if the cards are close together.


----------



## JohnnyMoore

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Yep, added! Welcome to the club!
> 
> Were you able to cool your top card down, and stop throttling?
> 
> You are going to need tons of air flow, and a very high speed fan profile for the top card, especially if the cards are close together.


Ty! Yes i downgrade core voltage to +0mv and it run pretty good with 1150Mhz,it allow me reach 90 degree after 30min benchmarking.And in game it never reach this temp,only one case if i run GTA5 3200x1800 all ultra MSAA x4 and staying in grass . I have two 240mm fan to insufflate air , one on the front and other on the side aiming cards.Two 220mm on the top ,1 behinde and 3 fan interior.My fun profile - if i reach 75 degree it run at 100% =) with 5 degree of cycle hysterezis


----------



## JohnnyMoore

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Yep, added! Welcome to the club!
> 
> Were you able to cool your top card down, and stop throttling?
> 
> You are going to need tons of air flow, and a very high speed fan profile for the top card, especially if the cards are close together.


But no way to put them close together ,sacrifie x4 line because of temp.Looking for watterblock but cant find it


----------



## Agent Smith1984

Quote:


> Originally Posted by *JohnnyMoore*
> 
> Ty! Yes i downgrade core voltage to +0mv and it run pretty good with 1150Mhz,it allow me reach 90 degree after 30min benchmarking.And in game it never reach this temp,only one case if i run GTA5 3200x1800 all ultra MSAA x4 and staying in grass . I have two 240mm fan to insufflate air , one on the front and other on the side aiming cards.Two 220mm on the top ,1 behinde and 3 fan interior.My fun profile - if i reach 75 degree it run at 100% =) with 5 degree of cycle hysterezis


Two 390X is corrsfire with 1150 core clocks is a pretty powerful setup bud!

If you want to push the memory further, you can add some AUX voltage in afterburner (around 25mv-50mv seems to work) and push it to 1700-1750mhz.

It will affect VRM temps a little though.

Have you ran any other benches? Firestrike? Curious to see the graphics score on two overclocked 390x's








Quote:


> Originally Posted by *JohnnyMoore*
> 
> But no way to put them close together ,sacrifie x4 line because of temp.Looking for watterblock but cant find it


They will not be making full cover blocks for the MSI 390's as of now. That likely will not change unless some serious petitioning/interest occurs.
I wish they would, because MSI has obviously binned their cores better than any of the other AMD partners. There is nothing "special" about the power circuitry and cooling on their cards, so it's obvious they are working with well binned chips. Another supporting factor would be their recent release of the MSI Gaming 390 "LE" which is just a lower clocked 390. Probably going to be several available that are produced with their less than stellar Hawaii cores. I'd like to see the LE at $299.99 (instead of $329.99), and a 390X LE for $399.99 (instead of $429.99), but considering the standard 390 has been selling out regularly, I guess they have no need to price drop it, as it will likely sell plenty when stock on the higher clocked parts is not available.


----------



## JohnnyMoore

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Two 390X is corrsfire with 1150 core clocks is a pretty powerful setup bud!
> 
> If you want to push the memory further, you can add some AUX voltage in afterburner (around 25mv-50mv seems to work) and push it to 1700-1750mhz.
> 
> It will affect VRM temps a little though.
> 
> Have you ran any other benches? Firestrike? Curious to see the graphics score on two overclocked 390x's


I have like filing that memory frequency is more important than on lowest resolutiuon and pcie 3.0 x4 affect on performance


----------



## JohnnyMoore

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Two 390X is corrsfire with 1150 core clocks is a pretty powerful setup bud!
> 
> If you want to push the memory further, you can add some AUX voltage in afterburner (around 25mv-50mv seems to work) and push it to 1700-1750mhz.
> 
> It will affect VRM temps a little though.
> 
> Have you ran any other benches? Firestrike? Curious to see the graphics score on two overclocked 390x's
> 
> 
> 
> 
> 
> 
> 
> 
> .


----------



## JohnnyMoore

not stellar score...going try another time x8/x8 lines...


----------



## Agent Smith1984

Quote:


> Originally Posted by *JohnnyMoore*
> 
> not stellar score...going try another time x8/x8 lines...


That's an excellent graphics score for (2) Hawaii cards man! You just need to OC your CPU some more.....

Very nice performance scaling!


----------



## IcarusLSC

Quote:


> Originally Posted by *diggiddi*
> 
> Why did you switch an whats it doing to make you miss the old one?


I already had two 970's and had/have hitching/stuttering with them if you read back many pages, so swapped them for a 390. The 390 cleared that up, but had issues according to MSI with a DVI port not working, so I returned it, and went back to a crappy 970...


----------



## JohnnyMoore

see 300 graphique point difference between x8/x4


----------



## JohnnyMoore

but in firestrike ultra i can take 2nd place , i think now my rig work well


----------



## JohnnyMoore

Ungin heaven at 12 scene 1st gpu 95 degree...







well , going wait end of 2016 for 8gb HBM and 16nm litography


----------



## Agent Smith1984

Quote:


> Originally Posted by *JohnnyMoore*
> 
> Ungin heaven at 12 scene 1st gpu 95 degree...
> 
> 
> 
> 
> 
> 
> 
> well , going wait end of 2016 for 8gb HBM and 16nm litography


Yeah, you are going to need to keep the card in the 4x slot it looks like.

I would suggest using higher speed/flow exhaust fans.

I dropped my card from 88C peak core with 100mv+ voltage offset, down to 75C peak with 100mv, simply by swapping out my two factory 120mm exhaust fans with Cooler Master Jetflo's....
It also dropped the VRM temp substantially!


----------



## JohnnyMoore

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Yeah, you are going to need to keep the card in the 4x slot it looks like.
> 
> I would suggest using higher speed/flow exhaust fans.
> 
> I dropped my card from 88C peak core with 100mv+ voltage offset, down to 75C peak with 100mv, simply by swapping out my two factory 120mm exhaust fans with Cooler Master Jetflo's....
> It also dropped the VRM temp substantially!


maybe change thermal paste to liquid metal?


----------



## Agent Smith1984

Quote:


> Originally Posted by *JohnnyMoore*
> 
> maybe change thermal paste to liquid metal?


It wouldn't hurt to change the TIM to some commonly used stuff, but I highly advise against liquid metal solutions on GPU's....


----------



## Agent Smith1984

Who has switched to windows 10 while already owning one of these cards, and noticed some improvements to both performance numbers, and overall "feel?"


----------



## CerealKillah

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Sadly, 1200 core will probably never happen until your VRM's are SUB 80C or lower.
> 
> I found that I could not get past 1180 with any stability, at any voltage, until I got the VRM1 into the low 70's, and then BAM, there it was!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> MSI's Frozr V isn't neccesarely the best cooler out there overall, but if you can get high case flow, it cools the VRM's better than anything I've seen.
> Consider that I'm running 100mv/50mv AUX, and my VRM1 is around 74C, and VRM2 is 48C..... that is pretty impressive for an air cooler.
> 
> Mind you it comes with some noise from the fans, but when I'm gaming, all I hear is shots fired!


The thicker pads I ordered SHOULD fix the issue (there is a thread here on OCN where other 290x users experienced the same issue and resolved the problem with 1mm thick thermal pads).

I have the cross hairs set on 1200....


----------



## Agent Smith1984

Quote:


> Originally Posted by *CerealKillah*
> 
> The thicker pads I ordered SHOULD fix the issue (there is a thread here on OCN where other 290x users experienced the same issue and resolved the problem with 1mm thick thermal pads).
> 
> I have the cross hairs set on 1200....


As you should my friend


----------



## JohnnyMoore

jesus ,after overclocking 4770k to 4400Mhz , physic score increased +30%,and total +2000pts


----------



## Agent Smith1984

Quote:


> Originally Posted by *JohnnyMoore*
> 
> 
> jesus ,after overclocking 4770k to 4400Mhz , physic score increased +30%,and total +2000pts


Yep.

Get into the 4.8 range and you'll be sitting on close to 20k points, if not more


----------



## AverdanOriginal

Quote:


> Originally Posted by *Agent Smith1984*
> 
> It wouldn't hurt to change the TIM to some commonly used stuff, but I highly advise against liquid metal solutions on GPU's....


I'd second that. unless you have done it before to a card worth 20 bugs or have the cash to buy a new one in case something goes wrong.
I have found somewhere in the world wide web a page where they tested different thermal pastes, pads and liquid metals solutions and their actuall temp difference on one card... I'll post it once I find it.

Thinking about doing it too, since we saw that apparently the MSI R9 390X has way too much thermal paste stuck on it (according to some colleagues on this forum in the early posts).


----------



## AverdanOriginal

Quote:


> Originally Posted by *JohnnyMoore*
> 
> 
> jesus ,after overclocking 4770k to 4400Mhz , physic score increased +30%,and total +2000pts


Your scores scare me








i am gonna be having nightmares now.. thanks


----------



## Agent Smith1984

Quote:


> Originally Posted by *AverdanOriginal*
> 
> Your scores scare me
> 
> 
> 
> 
> 
> 
> 
> 
> i am gonna be having nightmares now.. thanks


@JohnnyMoore

Are you not validating these online for a reason?

That graphics number looks high, even for 390's with an overclock.... are you running with tess off in CCC?









Not accusing, just very curious. 27k is a great graphics score for 2) Hawaii's fresh out of the box


----------



## JohnnyMoore

Quote:


> Originally Posted by *Agent Smith1984*
> 
> @JohnnyMoore
> 
> Are you not validating these online for a reason?
> 
> That graphics number looks high, even for 390's with an overclock.... are you running with tess off in CCC?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Not accusing, just very curious. 27k is a great graphics score for 2) Hawaii's fresh out of the box


tess in on,i cant validate with pirate version =)


----------



## Agent Smith1984

Quote:


> Originally Posted by *JohnnyMoore*
> 
> tess in on,i cant validate with pirate version =)


I'll pretend I didn't read that









Great scores man......


----------



## JohnnyMoore

ty!


----------



## JohnnyMoore

now overclocking my 4770k...i think 4.4Ghz is maximum for my sample


----------



## Agent Smith1984

Okay, here it is again...

*THE DEAL OF THE DAY*

*MSI Gaming R9 390X with free Sistorm Gaming Mousepad $399.99 after MIR*

http://www.newegg.com/Product/Product.aspx?Item=N82E16814127872&cm_re=r9_390-_-14-127-872-_-Product


----------



## THUMPer1

great deal!!


----------



## Darkeylel

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Who has switched to windows 10 while already owning one of these cards, and noticed some improvements to both performance numbers, and overall "feel?"


Had huge fps problems with my card would constantly be dropping frames every other second. Still have fps drops in some games on win7 refuses to stay at full clock speeds don't know what is going on.......


----------



## SzassTamOC

Hi guys I just got an MSI R9 390. My bios screen when I first boot is garbled purple lines. But, everything else after is fine. The text during the hardware check phase, OS operation, and gaming all works properly. Do my bios need updated or something to work with this card?

Also, I thought this was suppose to be clocked at 1060. Mine is 1040.

Thanks guys.


----------



## cbarros82

Quote:


> Originally Posted by *SzassTamOC*
> 
> Hi guys I just got an MSI R9 390. My bios screen when I first boot is garbled purple lines. But, everything else after is fine. The text during the hardware check phase, OS operation, and gaming all works properly. Do my bios need updated or something to work with this card?
> 
> Also, I thought this was suppose to be clocked at 1060. Mine is 1040.
> 
> Thanks guys.


You must load the msi software on the disk to unlock msi factory oc mode or just use afterburner


----------



## Agent Smith1984

Quote:


> Originally Posted by *Darkeylel*
> 
> Had huge fps problems with my card would constantly be dropping frames every other second. Still have fps drops in some games on win7 refuses to stay at full clock speeds don't know what is going on.......


What are your temps? Driver? Psu?


----------



## Darkeylel

Quote:


> Originally Posted by *Agent Smith1984*
> 
> What are your temps? Driver? Psu?


Hottest I have got is 80c , latest win7 driver , ahhhh thermal take 1300 watt PSU.


----------



## leonmorland0

Greetings.
My R9 390X was bought about a month back, and it has been nothing but trouble. Random BSODs even while the card is relatively idle. (You know, just doing 2D stuff in Windows.) It is almost guaranteed a BSOD for games that make extensive use of its 8gb memory such as MMOs.
I can run everything on highest settings and never go below 60 fps @ 1080p, but it BSODs at complete random. My motherboard is old, could it be that? My PSU should be able to handle it, as it's a 750w that is basically a Seasonic PSU with the XFX name on it.


----------



## CerealKillah

Quote:


> Originally Posted by *leonmorland0*
> 
> Greetings.
> My R9 390X was bought about a month back, and it has been nothing but trouble. Random BSODs even while the card is relatively idle. (You know, just doing 2D stuff in Windows.) It is almost guaranteed a BSOD for games that make extensive use of its 8gb memory such as MMOs.
> I can run everything on highest settings and never go below 60 fps @ 1080p, but it BSODs at complete random. My motherboard is old, could it be that? My PSU should be able to handle it, as it's a 750w that is basically a Seasonic PSU with the XFX name on it.


How old is the PSU? Driver and Windows version?


----------



## Mysticking32

Owner of a sapphire r9 390 nitro. Just reporting my results from overclocking.

I had previously been unaware that upping the voltage in a program like trixx could get rid of artifcats. Up until today, I had been using amd overdrive with my max clocks being 1090/1700 with a +50 power limit.

Today I decided to try something different and voila. I got a higher overclock. I upped the clocks till I started getting artifacts again (of the core clock not memory. memory remained 1700.) Each time I got artifacts I upped the voltage by 10 until they went away.

Long story short I ended up with 1140mhz/1700mhz with a voltage increase of +125. I had to keep the fan profile at 100 percent otherwise the temps would go beyond 84, which for me is unacceptable.

I obviously won't be using this for daily settings. The small performance gain just isn't worth that much of an increase in temperature. (I used the Witcher 3 to test these settings. Artifacts show up the best on this game for some reason. Not sure why. But it helped me for sure.)

Playing at 1440p.


----------



## kizwan

Quote:


> Originally Posted by *leonmorland0*
> 
> Greetings.
> My R9 390X was bought about a month back, and it has been nothing but trouble. Random BSODs even while the card is relatively idle. (You know, just doing 2D stuff in Windows.) It is almost guaranteed a BSOD for games that make extensive use of its 8gb memory such as MMOs.
> I can run everything on highest settings and never go below 60 fps @ 1080p, but it BSODs at complete random. My motherboard is old, could it be that? My PSU should be able to handle it, as it's a 750w that is basically a Seasonic PSU with the XFX name on it.


You should write down the BSOD bugcode. This way you can check what happened.


----------



## jackalopeater

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Who has switched to windows 10 while already owning one of these cards, and noticed some improvements to both performance numbers, and overall "feel?"


I've got to Win10 and haven't looked back. Things are a bit better on there, smoother especially on my 295x2. but we're talking about 390x here.

This is my 3dmark from win7 to win10
http://www.3dmark.com/compare/fs/5583862/fs/5585137

I also did a video where I compared performance, and found variances from 0% to 9.25% improvement with exact same clocks on cpu+gpu+memory.


----------



## thegamehhh

Quote:


> Originally Posted by *AverdanOriginal*
> 
> Sadly I have yet to get around to play AC Unity. Even though this game is supposed to be badly scripted, but a bsod 116 isn't normal. Do you still have the heat issue on your card? (I remember you posted you got like 94C° or something while playing AC Unity correct?).
> 
> I found a threat where someone had a similar problem http://www.tomshardware.co.uk/forum/381257-33-bsod-error
> 
> Apart from that, I would check PSU run some bench and stress tests. Did you uninstall all previous graphic card drviers with DDU? And do you also have all the current drivers for your Mobo and Windows and so on? Does this only happen in games?
> perhaps you should monitor your card (Temps, FPS...) during the gaming. Maybe something faulty can be seen?
> 
> I would guess... but that is widely fetched... it is a hardware problem. Either faulty card or old drivers or so.


theres was smth wrong with my case- i fixed that and temps are common, like 70-75 C in stress
about the bsod one- i did EVERYTHING what i could while searching in google expect updating bios- seems like this was root of the problem. been playing assasin for about four more hours and nohting happend
also unity its kinda buggy, thanks again for help


----------



## Sgt Bilko

Sign me up!













haven't worked out my daily clocks, probably 1100/1500 tbh but once i get a custom fan curve set up and going we'll see


----------



## gatygun

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Sign me up!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> haven't worked out my daily clocks, probably 1100/1500 tbh but once i get a custom fan curve set up and going we'll see


Gz, beast of a card.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Sign me up!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> haven't worked out my daily clocks, probably 1100/1500 tbh but once i get a custom fan curve set up and going we'll see


Oh sarge you didn't!!

Wait a minute.... YOU DID!!









Welcome to the club my friend!


----------



## Sgt Bilko

Quote:


> Originally Posted by *gatygun*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Sign me up!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> haven't worked out my daily clocks, probably 1100/1500 tbh but once i get a custom fan curve set up and going we'll see
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Gz, beast of a card.
Click to expand...

Hopefully yeah









I'll start playing around with Voltage tomorrow and see how that goes but i can confirm that the vrm temps are much much better than the 290x DD's were


----------



## Sgt Bilko

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Sign me up!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> haven't worked out my daily clocks, probably 1100/1500 tbh but once i get a custom fan curve set up and going we'll see
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Oh sarge you didn't!!
> 
> Wait a minute.... YOU DID!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Welcome to the club my friend!
Click to expand...

Haha, yeah









Thanks mate, very solid card this from what i've seen so far and 1150/1700 for stock voltage isn't bad


----------



## Agent Smith1984

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Haha, yeah
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks mate, very solid card this from what i've seen so far and 1150/1700 for stock voltage isn't bad


Indeed. I love mine man.

I will say though, that from all my testing, and all I have seen..... these iterations of the Hawaii cores exhibit the same type of characteristics that post 1429 Visheras do (apples to oranges comparing GPU and CPU I know, but the analogy is really just referring to the "low leakage" qualities these "Grenadas" seem to have)....

They seem to exhibit the following:

They all clock higher on much lower voltage than the previous gen Hawaii cards (up to a certain point)....
Then once they reach that point (around 1180-1200 on well binned cards) they aren't budging any further, regardless of how cool and how much voltage you send 'em. Some of that may be related to board power, but I believe it's more silicon related.
They also seem to run pretty hot if you consider that every one of these is on a third party cooler, and people still see temps int he mid to high 80's. .
I personally keep mine in the mid 70's with some beefed up case flow, but most people won't go as far as adding (2) 95CFM 120mm fans to their cases,









Let us know how the OC goes for you.
1150 is really nice on stock voltage, and I have said from day one that XFX and MSI 390's are the best binned of the series.
I think they may be the only two brands with cards that will regularly hit 1200MHz with some tweaking. Don't expect much more (if anything at all) past that though.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Haha, yeah
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks mate, very solid card this from what i've seen so far and 1150/1700 for stock voltage isn't bad
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Indeed. I love mine man.
> 
> I will say though, that from all my testing, and all I have seen..... these iterations of the Hawaii cores exhibit the same type of characteristics that post 1429 Visheras do (apples to oranges comparing GPU and CPU I know, but the analogy is really just referring to the "low leakage" qualities these "Grenadas" seem to have)....
> 
> They seem to exhibit the following:
> 
> They all clock higher on much lower voltage than the previous gen Hawaii cards (up to a certain point)....
> Then once they reach that point (around 1180-1200 on well binned cards) they aren't budging any further, regardless of how cool and how much voltage you send 'em. Some of that may be related to board power, but I believe it's more silicon related.
> They also seem to run pretty hot if you consider that every one of these is on a third party cooler, and people still see temps int he mid to high 80's. .
> I personally keep mine in the mid 70's with some beefed up case flow, but most people won't go as far as adding (2) 95CFM 120mm fans to their cases,
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Let us know how the OC goes for you.
> 1150 is really nice on stock voltage, and I have said from day one that XFX and MSI 390's are the best binned of the series.
> I think they may be the only two brands with cards that will regularly hit 1200MHz with some tweaking. Don't expect much more (if anything at all) past that though.
Click to expand...

Good info and thank you, I'm aiming to pass my 290x's best clocks (1287/1375.....memory on it was a bit meh) so I'll see how i go but i will say that XFX promised to beef up the vrm cooling on these cards and they delivered, vrm1 on this card has always stayed cooler than core temp and that's awesome considering that the 290/x cards vrm temps would often hit 100c while the core was at 75c :/


----------



## Agent Smith1984

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Good info and thank you, I'm aiming to pass my 290x's best clocks (1287/1375.....memory on it was a bit meh) so I'll see how i go but i will say that XFX promised to beef up the vrm cooling on these cards and they delivered, vrm1 on this card has always stayed cooler than core temp and that's awesome considering that the 290/x cards vrm temps would often hit 100c while the core was at 75c :/


That's great news, because if you want any chance of getting over 1200, you will need to keep VRM under 80C on these.

MSI did a great job with VRM cooling on my card too. The VRM1 temp is always 1C lower than the core, and that's running 100mv+ (stays between 72 and 74C depending on the title and resolution. 4k gives it the most hell)


----------



## AverdanOriginal

Quote:


> Originally Posted by *thegamehhh*
> 
> theres was smth wrong with my case- i fixed that and temps are common, like 70-75 C in stress
> about the bsod one- i did EVERYTHING what i could while searching in google expect updating bios- seems like this was root of the problem. been playing assasin for about four more hours and nohting happend
> also unity its kinda buggy, thanks again for help


Ahh the good old BIOS update trick







should have thought of that. I had the same problem with my old GTX 560. Funny enough, with my new R9 390 and my 6-7 year old Mobo, which stopped beeing supported in 2010 everything worked fine.


----------



## AverdanOriginal

Quote:


> Originally Posted by *leonmorland0*
> 
> Greetings.
> My R9 390X was bought about a month back, and it has been nothing but trouble. Random BSODs even while the card is relatively idle. (You know, just doing 2D stuff in Windows.) It is almost guaranteed a BSOD for games that make extensive use of its 8gb memory such as MMOs.
> I can run everything on highest settings and never go below 60 fps @ 1080p, but it BSODs at complete random. My motherboard is old, could it be that? My PSU should be able to handle it, as it's a 750w that is basically a Seasonic PSU with the XFX name on it.


Quote:


> Originally Posted by *CerealKillah*
> 
> How old is the PSU? Driver and Windows version?


I would second what CerealKillah asked. It sounds like an unstable PSU or Hardware instability. Also if your Mobo is old, try like thegamehhh did, to see if your BIOS is up to date.
Also check if you really deleted all older graphics drivers.
When I got my new R9 390, I had a GTX 560 in there before. So I booted in safe mode, uninstalled nvidia stuff and changed the cards and installed the newest driver.... was working fine... then I got some problems like BSOD.... long screen stucks and so on. Until I rebooted in safe mode and checked and saw that Physix from nvidia was still installed.... so I gut DDU whiped my harddrive of all graphic drviers and reinstalled everything... since then it ran perfect.


----------



## Derek129

Well I finally received my gigabyte r9 390 today, upgrading from my asus r9 270. Super excited to be able to finally play some games at a decent quality! Does anyone have a good fan curve that they use with their card yet and also what is your favorite overclock utility such as afterburner or?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Derek129*
> 
> Well I finally received my gigabyte r9 390 today, upgrading from my asus r9 270. Super excited to be able to finally play some games at a decent quality! Does anyone have a good fan curve that they use with their card yet and also what is your favorite overclock utility such as afterburner or?


Added and welcome









As far as fan profile, you are just going to want to cater it to your specific situation.

Create something that scales gradually with core temp, that still gives you a desired mix of acceptable temps and noise.

I like to idle at 25% all the way up to 40C, and from there I graduate to 100% at 90C.

As for overlocking, I suggest AfterBurner, but you may not get far with that card. They are volt locked, as stated on the OP. Let us know how it does for you.


----------



## Derek129

I was aware it was voltage locked but was only concerned about which program is potentially the best piece of software. Thank you for the reply about the fan curve that makes a lot of sense with my 270 I was using afterburners manual fan curve that came with the program and didn't have any problems with heat. Another question that came across my head this morning was what settings do you guys prefer to have turned off within games? Such as gtav


----------



## THUMPer1

I have a local frys with Gigabyte 390x's. I almost picked one up before reading the OP. Glad I didn't. But at least with frys you can return things. haha


----------



## leonmorland0

Quote:


> Originally Posted by *CerealKillah*
> 
> How old is the PSU? Driver and Windows version?


Probably around 4 years. Windows 10 Home 64-bit, AMD 15.7.1 drivers.
Quote:


> Originally Posted by *kizwan*
> 
> You should write down the BSOD bugcode. This way you can check what happened.


You are right. I Googled the bugcheck code for a bit but ultimately found inconclusive information. Something about kernel-space invasion. (or something like that)
Quote:


> Originally Posted by *AverdanOriginal*
> 
> I would second what CerealKillah asked. It sounds like an unstable PSU or Hardware instability. Also if your Mobo is old, try like thegamehhh did, to see if your BIOS is up to date.
> Also check if you really deleted all older graphics drivers.
> When I got my new R9 390, I had a GTX 560 in there before. So I booted in safe mode, uninstalled nvidia stuff and changed the cards and installed the newest driver.... was working fine... then I got some problems like BSOD.... long screen stucks and so on. Until I rebooted in safe mode and checked and saw that Physix from nvidia was still installed.... so I gut DDU whiped my harddrive of all graphic drviers and reinstalled everything... since then it ran perfect.


My mobo is quite old for this GPU: it's an Asus M5A97 EVO R2.0.
I could try to reinstall Windows from scratch, I guess. I did the free update but it failed the first time, so I told it to "delete everything" the second time around which pretty much gave me a fresh install.


----------



## Agent Smith1984

Quote:


> Originally Posted by *leonmorland0*
> 
> Probably around 4 years. Windows 10 Home 64-bit, AMD 15.7.1 drivers.
> You are right. I Googled the bugcheck code for a bit but ultimately found inconclusive information. Something about kernel-space invasion. (or something like that)
> My mobo is quite old for this GPU: it's an Asus M5A97 EVO R2.0.
> I could try to reinstall Windows from scratch, I guess. I did the free update but it failed the first time, so I told it to "delete everything" the second time around which pretty much gave me a fresh install.


Are you overclocking anything?

Have you verified stability for the CPU/RAM itself?

Have you reinstalled the games that are crashing?


----------



## leonmorland0

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Are you overclocking anything?
> 
> Have you verified stability for the CPU/RAM itself?
> 
> Have you reinstalled the games that are crashing?


GPU is stock since the day I bought it. Next thing I can think of is CPU: FX6300 OC'd to 4.1Ghz. Would that cause a bugcheck?
I "reinstalled" them because they were lost on the Windows 10 upgrade. I had to indicate the upgrade to "keep nothing."


----------



## JohnnyMoore

Quote:


> Originally Posted by *Derek129*
> 
> I was aware it was voltage locked but was only concerned about which program is potentially the best piece of software. Thank you for the reply about the fan curve that makes a lot of sense with my 270 I was using afterburners manual fan curve that came with the program and didn't have any problems with heat. Another question that came across my head this morning was what settings do you guys prefer to have turned off within games? Such as gtav


In GTA 5 i turn off AMD shadows ,prefer very soft it demand less resources and not worse quality . Viewing detail distance at half - too huge impact on fps with ultra shadows.And grass quality to high because it use a lot of resources too and realy only difference between ultra is shadow on grass ,and a little bit worse resolution of grass but not important and difficult to see. Others option are on ultra with crossfire , playing 3200x1800p MSAA x2,FXAA on .
Look at this guide to see difference and resource demand http://www.geforce.com/whats-new/guides/grand-theft-auto-v-pc-graphics-and-performance-guide


----------



## JohnnyMoore

Quote:


> Originally Posted by *leonmorland0*
> 
> GPU is stock since the day I bought it. Next thing I can think of is CPU: FX6300 OC'd to 4.1Ghz. Would that cause a bugcheck?
> I "reinstalled" them because they were lost on the Windows 10 upgrade. I had to indicate the upgrade to "keep nothing."


I had too sometimes BSODs with MSI versions.I think that is driver's problem with some soft,readed about this on amd forum and i was not alone .But it happened with me first 2 days ,after this i reinstalled twice driver without AMD gaming evolved and problem has gone.


----------



## AverdanOriginal

Quote:


> Originally Posted by *leonmorland0*
> 
> GPU is stock since the day I bought it. Next thing I can think of is CPU: FX6300 OC'd to 4.1Ghz. Would that cause a bugcheck?
> I "reinstalled" them because they were lost on the Windows 10 upgrade. I had to indicate the upgrade to "keep nothing."


Hmmm could be. But I just checked and ASUS brought out a new BIOS Update for that mainboard on the 27th of March 2015. So that board is way younger then mine







It even has UEFI BIOS








I have an ASUS M4A87TD / USB 3.0 last BIOS update in 2011 and no UEFI









Here the link (sadly only to the German ASUS page): https://www.asus.com/us/Motherboards/M5A97_EVO_R20/HelpDesk_Download/
The newest BIOS version is 2603. verify that number with your current bios version and if need be update it.


----------



## leonmorland0

Quote:


> Originally Posted by *AverdanOriginal*
> 
> Hmmm could be. But I just checked and ASUS brought out a new BIOS Update for that mainboard on the 27th of March 2015. So that board is way younger then mine
> 
> 
> 
> 
> 
> 
> 
> It even has UEFI BIOS
> 
> 
> 
> 
> 
> 
> 
> 
> I have an ASUS M4A87TD / USB 3.0 last BIOS update in 2011 and no UEFI
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Here the link (sadly only to the German ASUS page): https://www.asus.com/us/Motherboards/M5A97_EVO_R20/HelpDesk_Download/
> The newest BIOS version is 2603. verify that number with your current bios version and if need be update it.


I'm not at my PC right now, but I think I kept version 2201 because it was documented as "stability improvements" while the newest versions bring only USB improvements which I don't really need. And also, I was told that flashing BIOS is dangerous so I didn't flash the newest versions. (Since I deemed them unnecesary for now)


----------



## AverdanOriginal

Quote:


> Originally Posted by *leonmorland0*
> 
> I'm not at my PC right now, but I think I kept version 2201 because it was documented as "stability improvements" while the newest versions bring only USB improvements which I don't really need. And also, I was told that flashing BIOS is dangerous so I didn't flash the newest versions. (Since I deemed them unnecesary for now)


Yeah you are right only USB improvements. Thought it might also make the board windows 10 ready or so. There are some Hotfixes but only for APRP which has nothing to do with graphicscard I guess. Then it could be the 4 year old PSU, or like Agent Smith suggested the Overclocking. what card did you have in your system before?


----------



## leonmorland0

Quote:


> Originally Posted by *AverdanOriginal*
> 
> Yeah you are right only USB improvements. Thought it might also make the board windows 10 ready or so. There are some Hotfixes but only for APRP which has nothing to do with graphicscard I guess. Then it could be the 4 year old PSU, or like Agent Smith suggested the Overclocking. what card did you have in your system before?


I used to have a Gigabyte HD 7870 OC Edition before the R9 390X. It worked great with my CPU at 4.0 Ghz.
Thinking back, the HD 7870 also gave me BSODs at first but fixed them through driver updates. I guess my last resort is doing a full wipe and reinstall, or wait until newer driver versions are released.


----------



## AverdanOriginal

Hey guys, tried to overclock today a little bit. I put the Core mV back to 0 (after undervolting) and put the powerlimit to +50. I started with trying to find my sweet spots of Memory MHz without adjusting volts...
Also needed to stop between 1700-1730 MHz as with undervolting. So Core mV really only improves the Core Clock and does not at all improve Memory Clock or? that means even with an undervolting of -87 Clock mV the Memory would stay the same regardless of the Core mv is -87...0... or even higher?
*So really only Aux mV improves the Memory Clock?* Just thought I asked since I though Core mV could also improve Memory a bit. On the other hand... it does seem logic considering the naming









One more Question. if your PSU has two 6+8 pin VGA cables (so for crossfire). Would that mean they run over two different rails/lanes and hence if you'd use it both cables for one graphicsscard, could this improve stability in low as well as high voltage? just an idead. or do you think it wouldn't make a difference?

Edit: disregard the first paragraph, I think I pretty much answered it to myself


----------



## kizwan

Quote:


> Originally Posted by *AverdanOriginal*
> 
> One more Question. if your PSU has two 6+8 pin VGA cables (so for crossfire). Would that mean they run over two different rails/lanes and hence if you'd use it both cables for one graphicsscard, could this improve stability in low as well as high voltage? just an idead. or do you think it wouldn't make a difference?


I think you mean *6+2* PCIe power connector. Two PCIe cables doesn't mean they run on separate rails. It depends on the PSU whether it's single or multiple +12V rails. On multiple rails PSU, there usually labels on the connectors on the PSU which rails it connected to. Whether you have single or multiple rails, it's good idea to use different cables from the PSU for each PCIe power plug on the GPU. It's rule of thumb more toward safety than stability though.


----------



## duox

How are crossfire drivers if im the kinda guy that buys games after their about half off 90% of the time. Two 390s would be way cooler to look at than a Fury or 980 ti lol


----------



## Agent Smith1984

Quote:


> Originally Posted by *duox*
> 
> How are crossfire drivers if im the kinda guy that buys games after their about half off 90% of the time. Two 390s would be way cooler to look at than a Fury or 980 ti lol


I buy games with that same method









Crossfire is perfect for someone like us!


----------



## Derek129

Quote:


> Originally Posted by *kizwan*
> 
> I think you mean *6+2* PCIe power connector. Two PCIe cables doesn't mean they run on separate rails. It depends on the PSU whether it's single or multiple +12V rails. On multiple rails PSU, there usually labels on the connectors on the PSU which rails it connected to. Whether you have single or multiple rails, it's good idea to use different cables from the PSU for each PCIe power plug on the GPU. It's rule of thumb more toward safety than stability though.


So how important is this? I have two seperate cables coming from my psu which both have two 6+2 connectors branching off each cable and last night while installing my new gpu I just used one cable to plug in both power inputs on my gpu, should I change this and use 2 seperate cables coming from the psu to power the card?


----------



## diggiddi

My system would not power the gpu unless I used 2 separate power connectors on each card


----------



## Agent Smith1984

Quote:


> Originally Posted by *Derek129*
> 
> So how important is this? I have two seperate cables coming from my psu which both have two 6+2 connectors branching off each cable and last night while installing my new gpu I just used one cable to plug in both power inputs on my gpu, should I change this and use 2 seperate cables coming from the psu to power the card?


If you have multiple 12v rails, and each of those rails is rated at a lower total amperage than your card needs, then you would be better to use two separate cables....
If your total amperage is enough on each rail, then it won't matter.
If you only have one 12v rail it also won't matter.

It's completely dependent on the PSU itself, and what your actual power needs are.


----------



## Derek129

Quote:


> Originally Posted by *Agent Smith1984*
> 
> If you have multiple 12v rails, and each of those rails is rated at a lower total amperage than your card needs, then you would be better to use two separate cables....
> If your total amperage is enough on each rail, then it won't matter.
> If you only have one 12v rail it also won't matter.
> 
> It's completely dependent on the PSU itself, and what your actual power needs are.


I have the xfx pro750 watt psu,

750W of continuous power at 50°C
Tight voltage regulation (±5%)
High efficiency operation up to 85% (80 Plus Bronze)
Single, high power +12V rail (up to 62A/744W)


----------



## Agent Smith1984

Quote:


> Originally Posted by *Derek129*
> 
> I have the xfx pro750 watt psu,
> 
> 750W of continuous power at 50°C
> Tight voltage regulation (±5%)
> High efficiency operation up to 85% (80 Plus Bronze)
> Single, high power +12V rail (up to 62A/744W)


Then your cable choice shouldn't matter.


----------



## SzassTamOC

Does anyone know if the MSI R9 390 has a uefi/legacy bios switch?


----------



## Agent Smith1984

Quote:


> Originally Posted by *SzassTamOC*
> 
> Does anyone know if the MSI R9 390 has a uefi/legacy bios switch?


It does not.


----------



## JohnnyMoore

what's your ASIC and better stable overclock?


----------



## Derek129

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Then your cable choice shouldn't matter.


I just noticed in my case I had ordered the xfx pro edition and recieved the ts750 and I didn't realize that till now and that was months ago I bought it. I just read online though that the ts replaced the pro edition so they should be similar in there power output


----------



## Agent Smith1984

Quote:


> Originally Posted by *JohnnyMoore*
> 
> what's your ASIC and better stable overclock?


I've read tons of stuff about the ASIC quality on these cards, and almost everything I've seen says it's completely inaccurate and has no bearing on Hawaii's potential.

At least that was the case with the 290 series anyways. I can't see how that verdict would be any different for these though...

As far as stable overclock, I can run 1200/1750 daily at 100mv/50mv AUX.

It won't even do 2MHz past that on the core though with that voltage (not with full stability anyways), and the memory will clock higher, but actual performance diminishes at 1760+.


----------



## JohnnyMoore

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I've read tons of stuff about the ASIC quality on these cards, and almost everything I've seen says it's completely inaccurate and has no bearing on Hawaii's potential.
> 
> At least that was the case with the 290 series anyways. I can't see how that verdict would be any different for these though...
> 
> As far as stable overclock, I can run 1200/1750 daily at 100mv/50mv AUX.
> 
> It won't even do 2MHz past that on the core though with that voltage (not with full stability anyways), and the memory will clock higher, but actual performance diminishes at 1760+.


Ty for info , mate.


----------



## JohnnyMoore

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I've read tons of stuff about the ASIC quality on these cards, and almost everything I've seen says it's completely inaccurate and has no bearing on Hawaii's potential.
> 
> At least that was the case with the 290 series anyways. I can't see how that verdict would be any different for these though...
> 
> As far as stable overclock, I can run 1200/1750 daily at 100mv/50mv AUX.
> 
> It won't even do 2MHz past that on the core though with that voltage (not with full stability anyways), and the memory will clock higher, but actual performance diminishes at 1760+.


Do you know why in Msi afterburner power limit line disappeared ? and why in Heaven bench it run good and in 3dmark it can take artifacts on same frequency?


----------



## diggiddi

Quote:


> Originally Posted by *JohnnyMoore*
> 
> 
> Do you know why in Msi afterburner power limit line disappeared ? and why in Heaven bench it run good and in 3dmark it can take artifacts on same frequency?


3D mark is more demanding than heaven, I was able to pass Heaven and valley but "artifacted" in FS so I consider it( FS) to be the standard and only run heaven/valley after I pass FS


----------



## MazrimCF

What is everyone's idle temps? Afterburner is showing 42C at the desk top with zero load and the fans at 41% using a user fan profile in which I have the fan at 30% till 30C. My house is kept around 28C. I have the HAF 932 with 3 new Bitfenix Spector Pro 230mm fans and a Coolermaster 140mm. The door and front fans blow in the top and back fan (140mm) blow out. my cpu temps are 27c ( via CoreTemp)and the case temp is around the same as using my IR temp gun. is this normal?


----------



## JohnnyMoore

Quote:


> Originally Posted by *MazrimCF*
> 
> What is everyone's idle temps? Afterburner is showing 42C at the desk top with zero load and the fans at 41% using a user fan profile in which I have the fan at 30% till 30C. My house is kept around 28C. I have the HAF 932 with 3 new Bitfenix Spector Pro 230mm fans and a Coolermaster 140mm. The door and front fans blow in the top and back fan (140mm) blow out. my cpu temps are 27c ( via CoreTemp)and the case temp is around the same as using my IR temp gun. is this normal?


if it can help you =) cross in HAFX , door is open 27 ambient temp


----------



## AverdanOriginal

Quote:


> Originally Posted by *kizwan*
> 
> I think you mean *6+2* PCIe power connector. Two PCIe cables doesn't mean they run on separate rails. It depends on the PSU whether it's single or multiple +12V rails. On multiple rails PSU, there usually labels on the connectors on the PSU which rails it connected to. Whether you have single or multiple rails, it's good idea to use different cables from the PSU for each PCIe power plug on the GPU. It's rule of thumb more toward safety than stability though.


Quote:


> Originally Posted by *Agent Smith1984*
> 
> If you have multiple 12v rails, and each of those rails is rated at a lower total amperage than your card needs, then you would be better to use two separate cables....
> If your total amperage is enough on each rail, then it won't matter.
> If you only have one 12v rail it also won't matter.
> 
> It's completely dependent on the PSU itself, and what your actual power needs are.


Thanks for the info guys. I have a Be Quiet Straight Power 10 600W Modular with multiple rail and the following Setup:
+12V1(A) @18
+12V2(A) @18
+12V3(A) @24
+12V4(A) @24

So from what you are saying with my mutliple rail PSU it would make sense? for safety reasons


----------



## JohnnyMoore

Just reached new score after some hours of optimisation







19012 vs 18749


----------



## JohnnyMoore

Quote:


> Originally Posted by *AverdanOriginal*
> 
> Thanks for the info guys. I have a Be Quiet Straight Power 10 600W Modular with multiple rail and the following Setup:
> +12V1(A) @18
> +12V2(A) @18
> +12V3(A) @24
> +12V4(A) @24
> 
> So from what you are saying with my mutliple rail PSU it would make sense? for safety reasons


and for stable and smooth current


----------



## AverdanOriginal

Quote:


> Originally Posted by *JohnnyMoore*
> 
> 
> 
> Just reached new score after some hours of optimisation
> 
> 
> 
> 
> 
> 
> 
> 19012 vs 18749


Omg. Nice. now time to Play some GTAV or Batman Arkham Knight (heavy Memory abusers) in 4K and turn on 8xMSAA just for the fun of it


----------



## JohnnyMoore

Quote:


> Originally Posted by *AverdanOriginal*
> 
> Omg. Nice. now time to Play some GTAV or Batman Arkham Knight (heavy Memory abusers) in 4K and turn on 8xMSAA just for the fun of it










yeah, gta5 playable 3200x1800 x4 but x8 become a little bit hard =)


----------



## Derek129

quick question guys, what do you set your power limit to? even on stock clocks


----------



## JohnnyMoore

Quote:


> Originally Posted by *Derek129*
> 
> quick question guys, what do you set your power limit to? even on stock clocks


I set 25%


----------



## Derek129

Also since I'm new here and hope to have all my questions answered that pop up in my head so I can stop making multiple posts lol, what are the common idle and load temps?


----------



## LongRod

Quote:


> Originally Posted by *Derek129*
> 
> Also since I'm new here and hope to have all my questions answered that pop up in my head so I can stop making multiple posts lol, what are the common idle and load temps?


Most 390's have the fans not spin at idle, so the common idle temps are from 55-62c (62 being where the fans spin up on my MSI card and cool it back to 60), load temps seem to me 72-75c (stock and overclock) on the default fan profile.


----------



## stalker27

Why my R9 390 MSI GAMING with drivers 15.17.1 and windows 10x64 pro

Show me max Feature Level 11.1 in Dxdiag ??

Is normal ??

R9 390 has features 12 is not it?

guys can tell me showing them the dxdiag ?


----------



## stalker27

driver 15.17.1 suport feature level 12_0 ?


----------



## Noirgheos

What are your idle and load temps guys? MSI model specifically.


----------



## stalker27

Quote:


> Originally Posted by *Noirgheos*
> 
> What are your idle and load temps guys? MSI model specifically.


55 idle 70 load ... what show you dxdiag feature level? you have 390?


----------



## Noirgheos

No I was just asking, don't have it yet.


----------



## Sgt Bilko

Quote:


> Originally Posted by *stalker27*
> 
> Why my R9 390 MSI GAMING with drivers 15.17.1 and windows 10x64 pro
> 
> Show me max Feature Level 11.1 in Dxdiag ??
> 
> Is normal ??
> 
> R9 390 has features 12 is not it?
> 
> guys can tell me showing them the dxdiag ?


*ALL* GCN GPU's (HD 7000, 200 & 300 Series) support DX12.....you are fine


----------



## blank964

Quote:


> Originally Posted by *diggiddi*
> 
> 3D mark is more demanding than heaven, I was able to pass Heaven and valley but "artifacted" in FS so I consider it( FS) to be the standard and only run heaven/valley after I pass FS


If you can get Crysis 3, I've found that to be the most artifact inducing (real) program I've seen yet.


----------



## SystemTech

Hi guys,

Does anyone know if the Alphacool 290/290X DCuII waterblock will fit on my 390 DCu II.
I know the EK blocks should work as per the font page but im loving the look of the Alphacool blocks :
http://www.alphacool.com/product_info.php/info/p1455_-Alphacool-NexXxoS-GPX---ATI-R9-290X-und-290-M03---mit-Backplate---Schwarz.html

Thanks

PS is anyone here using a full size WB to cool their 390/390X and if so, which brand (card and WB). Maybe we can compile a list


----------



## diggiddi

Quote:


> Originally Posted by *blank964*
> 
> If you can get Crysis 3, I've found that to be the most artifact inducing (real) program I've seen yet.


I will test it out at my highest OC and then some


----------



## JohnnyMoore

Quote:


> Originally Posted by *Noirgheos*
> 
> What are your idle and load temps guys? MSI model specifically.


38/31 idle , 89/75 load


----------



## Sgt Bilko

http://www.3dmark.com/fs/5766796

Best I've managed out of this card so far, i forgot to take temp readings but i know the core never went over 80c and VRM1 never went over 70c with +125mV through Trixx, XFX really did a damn good job with this cooler but the fans at 100% are a hell of alot louder than the 290x DD's were


----------



## JohnnyMoore

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 
> http://www.3dmark.com/fs/5766796
> 
> Best I've managed out of this card so far, i forgot to take temp readings but i know the core never went over 80c and VRM1 never went over 70c with +125mV through Trixx, XFX really did a damn good job with this cooler but the fans at 100% are a hell of alot louder than the 290x DD's were


----------



## stalker27

Quote:


> Originally Posted by *Sgt Bilko*
> 
> *ALL* GCN GPU's (HD 7000, 200 & 300 Series) support DX12.....you are fine


I should say , 12_0 feature level in my video card

Nvidia video cards "970" show up as 12_1 feature level.

I want someone to take the doubt and see your dxdiag and show me some screenshot please.

anyone can take a picture please dxdiag screen to compare with mine .





R9 390 is GCN 1.1


----------



## Sgt Bilko

Quote:


> Originally Posted by *stalker27*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> *ALL* GCN GPU's (HD 7000, 200 & 300 Series) support DX12.....you are fine
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I should say , 12_0 feature level in my video card
> 
> Nvidia video cards "970" show up as 12_1 feature level.
> 
> I want someone to take the doubt and see your dxdiag and show me some screenshot please.
> 
> anyone can take a picture please dxdiag screen to compare with mine .
Click to expand...

Feature levels are only for Developers, they aren't for game rendering purposes, for us there is only supported by DX12 or not.

and again, ALL GCN GPU's are supported by DX12.....


----------



## JohnnyMoore

Quote:


> Originally Posted by *stalker27*
> 
> I should say , 12_0 feature level in my video card
> 
> Nvidia video cards "970" show up as 12_1 feature level.
> 
> I want someone to take the doubt and see your dxdiag and show me some screenshot please.
> 
> anyone can take a picture please dxdiag screen to compare with mine .
> 
> 
> 
> 
> 
> R9 390 is GCN 1.1


Nvidia 12.1 but 12.0 tier 2 , amd 12.0 tier 3


----------



## stalker27

Quote:


> Originally Posted by *JohnnyMoore*
> 
> Nvidia 12.1 but 12.0 tier 2 , amd 12.0 tier 3


tell me you appears on your dxdiag please

11.1 or 12.0 ?


----------



## JohnnyMoore

Quote:


> Originally Posted by *stalker27*
> 
> tell me you appears on your dxdiag please
> 
> 11.1 or 12.0 ?




im on win 8.1


----------



## stalker27

Quote:


> Originally Posted by *JohnnyMoore*
> 
> 
> 
> im on win 8.1


thanx i have windows 10 pro with the last drivers 15.17.1 and show me 11.1 featured level =(


----------



## Sgt Bilko

Quote:


> Originally Posted by *stalker27*
> 
> Quote:
> 
> 
> 
> Originally Posted by *JohnnyMoore*
> 
> 
> 
> im on win 8.1
> 
> 
> 
> thanx i have windows 10 pro with the last drivers 15.17.1 and show me 11.1 featured level =(
Click to expand...

DXDiag hasn't been updated yet to show them as DX12.

I'll put it this way, if you didn't have DX12 you wouldn't be able to run 3DMark's API Test and no-one would be able to run Ashes of the Singularity in DX12


----------



## stalker27

Quote:


> Originally Posted by *Sgt Bilko*
> 
> DXDiag hasn't been updated yet to show them as DX12.
> 
> I'll put it this way, if you didn't have DX12 you wouldn't be able to run 3DMark's API Test and no-one would be able to run Ashes of the Singularity in DX12


featured level 11.1 is DX 12.. i Know

but I like to have the full features of direcx supporting my video card "12.0"


----------



## Sgt Bilko

Quote:


> Originally Posted by *stalker27*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> DXDiag hasn't been updated yet to show them as DX12.
> 
> I'll put it this way, if you didn't have DX12 you wouldn't be able to run 3DMark's API Test and no-one would be able to run Ashes of the Singularity in DX12
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> featured level 11.1 is DX 12.. i Know
> 
> but I like to have the full features of direcx supporting my video card "12.0"
Click to expand...

What are you going to use the feature levels for?

Honest question....


----------



## stalker27

Quote:


> Originally Posted by *Sgt Bilko*
> 
> What are you going to use the feature levels for?
> 
> Honest question....


see this nvidia 970 dxdiag show all featured suported 12.1



or amd dont have drivers suport all featured level yet or is a bug in my windows or amd is lie with the feaured suported


----------



## Sgt Bilko

Quote:


> Originally Posted by *stalker27*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> What are you going to use the feature levels for?
> 
> Honest question....
> 
> 
> 
> see this nvidia 970 dxdiag show all featured suported 12.1
> 
> 
> 
> or amd dont have drivers suport all featured level yet or is a bug in my windows or amd is lie with the feaured suported
Click to expand...

Haha, AMD has never said that they support all DX12 feature levels, they said they support DX12 and they do......

You still didn't answer my question, what are you going to use the Feature Levels for?

they don't affect us end users, if a feature is used that isn't supported then an alternate code path is created to emulate it, same result but different action.

All that really matters is if the card supports DX12 or not and as I've said.....all GCN GPU's do support DX12


----------



## jackalopeater

Quote:


> Originally Posted by *stalker27*
> 
> featured level 11.1 is DX 12.. i Know
> 
> but I like to have the full features of direcx supporting my video card "12.0"


I think you'd be better in this thread for the answers you're looking for since it is the topic of what you're wanting to know
http://www.overclock.net/t/1567968/directx-12-direct3d-12-feature-levels-and-resources/0_50


----------



## stalker27

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Haha, AMD has never said that they support all DX12 feature levels, they said they support DX12 and they do......
> 
> You still didn't answer my question, what are you going to use the Feature Levels for?
> 
> they don't affect us end users, if a feature is used that isn't supported then an alternate code path is created to emulate it, same result but different action.
> 
> All that really matters is if the card supports DX12 or not and as I've said.....all GCN GPU's do support DX12


i think dx 12 full or with featured 12.0 in games in future is going more faster than 11.1

AMD confirm suport 12.0 max feutured suport, dont 12.1

http://wccftech.com/amd-confirms-gcn-cards-feature-full-directx-12-support-feature-level-111-gcn-10-feature-level-120-gcn-1112/


----------



## Sgt Bilko

Quote:


> Originally Posted by *stalker27*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Haha, AMD has never said that they support all DX12 feature levels, they said they support DX12 and they do......
> 
> You still didn't answer my question, what are you going to use the Feature Levels for?
> 
> they don't affect us end users, if a feature is used that isn't supported then an alternate code path is created to emulate it, same result but different action.
> 
> All that really matters is if the card supports DX12 or not and as I've said.....all GCN GPU's do support DX12
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i think dx 12 full or with featured 12.0 in games in future is going more faster than 11.1
> 
> AMD confirm suport 12.0 max feutured suport, dont 12.1
> 
> http://wccftech.com/amd-confirms-gcn-cards-feature-full-directx-12-support-feature-level-111-gcn-10-feature-level-120-gcn-1112/
Click to expand...

Listen to this guy....
Quote:


> Originally Posted by *jackalopeater*
> 
> Quote:
> 
> 
> 
> Originally Posted by *stalker27*
> 
> featured level 11.1 is DX 12.. i Know
> 
> but I like to have the full features of direcx supporting my video card "12.0"
> 
> 
> 
> I think you'd be better in this thread for the answers you're looking for since it is the topic of what you're wanting to know
> http://www.overclock.net/t/1567968/directx-12-direct3d-12-feature-levels-and-resources/0_50
Click to expand...

AMD Supports more Tiers than Nvidia does (which actually does affect us) while the 900 series supports Feature level 12_1 (which most games won't use) while AMD doesn't which as i said AMD would just emulate that feature anyway........

Seriously mate just drop it, you have no arguement here, AMD didn't lie, they didn't mislead anyone, they support DX12 and for all intents and purposes that's all you need









Not replying to you anymore about this now


----------



## stalker27

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Listen to this guy....
> AMD Supports more Tiers than Nvidia does (which actually does affect us) while the 900 series supports Feature level 12_1 (which most games won't use) while AMD doesn't which as i said AMD would just emulate that feature anyway........
> 
> Seriously mate just drop it, you have no arguement here, AMD didn't lie, they didn't mislead anyone, they support DX12 and for all intents and purposes that's all you need
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Not replying to you anymore about this now


you have a 390 or 390x?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 
> http://www.3dmark.com/fs/5766796
> 
> Best I've managed out of this card so far, i forgot to take temp readings but i know the core never went over 80c and VRM1 never went over 70c with +125mV through Trixx, XFX really did a damn good job with this cooler but the fans at 100% are a hell of alot louder than the 290x DD's were


Looking really good sarge.

You can see now what I was saying about hitting 1200 and then they are done, huh?

Really nice to get 14,500 out of a single hawaii card, right out of the box though..... That was a 1300/1500 run with tess off for most people on 290x 1-2 years ago.

My 390 @ 1200/1750 pulls around 14,200 something with the lower shader count. It destroys either of my previous tri-x 290's though, so I am more than happy with this card so far. I know some of that is driver improvement, but AMD did a nice job drawing that last bit of performance out of Hawaii from a hardware standpoint too.

What's really going to be silly, is when I add my second one in a few weeks. I'll probably try and run something like 1150/1700 on 30mv/30mv so temps are good, since I seriously
doubt I can keep the top card cool enough to stay at 1200 100mv. Fun for now on 1 card though.


----------



## JohnnyMoore

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Looking really good sarge.
> 
> You can see now what I was saying about hitting 1200 and then they are done, huh?
> 
> Really nice to get 14,500 out of a single hawaii card, right out of the box though..... That was a 1300/1500 run with tess off for most people on 290x 1-2 years ago.
> 
> My 390 @ 1200/1750 pulls around 14,200 something with the lower shader count. It destroys either of my previous tri-x 290's though, so I am more than happy with this card so far. I know some of that is driver improvement, but AMD did a nice job drawing that last bit of performance out of Hawaii from a hardware standpoint too.
> 
> What's really going to be silly, is when I add my second one in a few weeks. I'll probably try and run something like 1150/1700 on 30mv/30mv so temps are good, since I seriously
> doubt I can keep the top card cool enough to stay at 1200 100mv. Fun for now on 1 card though.


even Jesus Christ can't help me


----------



## Sgt Bilko

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> 
> http://www.3dmark.com/fs/5766796
> 
> Best I've managed out of this card so far, i forgot to take temp readings but i know the core never went over 80c and VRM1 never went over 70c with +125mV through Trixx, XFX really did a damn good job with this cooler but the fans at 100% are a hell of alot louder than the 290x DD's were
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Looking really good sarge.
> 
> You can see now what I was saying about hitting 1200 and then they are done, huh?
> 
> Really nice to get 14,500 out of a single hawaii card, right out of the box though..... That was a 1300/1500 run with tess off for most people on 290x 1-2 years ago.
> 
> My 390 @ 1200/1750 pulls around 14,200 something with the lower shader count. It destroys either of my previous tri-x 290's though, so I am more than happy with this card so far. I know some of that is driver improvement, but AMD did a nice job drawing that last bit of performance out of Hawaii from a hardware standpoint too.
> 
> What's really going to be silly, is when I add my second one in a few weeks. I'll probably try and run something like 1150/1700 on 30mv/30mv so temps are good, since I seriously
> doubt I can keep the top card cool enough to stay at 1200 100mv. Fun for now on 1 card though.
Click to expand...

Yep, you were bang on with the FX analogy









Here is a 1200/1375 run from my 290x DD for comparison: http://www.3dmark.com/fs/5068571 (Graphics Score: 13635)

seems like the the 390/x memory speed combined with the tightened Bios really helps there.

I'm yet to actually sit down and play any games on it though, had to swap out my PSU......maybe on the weekend


----------



## Agent Smith1984

Quote:


> Originally Posted by *JohnnyMoore*
> 
> even Jesus Christ can't help me


Can you get me another pic a little further back that shows your fan placement and layout.

You have the ideal card placement, but I am guessing you have front fans being blocked by cages and wires, and not enough air flow on the exhaust fans.

Trust me, the problem can be dealt with by improving airflow.

Here is what my 290's looked like in crossfire.



Top card never broke 88C and that was with my stock case fans.

I guarantee I could keep it at 80C with my Jetflo's.

You also need to try running at 1100mhz or so on the core with stock volts, so you aren't throttling the top card.

It does no good to overvolt the core, clock it up real high, and then watch it throttle down to 900Mhz....


----------



## stalker27

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Looking really good sarge.
> 
> You can see now what I was saying about hitting 1200 and then they are done, huh?
> 
> Really nice to get 14,500 out of a single hawaii card, right out of the box though..... That was a 1300/1500 run with tess off for most people on 290x 1-2 years ago.
> 
> My 390 @ 1200/1750 pulls around 14,200 something with the lower shader count. It destroys either of my previous tri-x 290's though, so I am more than happy with this card so far. I know some of that is driver improvement, but AMD did a nice job drawing that last bit of performance out of Hawaii from a hardware standpoint too.
> 
> What's really going to be silly, is when I add my second one in a few weeks. I'll probably try and run something like 1150/1700 on 30mv/30mv so temps are good, since I seriously
> doubt I can keep the top card cool enough to stay at 1200 100mv. Fun for now on 1 card though.


Quote:


> Originally Posted by *JohnnyMoore*
> 
> even Jesus Christ can't help me


45 fps with 2 cards ?? is gta v this game? what settings playing?

you need to upgrade to windows 10 gain more fps..

i have only one 390 and play gta v with works in 50 60 80 fps with all in ultra


----------



## JohnnyMoore

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Can you get me another pic a little further back that shows your fan placement and layout.
> 
> You have the ideal card placement, but I am guessing you have front fans being blocked by cages and wires, and not enough air flow on the exhaust fans.
> 
> Trust me, the problem can be dealt with by improving airflow.
> 
> Here is what my 290's looked like in crossfire.
> 
> 
> 
> Top card never broke 88C and that was with my stock case fans.
> 
> I guarantee I could keep it at 80C with my Jetflo's.







top 200mm 110cfm , front 240 mm 130cfm


----------



## JohnnyMoore

Quote:


> Originally Posted by *stalker27*
> 
> 45 fps with 2 cards ?? is gta v this game? what settings playing?
> 
> you need to upgrade to windows 10 gain more fps..
> 
> i have only one 390 and play gta v with works in 50 60 80 fps with all in ultra


45 fps with throttling ...all ultra 3200x1800 mssa x4, try it with one card and upload fps screen =)


----------



## Sgt Bilko

Quote:


> Originally Posted by *JohnnyMoore*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Agent Smith1984*
> 
> Can you get me another pic a little further back that shows your fan placement and layout.
> 
> You have the ideal card placement, but I am guessing you have front fans being blocked by cages and wires, and not enough air flow on the exhaust fans.
> 
> Trust me, the problem can be dealt with by improving airflow.
> 
> Here is what my 290's looked like in crossfire.
> 
> 
> 
> Top card never broke 88C and that was with my stock case fans.
> 
> I guarantee I could keep it at 80C with my Jetflo's.
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> top 200mm 110cfm , front 240 mm 130cfm
Click to expand...

Take a fan and place it at the rear of your drive cages in front of the cards, temps will drop


----------



## stalker27

Quote:


> Originally Posted by *JohnnyMoore*
> 
> all ultra 3200x1800 mssa x4


These perfect then , I thought you played at 1920x1080 .

good fps


----------



## JohnnyMoore

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Take a fan and place it at the rear of your drive cages in front of the cards, temps will drop


i yet tried this solution...


----------



## JohnnyMoore

Quote:


> Originally Posted by *stalker27*
> 
> These perfect then , I thought you played at 1920x1080 .
> 
> good fps


cant justifiy cross for play at 1080p...


----------



## Sgt Bilko

Quote:


> Originally Posted by *JohnnyMoore*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Take a fan and place it at the rear of your drive cages in front of the cards, temps will drop
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i yet tried this solution...
Click to expand...

Sorry, Did you mean to say you have tried it or you haven't tried it?


----------



## CerealKillah

Quote:


> Originally Posted by *JohnnyMoore*
> 
> even Jesus Christ can't help me


Maybe this has been asked already, but what does your case fan configuration look like on the case?


----------



## Agent Smith1984

@JohnnyMoore


----------



## JohnnyMoore

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Sorry, Did you mean to say you have tried it or you haven't tried it?


yes have tried , sry for my english


----------



## CerealKillah

Quote:


> Originally Posted by *Agent Smith1984*
> 
> @JohnnyMoore


If I were you (and I am NOT), I would look at removing the HDD cage all together and mounting the hard drives else where (SSD's can easily be mounted on the back side of the motherboard try with double sided tape). Those hard drive brackets could be used to to mount those hard drives directly to the bottom of the case. Mind you, doing this will require you to drill out rivets (as an added precaution I would remove all components), but the increased airflow from the front case fan would be significantly improved. Right now, that hard drive cage is blocking MOST of the air trying to get to your video cards.

Do you have 2 exhaust fans at the top? It looks like maybe you do.


----------



## Sgt Bilko

Quote:


> Originally Posted by *JohnnyMoore*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Sorry, Did you mean to say you have tried it or you haven't tried it?
> 
> 
> 
> yes have tried , sry for my english
Click to expand...

It's ok man, i appreciate you trying.....i can only speak english and bad english









Sorry about that......what about a fan in the side of the case then?

The other guys have some great ideas going there too


----------



## JohnnyMoore

Quote:


> Originally Posted by *CerealKillah*
> 
> If I were you (and I am NOT), I would look at removing the HDD cage all together and mounting the hard drives else where (SSD's can easily be mounted on the back side of the motherboard try with double sided tape). Those hard drive brackets could be used to to mount those hard drives directly to the bottom of the case. Mind you, doing this will require you to drill out rivets (as an added precaution I would remove all components), but the increased airflow from the front case fan would be significantly improved. Right now, that hard drive cage is blocking MOST of the air trying to get to your video cards.
> 
> Do you have 2 exhaust fans at the top? It looks like maybe you do.


yes 2x200mm at the top,gonna make cable management and relocate hdd and ssd


----------



## JohnnyMoore

Quote:


> Originally Posted by *Agent Smith1984*
> 
> @JohnnyMoore


ty going to follow your adviсe


----------



## JohnnyMoore

Quote:


> Originally Posted by *Sgt Bilko*
> 
> It's ok man, i appreciate you trying.....i can only speak english and bad english
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sorry about that......what about a fan in the side of the case then?
> 
> The other guys have some great ideas going there too


i have this door with 200mm fan , i feel like it not help but on the contrary.Because hot air flow from vga going in the direction of this door and 200 mm fan blow contrariwise.


----------



## Agent Smith1984

Quote:


> Originally Posted by *JohnnyMoore*
> 
> 
> i have this door with 200mm fan , i feel like it not help but on the contrary.Because hot air flow from vga going in the direction of this door and 200 mm fan blow contrariwise.


Try spinning that door fan around and getting the hot air OUT of the case.

You'd think blowing more cool air on the cards would help more, but I swear I have gotten better results so far by having a higher negative pressure airflow.

It's worth a try anyways....


----------



## stalker27

Any have crash issues with R9 390 o 390x with the last drivers 15.17.1 ?

in gta v or far cry 4 or arma 3 ? when running other programas like skype or steam chat etc the games crash¡ driver stop working and recovered ..

or error ERR gfxd3d init in gta V.


----------



## THUMPer1

That case is a bit cramped.


----------



## LongRod

Quote:


> Originally Posted by *stalker27*
> 
> Any have crash issues with R9 390 o 390x with the last drivers 15.17.1 ?
> 
> in gta v or far cry 4 or arma 3 ? when running other programas like skype or steam chat etc the games crash¡ driver stop working and recovered ..
> 
> or error ERR gfxd3d init in gta V.


In the release notes for the 15.7.1 driver, the third known issue is "[423976] GTA V may crash on extended gameplay on some AMD Radeon™ R9 300 series products".

No fix has been issued as far as I know, and I haven't tried the 15.201.1102 drivers on Guru3D yet.


----------



## stalker27

Quote:


> Originally Posted by *LongRod*
> 
> In the release notes for the 15.7.1 driver, the third known issue is "[423976] GTA V may crash on extended gameplay on some AMD Radeon™ R9 300 series products".
> 
> No fix has been issued as far as I know, and I haven't tried the 15.201.1102 drivers on Guru3D yet.


I have so many crash when i have other programas running. i hope amd release new drivers and fix this problem


----------



## LongRod

Quote:


> Originally Posted by *stalker27*
> 
> I have so many crash when i have other programas running. i hope amd release new drivers and fix this problem


Hmm, I've only experienced crashing in GTAV.

I think I'll install these drivers and see if GTAV stops crashing.


----------



## CerealKillah

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Try spinning that door fan around and getting the hot air OUT of the case.
> 
> You'd think blowing more cool air on the cards would help more, but I swear I have gotten better results so far by having a higher negative pressure airflow.
> 
> It's worth a try anyways....


The problem with negative pressure is the case will suck in all the dust and plug things up. I don't think that HAF X has any sort of real filters on it.

If he exhausts the side panel (that might work well and draw cool air to the area) I would switch the front top fan from exhaust to push. Neutral pressure is best (IMHO) but positive is better than negative for long term.

This is just my opinion. Please feel free to ignore it.


----------



## kizwan

Quote:


> Originally Posted by *Derek129*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> I think you mean *6+2* PCIe power connector. Two PCIe cables doesn't mean they run on separate rails. It depends on the PSU whether it's single or multiple +12V rails. On multiple rails PSU, there usually labels on the connectors on the PSU which rails it connected to. Whether you have single or multiple rails, it's good idea to use different cables from the PSU for each PCIe power plug on the GPU. It's rule of thumb more toward safety than stability though.
> 
> 
> 
> So how important is this? I have two seperate cables coming from my psu which both have two 6+2 connectors branching off each cable and last night while installing my new gpu I just used one cable to plug in both power inputs on my gpu, should I change this and use 2 seperate cables coming from the psu to power the card?
Click to expand...

It is rule of thumb, you can get away if you didn't follow it. The idea is that, especially when overclocking & overvolting, the card drawing all the power of the PCIe power connector through the same wire. Again, like I said you can get away if you use same cable for both PCIe power connectors. But if you have extra PCIe power cable, why not use separate cables for each PCIe power connectors, dividing the load.
Quote:


> Originally Posted by *AverdanOriginal*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> I think you mean *6+2* PCIe power connector. Two PCIe cables doesn't mean they run on separate rails. It depends on the PSU whether it's single or multiple +12V rails. On multiple rails PSU, there usually labels on the connectors on the PSU which rails it connected to. Whether you have single or multiple rails, it's good idea to use different cables from the PSU for each PCIe power plug on the GPU. It's rule of thumb more toward safety than stability though.
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Agent Smith1984*
> 
> If you have multiple 12v rails, and each of those rails is rated at a lower total amperage than your card needs, then you would be better to use two separate cables....
> If your total amperage is enough on each rail, then it won't matter.
> If you only have one 12v rail it also won't matter.
> 
> It's completely dependent on the PSU itself, and what your actual power needs are.
> 
> Click to expand...
> 
> 
> 
> 
> 
> Thanks for the info guys. I have a Be Quiet Straight Power 10 600W Modular with multiple rail and the following Setup:
> +12V1(A) @18
> +12V2(A) @18
> +12V3(A) @24
> +12V4(A) @24
> 
> *So from what you are saying with my mutliple rail PSU it would make sense? for safety reasons*
Click to expand...

You're misunderstand me. It's not about single or multiple rails. As long as PSU is good quality, doesn't matter whether it's single or multiple rails. I'm talking about the PCIe power cables. Read my reply above.

Anyway, with that PSU, you should use one PCIe power cable from +12V3 & another one from +12V4 for your card (single card right?). The 18 Amp ones seems too low IMO. I don't remember how much power single Hawaii card can draw especially when overclocking. You can find this info at 290/290X club for your reference. Probably need to search since it was buried deep in the thread.


----------



## Agent Smith1984

Quote:


> Originally Posted by *CerealKillah*
> 
> The problem with negative pressure is the case will suck in all the dust and plug things up. I don't think that HAF X has any sort of real filters on it.
> 
> If he exhausts the side panel (that might work well and draw cool air to the area) I would switch the front top fan from exhaust to push. Neutral pressure is best (IMHO) but positive is better than negative for long term.
> 
> This is just my opinion. Please feel free to ignore it.


Yes, negative pressure definitely can create a good bit of dust....

I don't really experience true negative pressure per say, I just have the fans' CFM/airflow/directions setup in such a way that _would_ create negative pressure, if I didn't have an opening behind my CPU socket that is drilled out with an additional 120mm intake fan









With the case not being sealed, it can't really create true negative pressure.

Now if it were not for the access behind the socket, I think this little S340 seals up well enough to actually create some true negative pressure.
I know this because when I added the (2) 95CFM Exhaust fans to the rear and top of the case, it increased my front radiator fans around 100RPM each.....
Basically letting me know that those jokers are helping to suck additional air through the largest intake portion of the case.


----------



## jon4501

Question for you guys:

I've just purchased an R9 390x to upgrade from a lacking 6970. Do you foresee any bottleneck issues with my current FX 6100 CPU? Should I look into upgrading to an 8350?


----------



## Agent Smith1984

Quote:


> Originally Posted by *jon4501*
> 
> Question for you guys:
> 
> I've just purchased an R9 390x to upgrade from a lacking 6970. Do you foresee any bottleneck issues with my current FX 6100 CPU? Should I look into upgrading to an 8350?


You may very well see some bottlenecking with that CPU.

What motherboard and CPU cooling are currently using though?

Those two factors will greatly impact whether or not it's worth it to get an FX-8 (of which I suggest just getting the $115 FX-8300 from tiger direct, if you are getting one).


----------



## jon4501

Quote:


> Originally Posted by *Agent Smith1984*
> 
> You may very well see some bottlenecking with that CPU.
> 
> What motherboard and CPU cooling are currently using though?
> 
> Those two factors will greatly impact whether or not it's worth it to get an FX-8 (of which I suggest just getting the $115 FX-8300 from tiger direct, if you are getting one).


I'm running an Asus M5A99X EVO R2.0 Motherboard and an FX-6100 CPU with Coolermaster Hyper N520 cooler. What do you think?

The rest of my system is: 32GB 1866 MHz DDR3 (Overkill, I know), 500GB Samsung 850 SSD, 2TB WD Black HDD, Asus PCE-AC68 Wifi card, Antec 750 Earthwatts PSU


----------



## Agent Smith1984

Quote:


> Originally Posted by *jon4501*
> 
> I'm running an Asus M5A99X EVO R2.0 Motherboard and an FX-6100 CPU with Coolermaster Hyper N520 cooler. What do you think?
> 
> The rest of my system is: 32GB 1866 MHz DDR3 (Overkill, I know), 500GB Samsung 850 SSD, 2TB WD Black HDD, Asus PCE-AC68 Wifi card, Antec 750 Earthwatts PSU


Well depending on your budget, there are few things you could do....

I won't get too deep into it since it's a bit off topic in this thread to get on a CPU (AMD vs Intel) BS tangent...

If you have the budget, get an FX-8300 and a decent 240mm AIO cooler to keep it cooler and OC some (maybe 4.6-4.8 on that board if lucky)

If you are dead set on AMD CPU, then get an 8300 and a better board, and cooler, if budget permits.

If even further budget permits, then you could look at going inte CPU and motherboard.

Again, I am not going to go in depth about it, but my vote would be to keep it cheap, add an FX-8 and a decent water cooler, and call it a day


----------



## jon4501

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Well depending on your budget, there are few things you could do....
> 
> I won't get too deep into it since it's a bit off topic in this thread to get on a CPU (AMD vs Intel) BS tangent...
> 
> If you have the budget, get an FX-8300 and a decent 240mm AIO cooler to keep it cooler and OC some (maybe 4.6-4.8 on that board if lucky)
> 
> If you are dead set on AMD CPU, then get an 8300 and a better board, and cooler, if budget permits.
> 
> If even further budget permits, then you could look at going inte CPU and motherboard.
> 
> Again, I am not going to go in depth about it, but my vote would be to keep it cheap, add an FX-8 and a decent water cooler, and call it a day


Thank you sir! I appreciate the help!


----------



## battleaxe

Okay, I've got some money burning in the pocket from my last birthday. I want to get something cool for my PC. Question is: What?

*My thought right now are:*

290x to go tri-Xfire in my largest PC
390X also to tri-Xfire in the largest PC
Fury just for fun to put in a different PC
Something else?
I don't really need anything, just want to have some fun for the birthday.

What should I do? What would you do?

Edit: surely someone has an opinion or an idea?

I'm leaning toward a 390x tri-fired with my existing 290x and 290. My 290 is pretty strong, so its not really holding the 290x back right now at all. Plays very smooth in games. Its been a great experience so far with Xfire.


----------



## Agent Smith1984

Quote:


> Originally Posted by *battleaxe*
> 
> Okay, I've got some money burning in the pocket from my last birthday. I want to get something cool for my PC. Question is: What?
> 
> *My thought right now are:*
> 
> 290x to go tri-Xfire in my largest PC
> 390X also to tri-Xfire in the largest PC
> Fury just for fun to put in a different PC
> Something else?
> I don't really need anything, just want to have some fun for the birthday.
> 
> What should I do? What would you do?
> 
> Edit: surely someone has an opinion or an idea?
> 
> I'm leaning toward a 390x tri-fired with my existing 290x and 290. My 290 is pretty strong, so its not really holding the 290x back right now at all. Plays very smooth in games. Its been a great experience so far with Xfire.


It's hard for me to recommend the 390x to somebody with so many 290's since crossfiring them takes away a little from the real purpose of the 390...

How about a good deal on another 290x?

I'm mobile so i can't see your system spec, but faster cpu, better mobo, or better cooling is always fun too...

Fury seems like a mediocre option to me right now with all the coil whine going around, and lack of voltage control.


----------



## Noirgheos

To those with the XFX 390X model... idle and load temps?


----------



## Gumbi

Quote:


> Originally Posted by *Noirgheos*
> 
> To those with the XFX 390X model... idle and load temps?


Can't give you exact numbers, but AFAIK XFX really upped the ante for this series of cards, fixing the VRM issues that plagued the 290(x) XFX DD series.

From what I've seen I would put them on par with the Nitro and MSI series (I don't rate the Asus one much, and Gigabyte lock the voltage sooo...).


----------



## Noirgheos

Quote:


> Originally Posted by *Gumbi*
> 
> Can't give you exact numbers, but AFAIK XFX really upped the ante for this series of cards, fixing the VRM issues that plagued the 290(x) XFX DD series.
> 
> From what I've seen I would put them on par with the Nitro and MSI series (I don't rate the Asus one much, and Gigabyte lock the voltage sooo...).


Would you take the MSI or XFX 390X at the same price?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Noirgheos*
> 
> Would you take the MSI or XFX 390X at the same price?


It's a total draw really. Whichever card you think looks best.

Though there is a full cover block for the XFX 390 and not for the MSI.
I did hear something about the improved VRM layout on the XFX not allowing the old XFX waterblock to work though.

I probably need to do some more research on that and get the OP updated.

I would say if the MSI works in your build/color scheme, go with it, if the XFX works better, get it.

As far as clocking, XFX and MSI are clearly the best for this series right now, and their cooling capabilities are almost identical for core and for VRM.


----------



## CerealKillah

Quote:


> Originally Posted by *Agent Smith1984*
> 
> It's a total draw really. Whichever card you think looks best.
> 
> Though there is a full cover block for the XFX 390 and not for the MSI.
> I did hear something about the improved VRM layout on the XFX not allowing the old XFX waterblock to work though.
> 
> I probably need to do some more research on that and get the OP updated.
> 
> I would say if the MSI works in your build/color scheme, go with it, if the XFX works better, get it.
> 
> As far as clocking, XFX and MSI are clearly the best for this series right now, and their cooling capabilities are almost identical for core and for VRM.


There is a revision of the XFX 390 cards that are not compatible with EK waterblocks due to a new inductor. I thankfully have the reference card with the reference inductor and the waterblock fits great (outside of the VRM1 pads...).

If you pull the stock XFX cooler off, you can tell pretty quickly if the full waterblocks are going to work o rnot.


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> It's hard for me to recommend the 390x to somebody with so many 290's since crossfiring them takes away a little from the real purpose of the 390...
> 
> How about a good deal on another 290x?
> 
> I'm mobile so i can't see your system spec, but faster cpu, better mobo, or better cooling is always fun too...
> 
> Fury seems like a mediocre option to me right now with all the coil whine going around, and lack of voltage control.


Maybe a dumb question. But in your OP, what's the real purpose of the 390 series over 290 series?

Agreed about Fury, pretty meh at this point compared to either a 290x or 390x.


----------



## flopper

Quote:


> Originally Posted by *battleaxe*
> 
> Maybe a dumb question. But in your OP, what's the real purpose of the 390 series over 290 series?
> 
> Agreed about Fury, pretty meh at this point compared to either a 290x or 390x.


I would say 390 is a fixed 290 and upgraded.
the old series had some issues which seems fixed.


----------



## Agent Smith1984

Quote:


> Originally Posted by *flopper*
> 
> I would say 390 is a fixed 290 and upgraded.
> the old series had some issues which seems fixed.


Agreed, and it fills a market segment that AMD needs.


----------



## JohnnyMoore

well...after 6 hours of cable management and air flow improvement(2 fan additional) have 85/75 degree








at stock core voltage and 3% performance decrement.I upload photos tomorrow


----------



## Agent Smith1984

Quote:


> Originally Posted by *JohnnyMoore*
> 
> well...after 6 hours of cable management and air flow improvement(2 fan additional) have 85/75 degree
> 
> 
> 
> 
> 
> 
> 
> at stock core voltage and 3% performance decrement.I upload photos tomorrow


Hell yeah!

Now find max oc on stock voltage and game on my friend!


----------



## Sgt Bilko

Quote:


> Originally Posted by *JohnnyMoore*
> 
> well...after 6 hours of cable management and air flow improvement(2 fan additional) have 85/75 degree
> 
> 
> 
> 
> 
> 
> 
> at stock core voltage and 3% performance decrement.I upload photos tomorrow


Good stuff man!


----------



## Cannon19932006

http://www.3dmark.com/3dm/8277673

I made my card suffer through a run of 1200/1750 with +200mv core and +50aux. It was hot, low 90s on core and vrm with side off and fan blowing air on card, but it completed with no artifacts and gave me a score.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Cannon19932006*
> 
> http://www.3dmark.com/3dm/8277673
> 
> I made my card suffer through a run of 1200/1750 with +200mv core and +50aux. It was hot, low 90s on core and vrm with side off and fan blowing air on card, but it completed with no artifacts and gave me a score.


Good score, but it looks like the card might not be happy there, i have pulled 14,200 gfx on my 390 with 1200/1700, I'll have to dig up the link.

Sarge pulled 14,550 i think, on his at 1200/1700.

I bet if you back the voltage down to 150 or less (125mv is my max "sweet spot") then you could land a solid 1180-1200 and score higher.

These things hate 200mv (whereas the 290s didn't care as long as they were cool enough).

Still nice to see you push it though bro!


----------



## Cannon19932006

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Good score, but it looks like the card might not be happy there, i have pulled 14,200 gfx on my 390 with 1200/1700, I'll have to dig up the link.
> 
> Sarge pulled 14,550 i think, on his at 1200/1700.
> 
> I bet if you back the voltage down to 150 or less (125mv is my max "sweet spot") then you could land a solid 1180-1200 and score higher.
> 
> These things hate 200mv (whereas the 290s didn't care as long as they were cool enough).
> 
> Still nice to see you push it though bro!


Wouldn't finish the run at 175 or 150, tried those first, driver crashed both times.

and the score was only about 50 points higher than my previous best graphics score at 1170 so I figured it didn't care for the voltage much.


----------



## SystemTech

Quote:


> Originally Posted by *SystemTech*
> 
> Hi guys,
> 
> Does anyone know if the Alphacool 290/290X DCuII waterblock will fit on my 390 DCu II.
> I know the EK blocks should work as per the font page but im loving the look of the Alphacool blocks :
> http://www.alphacool.com/product_info.php/info/p1455_-Alphacool-NexXxoS-GPX---ATI-R9-290X-und-290-M03---mit-Backplate---Schwarz.html
> 
> Thanks
> 
> PS is anyone here using a full size WB to cool their 390/390X and if so, which brand (card and WB). Maybe we can compile a list


----------



## robmcrock

Hi guys.

Msi r9 390 1170/1575

Would someone mind posting a pic of a custom fan curve in after burner if you have one.

I've never really done a custom fan curve before and I'm finding because my 390 overclocked idles just above 60°C, the fan keeps stopping and starting when just using the desktop.

Cheers


----------



## JohnnyMoore

Quote:


> Originally Posted by *robmcrock*
> 
> Hi guys.
> 
> Msi r9 390 1170/1575
> 
> Would someone mind posting a pic of a custom fan curve in after burner if you have one.
> 
> I've never really done a custom fan curve before and I'm finding because my 390 overclocked idles just above 60°C, the fan keeps stopping and starting when just using the desktop.
> 
> Cheers



My curve , idle 40 degree 30% fan speed (not noticable noise) , some not demanding games like lol dota about 60% fan speed and AAA and others 100%








But it's personal , depends on what you want , wich fan speed/temp ratio.You can stay at 60 idle if noise derange you , perso i set 30%-+5% , this fan speed is not signifiant and i love my stuff , better cooled is good !


----------



## JohnnyMoore

Quote:


> Originally Posted by *Cannon19932006*
> 
> http://www.3dmark.com/3dm/8277673
> 
> I made my card suffer through a run of 1200/1750 with +200mv core and +50aux. It was hot, low 90s on core and vrm with side off and fan blowing air on card, but it completed with no artifacts and gave me a score.


If you not notice artifacts with human eyes its not mean there are not here. Highest current(with high temp) not meant automatically better result ect .Try with lowest current and maybe frequency and upload your results!


----------



## CerealKillah

Quote:


> Originally Posted by *SystemTech*


I don't think you will really know until you pull the stock cooler off the 390 and compare PCBs. The EK configurator should show you what the PCB looks like. You will, of course, want to verify that your actual PCB matches the PCBs.


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> It's hard for me to recommend the 390x to somebody with so many 290's since crossfiring them takes away a little from the real purpose of the 390...
> 
> How about a good deal on another 290x?
> 
> I'm mobile so i can't see your system spec, but faster cpu, better mobo, or better cooling is always fun too...
> 
> Fury seems like a mediocre option to me right now with all the coil whine going around, and lack of voltage control.


just can't decide... gonna wait a little bit and see what happens to the 290x prices as you said. The last 290x I bought is quite nice, scores just as high as most 390x from what I can tell on air. But in order to get any more out of the card I would need a full water block, the VRM's just can't go much above 150mv without getting too hot. And, I don't think any blocks are going to be available for tne new version of Tri-X 290x either, so that's likely not going to happen.

Also, it may be pointless to have TriXfire on 4k, I can play most games now already maxed out that I play. So, maybe a dumb idea to begin with...

For now, I'll keep working on cleaning up the inside appearance of the case, might get some gentle typhoon fans to lower inside noise too to hold me off just a bit longer...









Still temped to get a 390x just to have one. And to have an 8GB card just for the hey of it.


----------



## JohnnyMoore

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Hell yeah!
> 
> Now find max oc on stock voltage and game on my friend!


Final pict


----------



## LongRod

I'm curious, does my graphics score for Firestrike look normal for 1100/1525?

Just wanting to compare to you guys


----------



## battleaxe

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *JohnnyMoore*
> 
> Final pict






Looks very nice... I need to clean mine up the same like this. Good job.

Have your temps improved? I noticed you added a fan at the end of the cards, and also looks like some are pushing air between the cards too?


----------



## Agent Smith1984

Quote:


> Originally Posted by *JohnnyMoore*
> 
> Final pict


Dude, you did a great job cleaning that thing up, and getting some better airflow. Glad it paid off.....


----------



## Agent Smith1984

Quote:


> Originally Posted by *LongRod*
> 
> I'm curious, does my graphics score for Firestrike look normal for 1100/1525?
> 
> Just wanting to compare to you guys


Yes the graphics score itself looks great for those clock speeds


----------



## JohnnyMoore

Quote:


> Originally Posted by *LongRod*
> 
> I'm curious, does my graphics score for Firestrike look normal for 1100/1525?
> 
> Just wanting to compare to you guys


good graphics score ! with 2 gpu have 27500 (13700 each logically) 1160/1650


----------



## JohnnyMoore

Quote:


> Originally Posted by *battleaxe*
> 
> 
> Looks very nice... I need to clean mine up the same like this. Good job.
> 
> Have your temps improved? I noticed you added a fan at the end of the cards, and also looks like some are pushing air between the cards too?


Ty , yes i added one 80x15 fan at the end between to aim gpu main fan , and another 120mm to the begining cards , inside some boxe to direct outside freash airflow between. Before 95(trottling)/82 after 5 min of stress test(with 90% mem load,harder then normal benchs) and +63mCore +13mMem.Now 86/75 after 30min of same test.In demanding games 81/71 +-3 degree


----------



## JohnnyMoore

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Dude, you did a great job cleaning that thing up, and getting some better airflow. Glad it paid off.....


Ty mate ! You also participated in this


----------



## Agent Smith1984

Quote:


> Originally Posted by *JohnnyMoore*
> 
> Ty mate ! You also participated in this


Maybe I asked this already, but are you using 4K yet?


----------



## JohnnyMoore

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Maybe I asked this already, but are you using 4K yet?


Presently using 27 inch 1080p , going switch to asus MG279Q - 27 inch 1440p 144hz freesync in 10 days.I think 4k is too demanding for actual market gpu , and there is no 120/144hz.But perso will prefere 21/9 144hz with no blacklight - doesnt not exist


----------



## Agent Smith1984

Quote:


> Originally Posted by *JohnnyMoore*
> 
> Presently using 27 inch 1080p , going switch to asus MG279Q - 27 inch 1440p 144hz freesync in 10 days.I think 4k is too demanding for actual market gpu , and there is no 120/144hz.But perso will prefere 21/9 144hz with no blacklight - doesnt not exist


You are going to dominate 1440P with that setup









Let us know how you enjoy freesync....

I personally use my system in my living room as the "end-all, be-all" entertainment box for the family (except my 10 year old son who has his own).

It handles our movies, our television programming (Netflix, Hulu, Youtube, etc), sports, surfing, research, work, school work, and all the gaming too, so I have found that using a 55" 4K TV, even with the 60Hz limitation, has been amazing.

We moved our 46" 1080P TV to the bedroom and that will be used with the wife's system that we are slowly building for her (project wifey).
I will probably score a used 290 on the cheap for that rig when it's done.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Quote:
> 
> 
> 
> Originally Posted by *JohnnyMoore*
> 
> Presently using 27 inch 1080p , going switch to asus MG279Q - 27 inch 1440p 144hz freesync in 10 days.I think 4k is too demanding for actual market gpu , and there is no 120/144hz.But perso will prefere 21/9 144hz with no blacklight - doesnt not exist
> 
> 
> 
> You are going to dominate 1440P with that setup
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Let us know how you enjoy freesync....
> 
> I personally use my system in my living room as the "end-all, be-all" entertainment box for the family (except my 10 year old son who has his own).
> 
> It handles our movies, our television programming (Netflix, Hulu, Youtube, etc), sports, surfing, research, work, school work, and all the gaming too, so I have found that using a 55" 4K TV, even with the 60Hz limitation, has been amazing.
> 
> We moved our 46" 1080P TV to the bedroom and that will be used with the wife's system that we are slowly building for her (project wifey).
> I will probably score a used 290 on the cheap for that rig when it's done.
Click to expand...

my 295x2 pulls around 100-120fps constant at 1440p so you'l have no issues with those cards










i will say though that you are going to have to cap the fps with that monitor as it's freesync range is 35-90hz, anything over that you lose freesync functionality (still a damn nice monitor though)

I'm aiming for the 27MU67 from LG atm, 4k, 27" IPS Freesync monitor.......just perfect for me


----------



## JohnnyMoore

Quote:


> Originally Posted by *Agent Smith1984*
> 
> You are going to dominate 1440P with that setup
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Let us know how you enjoy freesync....
> 
> I personally use my system in my living room as the "end-all, be-all" entertainment box for the family (except my 10 year old son who has his own).
> 
> It handles our movies, our television programming (Netflix, Hulu, Youtube, etc), sports, surfing, research, work, school work, and all the gaming too, so I have found that using a 55" 4K TV, even with the 60Hz limitation, has been amazing.
> 
> We moved our 46" 1080P TV to the bedroom and that will be used with the wife's system that we are slowly building for her (project wifey).
> I will probably score a used 290 on the cheap for that rig when it's done.


Sure going post about freesync personal expirience.What about 55" 4k i can suppose its amazing for family and different tasks ,resolution just incomparable with 1080 , but i have my own little office and 55" is too large,also monitor's in/output lags are much more better and im old hardcore gamer so my first criterions are response time ,refresh rate =)


----------



## JohnnyMoore

To me if fps push more than 90(on 144hz) not really need vcync ,freesycn or other cap.Going use freesync in exigent games those will not go more than 70 fps.Buy this monitor firstly for quick IPS , and freesync just bonus =)


----------



## Gumbi

Busted fan on my Vapor X is BADLY affecting my temps. Didn't realise how much these cards liked airflow (@Agent1984







) Did a quick test, stock volts (25mv), 1120/1400 (my overclock) and 40% fan speed (2 outer fans only remember) - 88 c after 3 or 4 mins of Heaven. DAMN. VRMs were hitting 80c too! The VRMs are usually WAY cooler. I upped the fans to 75% and within a minute my temps dropped to the mid 70s across the board.

I talked to Sapphire and I may end up joining the 390 club with a Nitro... Which sin't a bad thing I guess


----------



## Gumbi

On that note, does anyone care to show me their VRM temps on Sapphire 390 cards, say, at 40-50% fan (general operating speeds) while under load? Thanks







Core temps too would be nice.

I'd love to hear numbers for Powercolor too!


----------



## Cannon19932006

I did 2 more runs at +135mv 1180/1750 and +100mv 1150/1750 and the my card doesn't seem to like voltage, my best run of all time is at 100mv 1150/1750, finally broke 12k firestrike.









http://www.3dmark.com/3dm/8287295 (oddly the new 3dmark update shows it as a 290x...)


----------



## Darkeylel

So had a full night of gaming first once since I had the card and it performed well in bf4 holding a average 120fps. But on other games like lol and counter strike it just refuses to go into the next memory clock setting during league it was sitting at 700 for the whole time same as for guild wars 2 it would occasionally got upto 1050 then just drop back down


----------



## Sgt Bilko

Quote:


> Originally Posted by *Darkeylel*
> 
> So had a full night of gaming first once since I had the card and it performed well in bf4 holding a average 120fps. But on other games like lol and counter strike it just refuses to go into the next memory clock setting during league it was sitting at 700 for the whole time same as for guild wars 2 it would occasionally got upto 1050 then just drop back down


That's how AMD's Powerplay works, if it doesn't need to use the full clock speed then it won't.

I'm currently playing Assassin's Creed II again and my core clock hasn't gone over 600Mhz yet


----------



## Darkeylel

Quote:


> Originally Posted by *Sgt Bilko*
> 
> That's how AMD's Powerplay works, if it doesn't need to use the full clock speed then it won't.
> 
> I'm currently playing Assassin's Creed II again and my core clock hasn't gone over 600Mhz yet


Is there a way to turn it off ? Because in games likes Cs it's causing me to have unstable fps just causing me to rage at my screen.....


----------



## Sgt Bilko

Quote:


> Originally Posted by *Darkeylel*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> That's how AMD's Powerplay works, if it doesn't need to use the full clock speed then it won't.
> 
> I'm currently playing Assassin's Creed II again and my core clock hasn't gone over 600Mhz yet
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Is there a way to turn it off ? Because in games likes Cs it's causing me to have unstable fps just causing me to rage at my screen.....
Click to expand...

You can up the power limit to 25% or 50%

but if you want to run at 3D clocks all the time then go here in Afterburner:



Use it without PowerPlay support and it will lock your card to 3D Clocks all the time (even on Desktop).

That should fix the issue for you


----------



## AverdanOriginal

Alright here is my final Overclock without increasing Voltage. Been tested on Heaven, Firestrike and Dragon Age Inquisition for an hour.

1130 Core / 1690 Memory @0mv

Started also to try to get a nicer Overclock with Voltage but disregarded that idea, since I am currently using my old PSU which seems a bit unstable, plus my CPU will just throw all the nice Overclocking results down the bin....*note to myself* "a 6 year old CPU just isn't up-to-date anymore"











here the link to my firestrike 1.1
http://www.3dmark.com/3dm/8290301

And here the steps of going through to find my Overclocks (if anyone has suggestions or tipps.... feel free to comment







)


@AgentSmith: please add these results, oh and by the way, have you also tried to change your TIM, I remember Duke or something achieved a difference of up to 6C°. was thinking of also trying if many more report that MSI is using way too much TIM on their cards.


----------



## Mr.Pie

here is my submission! Might justify a 2nd 390 once I invest into a 1440p monitor....


----------



## jackalopeater

All these great overclocks.......*kicks Asus 390x DCUII* lol brb with overclocking results


----------



## jackalopeater

So here it is....pretty much the most I can get out of my poopoo clocking 390x, lol. 1075/1650 +25mV OR I can drop the mem to stock and can get the core to 1125. Past that, nothing even adding more voltage gets me very little. +125mV to get to 1175 core


----------



## battleaxe

Okay, I'm seeing some of your 390's here... am I doing something wrong? Or is there a reason my scores seem better than a lot of these? My card is a new version of the 290x, its the new version of the TriX. I don't know... maybe I have something set wrong that's making my scores higher? Or is this new version the 390x but badged as a 290x? Mine has Samsung RAM too. Seems strange.


----------



## Sgt Bilko

Quote:


> Originally Posted by *battleaxe*
> 
> Okay, I'm seeing some of your 390's here... am I doing something wrong? Or is there a reason my scores seem better than a lot of these? My card is a new version of the 290x, its the new version of the TriX. I don't know... maybe I have something set wrong that's making my scores higher? Or is this new version the 390x but badged as a 290x? Mine has Samsung RAM too. Seems strange.


Look at your clock speed mate









1230/1665

For comparison: my XFX R9 390x DD 1150/1700:



Can you run Valley?

Would be a better overall test imo


----------



## battleaxe

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Look at your clock speed mate
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1230/1665
> 
> For comparison: my XFX R9 390x DD 1150/1700:
> 
> 
> 
> Can you run Valley?
> 
> Would be a better overall test imo


Yes, I can run valley, I just can't run FS or any Futuremark product for some odd reason. I still haven't' figured out why... I'm running in windowed mode so only the primary GPU is enabled. Is that okay?

Edit: it seems to me that Heaven is a tougher test though? Valley from what I can remember I can clock even higher. I'll try it though.

Here's the Valley run, same sets. I have no idea what this score is like, good or bad.


----------



## Sgt Bilko

Quote:


> Originally Posted by *battleaxe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Look at your clock speed mate
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1230/1665
> 
> For comparison: my XFX R9 390x DD 1150/1700:
> 
> 
> 
> Can you run Valley?
> 
> Would be a better overall test imo
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yes, I can run valley, I just can't run FS or any Futuremark product for some odd reason. I still haven't' figured out why... I'm running in windowed mode so only the primary GPU is enabled. Is that okay?
Click to expand...

Yeah i think that would work just fine, we all know Crossfire only works in windowed mode for DX anyways







(Hmmm, since crossfire works in windowed mode for Mantle does that mean it will work with DX12 too?)

I'll do a stock run (1050/1150) and an overclocked one (1150/1700)


----------



## Sgt Bilko

Here we are

Stock (1050/1500)


and overclocked (1150/1700)


and this is with normal usage (Chrome, Steam, Origin, Teamspeak and even Assassins Creed II running in the background







)


----------



## Sgt Bilko

Quote:


> Originally Posted by *battleaxe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Look at your clock speed mate
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1230/1665
> 
> For comparison: my XFX R9 390x DD 1150/1700:
> 
> 
> 
> Can you run Valley?
> 
> Would be a better overall test imo
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yes, I can run valley, I just can't run FS or any Futuremark product for some odd reason. I still haven't' figured out why... I'm running in windowed mode so only the primary GPU is enabled. Is that okay?
> 
> Edit: it seems to me that Heaven is a tougher test though? Valley from what I can remember I can clock even higher. I'll try it though.
> 
> Here's the Valley run, same sets. I have no idea what this score is like, good or bad.
Click to expand...

Heaven relies more on CPU than what Valley does (Valley still likes a faster CPU though) and your score looks good tbh, the newer 290x's seem to clock easier than the older ones but the 390x's seem to go higher on stock voltage

for example, my 290x will do 1287/1375 Firestrike with +200mV but my 390x needs +125mV to do 1200/1700 while i can do 1150/1700 on the stock voltage and remain very cool while doing so.

so to take what @Agent Smith1984 said earlier it seems as though the 390 series cards are lower leakage so they don't need alot of voltage to hit a certain point but they hit a wall around 1200Mhz core clock and it takes quite a bit of voltage to get them to go any further.

Sorry for the triple post guys


----------



## battleaxe

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Heaven relies more on CPU than what Valley does (Valley still likes a faster CPU though) and your score looks good tbh, the newer 290x's seem to clock easier than the older ones but the 390x's seem to go higher on stock voltage
> 
> for example, my 290x will do 1287/1375 Firestrike with +200mV but my 390x needs +125mV to do 1200/1700 while i can do 1150/1700 on the stock voltage and remain very cool while doing so.
> 
> so to take what @Agent Smith1984 said earlier it seems as though the 390 series cards are lower leakage so they don't need alot of voltage to hit a certain point but they hit a wall around 1200Mhz core clock and it takes quite a bit of voltage to get them to go any further.
> 
> Sorry for the triple post guys


Well, that makes a lot of sense. What's interesting is that mine won't go much above 1230 either though. So I am wondering if mine is more like a 390x than a 290x. It doesn't behave like my 290 does at all to voltage.

My results seem okay then. I can't imagine having one of these that clocks at 1300mhz like I've seen some of them do. There were a couple guys on the 290 thread who had pairs of them that would do 1300mhz plus. Crazy.


----------



## kizwan

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Here we are
> 
> Stock (1050/1500)
> 
> 
> and overclocked (1150/1700)
> 
> 
> and this is with normal usage (Chrome, Steam, Origin, Teamspeak and even Assassins Creed II running in the background
> 
> 
> 
> 
> 
> 
> 
> )


Just for fun. [email protected]/1600







(This is single card run. Crossfire disabled.)



BTW, 1.25V is the max volt when running Valley @1150/1700?


----------



## battleaxe

When you have more than one monitor attached does the RAM always go up to maximum clocks? Mine is doing this when my second monitor is on and I was wondering why it can't just idle at lower clocks?


----------



## Gumbi

Quote:


> Originally Posted by *battleaxe*
> 
> When you have more than one monitor attached does the RAM always go up to maximum clocks? Mine is doing this when my second monitor is on and I was wondering why it can't just idle at lower clocks?


AFAIK, yes, and this is unavoidable. It's the same for nVidia I think. I have a 144hz monitor, but run it at 120hz when not gaming to avoid running at 1400mhz, and run at 150mhz clocks for memory. This for sure is the same for nVidia as I confirmed it with my friend.


----------



## battleaxe

Quote:


> Originally Posted by *Gumbi*
> 
> AFAIK, yes, and this is unavoidable. It's the same for nVidia I think. I have a 144hz monitor, but run it at 120hz when not gaming to avoid running at 1400mhz, and run at 150mhz clocks for memory. This for sure is the same for nVidia as I confirmed it with my friend.


Okay, annoying, but what ya gonna do?


----------



## MK-Professor

Sapphire Nitro 390


I can get 1160mhz with +100v but only when I am under 60C, above that it gives me artifacts and if I am above 70C I get even more artifacts, weird?
With a lower OC around 1100 I notice the same behavior but at much higher temp(around 80C) which is ok because it doesn't even get near to that.
At stock this never happened even with temps around 90C+


----------



## AverdanOriginal

Quote:


> Originally Posted by *Mr.Pie*
> 
> 
> 
> here is my submission! Might justify a 2nd 390 once I invest into a 1440p monitor....


Nice clocking, but if you really consider Crossfiring them I would also try to get the 89 C° down a bit (unless its your ambient temp that pushes them so high). most people report that the top card gets about 10C° more than the bottom card on these which would lead here to maybe 95C°-100C°??? just a though


----------



## Mr.Pie

I was playing around with the fan curve and wanted to see how hot I could cook the card while fiddling with how low I could run my case fans!









Nay worries

Normal load is roughly 72*C ish which I could drop if I cable managed a bit more and swapped out the front fans for something with higher static pressure!


----------



## AverdanOriginal

Quote:


> Originally Posted by *jackalopeater*
> 
> So here it is....pretty much the most I can get out of my poopoo clocking 390x, lol. 1075/1650 +25mV OR I can drop the mem to stock and can get the core to 1125. Past that, nothing even adding more voltage gets me very little. +125mV to get to 1175 core


Hi. not sure why the clock won't increase,temps of 77C° is good considering the +25mV. The 89C° on VRM 1 Temp would worry me though. have you increased the aux voltage too? and 60% fans on 77C° is nice, though you could customize the fans to run maybe at 70-75% at 77C° then also Vram Temp might go down a bit. I am not sure, but I read somewhere here in the forum that anything above 80C° on Vram is not good??? please correct me if I am wrong, since I know that these cards can handle 105C° not sure if it is the same for VRM Temp. You are also only getting 18 VDDC Current in on +25mV... Maybe again I am wrong here but could it be that your PSU is throttling the overclocking?


----------



## jackalopeater

Quote:


> Originally Posted by *AverdanOriginal*
> 
> Hi. not sure why the clock won't increase,temps of 77C° is good considering the +25mV. The 89C° on VRM 1 Temp would worry me though. have you increased the aux voltage too? and 60% fans on 77C° is nice, though you could customize the fans to run maybe at 70-75% at 77C° then also Vram Temp might go down a bit. I am not sure, but I read somewhere here in the forum that anything above 80C° on Vram is not good??? please correct me if I am wrong, since I know that these cards can handle 105C° not sure if it is the same for VRM Temp. You are also only getting 18 VDDC Current in on +25mV... Maybe again I am wrong here but could it be that your PSU is throttling the overclocking?


yeah I went back and ran the fans at 75% and got much better VRM temps, but still no dice on increasing clocks on this gpu. My daily driver gpu is a 295x2 and oc it to 1150/1500 with no problem using this Cooler Master V1200platinum psu so I don't think it's that....although that is a nice observation, I wonder if each pci-e power plug being on it's on modular cable may cause some delivery issues.


----------



## Sgt Bilko

Quote:


> Originally Posted by *kizwan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Here we are
> 
> Stock (1050/1500)
> 
> 
> and overclocked (1150/1700)
> 
> 
> and this is with normal usage (Chrome, Steam, Origin, Teamspeak and even Assassins Creed II running in the background
> 
> 
> 
> 
> 
> 
> 
> )
> 
> 
> 
> Just for fun. [email protected]/1600
> 
> 
> 
> 
> 
> 
> 
> (This is single card run. Crossfire disabled.)
> 
> 
> 
> BTW, 1.25V is the max volt when running Valley @1150/1700?
Click to expand...

under load it goes to 1.14v to 1.18v roughly with +50% powerlimit and 1150/1700 clocks.z

On desktop it sits at 1.258v


----------



## Flash Gordon

Can I get added to the club? I thought this would have been done already...thanks.


----------



## Darkeylel

So touch wood I might have found the source of my headache when it comes to FPS drops. It seems turning ambient occlusion off in those games have made my FPS stable hmmmmm weird


----------



## Sgt Bilko

Quote:


> Originally Posted by *Darkeylel*
> 
> So touch wood I might have found the source of my headache when it comes to FPS drops. It seems turning ambient occlusion off in those games have made my FPS stable hmmmmm weird


Yeah, AO does tend to cause some issues in some games......what CPU you using?


----------



## Darkeylel

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Yeah, AO does tend to cause some issues in some games......what CPU you using?


I7 3820 , didn't have problems with AO on my old 7950 though just weird but oh well stable fps is glorious haha


----------



## Sgt Bilko

Quote:


> Originally Posted by *Darkeylel*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Yeah, AO does tend to cause some issues in some games......what CPU you using?
> 
> 
> 
> I7 3820 , didn't have problems with AO on my old 7950 though just weird but oh well stable fps is glorious haha
Click to expand...

Well that's good you got it sorted









I have a couple of issues in some games with AO but there isn't that much of a difference with it off


----------



## Agent Smith1984

Quote:


> Originally Posted by *Flash Gordon*
> 
> Can I get added to the club? I thought this would have been done already...thanks.


If you've submitted proof, I'll add you in the morning, have some other new folks to add also but am away from the office ( keep the database on my work station).

Thanks


----------



## Agent Smith1984

Seems like this silicon wants 1.25v tops under load. You can push a little more to it, but you either get diminishing returns, little-to-no increases in clock speeds, or hit board power limitations. Temps can become a factor after that also, but i think we are seeing efficiency limitations more than anything.

One thing for certain, is that vrm temps over 80c will impact your OC, and it can be very difficult to stay under that with highly increased voltage.

All in all, it's good to see almost all 390 owners hitting 1150 with 100mv (or much less in most cases) versus a handful of people getting 1200+ on 150mv+ while the rest struggle to hit 1100 on manageable temps.


----------



## Gumbi

I might be joining the 390 master race soon, up from a 290 Vapor X (loved this chip, did 1175/1600 at 75mv), hopefully my 390 will do the same/better; due to RMA.

I'm going to take the opportunity to invest in some better case cooling by grabbing 4 of these http://www.noctua.at/main.php?show=industrialppc and hooking them up to my fan controller. They will be replacing an 800RPM side intake 140mm, a rear exhaust 1100RPM 1400mm, and 2 120mm intakes.

Do you think I'll see good temp gains with this setup Agent Smith?


----------



## Tarifas

Hello everyone, quick question to owners of the asus dc2 version of the 390/390x how does it behave in regards to temperatures and noise? Is there some feedback or numbers to be shared?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Tarifas*
> 
> Hello everyone, quick question to owners of the asus dc2 version of the 390/390x how does it behave in regards to temperatures and noise? Is there some feedback or numbers to be shared?


Due to poor cooling on the original dc2 290 cards, i think most people have avoided the 390 version.


----------



## AverdanOriginal

Quote:


> Originally Posted by *jackalopeater*
> 
> yeah I went back and ran the fans at 75% and got much better VRM temps, but still no dice on increasing clocks on this gpu. My daily driver gpu is a 295x2 and oc it to 1150/1500 with no problem using this Cooler Master V1200platinum psu so I don't think it's that....although that is a nice observation, I wonder if each pci-e power plug being on it's on modular cable may cause some delivery issues.


Not sure but easy to test by just trying going over one plug or two plugs. I will try aswell once the post manages to deliver my exchanged PSU.


----------



## THUMPer1

Why are people running Heaven Bench 1.0? 4.0 should be used.


----------



## AverdanOriginal

Quote:


> Originally Posted by *THUMPer1*
> 
> Why are people running Heaven Bench 1.0? 4.0 should be used.


They are running Heaven 4.0. the 1.0 Scores you see are Valley not Heaven.


----------



## Derek129

Would someone be able to give me a rundown of the catalyst control centre digital flat panel properties such as gpu upscaling, vsr, and dvi settings and such


----------



## THUMPer1

Quote:


> Originally Posted by *AverdanOriginal*
> 
> They are running Heaven 4.0. the 1.0 Scores you see are Valley not Heaven.


Thanks. I can't read.


----------



## Flash Gordon

Quote:


> Originally Posted by *Agent Smith1984*
> 
> If you've submitted proof, I'll add you in the morning, have some other new folks to add also but am away from the office ( keep the database on my work station).
> 
> Thanks


What's the point of asking for proof in the OP if you aren't going to check it yourself?

I mean seems kind of redundant. Let's say I didn't post proof...but I say I did...will you still add me to the list? (btw I posted proof weeks ago).

I don't know how often you are near your workstation but some consistency with updating the list would be nice instead of letting it get out of control. I think this is the second time I'm asking about the list and I'm not the only person. I'm not pointing fingers it would just be nice to be on a list before other people who bought the card weeks after me









I know life gets in the way but you did make the thread so it's your responsibility.


----------



## Motley01

Quote:


> Originally Posted by *Flash Gordon*
> 
> What's the point of asking for proof in the OP if you aren't going to check it yourself?
> 
> I mean seems kind of redundant. Let's say I didn't post proof...but I say I did...will you still add me to the list? (btw I posted proof weeks ago).
> 
> I don't know how often you are near your workstation but some consistency with updating the list would be nice instead of letting it get out of control. I think this is the second time I'm asking about the list and I'm not the only person. I'm not pointing fingers it would just be nice to be on a list before other people who bought the card weeks after me
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I know life gets in the way but you did make the thread so it's your responsibility.


Geeezus man, just chill out. Its not like everyone is checking the list to see if you are really on there. I could care less.


----------



## Flash Gordon

Quote:


> Originally Posted by *Motley01*
> 
> Geeezus man, just chill out. Its not like everyone is checking the list to see if you are really on there. I could care less.


Are you speaking for everybody here? I didn't know this was a dictatorship.

Sorry for being blasphemous.

All I asked were some questions about the legitimacy of the list if all it takes is someone's word to get on there.

This is a forum about PCs, their parts, and the like...if I post someone that I am interested in or interested in getting some clarification about and you don't find it mighty important I suggest you control your impulse and ignore it.

Do you post such things about other things that aren't important to you? Very suspect behavior.


----------



## battleaxe

Quote:


> Originally Posted by *Flash Gordon*
> 
> Are you speaking for everybody here? I didn't know this was a dictatorship.
> 
> Sorry for being blasphemous.
> 
> All I asked were some questions about the legitimacy of the list if all it takes is someone's word to get on there.
> 
> This is a forum about PCs, their parts, and the like...if I post someone that I am interested in or interested in getting some clarification about and you don't find it mighty important I suggest you control your impulse and ignore it.
> 
> Do you post such things about other things that aren't important to you? Very suspect behavior.


They guy updating the list is a guy like you or me. He's not being paid. And if he's missed you its because he's human. Give him a break. We all have oversight from time to time, including you. Maybe you should find another place to rant on about someone you don't even know.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Flash Gordon*
> 
> Are you speaking for everybody here? I didn't know this was a dictatorship.
> 
> Sorry for being blasphemous.
> 
> All I asked were some questions about the legitimacy of the list if all it takes is someone's word to get on there.
> 
> This is a forum about PCs, their parts, and the like...if I post someone that I am interested in or interested in getting some clarification about and you don't find it mighty important I suggest you control your impulse and ignore it.
> 
> Do you post such things about other things that aren't important to you? Very suspect behavior.


I verify all proof and must have mistakenly missed yours at some point. A simple pm to remind me would have sufficed.

I'll add you to the list, but your attitude is a little off putting and won't make you many friends in this club anyways. I do a lot of work in and for this thread, and the lack of respect you are showing me negates the point of even joining it.

Anyways, hope you've found this thread helpful, despite your name accidentally not making the list.

Have a nice day.


----------



## Flash Gordon

No worries...all I asked was a question I didn't know I was committing treason or something.

Are people really that thin-skinned that they take everything so personally? Did I really come off like such a jerk? Why does it feel like people have to walk around egg shells around here to avoid hurting people's fragile feelings? Lol

Sorry if you feel I disrespected you that wasn't my intention...where I'm from people don't feel like a victim over something so minuscule.

Thanks for adding me to the list appreciate it.

Btw I sold my cards last week I just wanted to be a part of the club. It's kind of like being a newly converted Atheist but still wanting to be a Priest at your local church...there is something naughty about it that gets me aroused.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Flash Gordon*
> 
> No worries...all I asked was a question I didn't know I was committing treason or something.
> 
> Are people really that thin-skinned that they take everything so personally? Did I really come off like such a jerk? Why does it feel like people have to walk around egg shells around here to avoid hurting people's fragile feelings? Lol
> 
> Sorry if you feel I disrespected you that wasn't my intention...where I'm from people don't feel like a victim over something so minuscule.
> 
> Thanks for adding me to the list appreciate it.
> 
> Btw I sold my cards last week I just wanted to be a part of the club. It's kind of like being a newly converted Atheist but still wanting to be a Priest at your local church...there is something naughty about it that gets me aroused.


----------



## Motley01

Quote:


> Originally Posted by *Flash Gordon*
> 
> No worries...all I asked was a question I didn't know I was committing treason or something.
> 
> Are people really that thin-skinned that they take everything so personally? *Did I really come off like such a jerk?* Why does it feel like people have to walk around egg shells around here to avoid hurting people's fragile feelings? Lol
> 
> Sorry if you feel I disrespected you that wasn't my intention...where I'm from people don't feel like a victim over something so minuscule.
> 
> Thanks for adding me to the list appreciate it.
> 
> Btw I sold my cards last week I just wanted to be a part of the club. It's kind of like being a newly converted Atheist but still wanting to be a Priest at your local church...there is something naughty about it that gets me aroused.


Yes you are being a jerk. Go somewhere else please, we don't want people like this here. I'm reporting you to the mods. Have a nice day.


----------



## Motley01

Quote:


> Originally Posted by *Agent Smith1984*


Wow! So first he's disrespecting you becuase you "missed to put him on the list". Now how doesn't even own the video cards any more?

Some people......


----------



## Sgt Bilko

Quote:


> Originally Posted by *Flash Gordon*
> 
> No worries...all I asked was a question I didn't know I was committing treason or something.
> 
> Are people really that thin-skinned that they take everything so personally? Did I really come off like such a jerk? Why does it feel like people have to walk around egg shells around here to avoid hurting people's fragile feelings? Lol
> 
> Sorry if you feel I disrespected you that wasn't my intention...where I'm from people don't feel like a victim over something so minuscule.
> 
> Thanks for adding me to the list appreciate it.
> 
> Btw I sold my cards last week I just wanted to be a part of the club. It's kind of like being a newly converted Atheist but still wanting to be a Priest at your local church...there is something naughty about it that gets me aroused.


Sold the cards and still asking to join.........

Yeah....thats not really how it works


----------



## battleaxe

moving on now...


----------



## iscariot

Quick question. If I was to get one of these cards and use in a PCE2.0 slot how much performance would I be loosing?

Cheers.


----------



## Sgt Bilko

Quote:


> Originally Posted by *iscariot*
> 
> Quick question. If I was to get one of these cards and use in a PCE2.0 slot how much performance would I be loosing?
> 
> Cheers.


So long as its PCI 2.0 x8 or x16 you won't notice the difference









Tbh you could get away with a x4 slot and still be fine


----------



## iscariot

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *iscariot*
> 
> Quick question. If I was to get one of these cards and use in a PCE2.0 slot how much performance would I be loosing?
> 
> Cheers.
> 
> 
> 
> So long as its PCI 2.0 x8 or x16 you won't notice the difference
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Tbh you could get away with a x4 slot and still be fine
Click to expand...

Thanks it will be in x16 slot. I think its time to give the 7590 I rest.


----------



## AverdanOriginal

Quote:


> Originally Posted by *iscariot*
> 
> Thanks it will be in x16 slot. I think its time to give the 7590 I rest.


Like Sgt Bilko stated in games you won't notice the difference. Mine is currently stuck in a PCI-e 2.0 x16 and All games run on Ultra smooth.
You might get a slightly lower score on benchmarks and perhaps have 55 instead of 56 FPS in some games








If it has an influence on Overclocking potential? i don't know, but I might be able to tell you in about 2-3 months


----------



## iscariot

Quote:


> Originally Posted by *AverdanOriginal*
> 
> Quote:
> 
> 
> 
> Originally Posted by *iscariot*
> 
> Thanks it will be in x16 slot. I think its time to give the 7590 I rest.
> 
> 
> 
> Like Sgt Bilko stated in games you won't notice the difference. Mine is currently stuck in a PCI-e 2.0 x16 and All games run on Ultra smooth.
> You might get a slightly lower score on benchmarks and perhaps have 55 instead of 56 FPS in some games
> 
> 
> 
> 
> 
> 
> 
> 
> If it has an influence on Overclocking potential? i don't know, but I might be able to tell you in about 2-3 months
Click to expand...

Good to know. Im not really interested in overclocking it for now. I have noticed that my card is now the minimum spec for most games coming out and its my birthday so its a good time for an upgrade. Just a matter of which card to get.


----------



## Sgt Bilko

Quote:


> Originally Posted by *iscariot*
> 
> Quote:
> 
> 
> 
> Originally Posted by *AverdanOriginal*
> 
> Quote:
> 
> 
> 
> Originally Posted by *iscariot*
> 
> Thanks it will be in x16 slot. I think its time to give the 7590 I rest.
> 
> 
> 
> Like Sgt Bilko stated in games you won't notice the difference. Mine is currently stuck in a PCI-e 2.0 x16 and All games run on Ultra smooth.
> You might get a slightly lower score on benchmarks and perhaps have 55 instead of 56 FPS in some games
> 
> 
> 
> 
> 
> 
> 
> 
> If it has an influence on Overclocking potential? i don't know, but I might be able to tell you in about 2-3 months
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> Good to know. Im not really interested in overclocking it for now. I have noticed that my card is now the minimum spec for most games coming out and its my birthday so its a good time for an upgrade. Just a matter of which card to get.
Click to expand...

If i might make a recommendation either the XFX DD or MSI Gaming cards seem to be the better ones out of the bunch from what i've seen but the Sapphire Nitro cards are pretty good too


----------



## robmcrock

could I get an add please?... Thanks!

MSI r9 390


----------



## Agent Smith1984

Quote:


> Originally Posted by *MK-Professor*
> 
> Sapphire Nitro 390
> 
> 
> I can get 1160mhz with +100v but only when I am under 60C, above that it gives me artifacts and if I am above 70C I get even more artifacts, weird?
> With a lower OC around 1100 I notice the same behavior but at much higher temp(around 80C) which is ok because it doesn't even get near to that.
> At stock this never happened even with temps around 90C+


Hey bud, can you get a validation in GPU Z with your name, or a screen shot with notepad open showing your user name.

Part of the proof submission









I'll get you added ASAP.

Thanks


----------



## Agent Smith1984

Quote:


> Originally Posted by *robmcrock*
> 
> could I get an add please?... Thanks!
> 
> MSI r9 390


Added!!

Welcome to the club!


----------



## sonicmat

Sorry if ive missed this but has anybody managed to unlock a 390 to a 390x or is it not possible with this series?


----------



## Agent Smith1984

Quote:


> Originally Posted by *sonicmat*
> 
> Sorry if ive missed this but has anybody managed to unlock a 390 to a 390x or is it not possible with this series?


Seen some 390 submissions in the "unlock thread" and it looks like it's impossible to unlock ANY of these cards at all.


----------



## sonicmat

Ah thats a bummer, didnt a good percentage of 290s unlock to 290x and 390x?


----------



## sonicmat

Anyway im just having a play with some overclocking on the core with stock voltages and not having much luck so far, wont go past 1050.

Im catalyst control centre do i want the power limit setting putting up to the full 50% to get the best chance of clocking?

I dont want to play with voltages just yet


----------



## superkeest

If anyone has a HIS 390x iceq x2, would you mind pm'ing me? Id like to get a copy of the bios.
Thanks.


----------



## bazookatooths

Hi all just posted my build on another site and said I may need to upgrade the PSU now that I have upgraded to the R9 390!!!






















What do you guys think

1920x1080 settings.
AMD FX-8120 3.1GHz 8-Core Processor @ 4.3ghz @ 1.376
Super Talent 4GB (2 x 4GB) DDR3-1333 @1600 9-9-927
Cooler Master Hyper 212 EVO 82.9 CFM Sleeve Bearing CPU Cooler
Gigabyte GA-970A-UD3 ATX AM3+ Motherboard
Seagate Barracuda 1TB 3.5" 7200RPM Internal Hard Drive
XFX Radeon R9 390 8GB Double Dissipation Video Card
Thermaltake SMART 650W 80+ Bronze Certified ATX Power Supply

The 12V rail only drops down to 11.9 but my knowledge of PSU's is limited.
GPU is running stock 1000mhz/1500mhz

Love this gpu, fps never drops below triple digits under max settings


----------



## bazookatooths

Verification with picture.


----------



## iscariot

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *iscariot*
> 
> Quote:
> 
> 
> 
> Originally Posted by *AverdanOriginal*
> 
> Quote:
> 
> 
> 
> Originally Posted by *iscariot*
> 
> Thanks it will be in x16 slot. I think its time to give the 7590 I rest.
> 
> 
> 
> Like Sgt Bilko stated in games you won't notice the difference. Mine is currently stuck in a PCI-e 2.0 x16 and All games run on Ultra smooth.
> You might get a slightly lower score on benchmarks and perhaps have 55 instead of 56 FPS in some games
> 
> 
> 
> 
> 
> 
> 
> 
> If it has an influence on Overclocking potential? i don't know, but I might be able to tell you in about 2-3 months
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> Good to know. Im not really interested in overclocking it for now. I have noticed that my card is now the minimum spec for most games coming out and its my birthday so its a good time for an upgrade. Just a matter of which card to get.
> 
> Click to expand...
> 
> If i might make a recommendation either the XFX DD or MSI Gaming cards seem to be the better ones out of the bunch from what i've seen but the Sapphire Nitro cards are pretty good too
Click to expand...

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *iscariot*
> 
> Quote:
> 
> 
> 
> Originally Posted by *AverdanOriginal*
> 
> Quote:
> 
> 
> 
> Originally Posted by *iscariot*
> 
> Thanks it will be in x16 slot. I think its time to give the 7590 I rest.
> 
> 
> 
> Like Sgt Bilko stated in games you won't notice the difference. Mine is currently stuck in a PCI-e 2.0 x16 and All games run on Ultra smooth.
> You might get a slightly lower score on benchmarks and perhaps have 55 instead of 56 FPS in some games
> 
> 
> 
> 
> 
> 
> 
> 
> If it has an influence on Overclocking potential? i don't know, but I might be able to tell you in about 2-3 months
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> Good to know. Im not really interested in overclocking it for now. I have noticed that my card is now the minimum spec for most games coming out and its my birthday so its a good time for an upgrade. Just a matter of which card to get.
> 
> Click to expand...
> 
> If i might make a recommendation either the XFX DD or MSI Gaming cards seem to be the better ones out of the bunch from what i've seen but the Sapphire Nitro cards are pretty good too
Click to expand...

I was looking at the MSI card. Any and all suggestions welcome. Thanks.


----------



## AverdanOriginal

Quote:


> Originally Posted by *iscariot*
> 
> I was looking at the MSI card. Any and all suggestions welcome. Thanks.


AgentSmith made a good round-up of the different models on the first page of this forum. I would only add my observation from this forum and I'd second here Sgt Bilko that MSI and XFX seem to be binned better in general compared to the other manufacturers. Furthermore 3 fans like on the saphire nitro seem to really make a difference in terms of temperature. So it will depend more on your preferences, do you want a slightly lower powered card and hence lower temps or higher temps (probably a bit louder) but also slightly more power.


----------



## iscariot

I'd prob go for a slightly more powerful card. This is the one I was thinking of:

http://www.mwave.com.au/product/msi-radeon-r9-390x-gaming-8gb-video-card-ab64047


----------



## bazookatooths

[/URL]


Plastic still on the fan side looks awesome two diff tones of black flat and glossy.


----------



## MK-Professor

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Hey bud, can you get a validation in GPU Z with your name, or a screen shot with notepad open showing your user name.
> 
> Part of the proof submission
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'll get you added ASAP.
> 
> Thanks


----------



## Agent Smith1984

Quote:


> Originally Posted by *bazookatooths*
> 
> 
> Verification with picture.


Added and welcome!
Quote:


> Originally Posted by *MK-Professor*


Added and welcome!


----------



## Stickeelion

Hey guys, just a HD 79x0 user here dirtying up your club







I've been looking at the specs of the 390X compared to the Fury X, and I noticed that the Fury X has only 4GB of VRAM compared to the 390X's 8GB, what's going on here, who decided to give the "better" GPU less VRAM? I did however notice that it uses High Bandwidth Memory but I don't fully understand if it makes any difference.

Just curious


----------



## tangelo

I've decided that my next gpu is gonna be R9 390. Long story short, gonna do some OCing with the stock coolers. Should I go with MSI or would a 50 euros cheaper Sapphire Nitro be a better bang for a buck? How is the VRM cooling done on Sapphire? I'm leaning torwards the MSI R9 390 (~410euros) due to peoples experience them OCing better and the active cooling on the VRMs, but I would be able to get the Sapphire for the price of 360e. Should I go for it? Is the cooling done properly? And is the price difference enough to bother taking a chance on getting a less good quality binned version?

What do you people think?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Stickeelion*
> 
> Hey guys, just a HD 79x0 user here dirtying up your club
> 
> 
> 
> 
> 
> 
> 
> I've been looking at the specs of the 390X compared to the Fury X, and I noticed that the Fury X has only 4GB of VRAM compared to the 390X's 8GB, what's going on here, who decided to give the "better" GPU less VRAM? I did however notice that it uses High Bandwidth Memory but I don't fully understand if it makes any difference.
> 
> Just curious


The Fury is limited to 4GB of VRAM because of the technological constraints with first gen HBM. That will change with their next revision of HBM (possibly called HBM 2).

HBM itself performs much faster, and as of this post, 4GB is all that is needed for gaming up to 4K resolution.

On the other hand, if one is considering a multi GPU setup with resolutions OVER 4K, then the 390 series is a very cost effective option.

Consider that (3) 390's at a total cost of $990 will perform admirably better than (2) Fury's @ $1300, and will have double the VRAM in a 5k-8k resolution setup.

Also, (2) 390's at a total of $660 will cost around the same as one Fury X, and give it a good pounding in every scenario.

Not to say that Hawai cards are the better solution for everyone. When you consider their power draw, heat production, and size, you can see the allure that the Fury brings to the table...

Then again, you have to be able to buy a Fury to enjoy one, and how's that workin' out for folks?


----------



## Cannon19932006

Quote:


> Originally Posted by *Agent Smith1984*
> 
> The Fury is limited to 4GB of VRAM because of the technological constraints with first gen HBM. That will change with their next revision of HBM (possibly called HBM 2).
> 
> HBM itself performs much faster, and as of this post, 4GB is all that is needed for gaming up to 4K resolution.
> 
> On the other hand, if one is considering a multi GPU setup with resolutions OVER 4K, then the 390 series is a very cost effective option.
> 
> Consider that (3) 390's at a total cost of $990 will perform admirably better than (2) Fury's @ $1300, and will have double the VRAM in a 5k-8k resolution setup.
> 
> Also, (2) 390's at a total of $660 will cost around the same as one Fury X, and give it a good pounding in every scenario.
> 
> Not to say that Hawai cards are the better solution for everyone. When you consider their power draw, heat production, and size, you can see the allure that the Fury brings to the table...
> 
> Then again, you have to be able to buy a Fury to enjoy one, and how's that workin' out for folks?


I don't know about "much faster" Agent Smith, the 390 series cards memory clocks up into the 1700 range, with a 512bit memory bus that puts it at mid 400s GB/s compared to fury's 512GB/s. Then again the power requirements on HBM with an interposer are much much lower than GDDR5, and it does best us stock and overclocked, just not by a whole lot.


----------



## CamsX

Hello all:

Haven't been able to keep up with the thread (about 25 pages behind), mostly due to work and some home projects.

I have CPU related question. Got someone to buy my oldish 8350 and Gigabyte 990FXA motherboard.

What would you say is a good DDR3 CPU and mobo combo for under about $350-400 USD??

I was looking at the 8370e, 8370 or 9370 and a newer uefi mobo combo, but I don't mind going Intel if I can get better performance within my budget. My last intel CPU was Pentium II 400mhz btw. LOL.


----------



## Cannon19932006

Quote:


> Originally Posted by *CamsX*
> 
> Hello all:
> 
> Haven't been able to keep up with the thread (about 25 pages behind), mostly due to work and some home projects.
> 
> I have CPU related question. Got someone to buy my oldish 8350 and Gigabyte 990FXA motherboard.
> 
> What would you say is a good DDR3 CPU and mobo combo for under about $350-400 USD??
> 
> I was looking at the 8370e, 8370 or 9370 and a newer uefi mobo combo, but I don't mind going Intel if I can get better performance within my budget. My last intel CPU was Pentium II 400mhz btw. LOL.


You can get a sandy/ivy i7 combo for around that much. I'm actually listing my sig cpu, mem, and mobo on the marketplace later today for right around 350, if you're interested in that sorta thing keep an eye out for it.


----------



## JohnnyMoore

Quote:


> Originally Posted by *CamsX*
> 
> Hello all:
> 
> Haven't been able to keep up with the thread (about 25 pages behind), mostly due to work and some home projects.
> 
> I have CPU related question. Got someone to buy my oldish 8350 and Gigabyte 990FXA motherboard.
> 
> What would you say is a good DDR3 CPU and mobo combo for under about $350-400 USD??
> 
> I was looking at the 8370e, 8370 or 9370 and a newer uefi mobo combo, but I don't mind going Intel if I can get better performance within my budget. My last intel CPU was Pentium II 400mhz btw. LOL.


Cpu fx 9370- 210 doll,RAM 8GB - 60, and 120 MoBo. But I think change 8350 to 9370 is absurdly...not big gap


----------



## Agent Smith1984

Quote:


> Originally Posted by *Cannon19932006*
> 
> I don't know about "much faster" Agent Smith, the 390 series cards memory clocks up into the 1700 range, with a 512bit memory bus that puts it at mid 400s GB/s compared to fury's 512GB/s. Then again the power requirements on HBM with an interposer are much much lower than GDDR5, and it does best us stock and overclocked, just not by a whole lot.


You are absolutely correct, I guess I should have elaborated....

My point was that it is much faster on a per clock basis, and if we know anything about memory technology, it's that their introductory memory clocks are peanuts compared to what they are capable of with maturation.

Early GDDR5 cards had 600-1000MHz Memory clocks.

When we see HBM get to 1GHz, we will see 1TB/s memory bandwidth, while clocking GDDR5 up over a 512 bit, would take several thousand more MHz to get to that level of bandwidth.

As it stands now though, NO, the Fury's memory bandwidth is not _that_ much faster than the 390 series' memory potential.


----------



## CamsX

Quote:


> Originally Posted by *JohnnyMoore*
> 
> Cpu fx 9370- 210 doll,RAM 8GB - 60, and 120 MoBo. But I think change 8350 to 9370 is absurdly...not big gap


I'm keeping my 16GB 1600Mhz CAS9 RAM for now. Memory investment not considered in the budget.


----------



## Cannon19932006

Quote:


> Originally Posted by *CamsX*
> 
> I'm keeping my 16GB 1600Mhz CAS9 RAM for now. Memory investment not considered in the budget.


You worded your op kind of odd, I think Johnny and myself both thought you needed memory.


----------



## bazookatooths

Bestbuy XFX lifetime warranty

Does this cover burning it up from oc lol


----------



## JohnnyMoore

Quote:


> Originally Posted by *CamsX*
> 
> I'm keeping my 16GB 1600Mhz CAS9 RAM for now. Memory investment not considered in the budget.


Also dont forget about TDP , fx9370 - 220watts , i7 4790 - 95 for example, in 1 year you can easy have benefit and better cpu for gaming


----------



## Agent Smith1984

Quote:


> Originally Posted by *CamsX*
> 
> Hello all:
> 
> Haven't been able to keep up with the thread (about 25 pages behind), mostly due to work and some home projects.
> 
> I have CPU related question. Got someone to buy my oldish 8350 and Gigabyte 990FXA motherboard.
> 
> What would you say is a good DDR3 CPU and mobo combo for under about $350-400 USD??
> 
> I was looking at the 8370e, 8370 or 9370 and a newer uefi mobo combo, but I don't mind going Intel if I can get better performance within my budget. My last intel CPU was Pentium II 400mhz btw. LOL.


I'd have kept the board, and just got a new version of the FX-8....

The 8300 does 4.8-5Ghz easily, and cost $115 right now, with Dirt Rally code included!!

Either that, or just go ahead and sink some cash on Intel..... I advise skylake 1151 or if you can swing it (which based on budget you can NOT- nor can I







) go X99 (5820K is such a great performance per dollar compared to other Intel CPU's).


----------



## Cannon19932006

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I'd have kept the board, and just got a new version of the FX-8....
> 
> The 8300 does 4.8-5Ghz easily, and cost $115 right now, with Dirt Rally code included!!
> 
> Either that, or just go ahead and sink some cash on Intel..... I advise skylake 1151 or if you can swing it (which based on budget you can NOT- nor can I
> 
> 
> 
> 
> 
> 
> 
> ) go X99 (5820K is such a great performance per dollar compared to other Intel CPU's).


That's what I just did, bought a 5820k to replace my 2700k, should be here any minute.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Cannon19932006*
> 
> That's what I just did, bought a 5820k to replace my 2700k, should be here any minute.


If I was a peanut butter sandwich....

I'd be dripping with jelly....


----------



## CamsX

TDP is not much concern. Electrical bill is only $30 USD monthly, and thats with 8350 OCed and 7970 crossfire for almost 2 years.

It was just easier to sell the CPU with the mobo together.

I just hate how expensive Intels are compared to AMD, specially after looking at performance per dollar.


----------



## JohnnyMoore

Quote:


> Originally Posted by *CamsX*
> 
> TDP is not much concern. Electrical bill is only $30 USD monthly, and thats with 8350 OCed and 7970 crossfire for almost 2 years.
> 
> It was just easier to sell the CPU with the mobo together.
> 
> I just hate how expensive Intels are compared to AMD, specially after looking at performance per dollar.


Price is justified to me


----------



## CamsX

Quote:


> Originally Posted by *JohnnyMoore*
> 
> Price is justified to me


It probably is. I know Intel is better, I'm not arguing that. But a $400 Intel cpu is not 2x faster than a $200 FX cpu. Biggest issue is still budget.


----------



## CamsX

What about this combo?

http://www.newegg.com/Product/ComboDealDetails.aspx?ItemList=Combo.2461940

Intel Core i5-4690K Devil's Canyon
ASUS Z97-A/USB 3.1
$375


----------



## Agent Smith1984

Quote:


> Originally Posted by *CamsX*
> 
> TDP is not much concern. Electrical bill is only $30 USD monthly, and thats with 8350 OCed and 7970 crossfire for almost 2 years.
> 
> It was just easier to sell the CPU with the mobo together.
> 
> I just hate how expensive Intels are compared to AMD, specially after looking at performance per dollar.


The problem is, you were already using an AMD FX-8 system, so unless you take a leap up, there's not much of a point is building anything, since getting another FX (even if clocking it a few hundred Mhz higher) would just be a marginal gain in the grand sceam.

A lot of guys in the Vishera thread scoop up new chips all the time though, just to "play with"..... if that's the case, then get a good board, and good cooling, and get the cheapest FX-8 you can find (again, the 8300) because it's not the stepping that is getting people to 5GHz anymore, it's nice ass boards, and high end cooling. Now if you want MORE than 5Ghz, I would look at the 8370, 9370, or 9590, but you have to ask yourself is getting one of those for $200, dropping $170 on a board, and then spending who knows what on cooling to OC the dog piss out of it, really going to be work it when you could get a $120-150 Z97 board, a $200 i5 k series, and a $50 air cooler, and probably have comparable if not better performance in some cases.

And trust me, I'm a die hard AMD fan, but only because I did not blow a fortune on my rig.

I spent:

$99 on CPU
$75 on motherboard after $10 MIR
$80 on AIO Cooler

4.8Ghz for that amount of money on an FX-8 is an unbelievable price/performance....

Spending:

$180 on AMD CPU
$170 on motherboard
$120 on cooler built for high OC's...

To still only get 5Ghz, or maybe a hair more, makes it not such a great value anymore....

Just my thoughts though....


----------



## CamsX

Quote:


> Originally Posted by *Agent Smith1984*
> 
> The problem is, you were already using an AMD FX-8 system, so unless you take a leap up, there's not much of a point is building anything, since getting another FX (even if clocking it a few hundred Mhz higher) would just be a marginal gain in the grand sceam.
> 
> A lot of guys in the Vishera thread scoop up new chips all the time though, just to "play with"..... if that's the case, then get a good board, and good cooling, and get the cheapest FX-8 you can find (again, the 8300) because it's not the stepping that is getting people to 5GHz anymore, it's nice ass boards, and high end cooling. Now if you want MORE than 5Ghz, I would look at the 8370, 9370, or 9590, but you have to ask yourself is getting one of those for $200, dropping $170 on a board, and then spending who knows what on cooling to OC the dog piss out of it, really going to be work it when you could get a $120-150 Z97 board, a $200 i5 k series, and a $50 air cooler, and probably have comparable if not better performance in some cases.
> 
> And trust me, I'm a die hard AMD fan, but only because I did not blow a fortune on my rig.
> 
> I spent:
> 
> $99 on CPU
> $75 on motherboard after $10 MIR
> $80 on AIO Cooler
> 
> 4.8Ghz for that amount of money on an FX-8 is an unbelievable price/performance....
> 
> Spending:
> 
> $180 on AMD CPU
> $170 on motherboard
> $120 on cooler built for high OC's...
> 
> To still only get 5Ghz, or maybe a hair more, makes it not such a great value anymore....
> 
> Just my thoughts though....


Point taken.

About to jump on the 4690k combo I posted above.


----------



## Agent Smith1984

Quote:


> Originally Posted by *CamsX*
> 
> Point taken.
> 
> About to jump on the 4690k combo I posted above.


Go for it man!

It's got good crossfire spacing too









Just remember that 1150 is "dead" and I dunno if it'd even be worth adding a 4790K to it later on down the road.

Skylake 6600K CPU is only $10 more than the 4790K....

Not sure what 1155 boards and DDR3L/DDR4 are fetching though (sure it's the usual "small fortune"







)

I'm so over the CPU upgrade itch until I see what Zen is bringing to the table.
I am only focused on getting another 390 at this point, for some more 4K Horse POWA!

Speaking of which....


----------



## CamsX

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Go for it man!
> 
> It's got good crossfire spacing too
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just remember that 1150 is "dead" and I dunno if it'd even be worth adding a 4790K to it later on down the road.
> 
> Skylake 6600K CPU is only $10 more than the 4790K....
> 
> Not sure what 1155 boards and DDR3L/DDR4 are fetching though (sure it's the usual "small fortune"
> 
> 
> 
> 
> 
> 
> 
> )
> 
> I'm so over the CPU upgrade itch until I see what Zen is bringing to the table.
> I am only focused on getting another 390 at this point, for some more 4K Horse POWA!
> 
> Speaking of which....


You made me ROFL! Thanks for all the comments.

I'm looking into cheaper boards right now, still crossfire ready.


----------



## kizwan

Quote:


> Originally Posted by *Cannon19932006*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Agent Smith1984*
> 
> I'd have kept the board, and just got a new version of the FX-8....
> 
> The 8300 does 4.8-5Ghz easily, and cost $115 right now, with Dirt Rally code included!!
> 
> Either that, or just go ahead and sink some cash on Intel..... I advise skylake 1151 or if you can swing it (which based on budget you can NOT- nor can I
> 
> 
> 
> 
> 
> 
> 
> ) go X99 (5820K is such a great performance per dollar compared to other Intel CPU's).
> 
> 
> 
> 
> 
> 
> 
> That's what I just did, bought a 5820k to replace my 2700k, should be here any minute.
Click to expand...

Which motherboard you're going to go with that 5820k?


----------



## JohnnyMoore

Quote:


> Originally Posted by *CamsX*
> 
> It probably is. I know Intel is better, I'm not arguing that. But a $400 Intel cpu is not 2x faster than a $200 FX cpu. Biggest issue is still budget.


100$ more expensive , about 30% better in game , cooler and 1/2power consumption,why not?here in France 1kw is about 0.18euro cents ,and in Germany 0.30 , we take care about less power consumption.


----------



## bazookatooths

I would have never sold the AMD mobo/cpu , visually you will see no differences in gaming but maybe thats just me watching videos of them side by side, only diff is the numbers.

*So it sounds like you just want to burn $$$* and make a pretty Rig, go all out get intel skylake i7-6700K , and do a color themed build, with nice cables, and liquid loops.


----------



## JohnnyMoore

Quote:


> Originally Posted by *bazookatooths*
> 
> I would have never sold the AMD mobo/cpu , visually you will see no differences in gaming but maybe thats just me watching videos of them side by side, only diff is the numbers.
> 
> *So it sounds like you just want to burn $$$* and make a pretty Rig, go all out get intel skylake i7-6700K , and do a color themed build, with nice cables, and liquid loops.


http://cpu.userbenchmark.com/Compare/Intel-Core-i7-4790K-vs-AMD-FX-9370/2384vs2005
no difference? if you are locked at 60 fps in some not AAA titles not mean that you will no see difference with cross 4k resolution caped at 144hz.For good gaming at 60fps im agree not big difference(but existe and not negligible).In 2 years i will easy burn more $$$ spending on electricity.


----------



## bazookatooths

Quote:


> Originally Posted by *JohnnyMoore*
> 
> http://cpu.userbenchmark.com/Compare/Intel-Core-i7-4790K-vs-AMD-FX-9370/2384vs2005
> no difference? if you are locked at 60 fps in some not AAA titles not mean that you will no see difference with cross 4k resolution caped at 144hz.For good gaming at 60fps im agree not big difference(but existe and not negligible).In 2 years i will easy burn more $$$ spending on electricity.


Yes I agree , thought he was running 1080p , but 4k I understand, but as far as the power bill, wouldn't the initial cost outweigh the powersaving cost. vs the cheaper inital cost of a AMD and higher power bill , if your replacing every 2 years. They would seem to void each other out but I personally dont run 24/7 on my aggressive overclocks, maybe for a mild one.

EDIT: When my current cpu dies, I will be getting the above mentioned skylake CPU


----------



## Cannon19932006

Quote:


> Originally Posted by *kizwan*
> 
> Which motherboard you're going to go with that 5820k?


Gigabyte GA-X99 SLI

got it all together and win installed and booted up
http://valid.x86.fr/fnkyhi


----------



## CamsX

I think I'll go Skylake right away. The 6600k suggestion is looking very good.

Should I go for ddr4? Or just stay with a ddr3 mobo and deal with it.


----------



## Stickeelion

Quote:


> Originally Posted by *Agent Smith1984*
> 
> And trust me, I'm a die hard AMD fan, but only because I did not blow a fortune on my rig.


I did blow a fortune on my AMD rig, but still a fan








Quote:


> Originally Posted by *Agent Smith1984*
> 
> The Fury is limited to 4GB of VRAM because of the technological constraints with first gen HBM. That will change with their next revision of HBM (possibly called HBM 2).
> 
> HBM itself performs much faster, and as of this post, 4GB is all that is needed for gaming up to 4K resolution.
> 
> On the other hand, if one is considering a multi GPU setup with resolutions OVER 4K, then the 390 series is a very cost effective option.
> 
> Consider that (3) 390's at a total cost of $990 will perform admirably better than (2) Fury's @ $1300, and will have double the VRAM in a 5k-8k resolution setup.
> 
> Also, (2) 390's at a total of $660 will cost around the same as one Fury X, and give it a good pounding in every scenario.
> 
> Not to say that Hawai cards are the better solution for everyone. When you consider their power draw, heat production, and size, you can see the allure that the Fury brings to the table...
> 
> Then again, you have to be able to buy a Fury to enjoy one, and how's that workin' out for folks?


Thanks for the information. Now I personally don't think the Fury X is ever worth the price considering what the 390x has to offer, My 7950's have had their 3GB memory maxed out a couple of times only on 1080p, which makes me think at 4K some games might possibly want more than 4GB

However if I lived in the US I would buy 390's for days, you have got to be kidding me, A 390 here costs $510. The 390X costs $650 and a Fury X is a wallet breaking $1100, put any of those 3 cards in crossfire and it will bring your wallet to it's knees. That is assuming they are not sold out, vendors only seem to get a very limited number imported, the state of the GPU market here is just so bad, I paid $450 for each of my 7950's in August 2013, And don't even try looking at the Titan X at a whopping $1700 per card it's ridiculous, it's nearly twice as much here, I see them on newegg and amazon for $1000 and get oh so jealous at your prices.


----------



## kizwan

Quote:


> Originally Posted by *Cannon19932006*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Which motherboard you're going to go with that 5820k?
> 
> 
> 
> Gigabyte GA-X99 SLI
> 
> got it all together and win installed and booted up
> http://valid.x86.fr/fnkyhi
Click to expand...

Gigabyte was doing bad job with x79 motherboard & I heard same thing with x99 but I'm not sure about that particular motherboard. I mean in term of overclocking. Hopefully that mobo is ok.

For the rest...


----------



## Cannon19932006

Quote:


> Originally Posted by *kizwan*
> 
> Gigabyte was doing bad job with x79 motherboard & I heard same thing with x99 but I'm not sure about that particular motherboard. I mean in term of overclocking. Hopefully that mobo is ok.
> 
> For the rest...


The ud3 and ud4 didn't show great reviews on newegg, but the SLI had 5 of 5 eggs, and decent reviews from a few places.


----------



## tangelo

So no opinions on Sapphire vs. MSI?


----------



## Stickeelion

Quote:


> Originally Posted by *tangelo*
> 
> So no opinions on Sapphire vs. MSI?


I don't really have much to say, except the only problem I had was that Sapphire's RMA process is very convoluted and slow (takes month(s)) but otherwise fantastic (this was back with a 7950 mind you)


----------



## AverdanOriginal

Quote:


> Originally Posted by *Cannon19932006*
> 
> The ud3 and ud4 didn't show great reviews on newegg, but the SLI had 5 of 5 eggs, and decent reviews from a few places.


read the first page. Agent Smith made a good roundup of all different models. saphire less heat (due to 3 fans) and thus a bit quieter maybe, but seems not to be so great overclocking potential and is very long (300+mm). MSI a bit more heat, (maybe 2-3 C° more) but better binned hence better overclocking results so far according to the results in this forum. then there is also the question of looks, red vs. black/Grey colour, led dragon and backplate.


----------



## tangelo

Quote:


> Originally Posted by *AverdanOriginal*
> 
> read the first page. Agent Smith made a good roundup of all different models. saphire less heat (due to 3 fans) and thus a bit quieter maybe, but seems not to be so great overclocking potential and is very long (300+mm). MSI a bit more heat, (maybe 2-3 C° more) but better binned hence better overclocking results so far according to the results in this forum. then there is also the question of looks, red vs. black/Grey colour, led dragon and backplate.


I've read the whole thread but was looking for more info specially about the VRM cooling. I've tried googling about the matter but none of the reviews and teardown address the issue









Thanks anyway. I think I'll just grab the MSI and be done with it.


----------



## CamsX

Quote:


> Originally Posted by *tangelo*
> 
> I've read the whole thread but was looking for more info specially about the VRM cooling. I've tried googling about the matter but none of the reviews and teardown address the issue
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks anyway. I think I'll just grab the MSI and be done with it.


On both cards I think is very good, but different at the same time. Both have custom PCBs, so waterblocks aren't available last time I checked. The MSI VRMs are in direct contact with the heatsink and backplate assembly. The Sapphire VRMs use tiny heatsinks and there is a small gap on top, relative to the main heatsink; also no backplate.

Depends on what you prefer. None will give spectacular cooling unless you slap additional fans on top of the cards like Agent Smitth did.


----------



## CamsX

*MSI*



https://www.techpowerup.com/reviews/MSI/R9_390X_Gaming/4.html


----------



## CamsX

*Sapphire*



German site:
http://www.pc-max.de/artikel/grafikkarten/test-sapphire-radeon-r9-390-nitro-8gb/21292


----------



## tangelo

Thanks! That helped


----------



## Agent Smith1984

I can tell you directly from experience that the VRM cooling on the MSI card is great.


----------



## tangelo

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I can tell you directly from experience that the VRM cooling on the MSI card is great.


My biggest questionmark about the matter was that if Sapphire has not-as-good cooling on it's VRMs, is it worth getting for 50e cheaper price. But like I said I'm going to get the MSI. I prefer higher OC potential and the backplate.

Thanks all for your input.


----------



## Dorland203

Hi.I've just played Dying Light with latest patch and driver,max settings,50% view distance.In the very first mission,when I'm in the bulding,I always got 70 FPS minimum,even when I play the tutorial mission on the "gym".But when I went out of the building to Dr Zere,the frames plummeted to 35 and got up to 60 but never pass 70.What are you guys' performance in this game ?


----------



## Techyrod

Glad to be here!!


----------



## JohnnyMoore

Quote:


> Originally Posted by *Dorland203*
> 
> Hi.I've just played Dying Light with latest patch and driver,max settings,50% view distance.In the very first mission,when I'm in the bulding,I always got 70 FPS minimum,even when I play the tutorial mission on the "gym".But when I went out of the building to Dr Zere,the frames plummeted to 35 and got up to 60 but never pass 70.What are you guys' performance in this game ?


Good performance ,but cant play with crossfire because blood splatter flickering on the screen.But with solo r9 390x do better than 35 fps,dont know what is wrong with your pc


----------



## JohnnyMoore

Quote:


> Originally Posted by *Techyrod*
> 
> 
> 
> Glad to be here!!


Congrats! But i think asus strix is the worst serie of 390 and one of the most expensive


----------



## Techyrod

it was a gift and no issues so far..havent overclocked yet but thats next. temps are good and no issues pushing my 1440 monitor. But if i was to buy a 390 it would have been the sapphire card. but im happy none the less


----------



## JohnnyMoore

Quote:


> Originally Posted by *Techyrod*
> 
> it was a gift and no issues so far..havent overclocked yet but thats next. temps are good and no issues pushing my 1440 monitor. But if i was to buy a 390 it would have been the sapphire card. but im happy none the less


Srsly good card in all ways +-50mhz doesnt matter , enjoy it.


----------



## Zack Foo

Quote:


> Originally Posted by *JohnnyMoore*
> 
> Srsly good card in all ways +-50mhz doesnt matter , enjoy it.


So msi is the go to card for 390 now?


----------



## Slay

19 AUX voltage
100 core voltage
+50 power limit

http://www.3dmark.com/fs/5828065


----------



## Agent Smith1984

Quote:


> Originally Posted by *Techyrod*
> 
> 
> 
> Glad to be here!!


Hey there!

Get us a screen shot with your name, or a pic of the card with your name and I'll get you added to the club.

Thanks


----------



## bazookatooths

Quote:


> Originally Posted by *JohnnyMoore*
> 
> Srsly good card in all ways +-50mhz doesnt matter , enjoy it.


I agree my 2011 hd6670 was 800mhz thats almost 5 years old. I donr see the big deal about the clock speed


----------



## Noirgheos

Has anyone managed to get an MSI 390 to 390X levels through OC'ing?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Noirgheos*
> 
> Has anyone managed to get an MSI 390 to 390X levels through OC'ing?


I am performing well past a 390X with my 390 @ 1200/1750, of course the 390X will clock to around the same, and then it's 3-5% faster again.
It only takes about 40 or 50MHz on a 390 to even out with the 390X


----------



## Noirgheos

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I am performing well past a 390X with my 390 @ 1200/1750, of course the 390X will clock to around the same, and then it's 3-5% faster again.
> It only takes about 40 or 50MHz on a 390 to even out with the 390X


What were temps like with the OC? Idle/load? Default fan settings or not, please mention it.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Noirgheos*
> 
> Has anyone managed to get an MSI 390 to 390X levels through OC'ing?
> 
> 
> 
> I am performing well past a 390X with my 390 @ 1200/1750, of course the 390X will clock to around the same, and then it's 3-5% faster again.
> It only takes about 40 or 50MHz on a 390 to even out with the 390X
Click to expand...

If it's the same with 290's vs 290x's then at 1150mhz Core speed for the 290 = 1100 or so for the 290x and 1250Mhz core for the 290 = 1150Mhz for the 290x

i don't have a 390 to try out but i might get one at some point


----------



## Agent Smith1984

Quote:


> Originally Posted by *Noirgheos*
> 
> What were temps like with the OC? Idle/load? Default fan settings or not, please mention it.


Well, before improving my case flow, it wasn't even possible as I was topping out at 1180/1700 @ 100/50mv offset, but now I can do 1200/1750 at 100/50mv offset.

The temps before were hitting 88C core and around 90C VRM with fan speed set to match the core temp (88%)

After improving my case flow, I am seeing around 72-74C core (depending on title as I game at 4K and settings impact temps at that res), and around 73-75C on the VRM

Just make sure the case flow is somewhere between good and great, and you should be fine on temps.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Sgt Bilko*
> 
> If it's the same with 290's vs 290x's then at 1150mhz Core speed for the 290 = 1100 or so for the 290x and 1250Mhz core for the 290 = 1150Mhz for the 290x
> 
> i don't have a 390 to try out but i might get one at some point


Can you run a FireStrike with your 390X at 1150/1500 and your CPU at 4.8GHz, and I will run some tests later on my 390 to see what it takes on the core to match it?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> If it's the same with 290's vs 290x's then at 1150mhz Core speed for the 290 = 1100 or so for the 290x and 1250Mhz core for the 290 = 1150Mhz for the 290x
> 
> i don't have a 390 to try out but i might get one at some point
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can you run a FireStrike with your 390X at 1150/1500 and your CPU at 4.8GHz, and I will run some tests later on my 390 to see what it takes on the core to match it?
Click to expand...

I would but.....i don't have the 390x in my rig atm









but i promise you i will do that at some point in the coming days


----------



## Noirgheos

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Well, before improving my case flow, it wasn't even possible as I was topping out at 1180/1700 @ 100/50mv offset, but now I can do 1200/1750 at 100/50mv offset.
> 
> The temps before were hitting 88C core and around 90C VRM with fan speed set to match the core temp (88%)
> 
> After improving my case flow, I am seeing around 72-74C core (depending on title as I game at 4K and settings impact temps at that res), and around 73-75C on the VRM
> 
> Just make sure the case flow is somewhere between good and great, and you should be fine on temps.


Well, I have a H440, and a Corsair H80i for my CPU. Do you think I'll be able to OC to your levels while hitting a max of 76C core?


----------



## CamsX

Quote:


> Originally Posted by *Noirgheos*
> 
> Well, I have a H440, and a Corsair H80i for my CPU. Do you think I'll be able to OC to your levels while hitting a max of 76C core?


Agent uses very aggressive/loud case fans. If noise is a concern for you, you might not be able to stay as low on temperature, which ultimately is the key for his stability, specially on the VRMs temperature. Just speaking from what I've seen throughout the entire thread.

Test around with your rig and see what you can achieve. H440 should have a fairly descent air flow balance.


----------



## Agent Smith1984

Quote:


> Originally Posted by *CamsX*
> 
> Agent uses very aggressive/loud case fans. If noise is a concern for you, you might not be able to stay as low on temperature, which ultimately is the key for his stability, specially on the VRMs temperature. Just speaking from what I've seen throughout the entire thread.
> 
> Test around with your rig and see what you can achieve. H440 should have a fairly descent air flow balance.


Yep, you pretty much summed it up.


----------



## Techyrod

Agreed that air/case flow is important, I have the define R5 and gaming i hit 72-73c and doing a bench valley or firestrike i hit 81c.


----------



## Noirgheos

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Yep, you pretty much summed it up.


Alright, what is the max safe temp.? I plan to game on this thing for 2 years minimum. I'd like to know a max temp. that will assure maximum performance without degradation for that long.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Noirgheos*
> 
> Alright, what is the max safe temp.? I plan to game on this thing for 2 years minimum. I'd like to know a max temp. that will assure maximum performance without degradation for that long.


I mean, these cards will run up to 94C before they start to throttle, but you will likely see temps in the low 70's to low 80's at stock, depending on your fan profile, and case flow.

Adding voltage and overclocking with increase those numbers, but you won't really have a true indication of what it will be until you test it out.

The to overclocking for me has been keeping the VRM under 80C.


----------



## Noirgheos

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I mean, these cards will run up to 94C before they start to throttle, but you will likely see temps in the low 70's to low 80's at stock, depending on your fan profile, and case flow.
> 
> Adding voltage and overclocking with increase those numbers, but you won't really have a true indication of what it will be until you test it out.
> 
> The to overclocking for me has been keeping the VRM under 80C.


Are VRM temps shown on MSI Afterburner?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Noirgheos*
> 
> Are VRM temps shown on MSI Afterburner?


No, you'll have to use GPU-Z, though I do believe the next version of Trixx will/does have VRM temps.


----------



## Noirgheos

Quote:


> Originally Posted by *Agent Smith1984*
> 
> No, you'll have to use GPU-Z, though I do believe the next version of Trixx will/does have VRM temps.


Thank you. I'll come back here if I need anything.


----------



## kizwan

Quote:


> Originally Posted by *Noirgheos*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Agent Smith1984*
> 
> I mean, these cards will run up to 94C before they start to throttle, but you will likely see temps in the low 70's to low 80's at stock, depending on your fan profile, and case flow.
> 
> Adding voltage and overclocking with increase those numbers, but you won't really have a true indication of what it will be until you test it out.
> 
> The to overclocking for me has been keeping the VRM under 80C.
> 
> 
> 
> 
> 
> 
> 
> Are VRM temps shown on MSI Afterburner?
Click to expand...

You can monitor VRM temps using GPU-Z, Trixx v5.0 or HWiNFO, to name a few.


----------



## Cannon19932006

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I mean, these cards will run up to 94C before they start to throttle, but you will likely see temps in the low 70's to low 80's at stock, depending on your fan profile, and case flow.
> 
> Adding voltage and overclocking with increase those numbers, but you won't really have a true indication of what it will be until you test it out.
> 
> The to overclocking for me has been keeping the VRM under 80C.


Same here, as soon as those vrm's and the core get over 82 any sort of overclock stability goes right out the window.


----------



## JohnnyMoore

Pls dont tease me guys , i wanna overclock it too







main gpu reach 1180mhz with 10mv and second with 60mv ...


----------



## Gumbi

Quote:


> Originally Posted by *JohnnyMoore*
> 
> Pls dont tease me guys , i wanna overclock it too
> 
> 
> 
> 
> 
> 
> 
> main gpu reach 1180mhz with 10mv and second with 60mv ...


WAT. Those results are insane, especially the first one. Heck, the second one is superb too...


----------



## blank964

Quote:


> Originally Posted by *Sgt Bilko*
> 
> If it's the same with 290's vs 290x's then at 1150mhz Core speed for the 290 = 1100 or so for the 290x and 1250Mhz core for the 290 = 1150Mhz for the 290x
> 
> i don't have a 390 to try out but i might get one at some point


It sounds like the equivalence curve looks something like:

290 ~ 290x
1450 ~ 1200
1250 ~ 1150
1150 ~ 1100
1075 ~ 1050
1013 ~ 1000

i.e. after 1000 on 290x, for each 50mhz 290x increase, you need double the mhz on 290. Is that the rule?


----------



## Agent Smith1984

Quote:


> Originally Posted by *blank964*
> 
> It sounds like the equivalence curve looks something like:
> 
> 290 ~ 290x
> 1450 ~ 1200
> 1250 ~ 1150
> 1150 ~ 1100
> 1075 ~ 1050
> 1013 ~ 1000
> 
> i.e. after 1000 on 290x, for each 50mhz 290x increase, you need double the mhz on 290. Is that the rule?


Na, it doesn't scale that much.

All you gotta do is multiply the clock by the shaders on the 390, and divide that result by 2816 to get the equivalent 390x clock. (Theoretically, and the actual performance of the X doesn't even surmount to the math)

1100x2560 = 2,816,000/ 2816= 1000

Now account for the memory counting for half of the performance (theoretically), and those being equal amongst the two cards, and you can cut the 100mhz difference in half.

If you figure most actual results show show that a Hawaii pro at 1050 performs about the same as Hawaii xt at 1000, then my little formula isn't too far off...

So a 390x at 1200mhz would compete with a390 at around 1260mhz, which is pretty accurate if you do some result searching.


----------



## Gumbi

From what I've seen (there were various threads about it back around the time of the 290/290x release), it didn't scale linearly. The difference was at best 5% and often times less than that clock for clock.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Gumbi*
> 
> From what I've seen (there were various threads about it back around the time of the 290/290x release), it didn't scale linearly. The difference was at best 5% and often times less than that clock for clock.


Exactly, and the difference between 1200 and 1260 is about 5%


----------



## Zack Foo

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Na, it doesn't scale that much.
> 
> All you gotta do is multiply the clock by the shaders on the 390, and divide that result by 2816 to get the equivalent 390x clock. (Theoretically, and the actual performance of the X doesn't even surmount to the math)
> 
> 1100x2560 = 2,816,000/ 2816= 1000
> 
> Now account for the memory counting for half of the performance (theoretically), and those being equal amongst the two cards, and you can cut the 100mhz difference in half.
> 
> If you figure most actual results show show that a Hawaii pro at 1050 performs about the same as Hawaii xt at 1000, then my little formula isn't too far off...
> 
> So a 390x at 1200mhz would compete with a390 at around 1260mhz, which is pretty accurate if you do some result searching.


hey if i getting 390x now, sapphire or msi? is it really cooling vs overclocking? or some other factor can decide that for me too?


----------



## rankdropper84

Can someone help me...what all needs to be in my profile thing to include my sapphire 390 in the owners list?


----------



## rankdropper84

Also what programs do you guys use to monitor temps?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Zack Foo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Agent Smith1984*
> 
> Na, it doesn't scale that much.
> 
> All you gotta do is multiply the clock by the shaders on the 390, and divide that result by 2816 to get the equivalent 390x clock. (Theoretically, and the actual performance of the X doesn't even surmount to the math)
> 
> 1100x2560 = 2,816,000/ 2816= 1000
> 
> Now account for the memory counting for half of the performance (theoretically), and those being equal amongst the two cards, and you can cut the 100mhz difference in half.
> 
> If you figure most actual results show show that a Hawaii pro at 1050 performs about the same as Hawaii xt at 1000, then my little formula isn't too far off...
> 
> So a 390x at 1200mhz would compete with a390 at around 1260mhz, which is pretty accurate if you do some result searching.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> hey if i getting 390x now, sapphire or msi? is it really cooling vs overclocking? or some other factor can decide that for me too?
Click to expand...

Why you no like XFX?


----------



## Zack Foo

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Why you no like XFX?


Cause xfx hates my country....


----------



## Sgt Bilko

Quote:


> Originally Posted by *Zack Foo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Why you no like XFX?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Cause xfx hates my country....
Click to expand...

and what country would that be?


----------



## atlas3686

Just installed my MSI R9 390 and upgraded to windows 10. I now get a 2 -3 min long blank black screen before post, then it starts up as normal and doesn't seem to have any other issues. Anyone have any ideas? Running a Asus maximus extreme IV (P67) with a 4.5Ghz 2600K. PSU is a corsair HX850.


----------



## Zack Foo

Quote:


> Originally Posted by *Sgt Bilko*
> 
> and what country would that be?


Malaysia. We literally don't hVr xfx new stuff here. Last piece was like some old card with 4 letters.
So which one to take?


----------



## By-Tor

Quote:


> Originally Posted by *rankdropper84*
> 
> Also what programs do you guys use to monitor temps?


HWinfo64 is what I use and the HW monitor gadget.


----------



## flopper

Quote:


> Originally Posted by *atlas3686*
> 
> Just installed my MSI R9 390 and upgraded to windows 10. I now get a 2 -3 min long blank black screen before post, then it starts up as normal and doesn't seem to have any other issues. Anyone have any ideas? Running a Asus maximus extreme IV (P67) with a 4.5Ghz 2600K. PSU is a corsair HX850.


latest mboard bios?
thats an old chipset today.


----------



## cfcboy

I love this card so i decided to CF it with another









But and BE WARNED! Plugging the card into the 2nd PCIE thats not gen3 x8 or above will crush the perfomance you were hoping for. Infact i had a 50% decrease in FPS with most games due to the bandwidth issues. The gen2 x4 can simply not handle CF. My board was a MSI Mate and on the specs says it can handle CF, it can but just halves your pefromance! lol

So if you do decide to CF make sure its x16 Gen3 x16 or x8/x8


----------



## Sgt Bilko

Quote:


> Originally Posted by *cfcboy*
> 
> I love this card so i decided to CF it with another
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But and BE WARNED! Plugging the card into the 2nd PCIE thats not gen3 x8 or above will crush the perfomance you were hoping for. Infact i had a 50 decrease in FPS with most games due to the bandwidth issues. The gen2 x4 can simply not handle CF. My board was a MSI Mate and on the specs says it can handle CF, it can but just halves your pefromance! lol
> 
> So if you do decide to CF make sure its x16 Gen3 x16 or x8/x8


Bollocks.....I've done PCIe 2.0 x8/x8 (= to PCIe 3.0 x4) and i got the same fps as i did with PCIe 2.0 x16/x16 (= to PCIe 3.0 x8)


----------



## diggiddi

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Bollocks.....I've done PCIe 2.0 x8/x8 (= to PCIe 3.0 x4) and i got the same fps as i did with PCIe 2.0 x16/x16 (= to PCIe 3.0 x8)


What about X16 X8 pcie 2.0 did you try that, and how were the results


----------



## Sgt Bilko

Quote:


> Originally Posted by *diggiddi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Bollocks.....I've done PCIe 2.0 x8/x8 (= to PCIe 3.0 x4) and i got the same fps as i did with PCIe 2.0 x16/x16 (= to PCIe 3.0 x8)
> 
> 
> 
> What about X16 X8 pcie 2.0 did you try that, and how were the results
Click to expand...

They would be the same in theory but quite a few websites have already done the PCIe 2.0 vs 3.0 testing (albiet with a single GPU) but they showed basically the same fps between them


----------



## atlas3686

Quote:


> Originally Posted by *flopper*
> 
> latest mboard bios?
> thats an old chipset today.


No not the latest, that was my first thought too, going to upgrade as soon as I get home and see if it sorts it out.


----------



## cfcboy

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Bollocks.....I've done PCIe 2.0 x8/x8 (= to PCIe 3.0 x4) and i got the same fps as i did with PCIe 2.0 x16/x16 (= to PCIe 3.0 x8)


yeah that's a gen 3x 4. I actually said gen2 x4. And that was crossfire not single cards in one or the other. You put a cf with 1x gen 3 16x and then another in gen 2x4 it will seriously hamper the performance, especially with the 390 which uses the lanes more and no crossfire cables.


----------



## Zack Foo

Hey guys I'm back! I've made my choice and go with a Msi r9 390x!

I used to have Asus Strix 390 and that card is bad overclocker and also bad temperature too bad.

Anyway i will like to get into the club and be a member!

And help out if anyone needs it!


----------



## Sgt Bilko

Quote:


> Originally Posted by *cfcboy*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Bollocks.....I've done PCIe 2.0 x8/x8 (= to PCIe 3.0 x4) and i got the same fps as i did with PCIe 2.0 x16/x16 (= to PCIe 3.0 x8)
> 
> 
> 
> yeah that's a gen 3x 4. I actually said gen2 x4. And that was crossfire not single cards in one or the other. You put a cf with 1x gen 3 16x and then another in gen 2x4 it will seriously hamper the performance, especially with the 390 which uses the lanes more and no crossfire cables.
Click to expand...

Well that's a little different. I didn't understand your post very well.

i don't have a board/CPU capable of PCIe 3.0 but i do know that mixing between x8 and x16 on PCIe 2.0 doesn't affect your fps all that much, between PCIe 2.0 and 3.0 i could see it and yes i know that all GCN 1.1 and 1.2 do Crossfire over the PCIe slots instead of using the Bridges now


----------



## cfcboy

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Well that's a little different. I didn't understand your post very well.
> 
> i don't have a board/CPU capable of PCIe 3.0 but i do know that mixing between x8 and x16 on PCIe 2.0 doesn't affect your fps all that much, between PCIe 2.0 and 3.0 i could see it and yes i know that all GCN 1.1 and 1.2 do Crossfire over the PCIe slots instead of using the Bridges now


Sorry, my fault, I have a habit of explaining things in a half cocked way lol


----------



## atlas3686

Solved this issue: It turns out it had nothing to do with the GPU, it was the asus xonar dx sound card..... no win 10 drivers or win 8 for that matter, asus are really the worst.


----------



## Zack Foo

Quote:


> Originally Posted by *atlas3686*
> 
> Solved this issue: It turns out it had nothing to do with the GPU, it was the asus xonar dx sound card..... no win 10 drivers or win 8 for that matter, asus are really the worst.


totally agree asus are getting worst and worst even nvidia side not only to amd.

I feel bad for the guy that bought my asus strix 390 tho but hey he got it at same price as new sapphire nitro 390.


----------



## Agent Smith1984

Quote:


> Originally Posted by *rankdropper84*
> 
> Can someone help me...what all needs to be in my profile thing to include my sapphire 390 in the owners list?


Everything eligible as proof is listed in the OP


----------



## jackalopeater

Quote:


> Originally Posted by *Tarifas*
> 
> Hello everyone, quick question to owners of the asus dc2 version of the 390/390x how does it behave in regards to temperatures and noise? Is there some feedback or numbers to be shared?


I have it, under stock settings the core gets to around 74c and VRM1 gets up to 95c, fan gets to around 41% and stay pretty quiet. Setting up a custom fan profile letting the fan get to 50% increases noise but drastically reduces vrm temps, and lowers the core to around 70c, but it a bit noisy.


----------



## rankdropper84

How do you check your vrm temps tho? Also doubt I'll add myself to the owners list since it seems like a lot of hassle. If anyone believes me I have the sapphire nitro 390(big honking card). Whats everyones max OC that they have been safely able to get out of the 390? Also what program is everyone using to add voltage and clock? CCC does the OCing for this card in a weird way...


----------



## rankdropper84

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Everything eligible as proof is listed in the OP


Is there any advantage of being listed in the owners directory? I'm a noob with this site and don't really care to jump through 50 hoops to prove I'm real. Check out my G+ page if you want. Todd Gless is my name. Also what 390 model do you have? Are you OCing? Whats the ambient temps? Highest I have seen from my sapphire is 69C on an 80+F day with no AC and Windows open playing the witcher. Playing gta5 I have never seen temps eclipse 61C as hard as that may be to believe


----------



## Agent Smith1984

Quote:


> Originally Posted by *rankdropper84*
> 
> Is there any advantage of being listed in the owners directory? I'm a noob with this site and don't really care to jump through 50 hoops to prove I'm real. Check out my G+ page if you want. Todd Gless is my name. Also what 390 model do you have? Are you OCing? Whats the ambient temps? Highest I have seen from my sapphire is 69C on an 80+F day with no AC and Windows open playing the witcher. Playing gta5 I have never seen temps eclipse 61C as hard as that may be to believe


It's not really 50 hoops, just a screen shot of GPU-Z with your name in notepad, or even a GPU-Z validation with your name and post the link.

If you don't want to be on the roster that is fine too, you are still welcome to join the conversation.
The member list is partially so guys can be part of the club, but it's biggest benefit is the ability to see OC results for different brands and models of these cards.

Anyways, I have the MSI 390 clocked at 1200/1750 100mv/50mv+ and am getting around 74C core, 75C VRM with custom fan profile, and very high powered case fans.


----------



## rankdropper84

Ehh sounds like I'd have to upload to imgur or some crap. Either believe me or don't...donno why I'd lie. What case do you have? What do you mean you have high powered case fans? What case? Can you help paint me a picture plz because I'm going to try and replicate your results on my sapphire. How do you monitor vrm temps btw?


----------



## Agent Smith1984

Quote:


> Originally Posted by *rankdropper84*
> 
> Ehh sounds like I'd have to upload to imgur or some crap. Either believe me or don't...donno why I'd lie. What case do you have? What do you mean you have high powered case fans? What case? Can you help paint me a picture plz because I'm going to try and replicate your results on my sapphire. How do you monitor vrm temps btw?


You don't need to host the image, just take a screen shot, save it in paint, and use the upload image button right above the post box... It's really simple, and people load screenies on here all the time, not even for proof of anything, but just to give examples or show off their settings.

Anyways, I use the NZXT S340 with some Cooler Master Jetflo 120's (95CFM) in the rear and top.

Odds are, the Sapphire card won't clock as well as the MSI, but you can probably land 1160-1180 with it, which is still a good performing card at that clock speed.


----------



## rankdropper84

Nvm I just seen your setup. My haf 932 should suffice as being in the high powered fan area


----------



## rankdropper84

https://plus.google.com/+ToddGless

Check out my G+ page. Don't have many pictures of my 390 but if you scroll you will see I do own one


----------



## rankdropper84




----------



## rankdropper84




----------



## rankdropper84

Plasmas don't like having their pictures taken haha


----------



## rankdropper84




----------



## Agent Smith1984

Quote:


> Originally Posted by *rankdropper84*


Awesome, I'll get you added!


----------



## rankdropper84

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Awesome, I'll get you added!


Thanks. How do you monitor vrm temps tho?


----------



## CamsX

... no comments

Btw, my Skylake kit shipped yesterday. I hope I can finally break 10k overall in Firestrike!









Honestly, I'm optimistic I can get over 11.5k with 1150/1650 on the GPU and 4.4-4.6 on the CPU. One can only dream.


----------



## CamsX

Quote:


> Originally Posted by *rankdropper84*
> 
> Thanks. How do you monitor vrm temps tho?


In GPU-Z, Sensors TAB. Set VRMs to Max to display what you are reaching with your card. Max recommended temp is 80°C


----------



## Agent Smith1984

Quote:


> Originally Posted by *CamsX*
> 
> ... no comments
> 
> Btw, my Skylake kit shipped yesterday. I hope I can finally break 10k overall in Firestrike!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Honestly, I'm optimistic I can get over 11.5k with 1150/1650 on the GPU and 4.4-4.6 on the CPU. One can only dream.


You'll get that pretty easy bud.

Not to mention that CPU should do about 4.7-4.8 with your cooling.


----------



## Zack Foo

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Here's my thoughts for anyone thinking about these cards...
> 
> If you buy one, I think you made a great choice....
> 
> If you don't buy one, I completely understand
> 
> 
> 
> 
> 
> 
> 
> 
> 
> At the end of the day, I am thrilled to be the first on here to own one of these cards.
> I know these are now going to be considered mainstream GPU's, but I feel like these cards are still pretty beastly, and are great performers for the money.
> 
> Just stay tuned..... This will be a work in progress.
> 
> Teaser info....
> 
> This 390 is about 14% faster out of the box than my 290 Tri-x OC.
> 
> At 100+MV I have hit 1180/1600 with full stability.
> 
> I will finish finding the max VRAM clock this evening, and then continue to push voltage from there for more core frequency.


what program is used to validate the overclock? only 3dmark? as mine is the demo version and waiting everything to finish is wayyyy too long. Anyway to finds out whether the clocks are good and only check the score later?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Zack Foo*
> 
> what program is used to validate the overclock? only 3dmark? as mine is the demo version and waiting everything to finish is wayyyy too long. Anyway to finds out whether the clocks are good and only check the score later?


I run the demo bench everytime, it's the best way to check your overclock in my opinion.

If it gets through the entire first video scene, then you probably have a good overclock.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Zack Foo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Agent Smith1984*
> 
> Here's my thoughts for anyone thinking about these cards...
> 
> If you buy one, I think you made a great choice....
> 
> If you don't buy one, I completely understand
> 
> 
> 
> 
> 
> 
> 
> 
> 
> At the end of the day, I am thrilled to be the first on here to own one of these cards.
> I know these are now going to be considered mainstream GPU's, but I feel like these cards are still pretty beastly, and are great performers for the money.
> 
> Just stay tuned..... This will be a work in progress.
> 
> Teaser info....
> 
> This 390 is about 14% faster out of the box than my 290 Tri-x OC.
> 
> At 100+MV I have hit 1180/1600 with full stability.
> 
> I will finish finding the max VRAM clock this evening, and then continue to push voltage from there for more core frequency.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> what program is used to validate the overclock? only 3dmark? as mine is the demo version and waiting everything to finish is wayyyy too long. Anyway to finds out whether the clocks are good and only check the score later?
Click to expand...

GPU-Z > Validate > Type Username into box > Hit Validate > Post the link into this thread








Quote:


> Originally Posted by *Agent Smith1984*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Zack Foo*
> 
> what program is used to validate the overclock? only 3dmark? as mine is the demo version and waiting everything to finish is wayyyy too long. Anyway to finds out whether the clocks are good and only check the score later?
> 
> 
> 
> I run the demo bench everytime, it's the best way to check your overclock in my opinion.
> 
> If it gets through the entire first video scene, then you probably have a good overclock.
Click to expand...

I completely misread that post









I use Unigine Valley, it's free, leave it running for 20mins or so and your OC will usually be stable


----------



## Zack Foo

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I run the demo bench everytime, it's the best way to check your overclock in my opinion.
> 
> If it gets through the entire first video scene, then you probably have a good overclock.


but i do have to stare at it for artifacts and stuff right?


----------



## Zack Foo

Quote:


> Originally Posted by *Sgt Bilko*
> 
> GPU-Z > Validate > Type Username into box > Hit Validate > Post the link into this thread
> 
> 
> 
> 
> 
> 
> 
> 
> I completely misread that post
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I use Unigine Valley, it's free, leave it running for 20mins or so and your OC will usually be stable


i used that too. so if no artifacts while running unigine means fine? and after i get the max overclock without artifacts only i let it run for an hour or so to check stability?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Zack Foo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Agent Smith1984*
> 
> I run the demo bench everytime, it's the best way to check your overclock in my opinion.
> 
> If it gets through the entire first video scene, then you probably have a good overclock.
> 
> 
> 
> but i do have to stare at it for artifacts and stuff right?
Click to expand...

Yep, either little blue + white squares for core speeds or your screen will flash red or white for memory iirc


----------



## Zack Foo

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Yep, either little blue + white squares for core speeds or your screen will flash red or white for memory iirc


flashing red or white i ve never seen but blue + white i know.

hope that i dont take the red and white as something normal to me and think that it is perfectly normal?

is 390x overclocking is same with 390?

1200mhz/1750 for both card is the same? not performance just like 1200mhz is very good for 390 is it same good for 390x?


----------



## Agent Smith1984

I would watch the FS demo at normal clocks first, and then test the OC, because my core will artifact in places near the end of FS demo that could be perceived as part of the video if you didn't know any better....

I can run Valley and Heaven at 1230MHz artifact free with my card, but anything over 1203MHz will show what appears to be floating fireballs at the end of the FS demo.

That's why I always check FS one time after I've ran Heaven for about 30 minutes.


----------



## Zack Foo

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I would watch the FS demo at normal clocks first, and then test the OC, because my core will artifact in places near the end of FS demo that could be perceived as part of the video if you didn't know any better....
> 
> I can run Valley and Heaven at 1230MHz artifact free with my card, but anything over 1203MHz will show what appears to be floating fireballs at the end of the FS demo.
> 
> That's why I always check FS one time after I've ran Heaven for about 30 minutes.


so artifacts free on valley at desire clock speed. then let it run for 30 minute. and run fs for final inspection? Got it.

And when i turn the memory clock speed to 1750 my screen just go mental on me. right away right at the moment i set it.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Zack Foo*
> 
> so artifacts free on valley at desire clock speed. then let it run for 30 minute. and run fs for final inspection? Got it.
> 
> And when i turn the memory clock speed to 1750 my screen just go mental on me. right away right at the moment i set it.


Firstly, don't just go straight to 1750.... that's asking for a black screen and a reboot needed....

Move in increments of 20-50Mhz and test each time.

Then once you hit instability, you need to add about 25mv of AUX voltage and push until you lose stability again.
Then add 25mv more and repeat.

Overclocking is a VERY time consuming process, but it does become enjoyable, and should not be looked at like a chore, but as a hobby.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Zack Foo*
> 
> so artifacts free on valley at desire clock speed. then let it run for 30 minute. and run fs for final inspection? Got it.
> 
> And when i turn the memory clock speed to 1750 my screen just go mental on me. right away right at the moment i set it.
> 
> 
> 
> Firstly, don't just go straight to 1750.... that's asking for a black screen and a reboot needed....
> 
> Move in increments of 20-50Mhz and test each time.
> 
> Then once you hit instability, you need to add about 25mv of AUX voltage and push until you lose stability again.
> Then add 25mv more and repeat.
> 
> Overclocking is a VERY time consuming process, but it does become enjoyable, and should not be looked at like a chore, but as a hobby.
Click to expand...

I really need to see if i can go past 1700Mhz on my memory at some point.....and i forgot about AUX voltage


----------



## Agent Smith1984

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I really need to see if i can go past 1700Mhz on my memory at some point.....and i forgot about AUX voltage


It didn't help much at 1080P, but it really helped my mins at 4K....


----------



## Sgt Bilko

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> I really need to see if i can go past 1700Mhz on my memory at some point.....and i forgot about AUX voltage
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It didn't help much at 1080P, but it really helped my mins at 4K....
Click to expand...

I overclock GPU's for fun and benches, when i game i'm usually at stock settings for the most part.....except The Witcher 3, i clocked one core on the 295x2 to 1150/1500 so i could stay above 60fps at 1440p until i got Crossfire support


----------



## Zack Foo

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Firstly, don't just go straight to 1750.... that's asking for a black screen and a reboot needed....
> 
> Move in increments of 20-50Mhz and test each time.
> 
> Then once you hit instability, you need to add about 25mv of AUX voltage and push until you lose stability again.
> Then add 25mv more and repeat.
> 
> Overclocking is a VERY time consuming process, but it does become enjoyable, and should not be looked at like a chore, but as a hobby.


So artifacts is related to core clock not the memory? in fs i mean cause in heaven i was getting 1220/1715 but at fs now even 1900/1715 have artifacts.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I overclock GPU's for fun and benches, when i game i'm usually at stock settings for the most part.....except The Witcher 3, i clocked one core on the 295x2 to 1150/1500 so i could stay above 60fps at 1440p until i got Crossfire support


I run 1175/1750 daily at 50mv to be able to game at 45-60FPS at 4K in most titles.

I normally bench and test games at 1200 for fun... the 25mhz helps with a frame or two, but for the most part 1175 @ 50mv makes the most sense for me.

It certainly runs much better than stock with the overclock... especially at the higher resolution.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Zack Foo*
> 
> So artifacts is related to core clock not the memory? in fs i mean cause in heaven i was getting 1220/1715 but at fs now even 1900/1715 have artifacts.


Exactly, and that's why I always test FS









And to answer, the memory can cause artifacts too, but will normally flash a little before it does.

It's always better to find the max core by itself, before ever pushing the memory, since the core affects performance on these cards more than anything.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Zack Foo*
> 
> So artifacts is related to core clock not the memory? in fs i mean cause in heaven i was getting 1220/1715 but at fs now even 1900/1715 have artifacts.
> 
> 
> 
> Exactly, and that's why I always test FS
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And to answer, the memory can cause artifacts too, but will normally flash a little before it does.
> 
> It's always better to find the max core by itself, before ever pushing the memory, since the core affects performance on these cards more than anything.
Click to expand...

^ That, with the 512-bit bus the memory is plenty fast at stock speeds for the vast majority of things, Core clocks are king with Hawaii


----------



## Zack Foo

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Exactly, and that's why I always test FS
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And to answer, the memory can cause artifacts too, but will normally flash a little before it does.
> 
> It's always better to find the max core by itself, before ever pushing the memory, since the core affects performance on these cards more than anything.


The memory seems not very good to overclock. 50 aux at 1700 still has artifacts.


----------



## Mr.Pie

Quote:


> Originally Posted by *Zack Foo*
> 
> The memory seems not very good to overclock. 50 aux at 1700 still has artifacts.


FYI I'm at +50 aux and only at 1600 memory. I could probably tweak it a bit higher in between 1600 - 1649 as 1650 is immediately unstable for me but I don't have the patience to nowadays


----------



## fenixfox

I have an intel i5 3570k overclocked to 4.2ghz

16gb ddr3 1600mhz ram

and of course the wonderful MSI R9 390


----------



## cfcboy

What kind of temps you guys getting with CF on these?



The first card always creeps up to 94c and sits there under 100% load where as the bottom gpu happily sits around 70ish under load.
When idle they both sit at around 40c. Most load test are playing witcher 3 @ 4k high.

Ive tried different combos such as swapping them around but still the top card hits the 94c. My rig is as follows:



Ive tried without the case side door on and its exactly the same so im wondering if its air flow issue or probably the more obvious the heat not properly being dispersed from the top card.

Any advice would be greatly appreciated


----------



## Mr.Pie

Quote:


> Originally Posted by *cfcboy*
> 
> What kind of temps you guys getting with CF on these?
> 
> 
> 
> The first card always creeps up to 94c and sits there under 100% load where as the bottom gpu happily sits around 70ish under load.
> When idle they both sit at around 40c. Most load test are playing witcher 3 @ 4k high.
> 
> Ive tried different combos such as swapping them around but still the top card hits the 94c. My rig is as follows:
> 
> 
> 
> Ive tried without the case side door on and its exactly the same so im wondering if its air flow issue or probably the more obvious the heat not properly being dispersed from the top card.
> 
> Any advice would be greatly appreciated


not enough air flow to those cards, both exhaust and intake.
you could make some sort of shroud or strap a fan to it. Thats probably the only thing you can do short of being able to put those cards further apart


----------



## cfcboy

ive got 4 120mm fans laying around and 2 extra case fans that i could utilise. Where would be the best place to put them?


----------



## BradleyW

Sorry to bother you all. Just hoping a 390X owner could bench heaven 4.0 - stock GPU speeds, 1080p - Ultra - Tess MAX - 0AA please? I'm trying to find out if AMD are cutting back Tess performance via drivers for the 290X owners.


----------



## jackalopeater

Quote:


> Originally Posted by *BradleyW*
> 
> Sorry to bother you all. Just hoping a 390X owner could bench heaven 4.0 - stock GPU speeds, 1080p - Ultra - Tess MAX - 0AA please? I'm trying to find out if AMD are cutting back Tess performance via drivers for the 290X owners.


Here ya go

Asus DCUII 390x
FX 8350 stock
8gb 2400mhz ram

http://www.overclock.net/content/type/61/id/2566828/



http://imgur.com/mYN6O4s


----------



## BradleyW

Quote:


> Originally Posted by *jackalopeater*
> 
> Here ya go
> 
> Asus DCUII 390x
> FX 8350 stock
> 8gb 2400mhz ram
> 
> http://www.overclock.net/content/type/61/id/2566828/
> 
> 
> 
> http://imgur.com/mYN6O4s


Thank you.

My clocks are 1000/1250 (CCC 15.7.1.) and we got the same score. Interesting.


----------



## MTDEW

Quote:


> Originally Posted by *atlas3686*
> 
> Solved this issue: It turns out it had nothing to do with the GPU, it was the asus xonar dx sound card..... no win 10 drivers or win 8 for that matter, asus are really the worst.


Go HERE and scroll down to the section labeled *Driver Downloads (for Windows only):* , grab the first driver listed. (its labeled *UNi-Xonar-1822-v1.75a-r2.exe*)
Best Xonar drivers you'll get for Win10.


----------



## jackalopeater

Quote:


> Originally Posted by *BradleyW*
> 
> Thank you.
> 
> My clocks are 1000/1250 (CCC 15.7.1.) and we got the same score. Interesting.


If you're running it on the rig in your sig it typically scores a little higher with intel cpu's so it seems to be in line. Guess there's no gimping. I'm using the same drivers in Win10 as you


----------



## CamsX

@MTDEW

Thx for this tip. I want to continue using my asus card when my new rig arrives and I install win10.


----------



## Mr.Pie

Quote:


> Originally Posted by *cfcboy*
> 
> ive got 4 120mm fans laying around and 2 extra case fans that i could utilise. Where would be the best place to put them?


Could try strapping it to the end of the cards to see if blowing air in between would help


----------



## JohnnyMoore

Quote:


> Originally Posted by *cfcboy*
> 
> What kind of temps you guys getting with CF on these?
> 
> 
> 
> The first card always creeps up to 94c and sits there under 100% load where as the bottom gpu happily sits around 70ish under load.
> When idle they both sit at around 40c. Most load test are playing witcher 3 @ 4k high.
> 
> Ive tried different combos such as swapping them around but still the top card hits the 94c. My rig is as follows:
> 
> 
> 
> Ive tried without the case side door on and its exactly the same so im wondering if its air flow issue or probably the more obvious the heat not properly being dispersed from the top card.
> 
> Any advice would be greatly appreciated







85/75 max temp


----------



## Zack Foo

Quote:


> Originally Posted by *JohnnyMoore*
> 
> 
> 
> 
> 
> 
> 85/75 max temp


you sir are rocking my dream that i wish one day will come true. So beautiful.

And ya, maybe a fan blowing in the middle? like from the front, it should help air move in between the cards so that the top card gets enough or airflow. And you looks like you lack of air flow for your case. A few more fans or a bigger case should do the trick.

More exhaust than intake should do the trick too since you clear out hot air faster but dust build up will be a problem.


----------



## Duke976

Quote:


> Originally Posted by *cfcboy*
> 
> What kind of temps you guys getting with CF on these?
> 
> 
> 
> The first card always creeps up to 94c and sits there under 100% load where as the bottom gpu happily sits around 70ish under load.
> When idle they both sit at around 40c. Most load test are playing witcher 3 @ 4k high.
> 
> Ive tried different combos such as swapping them around but still the top card hits the 94c. My rig is as follows:
> 
> 
> 
> Ive tried without the case side door on and its exactly the same so im wondering if its air flow issue or probably the more obvious the heat not properly being dispersed from the top card.
> 
> Any advice would be greatly appreciated


With very limited space it will be really hard for the cards to breath thus giving you a 94c temp as what I have before.



http://imgur.com/4BOWoPk


I was force to change my mobo from sabertooth x79 to RIVE so that I can have enough clearance for the cards to breath.



http://imgur.com/zikaROQ


Here's the actual temp that I got after replacing the tim on the top card.



http://imgur.com/a3xRYi2


Also when I change the tim in the 1st top card help a bit and was able to lower the temp by a couple of degrees. You can also change the fan curve to help you to lower your temp.


----------



## Agent Smith1984

I'll get noobs added Monday, on vaca with the fam in south myrtle!


----------



## Zack Foo

Quote:


> Originally Posted by *Duke976*
> 
> With very limited space it will be really hard for the cards to breath thus giving you a 94c temp as what I have before.
> 
> 
> 
> http://imgur.com/4BOWoPk
> 
> 
> I was force to change my mobo from sabertooth x79 to RIVE so that I can have enough clearance for the cards to breath.
> 
> 
> 
> http://imgur.com/zikaROQ
> 
> 
> Here's the actual temp that I got after replacing the tim on the top card.
> 
> 
> 
> http://imgur.com/a3xRYi2
> 
> 
> Also when I change the tim in the 1st top card help a bit and was able to lower the temp by a couple of degrees. You can also change the fan curve to help you to lower your temp.


wow that is way too much to give to allow the cards to breath. msi card is a little too thick i guess.


----------



## Cannon19932006

Quote:


> Originally Posted by *Zack Foo*
> 
> wow that is way too much to give to allow the cards to breath. msi card is a little too thick i guess.


They are Tri slot cards.


----------



## Zack Foo

Quote:


> Originally Posted by *Cannon19932006*
> 
> They are Tri slot cards.


right after my reply to you i look at my motherboard, z97 pro gamer, and i have real bad feeling about this alrd. I am using the lower pcie 3.0 16x. there is one more, 3 screw(the one on the case holding the gpu) above the card.


----------



## amdf4n

I have a very strange problem with my MSI R9 390.









After reading the temperatures of the card i noticed that my second vrm sensor doesnt show the correct temperature. The temperature never changes regardless whether I game or just idle on the desktop, it always stays at 56°C.

*Here some screenshots:*




Now my question is, if those readings are normal and if you guys have the same problem.

Big thanks in advance!


----------



## Cannon19932006

Quote:


> Originally Posted by *amdf4n*
> 
> I have a very strange problem with my MSI R9 390.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> After reading the temperatures of the card i noticed that my second vrm sensor doesnt show the correct temperature. The temperature never changes regardless whether I game or just idle on the desktop, it always stays at 56°C.
> 
> *Here some screenshots:*
> 
> 
> 
> 
> Now my question is, if those readings are normal and if you guys have the same problem.
> 
> Big thanks in advance!


Can confirm VRM 2 doesn't do jack.


----------



## bazookatooths

Why dont you just move that fan down to the video cards or add one. Also add another front internal.


----------



## Sgt Bilko

@Agent Smith1984



Thats 1150/1500 with the CPU at 5.0, I forgot to change it down to 4.8 but we can work off graphics score if need be


----------



## Dorland203

This is my score with the CPU at 4.5 Ghz and GPU at 1150/1625.

I set Tessellation to 8X in CCC in all applications.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Dorland203*
> 
> This is my score with the CPU at 4.5 Ghz and GPU at 1150/1625.
> 
> I set Tessellation to 8X in CCC in all applications.


Check combined score









and mine is with Tess on default, it's easier when doing a straight up comparison


----------



## Dorland203

Run again with tess set to default

Same graphic score as yours.








Anyway,I prefer to set Tess to 8X to get better performance and same image quality.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Dorland203*
> 
> Run again with tess set to default
> 
> Same graphic score as yours.
> 
> 
> 
> 
> 
> 
> 
> 
> Anyway,I prefer to set Tess to 8X to get better performance and same image quality.


Yep, I run my games with Tess turned down but for obvious reasons when i'm benching and doing gaming tests i leave everything on default.

Besides, least amount of settings changed makes it easier to compare scores etc and I did that because we are trying to work out the performance difference between the 390's and 390x's at different clock speeds


----------



## cfcboy

Quote:


> Originally Posted by *Zack Foo*
> 
> wow that is way too much to give to allow the cards to breath. msi card is a little too thick i guess.


Yes have to agree there... I'm not sure there is a mobo out there that gives them enough clearance to let them fully breathe enough.

I managed to slightly solve the issue. I placed another fan at the front increasing the intake, then I actually placed two pull cards on top of the first gpu to disperse some heat off the card and move it up. I placed a fan at the top of the case as well and cleaned up some cabling.

No matter what I do witcher 3 hits 90c range on that top card but now it's after a much longer period. Guessing it's not at a majorly long period so can live with it. All other games seem fine and idle is 35-40c


----------



## bazookatooths

Quote:


> Originally Posted by *cfcboy*
> 
> Yes have to agree there... I'm not sure there is a mobo out there that gives them enough clearance to let them fully breathe enough.
> 
> I managed to slightly solve the issue. I placed another fan at the front increasing the intake, then I actually placed two pull cards on top of the first gpu to disperse some heat off the card and move it up. I placed a fan at the top of the case as well and cleaned up some cabling.
> 
> No matter what I do witcher 3 hits 90c range on that top card but now it's after a much longer period. Guessing it's not at a majorly long period so can live with it. All other games seem fine and idle is 35-40c


The way I would set up fans in that case.
First I would turn the CPU fan front to rear, so its not pulling that hot air from that gpu into it, get rid of that heat pocket over the top card.

Then I would add a side intake and another intake on the bottom in front of the PSU so you are constantly feeding the two beast with fresh cool air.

INTAKE: 2x Front , 1x Bottom, 1x Side
OUTAKE: 1x Rear, 2x TOP

For the rear fan I would put behind the CPU or GPU(If you dont have open slots to create a positive pressure escape of air)
Or both I would test all possible combinations of rear fan.

I'm sure this setup will net you 7-10c cooler on the card.

Keep in mind i have 970chipset and fx8120 oc'd so my case is extremely HOT and this set up works great. I doubt you even need as much cooling as I have. Key would be getting rid of that heat pocket at all cost!


----------



## Zack Foo

Quote:


> Originally Posted by *cfcboy*
> 
> Yes have to agree there... I'm not sure there is a mobo out there that gives them enough clearance to let them fully breathe enough.
> 
> I managed to slightly solve the issue. I placed another fan at the front increasing the intake, then I actually placed two pull cards on top of the first gpu to disperse some heat off the card and move it up. I placed a fan at the top of the case as well and cleaned up some cabling.
> 
> No matter what I do witcher 3 hits 90c range on that top card but now it's after a much longer period. Guessing it's not at a majorly long period so can live with it. All other games seem fine and idle is 35-40c


let's hope i dont need to change my motherboard then. that will suck so bad.


----------



## sinholueiro

So, seeing that I can only have a full waterblock getting a Asus, Club3D, Powercolor or VTX3D, I was looking for one of this. I have to wait some moths with the stock cooler, so the Asus is discarded. The Club3D isn't available in my country. I remain between the Powercolor or the VTX. The Powercolor seems cool but loud. On the other hand I couldn't get information about the VTX. What brand do you recomend?


----------



## By-Tor

Powercolor would be my first choice over any of the other brands that released a 390/390x, but I've also owned the past 4 gens of Powercolor cards and have nothing bad to say about any of them.

My2cents


----------



## Derek129

Does anyone else on here have dirt rally and what kind of graphics settings are you using? My game plays like absolute **** no matter if I'm on high or ultra settings with no AA to having max AA and with v sync on or off. I cant seem to find anything that helps it plays like hell.


----------



## CamsX

Quote:


> Originally Posted by *Derek129*
> 
> Does anyone else on here have dirt rally and what kind of graphics settings are you using? My game plays like absolute **** no matter if I'm on high or ultra settings with no AA to having max AA and with v sync on or off. I cant seem to find anything that helps it plays like hell.


On Dirt Rally set Shaders to low or 2nd lowest. That's the setting that kills performance in that game. If you set it to lowest you can max out everything else, even AA @1440p


----------



## Derek129

Quote:


> Originally Posted by *CamsX*
> 
> On Dirt Rally set Shaders to low or 2nd lowest. That's the setting that kills performance in that game. If you set it to lowest you can max out everything else, even AA @1440p


Thank you so much for the reply, I will give this a try the next time I run the game! Could it be possible also that my gpu bottlenecks my fx8320 at all? I don't have too many other problems with games maybe the occasional fps drops here and there gtav plays beautifully but I cant understand why dirt rally would play like such hell


----------



## ninjamaster

http://www.newegg.com/Product/Product.aspx?Item=N82E16814131671

http://www.techpowerup.com/gpuz/details.php?id=5nuzx


----------



## CamsX

Quote:


> Originally Posted by *Derek129*
> 
> Thank you so much for the reply, I will give this a try the next time I run the game! Could it be possible also that my gpu bottlenecks my fx8320 at all? I don't have too many other problems with games maybe the occasional fps drops here and there gtav plays beautifully but I cant understand why dirt rally would play like such hell


No, I don't think thats the case. I run my 390 on a 8350 and I can avg 90+ fps in Dirt Rally without any problems. Trust me on the shaders bit, I tested a lot of different settings on that game. I'm glad it has a fairly consistent benchmark mode.


----------



## cfcboy

Ok
Quote:


> Originally Posted by *bazookatooths*
> 
> The way I would set up fans in that case.
> First I would turn the CPU fan front to rear, so its not pulling that hot air from that gpu into it, get rid of that heat pocket over the top card.
> 
> Then I would add a side intake and another intake on the bottom in front of the PSU so you are constantly feeding the two beast with fresh cool air.
> 
> INTAKE: 2x Front , 1x Bottom, 1x Side
> OUTAKE: 1x Rear, 2x TOP
> 
> For the rear fan I would put behind the CPU or GPU(If you dont have open slots to create a positive pressure escape of air)
> Or both I would test all possible combinations of rear fan.
> 
> I'm sure this setup will net you 7-10c cooler on the card.
> 
> Keep in mind i have 970chipset and fx8120 oc'd so my case is extremely HOT and this set up works great. I doubt you even need as much cooling as I have. Key would be getting rid of that heat pocket at all cost!


Ok cool (no pun intended ?). So when you mean switch the CPU fan round do you mean so it's facing the rear casing fan or pointing towards top of the case? Thanks again


----------



## tangelo

Quote:


> Originally Posted by *cfcboy*
> 
> Ok
> Ok cool (no pun intended ?). So when you mean switch the CPU fan round do you mean so it's facing the rear casing fan or pointing towards top of the case? Thanks again


Rotate the heatsink with the fan 90 degrees counter clockwise.


----------



## bazookatooths

Quote:


> Originally Posted by *cfcboy*
> 
> Ok
> Ok cool (no pun intended ?). So when you mean switch the CPU fan round do you mean so it's facing the rear casing fan or pointing towards top of the case? Thanks again


So the heatsink is up and down and fan pushing air through ittowards the rear. You can look at my rig for example. This will promote a flow from front bottom to top rear and get rid of that hot air just swirling around the top of the card.


----------



## Mr.Pie

15.8 beta drivers are out

anyone tried em out?

edit: Link - http://www.overclock.net/t/1571920/amd-catalyst-15-8-beta-available-for-download/0_40#post_24363057


----------



## Mysticking32

New Beta Drivers Released! 15.8 Testing now.http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx


----------



## MrOldboy

Can anyone tell me what the difference, besides the factory OC, is between these two gigabyte cards?

http://www.newegg.com/Product/Product.aspx?Item=N82E16814125805&cm_re=r9_390-_-14-125-805-_-Product
http://www.newegg.com/Product/Product.aspx?Item=N82E16814125792&cm_re=r9_390-_-14-125-792-_-Product

The G1 card is shorter so my main question is cooling. The Gigabyte site lists both as 234 so the newegg listing is wrong on the GN version. Is it the same between the two?

And also if a fix for the pc restart issue has been found?

Been looking at the R9 390 over the 970 recently and hoping that someone here maybe owns the non G1 gigabyte card and can give input. I can't find a review for the card either.


----------



## Agent Smith1984

Quote:


> Originally Posted by *MrOldboy*
> 
> Can anyone tell me what the difference, besides the factory OC, is between these two gigabyte cards?
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814125805&cm_re=r9_390-_-14-125-805-_-Product
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814125792&cm_re=r9_390-_-14-125-792-_-Product
> 
> The G1 card is shorter so my main question is cooling. Is it the same between the two?
> 
> And also if a fix for the pc restart issue has been found?
> 
> Been looking at the R9 390 over the 970 recently and hoping that someone here maybe owns the non G1 gigabyte card and can give input. I can't find a review for the card either.


Not sure on the questions you have, but i don't know why anyone would buy the g1 right now, unless they do not intend on overclocking past 1080 or so.... These cards are volt locked!

Not to mention the cooling is average, the binning is average....

Not to be negative at all, and i know availability/pricing is an issue outside of the states, but i will continue to recommend XFX, MSI, and even Shapphire when looking for one of these cards.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Mysticking32*
> 
> New Beta Drivers Released! 15.8 Testing now.http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx


Please advise on drivers ASAP, I'm recovering from Myrtle Beach trip d


----------



## MrOldboy

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Not sure on the questions you have, but i don't know why anyone would buy the g1 right now, unless they do not intend on overclocking past 1080 or so.... These cards are volt locked!
> 
> Not to mention the cooling is average, the binning is average....
> 
> Not to be negative at all, and i know availability/pricing is an issue outside of the states, but i will continue to recommend XFX, MSI, and even Shapphire when looking for one of these cards.


It's simple due to price. The Gigabyte GN can be had for $275 which is at least $30 less than the other cards you mentioned.

I have never OCed a video card to be honest so that isn't a big deal to me.

Are the XFX, MSI and other cards worth the extra money if you don't plan to OC? Even keeping in mind reselling the card down the line. Would the gigabyte hold less value?

How are the temps of the cards you recommend compared to the gigabyte? That is a concern of mine and if the other cards run cooler I'd be willing to spend the extra money.


----------



## Mysticking32

New drivers seem stable after testing the Witcher 3. FPS seem more stable and consistent now. (Playing at 1440p on High settings) Staying in the 50 to 60 range rather than the mid 40's to 60's. (Frame rate control on set to 60) I'm not sure why the drivers improved this game since it says the only performance optimizations are for batman and ashes of the singularity.

I'll test GTA 5 next.


----------



## Agent Smith1984

Quote:


> Originally Posted by *MrOldboy*
> 
> It's simple due to price. The Gigabyte GN can be had for $275 which is at least $30 less than the other cards you mentioned.
> 
> I have never OCed a video card to be honest so that isn't a big deal to me.
> 
> Are the XFX, MSI and other cards worth the extra money if you don't plan to OC? Even keeping in mind reselling the card down the line. Would the gigabyte hold less value?
> 
> How are the temps of the cards you recommend compared to the gigabyte? That is a concern of mine and if the other cards run cooler I'd be willing to spend the extra money.


Well, if overclocking doesn't matter, then the temp difference really won't stack up either. Get the cheapest card and roll with it....

Resale value will be lower though, because people will all know it has very limited OC Ability, and when buying a used card, people need overclocking now than ever.... Though if you plan on keeping it more than 2 years, the resale value will be probably $10-20 lower than other "old" 390....


----------



## MrOldboy

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Well, if overclocking doesn't matter, then the temp difference really won't stack up either. Get the cheapest card and roll with it....
> 
> Resale value will be lower though, because people will all know it has very limited OC Ability, and when buying a used card, people need overclocking now than ever.... Though if you plan on keeping it more than 2 years, the resale value will be probably $10-20 lower than other "old" 390....


I plan on keeping this card for a while probably more than 3 years. I've had my GTX 480 for 3.5 years now. I'd be willing to spend more on say the MSI if there is any difference in the temps compared to the gigabyte. Living with 94c for years has made me want an upgrade badly. I initially looked at the GTX 960 as people said it was very efficient and ran cool, but slowly I was convinced to spend more and get a card that would last longer.

I'm looking at the MSI here http://www.newegg.com/Product/Product.aspx?Item=N82E16814127874&cm_re=r9_390-_-14-127-874-_-Product

Is the lower factory clock the only difference with the LE? They're the same price.

So the MSI for $305 or the Gigabyte for $275 are my two options right now. I'll have to see if the MSI brings enough benefit to me personally to spend an extra 10%.


----------



## Agent Smith1984

Quote:


> Originally Posted by *MrOldboy*
> 
> I plan on keeping this card for a while probably more than 3 years. I've had my GTX 480 for 3.5 years now. I'd be willing to spend more on say the MSI if there is any difference in the temps compared to the gigabyte. Living with 94c for years has made me want an upgrade badly. I initially looked at the GTX 960 as people said it was very efficient and ran cool, but slowly I was convinced to spend more and get a card that would last longer.
> 
> I'm looking at the MSI here http://www.newegg.com/Product/Product.aspx?Item=N82E16814127874&cm_re=r9_390-_-14-127-874-_-Product
> 
> Is the lower factory clock the only difference with the LE? They're the same price.
> 
> So the MSI for $305 or the Gigabyte for $275 are my two options right now. I'll have to see if the MSI brings enough benefit to me personally to spend an extra 10%.


Wait, are we talking the same currency here? The MSI is $330 USD... if OC doesn't matter to you, then just get the cheapest card..... the gigabyte at $299 USD


----------



## MrOldboy

Newegg has $25 off with visa checkout that expires today. Sorry, should have mentioned.

But, like I said I can be convinced to buy the MSI if the cooling is better or quieter compared to the gigabyte. Just very little info/reviews out there in terms the gigabyte which is why I came to ask here.

Do you know about the gigabyte restarting pc thing? Has that been resolved?


----------



## Agent Smith1984

Quote:


> Originally Posted by *MrOldboy*
> 
> Newegg has $25 off with visa checkout that expires today. Sorry, should have mentioned.
> 
> But, like I said I can be convinced to buy the MSI if the cooling is better or quieter compared to the gigabyte. Just very little info/reviews out there in terms the gigabyte which is why I came to ask here.
> 
> Do you know about the gigabyte restarting pc thing? Has that been resolved?


Know nothing of a restart issue, but so few people in this particular website buy the giga card because of the voltage lock. This is a website that delves a lot into overclocking...

If you truly don't want to overclock, then a few degrees C isn't going to matter man. Get the cheapest card possible!

$275 is an awesome price for a 390 in my book.

Then again $305 is even better for a card that does 1170+ on the core pretty consistently... but again, that's just an old overclocker's perspective...


----------



## MrOldboy

Does an overclock around 1150 on this card show meaningful results in games? I've never really overclocked a GPU before since the benefits didn't seem that great, although my GTX 480 already ran so hot I don't even know if it could be overclocked.

I'll think it over, but as of now I am leaning a bit more towards the MSI since the gigabyte is a lesser known quantity right now. Reading through the thread a bit and reading your comments it seems like the MSI is a very good card. The gigabyte might be as good w/o overclocking, but I can't tell yet since info is scarce.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Sgt Bilko*
> 
> @Agent Smith1984
> 
> 
> 
> Thats 1150/1500 with the CPU at 5.0, I forgot to change it down to 4.8 but we can work off graphics score if need be


Not the slightest clue why combined score is even worse than usual (such a joke the way it uses AMD CPU's anyways) but the graphics score is close enough, and the CPU (my usual 4.9 "winter clocks") are edging you by a smidge.....

http://www.3dmark.com/3dm/8412037

Looks like 1200 on a 390 is right in line with a 390X @ 1150.......

Now lets see what memory clock it takes to try and get a 390 level with a 390x using the same core speed of 1150Mhz


----------



## MrOldboy

Is there any difference between the MSI gaming and LE models? The gaming is OOS on newegg now, but the LE is still in stock. Is it only the 50 mhx factory clock that is different?

Also I've seen the XFX model mentioned a lot in this thread. How would you rank it to the MSI?


----------



## Derek129

Quote:


> Originally Posted by *CamsX*
> 
> No, I don't think thats the case. I run my 390 on a 8350 and I can avg 90+ fps in Dirt Rally without any problems. Trust me on the shaders bit, I tested a lot of different settings on that game. I'm glad it has a fairly consistent benchmark mode.


Dude I absolutely cannot thank you enough, you are the man! I turned my shaders to low and with everything else at max/ultra it plays absolutely amazing!! Thank you!


----------



## Evan20x

I have a Sapphire Nitro R9 390 and I upgraded from a 7850. This new card feels like it isn't performing to what I'd expect I'm not sure.



Is it suppose to work like this on newer AMD cards? Constantly spiking from 0 to near 100% while in game? I've even seen my VDDC go above 1.2v.

I'm not sure anymore, feeling a bit uneasy with it. I feel like the performance is lacking. Should I RMA or just return it all together?


----------



## Gumbi

Quote:


> Originally Posted by *Evan20x*
> 
> I have a Sapphire Nitro R9 390 and I upgraded from a 7850. This new card feels like it isn't performing to what I'd expect I'm not sure.
> 
> 
> 
> Is it suppose to work like this on newer AMD cards? Constantly spiking from 0 to near 100% while in game? I've even seen my VDDC go above 1.2v.
> 
> I'm not sure anymore, feeling a bit uneasy with it. I feel like the performance is lacking. Should I RMA or just return it all together?


If the game is CPU bound, at points will your GPU will have less work to do. Those drops are those points.

What game is this from? Can you do a bench run of Heaven benchmark and report your FPS result? Do a 1080p run at max settings.


----------



## battleaxe

Quote:


> Originally Posted by *Gumbi*
> 
> If the game is CPU bound, at points will your GPU will have less work to do. Those drops are those points.
> 
> What game is this from? Can you do a bench run of Heaven benchmark and report your FPS result? Do a 1080p run at max settings.


Looks like a bottleneck is going on somewhere... this does look odd.

Interesting thought... I have a new 290x Sapphire Trij-X and guess what? My VRM2 is hotter than VRM1 just like this one is. I was already wondering if the new 290x is really just the 390x. Meaning, they made a few changes to the board such as the VRM's, of which there are fewer VRM2 than there were before, and now VRM2 is hotter than VRMf1 used to be on the older 290x version of Sapphire's Tri-X 290x. Here his graph above looks like what happens on mine. And no matter what I do I cannot get VRM2 lower than VRM1. Not that it really matters, but it does make me think that they updated the board for the 390x and went ahead and called the "new 290x" revised... but its really the exact same thing with a different BIOS and RAM...? Maybe, I don't know, seems possible though.


----------



## Gumbi

Quote:


> Originally Posted by *battleaxe*
> 
> Looks like a bottleneck is going on somewhere... this does look odd.
> 
> Interesting thought... I have a new 290x Sapphire Trij-X and guess what? My VRM2 is hotter than VRM1 just like this one is. I was already wondering if the new 290x is really just the 390x. Meaning, they made a few changes to the board such as the VRM's, of which there are fewer VRM2 than there were before, and now VRM2 is hotter than VRMf1 used to be on the older 290x version of Sapphire's Tri-X 290x. Here his graph above looks like what happens on mine. And no matter what I do I cannot get VRM2 lower than VRM1. Not that it really matters, but it does make me think that they updated the board for the 390x and went ahead and called the "new 290x" revised... but its really the exact same thing with a different BIOS and RAM...? Maybe, I don't know, seems possible though.


They might have redisgned their PCB, yeah. How do you find the new TriX? On paper it seems like a TriX (great cooling etc) with some of the old issues ironed out (hot VRMs, rattling fabs etc.).

How do you find the core/VRM cooling? Does it overclock well for you?


----------



## battleaxe

Quote:


> Originally Posted by *Gumbi*
> 
> They might have redisgned their PCB, yeah. How do you find the new TriX? On paper it seems like a TriX (great cooling etc) with some of the old issues ironed out (hot VRMs, rattling fabs etc.).
> 
> How do you find the core/VRM cooling? Does it overclock well for you?


This is the new Trij-X, I got it when it went on sale for $260 with rebate. http://www.newegg.com/Product/Product.aspx?Item=N82E16814202145

From what I have seen on this forum mine overclocks very similar to a 390X, on the core at least. Meaning, it cannot take as much voltage as before, and when they are done, no amount of voltage will help, but pretty easily goes above 1200mhz on the core.

Mine has Samsung memory too, I'm not sure what is in the 390x's. So while my memory won't go quite as high as some 390x, I have yet to see a 390x beat my Heaven score of 1657. (I'm sure some are higher, just haven't' seen it here yet) So, I'm really happy with my card, yes as it seem to overclock pretty well (1235 for benching) ... just some things I noticed are different than the old 290x's were. VRM1 being always hotter than VRM2 is on these. And the voltage ceiling of course.


----------



## Evan20x

Quote:


> Originally Posted by *battleaxe*
> 
> Looks like a bottleneck is going on somewhere... this does look odd.


I have an i5-3470 so there shouldn't be a bottleneck, unless my motherboard is doing that which I doubt it is. It's running at PCI-E 3.0 x16 speeds. ASRock H77M is the mobo.


----------



## Gumbi

Quote:


> Originally Posted by *Evan20x*
> 
> I have an i5-3470 so there shouldn't be a bottleneck, unless my motherboard is doing that which I doubt it is. It's running at PCI-E 3.0 x16 speeds. ASRock H77M is the mobo.


Depends on the game some games are very CPU dependant

Also, would you please provide numbers for the Heaven benchmark so we can determine if your GPUis underperforming or not...


----------



## Gumbi

Quote:


> Originally Posted by *battleaxe*
> 
> This is the new Trij-X, I got it when it went on sale for $260 with rebate. http://www.newegg.com/Product/Product.aspx?Item=N82E16814202145
> 
> From what I have seen on this forum mine overclocks very similar to a 390X, on the core at least. Meaning, it cannot take as much voltage as before, and when they are done, no amount of voltage will help, but pretty easily goes above 1200mhz on the core.
> 
> Mine has Samsung memory too, I'm not sure what is in the 390x's. So while my memory won't go quite as high as some 390x, I have yet to see a 390x beat my Heaven score of 1657. (I'm sure some are higher, just haven't' seen it here yet) So, I'm really happy with my card, yes as it seem to overclock pretty well (1235 for benching) ... just some things I noticed are different than the old 290x's were. VRM1 being always hotter than VRM2 is on these. And the voltage ceiling of course.


1235 for benching is superb tbh.

How much voltage are you putting through it for that








I recently got a new fan setup and can run my 1175/1600mhz overclock at 75mv with ease now.

1200/1600 at 100mv crashes the display driver within 5 mins of Heaven though :/I have yet to punp more voltage through it via Trixx. With fans at 100%, my VRMs didn't break 64 degrees (vrm 2 in the 50s), and my core was 76 degrees. It should be noted that my central fan wasn't even spinning as it is currently not functional, so once I have that fixed I'll have no prob putting 200mv through it and trying to break 1200mhz!

Gotaa love dem 290s


----------



## Evan20x

Quote:


> Originally Posted by *Gumbi*
> 
> Depends on the game some games are very CPU dependant
> 
> Also, would you please provide numbers for the Heaven benchmark so we can determine if your GPUis underperforming or not...


That pic is of me playing ArcheAge so I'm not sure.


----------



## Gumbi

Quote:


> Originally Posted by *Evan20x*
> 
> That pic is of me playing ArcheAge so I'm not sure.


How can we address performance issues if you don't provide us with proper data?

Would you please provide the FPS you get during a bench run of maxed Heaven. (google and download "heaven benchmark").


----------



## Evan20x

Quote:


> Originally Posted by *Gumbi*
> 
> How can we address performance issues if you don't provide us with proper data?
> 
> Would you please provide the FPS you get during a bench run of maxed Heaven. (google and download "heaven benchmark").


By maxed heaven you mean the extreme preset?

In the mean time here is a shot of gpu-z on BF4


----------



## battleaxe

Quote:


> Originally Posted by *Gumbi*
> 
> 1235 for benching is superb tbh.
> 
> How much voltage are you putting through it for that
> 
> 
> 
> 
> 
> 
> 
> 
> I recently got a new fan setup and can run my 1175/1600mhz overclock at 75mv with ease now.
> 
> 1200/1600 at 100mv crashes the display driver within 5 mins of Heaven though :/I have yet to punp more voltage through it via Trixx. With fans at 100%, my VRMs didn't break 64 degrees (vrm 2 in the 50s), and my core was 76 degrees. It should be noted that my central fan wasn't even spinning as it is currently not functional, so once I have that fixed I'll have no prob putting 200mv through it and trying to break 1200mhz!
> 
> Gotaa love dem 290s


1235mhz core
1665mhz Mem
100mv
+25mv Aux

Set like this:

core is 63c max
VRM1 72c max
VRM2 76c max (and I meant to say VRM2 is always, always hotter than VRM1 no matter what I do to the settings)

Heaven set to max everything and 1080p setting.


----------



## kalidae

Can I please join the club? Sapphire nitro 390. Love the card and in my define r4 I never hear the fans spinning, not at all. Was sad to open the box and find no backplate though.


Quote:


> Originally Posted by *battleaxe*
> 
> This is the new Trij-X, I got it when it went on sale for $260 with rebate. http://www.newegg.com/Product/Product.aspx?Item=N82E16814202145
> 
> From what I have seen on this forum mine overclocks very similar to a 390X, on the core at least. Meaning, it cannot take as much voltage as before, and when they are done, no amount of voltage will help, but pretty easily goes above 1200mhz on the core.
> 
> Mine has Samsung memory too, I'm not sure what is in the 390x's. So while my memory won't go quite as high as some 390x, I have yet to see a 390x beat my Heaven score of 1657. (I'm sure some are higher, just haven't' seen it here yet) So, I'm really happy with my card, yes as it seem to overclock pretty well (1235 for benching) ... just some things I noticed are different than the old 290x's were. VRM1 being always hotter than VRM2 is on these. And the voltage ceiling of course.


What resolution are you running heaven benchmark? Sapphire make a really good card.







I'm about to pull the whole system apart and move it into a noctis 450 case and change to hard line tubing...wish I could watercool the 390. Oh well its silent the way it is I suppose I only want to watercooling it for aesthetic reasons.


----------



## battleaxe

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *kalidae*
> 
> Can I please join the club? Sapphire nitro 390. Love the card and in my define r4 I never hear the fans spinning, not at all. Was sad to open the box and find no backplate though.
> 
> 
> What resolution are you running heaven benchmark? Sapphire make a really good card.
> 
> 
> 
> 
> 
> 
> 
> I'm about to pull the whole system apart and move it into a noctis 450 case and change to hard line tubing...wish I could watercool the 390. Oh well its silent the way it is I suppose I only want to watercooling it for aesthetic reasons.






Turn Tesselation back on. You can see on my score that Tesselation is set to "Extreme". I'm very curious to see if the 390x is faster, or if its just the same card I have, cause that's what it looks like it is...









You can see my sets on my screeny.


----------



## Evan20x

Results are in for my Sapphire Nitro R9 390 on the Heaven benchmark.

Everything on the GPU is stock. Have not touched the volts or clocks.


----------



## kalidae

Quote:


> Originally Posted by *battleaxe*
> 
> 
> Turn Tesselation back on. You can see on my score that Tesselation is set to "Extreme". I'm very curious to see if the 390x is faster, or if its just the same card I have, cause that's what it looks like it is...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You can see my sets on my screeny.


Mines a 390 not the X and it's still running stock volts so probably not the best comparison but I think the 290 8gb tri x would be a 390 nitro. Heres my score at 1126/1600


----------



## CamsX

Quote:


> Originally Posted by *Evan20x*
> 
> Results are in for my Sapphire Nitro R9 390 on the Heaven benchmark.
> 
> Everything on the GPU is stock. Have not touched the volts or clocks.]


I see that your monitor isn't FullHD. Even this will provide different scores, probably higher, to others using 1080p. When running Firestrike and Heaven use AMD defaults with Tessellation ON. Those settings in HEAVEN are the proper ones (other than resolution) for benchmarking.

I think you can turn VRS ON and use 1080p for the Heaven run in order to compare.

ALSO, in GPU-Z, always show "MAX" clocks and temps! By clicking on the value it will cycle until you are able to select MAX. Thats what we need to see.

I have a nitro too, and I avg 120+ fps in BF with a combination of Ultra and Medium settings. 144hz monitor here.


----------



## battleaxe

Quote:


> Originally Posted by *kalidae*
> 
> Mines a 390 not the X and it's still running stock volts so probably not the best comparison but I think the 290 8gb tri x would be a 390 nitro. Heres my score at 1126/1600


Ah, okay.









Not bad at all.


----------



## Evan20x

Quote:


> Originally Posted by *CamsX*
> 
> I see that your monitor isn't FullHD. Even this will provide different scores, probably higher, to others using 1080p. When running Firestrike and Heaven use AMD defaults with Tessellation ON. Those settings in HEAVEN are the proper ones (other than resolution) for benchmarking.
> 
> I think you can turn VRS ON and use 1080p for the Heaven run in order to compare.
> 
> ALSO, in GPU-Z, always show "MAX" clocks and temps! By clicking on the value it will cycle until you are able to select MAX. Thats what we need to see.
> 
> I have a nitro too, and I avg 120+ fps in BF with a combination of Ultra and Medium settings. 144hz monitor here.


Yes my monitor is 1680x1050. I do have tessellation set to Extreme on Heaven.

The BF4 GPU-Z was taken while I was mid game. That was taken with BF4 set to Ultra preset with 115% resolution scale I think.

I will have to do more testing. My main question is, is it normal for GPU-Z loads to look like that? GPU load, memory and power loads spiking up and down? See I come from a 7850 where my GPU load would just stay very steady and mostly everything else would.


----------



## CamsX

Quote:


> Originally Posted by *Derek129*
> 
> Dude I absolutely cannot thank you enough, you are the man! I turned my shaders to low and with everything else at max/ultra it plays absolutely amazing!! Thank you!










Im glad it helped you. Game is really good and looks beautiful, even with shaders on low, but some parts are still badly optimized. It's not bad for a beta I guess.

For some reason its crashing for me after the latest patch.


----------



## kalidae

Quote:


> Originally Posted by *battleaxe*
> 
> Ah, okay.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Not bad at all.


It would be very interesting to see what this card would score with some more volts and a clock of 1200, if it's doable. It's a really great card and the tri x cooler is silent which really surprised me though I just realised the fans only ramps up to 41% during a benchmark max temps of 67c so no wonder I can't hear it.


----------



## kizwan

Quote:


> Originally Posted by *Evan20x*
> 
> Quote:
> 
> 
> 
> Originally Posted by *CamsX*
> 
> I see that your monitor isn't FullHD. Even this will provide different scores, probably higher, to others using 1080p. When running Firestrike and Heaven use AMD defaults with Tessellation ON. Those settings in HEAVEN are the proper ones (other than resolution) for benchmarking.
> 
> I think you can turn VRS ON and use 1080p for the Heaven run in order to compare.
> 
> ALSO, in GPU-Z, always show "MAX" clocks and temps! By clicking on the value it will cycle until you are able to select MAX. Thats what we need to see.
> 
> I have a nitro too, and I avg 120+ fps in BF with a combination of Ultra and Medium settings. 144hz monitor here.
> 
> 
> 
> Yes my monitor is 1680x1050. I do have tessellation set to Extreme on Heaven.
> 
> The BF4 GPU-Z was taken while I was mid game. That was taken with BF4 set to Ultra preset with 115% resolution scale I think.
> 
> I will have to do more testing. My main question is, is it normal for GPU-Z loads to look like that? GPU load, memory and power loads spiking up and down? See I come from a 7850 where my GPU load would just stay very steady and mostly everything else would.
Click to expand...

Yeah, the GPU load is normal like that but once you go above 1080p, e.g. 1440p or 4K, the spiking will be less and less. If you're experiencing severe FPS dropping during gaming, then it is something to be concern.


----------



## battleaxe

Quote:


> Originally Posted by *kalidae*
> 
> It would be very interesting to see what this card would score with some more volts and a clock of 1200, if it's doable. It's a really great card and the tri x cooler is silent which really surprised me though I just realised the fans only ramps up to 41% during a benchmark max temps of 67c so no wonder I can't hear it.


That's really cool for a stock air cooler. Nice card there.


----------



## kalidae

This is what my gpu z for bf4 on ultra settings looks like.


----------



## Evan20x

Quote:


> Originally Posted by *kizwan*
> 
> Yeah, the GPU load is normal like that but once you go above 1080p, e.g. 1440p or 4K, the spiking will be less and less. If you're experiencing severe FPS dropping during gaming, then it is something to be concern.


Time to VSR to a higher res to make this puppy work then?









TBH I think ArcheAge and my GPU don't play well together.

My buddy has 760 SLi and he rips the game apart when I'm getting like 40 FPS in the middle of a populated city.


----------



## Gumbi

Quote:


> Originally Posted by *battleaxe*
> 
> 1235mhz core
> 1665mhz Mem
> 100mv
> +25mv Aux
> 
> Set like this:
> 
> core is 63c max
> VRM1 72c max
> VRM2 76c max (and I meant to say VRM2 is always, always hotter than VRM1 no matter what I do to the settings)
> 
> Heaven set to max everything and 1080p setting.


Very nice... very nice cooling too, I presume you are running 100% GPU fan and have excellent case cooling? As for that voltage, your core and VRMs temps are excellent.


----------



## MrOldboy

OK. I ended up grabbing the Gigabyte 390 for $275+tax (CA). I don't typically overclock that much since my GTX 480 already ran hot with the fans at 100%, but reading this thread, maybe I'll OC the card a little bit.

If anyone has the Gigabyte 390 could you give any impression on how you overclocked it even though the voltage is locked?

For now though I'll probably keep it stock and see how it performs in games that I've meant to go back to like Mordor or Metro.


----------



## koxy

Add to onwers club please.


Great card and really surprised how quiet is, its still not as same level as msi gtx 970 but huge improvement. Messing with OC little bit right now 1100/1600 stable. I had some issues with black screen and artifacts on screen as well, but reinstalled drivers about 3 times and seems fine now.


----------



## battleaxe

Quote:


> Originally Posted by *Gumbi*
> 
> Very nice... very nice cooling too, I presume you are running 100% GPU fan and have excellent case cooling? As for that voltage, your core and VRMs temps are excellent.


I'm using an H80 bolted to the board. (Red Mod) And individual sinks on the VRM's and RAM. The RAM was really happy when I did this. I went from a max of about 1575mhz to 1665mhz. The VRM's dropped pretty considerably from the stock cooler. I was running over 100c on the VRM2 while benching Heaven. I've also done a few case mods and cleaning up since installing the cards. With each modification I've seen a few small bumps down in temp. Not near as good as a full block, but it works better than stock, especially since I had 4 H80's laying around and it cost me zero to do it all. Plus, the air cooler gets its life saved for resale down the road.


----------



## HyeVltg3

Massive long shot but I'm planning to get the Sapphire card, my room has no AC so I would like the one with the best cooling, because I cant see me going WC Loop.

Does anyone have the Sapphire 390 AND only 310mm clearance rated from your case's website or specs?
the card measures 308mm.
I have a Deepcool Tesseract case and the site rates it as having 31cm (310mm)
I just took a ruler to measure and its hard to measure with my 280X in the way, but I'm getting about 308.5mm (Instead took my current 280X's length from mfc, and added in the gap between its edge and my irremovable SSD cage)

So I'm really annoyed, would like some first hand exp on this. I dont want to get it just to return it.

I'd aim for the MSI 390 but heard it runs "pretty dam hot"... and plenty of people are having non-gaming related driver crashes on 15.7 with their MSIs.


----------



## robmcrock

What sort of temps are people getting? On Mad Max last night at times i was getting 100% gpu usage and fans at 91%-100% at times and the card was hitting 85*C. I only played for around 20 minutes before having to head out so it could have climbed higher. Would this be considered normal? Most other games haven't gone over 75*C (the witcher 3 for example)


----------



## Mysticking32

Quote:


> Originally Posted by *robmcrock*
> 
> What sort of temps are people getting? On Mad Max last night at times i was getting 100% gpu usage and fans at 91%-100% at times and the card was hitting 85*C. I only played for around 20 minutes before having to head out so it could have climbed higher. Would this be considered normal? Most other games haven't gone over 75*C (the witcher 3 for example)


I'm usually in the mid to high 70's. The witcher 3 does get to 80 at times though on the r9 390x. Usually the fan speed is around the 80's when it hits 80c in the witcher 3 though. For some reason that game makes my gpu hot.

My previous r9 390x hit 90 degrees c on the witcher 3 so I had to rma it. (This is all on stock settings)


----------



## HyeVltg3

Well I feel like I did a dum-dum.
Went with the Club 3D 390

Is there any reason its not on the list in the first post?
Would like to note, maybe they're Canadian only?
Just ordered mine for $437cad Shipped with $7 full covered insurance, was on sale at $409.99 ($309.60usd) + tx +shipping.
thats...$330 usd total.

Did I just buy a no-name brand, heard they won some world championship GPU(gpu cooling?) award or something... really liking the name of their R9 390 " Royal Queen" ...giggity. (290/x was called "royalKing" and "royalAce", diff coolers)

EDIT: found a few reviews on the card versus others, time for some reading...even though its already purchased, I'm an idiot.
Oh, Club 3D are AMD-only, they dont do Nvidea cards. not sure if thats a bonus or a *****.


----------



## bazookatooths

Quote:


> Originally Posted by *robmcrock*
> 
> What sort of temps are people getting? On Mad Max last night at times i was getting 100% gpu usage and fans at 91%-100% at times and the card was hitting 85*C. I only played for around 20 minutes before having to head out so it could have climbed higher. Would this be considered normal? Most other games haven't gone over 75*C (the witcher 3 for example)


Witcher 3 ultra settings no hair 66c
Edit: 100% load 1080p fps cap in settings @60


----------



## CamsX

Quote:


> Originally Posted by *HyeVltg3*
> 
> Well I feel like I did a dum-dum.
> Went with the Club 3D 390
> 
> Is there any reason its not on the list in the first post?
> Would like to note, maybe they're Canadian only?
> Just ordered mine for $437cad Shipped with $7 full covered insurance, was on sale at $409.99 ($309.60usd) + tx +shipping.
> thats...$330 usd total.
> 
> Did I just buy a no-name brand, heard they won some world championship GPU(gpu cooling?) award or something... really liking the name of their R9 390 " Royal Queen" ...giggity. (290/x was called "royalKing" and "royalAce", diff coolers)
> 
> EDIT: found a few reviews on the card versus others, time for some reading...even though its already purchased, I'm an idiot.
> Oh, Club 3D are AMD-only, they dont do Nvidea cards. not sure if thats a bonus or a *****.


Seems you were in a rush to buy. $330 shipped is not a bad deal at all. XFX and Sapphire are AMD only brands, like EVGA is Nvidia only, so nothing to worry about this aspect. It just is.

In any case, I think you should be fine as long as you don't plan to overclock. If you are, then consider yourself the Guinea pig.









3 fans config and a large heatsink, very similar to the Powercolor model I would say. They use the same backplate, so probably just brother brands. The only drawback for me the 2.5 (3) slot size.

My Sapphire Nitro is not as large as I thought it would be. Didn't really measure it after all, but fitted nice on my NZXT 630 case. It stays nice and cool, under 70°C on 1080p gaming.


----------



## CamsX

Anyone tested 15.8 drivers yet? I will be installing Win10 when my Skylake arrives, and I'm not sure which driver to install. Currently on 15.7(.0) in win8.1


----------



## bazookatooths

Quote:


> Originally Posted by *CamsX*
> 
> Anyone tested 15.8 drivers yet? I will be installing Win10 when my Skylake arrives, and I'm not sure which driver to install. Currently on 15.7(.0) in win8.1


Yes , nice FPS gains for me, personally. IDK if its just me or does win10 like more cores?


----------



## CamsX

Quote:


> Originally Posted by *bazookatooths*
> 
> Yes , nice FPS gains for me, personally.


Great, thx for the feedback. I'll jump right on it.


----------



## Noirgheos

Quote:


> Originally Posted by *HyeVltg3*
> 
> Massive long shot but I'm planning to get the Sapphire card, my room has no AC so I would like the one with the best cooling, because I cant see me going WC Loop.
> 
> Does anyone have the Sapphire 390 AND only 310mm clearance rated from your case's website or specs?
> the card measures 308mm.
> I have a Deepcool Tesseract case and the site rates it as having 31cm (310mm)
> I just took a ruler to measure and its hard to measure with my 280X in the way, but I'm getting about 308.5mm (Instead took my current 280X's length from mfc, and added in the gap between its edge and my irremovable SSD cage)
> 
> So I'm really annoyed, would like some first hand exp on this. I dont want to get it just to return it.
> 
> I'd aim for the MSI 390 but heard it runs "pretty dam hot"... and plenty of people are having non-gaming related driver crashes on 15.7 with their MSIs.


That doesn't make sense. MSI did almost nothing to the card but add their own cooler. If issues are taking place they are not isolated to MSI. Also, a friend on Steam reports 65-72C on Shadow of Mordor maxed out at 1080p. Idles at 42C(fans on) and 56C(fans off).


----------



## HyeVltg3

Quote:


> Originally Posted by *CamsX*
> 
> Seems you were in a rush to buy. $330 shipped is not a bad deal at all. *Was in a rush because they were low on stock and the sale ends at 12am, was like 10pm when I saw the deal took a bit to look up anything about the Club 3D and if it was a poor-quality get it sold type of sale or just a really good deal which became low stock because I was the last one to know about it, haha*
> XFX and Sapphire are AMD only brands, like EVGA is Nvidia only, so nothing to worry about this aspect. It just is. *I had completely forgotten Sapphire was the same, did not know about XFX though couldve sworn I've regularly seen XFX nvidias, guess not, maybe it was the old BFGs, thanks for the info anyways*
> 
> In any case, I think you should be fine as long as you don't plan to overclock. If you are, then consider yourself the Guinea pig.
> 
> 
> 
> 
> 
> 
> 
> *Dont mind being it, will know when it gets here, should take 1-3 days, I dont really OC, but I dont mind setting a record to beat =D will reset to factory, too poor to afford any "mistakes" OC'ing atm*
> 
> 3 fans config and a large heatsink, very similar to the Powercolor model I would say. They use the same backplate, so probably just brother brands. The only drawback for me the 2.5 (3) slot size.
> *Feels like a good impulse-buy, some of the same price range pcbs had no plate, was sad*
> My Sapphire Nitro is not as large as I thought it would be. Didn't really measure it after all, but fitted nice on my NZXT 630 case. It stays nice and cool, under 70°C on 1080p gaming.
> *the Sapphire or the MSI were my first choices, but they were at $439 when I saw the Club 3D $409 deal, spent a lot of time figuring out if the 308mm Sappire would fit in my 308.5mm clearance. Case's site says 310mm, but it counts that little bit where metal bracket goes too.*


Quote:


> Originally Posted by *Noirgheos*
> 
> That doesn't make sense. MSI did almost nothing to the card but add their own cooler. If issues are taking place they are not isolated to MSI. Also, a friend on Steam reports 65-72C on Shadow of Mordor maxed out at 1080p. Idles at 42C(fans on) and 56C(fans off).


Oh I never meant to disrespect MSI, I was just stating what I was reading. but MSIs run hot is true, but the driver issue is pretty universal, first batch of Gigabytes werent allowing some to POST.

Anyone on a Sandy or Ivy bridge having crashes running 390/x cards? keep seeing haswells having problems. I'm on Z77 IvyB.


----------



## Noirgheos

Quote:


> Originally Posted by *HyeVltg3*
> 
> Oh I never meant to disrespect MSI, I was just stating what I was reading. but MSIs run hot is true, but the driver issue is pretty universal, first batch of Gigabytes werent allowing some to POST.
> 
> Anyone on a Sandy or Ivy bridge having crashes running 390/x cards? keep seeing haswells having problems. I'm on Z77 IvyB.


Where do you see this? Techpowerup reports cool temperatures, so do other sites. And like I said my friend's temperatures are fine.

He and I are also not having any driver issues. I saw one guy with a video on youtube, but he seems to be one of few.

Can you back up all your claims?


----------



## rusty190791

Hi guys, i just bought msi r9 390, and it is good....

Load temp just hit 70c.........

but, i had problem with blackscreen, randomly come... Usually after windows startup...
anyone got this problem?

I used last driver at amd website......
Thanks!!


----------



## koxy

Mine msi idling right now at 34 C, max temp i saw was 75 C after little OC 1100/1600.
@rusty try ddu in safe mode and clean all drivers, then install again. Had excatly same issue. Make sure You run ddu in safe mode.


----------



## rusty190791

Thanks for reply koxy, so its not the hardware issue right?
I try use ddu in safe mode of course, but still same probs,,,,,
maybe i'll try again with different drivers...

Thanks!!


----------



## koxy

Quote:


> Originally Posted by *rusty190791*
> 
> Thanks for reply koxy, so its not the hardware issue right?
> I try use ddu in safe mode of course, but still same probs,,,,,
> maybe i'll try again with different drivers...
> 
> Thanks!!


Using DVI port right ? If yes i had excatly same problem ddu in safe mode helps using current drivers ver 15.7.1 and now works, however first time i had black screen, then i ddu them install beta and same problem after ddu beta drivers and install 15.7.1, Its second day with no black screen so i guess its ok.


----------



## bazookatooths

Slot of pc crashes or blackscreen when gaming can be bad memory or an unstable overclock


----------



## iscariot

Picked up the MSI 390x, well got it for a birthday present. Will provide proof once I get around to installing it.


----------



## HyeVltg3

Quote:


> Originally Posted by *Noirgheos*
> 
> Where do you see this? Techpowerup reports cool temperatures, so do other sites. And like I said my friend's temperatures are fine.
> 
> He and I are also not having any driver issues. I saw one guy with a video on youtube, but he seems to be one of few.
> 
> Can you back up all your claims?


Hmm cant find threads where I read it, but Newegg has a lot of reviews on the MSI and Gigabyte. reviews are still reviews, you know these people arent paid to say something, but also that some are complete newbs. Click the "Reviews" Tab and choose the 1 - 3 egg ratings, if you've never newegg-ed before.
Gigabyte:
http://www.newegg.ca/Product/Product.aspx?Item=N82E16814125792&cm_re=390-_-14-125-792-_-Product

MSI:
http://www.newegg.ca/Product/Product.aspx?Item=N82E16814127874

I know it says .CA, but both .COM (US) and .CA (Canada) share the same reviews.
Weird but on the .COM (US) site newegg removed their old MSI listing and replaced it with a Model #: R9 390 GAMING 8G *LE*, where as on the .CA the non-LE is "out of stock", but you can see the reviews (above) left by its buyers and the LE version is blank because its a "new" catalog entry like it is on the .COM US site...fishy.


----------



## Noirgheos

Quote:


> Originally Posted by *HyeVltg3*
> 
> Hmm cant find threads where I read it, but Newegg has a lot of reviews on the MSI and Gigabyte. reviews are still reviews, you know these people arent paid to say something, but also that some are complete newbs. Click the "Reviews" Tab and choose the 1 - 3 egg ratings, if you've never newegg-ed before.
> Gigabyte:
> http://www.newegg.ca/Product/Product.aspx?Item=N82E16814125792&cm_re=390-_-14-125-792-_-Product
> 
> MSI:
> http://www.newegg.ca/Product/Product.aspx?Item=N82E16814127874
> 
> I know it says .CA, but both .COM (US) and .CA (Canada) share the same reviews.
> Weird but on the .COM (US) site newegg removed their old MSI listing and replaced it with a Model #: R9 390 GAMING 8G *LE*, where as on the .CA the non-LE is "out of stock", but you can see the reviews (above) left by its buyers and the LE version is blank because its a "new" catalog entry like it is on the .COM US site...fishy.


I never trust on-site reviews. There are just way too many variables. You shouldn't formulate opinions based on them.


----------



## desetnik

I want to share my story of new purchase.
I decided to replace my XFX 7870 Black Edition 2 months ago with new MSI R9 390. But there were some issues for some time.

The psu I owned was some cheap 500W LC Power Pro Line, I was scared of playing any games on it with my new card, but still I decided to install drivers and see if everything is fine.
Drivers installed without problems. Didn't use DDU just the amd's uninstaller. Fired up the Witcher 3 just for a minute and boom. Guess what? PSU DIED! Just kidding actually it worked but I freaked out because the fans on GPU very extremely loud compared to my 7870 which was dead silent in games, and after quitting the psu fan was audible more than usually before. Temperature was at 72C and fps was over 50 on ultra. I was more than happy with card and decided to exit the game and order new xfx 650w black edition semi modular psu.
After waiting for a week it arrived, but its fan was extremely loud. Even when pc was running on idle. I couldn't stand it, even wearing headphones didn't help how loud that psu was. I returned it and they told me I will have to order another one. So I ordered the same 650w but the core edition. Same bloody thing, the PSU fan is so loud it sounds like its going to explode any second. I returned this one too and vendor was like wth "no way I have this one and it works, but you have 390 650W is not enough for that card". They checked it and this one was broken too like WTH? How do you buy 2 psus and both are broken? What are the odds of same issue with different models? First I had to wait 1 week for psu to arrive, 5 days for them to tell me its broken, 5 days again for the new order to arrive. 5 DAYS AGAIN for this one to be checked and finally another weak for another antec's hcg-750M. Nevermind all this waiting, I had the black screen problem with this card after reinstalling windows 8. I thought I actually killed the card since I was running it on older psu while waiting for new ones. Tried the modded drivers that reported card as 290 and it worked. Installed new official drivers and everything was fine.

Now to the real stuff.

The card itself is loud, not as it was first day I was running it after all this time. Right now its ok but still very loud compared to 7870 that was quiet as a monk.
Temperatures are reaching high 83C in Witcher 3 1080p at stock clocks. Battlefield 4 most of the time is running at 70-75c while far cry 4 got up to 80C as well.
Tried removing side panel the temperature is same. I even added some low cfm old fan to the side but still no help. Intake or exhaust 0 changes to temperature. But my room is pretty hot. Especially when gaming room gets to 31C. How in the hell MSI gets this card to run at 72C on load and other people I have no idea. I did some cable management but nothing fancy still its much better than it was before.


----------



## Mysticking32

So about 2 months ago my r9 280x started artifacting, so it was time for a replacement. I sent the card in for an rma and bought the sapphire r9 390 nitro. I was very happy with this card, but I wanted something more. So about a month after I bought the nitro I returned it and got the full refund and purchased the r9 390x (msi version). First one was hitting 90 degrees celcius in the Witcher 3 so I got it replaced.

MSI has recently contacted me about my r9 280x and I'm not sure what happened but I guess they ran out? Anyways they offered me a free upgrade to the r9 390x!!!!!!!!!!!!!!!!!!

So boom I'll be picking up a new power supply soon and run these babies in crossfire. <3 MSI


----------



## jaydude

Quote:


> Originally Posted by *MrOldboy*
> 
> OK. I ended up grabbing the Gigabyte 390 for $275+tax (CA). I don't typically overclock that much since my GTX 480 already ran hot with the fans at 100%, but reading this thread, maybe I'll OC the card a little bit.
> 
> If anyone has the Gigabyte 390 could you give any impression on how you overclocked it even though the voltage is locked?
> 
> For now though I'll probably keep it stock and see how it performs in games that I've meant to go back to like Mordor or Metro.


1075/1600 is stable for me with locked voltage, 1080 core shows slight artifacting in 3dmark11, I am uncertain about how high the memory clocks as when I pushed it up to 1625 my pc freaked out and had to reset, though I am not sure if it was the overclocking that caused it or my fiddling with the powerplay and ulps settings, either way I left it at 1075/1600 for benching, but game on default, even at 1440p I am getting a mostly solid 60fps in almost everything so I don't really see the point in overclocking it outside of benchmarking









Edit: also my temps are max 75c on default fan profile, 68-70c on custom "fan speed same as temp" averaging around 65c in most cases.
I use msi afterburner for fan profiles.

Edit 2: Here is a bench I did at 1085/1575, strangely enough after I updated the drivers I started getting artifacts over 1080 core


----------



## maxlimits

Just got my new Club 3D R9 390!

Idle Temps are 33-35 degrees celsius at 35% fan speed
Load Temps are like 42-46 after 45 mins at 40% fan speed which were reached after playing 40 minutes of AC: Black Flags.



If the picture doesn't show up here is my proof. Add me to the list Mr. Smith.

http://i.imgur.com/uAofHjo.jpg


----------



## kizwan

Quote:


> Originally Posted by *jaydude*
> 
> Quote:
> 
> 
> 
> Originally Posted by *MrOldboy*
> 
> OK. I ended up grabbing the Gigabyte 390 for $275+tax (CA). I don't typically overclock that much since my GTX 480 already ran hot with the fans at 100%, but reading this thread, maybe I'll OC the card a little bit.
> 
> If anyone has the Gigabyte 390 could you give any impression on how you overclocked it even though the voltage is locked?
> 
> For now though I'll probably keep it stock and see how it performs in games that I've meant to go back to like Mordor or Metro.
> 
> 
> 
> 1075/1600 is stable for me with locked voltage, 1080 core shows slight artifacting in 3dmark11, I am uncertain about how high the memory clocks as when I pushed it up to 1625 my pc freaked out and had to reset, though I am not sure if it was the overclocking that caused it or my fiddling with the powerplay and ulps settings, either way I left it at 1075/1600 for benching, but game on default, even at 1440p I am getting a mostly solid 60fps in almost everything so I don't really see the point in overclocking it outside of benchmarking
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: also my temps are max 75c on default fan profile, 68-70c on custom "fan speed same as temp" averaging around 65c in most cases.
> I use msi afterburner for fan profiles.
> 
> Edit 2: Here is a bench I did at 1085/1575, strangely enough after I updated the drivers I started getting artifacts over 1080 core
Click to expand...

Use Firestrike (not Firestrike Extreme). It's widely used around here & easier for anyone to check & compare.


----------



## MrOldboy

Quote:


> Originally Posted by *jaydude*
> 
> 1075/1600 is stable for me with locked voltage, 1080 core shows slight artifacting in 3dmark11, I am uncertain about how high the memory clocks as when I pushed it up to 1625 my pc freaked out and had to reset, though I am not sure if it was the overclocking that caused it or my fiddling with the powerplay and ulps settings, either way I left it at 1075/1600 for benching, but game on default, even at 1440p I am getting a mostly solid 60fps in almost everything so I don't really see the point in overclocking it outside of benchmarking
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: also my temps are max 75c on default fan profile, 68-70c on custom "fan speed same as temp" averaging around 65c in most cases.
> I use msi afterburner for fan profiles.
> 
> Edit 2: Here is a bench I did at 1085/1575, strangely enough after I updated the drivers I started getting artifacts over 1080 core


Thanks for the info. Can you give any bench or opinion on what the performance increase was from stock to 1075/1600. I also have a 3570k. I'm likely going to keep it at stock with a custom fan profile since I think that will be plenty for me gaming wise while also giving me the lowest possible temps.

I'm assuming you have the G1 version as I haven't see anyone on here with the basic GN version. I'm curious as to how my (GN) compares to the (G1) in terms of overclocking.


----------



## Gumbi

Quote:


> Originally Posted by *maxlimits*
> 
> Just got my new Club 3D R9 390!
> 
> Idle Temps are 33-35 degrees celsius at 35% fan speed
> Load Temps are like 42-46 after 45 mins at 40% fan speed which were reached after playing 40 minutes of AC: Black Flags.
> 
> 
> 
> If the picture doesn't show up here is my proof. Add me to the list Mr. Smith.
> 
> http://i.imgur.com/uAofHjo.jpg


Those load temps are extremely low. Either they're wrong, or the game isn't stressing your GPU (could be CPU bound for example).


----------



## navjack27

just got an early birthday present. so happy with this. upgraded from SLI gtx 660 3gb cards to this one monster!

http://www.techpowerup.com/gpuz/details.php?id=kd4r

load gets around upper 70s and as you can see i'm KINDA idle right now. i'm using the weird msi gaming app thing that sets the boost clocks... i've used afterburner before i'm just giving the other program a try for a bit before i got back to afterburner.
i don't have the roomiest case nor tons of fans in my case either :-\ but i'm hoping i can get a little more OC outta this.


----------



## MrOldboy

Quote:


> Originally Posted by *Gumbi*
> 
> Those load temps are extremely low. Either they're wrong, or the game isn't stressing your GPU (could be CPU bound for example).


Was curious as well and found a review from bit-tech claiming load temps in the low 40s http://www.bit-tech.net/hardware/graphics/2015/07/16/sapphire-r9-300-series-review-roundup/11
But then realized they are reported as delta T, not T.


----------



## Pohernori

1.  http://i.imgur.com/vlei7Ck.jpg
2. PowerColor r9 390x PCS+
3. Air/Stock cooling

Hi guys. just acquired a 390x today. Thanks to members of this thread who provided information of their own cards. I was able to make a decision on which card to go with








Managed to get the 390x PCS+ stable on 1180/1600 which seems pretty alright. Hope I can be added to this list to provided more information to the list. Hope it'll help someone else.


----------



## Gumbi

Quote:


> Originally Posted by *Pohernori*
> 
> 1.
> 2. PowerColor r9 390x PCS+
> 3. Air/Stock cooling
> 
> Hi guys. just acquired a 390x today. Thanks to members of this thread who provided information of their own cards. I was able to make a decision on which card to go with
> 
> 
> 
> 
> 
> 
> 
> 
> Managed to get the 390x PCS+ stable on 1180/1600 which seems pretty alright. Hope I can be added to this list to provided more information to the list. Hope it'll help someone else.


Sounds superb


----------



## Gumbi

Quote:


> Originally Posted by *MrOldboy*
> 
> Was curious as well and found a review from bit-tech claiming load temps in the low 40s http://www.bit-tech.net/hardware/graphics/2015/07/16/sapphire-r9-300-series-review-roundup/11
> But then realized they are reported as delta T, not T.


Yep, they're delta temps, add 20-25 to those for ambient temps...

Do a loop of the Heaven benchmark, and I expect your load temps will be a lot more like 70 degrees or a tad more. Same with VRMs.


----------



## jaydude

Quote:


> Originally Posted by *MrOldboy*
> 
> Thanks for the info. Can you give any bench or opinion on what the performance increase was from stock to 1075/1600. I also have a 3570k. I'm likely going to keep it at stock with a custom fan profile since I think that will be plenty for me gaming wise while also giving me the lowest possible temps.
> 
> I'm assuming you have the G1 version as I haven't see anyone on here with the basic GN version. I'm curious as to how my (GN) compares to the (G1) in terms of overclocking.


Yes I have the G1 Gaming version, as for overclocking it, I have not noticed any difference in-game compared to stock settings, maybe you might notice a slight improvement with uncapped framerate but I use vsync for almost all my games so I have only noticed the improvement while benchmarking


----------



## kizwan

Quote:


> Originally Posted by *MrOldboy*
> 
> Quote:
> 
> 
> 
> Originally Posted by *jaydude*
> 
> 1075/1600 is stable for me with locked voltage, 1080 core shows slight artifacting in 3dmark11, I am uncertain about how high the memory clocks as when I pushed it up to 1625 my pc freaked out and had to reset, though I am not sure if it was the overclocking that caused it or my fiddling with the powerplay and ulps settings, either way I left it at 1075/1600 for benching, but game on default, even at 1440p I am getting a mostly solid 60fps in almost everything so I don't really see the point in overclocking it outside of benchmarking
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: also my temps are max 75c on default fan profile, 68-70c on custom "fan speed same as temp" averaging around 65c in most cases.
> I use msi afterburner for fan profiles.
> 
> Edit 2: Here is a bench I did at 1085/1575, strangely enough after I updated the drivers I started getting artifacts over 1080 core
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks for the info. Can you give any bench or opinion on what the performance increase was from stock to 1075/1600. I also have a 3570k. I'm likely going to keep it at stock with a custom fan profile since I think that will be plenty for me gaming wise while also giving me the lowest possible temps.
> 
> I'm assuming you have the G1 version as I haven't see anyone on here with the basic GN version. I'm curious as to how my (GN) compares to the (G1) in terms of overclocking.
Click to expand...

Can anyone add more voltage using modded BIOS?


----------



## jaydude

Quote:


> Originally Posted by *kizwan*
> 
> Can anyone add more voltage using modded BIOS?


The lack of dual bios switch means we may never know, unless someone does not mind bricking their card if it fails


----------



## Gumbi

You can add 200mv in Trixx in all 390s with unlocked voltage (all of them bar the Gigabyte model I believe).


----------



## koxy

Anyone know how to force Msi afterburner to apply OC setting on Windows 10 setup ? I have Apply oc setting on windows startup and start with windows checked but every time i reboot i have stock settings and stock fan profile


----------



## BradleyW

Quote:


> Originally Posted by *koxy*
> 
> Anyone know how to force Msi afterburner to apply OC setting on Windows 10 setup ? I have Apply oc setting on windows startup and start with windows checked but every time i reboot i have stock settings and stock fan profile


Did you save the OC'ed settings as a profile?


----------



## maxlimits

the fans are running at 40% and my room is pretty cool, case has 3 fans. Temps look fine to me. Witcher 3 got temps to 60 degrees Celsius while the fans were running at 60%.


----------



## koxy

Quote:


> Originally Posted by *BradleyW*
> 
> Did you save the OC'ed settings as a profile?


Yes Profile 1


----------



## CamsX

Quote:


> Originally Posted by *koxy*
> 
> Yes Profile 1


Same thing happened to me in Win8.1. Installed Win10 yesterday, but I guess it should be doing the same thing.

I'm not that familiar with AB tho.


----------



## BradleyW

Quote:


> Originally Posted by *koxy*
> 
> Yes Profile 1


Try and later/earlier version of the software.


----------



## Noirgheos

Quote:


> Originally Posted by *maxlimits*
> 
> the fans are running at 40% and my room is pretty cool, case has 3 fans. Temps look fine to me. Witcher 3 got temps to 60 degrees Celsius while the fans were running at 60%.


You know if only the Club3D cooler weren't so butt-ugly I would grab it in a heartbeat. From what you say it sounds amazing though. I like the Dutch.


----------



## navjack27

how can i get a link to that doc with all of us in it? i'm sure i can just view page source and look at the frame with the doc in it and get the url that way... i'm lazy














i'm just curious without reading every thread here on what clocks and volts people have been using.


----------



## maxlimits

Powercolor and club 3D make the same gpu, so if you like their colour scheme better, then go for it.


----------



## Noirgheos

Quote:


> Originally Posted by *maxlimits*
> 
> Powercolor and club 3D make the same gpu, so if you like their colour scheme better, then go for it.


Yeah, but I've also heard that Powecolour isn't too good. A friend of mine was adamant about me getting Sapphire or MSI.


----------



## Gumbi

Quote:


> Originally Posted by *Noirgheos*
> 
> Yeah, but I've also heard that Powecolour isn't too good. A friend of mine was adamant about me getting Sapphire or MSI.


Powercolor have one of if not the best 390 coolers.


----------



## MTDEW

Quote:


> Originally Posted by *koxy*
> 
> Anyone know how to force Msi afterburner to apply OC setting on Windows 10 setup ? I have Apply oc setting on windows startup and start with windows checked but every time i reboot i have stock settings and stock fan profile


If you have the AMD CCC loading at startup also, it may be loading after AB and overriding your settings.
Disable AMD Overdrive in CCC or completely disable CCC loading at windows startup and see if that works.


----------



## maxlimits

Quote:


> Originally Posted by *Gumbi*
> 
> Powercolor have one of if not the best 390 coolers.


yep that's what I have heard as well.


----------



## diggiddi

Quote:


> Originally Posted by *cfcboy*
> 
> Ok
> Ok cool (no pun intended ?). So when you mean switch the CPU fan round do you mean so it's facing the rear casing fan or pointing towards top of the case? Thanks again


Quote:


> Originally Posted by *MTDEW*
> 
> If you have the AMD CCC loading at startup also, it may be loading after AB and overriding your settings.
> Disable AMD Overdrive in CCC or completely disable CCC loading at windows startup and see if that works.


Could Overdive be causing Afterburner to crash my system upon opening?


----------



## CamsX

Quote:


> Originally Posted by *Gumbi*
> 
> Powercolor have one of if not the best 390 coolers.


Sapphire has arguably the best cooler of all 390 cards (I'm biased of course







). Under 70C core and vrms with less than 60% fan speed on mild OC is great testament of this. Too bad the card isn't as good overclocker as the MSI ones.


----------



## Renner

Howdy, folks.

I stumbled upon this thread accidentally and decided to join after a while. Got a MSI R9 390 not long ago:










and this is the story of it. Jumped from Gigabyte HD6850 OC which was quite a shocking jump, and I've waited quite a while since my first plan to get 970, which luckily I skipped. Saved some money and actually wanted to grab a 390X, but changed my mind at the last moment after figuring out that giving over 100€ more for a mere handful of frames more doesn't really pay off. Wanted to get a Sapphire model, but was just too long for my case, so I went for MSI, which in any way isn't a bad but a very complete product, with great cooling, backplate, better clocks and the default Hawaii power delivery of 6+8-pin. Turns out, MSI was too long as well, since my old 6850 looks like a toy compared to this beast, so I had to buy a bigger case which was Cooler Master Silencio 452. I really loved my old TrendSonic case, but its HDD cage was non-removable and couldn't fit the card, sniff. So, those 100€ I saved from skipping 390X went into case...

Apparently, one front and one back fan coming with the case were barely enough to tame this card. It was working fine most of the time, regarding temperatures, but I disliked the default 60 degrees on idle (since fans are turning off at those temps) so I activated custom fan profile on MSi AB, so those went down to some 40-45 degrees. Then the problems started. First one cable from my PSU was blocking the front fan so it was cooking inside and the system shut down due to high temps. After figuring that one out, and fixing the problem, then playing Crysis 3 caused my card to go over 85 degrees which was the first time I've seen that much, usually never goes beyond 75. Then my CPU with its crappy stock cooler was overheating with all that fire dissipating around, and more shutdowns. Also, frame limiter from the drivers doesn't really limit in every game I play. Being in the main menu of Space Hulk Ascension, for instance, makes my card going over 80 degrees with hundreds of frames. Activating v-sync at 60 fps cools down things quickly.

I think that the problem was in the custom fan profile which I didn't tweak at all. After getting back things to default seemed to improve my temps in Crysis, but just in case I got an aftermarket CPU cooler which is CoolerMaster Hyper 212 EVO, and it works like a charm (not counting an almost 7 hours long fight with a stubborn stock cpu fan holder screw from the board...). Adding another fan on the front and with the additional intake lowered my idle times for like 10 degrees on everything, card goes as low to 50 degrees sometimes, but that depends from my room temps. Now everything is just nice and dandy.

Btw, has anyone seen this:

http://www.techpowerup.com/215755/powercolor-launches-radeon-r9-390-x2-devil13-dual-gpu-graphics-card.html


----------



## Geoclock

Quote:


> Originally Posted by *Renner*
> 
> Btw, has anyone seen this:
> 
> http://www.techpowerup.com/215755/powercolor-launches-radeon-r9-390-x2-devil13-dual-gpu-graphics-card.html


Hm, i guess you don't need it anymore because all WWW is busy with news: "R9 290X goes toe-to-toe with GTX 980 Ti on DirectX 12 ".
So now all 290x cards are sold and prices are rising, more than r9 390 i guess.

We gonna see a BIG BIG MESS....

http://arstechnica.com/gaming/2015/08/directx-12-tested-an-early-win-for-amd-and-disappointment-for-nvidia/


----------



## Renner

Yeah, but its still an impressive product, I didn't seen that coming for sure.

Gains with GCN cards in that AoS test were quite impressive, I'm just disappointed seeing AMD FX CPUs (one of which I have) barely keeping up with i3s... But apparently the game is using only 2 cores atm, while devs are claiming that those are giving the hard time towards i7s in their internal builds, but eh.

Gotta ask something while I'm here. I have a weird issue lately (I'm on Win7 64-bit, and still using the Catalyst 15.7) is that sometimes my GPU load spikes hard while I'm on desktop doing nothing. It just jumps up and down like crazy, and fans turning on or off depending on the load and temps. I have to restart to make the things normal again. For some reason happens after midnight, and I see sometimes that Win Desktop manager has crashed. Anyone having an idea?


----------



## Dundundata

This thread is massive!

Hello all I just joined. For a long time I was running my old computer, a Gateway I bought waaaay back in 2001! 1.4 Thunderbird with a Radeon 7000. It still works, but I finally got with the times and built a new rig with a 4790k...and I've been itching to get a graphics card ever since. Luckily I stumbled across this site.

After much back and forth it seems like a 390 is the card for me, and I've whittled my choices down a (little) bit to the MSI XFX, Sapphire or Powercolor. (From what I've gathered) the MSI is the OC champ, and seems like a solid choice, maybe not the *best cooling though. XFX looks to be another good OC card with OK cooling. The Sapphire sounds like a card that will run nice and cool, with an OK OC. And the Powercolor I haven't found too much on besides it has excellent cooling as well.

I probably don't need to be that concerned with cooling, as my room isn't very hot. I did spend a fair amount of time setting up my case/fans and my CPU runs at about 30/60s. I have very little OC experience/knowledge, but you never know I do like to mess around with things.

So any help deciding is appreciated, but I really just want to say hi!

P.S. My PSU is a Corsair CX600, which I know isn't the best. Will it be enough to run one of these cards? It claims to be [email protected]


----------



## Noirgheos

Quote:


> Originally Posted by *maxlimits*
> 
> Powercolor and club 3D make the same gpu, so if you like their colour scheme better, then go for it.


By any chance can you post a picture of the Club3D in your case? I'd like to see how it looks.


----------



## battleaxe

Quote:


> Originally Posted by *Geoclock*
> 
> Hm, i guess you don't need it anymore because all WWW is busy with news: "R9 290X goes toe-to-toe with GTX 980 Ti on DirectX 12 ".
> So now all 290x cards are sold and prices are rising, more than r9 390 i guess.
> 
> We gonna see a BIG BIG MESS....
> 
> http://arstechnica.com/gaming/2015/08/directx-12-tested-an-early-win-for-amd-and-disappointment-for-nvidia/


Phhsssshhhhh...... BAhahahah.... LOL.....









Seriously... I don't know if all this is going to happen or not. Most likely this whole thing is totally blown out of proportion. But, but... you have to admit that with the stunts Nvidia has been pulling lately (3.5GB < 4.0Gb) etc... and other nonsense... its pretty funny.

I'm not implying AMD is innocent. No way, no how. But this is rich, great, hilarious even that Nvidia would have to eat crow after the way they have been jerking around the consumer the last 3-5 years. ($1000 + Titans etc) It serves em' right in a way you know? And I would be saying the same thing if this were AMD pulling the same nonsense, which they have in the past. Companies need to learn to protect their relationship with the customer. Nvidia has NOT been doing that lately. Their attitude has been that they are too big and too powerful to even care what we think about their antics.

Hopefully, this will bring back some humility on Nvidia's part so they can pull themselves back together and quit playing games with US the consumers. Just my 2 cents. Which I realize is worth nothing at all.


----------



## maxlimits

Quote:


> Originally Posted by *Gumbi*
> 
> Powercolor have one of if not the best 390 coolers.


i'll post it tomorrow.


----------



## Noirgheos

Quote:


> Originally Posted by *maxlimits*
> 
> i'll post it tomorrow.


Alright looking forward to it.


----------



## Gboss

A very happy PCS+ 390 user right here


----------



## battleaxe

Quote:


> Originally Posted by *Gboss*
> 
> A very happy PCS+ 390 user right here


Do you know what the VRM cooling is like on your card? I'm curious, trying to decide which 390x to get right now. Looking for one with VRM cooling that is not part of the GPU fins package. Similar to the XFX 390x if you know what that looks liike.


----------



## Gboss

Quote:


> Originally Posted by *battleaxe*
> 
> Do you know what the VRM cooling is like on your card? I'm curious, trying to decide which 390x to get right now. Looking for one with VRM cooling that is not part of the GPU fins package. Similar to the XFX 390x if you know what that looks liike.


The PCS+ has one of the best coolers IMHO, them and the Sapphire Nitro too.
I play Shadow of Mordor maxed out for hours on end and never see any frame rate drops or stutters ever.

I've just gotten home from night shift - when I wake up i can fire off a game and get msi afterburner to record some temps.


----------



## Mysticking32

Is anyone running a crossfire setup with the r9 390x's?

Could you run some benchmarks? And provide feedback over temps and what not? Power supply etc.


----------



## CamsX

Quote:


> Originally Posted by *Mysticking32*
> 
> Is anyone running a crossfire setup with the r9 390x's?
> 
> Could you run some benchmarks? And provide feedback over temps and what not? Power supply etc.


Check the list on page one, and look for the posts from that person. There about 3 or 4 ppl running xfire I think. It'll be easier than hoping for one of them to read this specific post.

I remember seeing some Firestrike score getting over 14k for them.


----------



## Pohernori

Quote:


> Originally Posted by *battleaxe*
> 
> Do you know what the VRM cooling is like on your card? I'm curious, trying to decide which 390x to get right now. Looking for one with VRM cooling that is not part of the GPU fins package. Similar to the XFX 390x if you know what that looks liike.


Its pretty average. Its not bad but I wouldn't it call the best. VRM temps were at 70*C while I was overvolting the card earlier.


----------



## kizwan

Quote:


> Originally Posted by *koxy*
> 
> Anyone know how to force Msi afterburner to apply OC setting on Windows 10 setup ? I have Apply oc setting on windows startup and start with windows checked but every time i reboot i have stock settings and stock fan profile


It is possible the AMD CCC reset everything when CCC loaded everytime you restart/boot into windows.

*I did not try this before* but you can try set a delay for the MSI AB (assuming you're using MSI AB). Go to...

*Task Scheduler >> double-click "MSIAfterburner" entry >> "Triggers" tab >> click "At log on" >> click "Edit..." button >> under "Advanced settings" section, set/tick "Delay task for"*

One minute or above should be good I think.


----------



## MrOldboy

I'm getting ready to install my Gigabyte R9 390 and was wondering if anyone has any opinions on Windows 7 vs 10 driver support and/or performance? I've read that there might be a tiny bump in performance due to windows 10, but I might just wait until I get a new HDD or SSD to go from 7 to 10. No games use dx12 yet so I don't see a big reason to go to 10 other than...Why Not? It's free.

Also there are a few bent fins and scratches on my heatsink. Is this normal for a non-reference video card? I've only had blower models in the past.


----------



## jaydude

Quote:


> Originally Posted by *MrOldboy*
> 
> I'm getting ready to install my Gigabyte R9 390 and was wondering if anyone has any opinions on Windows 7 vs 10 driver support and/or performance? I've read that there might be a tiny bump in performance due to windows 10, but I might just wait until I get a new HDD or SSD to go from 7 to 10. No games use dx12 yet so I don't see a big reason to go to 10 other than...Why Not? It's free.
> 
> Also there are a few bent fins and scratches on my heatsink. Is this normal for a non-reference video card? I've only had blower models in the past.


Got you covered









http://www.3dmark.com/3dm/7862942 Windows 7 64bit

http://www.3dmark.com/3dm/7957629 Windows 10 64bit

Slight increase in performance according to firestrike bench









Edit: I should point out the driver was different for W7 and W10, I think the W10 one was beta


----------



## MrOldboy

Quote:


> Originally Posted by *jaydude*
> 
> Got you covered
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/7862942 Windows 7 64bit
> 
> http://www.3dmark.com/3dm/7957629 Windows 10 64bit
> 
> Slight increase in performance according to firestrike bench
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: I should point out the driver was different for W7 and W10, I think the W10 one was beta


Interesting. Very slight like I read. I'll probably hold off until I get a new HDD/SSD. I'm hoping some real gains are seen once dx12 games start shipping. Hopefully this card lasts me a long time.


----------



## CamsX

Quote:


> Originally Posted by *MrOldboy*
> 
> Interesting. Very slight like I read. I'll probably hold off until I get a new HDD/SSD. I'm hoping some real gains are seen once dx12 games start shipping. Hopefully this card lasts me a long time.


Yeah, if you are just replacing the GPU stick to current OS. If you are installing a new HDD or in my case a new CPU, then go for the upgrade.

Installed Win10 2 days ago from scratch. So far so good.


----------



## Gumbi

***Don't know how to remove this***


----------



## desetnik

I just did my first heaven benchmark results with my msi r9 390 stock



Compared to bjorn review site I got 0.9fps less, 20 score less. On w10 while they used used w8 and 2666mhz ram.
Their temperature was 36C/72C IDLE/LOAD
While mine were 60/80C IDLE/LOAD

HW monitor reported 80C. Which is less compared to what I get in witcher 3 which runs up to 83C. 83C was recorded by msi afterburner.


----------



## CamsX

Quote:


> Originally Posted by *desetnik*
> 
> I just did my first heaven benchmark results with my msi r9 390 stock
> 
> Compared to bjorn review site I got 0.9fps less, 20 score less. On w10 while they used used w8 and 2666mhz ram.
> Their temperature was 36C/72C IDLE/LOAD
> While mine were 60/80C IDLE/LOAD
> 
> HW monitor reported 80C. Which is less compared to what I get in witcher 3 which runs up to 83C. 83C was recorded by msi afterburner.


Where they using an open test bench? That could explain the temp difference.

How is your case flow? Storm Trooper is an oldish case model afaik. Maybe your HDD caddies are blocking the airflow. Something to consider.


----------



## desetnik

Actually I do have 2 HDD's there. One is put as low as possible the other as high for more air. I will try to move them elsewhere and see what temperatures will be then. As far as the case airflow goes. It has 2x120mm front. 1x200m top and 1x120mm back. With 80mm as intake on side but a low cfm one. I don't think this 80mm makes any difference exhaust or intake.


----------



## CamsX

Quote:


> Originally Posted by *desetnik*
> 
> Actually I do have 2 HDD's there. One is put as low as possible the other as high for more air. I will try to move them elsewhere and see what temperatures will be then. As far as the case airflow goes. It has 2x120mm front. 1x200m top and 1x120mm back. With 80mm as intake on side but a low cfm one. I don't think this 80mm makes any difference exhaust or intake.


Yeah, with hot cards case flow is key. I have 2x140mm+1x200mm intakes, and 3x140mm exhaust, with absolutely nothing in between the fan and the graphics card (look for the pic in my Rigs), and I'm still debating if swtiching the 200mm for 2x140mm will improve airflow or not.


----------



## maxlimits

Quote:


> Originally Posted by *desetnik*
> 
> Actually I do have 2 HDD's there. One is put as low as possible the other as high for more air. I will try to move them elsewhere and see what temperatures will be then. As far as the case airflow goes. It has 2x120mm front. 1x200m top and 1x120mm back. With 80mm as intake on side but a low cfm one. I don't think this 80mm makes any difference exhaust or intake.


you should set a custom fan curve, maybe that'll lower your idle temperatures.

Here is the club 3d r9 390 in my matx case. Cabling is a nightmare with a non modular psu. I might fix it later.


----------



## desetnik

Heres how it looks like from inside. It really is dirty I actually didn't clean the case itself properly since I bought it 3 years ago or 4 I forgot. Only fans are cleaned.

Its a mess




I will remove the hdd case so those 2 fans have better flow and will post results tomorrow.


----------



## bazookatooths

Got around to toying with my XFX R9 390 Temps with stock fans that spin at 60c maxed out at 80c
Custom fan default on Afterburner maxed out at 70c
EDIT: Core Volts +100mV and Power Limit at +50%

I settled on 1175/1720


----------



## Jon55

Wow this thread is huge! Forgive me for (probably) asking the same question.

I'm looking to replace my very much aging reference 7970 with a 390 (the 390X doesn't seem worth it from what I can tell). My primary concern is noise and build quality. A few extra frames are nice, but after dealing with the reference 7970's noise level (it's INSANELY loud), I'd much rather have a quite card that is well built.

So far I'm looking at the MSI 390 Gaming LE or the XFX 390 Black Edition DD, but honestly I really don't know what the situation is. Please advise!


----------



## desetnik

My previous 7870 was xfx black edition and it was dead silent. My current msi r9 390 gets really loud during gaming but you can only hear her while sound is very low on speakers or if I remove headset. I don't trust reviews that much since I've seen too many people having different experiences with my card. Some claim msi is dead silent while others say its kind of loud. Maybe you will have more luck than me or I'm just too sensitive to this things.


----------



## CerealKillah

Quote:


> Originally Posted by *Jon55*
> 
> Wow this thread is huge! Forgive me for (probably) asking the same question.
> 
> I'm looking to replace my very much aging reference 7970 with a 390 (the 390X doesn't seem worth it from what I can tell). My primary concern is noise and build quality. A few extra frames are nice, but after dealing with the reference 7970's noise level (it's INSANELY loud), I'd much rather have a quite card that is well built.
> 
> So far I'm looking at the MSI 390 Gaming LE or the XFX 390 Black Edition DD, but honestly I really don't know what the situation is. Please advise!


I don't think you would regret either one of these, to be honest.


----------



## CamsX

Quote:


> Originally Posted by *Jon55*
> 
> Wow this thread is huge! Forgive me for (probably) asking the same question.
> 
> I'm looking to replace my very much aging reference 7970 with a 390 (the 390X doesn't seem worth it from what I can tell). My primary concern is noise and build quality. A few extra frames are nice, but after dealing with the reference 7970's noise level (it's INSANELY loud), I'd much rather have a quite card that is well built.
> 
> So far I'm looking at the MSI 390 Gaming LE or the XFX 390 Black Edition DD, but honestly I really don't know what the situation is. Please advise!


Fan noise is a very subjective topic, or at least low noise I would say. There is a point where loud is just loud for everyone. All R9 390 cards have very capable coolers (maybe excluding the ASUS model), at least on stock speeds. If you plan to overclock, then that definitely narrow down your options.

It also depends on how much heat is acceptable to you. A custom fan curve with speed under 55-60% should be barely audible on all models, but the headsink and fan design differences may render temperatures ranging from 70C up to 88C, also depending on your gaming needs.

Which almost always brings me back to my selection. The Sapphire R9 390 Nitro is very quiet, in my subjective point of view, and very efficient at cooling for my current needs. Just for comparison, in my case, the gfx card under 60% fan speed (2000-2200rpms) is inaudible. What I can clearly hear are the 2x140mm fans cooling the Kraken X60 at 1300~rpms.


----------



## Jon55

Quote:


> Originally Posted by *CamsX*
> 
> Fan noise is a very subjective topic, or at least low noise I would say. There is a point where loud is just loud for everyone. All R9 390 cards have very capable coolers (maybe excluding the ASUS model), at least on stock speeds. If you plan to overclock, then that definitely narrow down your options.
> 
> It also depends on how much heat is acceptable to you. A custom fan curve with speed under 55-60% should be barely audible on all models, but the headsink and fan design differences may render temperatures ranging from 70C up to 88C, also depending on your gaming needs.
> 
> Which almost always brings me back to my selection. The Sapphire R9 390 Nitro is very quiet, in my subjective point of view, and very efficient at cooling for my current needs. Just for comparison, in my case, the gfx card under 60% fan speed (2000-2200rpms) is inaudible. What I can clearly hear are the 2x140mm fans cooling the Kraken X60 at 1300~rpms.


Thank you guys for all your responses!

I don't want to say I'm sensitive to noise, but my mATX Corsair 350D case does sit on my desk only a few feet away from me. I had an MSI 7850, which was nearly silent under load and ran very cool, but got the opportunity to upgrade to a 7970 reference card for cheap - and I almost regretted it simply because the stock blower was insanely loud.

I keep seeing good things about the Nitro card, and I really dig that it's quiet. Two questions regarding the Nitro:

1) did you OC it? If so, have you noticed any more noise?
2) would my Corsair 750 watt PSU be enough for this card? (I have a somewhat OC'd i5 3750 + H60, on an ASUS Gene V mobo, 16GB DDR3, a couple case fans, and a single SSD).


----------



## bazookatooths

If your worried only about noise, just keep any brand at stock settings the fans dont kick on until 60c and spin so slowly you won't be able to hear them. I know that for sure on the XFX Card, but i'm pretty sure all the cards are like that.


----------



## CamsX

Quote:


> Originally Posted by *Jon55*
> 
> Thank you guys for all your responses!
> 
> I don't want to say I'm sensitive to noise, but my mATX Corsair 350D case does sit on my desk only a few feet away from me. I had an MSI 7850, which was nearly silent under load and ran very cool, but got the opportunity to upgrade to a 7970 reference card for cheap - and I almost regretted it simply because the stock blower was insanely loud.
> 
> I keep seeing good things about the Nitro card, and I really dig that it's quiet. Two questions regarding the Nitro:
> 
> 1) did you OC it? If so, have you noticed any more noise?
> 2) would my Corsair 750 watt PSU be enough for this card? (I have a somewhat OC'd i5 3750 + H60, on an ASUS Gene V mobo, 16GB DDR3, a couple case fans, and a single SSD).


Correct, all cards have fan OFF mode with stock fan profile. With the Nitro its a custom mode that turns off/on fans individually as it see fit under 30%fan speed.

Your PSU will be more than enough for any 390/390x you choose. It might only fall short if you plan to Crossfire.

My current mild OC was bringing my Nitro to MSI stock OC speeds of 1060 core 1025 memory without any voltage changes. No issues to reach that and temperature remains below 70°C under load on BF4, Dirt Rally (1440p), Assetto Corsa, Heaven, Valley and Firestrike. It may have reached 70°C in Crysis 3 @1440p max setting NoAA. Please notice I have very good airflow in my computer Case.


----------



## benprestwood

Hey Guys could I join


----------



## desetnik

Quote:


> Originally Posted by *CamsX*
> 
> Yeah, with hot cards case flow is key. I have 2x140mm+1x200mm intakes, and 3x140mm exhaust, with absolutely nothing in between the fan and the graphics card (look for the pic in my Rigs), and I'm still debating if swtiching the 200mm for 2x140mm will improve airflow or not.


I can't put hdd's anywhere else except in the cage sigh...
However there is some room for air to go through the hdd cages. Unfortunately the front fans seem to have zero effect on gpu temperatures on lowest speed or highest. Any gpu intensive game really hits her hard in terms of temps. Its hot here so the ambient in room is 28C, during gaming it was 30 once. Pretty hot room.


----------



## Jon55

Nice, and it's still pretty quiet even under full load?


----------



## Dundundata

Quote:


> Originally Posted by *bazookatooths*
> 
> Got around to toying with my XFX R9 390 Temps with stock fans that spin at 60c maxed out at 80c
> Custom fan default on Afterburner maxed out at 70c
> EDIT: Core Volts +100mV and Power Limit at +50%
> 
> I settled on 1175/1720


very nice







I am between this and MSI...


----------



## kalidae

Quote:


> Originally Posted by *desetnik*
> 
> Heres how it looks like from inside. It really is dirty I actually didn't clean the case itself properly since I bought it 3 years ago or 4 I forgot. Only fans are cleaned.
> 
> Its a mess
> 
> 
> 
> 
> I will remove the hdd case so those 2 fans have better flow and will post results tomorrow.


You should pick yourself up a little corsair H75 liquid cooler, I just bought one to use temporarily because I changed my case to a noctis 450 from a define r4 and i can't use my custom watercooling loop because the noctis has nowhere to mount mt bay res and pump. In very impressed by the h75 and it's only half the size radiator wise of my custom loop but it is keeping the same overclock nice and cool as my custom loop did. The h75 is cheap and looms good and it's something you will have for years to come, I'm talking about for your cpu though not your gpu.


----------



## CamsX

Quote:


> Originally Posted by *Jon55*
> 
> Nice, and it's still pretty quiet even under full load?


To me it is quiet enough. The A/C like hum coming from my case is not generated by the GPU fan


----------



## Gboss

Here too, dunno what the fuss about noise levels on these cards. I can barely hear my GPU over the hum from my Nepton cooler lol


----------



## CamsX

20 minutes of Crysis 3 campaign 1st stage. 1440p, Ultra High, NoAA. Frames moving from 38min to 61max.

Also my custom fan setting.


----------



## wrknight09

Nice thread, add me to the list please!




MSI r9 390 Gaming.

Stock Clocks and Cooling (for now).


----------



## desetnik

Quote:


> Originally Posted by *kalidae*
> 
> You should pick yourself up a little corsair H75 liquid cooler, I just bought one to use temporarily because I changed my case to a noctis 450 from a define r4 and i can't use my custom watercooling loop because the noctis has nowhere to mount mt bay res and pump. In very impressed by the h75 and it's only half the size radiator wise of my custom loop but it is keeping the same overclock nice and cool as my custom loop did. The h75 is cheap and looms good and it's something you will have for years to come, I'm talking about for your cpu though not your gpu.


I was already thinking about buying an aftermarket cooler for my cpu or h75 since the cpu started to run at 65-75 degrees few months ago. It used to run at 50-65 in games.
I'm not planning to oc cpu to limit since its already good enough for games. I hope for stable 4.2-4.5 ghz. Currently I lack any oc knowledge so I don't know what to expect, until then research! Just too bad my 390 isn't running cool.


----------



## orlfman

just got my new sapphire nitro r9 390's in crossfire to replace my 970's in sli, evga supernova g2 1000, and cooler master haf xb evo case.

in heaven benchmark my 970's (gigabyte g1's) in sli would score 104fps @ 1080p - ultra settings - extreme tessellation. on my 390's 110fps with the same exact settings and processor / ram / motherboard / ssd. amd has really improved their tessellation performance. with tessellation off i would get 134fps in sli 970 and now 149fps on my 390's in crossfire.

gotta say, my 390's are amazing and really happy with them. like them more than my 970's.


----------



## CamsX

Nice job with the HAF XB. Looks pretty clean. What are your card temps while running Heaven?


----------



## orlfman

Quote:


> Originally Posted by *CamsX*
> 
> Nice job with the HAF XB. Looks pretty clean. What are your card temps while running Heaven?


thanks!
the top card hits 85c core and the bottom card hits 75c core after 30 minutes of having it run in a loop. the coolers on the nitros work great and really happy with them. though i do miss having a back plate but they don't seem to need it.


----------



## snakeplissken1

Hi All !
Hi Agentsmith , love to join the club !



Recently upgraded from a VTX 7950 V3 to a Sapphire 390 Nitro (330€ ) and it was a great boost. Witcher 3 from 1920 med setting to 1440maxed (w/o hairworks). Really love this card. HAven't tested much but if I look at the data, there should be room for some well OC I guess. But I like a small OC to run @ 24/7 and low noise.

Got a quick question: HAven't OCed much during my 7950times. Powerlimit hasnt been touched so far, but is it needed if the card still boosts to 1060 without touching it?

Abd strangely my Sapphire tells me each time I boot +13 mv on the core. Checked with Trixx and Afterburner and both show me the same. Has Sapphire increased this corevolatage in the bios? I have the MV on the core set it down to 0 through Afterburner @ startup. Stil it runs fine without any artifacting so far.

Will post Heaven/VAlley maybe later after doing some tests.

PC:
Intel I5 - [email protected] 4GH
Sapphire 390 Nitro @ 1060Core / 1500Mem , mv 0, pl0 @ CUstom fan profile (48% max [email protected] 70-72C core, VRM 1/2 @78/ 80)
16GB Ram
Asus P8 P-67


----------



## MrOldboy

Got my Gigabyte R9 390 installed. I ended up just installing Windows 10 even though there isn't much need to at the moment. Currently running the 15.8 beta driver.

I may try to get a slight overclock on the card, but don't really have a need to since I am getting solid framerates in games. For example 60 in the Shadow of Mordor benchmark with the game maxed out.

One issue I am having is some weird coil whine from my PSU. I never had any issues before with my GTX 480 except occasional coil whine when pushed hard, but it only came form the card itself. The buzzing/coil whine is coming from my PSU. It's a HX620w. Could the extra power draw ~25w from the 390 compared to the 480 be the cause?

Happens in firestrike and benchmark for Shadows of Mordor.

For reference with everything stock. 3570k, 8gb ram I am getting around a 9500 score in firestrike.


----------



## kalidae

Quote:


> Originally Posted by *desetnik*
> 
> I was already thinking about buying an aftermarket cooler for my cpu or h75 since the cpu started to run at 65-75 degrees few months ago. It used to run at 50-65 in games.
> I'm not planning to oc cpu to limit since its already good enough for games. I hope for stable 4.2-4.5 ghz. Currently I lack any oc knowledge so I don't know what to expect, until then research! Just too bad my 390 isn't running cool.


The fans that come with the h75 will be dead silent if you don't overclock. I have my 4690k that's 3.5ghz clocked to 4.5ghz and i just changed the fans to the corsair sp120 performance edition since I already had those from my custom loop, but I gotta say for gaming my cpu doesn't go over 65 degrees with the fans at 60% so it's pretty quiet.


Here is a picture of my system, this is my system in my define r4, I recently moved it to the noctis because the r4 while silent and an excellent case it does restrict airflow quite a lot much like the h440.







I call this "Project-Dark Dreams" and moving my system into the noctis was stage one. Stage 2 is fitting my custom loop with hardline tubing and stage 3 is adding more black and removing some red, so maybe changing fans and new cables etc. It's far from being complete but for now I'm stoked as with the h75 so im not in a huge rush to get my custom loop installed again.


----------



## CamsX

Quote:


> Originally Posted by *orlfman*
> 
> thanks!
> the top card hits 85c core and the bottom card hits 75c core after 30 minutes of having it run in a loop. the coolers on the nitros work great and really happy with them. though i do miss having a back plate but they don't seem to need it.


85°C for the hot card with only 1 slot space in between is really good I must say. Crossfire is so tempting, but I'll refrain for now. Need to save for my Freesync monitor. Spent what I had on a new CPU/mobo kit. Doh!

And yeah, the backplate is not really needed for your case orientation anyway. I just hope the inner heatsink framing is strong enough to prevent any sagging in the long run. Only time will tell.


----------



## CamsX

Quote:


> Originally Posted by *snakeplissken1*
> 
> Hi All !
> Hi Agentsmith , love to join the club !
> 
> Recently upgraded from a VTX 7950 V3 to a Sapphire 390 Nitro (330€ ) and it was a great boost. Witcher 3 from 1920 med setting to 1440maxed (w/o hairworks). Really love this card. HAven't tested much but if I look at the data, there should be room for some well OC I guess. But I like a small OC to run @ 24/7 and low noise.
> 
> Got a quick question: HAven't OCed much during my 7950times. Powerlimit hasnt been touched so far, but is it needed if the card still boosts to 1060 without touching it?
> 
> Abd strangely my Sapphire tells me each time I boot +13 mv on the core. Checked with Trixx and Afterburner and both show me the same. Has Sapphire increased this corevolatage in the bios? I have the MV on the core set it down to 0 through Afterburner @ startup. Stil it runs fine without any artifacting so far.


Congrats on the upgrade. I also noticed the stock voltage difference with my Nitro as soon as I installed it. I don't see any harm on this, and I also tested at +0mV without any particular issue.

I just stopped using TriXX after they changed the interface. It can monitor more sensors but interface doesn't seem practically. Finally getting the hang of Afterburner.

Don't expect anything ground breaking regarding overclocking. Low noise and Max OC don't go that well together. And nobody with Sapphire cards has been able to reach the performance some of the MSI cards are able achieve.


----------



## CamsX

Quote:


> Originally Posted by *kalidae*
> 
> I call this "Project-Dark Dreams" and moving my system into the noctis was stage one. Stage 2 is fitting my custom loop with hardline tubing and stage 3 is adding more black and removing some red, so maybe changing fans and new cables etc. It's far from being complete but for now I'm stoked as with the h75 so im not in a huge rush to get my custom loop installed again.


Cool build.

I've never done a custom loop or anything like that, but I've always thought it's more about looks and personal achievement, than pure performance. I also think they need a lot of maintenance, but I might be wrong. AIO water kits are much simpler and have become very efficient for enthusiasts and mild overclockers.


----------



## bazookatooths

Quote:


> Originally Posted by *Dundundata*
> 
> very nice
> 
> 
> 
> 
> 
> 
> 
> I am between this and MSI...


Yes both are pulling some nice numbers.


----------



## kalidae

Quote:


> Originally Posted by *CamsX*
> 
> Cool build.
> 
> I've never done a custom loop or anything like that, but I've always thought it's more about looks and personal achievement, than pure performance. I also think they need a lot of maintenance, but I might be wrong. AIO water kits are much simpler and have become very efficient for enthusiasts and mild overclockers.


You are partially right, I have 3 all in one liquid coolers, corsair h75 a h110 an a antec 920 kuhler and while all 3 of those are great performers and excellent bang for buck they can't compare to a custom loop. Watercooling isn't about the looks, that's just what it has become. It's about high performance and silence. I got my h75 cooling my overclocked 4690k at the same 1ghz overclock and volts and temps as my custom loop but the difference is that it's much louder because the fans have to spin much faster to achieve the same thing. This also means the headroom for overclocking isn't as high as the custom loop because the fans are already spinning much faster than they did in the custom. An all in one is much louder because of this. They can't cool the same temps at the same fan speeds. The pumps aren't the same quality and can't move the same amount of fluid and the radiators are aluminium and not copper and the waterblock and mounting bracket isn't as good.

As for maintenance you drain the loop once a year, some go years and others drain and change the colour every few weeks. If you setup a proper drainage system then it's a 5 minute job. Watercooling is just about silence and performance and finding the perfect balance of the 2 and an all in one can't do that like a custom loop.

Also your gpu is unloading like what 60-75 degrees into your case? That gets sucked in by the radiator fans of the all in one cooler and if u have a gpu block in that loop then all that heat isn't going into the case, it's being vented out the top. Inside of the case remains cool because it's filled with cool air from the intake. The cpu and gpu put out the most heat so if u can get that out of the case through a radiator or 2 then the only heat source in your case is the motherboard and the hard drives but you can watercool them too. It's an amazing thing.

My psu fan only spins at heavy load so for the most part it's not spinning at all and the gpu fans are the same and the radiator and case fans spin at 25% when I'm watching movies or youtube or light gaming and it's dead silent. You can only hear my pc if u put your ear against the case and u only know it's on because of the bright leds and that's what it's all about, but the question is hmm is it worth the money? If someone stole all my custom watercooling parts I wouldn't go out and buy them all over again, I'd buy the swiftech h220x or the new ek all in one cooler because they are a custom loop that comes prefilled and all in one the same as any of other all in one but every part is custom from there line up and it's fully expandable for a fraction of the price. That's what I would recommend to anybody wanting to get into custom watercooling.


----------



## CamsX

Pretty nice. Thanks for the insight. I doubt I'll ever do one in the near future. Those parts aren't available in my country, and even for the improved cooling and silence performance, the cost and logistic involved is not worth the hassle for me.

Now, I do hope someone designs a Kraken G10 v2, to slap an AIO to my 390 like I did with my 7970. Dreams maybe?


----------



## battleaxe

Not trying to brag here, just curious as I'm still considering the 390x. Can any of you guys beat this score with a 390x? I don't want a problem, so please no drama. I finally got this thing to work with 200mv in TRIXX. BTW AgentSMith, I had to slowly pump the volts up while running Heaven in order to not get a black screen. So start the bench, then increase volts up to maximum by about 25mv at a time. Then push the clocks up. This worked for me, and as you can see, now running higher core than before. In my case, the RAM actually did better on AB, I am guessing because of the Aux volts control? I don't think there is one on the new Trixx software unless I missed it somewhere. So my VRM2 is actually cooler than when running aux volts on AB, but my core can go higher as I can get to 200mv. I"m wondering if you guys could try this on your 390x cards and see if it works?

Anyway, new personal best for me 290x new edition. Which I still think is just a rebadged 390x with only 4Gb of ram...


----------



## Renner

http://www.fudzilla.com/news/graphics/38667-powercolor-unveils-699-priced-devil-13-dual-core-r9-390

700€, gents. If someone picks up this thingly pls report here ASAP.


----------



## Gumbi

That's an awesome score dude! I hit 1600 yesterday on my 290! Missing a fan on my Vaporx at the moment which is gimping my core temps by a few degrees so not sure I can match what you have but I'll try!


----------



## battleaxe

Quote:


> Originally Posted by *Gumbi*
> 
> That's an awesome score dude! I hit 1600 yesterday on my 290! Missing a fan on my Vaporx at the moment which is gimping my core temps by a few degrees so not sure I can match what you have but I'll try!


Yes, I'm super happy with this card. But as I said the board is even different than the 290x old edition. So I think this is really a 390x with only 4Gb of ram in it. Or very similar anyway. I've got these on AIO coolers and now that I know its the Aux voltage that is pushing up VRM2 (memory controller) I can keep those temps down a bit better now.

These cards just chew through games though like butter. Its really nice to see AMD has figured Xfire out so well. As long as my frames are above 100 I don't notice any microstutter at all. I'm not sure why this is as my monitor can't even display 100FPS, but it is considerably less jumpy when frames are higher than 80fps for example. Thankfully, with this much power, that's really easy to do even on 4k. But I'm so satisfied that I've still been thinking about getting into a set of 390x or even FuryX, if they ever lose the reference version that is...

I personally think AMD is on a major uptick here. If consumers can just wait a bit the drivers have always been there for AMD users. I think we are witnessing this now with the 290x and 390x cards. We are seeing some great performance out of these cards if you ask me. I couldn't be more happy with AMD right now. I think the Fury will be rolling over the 980ti in a few months time too. Its not like its very far behind right now to begin with. (4% on 4k generally speaking)


----------



## Gumbi

I tried 200mv 1250/1650 there and it artefacts fairly heavily after a min or so as core temls rise.

My VRMs were 69 max after 5 mins but the core was 82 degrees and probably would have risen even more. Not sure what the central fan could do to alleviate this. Thr VRM cooling on this is IMO the best of the entire 290/390 series.

I might settle for 150mv (125 yesterday and core was 72 or 74 max) and see what I can push with that.

As regards you seeing a difference between 100 and 80 fps on a (presumably) 60hz screen, the answer is likely simple. Frame timings.

It's possible for 30fps to be smoother than 60fps if you have more consisntent frame timings.

You might have high frames but if every few frames has a major stutter it can ruin the experience.


----------



## battleaxe

Quote:


> Originally Posted by *Gumbi*
> 
> I tried 200mv 1250/1650 there and it artefacts fairly heavily after a min or so as core temls rise.
> 
> My VRMs were 69 max after 5 mins but the core was 82 degrees and probably would have risen even more. Not sure what the central fan could do to alleviate this. Thr VRM cooling on this is IMO the best of the entire 290/390 series.
> 
> I might settle for 150mv (125 yesterday and core was 72 or 74 max) and see what I can push with that.
> 
> As regards you seeing a difference between 100 and 80 fps on a (presumably) 60hz screen, the answer is likely simple. Frame timings.
> 
> It's possible for 30fps to be smoother than 60fps if you have more consisntent frame timings.
> 
> You might have high frames but if every few frames has a major stutter it can ruin the experience.


Try lowering your core just a bit and test again. Flashing is likely too high on core.


----------



## Gumbi

For sure. It also told me I probably can't handle 200mv on air, I most certainly can't handle it without the central GPU fan. I'll see if I can hit 1240mhz at 150mv or something.


----------



## Jon55

Quote:


> Originally Posted by *Gboss*
> 
> Here too, dunno what the fuss about noise levels on these cards. I can barely hear my GPU over the hum from my Nepton cooler lol


Good to hear! How you liking your Powercolor PSC+ 390? I'm very torn between it and the MSI 390. I've had MSI in the past, and they were great, but the Powercolor card is getting great reviews. I don't have an experience with Powercolor though.


----------



## xer0h0ur

What has powercolor been smoking? Da hell is the point of using Grenada Pro?

http://wccftech.com/powercolor-launches-devil-13-r9-390-dual-grenada-pro-graphics-card-features-5120-sps-16-gb-gddr5-vram/


----------



## Jon55

Quote:


> Originally Posted by *xer0h0ur*
> 
> What has powercolor been smoking? Da hell is the point of using Grenada Pro?
> 
> http://wccftech.com/powercolor-launches-devil-13-r9-390-dual-grenada-pro-graphics-card-features-5120-sps-16-gb-gddr5-vram/


Wow that's stupid. Sandwich cards like this are _never_ good.


----------



## desetnik

I think I will rma my 390. Looking at reviews on amazon and here, everyone is getting good temperatures even with oc. Mine is staying at 80-84C with fan speed 80-90%. I wonder if anyone else here gets this temps on stock like me, or I'm the only one with bad airflow or the thermal paste is bad on this card I removed side panel but 0 effect on temps.


----------



## Mysticking32

Quote:


> Originally Posted by *desetnik*
> 
> I think I will rma my 390. Looking at reviews on amazon and here, everyone is getting good temperatures even with oc. Mine is staying at 80-84C with fan speed 80-90%. I wonder if anyone else here gets this temps on stock like me, or I'm the only one with bad airflow or the thermal paste is bad on this card I removed side panel but 0 effect on temps.


Which 390? Sapphire?

It depends on what game you're playing. My card used to get to 83 playing the witcher 3 and only in the high 70's with gta 5. Not sure what's up with the witcher 3.


----------



## desetnik

Quote:


> Originally Posted by *Mysticking32*
> 
> Which 390? Sapphire?
> 
> It depends on what game you're playing. My card used to get to 83 playing the witcher 3 and only in the high 70's with gta 5. Not sure what's up with the witcher 3.


MSI. Witcher 3 gets to 83. Lords of the fallen same. BF4 on ultra never broke 75C. Act of Aggression runs at 82C without fps limit. Planetside 2 is running most of the time at 68C


----------



## Noirgheos

Quote:


> Originally Posted by *desetnik*
> 
> MSI. Witcher 3 gets to 83. Lords of the fallen same. BF4 on ultra never broke 75C. Act of Aggression runs at 82C without fps limit. Planetside 2 is running most of the time at 68C


Yeah that doesn't seem right. My friend has a stock everything MSI R9 390X and gets 72C max under load. He also has a Define S so that's pretty good airflow.


----------



## orlfman

i got a general question. my top nitro 390 vrm's gets into the low 90c range during gaming and benchmarking while my bottom card reaches about high 70's to low 80's. i tried googling but couldn't find much info about it. read one review on reddit for a 390 and the guy said they should stay below 125c but he likes them under 80c.

so i'm curious, what are the safe range temps for 390's vrms?


----------



## rankdropper84

http://www.3dmark.com/3dm11/10262576

1142mhz core 1755mhz memory


----------



## rankdropper84

Quote:


> Originally Posted by *desetnik*
> 
> MSI. Witcher 3 gets to 83. Lords of the fallen same. BF4 on ultra never broke 75C. Act of Aggression runs at 82C without fps limit. Planetside 2 is running most of the time at 68C


Highest temp I have seen ym 390 get is 69C on a hot day with no AC playing witcher 3. Playing games like gta5 in 74F ambient temp room my gpu gets to a max of like 60C lol


----------



## bazookatooths

Quote:


> Originally Posted by *desetnik*
> 
> MSI. Witcher 3 gets to 83. Lords of the fallen same. BF4 on ultra never broke 75C. Act of Aggression runs at 82C without fps limit. Planetside 2 is running most of the time at 68C


How is your case flow, and fan curve, I'm overclocked 1175/1720 on XFX and finally got around to setting a custom fan profile, 30% at 60c, 55% at 70c up to 70% at 100c Gaming I get low 60s max and my CPU fan is the only thing I can barely hear.
If i cap 60fps It doesnt get over low 50s and is inaudible.

Mine has only gotten to 80c running heaven bench with only 30% fan speed. Which blew my mind. So In theory you should be able to run 30% fan speed(no noise) and stay in the 70-75c ?

So those numbers do seem strange. Best of luck and hope you get this sorted. Try a side fan a couple front fans, I took off my bottom intake because i didn't need it.


----------



## kalidae

Quote:


> Originally Posted by *orlfman*
> 
> i got a general question. my top nitro 390 vrm's gets into the low 90c range during gaming and benchmarking while my bottom card reaches about high 70's to low 80's. i tried googling but couldn't find much info about it. read one review on reddit for a 390 and the guy said they should stay below 125c but he likes them under 80c.
> 
> so i'm curious, what are the safe range temps for 390's vrms?


I'm not sure what the safe temps for the vrms are but your top card is sucking in all the heat from the bottom card so that explain the higher temps, you could add some more case fans or just speed your current case fans up if you are worried about the heat. It also really depends on your case as to where the intake fans are located and close they are to the gpus. The corsair air 540 is awesome when it comes to keeping the gpus cool because it has 3 120 intake fans really really close to the gpus with no drive cages in the way and no restriction and 2 of the case fans blow air directly on between and below the crossfire setup.


----------



## kizwan

Quote:


> Originally Posted by *desetnik*
> 
> I think I will rma my 390. Looking at reviews on amazon and here, everyone is getting good temperatures even with oc. Mine is staying at 80-84C with fan speed 80-90%. I wonder if anyone else here gets this temps on stock like me, or I'm the only one with bad airflow or the thermal paste is bad on this card I removed side panel but 0 effect on temps.


What is your ambient temp? Trying to compare your temp with others are not good idea because they may have lower ambient temp than yours. You can try strap a high CFM fan to the card to help cool it down a little bit.


----------



## rankdropper84

Quote:


> Originally Posted by *kizwan*
> 
> What is your ambient temp? Trying to compare your temp with others are not good idea because they may have lower ambient temp than yours. You can try strap a high CFM fan to the card to help cool it down a little bit.


----------



## rankdropper84

Quote:


> Originally Posted by *rankdropper84*


Ambient air temperature of 75F my 390 will get up to 63C on unigine and 3dmark. Playing the witcher 3 for about an hour my max temp per hwmonitor was 65C. I use stock fan profiles also


----------



## orlfman

Quote:


> Originally Posted by *kalidae*
> 
> I'm not sure what the safe temps for the vrms are but your top card is sucking in all the heat from the bottom card so that explain the higher temps, you could add some more case fans or just speed your current case fans up if you are worried about the heat. It also really depends on your case as to where the intake fans are located and close they are to the gpus. The corsair air 540 is awesome when it comes to keeping the gpus cool because it has 3 120 intake fans really really close to the gpus with no drive cages in the way and no restriction and 2 of the case fans blow air directly on between and below the crossfire setup.


right now i have two 120mm scythe 2400rpm grandflexs as intake in my haf xb evo. one blows air directly towards the cards and is about a few inches away from them. i have one 2000 rpm cooler master blade master as exhaust in the rear which aligns with the cpu cooler... not so much the cards. i also zip tied a 120mm enermax 2000rpm vegas fan on the top of my haf xb directly above the cards. this really helped lower temps and i can feel it... it pumps out a lot of air but i still feel a lot of heat around the top card that isn't able to get pumped out.

i went ahead and ordered two noctua nf-a14 3000rpm fans to use as my new intake fans and one noctua nf-f12 3000rpm fan to replace my enermax fan. figured those industrial fans should really provide enough air flow and exhaust to help the temps.


----------



## kalidae

Quote:


> Originally Posted by *orlfman*
> 
> right now i have two 120mm scythe 2400rpm grandflexs as intake in my haf xb evo. one blows air directly towards the cards and is about a few inches away from them. i have one 2000 rpm cooler master blade master as exhaust in the rear which aligns with the cpu cooler... not so much the cards. i also zip tied a 120mm enermax 2000rpm vegas fan on the top of my haf xb directly above the cards. this really helped lower temps and i can feel it... it pumps out a lot of air but i still feel a lot of heat around the top card that isn't able to get pumped out.
> 
> i went ahead and ordered two noctua nf-a14 3000rpm fans to use as my new intake fans and one noctua nf-f12 3000rpm fan to replace my enermax fan. figured those industrial fans should really provide enough air flow and exhaust to help the temps.


Oh okay, I used to have a HAF XB and i recently just sold it to a friend of mine (i change cases like every 4 months and my gf hates it haha), it's a really good case. The 2 fans on the front should be pushing enough cool air into the case though I suppose its really only one fan that's cooling the gpus since the other fan is basically inline with the rear exhuast but still it should be doing an alright job, the only issue is its just 2 fans as intake. In saying that though I was running a 7870 crossfire in mine and I wasn't having any heat issues though the 390's are hotter than the 7870s but then I swapped to an air 540 and the extra fan on the front did make a difference in overall temps of the entire system. Honestly 3 intake fans will always be better than 2 intake fans. I'm thinking maybe your case is to small for the amount of heat it's producing inside verse the amount of intake it's receiving. I guess the beSt thing about the had XB is you can remove the top and side panels so it's an open test bench, try that and leave it completely open for a while, you could also rig up another 1 or 2 intake fans on the side blowing air into the gpus. Just an idea.


----------



## CamsX

Quote:


> Originally Posted by *orlfman*
> 
> right now i have two 120mm scythe 2400rpm grandflexs as intake in my haf xb evo. one blows air directly towards the cards and is about a few inches away from them. i have one 2000 rpm cooler master blade master as exhaust in the rear which aligns with the cpu cooler... not so much the cards. i also zip tied a 120mm enermax 2000rpm vegas fan on the top of my haf xb directly above the cards. this really helped lower temps and i can feel it... it pumps out a lot of air but i still feel a lot of heat around the top card that isn't able to get pumped out.
> 
> i went ahead and ordered two noctua nf-a14 3000rpm fans to use as my new intake fans and one noctua nf-f12 3000rpm fan to replace my enermax fan. figured those industrial fans should really provide enough air flow and exhaust to help the temps.


Thats why I asked earlier about your temps. The hot card Works with the surrounding heat/air from the top of lower card (cold card). From whats been posted in this actual thread. Core Temp should never get over 94°C, or you will get throttling issues. Above that is also bad for card longevity.

For the best stability/performance VRMs should be below 80°C, with 85°C (I think) max to prevent failures.


----------



## orlfman

Quote:


> Originally Posted by *CamsX*
> 
> Thats why I asked earlier about your temps. The hot card Works with the surrounding heat/air from the top of lower card (cold card). From whats been posted in this actual thread. Core Temp should never get over 94°C, or you will get throttling issues. Above that is also bad for card longevity.
> 
> For the best stability/performance VRMs should be below 80°C, with 85°C (I think) max to prevent failures.


yeah thats what i keep hearing about wanting the vrm's under 80c. core temps are fine on both cards, well under 85c, with the bottom staying in the 70c's. its the top cards vrms that i'm unsure about. highest its gotten is 93c and thats during heaven benchmark. gaming its around the mid 80c's.
Quote:


> Originally Posted by *kalidae*
> 
> Oh okay, I used to have a HAF XB and i recently just sold it to a friend of mine (i change cases like every 4 months and my gf hates it haha), it's a really good case. The 2 fans on the front should be pushing enough cool air into the case though I suppose its really only one fan that's cooling the gpus since the other fan is basically inline with the rear exhuast but still it should be doing an alright job, the only issue is its just 2 fans as intake. In saying that though I was running a 7870 crossfire in mine and I wasn't having any heat issues though the 390's are hotter than the 7870s but then I swapped to an air 540 and the extra fan on the front did make a difference in overall temps of the entire system. Honestly 3 intake fans will always be better than 2 intake fans. I'm thinking maybe your case is to small for the amount of heat it's producing inside verse the amount of intake it's receiving. I guess the beSt thing about the had XB is you can remove the top and side panels so it's an open test bench, try that and leave it completely open for a while, you could also rig up another 1 or 2 intake fans on the side blowing air into the gpus. Just an idea.


yeah its why i went ahead and ordered two new 3000rpm 140mm fans. figured that can push out a ton of air to make up for the lack of a second fan since its lined up with the cpu. and a 120mm 3000rpm fan to run as exhaust gimme rigged directly above the cards on the top lid mesh.


----------



## kalidae

Quote:


> Originally Posted by *orlfman*
> 
> yeah thats what i keep hearing about wanting the vrm's under 80c. core temps are fine on both cards, well under 85c, with the bottom staying in the 70c's. its the top cards vrms that i'm unsure about. highest its gotten is 93c and thats during heaven benchmark. gaming its around the mid 80c's.
> yeah its why i went ahead and ordered two new 3000rpm 140mm fans. figured that can push out a ton of air to make up for the lack of a second fan since its lined up with the cpu. and a 120mm 3000rpm fan to run as exhaust gimme rigged directly above the cards on the top lid mesh.


It's going to be loud as a jet with those huge rpm fans but it's going to help that's for sure. If it does help then you might want to look into another case somewhere down the line, I just got a noctis 450 and i can't recommend it enough. Same frame as the h440 but 10x more air flow and a pwm fan hub built in. Best case I have ever owned and i have owned a lot.


----------



## Dundundata

I've decided to go with the XFX 390







, but I think I should get a new PSU. Is this a good choice, XFX 750 gold for $100?

http://www.newegg.com/Product/Product.aspx?Item=N82E16817207033

There's also a seasonic 750/gold for $120

http://www.newegg.com/Product/Product.aspx?Item=N82E16817151132

I suppose I could spend a little more and get an 850w if I ever decide to crossfire...


----------



## rdr09

Quote:


> Originally Posted by *Dundundata*
> 
> I've decided to go with the XFX 390
> 
> 
> 
> 
> 
> 
> 
> , but I think I should get a new PSU. Is this a good choice, XFX 750 gold for $100?
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16817207033
> 
> There's also a seasonic 750/gold for $120
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16817151132
> 
> I suppose I could spend a little more and get an 850w if I ever decide to crossfire...


850 if not much more. might add another 390 in the future.

edit:

http://www.newegg.com/Product/Product.aspx?Item=N82E16817438030


----------



## Dundundata

Quote:


> Originally Posted by *rdr09*
> 
> 850 if not much more. might add another 390 in the future.
> 
> edit:
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16817438030


Thanks although I'd like to stick with at least a Gold. I don't mind spending a bit more ($150) if it's worth it, and I've heard XFX/Seasonic are supposed to be the top brands?


----------



## kizwan

Quote:


> Originally Posted by *Dundundata*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> 850 if not much more. might add another 390 in the future.
> 
> edit:
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16817438030
> 
> 
> 
> Thanks although I'd like to stick with at least a Gold. I don't mind spending a bit more ($150) if it's worth it, and I've heard XFX/Seasonic are supposed to be the top brands?
Click to expand...

The 80Plus rating either Bronze, Silver, Gold, Platinum or Titanium, is just good for your electrical bills. It doesn't dictate whether the PSU is good quality or not. Well, quality in term of voltage regulation, quality of the component used in it & it doesn't take out the other hardware when it fail.

Seasonic is top & trusted brand. XFX depends on which OEM it used to make the PSU.

BTW, depends on your computer specification, 850W may not enough for a pair of 390's. 390 at stock already overclocked which means the power consumption may higher than 290 at stock clocks.


----------



## Renner

Personally, choosing between those 2 PSUs I would go Seasonic.


----------



## Dorland203

Imo,I would buy a 1000W PSU at minimum to crossfire 390s.


----------



## desetnik

Quote:


> Originally Posted by *bazookatooths*
> 
> How is your case flow, and fan curve, I'm overclocked 1175/1720 on XFX and finally got around to setting a custom fan profile, 30% at 60c, 55% at 70c up to 70% at 100c Gaming I get low 60s max and my CPU fan is the only thing I can barely hear.
> If i cap 60fps It doesnt get over low 50s and is inaudible.
> 
> Mine has only gotten to 80c running heaven bench with only 30% fan speed. Which blew my mind. So In theory you should be able to run 30% fan speed(no noise) and stay in the 70-75c ?
> 
> So those numbers do seem strange. Best of luck and hope you get this sorted. Try a side fan a couple front fans, I took off my bottom intake because i didn't need it.


My fan speed is 0 when the card idles at 60, 60% at 75 and 100% at 85C

I got max 80C on heaven too but the fan was like at 80% not 30%. AIrflow isn't ideal but I still doubt it would cause such high temperatures the way its setup now, its not like everything is closed. Playing games on my pc results in jetlike experience! I remember benchmarking my 7870 last year on same setup and it made no sound. While this one is a different story.


----------



## Agent Smith1984

Quote:


> Originally Posted by *xer0h0ur*
> 
> What has powercolor been smoking? Da hell is the point of using Grenada Pro?
> 
> http://wccftech.com/powercolor-launches-devil-13-r9-390-dual-grenada-pro-graphics-card-features-5120-sps-16-gb-gddr5-vram/


What's even dumber, is that they already did this with the devil13, except the last one used Hawaii xt!


----------



## CamsX

Quote:


> Originally Posted by *Agent Smith1984*
> 
> What's even dumber, is that they already did this with the devil13, except the last one used Hawaii xt!


Welcome back. haha.


----------



## Agent Smith1984

Quote:


> Originally Posted by *CamsX*
> 
> Welcome back. haha.


Yeah been out...

My motherboard go BOOM!


----------



## xer0h0ur

Quote:


> Originally Posted by *Agent Smith1984*
> 
> What's even dumber, is that they already did this with the devil13, except the last one used Hawaii xt!


But that was just an air cooled 295X2 with a custom PCB. This time around they didn't even use the top chip available to them. I wonder if AMD told them not to make a card that beat the 295X2.


----------



## Dundundata

Thanks for all your help, finally pulled the trigger. XFX 390 and 750 Seasonic, should be plenty powerful. If I ever decide to Crossfire I'll upgrade PSU, but it won't be anytime soon. I'll post up when everything is installed and make it official!


----------



## Depauville Kid

My buddy is looking to grab a 390x, is there anyone who can vouch for a coil whine free card or any to stay clear of. His current 770 is a wicked buzzer and he's hoping to avoid that on this card if possible.

Thanks.


----------



## Renner

My MSI 390 is pretty much whineless, and I have a really quite system after getting rid of that stock CPU cooler. However, I also heard from some customers that my Corsair HX850i suffers from coil whines, but haven't heard any so far myself. Reading anywhere I don't recall anyone saying anything about coil whines on 390(X) series. It may depend from your PSU as well.

Regarding Dundunata's choice, its a good one. I was looking for a 650W PSU, I couldn't find it, then for 750W, that wasn't available here as well, so I had to go with 850W because I really wanted this particular model. But God, that 24-pin... I don't know was it harder to push it into PSU itself or in the MB. Always some bs. After getting this power I was thinking about CF of these cards, but regarding its using over 2 slots the second would be just near the sound card which would block one of its vents, so its not really ideal. That Devil 13 dual GPU however would be perfect.

This card should push me several years, I could live 5 years with 1GB 6850 I think this one will do just fine. By the time I start considering a new rig, new nodes and memory systems will be standards, and if Fury Nano is a glimpse into future, I may even go mini-ATX build that won't even cause my PSU fan to start spinning even under full load (I have 7 years warranty on this one, so I'm planning to use it in the future as well).


----------



## jaydude

Quote:


> Originally Posted by *Depauville Kid*
> 
> My buddy is looking to grab a 390x, is there anyone who can vouch for a coil whine free card or any to stay clear of. His current 770 is a wicked buzzer and he's hoping to avoid that on this card if possible.
> 
> Thanks.


No coil whine on my gigabyte 390 so far, msi does not either from what I have read, I would stay clear from asus, maybe the strix is fine but the dc2 does not have vrm cooling iirc so is likely to be more susceptible to it.

Also if overclocking is a priority then gigabyte is a no go, volt locked.


----------



## Depauville Kid

Anyone have the XFX straight 390 and can comment on coil whine?


----------



## Renner

Regarding Asus - indeed. I was seeing tests where Asus 390 reaches 86 degrees under load, while Sapphire 390 Nitro only 66... And that's having 3 fans, my dual fan MSI performs far better than that. I don't know, Asus disappoints me with their cards pass gen or two, at least regarding AMD. First slapping a 780 cooler on 290s, and then Elpida memory chips, even on their Strix models, and now this. Evade it at all costs.


----------



## Oregonduck007

My MSI R9 390 should be here in 3 days!! Can't f***ing wait!


----------



## Renner

Quote:


> Originally Posted by *Oregonduck007*
> 
> My MSI R9 390 should be here in 3 days!! Can't f***ing wait!


I know how it feels. I could barely wait for mine. Then after that, I had to wait several days for the new PSU. Then after I tried assembling everything I figured it cannot fit into my old case. Then after I got a new case I had to wait playing anything while waiting for the new CPU cooler since there was overheating with the stock.... Took a while for me to play like a human.


----------



## semiroundboss

Quote:


> Originally Posted by *Agent Smith1984*
> 
> What's even dumber, is that they already did this with the devil13, except the last one used Hawaii xt!


I wonder if it can be unlocked to Grenada XT. Might be worth it then.


----------



## Jhereck

Help needed here :

https://community.amd.com/message/2669920#2669920

And is there a program that can force 2d memory clock on windows 10 ? (MSI r9 390)

Thanks !


----------



## Agent Smith1984

Quote:


> Originally Posted by *semiroundboss*
> 
> I wonder if it can be unlocked to Grenada XT. Might be worth it then.


Only if it'll take the original devil bios, and i highly doubt it.


----------



## Derek129

My video driver keeps crashing while playing gtav and its absolutely driving me insane. Does anybody know a fix to this??


----------



## CamsX

Finally, 10,5k in Firestrike.

Graphics Score
12776

Physics Score
10064

Combined Score
4780

[email protected]
[email protected]/1525

http://www.3dmark.com/3dm/8485892


----------



## CamsX

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Yeah been out...
> 
> My motherboard go BOOM!


Yikes, that suxz. VRMs didn't survive the summer? Hope you get everything sorted soon.


----------



## geekforspeed

Hey, just made an account on this forum because my day of experiencing OC with my recently bought 390 wasn't as fulfilling as I'd liked it to be.
After a lot of you people reporting about Core Clocks between 1100 and 1200mhz I was eager on getting a decent bump in performance.
Set the TDP to +20% (MSI Afterburner) and started messing around at about 1120ish mhz core clock.
Everything seemed fine at first while playing some Ground Zeroes when I suddenly noticed some heavy dips in FPS.
I installed FurMark to check what might be the issue and it seems my core clock just randomly drops down to 300mhz every now and then for a split second. This starts to happen after a few minutes of heavy GPU load.
Does anyone have an idea? I've already tried fiddling around with vCore and TDP settings, nothing seemed to get rid of the problem.
Did I just really get screwed over in terms of sillicone lottery or is my Sapphire 390 Nitro defective? Or am I doing something completely wrong?
The most I've managed to get out of my R390 while running stable was at around 1086mhz with TDP +20% and vCore increased ever so slightly.
What I probably should add: I noticed my GPU not running with its advertised 1010mhz on stock settings, only if I increase TDP it seems to reach said clock speed.
Help would be much much appreciated. Thanks in advance.


----------



## Lazy Dog

I just looooove my two little dragons










Spoiler: Warning: Spoiler!










Spoiler: Some benches



    



I know both of them can do 1200/1700 but the weather here in Athens is just to damn hot to try








....~40 degrees celcius

The top card after 2hours of BF4 reaches 91C with fans at 100%...the bottom one never breaks the 70C mark(with fans around 50%)


----------



## Rockarolla

Which brand is best for watercooling?


----------



## flopper

Quote:


> Originally Posted by *Rockarolla*
> 
> Which brand is best for watercooling?


XFX seems to use default 290x layout.
That allows waterblock to fit however check with the manufacturer first before ordering.


----------



## BradleyW

Quote:


> Originally Posted by *Rockarolla*
> 
> Which brand is best for watercooling?


Any, as long as it's reference.


----------



## Dorland203

Quote:


> Originally Posted by *geekforspeed*
> 
> Hey, just made an account on this forum because my day of experiencing OC with my recently bought 390 wasn't as fulfilling as I'd liked it to be.
> After a lot of you people reporting about Core Clocks between 1100 and 1200mhz I was eager on getting a decent bump in performance.
> Set the TDP to +20% (MSI Afterburner) and started messing around at about 1120ish mhz core clock.
> Everything seemed fine at first while playing some Ground Zeroes when I suddenly noticed some heavy dips in FPS.
> I installed FurMark to check what might be the issue and it seems my core clock just randomly drops down to 300mhz every now and then for a split second. This starts to happen after a few minutes of heavy GPU load.
> Does anyone have an idea? I've already tried fiddling around with vCore and TDP settings, nothing seemed to get rid of the problem.
> Did I just really get screwed over in terms of sillicone lottery or is my Sapphire 390 Nitro defective? Or am I doing something completely wrong?
> The most I've managed to get out of my R390 while running stable was at around 1086mhz with TDP +20% and vCore increased ever so slightly.
> What I probably should add: I noticed my GPU not running with its advertised 1010mhz on stock settings, only if I increase TDP it seems to reach said clock speed.
> Help would be much much appreciated. Thanks in advance.


Have you disabled ULPS and set unofficial overclocking mode to "*without* powerplay support" ?
What PSU are you using ?
You should always set power limit to +50%.
My MSI 390X can reach 1150/1625 with +50mV/+25mV core/AUX voltage and +50% power limit.


----------



## s0ad33

Hi

Will a R9 390 Nitro fit in a Cooler Master CM RC-690 II case? I've seen pictures of a 290 tri-x fit in, but the Nitro should be a few mm longer..


----------



## Depauville Kid

Quote:


> Originally Posted by *Depauville Kid*
> 
> Anyone have the XFX straight 390 and can comment on coil whine?


My buddy is going to pull the trigger on the XFX r9 390, unless anyone has any reason not to?

Thanks everyone.


----------



## battleaxe

Quote:


> Originally Posted by *Depauville Kid*
> 
> My buddy is going to pull the trigger on the XFX r9 390, unless anyone has any reason not to?
> 
> Thanks everyone.


Good card from what we have seen here. Has a nice VRM cooler section.


----------



## Depauville Kid

Quote:


> Originally Posted by *battleaxe*
> 
> Good card from what we have seen here. Has a nice VRM cooler section.


Thanks. He and I were looking at the XFX and the Sapphire. I recommended to him the XFX just because the Sapphire card is so large. He has room in his case for it, but its just too much.

Thanks again.


----------



## semiroundboss

Quote:


> Originally Posted by *Depauville Kid*
> 
> My buddy is going to pull the trigger on the XFX r9 390, unless anyone has any reason not to?
> 
> Thanks everyone.


XFX if he wants dual slot and/or wants to liquid cool.
Sapphire Nitro for a little more gives better temps and has dual bios, so he could potentially unlock it into a 390X without bricking the card. But keep in mind it is not guaranteed to unlock into a 390X and is 2.2 slots. Though I recommend the Powercolor R9 390 as I have heard it gives the best temps and is triple slot. If he needs dual slot, go with XFX, if he can fit a bigger card in, then Powercolor or Sapphire is the way to go.


----------



## Depauville Kid

Quote:


> Originally Posted by *semiroundboss*
> 
> XFX if he wants dual slot and/or wants to liquid cool.
> Sapphire Nitro for a little more gives better temps and has dual bios, so he could potentially unlock it into a 390X without bricking the card. But keep in mind it is not guaranteed to unlock into a 390X and is 2.2 slots. Though I recommend the Powercolor R9 390 as I have heard it gives the best temps and is triple slot. If he needs dual slot, go with XFX, if he can fit a bigger card in, then Powercolor or Sapphire is the way to go.


He won't overclock or water cool. As I said, he can fit any card, but just doesn't want a card that large. His GTX 770 already sags and the Sapphire card is significantly larger. I think we're set on the XFX.


----------



## semiroundboss

Quote:


> Originally Posted by *Depauville Kid*
> 
> He won't overclock or water cool. As I said, he can fit any card, but just doesn't want a card that large. His GTX 770 already sags and the Sapphire card is significantly larger. I think we're set on the XFX.


Then XFX is his card. The backplate will help with sagging.


----------



## geekforspeed

Quote:


> Originally Posted by *Depauville Kid*
> 
> My buddy is going to pull the trigger on the XFX r9 390, unless anyone has any reason not to?
> 
> Thanks everyone.


In fact I did not have those settings enabled, but after doing so not much has changed. Still getting those drops after 1-2 minutes of running FurMark.
I have a 630W beQuiet PSU.


----------



## Agent Smith1984

Quote:


> Originally Posted by *CamsX*
> 
> Yikes, that suxz. VRMs didn't survive the summer? Hope you get everything sorted soon.


Yeah, popped a mosfet...

Should be up in a week or so with a formula-z and trying out the ASUS 390X Strix...

I sold my msi cause of the 2.5 slot cooler. I'm going to need 2slot cards for crossfire on the new board.

I'll get new members added tomorrow!


----------



## CamsX

Quote:


> Originally Posted by *s0ad33*
> 
> Hi
> 
> Will a R9 390 Nitro fit in a Cooler Master CM RC-690 II case? I've seen pictures of a 290 tri-x fit in, but the Nitro should be a few mm longer..


I would say it would be a very tight fit, if it fits at all.

Here is a side by side of the 690 vs how it fits in my nzxt case, and just by looking at the distance of the rubber grommets to the motherboard tray holes, I think they are further apart than the ones in the 690, and the Nitro is right there on their limit.

If you go for it, just be ready to jump on a new case if it doesn't fit.


----------



## CamsX

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Yeah, popped a mosfet...
> 
> Should be up in a week or so with a formula-z and trying out the ASUS 390X Strix...
> 
> I sold my msi cause of the 2.5 slot cooler. I'm going to need 2slot cards for crossfire on the new board.
> 
> I'll get new members added tomorrow!


I'm glad you are working on it. Also looking forward to see how the Strix does in your hands.

That way we can confirm if the Asus card is or isn't as capable as the other brands.


----------



## CamsX

Quote:


> Originally Posted by *Depauville Kid*
> 
> He won't overclock or water cool. As I said, he can fit any card, but just doesn't want a card that large. His GTX 770 already sags and the Sapphire card is significantly larger. I think we're set on the XFX.


Card is large, but even without a backplate internal framing of the Heatsink was designed to prevent any sagging. I'll confirm in a couple of months if any the Sapphire claims are true or not.


----------



## Agent Smith1984

Quote:


> Originally Posted by *CamsX*
> 
> Card is large, but even without a backplate internal framing of the Heatsink was designed to prevent any sagging. I'll confirm in a couple of months if any the Sapphire claims are true or not.


Looking very forward to it! Will keep everyone posted on this ROG build!


----------



## geekforspeed

Quote:


> Originally Posted by *geekforspeed*
> 
> In fact I did not have those settings enabled, but after doing so not much has changed. Still getting those drops after 1-2 minutes of running FurMark.
> I have a 630W beQuiet PSU.


Anyone?
Getting kinda desperate over here.


----------



## diggiddi

Quote:


> Originally Posted by *geekforspeed*
> 
> Anyone?
> Getting kinda desperate over here.


Furmark es no bueno, it just burns up your gpu. It may be gpu is throttling due to heat load


----------



## geekforspeed

Quote:


> Originally Posted by *diggiddi*
> 
> Furmark es no bueno, it just burns up your gpu. It may be gpu is throttling due to heat load


Doesn't exceed 80* which should be fine with this line of GPUs I guess?


----------



## Gumbi

Quote:


> Originally Posted by *geekforspeed*
> 
> Doesn't exceed 80* which should be fine with this line of GPUs I guess?


Power throttling. Furmakr is a waste of time. Run a Heaven bench at max settings 1080p and see how you get on. Post your results her (most stock 390(x)s score 55-60 FPS at 1080p in Heaven at max settings 1080p.


----------



## bazookatooths

Quote:


> Originally Posted by *Depauville Kid*
> 
> Anyone have the XFX straight 390 and can comment on coil whine?


I have it paired with EVGA 750w G2 no problems here


----------



## Depauville Kid

Quote:


> Originally Posted by *bazookatooths*
> 
> I have it paired with EVGA 750w G2 no problems here


Thanks.


----------



## K1mer0

Guys, Gigabyte R9 390 8GB GDDR5 or XFX R9 390 8GB GDDR5 DD Black Edition...which one do you prefer...?

thanks


----------



## By-Tor

Quote:


> Originally Posted by *K1mer0*
> 
> Guys, Gigabyte R9 390 8GB GDDR5 or XFX R9 390 8GB GDDR5 DD Black Edition...which one do you prefer...?
> 
> thanks


Between those 2 it would be the XFX due to EK is only producing full cover water blocks for XFX, Asus and Powercolor for the 390/390x cards.

But for me the Powercolor would be my first choice over any of the 390/390x cards.

My 2 cents


----------



## CamsX

Quote:


> Originally Posted by *K1mer0*
> 
> Guys, Gigabyte R9 390 8GB GDDR5 or XFX R9 390 8GB GDDR5 DD Black Edition...which one do you prefer...?
> 
> thanks


XFX if you plan to overclock. Otherwise, choose whatever matches your build color better.

Damn, that reminded me of my wife's fashion tips.


----------



## K1mer0

XFX it is!....the main advantage in gigabyte is the 3 dp 1.2....xfx as only 1 dp 1.2..

thanks guys,


----------



## kizwan

Quote:


> Originally Posted by *Depauville Kid*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Depauville Kid*
> 
> Anyone have the XFX straight 390 and can comment on coil whine?
> 
> 
> 
> My buddy is going to pull the trigger on the XFX r9 390, unless anyone has any reason not to?
> 
> Thanks everyone.
Click to expand...

Coil whine is pretty random. It is not brands specific.
Quote:


> Originally Posted by *geekforspeed*
> 
> Hey, just made an account on this forum because my day of experiencing OC with my recently bought 390 wasn't as fulfilling as I'd liked it to be.
> After a lot of you people reporting about Core Clocks between 1100 and 1200mhz I was eager on getting a decent bump in performance.
> Set the TDP to +20% (MSI Afterburner) and started messing around at about 1120ish mhz core clock.
> Everything seemed fine at first while playing some Ground Zeroes when I suddenly noticed some heavy dips in FPS.
> I installed FurMark to check what might be the issue and it seems my core clock just randomly drops down to 300mhz every now and then for a split second. This starts to happen after a few minutes of heavy GPU load.
> Does anyone have an idea? I've already tried fiddling around with vCore and TDP settings, nothing seemed to get rid of the problem.
> Did I just really get screwed over in terms of sillicone lottery or is my Sapphire 390 Nitro defective? Or am I doing something completely wrong?
> The most I've managed to get out of my R390 while running stable was at around 1086mhz with TDP +20% and vCore increased ever so slightly.
> What I probably should add: I noticed my GPU not running with its advertised 1010mhz on stock settings, only if I increase TDP it seems to reach said clock speed.
> Help would be much much appreciated. Thanks in advance.


Just a couple of advice.

Always max out power limit when overclocking. This way your card not easily hit TDP limit.
DO NOT use furmark for whatever reason except frying eggs but you have kitchen for that.
Your bequiet PSU have two rails & each rails support 30 Amp max which means 360W max for each +12V rails. If there's no option on the connectors at the back of the PSU for 12V1 & 12V2, this means all your hardware are pulling power from +12V1 rail. If this is the case, this explained why your card is throttling because not enough power for the GPU & other hardware.


----------



## djohny24

Hello! im new here, with a "rare" Gigabyte 390 hehe.

Some pics:



















voltaje is locked but runs perfectly at 1090/1640Mhz.

I made a review vs Asus 970 mini. I you wanna take a look, here is!: http://foro.noticias3d.com/vbulletin/showthread.php?t=443266

Is there any user with this card? Any unlocked bios?

Oh, i changed thermal paste today applying Gelid GC Extreme. Temps dropped from 77ºc to 74ºc using Kombustor at 1080P and MSAA4X.

This is the pcb:










And heatsink. Works perfectly and all parts are covered.


----------



## Gumbi

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Yeah, popped a mosfet...
> 
> Should be up in a week or so with a formula-z and trying out the ASUS 390X Strix...
> 
> I sold my msi cause of the 2.5 slot cooler. I'm going to need 2slot cards for crossfire on the new board.
> 
> I'll get new members added tomorrow!


'Twil be interesting to see how she performs... Why the X model all of a sudden?

The Asus gets very hot, but I believe it's mainly due to an extremely conservative fan profile.


----------



## Jon55

Very nice! In that last pic, did you replace the thermal pads on the VRMs/RAM (when you also replaced the stock thermal paste)?


----------



## orlfman

just want to say i got my new noctua ippc fans. 2x 140mm 3000 rpms for intake and 1x 120mm 3000 rpm as exhaust. got them hooked up to my swiftech 8 pwm splitter. dropped my top card vrm idle from 62c down to 53c (i have a 144hz monitor so it keeps my top cards memory running at its full 1500mhz instead of downclocking to 150mhz when at idle, and this happened with my 970 sli setup too) and my top cards vrm load 91c down to 76c! the extra air flow really improved the temps.


----------



## Gumbi

Quote:


> Originally Posted by *orlfman*
> 
> just want to say i got my new noctua ippc fans. 2x 140mm 3000 rpms for intake and 1x 120mm 3000 rpm as exhaust. got them hooked up to my swiftech 8 pwm splitter. dropped my top card vrm idle from 62c down to 53c (i have a 144hz monitor so it keeps my top cards memory running at its full 1500mhz instead of downclocking to 150mhz when at idle, and this happened with my 970 sli setup too) and my top cards vrm load 91c down to 76c! the extra air flow really improved the temps.


I recently added 2000RPM IPPC Noctua fans to my setup (1 140mm exhaust, 1 side intake 120 and 2 front 120 intakes) and saw my te.ls improve a lot too.

Have them hooked up to my voltage controller so get the best of both worlds







2000RPM is super loud, and only run it while gaming, 3000RPM must be crazy









That problem with 144hz is a known one, I run my monitor at 120hz and it sorts it out







(I run it at 120 hz while doing normal stuff then put it to 144 when gaming).. That allows it to run at 150mhz memory.


----------



## CamsX

Quote:


> Originally Posted by *orlfman*
> 
> just want to say i got my new noctua ippc fans. 2x 140mm 3000 rpms for intake and 1x 120mm 3000 rpm as exhaust. got them hooked up to my swiftech 8 pwm splitter. dropped my top card vrm idle from 62c down to 53c (i have a 144hz monitor so it keeps my top cards memory running at its full 1500mhz instead of downclocking to 150mhz when at idle, and this happened with my 970 sli setup too) and my top cards vrm load 91c down to 76c! the extra air flow really improved the temps.


I'm glad to hear that. Awesome job on the VRMS. Further proof on what agent achieved with his airflow configuration.


----------



## CamsX

I have finally overclocked my R9 390 Nitro beyond my comfort zone. Still nothing extreme or anything, just looking for stock voltage limits.

Core @ 1110
Memory @ 1600
Stock Voltage

http://www.3dmark.com/3dm/8495010
*10995*
Graphics Score
13355
Physics Score
10122
Combined Score
5008

Just short of 11k in Firestrike, 1500 in Heaven. No Artifacts that I could notice. Will test further with games before I try to increase it again.


----------



## jaydude

Quote:


> Originally Posted by *djohny24*
> 
> Hello! im new here, with a "rare" Gigabyte 390 hehe.
> 
> Some pics:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> voltaje is locked but runs perfectly at 1090/1640Mhz.
> 
> I made a review vs Asus 970 mini. I you wanna take a look, here is!: http://foro.noticias3d.com/vbulletin/showthread.php?t=443266
> 
> Is there any user with this card? Any unlocked bios?
> 
> Oh, i changed thermal paste today applying Gelid GC Extreme. Temps dropped from 77ºc to 74ºc using Kombustor at 1080P and MSAA4X.
> 
> This is the pcb:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And heatsink. Works perfectly and all parts are covered.


Nice pics, I also replaced thermal paste with some CM Essential E1, temps dropped a few degree's which is nice









As for the bios I have been looking around and have not came across anything, I am hoping to lower the voltage and see how it affects temps but as of now that seems a far fetch, and the lack of dual bios would worry me too much to try









Edit: Great review, there are no gigabyte 390 reviews so this is the first I have seen.

Great Work


----------



## Agent Smith1984

Quote:


> Originally Posted by *Gumbi*
> 
> 'Twil be interesting to see how she performs... Why the X model all of a sudden?
> 
> The Asus gets very hot, but I believe it's mainly due to an extremely conservative fan profile.


Cause I'm doing this once and doing it all the way... Fury to me is a ripoff, and the x is too high compared to the 390 even, but i have realised that it takes a 390 @ 1200 to reach an X @ 1150, so if i can get 1180-1200 on two 390x it'll be pretty potent. Not to say the cheap skate in me won't still emerge at the last minute and still just get the 390 strix

I'll be buying the motherboard, and the first gpu this week, and then buying the second gpu at Christmas.

This busted board put a delay on my crossfire plans though.


----------



## CamsX

Hmm, noticed some weird textures while playing BF4. Lots of flickering colors. Is this Core or Memory?


----------



## kizwan

Quote:


> Originally Posted by *CamsX*
> 
> Hmm, noticed some weird textures while playing BF4. Lots of flickering colors. Is this Core or Memory?


If flickering with or without colours, it would be memory if I'm not mistaken. Try increase the voltage a little bit.


----------



## Gumbi

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Cause I'm doing this once and doing it all the way... Fury to me is a ripoff, and the x is too high compared to the 390 even, but i have realised that it takes a 390 @ 1200 to reach an X @ 1150, so if i can get 1180-1200 on two 390x it'll be pretty potent. Not to say the cheap skate in me won't still emerge at the last minute and still just get the 390 strix
> 
> I'll be buying the motherboard, and the first gpu this week, and then buying the second gpu at Christmas.
> 
> This busted board put a delay on my crossfire plans though.


Haha I know the feeling.

Having read reviews of the Strix, I don't understand but it seems to draw a reasonable amount more power under load than other comparable 390Xs... maybe it's the fact they let it run hotter that it does?

For example, guru3d had it dawing 30 watts more than the 390X under load. It also hit 94c at 100mv at 60% fan speed!

I hope you can put a stamp on them and figure out what's up with them. Maybe it is just a simple matter of airflow









Asus nailed the cooler design for the Strix Fury, so they shouldn't have any problem with the 390X model!


----------



## djohny24

Thanks guys!

I left the same thermal pads, they are new. And yes, my review is the first Gigabyte review, i dont know why... but is the smallest 390 in the market!.

Yeah i also want to lower the voltaje, it must be quite high.

In our forum, a user and me trys the Ashes of the Singularity, athough it is not the same system, it was interesting.

970: http://fotos.subefotos.com/0785807492045003b0262d95c31eb3ebo.jpg
390: http://i.imgur.com/sP754hE.jpg


----------



## Noirgheos

Alright guys, for pure cooling, MSI or XFX? No water cooling, just the default air coolers on them.

And just to clarify, I can OC the XFX to match the performance levels of the MSI right? Seeing as default the MSI 390X performs 3-4FPS better.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Gumbi*
> 
> Haha I know the feeling.
> 
> Having read reviews of the Strix, I don't understand but it seems to draw a reasonable amount more power under load than other comparable 390Xs... maybe it's the fact they let it run hotter that it does?
> 
> For example, guru3d had it dawing 30 watts more than the 390X under load. It also hit 94c at 100mv at 60% fan speed!
> 
> I hope you can put a stamp on them and figure out what's up with them. Maybe it is just a simple matter of airflow
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Asus nailed the cooler design for the Strix Fury, so they shouldn't have any problem with the 390X model!


Uggg...

Took a look around, and you are right...

The ASUS has some nasty temps.....

I have been trying to get away from poor temps, not put myself right back in the middle of them .

Maybe I try an MSI 390X instead of the 390.....?

I really wish I could find some middle ground motherboard for this rig.... (keeping in mind I am sticking firm to my red/black color scheme)

I am not going to be looking at the ASRock killer at all, as it is basically the big brother of the one that just went boom on me.

The MSI 970 Gaming seems a nice board but I believe it's only going to fetch me around 4.7Ghz or so (though probably be very reliable doing it)

The MSI 990FXA-Gaming HAS NO REVIEWS AT ALL, and that bothers me because at around $135 it's more in the price range I'd like to be in and if it would hold 4.8-4.9GHz I'd be all over it (though I know the elusive 5GHz will likely not happen)

The Sabertooth is my automatic first pick from a price/performance/options standpoint, but it's ugly as sin (sorry kitty owners) and I can't make it work with my scheme.

Then the next step is the Crosshair Formula, and that thing is still $200, which just seems like so much money for an AM3+ board, considering the age of the platform.
But, if there is any shot at 5GHz, I suppose it'll be on that board....

Decisions decisions.


----------



## By-Tor

Sad that there has never been much AMD motherboard love from the companies. It seems they make tons of motherboards with different options for intel, but not to many for AMD.

I used a Sabertooth with my 8350 and at that time I had a green theme and with green lights in the case to hid the Sabertooth's colors for the most part. Everything looked black..


----------



## Gumbi

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Uggg...
> 
> Took a look around, and you are right...
> 
> The ASUS has some nasty temps.....
> 
> I have been trying to get away from poor temps, not put myself right back in the middle of them .
> 
> Maybe I try an MSI 390X instead of the 390.....?
> 
> I really wish I could find some middle ground motherboard for this rig.... (keeping in mind I am sticking firm to my red/black color scheme)
> 
> I am not going to be looking at the ASRock killer at all, as it is basically the big brother of the one that just went boom on me.
> 
> The MSI 970 Gaming seems a nice board but I believe it's only going to fetch me around 4.7Ghz or so (though probably be very reliable doing it)
> 
> The MSI 990FXA-Gaming HAS NO REVIEWS AT ALL, and that bothers me because at around $135 it's more in the price range I'd like to be in and if it would hold 4.8-4.9GHz I'd be all over it (though I know the elusive 5GHz will likely not happen)
> 
> The Sabertooth is my automatic first pick from a price/performance/options standpoint, but it's ugly as sin (sorry kitty owners) and I can't make it work with my scheme.
> 
> Then the next step is the Crosshair Formula, and that thing is still $200, which just seems like so much money for an AM3+ board, considering the age of the platform.
> But, if there is any shot at 5GHz, I suppose it'll be on that board....
> 
> Decisions decisions.


I assumed you had done the research!







Ya, some reviews paint a very poor picture of temps for that card. Now of course you'll get better temps with your airflow setup, but even then I dunno if it can fully make up for it. It gets soundly smashed by MSI etc in reviews.

If they had a VaporX 390X I would say go for it 110%.MSI seems like the best option, and you're already familiar with its superior cooling.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Gumbi*
> 
> I assumed you had done the research!
> 
> 
> 
> 
> 
> 
> 
> Ya, some reviews paint a very poor picture of temps for that card. Now of course you'll get better temps with your airflow setup, but even then I dunno if it can fully make up for it. It gets soundly smashed by MSI etc in reviews.
> 
> If they had a VaporX 390X I would say go for it 110%.MSI seems like the best option, and you're already familiar with its superior cooling.


Honestly, I have been looking at reviews from my phone all weekend since my rig is down, and was just looking at raw performance data and overclocking numbers....

Now that I have gone back and looked at temps, it's pretty disappointing...

Asus in my opinion, has fallen off on the GPU side of things the last few years..

Their 280x was terrible as it had a plethora of artifacting issues for almost everyone involved (including myself)

Their 290 was an epic failure because they slapped a 780 cooler on the card that didn't even cool it any better than the reference cooler.

Now they have this top tier 390 with a tri-fan cooler, and it still can't cool their GPU to the level of other cards out there.. I just don't understand.

Mind you, if I get the Crosshair, I can't run two MSI cards in crossfire with much success at all, because of the close spacing, and the thickness of the MSI cooler.

One thing also, is that probably what helps the MSI, Sapphire, and the PowerColor cards so much with cooling, is the fact that they are all 2.5 slot cards.

However, one could argue that XFX was able to bring a decent cooler to the table in a 2 slot form, using only 2 fans..... so what is ASUS excuse here?

This particular situation has ALMOST got me considering the MSI Gaming GTX 980....

Great cooler, great clocker, and it's 2 slots, which would work great on a close PCIE slot configuration.


----------



## desetnik

I've made myself a custom fan profile for my 390. Didn't like that idle 50-60C much and 80+ on load. Now its running 15% until 40C and I increased the default fan speed by 10% so the fan is at 70 instead of 60% when the card is in 75C area. Max I got in witcher Is 77C but only for a short test in demanding areas. But first time I checked VRM temps and they are low. The VRM1 gets to 66C max. But VRM2 sensor is wrong or there isn't one I don't know its at 50C all the time. I feel like thermal paste wasn't properly applied on my card. I'll post again once I bring down core temps.


----------



## Gumbi

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Honestly, I have been looking at reviews from my phone all weekend since my rig is down, and was just looking at raw performance data and overclocking numbers....
> 
> Now that I have gone back and looked at temps, it's pretty disappointing...
> 
> Asus in my opinion, has fallen off on the GPU side of things the last few years..
> 
> Their 280x was terrible as it had a plethora of artifacting issues for almost everyone involved (including myself)
> 
> Their 290 was an epic failure because they slapped a 780 cooler on the card that didn't even cool it any better than the reference cooler.
> 
> Now they have this top tier 390 with a tri-fan cooler, and it still can't cool their GPU to the level of other cards out there.. I just don't understand.
> 
> Mind you, if I get the Crosshair, I can't run two MSI cards in crossfire with much success at all, because of the close spacing, and the thickness of the MSI cooler.
> 
> One thing also, is that probably what helps the MSI, Sapphire, and the PowerColor cards so much with cooling, is the fact that they are all 2.5 slot cards.
> 
> However, one could argue that XFX was able to bring a decent cooler to the table in a 2 slot form, using only 2 fans..... so what is ASUS excuse here?
> 
> This particular situation has ALMOST got me considering the MSI Gaming GTX 980....
> 
> Great cooler, great clocker, and it's 2 slots, which would work great on a close PCIE slot configuration.


XFX might be worth a whirl, they seem to be on par with the better coolers for the 390 series. Tgey really cracked down on their VRM cooling and improved it drastically.

I have a feeling Asus used the same cooler again...


----------



## Dundundata

I'll post up some temps when I get my XFX. It will probably be the weekend after next. Maybe this weekend if it arrives sooner (can't wait!) My mobo is only the MSI PC mate Z97, probably should have got the gaming...though not a fan of red. In my miniscule OC experience it's worked fine on the CPU but I never really pushed it.

The XFX does look like it has a low profile which should be good for crossfire. From what I've read the temps should be good too.


----------



## CamsX

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Honestly, I have been looking at reviews from my phone all weekend since my rig is down, and was just looking at raw performance data and overclocking numbers....
> 
> Now that I have gone back and looked at temps, it's pretty disappointing...
> 
> Asus in my opinion, has fallen off on the GPU side of things the last few years..
> 
> Their 280x was terrible as it had a plethora of artifacting issues for almost everyone involved (including myself)
> 
> Their 290 was an epic failure because they slapped a 780 cooler on the card that didn't even cool it any better than the reference cooler.
> 
> Now they have this top tier 390 with a tri-fan cooler, and it still can't cool their GPU to the level of other cards out there.. I just don't understand.
> 
> Mind you, if I get the Crosshair, I can't run two MSI cards in crossfire with much success at all, because of the close spacing, and the thickness of the MSI cooler.
> 
> One thing also, is that probably what helps the MSI, Sapphire, and the PowerColor cards so much with cooling, is the fact that they are all 2.5 slot cards.
> 
> However, one could argue that XFX was able to bring a decent cooler to the table in a 2 slot form, using only 2 fans..... so what is ASUS excuse here?
> 
> This particular situation has ALMOST got me considering the MSI Gaming GTX 980....
> 
> Great cooler, great clocker, and it's 2 slots, which would work great on a close PCIE slot configuration.


Sapphires are 2 slot cards. There is an R9 390x Tri-X (thats a lot of trees and axes) with your name out there.


----------



## Noirgheos

Is anyone haveing crashing issues in DX11 games on their 390/390X? I asked on /r/AMD, and I got very mixed responses. I'd like to hear what people here are experiencint. Please leave your OS and driver version.


----------



## robmcrock

Quote:


> Originally Posted by *desetnik*
> 
> I've made myself a custom fan profile for my 390. Didn't like that idle 50-60C much and 80+ on load. Now its running 15% until 40C and I increased the default fan speed by 10% so the fan is at 70 instead of 60% when the card is in 75C area. Max I got in witcher Is 77C but only for a short test in demanding areas. But first time I checked VRM temps and they are low. The VRM1 gets to 66C max. But VRM2 sensor is wrong or there isn't one I don't know its at 50C all the time. I feel like thermal paste wasn't properly applied on my card. I'll post again once I bring down core temps.


Any chance you could post a screen shot of the curve? Cheers


----------



## HalongPort

Hello guys,

I'm looking to upgrade my rig with a 2500k, 660ti and a E9 480W.
I can't max games at 1080p like GTA V, BF4 or Skyrim with mods.
Since my case is rather old, cards longer than 305mm doesn't fit.

290 PCS+ 295€
290x PCS+ 345€
390 PCS+ 340€
390 MSI 360€
390 XFX 320€
My Gigabyte 660ti is pretty loud at load, but I don't mind since I'm gaming with headphones.
I don't know, which one to get.
Reading the forums it seems to be that PCS+ is the best cooling solution.
I also thought about waiting till Q2 for new 16nm cards, but who knows AMD/Nvidia will deliver in time.

What would you do?

Thanks for reading,
Max


----------



## desetnik

Quote:


> Originally Posted by *robmcrock*
> 
> Any chance you could post a screen shot of the curve? Cheers


I lost the curve because I thought it will save them on different profiles. I think it looked like this.


If you are interest in my current curve its same fan 15% on 40C. 60% at 75C and 100% at 85C. Thats how msi support told me the fan should be running except the 15% which keeps the card at 38-42C area during idle. Still hotter compared to what people get.


----------



## flopper

Quote:


> Originally Posted by *HalongPort*
> 
> Hello guys,
> 
> I'm looking to upgrade my rig with a 2500k, 660ti and a E9 480W.
> I can't max games at 1080p like GTA V, BF4 or Skyrim with mods.
> Since my case is rather old, cards longer than 305mm doesn't fit.
> 
> 290 PCS+ 295€
> 290x PCS+ 345€
> 390 PCS+ 340€
> 390 MSI 360€
> 390 XFX 320€
> My Gigabyte 660ti is pretty loud at load, but I don't mind since I'm gaming with headphones.
> I don't know, which one to get.
> Reading the forums it seems to be that PCS+ is the best cooling solution.
> I also thought about waiting till Q2 for new 16nm cards, but who knows AMD/Nvidia will deliver in time.
> 
> What would you do?
> 
> Thanks for reading,
> Max


390.
placeholder until 16nm is out.
I get mine at the end of the month.


----------



## Noirgheos

Anyone know if there are any backplates available for the Sapphire R9 390X?


----------



## Dorland203

Quote:


> Originally Posted by *Noirgheos*
> 
> Is anyone haveing crashing issues in DX11 games on their 390/390X? I asked on /r/AMD, and I got very mixed responses. I'd like to hear what people here are experiencint. Please leave your OS and driver version.


Do you run any background programs ? Do you record your gameplay ?


----------



## Noirgheos

Quote:


> Originally Posted by *Dorland203*
> 
> Do you run any background programs ? Do you record your gameplay ?


It was just a general question, I don't even have my 390X yet.


----------



## Geoclock

Is used ASUS 390x DirectCII worth to buy $60 less than retailer?
How is ASUS warranty for second owners?


----------



## Dorland203

Quote:


> Originally Posted by *Noirgheos*
> 
> It was just a general question, I don't even have my 390X yet.


From my experience,MSI 390X run game smoothly and crash-free with MSI afterburner and Fraps in the back to monitor GPU and FPS.I have encountered crash when I used Mirillis Action to record games.Mirills Action will crash in DX11 games but record without trouble in DX9,10 games.There may be more programs that cause crash when running in background.


----------



## navjack27

no problems here bud. win7 390x 15.8 beta


----------



## milan616

I think most people would think the fan profile on my MSI 390x is crazy, but I'm running 1100/1600 @ -19mV core/+25mV aux. And I mostly play Dota 2 so not very demanding.



Just a quick question for people about the Heaven benchmark. After my computer is up for a while it will always crash immediately in DX11 with extreme tessellation, but not in DX9 or OpenGL engines nor at lower levels of tessellation. If I restart the PC it generally works fine. Anyone else encounter this?

Also have a few instances of the computer locking up when opening GPU-Z or the info tab of Afterburner.


----------



## kizwan

Quote:


> Originally Posted by *HalongPort*
> 
> Hello guys,
> 
> I'm looking to upgrade my rig with a 2500k, 660ti and a E9 480W.
> I can't max games at 1080p like GTA V, BF4 or Skyrim with mods.
> Since my case is rather old, cards longer than 305mm doesn't fit.
> 
> 290 PCS+ 295€
> 290x PCS+ 345€
> 390 PCS+ 340€
> 390 MSI 360€
> 390 XFX 320€
> My Gigabyte 660ti is pretty loud at load, but I don't mind since I'm gaming with headphones.
> I don't know, which one to get.
> Reading the forums it seems to be that PCS+ is the best cooling solution.
> I also thought about waiting till Q2 for new 16nm cards, but who knows AMD/Nvidia will deliver in time.
> 
> What would you do?
> 
> Thanks for reading,
> Max


390 of course. If I were you, between MSI or XFX.
Quote:


> Originally Posted by *Geoclock*
> 
> Is used ASUS 390x DirectCII worth to buy $60 less than retailer?
> How is ASUS warranty for second owners?


Avoid ASUS at all cost. ASUS 290's & 390's cards have poor performance cooler.


----------



## UlyssesRex

I know this may be a stupid question, but I'm worried. I just bought an MSI R9 390 (non-LE) and it can reach 90 °C in Furmark with a _custom fan profile_ and default settings. It may even get hotter, but I don't want to risk reaching the 95 °C limit. The EVGA GTX 970 FTW I sold for this card couldn't even pass 75 °C with the same settings, so I'm wondering if my temperatures are abnormal? Also, I've been seeing some people posting their benches and they're very low under load. In the Bioshock Infinite benchmark and MGS:V, my temperatures never pass 65 °C, though! Will I ever hit the safety limit (temperature wise) for this card in actual games, and not just benchmarks designed to test your card? Also, would it be advisable to overclock my card? For the record, my case is blowing out cool air while running Furmark and the room temperature is 23 °C. The card was running at 60 °C before I used my custom profile.

*Edit:* I forgot some vital information. I'm using a 2560x1080 monitor, and the rest of my components are very cool. I modified my fan curve and I can idle at 40 °C with no active fans--couldn't before, not sure why it's alright now. Also, I tried Heaven Benchmark at the highest settings and my card reached about 79 °C after around five minutes but just fluctuated around that mark. Safe to OC?

Thanks!


----------



## kizwan

Quote:


> Originally Posted by *UlyssesRex*
> 
> I know this may be a stupid question, but I'm worried. I just bought an MSI R9 390 (non-LE) and it can reach 90 °C in Furmark with a _custom fan profile_ and default settings. It may even get hotter, but I don't want to risk reaching the 95 °C limit. The EVGA GTX 970 FTW I sold for this card couldn't even pass 75 °C with the same settings, so I'm wondering if my temperatures are abnormal? In the Bioshock Infinite benchmark and MGS:V, my temperatures never pass 65 °C, though! Will I ever hit the safety limit (temperature wise) for this card in actual games, and not just benchmarks designed to test your card? Also, would it be advisable to overclock my card? For the record, my case is blowing out cool air while running Furmark and the room temperature is 23 °C.
> 
> Thanks!


An advice, do not use furmark. The power draw, heat generation & high temps with furmark are unrealistic. Modern cards supposedly TDP limited when running furmark, so you can not have apple-to-apple comparison between AMD & NVIDIA because NVIDIA cards maybe a little more aggressive when dealing with furmark.

If you want test real-world stability or your card cooler cooling performance, test with games or a few loop of benchmarks. Anyway, temp is very unlikely going to kill Hawaii/Grenada cards because once the core reach 94/95C, the core will throttle down to keep the temp within safe range. The only thing that can kill these cards if you push too high voltage which means the first thing that will go is VRMs.


----------



## orlfman

Quote:


> Originally Posted by *milan616*
> 
> I think most people would think the fan profile on my MSI 390x is crazy, but I'm running 1100/1600 @ -19mV core/+25mV aux. And I mostly play Dota 2 so not very demanding.
> 
> 
> 
> Just a quick question for people about the Heaven benchmark. After my computer is up for a while it will always crash immediately in DX11 with extreme tessellation, but not in DX9 or OpenGL engines nor at lower levels of tessellation. If I restart the PC it generally works fine. Anyone else encounter this?
> 
> Also have a few instances of the computer locking up when opening GPU-Z or the info tab of Afterburner.


last few days i've been pretty much leaving two instances of gpu-z running the background logging both of my 390s while i play games for 3 - 7 hours straight. no complaints.

i've let heaven run for close to 40 minutes a few times running gpu-z in the background to log.

the only thing that has caused me to blue screen has been speccy and that was because of it loading up cpu-z library to scan my processor.

out side of that BF4 crashing once but then again its BF4. game won't connect to any server if i leave origin running. have to close out of it, then join a server, and have it reopen origin to launch the game.


----------



## AverdanOriginal

Quote:


> Originally Posted by *UlyssesRex*
> 
> I know this may be a stupid question, but I'm worried. I just bought an MSI R9 390 (non-LE) and it can reach 90 °C in Furmark with a _custom fan profile_ and default settings. It may even get hotter, but I don't want to risk reaching the 95 °C limit. The EVGA GTX 970 FTW I sold for this card couldn't even pass 75 °C with the same settings, so I'm wondering if my temperatures are abnormal? Also, I've been seeing some people posting their benches and they're very low under load. In the Bioshock Infinite benchmark and MGS:V, my temperatures never pass 65 °C, though! Will I ever hit the safety limit (temperature wise) for this card in actual games, and not just benchmarks designed to test your card? Also, would it be advisable to overclock my card? For the record, my case is blowing out cool air while running Furmark and the room temperature is 23 °C. The card was running at 60 °C before I used my custom profile.
> 
> *Edit:* I forgot some vital information. I'm using a 2560x1080 monitor, and the rest of my components are very cool. I modified my fan curve and I can idle at 40 °C with no active fans--couldn't before, not sure why it's alright now. Also, I tried Heaven Benchmark at the highest settings and my card reached about 79 °C after around five minutes but just fluctuated around that mark. Safe to OC?
> 
> Thanks!


Quote:


> Originally Posted by *kizwan*
> 
> An advice, do not use furmark. The power draw, heat generation & high temps with furmark are unrealistic. Modern cards supposedly TDP limited when running furmark, so you can not have apple-to-apple comparison between AMD & NVIDIA because NVIDIA cards maybe a little more aggressive when dealing with furmark.
> 
> If you want test real-world stability or your card cooler cooling performance, test with games or a few loop of benchmarks. Anyway, temp is very unlikely going to kill Hawaii/Grenada cards because once the core reach 94/95C, the core will throttle down to keep the temp within safe range. The only thing that can kill these cards if you push too high voltage which means the first thing that will go is VRMs.


Had the same issue at the beginning with Furmark. kizwan is correct here in stating that furmark is unrealistic. For easy comparisement use Heaven Benchmark.Or Firestrike

Secondly if your case is blowing out "COOL" air while your card is running up the hill to 90 C° something.... then... and I am just guessing here







... something is wrong with your airflow. it seems there might be a hot air bubble forming in your pc case and only the cold air is getting pushed out????

So I would suggest, check cabeling in your case (tidy it up if need be), check if you have a sufficient and correct airflow. bottom front to top back.
70-75 C° (+/-) with normal settings and normal air flow and ambient 23 should be realisitc with this card on Heaven Benchmark


----------



## kizwan

Anyone getting *ARIeRecord ATI EEU PnP start/stop failed* error in Event Viewer?


----------



## sportsczy

Sup guys and gals.

I just bought a Asus R9 390 Gaming... seems to run fine but temps got me worried a bit. I tweaked fan speeds on MSI Afterburner and i put it on the 1070mhz factory overclock (also bumped memory to 1520MhZ and power to 110% as per specs). When running at full load on Heaven Benchmark, the temps rise to 77 degrees and stays stable there (with fans running at 74%; it's loud but i knew that).

Is this normal?


----------



## sportsczy

Quote:


> Originally Posted by *sportsczy*
> 
> Sup guys and gals.
> 
> I just bought a Asus R9 390 Gaming... seems to run fine but temps got me worried a bit. I tweaked fan speeds on MSI Afterburner and i put it on the 1070mhz factory overclock (also bumped memory to 1520MhZ and power to 110% as per specs). When running at full load on Heaven Benchmark, the temps rise to 77 degrees and stays stable there (with fans running at 74%; it's loud but i knew that).
> 
> Is this normal?


Oh and is there any rational way to get rid of the led lights on the GPU. Highly annoying.


----------



## Noirgheos

Quote:


> Originally Posted by *sportsczy*
> 
> Sup guys and gals.
> 
> I just bought a Asus R9 390 Gaming... seems to run fine but temps got me worried a bit. I tweaked fan speeds on MSI Afterburner and i put it on the 1070mhz factory overclock (also bumped memory to 1520MhZ and power to 110% as per specs). When running at full load on Heaven Benchmark, the temps rise to 77 degrees and stays stable there (with fans running at 74%; it's loud but i knew that).
> 
> Is this normal?


For the ASUS model that's pretty decent. Especially with an OC on.


----------



## sportsczy

Thanks. I may have to watercool this system. It's too loud and i don't like all this hot air over the PC components.


----------



## Gumbi

Quote:


> Originally Posted by *sportsczy*
> 
> Thanks. I may have to watercool this system. It's too loud and i don't like all this hot air over the PC components.


More evidence that the Asus cooler sucks :/ All I can say is the better airflow your case has, the cooler your card will be. A few of us have had good success with high airflow setups.


----------



## bazookatooths

Quote:


> Originally Posted by *sportsczy*
> 
> Sup guys and gals.
> 
> I just bought a Asus R9 390 Gaming... seems to run fine but temps got me worried a bit. I tweaked fan speeds on MSI Afterburner and i put it on the 1070mhz factory overclock (also bumped memory to 1520MhZ and power to 110% as per specs). When running at full load on Heaven Benchmark, the temps rise to 77 degrees and stays stable there (with fans running at 74%; it's loud but i knew that).
> 
> Is this normal?


Seems hot man what's it idle at
I don't go over 80c with 30% fan speed on bench 1175/1720


----------



## sportsczy

Idle is at 42 degrees


----------



## sportsczy

If i leave the GPU fan setting at stock... it runs between 88-90 degrees at full load. The reviews say that's normal. I don't like it one bit lol. It needs to be below 80.

I'm also using a NZXT S340 case which is a midtower. It has good room and airflow but it's obviously nowhere as roomy as a full tower. Probably cost me 2-3 degrees right there.

Edit: Meh. Fan speed over 64% is just too loud. So i adjusted everything to get my max stable temp at that fan speed. I get 80 degrees at 64% fan speed while running Unigine for 10 mins. On the bright side, i have a little space heater for the winter months


----------



## desetnik

Quote:


> Originally Posted by *sportsczy*
> 
> If i leave the GPU fan setting at stock... it runs between 88-90 degrees at full load. The reviews say that's normal. I don't like it one bit lol. It needs to be below 80.
> 
> I'm also using a NZXT S340 case which is a midtower. It has good room and airflow but it's obviously nowhere as roomy as a full tower. Probably cost me 2-3 degrees right there.
> 
> Edit: Meh. Fan speed over 64% is just too loud. So i adjusted everything to get my max stable temp at that fan speed. I get 80 degrees at 64% fan speed while running Unigine for 10 mins. On the bright side, i have a little space heater for the winter months


You are still getting better temps than I do on my MSI. I added better sidefan to my case also cleaned it properly and changed the HDD positions on hdd cage so that one fan is completely free to push cold air in and still getting 81C with 75/80% fan speed. Even the ambient in my room is now lower than what it used to be few days ago. During winter my room gets really cold my legs and fingers are freezing. So this card will keep me warm with all this hot air coming out of this heater!


----------



## sportsczy

If Nvidia hadn't pissed me off so badly with the GTX 970 that STUTTERS when gaming and watching vids on occasion at 1440p... i wouldn't have gone AMD. Oh well. Nobody's perfect. At least i was fully aware of the AMD issues beforehand while Nvidia hoodwinked everyone.


----------



## Dundundata

I don't think it's AMD really, it depends which card you buy and from what I've read ASUS just doesn't have good cooling.


----------



## Piccolo55

Quote:


> Originally Posted by *Dundundata*
> 
> I don't think it's AMD really, it depends which card you buy and from what I've read ASUS just doesn't have good cooling.


The dcII or the strix?


----------



## sportsczy

290, 295 and 290x are notorious for running very hot... even the sapphire tri versions. I was just expecting better results than the norm from the new dcII cooling system from Asus. Seems like it's pretty much on par with the tri in terms of cooling ability, Fans on it are loud after 65% or it wouldn't be a problem. If i could run it at 70% without it sounding like a hurricane, i would be maxing out in the mid 70s in terms of temp.

In any case, i was paying for the clocks on it being the highest of any R9 390 and it's OCing capabilities. Not disappointed there at all. It's also built like a brick. Very solid.


----------



## sportsczy

For those considering Asus... the big thing was the VRAM was overheating on the Strix. That's resolved in the DCII.


----------



## Noirgheos

Quote:


> Originally Posted by *sportsczy*
> 
> 290, 295 and 290x are notorious for running very hot... even the sapphire tri versions. I was just expecting better results than the norm from the new dcII cooling system from Asus. Seems like it's pretty much on par with the tri in terms of cooling ability, Fans on it are loud after 65% or it wouldn't be a problem. If i could run it at 70% without it sounding like a hurricane, i would be maxing out in the mid 70s in terms of temp.
> 
> In any case, i was paying for the clocks on it being the highest of any R9 390 and it's OCing capabilities. Not disappointed there at all. It's also built like a brick. Very solid.


Strange. I just got my girlfriend a 290X Vapour-X. It's max temp. is 69C under load(we used valley).


----------



## Durlag

Hey guys I bought a Gigabyte R9 390 a few weeks ago and noticed it runs pretty hot and is REALLY REALLY loud. It idles at 60C but when I put it under load it sounds like a leaf blower. I've tried custom fan profiles but I'm not sure what to do. Some people are suggesting I get an RMA and exchange it for a MSI which apparently runs quieter.

Any insight into my card is appreciated. Thanks!


----------



## Agent Smith1984

High idle temp?

Cut the fans on... They are meant to idle silently...

Please read first post.

High load temps?

Make sure airflow is sufficient.

Also try new TIM, but honestly there is no reason why a volt locked g1 should run too hot.

Not trying to be short or rude at all, i promise, but we have gotten to the point of covering temp issues in a daily basis now.....

I really want this thread to be helpful, but to primarily be about overclocking and performance issues/solutions/results...

Hope you get it sorted


----------



## Renner

Quote:


> Originally Posted by *Durlag*
> 
> Hey guys I bought a Gigabyte R9 390 a few weeks ago and noticed it runs pretty hot and is REALLY REALLY loud. It idles at 60C but when I put it under load it sounds like a leaf blower. I've tried custom fan profiles but I'm not sure what to do. Some people are suggesting I get an RMA and exchange it for a MSI which apparently runs quieter.
> 
> Any insight into my card is appreciated. Thanks!


Make sure you have a good airflow. I had my share of issues till I made it sure for myself. My own MSI doesn't hear much during loads, but then again my case has noise suppression. I saw a guy pages ago who bought a Gigabyte card, he may share his thoughts, or check his post back a bit.

Edit: his post is at the page 218.


----------



## OutrideGaming

Is it still too late to join the "cool kids club?"


----------



## sportsczy

I just ran Valley to see... 74 degrees. The Asus DCII is pretty thin. I remember the Sapphire one being longer and thicker. Maybe i'm mistaken.


----------



## Durlag

Quote:


> Originally Posted by *Agent Smith1984*
> 
> High idle temp?
> 
> Cut the fans on... They are meant to idle silently...
> 
> Please read first post.
> 
> High load temps?
> 
> Make sure airflow is sufficient.
> 
> Also try new TIM, but honestly there is no reason why a volt locked g1 should run too hot.
> 
> Not trying to be short or rude at all, i promise, but we have gotten to the point of covering temp issues in a daily basis now.....
> 
> I really want this thread to be helpful, but to primarily be about overclocking and performance issues/solutions/results...
> 
> Hope you get it sorted


Alright I apologize. Out of curiosity what is "New TIM"?

Quote:


> Originally Posted by *Renner*
> 
> Make sure you have a good airflow. I had my share of issues till I made it sure for myself. My own MSI doesn't hear much during loads, but then again my case has noise suppression. I saw a guy pages ago who bought a Gigabyte card, he may share his thoughts, or check his post back a bit.
> 
> Edit: his post is at the page 218.


Well my cable management is basically as best as it is going to get. I'm using a Cooler Master N200 and I'm not sure what more I can do for airflow.

Still contemplating if it's worth exchanging for a different manufacturer, MSI or a Sapphire although I'm not sure it would even make a difference.


----------



## navjack27

lol nice outridegaming!

also agent smith, thats why i'm not saying anything about my 390x's temps. i have ZERO case fans in this ****house thermaltake case i have and the side is off and i use a fan profile. no crashing, no issues, noise... yeah but i have headphones. i'd post overclocking stuff but i couldnt get it much higher then the stock OC mode.


----------



## navjack27

i feel like if you arent overheating or throttling in GAMES you play with SETTINGS you actually use, then ur golden dude.


----------



## Renner

Quote:


> Originally Posted by *OutrideGaming*
> 
> Is it still too late to join the "cool kids club?"


What is your ASIC quality rating?

@Durlag: I have two fans in the front for the air intake. I only had one coming with the case, and added the fan I used in my previous case on the back. It really helped a bit, previously only original one was a bit above my card, and this secondary is intaking directly below it. Really improved my temps, especially in idle, and its also great that it goes through my SSD and HDD, those were around 40 degrees previously, now around 30.


----------



## Gumbi

Quote:


> Originally Posted by *sportsczy*
> 
> 290, 295 and 290x are notorious for running very hot... even the sapphire tri versions. I was just expecting better results than the norm from the new dcII cooling system from Asus. Seems like it's pretty much on par with the tri in terms of cooling ability, Fans on it are loud after 65% or it wouldn't be a problem. If i could run it at 70% without it sounding like a hurricane, i would be maxing out in the mid 70s in terms of temp.
> 
> In any case, i was paying for the clocks on it being the highest of any R9 390 and it's OCing capabilities. Not disappointed there at all. It's also built like a brick. Very solid.


Coupld of things

Calling it a space heater is a bit of a misnomer, it's producing just as much heat as other 390s, it's just not removing from the card as efficiently.

The Asus model is known for poor cooling. My Vapor X 290 has supsrb cooling. Whisper queit at 40% fan and under 70c VRMs and core at stock...

The Trix has great cooling too.

Optimise your case airflow and repaste your GPU, beyond that I can't help you, as the Asus is known to have shoddy cooling, it's really too bad, as the card looks great.


----------



## sportsczy

72.5% ASIC for me.

I wear headphones 99% of the time anyhow so the noise isn't a problem for me as much as my wife lol. I got it to stabilize in the 74-75% range now and fan speeds of 65-70%. it's fine.

I knew the card looked too thin when i opened it. I was expecting it to take 2 1/2 PCI-E slots and it only ate up 2. I immediately thought to myself either the radiator is thinner or the fans will be crap. Well, it looks like the fans are crap.

It performs very well though. I haven't done a thing other than factory OC on it and i'm at 1070/1520. I'm sure i could push it to around 1150/1700 without much trouble. But i don't really want to given the cooling. Once i get watercooling, i'll really push the bad boy.

As far as quality of components other than the fans.... it feels like the best made card i have had with my gtx 970 EVGA. In fact, it feels better made than the EVGA. Just a shame that they can't get the cooling right.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Durlag*
> 
> Alright I apologize. Out of curiosity what is "New TIM"?
> Well my cable management is basically as best as it is going to get. I'm using a Cooler Master N200 and I'm not sure what more I can do for airflow.
> 
> Still contemplating if it's worth exchanging for a different manufacturer, MSI or a Sapphire although I'm not sure it would even make a difference.


New thermal compound.

Use something non conductive ( no as5!)

They dump mounds of compound on them from the factory, and what you really want is a nice thin layer that effectively transmits heat to the cooler.


----------



## sportsczy

Is Gelid GC Extreme ok for a GPU? Just making sure before i screw it up lol.


----------



## Agent Smith1984

Quote:


> Originally Posted by *sportsczy*
> 
> Is Gelid GC Extreme ok for a GPU? Just making sure before i screw it up lol.


Excellent choice!


----------



## BradleyW

Quote:


> Originally Posted by *sportsczy*
> 
> Is Gelid GC Extreme ok for a GPU? Just making sure before i screw it up lol.


Great stuff!


----------



## Durlag

Quote:


> Originally Posted by *Renner*
> 
> What is your ASIC quality rating?
> 
> @Durlag: I have two fans in the front for the air intake. I only had one coming with the case, and added the fan I used in my previous case on the back. It really helped a bit, previously only original one was a bit above my card, and this secondary is intaking directly below it. Really improved my temps, especially in idle, and its also great that it goes through my SSD and HDD, those were around 40 degrees previously, now around 30.


Would it be worth it to look into adding fans to my N200 then? I'm not even worried so much about the temperatures, but the noise levels are basically unbearable under load and it makes me super nervous. I have a couple more days before I can't return it to the retailer so I'll mull it over and do some more tests


----------



## sportsczy

So put the GC extreme on and ran Unigine Heaven for 10 mins... high temp of 74 degrees with 75% fan speed. A bit loud but i wanted to keep my GPU below 75 under heavy duress.

What was the crap TIM they put on there ffs. It was thick and all over the place. Took me 30 minutes to clean all the junk before administering my own. It did the trick. I got a chance to really look at Asus' radiator and fans. They all suck horribly. A shame because the rest of it is very well built.


----------



## OutrideGaming

Quote:


> Originally Posted by *Renner*
> 
> What is your ASIC quality rating?


I am sorry, could you explain what you mean?

Okay well I'm new to the pc world. I've only ever owned laptops (not really good for any form of gaming) so here I am. Now don't get me wrong, without even touching the OC's I can run everything I've thrown at Talos (yes, I named my first computer. Get the bullying out of the system







) as high as the games settings go. It'll definitely pick up some fan speed (normal curve from AMD Afterburner (Best i've found to control temp, since gaming app by MSI seems like a "what it thinks" vs "Run-at-100%-to-fight-death" sort of thing. I understand Overclocking. I've done it with my android (not much there though) So here I am.

I have done my homework, as well as reading the front page of this forum. 1160mhz seems really good, about 100 over the oc mode built in. But that would require me to add core volatage and power limit, and I'm well past weary on doing that without guidance. $300+ for a single component definitely had me on the edge and I'd rather not damage without it being worth it







I have done some stress tests, as well as cpu burn ins and just gaming to test the system out. (There was a noob glitch. Placed too little thermal paste. Had to remove cpu fan, clean and re-apply, then re attach. It was definitely a pain but I've learned my lesson xD).

Anyway, on the gpu stress tests. I did a normal preset of 1080p and got a score of : Ozone3D External Link
Definitely nice to see that, but it's not a 4 GTX 980 or 390x setup







but It feels good (counting that with a laptop anything near this would have been godly). But I've seen others go for much higher as far as the 390 goes. I'll be posting my system specs (nothing extreme sadly) later on so any thoughts and / or tips are welcome. I'd love to also get into Overclocking though. My real question is what would be the max stable day-to-day temp for the MSI Gaming 390? When I do a stress test the gpu begins going over 85 and hit 90 (shut the furmark down at that point) with the fan at 100%. This seems too high and definitely not good. Any advice? Would overclocking decrease the temp? Would anything outside of more fans or decrease usage do it?

Thanks







And glad to be here







i've read through some posts. This should be interesting haha


----------



## sportsczy

So here's some performance info on the Asus R9 #() DCII:
- 1070/1520 at 110% power
- Unigine Heaven score of 2500 on Ultra settings
- 74 degrees max at full load for 10 mins. Fan speed at max was 75%
- ASIC score of 72,5%

General impressions: Put Gelid GC Extreme and that dropped the temps from high 70s to mid 70s. Fans are loud after 64% but aren't unbearable until you reach 75%. Not impressed by cooling but i knew that. Components are extremely high quality however and it has a backplate. If available, i'll put a waterblock on it over the next 2 months. It will overclock like a beast because it has the core components to do it. Cooling is holding it back.


----------



## sportsczy

Just did a quick OC and ran Unigine Heaven with +63mv 1170/1700 and 110%. It was stable. 77 degrees at 79% fan speed max.

What i have noticed is that OC only gives negligible returns in terms of Unigine Heaven. Haven't tested games yet and i won't. Really don't like OCing unless i need it. Wanted to give some Data on Asus DirectCU III for the thread.

Here's a pic of the card:


----------



## CamsX

Quote:


> Originally Posted by *Agent Smith1984*
> 
> New thermal compound.
> 
> Use something non conductive ( no as5!)
> 
> They dump mounds of compound on them from the factory, and what you really want is a nice thin layer that effectively transmits heat to the cooler.


Damn, used AS5 on my 7970+G10 card. Hope I'm not hurting anything.


----------



## CamsX

Quote:


> Originally Posted by *OutrideGaming*
> 
> I am sorry, could you explain what you mean?
> 
> Okay well I'm new to the pc world. I've only ever owned laptops (not really good for any form of gaming) so here I am. Now don't get me wrong, without even touching the OC's I can run everything I've thrown at Talos (yes, I named my first computer. Get the bullying out of the system
> 
> 
> 
> 
> 
> 
> 
> ) as high as the games settings go. It'll definitely pick up some fan speed (normal curve from AMD Afterburner (Best i've found to control temp, since gaming app by MSI seems like a "what it thinks" vs "Run-at-100%-to-fight-death" sort of thing. I understand Overclocking. I've done it with my android (not much there though) So here I am.
> 
> I have done my homework, as well as reading the front page of this forum. 1160mhz seems really good, about 100 over the oc mode built in. But that would require me to add core volatage and power limit, and I'm well past weary on doing that without guidance. $300+ for a single component definitely had me on the edge and I'd rather not damage without it being worth it
> 
> 
> 
> 
> 
> 
> 
> I have done some stress tests, as well as cpu burn ins and just gaming to test the system out. (There was a noob glitch. Placed too little thermal paste. Had to remove cpu fan, clean and re-apply, then re attach. It was definitely a pain but I've learned my lesson xD).
> 
> Anyway, on the gpu stress tests. I did a normal preset of 1080p and got a score of : Ozone3D External Link
> Definitely nice to see that, but it's not a 4 GTX 980 or 390x setup
> 
> 
> 
> 
> 
> 
> 
> but It feels good (counting that with a laptop anything near this would have been godly). But I've seen others go for much higher as far as the 390 goes. I'll be posting my system specs (nothing extreme sadly) later on so any thoughts and / or tips are welcome. I'd love to also get into Overclocking though. My real question is what would be the max stable day-to-day temp for the MSI Gaming 390? When I do a stress test the gpu begins going over 85 and hit 90 (shut the furmark down at that point) with the fan at 100%. This seems too high and definitely not good. Any advice? Would overclocking decrease the temp? Would anything outside of more fans or decrease usage do it?
> 
> Thanks
> 
> 
> 
> 
> 
> 
> 
> And glad to be here
> 
> 
> 
> 
> 
> 
> 
> i've read through some posts. This should be interesting haha


Welcome. Nice post.

Jumping into overclocking so quick means you are comfortable with the subject, so you are in for a lot of fun messing with Desktop HW. Just be prepared to spend a lot on components on a yearly basis. Not that you might need to, but it gets addictive. There is always something missing!









Others here have surpassed 1200/1700 clocks with stable results (MSI). For day to day use, I would say that you need to find a compromise in temps, noise and performance.

If you get 1150-1175 core and 1680-1700 memory clocks stable *WITH* Gaming VRM temps under 80°C and Core temps under 82-85°C I'd say you are golden. Pay special attention to the VRMs. The noise depends on your fan curve, but that also affects your temps, which is why I say its all a compromise. (install GPU-Z to monitor your VRM temps)

Good luck.


----------



## sportsczy

FYI... some cards don't have a VRM sensor... so if it doesn't show up, don't panic lol.


----------



## Durlag

Quote:


> Originally Posted by *Agent Smith1984*
> 
> New thermal compound.
> 
> Use something non conductive ( no as5!)
> 
> They dump mounds of compound on them from the factory, and what you really want is a nice thin layer that effectively transmits heat to the cooler.


Not sure if I want to void the warranty just yet on my r9 390

Honestly contemplating returning it for a quieter card

This is what the NCIX guy said in response to my ticket. If I'm purely gauging by his response then my card would certainly be declared defective in my books. It's really, really loud.

"We're not aware of any ongoing issues with these cards being loud, however this can be a factory defect if it's excessively noisy. Could you give me a rough idea of how loud this card is running? Can you hear the noise across the room your PC is in? Additionally, do you have your case panels off the case or is it closed?

Does this noise sound like the standard fan noise? Or a "whine" noise with a very electronic sound to it?

It should be fairly quiet under load, albeit not silent. "


----------



## Gumbi

Quote:


> Originally Posted by *sportsczy*
> 
> Just did a quick OC and ran Unigine Heaven with +63mv 1170/1700 and 110%. It was stable. 77 degrees at 79% fan speed max.
> 
> What i have noticed is that OC only gives negligible returns in terms of Unigine Heaven. Haven't tested games yet and i won't. Really don't like OCing unless i need it. Wanted to give some Data on Asus DirectCU III for the thread.
> 
> Here's a pic of the card:


That's actually quite a good overclock for 1170mhz core! You should be seeing definite gains in your FPS from that. Are you benching at 1080p? Any less and Heaven tends to be CPU bound.

Have you decided to stick with Asus or switch to XFX Agent Smith?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Gumbi*
> 
> That's actually quite a good overclock for 1170mhz core! You should be seeing definite gains in your FPS from that. Are you benching at 1080p? Any less and Heaven tends to be CPU bound.
> 
> Have you decided to stick with Asus or switch to XFX Agent Smith?


Well, the OC results for the Asus have been decent in the 1170-1190 range from what I have seen, but the cooler does scare me a bit.

What I do know, is that I would immediately change out the TIM to Noctua and get it a little cooler if I do get the Asus.

The XFX is also very tempting, but the Asus looks really good for my build theme, and given I am going to be running an Asus Crosshair Mobo, the theme would really work nicely together.

Really torn as of now... I haven't even shipped the 390 off yet cause of this morning the buyer had a payment issue with paypal and the money did not clear their account.....

It may even be that I keep my MSI 390 now and buy another


----------



## milan616

Quote:


> Originally Posted by *CamsX*
> 
> Damn, used AS5 on my 7970+G10 card. Hope I'm not hurting anything.


You aren't, AS5 is non-conductive.

_Not Electrically Conductive:
Arctic Silver 5 was formulated to conduct heat, not electricity.
(While much safer than electrically conductive silver and copper greases, Arctic Silver 5 should be kept away from electrical traces, pins, and leads. While it is not electrically conductive, the compound is very slightly capacitive and could potentially cause problems if it bridges two close-proximity electrical paths.)_

Unless you slathered it on like mayo on sandwiches in Undercover Brother you're doing just fine.


----------



## Gumbi

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Well, the OC results for the Asus have been decent in the 1170-1190 range from what I have seen, but the cooler does scare me a bit.
> 
> What I do know, is that I would immediately change out the TIM to Noctua and get it a little cooler if I do get the Asus.
> 
> The XFX is also very tempting, but the Asus looks really good for my build theme, and given I am going to be running an Asus Crosshair Mobo, the theme would really work nicely together.
> 
> Really torn as of now... I haven't even shipped the 390 off yet cause of this morning the buyer had a payment issue with paypal and the money did not clear their account.....
> 
> It may even be that I keep my MSI 390 now and buy another


It's somewhat a mystery to me how Asus could have messed up so much? Still the same transfer job from 780-->290-->390? Horrific TIM job? Bad cooler design overall? A mix of all the above?

Unless I saw one before me I really couldn't know forsure.

The XFX looks like a nice card tbh, pity you aren't set on the colour/design of it!


----------



## Sgt Bilko

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Gumbi*
> 
> That's actually quite a good overclock for 1170mhz core! You should be seeing definite gains in your FPS from that. Are you benching at 1080p? Any less and Heaven tends to be CPU bound.
> 
> Have you decided to stick with Asus or switch to XFX Agent Smith?
> 
> 
> 
> Well, the OC results for the Asus have been decent in the 1170-1190 range from what I have seen, but the cooler does scare me a bit.
> 
> What I do know, is that I would immediately change out the TIM to Noctua and get it a little cooler if I do get the Asus.
> 
> The XFX is also very tempting, but the Asus looks really good for my build theme, and given I am going to be running an Asus Crosshair Mobo, the theme would really work nicely together.
> 
> Really torn as of now... I haven't even shipped the 390 off yet cause of this morning the buyer had a payment issue with paypal and the money did not clear their account.....
> 
> It may even be that I keep my MSI 390 now and buy another
Click to expand...

After having both the XFX DD 290x and 390x here i honestly can't give them enough praise for all the work they did for the newer cards......everything that bugged me about the 290x has been changed and improved even in a small way.

vrm temps have been MASSIVELY improved, the overall airflow path is better, the backplate is awesome (though MSI's a touch nicer), it doesn't sag much at all, the fans have a higher rpm (a bit more noise but better cooling again)......the only couple of gripes i have with it....

the PCIe power connectors should have been flipped so the clip was towards the backplate (makes removing them easier), the small XFX logo on the side would be great with a small white LED in there and the diamond pattern isn't for everyone but i will say that photos make it seem more pronounced than it is in real life.

Overall it's a great GPU for it's price but choosing between it and the MSI is not easy.....


----------



## Sgt Bilko

Quote:


> Originally Posted by *Gumbi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Agent Smith1984*
> 
> Well, the OC results for the Asus have been decent in the 1170-1190 range from what I have seen, but the cooler does scare me a bit.
> 
> What I do know, is that I would immediately change out the TIM to Noctua and get it a little cooler if I do get the Asus.
> 
> The XFX is also very tempting, but the Asus looks really good for my build theme, and given I am going to be running an Asus Crosshair Mobo, the theme would really work nicely together.
> 
> Really torn as of now... I haven't even shipped the 390 off yet cause of this morning the buyer had a payment issue with paypal and the money did not clear their account.....
> 
> It may even be that I keep my MSI 390 now and buy another
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It's somewhat a mystery to me how Asus could have messed up so much? Still the same transfer job from 780-->290-->390? Horrific TIM job? Bad cooler design overall? A mix of all the above?
> 
> Unless I saw one before me I really couldn't know forsure.
> 
> The XFX looks like a nice card tbh, pity you aren't set on the colour/design of it!
Click to expand...

The Strix doesn't work so great on these cards because of the relatively small die size, and it's Direct contact with the die (hence the DCU part), they aren't using a heat spreader like other companies are so this has mixed results......the bigger the die the better th cooler works and after looking at the Strix it almost looks as though it was designed for the Fury + Fury X once you take the HBM modules into consideration.

tbh the DCU II actually performs better than the Strix for the most part from the little i've seen with it....


----------



## JRR1285

I hope to be a member of this club in the not so distant future. Anyone here play MechWarrior online? It's my most played game and I am curious of how well this card does with that title.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Gumbi*
> 
> It's somewhat a mystery to me how Asus could have messed up so much? Still the same transfer job from 780-->290-->390? Horrific TIM job? Bad cooler design overall? A mix of all the above?
> 
> Unless I saw one before me I really couldn't know forsure.
> 
> The XFX looks like a nice card tbh, pity you aren't set on the colour/design of it!


Here is the cooler for the ASUS Strix



You can see that they at least made it so that some of the heat pipes actually contact the core on this cooler....

Here is the cooler for the XFX...



Drastically DIFFERENT!!

The Asus uses larger pipes, but the XFX puts several more pipes touching the core, and for heat dissipation, that may be the best approach.


----------



## OutrideGaming

I'm definitely looking forward to someone water-cooling the gpu to see how far and stable it can actually go. The only way I personally see air cooling this would be to keep it out of the case and just have fans blowing on it... Is that bad?







xD


----------



## Gumbi

Just 3 heatpipes contacting the core directly, vs 5 from XFX! Poor design indeed. They seem to have slacked on the VRM cooling too, as the results on that front haven't been impressive either.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Gumbi*
> 
> Just 3 heatpipes contacting the core directly, vs 7 from XFX! Poor design indeed. They seem to have slacked on the VRM cooling too, as the results on that front haven't been impressive either.


Again, that's because of the heatspreader....i honestly cannot think of a 390/x design that doesn't use one apart from Asus' coolers......they really should ditch the "Direct Contact Unit" and move to a heatspreader like the rest of them


----------



## flopper

Quote:


> Originally Posted by *Sgt Bilko*
> 
> After having both the XFX DD 290x and 390x here i honestly can't give them enough praise for all the work they did for the newer cards......everything that bugged me about the 290x has been changed and improved even in a small way.
> 
> vrm temps have been MASSIVELY improved, the overall airflow path is better, the backplate is awesome (though MSI's a touch nicer), it doesn't sag much at all, the fans have a higher rpm (a bit more noise but better cooling again)......the only couple of gripes i have with it....
> 
> .


yea expected they had so much feedback from the 200 series they also went all out for the 300 and its paid off IMO.
Impressed with the XFX vrm cooling.

Quote:


> Originally Posted by *OutrideGaming*
> 
> I'm definitely looking forward to someone water-cooling the gpu to see how far and stable it can actually go. The only way I personally see air cooling this would be to keep it out of the case and just have fans blowing on it... Is that bad?
> 
> 
> 
> 
> 
> 
> 
> xD


Watercooling seldom give much more headroom as the limit of the cards are what they are.
100mhz maybe in OC difference but you really watercool for temps and noise normally.
However watercooling is the way to go for these


----------



## Offender_Mullet

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Here is the cooler for the XFX...
> 
> 
> 
> Drastically DIFFERENT!!
> 
> The Asus uses larger pipes, but the XFX puts several more pipes touching the core, and for heat dissipation, that may be the best approach.


Smith, is that the XFX 390*X*? I'm contemplating to get that or the Sapphire Tri-X version. Here is the cooler for the Sapphire:



Both cards seem to have great cooling from reviews, just not sure which one would be 'better'. Haven't paid attention to gpu heatsinks in quite a while.


----------



## CerealKillah

Quote:


> Originally Posted by *Offender_Mullet*
> 
> Smith, is that the XFX 390*X*? I'm contemplating to get that or the Sapphire Tri-X version. Here is the cooler for the Sapphire:
> 
> 
> 
> Both cards seem to have great cooling from reviews, just not sure which one would be 'better'. Haven't paid attention to gpu heatsinks in quite a while.


It is my understanding that the XFX 390 and 390X have the same cooler.


----------



## Sgt Bilko

Quote:


> Originally Posted by *CerealKillah*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Offender_Mullet*
> 
> Smith, is that the XFX 390*X*? I'm contemplating to get that or the Sapphire Tri-X version. Here is the cooler for the Sapphire:
> 
> 
> 
> Both cards seem to have great cooling from reviews, just not sure which one would be 'better'. Haven't paid attention to gpu heatsinks in quite a while.
> 
> 
> 
> It is my understanding that the XFX 390 and 390X have the same cooler.
Click to expand...

Yep, the XFX 390 and 390x have the same cooler as does the Sapphire R9 390 Nitro and R9 390x Tri-X









Here's the 390x Tri-X Heatsink:



The only difference between the Nitro and Tri-X in terms of heatsink is the colour.


----------



## sportsczy

Considering buying adhesive thermal pads so I can mod the Asus to have contact between components and pipes... That's all the other brands are doing. Not expensive either. Thoughts?

Gelid also makes this PCI fan unit with 2 fans. Guaranteed better quality fans than all the ones that come stock. My only concern is that it will liked be about 2-4 CMs from the pipes. Is that close enough?


----------



## Gumbi

Quote:


> Originally Posted by *sportsczy*
> 
> Considering buying adhesive thermal pads so I can mod the Asus to have contact between components and pipes... That's all the other brands are doing. Not expensive either. Thoughts?
> 
> Gelid also makes this PCI fan unit with 2 fans. Guaranteed better quality fans than all the ones that come stock. My only concern is that it will liked be about 2-4 CMs from the pipes. Is that close enough?


I definitely think that having direct contact between components and heatsink is the way to go...far more efficient thermal dissipation. Otherwise you're relying on airflow alone really to cool those components.

Particularly important is the cooling of the VRMs.

Grabbing an aftermarket cooler seems overkill when you've already got a (supposedly) aftermarket one. I'd say replace TIM, maybego thermal pad route like you suggest above and OPTIMISE CASE AIRFLOW.

I can't stress that enough. These cards looove cool air around them.


----------



## sportsczy

Ok. Will get thermal pad. Better to just put a sheet or cut it up to fit each component?. I already changed Tim to gc extreme and it made a significant difference.

Need to spend time with speedfan lol.

Just the Tim alone.... Run heaven benchmark and the Asus I have maxes out at 74 degrees with 70% fan speed. The problem is noise. The fans get loud after 60%. So I'm trying to stay in the mid 70s range while bringing down fan speed.


----------



## Noirgheos

Is it possible to make it so that the stock clock of my Sapphire 390X is 1170MHz? It OCs to that no problem, but is there any way to make it so I don't need a program to apply it? So that the card thinks that's the stock clock?

Is there a way to do this for the fan curve as well?


----------



## sportsczy

There is but I don't recommend it. You can flash your GPU bios with the stable OC settings. Not very complicated to do either. On mobile phone or I'd link you to instructions.

But again, I wouldn't. There is always a small chance you could brick your card.


----------



## xer0h0ur

You would have to screw up flashing the BIOS twice and even still you can fix that if you manage to screw up the BIOS on both switch positions. There really is no bricking a card by BIOS flashing unless you're screwing with the voltage.


----------



## Geoclock

What about ASUS r9 390/x DirectCII coolers.
At the forum no one likes it but i didn't see actual owner complaining about it.
All owners have good reviews.

Any pictures of it?


----------



## jackalopeater

Quote:


> Originally Posted by *Geoclock*
> 
> What about ASUS r9 390/x DirectCII coolers.
> At the forum no one likes it but i didn't see actual owner complaining about it.
> All owners have good reviews.
> 
> Any pictures of it?


Please, I beg you, go another route. I have one and it's okay so long as your case has VERY good air flow. My core got up to around 74c but my vrms would get to 95c within minutes and after long game sessions would be well over 100c.

I like the card, it's not bad and I've compared my numbers and results with a reviewer who had the Strix and my DCUII was performing better overall. So look at XFX, MSI, Sapphire, everyone I know with those have had much better luck.

For FULL DISCLOSURE, I was sent mine by AMD so I got it for free and I'm still recommending you go for a different model.

oh, and it's a horrible overclocker unless you ramp the fans past 60%, which is super loud, even then I was disappointed by it oc capability.

but here are some pics I took of it


http://imgur.com/a


and playing BF4 at 4k High settings


http://imgur.com/4fC7I


----------



## jackalopeater

Quote:


> Originally Posted by *xer0h0ur*
> 
> You would have to screw up flashing the BIOS twice and even still you can fix that if you manage to screw up the BIOS on both switch positions. There really is no bricking a card by BIOS flashing unless you're screwing with the voltage.


Lol I just borked up a bios flash on a R7 250 (trying to recover it's fan profile) and it went blah....tossed it in another slot used another gpu and just reflashed it back to the stock bios, no problem.


----------



## Agent Smith1984

Quote:


> Originally Posted by *jackalopeater*
> 
> Please, I beg you, go another route. I have one and it's okay so long as your case has VERY good air flow. My core got up to around 74c but my vrms would get to 95c within minutes and after long game sessions would be well over 100c.
> 
> I like the card, it's not bad and I've compared my numbers and results with a reviewer who had the Strix and my DCUII was performing better overall. So look at XFX, MSI, Sapphire, everyone I know with those have had much better luck.
> 
> For FULL DISCLOSURE, I was sent mine by AMD so I got it for free and I'm still recommending you go for a different model.
> 
> oh, and it's a horrible overclocker unless you ramp the fans past 60%, which is super loud, even then I was disappointed by it oc capability.
> 
> but here are some pics I took of it
> 
> 
> http://imgur.com/a
> 
> 
> and playing BF4 at 4k High settings
> 
> 
> http://imgur.com/4fC7I


Yep, I'm steering very clear of the ASUS... Just keeping my MSI for now, and adding another.

And just so everyone knows, 4k will add 3-5c to your core temp over anything you do at 1080!

And forget 99% gpu usage, you will regularly get the full 100%


----------



## jackalopeater

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Yep, I'm steering very clear of the ASUS... Just keeping my MSI for now, and adding another.
> 
> And just so everyone knows, 4k will add 3-5c to your core temp over anything you do at 1080!
> 
> And forget 99% gpu usage, you will regularly get the full 100%


100% agreed and then some, I found my DCUII at 4k playing BF4 was in the upper 80s on the core with the fan at 75%


----------



## Geoclock

What about ASUS r9 390 heatsink and fan
Quote:


> Originally Posted by *Agent Smith1984*
> 
> Yep, I'm steering very clear of the ASUS... Just keeping my MSI for now, and adding another.
> 
> And just so everyone knows, 4k will add 3-5c to your core temp over anything you do at 1080!
> 
> And forget 99% gpu usage, you will regularly get the full 100%


Will be on MSI 390x waste of money?
Better choice is 390 right?
I know $100 price difference isn't wort, but for $50 more is 390x worth over 390 ?

PS: My problem is i have $300 credit at NEWEGG and if i'll buy some other place even cheaper then i have no idea how to spend available credit, i'm not going to buy anything major nearest 2-3 years.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Geoclock*
> 
> What about ASUS r9 390 heatsink and fan
> Will be on MSI 390x waste of money?
> Better choice is 390 right?
> I know $100 price difference isn't wort, but for $50 more is 390x worth over 390 ?
> 
> PS: My problem is i have $300 credit at NEWEGG and if i'll buy some other place even cheaper then i have no idea how to spend available credit, i'm not going to buy anything major nearest 2-3 years.


$50 more? Yes the X would be worth it, at $100 it's really really questionable


----------



## sportsczy

DCU3 sis much better than DCU2 by Asus. Night and day. They triple fanned it and the VRM is adequately cooled. It overclocks well too. (+63mhz, 1170/1700 without even pushing it at all; first attempt and i got busyi before i tested higher); bet i can get it 1200+/1800+).

Only issue i have is that fan gets pretty loud after 65%. But I get temp stable at 77 degrees with Heaven going for 10 mins at 65% so it's ok. Changed tim.


----------



## Geoclock

S H ! T, i just bought brand new unopened MSI 390x from ebay for $375 .

I hope it will work as it should be, price at Newegg is $430 and 390 for $335, so just $40 more.
I guess it is wort at this price.


----------



## Dorland203

My MSI 390X at 1175/1700


----------



## Geoclock

Nice Benchmarks for MSI 390x, how cool it runs ?


----------



## Dorland203

Quote:


> Originally Posted by *Geoclock*
> 
> Nice Benchmarks for MSI 390x, how cool it runs ?


Max Temp is 78* after 1 loop of Valley . This is with my Cooler Master Elite 431 Plus case and 35* ambient Temp


----------



## Geoclock

Quote:


> Originally Posted by *Dorland203*
> 
> Max Temp is 78* after 1 loop of Valley . This is with my Cooler Master Elite 431 Plus case and 35* ambient Temp


Nice, very nice. I'm glad i spend little more for quality card.
I have Cooler Master Sniper so should be good airflow, had XFX 290x and returned back for credit, i guess will by 1440 monitor.
Thanks man.


----------



## sportsczy

Quote:


> Originally Posted by *sportsczy*
> 
> DCU3 sis much better than DCU2 by Asus. Night and day. They triple fanned it and the VRM is adequately cooled. It overclocks well too. (+63mhz, 1170/1700 without even pushing it at all; first attempt and i got busyi before i tested higher); bet i can get it 1200+/1800+).
> 
> Only issue i have is that fan gets pretty loud after 65%. But I get temp stable at 77 degrees with Heaven going for 10 mins at 65% so it's ok. Changed tim.


Just to add... gamed for about 45 mins and VRM temps maxed at 77. Witcher 3 too so loads were at max or near it most of the time.


----------



## desetnik

Quote:


> Originally Posted by *sportsczy*
> 
> Just to add... gamed for about 45 mins and VRM temps maxed at 77. Witcher 3 too so loads were at max or near it most of the time.


Witcher really puts a load on cards. Most of the time I see is 100% usage.
Today I played w3 for 4 hours and my vrm never exceded 70C while core was 81C.
MSI trully does good job with vrm's.
Makes me wonder if airflow is bad then why is vrm at ideal range? Or am I wrong?
Once I get rid of this core issue I'll start ocing her and post results.


----------



## sportsczy

yeah i'll take care of the Asus VRM issue very shortly. It's not even an issue tbh in DCU3. But it can be better... going to get some VRM thermal pads that are thick enough to touch the piping of the heatsink. For some reason, they leave the VRM as is with nothing to help. With DCU2, it was an absolute disaster because they fans didn't provide adequate airflow to begin with. With DCU3, they moved to the 3 fans setup and the airflow is good although a bit loud for my taste.

I'm also going to switch by PC box to a Fractal Design R5. That should pretty much solve the noise and case airflow issues i'm having right now. Eventually want to watercool and i need a bigger case anyhow.

Anyone have a R5?


----------



## Darkeylel

Hmmmm very curious to try out water cooling for this card just for quietness. Because god damm when the fans windup on my Gigabyte card it's like my whole case is going to fly away.....

Any one got any places where I can start ? Is there any pre built kits like for the CPU coolers ?


----------



## Noirgheos

Anyone here with an MSI 390 or 390X feel like they can do me a little favour? Can you take pictures of it installed in your PC, angles should be so that I can observe GPU sag and the top of the backplate. Thanks in advance.


----------



## sportsczy

Quote:


> Originally Posted by *Darkeylel*
> 
> Hmmmm very curious to try out water cooling for this card just for quietness. Because god damm when the fans windup on my Gigabyte card it's like my whole case is going to fly away.....
> 
> Any one got any places where I can start ? Is there any pre built kits like for the CPU coolers ?


Another option is Arctic Accelero Xtreme III. Install is a bit tricky because you have to do a small mod so that power connector fits (use pliers). But other than that, it's massively more quiet and significantly better cooling. Prolimatech MK-26 is another great option, but you need 180mm CPU clearance for it to fit.

I always swap out coolers for both my CPU and GPU. Stock cooling is pathetic no matter how you look at it. Watercooling gets expensive so i usually go for after market air cooling.


----------



## Darkeylel

Quote:


> Originally Posted by *sportsczy*
> 
> Another option is Arctic Accelero Xtreme III. Install is a bit tricky because you have to do a small mod so that power connector fits (use pliers). But other than that, it's massively more quiet and significantly better cooling. Prolimatech MK-26 is another great option, but you need 180mm CPU clearance for it to fit.
> 
> I always swap out coolers for both my CPU and GPU. Stock cooling is pathetic no matter how you look at it. Watercooling gets expensive so i usually go for after market air cooling.


I was looking at getting the NZXT G10 one of my mates has it running on his 290x and his temps are amazing haha


----------



## sportsczy

Only thing i would pay attention to on the G10 is how you're going to cool the components and VRM. that's my big Q with that solution


----------



## Darkeylel

Yea I'm not to sure that's why I have come here not sure if the G10 even fits the 390x


----------



## By-Tor

Quote:


> Originally Posted by *Darkeylel*
> 
> Hmmmm very curious to try out water cooling for this card just for quietness. Because god damm when the fans windup on my Gigabyte card it's like my whole case is going to fly away.....
> 
> Any one got any places where I can start ? Is there any pre built kits like for the CPU coolers ?


Having a Gigabyte card may make it hard to use a full cover block. EK will only be producing full cover blocks for XFX, Asus and Powercolor, but not to say another company wont produce one. Most block companies make universal blocks for the core and then you could add heatsinks to the memory and VRM's.

You can find kits from most of the big names in water cooling for a CPU loop..

It can be expensive and could cost as much if not more than the items your cooling depending on the product choices you make.


----------



## rankdropper84

Are the other 390 cards really loud? My sapphire is cool as a cucumber (max 67-69c) and deadly quiet to boot. All these lists are making me question if I should recommend non sapphire 390 cards to others


----------



## rankdropper84

Also my vrm temps according to gpuz never get above 70C. Usually they run alot colder than 70c though


----------



## Agent Smith1984

Looks like I'm just keeping my MSI 390..

Was going to replace my busted board with Asus Crosshair, but man, I just can't justify $200 on a mobo for AM3+...

Went with the dollar friendly MSI 970 Gaming which is PROVEN to handle 4.7-5GHz, and all I really want is my 4.8GHz back since I'm used to that anyways.

The cool thing will be the fact that this will be an MSI Gaming themed rig now, and the red/blacks will all be matched spot on.

Then once I add my second 390, I should be good to go for quite some time.

Next GPU change for me is going to come along with board/CPU/and RAM....


----------



## Offender_Mullet

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Looks like I'm just keeping my MSI 390..
> 
> Was going to replace my busted board with Asus Crosshair, but man, I just can't justify $200 on a mobo for AM3+...
> 
> Went with the dollar friendly MSI 970 Gaming which is PROVEN to handle 4.7-5GHz, and all I really want is my 4.8GHz back since I'm used to that anyways.
> 
> The cool thing will be the fact that this will be an MSI Gaming themed rig now, and the red/blacks will all be matched spot on.
> 
> Then once I add my second 390, I should be good to go for quite some time.
> 
> Next GPU change for me is going to come along with board/CPU/and RAM....


If I knew your board blew, I would've sold you a Gigabyte 990FXA-UD5.

I'm going back-and-forth between the 390X and 390. From all the reviews I've been pouring though, there seems very little performance difference between the 2 to justify the $60 upcharge for the 390x. I know as soon as I buy one, NewEgg will troll me and drop the price in a couple weeks. They always do that. I'm getting so impatient though ugh!


----------



## Agent Smith1984

Quote:


> Originally Posted by *Offender_Mullet*
> 
> If I knew your board blew, I would've sold you a Gigabyte 990FXA-UD5.
> 
> I'm going back-and-forth between the 390X and 390. From all the reviews I've been pouring though, there seems very little performance difference between the 2 to justify the $60 upcharge for the 390x. I know as soon as I buy one, NewEgg will troll me and drop the price in a couple weeks. They always do that. I'm getting so impatient though ugh!


Honestly man, if you are getting the MSI 390, you are good to go. They really all seem to break 1170 core clock, which will get you a tad faster than the 390X at 1100MHz.









The XFX also seems to clock pretty good too.

Of course, if you do get the 390X, the core will likely hit 1180-1200MHz, putting it in the performance range of a 390 at roughly 1250+ MHz, and that is a speed that these cards simply will not run at.


----------



## Offender_Mullet

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Honestly man, if you are getting the MSI 390, you are good to go. They really all seem to break 1170 core clock, which will get you a tad faster than the 390X at 1100MHz.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The XFX also seems to clock pretty good too.
> 
> Of course, if you do get the 390X, the core will likely hit 1180-1200MHz, putting it in the performance range of a 390 at roughly 1250+ MHz, and that is a speed that these cards simply will not run at.


I'm most likely going with XFX but thanks though! Seeing more pics of the torn-down card, I like how they improved on their cooling from the previous terrible 'Ghost' solutions they used.


----------



## CamsX

Quote:


> Originally Posted by *rankdropper84*
> 
> Also my vrm temps according to gpuz never get above 70C. Usually they run alot colder than 70c though


Those vrm and core temps are perfectly normal for the Nitro and Tri-X cards, specially is your are gaming at 1080p. Sapphire has done a very good job with their cards cooling solution. I would expect them to remain under 75°C at 1440p gaming without issues.

Once you start overclocking them, you are most likely limited by their internal configuration, rather than temp related limits.

I'm at 1100/1600 on stock voltages and still under in 70°C in 1080p gaming with my Nitro. Still the card I would recommend if it fits your case.


----------



## gerpogi

Hey all Ive been trying to decide between the 390x and the 980 for a while now and finally jumped on the 390x ship. Mostly because I think nvidia is slipping =\.so count me in! I have the xfx model


----------



## diggiddi

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Looks like I'm just keeping my MSI 390..
> 
> Was going to replace my busted board with Asus Crosshair, but man, I just can't justify $200 on a mobo for AM3+...
> 
> Went with the dollar friendly MSI 970 Gaming which is PROVEN to handle 4.7-5GHz, and all I really want is my 4.8GHz back since I'm used to that anyways.
> 
> The cool thing will be the fact that this will be an MSI Gaming themed rig now, and the red/blacks will all be matched spot on.
> 
> *Then once I add my second 390, I should be good to go for quite some time*.
> 
> Next GPU change for me is going to come along with board/CPU/and RAM....


I thinking of crossfiring my 290x lightning but not sure which card to add, either another 290x or go with a grenada in that case either a sapphire or xfx 390/x
hopefully dx 12 games will come with the ability to add VRAM


----------



## Sgt Bilko

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Looks like I'm just keeping my MSI 390..
> 
> Was going to replace my busted board with Asus Crosshair, but man, I just can't justify $200 on a mobo for AM3+...
> 
> Went with the dollar friendly MSI 970 Gaming which is PROVEN to handle 4.7-5GHz, and all I really want is my 4.8GHz back since I'm used to that anyways.
> 
> The cool thing will be the fact that this will be an MSI Gaming themed rig now, and the red/blacks will all be matched spot on.
> 
> Then once I add my second 390, I should be good to go for quite some time.
> 
> Next GPU change for me is going to come along with board/CPU/and RAM....


Well I'll be curious to see how you got with that MSI board, i was pretty sure 4.7Ghz was it's limit for most chips but hopefullyyou can prove me wrong there


----------



## Agent Smith1984

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Well I'll be curious to see how you got with that MSI board, i was pretty sure 4.7Ghz was it's limit for most chips but hopefullyyou can prove me wrong there


Well i certainly plan on trying.

The reviews are mostly older so the 4.7 average is coming at 1.45v - 1.5v. I saw that and thought, okay, worst case i get 4.7ghz to tie me over until zen (for $99 mind you), best case, i get my 4.9ghz back at 1.464v since the board is happily holding that kind of voltage.

If i blow it up, I'll rma, swap the new asrock in, get the replacement for the msi, sell them both for a cool $60 each as refurbs, and get a kitty for $160 ( and just deal with the dookie until i upgrade everything







)


----------



## Sgt Bilko

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Well I'll be curious to see how you got with that MSI board, i was pretty sure 4.7Ghz was it's limit for most chips but hopefullyyou can prove me wrong there
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Well i certainly plan on trying.
> 
> The reviews are mostly older so the 4.7 average is coming at 1.45v - 1.5v. I saw that and thought, okay, worst case i get 4.7ghz to tie me over until zen (for $99 mind you), best case, i get my 4.9ghz back at 1.464v since the board is happily holding that kind of voltage.
> 
> If i blow it up, I'll rma, swap the new asrock in, get the replacement for the msi, sell them both for a cool $60 each as refurbs, and get a kitty for $160 ( and just deal with the dookie until i upgrade everything
> 
> 
> 
> 
> 
> 
> 
> )
Click to expand...

Haha....great plan


----------



## Agent Smith1984

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Haha....great plan


LOL, sounded good to the wife too









I like testing, building and toying around ya know?

It'd be good to be able to tell people how the msi board does, and then also see if those nikos hold up for a year until zen comes.

That little board looks really good, so we shall see what happens.

If you can't tell by my rig pic, i like a little flash.... It impresses all my console inslaved friends and family and makes a nice conversation piece in my living room.


----------



## orlfman

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Well i certainly plan on trying.
> 
> The reviews are mostly older so the 4.7 average is coming at 1.45v - 1.5v. I saw that and thought, okay, worst case i get 4.7ghz to tie me over until zen (for $99 mind you), best case, i get my 4.9ghz back at 1.464v since the board is happily holding that kind of voltage.
> 
> If i blow it up, I'll rma, swap the new asrock in, get the replacement for the msi, sell them both for a cool $60 each as refurbs, and get a kitty for $160 ( and just deal with the dookie until i upgrade everything
> 
> 
> 
> 
> 
> 
> 
> )


kitty as a cat? paying for a kitty?! i didn't have to pay for my cats... they sorta came to me and decided to live in my house.

i have one cat personally, a stray i found as a three week old in my backyard. herd this high pitch "squueeeee" in my backyard and wondered what it was. only to find a small little kitten with closed eyes under my chair. recently a moma cat that i have noticed roaming around my neighborhood about a year ago gave birth to two kittens in my living room... i have a "cattie" door for my cat can go in the backyard to roam around... she apparently invited in the moma cat in as a guest.

one of the furballs


http://imgur.com/pQorZo9


----------



## Oregonduck007

count me in!


----------



## bazookatooths

Current RANK 6


----------



## Oregonduck007

My first shot at Overclocking my R9 390 went well. 1129mhz and 1638mhz on stock voltage. score on unigine "valley"was 2837.

http://www.3dmark.com/3dm11/10285675


----------



## Mysticking32

Just updating on my status. So to recap. My r9 280x started artifacting and I sent it to MSI to have it repaired. But I was in the market for a new graphics card so to replace it I bought the sapphire r9 390 Nitro. Wonderful card and wonderful temps. Sadly the overclocking was very limited and only reached around 1080/1700 mhz. I returned the card and forked over an extra 100 to buy the MSI r9 390x. Again a great card but the first card I got was reaching temps over 90 degrees C in the Witcher 3, so I had to take that back. The replacement they sent got to 83 in the Witcher 3 so I was happy with my purchase.

Fast forward to the status of my r9 280x. About a week ago I got an email from their customer service asking if I would accept a swap for my 280x to a 390x. I'm like uhhhh yeah. lol. But anyways I couldn't believe they gave me a free upgrade. Today the card arrived and I have it installed now. I'm going to take the other card back to amazon for a refund. (I was going to crossfire but I would have to buy a new mobo and new power supply)

The temps are great and I haven't gone above 70 in the Witcher 3. I've attached some benchmarks. Ran a **** ton of them lol.

http://www.3dmark.com/compare/fs/5964091/fs/5948277


----------



## Renner

Folks, I'm having an update regarding coil whines. Friend of mine just complained about those with her XFX390X, and even posted a video showing it:






Does that seem like a coil whine?

She played with the voltages and clocks though. This is the rest of the specs: 4790K @ 4.6 GHz, (h100i), 16GB Kingston HyperX Savage 2400, 2x 250GB SSD, 2x 120GB SSD, 2 TB HDD, 1.5kW PSU (Ax1500i).


----------



## Agent Smith1984

Quote:


> Originally Posted by *orlfman*
> 
> kitty as a cat? paying for a kitty?! i didn't have to pay for my cats... they sorta came to me and decided to live in my house.
> 
> i have one cat personally, a stray i found as a three week old in my backyard. herd this high pitch "squueeeee" in my backyard and wondered what it was. only to find a small little kitten with closed eyes under my chair. recently a moma cat that i have noticed roaming around my neighborhood about a year ago gave birth to two kittens in my living room... i have a "cattie" door for my cat can go in the backyard to roam around... she apparently invited in the moma cat in as a guest.
> 
> one of the furballs
> 
> 
> http://imgur.com/pQorZo9


LOL!

They are cute when they're little aren't they?

But i was referring to this kitty:


----------



## desetnik

Quote:


> Originally Posted by *Renner*
> 
> Folks, I'm having an update regarding coil whines. Friend of mine just complained about those with her XFX390X, and even posted a video showing it:
> 
> 
> 
> 
> 
> 
> Does that seem like a coil whine?
> 
> She played with the voltages and clocks though. This is the rest of the specs: 4790K @ 4.6 GHz, (h100i), 16GB Kingston HyperX Savage 2400, 2x 250GB SSD, 2x 120GB SSD, 2 TB HDD, 1.5kW PSU (Ax1500i).


Hmm. I had exactly the same whine with older drivers 15.5 or 15.6 I forgot which one but that was on old 7870. It never made those sounds until I installed those drivers. I didn't test it with 390 so I can't be sure seems like its a very rare thing to happen. If she does have older drivers that could be cause for some reason. It was loud but only if the card was on load. Try removing and installing new drivers. I bet lots of people won't believe me but it happened and I couldn't believe it myself. Just in case try this.


----------



## OutrideGaming

Quote:


> Originally Posted by *flopper*
> 
> Watercooling seldom give much more headroom as the limit of the cards are what they are.
> 100mhz maybe in OC difference but you really watercool for temps and noise normally.
> However watercooling is the way to go for these


Haha I see. Well the thing is though. Running FurMark at just 1080p, and leaving it for about 20 minutes, it'll keep climbing until it reaches about 90 degrees C. At that very moment, the screen lags, and it starts dropping gpu usage until it cools down to 85 C again, then keeps going until it hits 90, rinse and repeat. Idk why. I don't understand how I'm getting 100% fan usage and it just will not stop climbing in temp :/


----------



## bazookatooths

Quote:


> Originally Posted by *Renner*
> 
> Folks, I'm having an update regarding coil whines. Friend of mine just complained about those with her XFX390X, and even posted a video showing it:
> 
> 
> 
> 
> 
> 
> Does that seem like a coil whine?
> 
> She played with the voltages and clocks though. This is the rest of the specs: 4790K @ 4.6 GHz, (h100i), 16GB Kingston HyperX Savage 2400, 2x 250GB SSD, 2x 120GB SSD, 2 TB HDD, 1.5kW PSU (Ax1500i).


From what i've read its all about electric fields bouncing around off different components and seems random, i've never experienced it myself.
Seen people solving issue by.
1> goes away by itself after awhile
2> getting a different power supply
3> maybe get rid of components you dont use/swap, for less signal bouncing

I'm not an expert on this just what i've read on the inet. PSU seems about 3x overkill

EDIT: I would take it back if mine was making that noise, have you always experinced coil whine with all your vide cards?


----------



## Renner

Quote:


> Originally Posted by *bazookatooths*
> 
> EDIT: I would take it back if mine was making that noise, have you always experinced coil whine with all your vide cards?


Yeah, like these...


----------



## Oregonduck007

Not sure if i could hear anything over that music.


----------



## flopper

Quote:


> Originally Posted by *OutrideGaming*
> 
> Haha I see. Well the thing is though. Running FurMark at just 1080p, and leaving it for about 20 minutes, it'll keep climbing until it reaches about 90 degrees C. At that very moment, the screen lags, and it starts dropping gpu usage until it cools down to 85 C again, then keeps going until it hits 90, rinse and repeat. Idk why. I don't understand how I'm getting 100% fan usage and it just will not stop climbing in temp :/


furmark will burn your card.
do not run it.

gaming are not even close to push a card the way furmark does it and thats why if you do run it can blow up your card.
DONT RUN FURMARK


----------



## battleaxe

Quote:


> Originally Posted by *OutrideGaming*
> 
> Haha I see. Well the thing is though. Running FurMark at just 1080p, and leaving it for about 20 minutes, it'll keep climbing until it reaches about 90 degrees C. At that very moment, the screen lags, and it starts dropping gpu usage until it cools down to 85 C again, then keeps going until it hits 90, rinse and repeat. Idk why. I don't understand how I'm getting 100% fan usage and it just will not stop climbing in temp :/


Your card is throttling. Its protecting itself.


----------



## Gumbi

Quote:


> Originally Posted by *battleaxe*
> 
> Your card is throttling. Its protecting itself.


Protecting itself from it's abusive owner!


----------



## Geoclock

Quote:


> Originally Posted by *Renner*
> 
> Yeah, like these...


Holly S H 1 T, that was NEW AMERICAN RAP wasn't it ? L O L









Did you see system was like 5 year uncleaned from dust, that PC was "PROTESTING" against owner


----------



## battleaxe

Quote:


> Originally Posted by *Gumbi*
> 
> Protecting itself from it's abusive owner!


Most people don't know that a testing/benchmark program could hurt a piece of hardware. Logic would indicate that its a stability test, not something destructive. So I think most consumers trust Furmark, even though we know not to... Still, I don' think someone is 'dumb' because they use Furmark. Sometimes when we are new at this, we just don't know any better. Gotta learn somehow and we all start somewhere, ya know?


----------



## sportsczy

Quick Q.... is VRM max temp of 82 safe? Doing some overclocking and the VRM is the sticking point. That will all change soon with some upgrades i have in mind. But for the time being, is that too high or ok?


----------



## sportsczy

Quote:


> Originally Posted by *battleaxe*
> 
> Most people don't know that a testing/benchmark program could hurt a piece of hardware. Logic would indicate that its a stability test, not something destructive. So I think most consumers trust Furmark, even though we know not to... Still, I don' think someone is 'dumb' because they use Furmark. Sometimes when we are new at this, we just don't know any better. Gotta learn somehow and we all start somewhere, ya know?


So is Heaven better or Valley? Seems Valley is less strenuous. But there is a point where you want to make sure that the card is truly stable. If Valley is enough, i'll just use that.


----------



## Gumbi

Quote:


> Originally Posted by *sportsczy*
> 
> So is Heaven better or Valley? Seems Valley is less strenuous. But there is a point where you want to make sure that the card is truly stable. If Valley is enough, i'll just use that.


Heaven is a bit more stressful (use tess and AA). Generally they are stressful enough. A game like Witcher 3 or Crysis 3 might be more stressful though, and will sniff out instability fairly quickly.

82c VRM is fine. They are specced upto 125c, but keeping them under 85 is desirable (though letting them get higher is not gonna damage your card...).


----------



## sportsczy

Thank you.


----------



## Renner

Quote:


> Originally Posted by *Geoclock*
> 
> Holly S H 1 T, that was NEW AMERICAN RAP wasn't it ? L O L
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Did you see system was like 5 year uncleaned from dust, that PC was "PROTESTING" against owner


Yeah, I know. We were telling him but he didn't care, because "In soviet Russia - dust filter keeps you"...


----------



## OutrideGaming

Alright so stay away from Furmark. Lesson learned, harshly, but learned







haha but what program should I use to test my gpu's ability to overclock?

And will it always throttle? That's actually really awesome that it does that









*Opens case. Pats GPU* I'm sorry baby. No more furs, maybe some boots?


----------



## Dundundata

Got my XFX installed but I can't for the life of me install the amd driver. I've run the catalyst install manager and it crashes when trying to install the driver (win7). I've tried uninstalling, deleting everything amd on c drive and registry and running DDU. Happens on 15.7.1 and 15.8.


----------



## sportsczy

Just a quick update on my Asus DCU3... without adding any voltage... +10%/1140/1600. Tried 1150, but saw some artifacts. Temps was 78 degrees on core and 84 degrees on VRM.


----------



## sportsczy

Quote:


> Originally Posted by *Dundundata*
> 
> Got my XFX installed but I can't for the life of me install the amd driver. I've run the catalyst install manager and it crashes when trying to install the driver (win7). I've tried uninstalling, deleting everything amd on c drive and registry and running DDU. Happens on 15.7.1 and 15.8.


Try going to device manager and installing driver with the "have disk" option.


----------



## Dorland203

Quote:


> Originally Posted by *OutrideGaming*
> 
> Alright so stay away from Furmark. Lesson learned, harshly, but learned
> 
> 
> 
> 
> 
> 
> 
> haha but what program should I use to test my gpu's ability to overclock?
> 
> And will it always throttle? That's actually really awesome that it does that
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Opens case. Pats GPU* I'm sorry baby. No more furs, maybe some boots?


I use heaven or valley for a quick test.I run 3 rounds of Heaven(Valley).If it doesn't crash,I play a heavy game for a final test.I recommend Crysis 3 with the highest playable setting,or Metro Last Light.Run the game for at least 1 hour.It it passes 1 hour,I say it is a stable overclock.


----------



## battleaxe

Quote:


> Originally Posted by *Gumbi*
> 
> Heaven is a bit more stressful (use tess and AA). Generally they are stressful enough. A game like Witcher 3 or Crysis 3 might be more stressful though, and will sniff out instability fairly quickly.
> 
> 82c VRM is fine. They are specced upto 125c, but keeping them under 85 is desirable (though letting them get higher is not gonna damage your card...).


Quote:


> Originally Posted by *OutrideGaming*
> 
> Alright so stay away from Furmark. Lesson learned, harshly, but learned
> 
> 
> 
> 
> 
> 
> 
> haha but what program should I use to test my gpu's ability to overclock?
> 
> And will it always throttle? That's actually really awesome that it does that
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Opens case. Pats GPU* I'm sorry baby. No more furs, maybe some boots?


Quote:


> Originally Posted by *Dorland203*
> 
> I use heaven or valley for a quick test.I run 3 rounds of Heaven(Valley).If it doesn't crash,I play a heavy game for a final test.I recommend Crysis 3 with the highest playable setting,or Metro Last Light.Run the game for at least 1 hour.It it passes 1 hour,I say it is a stable overclock.


Heaven is a bit tougher to pass in my experience also. Then onto your games to give them some real world scenarios. Benching is just a rough indicator and usually games are a bit tougher, at least like BF4, Crisis3 etc...

While 82c is fine. the VRM's start to lose their efficiency around 75c. So you may notice you don't get much more OC once reaching those temps and above. Its not going to kill them, but you won't gain much either.


----------



## Dundundata

Quote:


> Originally Posted by *sportsczy*
> 
> Try going to device manager and installing driver with the "have disk" option.


Is there a way to download only the driver, I'm only seeing it bundled with the install manager


----------



## sportsczy

you don't need to download the driver only... click on the "update driver" option and then choose "browse computer for driver software". You can pick Catalyst there and it will run the install. Hopefully, that solves it. If not, the next option is to go to the graphics card site and download the correct bios (very important you get the right one) and then flash your card with that bios. Backup the current bios with GPU-Z to be safe if you need to revert. Sometimes, the issue is that the bios is corrupted.


----------



## sportsczy

7Zip will unpack a .exe file if it comes to that... but i'm not sure you can just install a driver file without the rest of it on a either Nvidia or AMD. You'd need to research that.


----------



## battleaxe

delete


----------



## CamsX

Quote:


> Originally Posted by *Dundundata*
> 
> Is there a way to download only the driver, I'm only seeing it bundled with the install manager


Make sure you are using DDU in SAFE MODE, and then reinstall the driver in normal mode.

I've used 15.7 in windows 8.1 and 15.8 in windows 10, both without problem. My biggest problem was getting Black screen issues by uninstalling the driver (I think it was 15.6) in normal mode by mistake, and then having a hard time to get into safe mode. Bloody awful Windows 8 design.


----------



## Judge Dredd 3D

Looking for some one who could share the XFX 390X Black edition bios...
Thanks!


----------



## CC268

Does anyone have any info on the ASUS Strix 390 (NOT THE 390X)? I am interested in buying one, but want to make sure it is a good card. I like the MSI card too but the ASUS will go with my build better (color wise).


----------



## Dundundata

Thanks, sorry if this was a bit OT. I fixed it...by installing Windows 10.


----------



## sportsczy

The big thing with the Catalyst drivers in Windows 10 is that movies stutter. Tried MPC-HC, Potplayer and even
Quote:


> Originally Posted by *CC268*
> 
> Does anyone have any info on the ASUS Strix 390 (NOT THE 390X)? I am interested in buying one, but want to make sure it is a good card. I like the MSI card too but the ASUS will go with my build better (color wise).


I have it. I'm assuming you mean the DCU3 gaming version with 3 fans. Components are great and it OCs well. But cooling is subpar compared to MSI, Sapphire and XFX. Because of that, you need to rev fans higher and noise is a problem. SInce i was going to swap cooling out anyhow, i was more curious than concerned about it. If you're going to keep everything stock, then the others are better according to almost everyone here.


----------



## diggiddi

Quote:


> Originally Posted by *sportsczy*
> 
> So is Heaven better or Valley? Seems Valley is less strenuous. But there is a point where you want to make sure that the card is truly stable. If Valley is enough, i'll just use that.


I have found out that Firestrike is the most stressful, if it passes firestrike it'll pass valley and heaven but not the other way round


----------



## Oregonduck007

HELP! My MSI R9 390 is making a terrible whine at full load. Some people say its the PSU but im fairly certain it is not. i have a Termaltake toughpower 750 gold. Any thoughts? or should i just return the card?


----------



## Judge Dredd 3D

Are you 100% sure that the PSU is not the problem?
I had a Corsair HX1050 and was making all of my VCs while like crazy, I had it replaced and solved the problem...
I tried with my Asus GTX680 DirectCUII, Sapphire R9 Fury Tri-X OC, Zotac GTX980 Amp! Extreme, Power Color R9 290X and my current XFX R9 390X. try another PSU if you have one available. GL


----------



## Oregonduck007

Quote:


> Originally Posted by *Judge Dredd 3D*
> 
> Are you 100% sure that the PSU is not the problem?
> I had a Corsair HX1050 and was making all of my VCs while like crazy, I had it replaced and solved the problem...
> I tried with my Asus GTX680 DirectCUII, Sapphire R9 Fury Tri-X OC, Zotac GTX980 Amp! Extreme, Power Color R9 290X and my current XFX R9 390X. try another PSU if you have one available. GL


Nope. i'm trying different games to see if the whine persists. So far when i try to play sniper ghost warrior 2, the game itself is all messed up with invisible AI and my character having a black box over most of the screen to go along with a that terrible whine. My first thought would be to remove the card and uninstall the drivers ect.
When i play far cry 4 the whine is not nearly as bad but still noticeable.


----------



## Judge Dredd 3D

Go to guru3d.com and download DDU Unistaller, reboot in to safe mode and uninstall all drivers, download WHQL drivers and try again.
If you have another video card try that and run a test game so see if the whine continues.
If the problem despairs it means the VC was bad and needs RMA/Refund.
If the problem persist after using another Video card it means your PSU is bad and replacement its due.

Hope for the best!


----------



## Oregonduck007

Quote:


> Originally Posted by *Judge Dredd 3D*
> 
> Go to guru3d.com and download DDU Unistaller, reboot in to safe mode and uninstall all drivers, download WHQL drivers and try again.
> If you have another video card try that and run a test game so see if the whine continues.
> If the problem despairs it means the VC was bad and needs RMA/Refund.
> If the problem persist after using another Video card it means your PSU is bad and replacement its due.
> 
> Hope for the best!


Thanks for the advise! lets hope it works!


----------



## tangelo

Quote:


> Originally Posted by *Oregonduck007*
> 
> Not sure if i could hear anything over that music.


Missed the joke, did we?


----------



## Dundundata

Please add me to da club!











Stock cooling


----------



## Dundundata

Quote:


> Originally Posted by *Renner*
> 
> Folks, I'm having an update regarding coil whines. Friend of mine just complained about those with her XFX390X, and even posted a video showing it:
> 
> 
> 
> 
> 
> 
> Does that seem like a coil whine?
> 
> She played with the voltages and clocks though. This is the rest of the specs: 4790K @ 4.6 GHz, (h100i), 16GB Kingston HyperX Savage 2400, 2x 250GB SSD, 2x 120GB SSD, 2 TB HDD, 1.5kW PSU (Ax1500i).


I just got my XFX 390 hooked up and ran 1 game and can definitely hear some coil whine. I don't think it's that bad personally, especially with all the sounds of the game going. It's not constant. I have a RM750i PSU, so same brand. Also same chip.


----------



## kalidae

My sapphire nitro 390 at 1170 / 1750 max temp of 73 degrees at 65% fan speed.


----------



## Gumbi

Quote:


> Originally Posted by *kalidae*
> 
> My sapphire nitro 390 at 1170 / 1750 max temp of 73 degrees at 65% fan speed.


Sweet dude... How hot do the VRMs get? What kind of voltage you pushing to get that core clock? Those speeds are great!


----------



## kalidae

Quote:


> Originally Posted by *Gumbi*
> 
> Sweet dude... How hot do the VRMs get? What kind of voltage you pushing to get that core clock? Those speeds are great!


Thanks man. +100mv in afterburner and my vrms are at 73 and 80 degrees with one loop in valley but playing bf4 for an hour to check overclock stability I sore vrms max at 78 and 92 degrees. I was able to get the gpu core to 1200mhz but I did get artifacts so I kept backing down the core until they were gone, 1180 was good but then I started running 3dmark 11 and fire strike and still had artifacts so I backed It down to 1170 and it's all good. I did get 1800mhz on the memory and could have gone higher, it passed benches fine but the score didn't increase, I actually lost points so went back to 1750 on the memory. This was also with a max fan speed of 61%


----------



## imrunning

Hi all, just had my msi r9 390 a few weeks back. Been happy with its performance so far but the temps i got is the worrying part. Pass few weeks i've been doing a lot of changes to my 4yr old rig from replacing the 80mm fans to 120mm, purchasing new psu and finally decided to get new case. Perhaps this is why it took me a while to finally decided to sign up to this official thread to seek similar users' advice.

My Rig (current)
Case - Corsair Carbide 200r
proc - i5 2500k @4.2 ghz with Coolermaster v8 hsf
motherboard - Asus p8p97 deluxe
gfx - Msi r9 390 gaming 8G (stock)
psu - superflower leadex gold 650w 80+ gold
hdd - 2 x 650gb WD black 1x 50gb seagate (old hdd)
rams - 2x 4gb 1066 ddr3 corsair vengeance
opticle drive - 1x dvd/rw

fans
- 2x corsair SP120 performance endition (front intake)
- 1x corsair AF120 performance edition (side panel intake)
- 1x corsiar 120mm stock case fan (bottom intake)
- 1x corsair 120mm stock case fan (top intake)
- 1x corsair AF120 performance edition (exhaust)
* please see attached pic for case layout ( and yes i know my rig now sounds like a turbine)

I'm currently in Singapore with avg temp of 28-32 degrees celcius. My card idle around 40-45C. But on gaming or furmark it will reach max 94C after which the core speed will start throttling down to a range of 990-1030mhz. (1040mhz stock). now I've tried alot of combination for my fans, negative pressure and positive pressure and all will results in reaching max temp during load, difference is by how long it will reach to max temp.

I even tried switching on the room air condition and set it to 21C and let furmark run and max stable temp i got was 83-84C. Now this might be ideal temp for this card but taking into account most reviews and user able to hit max load temp of 70+ to 80 in their rig might not be sustainable for me if i were to run my rig in air con room each time i were to play games. (let me know if i'm wrong in this area)

I've even tried bringing back the card to local distro for rma but when they test in their center, the card hovers around 80-84C in open test bench running furmark. Which they mentioned is normal hence unable to have the card change. (their test center was air condition. I've tried to seek their advice to whether am i allowed to remove the card's hsf and apply my own thermal paste and they said that MSI might void the warranty. However they advice to have me email personally to MSI on this matter.

Pardon for this lengthy post but I've run out of ideas on reaching desired temps for this card as shown by reviews, bench markers and similar users here. Appreciate any advice from you guys! Thanks in advance!


----------



## kalidae

Quote:


> Originally Posted by *imrunning*
> 
> Hi all, just had my msi r9 390 a few weeks back. Been happy with its performance so far but the temps i got is the worrying part. Pass few weeks i've been doing a lot of changes to my 4yr old rig from replacing the 80mm fans to 120mm, purchasing new psu and finally decided to get new case. Perhaps this is why it took me a while to finally decided to sign up to this official thread to seek similar users' advice.
> 
> My Rig (current)
> Case - Corsair Carbide 200r
> proc - i5 2500k @4.2 ghz with Coolermaster v8 hsf
> motherboard - Asus p8p97 deluxe
> gfx - Msi r9 390 gaming 8G (stock)
> psu - superflower leadex gold 650w 80+ gold
> hdd - 2 x 650gb WD black 1x 50gb seagate (old hdd)
> rams - 2x 4gb 1066 ddr3 corsair vengeance
> opticle drive - 1x dvd/rw
> 
> fans
> - 2x corsair SP120 performance endition (front intake)
> - 1x corsair AF120 performance edition (side panel intake)
> - 1x corsiar 120mm stock case fan (bottom intake)
> - 1x corsair 120mm stock case fan (top intake)
> - 1x corsair AF120 performance edition (exhaust)
> * please see attached pic for case layout ( and yes i know my rig now sounds like a turbine)
> 
> I'm currently in Singapore with avg temp of 28-32 degrees celcius. My card idle around 40-45C. But on gaming or furmark it will reach max 94C after which the core speed will start throttling down to a range of 990-1030mhz. (1040mhz stock). now I've tried alot of combination for my fans, negative pressure and positive pressure and all will results in reaching max temp during load, difference is by how long it will reach to max temp.
> 
> I even tried switching on the room air condition and set it to 21C and let furmark run and max stable temp i got was 83-84C. Now this might be ideal temp for this card but taking into account most reviews and user able to hit max load temp of 70+ to 80 in their rig might not be sustainable for me if i were to run my rig in air con room each time i were to play games. (let me know if i'm wrong in this area)
> 
> I've even tried bringing back the card to local distro for rma but when they test in their center, the card hovers around 80-84C in open test bench running furmark. Which they mentioned is normal hence unable to have the card change. (their test center was air condition. I've tried to seek their advice to whether am i allowed to remove the card's hsf and apply my own thermal paste and they said that MSI might void the warranty. However they advice to have me email personally to MSI on this matter.
> 
> Pardon for this lengthy post but I've run out of ideas on reaching desired temps for this card as shown by reviews, bench markers and similar users here. Appreciate any advice from you guys! Thanks in advance!


Hi, you shouldn't really run furmark. It makes cards run hotter than they would in real world scenarios like gaming. 90 degrees is crazy hot and i do believe furmark would force that but gaming it shouldn't get that high. Run valley on a loop for 30 minutes and see what temps you get.


----------



## Gumbi

80 c isn't bad if your ambients are so high. Most people's ambients are ~20c. Not 30.

Next, STOP USING FURMARK. It will throttle virtually any card, no matter the cooling.

Do a loop of Heaven. What do you get? Also, make a custom fan curve in MSI Afterburner, that might help.


----------



## CC268

Quote:


> Originally Posted by *sportsczy*
> 
> The big thing with the Catalyst drivers in Windows 10 is that movies stutter. Tried MPC-HC, Potplayer and even
> I have it. I'm assuming you mean the DCU3 gaming version with 3 fans. Components are great and it OCs well. But cooling is subpar compared to MSI, Sapphire and XFX. Because of that, you need to rev fans higher and noise is a problem. SInce i was going to swap cooling out anyhow, i was more curious than concerned about it. If you're going to keep everything stock, then the others are better according to almost everyone here.


I was looking at this card: http://www.newegg.com/Product/Product.aspx?Item=N82E16814121974&nm_mc=KNC-GoogleAdwords-PC&cm_mmc=KNC-GoogleAdwords-PC-_-pla-_-Desktop+Graphics+Cards-_-N82E16814121974&gclid=Cj0KEQjwvdSvBRDahavi3KPGrvUBEiQATZ9v0LDrRaQFv7IHTSdMXk_xC8VKu0B0PNZghXAexG93UAUaAgpT8P8HAQ&gclsrc=aw.ds

Is that the same as yours?

I hate how the Sapphire looks it would just ruin the look of my build.


----------



## Agent Smith1984

Quote:


> Originally Posted by *imrunning*
> 
> Hi all, just had my msi r9 390 a few weeks back. Been happy with its performance so far but the temps i got is the worrying part. Pass few weeks i've been doing a lot of changes to my 4yr old rig from replacing the 80mm fans to 120mm, purchasing new psu and finally decided to get new case. Perhaps this is why it took me a while to finally decided to sign up to this official thread to seek similar users' advice.
> 
> My Rig (current)
> Case - Corsair Carbide 200r
> proc - i5 2500k @4.2 ghz with Coolermaster v8 hsf
> motherboard - Asus p8p97 deluxe
> gfx - Msi r9 390 gaming 8G (stock)
> psu - superflower leadex gold 650w 80+ gold
> hdd - 2 x 650gb WD black 1x 50gb seagate (old hdd)
> rams - 2x 4gb 1066 ddr3 corsair vengeance
> opticle drive - 1x dvd/rw
> 
> fans
> - 2x corsair SP120 performance endition (front intake)
> - 1x corsair AF120 performance edition (side panel intake)
> - 1x corsiar 120mm stock case fan (bottom intake)
> - 1x corsair 120mm stock case fan (top intake)
> - 1x corsair AF120 performance edition (exhaust)
> * please see attached pic for case layout ( and yes i know my rig now sounds like a turbine)
> 
> I'm currently in Singapore with avg temp of 28-32 degrees celcius. My card idle around 40-45C. But on gaming or furmark it will reach max 94C after which the core speed will start throttling down to a range of 990-1030mhz. (1040mhz stock). now I've tried alot of combination for my fans, negative pressure and positive pressure and all will results in reaching max temp during load, difference is by how long it will reach to max temp.
> 
> I even tried switching on the room air condition and set it to 21C and let furmark run and max stable temp i got was 83-84C. Now this might be ideal temp for this card but taking into account most reviews and user able to hit max load temp of 70+ to 80 in their rig might not be sustainable for me if i were to run my rig in air con room each time i were to play games. (let me know if i'm wrong in this area)
> 
> I've even tried bringing back the card to local distro for rma but when they test in their center, the card hovers around 80-84C in open test bench running furmark. Which they mentioned is normal hence unable to have the card change. (their test center was air condition. I've tried to seek their advice to whether am i allowed to remove the card's hsf and apply my own thermal paste and they said that MSI might void the warranty. However they advice to have me email personally to MSI on this matter.
> 
> Pardon for this lengthy post but I've run out of ideas on reaching desired temps for this card as shown by reviews, bench markers and similar users here. Appreciate any advice from you guys! Thanks in advance!


You have to get more air exiting your case!
You're blowing a bunch of air in, which helps some, but that air is immediately being heated with the hot air generated by the card. It's smoldering in it's own heat.

Also, as stated, over and over on this very page...... Do not ever test Hawaii based cards with furmark!

I am going to add this to the cooling section of the op tomorrow.....


----------



## CC268

I guess MSI is my best bet for a red and black build and something that has good OC ability and good cooling?


----------



## kalidae

Quote:


> Originally Posted by *CC268*
> 
> I was looking at this card: http://www.newegg.com/Product/Product.aspx?Item=N82E16814121974&nm_mc=KNC-GoogleAdwords-PC&cm_mmc=KNC-GoogleAdwords-PC-_-pla-_-Desktop+Graphics+Cards-_-N82E16814121974&gclid=Cj0KEQjwvdSvBRDahavi3KPGrvUBEiQATZ9v0LDrRaQFv7IHTSdMXk_xC8VKu0B0PNZghXAexG93UAUaAgpT8P8HAQ&gclsrc=aw.ds
> 
> Is that the same as yours?
> 
> I hate how the Sapphire looks it would just ruin the look of my build.


I felt the same way about the sapphire card because it's pretty plain looking and there are other 390's out there that look better and have a backplate but my build is black and red so the black nitro card still fits in though an msi or asus would look better. The sapphire card though it runs so cool and quiet and is one of the best 390's if it had a backplate I think it would be the best 390 available.







I really need to do something about that blue led on the card though... sapphire why did u make it blue? White would have been more neutral and better for every build and god damn it why no backplate?!


----------



## CC268

I appreciate your feedback! I just wish I could get some more info on the ASUS STRIX card - everyone claims it is very hot and loud - which in that case I would just buy the MSI card.


----------



## Dundundata

Tested out some Witcher 3 with the XFX, this game looks amazing! I am running core clock 1015 and got 1080/60 only turning Hairworks off and setting foliage distance to High (and I turned off blur). Temps got up to 72 on the GPU @ 50% fan speed. This is without any tweaking and I am already super impressed. I'll do a custom fan curve next, unfortunately with my new PSU I can't fit my bottom intake fan anymore. I didn't even notice any coil whine, it was noticeable when I ran Witcher 1, though not bad really.


----------



## kalidae

Quote:


> Originally Posted by *CC268*
> 
> I appreciate your feedback! I just wish I could get some more info on the ASUS STRIX card - overall it seems like a good card?


It's overpriced imo, it's a good card and it looks awesome but it doesn't outperform the other brand 390's, only out of the box but once you overclock them all then they can all beat or be as good as the strixx, it just comes down to luck. It runs hotter and is louder than the sapphire nitro 390 but it looks better than the sapphire, the sapphire is way cheaper. Really it's up to you on which you want and if u wanna pay more for the looks and the name. The word strixx adds a premium.


----------



## kizwan

Quote:


> Originally Posted by *imrunning*
> 
> Hi all, just had my msi r9 390 a few weeks back. Been happy with its performance so far but the temps i got is the worrying part. Pass few weeks i've been doing a lot of changes to my 4yr old rig from replacing the 80mm fans to 120mm, purchasing new psu and finally decided to get new case. Perhaps this is why it took me a while to finally decided to sign up to this official thread to seek similar users' advice.
> 
> My Rig (current)
> Case - Corsair Carbide 200r
> proc - i5 2500k @4.2 ghz with Coolermaster v8 hsf
> motherboard - Asus p8p97 deluxe
> gfx - Msi r9 390 gaming 8G (stock)
> psu - superflower leadex gold 650w 80+ gold
> hdd - 2 x 650gb WD black 1x 50gb seagate (old hdd)
> rams - 2x 4gb 1066 ddr3 corsair vengeance
> opticle drive - 1x dvd/rw
> 
> fans
> - 2x corsair SP120 performance endition (front intake)
> - 1x corsair AF120 performance edition (side panel intake)
> - 1x corsiar 120mm stock case fan (bottom intake)
> - 1x corsair 120mm stock case fan (top intake)
> - 1x corsair AF120 performance edition (exhaust)
> * please see attached pic for case layout ( and yes i know my rig now sounds like a turbine)
> 
> I'm currently in Singapore with avg temp of 28-32 degrees celcius. My card idle around 40-45C. But on gaming or furmark it will reach max 94C after which the core speed will start throttling down to a range of 990-1030mhz. (1040mhz stock). now I've tried alot of combination for my fans, negative pressure and positive pressure and all will results in reaching max temp during load, difference is by how long it will reach to max temp.
> 
> I even tried switching on the room air condition and set it to 21C and let furmark run and max stable temp i got was 83-84C. Now this might be ideal temp for this card but taking into account most reviews and user able to hit max load temp of 70+ to 80 in their rig might not be sustainable for me if i were to run my rig in air con room each time i were to play games. (let me know if i'm wrong in this area)
> 
> I've even tried bringing back the card to local distro for rma but when they test in their center, the card hovers around 80-84C in open test bench running furmark. Which they mentioned is normal hence unable to have the card change. (their test center was air condition. I've tried to seek their advice to whether am i allowed to remove the card's hsf and apply my own thermal paste and they said that MSI might void the warranty. However they advice to have me email personally to MSI on this matter.
> 
> Pardon for this lengthy post but I've run out of ideas on reaching desired temps for this card as shown by reviews, bench markers and similar users here. Appreciate any advice from you guys! Thanks in advance!


Flip the fan up top from intake to exhaust. Add one more fan up top, also exhaust.


----------



## Agent Smith1984

Quote:


> Originally Posted by *kizwan*
> 
> Flip the fan up top from intake to exhaust. Add one more fan up top, also exhaust.


100% agreed


----------



## CC268

Wait the STRIX card is the cheapest for me? It is $319...MSI is $340 and Sapphire is $330


----------



## imrunning

Quote:


> Originally Posted by *Gumbi*
> 
> 80 c isn't bad if your ambients are so high. Most people's ambients are ~20c. Not 30.
> 
> Next, STOP USING FURMARK. It will throttle virtually any card, no matter the cooling.
> 
> Do a loop of Heaven. What do you get? Also, make a custom fan curve in MSI Afterburner, that might help.


okay noted on not to run furmark. just ran furmark cause the guys at RMA center ran furmark to test my card. anyways any games or 3dmark or heaven test i ran will surely reach 94C. its just a little longer to reach. The only game that ran cool was arkham knight and that was because it was lock at 30fps. However if unlock to 120fps, it reached 94C too.

@agent smith : I guess i miss out reading the warning not to ran furmark, agent smith. haha! i will switch the top intake to exhaust in this case.

@gumbi: that means with my ambient temp at 30, is it normal to hit 94C on max load in a close case environment?

Oh yah, also i did one test that i did not mentioned, was able to hit 84C max temp on load w/o needing the aircon. Which was to place a small box fan directly to my cpu with side panel remove.


----------



## kizwan

Quote:


> Originally Posted by *imrunning*
> 
> okay noted on not to run furmark. just ran furmark cause the guys at RMA center ran furmark to test my card. anyways any games or 3dmark or heaven test i ran will surely reach 94C. its just a little longer to reach. The only game that ran cool was arkham knight and that was because it was lock at 30fps. However if unlock to 120fps, it reached 94C too.
> 
> @agent smith : I guess i miss out reading the warning not to ran furmark, agent smith. haha! i will switch the top intake to exhaust in this case.
> 
> @gumbi: that means with my ambient temp at 30, is it normal to hit 94C on max load in a close case environment?
> 
> Oh yah, also i did one test that i did not mentioned, was able to hit 84C max temp on load w/o needing the aircon. Which was to place a small box fan directly to my cpu with side panel remove.


Configure custom fan profile using MSI Afterburner. You can get it down to 80s, maybe 70s Celsius without aircon.


----------



## imrunning

Quote:


> Originally Posted by *kizwan*
> 
> Configure custom fan profile using MSI Afterburner. You can get it down to 80s, maybe 70s Celsius without aircon.


Already did custom fan profile using that software. I've set the card's fans to run max speed at 80C which was around 2200+ rpm.


----------



## kalidae

Quote:


> Originally Posted by *CC268*
> 
> Wait the STRIX card is the cheapest for me? It is $319...MSI is $340 and Sapphire is $330


That's lucky over here in Australia the strixx is $599 and the nitro is $509


----------



## Agent Smith1984

My msi 390 tops out at 74c with 100mv/50mv offset set 1200/1750....

How does it manage those temps?

By setting a custom fan curve that hits 90% @ 80c, and running two high flow exhaust fans...

Prior to adding the high flow fans (95cfm each), my temps were hitting 90c with 100mv, and the OC was limited to 1180mhz... That was with two stock120mm case fans that only moved around 40cfm each.


----------



## imrunning

Quote:


> Originally Posted by *Agent Smith1984*
> 
> My msi 390 tops out at 74c with 100mv/50mv offset set 1200/1750....
> 
> How does it manage those temps?
> 
> By setting a custom fan curve that hits 90% @ 80c, and running two high flow exhaust fans...
> 
> Prior to adding the high flow fans (95cfm each), my temps were hitting 90c with 100mv, and the OC was limited to 1180mhz... That was with two stock120mm case fans that only moved around 40cfm each.


Thats one good temp and i wish mine was like that... what is your ambient temp?


----------



## bazookatooths

I think Heaven will show you very quick results crank up everything to max at your resolution
Quote:


> Originally Posted by *Dundundata*
> 
> Tested out some Witcher 3 with the XFX, this game looks amazing! I am running core clock 1015 and got 1080/60 only turning Hairworks off and setting foliage distance to High (and I turned off blur). Temps got up to 72 on the GPU @ 50% fan speed. This is without any tweaking and I am already super impressed. I'll do a custom fan curve next, unfortunately with my new PSU I can't fit my bottom intake fan anymore. I didn't even notice any coil whine, it was noticeable when I ran Witcher 1, though not bad really.


Yes I had the same issue with my PSU had to remove my bottom intake, luckily my side vent is super low, but still thinking of getting a bigger case.


----------



## kalidae

Quote:


> Originally Posted by *Agent Smith1984*
> 
> My msi 390 tops out at 74c with 100mv/50mv offset set 1200/1750....
> 
> How does it manage those temps?
> 
> By setting a custom fan curve that hits 90% @ 80c, and running two high flow exhaust fans...
> 
> Prior to adding the high flow fans (95cfm each), my temps were hitting 90c with 100mv, and the OC was limited to 1180mhz... That was with two stock120mm case fans that only moved around 40cfm each.


Really nice overclock agent Smith, how loud is your computer with your gpu fans at 90% and your 90 cfm fans at 100%?


----------



## Renner

Smith, can you post us your fan curve, pls?

Also, I don't do a lot of benchmarking, but I do play games... At default fan curve I can reach 80 degrees in some titles while using virtual super resolution via driver. One of such games is Batman Arkham Knight. After it was patched recently and entering it, it set itself at max settings and 3200x1800 res (max that my 23 inch monitor can emulate), and these are benchmark results:



I didn't show this screen before. In game itself I'm reaching 30fps or more easily. I just briefly mentioned that fact on our domestic IT forum, and some dude was so enraged by that he actually created an acc there just to send me a PM, calling me a liar, a jerk, my clocked i7 4790, clocked 970, 32GB of RAM are showing slides at that res. Why do you lie, above all - you lie to yourself. My 970 is the same strength as your card, if not stronger. Whats worth bragging with all of that VRAM when you can't turn the most important features, that's Radeon - strength on a paper... And so on...

I mean lol. I guess double the bandwith and memory on this card are really helping. Its not my fault he was a sucker.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Renner*
> 
> Smith, can you post us your fan curve, pls?
> 
> Also, I don't do a lot of benchmarking, but I do play games... At default fan curve I can reach 80 degrees in some titles while using virtual super resolution via driver. One of such games is Batman Arkham Knight. After it was patched recently and entering it, it set itself at max settings and 3200x1800 res (max that my 23 inch monitor can emulate), and these are benchmark results:
> 
> 
> 
> I didn't show this screen before. In game itself I'm reaching 30fps or more easily. I just briefly mentioned that fact on our domestic IT forum, and some dude was so enraged by that he actually created an acc there just to send me a PM, calling me a liar, a jerk, my clocked i7 4790, clocked 970, 32GB of RAM are showing slides at that res. Why do you lie, above all - you lie to yourself. My 970 is the same strength as your card, if not stronger. Whats worth bragging with all of that VRAM when you can't turn the most important features, that's Radeon - strength on a paper... And so on...
> 
> I mean lol. I guess double the bandwith and memory on this card are really helping. Its not my fault he was a sucker.


Sorry man, whole rig down until Wednesday waiting on new mobo.

Will get it for you when I'm up again.


----------



## sportsczy

You do realize that the extra VRM is designed to keep performance up when playing on 1440p and even 4k right? For 1080p, GTX 970 is fine. But it's useless on higher resolutions... i'm coming from a 970 that drove me crazy with the stuttering at 1440p. That's all gone. The reason? You don't have 4gb with the 970 but 3.5gb. That's become an issue now although it wasn't when it came out since higher resolution gaming was practically non-existent.

Also, "emulate" doesn't mean that the graphics card is delivering that resolution. It's downsampling. All it does is really refine AA. It's COMPLETELY DIFFERENT than a graphics card rendering actual 1440p or higher. It also highly depends on the VRM requirements of each game and the new games have become much much heavier. Try it with Witcher 3 at 1440p (like i did) and see what happens lol.

You're the one that needs to stop fooling himself.


----------



## Renner

Quote:


> Originally Posted by *sportsczy*
> 
> You do realize that the extra VRM is designed to keep performance up when playing on 1440p and even 4k right? For 1080p, GTX 970 is fine. But it's useless on higher resolutions... i'm coming from a 970 that drove me crazy with the stuttering at 1440p. That's all gone. The reason? You don't have 4gb with the 970 but 3.5gb. That's become an issue now although it wasn't when it came out since higher resolution gaming was practically non-existent.
> 
> Also, "emulate" doesn't mean that the graphics card is delivering that resolution. It's downsampling. All it does is really refine AA. It's COMPLETELY DIFFERENT than a graphics card rendering actual 1440p or higher. It also highly depends on the VRM requirements of each game and the new games have become much much heavier. Try it with Witcher 3 at 1440p (like i did) and see what happens lol.
> 
> You're the one that needs to stop fooling himself.


I know that 970 is a 1080p card, and about its 3.5GB issue, but tell the thing about fooling yourself to that dude who sent me that message, who keeps telling himself that his card keeps with this one (related to this very thread) at those settings, and calling me a liar. He also emulated those res options over GF driver, btw. I'm just laughing how silly and butthurt some people can be over some things. Reminds me of this:


----------



## Dundundata

I'm getting >4GB on Witcher 3

I was looking up fan curves and I found one of agent smith's posts. Right now I have it set at 35% to 50C, then 80/80, 100/90. Simple and it lowered my temps a few degrees, mostly getting high 60s-70. Been running Heaven and bumped my card to 1100/1600 on stock voltage with good temps, my room is rather cool today. I have a spare Noctua now I might swap out with the stock exhaust fan.

I've also noticed the coil whine is directly related to FPS, which is why I noticed it in Witcher 1 and not 3. What made me realize this is whenever I quit Heaven the frames jump to over 8000 for a couple seconds.

All-in-all I am very impressed with this card. Next step is to see how far I can push it.


----------



## sportsczy

Ah ok... Well it's a shame to get butthurt when you're proving your point with a false example lol. Sorry about that.

970 is right at the edge where 3.5 gn is going to be a real problem.


----------



## Oregonduck007

Quote:


> Originally Posted by *kalidae*
> 
> My sapphire nitro 390 at 1170 / 1750 max temp of 73 degrees at 65% fan speed.


Nice! did you change the voltage at all?


----------



## Mr.Pie

Quote:


> Originally Posted by *sportsczy*
> 
> Ah ok... Well it's a shame to get butthurt when you're proving your point with a false example lol. Sorry about that.
> 
> 970 is right at the edge where 3.5 gn is going to be a real problem.


Batman Arkham knight for me fully maxed at 1440p (nvidia gameworks off) peaked about 4.2GB VRAM usage. So glad I got a 390 instead!


----------



## CamsX

Quote:


> Originally Posted by *Oregonduck007*
> 
> Nice! did you change the voltage at all?


He did, +100mV. Would it be possible to reach a similar overclock with less voltage?


----------



## Gumbi

Quote:


> Originally Posted by *Oregonduck007*
> 
> Nice! did you change the voltage at all?


He said 100mv.


----------



## Oregonduck007

Quote:


> Originally Posted by *Renner*
> 
> I know that 970 is a 1080p card, and about its 3.5GB issue, but tell the thing about fooling yourself to that dude who sent me that message, who keeps telling himself that his card keeps with this one (related to this very thread) at those settings, and calling me a liar. He also emulated those res options over GF driver, btw. I'm just laughing how silly and butthurt some people can be over some things. Reminds me of this:


That video almost had me in tears!


----------



## Oregonduck007

Quote:


> Originally Posted by *Dundundata*
> 
> I'm getting >4GB on Witcher 3
> 
> I was looking up fan curves and I found one of agent smith's posts. Right now I have it set at 35% to 50C, then 80/80, 100/90. Simple and it lowered my temps a few degrees, mostly getting high 60s-70. Been running Heaven and bumped my card to 1100/1600 on stock voltage with good temps, my room is rather cool today. I have a spare Noctua now I might swap out with the stock exhaust fan.
> 
> I've also noticed the coil whine is directly related to FPS, which is why I noticed it in Witcher 1 and not 3. What made me realize this is whenever I quit Heaven the frames jump to over 8000 for a couple seconds.
> 
> All-in-all I am very impressed with this card. Next step is to see how far I can push it.


After testing my card over and over to pin point my coil whine, i concluded that it is indeed being caused by massive frame jumps as well. Not sure if i should limit my MSI's frame rate to 120ish..


----------



## Mr.Pie

Quote:


> Originally Posted by *Oregonduck007*
> 
> After testing my card over and over to pin point my coil whine, i concluded that it is indeed being caused by massive frame jumps as well. Not sure if i should limit my MSI's frame rate to 120ish..


my MSI R9 390 only has coil whine when I launch heaven or close it down........no coil whine at all in any other application.

Strange


----------



## Sgt Bilko

Quote:


> Originally Posted by *CamsX*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Oregonduck007*
> 
> Nice! did you change the voltage at all?
> 
> 
> 
> He did, +100mV. Would it be possible to reach a similar overclock with less voltage?
Click to expand...

Mine does 1150/1700 on stock voltage









Only on heaven did i notice a couple of artifacts so daily for me is 1125/1700


----------



## bazookatooths

Managed to get to 104% Rank 4







, Anyone on this board 109% Rank 1?

To all the guys with 3D Mark is it worth the price of purchase vs. the free one to compare to other users?
Thinking about purchasing it as benchmarking is so much fun.


----------



## Oregonduck007

Quote:


> Originally Posted by *Mr.Pie*
> 
> my MSI R9 390 only has coil whine when I launch heaven or close it down........no coil whine at all in any other application.
> 
> Strange


file:///C:/Users/krame/Valley/Unigine_Valley_Benchmark_1.0_20150913_1553.html
No whine at all. strange indeed.
Quote:


> Originally Posted by *Mr.Pie*
> 
> my MSI R9 390 only has coil whine when I launch heaven or close it down........no coil whine at all in any other application.
> 
> Strange




No whine at all when i run heaven on my MSI 390, I got this score while i was sharing my screen with a buddy. On a side note**( it says windows 8 but im running windows 10 64bit.)


----------



## Oregonduck007

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Mine does 1150/1700 on stock voltage
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Only on heaven did i notice a couple of artifacts so daily for me is 1125/1700


Sweet dude! I managed to get 1130 /1660 on stock voltage with very little or no artifacts.


----------



## Dundundata

How can you tell when the artifacts are "good enough"? I've run Heaven and saw massive artifacts/colors, and run it with different settings and only seen smaller ones (like little blueish dots on the cannon)


----------



## Noirgheos

Quote:


> Originally Posted by *Dundundata*
> 
> How can you tell when the artifacts are "good enough"? I've run Heaven and saw massive artifacts/colors, and run it with different settings and only seen smaller ones (like little blueish dots on the cannon)


The moment you see an artifact that means it's unstable. Dial it back until you don't.

You have an XFX 390X correct? Would you mind telling me your idle/load temps and posting your fan curve?

Also, to the general public of this thread:

XFX or MSI for cooling?


----------



## battleaxe

Quote:


> Originally Posted by *Noirgheos*
> 
> The moment you see an artifact that means it's unstable. Dial it back until you don't.
> 
> You have an XFX 390X correct? Would you mind telling me your idle/load temps and posting your fan curve?
> 
> Also, to the general public of this thread:
> 
> XFX or MSI for cooling?


Neither. Powercolor PCS.


----------



## Dundundata

Quote:


> Originally Posted by *Noirgheos*
> 
> The moment you see an artifact that means it's unstable. Dial it back until you don't.
> 
> You have an XFX 390X correct? Would you mind telling me your idle/load temps and posting your fan curve?
> 
> Also, to the general public of this thread:
> 
> XFX or MSI for cooling?


I have the XFX 390 (non X) with temps 35/70. I just set this up so still testing and making changes but this is my current fan curve.


----------



## kalidae

Quote:


> Originally Posted by *CamsX*
> 
> He did, +100mV. Would it be possible to reach a similar overclock with less voltage?


Stock voltage I was able to run 1125/1600 I think I was able to run my memory higher but when I benched it didn't perform any better on stock volts so I left it at 1600. Last night I was bored so I thought right I'm gonna stuff around and see what this card can do. I went straight to msi afterburner and i knew these 390's don't really go over 1200mhz at the core so I clocked it to 1200 and added maximum 100mv then started benching. Valley gave me artifacts straight away but it didn't crash which means if I was able to give it more voltage then 1200 would be easily achievable. I backed It off to 1900 still had some artifacts then 1800 it was fine, no artifacts and passed all the unigine benchmarks. I started the future mark benches and yep, artifacts. Dropped the core back to 1700 and no artifacts, successful benchmark. Then with the memory I bumper it straight up to 1700 and everything benched fine, same at 1750 then 1800 all though there was no issues running benches my scores actually decreased 20 points so I dropped it back to 1750 and called it a day.

It seems at +100mv 1700 on the core is as far as I can go without artifacts. I think to reduce the voltage would probably cause artifacts again but I didn't try it, honestly adding the extra 100mv didn't make my card that much hotter. And even with my fan at 61% the card only reached very low 70's like 72 or something which is like 3 degrees hotter than what it was running at stock volts and same fan curve.

The 390 nitro completely stock settings out if the box.


The nitro +100mv 1170 /1750


It added 9 fps to my average. But games like bf4 it feels higher than that, I run a 144hz monitor with all settings on ultra and the frames are much more consistently staying over 100. And for the most part are in the 120s depending on the map and location of course.


----------



## Noirgheos

Quote:


> Originally Posted by *Dundundata*
> 
> I have the XFX 390 (non X) with temps 35/70. I just set this up so still testing and making changes but this is my current fan curve.


Is it loud?


----------



## Judge Dredd 3D

Quote:


> Originally Posted by *Noirgheos*
> 
> The moment you see an artifact that means it's unstable. Dial it back until you don't.
> 
> You have an XFX 390X correct? Would you mind telling me your idle/load temps and posting your fan curve?
> 
> Also, to the general public of this thread:
> 
> XFX or MSI for cooling?


I have the XFX 390X, going past 40% fan speed will get pretty noisy, I tried 50% and was loud... very.
If you want maximum cooling and silent operation get the Sapphire. If you want balenced cooling and decent looks get the MSI. Stay clear of the ASUS,it gets pretty hot at idle/load.


----------



## Allan P

I'm planning to buy a R9 390/390x soon. Are there any good reasons to justify buying the 390x instead of a 390?


----------



## Dundundata

Quote:


> Originally Posted by *Judge Dredd 3D*
> 
> I have the XFX 390X, going past 40% fan speed will get pretty noisy, I tried 50% and was loud... very.
> If you want maximum cooling and silent operation get the Sapphire. If you want balenced cooling and decent looks get the MSI. Stay clear of the ASUS,it gets pretty hot at idle/load.


EDIT: Well now that you mention it I guess they do make noise. I sit a bit away from the computer and I must not be that sensitive to it. Actually I find the sound of fans rather soothing! I don't find it to be bad or distracting especially with audio from games playing but I suppose I see how it could be. I never even noticed until you posted this and I went and gave a listen!


----------



## Judge Dredd 3D

Quote:


> Originally Posted by *Allan P*
> 
> I'm planning to buy a R9 390/390x soon. Are there any good reasons to justify buying the 390x instead of a 390?


R9 390 for 1080p and R9 390X for 1440p gameplay, get the Sapphire or Msi for better cooling. The msi 390x comes with 1100mhz gpu.


----------



## diggiddi

Quote:


> Originally Posted by *Allan P*
> 
> I'm planning to buy a R9 390/390x soon. Are there any good reasons to justify buying the 390x instead of a 390?


2816 Stream core processors vs 2560


----------



## kalidae

Quote:


> Originally Posted by *bazookatooths*
> 
> 
> 
> Managed to get to 104% Rank 4
> 
> 
> 
> 
> 
> 
> 
> , Anyone on this board 109% Rank 1?
> 
> To all the guys with 3D Mark is it worth the price of purchase vs. the free one to compare to other users?
> Thinking about purchasing it as benchmarking is so much fun.


I had never seen this userbenchmark before so thank you very much for showing me this. It's pretty damn cool. I ran a test on mine. It won't let me upload the picture for some reason but overall component status was 96% my i5 4690k scored rank 2 103% and my nitro 390 scored rank 3 106%. My card is pretty heavily overclocked so to score the 109% they would have to be running 1200mhz on the core.

The 3D marks are worth buying. I also love benching my system and components. The 3D marks are handy to have not just for benching but also for checking stability. 3dmark found instability where unigine heaven and valley did not. Also if you love benching get into overclocking both cpu and gpu and watch your scores rise. Your fx cpu should be able to reach 5ghz or pretty damn close to it.


----------



## Allan P

I guess I'll choose the 390x then. Thanks to both of you's. I'm actually going to get the Best-buy exclusive XFX one for the lifetime warranty and watercool it. I shall look into Stream Core Processors to educate myself on what these are.


----------



## kalidae

Quote:


> Originally Posted by *Allan P*
> 
> I guess I'll choose the 390x then. Thanks to both of you's. I'm actually going to get the Best-buy exclusive XFX one for the lifetime warranty and watercool it. I shall look into Stream Core Processors to educate myself on what these are.


I was in the same situation but over here 390's are in the $500-$600 range and 390x are $600+ so I thought stuff it ill get a 390 and overclock it and it does outperform a 390x though if you overclock the 390x it will of course beat a 390. In saying that though i don't know if it's worth the extra 150 or so for me.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Dundundata*
> 
> How can you tell when the artifacts are "good enough"? I've run Heaven and saw massive artifacts/colors, and run it with different settings and only seen smaller ones (like little blueish dots on the cannon)


Zero artifacts should be what you aim for, artifacting means the clocks are unstable and can damage your GPU over time.
Quote:


> Originally Posted by *Allan P*
> 
> I guess I'll choose the 390x then. Thanks to both of you's. I'm actually going to get the Best-buy exclusive XFX one for the lifetime warranty and watercool it. I shall look into Stream Core Processors to educate myself on what these are.


Simple way of putting it is; More SP's = More FPS.

There are also ROP's and TMU's that factor in a bit as well but thats a bit more in depth


----------



## Noirgheos

Quote:


> Originally Posted by *Judge Dredd 3D*
> 
> I have the XFX 390X, going past 40% fan speed will get pretty noisy, I tried 50% and was loud... very.
> If you want maximum cooling and silent operation get the Sapphire. If you want balenced cooling and decent looks get the MSI. Stay clear of the ASUS,it gets pretty hot at idle/load.


How were your temps at below 50% fans? Load temps I mean.

I also would get Sapphire, but they don't have a backplate. I've also heard bad things about the MSI for temps... especially here.


----------



## jaydude

I just noticed something on my gigabyte 390, there is no difference in fan speed between 85% and 100% on afterburner, rpm is at 3600 at both settings


----------



## imrunning

Quote:


> Originally Posted by *kalidae*
> 
> Hi, you shouldn't really run furmark. It makes cards run hotter than they would in real world scenarios like gaming. 90 degrees is crazy hot and i do believe furmark would force that but gaming it shouldn't get that high. Run valley on a loop for 30 minutes and see what temps you get.


I've ran Unigine's valley and hits 94C in around 10 mins... I've also tried the advice of flipping my top intake to exhaust but still giving me 94C max in same time frame. what i notice is that VRM1 graph hits max of 84 while VRM2 stuck at 47C.

Attached some pics of temp graph...

This graph is with 5 intake 1 exhaust


This graph is with 4 intake and 2 exhaust


Fan profile


----------



## bazookatooths

Quote:


> Originally Posted by *Judge Dredd 3D*
> 
> R9 390 for 1080p and R9 390X for 1440p gameplay, get the Sapphire or Msi for better cooling. The msi 390x comes with 1100mhz gpu.


Quote:


> Originally Posted by *Dundundata*
> 
> EDIT: Well now that you mention it I guess they do make noise. I sit a bit away from the computer and I must not be that sensitive to it. Actually I find the sound of fans rather soothing! I don't find it to be bad or distracting especially with audio from games playing but I suppose I see how it could be. I never even noticed until you posted this and I went and gave a listen!


I agree with Dundundata, I cant hear mine until 60%+, I have the XFX model. OC @ 1175/1720 stays @65c gaming for hours @50-55% fan speed so I cannot hear it, over my CPU/Case fans.
Also clocked my fx8120 up to 4.45ghz so a bit hotter temps overall for me. I would clock my CPU back down to 4.2ghz to cool the case temps down, if the GPU fans were too loud, but that is not an issue here.


----------



## Judge Dredd 3D

Quote:


> Originally Posted by *bazookatooths*
> 
> I agree with Dundundata, I cant hear mine until 60%+, I have the XFX model. OC @ 1175/1720 stays @65c gaming for hours @50-55% fan speed so I cannot hear it, over my CPU/Case fans.
> Also clocked my fx8120 up to 4.45ghz so a bit hotter temps overall for me. I would clock my CPU back down to 4.2ghz to cool the case temps down, if the GPU fans were too loud, but that is not an issue here.


I have coolermaster HAF and a Cooler Master nepton 280l, everything else its pretty quiet in my rig maybe thats why I can hear the gpu fans spinning at more than 40%... that case has little to no sound insulation.


----------



## kalidae

Quote:


> Originally Posted by *imrunning*
> 
> I've ran Unigine's valley and hits 94C in around 10 mins... I've also tried the advice of flipping my top intake to exhaust but still giving me 94C max in same time frame. what i notice is that VRM1 graph hits max of 84 while VRM2 stuck at 47C.
> 
> Attached some pics of temp graph...
> 
> This graph is with 5 intake 1 exhaust
> 
> 
> This graph is with 4 intake and 2 exhaust
> 
> 
> Fan profile


Wow dude okay that sucks. My only thought is that the heatsink isn't making full contact, your vrms are running cool and must have full contact for your gpu is really hot and I'm wondering if your heatsink is actually mounted on a very slight angle as to make contact with one but not the other. Since this company you bought it from won't let you return it then maybe you should just remove the heat sink and apply some good thermal paste and remount it and see how you go because something is really not right, your system has a lot of airflow and your gpu should be running way cooler than that. Or you could probably bypass the place you bought it from and get in touch with msi or whoever it is directly and explain the situation. As soon as you mention the shop used furmark for testing then msi will know straight away that the shop doesn't know what they are doing.


----------



## bazookatooths

Quote:


> Originally Posted by *kalidae*
> 
> I had never seen this userbenchmark before so thank you very much for showing me this. It's pretty damn cool. I ran a test on mine. It won't let me upload the picture for some reason but overall component status was 96% my i5 4690k scored rank 2 103% and my nitro 390 scored rank 3 106%. My card is pretty heavily overclocked so to score the 109% they would have to be running 1200mhz on the core.
> 
> The 3D marks are worth buying. I also love benching my system and components. The 3D marks are handy to have not just for benching but also for checking stability. 3dmark found instability where unigine heaven and valley did not. Also if you love benching get into overclocking both cpu and gpu and watch your scores rise. Your fx cpu should be able to reach 5ghz or pretty damn close to it.


Excellent ranks i'm jealous, I







:thumb:Yes this chip would easily hit 5ghz I havent even went over 1.4volts yet, problem is no water cooling , I'd have to buy a bigger case for double rad, LIFE GOALS!
I want to get an intel CPU love the results everyone is having with them, although have to wait til I find a good deal, no microcenters around here. Okay I will def. be buying 3dmark then more fun benching!


----------



## Judge Dredd 3D

Quote:


> Originally Posted by *Dundundata*
> 
> EDIT: Well now that you mention it I guess they do make noise. I sit a bit away from the computer and I must not be that sensitive to it. Actually I find the sound of fans rather soothing! I don't find it to be bad or distracting especially with audio from games playing but I suppose I see how it could be. I never even noticed until you posted this and I went and gave a listen!


Oops.. I hope now that you are aware of it it will not be a distraction, I sit next to it at ear level and the case has no noise insulation so yeah... its not horrible but its there.


----------



## imrunning

Quote:


> Originally Posted by *kalidae*
> 
> Wow dude okay that sucks. My only thought is that the heatsink isn't making full contact, your vrms are running cool and must have full contact for your gpu is really hot and I'm wondering if your heatsink is actually mounted on a very slight angle as to make contact with one but not the other. Since this company you bought it from won't let you return it then maybe you should just remove the heat sink and apply some good thermal paste and remount it and see how you go because something is really not right, your system has a lot of airflow and your gpu should be running way cooler than that. Or you could probably bypass the place you bought it from and get in touch with msi or whoever it is directly and explain the situation. As soon as you mention the shop used furmark for testing then msi will know straight away that the shop doesn't know what they are doing.


Sigh... I guess I'll get in touch with MSI and ask about removing the hsf would void the warranty or not.... i just did ran the valley test with side panel open and temp manage to hover around 92-93.... 1C improvement. the local distro guys have also made contact and said to wait for thier r9 390 to restock and would give it another test. Thanks guys for the help....


----------



## kalidae

Quote:


> Originally Posted by *bazookatooths*
> 
> Excellent ranks i'm jealous, I
> 
> 
> 
> 
> 
> 
> 
> :thumb:Yes this chip would easily hit 5ghz I havent even went over 1.4volts yet, problem is no water cooling , I'd have to buy a bigger case for double rad, LIFE GOALS!
> I want to get an intel CPU love the results everyone is having with them, although have to wait til I find a good deal, no microcenters around here. Okay I will def. be buying 3dmark then more fun benching!


The amd fx processors are plenty good. My gf has my old system which has an 8350 overclocked to 4.5ghz and it beats my 4690k at 4.5ghz in cinebench. In the winter I used to run that 8350 at 4.8ghz then drop it back to 4.5ghz in the warmer months. I just leave it at 4.5 now but for the price honestly the fx processors are beastly.


----------



## Sgt Bilko

Quote:


> Originally Posted by *kalidae*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bazookatooths*
> 
> Excellent ranks i'm jealous, I
> 
> 
> 
> 
> 
> 
> 
> :thumb:Yes this chip would easily hit 5ghz I havent even went over 1.4volts yet, problem is no water cooling , I'd have to buy a bigger case for double rad, LIFE GOALS!
> I want to get an intel CPU love the results everyone is having with them, although have to wait til I find a good deal, no microcenters around here. Okay I will def. be buying 3dmark then more fun benching!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The amd fx processors are plenty good. My gf has my old system which has an 8350 overclocked to 4.5ghz and it beats my 4690k at 4.5ghz in cinebench. In the winter I used to run that 8350 at 4.8ghz then drop it back to 4.5ghz in the warmer months. I just leave it at 4.5 now but for the price honestly the fx processors are beastly.
Click to expand...

8360 has double the cores of your i5


----------



## kalidae

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 8360 has double the cores of your i5


Sure does that's why I used it in my amd system and it was a really good overclocker too. But I upgraded to intel and the i5 beats the amd at everything except for cinebench, in saying that the 8350 has been out for years so it doesn't compare to a 4690k but more like the 3570k.


----------



## Sgt Bilko

Quote:


> Originally Posted by *kalidae*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> 8360 has double the cores of your i5
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sure does that's why I used it in my amd system and it was a really good overclocker too. But I upgraded to intel and the i5 beats the amd at everything except for cinebench, in saying that the 8350 has been out for years so it doesn't compare to a 4690k but more like the 3570k.
Click to expand...

Well yeah, actually the 8350 is comparable to the 3770k (if you are talking about year of release)

The 3570k is roughly comprable to the 8320 or maybe a well overclocked 6300.....

AMD needs a new CPU and i personally cannot wait for Zen to launch


----------



## kizwan

Quote:


> Originally Posted by *Oregonduck007*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mr.Pie*
> 
> my MSI R9 390 only has coil whine when I launch heaven or close it down........no coil whine at all in any other application.
> 
> Strange
> 
> 
> 
> file:///C:/Users/krame/Valley/Unigine_Valley_Benchmark_1.0_20150913_1553.html
> No whine at all. strange indeed.
> Quote:
> 
> 
> 
> Originally Posted by *Mr.Pie*
> 
> my MSI R9 390 only has coil whine when I launch heaven or close it down........no coil whine at all in any other application.
> 
> Strange
> 
> Click to expand...
> 
> 
> 
> No whine at all when i run heaven on my MSI 390, I got this score while i was sharing my screen with a buddy. On a side note**( it says windows 8 but im running windows 10 64bit.)
Click to expand...

Coil whine on gpu usually from the chokes. Usually mfgr put glue on the coils (inside the chokes). The cards that have coil whine most likely because the glue is not enough or too thin or already degraded. Considering 390's not old cards, I bet the first two.
Quote:


> Originally Posted by *Dundundata*
> 
> How can you tell when the artifacts are "good enough"? I've run Heaven and saw massive artifacts/colors, and run it with different settings and only seen smaller ones (like little blueish dots on the cannon)


Artifacts = *NOT* good enough.
Quote:


> Originally Posted by *Allan P*
> 
> I'm planning to buy a R9 390/390x soon. Are there any good reasons to justify buying the 390x instead of a 390?


For benching go for 390X. For gaming, best bang for the buck would be 390.
Quote:


> Originally Posted by *imrunning*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kalidae*
> 
> Hi, you shouldn't really run furmark. It makes cards run hotter than they would in real world scenarios like gaming. 90 degrees is crazy hot and i do believe furmark would force that but gaming it shouldn't get that high. Run valley on a loop for 30 minutes and see what temps you get.
> 
> 
> 
> I've ran Unigine's valley and hits 94C in around 10 mins... I've also tried the advice of flipping my top intake to exhaust but still giving me 94C max in same time frame. what i notice is that VRM1 graph hits max of 84 while VRM2 stuck at 47C.
> 
> Attached some pics of temp graph...
> 
> This graph is with 5 intake 1 exhaust
> 
> 
> This graph is with 4 intake and 2 exhaust
> 
> 
> Fan profile
Click to expand...

You need two exhaust up top (high CFM fan). The front intake is not ideal because the HDD's basically causing restriction which interfere the fresh air coming from outside. Once the air passed through the HDD's, air flow already too weak to help cooling the gpu. The side panel, if you can mod it that will allow you to put two 120 high CFM fan pointing directly to the gpu, that will help cooling the gpu.

I noticed you use the stock corsair case fan (top & bottom). That fans are pretty weak. You should changed both to high CFM fans.

The MSI AB monitoring show the fan doesn't ramp up properly. What was the max RPM the fan spin when running the benchmark? Look like in 2XXX RPM which is low IMO.

Basically you want:-
- back : 1 x 120mm exhaust
- top : 2 x 120mm exhaust
- front : 2 x 120mm intake
- bottom : 1 x 120mm intake
- side panel : 1 x 120mm intake (or mod it for 2 x 120mm intake) will work out properly if they're directly on the gpu.

3 x 120mm high CFM fans for exhaust should be powerful enough for exhausting the hot air & keep the case temp low enough.

*Edit:* you might want to experiment for the side panel between exhaust or intake. The gpu have open fan cooler type, so exhaust on the side panel may give better cooling.


----------



## sportsczy

Stupid question... what power limit % should i use? I've been using +10%, but i use +19%with Nvidia. Is 10% the correct amount?


----------



## Sgt Bilko

Quote:


> Originally Posted by *sportsczy*
> 
> Stupid question... what power limit % should i use? I've been using +10%, but i use +19%with Nvidia. Is 10% the correct amount?


Whatever you want to use really, it doesn't increase your temps much, it just lets the card draw a bit more power and allows you to maintain a static clock speed in the majority of games (unless you use Vsync or the game doesn't require that much GPU power)


----------



## navjack27

i got a new overclock that i'm working on for my r9 390x.


stable in this benchmark so far, and in this too
http://www.userbenchmark.com/UserRun/385483

without hyperthreading for some reason i get higher gpu score
http://www.userbenchmark.com/UserRun/385535


----------



## kalidae

Do you have a 720p monitor? I would like to see the score in 1080p good work, the overclock is nice


----------



## navjack27

no i just wanted it in a window. sure i'll run a fullscreen max everything


----------



## kalidae

I noticed that userbenchmark scores differently every time. My 1St run my gpu was rank 3 106% then the next it was rank 4 102% then the next was 103% the scores change slightly every time.


----------



## navjack27

i ran it on 1080p 8x aa max everything

Unigine Heaven Benchmark 4.0

FPS:
64.9
Score:
1635
Min FPS:
17.5
Max FPS:
135.0


----------



## kalidae

Quote:


> Originally Posted by *navjack27*
> 
> i ran it on 1080p 8x aa max everything
> 
> Unigine Heaven Benchmark 4.0
> 
> FPS:
> 64.9
> Score:
> 1635
> Min FPS:
> 17.5
> Max FPS:
> 135.0




This is my score with my nitro 390. Clocks are at 1170/1750. I'm stuck at this and can't go any higher unless I can flash the bios to allow me to add more voltage. The cool thing is you still have headroom to keep on going and I can't wait to see what you end up with.


----------



## kalidae

www.userbenchmark.com/UserRun/385582

A link to my userbenchmark score. I overclocked my gpu last night and just finished overclocking my cpu. For some reason my gpu scored rank 4 103% when I have seen it score rank 3 106%


----------



## Gumbi

Quote:


> Originally Posted by *kalidae*
> 
> 
> 
> This is my score with my nitro 390. Clocks are at 1170/1750. I'm stuck at this and can't go any higher unless I can flash the bios to allow me to add more voltage. The cool thing is you still have headroom to keep on going and I can't wait to see what you end up with.


Use Sapphire Trixx, you can add 200mv with that, which is more than enough on air.


----------



## kalidae

Quote:


> Originally Posted by *Gumbi*
> 
> Use Sapphire Trixx, you can add 200mv with that, which is more than enough on air.


oooh cheers for that!


----------



## kalidae

Okay 390 nitro at 1201/1750 +156mv max fan speed 89% max gpu temp 75 degree max vrms 77/93 degrees after one run of valley.


----------



## Gumbi

Quote:


> Originally Posted by *kalidae*
> 
> Okay 390 nitro at 1201/1750 +156mv max fan speed 89% max gpu temp 75 degree max vrms 77/93 degrees after one run of valley.


Pretty sweet, VRMs getting a bit toasty with that much voltage


----------



## kalidae

Quote:


> Originally Posted by *Gumbi*
> 
> Pretty sweet, VRMs getting a bit toasty with that much voltage


Yeah they are I'm just stuffing around to see if I can get it completely stable at 1200 1750 in all all unigine and future mark and bf4 but I don't think I'll keep it at these clocks.


----------



## kalidae

Nitro 390 1201/1750 +180 mv 100% fan speed max gpu temp 75 vrms 77/93


----------



## Gumbi

Quote:


> Originally Posted by *kalidae*
> 
> Yeah they are I'm just stuffing around to see if I can get it completely stable at 1200 1750 in all all unigine and future mark and bf4 but I don't think I'll keep it at these clocks.


Some games will push the GPU more than Valley, ao watch those VRM temps, I wouldn't push any more voltage through them.


----------



## Anvi

I have a few questions regarding R9 390..

I've noticed that GPU/MEM overclocks crashes the computer if I don't use "*Unofficial overclock mode without powerplay*". Unfortunately Power Limit function in AB doesn't work with unofficial overclocking mode.

Are you guys using powerplay and if so, how did u manage to get it stable?

Also, I'm getting random black screens with "*"Display driver stopped responding and has recovered*" as described here:
https://community.amd.com/thread/183430

Does anyone have a solution for that?


----------



## kalidae

Quote:


> Originally Posted by *Gumbi*
> 
> Some games will push the GPU more than Valley, ao watch those VRM temps, I wouldn't push any more voltage through them.


Its okay, I just wanted to see if I could pass all my benchmarks at 1200 1750 and i did but with the fans at 100% and that was to loud. I was also able to score 109% in userbenchmark.I dropped my clocks back to 1170 1750, it's much quieter and safer.


----------



## Dundundata

Quote:


> Originally Posted by *Judge Dredd 3D*
> 
> I have coolermaster HAF and a Cooler Master nepton 280l, everything else its pretty quiet in my rig maybe thats why I can hear the gpu fans spinning at more than 40%... that case has little to no sound insulation.


Ah that make sense then my case is all air and I have alot of fans. I'm going full Noctua this week except for the 2 cooler master's on the hyper 212. I found a slim fan so I can get an intake back on bottom. I imagine if my case was ear level noises would be more distracting!

And thanks I will make sure there are no artifacts! It's amusing I'm having as much fun tinkering with setup than playing games


----------



## Noirgheos

To those with an XFX 390X, I have another favour to ask of you, can you take pictures of the card in your rig? Primarily so I can observe sag and the backplate. Thanks.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Noirgheos*
> 
> To those with an XFX 390X, I have another favour to ask of you, can you take pictures of the card in your rig? Primarily so I can observe sag and the backplate. Thanks.





There you go, My rig ain't pretty (mainly due to swapping parts over all the time







) but she does alright


----------



## Noirgheos

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 
> 
> 
> There you go, My rig ain't pretty (mainly due to swapping parts over all the time
> 
> 
> 
> 
> 
> 
> 
> ) but she does alright


Looks ok. Looks a little warped, or that may just be the angle. I guess I'll get whichever is cheaper, the XFX or MSI when it's time to buy an R9 390X. Right now it seems like XFX has better core cooling, but MSI has better VRM cooling. If it was up to me, I'd take XFX, but whichever is cheaper will do I guess. Thanks!


----------



## Sgt Bilko

Quote:


> Originally Posted by *Noirgheos*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> There you go, My rig ain't pretty (mainly due to swapping parts over all the time
> 
> 
> 
> 
> 
> 
> 
> ) but she does alright
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Looks ok. Looks a little warped, or that may just be the angle. I guess I'll get whichever is cheaper, the XFX or MSI when it's time to buy an R9 390X. Right now it seems like XFX has better core cooling, but MSI has better VRM cooling. If it was up to me, I'd take XFX, but whichever is cheaper will do I guess. Thanks!
Click to expand...

Actually you'd have that backwards, I'd be willing to bet that XFX has the best vrm cooling out of any 390x out there with the monster heatsink it has.

The core can be kept well under control with a custom fan curve (mine never goes over 70c even overvolted) at the expense of some noise and i did have a MSI R9 380 in my rig a little while ago for comparison:




Pretty much every GPU out nowadays will have some sag to it, no backplate or bracing will stop it 100%


----------



## flopper

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Actually you'd have that backwards, I'd be willing to bet that XFX has the best vrm cooling out of any 390x out there with the monster heatsink it has.


It seems so that XFX has that VRM covered well.
they science the hell out of it


----------



## imrunning

Quote:


> Originally Posted by *kizwan*
> 
> You need two exhaust up top (high CFM fan). The front intake is not ideal because the HDD's basically causing restriction which interfere the fresh air coming from outside. Once the air passed through the HDD's, air flow already too weak to help cooling the gpu. The side panel, if you can mod it that will allow you to put two 120 high CFM fan pointing directly to the gpu, that will help cooling the gpu.
> 
> I noticed you use the stock corsair case fan (top & bottom). That fans are pretty weak. You should changed both to high CFM fans.
> 
> The MSI AB monitoring show the fan doesn't ramp up properly. What was the max RPM the fan spin when running the benchmark? Look like in 2XXX RPM which is low IMO.
> 
> Basically you want:-
> - back : 1 x 120mm exhaust
> - top : 2 x 120mm exhaust
> - front : 2 x 120mm intake
> - bottom : 1 x 120mm intake
> - side panel : 1 x 120mm intake (or mod it for 2 x 120mm intake) will work out properly if they're directly on the gpu.
> 
> 3 x 120mm high CFM fans for exhaust should be powerful enough for exhausting the hot air & keep the case temp low enough.
> 
> *Edit:* you might want to experiment for the side panel between exhaust or intake. The gpu have open fan cooler type, so exhaust on the side panel may give better cooling.


The front fans are 2 static pressure 3.1mm/h20 with 62cfm flow. only one is facing the hdd directly while the other one have directly blow to the gfx. The side panel fan are airflow optimised fan with 62 cfm flow. (i tried intake and exhaust for this config and basically intake do a better suppression on the temps till it hit 94C while exhaust config hits 94C faster). I'll purchase another air flow fans for top exhaust to make a total of 3 soon. But for now my case is already sounding like a turbine.

my gpu's max fan speed on load is around 2500rpm for this MSI r9 390. Whats the official max rpm for this gpu?


----------



## Agent Smith1984

Quote:


> Originally Posted by *gerpogi*
> 
> Hey all Ive been trying to decide between the 390x and the 980 for a while now and finally jumped on the 390x ship. Mostly because I think nvidia is slipping =\.so count me in! I have the xfx model


Nice build, please submitt proper proof to be added to the list

Quote:


> Originally Posted by *Oregonduck007*
> 
> 
> count me in!


I will get you on the list for issuing correct proof, but I need to know if this is the 390 or the 390X, thanks!
Quote:


> Originally Posted by *kalidae*
> 
> I felt the same way about the sapphire card because it's pretty plain looking and there are other 390's out there that look better and have a backplate but my build is black and red so the black nitro card still fits in though an msi or asus would look better. The sapphire card though it runs so cool and quiet and is one of the best 390's if it had a backplate I think it would be the best 390 available.
> 
> 
> 
> 
> 
> 
> 
> I really need to do something about that blue led on the card though... sapphire why did u make it blue? White would have been more neutral and better for every build and god damn it why no backplate?!


If you could get a piece of paper with your name in there I could get you added to the list, very nice build btw!!!

Also, as far as sag goes, they will all have it.

It's more indicative of the motherboard that the GPU.

I use a rear PCI slot cover and bend it to act as a stand for mine, and it's not hardly even noticeable.... (see rig pic in sig).

New motherboard landing tomorrow!!! Looking forward to a tear down, rad cleaning, and some new TIM to test with!!!


----------



## Sgt Bilko

Quote:


> Originally Posted by *flopper*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Actually you'd have that backwards, I'd be willing to bet that XFX has the best vrm cooling out of any 390x out there with the monster heatsink it has.
> 
> 
> 
> It seems so that XFX has that VRM covered well.
> they science the hell out of it
Click to expand...

Well I think i might try and prove it

Now i know that Furmark is banned from any sensible comparison but this is truly a worst possible case scenario and let's be honest......most cards have backplates prime for frying some bacon on right?









Here is my Fan curve setup for my 390x along with the clock speeds:



And this is what happens when you throw this card under Furmark:



^That....is why you never run FurMark with these cards









But even so......that is quite impressive considering i let it go for about 10 minutes.....more than enough time for things to heat up a bit.
And I'll stress this much, FurMark is NEVER an accurate representation of gaming temps, as i said before my vrm's and core never exceed 70c









(Don't ask me to do that again because i won't







)


----------



## kizwan

Quote:


> Originally Posted by *Anvi*
> 
> I have a few questions regarding R9 390..
> 
> I've noticed that GPU/MEM overclocks crashes the computer if I don't use "*Unofficial overclock mode without powerplay*". Unfortunately Power Limit function in AB doesn't work with unofficial overclocking mode.
> 
> Are you guys using powerplay and if so, how did u manage to get it stable?
> 
> Also, I'm getting random black screens with "*"Display driver stopped responding and has recovered*" as described here:
> https://community.amd.com/thread/183430
> 
> Does anyone have a solution for that?


https://community.amd.com/message/2671041#2671041


----------



## Noirgheos

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Well I think i might try and prove it
> 
> Now i know that Furmark is banned from any sensible comparison but this is truly a worst possible case scenario and let's be honest......most cards have backplates prime for frying some bacon on right?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Here is my Fan curve setup for my 390x along with the clock speeds:
> 
> 
> 
> And this is what happens when you throw this card under Furmark:
> 
> 
> 
> ^That....is why you never run FurMark with these cards
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But even so......that is quite impressive considering i let it go for about 10 minutes.....more than enough time for things to heat up a bit.
> And I'll stress this much, FurMark is NEVER an accurate representation of gaming temps, as i said before my vrm's and core never exceed 70c
> 
> 
> 
> 
> 
> 
> 
> 
> 
> (Don't ask me to do that again because i won't
> 
> 
> 
> 
> 
> 
> 
> )


Ayy, so that clock you have there(1200MHz) is with the Double Dissipation Black Edition? At that point it should outperform the MSI 390X right?


----------



## kizwan

Quote:


> Originally Posted by *imrunning*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> You need two exhaust up top (high CFM fan). The front intake is not ideal because the HDD's basically causing restriction which interfere the fresh air coming from outside. Once the air passed through the HDD's, air flow already too weak to help cooling the gpu. The side panel, if you can mod it that will allow you to put two 120 high CFM fan pointing directly to the gpu, that will help cooling the gpu.
> 
> I noticed you use the stock corsair case fan (top & bottom). That fans are pretty weak. You should changed both to high CFM fans.
> 
> The MSI AB monitoring show the fan doesn't ramp up properly. What was the max RPM the fan spin when running the benchmark? Look like in 2XXX RPM which is low IMO.
> 
> Basically you want:-
> - back : 1 x 120mm exhaust
> - top : 2 x 120mm exhaust
> - front : 2 x 120mm intake
> - bottom : 1 x 120mm intake
> - side panel : 1 x 120mm intake (or mod it for 2 x 120mm intake) will work out properly if they're directly on the gpu.
> 
> 3 x 120mm high CFM fans for exhaust should be powerful enough for exhausting the hot air & keep the case temp low enough.
> 
> *Edit:* you might want to experiment for the side panel between exhaust or intake. The gpu have open fan cooler type, so exhaust on the side panel may give better cooling.
> 
> 
> 
> 
> 
> 
> 
> The front fans are 2 static pressure 3.1mm/h20 with 62cfm flow. only one is facing the hdd directly while the other one have directly blow to the gfx. The side panel fan are airflow optimised fan with 62 cfm flow. (i tried intake and exhaust for this config and basically intake do a better suppression on the temps till it hit 94C while exhaust config hits 94C faster). I'll purchase another air flow fans for top exhaust to make a total of 3 soon. But for now my case is already sounding like a turbine.
> 
> my gpu's max fan speed on load is around 2500rpm for this MSI r9 390. Whats the official max rpm for this gpu?
Click to expand...

I don't know about the max rpm on msi though. Unfortunately you can't escape from noise with air cooling, especially in high ambient. If you want quiet, watercooling is the way to go.
Quote:


> Originally Posted by *Noirgheos*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Well I think i might try and prove it
> 
> Now i know that Furmark is banned from any sensible comparison but this is truly a worst possible case scenario and let's be honest......most cards have backplates prime for frying some bacon on right?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Here is my Fan curve setup for my 390x along with the clock speeds:
> 
> 
> 
> And this is what happens when you throw this card under Furmark:
> 
> 
> 
> ^That....is why you never run FurMark with these cards
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But even so......that is quite impressive considering i let it go for about 10 minutes.....more than enough time for things to heat up a bit.
> And I'll stress this much, FurMark is NEVER an accurate representation of gaming temps, as i said before my vrm's and core never exceed 70c
> 
> 
> 
> 
> 
> 
> 
> 
> 
> (Don't ask me to do that again because i won't
> 
> 
> 
> 
> 
> 
> 
> )
> 
> 
> 
> 
> 
> 
> 
> Ayy, so that clock you have there(1200MHz) is with the Double Dissipation Black Edition? At that point it should outperform the MSI 390X right?
Click to expand...

Not if 390X is anywhere between 1170 to 1200. If Gigabyte 390X, definitely yes. LoL


----------



## Agent Smith1984

Crazy thing is....

XFX and MSI are the only 390's we've seen hit 1200MHz on here with any stability.

Both of them have comperable core cooling, and both have excellent VRM cooling.

*It's apparent that two things are damn near fact at this point;*

1) The XFX and MSI cards are the best binned cards of this series
2) The VRM cooling being able to hold temps below 80C is also contributing to the OC potential of the cards.

I believe it's a combination of those two things that are allowing them to reach 1200MHz.

Now when compared to the 290 series, here is what is known..

Hawaii can reach clocks over 1250MHz with stability if cooled properly and appropriate voltage is added (usually in the 200mv range), but those very same cards may need 50-75mv just to get into the 1100-1150MHz range for daily use.

Grenada seems to be able to hit core speeds of 1100-1150 with no added voltage at all, but going any further than 1200MHz seems to be a total no go at any voltage....

*I believe this because of these two things....*

1) A refined manufacturing process has created a lower leakage chip capable of higher clocks on less voltage in a lower clock range, versus the older process that would allow a higher clock ceiling with much higher voltage, because that applied voltage "bleeds" with the higher leakage chip. This is the same premise we see with CPU's and their different ASIC qualities.
Some would attest that you want the low leakage chip for air cooling, and basic overclocking, and you want the high leakage unit for high-end/etreme cooling/overclocking

1) I believe the BIOS tweaks and/or PCB revisions implented on the 390 series has limited their TDP to aorund 325-350w TOPS, and that is why exceeding voltage offsets over 100mv yields little-to-no help with higher clocks, and in some cases hurts the max overclock. Many may find their max clocks come in the 60-75mv range (*SO PLEASE TEST THERE, AND DO NOT ASSUME YOU'LL GET IT ALL AT 100MV*)

When I mention these numbers, remember I am speaking in terms of usable/stable clocks. It's not to say you'd want to run 100mv 1200MHz on a daily basis, but my particular card ran that way for over 3 weeks before I blew up my motherboard, and will probably be ran at that again, because using a single card on 4K needs every bit of help it can get....









I still plan on going crossfire, but my board failing, and my son's HDD both failing within a week, and both my daughter's birthdays within in a month of each other, has put a slight delay on that.

Once I go crossfire, there is a good chance I will be looking to run ~1100MHz or so on stock voltage to improve power usage and temps.

Of course, it's not to say some max clock suicide benchies won't be worked up for good measure


----------



## Gumbi

I think that's a good summary. I'm not so sure about binning though, I just think it's down to better cooling overall (like you said, core AND vrm both being cool.

The Sapphire card has superb core cooling but slacks a bit on one of the VRMs (one user had 93c at 156mv on Heaven or Valley. I don't think necessarily bad cooling but I think MSI or XFX could hold that in the lower 80s.


----------



## milan616

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Once I go crossfire, there is a good chance I will be looking to run ~1100MHz or so on stock voltage to improve power usage and temps.


I only have room for one in my ITX/MSI 390x build. I hit 1100 on stock voltage ... and then I took the voltage down to do the same thing. Only was able to drop -19mV, but I'm happy with it. Custom fan profile to keep the noise down since my fans are right up against the vent holes on my BitFenix Prodigy, which is on my desk and I use open back headphones.

I really want to redo the thermal paste and put on better thermal pads, but that damn little warranty sticker is killing me!


----------



## Berkeley

Hello everybody,

could anyone please upload the original Bios of the XFX DD 390 Card from GPU-Z Tool?

Many thanks in advance!


Spoiler: How it works:



Just click "save to file". Then you can zip the file (e.g. winrar) in order to upload it here.


----------



## kalidae

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Crazy thing is....
> 
> XFX and MSI are the only 390's we've seen hit 1200MHz on here with any stability.
> 
> Both of them have comperable core cooling, and both have excellent VRM cooling.
> 
> *It's apparent that two things are damn near fact at this point;*
> 
> 1) The XFX and MSI cards are the best binned cards of this series
> 2) The VRM cooling being able to hold temps below 80C is also contributing to the OC potential of the cards.
> 
> I believe it's a combination of those two things that are allowing them to reach 1200MHz.
> 
> Now when compared to the 290 series, here is what is known..
> 
> Hawaii can reach clocks over 1250MHz with stability if cooled properly and appropriate voltage is added (usually in the 200mv range), but those very same cards may need 50-75mv just to get into the 1100-1150MHz range for daily use.
> 
> Grenada seems to be able to hit core speeds of 1100-1150 with no added voltage at all, but going any further than 1200MHz seems to be a total no go at any voltage....
> 
> *I believe this because of these two things....*
> 
> 1) A refined manufacturing process has created a lower leakage chip capable of higher clocks on less voltage in a lower clock range, versus the older process that would allow a higher clock ceiling with much higher voltage, because that applied voltage "bleeds" with the higher leakage chip. This is the same premise we see with CPU's and their different ASIC qualities.
> Some would attest that you want the low leakage chip for air cooling, and basic overclocking, and you want the high leakage unit for high-end/etreme cooling/overclocking
> 
> 1) I believe the BIOS tweaks and/or PCB revisions implented on the 390 series has limited their TDP to aorund 325-350w TOPS, and that is why exceeding voltage offsets over 100mv yields little-to-no help with higher clocks, and in some cases hurts the max overclock. Many may find their max clocks come in the 60-75mv range (*SO PLEASE TEST THERE, AND DO NOT ASSUME YOU'LL GET IT ALL AT 100MV*)
> 
> When I mention these numbers, remember I am speaking in terms of usable/stable clocks. It's not to say you'd want to run 100mv 1200MHz on a daily basis, but my particular card ran that way for over 3 weeks before I blew up my motherboard, and will probably be ran at that again, because using a single card on 4K needs every bit of help it can get....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I still plan on going crossfire, but my board failing, and my son's HDD both failing within a week, and both my daughter's birthdays within in a month of each other, has put a slight delay on that.
> 
> Once I go crossfire, there is a good chance I will be looking to run ~1100MHz or so on stock voltage to improve power usage and temps.
> 
> Of course, it's not to say some max clock suicide benchies won't be worked up for good measure


If you go back a page my sapphire 390 was able to achieve 1200/1750 @ 184mv but one vrm did reach 93c in valley and heaven and firestrike. No need to keep it at those clocks though, not for me because I'm just running 1080p and that was with the fan curve at 100% and that's just to loud for me. 1170 at the core completely stable and only requires 100mv but to get another 30mhz stable (1200mhz) required an extra 84mv so a total of 184mv


----------



## Agent Smith1984

Quote:


> Originally Posted by *kalidae*
> 
> If you go back a page my sapphire 390 was able to achieve 1200/1750 @ 184mv but one vrm did reach 93c in valley and heaven and firestrike. No need to keep it at those clocks though, not for me because I'm just running 1080p and that was with the fan curve at 100% and that's just to loud for me.


Can it handle that clock/voltage in an hour of Crysis 3?

And again, the numbers I discussed were referring to getting 1200 around the 100mv mark.

The additional voltage doesn't help these cards much. Did you try to see how far it would go in 100-125mv range?

Was it only within 5-10mhz difference of 184mv?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Crazy thing is....
> 
> XFX and MSI are the only 390's we've seen hit 1200MHz on here with any stability.
> 
> Both of them have comperable core cooling, and both have excellent VRM cooling.
> 
> *It's apparent that two things are damn near fact at this point;*
> 
> 1) The XFX and MSI cards are the best binned cards of this series
> 2) The VRM cooling being able to hold temps below 80C is also contributing to the OC potential of the cards.
> 
> I believe it's a combination of those two things that are allowing them to reach 1200MHz.
> 
> Now when compared to the 290 series, here is what is known..
> 
> Hawaii can reach clocks over 1250MHz with stability if cooled properly and appropriate voltage is added (usually in the 200mv range), but those very same cards may need 50-75mv just to get into the 1100-1150MHz range for daily use.
> 
> Grenada seems to be able to hit core speeds of 1100-1150 with no added voltage at all, but going any further than 1200MHz seems to be a total no go at any voltage....
> 
> *I believe this because of these two things....*
> 
> 1) A refined manufacturing process has created a lower leakage chip capable of higher clocks on less voltage in a lower clock range, versus the older process that would allow a higher clock ceiling with much higher voltage, because that applied voltage "bleeds" with the higher leakage chip. This is the same premise we see with CPU's and their different ASIC qualities.
> Some would attest that you want the low leakage chip for air cooling, and basic overclocking, and you want the high leakage unit for high-end/etreme cooling/overclocking
> 
> 1) I believe the BIOS tweaks and/or PCB revisions implented on the 390 series has limited their TDP to aorund 325-350w TOPS, and that is why exceeding voltage offsets over 100mv yields little-to-no help with higher clocks, and in some cases hurts the max overclock. Many may find their max clocks come in the 60-75mv range (*SO PLEASE TEST THERE, AND DO NOT ASSUME YOU'LL GET IT ALL AT 100MV*)
> 
> When I mention these numbers, remember I am speaking in terms of usable/stable clocks. It's not to say you'd want to run 100mv 1200MHz on a daily basis, but my particular card ran that way for over 3 weeks before I blew up my motherboard, and will probably be ran at that again, because using a single card on 4K needs every bit of help it can get....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I still plan on going crossfire, but my board failing, and my son's HDD both failing within a week, and both my daughter's birthdays within in a month of each other, has put a slight delay on that.
> 
> Once I go crossfire, there is a good chance I will be looking to run ~1100MHz or so on stock voltage to improve power usage and temps.
> 
> Of course, it's not to say some max clock suicide benchies won't be worked up for good measure


Agreed


----------



## Offender_Mullet

Just ordered an XFX R9 390 Black Edition.


----------



## kalidae

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Can it handle that clock/voltage in an hour of Crysis 3?
> 
> And again, the numbers I discussed were referring to getting 1200 around the 100mv mark.
> 
> The additional voltage doesn't help these cards much. Did you try to see how far it would go in 100-125mv range?
> 
> Was it only within 5-10mhz difference of 184mv?


I had it at 1170 1750 stable at 100mv then I downloaded Trixx and went straight to 1200mhz on the core and started at 150mv, I remember you saying that after 1200 u gain nothing. Valley showed artifacts at that voltage so I increased voltage to 160mv and valley was fine, heaven showed artifacts at one point and that's just after it circled the dragon statue. So I increased to 180mv and it adjusted itself to 184, it was getting late and i was running out of time. Heaven ran perfect at those clocks and voltage so I ran firestrike and 3dmark 11 and userbenchmark which isn't much of a benchmark but I wanted to see if 1200mhz was enough to get the max 109% rating and it was, all those ran fine. After that I went to bed, ill do further testing with games when I get home from work but I'm confident that it will play games just fine, the thing about this card is it never driver crashes to black screen it just gives artifacts. Even 1200mhz with 100mv it's just artifacts. It's crazy that it needs an extra 84mv just to gain an extra 30mhz on the core. I don't plan to keep this card at these clocks for every day use but just for benchmarking. I hate having case fans running at 100% even though I have quiet editions so all these benches were done with 3 intake fans running at 50% and vrms topped out at 93 and the gpu fan was at 100%. I'm sure if I turned all my case fans to max then the vrms would run much cooler especially since this is a high flow case with no drive cages or anything causing restriction. But again ill drop the clocks back to a point where I don't need the gpu fans to ramp up over 60%, if I can hear it I don't want it haha. But yes I will game at these clocks when I finish work and report the outcome.

Also one thing to note about my system is that every case fan that is installed is connected to a pwm fan hub so all the fans increase and decrease speed depending on cpu temps. In something like valley where it's not very cpu intensive my case fans won't ramp up so my gpu temps will run hotter than when I play something cpu intensive like bf4. In valley my case fans will run at around 700 rpm but in bf4 they will increase to 1200rpm I think that's about 80% so still quiet. They are corsair af120 quiet editions.


----------



## Oregonduck007

No whine at all when i run heaven on my MSI 390, I got this score while i was sharing my screen with a buddy. On a side note**( it says windows 8 but im running windows 10 64bit.)
Quote:


> Originally Posted by *Agent Smith1984*
> 
> Nice build, please submitt proper proof to be added to the list
> I will get you on the list for issuing correct proof, but I need to know if this is the 390 or the 390X, thanks!
> If you could get a piece of paper with your name in there I could get you added to the list, very nice build btw!!!
> 
> Also, as far as sag goes, they will all have it.
> 
> It's more indicative of the motherboard that the GPU.
> 
> I use a rear PCI slot cover and bend it to act as a stand for mine, and it's not hardly even noticeable.... (see rig pic in sig).
> 
> New motherboard landing tomorrow!!! Looking forward to a tear down, rad cleaning, and some new TIM to test with!!!


Sorry its a MSI R9 390. (non x)


----------



## ChaosAD

I was about to order an MSI 390 within this or next week. And then i saw this sale on a second hand, brand new, sealed MSI 390X for 5 euro more. It was a no brainer deal and i got it. The only sad thing is that i ll have it on my hands in thee weeks. Lets hope i got a good one! It will be a nice upgrade from my current GTX 275 i borrowed from a friend


----------



## navjack27

yeah it seems like the 390x from msi is a great clocker with zero voltage adjustment. yeah u could go higher core clocks but it never holds it at that clock when you add voltage to aux or core. actually adding voltage got me lower scores in benchmarks. i just play csgo mainly and the added 100fps or so average with the settings i use is great! i played for an hour or two today and it only went to a black screen and back to the game (within like a sec or two) just once. then i clocked it back down, maybe i'll add volts or lower it 5mhz or something. but my room door is closed and its getting hot in my room lol. it seems like when you overclock, at least for me. when it gets around 60c, artifacts show and stability becomes an issue. but that is in games with shaders and stuff going on. so like dx11 might want a lower clock. dx9 games are cool like that with a higher clock. i saw SOME of those blue square artifacts during my heaven run, but it was so minimal that it could of just been the fact it was 4am and i was tired lol. better then the old sli gtx 660s i had. if u clocked it wrong it just crashed. at least amd gives you a warning with squares n such.


----------



## Geoclock

Just got MSI 390x, going to install it now.
Before it i have question: At first examination thermalpads on SFC chips are little off (like 2 heatsink plates, pads are for 6).
Do you recommend to take apart at least that side and make correction?
Tell you the truth i don't think by removing just half of the heatsink i can able to do it.


----------



## navjack27

i was wondering that myself. if i should remove the heatsink and put AS5 on that.


----------



## Sgt Bilko

Quote:


> Originally Posted by *navjack27*
> 
> i was wondering that myself. if i should remove the heatsink and put AS5 on that.


Try to use something other than AS5, Arctic MX4, NT-H1 or Gelid GC Extreme all work better on GPU's imo and they are non conductive.

I know you can use AS5 on them safely but i dont think the risk is worth it when you have better alternatives


----------



## navjack27

okay good to know! thanks


----------



## Geoclock

15.7.1 cat didn't work. Which one i should try?


----------



## CC268

Hey guys, hoping you can help me out here. I plan on getting the MSI 390X from my local Fry's Electronics since the 390 is sold out and it is even tough to find online.

Will the 390X be a good card for the next few years or is it worth it to go up to the 980TI?


----------



## kizwan

Quote:


> Originally Posted by *Geoclock*
> 
> 15.7.1 cat didn't work. Which one i should try?


Why it didn't work? Try the latest 15.8?


----------



## kalidae

Quote:


> Originally Posted by *CC268*
> 
> Hey guys, hoping you can help me out here. I plan on getting the MSI 390X from my local Fry's Electronics since the 390 is sold out and it is even tough to find online.
> 
> Will the 390X be a good card for the next few years or is it worth it to go up to the 980TI?


It just depends on how much you want to spend. Both cards will be good for the next few years. The 980 Ti will last longer but it costs more. In saying that the 390x is a damn good card and will do its job for years to come then down the line you could always buy a second one and that will still be cheaper than buying a second 980 Ti


----------



## Rob27shred

Quote:


> Originally Posted by *CC268*
> 
> Hey guys, hoping you can help me out here. I plan on getting the MSI 390X from my local Fry's Electronics since the 390 is sold out and it is even tough to find online.
> 
> Will the 390X be a good card for the next few years or is it worth it to go up to the 980TI?


A 390X will be good for the next few yrs. I'm rocking the XFX 390 right now & it is pushing all the newer games at max settings, in 1440p, & maintaining very solid frame rates. In the Witcher 3 I'm not even getting FR drops in the busiest parts of Novigrad with everything set to ultra (no hairworks) @1440p. With the extra power of the X variant you should be good to go for awhile, especially if AMD cards really do get the upper hand with DX12!

Also while the 980ti is a beast of a card the added performance is not worth another $300 to $400 IMHO. For that price you could crossfire 2 390Xs & probably outperform a single 980ti. So unless money isn't a restricting factor for you I'd say go with the 390X.


----------



## kizwan

I don't think 980Ti would last longer. Once Nvidia release new gpus, support for 980ti will be less and less.


----------



## bazookatooths

I gained about 3-4% better performance with 15.8 beta drivers.


----------



## Noirgheos

Quote:


> Originally Posted by *Rob27shred*
> 
> A 390X will be good for the next few yrs. I'm rocking the XFX 390 right now & it is pushing all the newer games at max settings, in 1440p, & maintaining very solid frame rates. In the Witcher 3 I'm not even getting FR drops in the busiest parts of Novigrad with everything set to ultra (no hairworks) @1440p. With the extra power of the X variant you should be good to go for awhile, especially if AMD cards really do get the upper hand with DX12!
> 
> Also while the 980ti is a beast of a card the added performance is not worth another $300 to $400 IMHO. For that price you could crossfire 2 390Xs & probably outperform a single 980ti. So unless money isn't a restricting factor for you I'd say go with the 390X.


I added you on Steam if you were wondering who it is. I see you're a fellow MGS fan.


----------



## CC268

Thanks guys - I guess I could save some serious cash going with the 390X, just wasn't sure if the 980TI was worth it - I appreciate the feedback


----------



## bazookatooths

Though this was an interesting bench.



For the price looks like the 390 is king, even at 1440 only dropping a couple settings to high,rest ultra. Another guy was saying you'd need 2x 980ti to run most games @60fps on all ultra settings(he owns one). 390/390x all the way, heck 2x390x


----------



## Rob27shred

Quote:


> Originally Posted by *Noirgheos*
> 
> I added you on Steam if you were wondering who it is. I see you're a fellow MGS fan.


Accepted







A very big Metal Gear fan, the og Metal Gear for NES is the 1st game that truly hooked me!


----------



## Geoclock

15.8 beta drivers worked for my MSI 390x too, didn't OC-ed YET


----------



## kalidae

Add me to the club please










1170/1750 +100mv is the fastest and most stable clocks I have been able to get this nitro 390. Max gpu temps are 74 and vrms max is 83 with my fan profile maxing out at 65%

Scores @1170/1750



I tried 1200/1750 +184mv and it passed all my benchmarks with no artifacts, however after 35 minutes of bf4 on ultra it crashed. Even +200mv bf4 still crashed. Here are the scores



The card was running to hit and even with the fan running at 100% the vrms were hitting over 90 degrees. I would not have kept it at these clocks and voltages but attempted it just for fun.

Currently working on a lower,safer and cooler overclock right now. As I only have a 650w psu and my 4690k is already overclocked to 4.7ghz I'm worried that with a highly clocked card and cpu that my psu won't be able to keep up, also I just want a quiet system so if I can get 1150 /1650 or so at maybe +60mv and possibly a max fan speed of 55% then ill be happy. Also keep in mind all these tests were done with all case fans at lower than 100% fan speed, they are connected to a fan hub so unless the cpu is getting hot the fans won't spin more than 50% 700rpm and unigine doesn't stress the cpu but when I game then case fans will ramp up as the cpu is used more so my overall temps while gaming are generally cooler.


----------



## djohny24

Is there any Powercolor 390 Pcs+ mutated to 390x here? is there any posibility? I bought one, so i will try it


----------



## kizwan

Is anyone game for running Firestrike @1100/1700? Just want to see graphics score from all brands at same clock. Any card should be able to do 1100/1700, right?
Quote:


> Originally Posted by *djohny24*
> 
> Is there any Powercolor 390 Pcs+ mutated to 390x here? is there any posibility? I bought one, so i will try it


There's no successful story of unlocking 390 to 390X so far.


----------



## Gumbi

Quote:


> Originally Posted by *kalidae*
> 
> Add me to the club please
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1170/1750 +100mv is the fastest and most stable clocks I have been able to get this nitro 390. Max gpu temps are 74 and vrms max is 83 with my fan profile maxing out at 65%
> 
> Scores @1170/1750
> 
> 
> 
> I tried 1200/1750 +184mv and it passed all my benchmarks with no artifacts, however after 35 minutes of bf4 on ultra it crashed. Even +200mv bf4 still crashed. Here are the scores
> 
> 
> 
> The card was running to hit and even with the fan running at 100% the vrms were hitting over 90 degrees. I would not have kept it at these clocks and voltages but attempted it just for fun.
> 
> Currently working on a lower,safer and cooler overclock right now. As I only have a 650w psu and my 4690k is already overclocked to 4.7ghz I'm worried that with a highly clocked card and cpu that my psu won't be able to keep up, also I just want a quiet system so if I can get 1150 /1650 or so at maybe +60mv and possibly a max fan speed of 55% then ill be happy. Also keep in mind all these tests were done with all case fans at lower than 100% fan speed, they are connected to a fan hub so unless the cpu is getting hot the fans won't spin more than 50% 700rpm and unigine doesn't stress the cpu but when I game then case fans will ramp up as the cpu is used more so my overall temps while gaming are generally cooler.


A good 650w is fine with an i5, even if you are pumping up the volts on a 390. 200mv will have your card drawing 350 plus watts tho


----------



## kalidae

Quote:


> Originally Posted by *Gumbi*
> 
> A good 650w is fine with an i5, even if you are pumping up the volts on a 390. 200mv will have your card drawing 350 plus watts tho


Its a corsair rm650 it's gold rated and a really nice psu. I just freak out a bit because I don't actually know what I'm pulling at the wall. I think an overclocked i5 which does have a few added volts and the overclocked 390 would be getting close even at 100mv.I will get a bigger psu eventually since I'd like to get another 390 down the line.


----------



## Gumbi

It's absolutely fine. A 4670k at 1.3v~ doesn't draw that much power. Your setup with a stock 390 would draw less than 450 watts, adding voltage to the 390 is fine, you have TONNES of headroom.


----------



## kalidae

Quote:


> Originally Posted by *Gumbi*
> 
> It's absolutely fine. A 4670k at 1.3v~ doesn't draw that much power. Your setup with a stock 390 would draw less than 450 watts, adding voltage to the 390 is fine, you have TONNES of headroom.


hmm that makes me feel better. Thanks for that mate.


----------



## Agent Smith1984

WOOHOO

New board is delivering today, should be back up and running later tonight.

Missing my beloved Crysis 3 deathmatches @ 4k Very High....









The settings I mean.... are very high... not me!


----------



## THUMPer1

Not sure if anyone knows. Sapphire has a new 390. 1040MHz clock and a back plate.


__ https://twitter.com/i/web/status/643809081271164928%5B%2FURL


----------



## Noirgheos

Quote:


> Originally Posted by *THUMPer1*
> 
> Not sure if anyone knows. Sapphire has a new 390. 1040MHz clock and a back plate.
> 
> 
> __ https://twitter.com/i/web/status/643809081271164928%5B%2FURL


I really want a 390X with a backplate. Please Sapphire.


----------



## Geoclock

Nice backplate, they should fill it up, missing 1" LOL...


----------



## bazookatooths

Quote:


> Originally Posted by *THUMPer1*
> 
> Not sure if anyone knows. Sapphire has a new 390. 1040MHz clock and a back plate.
> 
> 
> __ https://twitter.com/i/web/status/643809081271164928%5B%2FURL


Thought they all had back plates , this the only one that doesnt?


----------



## Agent Smith1984

I'd be pissed if I'm a current nitro owner.

The lack of backplate was my biggest gripe with the Nitro to begin with!!


----------



## sinholueiro

Quote:


> Originally Posted by *Noirgheos*
> 
> I really want a 390X with a backplate. Please Sapphire.


The XFX one has one, I think.


----------



## milan616

Quote:


> Originally Posted by *Geoclock*
> 
> Nice backplate, they should fill it up, missing 1" LOL...


You really only need back plate over the PCB area for preventing PCB flex. Having some open area allows for better cooling as the air flows more freely over that section of heat sink.


----------



## Noirgheos

Quote:


> Originally Posted by *sinholueiro*
> 
> The XFX one has one, I think.


Yeah if Sapphire doesn't put one on, I'll go XFX.


----------



## orlfman

Quote:


> Originally Posted by *THUMPer1*
> 
> Not sure if anyone knows. Sapphire has a new 390. 1040MHz clock and a back plate.
> 
> 
> __ https://twitter.com/i/web/status/643809081271164928%5B%2FURL


and i just bought two nitros.... and can't refund them since newegg doesn't allow refund on the nitros... only replacement.

eh with my case i really don't need the blackplates for support, but it still would of been nice.


----------



## Agent Smith1984

Yeah, that's kind of disappointing for Nitro Rev1 owners....

It explains now why they randomly dropped the nitro down to $309 recently though...


----------



## bazookatooths

That would piss me off so they only dropped the price once, the word got around of it not having a backplate, or been that price since release?


----------



## Geoclock

So i tested my MSI 390x on stock settings by VALLEY and OC ed to 1100/6080 +0.12 voltage and benchs are lover.
Can you guys explain what's going on?
Thanks


PS: First game results: Wolfenstein Old Blood: Pixelated artifacts even with stock settings.
Downclocked 100mhz but didn't help.
Any idea?


----------



## kalidae

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I'd be pissed if I'm a current nitro owner.
> 
> The lack of backplate was my biggest gripe with the Nitro to begin with!!


Yep I'm pretty pissed off about that. 290 series, some had backplates. I watched a review on you tube after I I bought my nitro and noticed older model cards had a backplate and guess what? It's the exact same design as that one. I hope I can get one separately.


----------



## Oregonduck007

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I'd be pissed if I'm a current nitro owner.
> 
> The lack of backplate was my biggest gripe with the Nitro to begin with!!


Same, i really wanted to like the nitro, but no back plate really did it for me. So i went with the MSI R9 390 even though it was 349.99 at the time..


----------



## kalidae

Quote:


> Originally Posted by *Oregonduck007*
> 
> Same, i really wanted to like the nitro, but no back plate really did it for me. So i went with the MSI R9 390 even though it was 349.99 at the time..


I was stupid and i bought my nitro without really doing any research on the 390 cards, I already knew alot about the 290 series because I researched them a lot and was going to get one then the 390 came out, I went to the store to buy the asus card because it was the cheapest and i knew the cooling sucked on it but I was going to put a waterblock on it anyway. I got there and all cards were sold out except for the nitro. So I bought it thinking well everything else has a backplate and surely there will be a waterblock for it since its still basically a 290....got home no backplate and ek will not make a waterblock for it. I was pissed but it was my fault for buying it when I should have just waited for the asus card to come back in stock....The next day it was in stock...***. Lucky for me this nitro is a really good performer as far as nitros go so im stoked with its overclock abilities and the cooling and sound but I think I'll still buy a XFX or asus 390 and give my gf this nitro purely because I want a waterblock.


----------



## Lindwurm

Apfft, dammit. I ordered a Nitro on the 12th for $330, and the price already dropped. I don't care about the backplate, but still.


----------



## Mysticking32

http://www.3dmark.com/compare/fs/5995609/fs/5995655

Overclocked r9 390x vs stock. It makes a small difference. I saw an increase of 5 fps in the first graphics test. I don't think I'd be overclocking at this level daily though.

Settings on msi afterburner

1160mhz core
1650 mhz memory. (Crashed on anything past that)
and +100mv
+50% power settings

Getting about the same performance as an r9 fury on stock settings now. So I'm very happy about that. I would try and install sapphire's trixx to get a higher overclock but I usually don't go above 100mv anyways. Not worth the risk.

Anyways those overclocked settings are only for games where performance dips below 60fps. Which isn't many.


----------



## Mysticking32

Quote:


> Originally Posted by *Geoclock*
> 
> So i tested my MSI 390x on stock settings by VALLEY and OC ed to 1100/6080 +0.12 voltage and benchs are lover.
> Can you guys explain what's going on?
> Thanks
> 
> 
> PS: First game results: Wolfenstein Old Blood: Pixelated artifacts even with stock settings.
> Downclocked 100mhz but didn't help.
> Any idea?


Hey man you have a defective card. Send it back now before it's too late. I had the same thing happen with my first r9 390x. Not worth the trouble so just return it and get a replacement. If you ordered it from amazon you can do it right now and choose one day shipping and get it on Thursday. Then when the new card arrives you can ship your current card so you're not left without a card for 2 days.

Depends on where you ordered from though.


----------



## Geoclock

Thanks, i bough new from Ebay member, MSI send to him new replacement, he didn't keep it and sold on Ebay.
Just registered. Called to Customer Support and they told me to reinstall drivers, if not then RMA.
So i just reinstalled it and will test.
How much it will cost by USPS?


----------



## Mysticking32

Quote:


> Originally Posted by *Geoclock*
> 
> Thanks, i bough new from Ebay member, MSI send to him new replacement, he didn't keep it and sold on Ebay.
> Just registered. Called to Customer Support and they told me to reinstall drivers, if not then RMA.
> So i just reinstalled it and will test.
> How much it will cost by USPS?


Mine cost 12 dollars but I'm not sure which service they used to ship it at my mail room. Should be somewhere around that.


----------



## Geoclock

Reinstalled and tried Crysis 2-3, Farcry 4 and Wolf-New Blood,Wolf II. GTAV.
All are OK except Wolfenstein Old Blood.
Do you think is worth to try adjust somehow setting or just send to RMA?


----------



## Mysticking32

Just an announcement. Amnesia is free on steam until later this Wednesday, if anyone wants to grab it.


----------



## Oregonduck007

Quote:


> Originally Posted by *Mysticking32*
> 
> Just an announcement. Amnesia is free on steam until later this Wednesday, if anyone wants to grab it.


Already got my copy! heard its pretty creepy.


----------



## Darkeylel

Soooo after a solid day of playing dragon age inquisition, I decided to jump back into after a hours break or so and have ran into a problem. My Pc just turns off completely no error msg or anything, my temps are fine haven't hit anything above 74c on my GPU and nothing over 50c on my CPU.

Thought I had fixed but after playing for a hour it just did it again completely up in the air about this......


----------



## Gumbi

Quote:


> Originally Posted by *Darkeylel*
> 
> Soooo after a solid day of playing dragon age inquisition, I decided to jump back into after a hours break or so and have ran into a problem. My Pc just turns off completely no error msg or anything, my temps are fine haven't hit anything above 74c on my GPU and nothing over 50c on my CPU.
> 
> Thought I had fixed but after playing for a hour it just did it again completely up in the air about this......


Sounds suspiciously like PSU issues.


----------



## Darkeylel

Quote:


> Originally Posted by *Gumbi*
> 
> Sounds suspiciously like PSU issues.


Ahhh the one thing I was thinking just hoping it isn't...... What's the best way to test if it's the PSU ?


----------



## Mysticking32

Quote:


> Originally Posted by *Oregonduck007*
> 
> Already got my copy! heard its pretty creepy.


I know right lol. I can't wait to play it.
Quote:


> Originally Posted by *Darkeylel*
> 
> Soooo after a solid day of playing dragon age inquisition, I decided to jump back into after a hours break or so and have ran into a problem. My Pc just turns off completely no error msg or anything, my temps are fine haven't hit anything above 74c on my GPU and nothing over 50c on my CPU.
> 
> Thought I had fixed but after playing for a hour it just did it again completely up in the air about this......


Hey man. Go into windows event log and is the error you're getting kernel event power 41??
If so it may very well be the power supply. It could also be the memory however. Run mem test and see if you get any errors. Or it could be the drivers. Install ddu and choose the clean and shutdown option for new graphics card. Take your card out and blow any dust away. Place the card back in and boot the computer. Then install the latest 15.8beta drivers from amd. Should work now.


----------



## Darkeylel

Quote:


> Originally Posted by *Mysticking32*
> 
> Hey man. Go into windows event log and is the error you're getting kernel event power 41??
> If so it may very well be the power supply. It could also be the memory however. Run mem test and see if you get any errors. Or it could be the drivers. Install ddu and choose the clean and shutdown option for new graphics card. Take your card out and blow any dust away. Place the card back in and boot the computer. Then install the latest 15.8beta drivers from amd. Should work now.


Yea it's error 41 sigh......

Mem test was the first thing I did haha all good in that department, running the 15.8 drivers at the moment going to uninstall them and see how it goes


----------



## Mysticking32

Quote:


> Originally Posted by *Darkeylel*
> 
> Yea it's error 41 sigh......
> 
> Mem test was the first thing I did haha all good in that department, running the 15.8 drivers at the moment going to uninstall them and see how it goes


Don't forget the steps involving ddu


----------



## Darkeylel

Quote:


> Originally Posted by *Mysticking32*
> 
> Don't forget the steps involving ddu


So did all the steps that where involved played it for a solid 2 hours and it worked great temps where fine. Then just as I was about to make a post on here just turned it self off.....


----------



## toxick

Used Trixx

+200mV

http://www.3dmark.com/3dm11/10281266

+100mV

http://www.3dmark.com/3dm11/10300954


----------



## Klocek001

what's the difference between the old trix, new trix and nitro cooling ?


----------



## Gumbi

Quote:


> Originally Posted by *Klocek001*
> 
> what's the difference between the old trix, new trix and nitro cooling ?












And Nitro is rebranded new edition Trix.


----------



## Klocek001

Quote:


> Originally Posted by *Gumbi*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And Nitro is rebranded new edition Trix.


thx. can you post a normal link and pic ?


----------



## Gumbi

Am on phone, took me 5 edits to fix









http://www.pcgameshardware.de/screenshots/original/2015/02/Sapphire_R9_290X_New_Edition-pcgh.png


----------



## Klocek001

Quote:


> Originally Posted by *Gumbi*
> 
> Am on phone, took me 5 edits to fix
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.pcgameshardware.de/screenshots/original/2015/02/Sapphire_R9_290X_New_Edition-pcgh.png


still broken. PM maybe ?


----------



## Gumbi

Quote:


> Originally Posted by *Klocek001*
> 
> still broken. PM maybe ?


It works fine for me. Google "Sapphire Tri x new edition vs old" and scroll down about 20 pics.


----------



## milan616

There, hope this helps


----------



## Gumbi

Thanks man... was relying on phone and it was just not co-operating :/

As you can see the new edition is pretty much the same except slightly better in pretty much every aspect


----------



## battleaxe

Quote:


> Originally Posted by *Gumbi*
> 
> Thanks man... was relying on phone and it was just not co-operating :/
> 
> As you can see the new edition is pretty much the same except slightly better in pretty much every aspect


I own one.


----------



## Gumbi

Quote:


> Originally Posted by *battleaxe*
> 
> I own one.


Apart from the"vapor chamber" and backplate, and IFC (big whoop) is it identical PCB-wise to the VaporX?

With the extra phase (I know 290 VaporX is 5+1, and old trix was 5; 290x vaporx is 10 tho so not sure about 290 trix new) and 375w tdp it seems very similar.


----------



## battleaxe

Quote:


> Originally Posted by *Gumbi*
> 
> Apart from the"vapor chamber" and backplate, and IFC (big whoop) is it identical PCB-wise to the VaporX?
> 
> With the extra phase (I know 290 VaporX is 5+1, and old trix was 5; 290x vaporx is 10 tho so not sure about 290 trix new) and 375w tdp it seems very similar.


No the PCB is different. An old block will not fit on the new one. I wish it would.

The VRM1 and VRM2 are both where the VRM1 used to be. The top of where VRM1 is now has a small cluster and that is VRM2. I had to do a lot of experimenting to figure this out. So now, my air cooler that sits on VRM1 cools VRM2 as well. Kinda weird. There's a few other VRM's around the board but those are not monitored through GPU-Z. I just put some good RAM sinks on those just in case. Most were not connected/covered on the air cooler block at all.

I'm not sure what the extra power means in real world. Mine runs up to about 1260mhz/1680mhz for benching purposes. Better than most, not as good as some of the old ones.


----------



## MTDEW

MTDEW - Sapphire - Tri-X 390X - Air/stock - 1055 / 1500
No overclocking just yet.


----------



## navjack27

here is what i got with my 390x at its stock overclocked speed, with no cpu overclock and no hyperthreading.


this is for Geoclock in re to his post

here is with a gpu overclock of 1150/1650 and +25 on core voltage. prob should do +50 for ZERO artifacts, there was minimal tho.


same as the last on the gpu except +50 on core voltage BUT with my cpu @ 4GHz


max 72C with one run and a fan profile


----------



## Offender_Mullet

Got the 390 in today. Nice, sold build quality on this. Very impressed with XFX.


----------



## FooSkiii

Stock Cooler On Gpu, working on getting a water block for it soon!
XFX DD R9 390


----------



## Noirgheos

Anybody here play Witcher 3? At 1080p, I'd like to see what FPS you get with everything maxed, but shadows put to high, and foliage distance to high, and use HBAO+ please. Also obviously no hairworks. If anybody can oblige, I'd be real happy.

Also, post your clocks on your GPU please.


----------



## desetnik

Quote:


> Originally Posted by *Noirgheos*
> 
> Anybody here play Witcher 3? At 1080p, I'd like to see what FPS you get with everything maxed, but shadows put to high, and foliage distance to high, and use HBAO+ please. Also obviously no hairworks. If anybody can oblige, I'd be real happy.
> 
> Also, post your clocks on your GPU please.


My stock 390 runs the game at everything ultra with only foliage distance to high at 50+. Mostly its at 60 while just running around except in the dense forests where it can fall to 45 but thats rare its like less than 5% of my gameplay time. Also I'm playing with hairworks on with mod that removes it from geralt and the tesselation is to x8 in catalyst, the fps loss was minimal in combat with multiple wolves. Personally hairworks is meh but I still use it for fun. If I could OC the card I don't think my fps would ever fall under 50 or more since it runs really good.


----------



## Noirgheos

Quote:


> Originally Posted by *desetnik*
> 
> My stock 390 runs the game at everything ultra with only foliage distance to high at 50+. Mostly its at 60 while just running around except in the dense forests where it can fall to 45 but thats rare its like less than 5% of my gameplay time. Also I'm playing with hairworks on with mod that removes it from geralt and the tesselation is to x8 in catalyst, the fps loss was minimal in combat with multiple wolves. Personally hairworks is meh but I still use it for fun. If I could OC the card I don't think my fps would ever fall under 50 or more since it runs really good.


Thanks for the response.


----------



## kizwan

Quote:


> Originally Posted by *navjack27*
> 
> here is what i got with my 390x at its stock overclocked speed, with no cpu overclock and no hyperthreading.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> this is for Geoclock in re to his post
> 
> here is with a gpu overclock of 1150/1650 and +25 on core voltage. prob should do +50 for ZERO artifacts, there was minimal tho.
> 
> 
> same as the last on the gpu except +50 on core voltage BUT with my cpu @ 4GHz
> 
> 
> max 72C with one run and a fan profile


What was the CPU clocked up to with the first valley run? That's pretty much show +25mV not enough juice for that clock & CPU does affect the score a little bit by adding a couple of FPS with higher CPU frequency.

BTW, Haswell did have extra unlocked bins, didn't they? If yes, you should be able to hit 4.1GHz, shouldn't you?


----------



## Oregonduck007

Quote:


> Originally Posted by *Offender_Mullet*
> 
> Got the 390 in today. Nice, sold build quality on this. Very impressed with XFX.


Nice! welcome to the club lol.


----------



## Oregonduck007

Quote:


> Originally Posted by *Noirgheos*
> 
> Anybody here play Witcher 3? At 1080p, I'd like to see what FPS you get with everything maxed, but shadows put to high, and foliage distance to high, and use HBAO+ please. Also obviously no hairworks. If anybody can oblige, I'd be real happy.
> 
> Also, post your clocks on your GPU please.


I could run a few Witcher tests later tonight. Ill be running it on a MSI R9 390 and be running it at 1102mhz/1599mhz.


----------



## Derek129

so sick of my video driver crashing while playing gtav....................................................


----------



## navjack27

with the 4790 S i'm not sure kizwan. the clocks on the default run were whatever auto is on my z97-a. like 3.6ghz to 3.2ghz and sometimes 4ghz if something just uses one core, which is rare. thats why i ratio lock it to 40.

EDIT: i actually emailed intel and asus about how default settings on this mobo NEVER WORKED at spec. i'm sure you know what i mean. a cpu has spec speeds things run at, and voltages. THERE IS NO WAY TO GET THAT. auto does GOD KNOWS what to the voltages and speeds. with stock cooler it was impossible to be stable. now that i have an h80i gt everything is dandy. but i still prefer to have it core locked at 40x for 4ghz with no power savings enabled.
intel and asus had no idea what i was talking about. not sure if it was how i typed it or whatever, but yeah. its weird.


----------



## Oregonduck007

Quote:


> Originally Posted by *Noirgheos*
> 
> Anybody here play Witcher 3? At 1080p, I'd like to see what FPS you get with everything maxed, but shadows put to high, and foliage distance to high, and use HBAO+ please. Also obviously no hairworks. If anybody can oblige, I'd be real happy.
> 
> Also, post your clocks on your GPU please.


Same with desetnik. With everything set to high and HBAO+ I was at a solid 60 FPS. MSI R9 390 1102mhz/1599mhz.


----------



## kizwan

Quote:


> Originally Posted by *navjack27*
> 
> with the 4790 S i'm not sure kizwan. the clocks on the default run were whatever auto is on my z97-a. like 3.6ghz to 3.2ghz and sometimes 4ghz if something just uses one core, which is rare. thats why i ratio lock it to 40.
> 
> EDIT: i actually emailed intel and asus about how default settings on this mobo NEVER WORKED at spec. i'm sure you know what i mean. a cpu has spec speeds things run at, and voltages. THERE IS NO WAY TO GET THAT. auto does GOD KNOWS what to the voltages and speeds. with stock cooler it was impossible to be stable. now that i have an h80i gt everything is dandy. but i still prefer to have it core locked at 40x for 4ghz with no power savings enabled.
> intel and asus had no idea what i was talking about. not sure if it was how i typed it or whatever, but yeah. its weird.


My mistake, non-K Haswell CPU doesn't have extra bins. The highest your CPU can go is 3.6GHz with all cores active. Only when one core is active it can go up to 4GHz.


----------



## LongRod

Quote:


> Originally Posted by *Derek129*
> 
> so sick of my video driver crashing while playing gtav....................................................


Apparently AMD has fixed this internally (according to a post by AMD_Matt in the huge thread on the AMD forums about the driver crashing issues of the 300 series), it's in testing atm which is why the driver hasn't been released.

Hopefully not too much longer....


----------



## Dorland203

Does anyone use aiseesoft total video converter or any similar programs ? Can you enable video acceleration to convert video files ?


----------



## Rhys93B

Having built my expensive PC with the aim of playing GTA 5 in glorious 1440P, it is incredibly annoying to find out I can play it for about 2 minutes before it crashes due to AMD drivers stopping working or whatever the error phrase was.

It's actually a good card (XFX R9 390). The highest temperature I;ve seen it get so far is 71C and that was after 3 hours of Bioshock Infinite at 1440p.

I hope they release a fix soon as it is so frustrating being limited to older and less demanding games.


----------



## milan616

Quote:


> Originally Posted by *Oregonduck007*
> 
> I could run a few Witcher tests later tonight. Ill be running it on a MSI R9 390 and be running it at 1102mhz/1599mhz.


If you ever get OCD I learned that you can move the sliders in Afterburner with the keyboard left/right arrow keys after clicking them. 1102/1599 made me crazy. 1100/1600 is the way.


----------



## Noirgheos

Alright, my EVGA G2 750W PSU has been making a buzzing noise for 4 months whenever the PC boots up, and then it goes away. Nonetheless, it is annoying. Can someone recommend a good 650W or 750W PSU? Has to be 80+ Gold. Max $160 CAD please.


----------



## Noirgheos

Quote:


> Originally Posted by *Rhys93B*
> 
> Having built my expensive PC with the aim of playing GTA 5 in glorious 1440P, it is incredibly annoying to find out I can play it for about 2 minutes before it crashes due to AMD drivers stopping working or whatever the error phrase was.
> 
> It's actually a good card (XFX R9 390). The highest temperature I;ve seen it get so far is 71C and that was after 3 hours of Bioshock Infinite at 1440p.
> 
> I hope they release a fix soon as it is so frustrating being limited to older and less demanding games.


Well, AMD has said they'd fixed that issue, and that they'll release a new driver for it by the end of September. That's what I heard though, but I'm certain they said they fixed it.


----------



## By-Tor

Quote:


> Originally Posted by *Noirgheos*
> 
> Alright, my EVGA G2 750W PSU has been making a buzzing noise for 4 months whenever the PC boots up, and then it goes away. Nonetheless, it is annoying. Can someone recommend a good 650W or 750W PSU? Has to be 80+ Gold. Max $160 CAD please.


I've been using this 850w XFX PSU for a couple of years now and it has been working great for me. A couple weeks back I ran Furmark and the watt meter on my desk said my system was pulling 1015 watts out of the wall and just kept on pulling until the test was over.

http://www.newegg.com/Product/Product.aspx?Item=N82E16817207028

They do make this 750w non-modular PSU if you don't mind all the cables in the case...

http://www.newegg.com/Product/Product.aspx?Item=N82E16817207033


----------



## kalidae

Quote:


> Originally Posted by *Noirgheos*
> 
> Alright, my EVGA G2 750W PSU has been making a buzzing noise for 4 months whenever the PC boots up, and then it goes away. Nonetheless, it is annoying. Can someone recommend a good 650W or 750W PSU? Has to be 80+ Gold. Max $160 CAD please.


I'm using the corsair rm650 it's gold rated. It's got 0 db fanless technology and i have never actually heard the psu running. Not once. The fan probably spins from time to time, never seen it spin and i have never heard it spin. I don't know about the price in cad because im aussie but it's worth checking out.


----------



## Gumbi

Quote:


> Originally Posted by *Rhys93B*
> 
> Having built my expensive PC with the aim of playing GTA 5 in glorious 1440P, it is incredibly annoying to find out I can play it for about 2 minutes before it crashes due to AMD drivers stopping working or whatever the error phrase was.
> 
> It's actually a good card (XFX R9 390). The highest temperature I;ve seen it get so far is 71C and that was after 3 hours of Bioshock Infinite at 1440p.
> 
> I hope they release a fix soon as it is so frustrating being limited to older and less demanding games.


Revert your drivers...


----------



## Mysticking32

Quote:


> Originally Posted by *Oregonduck007*
> 
> Same with desetnik. With everything set to high and HBAO+ I was at a solid 60 FPS. MSI R9 390 1102mhz/1599mhz.


Same with 390x. I prefer high settings just so I get over 60 fps. Runs butter smooth. I use amd's tool to cap the fps at 60fps though.

Really a handy tool. Temps and power consumption is lowered. Which is a plus for anyone. MSI r9 390x 1100mhz/1525mhz.


----------



## Mysticking32

Quote:


> Originally Posted by *Rhys93B*
> 
> Having built my expensive PC with the aim of playing GTA 5 in glorious 1440P, it is incredibly annoying to find out I can play it for about 2 minutes before it crashes due to AMD drivers stopping working or whatever the error phrase was.
> 
> It's actually a good card (XFX R9 390). The highest temperature I;ve seen it get so far is 71C and that was after 3 hours of Bioshock Infinite at 1440p.
> 
> I hope they release a fix soon as it is so frustrating being limited to older and less demanding games.


Hey man. I was getting the same error. It turns out that when you overclock the card the game crashes. I don't really get this at all but instead of using sapphire's trixx tool I started using amd's overdrive and it stopped crashing as much.

Hell games like rome 2 still crash for me. I doubt they'll ever fix that though.


----------



## Rhys93B

Quote:


> Originally Posted by *Gumbi*
> 
> Revert your drivers...


That was quite vague. Revert it to what?

From what I understand there is no fix available. A guy named Matt from AMD said they may have a fix but they're still testing it internally. There was no time-frame given for release. If the problem was fixable simply by reverting to previous drivers. surely they would have suggested already?


----------



## toxick

Until now I had five Hawaii GPU's, three r9 290 and two r9 390.
From R9 290 two was BBA: Sapphire and Asus and another one Sapphire Trixx OC with 6+8 power. Sapphire BBA reached 1220 / GPU, 24/7. On this I did a few tests, I changed cooling with a Raijintek Morpheus Core Black Edition, very cute, but over while, I purchased ASUS R9 290 BBA and I was forced to abandon custom cooling, because I could not mount video cards in CrossFire.
First R9 390 was a Club 3D, maximum temperature 72C in normal use(Crysis 3...etc...), OC 1175/GPU, + 150mV and VRM temperature to a maximum 65C.
And last one, Sapphire R9 390 Nitro 1200 / GPU VRM + 200mV, but VRM reaches 92C.
From what I've seen Club 3D VMR temperature is much lower.


----------



## FooSkiii

Hey guys I was thinking of trying to get this to water cool my 390 what do you guys think????

https://www.nzxt.com/product/detail/138-kraken-g10-gpu-bracket.html

Has anybody tried to use this yet?


----------



## Oregonduck007

Quote:


> Originally Posted by *Mysticking32*
> 
> Same with 390x. I prefer high settings just so I get over 60 fps. Runs butter smooth. I use amd's tool to cap the fps at 60fps though.
> 
> Really a handy tool. Temps and power consumption is lowered. Which is a plus for anyone. MSI r9 390x 1100mhz/1525mhz.


Nice to know! btw have you maxed out you're MSI R9 390x yet? I got my MSI 390 to max at 1200/1700 with +100Mv


----------



## Gumbi

Quote:


> Originally Posted by *FooSkiii*
> 
> Hey guys I was thinking of trying to get this to water cool my 390 what do you guys think????
> 
> https://www.nzxt.com/product/detail/138-kraken-g10-gpu-bracket.html
> 
> Has anybody tried to use this yet?


I personally don't see the point given all the 390 models are decent aftermarket models... the cooloer is nice though, but make sure you have adequate VRM cooling... I've seen that cooler have the core icy cold but allow the VRMs to get very hot. You need heatsinks as well as that fan blowing cool air onto them, and make sure the card itself has some cool air being fed to it.


----------



## FooSkiii

Quote:


> Originally Posted by *Gumbi*
> 
> I personally don't see the point given all the 390 models are decent aftermarket models... the cooloer is nice though, but make sure you have adequate VRM cooling... I've seen that cooler have the core icy cold but allow the VRMs to get very hot. You need heatsinks as well as that fan blowing cool air onto them, and make sure the card itself has some cool air being fed to it.


that is what i was thinking cause from the look of it the VRM's won't be cooled as much bummers.
i was also thinking about a straight water block, but then it gets real pricey lol


----------



## CaptainZombie

I recently had a 980 that I sold a few days ago and was considering getting a GTX960 to tied me over till next year to see what cards come out. I did have a 970 before that but ditched the card with the 3.5GB issue. I was lurking around the forums today and see that the 390 w/8GB is a pretty good card. I do have an i7 4790k in my ITX build hooked up to a 4KTV, but would be ok with gaming at 1080p since at times 4K can get wonky on my end. If I am not OC'ing the card, would I be ok with a 550W PSU?

How is the MSI 390 when compared to Sapphire or Gigabyte? I've never owned an AMD card, but have tested a few in the past and hated the coil whine and heat out put.


----------



## Dundundata

Quote:


> Originally Posted by *Noirgheos*
> 
> Anybody here play Witcher 3? At 1080p, I'd like to see what FPS you get with everything maxed, but shadows put to high, and foliage distance to high, and use HBAO+ please. Also obviously no hairworks. If anybody can oblige, I'd be real happy.
> 
> Also, post your clocks on your GPU please.


I am running it with everything ULTRA except foliage distance is high and getting a pretty solid 60fps. Hairworks takes a decent FPS hit on occasion, does look nice though. This is on XFX 390 running at 1100/1600 on stock voltage. I am just starting to test some higher OC numbers so it will be interesting to see how far I can push this card.

Enabling HBAO+ there was no real noticeable drop in frames. Please note these were all just simple tests/observations I hope to be going more in depth soon.


----------



## Darkeylel

Just want to say thanks to Mysticking32 Pc is up and working again (holding onto wood for dear life) , re wired my Pc and did a fresh install of drivers seemed to have done the trick. PSU seems stable again so i'm in two minds, either something was causing a short some where or my BIOS actually decided to roll over and die.

But thanks again for the help


----------



## Dundundata

Quote:


> Originally Posted by *kalidae*
> 
> I'm using the corsair rm650 it's gold rated. It's got 0 db fanless technology and i have never actually heard the psu running. Not once. The fan probably spins from time to time, never seen it spin and i have never heard it spin. I don't know about the price in cad because im aussie but it's worth checking out.


Just picked up/installed the RM750i


----------



## Gumbi

Quote:


> Originally Posted by *CaptainZombie*
> 
> I recently had a 980 that I sold a few days ago and was considering getting a GTX960 to tied me over till next year to see what cards come out. I did have a 970 before that but ditched the card with the 3.5GB issue. I was lurking around the forums today and see that the 390 w/8GB is a pretty good card. I do have an i7 4790k in my ITX build hooked up to a 4KTV, but would be ok with gaming at 1080p since at times 4K can get wonky on my end. If I am not OC'ing the card, would I be ok with a 550W PSU?
> 
> How is the MSI 390 when compared to Sapphire or Gigabyte? I've never owned an AMD card, but have tested a few in the past and hated the coil whine and heat out put.


MSI 390 is great, one of the best 390 models out there. Coil whine is no more prevalent in AMD model cards than it is in nVidia cards...


----------



## Sgt Bilko

Quote:


> Originally Posted by *Gumbi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *CaptainZombie*
> 
> I recently had a 980 that I sold a few days ago and was considering getting a GTX960 to tied me over till next year to see what cards come out. I did have a 970 before that but ditched the card with the 3.5GB issue. I was lurking around the forums today and see that the 390 w/8GB is a pretty good card. I do have an i7 4790k in my ITX build hooked up to a 4KTV, but would be ok with gaming at 1080p since at times 4K can get wonky on my end. If I am not OC'ing the card, would I be ok with a 550W PSU?
> 
> How is the MSI 390 when compared to Sapphire or Gigabyte? I've never owned an AMD card, but have tested a few in the past and hated the coil whine and heat out put.
> 
> 
> 
> MSI 390 is great, one of the best 390 models out there. Coil whine is no more prevalent in AMD model cards than it is in nVidia cards...
Click to expand...

MSI and XFX are the top cards in this gen from what we've seen, Sapphire and Powercolor are also pretty good.

Stay away from Gigabyte for the love of all that is holy.....i have seen way way too many broken cards from them lately (not just talking about the 390/x here) and Asus is ok afaik.....the Strix is decent but the DCU II has the same heat issues the 290x did.

Havent seen Club3D's, Visiontek's or HIS yet to comment on them.


----------



## kalidae

Quote:


> Originally Posted by *Dundundata*
> 
> Just picked up/installed the RM750i


nice! what's the difference between the rm and rmi? Is it just digital or something? I haven't checked those out yet.


----------



## flopper

Quote:


> Originally Posted by *CaptainZombie*
> 
> If I am not OC'ing the card, would I be ok with a 550W PSU?
> 
> How is the MSI 390 when compared to Sapphire or Gigabyte? I've never owned an AMD card, but have tested a few in the past and hated the coil whine and heat out put.


550w are fine.
coil whine can happen with any brand.
Heat seems ok with the newer coolers all around.


----------



## Gumbi

Quote:


> Originally Posted by *flopper*
> 
> 550w are fine.
> coil whine can happen with any brand.
> Heat seems ok with the newer coolers all around.


550 w is fine IF YOU HAVE AN INTEL CHIP. If you have an AMD chip be careful, and definitely don't do it if you are overclockig the AMD CPU.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Gumbi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *flopper*
> 
> 550w are fine.
> coil whine can happen with any brand.
> Heat seems ok with the newer coolers all around.
> 
> 
> 
> 550 w is fine IF YOU HAVE AN INTEL CHIP. If you have an AMD chip be careful, and definitely don't do it if you are overclockig the AMD CPU.
Click to expand...

I know it's not a proper comparison but I've run an FX-6300 at 4.7Ghz and an R9 290 (small oc 1100/1400) on an XFX TS 550w PSU









Now if it was an FX-8xxx series CPU and you had it clocked at 5.0 or so then i would say no but at 4.5-4.6 i think you'd be alright


----------



## Gumbi

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I know it's not a proper comparison but I've run an FX-6300 at 4.7Ghz and an R9 290 (small oc 1100/1400) on an XFX TS 550w PSU
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now if it was an FX-8xxx series CPU and you had it clocked at 5.0 or so then i would say no but at 4.5-4.6 i think you'd be alright


Ya that would be fine, but I wouldn't be putting 100mv through that card


----------



## flopper

Quote:


> Originally Posted by *Gumbi*
> 
> 550 w is fine IF YOU HAVE AN INTEL CHIP. If you have an AMD chip be careful, and definitely don't do it if you are overclockig the AMD CPU.


550w psu is fine as long its not a no brand one even on a amd cpu.
gaming with a default 290/390 is within margins.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Gumbi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> I know it's not a proper comparison but I've run an FX-6300 at 4.7Ghz and an R9 290 (small oc 1100/1400) on an XFX TS 550w PSU
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now if it was an FX-8xxx series CPU and you had it clocked at 5.0 or so then i would say no but at 4.5-4.6 i think you'd be alright
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Ya that would be fine, but I wouldn't be putting 100mv through that card
Click to expand...

oh of course not, but if they only want to run stock then they'd be fine tbh









for overclocking a nice 750w PSU would do the trick then for Crossfire 850-1000w and anything above that is 1200w+ anyway


----------



## jackalopeater

Been absent for a while, lots of good stuff rolling through.

I've been spending a bit of time with my new Nano from AMD and after spending a while with Tonga, Hawaii/Grenada, and now Fiji.

If I were spending my money I'd be all over the XFX or MSI R9 390 all day or a pair of them for that matter









*Not an advertisement, but just my personal opinion*

This test was run at stock clocks but +25 Power Limit on the 390x and +50 Power Limit on the Nano ( trying to keep clocks locked on both)
Dirt Rally Ultra Preset (no additional AA)
FX 8370 @4.6
16gb AMD R9 Gamer Ram 2133mhz
Crosshair V Formula Z
Cooler Master V1200platinum PSU


----------



## Noirgheos

Quote:


> Originally Posted by *jackalopeater*
> 
> Been absent for a while, lots of good stuff rolling through.
> 
> I've been spending a bit of time with my new Nano from AMD and after spending a while with Tonga, Hawaii/Grenada, and now Fiji.
> 
> If I were spending my money I'd be all over the XFX or MSI R9 390 all day or a pair of them for that matter
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Not an advertisement, but just my personal opinion*
> 
> This test was run at stock clocks but +25 Power Limit on the 390x and +50 Power Limit on the Nano ( trying to keep clocks locked on both)
> Dirt Rally Ultra Preset (no additional AA)
> FX 8370 @4.6
> 16gb AMD R9 Gamer Ram 2133mhz
> Crosshair V Formula Z
> Cooler Master V1200platinum PSU


You're telling me the Nano outperforms the 390X? Damn. Well it does cost more I guess.


----------



## Dorland203

Quote:


> Originally Posted by *Noirgheos*
> 
> You're telling me the Nano outperforms the 390X? Damn. Well it does cost more I guess.


The Nano has the full Fiji chip so I think it should be faster than a 390X


----------



## CaptainZombie

Quote:


> Originally Posted by *Gumbi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *CaptainZombie*
> 
> I recently had a 980 that I sold a few days ago and was considering getting a GTX960 to tied me over till next year to see what cards come out. I did have a 970 before that but ditched the card with the 3.5GB issue. I was lurking around the forums today and see that the 390 w/8GB is a pretty good card. I do have an i7 4790k in my ITX build hooked up to a 4KTV, but would be ok with gaming at 1080p since at times 4K can get wonky on my end. If I am not OC'ing the card, would I be ok with a 550W PSU?
> 
> How is the MSI 390 when compared to Sapphire or Gigabyte? I've never owned an AMD card, but have tested a few in the past and hated the coil whine and heat out put.
> 
> 
> 
> MSI 390 is great, one of the best 390 models out there. Coil whine is no more prevalent in AMD model cards than it is in nVidia cards...
Click to expand...

That is good to hear about coil whine. I've watched several reviews does this card get very hot? I'd be placing it in a 250D.

How many slots is this card 2 or 3?
Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Gumbi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *CaptainZombie*
> 
> I recently had a 980 that I sold a few days ago and was considering getting a GTX960 to tied me over till next year to see what cards come out. I did have a 970 before that but ditched the card with the 3.5GB issue. I was lurking around the forums today and see that the 390 w/8GB is a pretty good card. I do have an i7 4790k in my ITX build hooked up to a 4KTV, but would be ok with gaming at 1080p since at times 4K can get wonky on my end. If I am not OC'ing the card, would I be ok with a 550W PSU?
> 
> How is the MSI 390 when compared to Sapphire or Gigabyte? I've never owned an AMD card, but have tested a few in the past and hated the coil whine and heat out put.
> 
> 
> 
> MSI 390 is great, one of the best 390 models out there. Coil whine is no more prevalent in AMD model cards than it is in nVidia cards...
> 
> Click to expand...
> 
> MSI and XFX are the top cards in this gen from what we've seen, Sapphire and Powercolor are also pretty good.
> 
> Stay away from Gigabyte for the love of all that is holy.....i have seen way way too many broken cards from them lately (not just talking about the 390/x here) and Asus is ok afaik.....the Strix is decent but the DCU II has the same heat issues the 290x did.
> 
> Havent seen Club3D's, Visiontek's or HIS yet to comment on them.
Click to expand...

I'll stick with the msi then, seems like the best of the bunch. I've heard that Gigabyte has had issues across the board.
Quote:


> Originally Posted by *flopper*
> 
> Quote:
> 
> 
> 
> Originally Posted by *CaptainZombie*
> 
> If I am not OC'ing the card, would I be ok with a 550W PSU?
> 
> How is the MSI 390 when compared to Sapphire or Gigabyte? I've never owned an AMD card, but have tested a few in the past and hated the coil whine and heat out put.
> 
> 
> 
> 550w are fine.
> coil whine can happen with any brand.
> Heat seems ok with the newer coolers all around.
Click to expand...

Awesome. Glad the 550W is enough. I have an i7 4790k for the guy that was asking.

How has AMDs drivers been on late?

Thanks to everyone you've been help helpful.


----------



## flopper

Quote:


> Originally Posted by *Noirgheos*
> 
> You're telling me the Nano outperforms the 390X? Damn. Well it does cost more I guess.


One can say its the same as the Fury X without the watercooling bit.
Maybe even better.

I am about to buy a 390 as its simply a case for me to wait for next years die shrink due to the few fps one have with a twice the money card.
not a big fan to spend for not much in return.
Next year be really interesting though in the graphics arena


----------



## bazookatooths

Witcher 3 with overclock r9 390 usually sit around 77 fps no hairworks, stock sits around 66.

This game seems to love 8 cores on my 8120.


----------



## flopper

Quote:


> Originally Posted by *bazookatooths*
> 
> Witcher 3 with overclock r9 390 usually sit around 77 fps no hairworks, stock sits around 66.
> 
> This game seems to love 8 cores on my 8120.


cant wait for Dx12 games to be the norm


----------



## Darkstalker420

Can you add me please
















Thanx.


----------



## Gumbi

Another XFX, noice. Let us know how she clocks/cools.


----------



## alexelite

anyone running the vtx3d r9 390 i have mine buit not had a play yet any advice


----------



## Geoclock

I have Dell Workstation a heca Core T3600 with 635w PSU and i have shutdowns with MSI 390x in games and Windows Experience index, worked well with r9 290, strange.
Here is the pic


----------



## alexelite

this is my vtx3d can upload pic of rig if required cheers


----------



## Gumbi

Quote:


> Originally Posted by *Geoclock*
> 
> I have Dell Workstation a heca Core T3600 with 635w PSU and i have shutdowns with MSI 390x in games and Windows Experience index, worked well with r9 290, strange.
> Here is the pic


390s draw more power thab 290s (higher stock voltage), your PSU is pretty trash, it could be due to the extra power tha it's crashing it


----------



## bazookatooths

Quote:


> Originally Posted by *flopper*
> 
> cant wait for Dx12 games to be the norm


I am excited although only to 6cores because intel only has a 6 core consumer cpu. It will be awesome also amd was seeing up to 80% boost in gpus. We will see though...


----------



## kizwan

Quote:


> Originally Posted by *Geoclock*
> 
> I have Dell Workstation a heca Core T3600 with 635w PSU and i have shutdowns with MSI 390x in games and Windows Experience index, worked well with r9 290, strange.
> Here is the pic


Your PSU have 5 +12V rails, did you connect the PCIe connectors to different rail?


----------



## Geoclock

Its a Dell and i dontthink its possible, ill double check in case with customer support. Thanks.


----------



## jackalopeater

Quote:


> Originally Posted by *flopper*
> 
> One can say its the same as the Fury X without the watercooling bit.
> Maybe even better.
> 
> I am about to buy a 390 as its simply a case for me to wait for next years die shrink due to the few fps one have with a twice the money card.
> not a big fan to spend for not much in return.
> Next year be really interesting though in the graphics arena


I still stand that if I were putting my own money up, the R9 390 would be the card I would look at honestly


----------



## By-Tor

Quote:


> Originally Posted by *jackalopeater*
> 
> I still stand that if I were putting my own money up, the R9 390 would be the card I would look at honestly


I to think the 390's are the sweet spot (money/performance) cards for the new series. Add a second card with a total price for the pair of what you would spend on a single Fury X and out perform it.

just my 2 cents


----------



## jackalopeater

Quote:


> Originally Posted by *By-Tor*
> 
> I to think the 390's are the sweet spot (money/performance) cards for the new series. Add a second card with a total price for the pair of what you would spend on a single Fury X and out perform it.
> 
> just my 2 cents


I 100% agree, even made that statement several times myself


----------



## By-Tor

Quote:


> Originally Posted by *jackalopeater*
> 
> I 100% agree, even made that statement several times myself


I have thought about replacing my pair of 290X's with a pair of 390's, but unless I go to a 4k monitor I really don't see a huge jump in performance and would rather wait to see what the next gen. brings us.


----------



## battleaxe

Quote:


> Originally Posted by *By-Tor*
> 
> I have thought about replacing my pair of 290X's with a pair of 390's, but unless I go to a 4k monitor I really don't see a huge jump in performance and would rather wait to see what the next gen. brings us.


Personally, going from a 290x to 390 is not worth it even on 4k. Just IMO from the data I have seen. Seems a mostly lateral move. 390x okay, maybe worth it.


----------



## Dundundata

Quote:


> Originally Posted by *kalidae*
> 
> nice! what's the difference between the rm and rmi? Is it just digital or something? I haven't checked those out yet.


check this out

RMi750


----------



## kizwan

Quote:


> Originally Posted by *battleaxe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *By-Tor*
> 
> I have thought about replacing my pair of 290X's with a pair of 390's, but unless I go to a 4k monitor I really don't see a huge jump in performance and would rather wait to see what the next gen. brings us.
> 
> 
> 
> Personally, going from a 290x to 390 is not worth it even on 4k. Just IMO from the data I have seen. Seems a mostly lateral move. 390x okay, maybe worth it.
Click to expand...

Upgrading from a pair of 290X to a pair of 390 seems like good idea if the games use more than 4GB at 4K. Then again with just a pair of 390, probably need tone down the settings a little bit to keep comfortable FPS at 4K which probably means it may use less than 4GB. He could overclock the cards but cooling will be an issue with a pair of heater in there.


----------



## battleaxe

Quote:


> Originally Posted by *kizwan*
> 
> Upgrading from a pair of 290X to a pair of 390 seems like good idea if the games use more than 4GB at 4K. Then again with just a pair of 390, probably need tone down the settings a little bit to keep comfortable FPS at 4K which probably means it may use less than 4GB. He could overclock the cards but cooling will be an issue with a pair of heater in there.


what games use more than 4gb? Batman? What else? The games I play this would def be a non issue... I guess it does depend though...


----------



## By-Tor

Quote:


> Originally Posted by *kizwan*
> 
> Upgrading from a pair of 290X to a pair of 390 seems like good idea if the games use more than 4GB at 4K. Then again with just a pair of 390, probably need tone down the settings a little bit to keep comfortable FPS at 4K which probably means it may use less than 4GB. He could overclock the cards but cooling will be an issue with a pair of heater in there.


At the moment I'm using just a single Asus 24" 144hz, 1080p monitor and have thought about going with a 27" 144hz, 1440p monitor in the near future. Not worried about the heat because whatever card I get it will be watercooled. My 290x's run at 1200/1600mhz without issue in my loop and I have never seen them hit 50c, they mostly stay in the low 40's for most things like gaming and only heat up in benches...


----------



## flopper

Quote:


> Originally Posted by *jackalopeater*
> 
> I still stand that if I were putting my own money up, the R9 390 would be the card I would look at honestly


Quote:


> Originally Posted by *By-Tor*
> 
> I to think the 390's are the sweet spot (money/performance) cards for the new series. Add a second card with a total price for the pair of what you would spend on a single Fury X and out perform it.
> 
> just my 2 cents


Mty 290 died in april.
Was gonna buy a Fury but then my screens was to old and then a new acer xg270hu 2560p 144hz was needed. I am so glad I did that as it was like a sun open up with the new screen.
so my old 6850 had to spend time working again.

Ithought about what I use and for now it made most sense to buy a 390 vs a Fury for the time. I will however upgrade this to next generation 16nm HBM2 card once those are out as I plan to run 3x2560p screens in eyefinity some time next year.

Today though I went and got a sapphire Nitro 390 8gb card.
First impression, dead silent as far.


----------



## CaptainZombie

Is the NITRO R9 390 a better card than the MSI 390 or are they neck and neck for most part? I think the Sapphire probably would be cutting it close in my case (250D).


----------



## By-Tor

Very nice....


----------



## flopper

Quote:


> Originally Posted by *CaptainZombie*
> 
> Is the NITRO R9 390 a better card than the MSI 390 or are they neck and neck for most part? I think the Sapphire probably would be cutting it close in my case (250D).


MSI cards seems to OC good togheter with XFX.
sapphire card was the value priced one so that was the choice for me this time as xfx was an option also.
I figure, if it clocks to 1100mhz or in the ballpark its a toss up anyway for gaming.


----------



## Higgenbobber

I've tested out the Sapphire, MSI, and Gigabyte versions of the card so far all in the same setup.

Got the gigabyte first, no problems really except limited overclocking

Tried out the MSI next and the temperatures drove me crazy. It was consistently reaching over 80 degrees and almost 90 in any game i played, and idle temperatures were above 60.

I returned the gigabyte and got the sapphire nitro card and I think I'm sticking with this one-- the temperatures are MUCH lower than the MSI (on full load I would usually be around 71-73), and for some reason an overclock of 1100 on the sapphire got higher scores than 1135 MSI in Heaven benchmarking.

Sidenote: Although the gigabyte is factory locked in voltage, one thing it does have going for it is the fact it's the smallest 390 on the market (i think). And the temperatures were decent and lower than the MSI card.


----------



## CC268

For any of you gaming on a 1440p monitor - how is your 390/390X doing? Will the 390/390X handle games well on a 1440p for the next 2 years or so?

I am thinking about getting a 390 or 390X to hold me off until Pascal comes out. Can't seem to pony up and spend the cash on a 980TI with Pascal coming out next year.


----------



## kizwan

Quote:


> Originally Posted by *By-Tor*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Upgrading from a pair of 290X to a pair of 390 seems like good idea if the games use more than 4GB at 4K. Then again with just a pair of 390, probably need tone down the settings a little bit to keep comfortable FPS at 4K which probably means it may use less than 4GB. He could overclock the cards but cooling will be an issue with a pair of heater in there.
> 
> 
> 
> At the moment I'm using just a single Asus 24" 144hz, 1080p monitor and have thought about going with a 27" 144hz, 1440p monitor in the near future. Not worried about the heat because whatever card I get it will be watercooled. My 290x's run at 1200/1600mhz without issue in my loop and I have never seen them hit 50c, they mostly stay in the low 40's for most things like gaming and only heat up in benches...
Click to expand...

I think probably good idea to hold off gpu upgrade for now & upgrade the monitor instead.

If you go ahead with the gpu upgrade, are you going to get Powercolor 390? You know I would be able to beat that too, right? LOL J/K








Quote:


> Originally Posted by *Higgenbobber*
> 
> I've tested out the Sapphire, MSI, and Gigabyte versions of the card so far all in the same setup.
> 
> Got the gigabyte first, no problems really except limited overclocking
> 
> Tried out the MSI next and the temperatures drove me crazy. It was consistently reaching over 80 degrees and almost 90 in any game i played, and idle temperatures were above 60.
> 
> I returned the gigabyte and got the sapphire nitro card and I think I'm sticking with this one-- the temperatures are MUCH lower than the MSI (on full load I would usually be around 71-73), and for some reason an overclock of *1100 on the sapphire got higher scores than 1135 MSI in Heaven* benchmarking.
> 
> Sidenote: Although the gigabyte is factory locked in voltage, one thing it does have going for it is the fact it's the smallest 390 on the market (i think). And the temperatures were decent and lower than the MSI card.


----------



## flopper

Quote:


> Originally Posted by *CC268*
> 
> For any of you gaming on a 1440p monitor - how is your 390/390X doing? Will the 390/390X handle games well on a 1440p for the next 2 years or so?
> 
> I am thinking about getting a 390 or 390X to hold me off until Pascal comes out. Can't seem to pony up and spend the cash on a 980TI with Pascal coming out next year.


should handle it just fine next 2 years.
dx12 games will make it shine even more


----------



## Higgenbobber

I'll post the heaven scores when I get home from work this evening. What's weird was that it was consistent too


----------



## By-Tor

Quote:


> Originally Posted by *kizwan*
> 
> I think probably good idea to hold off gpu upgrade for now & upgrade the monitor instead.
> 
> If you go ahead with the gpu upgrade, are you going to get Powercolor 390? You know I would be able to beat that too, right? LOL J/K


Powercolor would be my first choice, yes.... I've owned the last 4 gen. of Powercolor's cards and have had great luck with them. I did RMA one of my 7950's and received great service.
There is a chance my 290x LCS factory waterblocks may fit the Powercolor PCS+ 390 cards. Waiting to hear back from EK on this.

I didn't know this was a competition!!


----------



## Mr.Pie

Quote:


> Originally Posted by *CC268*
> 
> For any of you gaming on a 1440p monitor - how is your 390/390X doing? Will the 390/390X handle games well on a 1440p for the next 2 years or so?
> 
> I am thinking about getting a 390 or 390X to hold me off until Pascal comes out. Can't seem to pony up and spend the cash on a 980TI with Pascal coming out next year.


I'm playing on a single 390 on 1440p

going great so far. Haven't fired up witcher 3 yet but i can max Arkham knight without any issues (gameworks off)

Planetside 2 runs great with a few settings toned down


----------



## Geoclock

How i find out only 2 PCI 6pin cords are coming out from PSU, i don't know rails are combined or spitted but there is no other PCI to be used.
Sad, moving it to my custom build PC. I had artifacts just in Wolfenstein Old Blood game and wanted to check was it Computer problem or card itself.
All other games plays good.


----------



## Oregonduck007

Quote:


> Originally Posted by *Higgenbobber*
> 
> I've tested out the Sapphire, MSI, and Gigabyte versions of the card so far all in the same setup.
> 
> Got the gigabyte first, no problems really except limited overclocking
> 
> Tried out the MSI next and the temperatures drove me crazy. It was consistently reaching over 80 degrees and almost 90 in any game i played, and idle temperatures were above 60.
> 
> I returned the gigabyte and got the sapphire nitro card and I think I'm sticking with this one-- the temperatures are MUCH lower than the MSI (on full load I would usually be around 71-73), and for some reason an overclock of 1100 on the sapphire got higher scores than 1135 MSI in Heaven benchmarking.
> 
> Sidenote: Although the gigabyte is factory locked in voltage, one thing it does have going for it is the fact it's the smallest 390 on the market (i think). And the temperatures were decent and lower than the MSI card.


Quite the opposite for me. My MSI R9 390 runs around 70 on full load and around 36 idle(While OCed at 1130/1640). Though the fans are a bit loud at times which can take away from your gaming session. i don't mind it because i use Headphones. Overall the OC ability of the MSI 390 matched with the sexy back plate and frozr makes the card just right for me.


----------



## kalidae

Quote:


> Originally Posted by *Dundundata*
> 
> check this out
> 
> RMi750


They somehow took the rm series and made it even better. Very well done corsair. When I upgrade to a 390 crossfire ill have to upgrade my psu as well, looks like an rmi will be my choice.the only thing I don't like about Rm is the cables...They are a real pain to manage being the flat hard rubber type and very little flexibility. I use cable extensions so its not an issue anymore and a n450 case but in my 250d without extensions it was really hard to get neat cable management.they kept the same cable type with the rmi


----------



## gerpogi

Here we go


----------



## kalidae

Quote:


> Originally Posted by *gerpogi*
> 
> Here we go


That looks awesome. Very nice build.


----------



## gerpogi

Quote:


> Originally Posted by *kalidae*
> 
> That looks awesome. Very nice build.


thank you ! it took alota work wrapping the case with carbon fibre. its actually my first build







. and the cable management on this case is a nightmare... but it looks very sleek and awesome so its well worth it


----------



## CC268

Thanks for the responses guys...tough decision!


----------



## Sgt Bilko

Quote:


> Originally Posted by *gerpogi*
> 
> Here we go


Very nice looking build you got there









How you liking the XFX card so far?


----------



## Agent Smith1984

I'm all but about to just smash whats left of my computer.....

I dunno if it's a DOA new board, bad psu, or bad CPU but I'm just disgusted with the wrench i feel like this assrock board had thrown in my gears.

Testing new psu tomorrow...

Next I'll try cpu swap...


----------



## Rob27shred

.
Quote:


> Originally Posted by *kizwan*
> 
> Is anyone game for running Firestrike @1100/1700? Just want to see graphics score from all brands at same clock. Any card should be able to do 1100/1700, right?
> There's no successful story of unlocking 390 to 390X so far.


----------



## Higgenbobber

Quote:


> Originally Posted by *kizwan*
> 
> I think probably good idea to hold off gpu upgrade for now & upgrade the monitor instead.
> 
> If you go ahead with the gpu upgrade, are you going to get Powercolor 390? You know I would be able to beat that too, right? LOL J/K


Here are my comparisons for Sapphire and MSI core overclocking-- I ran benchmarks at 1100, 1110, and 1135. And yes these are accurate, same setup and all, didn't mislabel anything and was repeatable.

MSI 1100:


Sapphire 1100:


MSI 1110:


Sapphire 1110

MSI 1135:


Sapphire 1135:


So yeah, sapphire OC at 1100 clock for my computer is better than MSI OC at 1135-- craaaaazy


----------



## Oregonduck007

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I'm all but about to just smash whats left of my computer.....
> 
> I dunno if it's a DOA new board, bad psu, or bad CPU but I'm just disgusted with the wrench i feel like this assrock board had thrown in my gears.
> 
> Testing new psu tomorrow...
> 
> Next I'll try cpu swap...


Sorry to hear







. hope for the best!


----------



## flopper

1500p more than my old crashed and burned 290 in firestrike graphics score.


----------



## battleaxe

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *Higgenbobber*
> 
> Here are my comparisons for Sapphire and MSI core overclocking-- I ran benchmarks at 1100, 1110, and 1135. And yes these are accurate, same setup and all, didn't mislabel anything and was repeatable.
> 
> MSI 1100:
> 
> 
> Sapphire 1100:
> 
> 
> MSI 1110:
> 
> 
> Sapphire 1110
> 
> MSI 1135:
> 
> 
> Sapphire 1135:
> 
> 
> So yeah, sapphire OC at 1100 clock for my computer is better than MSI OC at 1135-- craaaaazy






Impressive. So is this a 390x I assume? Nice scores BTW...


Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *Agent Smith1984*
> 
> I'm all but about to just smash whats left of my computer.....
> 
> I dunno if it's a DOA new board, bad psu, or bad CPU but I'm just disgusted with the wrench i feel like this assrock board had thrown in my gears.
> 
> Testing new psu tomorrow...
> 
> Next I'll try cpu swap...






That blows dude. Totally.


----------



## Higgenbobber

Quote:


> Originally Posted by *battleaxe*
> 
> 
> Impressive. So is this a 390x I assume? Nice scores BTW...
> 
> That blows dude. Totally.


They're both the standard 390s from MSI and Sapphire. And just so you know I didn't use one of Heaven's presets-- they're custom settings


----------



## gerpogi

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Very nice looking build you got there
> 
> 
> 
> 
> 
> 
> 
> 
> 
> How you liking the XFX card so far?


its very good in terms of performance. it tends to get abit hot though because my pc case is somewhat restrictive. i used to have a gtx 980 and i dont see any difference compared to it aside from metal gear solid . so im happy


----------



## battleaxe

Quote:


> Originally Posted by *Higgenbobber*
> 
> They're both the standard 390s from MSI and Sapphire. And just so you know I didn't use one of Heaven's presets-- they're custom settings


Oh I see. I was thinking those were Extreme tesselation like I run mine... bummer. I was hoping those cards killed mine so I could go buy a new one.


----------



## kalidae

Quote:


> Originally Posted by *battleaxe*
> 
> 
> Impressive. So is this a 390x I assume? Nice scores BTW...
> 
> That blows dude. Totally.


Tesselation is on normal that's why the scores look like 390x scores.


----------



## Sgt Bilko

Quote:


> Originally Posted by *gerpogi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Very nice looking build you got there
> 
> 
> 
> 
> 
> 
> 
> 
> 
> How you liking the XFX card so far?
> 
> 
> 
> its very good in terms of performance. it tends to get abit hot though because my pc case is somewhat restrictive. i used to have a gtx 980 and i dont see any difference compared to it aside from metal gear solid . so im happy
Click to expand...

Awesome stuff









Always great to see people happy with their purchase


----------



## battleaxe

Quote:


> Originally Posted by *kalidae*
> 
> Tesselation is on normal that's why the scores look like 390x scores.


I want to cry now because you just ruined all my fun. (puts wallet back in pocket)...


----------



## Rob27shred

Quote:


> Originally Posted by *flopper*
> 
> 1500p more than my old crashed and burned 290 in firestrike graphics score.


Nice (the higher score not your card dying) that shows these 300 series are more than just a re-brand. It makes me feel better too cause I was almost gonna go with a 290X to save money. Sorry to hear about your card though, a 390/X would make a great replacement!


----------



## kalidae

Quote:


> Originally Posted by *battleaxe*
> 
> I want to cry now because you just ruined all my fun. (puts wallet back in pocket)...


Haha just be happy with your beastly 290x and get your wallet back out and buy a different hardware upgrade







if you are looking to buy stuff and want a new case....get yourself a noctis 450. I can't recommend this case enough and I'm the type of person that buys a new case every couple of months...I'm sticking with this thing until the day I die...or until theh release a new model hehe.


----------



## Dorland203

My MSI 390X at 1175/1625,+50mV/+0mV core/AUX voltage,+50% power limit.

Ambient temp is 30*C.My case is Cooler Master Elite 431.This case has very poor air flow.5 HDDs block the frontal airflow,messy cables block bottom airflow.The Corsair H80i GT AIO cooler occupies the rear and 1 of the top exhaust fan position which leave me 1 top fan for exhausting.I have to take the side panel off.Max core temp is 74*C,max VRM temp is 75*C








Edit:
Running firestrike with same clocks


----------



## kalidae

My firestrike score for my nitro 390 @1100/1700 exactly for whoever wanted to compare different cards at the same clocks.

13354
63.56
53.44


----------



## diggiddi

^Could you rock 1180/1550 thx^


----------



## kalidae

Quote:


> Originally Posted by *diggiddi*
> 
> ^Could you rock 1180/1550 thx^


Yep running it now



Sapphire nitro 1180/1550
Graphics score 13931
Graphics test 1 66.51
Graphics test 2 55.60


----------



## diggiddi

Quote:


> Originally Posted by *kalidae*
> 
> Yep running it now
> 
> 
> 
> Sapphire nitro 1180/1550
> Graphics score 13931
> Graphics test 1 66.51
> Graphics test 2 55.60


Thanks +rep
Ok I got 12085 updated to latest version and got 12197 @4.4ghz
Whats your OS?


----------



## kizwan

Quote:


> Originally Posted by *Rob27shred*
> 
> .
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Is anyone game for running Firestrike @1100/1700? Just want to see graphics score from all brands at same clock. Any card should be able to do 1100/1700, right?
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
Click to expand...

Thanks!







Why core clock is reported as 1121?
Quote:


> Originally Posted by *kalidae*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> My firestrike score for my nitro 390 @1100/1700 exactly for whoever wanted to compare different cards at the same clocks.
> 
> 13354
> 63.56
> 53.44


Thanks!








Quote:


> Originally Posted by *Higgenbobber*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> I think probably good idea to hold off gpu upgrade for now & upgrade the monitor instead.
> 
> If you go ahead with the gpu upgrade, are you going to get Powercolor 390? You know I would be able to beat that too, right? LOL J/K
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Here are my comparisons for Sapphire and MSI core overclocking-- I ran benchmarks at 1100, 1110, and 1135. And yes these are accurate, same setup and all, didn't mislabel anything and was repeatable.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> MSI 1100:
> 
> 
> Sapphire 1100:
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> MSI 1110:
> 
> 
> Sapphire 1110
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> MSI 1135:
> 
> 
> Sapphire 1135:
> 
> 
> 
> 
> So yeah, sapphire OC at 1100 clock for my computer is better than MSI OC at 1135-- craaaaazy
Click to expand...

Thanks. That's pretty consistent.


----------



## kalidae

Quote:


> Originally Posted by *diggiddi*
> 
> Thanks +rep
> Ok I got 12085 updated to latest version and got 12197 @4.4ghz
> Whats your OS?


I'm running windows 8.1 64bit. I'm using the latest non beta drivers. From memory this card nitro 390 at stock clocks 1010/1500 score 1200...I think. I'm out right now but I'll run it at stock and post it up when I get home.


----------



## diggiddi

Quote:


> Originally Posted by *kalidae*
> 
> I'm running windows 8.1 64bit. I'm using the latest non beta drivers. From memory this card nitro 390 at stock clocks 1010/1500 score 1200...I think. I'm out right now but I'll run it at stock and post it up when I get home.


----------



## rdr09

Quote:


> Originally Posted by *kizwan*
> 
> Thanks!
> 
> 
> 
> 
> 
> 
> 
> Why core clock is reported as 1121?
> Thanks!
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks. That's pretty consistent.


if i am not confusing the numbers (graphics scores) at that oc the 390 matches the 390X stock (depends on base core clock of brand). the 290 with orig bios needs 1200 core to match both.

here was my 290 at 1200 . . .

http://www.3dmark.com/3dm/7690148?

vram oc adds a bit of difference, too. as well as OS.


----------



## kalidae

Quote:


> Originally Posted by *kizwan*
> 
> Thanks!
> 
> 
> 
> 
> 
> 
> 
> Why core clock is reported as 1121?
> Thanks!
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks. That's pretty consistent.


Robs firestrike score was actually 1121/1700 21mhz more than my score at exactly 1100/1700 and only scored 16 higher graphics score than my nitro, if I added another 20mhz core clock mine would be higher than that. I'm starting to think some cards run better than others OR firestrike is not the best way to compare graphics scores.

Edit- nitro 390 at 1121/1700 to match Robs clocks.
Graphics score 13585
Test 1 64.58
Test 2 54.42


----------



## kalidae

Quote:


> Originally Posted by *diggiddi*


Okay nitro 390 stock clocks 1010/1500
Graphics score 12270
Test 1 58.27
Test 2 49.20



What card are you running?


----------



## diggiddi

Quote:


> Originally Posted by *kalidae*
> 
> Okay nitro 390 stock clocks 1010/1500
> Graphics score 12270
> Test 1 58.27
> Test 2 49.20
> 
> 
> 
> What card are you running?


290x lightning what is your cpu and clock speed?


----------



## kalidae

Quote:


> Originally Posted by *diggiddi*
> 
> 290x lightning what is your cpu and clock speed?


Its a [email protected] 4.5ghz.

This is my score with the the 4690k @4.7ghz and nitro 390 @1200/1750 the gpu was to hot to keep these clocks though.



The 290x lighting is a very nice card.


----------



## Darkstalker420

Have to say im impressed with this thing







Just one thing is bugging me slightly and wondered if anyone could clear the issue up for me. But understand i have upgraded from a 6870 so a lot of what this card does/can do is a mystery to me as yet LMAO!!

Installed 15.8 drivers and set options to my liking. I also set the frame rate limiter to 60FPS. After driving about in GTA V (1080p max settings except "advanced" options) for a while i exited and loaded up Afterburner to see my core clock NEVER hitting 1015Mhz?!? It hovered around the high 900's but in some cases dropped to as low as 744Mhz!! Temps were VERY low (custom fan profile under 50c!!) so i doubt it was temp throttling. At no point during the game was it apparent the card was running slower. No stutter/hitching etc.

Now i would like full performance out of it if possible but wondered if this was a result of activating the FPS limit? Then i guess the card wouldn't need 100% all the time. Just wanted to find out if it IS an issue or how it's meant to be. No OC and powerlimit set to 0 at the time it happened.

Thanx.


----------



## kalidae

Quote:


> Originally Posted by *Darkstalker420*
> 
> Have to say im impressed with this thing
> 
> 
> 
> 
> 
> 
> 
> Just one thing is bugging me slightly and wondered if anyone could clear the issue up for me. But understand i have upgraded from a 6870 so a lot of what this card does/can do is a mystery to me as yet LMAO!!
> 
> Installed 15.8 drivers and set options to my liking. I also set the frame rate limiter to 60FPS. After driving about in GTA V (1080p max settings except "advanced" options) for a while i exited and loaded up Afterburner to see my core clock NEVER hitting 1015Mhz?!? It hovered around the high 900's but in some cases dropped to as low as 744Mhz!! Temps were VERY low (custom fan profile under 50c!!) so i doubt it was temp throttling. At no point during the game was it apparent the card was running slower. No stutter/hitching etc.
> 
> Now i would like full performance out of it if possible but wondered if this was a result of activating the FPS limit? Then i guess the card wouldn't need 100% all the time. Just wanted to find out if it IS an issue or how it's meant to be. No OC and powerlimit set to 0 at the time it happened.
> 
> Thanx.


I'm not exactly sure what's going on there but it might have something to do with the gpu not needing 100% usage to run that game at those settings at just 1080p. Run gta 5 at max settings in advanced as well and see if your gpu runs at 100%. I don't think you are putting enough load on the card.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Darkstalker420*
> 
> Have to say im impressed with this thing
> 
> 
> 
> 
> 
> 
> 
> Just one thing is bugging me slightly and wondered if anyone could clear the issue up for me. But understand i have upgraded from a 6870 so a lot of what this card does/can do is a mystery to me as yet LMAO!!
> 
> Installed 15.8 drivers and set options to my liking. I also set the frame rate limiter to 60FPS. After driving about in GTA V (1080p max settings except "advanced" options) for a while i exited and loaded up Afterburner to see my core clock NEVER hitting 1015Mhz?!? It hovered around the high 900's but in some cases dropped to as low as 744Mhz!! Temps were VERY low (custom fan profile under 50c!!) so i doubt it was temp throttling. At no point during the game was it apparent the card was running slower. No stutter/hitching etc.
> 
> Now i would like full performance out of it if possible but wondered if this was a result of activating the FPS limit? Then i guess the card wouldn't need 100% all the time. Just wanted to find out if it IS an issue or how it's meant to be. No OC and powerlimit set to 0 at the time it happened.
> 
> Thanx.


Take off the FPS limiter and it should go to full speed, if not then bump up your power limit to +20% in CCC or MSI afterburner


----------



## Darkstalker420

+Rep. Cheers mate









Thanx.


----------



## flopper

Quote:


> Originally Posted by *Rob27shred*
> 
> Nice (the higher score not your card dying) that shows these 300 series are more than just a re-brand. It makes me feel better too cause I was almost gonna go with a 290X to save money. Sorry to hear about your card though, a 390/X would make a great replacement!


Yea I choose a new card having warranty and an updated process node.
8gb I dont regard as an option as 4gb would be plenty.
its silent and runs cool both on core and vrm.
way different than my old 290 reference that did a good job but I did push it for a long time.
Died in battle what else can one wish for?








Quote:


> Originally Posted by *kalidae*
> 
> 
> 
> My firestrike score for my nitro 390 @1100/1700 exactly for whoever wanted to compare different cards at the same clocks.
> 
> 13354
> 63.56
> 53.44


My scores match those. Twins?


----------



## kalidae

Quote:


> Originally Posted by *flopper*
> 
> Yea I choose a new card having warranty and an updated process node.
> 8gb I dont regard as an option as 4gb would be plenty.
> its silent and runs cool both on core and vrm.
> way different than my old 290 reference that did a good job but I did push it for a long time.
> Died in battle what else can one wish for?
> 
> 
> 
> 
> 
> 
> 
> 
> My scores match those. Twins?


We will be identical twins if u have a nitro hehe


----------



## navjack27

i'm going for a GREAT overclock on my msi r9 390x. so far i've noticed that i should avoid going 1650 on memory. i get a better score -10mhz or so from that. i'm at, right now in testing with unigine valley, 1180/1645 and getting
FPS:
75.1
Score:
3144
Min FPS:
34.4
Max FPS:
143.9

temps are ok. nothing crazy high. i'm testing 1200/1640 right now. wish me luck. i'll post with voltages and screens of results once i'm sure of how high i can get.

EDIT: OMG 1200/1640 with +90ish on the voltage BLUE SQUARES but not that bad, it finished the run with no issue or crash. i'll up the voltage but here:
FPS:
75.9
Score:
3175
Min FPS:
34.7
Max FPS:
146.0


----------



## Sgt Bilko

Sooo......I know nobody asked the question and we all know but it's nice to have confirmation.....

390x will Quadfire with a 290x and 295x2











Just thought I'd have some fun and try it out









EDIT: Yes i know it isn't tidy and no i don't care about that.


----------



## navjack27

how did you power that @[email protected]


----------



## Sgt Bilko

Quote:


> Originally Posted by *navjack27*
> 
> how did you power that @[email protected]


Rig and 390x are running off my XFX 850w PSU, 295x2 and 290x are running off a Corsair AX1200i


----------



## kalidae

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Sooo......I know nobody asked the question and we all know but it's nice to have confirmation.....
> 
> 390x will Quadfire with a 290x and 295x2
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just thought I'd have some fun and try it out
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT: Yes i know it isn't tidy and no i don't care about that.


Haha frankenstien pc. Beast.


----------



## Mr.Pie

quick question - whats the performance "hit" if you x-fire a R9 390 and 390X together?

Helping a friend piece together his build and turns out Amazon sent him a 390X instead of a 390....and they're letting him keep it.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Mr.Pie*
> 
> quick question - whats the performance "hit" if you x-fire a R9 390 and 390X together?
> 
> Helping a friend piece together his build and turns out Amazon sent him a 390X instead of a 390....and they're letting him keep it.


Not much, they run at independent clock speeds so it's will be a little faster than R9 390 crossfire but a little but slower than R9 390x Crossfire.

Just tell him to have the 390x in the top slot for games that don't support Crossfire and you'll be good to go


----------



## Mr.Pie

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Not much, they run at independent clock speeds so it's will be a little faster than R9 390 crossfire but a little but slower than R9 390x Crossfire.
> 
> Just tell him to have the 390x in the top slot for games that don't support Crossfire and you'll be good to go


cheers.

how would CFX run then? I haven't setup a multi-GPU setup in ages.....last time I set 1 up you had to have the same clocks for both cards.....does the same still apply?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Mr.Pie*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Not much, they run at independent clock speeds so it's will be a little faster than R9 390 crossfire but a little but slower than R9 390x Crossfire.
> 
> Just tell him to have the 390x in the top slot for games that don't support Crossfire and you'll be good to go
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> cheers.
> 
> how would CFX run then? I haven't setup a multi-GPU setup in ages.....last time I set 1 up you had to have the same clocks for both cards.....does the same still apply?
Click to expand...

It's pretty much Plug and Play nowadays, The 390 will either clock itself up to match the 390x or they will run at whatever their stock clocks are.

I'm running a 295x2, 290x and 390x atm and the clock speeds are sitting at 1030/1300, 1030/1300, 1000/1250 and 1050/1500, they run at those speeds in everything I've tested so far


----------



## navjack27

i cant get anything stable without artifacting over 1150/1650 with +50 on core and +50% power target. even that is cutting it close. i CAN hit 1200 core but its not what i deem "safe" for normal usage. hmmm. well i tried, i'm not the best at these things tho.


----------



## Geoclock

Mine MSI 390x also has similar results, no artifacts under 1150/1650 .
So what it means?
Regular 390 are better OC than 390x ?
ASIC is 79.4


----------



## Rob27shred

Quote:


> Originally Posted by *flopper*
> 
> Yea I choose a new card having warranty and an updated process node.
> 8gb I dont regard as an option as 4gb would be plenty.
> its silent and runs cool both on core and vrm.
> way different than my old 290 reference that did a good job but I did push it for a long time.
> Died in battle what else can one wish for?
> 
> 
> 
> 
> 
> 
> 
> 
> Very true, glad to hear it died an honorable death. The Spartan way right!


----------



## flopper

Quote:


> Originally Posted by *Sgt Bilko*
> 
> EDIT: Yes i know it isn't tidy and no i don't care about that.


Mine looks like you empty a trashcan and throw in tarzans lians and a jungle without the apes.









Quote:


> Originally Posted by *Mr.Pie*
> 
> quick question - whats the performance "hit" if you x-fire a R9 390 and 390X together?
> 
> Helping a friend piece together his build and turns out Amazon sent him a 390X instead of a 390....and they're letting him keep it.


sits between in scores normally.

Quote:


> Originally Posted by *navjack27*
> 
> i cant get anything stable without artifacting over 1150/1650 with +50 on core and +50% power target. even that is cutting it close. i CAN hit 1200 core but its not what i deem "safe" for normal usage. hmmm. well i tried, i'm not the best at these things tho.


Quote:


> Originally Posted by *Geoclock*
> 
> Mine MSI 390x also has similar results, no artifacts under 1150/1650 .
> So what it means?
> Regular 390 are better OC than 390x ?
> ASIC is 79.4


AMD likely used as speculated here a different power die process so we have a more stable but also lacking a bit in OC.
I run mine atm 1140/1575mhz and have no need to go more for daily default gaming use.


----------



## kizwan

Quote:


> Originally Posted by *Darkstalker420*
> 
> Have to say im impressed with this thing
> 
> 
> 
> 
> 
> 
> 
> Just one thing is bugging me slightly and wondered if anyone could clear the issue up for me. But understand i have upgraded from a 6870 so a lot of what this card does/can do is a mystery to me as yet LMAO!!
> 
> Installed 15.8 drivers and set options to my liking. I also set the frame rate limiter to 60FPS. After driving about in GTA V (1080p max settings except "advanced" options) *for a while i exited and loaded up Afterburner to see my core clock NEVER hitting 1015Mhz*?!? It hovered around the high 900's but in some cases dropped to as low as 744Mhz!! Temps were VERY low (custom fan profile under 50c!!) so i doubt it was temp throttling. At no point during the game was it apparent the card was running slower. No stutter/hitching etc.
> 
> Now i would like full performance out of it if possible but wondered if this was a result of activating the FPS limit? Then i guess the card wouldn't need 100% all the time. Just wanted to find out if it IS an issue or how it's meant to be. No OC and powerlimit set to 0 at the time it happened.
> 
> Thanx.


You should run afterburner in the background while playing the games. In case I misunderstand & you actually running afterburner in the background while playing games, did you set frame rate limit to 60FPS in CCC? If yes, this probably the reason why because probably the gpu don't need to work hard to maintain 60FPS at 1080p.


----------



## CaptainZombie

I'm heading to microcenter in a bit to possibly pickup the MSI 390. Since the card is 2.5 slots will I be ok in the 250D case? I've read some people had to remove the interior filter and there was just enough room. Any thoughts?

Always being an NVIDIA owner, how are the temps? I know I asked a few days ago but now my head is playing overthinking mode.


----------



## CaptainZombie

Quote:


> Originally Posted by *CaptainZombie*
> 
> I'm heading to microcenter in a bit to possibly pickup the MSI 390. Since the card is 2.5 slots will I be ok in the 250D case? I've read some people had to remove the interior filter and there was just enough room. Any thoughts?
> 
> Always being an NVIDIA owner, how are the temps? I'd hate to go back to a 970, the card just feels like a bad value at this point. I know I asked a few days ago but now my head is playing overthinking mode.


----------



## caenlen

can someone give me a short list of games that use more than 4gb of vram at 1440p with 4x msaa enabled?


----------



## Mr.Pie

Quote:


> Originally Posted by *caenlen*
> 
> can someone give me a short list of games that use more than 4gb of vram at 1440p with 4x msaa enabled?


Arkham knight fully maxed (gameworks off) peaks more than 4gb for me at 1440p. Haven't monitored my usage in other games


----------



## Oregonduck007

Quote:


> Originally Posted by *Mr.Pie*
> 
> quick question - whats the performance "hit" if you x-fire a R9 390 and 390X together?
> 
> Helping a friend piece together his build and turns out Amazon sent him a 390X instead of a 390....and they're letting him keep it.


Lucky!
wish they would have made that mistake with my card


----------



## bazookatooths

Quote:


> Originally Posted by *CaptainZombie*
> 
> I'm heading to microcenter in a bit to possibly pickup the MSI 390. Since the card is 2.5 slots will I be ok in the 250D case? I've read some people had to remove the interior filter and there was just enough room. Any thoughts?
> 
> Always being an NVIDIA owner, how are the temps? I know I asked a few days ago but now my head is playing overthinking mode.


I am so jealous of people that have a microcenter i'm in oklahoma and all we have is bestbuy. Its okay but the selection is slim, only thing I like is being able to return anything for 15 days no questions asked w/ receipt. I can't imagine how fun it would be shopping at that store.


----------



## flopper

Quote:


> Originally Posted by *CaptainZombie*
> 
> Always being an NVIDIA owner, how are the temps? I know I asked a few days ago but now my head is playing overthinking mode.


seems fine to me.
I use a open case so its not a average user temps but 1140mhz bf4 didnt go over 57C.
really solid card I say.


----------



## CaptainZombie

Quote:


> Originally Posted by *bazookatooths*
> 
> I am so jealous of people that have a microcenter i'm in oklahoma and all we have is bestbuy. Its okay but the selection is slim, only thing I like is being able to return anything for 15 days no questions asked w/ receipt. I can't imagine how fun it would be shopping at that store.


Quote:


> Originally Posted by *flopper*
> 
> seems fine to me.
> I use a open case so its not a average user temps but 1140mhz bf4 didnt go over 57C.
> really solid card I say.


I'm not too close to Microcenter, they are about 20 miles one way which is not bad at all. Glad we have them around as an option.

I bought the MSI 390 and was able to fit it in the 250D albeit a little snug. I had to remove the filter on the panel since the fan was hitting it. Since I have the 250D, everything is like a hot box in there. Hoping this works out as it seems like a good card.

As of right now after messing around with a bit of Mad Max at maxed out settings on 1080p, my card hit about 73 degrees at max. I'm trying Dying Light next on 1440p to see how that does.

I have an ASIC quality of 70.2%, not sure how that is for this card.


----------



## Dundundata

Quote:


> Originally Posted by *bazookatooths*
> 
> I am so jealous of people that have a microcenter i'm in oklahoma and all we have is bestbuy. Its okay but the selection is slim, only thing I like is being able to return anything for 15 days no questions asked w/ receipt. I can't imagine how fun it would be shopping at that store.


Microcenter is great! Some good deals that can compete with online


----------



## Metalcrack

Quote:


> Originally Posted by *Dundundata*
> 
> Microcenter is great! Some good deals that can compete with online


TRhey won't price match with Newegg (etc.) on most things, other than processors, which they do match/beat.

Just picked up an open box XFX 390 with a $20.00 Steam card for 286.00.


----------



## kalidae

Quote:


> Originally Posted by *CaptainZombie*
> 
> I'm not too close to Microcenter, they are about 20 miles one way which is not bad at all. Glad we have them around as an option.
> 
> I bought the MSI 390 and was able to fit it in the 250D albeit a little snug. I had to remove the filter on the panel since the fan was hitting it. Since I have the 250D, everything is like a hot box in there. Hoping this works out as it seems like a good card.
> 
> As of right now after messing around with a bit of Mad Max at maxed out settings on 1080p, my card hit about 73 degrees at max. I'm trying Dying Light next on 1440p to see how that does.
> 
> I have an ASIC quality of 70.2%, not sure how that is for this card.


I have a 250d. That case warped my motherboard really really bad to the point where the computer would freeze or the image would start flickering or the pc would just shutoff. Which was a shame because I really liked that case.


----------



## CaptainZombie

Quote:


> Originally Posted by *kalidae*
> 
> I have a 250d. That case warped my motherboard really really bad to the point where the computer would freeze or the image would start flickering or the pc would just shutoff. Which was a shame because I really liked that case.


I remember the first versions of this case had this issue and that Corsair were offering some kind of bracket to fix the problem. Then I guess with later revisions they fixed it. I've had this case now for a few months so hopefully I have one that was revised. The nice thing with the 250D is that I can open the front door on the case while gaming to help expel some of the heat out.

I'm doing a bit more gaming with Mad Max now at 1440p maxed out and will have to check the temps here in a bit.


----------



## kalidae

Quote:


> Originally Posted by *CaptainZombie*
> 
> I remember the first versions of this case had this issue and that Corsair were offering some kind of bracket to fix the problem. Then I guess with later revisions they fixed it. I've had this case now for a few months so hopefully I have one that was revised. The nice thing with the 250D is that I can open the front door on the case while gaming to help expel some of the heat out.
> 
> I'm doing a bit more gaming with Mad Max now at 1440p maxed out and will have to check the temps here in a bit.


The 250D is truly awesome. Yes mine was the first revision, I didn't bother getting a bracket from corsair because the damage was already done and tbh I didn't trust the case anymore. It was a brand new mitx build that only lived for 2 months before I had to replace the case and motherboard. The board warped to the point where when I would install my heatsink, tighten all 4 corners, the heatsink would sit on an angle and not make full contact. I had to losen off corners and tighten some and use my eyes to see if it looked like it was making full contact with the cpu. It made me sad because it was so small and sexy.





Then it evolved into a define r4



Then it evolved again into a n450



Started off small and ended up being huge


----------



## navjack27

Quote:


> Originally Posted by *Geoclock*
> 
> Mine MSI 390x also has similar results, no artifacts under 1150/1650 .
> So what it means?
> Regular 390 are better OC than 390x ?
> ASIC is 79.4


mine is 76.3%


----------



## CaptainZombie

Quote:


> Originally Posted by *kalidae*
> 
> The 250D is truly awesome. Yes mine was the first revision, I didn't bother getting a bracket from corsair because the damage was already done and tbh I didn't trust the case anymore. It was a brand new mitx build that only lived for 2 months before I had to replace the case and motherboard. The board warped to the point where when I would install my heatsink, tighten all 4 corners, the heatsink would sit on an angle and not make full contact. I had to losen off corners and tighten some and use my eyes to see if it looked like it was making full contact with the cpu. It made me sad because it was so small and sexy.


Thanks for sharing the pics, looks like you went from small to huge there. LOL!

You should of contacted Corsair about the motherboard issue and see what they could have done. That really sucks that it messed up your motherboard that badly. I'm hoping that the weight from this card doesn't cause any kind of issues to motherboard.

Quote:


> Originally Posted by *navjack27*
> 
> mine is 76.3%


So I wonder if mine at 70.2% is not good then.

For watercooling, does the 290 blocks work on these 390's or are they totally different PCB's?


----------



## Dundundata

Hello







I've been running Afterburner alongside 3dmark Firestrike and looking for some tips on overclocking. Sorry if this has been asked, I'm a bit confused about the order I should be bumping things up. Particularly what I should be setting my power limit to, and when I should be increasing Core/Aux voltage. Also should I be increasing memory or core clock first? Thanks!


----------



## By-Tor

Quote:


> Originally Posted by *CaptainZombie*
> 
> For watercooling, does the 290 blocks work on these 390's or are they totally different PCB's?


Some do, but not many.

XFX, Asus and Powercolor cards fit 290x blocks that EK already makes.

Check here for the card u have.
http://configurator.ekwb.com/


----------



## diggiddi

Max your power limit then increase your core clock, then core voltage next and then aux voltage last


----------



## diggiddi

Quote:


> Originally Posted by *kalidae*
> 
> Its a [email protected] 4.5ghz.
> 
> This is my score with the the 4690k @4.7ghz and nitro 390 @1200/1750 the gpu was to hot to keep these clocks though.
> 
> 
> 
> *The 290x lighting is a very nice card*.


For sure







I'l most likely pair it with a 390/X


----------



## Dundundata

Been messing with Firestrike for hours now and came up with this. I think it could be better but it's hard increasing the core clock. Aux voltage has not been increased. Witcher 3 runs at 77C @75%fan with this setting.


----------



## MTDEW

i
Quote:


> Originally Posted by *Dundundata*
> 
> Hello
> 
> 
> 
> 
> 
> 
> 
> I've been running Afterburner alongside 3dmark Firestrike and looking for some tips on overclocking. Sorry if this has been asked, I'm a bit confused about the order I should be bumping things up. Particularly what I should be setting my power limit to, and when I should be increasing Core/Aux voltage. Also should I be increasing memory or core clock first? Thanks!


Max the Power Limit slider first to prevent the card from throttling due to not enough power.
Note the name Power *LIMIT*...maxing this does *not* make the GPU use more power, it just allows the GPU access to all the power if it needs it.
So you can safely max the Power Limit slider and the GPU will still only use the power it needs to maintain its clock-speed....so there really isn't any reason not to max this setting and leave it maxed. (unless of course you WANT the GPU to throttle and use less power of course)
*Note:* Use Afterburners/Rivatuners on-screen display to be sure the core isn't throttling while testing.

Then work on core clock first , leaving the memory clock alone.
Find your max core clock first, then set it back to default and test your memory clocks.
Once you know your max stable core and memory clocks individually, then test both overclocks together to find a "happy medium" that's stable.


----------



## flopper

Quote:


> Originally Posted by *MTDEW*
> 
> i
> Max the Power Limit slider first to prevent the card from throttling due to not enough power.
> Note the name Power *LIMIT*...maxing this does *not* make the GPU use more power, it just allows the GPU access to all the power if it needs it.
> So you can safely max the Power Limit slider and the GPU will still only use the power it needs to maintain its clock-speed....so there really isn't any reason not to max this setting and leave it maxed. (unless of course you WANT the GPU to throttle and use less power of course)
> *Note:* Use Afterburners/Rivatuners on-screen display to be sure the core isn't throttling while testing.
> 
> Then work on core clock first , leaving the memory clock alone.
> Find your max core clock first, then set it back to default and test your memory clocks.
> Once you know your max stable core and memory clocks individually, then test both overclocks together to find a "happy medium" that's stable.


One can check the first page and make a educated guess also.
so for me that be like, run ram at 1600 or just about for me 1575mhz
core clock 1140mhz for me as 24/7 or in my case that gaming works fine.
Havent tested max as I dont run much benchmarks as I want it to just work for gaming.

really impressed with this 390 though as my old 290 reference while good this new 390 version absolutly rocks.
If amd had released this version back in the day it be their biggest seller yet.


----------



## MitsosTheGreat

Hi! My card is the R9 390 Shapphire Nitro and have big problem with o / c. To have 1100mhz core should I set the voltage to + 100mV. If you give less mv see artifacts. Is normal?


----------



## _ray_

Hi guys, I have come back to AMD (MSI 390) after swapping it with 970 (got fed up with NVidia lies) and it has been a great card so far. I had a few hiccups with black screen and driver crashing but I fixed it all by clean installing Windows 10.

Since I am quite new to AMD cards I like to ask a few questions regarding getting the best our of my GPU:

-What are the best options to setup in control centre for optimal gaming ?
-Do I need any utilities to unlock special settings like NVidia ?
-Finally I have managed to OC my card to a stable 1150/1600 @+63mv, is that going to be safe for prolonged gaming session ? My
current temp are below 65 ?

Many thanks.


----------



## flopper

Quote:


> Originally Posted by *MitsosTheGreat*
> 
> Hi! My card is the R9 390 Shapphire Nitro and have big problem with o / c. To have 1100mhz core should I set the voltage to + 100mV. If you give less mv see artifacts. Is normal?


every card is different with OC.
I use +81 and 1140mhz without trouble.
You might been unlucky with the card and have a bad overclocker.

Quote:


> Originally Posted by *_ray_*
> 
> Since I am quite new to AMD cards I like to ask a few questions regarding getting the best our of my GPU:
> 
> -What are the best options to setup in control centre for optimal gaming ?
> -Do I need any utilities to unlock special settings like NVidia ?
> -Finally I have managed to OC my card to a stable 1150/1600 @+63mv, is that going to be safe for prolonged gaming session ? My
> current temp are below 65 ?
> 
> Many thanks.


safe with OC?
It depends, normally you be fine for years with gaming in those ranges.
Nothing I would worry about myself.


----------



## Mysticking32

Quote:


> Originally Posted by *_ray_*
> 
> Hi guys, I have come back to AMD (MSI 390) after swapping it with 970 (got fed up with NVidia lies) and it has been a great card so far. I had a few hiccups with black screen and driver crashing but I fixed it all by clean installing Windows 10.
> 
> Since I am quite new to AMD cards I like to ask a few questions regarding getting the best our of my GPU:
> 
> -What are the best options to setup in control centre for optimal gaming ?
> -Do I need any utilities to unlock special settings like NVidia ?
> -Finally I have managed to OC my card to a stable 1150/1600 @+63mv, is that going to be safe for prolonged gaming session ? My
> current temp are below 65 ?
> 
> Many thanks.


Welcome to Team Red. And as long as those clocks are stable and you see no artifacts at those settings then yes it's safe. Especially at those wonderful temps. 65 is amazing.

And the features you're looking for are in amd's catalyst control center. Tons of settings to play around with there including virtual super resolution. If you're into recording then you can get amd's raptr program. Uses the gpu to record, similar to shadowplay.


----------



## MitsosTheGreat

Quote:


> Originally Posted by *flopper*
> 
> every card is different with OC.
> I use +81 and 1140mhz without trouble.
> You might been unlucky with the card and have a bad overclocker.
> safe with OC?
> It depends, normally you be fine for years with gaming in those ranges.
> Nothing I would worry about myself.


What is your temperature at full load? There is a possibility in the future with custom bios I can do better o / c on my card?


----------



## Noirgheos

Quote:


> Originally Posted by *Dundundata*
> 
> Been messing with Firestrike for hours now and came up with this. I think it could be better but it's hard increasing the core clock. Aux voltage has not been increased. Witcher 3 runs at 77C @75%fan with this setting.


XFX 390 I think? Is it loud at that speed?


----------



## Dundundata

Quote:


> Originally Posted by *Noirgheos*
> 
> XFX 390 I think? Is it loud at that speed?


Yes and it does start to get a bit loud at that point.


----------



## CaptainZombie

Quote:


> Originally Posted by *By-Tor*
> 
> Some do, but not many.
> 
> XFX, Asus and Powercolor cards fit 290x blocks that EK already makes.
> 
> Check here for the card u have.
> http://configurator.ekwb.com/


Looks like the MSI doesn't have support for full blocks. Thanks for the link.

So far this seems like a beast of a card for the $300-$330 price range. I was having some issues with nvidia drivers recognizing my 1440 p and 4K setup on this TV but it seems like I don't have this issue with the AMD drivers.


----------



## Noirgheos

Quote:


> Originally Posted by *Dundundata*
> 
> Yes and it does start to get a bit loud at that point.


To be fair you do have a pretty big overclock.


----------



## Dundundata

Quote:


> Originally Posted by *Noirgheos*
> 
> To be fair you do have a pretty big overclock.


I think I've about hit my limit 1140/1700 with +100mV core (did not increase Aux). I'm getting stable runs in Firestrike with a score ~11550. I could probably dial it back a bit for general gaming unless I really want to push it. I can max everything in Witcher 3, although I either need to turn hairworks off or set foliage to high to maintain a solid 60. Personally I keep hairworks on.

I am wondering something though, since I initially did not plan on gaming with this PC I went a bit budget on PSU and Mobo. PSU has been upgraded but I'm wondering if I could achieve any better numbers by upgrading the mobo. Currently I have the MSI Z97 PCMate but I was looking at maybe the MSI gaming. Could I expect better raw numbers? It will be a bit of work to rip my computer apart!

Bah why not just Crossfire 2 of these badboys!









Also curious at what difference there would be between using the beta driver 15.8 and the 15.7.1. I'm using the beta


----------



## Rob27shred

Quote:


> Originally Posted by *Dundundata*
> 
> Been messing with Firestrike for hours now and came up with this. I think it could be better but it's hard increasing the core clock. Aux voltage has not been increased. Witcher 3 runs at 77C @75%fan with this setting.


What core & memory speeds you running? I am getting similar Firestrike scores http://www.3dmark.com/fs/6033745 with +35% on the power limit setting, core 1,125 (+ 10.8% on GPU clock settings in CCC), & memory 1,710. I'm just using Catalyst Control Center as well. Also the temps are a little higher than mine at a lower fan speed I'd keep any eye on it. I'm getting around 72C average with the OC & fans set to 65% while gaming. It could just be case differences or something simple like that so I wouldn't worry unless the temps get higher. I will say though having the fans at 65% on my XFX DD Core 390 is much more tolerable than 75%, they are fairly loud at that speed.

Maybe try similar settings to mine with CCC to see if you can get the temp & fan speed down a little. Afterburner did not play well with my card for some reason. I installed it then started up the Witcher 3 intending on using it just as a monitor for the time being so I could get a baseline of my rigs stock speeds.When I loaded into the game it was artifacting like crazy & I hadn't even OCed anything yet! So I ended up uninstalling Afterburner & problem solved, no more artifacting in the Witcher 3. Ever since then I have just used CCC to OC my card & have gotten good results too.


----------



## _ray_

According to GPUZ my card is pulling 1.324V @ 63mv on VDDC during FireStrike bench. Where as without any OC it is 1.266V. Is this normal ?


----------



## Gumbi

Quote:


> Originally Posted by *_ray_*
> 
> According to GPUZ my card is pulling 1.324V @ 63mv on VDDC during FireStrike bench. Where as without any OC it is 1.266V. Is this normal ?


It depends. Is that the max voltage? Because a voltage spike like that is fine, but if it's constant it's unusual. I doubt that's the constant voltage, as that would be very hard to cool on air.


----------



## _ray_

Playing witcher 3 my VDDC is at constant 1.245v. temps are around 65 @ 50mv. should i worried aboit it ?


----------



## Gumbi

Quote:


> Originally Posted by *_ray_*
> 
> Playing witcher 3 my VDDC is at constant 1.245v. temps are around 65 @ 50mv. should i worried aboit it ?


No, that's absolutely fine. As long as it's cool it's fine.

Is it spiking to 1.25~ or is it generally 1.245~ with some spikes above it (just want to know myself).


----------



## flopper

Quote:


> Originally Posted by *MitsosTheGreat*
> 
> What is your temperature at full load? There is a possibility in the future with custom bios I can do better o / c on my card?


I have a open case holds for gaming at 62 max.
1100 or 1140mhz while one can say darn thats more it wont be in pure fps a big difference if any.


----------



## _ray_

Its at a constant 1.25v.


----------



## Dundundata

Quote:


> Originally Posted by *Rob27shred*
> 
> What core & memory speeds you running? I am getting similar Firestrike scores http://www.3dmark.com/fs/6033745 with +35% on the power limit setting, core 1,125 (+ 10.8% on GPU clock settings in CCC), & memory 1,710. I'm just using Catalyst Control Center as well. Also the temps are a little higher than mine at a lower fan speed I'd keep any eye on it. I'm getting around 72C average with the OC & fans set to 65% while gaming. It could just be case differences or something simple like that so I wouldn't worry unless the temps get higher. I will say though having the fans at 65% on my XFX DD Core 390 is much more tolerable than 75%, they are fairly loud at that speed.
> 
> Maybe try similar settings to mine with CCC to see if you can get the temp & fan speed down a little. Afterburner did not play well with my card for some reason. I installed it then started up the Witcher 3 intending on using it just as a monitor for the time being so I could get a baseline of my rigs stock speeds.When I loaded into the game it was artifacting like crazy & I hadn't even OCed anything yet! So I ended up uninstalling Afterburner & problem solved, no more artifacting in the Witcher 3. Ever since then I have just used CCC to OC my card & have gotten good results too.


I don't get any artifacts in Witcher 3 on stock or OC settings.

I gave CCC a try at your settings and got instant artifacts in Firestrike. It doesn't seem like there is any voltage control in CCC and I'm running +75mV in Afterburner to get 1125/1700 and 100mV 1140/1700. That may be the reason temps are different, it might be mine just doesn't clock as well. I can get to 1100 without adding voltage. Another reason could be ambient temp since I've had my AC off the past day, but adding that core voltage definitely brings my temps up. It doesn't go past 77-78C, and temps go down depending on what I'm doing in the game. Fan speed is still loud at 65 and I'm not too concerned about it, even at 75. I do run a custom curve in Afterburner so I can always mess with that.

I was trying to get to 1150 and I might still try but probably will need to +Aux voltage and I'd like to keep temps under 80. I'm going to mess with my voltage some more if I can bring it down a bit it might help.

OH and one thing I noticed is is we are using different drivers! Maybe it's because I am using the beta driver? Also your physics score is very high! Must be the skylake chip?







I haven't OCed my 4790k at all.

UPDATE: been messing with the 1125/1700 profile and got the core voltage down to +63, temps are better now low 70s. I'm still curious to try the 15.7.1 driver and see if there's any difference.


----------



## CaptainZombie

I had asked this last night and hadn't received a response, but is a 70.2% ASIC quality not good for this card? I haven't tried OC this card yet to see how well it does.


----------



## _ray_

Deleted


----------



## diggiddi

Quote:


> Originally Posted by *CaptainZombie*
> 
> I had asked this last night and hadn't received a response, but is a 70.2% ASIC quality not good for this card? I haven't tried OC this card yet to see how well it does.


Asic shouldn't matter that much, either its a good overclocker or its not, so just give it a shot


----------



## rankdropper84

http://www.3dmark.com/3dm11/10298366


----------



## MitsosTheGreat

There is a custom bios for R9 390 Nitro?


----------



## flopper

Quote:


> Originally Posted by *MitsosTheGreat*
> 
> There is a custom bios for R9 390 Nitro?


my google fu said, not.


----------



## Geoclock

My MSI 390x still have artifacts just in Wolfenstein Old blood, reinstaled drivers, Windows too and abled to play it at LOW settings .???
Replaced Windows SSD based by regular HDD and artifats dissapeared.
Wired....although game is installed on diferent HDD.


----------



## FooSkiii

Hey guys does this look right???


----------



## tangelo

I have a Cooler Master *HAF 912* Mid Tower, similar like in this photo:


I have one *200mm* fan in the front pushing air inside and one *200mm* fan on the top as exhaust. Back and side has places and airholes for extra fans but I'm not using them at the moment. The temps inside my case are usually around *30-35C* depending where you take the temp. I'm using 3 different hard drives that are situated on the dock next to the front fan and mobo for places where I've monitored these temps.

I'm going to get the *MSI R9 390* and propably aim for the higest OC I'm able to get stable. Will the configuration of my case and it's fans lower my chances on getting a proper OC or is it in this case more about silicon lottery?

Any input is highly welcome.


----------



## flopper

Quote:


> Originally Posted by *tangelo*
> 
> more about silicon lottery?
> 
> Any input is highly welcome.


its the randomness of silicon.
basically heat isnt much of a limit as cards are made to deal with that at high ranges.

my old 290 wasnt good to oc.
my 390 is a lot better.


----------



## Dundundata

I would probably add an exhaust in the back and intake on the side


----------



## Pokealong1227

Interesting forum. My 390 hasn't been a great overclocker so far, to get 1130mhz on core and 1600mhz on memory I have to do +81mv and +50% power limit but I believe this is the fault of my motherboard because I can't overclock my cpu without having crashes and freezes(4+1 power phase). Guess that's what I get for cheaping out and buying a more budget oriented motherboard. Will be going for a MSI Gaming motherboard once I get the money and seeing if I can get better clocks, but even at the stock 1060mhz it is as fast if not faster than my 970 that randomly bricked was.


----------



## MitsosTheGreat

Its safe for 24/7?


----------



## Pokealong1227

Quote:


> Originally Posted by *MitsosTheGreat*
> 
> 
> 
> Its safe for 24/7?


You can only get 1105? Seems super low. Stock OC on MSI gaming editions is already 1060. Anyways, my suggestion to stress the card less would be to have two profiles. One profile at your highest stable overclock and use that for gaming, and another profile at stock speeds with stock voltages and power limits.


----------



## diggiddi

looks good^


----------



## MitsosTheGreat

Quote:


> Originally Posted by *Pokealong1227*
> 
> You can only get 1105? Seems super low. Stock OC on MSI gaming editions is already 1060. Anyways, my suggestion to stress the card less would be to have two profiles. One profile at your highest stable overclock and use that for gaming, and another profile at stock speeds with stock voltages and power limits.


When you increase the core in 1110 for example I have artifacts.
You may blame the mobo or psu (Gigabyte 990XA-UD3 & FX 8320 4.6Ghz & Coolermaster GX 650W)?


----------



## rankdropper84

Quote:


> Originally Posted by *MitsosTheGreat*
> 
> When you increase the core in 1110 for example I have artifacts.
> You may blame the mobo or psu (Gigabyte 990XA-UD3 & FX 8320 4.6Ghz & Coolermaster GX 650W)?


Try turning your memory clock down and recheck for artifacts. I have that same mobo and can get over 1100 using 63mv. Honestly I would say your psu is your weak point since your probably running the volts up on your 8320 while also running the volts up on your 390...doesn't leave much headroom

Side question. What would everyone here say is the best 390 GPU that has two fans. My buddy is wanting to get one but unfortunately his case doesn't support such a long GPU. How's everyones temps that have a dual fan 390? Was talking to someone who has a xfx 390 and he said his temps get over 80C sometimes which is a stark contrast to my sapphires temps. Only time I ever get over 70C on mine is when pushing the core clock close to 1200 and running the MV up.

Short version: Does a dual fan 390 exist that gets good temps and if so what is the best one. Thanks


----------



## rankdropper84

Quote:


> Originally Posted by *Pokealong1227*
> 
> Interesting forum. My 390 hasn't been a great overclocker so far, to get 1130mhz on core and 1600mhz on memory I have to do +81mv and +50% power limit but I believe this is the fault of my motherboard because I can't overclock my cpu without having crashes and freezes(4+1 power phase). Guess that's what I get for cheaping out and buying a more budget oriented motherboard. Will be going for a MSI Gaming motherboard once I get the money and seeing if I can get better clocks, but even at the stock 1060mhz it is as fast if not faster than my 970 that randomly bricked was.


I would say your results are right in line with what I have seen others get


----------



## rankdropper84

Quote:


> Originally Posted by *flopper*
> 
> its the randomness of silicon.
> basically heat isnt much of a limit as cards are made to deal with that at high ranges.
> 
> my old 290 wasnt good to oc.
> my 390 is a lot better.


----------



## rankdropper84

Quote:


> Originally Posted by *flopper*
> 
> its the randomness of silicon.
> basically heat isnt much of a limit as cards are made to deal with that at high ranges.
> 
> my old 290 wasnt good to oc.
> my 390 is a lot better.


It's the same concept as cpu binning.


----------



## MitsosTheGreat

Quote:


> Originally Posted by *rankdropper84*
> 
> Try turning your memory clock down and recheck for artifacts. I have that same mobo and can get over 1100 using 63mv. Honestly I would say your psu is your weak point since your probably running the volts up on your 8320 while also running the volts up on your 390...doesn't leave much headroom
> 
> Side question. What would everyone here say is the best 390 GPU that has two fans. My buddy is wanting to get one but unfortunately his case doesn't support such a long GPU. How's everyones temps that have a dual fan 390? Was talking to someone who has a xfx 390 and he said his temps get over 80C sometimes which is a stark contrast to my sapphires temps. Only time I ever get over 70C on mine is when pushing the core clock close to 1200 and running the MV up.
> 
> Short version: Does a dual fan 390 exist that gets good temps and if so what is the best one. Thanks


With memory clocks downclock i have same problem. I'll check my power supply. I hope to be this problem.

See here and tell what's wrong if you know? 



 the processor?


----------



## Pokealong1227

Quote:


> Originally Posted by *MitsosTheGreat*
> 
> When you increase the core in 1110 for example I have artifacts.
> You may blame the mobo or psu (Gigabyte 990XA-UD3 & FX 8320 4.6Ghz & Coolermaster GX 650W)?


Try lowering your memory clock a little bit. You shouldn't need +100mv for 1105 but maybe that's just how your card is. Some cards are better than anothers.

Edit : Sorry didnt' see you already replied to someone.

Edit 2 : I looked at your video on youtube. I see no reason why your FPS should fluctuate so much. When I had a bad PSU I would get spikes down to 20-30fps in games, but it didn't typically only happen when I looked at a specific area. It was fairly random and would happen 20 seconds or so at a time. Definitely not your processor making that big of a difference though. Especially since it's topping out at 50-55%. I also had an issue where my cpu being overclocked caused issues because my motherboard didn't have proper power for overclocking. Try reverting to stock clocks on the CPU and see if it helps.


----------



## Pokealong1227

Quote:


> Originally Posted by *rankdropper84*
> 
> Try turning your memory clock down and recheck for artifacts. I have that same mobo and can get over 1100 using 63mv. Honestly I would say your psu is your weak point since your probably running the volts up on your 8320 while also running the volts up on your 390...doesn't leave much headroom
> 
> Side question. What would everyone here say is the best 390 GPU that has two fans. My buddy is wanting to get one but unfortunately his case doesn't support such a long GPU. How's everyones temps that have a dual fan 390? Was talking to someone who has a xfx 390 and he said his temps get over 80C sometimes which is a stark contrast to my sapphires temps. Only time I ever get over 70C on mine is when pushing the core clock close to 1200 and running the MV up.
> 
> Short version: Does a dual fan 390 exist that gets good temps and if so what is the best one. Thanks


I like my MSI Gaming Edition. Not sure how much shorter it is than the other cards if at all but it's dual fan. Temp at idle is around 60c but that's because it has a zero-fan mode. Temps, especially if you set a custom fan profile, stay fairly low. Then again I have an AIO cpu liquid cooler so my case in total has 2 intake and 2 exhaust fans and almost no cables to block airflow.


----------



## Pokealong1227

Quote:


> Originally Posted by *rankdropper84*
> 
> I would say your results are right in line with what I have seen others get


Really? Figured it was super low. I've seen plenty of people hitting 1200. Maybe I was just seeing the "best case scenario". I have no problem with my clocks though really. Got me a 5-10fps boost in most games and is stable. Fans can get a little audible but since my CPU cooler is near silent and my case fan and PSU fans are quiet, not a big deal.


----------



## Noirgheos

Wow, it seems like a lot of review sites **** on the XFX 390X for being loud... It's making the MSI one so tempitng. If only it didn't have ****ty temps...


----------



## Pokealong1227

Quote:


> Originally Posted by *Noirgheos*
> 
> Wow, it seems like a lot of review sites **** on the XFX 390X for being loud... It's making the MSI one so tempitng. If only it didn't have ****ty temps...


70c at 35% fan speed in-game is considered ****ty? Haha


----------



## Noirgheos

Quote:


> Originally Posted by *Pokealong1227*
> 
> 70c at 35% fan speed in-game is considered ****ty? Haha


Where? If this is true I'll buy it in a heartbeat.


----------



## Pokealong1227

Quote:


> Originally Posted by *Noirgheos*
> 
> Where? If this is true I'll buy it in a heartbeat.


Oh my apologies looking back you said 390x, I have the 390. The 390x MSI Gaming OC actually reaches around 5 degrees warmer maxing out at 76c under load. I'd assume close to the same fan speeds. Compare that too the R9 290x at 90-95 degrees, it seems like a cool fall evening. Haha

http://www.guru3d.com/articles-pages/msi-radeon-r9-390x-gaming-8g-oc-review,9.html

That link shows peak temperatures at load.


----------



## Dundundata

Couple updates for those interested (XFX 390)

OC's\Firestrike graphics scores

1125/1700 +63mV \ ~13600
1140/1700 +88mV \ ~13750

Temps actually seemed to improve, peak 75C on Witcher3 @1140. Think it's ambient temps; been getting colder at night. 1150 is going to be pushing it with this card but I am determined to try. Also might be able to lower GPU temps further. Case has alot of fans (Noctua) but they are setup in the bios linked to CPU/mobo and since those parts never get very hot they are probably not pushing enough air while the GPU is heating, so that's next on the agenda.


----------



## Noirgheos

Quote:


> Originally Posted by *Dundundata*
> 
> Couple updates for those interested (XFX 390)
> 
> OC's\Firestrike graphics scores
> 
> 1125/1700 +63mV \ ~13600
> 1140/1700 +88mV \ ~13750
> 
> Temps actually seemed to improve, peak 75C on Witcher3 @1140. Think it's ambient temps; been getting colder at night. 1150 is going to be pushing it with this card but I am determined to try. Also might be able to lower GPU temps further. Case has alot of fans (Noctua) but they are setup in the bios linked to CPU/mobo and since those parts never get very hot they are probably not pushing enough air while the GPU is heating, so that's next on the agenda.


By any chance, do you have a water cooler for your CPU? Also, I think in Canada, I'll have to remove the OC during the summer. It gets really hot in my room. During most of the year through, my room stays at 19C-22C. I think that's good.

Anyway what was your fan curve with those temps?

And P.S. Since the XFX 390X and the MSI 390X are the same chip, they should perform the same at the same speeds right?


----------



## Rob27shred

Quote:


> Originally Posted by *Dundundata*
> 
> I don't get any artifacts in Witcher 3 on stock or OC settings.
> 
> I gave CCC a try at your settings and got instant artifacts in Firestrike. It doesn't seem like there is any voltage control in CCC and I'm running +75mV in Afterburner to get 1125/1700 and 100mV 1140/1700. That may be the reason temps are different, it might be mine just doesn't clock as well. I can get to 1100 without adding voltage. Another reason could be ambient temp since I've had my AC off the past day, but adding that core voltage definitely brings my temps up. It doesn't go past 77-78C, and temps go down depending on what I'm doing in the game. Fan speed is still loud at 65 and I'm not too concerned about it, even at 75. I do run a custom curve in Afterburner so I can always mess with that.
> 
> I was trying to get to 1150 and I might still try but probably will need to +Aux voltage and I'd like to keep temps under 80. I'm going to mess with my voltage some more if I can bring it down a bit it might help.
> 
> OH and one thing I noticed is is we are using different drivers! Maybe it's because I am using the beta driver? Also your physics score is very high! Must be the skylake chip?
> 
> 
> 
> 
> 
> 
> 
> I haven't OCed my 4790k at all.
> 
> UPDATE: been messing with the 1125/1700 profile and got the core voltage down to +63, temps are better now low 70s. I'm still curious to try the 15.7.1 driver and see if there's any difference.


Yeah, I got the 6700K for my new rig but having it running at 4.5Ghz stable







, the temps on it have barely jumped as well. Skylake seems to OC pretty well! Although there really is not much of a performance gain between my 6700K & your 4790K so I'm sure you could easily see the same out of yours if you wanted too







I have heard CCC actually does not up the voltages & TBH it really doesn't seem too. I think it just allows the GPU to use a tiny little bit of extra juice when it is above stock speeds. I do have my card set just below where I start getting artifacts so ours probably ain't too far apart with OCing. You are right too, I didn't think about it earlier but the true extra voltage from using AB definitely would add a little heat to the mix. Glad to hear you dialed it in a little better & found lower temps too. As far the drivers go, I was using the beta as well for awhile & didn't see a huge difference with bench scores. The official driver does get me slightly higher scores on FS but it's the same on average in Heaven & Valley between the 2. I mainly rolled back to have valid FS scores when I started OCing my card.


----------



## Dundundata

Quote:


> Originally Posted by *Noirgheos*
> 
> By any chance, do you have a water cooler for your CPU? Also, I think in Canada, I'll have to remove the OC during the summer. It gets really hot in my room. During most of the year through, my room stays at 19C-22C. I think that's good.
> 
> Anyway what was your fan curve with those temps?
> 
> And P.S. Since the XFX 390X and the MSI 390X are the same chip, they should perform the same at the same speeds right?


I have the Hyper Evo 212 with 2 fans and my CPU runs rather cool. During the summer the AC is on most of the time, just recently stopped using it.

Fan curve is 35% to 50C, then 80/80, 100/90.

Not really sure about the last part, I imagine so. The MSI can probably clock higher more easily though, but it also comes down to silicon lottery. My benchmarks are good and it runs games like a champ. I'm still not sure how big of a difference the gain is adding those last MHz in real world FPS but it sure is fun to try


----------



## Dundundata

Quote:


> Originally Posted by *Rob27shred*
> 
> Yeah, I got the 6700K for my new rig but having it running at 4.5Ghz stable
> 
> 
> 
> 
> 
> 
> 
> , the temps on it have barely jumped as well. Skylake seems to OC pretty well!
> 
> I was using the beta as well for awhile & didn't see a huge difference with bench scores. The official driver does get me slightly higher scores on FS but it's the same on average in Heaven & Valley between the 2. I mainly rolled back to have valid FS scores when I started OCing my card.


Those i7's chips are quite impressive. Clocked mine up to 4.6 awhile back just messing around and it was no sweat, but never saw the need so been running it stock.

I did try out the 15.7.1 driver and didn't really notice any difference. Went back to the beta just for the heck of it


----------



## Derek129

Does anyone else have and major problems with there 390? I can hardly play gtav ever without the game crashing and have a video error and dirt rally is choppy as all hell. This card is driving me insane. I'm almost considering getting rid of it. $400 anchor....


----------



## Pokealong1227

Quote:


> Originally Posted by *Derek129*
> 
> Does anyone else have and major problems with there 390? I can hardly play gtav ever without the game crashing and have a video error and dirt rally is choppy as all hell. This card is driving me insane. I'm almost considering getting rid of it. $400 anchor....


If it was fine before changing to the 390 get it replaced. It's faulty. I've had absolutely no issues whatsoever in any games.


----------



## componentgirl90

I just purchased an XFX DD R9 390x. A big improvement on my R9 270  I do get a slight buzzing almost clicking sound when the card is under load. Is this a coil or something. I hope it isn't a sign of a major problem.

@ Derek - I have no problems with games atm. TBH I have heard of problems with booting. Any new card, even a refresh, runs the risk of running into problems after release. It may be a driver issue or something.


----------



## navjack27

i've started using msi afterburner with the power play disabled so its always on its high clocks. i just have a thing against dynamic clocking. this won't hurt anything right? temps are perfect since there is no load most of the time.


----------



## ValValdesky

Hey guys, I might be getting one of this red 390X cards, Tonight I found a sweat deal for a 4790K with a 390X relatively cheap with trade-in some old parts I had.

The problem is currently I have a Corsair CX600M PSU which was bought back in December, it has been used to power up a FX8350 with a reference AMD 7950 without problems, then right now is working with an i5 4440 and a G1 Gaming GTX 970, I've tested it with a multimeter and it delivers a little more power than what is in the specs which actually was unexpected from what I've heard about Corsair CX series, the problem I have is do you guys think I would be able to run the 4790K and the 390X without problems with the CX600M or would I need to buy a beefier PSU? I have no plans to OC for now..

Another of my worries is the card is an Asus DirectCu2... which from what I've found so far they just reused the old cooler from the 290X so I would love to hear experiences from owners.


----------



## Mr.Pie

Quote:


> Originally Posted by *ValValdesky*
> 
> Hey guys, I might be getting one of this red 390X cards, Tonight I found a sweat deal for a 4790K with a 390X relatively cheap with trade-in some old parts I had.
> 
> The problem is currently I have a Corsair CX600M PSU which was bought back in December, it has been used to power up a FX8350 with a reference AMD 7950 without problems, then right now is working with an i5 4440 and a G1 Gaming GTX 970, I've tested it with a multimeter and it delivers a little more power than what is in the specs which actually was unexpected from what I've heard about Corsair CX series, the problem I have is do you guys think I would be able to run the 4790K and the 390X without problems with the CX600M or would I need to buy a beefier PSU? I have no plans to OC for now..
> 
> Another of my worries is the card is an Asus DirectCu2... which from what I've found so far they just reused the old cooler from the 290X so I would love to hear experiences from owners.


from what I hear I would avoid any Asus AMD cards for now.......they've got rather ****ty coolers compared to other board partners.

The CX600M would be fine for an i7 and a 390X.


----------



## _ray_

Is anyone else having painfully slow boot up/shutdown on Windows 10 64bit with latest 15.8 beta drivers. Sometimes it will just hang on boot and shutdown till i hard reset my PC ?

Its driving me insane.


----------



## ValValdesky

Quote:


> Originally Posted by *Mr.Pie*
> 
> from what I hear I would avoid any Asus AMD cards for now.......they've got rather ****ty coolers compared to other board partners.
> 
> The CX600M would be fine for an i7 and a 390X.


Kinda figure it out, might aswell just skip the GPU and get that i7 instead.


----------



## Gumbi

Quote:


> Originally Posted by *ValValdesky*
> 
> Hey guys, I might be getting one of this red 390X cards, Tonight I found a sweat deal for a 4790K with a 390X relatively cheap with trade-in some old parts I had.
> 
> The problem is currently I have a Corsair CX600M PSU which was bought back in December, it has been used to power up a FX8350 with a reference AMD 7950 without problems, then right now is working with an i5 4440 and a G1 Gaming GTX 970, I've tested it with a multimeter and it delivers a little more power than what is in the specs which actually was unexpected from what I've heard about Corsair CX series, the problem I have is do you guys think I would be able to run the 4790K and the 390X without problems with the CX600M or would I need to buy a beefier PSU? I have no plans to OC for now..
> 
> Another of my worries is the card is an Asus DirectCu2... which from what I've found so far they just reused the old cooler from the 290X so I would love to hear experiences from owners.


You should be OK, thr 4790k using less power than the FX chip will counteract the 390X using more than the 7950...

A mild overclock on both the CPU and GPU will be possible too, but be careful pf putting too much extra voltahe into the GPU.

In short, you'll be fine.


----------



## flopper

Quote:


> Originally Posted by *_ray_*
> 
> Is anyone else having painfully slow boot up/shutdown on Windows 10 64bit with latest 15.8 beta drivers. Sometimes it will just hang on boot and shutdown till i hard reset my PC ?
> 
> Its driving me insane.


Its not the card normally.
latest bios?
reinstall and clean drivers with ddu.

using 15.8 and win 10 64 and its flawless


----------



## _ray_

Did DDU uninstall but problem presists with either 15.7 or 15.8.. getting fed up. everything works when i uninstall drivers. are you using MSI Afterburner ?


----------



## imrunning

Quote:


> Originally Posted by *kalidae*
> 
> Wow dude okay that sucks. My only thought is that the heatsink isn't making full contact, your vrms are running cool and must have full contact for your gpu is really hot and I'm wondering if your heatsink is actually mounted on a very slight angle as to make contact with one but not the other. Since this company you bought it from won't let you return it then maybe you should just remove the heat sink and apply some good thermal paste and remount it and see how you go because something is really not right, your system has a lot of airflow and your gpu should be running way cooler than that. Or you could probably bypass the place you bought it from and get in touch with msi or whoever it is directly and explain the situation. As soon as you mention the shop used furmark for testing then msi will know straight away that the shop doesn't know what they are doing.


Finally gotten my replacement card.

The replacement card runs much much cooler now. In open case test manage to hit 70-71C running unigine valley and 3dmark.
And under close case the gfx temp max out at 75-76C!

open case unigine valley ambient temp 35-36C.
max stable Gfx temp at 70-71C.
http://s853.photobucket.com/user/imrunning/media/valley open case ambient.jpg.html
http://s853.photobucket.com/user/imrunning/media/Valley open case temp.jpg.html

close case unigine valley max stable gfx temp at 75-76C
http://s853.photobucket.com/user/imrunning/media/Valley close case temp.jpg.html

FYI, I've gotten a confirmation email from msi to reapply the thermal paste which wont void the warranty... was going to do that i hadnt got the replacement card... but anyways thanks all once again!


----------



## Sonic B0000M

I'm sorry guys I've sold my 2 390 R9s..

I now own a Zotac GTX 980 TI Amp Extreme.

4k 60fps on Mad Max and MGS V. It's one crazy card


----------



## LongRod

Quote:


> Originally Posted by *Sonic B0000M*
> 
> I'm sorry guys I've sold my 2 390 R9s..
> 
> I now own a Zotac GTX 980 TI Amp Extreme.
> 
> 4k 60fps on Mad Max and MGS V. It's one crazy card


I can't blame you, it's a beast of a card even when compared to 390 X-Fire.

Enjoy your new GPU!


----------



## Noirgheos

Quote:


> Originally Posted by *componentgirl90*
> 
> I just purchased an XFX DD R9 390x. A big improvement on my R9 270  I do get a slight buzzing almost clicking sound when the card is under load. Is this a coil or something. I hope it isn't a sign of a major problem.
> 
> @ Derek - I have no problems with games atm. TBH I have heard of problems with booting. Any new card, even a refresh, runs the risk of running into problems after release. It may be a driver issue or something.


It shouldn't be. What I would do is just let Valley run 24/7 and put load on the card. If it doesn't go away after a day it's there to stay. Shouldn't be harmful.


----------



## flopper

Quote:


> Originally Posted by *_ray_*
> 
> Did DDU uninstall but problem presists with either 15.7 or 15.8.. getting fed up. everything works when i uninstall drivers. are you using MSI Afterburner ?


yes I use msi afterburner in Kernel mode.

mboard bios?
change the bios switch on the card?
move to a different pci-e slot?

need to determine if its hardware or software issue.


----------



## Streetdragon

After i lost one of my 290 Vapor i wanna crossfire again, but i cant find any Vapor in my trusted shops anymore.
The question is: Can i crossfire a r9 390 or even an r9 390X from Sapphire with my Sapphire Vapor?
What would be the best choice? Money is not the problem^^

I asked the Sapphire Support and they said, that they are not compatible or only with trouble, and other forum post say that it is possible. Someone here that already testet it? I didnt found a answer with the search^^


----------



## Geoclock

Newegg cut price on just MSI 390x to $370 after MIR with free gift included.
Guys i own one and tell you the truth i guess it even don't worth it.
Really dissapointed with perfomance.
I don't think DX12 will do some king of miracle for it.


----------



## Noirgheos

Quote:


> Originally Posted by *Geoclock*
> 
> Newegg cut price on just MSI 390x to $370 after MIR with free gift included.
> Guys i own one and tell you the truth i guess it even don't worth it.
> Really dissapointed with perfomance.
> I don't think DX12 will do some king of miracle for it.


What do you mean? It it performing sub par to benchmarks?


----------



## Geoclock

Benchs, gameplay and everything.
At least it isnt $400 card indeed


----------



## Noirgheos

Quote:


> Originally Posted by *Geoclock*
> 
> Benchs, gameplay and everything.
> At least it isnt $400 card indeed


Then RMA it. This card is as powerful or even better than a 980 in most games.


----------



## Agent Smith1984

Well, the decision just got a lot easier....

http://www.newegg.com/Product/Product.aspx?Item=N82E16814127872&cm_re=390x-_-14-127-872-_-Product


----------



## Dundundata

@Ray, no problems here mate Win10 Pro x64, 15.8

@ComponentGirl90, buzzing could be related to coil whine. I have the non-x version of your card and get some high pitch whine under certain circumstances - like really high FPS in some menus, sometimes benchmarking OC in Firestrike (no sound makes it more prominent). Without hearing yours it's hard to say but seems normal to me. Never noticed a clicking noise though.

@ValValdesky, I have the same chip and PSU but never tried it. Decided to upgrade to RM750i when installing the 390. Might want to upgrade when you decide to OC.


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Well, the decision just got a lot easier....
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814127872&cm_re=390x-_-14-127-872-_-Product


Son-of-a-herndle....









Is the VRM coolers on this MSI connected to the GPU air cooler? Or are they separated like the XFX are?


----------



## Gumbi

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Well, the decision just got a lot easier....
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814127872&cm_re=390x-_-14-127-872-_-Product


Do you still have your 390? Wouldn't it make more sense to CF it with another 390?


----------



## fyzzz

Quote:


> Originally Posted by *battleaxe*
> 
> Son-of-a-herndle....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Is the VRM coolers on this MSI connected to the GPU air cooler? Or are they separated like the XFX are?


It seems like the vrm is cooled by the main heatsink


----------



## battleaxe

Quote:


> Originally Posted by *fyzzz*
> 
> It seems like the vrm is cooled by the main heatsink


That's what I was afraid of. Can anyone confirm this?


----------



## LongRod

Quote:


> Originally Posted by *battleaxe*
> 
> Son-of-a-herndle....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Is the VRM coolers on this MSI connected to the GPU air cooler? Or are they separated like the XFX are?


Connected straight to the main heatsink.


----------



## battleaxe

Quote:


> Originally Posted by *LongRod*
> 
> Connected straight to the main heatsink.


Yeah, so I'm waiting for the XFX to come down then. I like its VRM cooler. I plan to add an AIO to it and keep the VRM section cooler intact. XFX did a nice job on theirs.


----------



## Noirgheos

Alright, I'm seeing a lot of different opinions here...

For temps alone, and considering the XFX is $40 cheaper, which cools better? XFX or MSI?


----------



## POLJDA

R9 390 MSI


----------



## battleaxe

Quote:


> Originally Posted by *Noirgheos*
> 
> Alright, I'm seeing a lot of different opinions here...
> 
> For temps alone, and considering the XFX is $40 cheaper, which cools better? XFX or MSI?


They seem about the same for the core. I like the VRM section on XFX better though as you can just remove the air cooler and add an AIO water block to get it under water for some higher OC'ing. Plus, the VRM cooler on the XFX is more than adequate/better than most. I also like that it is separate as you don't get sympathetic heating from the air cooler. Just my OP


----------



## Agent Smith1984

Sold my 390...

Also, the msi has excellent vrm cooling!


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Sold my 390...
> 
> Also, the msi has excellent vrm cooling!


I agree. I just like that the XFX VRM section is removed from the core cooler. Its separate. That's all I was saying. Both are about the same from what I can tell if you stay on air though. The XFX only has the advantage if using an AIO cooler.


----------



## CaptainZombie

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Sold my 390...
> 
> Also, the msi has excellent vrm cooling!


Why did you end up getting rid of your 390? Run into any issues.


----------



## Agent Smith1984

Quote:


> Originally Posted by *CaptainZombie*
> 
> Why did you end up getting rid of your 390? Run into any issues.


No never, was going to go a different route, long story short, my whole system is down and out right now, and I am doing some contemplating on what to do next.

The 390X being down to $400 now seems pretty sweet, crossfire is no longer an option for me because of my layout.

I'm kind of on hideous right now.


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> No never, was going to go a different route, long story short, my whole system is down and out right now, and I am doing some contemplating on what to do next.
> 
> The 390X being down to $400 now seems pretty sweet, crossfire is no longer an option for me because of my layout.
> 
> I'm kind of on hideous right now.


That blows... but... time to go Intel now... hehhehe...!


----------



## Noirgheos

Quote:


> Originally Posted by *battleaxe*
> 
> They seem about the same for the core. I like the VRM section on XFX better though as you can just remove the air cooler and add an AIO water block to get it under water for some higher OC'ing. Plus, the VRM cooler on the XFX is more than adequate/better than most. I also like that it is separate as you don't get sympathetic heating from the air cooler. Just my OP


Alright, I've asked this multiple times, but I want a definite answer. The XFX is 10MHz less than the MSI. Will that even make a 1FPS difference?


----------



## Agent Smith1984

NO


----------



## CaptainZombie

Quote:


> Originally Posted by *Agent Smith1984*
> 
> No never, was going to go a different route, long story short, my whole system is down and out right now, and I am doing some contemplating on what to do next.
> 
> The 390X being down to $400 now seems pretty sweet, crossfire is no longer an option for me because of my layout.
> 
> I'm kind of on hideous right now.


That really sucks, hate when those kinds of things happen.

Is the 390X a much better card than the 390? I've read that you can OC the 390 to meet the 390X at stock, is that true. I'm seeing Newegg has several of the 390X's to $399 which is a pretty good price.

Has anyone tried the Kraken G10 on the MSI 390 yet? When I had the MSI 970, it was easy to install and the nice thing is that they have a VRM plate that keeps the VRMs cool on the MSI 970 when you install the G10. Is it the same for the MSI 390?


----------



## Higgenbobber

Quote:


> Originally Posted by *Noirgheos*
> 
> Alright, I've asked this multiple times, but I want a definite answer. The XFX is 10MHz less than the MSI. Will that even make a 1FPS difference?


The difference between all these cards at stock settings are pretty negligible-- and even with overclocking the most fps you'll probably gain is like 5 (if you're already over 60 and manage a good OC).

Let's be honest, all these overclocking stats is just a perversion of ours.

And regarding your concern for MSI temps, I dealt with ****ty temps myself with the MSI card (compared to gigabyte and sapphire)-- but it might've just been a ****ty card since there are a lot of people who don't seem to have the problem (i'm guessing MSI just has a higher proportion of temp ****ty cards). I really could've been fine sticking with the MSI card and getting a slightly better overclock (like +1 fps for me), but high temps are a pet peeve of mine so I finally settle with the Sapphire.

And if you're just playing at 1080p like I am, then you're probably never going to have to worry about which 390 you have


----------



## Agent Smith1984

Quote:


> Originally Posted by *CaptainZombie*
> 
> That really sucks, hate when those kinds of things happen.
> 
> Is the 390X a much better card than the 390? I've read that you can OC the 390 to meet the 390X at stock, is that true. I'm seeing Newegg has several of the 390X's to $399 which is a pretty good price.
> 
> Has anyone tried the Kraken G10 on the MSI 390 yet? When I had the MSI 970, it was easy to install and the nice thing is that they have a VRAM plate that keeps the VRAM cool on the MSI 970 when you install the G10. Is it the same for the MSI 390?


Not sure if the Kraken works on 390 or not.....

The 390 is within 5% of the 390X at the same clock speeds, but just remember that they will both hit the same 1170-1200Mhz range, so whatever you get to on an X, you are going to need around 50MHz more to perform the same on the non-X


----------



## Alerean

I'm sure this has been asked a million times, so sorry in advance. PowerColor 390 PCS+ or XFX 390 Double Dissipation _predominantly for water cooling_ (an EK block)? Once the fans are off I guess there's not much difference, but since I have the choice to buy either and can't find much info on VRM/PCB differences I figured I'd ask anyway.


----------



## componentgirl90

Once you overclock the card, what sort of performance gains do people see? (Sorry if this has been asked before but I mean, look at the length of this post lol).

My XFX 390x from 1050 to 1180...would I get 12%?


----------



## CaptainZombie

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Not sure if the Kraken works on 390 or not.....
> 
> The 390 is within 5% of the 390X at the same clock speeds, but just remember that they will both hit the same 1170-1200Mhz range, so whatever you get to on an X, you are going to need around 50MHz more to perform the same on the non-X


Yup, makes sense with the OC.

I'll poke around in the Kraken G10 thread to see if anyone has tried to put the G10 on the MSI. Being a first time AMD GPU owner, I am still on the fence until I do further testing and playing around with the card. Out of all the games I have tried so far, Shadows of Mordor really put this card to test. My fans went crazy while playing this in 1440p.

Is there a good program to use to see what framerates that I am getting and how much VRAM is getting eaten up while gaming in Catalyst or another program that would be recommended?


----------



## componentgirl90

Quote:


> Originally Posted by *Noirgheos*
> 
> It shouldn't be. What I would do is just let Valley run 24/7 and put load on the card. If it doesn't go away after a day it's there to stay. Shouldn't be harmful.


Ok I might stick some chewing gum or whatever it appropriate there lol. I read that somewhere. I normally have headphones in anyway when its under load.


----------



## tangelo

Quote:


> Originally Posted by *battleaxe*
> 
> That's what I was afraid of. Can anyone confirm this?


What do you mean? I thought this was a good thing. Some reviews have said that the VRM is activly cooled so it's better than just heatsinks on the vrm as some manufacturers do.

edit: managed to catch up the thread, so no need to answer. Teaches me to reply before going thru the whole thread first


----------



## flopper

Quote:


> Originally Posted by *componentgirl90*
> 
> Once you overclock the card, what sort of performance gains do people see? (Sorry if this has been asked before but I mean, look at the length of this post lol).
> 
> My XFX 390x from 1050 to 1180...would I get 12%?


its in the ballpark but real fps gains are in the 3 to 5fps normally.
you gain more by adjusting a setting or two.

For myself its a waste of time spending on overclocking as I rather adjust a setting or two.
framelocked fps 125 with bf4 1440p with adjusted settings, rather have that than to oc max and risk the card.


----------



## Noirgheos

Quote:


> Originally Posted by *componentgirl90*
> 
> Ok I might stick some chewing gum or whatever it appropriate there lol. I read that somewhere. I normally have headphones in anyway when its under load.


And just to ask... What are your temps under load? Also is it the default fan curve?


----------



## AverdanOriginal

Quote:


> Originally Posted by *Agent Smith1984*
> 
> No never, was going to go a different route, long story short, my whole system is down and out right now, and I am doing some contemplating on what to do next.
> 
> The 390X being down to $400 now seems pretty sweet, crossfire is no longer an option for me because of my layout.
> 
> I'm kind of on hideous right now.


Damn man. you started this whole thread. Well maybe really go for Skylake, get the asrock OC Formula or MSI titianium (Gigabyte G1 is just too expensive over in europe) to get enough spacing between the cards and go crossfire with that 390... well at least I am thinkin of this right now







.... in the long run that is and once money starts growing on trees over here


----------



## navjack27

how can you tell if your gpu is getting the performance it is supposto be getting? i'm sure i'm fine, but then sometimes i'm wondering to myself. this is supposto be AMDs flagship under the fury cards. its a 390x and maybe i expected more? its fast don't get me wrong just sometimes things don't seem to add up. or maybe thats just the feeling of going team red sometimes.


----------



## Dundundata

Quote:


> Originally Posted by *Higgenbobber*
> 
> The difference between all these cards at stock settings are pretty negligible-- and even with overclocking the most fps you'll probably gain is like 5 (if you're already over 60 and manage a good OC).
> 
> Let's be honest, all these overclocking stats is just a perversion of ours.










Truly. It's oddly fun though.


----------



## navjack27

yep. there is just something magical about finding a max clock on something that is just to be used for ONE TEST and bragging rights or personal self conflict cuz u couldnt FIND a high clock.


----------



## Rob27shred

Quote:


> Originally Posted by *Dundundata*
> 
> Those i7's chips are quite impressive. Clocked mine up to 4.6 awhile back just messing around and it was no sweat, but never saw the need so been running it stock.
> 
> I did try out the 15.7.1 driver and didn't really notice any difference. Went back to the beta just for the heck of it


The beta is probably the best to go with right now since they are still working the kinks out of the 300 series cards. That is a pretty nice OC on your CPU but I agree there is really not much to be gained gaming wise out of OCing a newer CPU. If you have CPU that is a few gens old maybe then there is some performance to be gained, but my 6700K & your 4790K have very strong kick at stock speeds still.


----------



## bazookatooths




----------



## navjack27

Quote:


> Originally Posted by *bazookatooths*


sanity check. thx. http://www.3dmark.com/3dm/8662632
+50 core voltage
+13 aux
+50 power limit
1140/1650

my room is hot and it got to 75c on the card and around that speed, even with power play disabled it clocks down at or near that temp

going up to 1150/1650
+75 core voltage
+25 aux
+50 power limit
hit about 78-80c
no down clocking
some artifacts.
http://www.3dmark.com/3dm/8662741


----------



## Dundundata

Quote:


> Originally Posted by *navjack27*
> 
> sanity check. thx. http://www.3dmark.com/3dm/8662632
> +50 core voltage
> +13 aux
> +50 power limit
> 1140/1650


How do you know when to start adding AUX voltage? For example why did you +13 aux instead of increasing the core voltage more?


----------



## navjack27

okay, how i overclock.

start with memory at a known max, which is 1650.
start with core at factory overclock. which was 1100.
run with power limit at max, it don't hurt nuffin.
set up my fan curve to go up to 100% when its at 55c
set it up so it 0% at 40c
run a baseline benchmark

go up by 10mhz on core until i get a crash
go down by 5mhz and add +5 volts on core
keep adding the latched voltages until i get a run with minimal artifacts
keep going up on core mhz until i just don't feel comfortable anymore
aux, i'm still not sure about. i think its like voltage that helps the stability of the other voltages being supplied. but it never goes down when i run at full load. the core voltage fluctuates like CRAZY.


----------



## kalidae

Quote:


> Originally Posted by *Noirgheos*
> 
> Alright, I've asked this multiple times, but I want a definite answer. The XFX is 10MHz less than the MSI. Will that even make a 1FPS difference?


Just buy XFX 390. The base one not the overclocked one becsuse the PCB might he different I'm not sure. The XFX has great cooling, a back plate and the card itself looks sweet PLUS ek has a waterblock for it. So if you ever decided to go full watercooling in the future then you can actually watercool your gpu. That's the problem I have right now....no waterblock for my nitro 390.


----------



## Geoclock

Quote:


> Originally Posted by *navjack27*
> 
> sanity check. thx. http://www.3dmark.com/3dm/8662632
> +50 core voltage
> +13 aux
> +50 power limit
> 1140/1650
> 
> my room is hot and it got to 75c on the card and around that speed, even with power play disabled it clocks down at or near that temp
> 
> going up to 1150/1650
> +75 core voltage
> +25 aux
> +50 power limit
> hit about 78-80c
> no down clocking
> some artifacts.
> http://www.3dmark.com/3dm/8662741


Same here, cant pass 1150 when regular 390 does


----------



## Noirgheos

Quote:


> Originally Posted by *Geoclock*
> 
> Same here, cant pass 1150 when regular 390 does


Yep, out of all the 390X cards, the MSI one seems to have the most failures, or issues.


----------



## Geoclock

Quote:


> Originally Posted by *Noirgheos*
> 
> Yep, out of all the 390X cards, the MSI one seems to have the most failures, or issues.


They have to fix cooling i guess and then we'll RMA it.
That's why they have on sale.


----------



## bazookatooths

Only gained 46 points on 3Dmark with core clock @ 1200 from 1180 , thats the max I can get with msi afterburner. I thought the old 95w 8120 would hold me back, alot more , but was pleasantly surprised. For stock i run @ 1125 /1620 no core voltage.

EDIT: Next upgrade


----------



## Dundundata

Quote:


> Originally Posted by *bazookatooths*
> 
> Only gained 46 points on 3Dmark with core clock @ 1200 from 1180 , thats the max I can get with msi afterburner. I thought the old 95w 8120 would hold me back, alot more , but was pleasantly surprised. For stock i run @ 1125 /1620 no core voltage.


Nice, what's your graphics scores look like?


----------



## bazookatooths

Quote:


> Originally Posted by *Dundundata*
> 
> Nice, what's your graphics scores look like?


14065


----------



## MTDEW

Personally I prefer running my Sapphire R9 390x at default clocks of 1055 / 1500 for gaming/daily use as I feel its more than enough for me and I can easily keep my core and VRM temps in the mid 60c's with a quiet fan Profile.

But this is overclock.net and "if everyone else jumped off a bridge....?"









So this is just for fun of course.
Here is what I would consider my GPUs max limits of 1180 / 1700 with +100mv / 100% fan / +50% PWR
Any more on the core I get artifacts.
Any more on memory I get black screens.
GPU temp - 75c / VRM1 temp - 102c / VRM2 temp -98c (remember 100% fan speed & +100mv)
So there is no way on earth I'd even try using Trixx to get more voltage and try for 1200 core with these kind of VRM temps, besides as I said I think these kinds of overclocks have no real use for me anyway other than for testing, because I consider these clocks unrealistic for any prolonged use.
1180 / 1700 Firestrike

*1100 / 1700 is what I would consider my GPUs "sweet spot"* since I can hit that without a voltage increase while easily keeping the core and VRM temps under 70c with a quiet fan profile.
Now If I was told I had to overclock, these are the clocks I would personally choose for daily use.
So for those who were asking for 1100/1700 scores to use for comparison a few pages back, here is mine.
1100 / 1700 Firestrike

1140 / 1700 isn't too bad either but requires a mild voltage increase of +30mv....Anything over 1140 requires +50mv or more and my VRM temps start to climb fast.
1140 / 1700 Firestrike

Notice how my physics score actually goes down after 1100mhz on my setup? Odd

Anyway, I'm done "testing" and I'm back to gaming at my default 1055 / 1500 clocks and couldn't be happier for the perf/price I paid.
(I got my Sapphire 390x for $399 with free shipping and free Mad Max game on Newegg....I see the price is back up now and the MSI is now $399 instead, so one could assume that 4th QTR prices/deals should be even better yet)

If you could combine Sapphires core cooling with MSI or XFX VRM cooling, you'd have the perfect air cooled 390/390x IMO.


----------



## milan616

My VRAM won't go past 1600, sadly, but I'm with you on the keeping the volts low. The diminishing returns on core clock just aren't worth it when you are on air cooling. I've got my 1100 MHz down to -37mV stable. The most annoying thing is that without BIOS edits (no dual BIOS scary, thanks MSI+Obama) I can't get the idle core voltage down because all the idle/low usage GPU + 1600 MHz VRAM speeds make it shoot to 1258mV. When I'm actually running something that slots into performance mode I drop to 1180 mV.


----------



## MTDEW

So the MSI's don't have a dual bios?
That does knda suck if they don't! (not a total deal breaker, but still sucks none the less)
Do the XFX 390/390x have dual bios?
I know both the Sapphire Nitro 390 and Sapphire 390x do....but that's all I really researched b4 buying because my last 6 AMD GPUs have been Sapphire so I only looked closely at those reviews to be sure they had dual bios b4 buying.

*EDIT:*Ahhh....I see now, the MSI Gaming doesnt have a dual bios and comes with a slightly higher default voltage to help with OCing.
Which kinda throws all our +XXmv and core temp comparisons off a bit.
Of course the MSI's are going to seem to run warmer when comparing to other 390/390x's at the same +XXmv settings since their base voltage is a bit higher to begin with.
Which makes the MSI VRM cooling seem all the more impressive when comparing at the same +XXmv also.


----------



## flopper

Quote:


> Originally Posted by *MTDEW*
> 
> Personally I prefer running my Sapphire R9 390x at default clocks of 1055 / 1500 for gaming/daily use as I feel its more than enough for me and I can easily keep my core and VRM temps in the mid 60c's with a quiet fan Profile.
> 
> But this is overclock.net and "if everyone else jumped off a bridge....?"
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If you could combine Sapphires core cooling with MSI or XFX VRM cooling, you'd have the perfect air cooled 390/390x IMO.


my vrm dont go over 60c.
open case and 1140mhz.
sapphire Nitro 390.
so my exemplar doing great on all counts what I would ever want.


----------



## milan616

Quote:


> Originally Posted by *MTDEW*
> 
> So the MSI's don't have a dual bios?
> That does knda suck if they don't! (not a total deal breaker, but still sucks none the less)
> 
> *EDIT:*Ahhh....I see now, the MSI Gaming doesnt have a dual bios and comes with a slightly higher default voltage to help with OCing.


Yeah, if I hadn't picked up mine for cheap private sale (warranty replacement someone got, NIB) I would be more upset about it. Hawaii/Grenada BIOS editing is greatly more complicated than older setups with like four simple core+mem+voltage PowerPlay states so I've been wary.


----------



## MTDEW

Quote:


> Originally Posted by *flopper*
> 
> my vrm dont go over 60c.
> open case and 1140mhz.
> sapphire Nitro 390.
> so my exemplar doing great on all counts what I would ever want.


Yeah that seems doable with the right fan profile/ambient temps.
Like I said in the middle of my post,....with my 390x my VRM temps stay below and/or keep pace with my core temps (VRM1 is the warmest) until I start adding +50mv or more, then my VRM temps outpace my core temps pretty quickly.....no big deal to me as I said, as far as im concerned by then you're already at the point where the extra voltage/power usage/and temps outpace any actual real world gains in FPS performance at those overclocked speeds anyway.

Of course it goes without saying that this is just my personal opinion.









I still think MSI did a great job tailoring their new cooling for the 390/390x to ensure their GPUs have best chance at max/extreme overclocks on air cooling IMO.
If you want anything more, you may as well go water cooling, but I personally think you'd have to be sure you hit the silicone lottery and have a "good one" before justifying that expense.
It would suck to slap a block on a 390/390x and find out you've got an overclocking dud regardless of cooling.


----------



## flopper

Quote:


> Originally Posted by *MTDEW*
> 
> Yeah that seems doable with the right fan profile/ambient temps.
> Like I said in the middle of my post,....with my 390x my VRM temps stay below and/or keep pace with my core temps (VRM1 is the warmest) until I start adding +50mv or more, then my VRM temps outpace my core temps pretty quickly.....no big deal to me as I said, as far as im concerned by then you're already at the point where the extra voltage/power usage/and temps outpace any actual real world gains in FPS performance at those overclocked speeds anyway.
> 
> Of course it goes without saying that this is just my personal opinion.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I still think MSI did a great job tailoring their new cooling for the 390/390x to ensure their GPUs have best chance at max/extreme overclocks on air cooling IMO.
> If you want anything more, you may as well go water cooling, but I personally think you'd have to be sure you hit the silicone lottery and have a "good one" before justifying that expense.
> It would suck to slap a block on a 390/390x and find out you've got an overclocking dud regardless of cooling.


overall, its a 1100-1160mhz for these cards.
the process node and manufacturing and design of gcn seems to pan out there.
I just adjust settings for fps if I need more of them than to stretch oc.

Plan to run 3x1440p screens for eyefinity next year and then I need a dieshrink card fury 2 or whatever it be named.
That I look forward to this winter.


----------



## componentgirl90

Quote:


> Originally Posted by *Noirgheos*
> 
> And just to ask... What are your temps under load? Also is it the default fan curve?


Temps are 76 degrees celcius when playing battlefield 4 maximum. No idea what a default fan curve is!


----------



## MTDEW

Quote:


> Originally Posted by *flopper*
> 
> overall, its a 1100-1160mhz for these cards.
> the process node and manufacturing and design of gcn seems to pan out there.
> I just adjust settings for fps if I need more of them than to stretch oc.
> 
> Plan to run 3x1440p screens for eyefinity next year and then I need a dieshrink card fury 2 or whatever it be named.
> That I look forward to this winter.


Agreed, you're better off turning down some un-noticable game settings to gain more performance if needed, then running your GPU overclocks on the edge of stability for less of a performance gain.









And yeah, I think we all know we're just "stop gapping" until the new GPUs next year.
I'm just wondering what the pricing will be like, it seems we're already being primed with $650 - $700 flagship GPU pricing....and with the rumored huge performance boost for 16nm who knows how much higher AMD/Nvidia will think customers will be willing to pay for a flagship GPU.
Around this time next year should be interesting.

OK, enough off topic talk for me. (sorry about that guys/gals...it just came out...LOL)


----------



## Agent Smith1984

So, just to give asus the benefit off the doubt.....

I got a great price on a strix 390, and will be testing it next.

Am applying new tim, and assuming my high flow case setup should make a nice test bed for it.

I am going too be switching to an asus 990fx sabertooth for its overclockability of the CPU, and also for it's awesome pcie slot spacing (crossfire now in the plans again)


----------



## Gumbi

Quote:


> Originally Posted by *Agent Smith1984*
> 
> So, just to give asus the benefit off the doubt.....
> 
> I got a great price on a strix 390, and will be testing it next.
> 
> Am applying new tim, and assuming my high flow case setup should make a nice test bed for it.
> 
> I am going too be switching to an asus 990fx sabertooth for its overclockability of the CPU, and also for it's awesome pcie slot spacing (crossfire now in the plans again)


Cross your fingers. From what I've seen the VRM temps drop drastically when you increase the airflow, which tells me there is poor/no contact between the VRM sinks (if there are any?) and the main cooler.

Test it out anyways with average airflow and then high airflow or something and see how she does. I'm sure the build quality overall is decent, as it it with all 390s really.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Gumbi*
> 
> Cross your fingers. From what I've seen the VRM temps drop drastically when you increase the airflow, which tells me there is poor/no contact between the VRM sinks (if there are any?) and the main cooler.
> 
> Test it out anyways with average airflow and then high airflow or something and see how she does. I'm sure the build quality overall is decent, as it it with all 390s really.


Who better to bring you guys more info on other 390's than the originator of the club right?









I am wondering if the beefed up power circuitry will offer me anything in the over-100mv range if I can cool her down some.

I like the card cause it looks great, and it's a 2 slot solution..... so if some new noctua TIM, and some high flow fans keep her happy, she will have a twin certain by Christmas


----------



## Dundundata

Quote:


> Originally Posted by *bazookatooths*
> 
> 14065


Is that your 1180 clock or 1125?


----------



## flopper

Quote:


> Originally Posted by *MTDEW*
> 
> And yeah, I think we all know we're just "stop gapping" until the new GPUs next year.
> I'm just wondering what the pricing will be like, it seems we're already being primed with $650 - $700 flagship GPU pricing....and with the rumored huge performance boost for 16nm who knows how much higher AMD/Nvidia will think customers will be willing to pay for a flagship GPU.
> Around this time next year should be interesting.


the 300 or 390 range of cards next year be better than fury/980ti this year is most likely.
so while the prices will continue to be high you get a whole lot more next year.
Quote:


> Originally Posted by *Agent Smith1984*
> 
> Who better to bring you guys more info on other 390's than the originator of the club right?


----------



## bazookatooths

Quote:


> Originally Posted by *Dundundata*
> 
> Is that your 1180 clock or 1125?


This is the best score I can achieve with no artifacts.

http://www.3dmark.com/3dm/8671734?

1200mhz gives small amounts of artifacts on the 3dmark, I would need voltage over +100, I don't really want to go above that though.

Also the PC R9 390 is 289.99 with rebate

http://www.newegg.com/Product/Product.aspx?Item=N82E16814131672&cm_re=powercolor-_-14-131-672-_-Product&hootPostID=a3c1de07813199e8a4397b03d081a4d0&RandomID=24216452418221620150923104435


----------



## MitsosTheGreat

GOOD NEWS for sapphire r9 390 users.


----------



## FooSkiii

Hey guys anybody play League Of Legends on here that has a r9 390???
if so what is your top fps uncapped at high res.???


----------



## componentgirl90

Quote:


> Originally Posted by *flopper*
> 
> the 300 or 390 range of cards next year be better than fury/980ti this year is most likely.
> so while the prices will continue to be high you get a whole lot more next year.


What do you mean the 390 will be better next year?


----------



## Derek129

Well I sent my r9 390 out for Rma today. I know the problems with my card are probably most likely driver related but what the hell let's see what gigabyte can tell me. I also was having the shutdown sleep problem with my card installed in my computer.


----------



## Gumbi

Quote:


> Originally Posted by *FooSkiii*
> 
> Hey guys anybody play League Of Legends on here that has a r9 390???
> if so what is your top fps uncapped at high res.???


Irrelevant, it's a CPU bound game.


----------



## FooSkiii

Quote:


> Originally Posted by *Gumbi*
> 
> Irrelevant, it's a CPU bound game.


kool thnx bro


----------



## tangelo

Quote:


> Originally Posted by *componentgirl90*
> 
> What do you mean the 390 will be better next year?


I think he meant that the next gen cards that will replace 390 series are better.


----------



## Dundundata

Holy thermal paste Batman!

So I got an XboxOne/Windows controller (pretty sweet) today and loaded up Witcher 3. A bit hotter in my room and with a decent OC on temps were nearing 80C. This just didn't seem right. So I went ahead and voided my warranty (stupid sticker screws!) and I'm glad I did. What I saw was shocking. I think the thermo5000 dab-o-matic at the factory was running haywire that day, because the paste was an absolute gobbed on mess. There was so much it came out the side of the chip/heatsink interface and congealed on the side. Cleaned everything off and applied a dab of some cooler master paste leftover from installing my CPU.

Well temps are much better now about 10C cooler! Barely touched 70C with a decent OC. I could perhaps reapply a different paste if that will make any difference? But it has worked fine for my CPU.

Sorry I should have took some pics of the whole debacle but it was in the heat of the moment (zing!)


----------



## Mysticking32

Yeah the thermal paste on the msi cards in particular seem to be iffy. I have had 3 r9 390x's and the first one reached temps of up to 90. The second reached temps up to 85. So Msi finally sent me one from the factory and that one only reaches 72.

So it's definitely a thermal paste issue.


----------



## kalidae

Quote:


> Originally Posted by *MitsosTheGreat*
> 
> 
> 
> GOOD NEWS for sapphire r9 390 users.


Ooooo I want a backplate!!!


----------



## navjack27

so far we see that the cards should have better paste application. besides that we are pushing as much as we can from the 390x. upping volts rids artifacts but introduces heat and a downclocking at times. the best clock is what you can get max memory with the max core with no voltage adjustment.
yes a 390 can clock faster, buttttt remember what the 390 gets with an overclock is what the 390x gets most of the time at factory overclock. higher numbers on core and mem ≠ better performance


----------



## navjack27

Quote:


> Originally Posted by *MTDEW*
> 
> Personally I prefer running my Sapphire R9 390x at default clocks of 1055 / 1500 for gaming/daily use as I feel its more than enough for me and I can easily keep my core and VRM temps in the mid 60c's with a quiet fan Profile.
> 
> But this is overclock.net and "if everyone else jumped off a bridge....?"
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So this is just for fun of course.
> Here is what I would consider my GPUs max limits of 1180 / 1700 with +100mv / 100% fan / +50% PWR
> Any more on the core I get artifacts.
> Any more on memory I get black screens.
> GPU temp - 75c / VRM1 temp - 102c / VRM2 temp -98c (remember 100% fan speed & +100mv)
> So there is no way on earth I'd even try using Trixx to get more voltage and try for 1200 core with these kind of VRM temps, besides as I said I think these kinds of overclocks have no real use for me anyway other than for testing, because I consider these clocks unrealistic for any prolonged use.
> 1180 / 1700 Firestrike
> 
> *1100 / 1700 is what I would consider my GPUs "sweet spot"* since I can hit that without a voltage increase while easily keeping the core and VRM temps under 70c with a quiet fan profile.
> Now If I was told I had to overclock, these are the clocks I would personally choose for daily use.
> So for those who were asking for 1100/1700 scores to use for comparison a few pages back, here is mine.
> 1100 / 1700 Firestrike
> 
> 1140 / 1700 isn't too bad either but requires a mild voltage increase of +30mv....Anything over 1140 requires +50mv or more and my VRM temps start to climb fast.
> 1140 / 1700 Firestrike
> 
> Notice how my physics score actually goes down after 1100mhz on my setup? Odd
> 
> Anyway, I'm done "testing" and I'm back to gaming at my default 1055 / 1500 clocks and couldn't be happier for the perf/price I paid.
> (I got my Sapphire 390x for $399 with free shipping and free Mad Max game on Newegg....I see the price is back up now and the MSI is now $399 instead, so one could assume that 4th QTR prices/deals should be even better yet)
> 
> If you could combine Sapphires core cooling with MSI or XFX VRM cooling, you'd have the perfect air cooled 390/390x IMO.


the only place you beat me was in physics really. i disable hyperthreading on my i7.
http://www.3dmark.com/3dm/8662741

with hyperthreading
http://www.3dmark.com/3dm/8655791


----------



## Weird0ne

http://www.techpowerup.com/gpuz/details.php?id=uw3d4

My OC's seem stable, just got the card today.
1150/1725 w/ 50% PL and +75mV.

Gonna lower my clocks, VRM1 is climbing 110 apparently...
Or are the VRM rated for 110+ on XFX DD 390?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Weird0ne*
> 
> http://www.techpowerup.com/gpuz/details.php?id=uw3d4
> 
> My OC's seem stable, just got the card today.
> 1150/1725 w/ 50% PL and +75mV.
> 
> Gonna lower my clocks, VRM1 is climbing 110 apparently...
> Or are the VRM rated for 110+ on XFX DD 390?


VRM's are rated pretty high but the sweet spot seems to be in between 70-80c on my card.

Now i love overclocking as much as the rest of us do but real world gains from it sometimes arent worth the extra heat


----------



## Weird0ne

Quote:


> Originally Posted by *Sgt Bilko*
> 
> VRM's are rated pretty high but the sweet spot seems to be in between 70-80c on my card.
> 
> Now i love overclocking as much as the rest of us do but real world gains from it sometimes arent worth the extra heat


Just ran +50mV/50PL doing 1135/1650, the VRM was just as high temped.
Might just run 0mV/50PL and leave it at that.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Weird0ne*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> VRM's are rated pretty high but the sweet spot seems to be in between 70-80c on my card.
> 
> Now i love overclocking as much as the rest of us do but real world gains from it sometimes arent worth the extra heat
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just ran +50mV/50PL doing 1135/1650, the VRM was just as high temped.
> Might just run 0mV/50PL and leave it at that.
Click to expand...

If you can get 1100/1650 on stock voltage with just power limit bumped up then id leave it there and call it a day


----------



## Weird0ne

Quote:


> Originally Posted by *Sgt Bilko*
> 
> If you can get 1100/1650 on stock voltage with just power limit bumped up then id leave it there and call it a day


I can actually do 1100/1700 stock.
Gonna push it to my max on stock


----------



## bazookatooths

Quote:


> Originally Posted by *Dundundata*
> 
> Holy thermal paste Batman!
> Well temps are much better now about 10C cooler! Barely touched 70C with a decent OC. I could perhaps reapply a different paste if that will make any difference? But it has worked fine for my CPU.
> Sorry I should have took some pics of the whole debacle but it was in the heat of the moment (zing!)










Nice now that sounds about right, curious tho what is your VRM temp with a mild oc and gaming awhile?


----------



## Rob27shred

Quote:


> Originally Posted by *Dundundata*
> 
> Holy thermal paste Batman!
> 
> So I got an XboxOne/Windows controller (pretty sweet) today and loaded up Witcher 3. A bit hotter in my room and with a decent OC on temps were nearing 80C. This just didn't seem right. So I went ahead and voided my warranty (stupid sticker screws!) and I'm glad I did. What I saw was shocking. I think the thermo5000 dab-o-matic at the factory was running haywire that day, because the paste was an absolute gobbed on mess. There was so much it came out the side of the chip/heatsink interface and congealed on the side. Cleaned everything off and applied a dab of some cooler master paste leftover from installing my CPU.
> 
> Well temps are much better now about 10C cooler! Barely touched 70C with a decent OC. I could perhaps reapply a different paste if that will make any difference? But it has worked fine for my CPU.
> 
> Sorry I should have took some pics of the whole debacle but it was in the heat of the moment (zing!)


Wait so you did this to your 390? I thought those temps sounded kinda high in your previous posts here. Although you got me worried now since we have the same card. My temps are fine ATM but it will drive my OCD into over gear thinking about the mess of thermal paste possibly in my card. I have some MX4 & Cooler Master paste laying around also have some of the cleaning & condition products from AS left over... Think I'll just have to keep an eye on my temps & if they start creeping up as time passes I'll do the same you did.


----------



## Gumbi

I may end up getting a 390 Nitro on an RMA and am interested in its cooling performance vs the Vaporx.

From what I've seen the vrms are a bit warmer for the Nitro. At 50% speed speed, how hot does the core and vrms get on a bench run of Heaven 1080p maxed?


----------



## flopper

Quote:


> Originally Posted by *Weird0ne*
> 
> http://www.techpowerup.com/gpuz/details.php?id=uw3d4
> 
> My OC's seem stable, just got the card today.
> 1150/1725 w/ 50% PL and +75mV.
> 
> Gonna lower my clocks, VRM1 is climbing 110 apparently...
> Or are the VRM rated for 110+ on XFX DD 390?


if it reach 110 celcius the cooling is off.


----------



## Vellinious

I figured this would be a good place to find a few guys with a 390 to take part in a little experiment a guy is doing in a Google+ group. Details in the link. Would like to see some really strong 390 scores.

https://goo.gl/85bCL4


----------



## Gumbi

Quote:


> Originally Posted by *flopper*
> 
> if it reach 110 celcius the cooling is off.


110 seems insanely high. Something is definitely wrong, especially given the relatively safe voltage of 50mv... I would say make sure your cooler is mounted correctly, optimise your case airflow etc.

What are your ambients like?


----------



## Weird0ne

Quote:


> Originally Posted by *flopper*
> 
> if it reach 110 celcius the cooling is off.


Quote:


> Originally Posted by *Gumbi*
> 
> 110 seems insanely high. Something is definitely wrong, especially given the relatively safe voltage of 50mv... I would say make sure your cooler is mounted correctly, optimise your case airflow etc.
> 
> What are your ambients like?


Right now it's 45~/45~/50~

Dropped to 40~ all across the board currently while doing nothing.


----------



## Gumbi

Quote:


> Originally Posted by *Weird0ne*
> 
> Right now it's 45~/45~/50~


Err, context? What do you mean?

Do a bench run of Heaven 1080p max at stock settings and report your core/vrm results. And include your ambient temps too.


----------



## Weird0ne

Quote:


> Originally Posted by *Gumbi*
> 
> Err, context? What do you mean?
> 
> Do a bench run of Heaven 1080p max at stock settings and report your core/vrm results. And include your ambient temps too.


Ambient temps.


----------



## Gumbi

Quote:


> Originally Posted by *Weird0ne*
> 
> Ambient temps.


You gave 3 numbers. Which is the ambient one? Hot how does the card (core and vrms) get under load in heaven benchmark?


----------



## rdr09

Quote:


> Originally Posted by *Vellinious*
> 
> I figured this would be a good place to find a few guys with a 390 to take part in a little experiment a guy is doing in a Google+ group. Details in the link. Would like to see some really strong 390 scores.
> 
> https://goo.gl/85bCL4


we have one local . . .

http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0

invite them over.


----------



## Vellinious

Quote:


> Originally Posted by *rdr09*
> 
> we have one local . . .
> 
> http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0
> 
> invite them over.


Yeah, not many 390s there.

He's wanting to compare the 390 vs the 970 on "Extreme" preset. No idea why he chose that, but...I digress. Just a casual, fun thing. /shrug

Ya just can't find much from the 390 owners.


----------



## omslemming

Add me to the member list thanks









Proof Gpu-z Link-http://www.techpowerup.com/gpuz/details.php?id=eeb8b

Man and model-Msi gaming 8g 390x

Cooling-Aftermarket


----------



## omslemming

Ive a question about the cooling on the 390x, ive got the msi gaming and im aware of the silent running feature with the fans not kicking in till 65 c

The thing is im not sure exactly how hard I should push this card? I did a basic overclock yesterday to 1150 core clock and 1650 memory clock with a slight voltage adjustment of +50 together with an overclock on my processer and I got some great results my linked 3dmark score is http://www.3dmark.com/fs/6055092 The thing is ive since toned it down as I found when gaming with this the card was reaching 80 c+ and I didnt feel comfortable at this temperature. its been a change for me anyway as I upgraded from a hd 7970 which didnt have the silent fan mode so going from a card that idles at 30-40 degrees to one that idles at 65 c took some getting used to as it was without feeling a bit `iffy`

so yeah im new to the 390x what can this thing handle? is 80-83 c an ok temperature to keep up for an extended session without damage or was I right to tone the oc down a bit(im back on the msi gaming software stock oc settings now of core at 1100 and memory at 1525 and tempratures hit about 65-70 when gaming, which to me feels comfier....`but` am I being to paranoid, can I push this more without damage? pretty new to the oc scene and the 390x so still working out whats `ok` to be running at temps wise.


----------



## tangelo

The postman delivered today.
So please Agent Smith, add me to the club.

Going to do more in depth overclocking later, but here is what I'm running at the moment.



MSI R9 390 Gaming 8G
Stock cooling. Running 1100/1550.
Stock volts. +20 Power Limit
Temps during Valley 72-73 with fans at 45% - 55% with my own fan curve in action. Ambient temp inside case ~30-35C
ASIC Quality 70.8% so I guess it's an average chip(?)


----------



## Gumbi

Quote:


> Originally Posted by *omslemming*
> 
> Ive a question about the cooling on the 390x, ive got the msi gaming and im aware of the silent running feature with the fans not kicking in till 65 c
> 
> The thing is im not sure exactly how hard I should push this card? I did a basic overclock yesterday to 1150 core clock and 1650 memory clock with a slight voltage adjustment of +50 together with an overclock on my processer and I got some great results my linked 3dmark score is http://www.3dmark.com/fs/6055092 The thing is ive since toned it down as I found when gaming with this the card was reaching 80 c+ and I didnt feel comfortable at this temperature. its been a change for me anyway as I upgraded from a hd 7970 which didnt have the silent fan mode so going from a card that idles at 30-40 degrees to one that idles at 65 c took some getting used to as it was without feeling a bit `iffy`
> 
> so yeah im new to the 390x what can this thing handle? is 80-83 c an ok temperature to keep up for an extended session without damage or was I right to tone the oc down a bit(im back on the msi gaming software stock oc settings now of core at 1100 and memory at 1525 and tempratures hit about 65-70 when gaming, which to me feels comfier....`but` am I being to paranoid, can I push this more without damage? pretty new to the oc scene and the 390x so still working out whats `ok` to be running at temps wise.


Yup it's fine. Make a custom fan curve in MSI afterburner if you cooler temps.


----------



## bazookatooths

http://www.extremetech.com/gaming/214834-fable-legends-amd-and-nvidia-go-head-to-head-in-latest-directx-12-benchmark

Looks like we are outperforming the GTX 980 a good day indeed.


----------



## Dundundata

Quote:


> Originally Posted by *bazookatooths*
> 
> 
> 
> 
> 
> 
> 
> 
> Nice now that sounds about right, curious tho what is your VRM temp with a mild oc and gaming awhile?


I ran a little test on Witcher 3 with my 1125/1700 OC +63mV and here are the results


----------



## bazookatooths

Quote:


> Originally Posted by *Dundundata*
> 
> I ran a little test on Witcher 3 with my 1125/1700 OC +63mV and here are the results


Thank you looks exactly like mine! i usually hit a max of 80-81c on VRM


----------



## geoffropuff

Hi guys, i just got my brand new MSI 390x yesterday but I have a few questions. First, does it show up in your device manager as R9 390 series rather than mentioning 390x? I figure this is normal behavior but I wanted to check.

Second, I don't know about on your machines but I am running one DVI and one displayport but the screens mirror each other on the bios screens until the desktop loads. I know this isn't a big deal but also wondering if it's normal behavior.

Third, I wasn't able to catch it on video but when loading up the blue Welcome screen for Windows, the desktop, or the blue shutting down screen, it loads strangely. Rather than an instant load of the whole page, it seems like it's loading from the bottom up and I can very briefly see tearing on the screen. This does not sound normal. Is it?

Lastly, I have not tried to OC or anything because I wanted to see that I could game at normal settings before messing with anything. I've only had time to play one game, Heroes of the Storm, but I experienced some weird artifacting. Here's a link to a video of the issue: 




It seems like it's isolated to that one section as you can see that it only happens on the arch with the anubis but not on the other walls, floor, or on the gold pieces.

I have no idea what could be causing the artifacts or the screen tearing. Anyone have any idea?


----------



## Gumbi

Yes it's normal for it to be called "390series".


----------



## Gumbi

Quote:


> Originally Posted by *Gumbi*
> 
> I may end up getting a 390 Nitro on an RMA and am interested in its cooling performance vs the Vaporx.
> 
> From what I've seen the vrms are a bit warmer for the Nitro. At 50% speed speed, how hot does the core and vrms get on a bench run of Heaven 1080p maxed?


Anyone?


----------



## Ha-Nocri

Quote:


> Originally Posted by *Gumbi*
> 
> Anyone?


It's the best cooler there is, well beside vapor-x. But the difference is rly small.


----------



## tangelo

Any idea whats going on in here? Tried reinstalling Valley and it changed nothing. Also, Heaven shows zero degrees all times.


----------



## Gumbi

Quote:


> Originally Posted by *Ha-Nocri*
> 
> It's the best cooler there is, well beside vapor-x. But the difference is rly small.


That's what I gather, but from what I've read here the VRMs sometimes get fairly hot. My Vapor X keeps my VRMs in the 60s with ease even at high voltages.


----------



## Ha-Nocri

Quote:


> Originally Posted by *Gumbi*
> 
> That's what I gather, but from what I've read here the VRMs sometimes get fairly hot. My Vapor X keeps my VRMs in the 60s with ease even at high voltages.


I rly don't think so. My 290 is Tri-X, which is like previous version of Nitro cooler, and VRM's are well under control. I don't think new version is worse...


----------



## diazz694

Hello i want to buy the Gigabyte Radeon R9 390 8GB WindForce 2X but i read on amazon and newegg many user have reboot issue

http://www.newegg.com/Product/Product.aspx?Item=N82E16814125792

is better to go for msi or asus ????


----------



## Streetdragon

Diazz694: Go Sapphire!

BTW Thx for not answering my question/ any help!
But here it is. R9 290 Vapor-X crossfired with a R9 390 Nitro




With a small Overclock:
http://www.3dmark.com/3dm/8692415?


----------



## kizwan

Quote:


> Originally Posted by *tangelo*
> 
> Any idea whats going on in here? Tried reinstalling Valley and it changed nothing. Also, Heaven shows zero degrees all times.


Known bug. You won't get correct temp.


----------



## Gumbi

Quote:


> Originally Posted by *Ha-Nocri*
> 
> I rly don't think so. My 290 is Tri-X, which is like previous version of Nitro cooler, and VRM's are well under control. I don't think new version is worse...


When I say hot, I'm holding them to the high standadd of my Vaporx, ie, getting into the 80s or more when pumping in extra voltage.

Like I said my VaporX can keep even 100mv in the mid to low 60s


----------



## sportsczy

It's interesting... although my OC ran smooth as can be on the benchmarks (+70mv/1170/1750), i was crashing randomly when i was doing nothing GPU related. Got a few "machine error" blue screens too. I started testing things... and found that things stabilizes if i added aux voltage. So instead of just a flat +70mv to the core, i split that into +50mv core and +25mv aux. Seems to work so far.


----------



## Weird0ne

XFX 390 DD @ 1100/1700
Using Aida64 sensors for the temps.

Core/VRM1/VRM2
40/40/45~ at idle.
74~/85~/85~ after running Heaven.
+75mV 1145/1700
80~/95~/80~ after running heaven.

~ denotes +/- 1-3 in this case.


----------



## Dundundata

Quote:


> Originally Posted by *Weird0ne*
> 
> XFX 390 DD @ 1100/1700
> Using Aida64 sensors for the temps.
> 
> Core/VRM1/VRM2
> 40/40/45~ at idle.
> 74~/85~/85~ after running Heaven.
> +75mV 1145/1700
> 80~/95~/80~ after running heaven.
> 
> ~ denotes +/- 1-3 in this case.


seems high, did you setup a fan curve for your GPU?


----------



## Weird0ne

Quote:


> Originally Posted by *Dundundata*
> 
> seems high, did you setup a fan curve for your GPU?


No I did not, using the Auto setting.
My VRM1 was hitting 120+ when I was running Furmark but didn't go that high in Heaven.


----------



## battleaxe

Quote:


> Originally Posted by *Weird0ne*
> 
> No I did not, using the Auto setting.
> My VRM1 was hitting 120+ when I was running Furmark but didn't go that high in Heaven.


Cow... don't don't do that again... Furmark not good for GPU's.

120C+ not good for GPU's either.


----------



## Weird0ne

Quote:


> Originally Posted by *battleaxe*
> 
> Cow... don't don't do that again... Furmark not good for GPU's.
> 
> 120C+ not good for GPU's either.


I didn't know Furmark was bad, but I knew 80 - 90+ depending on the card is bad lol.


----------



## diggiddi

Quote:


> Originally Posted by *Streetdragon*
> 
> Diazz694: Go Sapphire!
> 
> BTW Thx for not answering my question/ any help!
> But here it is. R9 290 Vapor-X crossfired with a R9 390 Nitro
> 
> 
> 
> 
> With a small Overclock:
> http://www.3dmark.com/3dm/8692415?


Yours got the backplate heh


----------



## tangelo

Quote:


> Originally Posted by *Streetdragon*
> 
> Diazz694: Go Sapphire!
> 
> BTW Thx for not answering my question/ any help!
> But here it is. R9 290 Vapor-X crossfired with a R9 390 Nitro
> 
> 
> 
> 
> With a small Overclock:
> http://www.3dmark.com/3dm/8692415?


May I ask what is that widget you use on the right side on the screen to show temps etc?


----------



## Dundundata

Quote:


> Originally Posted by *Weird0ne*
> 
> I didn't know Furmark was bad, but I knew 80 - 90+ depending on the card is bad lol.




That's mine. If you're using something else try turning it up to say, 65% and testing again. Don't know if you saw my post but I had to reapply the thermal paste on my card. Not sure how this relates to VRM temp though.


----------



## Weird0ne

Quote:


> Originally Posted by *Dundundata*
> 
> 
> 
> That's mine. If you're using something else try turning it up to say, 65% and testing again. Don't know if you saw my post but I had to reapply the thermal paste on my card. Not sure how this relates to VRM temp though.


Thanks for the curve picture








From what I know from building computers/after market GPU coolers, adding new thermal paste won't affect the VRM temps unless I apply a thermal solution onto the pads with a heatsink.

And I just got this 390 from bestbuy using giftcards and it has lifetime warrenty, so I'm not sure how forward I am to opening up a 3 day old card to change the paste xD


----------



## Mysticking32

Quote:


> Originally Posted by *Weird0ne*
> 
> Thanks for the curve picture
> 
> 
> 
> 
> 
> 
> 
> 
> From what I know from building computers/after market GPU coolers, adding new thermal paste won't affect the VRM temps unless I apply a thermal solution onto the pads with a heatsink.
> 
> And I just got this 390 from bestbuy using giftcards and it has lifetime warrenty, so I'm not sure how forward I am to opening up a 3 day old card to change the paste xD


if you just bought it you should be able to take it to best buy for a replacement. I think their policy is within 14 days.


----------



## Weird0ne

Quote:


> Originally Posted by *Mysticking32*
> 
> if you just bought it you should be able to take it to best buy for a replacement. I think their policy is within 14 days.


Inform them I want an exchange because I don't like the temps I'm getting? xD


----------



## Mysticking32

Quote:


> Originally Posted by *Weird0ne*
> 
> Inform them I want an exchange because I don't like the temps I'm getting? xD


Lol. Your temps seem fine to me. But if you are going to take it back I'd do it within the time frame.


----------



## Weird0ne

Quote:


> Originally Posted by *Mysticking32*
> 
> Lol. Your temps seem fine to me. But if you are going to take it back I'd do it within the time frame.


I think I'm alright with my temps xD


----------



## Coruscation

Hi people. I'm a potentially prospective 390x owner later this very day but there's an issue I badly want an answer to that I know I can rely on, I was directed to this thread and I figure you guys would indeed know.

The issue concerns power supply. When I asked for advice on the most popular forum in my native country, people said a 550w PSU would do just fine for the Sapphire 390x. I looked around and decided to go for a 650w PSU just to stay on the safe side. Or so I thought. But looking and asking around more I'm developing serious concerns. In some places I'm told I actually need 750w. 750w is what the retailer recommends, but from what I read people say those values are vastly inflated. On Tom's Hardware they recommend 700w for a single 390x. Perhaps that is with plenty of overclocking room accounted for, though.

But that all makes me wonder if I was really given a horribly wrong recommendation that would have screwed up my system completely from my fellow countrymen PC enthusiasts. I've looked up as many load tests around the web as I can, and a full system with a 390x never seems to even hit 500w, though it is between 400w and 500w. I'm completely new to building PCs so I may be misunderstanding something, but doesn't this indicate a 650w PSU should be fine? I know that the brand and model are important. I am on a tight budget, so the one I went for was the Fractal Design Integra M 650w. It's rated as tier 3: safe and stable to use on Tom's Hardware, just not perhaps for overclocking, but as this is my first build ever I'm going to steer away from OC for the time being at least.

If the official owners club could alleviate this worry I'd feel safe going for that Sapphire 390x which is on a fantastic price level in my country right now, being only about $30 more than the 390. Can I feel safe with a 650w PSU? The idea that I'd have to get an expensive, high-wattage PSU just to be able to use this card at all without risking all hell breaking loose really throws me off, and if that's the case I won't be able to go for it due to budget concerns, and NVIDIA's 970 would be the other option.

I apologize for the lengthy post, but I hope someone has the time to answer. The card looks great and at such a price it's a steal, but this worry about the PSU is eating away at me.


----------



## flopper

Quote:


> Originally Posted by *Coruscation*
> 
> Hi people. I'm a potentially prospective 390x owner later this very day but there's an issue I badly want an answer to that I know I can rely on, I was directed to this thread and I figure you guys would indeed know.
> 
> The issue concerns power supply. When I asked for advice on the most popular forum in my native country, people said a 550w PSU would do just fine for the Sapphire 390x. I looked around and decided to go for a 650w PSU just to stay on the safe side. Or so I thought. But looking and asking around more I'm developing serious concerns. In some places I'm told I actually need 750w. 750w is what the retailer recommends, but from what I read people say those values are vastly inflated. On Tom's Hardware they recommend 700w for a single 390x. Perhaps that is with plenty of overclocking room accounted for, though.
> 
> But that all makes me wonder if I was really given a horribly wrong recommendation that would have screwed up my system completely from my fellow countrymen PC enthusiasts. I've looked up as many load tests around the web as I can, and a full system with a 390x never seems to even hit 500w, though it is between 400w and 500w. I'm completely new to building PCs so I may be misunderstanding something, but doesn't this indicate a 650w PSU should be fine? I know that the brand and model are important. I am on a tight budget, so the one I went for was the Fractal Design Integra M 650w. It's rated as tier 3: safe and stable to use on Tom's Hardware, just not perhaps for overclocking, but as this is my first build ever I'm going to steer away from OC for the time being at least.
> 
> If the official owners club could alleviate this worry I'd feel safe going for that Sapphire 390x which is on a fantastic price level in my country right now, being only about $30 more than the 390. Can I feel safe with a 650w PSU? The idea that I'd have to get an expensive, high-wattage PSU just to be able to use this card at all without risking all hell breaking loose really throws me off, and if that's the case I won't be able to go for it due to budget concerns, and NVIDIA's 970 would be the other option.
> 
> I apologize for the lengthy post, but I hope someone has the time to answer. The card looks great and at such a price it's a steal, but this worry about the PSU is eating away at me.


the manufacturer often add more watt for their recomendation due to a lot of people run low rated psu that are either cheap which means they cant handle load over time that good.
so they add more watt to be safe from rma situations basically.
a good rated psu even at 500w will work, check the amp on the 12v line.
650w with bronze certification is fine.


----------



## Coruscation

Thank you for the input. I've had it pointed out to me elsewhere that if I go with a 650w and run into some kind of problem with the system, customer support might have grounds for turning me away because it could be considered my own fault for undershooting in wattage even if the problem really had nothing to do with it.

I'm raising the bar to a 750w PSU of the same series, just to be safe in that respect too, since it's just a bit over $10 more, but the 390x it is.


----------



## Exenth

Had my R9 390 for a month started OCing it yesterday, now i have reached a stable Oc and i'm quite happy with the performance gain i had
Great Card


----------



## areamike

Current Sapphire Nitro 390 owner checking in.

Just got my card 3 days ago and haven't had much time to play with it yet. I am hoping to do a little overclocking on it, but not much. I will say, this is one BIG card. Makes my old XFX 7850 look like a baby.


----------



## Streetdragon

Quote:


> Originally Posted by *diggiddi*
> 
> Yours got the backplate heh


Yes. The newer cards have it!


Quote:


> Originally Posted by *tangelo*
> 
> May I ask what is that widget you use on the right side on the screen to show temps etc?


Its the Open Hardware Monitor.
Nice and easy to use^^

Hmm the Nitro get some hot Vrm Temps. With OC VRM1 reach 100C and VRM2 80C not cool


----------



## Rob27shred

Quote:


> Originally Posted by *Exenth*
> 
> 
> 
> Had my R9 390 for a month started OCing it yesterday, now i have reached a stable Oc and i'm quite happy with the performance gain i had
> Great Card


Nice, I am running almost the same values on my XFX 390 & have been very stable at them as well


----------



## yafatana

Hello I just want to ask if my seasonic g650 can run my sapphire r9 390 nitro trix which according to sapphire they recommend 750 watt psu and according to reviews the card pulls 330 watt when gaming my other parts i7 6700k z170x gaming 5 noctua nh d15 gskill 3000mhz 1 ssd 3 hard drives 1 optical drive 8 system fans seasonic g650 I just built that pc but I did not played any game yet


----------



## fyzzz

Quote:


> Originally Posted by *yafatana*
> 
> Hello I just want to ask if my seasonic g650 can run my sapphire r9 390 nitro trix which according to sapphire they recommend 750 watt psu and according to reviews the card pulls 330 watt when gaming my other parts i7 6700k z170x gaming 5 noctua nh d15 gskill 3000mhz 1 ssd 3 hard drives 1 optical drive 8 system fans seasonic g650 I just built that pc but I did not played any game yet


Not a problem. The recommendation is usually for cheap psu, so your 650 will handle it just fine.


----------



## derkington

Can I join?


----------



## kizwan

Quote:


> Originally Posted by *yafatana*
> 
> Hello I just want to ask if my seasonic g650 can run my sapphire r9 390 nitro trix which according to sapphire they recommend 750 watt psu and according to reviews the card pulls 330 watt when gaming my other parts i7 6700k z170x gaming 5 noctua nh d15 gskill 3000mhz 1 ssd 3 hard drives 1 optical drive 8 system fans seasonic g650 I just built that pc but I did not played any game yet


750W recommendation is to cover all the bases. They don't know what is the spec of their customer's PC, so 750W pretty much a safe bet. That G650 is plenty for a 390.


----------



## semiroundboss

Has anyone overclocked higher than 1200 MHz?


----------



## Sgt Bilko

Quote:


> Originally Posted by *semiroundboss*
> 
> Has anyone overclocked higher than 1200 MHz?


I haven't tried yet, going to wait until i reinstall Windows and get my 3DMark sorted out so i can do some more Quadfire benches









It does take quite alot to get 1200 though so i don't expect it to get any higher than that


----------



## Dundundata

anyone else (i think at least one of you) getting high pitch buzzing/coil whine with their xfx? Ive started to notice it more. I ordered another to test it out, pretty sure it's the inductors. Happens when gpu is under load. For instance i hear it playing witcher3, but as soon as i pause it stops. Gets louder with OC.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Dundundata*
> 
> anyone else (i think at least one of you) getting high pitch buzzing/coil whine with their xfx? Ive started to notice it more. I ordered another to test it out, pretty sure it's the inductors. Happens when gpu is under load. For instance i hear it playing witcher3, but as soon as i pause it stops.


I've had a little bit of it here and there, It's kind of funny....reminds me of a little 2-stroke engine sometimes haha

I've been tempted to RMA but it's not annoying me and it's not doing it so much anymore


----------



## Dundundata

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I've had a little bit of it here and there, It's kind of funny....reminds me of a little 2-stroke engine sometimes haha
> 
> I've been tempted to RMA but it's not annoying me and it's not doing it so much anymore


At first i only noticed it in some menus mainly but it's more constant now, idk it might have been like that and I just didn't notice. In any case I'll be testing a different card Monday and see how it goes, who knows could be a bad psu/gpu combo.

Maybe I should run it for awhile and see if it changes at all.


----------



## Agent Smith1984

Once the saber-kitty and new psu show up, I'll be pushing this 390 strix tip it's limits...

Pics to come!


----------



## Sgt Bilko

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Once the saber-kitty and new psu show up, I'll be pushing this 390 strix tip it's limits...
> 
> Pics to come!


I will really be looking forward to seeing what the Strix is like.....hopefully you get a great clocker!


----------



## Noirgheos

Can anybody with a Sapphire 390/390X get on support with them? Apparently I need to own a Sapphire GPU to start a chat. I'd like you to ask if they plan to release a 390X with a backplate and higher clocks. Not just the 390.


----------



## rdr09

Quote:


> Originally Posted by *Gasbah*
> 
> Hello from Finland!
> My second post in here. I have a 4690k and two R9 390's running in crossfire. I already posted the picture in Corsair air 540 owners club but Im gonna share it here too. Love what you guys are doing here!
> Running the gpu's at 4k gets pretty hot so Im looking forward to watercooling solutions in the future. I have never done custom water cooling so Im a noob there.


have you watercooled yet? we a similar setup. i have a thin 240 up top and thick 360 to the front . . .


----------



## Agent Smith1984

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I will really be looking forward to seeing what the Strix is like.....hopefully you get a great clocker!


If she's got 1200mhz in her, I'll find it! The asus has a bad rap on temps, but i think some new tim, custom fan profile, and my high flow case fans will keep things under control.


----------



## Gumbi

Quote:


> Originally Posted by *Agent Smith1984*
> 
> If she's got 1200mhz in her, I'll find it! The asus has a bad rap on temps, but i think some new tim, custom fan profile, and my high flow case fans will keep things under control.


The question is, is that gonna compensate for fundamentally bad cooling, or bring out the best in her?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Gumbi*
> 
> The question is, is that gonna compensate for fundamentally bad cooling, or bring out the best in her?


That will be the question....

Hardocp got 1140mhz on their strix with no added voltage, and 1185mhz on 75mv, so it should fall in line with the msi from a performance standpoint...

My concern will be keeping the vrm below 80c, and what i have to do to make that happen....


----------



## flopper

Never owned a card that had coilwhine.
Not sure if lucky or if the hardware I bought just works.

anyhow, my 390 are as far perfect.


----------



## Schwulibertz

.Hey Guys,

got my R9 390 from Sapphire some weeks ago but never really put any effort into oc'ing.
So the last 3 days I spent figuring out the max values my card can handle







.

I ended up with this:
GPU Clock: 1145
Mem Clock: 1700
VDDC Offs.: 106



http://www.techpowerup.com/gpuz/details.php?id=umxs2

I even ended up buying 3DMark on Steam







.
Here are my results:
http://www.3dmark.com/3dm/8716738?&locale=en_GB

I also ran the Unigine Heaven Benchmark:


Unrelated Question:
What CPU would you recommend as an update for my i5 4670k?









Great forum keep it up!









G.


----------



## Waleh

Hello all! I'm planning on building a high end mini itx system and wanted to use an XFX 390X as my GPU (the XFX is one of the only iterations that will fit the case). My question is, how will the 390x handle 1440p if I want around 60 FPS? Should I get a 1080p monitor instead? I'm currently using an old 720p monitor and I think at this point anything will be a huge improvement. The GPU will be paired with a 6700k for reference. I enjoy playing games like BF4, GTA5, and other AAA titles. I've seen some benchmarks but I figured I'd ask the folks here to get an alternate point of view.


----------



## Gumbi

Quote:


> Originally Posted by *Waleh*
> 
> Hello all! I'm planning on building a high end mini itx system and wanted to use an XFX 390X as my GPU (the XFX is one of the only iterations that will fit the case). My question is, how will the 390x handle 1440p if I want around 60 FPS? Should I get a 1080p monitor instead? I'm currently using an old 720p monitor and I think at this point anything will be a huge improvement. The GPU will be paired with a 6700k for reference. I enjoy playing games like BF4, GTA5, and other AAA titles. I've seen some benchmarks but I figured I'd ask the folks here to get an alternate point of view.


Honestly, fine. If you drop the graphics down a bit you'll get great performance.


----------



## Dundundata

@Schwulibertz; 4790k, but you might want to just keep what you have and maybe OC it a bit

@Waleh; 1440p


----------



## tangelo

Spend the evening OCing the card.

Stopped at 1150/1600 as the temps started to go a little too high for my taste. It was stable but I don't think I'm gonna run it with these clocks. Gonna stick with 1100/1600 in the future.

During Firestrike both GPU and VRM went up to 81C with fan speed at 82%

http://www.3dmark.com/fs/6092301


----------



## MrDave

I had an R9 290x OC by ASUS that would constantly Black Screen Crash while sitting idle at the desk top while I could be gaming all night in 3D mode and not ever crash. I bought it in January of 2015. Early August, after attempting so many GPU tweaks and timings, buying a single rail 'over' powered power supply, where nothing would cure the crashes not even any driver versions from 2013 through current, and installed on fresh clean installs of three operating systems Win7.1, Win 8.1, nor Windows 10, I finally submitted my card back to ASUS for an RMA in August 2015. After 10 days, my card was still sitting in the "Received but Waiting" status. I contacted ASUS, and they said they did not have the parts nor a replacement card to ship to me and were sending me a letter of upgrade offer. Well, to my surprise, I was offered a STRIX R9 390x OC DCUIII 8Gbyte DDR5 Gaming edition card. I gladly accepted and 1 week later I received a brand new retail packaged unit. ASUS made good on providing me with a card that no longer crashes my system while idle! I know it is the same Hawaii Core, but it is a later generation and the idle clock jumping that caused my system lockups appears to not exist at the 2D desktop. I am one happy camper with this card and upgrade !!!


----------



## Waleh

Quote:


> Originally Posted by *Dundundata*
> 
> @Schwulibertz; 4790k, but you might want to just keep what you have and maybe OC it a bit
> 
> @Waleh; 1440p


So I won't really have any issues at 1440p? I really want a 1440p monitor


----------



## Mysticking32

Quote:


> Originally Posted by *Waleh*
> 
> So I won't really have any issues at 1440p? I really want a 1440p monitor


You'll have to tone down some setttings on gta 5 to get 60 fps constantly but yes the 390x is perfect for 1440p. I run bf4 at max everything at 1440p and get 60 fps.


----------



## danielhowk

heard the r9 390 outspec the gtx 970 in most cases.
which is the best brand to get for r9 390 ? this is my first time shopping for ati card. been using nvidia for all my life. hoping i'd get the best experience from ati .
so which is the best brand to get for r9 390?


----------



## gerpogi

Quote:


> Originally Posted by *danielhowk*
> 
> heard the r9 390 outspec the gtx 970 in most cases.
> which is the best brand to get for r9 390 ? this is my first time shopping for ati card. been using nvidia for all my life. hoping i'd get the best experience from ati .
> so which is the best brand to get for r9 390?


Other than gigabyte, anything will do. More popular brands are powercolor, msi, and sapphire. Sapphire cards do not have backplates though. Gigabyte 390's are voltage locked and run hotter compared to others but It does look sexy though.


----------



## diggiddi

Correction the new Sapphire cards do have backplates now


----------



## Mysticking32

best 390 to get is the msi one. And sapphire cards do have backplates now. At least the last time I checked.

Msi seems to have the best overclocking capability and has a nice backplate.


----------



## gerpogi

Quote:


> Originally Posted by *diggiddi*
> 
> Correction the new Sapphire cards do have backplates now


Not being produced yet though. They are still planning to from what I read so it kinda still doesn't exist


----------



## bazookatooths

Anyone could post brands with VRM temp under 80c with OC full load. Thank you in advance.


----------



## diggiddi

Quote:


> Originally Posted by *Streetdragon*
> 
> Yes. The newer cards have it!
> 
> Its the Open Hardware Monitor.
> Nice and easy to use^^
> 
> Hmm the Nitro get some hot Vrm Temps. With OC VRM1 reach 100C and VRM2 80C not cool


Quote:


> Originally Posted by *gerpogi*
> 
> Not being produced yet though. They are still planning to from what I read so it kinda still doesn't exist


See the post I just quoted from 3 or so pages back


----------



## bazookatooths

Quote:


> Originally Posted by *gerpogi*
> 
> Other than gigabyte, anything will do. More popular brands are powercolor, msi, and sapphire. Sapphire cards do not have backplates though. Gigabyte 390's are voltage locked and run hotter compared to others but It does look sexy though.


Most popular cards are MSI Sapphire and XFX by miles.


----------



## danielhowk

Quote:


> Originally Posted by *bazookatooths*
> 
> Most popular cards are MSI Sapphire and XFX by miles.


Quote:


> Originally Posted by *diggiddi*
> 
> See the post I just quoted from 3 or so pages back


Quote:


> Originally Posted by *bazookatooths*
> 
> Anyone could post brands with VRM temp under 80c with OC full load. Thank you in advance.


Quote:


> Originally Posted by *gerpogi*
> 
> Not being produced yet though. They are still planning to from what I read so it kinda still doesn't exist


does sapphire r9 390 at idle does the fan off ? like msi ? once 60c pass only the fan on ?


----------



## gerpogi

Quote:


> Originally Posted by *bazookatooths*
> 
> Most popular cards are MSI Sapphire and XFX by miles.


Didn't I just say that? Other than xfx ofcourse. Doesn't matter
Quote:


> Originally Posted by *danielhowk*
> 
> does sapphire r9 390 at idle does the fan off ? like msi ? once 60c pass only the fan on ?


I don't know really but it does say on their website that it's possible for the fans to stop completely


----------



## Schwulibertz

[quote name="danielhowk" url="/t/1561704/official-amd-r9-390-390x-owners-club/2910#post_244556

does sapphire r9 390 at idle does the fan off ? like msi ? once 60c pass only the fan on ?[/quote]

The Sapphires fans DO turn off when reaching a certain temperature.
You can change that temperature using Sapphires tool Trixxx by changing the fan profile.


----------



## iludez

Hey guys my mate just bought an r9 390 sapphire tri-x and when he plays war thunder itl do a weird little alt tab for a few seconds.
This did not happen with his 980 he had.Any ideas?Hes running win10 clean install and drivers updated.All nvidia drivers removed before hand


----------



## Schwulibertz

Quote:


> Originally Posted by *Dundundata*
> 
> @Schwulibertz; 4790k, but you might want to just keep what you have and maybe OC it a bit
> 
> @Waleh; 1440p


I already did OC the i5 4670k.
It comes at 3.4Ghz stock.
I now run it at 3.8 GHz.


----------



## perithimus

I have a thermaltake smart 650w 80+ bronze. Will I need to upgrade my psu for a 390? here's a link to my psu

http://www.bestbuy.com/site/thermaltake-smart-series-650w-bronze-power-supply-black/8733872.p?id=1219365660388&skuId=8733872&ref=199&loc=rGMTN56tf/w&acampID=7252269&siteID=rGMTN56tf_w-VHh82Rf7VTq9y.KL8grZCQ

If I don't need to upgrade my psu I might just get a 390x instead. My specs are here

Corsair carbide spec 01
Intel i7 4790
gigabyte z97n gaming 5
1tb hdd
8gb adata ram
Pending
thermaltake 650w psu
windows 10 pro


----------



## Schwulibertz

Quote:


> Originally Posted by *perithimus*
> 
> I have a thermaltake smart 650w 80+ bronze. Will I need to upgrade my psu for a 390? here's a link to my psu
> 
> http://www.bestbuy.com/site/thermaltake-smart-series-650w-bronze-power-supply-black/8733872.p?id=1219365660388&skuId=8733872&ref=199&loc=rGMTN56tf/w&acampID=7252269&siteID=rGMTN56tf_w-VHh82Rf7VTq9y.KL8grZCQ
> 
> If I don't need to upgrade my psu I might just get a 390x instead. My specs are here
> 
> Corsair carbide spec 01
> Intel i7 4790
> gigabyte z97n gaming 5
> 1tb hdd
> 8gb adata ram
> Pending
> thermaltake 650w psu
> windows 10 pro


I have a Thermaltake 630W 80+ and everything runs fine so you should be good







.


----------



## bazookatooths

Quote:


> Originally Posted by *gerpogi*
> 
> Didn't I just say that? Other than xfx ofcourse. Doesn't matter
> I don't know really but it does say on their website that it's possible for the fans to stop completely


Quote:


> Originally Posted by *gerpogi*
> 
> Didn't I just say that? Other than xfx ofcourse. Doesn't matter
> I don't know really but it does say on their website that it's possible for the fans to stop completely


Z's

Well dont cry i was just correcting you , powercolor is not remotley close to these 3


----------



## POLJDA

Update MSI r9 390


----------



## danielhowk

Should i go with msi r9 390 or sapphire r9 390 ?
seems to be close call. but which is better ? price is the same


----------



## Gumbi

Mother of God, may we see a bench run of Heaven maxed at 1080p? Is that 1250mhz stable? How hot does the core/vrms get. You have a gooooolden chip my friend.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Gumbi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *POLJDA*
> 
> Update MSI r9 390
> 
> 
> 
> 
> Mother of God, may we see a bench run of Heaven maxed at 1080p? Is that 1250mhz stable? How hot does the core/vrms get. You have a gooooolden chip my friend.
Click to expand...

That wouldn't be stable with just +100mV......but i would like to be proved wrong....


----------



## navjack27

i guess for anyone who reads this thread and wonders what another msi 390x owner who overclocks uses as a fan curve 24/7.
here


----------



## tangelo

Quote:


> Originally Posted by *bazookatooths*
> 
> Anyone could post brands with VRM temp under 80c with OC full load. Thank you in advance.


My MSI R9 390's VRM runs stable at 70-72C with oc clocks 1100/1600 after hours of 100% gpu usage.
With 1150/1600 the VRM raised to 80+ on single Firestrike run. Might be lower if I run the fans at 100% but haven't tried.


----------



## MitsosTheGreat

Best card for O/C (390)? MSI R9 390 or Shapphire R9 390 ( With Backplate);


----------



## Schwulibertz

So yesterday my PC crashed after playing The Crew for a few hours.
Oddly enough, it ran just fine until i changed the settings a couple of times too much








Today I decided to redo the numbers for my R9 390 Nitro (thanks to @navjack27 for his little instruction on OC'ing on page 281.

Im now at the following values:
Core Voltage: +94
Power Limit: +50
Core Clock: 1150
Memory Clock: 1700

Fire Strike:
http://www.3dmark.com/3dm/8727582?


http://www.techpowerup.com/gpuz/details.php?id=zhand

I just ran Unigine Heaven for 30 minutes and the temperatures on my card leveled out at about 78°C.










So far I think thats about all i can get out of that card but I'm pretty happy







.
Also it's a little bit ******ed how much fun I'm having with this







.

Oh.... and f*** Sapphire for not selling their card with a backplate from the beginning







.

G.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Sgt Bilko*
> 
> That wouldn't be stable with just +100mV......but i would like to be proved wrong....


ME TOO!!!

@POLJDA

Please give us a firestrike run at those clock speeds. I have you added, but would seriously like to see some proof of stability on that OC, because if you are getting 1250 @ 100mv, you surely have a gem on your hand there


----------



## Mysticking32

His card must be praised!


----------



## semiroundboss

Wow. Never knew that existed. I wonder if it is as bad as the 290/290X reference cooler.


----------



## MrDave

http://www.3dmark.com/3dm/8729827

My card is an ASUS STRIX R9 390X OC, even though 3D Mark reports it as a 290X.


----------



## Faster_is_better

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Once the saber-kitty and new psu show up, I'll be pushing this 390 strix tip it's limits...
> 
> Pics to come!


Did you punish that card yet?







Curious how well it clocks.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Faster_is_better*
> 
> Did you punish that card yet?
> 
> 
> 
> 
> 
> 
> 
> Curious how well it clocks.


Not yet, waiting on the other goodies to show up









Don't worry though, I'll keep you posted.

I'm shooting for 1180 core minimum.


----------



## Dundundata

Quote:


> Originally Posted by *Mysticking32*
> 
> His card must be praised!


----------



## Mysticking32

Amd has just released 15.9 beta. I can't do any testing tonight. Got some work to do for class in the morning. But I'll test it eventually.


----------



## Cannon19932006

http://www.3dmark.com/fs/6099606

on the 15.9 beta at 1175/1750


----------



## ChaosAD

Just got my brand new MSI 390X yesterday, for the price of a 390. Did some testing today. My first test was to see how low i can go with Voltage. I managed to run Valley @ 1080/1500 with -100mv, at 1100 core i got artifacts. Here is my score.


----------



## Gumbi

Quote:


> Originally Posted by *ChaosAD*
> 
> Just got my brand new MSI 390X yesterday, for the price of a 390. Did some testing today. My first test was to see how low i can go with Voltage. I managed to run Valley @ 1080/1500 with -100mv, at 1100 core i got artifacts. Here is my score.


Pretty sweet dude. How cool were the temps at that voltage? And how high did the fan spin?


----------



## Weird0ne

What program should I test stability on?
Ran my R9 390 to 1150/1750
Quote:


> Originally Posted by *ChaosAD*
> 
> Just got my brand new MSI 390X yesterday, for the price of a 390. Did some testing today. My first test was to see how low i can go with Voltage. I managed to run Valley @ 1080/1500 with -100mv, at 1100 core i got artifacts. Here is my score.


+100mV/50PL for 1080/1500 seems pretty bad.
Mine runs +75mV/50PL @ 1145/1700


----------



## Gumbi

Quote:


> Originally Posted by *Weird0ne*
> 
> What program should I test stability on?
> Ran my R9 390 to 1150/1750
> +100mV/50PL for 1080/1500 seems pretty bad.
> Mine runs +75mV/50PL @ 1145/1700


He said - 100mv (minus). I'm presuming it's minus, as 100mv is hilariously bad for 1080mhz









I could do 1140mhz core at 25mv on my 290.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Gumbi*
> 
> He said - 100mv (minus). I'm presuming it's minus, as 100mv is hilariously bad for 1080mhz
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I could do 1140mhz core at 25mv on my 290.


Yeah, he's undervolting 100mv and getting 1080, that's pretty impressive


----------



## Gumbi

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Yeah, he's undervolting 100mv and getting 1080, that's pretty impressive


For sure, I'd love to see power draw/thermal numbers for that. 1080/1500 is no joke, with heavy overvolting the cards are maxing out at 1200/1750 AT BEST.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Gumbi*
> 
> For sure, I'd love to see power draw/thermal numbers for that. 1080/1500 is no joke, with heavy overvolting the cards are maxing out at 1200/1750 AT BEST.


I can't remember whether there is the option or not, but I wonder if it's possible to undevolt aux also???


----------



## ChaosAD

Quote:


> Originally Posted by *Weird0ne*
> 
> What program should I test stability on?
> +100mV/50PL for 1080/1500 seems pretty bad.


Firestrike needs a little more voltage than Valley to be stable. So better test with that.
I said -100mv, which is undervolting









I also run FS and noticed that i have to increase voltage to -70mv to run without artifacts. So its 1080/1500 at -70mv/+50PL, max temp 66C with 24C ambient! (Idles at 33C with fan at 30%)



Edit: You can undervolt aux and i can test it, but first tell me what is that for!


----------



## Agent Smith1984

Quote:


> Originally Posted by *ChaosAD*
> 
> Firestrike needs a little more voltage than Valley to be stable. So better test with that.
> I said -100mv, which is undervolting
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I also run FS and noticed that i have to increase voltage to -70mv to run without artifacts. So its 1080/1500 at -70mv/+50PL, max temp 66C with 24C ambient! (Idles at 33C with fan at 30%)
> 
> 
> 
> Edit: You can undervolt aux and i can test it, but first tell me what is that for!


Well, not sure exactly, but it seems to directly impact the ability to overclock the memory past 1600-1650MHz on most cards, though many have not needed it to reach 1700-1750MHz


----------



## Mysticking32

I want to see the card at 1250 damn it! Stop teasing us! lol


----------



## Agent Smith1984

Quote:


> Originally Posted by *Mysticking32*
> 
> I want to see the card at 1250 damn it! Stop teasing us! lol


I'm almost positive that user is not running a stable 1250mhz....

Not calling anyone a liar, just saying I would certainly need to see proof to truly believe that....


----------



## bazookatooths

1250 ver
Quote:


> Originally Posted by *Gumbi*
> 
> For sure, I'd love to see power draw/thermal numbers for that. 1080/1500 is no joke, with heavy overvolting the cards are maxing out at 1200/1750 AT BEST.


My card refuses to even really operate at anythign above 1200 or 1750 not even a single mhz

http://www.3dmark.com/fs/6099186

As far as physics score i've read it is more CPU intensive as far as intels getting the superior single core speed and more IPC, ie: higher score then AMD cpus


----------



## Dundundata

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Well, not sure exactly, but it seems to directly impact the ability to overclock the memory past 1600-1650MHz on most cards, though many have not needed it to reach 1700-1750MHz


my xfx does 1700 with ease, the core clock is another story. i can get 1140 but it gets tough after that. i'm gonna work on getting a stable 1150/1700 (or thereabouts) for fun.


----------



## RicoDee

Serious upgrade, my faithful gtx 580 died on me last week and so I wanted to try something different hopefully better too







MSI R9 390 ...

RicoDee


----------



## Weird0ne

1150/1725 @ +75mV/50PL
Quote:


> Originally Posted by *Dundundata*
> 
> my xfx does 1700 with ease, the core clock is another story. i can get 1140 but it gets tough after that. i'm gonna work on getting a stable 1150/1700 (or thereabouts) for fun.


My 390 can push 1145/1700 on +75mV.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ChaosAD*
> 
> Firestrike needs a little more voltage than Valley to be stable. So better test with that.
> I said -100mv, which is undervolting
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I also run FS and noticed that i have to increase voltage to -70mv to run without artifacts. So its 1080/1500 at -70mv/+50PL, max temp 66C with 24C ambient! (Idles at 33C with fan at 30%)
> 
> 
> 
> Edit: You can undervolt aux and i can test it, but first tell me what is that for!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Well, not sure exactly, but it seems to directly impact the ability to overclock the memory past 1600-1650MHz on most cards, though many have not needed it to reach 1700-1750MHz
Click to expand...

Well afaik the AUX voltage is PCIe slot voltage (at least that's the conclusion we came to in the 290/x owners club) and the only Hawaii based card with memory voltage unlocked is the R9 290x Lightning.


----------



## Mysticking32

When you guys overclock do you leave it there all the time or do you only apply it when needed? When I overclock to 1160/1650 I get about a 4 to 5 fps increase with +100mv. I don't think I'd want to be pushing those settings daily.

And so far I haven't really needed the extra 5 fps. I switched to high settings on the witcher 3 so I could get a consistent 60+ fps. If I ran it on ultra then the extra 5 fps would come in handy.

Here are my benchmark scores on stock r9 390x vs overclocked.
http://www.3dmark.com/compare/fs/5995609/fs/5995655


----------



## Agent Smith1984

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Well afaik the AUX voltage is PCIe slot voltage (at least that's the conclusion we came to in the 290/x owners club) and the only Hawaii based card with memory voltage unlocked is the R9 290x Lightning.


Wonder if the aux voltage now represents the memory instead, but the lack of software updating on the third parties doesn't reflect it? Sounds crazy, but there is no doubt about it, that my memory was stick at1600mhz without some aux voltage. I dunno? Wish there was some more information....


----------



## Noirgheos

No one upgrade to 15.9 drivers. They have a memory leak bug and cause crashing.


----------



## zorack

Hey there guys, this is my first message and I'm here because even after some reading and research I still cannot decide what to do. I must admit I am kinda paranoid and that negatively affects my decision-making . So i wanted you guys advice. Btw this thread is really nice but it's also gigantic









The question would be: MSI R9 390 or Sapphire R9 390. Here are the facts I am taking into account:

I am not a huge performance freak but I want my R9 to be a long term vga, and therefore 5-10% performance difference could be a big deal. I should remember that Sapphire 390 has 1010 mhz base clock and is expected to overclock limit around 1125 mhz, and the MSI has base clock 1060 mhz and is expected to overclock limit around 1200 mhz.

However, I really don't like noisy/loud vgas, and I've heard some great comments about Sapphire 390 temperatures and noise, and some mixed comments about MSI 390 temps and noise, some people also reported to have weird high temperatures even on idle with the MSI one. I believe both cards do not have problems with coil whine, is this correct?

Another thing to be considered: I am not from the US, and the MSI 390 is 30 dollars more expensive than the Sapphire one. that would be around a 8% price difference.

So, what you guys think I should go for? thanks in advance to people willing to help a fool like me


----------



## Mysticking32

Quote:


> Originally Posted by *Noirgheos*
> 
> No one upgrade to 15.9 drivers. They have a memory leak bug and cause crashing.


I'm on the 15.9 drivers now and I haven't had a single crash. They seem to be working fine for me?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Mysticking32*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Noirgheos*
> 
> No one upgrade to 15.9 drivers. They have a memory leak bug and cause crashing.
> 
> 
> 
> I'm on the 15.9 drivers now and I haven't had a single crash. They seem to be working fine for me?
Click to expand...

I installed them last night and hit 4GB of Vram usage on the desktop......and Mad Max was stuttering all over the place, back on the 15.7.1 drivers now and butter smooth


----------



## Schwulibertz

Quote:


> Originally Posted by *zorack*
> 
> Hey there guys, this is my first message and I'm here because even after some reading and research I still cannot decide what to do. I must admit I am kinda paranoid and that negatively affects my decision-making . So i wanted you guys advice. Btw this thread is really nice but it's also gigantic
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The question would be: MSI R9 390 or Sapphire R9 390. Here are the facts I am taking into account:
> 
> I am not a huge performance freak but I want my R9 to be a long term vga, and therefore 5-10% performance difference could be a big deal. I should remember that Sapphire 390 has 1010 mhz base clock and is expected to overclock limit around 1125 mhz, and the MSI has base clock 1060 mhz and is expected to overclock limit around 1200 mhz.
> 
> However, I really don't like noisy/loud vgas, and I've heard some great comments about Sapphire 390 temperatures and noise, and some mixed comments about MSI 390 temps and noise, some people also reported to have weird high temperatures even on idle with the MSI one. I believe both cards do not have problems with coil whine, is this correct?
> 
> Another thing to be considered: I am not from the US, and the MSI 390 is 30 dollars more expensive than the Sapphire one. that would be around a 8% price difference.
> 
> So, what you guys think I should go for? thanks in advance to people willing to help a fool like me


If you really arent into a rather loud rig i think overclocking in itself is kinda problematic because you will always need a higher fan speed to comp the higher temps caused by increased clocks and/or energy consumption.

I cant complain about my Sapphire R9 390 but ideally i would want watercooling for a more silent setup and there are no matching wattercooling units for the Nitro (yet).
Thats something I regret about getting this card.

So i think it would be a better idea to just go with a card with reference design to avoid incompatibility with future upgrades.
Unless of course you dont want/need watercooling.


----------



## Cannon19932006

15.9 seems to have a massive memory leak, I randomly see 8gb usage in just desktop.


----------



## Ha-Nocri

Yes, there is a memory leak in 15.9 drivers, every-time you resize a window. Rly bad. If you have MSI AfterBurner you can observe how it's filling up as you resize.


----------



## Dundundata

Quote:


> Originally Posted by *Mysticking32*
> 
> When you guys overclock do you leave it there all the time


No most of the time i run stock. For newer games I'll do a decent OC but not max


----------



## Dundundata

XFX coil whine update: Got a second card to tryout, same problem. Maybe a PSU issue? Ran Witcher 3 overnight on OC, worked fine. Temps are good, but the high pitched noise can be annoying. Trying to break the card in if that's even a thing. Need to try some other games that will push the card, and try the old 15.7.1 driver.

If I can't solve this I may sell the card and get an MSI. Any whine issues with this card? I am totally fine with fan noise. If temps are an issue I can reapply paste. I know the msi390 OC's quite well, is it worthwhile to grab a 390x?


----------



## Mik4

XFX R9390X card with 15.8 drivers here and no coil whine. I have watercooling on it, but don't think it makes much a difference.


----------



## componentgirl90

I get a noise too from my XFX DD 390x. It sounds like something in between a buzz and a clicking. It also gets more like a high pitch squeal at times. Its not very audible through the case, buts its definitely there.

I have used the card for a week or so now and it hasn't gone away. I don't know if this signifies future problems or not. I read that it doesn't usually mean anything serious will happen but I don't know because I am not an expert.

I can't hear it when gaming. I can't even hear the normal card noise itself with heaphones in. I have to say this is probably the noisiest card I have ever had. It sounds a bit like a low powered washing machine on full go. Tbh this is what I would want a high end card to sound like. I am not bothered by the heat, noise or power consumption.

It is a beast of a card.


----------



## Mysticking32

New drivers that fix the memory leak are up. 15.9.1 beta


----------



## Judge Dredd 3D

Quote:


> Originally Posted by *componentgirl90*
> 
> I get a noise too from my XFX DD 390x. It sounds like something in between a buzz and a clicking. It also gets more like a high pitch squeal at times. Its not very audible through the case, buts its definitely there.
> 
> I have used the card for a week or so now and it hasn't gone away. I don't know if this signifies future problems or not. I read that it doesn't usually mean anything serious will happen but I don't know because I am not an expert.
> 
> I can't hear it when gaming. I can't even hear the normal card noise itself with heaphones in. I have to say this is probably the noisiest card I have ever had. It sounds a bit like a low powered washing machine on full go. Tbh this is what I would want a high end card to sound like. I am not bothered by the heat, noise or power consumption.
> 
> It is a beast of a card.


So the dual MSI R9 390X are going back for a Refund, I dared to dream that I could run these in a CF configuration.
It was like a small nuclear reactor inside my case, btw I had good air flow and a brand new Corsair HX1200i PSU because I was concerned about the coil whine.
The coil whine its real on the MSI R9 390X equally or worst than my XFX R9 390X DD.
Trying tu run the the MSI in CF resulted in artifacts running on the normal stock 1080Mhz gpu, it didn't mattered I placed the second card on the last pci slot with the side panel removed on my case... the games and benchmarks would studer due to thermal Throttle. Those cards are now going back for rma refund. My point is Yes, the XFX suffers from Coil Whine and yes its a little noisy after 45% fan speed but it is a good overcloker and runs cooler than the MSI. Also the MSI fans are LOUD...

Update on my rig: Sold my XFX R390X, RMA my 2x MSI R9 390X and ordered a Zotac GTX 980 ti Amp! Extreme. I want to love ATI ( ermm amd) but their offering right now run too hot and all of their new cards have horrible coil whine ( yes I also had a Sapphire R9 Fury Tri-x).


----------



## Cannon19932006

Quote:


> Originally Posted by *Judge Dredd 3D*
> 
> So the dual MSI R9 390X are going back for a Refund, I dared to dream that I could run these in a CF configuration.
> It was like a small nuclear reactor inside my case, btw I had good air flow and a brand new Corsair HX1200i PSU because I was concerned about the coil whine.
> The coil whine its real on the MSI R9 390X equally or worst than my XFX R9 390X DD.
> Trying tu run the the MSI in CF resulted in artifacts running on the normal stock 1080Mhz gpu, it didn't mattered I placed the second card on the last pci slot with the side panel removed on my case... the games and benchmarks would studer due to thermal Throttle. Those cards are now going back for rma refund. My point is Yes, the XFX suffers from Coil Whine and yes its a little noisy after 45% fan speed but it is a good overcloker and runs cooler than the MSI. Also the MSI fans ar3 LOUD...


I can't say for Crossfire, but I literally have none of the other problems with my MSI 390x that you listed, maybe just a really bad card.


----------



## Judge Dredd 3D

Quote:


> Originally Posted by *Cannon19932006*
> 
> I can't say for Crossfire, but I literally have none of the other problems with my MSI 390x that you listed, maybe just a really bad card.


I had Two MSI R9 390Xs, both had the coil whine and had a hard time running them in CF. I specifically replaced my Corsair HX1050 PSU for a Corsair HX1200i because I had the suspicion of the PSU causing the Coil whine on the XFX... but nop the XFX and MSI cards still had the coil whine proble, yes also ran valley for a while to see if that could ease the coil whine but after a week of daily usage it was still there. R9s have a coil whine problem plain and simple.
I have my PC behind a UPS from APC, also replaced my mobo to try to minimize the coil whine... nothing helped.

The nest step would be to re-wire my entire electrical system in my house.. but that would be me being in denial that the R9s have a coil whine problem.. <--- getting redundant.


----------



## Cannon19932006

Quote:


> Originally Posted by *Judge Dredd 3D*
> 
> I had Two MSI R9 390Xs, both had the coil whine and had a hard time running them in CF. I specifically replaced my Corsair HX1050 PSU for a Corsair HX1200i because I had the suspicion of the PSU causing the Coil whine on the XFX... but nop the XFX and MSI cards still had the coil whine proble, yes also ran valley for a while to see if that could ease the coil whine but after a week of daily usage it was still there. *R9s have a coil whine problem plain and simple.*
> I have my PC behind a UPS from APC, also replaced my mobo to try to minimize the coil whine... nothing helped.
> 
> The nest step would be to re-wire my entire electrical system in my house.. but that would be me being in denial that the R9s have a coil whine problem.. <--- getting redundant.


I haven't had any coil whine with either of the 390x's I've had, a Gigabyte 390x g1, and an MSI 390x gaming, and I haven't noticed users complaining about it in this club.


----------



## componentgirl90

Quote:


> Originally Posted by *Judge Dredd 3D*
> 
> I had Two MSI R9 390Xs, both had the coil whine and had a hard time running them in CF. I specifically replaced my Corsair HX1050 PSU for a Corsair HX1200i because I had the suspicion of the PSU causing the Coil whine on the XFX... but nop the XFX and MSI cards still had the coil whine proble, yes also ran valley for a while to see if that could ease the coil whine but after a week of daily usage it was still there. R9s have a coil whine problem plain and simple.
> I have my PC behind a UPS from APC, also replaced my mobo to try to minimize the coil whine... nothing helped.
> 
> The nest step would be to re-wire my entire electrical system in my house.. but that would be me being in denial that the R9s have a coil whine problem.. <--- getting redundant.


But why is the coil whine so bad? What is it about the coil whine that you think means you have to replace it? I hardly notice it.


----------



## Geoclock

I have no COIL WHINE on MSI 390x.

So how is new 15.9.1 Beta if you tried it out?


----------



## Dundundata

Could it have something to do with the PSU, I have a Corsair as well.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Cannon19932006*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Judge Dredd 3D*
> 
> I had Two MSI R9 390Xs, both had the coil whine and had a hard time running them in CF. I specifically replaced my Corsair HX1050 PSU for a Corsair HX1200i because I had the suspicion of the PSU causing the Coil whine on the XFX... but nop the XFX and MSI cards still had the coil whine proble, yes also ran valley for a while to see if that could ease the coil whine but after a week of daily usage it was still there. *R9s have a coil whine problem plain and simple.*
> I have my PC behind a UPS from APC, also replaced my mobo to try to minimize the coil whine... nothing helped.
> 
> The nest step would be to re-wire my entire electrical system in my house.. but that would be me being in denial that the R9s have a coil whine problem.. <--- getting redundant.
> 
> 
> 
> I haven't had any coil whine with either of the 390x's I've had, a Gigabyte 390x g1, and an MSI 390x gaming, and I haven't noticed users complaining about it in this club.
Click to expand...

I've had 2 x R9 290's, 2 x R9 290x's a R9 295x2 and a R9 390x and only the 390x has any sort of coil whine and even then only when the temps get above 80c (which is rare) or at high fps.


----------



## Dead Meat

Quick question to anyone who might know: Does it matter what array of PCI power cables I run to my GPU?

My MSI R9 390 Gaming has the 8+6 pin power interface and my PSU has two 8-pin PCI power outputs ( http://www.newegg.com/Product/Product.aspx?Item=N82E16817182323 ).

Both of the outputs have cables that have the pigtailed dual 8 pin/6+2 pin connectors, so I can hook up the video card with just one cable.

I have tried both configurations, single PCI power to both 8 and 6 pin and each connector on the GPU to one PCI connector on the PSU. I have noticed no significant voltage droop in MSI Afterburner.

Another bit of info that might be helpful is that I seem to not be able to OC the card as high as others here on OCN. I'm currently running it at 1125Mhz core, 1600Mhz memory, +63mv, and +50% power with a considerable fan curve. Temps top out at 88 to 90C, so its not throttling.


----------



## Cannon19932006

Quote:


> Originally Posted by *Dead Meat*
> 
> Quick question to anyone who might know: Does it matter what array of PCI power cables I run to my GPU?
> 
> My MSI R9 390 Gaming has the 8+6 pin power interface and my PSU has two 8-pin PCI power outputs ( http://www.newegg.com/Product/Product.aspx?Item=N82E16817182323 ).
> 
> Both of the outputs have cables that have the pigtailed dual 8 pin/6+2 pin connectors, so I can hook up the video card with just one cable.
> 
> I have tried both configurations, single PCI power to both 8 and 6 pin and each connector on the GPU to one PCI connector on the PSU. I have noticed no significant voltage droop in MSI Afterburner.
> 
> Another bit of info that might be helpful is that I seem to not be able to OC the card as high as others here on OCN. I'm currently running it at 1125Mhz core, 1600Mhz memory, +63mv, and +50% power with a considerable fan curve. Temps top out at 88 to 90C, so its not throttling.


Your psu has a SINGLE 12V rail so using 1 cable that has the pigtailed 8+6/2 is just fine. For psu's with multiple 12v rails, multiple separate PCIe cables are recommended.


----------



## Dead Meat

Sweet. Thanks man.

I was kind of hesitant to assume based on the fact that at the peek theoretical load of 300W+ I'd be pulling a maximum of 25A from a single 8pin. I know that's not a lot for a 12V line but I was worried about splitting a 12V supply to a single card.


----------



## kizwan

Quote:


> Originally Posted by *Dead Meat*
> 
> Sweet. Thanks man.
> 
> I was kind of hesitant to assume based on the fact that at the peek theoretical load of 300W+ I'd be pulling a maximum of 25A from a single 8pin. I know that's not a lot for a 12V line but I was worried about splitting a 12V supply to a single card.


It's always good idea to use separate PCIe cables directly from PSU for each PCIe power connectors.


----------



## Judge Dredd 3D

Had a Corsair HX1050 and decided to replace it with a corsair HX1200i, still same problem.


----------



## Noirgheos

Damn all this coil whine is making me wary of getting a 390X. So far though everyone who's reported it has a Corsair PSU. Could this be a common trend? I have an XFX XTR 750W.


----------



## Mysticking32

I've had 3 r9 390x's and none have coil whine.


----------



## Judge Dredd 3D

Quote:


> Originally Posted by *Noirgheos*
> 
> Damn all this coil whine is making me wary of getting a 390X. So far though everyone who's reported it has a Corsair PSU. Could this be a common trend? I have an XFX XTR 750W.


Don't let that deter you from getting a 390X, the XFX R9 390X is a great card and in my opinion it runs cooler and quieter than the MSI one.
Coil whine is not going to destroy your PC, it only means that you have to consider if yoibare building a super quiet pc.


----------



## Noirgheos

Quote:


> Originally Posted by *Judge Dredd 3D*
> 
> Don't let that deter you from getting a 390X, the XFX R9 390X is a great card and in my opinion it runs cooler and quieter than the MSI one.
> Coil whine is not going to destroy your PC, it only means that you have to consider if yoibare building a super quiet pc.


Well I do have an S340 with sound padding on the inside that I ripped from a H440, and a Corsair H80i for my CPU. I try for it to be quiet, but if only the GPU is audible I don't care as long as it's cool.


----------



## FleiPei

Yesterday, I got a r9 390 nitro + backplate. This version comes with 1040 core clock.

*I was able to reach 1100 mhz core and 1700 mhz memory without increasing the juice.*

At 1110 mhz 3D Mark 2011 has shown little artifacts. BTW 3D Mark 2011 is from my point of view the best benchmark to verify if an oc is stable.

With unigine heaven I didnt get artifacts even on 1140 core. BF hardline has shown no artifacts as well. 3D Mark 2011 did









*By increasing the voltage up to +100 mV my card reaches 1150 mhz stable. But the higher temperature, which results in an higher noise is not worth the performance gain.*


----------



## Noirgheos

Quote:


> Originally Posted by *FleiPei*
> 
> Yesterday, I got a r9 390 nitro + backplate. This version comes with 1040 core clock.
> 
> *I was able to reach 1100 mhz core and 1700 mhz memory without increasing the juice.*
> 
> At 1110 mhz 3D Mark 2011 has shown little artifacts. BTW 3D Mark 2011 is from my point of view the best benchmark to verify if an oc is stable.
> 
> With unigine heaven I didnt get artifacts even on 1140 core. BF hardline has shown no artifacts as well. 3D Mark 2011 did
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *By increasing the voltage up to +100 mV my card reaches 1150 mhz stable. But the higher temperature, which results in an higher noise is not worth the performance gain.*


My God now I wish Sapphire would release a 390X version.


----------



## zorrobyte

Sapphire Nitro 390 w/ Backplate

Can't push clocks much higher, even with overvolt..

Cools well, Furmark @ 79C solid and still quiet (quiet as in sitting right next to me, out of case on my desk)


----------



## ChaosAD

When you say you hear coil whine, is it during benchamrking, during gaming or all the time? I cant hear anything while running valley or 3dmark or playing diablo. The only time i can hear my card scream is at the exit screen of valley which reports 7500 fps.

And btw, i tested 3dmark, which need more voltage than valley to run stable, and i can run at 1120/1600 (mem not tested to max) with 0mv/+50pl and no artifacts at all









Does this score seems ok?


----------



## Gumbi

Quote:


> Originally Posted by *zorrobyte*
> 
> 
> 
> Sapphire Nitro 390 w/ Backplate
> 
> Can't push clocks much higher, even with overvolt..
> 
> Cools well, Furmark @ 79C solid and still quiet (quiet as in sitting right next to me, out of case on my desk)


Please don't abuse your card with Furmark. The clocks may well have throttled tbh. Using a real life bench like Heaven is much more realistic and won't cook your card.


----------



## Mister300

I am new to the OC group and I have a few questions. My profession is research director and I have done Chemistry for over 20 years so I have seen much tech come and go over the years. I run an electron microscope and x ray diffraction research center so I deal with number crunching and high end imaging.

My gaming rig is self built and composes of the following:

5820 K OC at 4.3 GHz multiplier changed only
MSI Krait Mobo
Hyper X 16 gig @ 2667 stock.
EVGA 1 KW G2
XFX 390X
Kraken X-61
1920 X 1080 monitor HP
NZXT H440 white case

The question I have is OCing is fun but my rig laughs at any game you throw at it at stock settings. When I benchmark the OC is about 5 FPS faster but since I am maxed out on eye candy I see no performance reason to do it. Arguably at higher res there could be a slight difference in playabiilty. The issue of screen tearing is more critical to me so I will buy a free sync monitor around Dec when more affordable choices hit the market.

The question I have is why does it matter what program is used I have read from many posters the Furmark is bad or potentially damaging.

It should not matter how the GPU is loaded down 100% is 100% and should not be dependent on the program.

For example GPU Z rendering 100% load

Furmark 100%

Heaven 100%

Professional animators rendering 100%

Any valid scientific proof here or just personal experience? I believe over-volting and heat are the killers here.


----------



## Dundundata

Quote:


> Originally Posted by *Noirgheos*
> 
> Well I do have an S340 with sound padding on the inside that I ripped from a H440, and a Corsair H80i for my CPU. I try for it to be quiet, but if only the GPU is audible I don't care as long as it's cool.


I thought about adding some sound padding, it just might solve it. It's definitely quieter with the side panel on. I might be picking up an MSI today and see how that goes.

Whine gets loader the more volts the card gets, and only when GPU is being taxed under full load.


----------



## BradleyW

Quote:


> Originally Posted by *Mister300*
> 
> I am new to the OC group and I have a few questions. My profession is research director and I have done Chemistry for over 20 years so I have seen much tech come and go over the years. I run an electron microscope and x ray diffraction research center so I deal with number crunching and high end imaging.
> 
> My gaming rig is self built and composes of the following:
> 
> 5820 K OC at 4.3 GHz multiplier changed only
> MSI Krait Mobo
> Hyper X 16 gig @ 2667 stock.
> EVGA 1 KW G2
> XFX 390X
> Kraken X-61
> 1920 X 1080 monitor HP
> NZXT H440 white case
> 
> The question I have is OCing is fun but my rig laughs at any game you throw at it at stock settings. When I benchmark the OC is about 5 FPS faster but since I am maxed out on eye candy I see no performance reason to do it. Arguably at higher res there could be a slight difference in playabiilty. The issue of screen tearing is more critical to me so I will buy a free sync monitor around Dec when more affordable choices hit the market.
> 
> The question I have is why does it matter what program is used I have read from many posters the Furmark is bad or potentially damaging.
> 
> It should not matter how the GPU is loaded down 100% is 100% and should not be dependent on the program.
> 
> For example GPU Z rendering 100% load
> 
> Furmark 100%
> 
> Heaven 100%
> 
> Professional animators rendering 100%
> 
> Any valid scientific proof here or just personal experience? I believe over-volting and heat are the killers here.


The only proof is a wide range of images showing GPU's which have burnt up when running Furmark.
I believe the software issues commands which far exceed power delivery requirements, causing an overload. More proof is found when investigating both Nvidia and AMD drivers. They've implemented throttling procedures when running Furmark.

Furmarks aim is to build up heat and power without actually stressing or saturating the pipelines and command queues constructively. Their idea is "throw power and heat at it. if it survives, it's good to go". There are far more intelligent programs which utilize the card better in accordance to it's architecture without forcing the card beyond it's rated power delivery. Such software is far more effective at diagnosing instability issues.

All software utilize hardware differently. Usage measurements don't mean anything. A game using the GPU at 100% is not using 100% of the GPU, simply because no game or API out there can saturate every shader or streamed processor at the same time. However, low level API's are able to use more of these cores at a given time, and more effectively.

The GPU you have uses GCN. That has a deep pipeline. DX11 cannot feed that pipeline fully. Not even close. Your GPU has never ran 100% usage in it's life.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Mister300*
> 
> I am new to the OC group and I have a few questions. My profession is research director and I have done Chemistry for over 20 years so I have seen much tech come and go over the years. I run an electron microscope and x ray diffraction research center so I deal with number crunching and high end imaging.
> 
> My gaming rig is self built and composes of the following:
> 
> 5820 K OC at 4.3 GHz multiplier changed only
> MSI Krait Mobo
> Hyper X 16 gig @ 2667 stock.
> EVGA 1 KW G2
> XFX 390X
> Kraken X-61
> 1920 X 1080 monitor HP
> NZXT H440 white case
> 
> The question I have is OCing is fun but my rig laughs at any game you throw at it at stock settings. When I benchmark the OC is about 5 FPS faster but since I am maxed out on eye candy I see no performance reason to do it. Arguably at higher res there could be a slight difference in playabiilty. The issue of screen tearing is more critical to me so I will buy a free sync monitor around Dec when more affordable choices hit the market.
> 
> The question I have is why does it matter what program is used I have read from many posters the Furmark is bad or potentially damaging.
> 
> It should not matter how the GPU is loaded down 100% is 100% and should not be dependent on the program.
> 
> For example GPU Z rendering 100% load
> 
> Furmark 100%
> 
> Heaven 100%
> 
> Professional animators rendering 100%
> 
> Any valid scientific proof here or just personal experience? I believe over-volting and heat are the killers here.


No scientific proof here, but it's a known fact that Furmark will place more load on the card than anything else, despite it reading the same 100% utilization that other benches do.

For the purpose of explaining..... it's basically the same difference as running linpacks with IBT on a CPU versus Mirsene Primes with Prime95.... Both will put load at 100%, but one (linpacks) will cause a higher load temp than the other.


----------



## Gumbi

Quote:


> Originally Posted by *Agent Smith1984*
> 
> No scientific proof here, but it's a known fact that Furmark will place more load on the card than anything else, despite it reading the same 100% utilization that other benches do.
> 
> For the purpose of explaining..... it's basically the same difference as running linpacks with IBT on a CPU versus Mirsene Primes with Prime95.... Both will put load at 100%, but one (linpacks) will cause a higher load temp than the other.


This is true.

Get that damn 390 Asus up and running son!


----------



## Mister300

Thanks for your input I value everyone thoughts.


----------



## BradleyW

Quote:


> Originally Posted by *Mister300*
> 
> Thanks for your input I value everyone thoughts.


You are welcome.


Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *BradleyW*
> 
> The only proof is a wide range of images showing GPU's which have burnt up when running Furmark.
> I believe the software issues commands which far exceed power delivery requirements, causing an overload. More proof is found when investigating both Nvidia and AMD drivers. They've implemented throttling procedures when running Furmark.
> 
> Furmarks aim is to build up heat and power without actually stressing or saturating the pipelines and command queues constructively. Their idea is "throw power and heat at it. if it survives, it's good to go". There are far more intelligent programs which utilize the card better in accordance to it's architecture without forcing the card beyond it's rated power delivery. Such software is far more effective at diagnosing instability issues.
> 
> All software utilize hardware differently. Usage measurements don't mean anything. A game using the GPU at 100% is not using 100% of the GPU, simply because no game or API out there can saturate every shader or streamed processor at the same time. However, low level API's are able to use more of these cores at a given time, and more effectively.
> 
> The GPU you have uses GCN. That has a deep pipeline. DX11 cannot feed that pipeline fully. Not even close. Your GPU has never ran 100% usage in it's life.






Edit:

To read the GPU usage more accurately, you'll need this:
https://graphics.stanford.edu/~mdfisher/GPUView.html


----------



## Noirgheos

Quote:


> Originally Posted by *Mister300*
> 
> I am new to the OC group and I have a few questions. My profession is research director and I have done Chemistry for over 20 years so I have seen much tech come and go over the years. I run an electron microscope and x ray diffraction research center so I deal with number crunching and high end imaging.
> 
> My gaming rig is self built and composes of the following:
> 
> 5820 K OC at 4.3 GHz multiplier changed only
> MSI Krait Mobo
> Hyper X 16 gig @ 2667 stock.
> EVGA 1 KW G2
> XFX 390X
> Kraken X-61
> 1920 X 1080 monitor HP
> NZXT H440 white case
> 
> The question I have is OCing is fun but my rig laughs at any game you throw at it at stock settings. When I benchmark the OC is about 5 FPS faster but since I am maxed out on eye candy I see no performance reason to do it. Arguably at higher res there could be a slight difference in playabiilty. The issue of screen tearing is more critical to me so I will buy a free sync monitor around Dec when more affordable choices hit the market.
> 
> The question I have is why does it matter what program is used I have read from many posters the Furmark is bad or potentially damaging.
> 
> It should not matter how the GPU is loaded down 100% is 100% and should not be dependent on the program.
> 
> For example GPU Z rendering 100% load
> 
> Furmark 100%
> 
> Heaven 100%
> 
> Professional animators rendering 100%
> 
> Any valid scientific proof here or just personal experience? I believe over-volting and heat are the killers here.


You mean screen tearing with VSYNC off? And do you experience coil whine?


----------



## kizwan

Quote:


> Originally Posted by *Mister300*
> 
> Thanks for your input I value everyone thoughts.


Furmark is a power virus which can make the gpu to draw more power than any games/benchmarks. It can overtaxing the VRMs which what happened in the past where gpu damaged because of it. However, modern gpus nowadays are designed to run in TDP limited when it detect furmark. So it's still not a _proper_ software to test overclock stability because it still can make the gpu to generate a lot of heat & also when the card is running in TDP limited, it's not exactly properly fully utilize the card.


----------



## Mister300

Funny my VRAMS never exceed 70 C under any load conditions. GPU stays under 85 C at 50% fan speed.


----------



## AverdanOriginal

Quote:


> Originally Posted by *Dead Meat*
> 
> Sweet. Thanks man.
> 
> I was kind of hesitant to assume based on the fact that at the peek theoretical load of 300W+ I'd be pulling a maximum of 25A from a single 8pin. I know that's not a lot for a 12V line but I was worried about splitting a 12V supply to a single card.


Quote:


> Originally Posted by *kizwan*
> 
> It's always good idea to use separate PCIe cables directly from PSU for each PCIe power connectors.


I also tested it if I can see a difference. I have a BeQuiet Straight Power 600W CM PSU which is multiple rails. Here a run on Heaven with 1200/1700 @+100mv/+38aux and the corresponding GPU-Z Screenshots. (Don't count that overclock as I got wiht both runs serious artifacting and I believe that's more an issue of my old motherboard/RAM/CPU issue rather than only the card as this card didn't need voltage until 1140/1680).

With just 1 lane used for the GPU:



With 2 lanes used for GPU:



I did it twice with each run. So from what I could tell is, that using 2 lanes (2 splitt cables each plugged into the PSU) I have a slightly higher VDDC Current IN = 0.7 A and a bit lower VDDC Current Out = 6 A. Also a bit more power in and out compared to using 1 lane. So from what I could tell with a multiple rail PSU, plugging in 2 different VGA cables you give the card a bit more power = a bit more stability. while I think this wont matter much if you run at stock. But of course cable management in a cube case like mine gets much more complicated. You guys think it wouldn't matter if I just take one 6+2 cable of OFF each of the 2 PSU plugs (meaning you have half of each cable)? or might that confuse the PSU or ruin the power flow??? not sure.
here a small pic to understand what I mean


----------



## Cannon19932006

Quote:


> Originally Posted by *kizwan*
> 
> It's always good idea to use separate PCIe cables directly from PSU for each PCIe power connectors.


Any reasoning behind this on single rail psu's?


----------



## Streetdragon

Quote:


> Originally Posted by *FleiPei*
> 
> Yesterday, I got a r9 390 nitro + backplate. This version comes with 1040 core clock.
> 
> *I was able to reach 1100 mhz core and 1700 mhz memory without increasing the juice.*
> 
> At 1110 mhz 3D Mark 2011 has shown little artifacts. BTW 3D Mark 2011 is from my point of view the best benchmark to verify if an oc is stable.
> 
> With unigine heaven I didnt get artifacts even on 1140 core. BF hardline has shown no artifacts as well. 3D Mark 2011 did
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *By increasing the voltage up to +100 mV my card reaches 1150 mhz stable. But the higher temperature, which results in an higher noise is not worth the performance gain.*


yep. can do the same. 1100/1700 with +0 Voltage. Stock is +19 Voltage. So i even lowerd it by higher clocks. Cpu gets up to 75 and both vrm 85. (Crossfire)


----------



## kizwan

Quote:


> Originally Posted by *Cannon19932006*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> It's always good idea to use separate PCIe cables directly from PSU for each PCIe power connectors.
> 
> 
> 
> Any reasoning behind this on single rail psu's?
Click to expand...

When you're using same cable, daisy chain, for both PCIe power connectors, the card will draw all of the power through the same wire. Sure you usually can get away with that but there's no reason to do so if you can use separate cables from the PSU for each PCIe power connectors, splitting the load. I have seen cables melted before & damaging the hardware in the process. So it's always good idea to be cautious.


----------



## Mister300

No whine, yes with v sync off. Find tearing varies form game to game Metro 2033 is notorious must use riva tune to override.


----------



## Noirgheos

Quote:


> Originally Posted by *Mister300*
> 
> No whine, yes with v sync off. Find tearing varies form game to game Metro 2033 is notorious must use riva tune to override.


Still as long as it's capped to your monitor's refresh rate, there should be no tearing.


----------



## BradleyW

Quote:


> Originally Posted by *Mister300*
> 
> No whine, yes with v sync off. Find tearing varies form game to game Metro 2033 is notorious must use riva tune to override.


If you disable Vsync, you'll have tearing.
If you enable Vsync, you won't have tearing.
If you cap the fps to the same value as your refresh rate whilst Vsync is disabled, you'll get "minor" tearing.
If you cap the fps to the same value as your refresh rate whilst Vsync is enabled, you'll experience less input lag.
If you enable Vsync, and you see tearing, you have a GPU driver issue or a game specific issue.


----------



## Dundundata

So I stopped by Microcenter today and picked up an MSI 390...AMAZING!

This card is an absolute beast. Beat my best OC with ease. Temps are impressive as well. I don't want to post numbers yet since I just got it up and running, but it already beats my old card hands down. And the best part....no whine!









Probably won't get too much testing done tonight but it looks like I have to put my XFX on the auction block for a bit of a loss.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Dundundata*
> 
> So I stopped by Microcenter today and picked up an MSI 390...AMAZING!
> 
> This card is an absolute beast. Beat my best OC with ease. Temps are impressive as well. I don't want to post numbers yet since I just got it up and running, but it already beats my old card hands down. And the best part....no whine!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Probably won't get too much testing done tonight but it looks like I have to put my XFX on the auction block for a bit of a loss.


Yes, the msi is a great card, btw...

I'll get new members added tomorrow...

I've been so busy at work, not to mention my main rig is still down with dead CPU


----------



## vatrak

Bought myself a new PC and went with a 390x. Can't wait to get all the parts and build that baby next week!

Going from a 2500k and a HD7950 to a 6600k and a 390x on a 144hz monitor (old one is 60hz) should be pretty sweet!


----------



## Dundundata

Ok a few numbers for the MSI









1150/1650 +19mV

Witcher 3 temps (max) w/custom fan curve:
core, 66C
vrm1, 61C
vrm2, 51C
fan, ~55%

Card is very quiet, can't wait to push the OC further and see what she can do


----------



## Noirgheos

Quote:


> Originally Posted by *Dundundata*
> 
> So I stopped by Microcenter today and picked up an MSI 390...AMAZING!
> 
> This card is an absolute beast. Beat my best OC with ease. Temps are impressive as well. I don't want to post numbers yet since I just got it up and running, but it already beats my old card hands down. And the best part....no whine!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Probably won't get too much testing done tonight but it looks like I have to put my XFX on the auction block for a bit of a loss.


Looks like I'm getting an MSI 390X. Mind testing some DX11 games? A lot of people are reporting crashes on the 300 series cards. In fact, can anyone with games like Witcher 3, or any new game try it out with their card?


----------



## Noirgheos

Quote:


> Originally Posted by *Dundundata*
> 
> Ok a few numbers for the MSI
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1150/1650 +19mV
> 
> Witcher 3 temps (max) w/custom fan curve:
> core, 66C
> vrm1, 61C
> vrm2, 51C
> fan, ~55%
> 
> Card is very quiet, can't wait to push the OC further and see what she can do


Mind posting a screenshot of the fan curve?


----------



## kalidae

Quote:


> Originally Posted by *Mister300*
> 
> I am new to the OC group and I have a few questions. My profession is research director and I have done Chemistry for over 20 years so I have seen much tech come and go over the years. I run an electron microscope and x ray diffraction research center so I deal with number crunching and high end imaging.
> 
> My gaming rig is self built and composes of the following:
> 
> 5820 K OC at 4.3 GHz multiplier changed only
> MSI Krait Mobo
> Hyper X 16 gig @ 2667 stock.
> EVGA 1 KW G2
> XFX 390X
> Kraken X-61
> 1920 X 1080 monitor HP
> NZXT H440 white case
> 
> The question I have is OCing is fun but my rig laughs at any game you throw at it at stock settings. When I benchmark the OC is about 5 FPS faster but since I am maxed out on eye candy I see no performance reason to do it. Arguably at higher res there could be a slight difference in playabiilty. The issue of screen tearing is more critical to me so I will buy a free sync monitor around Dec when more affordable choices hit the market.
> 
> The question I have is why does it matter what program is used I have read from many posters the Furmark is bad or potentially damaging.
> 
> It should not matter how the GPU is loaded down 100% is 100% and should not be dependent on the program.
> 
> For example GPU Z rendering 100% load
> 
> Furmark 100%
> 
> Heaven 100%
> 
> Professional animators rendering 100%
> 
> Any valid scientific proof here or just personal experience? I believe over-volting and heat are the killers here.


I don't believe you really need a freesync monitor to avoid screen tearing. I was in the market for a new monitor just a month or 2 ago and was looking into 1080 1440 and 4k monitors. I ended up going with a 1080p 144hz monitor with no freesync becsuse I read that as long as you can keep with fps up high like 90+ then screen tearing will be minimal to non existent. So i bought an asus 144hz and i have not seen any screen tearing at all and believe me I am looking for it. I used to play on a 42" TV and screen tearing was really really bad in all games. This monitor is awesome, I have to say I am so glad I didn't fork out the extra money for freesync. I honestly do not believe it's needed, just a decent 144hz panel and keep those fps high and you'll be free of screen tear, I'm actually playing a really moded skyrim right now and my fps is in between 40-100 and i still don't get screen tearing or motion blur even on those low fps dips (on fields) Just my recommendation and a much cheaper option.


----------



## Gumbi

x
Quote:


> Originally Posted by *Dundundata*
> 
> Ok a few numbers for the MSI
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1150/1650 +19mV
> 
> Witcher 3 temps (max) w/custom fan curve:
> core, 66C
> vrm1, 61C
> vrm2, 51C
> fan, ~55%
> 
> Card is very quiet, can't wait to push the OC further and see what she can do


That's... a superb overclock. Keep pushing it. That should hit 1200mhz core or higher!


----------



## Sgt Bilko

And today i learned after some messing around that my 390x doesn't have coil whine at all........it's actually the fans, once they hit max or near max rpm they make a vibration sound that sounds alot like coil whine.

easiest way to test it is to just raise and lower the fan speed on the desktop and see if you can hear it......it's faint but can be noticeable, I'm going to see if i can track down a Kraken + G10 bracket at some point and replace the cooler on it since the DD 390x has a beefy vrm heatsink


----------



## diggiddi

Quote:


> Originally Posted by *Sgt Bilko*
> 
> And today i learned after some messing around that my 390x doesn't have coil whine at all........it's actually the fans, once they hit max or near max rpm they make a vibration sound that sounds alot like coil whine.
> 
> easiest way to test it is to just raise and lower the fan speed on the desktop and see if you can hear it......it's faint but can be noticeable, I'm going to see if i can track down a Kraken + G10 bracket at some point and replace the cooler on it since the DD 390x has a beefy vrm heatsink


Yess doo eeet! Will you be able to keep the backplate?


----------



## Sgt Bilko

Quote:


> Originally Posted by *diggiddi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> And today i learned after some messing around that my 390x doesn't have coil whine at all........it's actually the fans, once they hit max or near max rpm they make a vibration sound that sounds alot like coil whine.
> 
> easiest way to test it is to just raise and lower the fan speed on the desktop and see if you can hear it......it's faint but can be noticeable, I'm going to see if i can track down a Kraken + G10 bracket at some point and replace the cooler on it since the DD 390x has a beefy vrm heatsink
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yess doo eeet! Will you be able to keep the backplate?
Click to expand...

I can't see why not tbh, all depends on how long the screws are for the Kraken i suppose.

This is just an idea though, I'm not going out of my way to make it happen


----------



## Dundundata

Quote:


> Originally Posted by *Noirgheos*
> 
> Mind posting a screenshot of the fan curve?


Sure, I just used my old one. This card runs cooler though so it's probably overkill but I can't really hear the fans. You can also use MSI's mode where the fans don't run below a certain temp but I like low temps!

I was very surprised how this card handled Firestrike. At first I gave it more mV but it kept passing at lower voltage.


----------



## Gumbi

Quote:


> Originally Posted by *Dundundata*
> 
> Sure, I just used my old one. This card runs cooler though so it's probably overkill but I can't really hear the fans. You can also use MSI's mode where the fans don't run below a certain temp but I like low temps!
> 
> I was very surprised how this card handled Firestrike. At first I gave it more mV but it kept passing at lower voltage.


Hey man, you should try pushing the card a bit more. Try 1200/1600 at 75mv and see if you can pull that off... honestly 1200 core at 100mv is good going, any less is damn good.


----------



## Lazy Dog

Rock solid at -69mV for the past week.









This is from last night playing BF4 in a 144 tickrate server.


Spoiler: Warning: Spoiler!


----------



## flopper

Quote:


> Originally Posted by *kalidae*
> 
> I don't believe you really need a freesync monitor to avoid screen tearing. I was in the market for a new monitor just a month or 2 ago and was looking into 1080 1440 and 4k monitors. I ended up going with a 1080p 144hz monitor with no freesync becsuse I read that as long as you can keep with fps up high like 90+ then screen tearing will be minimal to non existent. So i bought an asus 144hz and i have not seen any screen tearing at all and believe me I am looking for it. I used to play on a 42" TV and screen tearing was really really bad in all games. This monitor is awesome, I have to say I am so glad I didn't fork out the extra money for freesync. I honestly do not believe it's needed, just a decent 144hz panel and keep those fps high and you'll be free of screen tear, I'm actually playing a really moded skyrim right now and my fps is in between 40-100 and i still don't get screen tearing or motion blur even on those low fps dips (on fields) Just my recommendation and a much cheaper option.


there are 75hz freesync 1080p screens for under 200euro now from aoc.
freesync is the better option today.

Tried BF4 with and without freesync and freesync wins easily.


----------



## battleaxe

Quote:


> Originally Posted by *Sgt Bilko*
> 
> And today i learned after some messing around that my 390x doesn't have coil whine at all........it's actually the fans, once they hit max or near max rpm they make a vibration sound that sounds alot like coil whine.
> 
> easiest way to test it is to just raise and lower the fan speed on the desktop and see if you can hear it......it's faint but can be noticeable, I'm going to see if i can track down a Kraken + G10 bracket at some point and replace the cooler on it since the DD 390x has a beefy vrm heatsink


Yeah, the XFX heatsink on the VRM is best in class IMO. And it lends itself well for AIO's. I think the XFX is the best card out there right now. Nice considering they were among the weakest last gen.


----------



## supermiguel

i heard alot of good things about xfx


----------



## Sgt Bilko

Quote:


> Originally Posted by *battleaxe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> And today i learned after some messing around that my 390x doesn't have coil whine at all........it's actually the fans, once they hit max or near max rpm they make a vibration sound that sounds alot like coil whine.
> 
> easiest way to test it is to just raise and lower the fan speed on the desktop and see if you can hear it......it's faint but can be noticeable, I'm going to see if i can track down a Kraken + G10 bracket at some point and replace the cooler on it since the DD 390x has a beefy vrm heatsink
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yeah, the XFX heatsink on the VRM is best in class IMO. And it lends itself well for AIO's. I think the XFX is the best card out there right now. Nice considering they were among the weakest last gen.
Click to expand...

I've been going back and forth with a friend who just got a Powercolor Devil 390x and the vrm temps he is reporting are better than what my DD can manage, If i get some solid temps from him I'll report back with them









I've talked with XFX about the 390x and they said it was one of the biggest complaints they got with the 290x was the vrm cooling, the core temps were fine but the vrm's were just bad for the most part so they focused most of their efforts into improving them without sacrificing the temps for the core.......I'd say they did a pretty good job


----------



## kizwan

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I've been going back and forth with a friend who just got a Powercolor Devil 390x and the vrm temps he is reporting are better than what my DD can manage, If i get some solid temps from him I'll report back with them
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've talked with XFX about the 390x and they said it was one of the biggest complaints they got with the 290x was the vrm cooling, the core temps were fine but the vrm's were just bad for the most part so they focused most of their efforts into improving them without sacrificing the temps for the core.......I'd say they did a pretty good job


Did you & your friend running benchmark at the same clock & compare?


----------



## battleaxe

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I've been going back and forth with a friend who just got a Powercolor Devil 390x and the vrm temps he is reporting are better than what my DD can manage, If i get some solid temps from him I'll report back with them
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've talked with XFX about the 390x and they said it was one of the biggest complaints they got with the 290x was the vrm cooling, the core temps were fine but the vrm's were just bad for the most part so they focused most of their efforts into improving them without sacrificing the temps for the core.......I'd say they did a pretty good job


Yeah, agreed. If I end up picking up a 390x the XFX is the one I will get.

I think I may delve into a full loop soon though, so will probably wait on the GPU's until next gen HBM2 instead. Still thinking about it all.


----------



## Sgt Bilko

Quote:


> Originally Posted by *kizwan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> I've been going back and forth with a friend who just got a Powercolor Devil 390x and the vrm temps he is reporting are better than what my DD can manage, If i get some solid temps from him I'll report back with them
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've talked with XFX about the 390x and they said it was one of the biggest complaints they got with the 290x was the vrm cooling, the core temps were fine but the vrm's were just bad for the most part so they focused most of their efforts into improving them without sacrificing the temps for the core.......I'd say they did a pretty good job
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Did you & your friend running benchmark at the same clock & compare?
Click to expand...

He is running some benchmarks atm and just enjoying it tbh, when he gets some time I'll do my own benches and we can compare








Quote:


> Originally Posted by *battleaxe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> I've been going back and forth with a friend who just got a Powercolor Devil 390x and the vrm temps he is reporting are better than what my DD can manage, If i get some solid temps from him I'll report back with them
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've talked with XFX about the 390x and they said it was one of the biggest complaints they got with the 290x was the vrm cooling, the core temps were fine but the vrm's were just bad for the most part so they focused most of their efforts into improving them without sacrificing the temps for the core.......I'd say they did a pretty good job
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yeah, agreed. If I end up picking up a 390x the XFX is the one I will get.
> 
> I think I may delve into a full loop soon though, so will probably wait on the GPU's until next gen HBM2 instead. Still thinking about it all.
Click to expand...

Yeah, Arctic Islands is where I'll be going after this, the Fury and Fury X don't interest me enough to fork out the silly amount of cash they cost here (R.I.P Aussie Dollar)


----------



## MK-Professor

I have to say Sapphire R9 390 is the quietest & coolest GPU that I had in the last 10 years.

With OC 1105/1730 +50vm
After a couple of Heaven benchmark loops, with room temperature 23C it maxed at 66C with fan speed 54%(in normal gaming I get around 60-63C with fan speed 50% or under)


----------



## TristanL

I (obv) didn't read the whole thread, but does anybody have experience with the GPU load jumping from 100% to 0% and reverse? A freind has a Gigabyte G1 390 since yesterday and is a bit irritated by the GPU-Z measurements, although he thinks it doesn't affect his FPS ingame.
I read somewhere that this was already a thing with 290 GPUs and some say disabling VSync would help.

Any experiences or thoughts?

(OS is Windows 10 Pro x64 with the latest (stable) Catalyst)


----------



## battleaxe

Quote:


> Originally Posted by *TristanL*
> 
> I (obv) didn't read the whole thread, but does anybody have experience with the GPU load jumping from 100% to 0% and reverse? A freind has a Gigabyte G1 390 since yesterday and is a bit irritated by the GPU-Z measurements, although he thinks it doesn't affect his FPS ingame.
> I read somewhere that this was already a thing with 290 GPUs and some say disabling VSync would help.
> 
> Any experiences or thoughts?
> 
> (OS is Windows 10 Pro x64 with the latest (stable) Catalyst)


Need more information. During what? Just gameplay? what resolution?

Neither of my cards do this at all. But there are things that can cause it. Set the power limit to max for starters.


----------



## MK-Professor

Quote:


> Originally Posted by *TristanL*
> 
> I (obv) didn't read the whole thread, but does anybody have experience with the GPU load jumping from 100% to 0% and reverse? A freind has a Gigabyte G1 390 since yesterday and is a bit irritated by the GPU-Z measurements, although he thinks it doesn't affect his FPS ingame.
> I read somewhere that this was already a thing with 290 GPUs and some say disabling VSync would help.
> 
> Any experiences or thoughts?
> 
> (OS is Windows 10 Pro x64 with the latest (stable) Catalyst)


use msi afterburner and the go to settings -> general -> enable unified GPU usage monitoring


----------



## LeSwede

Can i join?

MSI Radeon R9 390 8GB Gaming
Clock: Stock
Cooling: Stock


----------



## Lasso

Picked up a 390 today









Clock: Stock
Cooling: Stock


----------



## Agent Smith1984

Quote:


> Originally Posted by *LeSwede*
> 
> Can i join?
> 
> MSI Radeon R9 390 8GB Gaming
> Clock: Stock
> Cooling: Stock


You're in!

How are temps with that setup?

Anyone I've seen running two MSI's has had hell keeping them cool cause the 2.5 slot design caused the cards to be so close, and the top card generally starves for air.


----------



## LeSwede

Quote:


> Originally Posted by *Agent Smith1984*
> 
> You're in!
> 
> How are temps with that setup?
> 
> Anyone I've seen running two MSI's has had hell keeping them cool cause the 2.5 slot design caused the cards to be so close, and the top card generally starves for air.


The top card runs at 90c on max load, hence why Im looking into liquid cooling for them xD


----------



## Agent Smith1984

Quote:


> Originally Posted by *LeSwede*
> 
> The top card runs at 90c on max load, hence why Im looking into liquid cooling for them xD


Your only option would be a GPU only water block and some sinks and fans on the VRM and VRAM....

There are no full cover blocks for the MSI cards.


----------



## LeSwede

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Your only option would be a GPU only water block and some sinks and fans on the VRM and VRAM....
> 
> There are no full cover blocks for the MSI cards.


I saw on EK's website that some Rx 200 series full cover blocks will fit the 300 series GPUs, but I cant confirm anything for my MSI ones D:


----------



## TristanL

Quote:


> Originally Posted by *battleaxe*
> 
> Need more information. During what? Just gameplay? what resolution?
> 
> Neither of my cards do this at all. But there are things that can cause it. Set the power limit to max for starters.


confirmed games are Part of Exile, H1Z1 and something else i don't remember (I'm still writing for a friend), the resolution is 1080p so nothing fancy here.
He descries the problem as slight micro stutters (at least in Path of Exile) while having stable FPS.

increasing the power limit to +20%, turning on VSync and doing a Clean Driver installation (with DDU) apparently had no effect on the issue.


----------



## Agent Smith1984

Quote:


> Originally Posted by *LeSwede*
> 
> I saw on EK's website that some Rx 200 series full cover blocks will fit the 300 series GPUs, but I cant confirm anything for my MSI ones D:


It's already been confirmed (see OP)

EK does not, and will not produce a full cover block for these....

Kind of a shame if you consider how popular the card is, and how well it already clocks on air alone.


----------



## battleaxe

Quote:


> Originally Posted by *TristanL*
> 
> confirmed games are Part of Exile, H1Z1 and something else i don't remember (I'm still writing for a friend), the resolution is 1080p so nothing fancy here.
> He descries the problem as slight micro stutters (at least in Path of Exile) while having stable FPS.
> 
> increasing the power limit to +20%, turning on VSync and doing a Clean Driver installation (with DDU) apparently had no effect on the issue.


What are the temps? Not throttling?


----------



## TristanL

Quote:


> Originally Posted by *battleaxe*
> 
> What are the temps? Not throttling?


in some heavy situations it maybe reaches 80°C which should be okay.
To make it clear: The main "issue" is that he sees a Graph which is behaving a bit unorthodox and the "fact" that he might be a bit of a "hypochondriac" when it comes to PC problems does not make it any easier for me









(maybe i shouldn't had introduce him to these "Tools")


----------



## battleaxe

Quote:


> Originally Posted by *TristanL*
> 
> in some heavy situations it maybe reaches 80°C which should be okay.
> To make it clear: The main "issue" is that he sees a Graph which is behaving a bit unorthodox and the "fact" that he might be a bit of a "hypochondriac" when it comes to PC problems does not make it any easier for me
> 
> 
> 
> 
> 
> 
> 
> 
> 
> (maybe i shouldn't had introduce him to these "Tools")


Post a screeny. Some variations in the core on AB are normal. It will occasionally spike up and down. But most of the time it should run at full clocks. A screeny would be helpful just to be sure the usage is normal or represents and irregularity. Below is an example of usage shown on GPU-Z.


----------



## Dundundata

Took a bit more to get to 1200/1650, +100/+38. Amazing it can do it.


----------



## TristanL

Quote:


> Originally Posted by *battleaxe*
> 
> Post a screeny. Some variations in the core on AB are normal. It will occasionally spike up and down. But most of the time it should run at full clocks. A screeny would be helpful just to be sure the usage is normal or represents and irregularity. Below is an example of usage shown on GPU-Z.


all i can deliver right now is this GPU-Z screen, fluctuations are there but it seems like they do not always affect the Core Clock:

 (it was taken while playing Path of Exile)


----------



## Agent Smith1984

Quote:


> Originally Posted by *TristanL*
> 
> all i can deliver right now is this GPU-Z screen, fluctuations are there but it seems like they do not always affect the Core Clock:
> 
> (it was taken while playing Path of Exile)


That's just power tune noticing how easy the game is to run, look how low the gpu utilization itself is....


----------



## Agent Smith1984

Quote:


> Originally Posted by *Dundundata*
> 
> Took a bit more to get to 1200/1650, +100/+38. Amazing it can do it.


Looks great!

If you lift that aux voltage to 50mv, the vram will probably hit 1700mhz+ also.

Over 14k+ firestrike graphics score for $330! Gotta love it


----------



## TristanL

Quote:


> Originally Posted by *Agent Smith1984*
> 
> That's just power tune noticing how easy the game is to run, look how low the gpu utilization itself is....


the thing is that the GPU Load jumps from 100% to 0% and reverse


----------



## Mysticking32

Quote:


> Originally Posted by *Dundundata*
> 
> Took a bit more to get to 1200/1650, +100/+38. Amazing it can do it.


That's better than my 390x lol. God damn


----------



## AverdanOriginal

Quote:


> Originally Posted by *LeSwede*
> 
> The top card runs at 90c on max load, hence why Im looking into liquid cooling for them xD


90 on max load in crossfire is normal. maybe change TIM and add a side fan (if possible via case or simply dodgy style directly on the cards on the side?). If you do, let us know if it improves, would be interesting to hear.


----------



## AverdanOriginal

Quote:


> Originally Posted by *MK-Professor*
> 
> I have to say Sapphire R9 390 is the quietest & coolest GPU that I had in the last 10 years.
> 
> With OC 1105/1730 +50vm
> After a couple of Heaven benchmark loops, with room temperature 23C it maxed at 66C with fan speed 54%(in normal gaming I get around 60-63C with fan speed 50% or under)


Quote:


> Originally Posted by *Dundundata*
> 
> Took a bit more to get to 1200/1650, +100/+38. Amazing it can do it.


Nice overclocks.
Is it just me or does it seem like that MSI R9 390 tends to be better at overclocking the core, while the Saphire Nitro R9 390 tends to be a bit better at overclocking Memory?


----------



## LeSwede

Quote:


> Originally Posted by *TristanL*
> 
> the thing is that the GPU Load jumps from 100% to 0% and reverse


This happens for me to, its simply the GPU only working when it needs to. Nothing wrong here


----------



## kalidae

Quote:


> Originally Posted by *flopper*
> 
> there are 75hz freesync 1080p screens for under 200euro now from aoc.
> freesync is the better option today.
> 
> Tried BF4 with and without freesync and freesync wins easily.


75hz is still gimping your system people say you can't see over 60 fps or whatever but you can feel it if that makes sense. No point in having a beast of a pc if you only get 60-75 fps tops. And for freesync you need to stay within a certain fps for it to work. With a beefy system and a 144hz monitor just keep the fps up and you won't get screen tearing, I can tell you that I don't have freesync and i don't need it. I'm playing bf4 and my fps minimum is like 80 so most of the time it's way above that, it's the smoothest game play ever.

Everything over here in Australia is expensive and a freesync monitor is like 600+ for a decent one and the best are 1000, but I could never play on anything less than 144hz now.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Mysticking32*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Dundundata*
> 
> Took a bit more to get to 1200/1650, +100/+38. Amazing it can do it.
> 
> 
> 
> 
> 
> That's better than my 390x lol. God damn
Click to expand...

It's not far off my 390x either:
http://www.3dmark.com/fs/5766796


----------



## Sgt Bilko

Quote:


> Originally Posted by *kalidae*
> 
> Quote:
> 
> 
> 
> Originally Posted by *flopper*
> 
> there are 75hz freesync 1080p screens for under 200euro now from aoc.
> freesync is the better option today.
> 
> Tried BF4 with and without freesync and freesync wins easily.
> 
> 
> 
> 75hz is still gimping your system people say you can't see over 60 fps or whatever but you can feel it if that makes sense. No point in having a beast of a pc if you only get 60-75 fps tops. And for freesync you need to stay within a certain fps for it to work. With a beefy system and a 144hz monitor just keep the fps up and you won't get screen tearing, I can tell you that I don't have freesync and i don't need it. I'm playing bf4 and my fps minimum is like 80 so most of the time it's way above that, it's the smoothest game play ever.
> 
> Everything over here in Australia is expensive and a freesync monitor is like 600+ for a decent one and the best are 1000, but I could never play on anything less than 144hz now.
Click to expand...

AOC just released some newer 1080p 144hz monitors, not sure what the Freesync range on them isbut I'm going 4k Freesync anyways.....just waiting for Payday to roll around


----------



## kalidae

Quote:


> Originally Posted by *Sgt Bilko*
> 
> AOC just released some newer 1080p 144hz monitors, not sure what the Freesync range on them isbut I'm going 4k Freesync anyways.....just waiting for Payday to roll around


Mmmm 4k you have an absolute beast of a machine. I'd do the same thing haha.


----------



## Sgt Bilko

Quote:


> Originally Posted by *kalidae*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> AOC just released some newer 1080p 144hz monitors, not sure what the Freesync range on them isbut I'm going 4k Freesync anyways.....just waiting for Payday to roll around
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Mmmm 4k you have an absolute beast of a machine. I'd do the same thing haha.
Click to expand...

Well I should update my Sig Rig but i have a 295x2 paired with the 390x atm ready for 4k


----------



## Mysticking32

Quote:


> Originally Posted by *Sgt Bilko*
> 
> It's not far off my 390x either:
> http://www.3dmark.com/fs/5766796


What's your card overclocked to?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Mysticking32*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> It's not far off my 390x either:
> http://www.3dmark.com/fs/5766796
> 
> 
> 
> 
> 
> What's your card overclocked to?
Click to expand...

1200/1700 for that run iirc


----------



## seanpatrick

Just got my Sapphire 390. This is the newer version with the backplate


----------



## Mysticking32

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 1200/1700 for that run iirc


Very nice. I wish my card could get to 1200 lol.

Here's my score. http://www.3dmark.com/fs/5995609


----------



## kizwan

Quote:


> Originally Posted by *battleaxe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *TristanL*
> 
> in some heavy situations it maybe reaches 80°C which should be okay.
> To make it clear: The main "issue" is that he sees a Graph which is behaving a bit unorthodox and the "fact" that he might be a bit of a "hypochondriac" when it comes to PC problems does not make it any easier for me
> 
> 
> 
> 
> 
> 
> 
> 
> 
> (maybe i shouldn't had introduce him to these "Tools")
> 
> 
> 
> Post a screeny. Some variations in the core on AB are normal. It will occasionally spike up and down. But most of the time it should run at full clocks. A screeny would be helpful just to be sure the usage is normal or represents and irregularity. Below is an example of usage shown on GPU-Z.
Click to expand...

I think when he said gpu usage, he meant the gpu load, not the frequency.

@TristanL depending on the games, it's quite normal for gpu usage/load fluctuates from 100% to 0% and vice versa, especially at 1080p. If the games performance not affected, then there's nothing to worry about. It's just cosmetic. If your friend using MSI AB for monitoring, just enable unified gpu usage.


----------



## LeSwede

Hey,

Just did some benching in Unigine Heaven and Valley

Unigine Valley, 1 card:


Unigine Valley, 2 cards:


Unigine Heaven, 2 cards:


----------



## rdr09

Quote:


> Originally Posted by *LeSwede*
> 
> Hey,
> 
> Just did some benching in Unigine Heaven and Valley
> 
> Unigine Valley, 1 card:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Unigine Valley, 2 cards:
> 
> 
> Unigine Heaven, 2 cards:
> 
> 
> Spoiler: Warning: Spoiler!


try disabling and re-enabling crossfire prior to running Valley. if you scroll down the op . . . you'll see tweaks that will help boost your scores.

http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0

i scored a 70 with a single 290 oc'ed.

edit: what psu are you using?


----------



## Hitm4n

Hello community,

Just got a nitro 390 as a replacement of my 280x tri-x. good upgrade for my part.

280x @1175/1735 :9804 graphics score
http://www.3dmark.com/3dm/8703840

i'm working on the 390 atm but i'll need your help about some stuff.

390 Nitro @1200/1600 :13888 graphics score
http://www.3dmark.com/3dm/8770965

Using Trix:
I need the +200mv and +50 power to get to 1200, it won't go further ... is it ok to leave this 24/7 ?
On the memory side it's not passing really stable after 1600....firestrike just goes black and that's it, i have to reboot.

Using Afterburner i get higher on memory with the additional Auxiliary memory voltage (1650)

There is no Auxiliary memory voltage within Trix ? ( 14.9.0)

driver used : 15.8Beta ( should i try the 15.15 or wich is best used ?)

i really would like to hit that 14k score...









Thank you


----------



## Coolslammer

Just got the msi r9 390! Overclocking now. I'll post my oc later.

Sent from my SAMSUNG-SM-G890A using Tapatalk


----------



## Hitm4n

Quote:


> Originally Posted by *Hitm4n*
> 
> Hello community,
> 
> Just got a nitro 390 as a replacement of my 280x tri-x. good upgrade for my part.
> 
> 280x @1175/1735 :9804 graphics score
> http://www.3dmark.com/3dm/8703840
> 
> i'm working on the 390 atm but i'll need your help about some stuff.
> 
> 390 Nitro @1200/1600 :13888 graphics score
> http://www.3dmark.com/3dm/8770965
> 
> Using Trix:
> I need the +200mv and +50 power to get to 1200, it won't go further ... is it ok to leave this 24/7 ?
> On the memory side it's not passing really stable after 1600....firestrike just goes black and that's it, i have to reboot.
> 
> Using Afterburner i get higher on memory with the additional Auxiliary memory voltage (1650)
> 
> There is no Auxiliary memory voltage within Trix ? ( 14.9.0)
> 
> driver used : 15.8Beta ( should i try the 15.15 or wich is best used ?)
> 
> i really would like to hit that 14k score...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thank you


well, i've hit exactly 14.000 graphics score @ 1200/1615 under triX


----------



## gerpogi

Has anyone tried using a kraken g10 on a 390x? Does it fit?


----------



## Noirgheos

Can anyone here with a 390 Nitro w/backplate or not, see how high they can OC?


----------



## EricM280

Quote:


> Originally Posted by *gerpogi*
> 
> Has anyone tried using a kraken g10 on a 390x? Does it fit?


I can only speak on behalf of the XFX 390/390x. I didn't have any problems using the Kraken G10 or Corsairs HG10 A1. No mods were necessary on either. To be completely honest though, strange enough, i found the best temps i got were using the Arctic Hybrid II-120. Possibly just because of the VRAM cooling included. That one though, because its "universal" takes quite a bit more time to install. Still, that is the one I am currently running. I will include temp screenshots when i get home if interested.


----------



## gerpogi

Quote:


> Originally Posted by *EricM280*
> 
> I can only speak on behalf of the XFX 390/390x. I didn't have any problems using the Kraken G10 or Corsairs HG10 A1. No mods were necessary on either. To be completely honest though, strange enough, i found the best temps i got were using the Arctic Hybrid II-120. Possibly just because of the VRAM cooling included. That one though, because its "universal" takes quite a bit more time to install. Still, that is the one I am currently running. I will include temp screenshots when i get home if interested.


Thank you so much! Pics would be nice. I have the xfx core model so that's good to hear


----------



## Hemanse

Ordered a R9 390 yesterday, should be able to pick it up on monday, got a few questions that people might feel like answering









I have been a bit hesitant in buying a new GPU, but after my 770 died i kinda had to bite the bullet, what i am most worried about is ending up with coil whine, is coil whine just as big of a problem with the 300 series as it is with 900 nvidia series? Just gotta cross my fingers that i get one without.

And how are people liking their MSI 390? Went with MSI over Sapphire since it was cheaper, sounded like it was quieter and the fact that it has a backplate.


----------



## Dundundata

Quote:


> Originally Posted by *Hemanse*
> 
> Ordered a R9 390 yesterday, should be able to pick it up on monday, got a few questions that people might feel like answering
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have been a bit hesitant in buying a new GPU, but after my 770 died i kinda had to bite the bullet, what i am most worried about is ending up with coil whine, is coil whine just as big of a problem with the 300 series as it is with 900 nvidia series? Just gotta cross my fingers that i get one without.
> 
> And how are people liking their MSI 390? Went with MSI over Sapphire since it was cheaper, sounded like it was quieter and the fact that it has a backplate.


I got whine with my XFX, but I think I'm in the minority. I absolutely love my MSI; great OCer, quiet, and it runs really cool. Oh yeah and the backplate has a dragon on it too


----------



## Hemanse

Quote:


> Originally Posted by *Dundundata*
> 
> I got whine with my XFX, but I think I'm in the minority. I absolutely love my MSI; great OCer, quiet, and it runs really cool. Oh yeah and the backplate has a dragon on it too


Atleast there is a 14 days return policy here, so if i do get coil whine, atleast it should be able to return it, crossing my fingers that i dont tho
Quote:


> Originally Posted by *Dundundata*
> 
> I got whine with my XFX, but I think I'm in the minority. I absolutely love my MSI; great OCer, quiet, and it runs really cool. Oh yeah and the backplate has a dragon on it too


You exchanged the XFX for the MSI then? I have a XFX R7 370 in my machine now that i borrowed from my smaller rig, cant say im impressed by it, then again the 370 isnt exactly a power house and dear lord is it loud


----------



## Dundundata

In Afterburner there is an option to 'Extend official overclocking limits'. I had this checked with my old card, but just realized I never checked it with the MSI. It doesn't seem to affect my OCing ability at all though. Would I gain anything from enabling it?

After playing Witcher 3 for awhile...


----------



## MechaDurka

MechaDurka XFX R9 390 stock air cooling 1050mhz core/1600mhz memory


----------



## seanpatrick

Quote:


> Originally Posted by *Noirgheos*
> 
> Can anyone here with a 390 Nitro w/backplate or not, see how high they can OC?


I haven't tried messing with the voltages (I probably won't) but I can get 1100 / 1625 stable without touching them. The fans are indeed whisper quiet, much better than my Sapphire Dual-X 280x that it replaced. I haven't been able to push the temp past 70 while benchmarking, and at 60 they're dead silent. That being said I'm rocking an Air 540 case with 3 front intake fans blowing right over it, but it's still pretty impressive.


----------



## Oregonduck007

Quote:


> Originally Posted by *Hemanse*
> 
> Ordered a R9 390 yesterday, should be able to pick it up on monday, got a few questions that people might feel like answering
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have been a bit hesitant in buying a new GPU, but after my 770 died i kinda had to bite the bullet, what i am most worried about is ending up with coil whine, is coil whine just as big of a problem with the 300 series as it is with 900 nvidia series? Just gotta cross my fingers that i get one without.
> 
> And how are people liking their MSI 390? Went with MSI over Sapphire since it was cheaper, sounded like it was quieter and the fact that it has a backplate.


Love my MSI R9 390. Sold overclocker and good cooling. i got mine to 1190/1700 with just +50.


----------



## flopper

Quote:


> Originally Posted by *Hitm4n*
> 
> Hello community,
> 
> Just got a nitro 390 as a replacement of my 280x tri-x. good upgrade for my part.
> 
> Using Trix:
> I need the +200mv and +50 power to get to 1200, it won't go further ... is it ok to leave this 24/7 ?
> On the memory side it's not passing really stable after 1600....firestrike just goes black and that's it, i have to reboot.
> 
> Thank you


No not ok.

I would leave it what 100mv can do.
and even then back off a few mhz.


----------



## ChaosAD

Why do i get better graphics score in 3dmark with pcie 2.0x16 than with 3.0? By changing it to 3.0 i lose about 400 points.


----------



## kizwan

Quote:


> Originally Posted by *ChaosAD*
> 
> Why do i get better graphics score in 3dmark with pcie 2.0x16 than with 3.0? By changing it to 3.0 i lose about 400 points.


Repeatedly? 400 points is not much & I think within margin of error.


----------



## Pwned24

Sapphire Tri X R9 390 8GB


----------



## Mysticking32

Quote:


> Originally Posted by *ChaosAD*
> 
> Why do i get better graphics score in 3dmark with pcie 2.0x16 than with 3.0? By changing it to 3.0 i lose about 400 points.


What speed is your 3.0? Is it less than x8? Is it x8? That's probably why


----------



## ChaosAD

Quote:


> Originally Posted by *kizwan*
> 
> Repeatedly? 400 points is not much & I think within margin of error.


Yes repeatedly, as soon as i change to 2.0 my score goes up.

Quote:


> Originally Posted by *Mysticking32*
> 
> What speed is your 3.0? Is it less than x8? Is it x8? That's probably why


Its at x16, both 2.0 and 3.0


----------



## Stige

Any way to improve the VRM cooling on 390 DCIII Strix?

The GPU runs nice temps but VRM gets frying near 100C under heavy load even at stock voltage, I didn't imagine it would be THAT bad.

Anyone got a fix for this? New thermal pads?


----------



## sportsczy

For DCUIII, you need to change fan settings. Because the fans are so damn loud, they have them running at nothing to fool you into believing it's quiet. Once you get the fans up, the VRM was fine for me. The loudness, however, made me change to Kraken watercooling and heatsinks for the VRM.


----------



## Stige

How are your temps now? I did some checking and there are no full cover blocks for the card available so trying to think of alternative solutions aswell.

I have the fans already at much higher speeds than normal just to keep the VRMs at 90-95C instead of 100C+.


----------



## sportsczy

http://s1173.photobucket.com/user/raptormadrid/media/GPUz.png.html

http://s1173.photobucket.com/user/raptormadrid/media/3DMark.png.html

Asus DCU III with Kraken G10. 50mv+ on overall and 25+ on Aux.


----------



## sportsczy

Quote:


> Originally Posted by *Stige*
> 
> How are your temps now? I did some checking and there are no full cover blocks for the card available so trying to think of alternative solutions aswell.
> 
> I have the fans already at much higher speeds than normal just to keep the VRMs at 90-95C instead of 100C+.


Great. I got the Gelid VRM heatsink kit, Kraken G10, Cosmos heatsinks for memory, Noctua 92mm fan for Kraken, Cougar Vortex 120 for radiator and Corsair H55. Core doesn't go higher than 55-60 and VRM hit 70 at most... and this is with pretty significant OC. Whisper quiet... i set the fans at 65% constant.


----------



## Zack Foo

Hi guys I'm back. Saw my name on the list but just wanna correct it I upgraded from asus 390 to msi 390x BTW.
This is with -100mv. Temp for full load is 72c max room temperature around 25.
I tried -100 for aux along with -100 for core too BTW. The score increased to 2886 which is weird I would say. But temp is the same.


----------



## Coolslammer

Hey guys I was wondering about my msi r9 390. When overclocking it maxs out at around 1141 core clock with 100+. It has never crashed but it gets arti-fracking on occt. Is my card a bad overclcocker or is something else wrong


----------



## Agent Smith1984

Quote:


> Originally Posted by *Coolslammer*
> 
> Hey guys I was wondering about my msi r9 390. When overclocking it maxs out at around 1141 core clock with 100+. It has never crashed but it gets arti-fracking on occt. Is my card a bad overclcocker or is something else wrong


What are your temps? Have you tried finding max oc at 50-75mv? Some people are finding better clocks in that range due to tdp limitations....

Is your msi 390 the "le" version?


----------



## Noirgheos

Can anyone find some benchmarks where the 390X is consistently matching or besting the 980 at 1080p? I've seen so many sites show the 980 winning, or the 390X winning, I just don't know what to believe anymore.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Noirgheos*
> 
> Can anyone find some benchmarks where the 390X is consistently matching or besting the 980 at 1080p? I've seen so many sites show the 980 winning, or the 390X winning, I just don't know what to believe anymore.


At 1080p it's dependant on the game....

390x wins some battles, 980 wins others...

Overclocked, the 980 will win overall because Maxwell has so much headroom, but 1080 isn't where the 390 shines.

It's at 1440 and above where the 390 starts gaining ground on the 980 and its limited frame buffer...

And don't forget the price difference!


----------



## rdr09

Quote:


> Originally Posted by *Noirgheos*
> 
> Can anyone find some benchmarks where the 390X is consistently matching or besting the 980 at 1080p? I've seen so many sites show the 980 winning, or the 390X winning, I just don't know what to believe anymore.


if you have a bencher in you . . . the 980 for sure. others who have used both will have differing views but here are some . . .

see last posts . . .

http://www.overclock.net/t/1547314/official-amd-r9-radeon-fury-nano-x-x2-fiji-owners-club/4480#post_24465874


----------



## Coolslammer

Quote:


> Originally Posted by *Agent Smith1984*
> 
> What are your temps? Have you tried finding max oc at 50-75mv? Some people are finding better clocks in that range due to tdp limitations....
> 
> Is your msi 390 the "le" version?


my temps are around 90 degrees. and no my card is not the le edition


----------



## Mister300

How I decide on a videocard purchase.

Do not get caught up with marketing claims or benchmarking/spec 'measurbators'. I had one NVIDIA fanboy tell me his 980 smokes my 390X. BTW his 980 has only a 3 FPS difference is one or two game benchmarks, not a scientifically valid reason to buy one over another. Plus my son has one so I can subjectivity compare the two cards.

Where do you want to be in 3 to 4 years? The card with the more VRAM will have more future-proofing NVIDIA users are going to find out that 4 Gig VRAM will not be enough down the road, I found that out when I got a 5870 with 1 gig onboard, If you have the cash to buy new tech every year VRAM is not an issue.

What tech is most important? ROP count. For example I almost bought a 980Ti at 750USD. The only tech factor that makes it better than the 390X is the 96 ROP vs our 64 ROP count. Wonder why is praised for 4K?

Fill rates are crucial for 4K more ROP's is the factor OEMs are overlooking so until this increases 4K is hard to achieve at insane FPS. Believe there is a limit on how many ROPs are physically possible to incorporate. We found out that HBM didn't make AMD into 4 K killers. Only time will tell if HBM is the answer.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Coolslammer*
> 
> my temps are around 90 degrees. and no my card is not the le edition


Are you running stock fan profile?

90C is on the high side, especially for overclocking, and that probably means your VRM's are well over 80C.

My MSI 390 with custom profile and high flow case fans was hitting 76C core/75C VRM with 100mv+ and doing 1200/1700!

Get it cooler, and see how it does then..... I've not seen an MSI yet that won't break 1160 pretty easily.


----------



## RicoDee




----------



## Agent Smith1984

Quote:


> Originally Posted by *RicoDee*


----------



## Dundundata

Quote:


> Originally Posted by *Coolslammer*
> 
> my temps are around 90 degrees. and no my card is not the le edition


that sounds way off how is the cooling in your case and what are your ambient temps? I've never used OCCT most people seem to test with Unigine Heaven or 3DMark Firestrike. Try using MSI Afterburner for OC instead and setting a fan curve something like this, and start with stock clocks and see how your temps are (HWiNFO is good for reading core/VRM temps):


----------



## Derek129

Well I sent my card for Rma to gigabyte a week ago. The current Rma status says there's no problems and they've tested it with multiple configurations. What a load of ****. Obviously I sent it in for a reason..biggest $450 mistake investment of my life


----------



## Coolslammer

No, I overclock with ab and stress with occt. And I stress test with my fans at 100%


----------



## Streetdragon

Quote:


> Originally Posted by *RicoDee*


so no fun for sapphire nitro......


----------



## Waleh

delete please


----------



## kizwan

Quote:


> Originally Posted by *RicoDee*


That is an old note which only applicable to 290's.


----------



## fyzzz

Quote:


> Originally Posted by *Streetdragon*
> 
> so no fun for sapphire nitro......


I don't think they mean the msi and gigabyte 390(x)'s, but instead the 290(x)'s. Also look at the compatibility list


----------



## Agent Smith1984

SAD SAD NEWS...

Had me excited.... I will verify and remove the information from the OP...

I was really hoping the MSI had water coming, cause those cards are begging for it.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Agent Smith1984*
> 
> SAD SAD NEWS...
> 
> Had me excited.... I will verify and remove the information from the OP...
> 
> I was really hoping the MSI had water coming, cause those cards are begging for it.


Not sure why I was personally excited, since I no longer have an MSI... lol

I guess I was excited for everybody else.


----------



## Zack Foo

Quote:


> Originally Posted by *Agent Smith1984*
> 
> SAD SAD NEWS...
> 
> Had me excited.... I will verify and remove the information from the OP...
> 
> I was really hoping the MSI had water coming, cause those cards are begging for it.


I found a website byski or something like that they are from china and they apparently make custom blocks for any card. Msi 390x can be done to according to what they reply to me on the email.


----------



## Zack Foo

What ar
Quote:


> Originally Posted by *Agent Smith1984*
> 
> Not sure why I was personally excited, since I no longer have an MSI... lol
> 
> I guess I was excited for everybody else.


I tot you had a msi 390?


----------



## Dundundata

Quote:


> Originally Posted by *Coolslammer*
> 
> No, I overclock with ab and stress with occt. And I stress test with my fans at 100%


Hmmm i wonder if it's a thermal paste issue, can you exchange it?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Zack Foo*
> 
> What ar
> I tot you had a msi 390?


I did, I sold it and am about to try out the Strix 390


----------



## Mister300

How easy is the Kraken to mount, I have the XFX 390X core edition too.


----------



## muhd86

i have gigabyte r9 390 8gb gaming g1 models in tri fire rampage v board - the thing is temps are way to high even in 3d mark fire strike benchmarks temps hit 90 93c even with full fans speeds , thats only the top 2 gpus - as the 3rd has plenty of air to breath temps dont go above 65c max 70c for that gpus .

so how do i cool them except water cooling , would a fan placed near the gpus blowing air on top and from the sides help ...coz its risking it to let them run at 90c all the time , i cancel the benchmark when it touches 90c .

so whats the soloution here


----------



## Waldos Platypus

Been lurking for a few years here, and finally setup an account. Recently got a Gigabyte G1 390, and hoping to join the group soon with pic. Thanks for posting fantastic material thus far on this thread, been quite helpful.

Also, hello.


----------



## AverdanOriginal

Quote:


> Originally Posted by *muhd86*
> 
> i have gigabyte r9 390 8gb gaming g1 models in tri fire rampage v board - the thing is temps are way to high even in 3d mark fire strike benchmarks temps hit 90 93c even with full fans speeds , thats only the top 2 gpus - as the 3rd has plenty of air to breath temps dont go above 65c max 70c for that gpus .
> 
> so how do i cool them except water cooling , would a fan placed near the gpus blowing air on top and from the sides help ...coz its risking it to let them run at 90c all the time , i cancel the benchmark when it touches 90c .
> 
> so whats the soloution here


Depends really. Hard to tell without a pic of your airflow set-up. I don't have a 2 or 3 way set up but here are a couple of suggestions:

1. if your airflow in the case is set up in that way that cool air comes in from the front and exits at the back and the back top, installing a fan on top of the cards might be contra productive. you would make the hot air collect in the bottom back corner of your case and thats probably where your cards are... so one on the side would be good. maybe a bigger one to go over all 3 on the side. but this might only bring 1-3c° better temps.
2. exchange the TIM on each card. in some cases the differences where reported to be up to 6c° better (the higher the temps go on your cards the bigger the difference would be).
3. refrain from overclocking








4. try undervolting. you can lower your voltage and keep your average fps steady with these cards. I have a setting on my msi r9 390 for really hot days (when it gets around 35c° in my room). which is -81mv and 1020/1680. heaven score is the same as with 1040/1500 @ normal voltage. BUT the temps went down from 78 to 71.

So all together you might an improvement of around 2 fan + 5-6 TIM+ 5-7 Undervolting = 12-15c° difference.... at max. probably for the top two cards maybe only 6-8c° but that would bring you into the 80s range of temps and thats fine I guess. hope that helps

EDIT: wait a sec. isn't the Gigabyte Voltage hard locked?? that might make it hard to undervolt as you can not change the voltage lower then aswell I guess


----------



## Dundundata

I'm gonna say take the sides off your case and get a couple of these


----------



## Darkeylel

So from what I'm reading here the kraken x61 with the g10 bracket fits the 390x ?


----------



## Noirgheos

This probably isn't the right place to post this, but can anyone help me on this? My PSU(EVGA G2 750W 80+ Gold) makes a very loud buzzing sound at start up, and eventually it goes away. but can also come back at any time, it seems to happen more often than not when the PC is under load. I've tried running Valley for 24 hours to "burn it in" but that didn't do anything. I also removed my GPU to check if it was the GPU, it's not.

I really don't want to RMA, any suggestions?

P.S. If you google EVGA G2 750W buzzing, you'll get results with coil whine, videos and such. This is NOT coil whine, it is much too rattly to be coil whine.


----------



## Dundundata

The fan?


----------



## Oregonduck007

Quote:


> Originally Posted by *Agent Smith1984*
> 
> What are your temps? Have you tried finding max oc at 50-75mv? Some people are finding better clocks in that range due to tdp limitations....
> 
> Is your msi 390 the "le" version?


Le version? can you tell me the differences between them?


----------



## Rob27shred

Fortunately no coil whine with my XFX 390 even with OC. I played the Witcher 3 for hrs the other day & did not notice any at least. I'll be sure to pay a little more attention for it now though.


----------



## Zack Foo

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I did, I sold it and am about to try out the Strix 390


While the strix have real good design, i would notnot recommend it over the msi one. The cooling is not as good and overclocking seems weak for mine and a few of them that I know have the same problem too. It can't quite push over 1160.


----------



## Stige

I can do 1150 on stock voltage on my 390 Strix but if I even think of upping the voltage the VRM gets so hot I start to get artifacts.

The VRM needs a serious improvement in cooling, the core temps are not an issue at all.


----------



## poii

http://www.computerbase.de/2015-10/neun-radeon-r9-390-x-partnerkarten-test/

A nice 390(X) Round-up including 9 different cards with (sustained) clock speeds, noise, temperature, power draw (complete system) and performance.

Asus Radeon R9 390X Strix OC
Club3D Radeon R9 390 royalQueen
MSI Radeon R9 390 Gaming
MSI Radeon R9 390X Gaming
PowerColor Radeon R9 390 PCS+
PowerColor Radeon R9 390X Devil
Sapphire Radeon R9 390 Nitro
Sapphire Radeon R9 390X Tri-X
XFX Radeon R9 390X Black Edition

Tested!


----------



## Exenth

decided to overclock again, and ended up with a totaly different result
first i had 1120/1700 and now
I was able to push my R9 390 Nitro to 1180/1680 with +100mV
don't know if it was because of the new driver, or my younger self was just stupid


----------



## Gumbi

Quote:


> Originally Posted by *Exenth*
> 
> decided to overclock again, and ended up with a totaly different result
> first i had 1120/1700 and now
> I was able to push my R9 390 Nitro to 1180/1680 with +100mV
> don't know if it was because of the new driver, or my younger self was just stupid


Very respectable overclock... How hot does the core and VRMs get under load and at what fan speed?


----------



## Exenth

Quote:


> Originally Posted by *Gumbi*
> 
> Very respectable overclock... How hot does the core and VRMs get under load and at what fan speed?


in Unigin Valley my GPU temp is around 72°C and VRAM1 gets hot on Sapphire Cards so there is no surprise that it hits 90°C and VRAM2 is around 70°C
fan fluctuates from 80% - 90%
it ran for about 1,5 hours

thats my fan curve:


----------



## sportsczy

Just a note... as the OP stated, OCing for the sake of OCing is a mistake for this card. For example, you could get to 1200/1800 perhaps... but the actual gain in performance is negligible compared to 1180/1750 if you check the benchmark results. In my case, gains became tiny and really not worth the extra OC. For my card, the sweet spot was 1170/1750 although i could do 1200. Just something to keep in mind.


----------



## Exenth

Quote:


> Originally Posted by *sportsczy*
> 
> Just a note... as the OP stated, OCing for the sake of OCing is a mistake for this card. For example, you could get to 1200/1800 perhaps... but the actual gain in performance is negligible compared to 1180/1750 if you check the benchmark results. In my case, gains became tiny and really not worth the extra OC. For my card, the sweet spot was 1170/1750 although i could do 1200. Just something to keep in mind.


yea i noticed that after 1150 there was no real gain, i just wanted to know what the max of my card was, now i'm back to 1150, which seems to be the sweet spot for my Nitro


----------



## Zack Foo

Quote:


> Originally Posted by *Exenth*
> 
> decided to overclock again, and ended up with a totaly different result
> first i had 1120/1700 and now
> I was able to push my R9 390 Nitro to 1180/1680 with +100mV
> don't know if it was because of the new driver, or my younger self was just stupid


that is quite a lot of voltage bumb wont the temperature be quite high?
try finding the clocks at lower voltage i realise that mine scores better at lower voltage. like i was pushing 1200 with 95 mv and later realise i scores better at 80. and also the aux too it scores way better at +35 instead of +75


----------



## Zack Foo

Quote:


> Originally Posted by *sportsczy*
> 
> Just a note... as the OP stated, OCing for the sake of OCing is a mistake for this card. For example, you could get to 1200/1800 perhaps... but the actual gain in performance is negligible compared to 1180/1750 if you check the benchmark results. In my case, gains became tiny and really not worth the extra OC. For my card, the sweet spot was 1170/1750 although i could do 1200. Just something to keep in mind.


i only realise the scores from valley increase around 100 from 1150 to 1200 and around 5 fps difference i think. which proves your words too.


----------



## sportsczy

I suggest that you get the max OC at a set voltage... in your case, whatever voltage is at 1150. That's literally what i did... i saw that the gains from 1500 up were small and tiny once i got to 1180. So i settled on 1170 as it was the highest i could get at the same voltage as 1150.


----------



## gerpogi

Quote:


> Originally Posted by *EricM280*
> 
> I can only speak on behalf of the XFX 390/390x. I didn't have any problems using the Kraken G10 or Corsairs HG10 A1. No mods were necessary on either. To be completely honest though, strange enough, i found the best temps i got were using the Arctic Hybrid II-120. Possibly just because of the VRAM cooling included. That one though, because its "universal" takes quite a bit more time to install. Still, that is the one I am currently running. I will include temp screenshots when i get home if interested.


Btw do I need to get heatsinKS for the xfx model when using the Kraken g10? Or will that black heatsink fit under there? For the hg10 wouldn't I need a reference blower fan?


----------



## EricM280

Quote:


> Originally Posted by *gerpogi*
> 
> Btw do I need to get heatsinKS for the xfx model when using the Kraken g10? Or will that black heatsink fit under there? For the hg10 wouldn't I need a reference blower fan?


So the Kraken G10 comes with the fan that mounts on top and blows downwards. I personally didn't like this design as it really didn't help cooling the VRAM especially on the 390 where it is well known that they run hot. If you can come up with something else to add as cooling to it, i would recommend it definitely. You do not need to have the reference blower for this bracket though.


----------



## sportsczy

Quote:


> Originally Posted by *EricM280*
> 
> So the Kraken G10 comes with the fan that mounts on top and blows downwards. I personally didn't like this design as it really didn't help cooling the VRAM especially on the 390 where it is well known that they run hot. If you can come up with something else to add as cooling to it, i would recommend it definitely. You do not need to have the reference blower for this bracket though.


I used the following Gelid kit to go with the Kraken G10 and it worked perfectly. Keep in mind it doesn't line up with the screw opening on your card. You need to get adhesive thermal tape to hold it.

http://www.amazon.com/gp/product/B00K73F60E?psc=1&redirect=true&ref_=oh_aui_detailpage_o00_s00

Also got this to cool the GPU RAM properly:

http://www.amazon.com/gp/product/B007XA969G?psc=1&redirect=true&ref_=od_aui_detailpages00


----------



## EricM280

Quote:


> Originally Posted by *sportsczy*
> 
> I used the following Gelid kit to go with the Kraken G10 and it worked perfectly. Keep in mind it doesn't line up with the screw opening on your card. You need to get adhesive thermal tape to hold it.
> 
> http://www.amazon.com/gp/product/B00K73F60E?psc=1&redirect=true&ref_=oh_aui_detailpage_o00_s00
> 
> Also got this to cool the GPU RAM properly:
> 
> http://www.amazon.com/gp/product/B007XA969G?psc=1&redirect=true&ref_=od_aui_detailpages00


Awesome thanks! I will have to look into that. Did that make a solid difference in VRAM temp for you? that is the one thing i didn't like with both the Kracken and Corsair brackets. I am hoping to eventually go XFire with my 390s but the Hybrid II cooler http://www.amazon.com/ARCTIC-Accelero-Graphics-Efficient-DCACO-V860001-GB/dp/B00I3ETT84/ref=sr_1_1?s=pc&ie=UTF8&qid=1444153581&sr=1-1&keywords=Arctic+Hybrid+II would be a tight fight with the VRAM heatsinks.


----------



## bazookatooths

Well finally finished my new build.


----------



## By-Tor

Grats on the build....

Wow that CPU voltage...


----------



## CamsX

Quote:


> Originally Posted by *bazookatooths*
> 
> Well finally finished my new build.


Hi, what are your current memory settings?

I was able to reach 4.55ghz @ 1.375v with my 6600k, but memory refuses to post at the nominal XMP profile speed for 2666mhz. Currently at 2650mhz with 18-33-33-44 2T latencies.


----------



## sportsczy

Quote:


> Originally Posted by *EricM280*
> 
> Awesome thanks! I will have to look into that. Did that make a solid difference in VRAM temp for you? that is the one thing i didn't like with both the Kracken and Corsair brackets. I am hoping to eventually go XFire with my 390s but the Hybrid II cooler http://www.amazon.com/ARCTIC-Accelero-Graphics-Efficient-DCACO-V860001-GB/dp/B00I3ETT84/ref=sr_1_1?s=pc&ie=UTF8&qid=1444153581&sr=1-1&keywords=Arctic+Hybrid+II would be a tight fight with the VRAM heatsinks.


VRM running benchmarks for 10 mins max out at 75 degrees with OC of +50 power on main, +25 power on aux and 1170/1750. GPU temp maxed out at 60 with a H55 and Cougar Vortex fan at 65% fan speed. Whisper quiet.

Benchmarks are not reality either... in games, it runs much cooler than that even.


----------



## bazookatooths

Quote:


> Originally Posted by *By-Tor*
> 
> Grats on the build....
> 
> Wow that CPU voltage...


Yes gave me quite the scare , but found out it is currently bugged.
Quote:


> Originally Posted by *CamsX*
> 
> Hi, what are your current memory settings?
> 
> I was able to reach 4.55ghz @ 1.375v with my 6600k, but memory refuses to post at the nominal XMP profile speed for 2666mhz. Currently at 2650mhz with 18-33-33-44 2T latencies.


4.6ghz @ 1.275v Vcore, I've been able to boot up and pass 3dMark @ 4.9ghz @ 1.45v Vcore, which suprised me. I have yet to tweak the ram settings. Gigabyte just released a bios Improve DDR XMP compatibility, I have had no problems with setting mine to XMP 2400mhz 16-16-16-39 2T 1.2v. I been reading the reviews of ballistix ram and a few people are saying they OC to 2666MHz CL12-12-12-28-1T 1.35V stable , so I will let you know when i test. Looking at the OC chart the silicon lottery shows 4.5ghz at that voltage while others are getting 4.8ghz, so its just luck. I would first check for current bios then check voltages and timings and manually set the ram.


----------



## tangelo

Quote:


> Originally Posted by *Oregonduck007*
> 
> Le version? can you tell me the differences between them?


LE has lower OC on it's default clocks. It's clocked at 1010 as the "normal" version is 1060 (or 1040 if you using the "gaming mode" instead of OC)

EDIT: There is speculation on the forums that the LE versions are lower binned cards, but I dunno if it's true.


----------



## fyzzz

Does somebody have a xfx 390 dd bios? If so i want to have a copy thanks. Reason why i looking for a 390 non x bios is that i want to take a peak inside it/test it and see if any changes has happen from the leaked 390x bios i'm running on my 290 right now. I suppose this is the right thread to request this.


----------



## gerpogi

Quote:


> Originally Posted by *sportsczy*
> 
> I used the following Gelid kit to go with the Kraken G10 and it worked perfectly. Keep in mind it doesn't line up with the screw opening on your card. You need to get adhesive thermal tape to hold it.
> 
> http://www.amazon.com/gp/product/B00K73F60E?psc=1&redirect=true&ref_=oh_aui_detailpage_o00_s00
> 
> Also got this to cool the GPU RAM properly:
> 
> http://www.amazon.com/gp/product/B007XA969G?psc=1&redirect=true&ref_=od_aui_detailpages00


I'm no expert in pc parts so bear with me. But under the xfx heatsink isn't there a black plate already covering the vrams? And another black heatsink on the right side? Can't I just leave them on or do they have to come off for the Kraken g10?


----------



## sportsczy

Not sure how the XFX is set up. But if it comes with heatsinks that cover VRAMs and they don't need to be removed once the radiotor is removed, you can give it a try. My ASUS had a backplate too and i kept that on. Took out everything else though.


----------



## gerpogi

Quote:


> Originally Posted by *sportsczy*
> 
> Not sure how the XFX is set up. But if it comes with heatsinks that cover VRAMs and they don't need to be removed once the radiotor is removed, you can give it a try. My ASUS had a backplate too and i kept that on. Took out everything else though.


I see. You didn't have to use the g10 backplate?


----------



## sportsczy

No because the g10 backplate was unnecessary and, more importantly, then screws became too short if the backplate were to be used.


----------



## gerpogi

Quote:


> Originally Posted by *sportsczy*
> 
> No because the g10 backplate was unnecessary and, more importantly, then screws became too short if the backplate were to be used.


I see. Ty for that info. What card do you have and which cooler are you using?


----------



## sportsczy

Asus DCU III and Corsair H55


----------



## gerpogi

Quote:


> Originally Posted by *sportsczy*
> 
> Asus DCU III and Corsair H55


I see. Hopefully it works out well for me
Ty for the help guys


----------



## Oregonduck007

Quote:


> Originally Posted by *tangelo*
> 
> Try rising the volts or go down on clocks.
> LE has lower OC on it's default clocks. It's clocked at 1010 as the "normal" version is 1060 (or 1040 if you using the "gaming mode" instead of OC)
> 
> EDIT: There is speculation on the forums that the LE versions are lower binned cards, but I dunno if it's true.


Interesting, would like to know more about that.


----------



## CamsX

Quote:


> Originally Posted by *bazookatooths*
> 
> Yes gave me quite the scare , but found out it is currently bugged.
> 4.6ghz @ 1.275v Vcore, I've been able to boot up and pass 3dMark @ 4.9ghz @ 1.45v Vcore, which suprised me. I have yet to tweak the ram settings. Gigabyte just released a bios Improve DDR XMP compatibility, I have had no problems with setting mine to XMP 2400mhz 16-16-16-39 2T 1.2v. I been reading the reviews of ballistix ram and a few people are saying they OC to 2666MHz CL12-12-12-28-1T 1.35V stable , so I will let you know when i test. Looking at the OC chart the silicon lottery shows 4.5ghz at that voltage while others are getting 4.8ghz, so its just luck. I would first check for current bios then check voltages and timings and manually set the ram.


Thank you for your comments. Also using Gigabyte motherboard. Hate having ram timing problems. Otherwise it is stable. Not really needing to go any further on CPU overclocking, so I'll stay where I'm at.


----------



## bazookatooths

Quote:


> Originally Posted by *CamsX*
> 
> Thank you for your comments. Also using Gigabyte motherboard. Hate having ram timing problems. Otherwise it is stable. Not really needing to go any further on CPU overclocking, so I'll stay where I'm at.


Yes had a question for you , have you tried using the fast boot utility or super fast, everytime I try that or RAM fast boot I cant post

Also tried OC settings on the RAM , no go for me so just leave them stock for now at these speeds and frequencies I doubt it will be noticeably faster as it is already such overkill


----------



## muhd86

..............edited............


----------



## muhd86

Quote:


> Originally Posted by *AverdanOriginal*
> 
> Depends really. Hard to tell without a pic of your airflow set-up. I don't have a 2 or 3 way set up but here are a couple of suggestions:
> 
> 1. if your airflow in the case is set up in that way that cool air comes in from the front and exits at the back and the back top, installing a fan on top of the cards might be contra productive. you would make the hot air collect in the bottom back corner of your case and thats probably where your cards are... so one on the side would be good. maybe a bigger one to go over all 3 on the side. but this might only bring 1-3c° better temps.
> 2. exchange the TIM on each card. in some cases the differences where reported to be up to 6c° better (the higher the temps go on your cards the bigger the difference would be).
> 3. refrain from overclocking
> 
> 
> 
> 
> 
> 
> 
> 
> 4. try undervolting. you can lower your voltage and keep your average fps steady with these cards. I have a setting on my msi r9 390 for really hot days (when it gets around 35c° in my room). which is -81mv and 1020/1680. heaven score is the same as with 1040/1500 @ normal voltage. BUT the temps went down from 78 to 71.
> 
> So all together you might an improvement of around 2 fan + 5-6 TIM+ 5-7 Undervolting = 12-15c° difference.... at max. probably for the top two cards maybe only 6-8c° but that would bring you into the 80s range of temps and thats fine I guess. hope that helps
> 
> EDIT: wait a sec. isn't the Gigabyte Voltage hard locked?? that might make it hard to undervolt as you can not change the voltage lower then aswell I guess


well i can not control the voltage / also fans only start after 60c , so i am trying to load a profile for them to work at 50 percent speed when windows loads up .

but temps go till 90c even with full fan speed - will try again with the a/c on with a cooler room - will try to install 2 fans near the gpus to see if that helps blowing air in and hot air out .

damn these gpus get hot .


----------



## sportsczy

Got around to OCing my CPU and it made a difference:

http://s1173.photobucket.com/user/raptormadrid/media/3DMark_1.png.html


----------



## Darkeylel

So sent a email to NZXT asking about the kraken x61 and g10 for my gigabyte care and this is there response

Zoe Lee (NZXT)
Oct 6, 10:57 PM

Hi Sir,

We are also limited to be not able to check this card physically. What we can sure is the mounting holes around GPU is compatible for G10 bracket. But the problem is maybe you also need to copper shim between X61 contact surface and GPU. and Using G10 also require you to remove the backplate on graphics card.

Zoe Lee -

NZXT.

So by that I'm guessing I can go out and buy the kraken and G10 and shouldn't have a problem ?


----------



## Pwned24

Hi Guys why is my Sapphire 390 crashing when i play games like GTA V and The Witcher 3 Maxed out? The temps dont go higher than 88 and my cpu is i5 4590


----------



## flopper

Quote:


> Originally Posted by *Pwned24*
> 
> Hi Guys why is my Sapphire 390 crashing when i play games like GTA V and The Witcher 3 Maxed out? The temps dont go higher than 88 and my cpu is i5 4590


could be a PSU power issue.


----------



## flopper

Quote:


> Originally Posted by *muhd86*
> 
> well i can not control the voltage / also fans only start after 60c , so i am trying to load a profile for them to work at 50 percent speed when windows loads up .
> 
> but temps go till 90c even with full fan speed - will try again with the a/c on with a cooler room - will try to install 2 fans near the gpus to see if that helps blowing air in and hot air out .
> 
> damn these gpus get hot .


if it reach 90c its likely the cooler isnt making proper contact.
now and then they have to much grease between the core and cooler and thats likely a culprit.


----------



## kalidae

Quote:


> Originally Posted by *Pwned24*
> 
> Hi Guys why is my Sapphire 390 crashing when i play games like GTA V and The Witcher 3 Maxed out? The temps dont go higher than 88 and my cpu is i5 4590


My nitro does that too, not for gta but for my heavily modified skyrim, if my gpu reaches over 85 it starts playing up, I get artifacts and the game starts skipping and will even crash from time to time, I just moved into a new place and the room where my pc is gets very very hot. I have to run all my case fans at 100% and my gpu fan I run at 80% constant and i also dropped my gpu back to stock clocks and voltages. Now it maxes out at like 72.I don't think these nitros like going over 80 degrees.I'm thinking about selling it and buying either and XFX or an asus 390 just so I can get a waterblock and install my custom loop once again.


----------



## tolis626

Hey everyone!

I've had my MSI 390x for like a month now, but haven't had the chance to properly play with it or benchmark it. I got that chance these last few days, so here I am. I've been reading quite a lot from this thread and decided to finally join. So without further ado...
Firestrike:


Firestrike Extreme:


These were taken last night with the GPU at 1170/1680 with +75mV core/+20mV AUX. I also went higher but scored worse for some reason (Throttling perhaps?). The CPU is a 4790k at 4.7GHz.

I'm pretty happy with the card so far. It's by no means whisper quiet or anything, but it's not too noisy and the noise it makes isn't the high-pitched whining I find annoying, but rather a whoosh, like air moving, so I'm ok with it. Temps do climb all the way to 80-85C (Been up to 91 once, but that was a +100mV core/+50mV AUX run) but stay there and I'm okay with it. I mean, the 290 and 290x reference cards, and also some aftermarket ones, would run at 95C non-stop with the slightest load. Many were used for hardcore gaming and, worse, mining and they're still alive. I haven't heard of many Hawaii chips frying, so 80C seems like a good spot to be for now. I would change the TIM, but I'm afraid of voiding the warranty so early. Maybe later in the year, although it's going to be my first time modifying a high(-ish) end GPU, so I'm somewhat scared. Still, how hard can it be?

My three questions would be:
- How far would you keep AUX voltage for 24/7 use? I'm quite comfortable messing with the core voltage, but AUX... I dunno. Need some input there. It's at +20mV currently, not too high.
- Has anyone played DA: Inquisition with their 390/390x? If so, did you get some massive stutters and did your FPS tank when going around forest areas? I mean, I'm usually at 60+FPS everywhere, except for open forest areas, where the framerate will drop drastically, sometimes to the high 30s but mostly mid 40s, causing massive stuttering. And by massive, I mean nigh on unplayable. Even the mouse stops responding correctly. Quite annoying actually. Only way I get around this is by disabling MSAA completely, leaving everything else at the highest settings. Which brings me to question #3.
- Do settings in CCC for games even apply? I'm trying to force AA from there and it doesn't seem to do anything. It doesn't ignore the game's setttings, and it doesn't change the performance or image quality. What's up with that?

I'd appreciate some thoughts on my "issues", but mostly I'm happy to be here. Sign me up!

PS : 430GB/s bandwidth? That's insane. Take that Fury!


----------



## Pwned24

I have a Hexa 700W PSU. It should be more than enough according to the calculator. (The rails are also sufficient).


----------



## kalidae

Quote:


> Originally Posted by *Pwned24*
> 
> I have a Hexa 700W PSU. It should be more than enough according to the calculator. (The rails are also sufficient).


What case and case fans do you have?


----------



## rdr09

Quote:


> Originally Posted by *Pwned24*
> 
> I have a Hexa 700W PSU. It should be more than enough according to the calculator. (The rails are also sufficient).


700W is enuf. Hexa - no.


----------



## Pwned24

Quote:


> Originally Posted by *kalidae*
> 
> What case and case fans do you have?


I don't think temperature is the issue (It can run on higher temps and still not crash)


----------



## Pwned24

Quote:


> Originally Posted by *rdr09*
> 
> 700W is enuf. Hexa - no.


Well the specs at the box are good enough :/ . How do you even know if your gpu isnt being fed enough power? Shouldn't it just NOT turn on?


----------



## MadPolygon

Anyone got experience with this cooler? http://www.aquatuning.us/water-cooling/gpu-water-blocks/gpu-full-cover/19917/alphacool-nexxxos-gpx-ati-r9-390-m01-incl.-backplate-black?c=6470


----------



## kalidae

Quote:


> Originally Posted by *Pwned24*
> 
> I don't think temperature is the issue (It can run on higher temps and still not crash)


I think it is, if you can keep the temps in the 70's you shouldn't have issues, if my 390 nitro reaches the 80s I start having problems with freezing, frame stutter artifacts and crashing. If it's running in the 80s or higher like you said then there is definitely cooling issues. If your gpu is in the 80s or higher then your vrms would be even higher again.I don't think these cards should be running above 80 degrees at all which leads me to think either your room is to hot like mine or your card isn't getting much air flow.


----------



## CamsX

Quote:


> Originally Posted by *bazookatooths*
> 
> Yes had a question for you , have you tried using the fast boot utility or super fast, everytime I try that or RAM fast boot I cant post
> 
> Also tried OC settings on the RAM , no go for me so just leave them stock for now at these speeds and frequencies I doubt it will be noticeably faster as it is already such overkill


Not really. I definitely noticed that it takes longer to post than my previous AMD kit. But was worried that enabling fast post/boot would cause problems like the one you are having, so never attempted it.


----------



## tolis626

Quote:


> Originally Posted by *kalidae*
> 
> I think it is, if you can keep the temps in the 70's you shouldn't have issues, if my 390 nitro reaches the 80s I start having problems with freezing, frame stutter artifacts and crashing. If it's running in the 80s or higher like you said then there is definitely cooling issues. If your gpu is in the 80s or higher then your vrms would be even higher again.I don't think these cards should be running above 80 degrees at all which leads me to think either your room is to hot like mine or your card isn't getting much air flow.


I said it in my post in the previous page and I'll say it again. The 390 and 390x, which essentially use the same GPU as the 290 and 290x, can be ran at 90+C no problem. There are cards that are 2 years old, they have been used for mining while getting cooked at 95C and they still work. And these are mostly reference models. I'd suppose a Sapphire or MSI 390(x) would have no problem running at 80-85C.


----------



## rdr09

Quote:


> Originally Posted by *Pwned24*
> 
> Well the specs at the box are good enough :/ . How do you even know if your gpu isnt being fed enough power? Shouldn't it just NOT turn on?


some psu's go out and take other components with them. check again if that psu is in this list . . .

http://www.overclock.net/t/183810/faq-recommended-power-supplies

mine is not there (Cougar 700W) and it oc'ed my i7 to 4.9GHz and my 290 to 1330/1630 during bench without issue. it just whined.


----------



## kalidae

Quote:


> Originally Posted by *tolis626*
> 
> I said it in my post in the previous page and I'll say it again. The 390 and 390x, which essentially use the same GPU as the 290 and 290x, can be ran at 90+C no problem. There are cards that are 2 years old, they have been used for mining while getting cooked at 95C and they still work. And these are mostly reference models. I'd suppose a Sapphire or MSI 390(x) would have no problem running at 80-85C.


Past experience with the 290s show that yes they can run at 90 degrees easily no problem. My experience with my 390 is no it can not run at 90 degrees and expect to have no problems. This isn't a 290, essentially yes it is but there has been changes. Play with your 390 or 390x at close to if not 90 degrees and see if it's stable like the 290s were. Mine isn't, if it was it would still be overclocked and i wouldn't be considering forking out another $500 just to buy a model that can take a waterblock. I love my nitro, it looks sweet and it's quiet but when it's temps reach the 80s it becomes very unstable, diablo 3 will freeze for a moment, skyrim will show artifacts or even crash. If this card was capable of running at 90 degrees all day long like the previous versions then I wouldn't have any of these problems, this isn't a 290 and i don't believe they are meant to run as hot as a 290.


----------



## Stige

Your VRM is overheating, not core.
That is why you get the artifacts, check your VRM temps, they are propably 100C+.


----------



## Gumbi

Quote:


> Originally Posted by *kalidae*
> 
> Past experience with the 290s show that yes they can run at 90 degrees easily no problem. My experience with my 390 is no it can not run at 90 degrees and expect to have no problems. This isn't a 290, essentially yes it is but there has been changes. Play with your 390 or 390x at close to if not 90 degrees and see if it's stable like the 290s were. Mine isn't, if it was it would still be overclocked and i wouldn't be considering forking out another $500 just to buy a model that can take a waterblock. I love my nitro, it looks sweet and it's quiet but when it's temps reach the 80s it becomes very unstable, diablo 3 will freeze for a moment, skyrim will show artifacts or even crash. If this card was capable of running at 90 degrees all day long like the previous versions then I wouldn't have any of these problems, this isn't a 290 and i don't believe they are meant to run as hot as a 290.


How much voltage are you running through them? The Trix cooler can handle core coolijg np, VRM 1 tends to get hot with them though, which indicates to me if your core is getting into the 80s that VRM1 is getting very toasty.


----------



## kalidae

Quote:


> Originally Posted by *Stige*
> 
> Your VRM is overheating, not core.
> That is why you get the artifacts, check your VRM temps, they are propably 100C+.


You sir would be right, I dint have to check that but I know you are right. Which gos back to Pwned24, I was giving advice about his 390 nitro crashing during gta v. His card is overheating and he needs more cooling, like I said to him, if his gpu is reaching 80-90 degrees then his vrms would be way higher.

Pwned24 the only reason I'm suggesting that you need better cooling is because I have the exact same card as you and i experience the same issue as you and i resolved this by supplying my nitro with better cooling.


----------



## kalidae

Quote:


> Originally Posted by *Gumbi*
> 
> How much voltage are you running through them? The Trix cooler can handle core coolijg np, VRM 1 tends to get hot with them though, which indicates to me if your core is getting into the 80s that VRM1 is getting very toasty.


I'm running my 390 at stock clocks and voltage now to keep my temps low (akyrim is maxing the gpu out at about 74. At +50 mv the card would get to hot and my gpu would reach around 84-86 so my vrms would be way worst and that's when id start having issues. my skyrim, well the ENB that runs on my skyrim is known for stressing the hell out of Graphics cards. The room my pc is in now (I just moved house) would be reaching over 30 degrees as it's coming into summer here and the sun just smashes this room in the afternoon. My card has been stable at 1170 1750 +100mv. I dropped it back to 1140 1650 +50 mv and i left it at that for a fair while, once I moved house the gpu just got to hot, because of the room but also because I started modding skyrim more and more, heck I get 40 fps out in an open field. If i turn on the mod "grass on steroids" I get 21fps and if you consider how old skyrim is and how powerful the 390 is then u can kind of imagine how modified my skyrim is to get such low fps. My skyrim is very very rough on this card and the temps are showing that. Nothing else I play gives this card this kind of workout. I'm not complain it's a great card and it does everything i want it to do, it does need a waterblock and it would be perfect.


----------



## Stige

It is easy to check your VRM temps with GPU-Z, so do so...


----------



## Noirgheos

Quote:


> Originally Posted by *kalidae*
> 
> I'm running my 390 at stock clocks and voltage now to keep my temps low (akyrim is maxing the gpu out at about 74. At +50 mv the card would get to hot and my gpu would reach around 84-86 so my vrms would be way worst and that's when id start having issues. my skyrim, well the ENB that runs on my skyrim is known for stressing the hell out of Graphics cards. The room my pc is in now (I just moved house) would be reaching over 30 degrees as it's coming into summer here and the sun just smashes this room in the afternoon. My card has been stable at 1170 1750 +100mv. I dropped it back to 1140 1650 +50 mv and i left it at that for a fair while, once I moved house the gpu just got to hot, because of the room but also because I started modding skyrim more and more, heck I get 40 fps out in an open field. If i turn on the mod "grass on steroids" I get 21fps and if you consider how old skyrim is and how powerful the 390 is then u can kind of imagine how modified my skyrim is to get such low fps. My skyrim is very very rough on this card and the temps are showing that. Nothing else I play gives this card this kind of workout. I'm not complain it's a great card and it does everything i want it to do, it does need a waterblock and it would be perfect.


Sounds to me like you got a good card to OC, but not for temps...


----------



## flopper

Looks good for frostbite engine for the future.
and the 390 owners....who will play battlefront

http://www.guru3d.com/articles_pages/star_wars_battlefront_beta_vga_graphics_performance_benchmarks,5.html


----------



## kalidae

Quote:


> Originally Posted by *Stige*
> 
> It is easy to check your VRM temps with GPU-Z, so do so...


Quote:


> Originally Posted by *Stige*
> 
> It is easy to check your VRM temps with GPU-Z, so do so...


Trust me I spent days overclocking and benchmarking my nitro to the point where I knew what it was capable of and what it wasn't. I know my VRM1 gets very very hot and i know that if my core is over 80, especially 85 or higher than that one vrm is going to cause me issues in graphic intensive games.I should habe checked my vrms in gpuz but I'm at a stage where I'm over testing and benchmarking and just want to play, when these issues started happening I was mid game and fully into it. I didn't want to go back to checking vrms and voltages and clocks and all that, I knew I had temp issues and I just wanted to get back into the game. Just so you know that I do know what I'm talking about here's some tests. For my tests and benchmarks I did monitor my vrms and core temps.


1200/1750 max voltage in Trixx, ran benchmarks stable but crashed in bf4 after a little while. From memory my core reached like 88 or something and my VRM1 was way to high and gpu fan was at 100% I'm pretty sure my case fans were to but that doesn't matter I was using a different case at the time. The case I use now has way better cooling.


1170/1750 +100mv all tests ran perfect, gaming was completely stable the card did run hot but it was safe I dropped it back further because these clocks weren't really needed and I didn't want my fans up high. I ended up dropping it back to 1140 1650 +50mv and it ran sweet and cool and quiet for what I was playing at the time which was just bf4 and diablo 3.

Moved house, room is extremely hot and started playing my moded skyrim. It stress tests this card harder than any benchmark that I have ran and any game to the point where these clocks that I had been running safely for the last month were now causing overheating issues. I loaded up afterburner and instead of seeing a max temp of 73 or so I saw 88 previous tests told me 88 on the core means VRM1 would be at 100 or more, I should have opened up gpuz just to see my exact vrms but honestly I was over all those tests and checks and adjusting the voltages and clocks, I just wanted to play so boom dropped it back to stock clocks and volts and turned all my case fans to 100% and all of a sudden I had no issues, max gpu temp was like 74 or 76 when playing my skyrim which was as hot as playing bf4 with +50mv at 1140 1650 but this was with my gpu at 80% fan speed and case fans at 100% where as when playing bf4 case fans were like 80% or lower they run on a hub and gpu fan was topping out at 60%. Skyrim was able to heat up my card hotter at stock clocks and voltages than any game or benchmark that was using a decent overclock and voltage increase, this is the game where I realised wow I might actually need 2 cards to run it perfectly with all my mods running. I also started thinking damn it I need a waterblock.


----------



## kalidae

Quote:


> Originally Posted by *MadPolygon*
> 
> Anyone got experience with this cooler? http://www.aquatuning.us/water-cooling/gpu-water-blocks/gpu-full-cover/19917/alphacool-nexxxos-gpx-ati-r9-390-m01-incl.-backplate-black?c=6470


Mate I just want to let you know that I love you. Thank you so much for linking that waterblock it will solve all my problems, you just saved me forking out 500 bucks just to buy a 390 that will take an ek waterblock. Now I can keep my nitro and watercool it. And yes I have seen reviews on those alphacool waterblocks,but for different cards but the reviews were always good. Tiny Tom logan reviewed one ages ago but i can't remember what card it was for I just remember that he only had good things to say about it.


----------



## kizwan

Quote:


> Originally Posted by *kalidae*
> 
> Quote:
> 
> 
> 
> Originally Posted by *MadPolygon*
> 
> Anyone got experience with this cooler? http://www.aquatuning.us/water-cooling/gpu-water-blocks/gpu-full-cover/19917/alphacool-nexxxos-gpx-ati-r9-390-m01-incl.-backplate-black?c=6470
> 
> 
> 
> Mate I just want to let you know that I love you. Thank you so much for linking that waterblock it will solve all my problems, you just saved me forking out 500 bucks just to buy a 390 that will take an ek waterblock. Now I can keep my nitro and watercool it. And yes I have seen reviews on those alphacool waterblocks,but for different cards but the reviews were always good. Tiny Tom logan reviewed one ages ago but i can't remember what card it was for I just remember that he only had good things to say about it.
Click to expand...

I believe that's just a huge heatsink with waterblock for the core only. So don't expect to get good cooling performance for VRM.


----------



## kalidae

Quote:


> Originally Posted by *kizwan*
> 
> I believe that's just a huge heatsink with waterblock for the core only. So don't expect to get good cooling performance for VRM.







It's a hybrid type, the plate for the memory and the vrms is raised to make contact and the waterblock that cools the gpu also cools the plate which helps to cool the vrms it also has the huge heatsink too. It's actually a pretty smart design. The waterblock is reusable for all different types of cards as well and all you have to do is buy the plate for your new card. He tried this on a 290 with a reference design cooler and was running a loop over night ( he normally uses valley) and the gpu was at 50 degrees which I believe the reference 290 would nornally run at 90 degrees and his hot vrm would normally run at 100 it dropped down to the 70's.it's not going to be as good as a full cover ek waterblock but for a nitro 390 it would still be an improvement and so far the only option plus you could reuse it for your next card if you wanted it to. I already have everything I need for a custom loop I just don't have a gpu block but if I get this block ill be installing my whole loop again. It's worth trying and i think id rather give this ago instead of spending 500 on a new XFX 390 and another 140 on an ek block and another 30 on a backplate. So it dropped his gpu temp by 40 degrees and it dropped his hottest vrm by a little over 20 degrees, even if it only dropped the temps of the nitro 390 hottest vrm by 10 degrees id be happy with that.


----------



## MadPolygon

Quote:


> Originally Posted by *kalidae*
> 
> Mate I just want to let you know that I love you. Thank you so much for linking that waterblock it will solve all my problems, you just saved me forking out 500 bucks just to buy a 390 that will take an ek waterblock. Now I can keep my nitro and watercool it. And yes I have seen reviews on those alphacool waterblocks,but for different cards but the reviews were always good. Tiny Tom logan reviewed one ages ago but i can't remember what card it was for I just remember that he only had good things to say about it.


Absolutely no problem







From the reviews I saw core temp is mostly on par with an EK fullcover block. What I'm worried about are VRM temps.

I really wish EK would have made a proper fullcover block at least for the MSI and Nitro 390...

Quote:


> Originally Posted by *kizwan*
> 
> I believe that's just a huge heatsink with waterblock for the core only. So don't expect to get good cooling performance for VRM.


That's the thing I'm worried about.


----------



## kalidae

Quote:


> Originally Posted by *MadPolygon*
> 
> Absolutely no problem:thumb: From the reviews I saw core temp is mostly on par with an EK fullcover block. What I'm worried about are VRM temps.
> I really wish EK would have made a proper fullcover block at least for the MSI and Nitro 390...
> That's the thing I'm worried about.


Yeah I'm not sure how well it will cool the vrms, it seems like it might do ab okay job. Possibly better than the Trix cooler but maybe not. I'm going to give it a shot sometime soon. Even if it cools the vrms on par with the air cooler then at least my pc will look awesome and be quiet with a full custom loop once again.


----------



## MadPolygon

Quote:


> Originally Posted by *kalidae*
> 
> Yeah I'm not sure how well it will cool the vrms, it seems like it might do ab okay job. Possibly better than the Trix cooler but maybe not. I'm going to give it a shot sometime soon. Even if it cools the vrms on par with the air cooler then at least my pc will look awesome and be quiet with a full custom loop once again.


Hope you post some results here when you have done some testing









While I don't think I will have problems with vrm temps, even when the gpx heatsink is only as good as the nitro cooler, as I run my 390 Nitro with 1120Mhz undervolted by quite a bit, I would still prefer some cooler temps


----------



## KNG HOLDY

Quote:


> Originally Posted by *Agent Smith1984*
> 
> *The OCN Official R9 390 / 390X Owners' Club!*
> 
> *Sapphire*= No water block available, but has excellent air cooling. Require 8+8 pin connectors versus 8+6. Some would prefer that, in hopes of obtaining higher OC results, but unfortunately OC results so far on the Nitro and Tri-X are only mediocre-good
> [/CENTER]


i want to buy an Sapphire Radeon R9 390 Nitro, and according to this:
http://www.alphacool.com/download/compatibility%20list%20ATI.pdf

a Alphacool NexXxoS GPX - ATI R9 390 M01 is compatible with the Sapphire Radeon R9 390 Nitro

is this the thread owner wrong or alphacool or is it just my bad english and i ****ed up and didnt understood it


----------



## Agent Smith1984

Quote:


> Originally Posted by *KNG HOLDY*
> 
> i want to buy an Sapphire Radeon R9 390 Nitro, and according to this:
> http://www.alphacool.com/download/compatibility%20list%20ATI.pdf
> 
> a Alphacool NexXxoS GPX - ATI R9 390 M01 is compatible with the Sapphire Radeon R9 390 Nitro
> 
> is this the thread owner wrong or alphacool or is it just my bad english and i ****ed up and didnt understood it


That block was just recently discovered, I just haven't updated the OP yet.


----------



## Cannon19932006

What's the minimum you would recommend on power supply with an overclocked CPU and 2x390X running stock?


----------



## LeSwede

Quote:


> Originally Posted by *Cannon19932006*
> 
> What's the minimum you would recommend on power supply with an overclocked CPU and 2x390X running stock?


I use a AMD FX-9370 and 2 MSI R9 390s, during gaming my PC draws 800w+ from the wall. So anything over that I suppose


----------



## Cannon19932006

Quote:


> Originally Posted by *LeSwede*
> 
> I use a AMD FX-9370 and 2 MSI R9 390s, during gaming my PC draws 800w+ from the wall. So anything over that I suppose


So it'd probably be pretty close with 4.5GHz 5820k and 2 stock 390x's on an 850w.


----------



## tolis626

Quote:


> Originally Posted by *Cannon19932006*
> 
> So it'd probably be pretty close with 4.5GHz 5820k and 2 stock 390x's on an 850w.


If you don't overclock them too much, you should be okay. 800W from the wall is more like 700-750W for the system in total, after factoring in PSU efficiency. And I also think the 9370 draws quite more juice than your 5820k. If you leave the cards at stock clocks, I think you'll be looking at just under 700W, maybe even 650W if you don't stress your system too much. Maybe you can even overclock a bit, but without adding voltage.

All that applies as long as you have a good quality unit from a reputable brand.


----------



## LeSwede

Quote:


> Originally Posted by *Cannon19932006*
> 
> So it'd probably be pretty close with 4.5GHz 5820k and 2 stock 390x's on an 850w.


Not really, keep in mind that my FX-9370 draws 220w alone. Would probably be fine with 850w.


----------



## Dundundata

All this talk about temps, I ran a little test because I know my card runs cool. Played Witcher 3 for about an hour.

MSI 390
1200/1650, +100/38

Ambient temp: ~21C
GPU: 73C (max)
VRM1: 68C
VRM2: 50C

Now the reason I write this is because I've seen people even with the same card with much higher temps, and was curious what the difference is. Obviously my ambient temp isn't very high so I'm sure that helps. My case is mid tower with the drive bay removed and a good array of fans (see profile for specs). And in actuality I run a milder overclock of 1150/1650, +19, so temps are even cooler. I've learned from experience with my XFX that it could be thermal paste, since I was able to drop the GPU temp on that card down by 10C, but it still doesn't account for VRMs.


----------



## Gumbi

Quote:


> Originally Posted by *Dundundata*
> 
> All this talk about temps, I ran a little test because I know my card runs cool. Played Witcher 3 for about an hour.
> 
> MSI 390
> 1200/1650, +100/38
> 
> Ambient temp: ~21C
> GPU: 73C (max)
> VRM1: 68C
> VRM2: 50C
> 
> Now the reason I write this is because I've seen people even with the same card with much higher temps, and was curious what the difference is. Obviously my ambient temp isn't very high so I'm sure that helps. My case is mid tower with the drive bay removed and a good array of fans (see profile for specs). And in actuality I run a milder overclock of 1150/1650, +19, so temps are even cooler. I've learned from experience with my XFX that it could be thermal paste, since I was able to drop the GPU temp on that card down by 10C, but it still doesn't account for VRMs.


Damn dude, 1150 core with 19mv? Ridiculous.

Those temps are also excellent. You must have low ambients coupled with very good airflow.


----------



## famous1994

Got the MSI R9 390 Gaming yesterday, so far so good. It's a decent upgrade over my 7970.


----------



## LeSwede

Quote:


> Originally Posted by *Dundundata*
> 
> All this talk about temps, I ran a little test because I know my card runs cool. Played Witcher 3 for about an hour.
> 
> MSI 390
> 1200/1650, +100/38
> 
> Ambient temp: ~21C
> GPU: 73C (max)
> VRM1: 68C
> VRM2: 50C
> 
> Now the reason I write this is because I've seen people even with the same card with much higher temps, and was curious what the difference is. Obviously my ambient temp isn't very high so I'm sure that helps. My case is mid tower with the drive bay removed and a good array of fans (see profile for specs). And in actuality I run a milder overclock of 1150/1650, +19, so temps are even cooler. I've learned from experience with my XFX that it could be thermal paste, since I was able to drop the GPU temp on that card down by 10C, but it still doesn't account for VRMs.


My MSI R9 390 runs at 90c when playing BF4, ambient is ~23c


----------



## Gumbi

Quote:


> Originally Posted by *LeSwede*
> 
> My MSI R9 390 runs at 90c when playing BF4, ambient is ~23c


These chips LOVE high airflow setups. Quality fans feeding cool air in from the side and front helps a ton.


----------



## tolis626

For those who have already done it, on a scale of 1 to 10, how complicated is it to change the thermal paste on the MSI 390x? The whole disassembly process seems kinda daunting at first.

Also, is there any chance that MSI would honor the warranty afterwards? I'd guess the answer is "No, you just voided it", but I dunno.


----------



## Kinaesthetic

So I know that most people don't read or even look into the Online Deals forum, but do come to this thread more often, but Newegg is currently running a $50 off promo code for ALL R9 390/390X's on their website. And it works for cards that have a Mail-in-rebate too, so you can stack on some serious savings.

http://www.overclock.net/t/1576349/newegg-50-off-any-r9-390-390x-promo-code-ign390/0_50#post_24488732


----------



## Dundundata

Quote:


> Originally Posted by *Gumbi*
> 
> Damn dude, 1150 core with 19mv? Ridiculous.
> 
> Those temps are also excellent. You must have low ambients coupled with very good airflow.


That was the first OC I tried and I just played with the volts until it was stable. At that OC I'm getting 67/63/50. Will probably find a middle ground somewhere, see what I can do with ~50mV. Not that it makes much a difference at that point, maybe a few frames.


----------



## battleaxe

Quote:


> Originally Posted by *tolis626*
> 
> For those who have already done it, on a scale of 1 to 10, how complicated is it to change the thermal paste on the MSI 390x? The whole disassembly process seems kinda daunting at first.
> 
> Also, is there any chance that MSI would honor the warranty afterwards? I'd guess the answer is "No, you just voided it", but I dunno.


My personal OP would be about a 3 out of 10. Very easy IMO.


----------



## Zack Foo

Quote:


> Originally Posted by *Gumbi*
> 
> These chips LOVE high airflow setups. Quality fans feeding cool air in from the side and front helps a ton.


Holy**** bro please alter your airflow don't kill the card. That card suppose to real cool bro.


----------



## mandrix

I have a water block coming Monday from EK, and hope to order a PowerColor PCS 390X 8GBD5-PPDHE tonight. The PowerColor gpu's I've bought in the past have been good overclocker's so I thought I would go with them again.
I missed the $399 price from the Egg, unfortunately, but they still have a $20 rebate.


----------



## Gumbi

Quote:


> Originally Posted by *mandrix*
> 
> I have a water block coming Monday from EK, and hope to order a PowerColor PCS 390X 8GBD5-PPDHE tonight. The PowerColor gpu's I've bought in the past have been good overclocker's so I thought I would go with them again.
> I missed the $399 price from the Egg, unfortunately, but they still have a $20 rebate.


The Powercolor 290(X) PCS+ cards were great clockers, amd their coolers were very good (same cooler in theur 390 edition I believe which tells you how solid their original design was).

Let us know how you get on woth clocking that beast!


----------



## mandrix

Quote:


> Originally Posted by *Gumbi*
> 
> The Powercolor 290(X) PCS+ cards were great clockers, amd their coolers were very good (same cooler in theur 390 edition I believe which tells you how solid their original design was).
> 
> Let us know how you get on woth clocking that beast!


I will. I will run it without the water block for a little to make sure it's OK first. (In the past I never even tested the cards before installing the water blocks, but money is tight these days)


----------



## DR4G00N

Alright, I'll be soon buying a 390 to replace my (sort of buggy) 780 Ti and then buying another later for crossfire.

My question is,
Sapphire NITRO vs. MSI Twin Frozr V vs. ASUS STRIX.
Which one is the best in terms of VRM, Vram (quality not quantity







) and overclocking? I'll be using the H75 and Kraken G10 from my 780 Ti on it so the cooler performance doesn't matter at all.

I'm leaning towards the NITRO because it has dual 8pins. (Which I know doesn't matter at all but it looks nicer







)


----------



## Cannon19932006

Quote:


> Originally Posted by *DR4G00N*
> 
> Alright, I'll be soon buying a 390 to replace my (sort of buggy) 780 Ti and then buying another later for crossfire.
> 
> My question is,
> Sapphire NITRO vs. MSI Twin Frozr V vs. ASUS STRIX.
> Which one is the best in terms of VRM, Vram (quality not quantity
> 
> 
> 
> 
> 
> 
> 
> ) and overclocking? I'll be using the H75 and Kraken G10 from my 780 Ti on it so the cooler performance doesn't matter at all.
> 
> I'm leaning towards the NITRO because it has dual 8pins. (Which I know doesn't matter at all but it looks nicer
> 
> 
> 
> 
> 
> 
> 
> )


The MSI and the NITRO and pretty much equal, the STRIX has craptastic VRM cooling from what I've read.


----------



## fyzzz

I am going to ask again, does somebody have a xfx r9 390 (non x) dd bios and are willing to give me a copy? Thanks in advance


----------



## Hemanse

So i just got my MSI R9 390 today, so far im pretty happy with it, its sadly got some coil whine, but gonna run it for a bit before even thinking about asking the local store i bought it at for a checkup, its dead silent at idle, at very high fps it squeels quite a bit, but that seems to be a common thing with the newer cards, during normal gameplay i atleast cant hear it with the case closed and using open back headphones.

Got a few questions tho if someone fancy helping:

1. Is it normal not being able to read the VRM temps on "VRM Temperature 2" in GPU-Z, seems to be stuck on 47c no matter how i push the card
2. I dont have a lot of experience with overclocking and 0 experience with AMD cards, so far i have been able to push 1150 / 1600 with +50 power limit and no added voltage, runs Valley smooth and tops out at 74c, seems about right?


----------



## Sgt Bilko

Quote:


> Originally Posted by *fyzzz*
> 
> I am going to ask again, does somebody have a xfx r9 390 (non x) dd bios and are willing to give me a copy? Thanks in advance


Well you'd be wanting these people then:

@Superjit94, @Bigm, @xhitekredneckx, @BlaXey, @undyingbread, @CerealKillah, @bazookatooths, @FooSkiii, @Darkstalker420, @Weird0ne and @MechaDurka

And am I seriously the only person in here with a XFX DD 390x?


----------



## Dundundata

@Hemanse, that stinks about the whine, mine has none. Seems to be a random phenomenon. What kind of high FPS are you talking about?

Try HWinfo for reading temps, hmmm maybe I have the same issue, I'll have to check later.

That OC sounds really good, you might want to try 3d mark Firestrike as I think it's harder to pass.


----------



## DR4G00N

Quote:


> Originally Posted by *Cannon19932006*
> 
> The MSI and the NITRO and pretty much equal, the STRIX has craptastic VRM cooling from what I've read.


Thanks for the info. After checking out some reviews I guess I'll be going with the MSI card because of it's VRAM cooling plate (less work on my end). I've had good experiences with MSI's TF cards in the past so I'm expecting a high quality part.









I'll be ordering it up next week.


----------



## Weird0ne

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Well you'd be wanting these people then:
> 
> @Superjit94, @Bigm, @xhitekredneckx, @BlaXey, @undyingbread, @CerealKillah, @bazookatooths, @FooSkiii, @Darkstalker420, @Weird0ne and @MechaDurka
> 
> And am I seriously the only person in here with a XFX DD 390x?


Not sure how to extract bios or I would.


----------



## Hemanse

Quote:


> Originally Posted by *Dundundata*
> 
> @Hemanse, that stinks about the whine, mine has none. Seems to be a random phenomenon. What kind of high FPS are you talking about?
> 
> Try HWinfo for reading temps, hmmm maybe I have the same issue, I'll have to check later.
> 
> That OC sounds really good, you might want to try 3d mark Firestrike as I think it's harder to pass.


When i say high FPS i mean in the 500-600s, just as a small test i ran league of legends without v-sync, which gives me a very high fps count, thats when the card whines the most, but i have also heard that those high fps numbers tend to bring out the noise.

And yeah you were right, Firestrike didnt really like that OC, dialed it back to 1110 / 1550 and it runs smooth without any artifacts, gonna just keep it at that, im not really comfortable with dialing up the voltage just yet.


----------



## fyzzz

false
Quote:


> Originally Posted by *Weird0ne*
> 
> Not sure how to extract bios or I would.


Press the arrow that i have marked with a red circle, save to file.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Weird0ne*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Well you'd be wanting these people then:
> 
> @Superjit94, @Bigm, @xhitekredneckx, @BlaXey, @undyingbread, @CerealKillah, @bazookatooths, @FooSkiii, @Darkstalker420, @Weird0ne and @MechaDurka
> 
> And am I seriously the only person in here with a XFX DD 390x?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Not sure how to extract bios or I would.
Click to expand...

Run GPU-Z > Next to the Bios Version box there is an Arrow symbol > Hit that then "Save to File" > Upload file to OCN as attachment.









You might need to rename the file to .txt to upload it as an attachment though, i know from the 295x2 owners thread there was some issues with .rom files being uploaded
Quote:


> Originally Posted by *fyzzz*
> 
> false
> Quote:
> 
> 
> 
> Originally Posted by *Weird0ne*
> 
> Not sure how to extract bios or I would.
> 
> 
> 
> Press the arrow that i have marked with a red circle, save to file.
> 
> 
> Spoiler: Warning: Spoiler!
Click to expand...

Please?


----------



## Dundundata

Quote:


> Originally Posted by *Hemanse*
> 
> When i say high FPS i mean in the 500-600s, just as a small test i ran league of legends without v-sync, which gives me a very high fps count, thats when the card whines the most, but i have also heard that those high fps numbers tend to bring out the noise.
> 
> And yeah you were right, Firestrike didnt really like that OC, dialed it back to 1110 / 1550 and it runs smooth without any artifacts, gonna just keep it at that, im not really comfortable with dialing up the voltage just yet.


That's usually when it can happen, it can also with a voltage increase, hopefully not. You can probably run 1150 with a slight mV increase. You can always limit your frames in those games to help with the whine. Idk if you have RivaTuner but it comes with Afterburner and you can cap FPS without vsync.


----------



## undyingbread

bro i will give you the bios i have the xfx dd edition.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Dundundata*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Hemanse*
> 
> When i say high FPS i mean in the 500-600s, just as a small test i ran league of legends without v-sync, which gives me a very high fps count, thats when the card whines the most, but i have also heard that those high fps numbers tend to bring out the noise.
> 
> And yeah you were right, Firestrike didnt really like that OC, dialed it back to 1110 / 1550 and it runs smooth without any artifacts, gonna just keep it at that, im not really comfortable with dialing up the voltage just yet.
> 
> 
> 
> That's usually when it can happen, it can also with a voltage increase, hopefully not. You can probably run 1150 with a slight mV increase. You can always limit your frames in those games to help with the whine. Idk if you have RivaTuner but it comes with Afterburner and you can cap FPS without vsync.
Click to expand...

FRTC in CCC......you can cap it in there but it's limited to a value between 55-95 fps, upside is you don't have to use Rivatuner (Damn thing always causes issues on my rig) but the downside is if you have a 120-144hz monitor then you miss out on the extra frames then.

works out great for me seeing as my Qnix sits at 100hz so i cap it at 95fps and all is well


----------



## tolis626

Quote:


> Originally Posted by *DR4G00N*
> 
> Thanks for the info. After checking out some reviews I guess I'll be going with the MSI card because of it's VRAM cooling plate (less work on my end). I've had good experiences with MSI's TF cards in the past so I'm expecting a high quality part.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'll be ordering it up next week.


The MSI's quality is top notch, no arguing there. BUT! If you're going to go the water cooling route, I'd say consider the Sapphire. My card throttles because of TDP limitations before it overheats or crashes. Unless water cooling for you means silent operation, in which case you're going to be more than fine. But for overclocking... I think the Nitro would be better.


----------



## Hemanse

Whats the experience of other people with the MSI R9 390 when it comes to coil whine? Just curious as seems like its just a 50/50 change when buying.


----------



## tolis626

Quote:


> Originally Posted by *Hemanse*
> 
> Whats the experience of other people with the MSI R9 390 when it comes to coil whine? Just curious as seems like its just a 50/50 change when buying.


Mine does whine a little, but only a little at high voltages (Over +50mV) and under full load, which is unnoticeable with any other sound source nearby, and at crazy high FPS. By crazy high, I mean something stupid like the 200+ FPS that Valley hits on the closing screen. There it's audible, but unless you plan on overclocking to play minesweeper, I think you'll be fine.


----------



## Gumbi

Quote:


> Originally Posted by *tolis626*
> 
> Mine does whine a little, but only a little at high voltages (Over +50mV) and under full load, which is unnoticeable with any other sound source nearby, and at crazy high FPS. By crazy high, I mean something stupid like the 200+ FPS that Valley hits on the closing screen. There it's audible, but unless you plan on overclocking to play minesweeper, I think you'll be fine.


You mean 2000+







And yes that's unavoidable.


----------



## tolis626

Quote:


> Originally Posted by *Gumbi*
> 
> You mean 2000+
> 
> 
> 
> 
> 
> 
> 
> And yes that's unavoidable.


Maybe...









Now that I think of it, I don't think I've ever seen a card that doesn't whine under these circumstances. Some whine more, some less, but all that I've seen do.


----------



## Snailgun

Here's my story: i bought MSI R9 390 in July and was exeriencing driver crashes and all that stuff that forums full of. But after a week of struggling with this problems the card just stopped working at all: all i had is just screen flickering on Windows strartup. Surely after that i just send it to the store service, which said that there's a problem with BIOS chip and my card is not the first one they got with this issue. So they fixed it and returned to me around month later - in the end of August. The whole September i was just playing MGS V without any issues, but after that: GTA V, Witcher 3, COD: AW, Rainbow Six: Siege and now finally Star Wars: Battlefront still crash. From all graphically advanced games what i tried, seems like just BF4 and MGSV (and benchmarks) are working okay. During this time i tried almost everything: Windows 8.1 and 10, drivers from 15.2 to 15.9.1, underclocking, overvolting, different MB settings - nothing helped.
There is not much point it this post since there are 50-page thread on AMD forums with people having the same issues - i just wanted to speak my mind off untill there are no solutions if there will be any. And exuse me for my english.


----------



## Hemanse

Quote:


> Originally Posted by *Snailgun*
> 
> Here's my story: i bought MSI R9 390 in July and was exeriencing driver crashes and all that stuff that forums full of. But after a week of struggling with this problems the card just stopped working at all: all i had is just screen flickering on Windows strartup. Surely after that i just send it to the store service, which said that there's a problem with BIOS chip and my card is not the first one they got with this issue. So they fixed it and returned to me around month later - in the end of August. The whole September i was just playing MGS V without any issues, but after that: GTA V, Witcher 3, COD: AW, Rainbow Six: Siege and now finally Star Wars: Battlefront still crash. From all graphically advanced games what i tried, seems like just BF4 and MGSV (and benchmarks) are working okay. During this time i tried almost everything: Windows 8.1 and 10, drivers from 15.2 to 15.9.1, underclocking, overvolting, different MB settings - nothing helped.
> There is not much point it this post since there are 50-page thread on AMD forums with people having the same issues - i just wanted to speak my mind off untill there are no solutions if there will be any. And exuse me for my english.


That sounds kinda crappy, my options were to either go for a 970 or a 390, 390 seemed like the best performer so i went for that, i have seen quite a few people complain about Windows 10 crashes, but im still on Windows 7 tho.


----------



## Weird0ne

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Run GPU-Z > Next to the Bios Version box there is an Arrow symbol > Hit that then "Save to File" > Upload file to OCN as attachment.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You might need to rename the file to .txt to upload it as an attachment though, i know from the 295x2 owners thread there was some issues with .rom files being uploaded
> Please?


Quote:


> Originally Posted by *fyzzz*
> 
> I am going to ask again, does somebody have a xfx r9 390 (non x) dd bios and are willing to give me a copy? Thanks in advance


 XFXR9390DDBios.zip 99k .zip file


Told me I had to use a Zip.


----------



## Agent Smith1984

I've had two 390's so far, an msi and an Asus....

No coil whine, no crashes from either in Windows10..

Not saying it's not a problem for some, but i don't think it's a "thing"... Not as of yet anyways...

I've pushed the msi to the max, and it had no whine at all, not had a chance to thoroughly torture the strix yet though.

Sometimes motherboards and psu (both power ripple and poor grounding) can contribute to whine.


----------



## tangelo

Quote:


> Originally Posted by *Hemanse*
> 
> 1. Is it normal not being able to read the VRM temps on "VRM Temperature 2" in GPU-Z, seems to be stuck on 47c no matter how i push the card
> 2. I dont have a lot of experience with overclocking and 0 experience with AMD cards, so far i have been able to push 1150 / 1600 with +50 power limit and no added voltage, runs Valley smooth and tops out at 74c, seems about right?


1. Yes
2. Yes


----------



## Sgt Bilko

Quote:


> Originally Posted by *Weird0ne*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Run GPU-Z > Next to the Bios Version box there is an Arrow symbol > Hit that then "Save to File" > Upload file to OCN as attachment.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You might need to rename the file to .txt to upload it as an attachment though, i know from the 295x2 owners thread there was some issues with .rom files being uploaded
> Please?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *fyzzz*
> 
> I am going to ask again, does somebody have a xfx r9 390 (non x) dd bios and are willing to give me a copy? Thanks in advance
> 
> Click to expand...
> 
> XFXR9390DDBios.zip 99k .zip file
> 
> 
> Told me I had to use a Zip.
Click to expand...

Awesome stuff









Hopefully that gets the other guy what he needs and +Rep to you


----------



## bazookatooths

Anyone planning on purchasing this 390?

Says the fastest R9 390 in the world!

http://www.newegg.com/Product/Product.aspx?Item=N82E16814131680&cm_re=r9_390-_-14-131-680-_-Product


----------



## Sgt Bilko

Quote:


> Originally Posted by *bazookatooths*
> 
> Anyone planning on purchasing this 390?
> 
> Says the fastest R9 390 in the world!
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814131680&cm_re=r9_390-_-14-131-680-_-Product


Well it's 2 x R9 390 cores on the same PCB so yeah....it would be faster than any single GPU card


----------



## bazookatooths

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Well it's 2 x R9 390 cores on the same PCB so yeah....it would be faster than any single GPU card


Well thx your for the clarification I would have never figured that out







Just posted incase anyone has yet to see it.


----------



## Waleh

Hello folks, is anyone here using the Gigabyte G1 390? I'm looking at a mini-itx case that can only fit the Gigabyte 390 (the other 390's will not fit). How does this card perform? I know it is voltage locked but that's not a problem for me. I'm mostly concerned about thermals and noise (I just don't want to hear a jet taking off at full load). Also, I've read that there's an issue where the GPU causes the PC to shutdown, is this still a problem? Thanks guys


----------



## Sgt Bilko

Quote:


> Originally Posted by *bazookatooths*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Well it's 2 x R9 390 cores on the same PCB so yeah....it would be faster than any single GPU card
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Well thx your for the clarification I would have never figured that out
> 
> 
> 
> 
> 
> 
> 
> Just posted incase anyone has yet to see it.
Click to expand...

It was posted a little while ago iirc, and unless you are going for 4k then the original 290x Devil 13 or a 295x2 would better suit your needs tbh.

this has more vram but it's 2 x 2560 SP's vs 2 x 2816 SP's on the 290x Devil and 295x2.

just my







though


----------



## Pwned24

Hi guys why is my New Sapphire R9 390 Nitro Tri-X crashing in the Witcher 3? I removed the overclocks and everything has been reset. I set the fans to 100% in the last session and the temp did not reach 70 degrees celsius. I played it on Ultra Settings, SSAO, Some Post Processing off. 1080p . If it helps I have an i5 4590 3.3 ghz 16gb RAM and a 700W Hexa PSU


----------



## Darkeylel

Quote:


> Originally Posted by *Waleh*
> 
> Hello folks, is anyone here using the Gigabyte G1 390? I'm looking at a mini-itx case that can only fit the Gigabyte 390 (the other 390's will not fit). How does this card perform? I know it is voltage locked but that's not a problem for me. I'm mostly concerned about thermals and noise (I just don't want to hear a jet taking off at full load). Also, I've read that there's an issue where the GPU causes the PC to shutdown, is this still a problem? Thanks guys


I have the 390x and it runs fine does get a bit noisy after 2 hours of playing the witcher 3 for a couple of hours, but it's not as noisy as my 7950 under full load


----------



## tbob22

Not using afterburner anymore. Had a manual fan curve set, it seems like after I put the machine to sleep it sets automatically sets it to manual and 0%. Was playing a game and it started stuttering.. Looked at temp and it was at 98c. Great.

Decided to test it and closed afterburner, reopened and put it to sleep.. Sure enough it sets it to manual and 0% on wake. I wonder how long that has been going on. Set an alarm for GPU temp now, hopefully my GPU doesn't randomly die.

Afterburner uninstalled.


----------



## AverdanOriginal

Quote:


> Originally Posted by *Waleh*
> 
> Hello folks, is anyone here using the Gigabyte G1 390? I'm looking at a mini-itx case that can only fit the Gigabyte 390 (the other 390's will not fit). How does this card perform? I know it is voltage locked but that's not a problem for me. I'm mostly concerned about thermals and noise (I just don't want to hear a jet taking off at full load). Also, I've read that there's an issue where the GPU causes the PC to shutdown, is this still a problem? Thanks guys


Hi. I have a MSI R9 390 and Iam happy with it. took me some time though to get the temps down and I think I still have room for more temp reduction but thats fine tuning. Not sure abotu the Gigabyte though.
If your build must be an mini-itx case and "only" the Gigabyte G1 will fit I would actually suggested to not go for a 390. Why? Mainly because the potential for good air flow is limited in mini-itx cases. there are better cards especially build for mini-itx cases. Fury Nano, mini GTX 970 from asus (i thinki?) and some others.


----------



## AverdanOriginal

Quote:


> Originally Posted by *tbob22*
> 
> Not using afterburner anymore. Had a manual fan curve set, it seems like after I put the machine to sleep it sets automatically sets it to manual and 0%. Was playing a game and it started stuttering.. Looked at temp and it was at 98c. Great.
> 
> Decided to test it and closed afterburner, reopened and put it to sleep.. Sure enough it sets it to manual and 0% on wake. I wonder how long that has been going on. Set an alarm for GPU temp now, hopefully my GPU doesn't randomly die.
> 
> Afterburner uninstalled.


HI, I have a similar problem, when I changed my overclocking settings in afterburner more then 3-4 times, it gets stuck. Actually I got a black screen (and not becuase the overclock was too high). I was thinking of switching to amd overdrive, but then cant adjust the fan curve.
Any one else exprienceing unstability maybe through afterburner?


----------



## Derek129

I have the gigabyte r9 390 and would not reccomend it at all. Yes the size of it is perfect but it is very loud while gaming. I also have the shutdown issue where the card will not let the pc shutdown or go into sleep mode it just keeps booting right back up. Im actually just waiting for it to come back from RMA as they said there is absolutely nothing wrong with it after testing with multiple configurations which is total BS I sent it for a reason... Probably will sell it once I get it back


----------



## Derek129

Also forgot to mention it crashes in gtav and other games like crazy and fps has been terrible with every game


----------



## Dundundata

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I've had two 390's so far, an msi and an Asus....
> 
> No coil whine, no crashes from either in Windows10..
> 
> Not saying it's not a problem for some, but i don't think it's a "thing"... Not as of yet anyways...
> 
> I've pushed the msi to the max, and it had no whine at all, not had a chance to thoroughly torture the strix yet though.
> 
> Sometimes motherboards and psu (both power ripple and poor grounding) can contribute to whine.


It's weird because I tried 2 XFX's and they both had whine, so I was either unlucky twice or they just didn't like my setup. Then the MSI is quiet as a mouse (a mouse with fans).


----------



## mandrix

I was able to go ahead and order my PowerColor PCS 390X, should be delivered Wednesday. The EK block I had to order directly from EK and it will be here Monday...it made the trip from Slovenia to the US in one day, that's pretty cool.


----------



## Noirgheos

Is anyone here having those DX11 crashing issues reported in the AMD forums?


----------



## DarX098

hello guys, a 650W gold psu is enough for a Nitro R9 390?Limitations to overclock?


----------



## Gumbi

Quote:


> Originally Posted by *DarX098*
> 
> hello guys, a 650W gold psu is enough for a Nitro R9 390?Limitations to overclock?


What model?

And no, it should be perfect, even with 100mv these cards draw less 550watts depending on system config. You're good


----------



## kalidae

Quote:


> Originally Posted by *DarX098*
> 
> hello guys, a 650W gold psu is enough for a Nitro R9 390?Limitations to overclock?


I'm running my nitro 390 with a 650w gold psu, corsair rm650. It's fine and I didn't hit any limits psu wise with my overclocks and that was also with a 4690k overclocked to 4.5 ghz


----------



## TristanL

It seem like my freind is having some annoying issues with his 390 (Gigabyte G1), many Driver crashes in the Battlefront Beta and Ark for example (The latest Beta Driver is installed atm (for Battlefront). Some say the 300 line might have some DX11 problems and that the 15.15 Driver and/or a Bios Update might help, any thoughts?

(Rig. Xeon E3, Gigabyte H97 Board, Windows 10 x64 Pro, might deliver detailed informations later since i don't have it in my head right now.)


----------



## flopper

Quote:


> Originally Posted by *Noirgheos*
> 
> Is anyone here having those DX11 crashing issues reported in the AMD forums?


Normally its an user error.
I know that isnt what people want to hear but,
have you the latest bios for the mboard?
Did you fresh install over previous driver and cards?
Did you overclock and is it stable for gaming?

Its more often a hardware driver issue from other components that the user neglected to upgrade/update.
If that isnt the case its either a gpu hardware or software issue.
Most dont do proper troubleshooting and that cause issues with any driver team trying to replicate the issue as its how they do things, replicate the bug that is in the game so they can fix it.

If issues, re-set all to default, isolate and test each component if possible, update all drivers etc...
I had some issues with Battlefront beta, so I figured change a setting or two in the game would fix it and it did.
I could have complained and waited for a fix but a beta is 4 days so I tested a few things and then its playable.


----------



## Snailgun

Quote:


> Originally Posted by *flopper*
> 
> Normally its an user error.
> I know that isnt what people want to hear but,
> have you the latest bios for the mboard?
> Did you fresh install over previous driver and cards?
> Did you overclock and is it stable for gaming?
> 
> Its more often a hardware driver issue from other components that the user neglected to upgrade/update.
> If that isnt the case its either a gpu hardware or software issue.
> Most dont do proper troubleshooting and that cause issues with any driver team trying to replicate the issue as its how they do things, replicate the bug that is in the game so they can fix it.
> 
> If issues, re-set all to default, isolate and test each component if possible, update all drivers etc...
> I had some issues with Battlefront beta, so I figured change a setting or two in the game would fix it and it did.
> I could have complained and waited for a fix but a beta is 4 days so I tested a few things and then its playable.


l'm offended by this post. Everyone who has normally functioning R9 390 must check their privilege








If you look at my post 2 pages earlier, you can notice that i did all that you mentioned. Could you better tell what issues did you have and what solution found?


----------



## tangelo

Quote:


> Originally Posted by *Noirgheos*
> 
> Is anyone here having those DX11 crashing issues reported in the AMD forums?


Yup.


----------



## Roninnn

Hello everyone,

I'm new to the club. Unfortunetely I can't seem to get my card to work







It just refuses to send signal to monitor/tv. It's a Sapphire 390 nitro and the monitor is Dell u2515h. My mobo is kinda old(GA-890FXA-UD5) so I'm thinking it might be the culprit. Can't be the PSU because it's 850w [email protected] The fans are spinning and the led is on but my tv and my monitor don't detect anything coming from the pc. I tried using HDMI-to-HDMI and mDP-to-DP, swapped PCI-E slots, replugged connectors... nothing helps. I know for a fact that the card is not DOA, because it works flawlessly on another system. What do you guys think I should do?


----------



## flopper

Quote:


> Originally Posted by *Roninnn*
> 
> Hello everyone,
> 
> I'm new to the club. Unfortunetely I can't seem to get my card to work
> 
> 
> 
> 
> 
> 
> 
> It just refuses to send signal to monitor/tv. It's a Sapphire 390 nitro and the monitor is Dell u2515h. My mobo is kinda old(GA-890FXA-UD5) so I'm thinking it might be the culprit. Can't be the PSU because it's 850w [email protected] The fans are spinning and the led is on but my tv and my monitor don't detect anything coming from the pc. I tried using HDMI-to-HDMI and mDP-to-DP, swapped PCI-E slots, replugged connectors... nothing helps. I know for a fact that the card is not DOA, because it works flawlessly on another system. What do you guys think I should do?


bios mboard likely needed a newer one.
check cable and another monitor to the computer.


----------



## tangelo

Quote:


> Originally Posted by *Roninnn*
> 
> Hello everyone,
> 
> I'm new to the club. Unfortunetely I can't seem to get my card to work
> 
> 
> 
> 
> 
> 
> 
> It just refuses to send signal to monitor/tv. It's a Sapphire 390 nitro and the monitor is Dell u2515h. My mobo is kinda old(GA-890FXA-UD5) so I'm thinking it might be the culprit. Can't be the PSU because it's 850w [email protected] The fans are spinning and the led is on but my tv and my monitor don't detect anything coming from the pc. I tried using HDMI-to-HDMI and mDP-to-DP, swapped PCI-E slots, replugged connectors... nothing helps. I know for a fact that the card is not DOA, because it works flawlessly on another system. What do you guys think I should do?


I had this same problem after I installed my MSI. I needed to enable the integrated gpu on my cpu, then it booted into windows with it and after reboot it worked. *shrug*

Try googling "R9 390 install black screen"


----------



## bazookatooths

Quote:


> Originally Posted by *Weird0ne*
> 
> XFXR9390DDBios.zip 99k .zip file
> 
> 
> Told me I had to use a Zip.


Had a question is your BIOS 015.049.000.000.000000


----------



## Hemanse

Hm this is what my MSI R9 390 sounds like under load:






Sound is a bit low due to phone recording, but that sounds rather crappy to me, that is with the side on recording next to the pc, am i just being picky or is that bad coil whine?


----------



## Dundundata

Sounds like bad coil whine


----------



## DR4G00N

For those of you with R9 390 Gaming 8G or 390 NITRO's,

What are your max oc's and what problems (if any) did you encounter while oc'ing (TDP limits, temp limits, ect.)?


----------



## Liranan

Quote:


> Originally Posted by *TristanL*
> 
> It seem like my freind is having some annoying issues with his 390 (Gigabyte G1), many Driver crashes in the Battlefront Beta and Ark for example (The latest Beta Driver is installed atm (for Battlefront). Some say the 300 line might have some DX11 problems and that the 15.15 Driver and/or a Bios Update might help, any thoughts?
> 
> (Rig. Xeon E3, Gigabyte H97 Board, Windows 10 x64 Pro, might deliver detailed informations later since i don't have it in my head right now.)


Do drivers crash in idle (normal desktop use)? If so the card is bad and needs to be returned as this usually only happens when the GPU or RAM are defective.


----------



## BradleyW

Quote:


> Originally Posted by *TristanL*
> 
> It seem like my freind is having some annoying issues with his 390 (Gigabyte G1), many Driver crashes in the Battlefront Beta and Ark for example (The latest Beta Driver is installed atm (for Battlefront). Some say the 300 line might have some DX11 problems and that the 15.15 Driver and/or a Bios Update might help, any thoughts?
> 
> (Rig. Xeon E3, Gigabyte H97 Board, Windows 10 x64 Pro, might deliver detailed informations later since i don't have it in my head right now.)


AMDMatt over at OverclockersUK said AMD are aware that the R9 300 series are crashing a lot in various games. A fix should be seen soon. Either that or you have a defective card. Also, some cards can crash a lot when they are becoming faulty, but this is not likely.


----------



## Superjit94

Quote:


> Originally Posted by *Roninnn*
> 
> Hello everyone,
> 
> I'm new to the club. Unfortunetely I can't seem to get my card to work
> 
> 
> 
> 
> 
> 
> 
> It just refuses to send signal to monitor/tv. It's a Sapphire 390 nitro and the monitor is Dell u2515h. My mobo is kinda old(GA-890FXA-UD5) so I'm thinking it might be the culprit. Can't be the PSU because it's 850w [email protected] The fans are spinning and the led is on but my tv and my monitor don't detect anything coming from the pc. I tried using HDMI-to-HDMI and mDP-to-DP, swapped PCI-E slots, replugged connectors... nothing helps. I know for a fact that the card is not DOA, because it works flawlessly on another system. What do you guys think I should do?


i had a similar issue, i just uninstalled and completely removed my drivers, cleared CMOS and reinstalled drivers and that fixed it. See if that works


----------



## tolis626

Hey guys, could anyone hazard a guess about the maximum "safe" AUX voltage? Has anyone been running a high aux voltage for a long time? I really want to hit that 1700MHz memory on my MSI, but I've gone up to +25mV aux and I can do 1680MHz. 1700MHz would take +30mV or more, and I'm already somewhat skeptical about +25mV.


----------



## Allan P

Just posting my screenshots to join the group. I've had my XFX 390x for a few weeks so far and I love it. It's a huge upgrade from my GTX 650.


----------



## battleaxe

Quote:


> Originally Posted by *tolis626*
> 
> Hey guys, could anyone hazard a guess about the maximum "safe" AUX voltage? Has anyone been running a high aux voltage for a long time? I really want to hit that 1700MHz memory on my MSI, but I've gone up to +25mV aux and I can do 1680MHz. 1700MHz would take +30mV or more, and I'm already somewhat skeptical about +25mV.


I've run up to +60mv for running Heaven. I tried higher but it didn't do anything but add heat. I actually went up to +100mv, nothing was gained though in my case.


----------



## Liranan

Quote:


> Originally Posted by *BradleyW*
> 
> AMDMatt over at OverclockersUK said AMD are aware that the R9 300 series are crashing a lot in various games. A fix should be seen soon. Either that or you have a defective card. Also, some cards can crash a lot when they are becoming faulty, but this is not likely.


It's very common for this to happen. It's happened to me numerous times with both AMD and nVidia cards.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Allan P*
> 
> Just posting my screenshots to join the group. I've had my XFX 390x for a few weeks so far and I love it. It's a huge upgrade from my GTX 650.


I'm not alone anymore!!









Speaking of the 390x though, a friend picked up 3 XFX DD's a few days ago and it turns out all 3 will do 1150 core on stock voltage.....hopefully yours will yoo


----------



## TristanL

Quote:


> Originally Posted by *BradleyW*
> 
> AMDMatt over at OverclockersUK said AMD are aware that the R9 300 series are crashing a lot in various games. A fix should be seen soon. Either that or you have a defective card. Also, some cards can crash a lot when they are becoming faulty, but this is not likely.


on other games with a heavy load no problems occur, either way my friend decided to send it back. He was to annoyed by it :/


----------



## tangelo

Quote:


> Originally Posted by *Hemanse*
> 
> Hm this is what my MSI R9 390 sounds like under load:
> 
> 
> 
> 
> 
> 
> Sound is a bit low due to phone recording, but that sounds rather crappy to me, that is with the side on recording next to the pc, am i just being picky or is that bad coil whine?


Damn. Mine is whisper quiet


----------



## flopper

Quote:


> Originally Posted by *DR4G00N*
> 
> For those of you with R9 390 Gaming 8G or 390 NITRO's,
> 
> What are your max oc's and what problems (if any) did you encounter while oc'ing (TDP limits, temp limits, ect.)?


I run 1140mhz on core as 1170mhz is doable but 1140 I am aiming for normally.
Pushing battlefront beta showed that the card goes hot and I wouldnt push it beyond 1140 for myself.
still the only game I tried that actually push the card somewhat
Quote:


> Originally Posted by *tangelo*
> 
> Damn. Mine is whisper quiet


same with normal use and most gaming.


----------



## TsukikoChan

Hi all,
Long time lurker and hoping to be a proud owner of a 390x very soon, been keeping up in the thread for the past few dozen pages and I have a wee question for the owners. I was waiting on the Furies for ages and decided not to go for them since they aren't as good value for money as the 390x which i can switch out for a hbm2 card next year or year after.

I've narrowed my choice down to either msi or sapphire 390x and i know the main differences appear to be vrm & core temps, length, warranty and overclockability but which would be the better purchase for someone who doesn't intend to push its OC too heavily (at the start at least, once games start becoming sluggish then i will push it hard). I use a fractal r4 case so i got decent cooling in place, I play at 1080p on a 120hz monitor (will move up to 1440p sometime once a 120/144hz+freesync becomes available and affordable) and currently on a 7870 so an upgrade is long overdue 

ty guys! <3


----------



## DR4G00N

Quote:


> Originally Posted by *flopper*
> 
> I run 1140mhz on core as 1170mhz is doable but 1140 I am aiming for normally.
> Pushing battlefront beta showed that the card goes hot and I wouldnt push it beyond 1140 for myself.
> still the only game I tried that actually push the card somewhat
> same with normal use and most gaming.


Thanks a bunch.









It would be great if more people would post their oc's.


----------



## gupsterg

Quote:


> Originally Posted by *Weird0ne*
> 
> XFXR9390DDBios.zip 99k .zip file
> 
> 
> Told me I had to use a Zip.


Quote:


> Originally Posted by *bazookatooths*
> 
> Had a question is your BIOS 015.049.000.000.000000


That bios is 015.049.000.000.000000


----------



## Streetdragon

My r9 390 Nitro is on the GPU cool enough, but my vrm1 hits 90+ while gaming and my vrm2 hit 90+ too, While in idle vrm1 is ok but vrm2 runs @ 80-90 (have 2 screens)
So im a bit pissed that i didnt knew, that the vrm have only a cheap little heatspreader on it-.-
http://www.alphacool.com/product_info.php/info/p1652_Alphacool-NexXxoS-GPX---ATI-R9-390-M01---incl--backplate---black.html
This one would fit on it but i need some Fittings, but i cant find, wich size i need.
can i juse this one? http://www.highflow.nl/aansluitingen/fittingen/bitspower/10mm-3-8-id-5-8-od-bitspower-schroeffitting-90-graden-rotary-g1-4-matt-black-bp-mb90r2lcpf-cc3.html


----------



## Dundundata

Quote:


> Originally Posted by *DR4G00N*
> 
> Thanks a bunch.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It would be great if more people would post their oc's.


MSI 390
1200/1650, +100/38

Ambient temp: ~21C
GPU: 73C (max)
VRM1: 68C
VRM2: 50C


----------



## Agent Smith1984

Quote:


> Originally Posted by *Dundundata*
> 
> MSI 390
> 1200/1650, +100/38
> 
> Ambient temp: ~21C
> GPU: 73C (max)
> VRM1: 68C
> VRM2: 50C


Looks good man, those cards (in my opinion) will do that pretty readily so long as the air flow is good in the case, and the fan profile is "custom"









You ambient temp is ideal....


----------



## Dundundata

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Looks good man, those cards (in my opinion) will do that pretty readily so long as the air flow is good in the case, and the fan profile is "custom"
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You ambient temp is ideal....


I could probably tweak it a bit, maybe add some more to memory, but I'm running 1150/1650 normally as it's plenty enough. I just wanted to hit that 1200 mark







Ambient might actually get worse come winter with the windows shut and heat on but we'll see! Spent some time clearing space in the case, routing cables, trying to get things out of the way...not the cleanest setup out there but it works. I highly recommend the Noctua fans, they move alot of air and are quiet, I've been running them at 100% and they are not very loud at all. It's probably unnecessary but a few are hooked up via molex so no fan control for them atm.


----------



## tolis626

Quote:


> Originally Posted by *Dundundata*
> 
> MSI 390
> 1200/1650, +100/38
> 
> Ambient temp: ~21C
> GPU: 73C (max)
> VRM1: 68C
> VRM2: 50C


Man, my GPU won't get close to that. It can do close to 1200MHz, but it can't keep up because it will throttle the voltage back to more sane levels, which causes artifacting. I think it has to do with temps. It's not like 80 or 90C will damage the card, but it certainly gets less stable at over 80C. At this point I do have to sort of apologize to kalidae. He was right that over 80C does mess with stability.

And also seeing your temps and the fact that the ambient temperature here is in the 20-25C range too this time of year, I can come to only one conclusion. My GPU must be so gunked up with thermal paste that it's not even funny. But alas, I ***** bricks each time I think about tearing it apart to replace the TIM. The thought of it dying on me and having MSI tell me "No warranty for you dude" frightens me more than I care to describe.









PS : Is it me or do MSI 390's tend to overclock better than their 390x counterparts?


----------



## Dundundata

I's not sure what's involved with replacing the TIM on the MSI. It was fairly easy with the XFX, 4 screws and the entire heatsink/fan assembly separated from the GPU. They did have little stickers on 2 of the screws, so the warranty may be voided. I would look into it though because if it's just a paste issue it (should) be an easy fix.


----------



## DR4G00N

Quote:


> Originally Posted by *Dundundata*
> 
> MSI 390
> 1200/1650, +100/38
> 
> Ambient temp: ~21C
> GPU: 73C (max)
> VRM1: 68C
> VRM2: 50C


Nice clocks & temps.








Quote:


> Originally Posted by *Dundundata*
> 
> I's not sure what's involved with replacing the TIM on the MSI. It was fairly easy with the XFX, 4 screws and the entire heatsink/fan assembly separated from the GPU. They did have little stickers on 2 of the screws, so the warranty may be voided. I would look into it though because if it's just a paste issue it (should) be an easy fix.


Both MSI and XFX don't really care if you remove the cooler (even if it has stickers) as long as it's not damaged in the process. At least that's how it was for the HD 7000 series, don't see why they would have changed it.

The MSI is looking to be my best bet here. With it going to be WC'ed and the fact that winter is right around the corner ( <15c ambient) I should be able to get quite a respectable oc out of it.


----------



## Allan P

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Allan P*
> 
> Just posting my screenshots to join the group. I've had my XFX 390x for a few weeks so far and I love it. It's a huge upgrade from my GTX 650.
> 
> 
> 
> I'm not alone anymore!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Speaking of the 390x though, a friend picked up 3 XFX DD's a few days ago and it turns out all 3 will do 1150 core on stock voltage.....hopefully yours will yoo
Click to expand...

That's really great to know. I was wondering how high I could increase it before it would need more voltage. So far I'm at 1100 core on stock voltage. It's a good thing that it's cooled by my custom loop.







Quote:


> Originally Posted by *DR4G00N*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Dundundata*
> 
> MSI 390
> 1200/1650, +100/38
> 
> Ambient temp: ~21C
> GPU: 73C (max)
> VRM1: 68C
> VRM2: 50C
> 
> 
> 
> Nice clocks & temps.
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Dundundata*
> 
> I's not sure what's involved with replacing the TIM on the MSI. It was fairly easy with the XFX, 4 screws and the entire heatsink/fan assembly separated from the GPU. They did have little stickers on 2 of the screws, so the warranty may be voided. I would look into it though because if it's just a paste issue it (should) be an easy fix.
> 
> Click to expand...
> 
> Both MSI and XFX don't really care if you remove the cooler (even if it has stickers) as long as it's not damaged in the process. At least that's how it was for the HD 7000 series, don't see why they would have changed it.
> 
> The MSI is looking to be my best bet here. With it going to be WC'ed and the fact that winter is right around the corner ( <15c ambient) I should be able to get quite a respectable oc out of it.
Click to expand...

XFX will honor the warranty even if you remove the cooler as long as you're in the US. I think that the warranty void sticker only takes effect if you are from outside of this country.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Allan P*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Allan P*
> 
> Just posting my screenshots to join the group. I've had my XFX 390x for a few weeks so far and I love it. It's a huge upgrade from my GTX 650.
> 
> 
> 
> I'm not alone anymore!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Speaking of the 390x though, a friend picked up 3 XFX DD's a few days ago and it turns out all 3 will do 1150 core on stock voltage.....hopefully yours will yoo
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> That's really great to know. I was wondering how high I could increase it before it would need more voltage. So far I'm at 1100 core on stock voltage. It's a good thing that it's cooled by my custom loop.
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *DR4G00N*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Dundundata*
> 
> MSI 390
> 1200/1650, +100/38
> 
> Ambient temp: ~21C
> GPU: 73C (max)
> VRM1: 68C
> VRM2: 50C
> 
> Click to expand...
> 
> Nice clocks & temps.
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Dundundata*
> 
> I's not sure what's involved with replacing the TIM on the MSI. It was fairly easy with the XFX, 4 screws and the entire heatsink/fan assembly separated from the GPU. They did have little stickers on 2 of the screws, so the warranty may be voided. I would look into it though because if it's just a paste issue it (should) be an easy fix.
> 
> Click to expand...
> 
> Both MSI and XFX don't really care if you remove the cooler (even if it has stickers) as long as it's not damaged in the process. At least that's how it was for the HD 7000 series, don't see why they would have changed it.
> 
> The MSI is looking to be my best bet here. With it going to be WC'ed and the fact that winter is right around the corner ( <15c ambient) I should be able to get quite a respectable oc out of it.
> 
> Click to expand...
> 
> XFX will honor the warranty even if you remove the cooler as long as you're in the US. I think that the warranty void sticker only takes effect if you are from outside of this country.
Click to expand...

^ Correct......XFX will only honor the warranty after removing the cooler if you are in the US.

Very important to mention that


----------



## tolis626

Quote:


> Originally Posted by *Sgt Bilko*
> 
> ^ Correct......XFX will only honor the warranty after removing the cooler if you are in the US.
> 
> Very important to mention that


Damn you Americans and your priviledges!









Seriously though, ever tried buying electronics from Europe? Yikes. Prices are insane, deals are rare and after sales service is usually lacking.

Anyways... I may bite the bullet and paste that MSI with some MX4 I have lying around. It served me well on my CPU, so... Yeah.


----------



## gupsterg

Quote:


> Originally Posted by *Streetdragon*
> 
> My r9 390 Nitro is on the GPU cool enough, but my vrm1 hits 90+ while gaming and my vrm2 hit 90+ too, While in idle vrm1 is ok but vrm2 runs @ 80-90 (have 2 screens)
> So im a bit pissed that i didnt knew, that the vrm have only a cheap little heatspreader on it-.-


Looking at images of 390 Nitro & 390X Tri-X they don't use little heat spreader, but the are connected to the main plate of cooler just like the Tri-X 290/X.

Link:- http://www.kitguru.net/components/graphic-cards/zardon/sapphire-r9-390-nitro-8gb-review/4/
Link:- http://www.kitguru.net/components/graphic-cards/zardon/sapphire-r9-390x-tri-x-8gb-review/2/

The 290 Tri-X I used to own never touched 90c on VRMs, was OC'd and I'd very much think a 390/X Nitro/Tri-X would behave similar, not saying experience is wrong







.

IIRC highest was below 80c on VRM 1 & high 50s for VRM 2 with a OC of 1100 / 1475 +25mv.

Perhaps the backing off the thermal pads is still there? view this thread.


----------



## Dundundata

Quote:


> Originally Posted by *Hemanse*
> 
> 1. Is it normal not being able to read the VRM temps on "VRM Temperature 2" in GPU-Z, seems to be stuck on 47c no matter how i push the card


huh I am having the same issue, it seems to change by a few degrees on reboot but stays ~50C


----------



## kalidae

Quote:


> Originally Posted by *tolis626*
> 
> At this point I do have to sort of apologize to kalidae. He was right that over 80C does mess with stability.


Don't worry about it brother.i worked out the card doesn't like being over 80c very early in the life of this card. I had a define r4 case with a custom watercooling loop for the cpu with all the fans,both radiator and case fans set to 5v to make it dead silent, when I put the 390 in I started playing diablo 3 and the game basically sat on 144fps for the most part but when there was lost of mobs around and the gpu had to actually put in effort it would freeze for about 1 to 2 seconds. I noticed that my core temps were only 83 degrees or sometimes a bit more and I thought if this is basically a 290 it should handle way hotter than that. For an experiment I turned all the case fans up to 12v and my temps dropped into the 70s amd all of a sudden my freezing problems were gone. When I started adding voltage to the card for overclocks as soon as my core temps would be in the 80s I would start having problems again. I ended up leaving the door on the define r4 open because it restricts airflow a lot. In the end I couldn't have the silent pc like it once was because this card just required so much extra cooling than my previous 270x. I eventually just bought a new case designed for airflow and moved away from the silent design.


----------



## Gumbi

Quote:


> Originally Posted by *gupsterg*
> 
> Looking at images of 390 Nitro & 390X Tri-X they don't use little heat spreader, but the are connected to the main plate of cooler just like the Tri-X 290/X.
> 
> Link:- http://www.kitguru.net/components/graphic-cards/zardon/sapphire-r9-390-nitro-8gb-review/4/
> Link:- http://www.kitguru.net/components/graphic-cards/zardon/sapphire-r9-390x-tri-x-8gb-review/2/
> 
> The 290 Tri-X I used to own never touched 90c on VRMs, was OC'd and I'd very much think a 390/X Nitro/Tri-X would behave similar, not saying experience is wrong
> 
> 
> 
> 
> 
> 
> 
> .
> 
> IIRC highest was below 80c on VRM 1 & high 50s for VRM 2 with a OC of 1100 / 1475 +25mv.
> 
> Perhaps the backing off the thermal pads is still there? view this thread.


For only 25mv that is kind of hot for VRM 1... easily indicares it would go over 90 if you pumped 100mv through it.


----------



## Exenth

I have a question for the Sapphire R9 390 owners here, what is the difference between the 2 BIOSes, which you can switch with the button on the card?


----------



## mandrix

My PCS 390X is supposed to be here tomorrow, which is good since I've pretty much killed my 7950's.







I took one out a few days ago and yesterday the other one started causing reboots and artifacting. Both cards are 3 months out of warranty so that's that and I'm running on the iGPU. Shame, both cards were great overclockers at one time.
Anyone need EK waterblocks/backplates for reference 7950's cheap? lol.


----------



## tangelo

Quote:


> Originally Posted by *Dundundata*
> 
> huh I am having the same issue, it seems to change by a few degrees on reboot but stays ~50C


Like I previously said. It's completely normal. There is only one sensor for VRM's in MSI


----------



## Pwned24

I have a Sapphire R9 390 Nitro and while GPU temps dont go over 75-78C , my VRM temps get real hot (highest i got was 90+) This is with 100% Fans
My Settings
1140/1600
+63 Voltage
Allow 50% more power (highest i recorded power in GPU-Z was 300 Watts)


----------



## kalidae

Quote:


> Originally Posted by *Exenth*
> 
> I have a question for the Sapphire R9 390 owners here, what is the difference between the 2 BIOSes, which you can switch with the button on the card?


I'm pretty sure one bios is stock and can't be overclocked and the other can. I never really played around with it on my sapphire 390. I just installed the card and pushed the button straight away and haven't touched it since.


----------



## TsukikoChan

anyone help? :-(
Quote:


> Originally Posted by *TsukikoChan*
> 
> Hi all,
> Long time lurker and hoping to be a proud owner of a 390x very soon, been keeping up in the thread for the past few dozen pages and I have a wee question for the owners. I was waiting on the Furies for ages and decided not to go for them since they aren't as good value for money as the 390x which i can switch out for a hbm2 card next year or year after.
> 
> I've narrowed my choice down to either msi or sapphire 390x and i know the main differences appear to be vrm & core temps, length, warranty and overclockability but which would be the better purchase for someone who doesn't intend to push its OC too heavily (at the start at least, once games start becoming sluggish then i will push it hard). I use a fractal r4 case so i got decent cooling in place, I play at 1080p on a 120hz monitor (will move up to 1440p sometime once a 120/144hz+freesync becomes available and affordable) and currently on a 7870 so an upgrade is long overdue
> 
> ty guys! <3


----------



## tolis626

Quote:


> Originally Posted by *kalidae*
> 
> Don't worry about it brother.i worked out the card doesn't like being over 80c very early in the life of this card. I had a define r4 case with a custom watercooling loop for the cpu with all the fans,both radiator and case fans set to 5v to make it dead silent, when I put the 390 in I started playing diablo 3 and the game basically sat on 144fps for the most part but when there was lost of mobs around and the gpu had to actually put in effort it would freeze for about 1 to 2 seconds. I noticed that my core temps were only 83 degrees or sometimes a bit more and I thought if this is basically a 290 it should handle way hotter than that. For an experiment I turned all the case fans up to 12v and my temps dropped into the 70s amd all of a sudden my freezing problems were gone. When I started adding voltage to the card for overclocks as soon as my core temps would be in the 80s I would start having problems again. I ended up leaving the door on the define r4 open because it restricts airflow a lot. In the end I couldn't have the silent pc like it once was because this card just required so much extra cooling than my previous 270x. I eventually just bought a new case designed for airflow and moved away from the silent design.


I dunno man... On my case airflow shouldn't be a problem at all. I have 8 140mm fans (And high quality ones at that) blowing air all over the place, 2 of them feeding air from the bottom of the case directly to the GPU. I dread to think what would be happening if my setup wasn't so stupidly well ventilated.
Quote:


> Originally Posted by *Pwned24*
> 
> I have a Sapphire R9 390 Nitro and while GPU temps dont go over 75-78C , my VRM temps get real hot (highest i got was 90+) This is with 100% Fans
> My Settings
> 1140/1600
> +63 Voltage
> Allow 50% more power *(highest i recorded power in GPU-Z was 300 Watts)*


Well, I've seen 380W and commonly see 350-360W (although usually momentarily), so there's that.









In all honesty, I wouldn't care about VRM temps unless they go over 100C. Just let them do their thing, they're supposed to be cooking. If you're afraid nevertheless, just lower your voltage.


----------



## kalidae

Quote:


> Originally Posted by *tolis626*
> 
> I dunno man... On my case airflow shouldn't be a problem at all. I have 8 140mm fans (And high quality ones at that) blowing air all over the place, 2 of them feeding air from the bottom of the case directly to the GPU. I dread to think what would be happening if my setup wasn't so stupidly well ventilated.
> Well, I've seen 380W and commonly see 350-360W (although usually momentarily), so there's that.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> In all honesty, I wouldn't care about VRM temps unless they go over 100C. Just let them do their thing, they're supposed to be cooking. If you're afraid nevertheless, just lower your voltage.


Sounds like you have really good airflow and thats what these cards need. I'm going to be trying one of those alphacool waterblocks on my nitro and install my custom loop again. It's coming into summer and I want to keep my gpu as cool as possible but with a decent overclock and I don't want to hear my computer at all so I'll probably run a 240 and 360 radiator just for cpu and gpu with the rad fans on slow.


----------



## gupsterg

Quote:


> Originally Posted by *Gumbi*
> 
> For only 25mv that is kind of hot for VRM 1... easily indicares it would go over 90 if you pumped 100mv through it.


Probably







.

Quote:


> Originally Posted by *Exenth*
> 
> I have a question for the Sapphire R9 390 owners here, what is the difference between the 2 BIOSes, which you can switch with the button on the card?


Quote:


> Originally Posted by *kalidae*
> 
> I'm pretty sure one bios is stock and can't be overclocked and the other can. I never really played around with it on my sapphire 390. I just installed the card and pushed the button straight away and haven't touched it since.


Logo lit on switch UEFI bios is used, not lit is legacy ie no UEFI module in that ROM.


----------



## bichael

Just got my new toy







Probably overkill for what I need but couldn't resist the temptation any longer!

Had been looking at second hand 290s, which I probably would have got from Taobao as prices on ebay UK seem high in comparison, but then this 390 came up locally in Singapore at a decent price. Figured even though a bit more it was a better deal given it was boxed, still with warranty and not that old vs a random card in a bag from Taobao. Paid 400SGD which works out around £190. It does look like new apart from a bit of warping but hoping that shouldn't be an issue.

Will be putting an alphacool GPX block on it when I can get hold of just the 'upgrade kit' as will move the 'core' part over from my 270x. I'm 99% sure it's the same board as the 290PCS+, which I think is also noted on the first post of this thread.


----------



## gupsterg

Quote:


> Originally Posted by *bichael*
> 
> I'm 99% sure it's the same board as the 290PCS+, which I think is also noted on the first post of this thread.


Wouldn't be surprised.

When the first pics of the XFX 390 PCB surfaced on web it also have a 290/X PCB.

Even other past GPUs which have been rebranded with higher clock,etc the early cards have had PCB of the one before from some manufactuers.


----------



## Gumbi

Quote:


> Originally Posted by *gupsterg*
> 
> Wouldn't be surprised.
> 
> When the first pics of the XFX 390 PCB surfaced on web it also have a 290/X PCB.
> 
> Even other past GPUs which have been rebranded with higher clock,etc the early cards have had PCB of the one before from some manufactuers.


Powercolor's PCS design was one of the best designed 290s and likely required little modification.


----------



## gupsterg

Agreed great cooler







but PCB was ref design used, good for waterblock compatibility, etc but wouldn't be surprised if coil whine start like my Tri-X 290 did after several months of use but I do recall reading they apply glue to the inductors which Sapphire don't to Tri-X.


----------



## bichael

There were two versions of the PCB for the 290 pcs (have been researching a lot lately...), started out as reference then changed to a custom one (which was also used for the turbo duo, vtx and club 3d cards). The 390 pcs follows on using the same 'updated' one it seems. As to whether the new board is noticeably better I couldn't say....


----------



## Dundundata

Quote:


> Originally Posted by *tangelo*
> 
> Like I previously said. It's completely normal. There is only one sensor for VRM's in MSI


Thanks!


----------



## tbob22

Quote:


> Originally Posted by *gupsterg*
> 
> Agreed great cooler
> 
> 
> 
> 
> 
> 
> 
> but PCB was ref design used, good for waterblock compatibility, etc but wouldn't be surprised if coil whine start like my Tri-X 290 did after several months of use but I do recall reading they apply glue to the inductors which Sapphire don't to Tri-X.


No coil whine here. Have been running it since June. Also has been running forced at 0% fan speed for who knows how long (maybe 3 months?) due to some bug in afterburner after sleep only _(no problem on restart/afterburner restart)_. Noticed some stuttering and checked temps and it was at 98c. Still seems to be working fine, but I'm sure that wasn't too good for it.


----------



## tolis626

Hey, does anyone know why the 12V rail reading in GPU-Z keeps fluctuating and is always below 12V? It's usually 11.75V with spikes to 11.88V and it quite often dips to 11.66V under load. How accurate could that be?


----------



## DR4G00N

Quote:


> Originally Posted by *tolis626*
> 
> Hey, does anyone know why the 12V rail reading in GPU-Z keeps fluctuating and is always below 12V? It's usually 11.75V with spikes to 11.88V and it quite often dips to 11.66V under load. How accurate could that be?


That would depend on your psu and software readings are only semi accurate. Only if you hook up a multi meter up to a molex plug's 12v wire could you know what the voltage actually is.

The 12v rail on my 1300 G2 stays at 12.30v idle and when the gpu and cpu are put under heavy load it occasionally drops to 12.24V.
With my old 7950 Tri-fire I'd drop down to around 11.9V under load.


----------



## tolis626

Quote:


> Originally Posted by *DR4G00N*
> 
> That would depend on your psu and software readings are only semi accurate. Only if you hook up a multi meter up to a molex plug's 12v wire could you know what the voltage actually is.
> 
> The 12v rail on my 1300 G2 stays at 12.30v idle and when the gpu and cpu are put under heavy load it occasionally drops to 12.24V.
> With my old 7950 Tri-fire I'd drop down to around 11.9V under load.


What do you use to monitor it? I sadly don't have a multimeter (although maybe I should get one, it's not like they're expensive), so I have to rely on software. I also happen to have the exact same PSU, so there's that.

Thanks man. 

PS : Since you'll probably feel my pain, the noise the 1300G2 makes at idle is killing me. 

Στάλθηκε από το GT-I9300 μου χρησιμοποιώντας Tapatalk


----------



## DR4G00N

Quote:


> Originally Posted by *tolis626*
> 
> What do you use to monitor it? I sadly don't have a multimeter (although maybe I should get one, it's not like they're expensive), so I have to rely on software. I also happen to have the exact same PSU, so there's that.
> 
> Thanks man.
> 
> PS : Since you'll probably feel my pain, the noise the 1300G2 makes at idle is killing me.
> 
> Στάλθηκε από το GT-I9300 μου χρησιμοποιώντας Tapatalk


Aida 64 Extreme is what I use but HWInfo64 works fine too.

PS : What?!? Sorry, I can't hear you over my case fans.








Seriously, the 1300 G2's fan is extremely quite compared to my 140mm front intake fans (practically leaf blowers, they push just as much air as one too).


----------



## tolis626

Quote:


> Originally Posted by *DR4G00N*
> 
> Aida 64 Extreme is what I use but HWInfo64 works fine too.
> 
> PS : What?!? Sorry, I can't hear you over my case fans.
> 
> 
> 
> 
> 
> 
> 
> Seriously, the 1300 G2's fan is extremely quite compared to my 140mm front intake fans (practically leaf blowers, they push just as much air as one too).


Well, under load it's OK. Almost silent. I don't mind it then. But at idle everything is silent, save from the slight buzzing of my AIO pump and the PSU. At least it's not high-pitched.

What fans are you using anyway? Mine are 140mm too, and I have 8 of them. They are all Phanteks PF-F140SP fans. At idle they are dead silent and under load they are... OK. Not too quiet as they have to spin at like 1300 RPM, but they aren't screaming or anything.


----------



## GorillaSceptre

Hey guys,

Which 390x has the best cooler? The ambient temps where i live are brutal in summer, 30c+ most of the time.. So i need something pretty beastly, water isn't an option right now.

Thanks.


----------



## battleaxe

Hey for all of you use an AIO to cool the core... I made something that cools the VRM's like no other. Not small, but also not pathetic at cooling the VRM's.


----------



## Gumbi

I have to say the VRM cooling is superb on m VaporX. I can keep the VRMs sub 50c when running a mild oc at 55% fan speeds at a mild overclock at stock volts.

I have a 290X VaporX 8GB now and I'm gunning for your Heaven scores battleaxe







Hit 1649 at 1180/1600 yesterday, 81mv, core 60c and VRMs mid 50s I think







100% fan speed.


----------



## Cannon19932006

Quote:


> Originally Posted by *tolis626*
> 
> Hey, does anyone know why the 12V rail reading in GPU-Z keeps fluctuating and is always below 12V? It's usually 11.75V with spikes to 11.88V and it quite often dips to 11.66V under load. How accurate could that be?


It may or may not be accurate, but generally as long as your system is stable, and it's not dropping under 11.5 you should be okay.


----------



## tolis626

Quote:


> Originally Posted by *Cannon19932006*
> 
> It may or may not be accurate, but generally as long as your system is stable, and it's not dropping under 11.5 you should be okay.


Hmmm... Define stable. Under high overclocks it gets unstable after a while. Maybe I could stabilize those. Maybe not. Can't know yet.









Would there be any way to remedy this? I was thinking about pulling a second wire from my PSU, so that I have seperate wires for the 8 pin and the 6 pin connectors. I doubt that would do much though. It's not like my PSU is struggling or anything.


----------



## DR4G00N

Quote:


> Originally Posted by *tolis626*
> 
> Well, under load it's OK. Almost silent. I don't mind it then. But at idle everything is silent, save from the slight buzzing of my AIO pump and the PSU. At least it's not high-pitched.
> 
> What fans are you using anyway? Mine are 140mm too, and I have 8 of them. They are all Phanteks PF-F140SP fans. At idle they are dead silent and under load they are... OK. Not too quiet as they have to spin at like 1300 RPM, but they aren't screaming or anything.


2x Corsair H110's stock fans (@ 7v, 880ish RPM) as front intake, 1x Cougar V12H (1200 RPM) as bottom intake, 2x Cougar V12H's as push/pull rear exhaust for my H75 & 2x Fractal R2 140MM's (1000 RPM) as pull top exhaust for my H110.

I use headphones and the system is about 4-5 feet away from where I sit, so I can't hear it at all.


----------



## hugoolly

Hey guys, I'm new to the 390 club, went for an ASUS DCU2 version.
I'm just about ready to get myself a larger case and start some water cooling, but not sure whether to go fully custom or take the easy way out.
Has anyone used the NZXT G10 with the ASUS 390?


----------



## mandrix

390X came in today. No time to do much yet but here's a gpu-z shot.


----------



## tangelo

New beta drivers are out. Any progression on crashing DX11 games?


----------



## MTDEW

1
Quote:


> Originally Posted by *tangelo*
> 
> New beta drivers are out. Any progression on crashing DX11 games?


It's looking good and seems to have fixed things for a lot of people who were having issues. (see link below)
LINK


----------



## TehMasterSword

Hey guys! Long time lurker, new joiner!



http://imgur.com/fTqGBLf


Upgraded from a Strix 960 (now donated to a friend). Absolutely loving the performance of the card, but a little wary of the temps. 66 on idle, 90 at max load, max settings, 2x AA in Valley Benchmark, 85 in MGS V max settings. I took the card apart and replaced the thermal compound with my own which brought down the idle temps a few degrees, but still hit 90 in Valley. Really wish I could put this bad boy under water =/

Anyone know of any solutions? From what I've read, ASUS doesn't use a reference board so no full waterblocks are compatible, and an NZXT G10 leaves the VRMs to fend for themselves.


----------



## wirefox

I have my 2x 7970 running 1210/1525 ... but feel like vram is holding up bf4 a bit ..

has anyone upgraded from 2x 7970 to a 390x -- playing mostly bf4 1080 p on 120hz 27" monitor

How's the sound on load? I'm very sound conscious.. Also been reading there are no water blocks for this card... I have a great loop so might be a waste... I read 290x might work.. .anyone tried this?

otherwise I may go green :/ 980ti or something

wirefox


----------



## smithydan

Quote:


> Originally Posted by *Agent Smith1984*
> 
> *PowerColor*= Offers full cover waterblocks, and also has excellent air cooling.


Just confirming, does this use the reference waterblock or otherwise?


----------



## By-Tor

Quote:


> Originally Posted by *smithydan*
> 
> Just confirming, does this use the reference waterblock or otherwise?


You can use this EK website to find out which block fits which card.

http://www.ekwb.com/configurator/

This block fits both the 390 and 390x Powercolor cards, but not sure they are reference..
http://www.ekwb.com/configurator/step1_complist?gpu_gpus=1712

I'm running a pair of Powercolor 290x LCS cards that came from Powercolor with EK blocks installed and I do believe they are reference, but I could be wrong..

http://www.powercolor.com/us/products_features.asp?id=521


----------



## GorillaSceptre

Quote:


> Originally Posted by *TehMasterSword*
> 
> Hey guys! Long time lurker, new joiner!
> 
> 
> 
> http://imgur.com/fTqGBLf
> 
> 
> Upgraded from a Strix 960 (now donated to a friend). Absolutely loving the performance of the card, but a little wary of the temps. 66 on idle, 90 at max load, max settings, 2x AA in Valley Benchmark, 85 in MGS V max settings. I took the card apart and replaced the thermal compound with my own which brought down the idle temps a few degrees, but still hit 90 in Valley. Really wish I could put this bad boy under water =/
> 
> Anyone know of any solutions? From what I've read, ASUS doesn't use a reference board so no full waterblocks are compatible, and an NZXT G10 leaves the VRMs to fend for themselves.


What's your ambient temps?


----------



## Gumbi

Yeah, what are your ambients? Look at optimising case airflow and having a custom fan curve.


----------



## mandrix

At 23-25 Ambient my 390X idles at around 59C. I fired up Fire Strike and almost freaked out...It's strange hearing the fans, my last 3 cards I put water blocks on before I ever connected them so I'm used to almost total quiet.
Probably stick this block on tomorrow if I have enough distilled left.

After glancing at all the 390X that EK has full blocks for, I'm having a hard time figuring out what is reference and what is not, since it seemed like no one block fits a lot of cards.
But anyway I bought the FC 290X SE which is supposed to fit my PowerColor 390X...guess I'll know for sure soon! Had to order directly from EK though and pay a bunch of shipping. I guess PPC's didn't want to stock them although they have the backplates.


----------



## By-Tor

Quote:


> Originally Posted by *mandrix*
> 
> At 23-25 Ambient my 390X idles at around 59C. I fired up Fire Strike and almost freaked out...It's strange hearing the fans, my last 3 cards I put water blocks on before I ever connected them so I'm used to almost total quiet.
> Probably stick this block on tomorrow if I have enough distilled left.
> 
> After glancing at all the 390X that EK has full blocks for, I'm having a hard time figuring out what is reference and what is not, since it seemed like no one block fits a lot of cards.
> But anyway I bought the FC 290X SE which is supposed to fit my PowerColor 390X...guess I'll know for sure soon! Had to order directly from EK though and pay a bunch of shipping. I guess PPC's didn't want to stock them although they have the backplates.


Would like to hear how that block works out myself. I have a pair of Powercolor 290x LCS cards that came from Powercolor with EK blocks on them and wonder if they mite fit the 390/390x
cards.


----------



## mandrix

Quote:


> Originally Posted by *By-Tor*
> 
> Would like to hear how that block works out myself. I have a pair of Powercolor 290x LCS cards that came from Powercolor with EK blocks on them and wonder if they mite fit the 390/390x
> cards.


Well if you go strictly by what the configurator shows, the block only fits two cards so I'm not sure.
You could look at the bare card pic of the 390 on EK's site and compare to yours?


----------



## TehMasterSword

Quote:


> Originally Posted by *Gumbi*
> 
> Yeah, what are your ambients? Look at optimising case airflow and having a custom fan curve.


25-27C

I've got an NZXT H440 and my CPU is cooled by a Kraken x61 expelling heat out the top. All the air in the case is cool till it hits the GPU. I'll go ahead and make a more aggressive fan curve.

EDIT: HAHAHAHAHAHA GG WP ASUS. I set a super aggressive curve in MSI AB causing my GPU to sound like a jet engine BUT.... core temps dropped an entire 9C. I'll probably tone it down a little bit and just get used to the noise since I wear headphones anyway.t


----------



## kalidae

Quote:


> Originally Posted by *TehMasterSword*
> 
> 25-27C
> 
> I've got an NZXT H440 and my CPU is cooled by a Kraken x61 expelling heat out the top. All the air in the case is cool till it hits the GPU. I'll go ahead and make a more aggressive fan curve.
> 
> EDIT: HAHAHAHAHAHA GG WP ASUS. I set a super aggressive curve in MSI AB causing my GPU to sound like a jet engine BUT.... core temps dropped an entire 9C. I'll probably tone it down a little bit and just get used to the noise since I wear headphones anyway.t


The problem is your case. The h440 has terrible Airflow because it's optimised for silence. I had the same problem with my nitro 390 and my define r4 case also designed for silence but very average airflow. I swapped the case for a nzxt n450 which is the same internal design as the h440 but 10x better airflow it's even advertised and says 10x better. It defintely helped temps


----------



## GorillaSceptre

90C temps with a 25C ambient doesn't sound right at all, for a custom cooler anyway. Are 390X's really that hot?

I'm looking to get one, but with my ambient temps there's no way that would work.


----------



## Gumbi

Quote:


> Originally Posted by *TehMasterSword*
> 
> 25-27C
> 
> I've got an NZXT H440 and my CPU is cooled by a Kraken x61 expelling heat out the top. All the air in the case is cool till it hits the GPU. I'll go ahead and make a more aggressive fan curve.
> 
> EDIT: HAHAHAHAHAHA GG WP ASUS. I set a super aggressive curve in MSI AB causing my GPU to sound like a jet engine BUT.... core temps dropped an entire 9C. I'll probably tone it down a little bit and just get used to the noise since I wear headphones anyway.t


Wow the Asus cooling is poor. :/ I have excellent case cooling and 18c~ ambients... but I can keep my 290X vaporx at 62c easily at 55% fan speed. Granted, the VaporX is beastly but still, the Asus should be better than that.


----------



## Sgt Bilko

Quote:


> Originally Posted by *GorillaSceptre*
> 
> 90C temps with a 25C ambient doesn't sound right at all, for a custom cooler anyway. Are 390X's really that hot?
> 
> I'm looking to get one, but with my ambient temps there's no way that would work.


Mine hit a max of 69c today whilst gaming (28-22c ambient)


----------



## Sgt Bilko

Quote:


> Originally Posted by *Gumbi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *TehMasterSword*
> 
> 25-27C
> 
> I've got an NZXT H440 and my CPU is cooled by a Kraken x61 expelling heat out the top. All the air in the case is cool till it hits the GPU. I'll go ahead and make a more aggressive fan curve.
> 
> EDIT: HAHAHAHAHAHA GG WP ASUS. I set a super aggressive curve in MSI AB causing my GPU to sound like a jet engine BUT.... core temps dropped an entire 9C. I'll probably tone it down a little bit and just get used to the noise since I wear headphones anyway.t
> 
> 
> 
> Wow the Asus cooling is poor. :/ I have excellent case cooling and 18c~ ambients... but I can keep my 290X vaporx at 62c easily at 55% fan speed. Granted, the VaporX is beastly but still, the Asus should be better than that.
Click to expand...

It's not, you cannot compare the Vapor-X's cooling to any other card except the Lightning due to the vapor chamber, beefy heatsink and it's vrm placement.

Asus reused the same cooler that was on the 290x for the 390x (in the case of the DCUII) and the Strix is better but only with a more aggressive fan profile than what it comes with out of the box.

The issue imo is the way Asus has done it, because the rest of the companies use a plate that transfers heat to the pipes it spreads the load out more evenly than Asus' Direct Contact pipes do and with a small(ish) die like Hawaii/Grenada it means that only 2-3 of the 5 heatpipes make direct contact with the die and the others are left to only absorb heat passively.

I'll also point out that with ambients like that I'm surprised your temps aren't lower tbh :/


----------



## Gumbi

Quote:


> Originally Posted by *Sgt Bilko*
> 
> It's not, you cannot compare the Vapor-X's cooling to any other card except the Lightning due to the vapor chamber, beefy heatsink and it's vrm placement.
> 
> Asus reused the same cooler that was on the 290x for the 390x (in the case of the DCUII) and the Strix is better but only with a more aggressive fan profile than what it comes with out of the box.
> 
> The issue imo is the way Asus has done it, because the rest of the companies use a plate that transfers heat to the pipes it spreads the load out more evenly than Asus' Direct Contact pipes do and with a small(ish) die like Hawaii/Grenada it means that only 2-3 of the 5 heatpipes make direct contact with the die and the others are left to only absorb heat passively.
> 
> I'll also point out that with ambients like that I'm surprised your temps aren't lower tbh :/


It's with my overclock







And with case cooling not maxed.

And that's disappointing Asus didn't revamp the cooker at all :/


----------



## Agent Smith1984

I'm done testing the Asus Strix 390 now, and have put it up on the block in the Marketplace if anyone is interested.
http://www.overclock.net/t/1577053/asus-r9-390-strix-1-month-old-mint-condition-with-box-and-accessories-included

The card did 1180/1750 @ 50mv/25mv AUX, and while it did not run as cool on the VRM as my MSI, it never broke 85C with a custom fan curve.

The card has somewhat of a bad rap I think.....

As far as max overclock, I squeezed a few runs out at 1210MHz/1775 (have to find them), and the graphics score was around 14,380, but I found max stability was at around 1195/1750 with 100mv, which ran a tad too hot for me (81C/88CVRM)

At 50mv the core could be kept to around 77C which was pretty decent with such a high clock speed...... I am sure with a TIM swap it would do a tad better

Next card I am testing will HOPEFULLY be the Devil 13









We shall see!!!


----------



## Sgt Bilko

Quote:


> Originally Posted by *Gumbi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> It's not, you cannot compare the Vapor-X's cooling to any other card except the Lightning due to the vapor chamber, beefy heatsink and it's vrm placement.
> 
> Asus reused the same cooler that was on the 290x for the 390x (in the case of the DCUII) and the Strix is better but only with a more aggressive fan profile than what it comes with out of the box.
> 
> The issue imo is the way Asus has done it, because the rest of the companies use a plate that transfers heat to the pipes it spreads the load out more evenly than Asus' Direct Contact pipes do and with a small(ish) die like Hawaii/Grenada it means that only 2-3 of the 5 heatpipes make direct contact with the die and the others are left to only absorb heat passively.
> 
> I'll also point out that with ambients like that I'm surprised your temps aren't lower tbh :/
> 
> 
> 
> It's with my overclock
> 
> 
> 
> 
> 
> 
> 
> And with case cooling not maxed.
> 
> And that's disappointing Asus didn't revamp the cooker at all :/
Click to expand...

That is starting to make more sense then









and they did revamp it for the Strix but it's the way it's done more than anything.

I'll admit after seeing the DCU 7000 series cards i was expecting Hawaii's versions to be awesome but it wasn't meant to be.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I'm done testing the Asus Strix 390 now, and have put it up on the block in the Marketplace if anyone is interested.
> http://www.overclock.net/t/1577053/asus-r9-390-strix-1-month-old-mint-condition-with-box-and-accessories-included
> 
> The card did 1180/1750 @ 50mv/25mv AUX, and while it did not run as cool on the VRM as my MSI, it never broke 85C with a custom fan curve.
> 
> The card has somewhat of a bad rap I think.....
> 
> As far as max overclock, I squeezed a few runs out at 1210MHz/1775 (have to find them), and the graphics score was around 14,380, but I found max stability was at around 1195/1750 with 100mv, which ran a tad too hot for me (81C/88CVRM)
> 
> At 50mv the core could be kept to around 77C which was pretty decent with such a high clock speed...... I am sure with a TIM swap it would do a tad better
> 
> Next card I am testing will HOPEFULLY be the Devil 13
> 
> 
> 
> 
> 
> 
> 
> 
> 
> We shall see!!!


Pretty much exactly what I've heard.....the stock fan profile isn't aggressive enough









and i actually know someone that has a 390x Devil and they are reporting some of the best vrm temps I've heard of


----------



## mandrix

Jeez Louise, anyone know how to get the backplate off the PCS 390X?
I can't put the water block on if I can't get this freaking backplate off. I can't see how anyone could access all the screws with the cooler in the way.

DUHH I got it. lol, that's some sticky TIM tape!


----------



## Zack Foo

pretty good i say not yet crash any of my game yet played csgo dota 2 rocket league transformer since update.


----------



## GorillaSceptre

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Mine hit a max of 69c today whilst gaming (28-22c ambient)


Ah, that sounds better, does it ever go past 30C where you live?

I'm trying to find out which 390X has the best cooler.


----------



## Gumbi

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I'm done testing the Asus Strix 390 now, and have put it up on the block in the Marketplace if anyone is interested.
> http://www.overclock.net/t/1577053/asus-r9-390-strix-1-month-old-mint-condition-with-box-and-accessories-included
> 
> The card did 1180/1750 @ 50mv/25mv AUX, and while it did not run as cool on the VRM as my MSI, it never broke 85C with a custom fan curve.
> 
> The card has somewhat of a bad rap I think.....
> 
> As far as max overclock, I squeezed a few runs out at 1210MHz/1775 (have to find them), and the graphics score was around 14,380, but I found max stability was at around 1195/1750 with 100mv, which ran a tad too hot for me (81C/88CVRM)
> 
> At 50mv the core could be kept to around 77C which was pretty decent with such a high clock speed...... I am sure with a TIM swap it would do a tad better
> 
> Next card I am testing will HOPEFULLY be the Devil 13
> 
> 
> 
> 
> 
> 
> 
> 
> 
> We shall see!!!


That's actually a very good overclock, pity the cooling is so poor. How were the VRMs?

@Bilko, ya, since I have my 3 fans working again (and now have a 290X Vaporx 8gb vs a 290 vaporx) I am remembering how beastly the Vaporx cooling is. The VRMs are kept at 50 or below with my current setup. I still have to play with overclocks but I can easily game at 1200/1650 (maybe more memory, have to see though) at 110mv.


----------



## TehMasterSword

Quote:


> Originally Posted by *kalidae*
> 
> The problem is your case. The h440 has terrible Airflow because it's optimised for silence. I had the same problem with my nitro 390 and my define r4 case also designed for silence but very average airflow. I swapped the case for a nzxt n450 which is the same internal design as the h440 but 10x better airflow it's even advertised and says 10x better. It defintely helped temps


I'll test it again when I get off work, this time with the side panel off and fans at max speed.

And yes, Sgt Bilko is right. Only 2 or 3 pipes touch the tiny die.

Gorilla, my card is just the 390.

AgentSmith, can you show me your fan curve?


----------



## Mysticking32

Overclocking results on my msi r9 390x.

Going from 1110mhz to 1160 mhz led to a great 5 percent increase in graphic score. Very happy about those results but as per usual I'm staying at 1110. If I ever need the extra performance then I can overclock it.

All in all it's the equivalent of the r9 fury when overclocked. Max temps 77 @1160. Max temps 73 @1110

http://www.3dmark.com/compare/fs/5995609/fs/6220331


----------



## Agent Smith1984

Quote:


> Originally Posted by *Gumbi*
> 
> That's actually a very good overclock, pity the cooling is so poor. How were the VRMs?
> 
> @Bilko, ya, since I have my 3 fans working again (and now have a 290X Vaporx 8gb vs a 290 vaporx) I am remembering how beastly the Vaporx cooling is. The VRMs are kept at 50 or below with my current setup. I still have to play with overclocks but I can easily game at 1200/1650 (maybe more memory, have to see though) at 110mv.


The VRM's hit around 88C on 100mv which is higher than I'm used too, but not as bad as some others I have read.

At 50mv, the VRM never broke 85C.

I really enjoyed working with this card, and it looks especially great....

I'll do $265 for anybody who wants to score a crazy deal today!!
http://www.overclock.net/t/1577053/asus-r9-390-strix-1-month-old-mint-condition-with-box-and-accessories-included


----------



## tolis626

Hey guys, a couple of things.

First off, I may be getting a bit paranoid here, but can someone post your 12V reading from GPU-z?

Secondly, did anyone else install the 15.10 driver? I think my memory is less stable with that driver. It's either that, or it wasn't being stressed enough. And on that note, what's a good way to stress the memory controller? While gaming it will fluctuate from 5-10% to 100%, but it's usually in the 20-40% range.

Third, regarding the MSI 390x. How much do these cards sag? God damn, I just noticed today and it's bent quite a bit. Looks scary tbh. Although I guess that's the price of that huge cooler. Also, removing the cooler... What screws do I need to unscrew? The 4 around the GPU die on the back only, or are there more?

I also ran it with my side panel off today. Temps were exactly the same, which means my airflow is more than adequate. So the TIM is the issue here. Which leads me to hoping that my OC will improve if I change my TIM and improve my temps. Unfortulately, I also noticed that my card has some bad coild whine, especially at high voltages. Fortunately, I'm looking at my PC from the side and it blocks the noise like 90%, so it doesn't bother me even when not using headphones.

Damn I'm frustrated today...


----------



## Faster_is_better

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I'm done testing the Asus Strix 390 now, and have put it up on the block in the Marketplace if anyone is interested.
> http://www.overclock.net/t/1577053/asus-r9-390-strix-1-month-old-mint-condition-with-box-and-accessories-included
> 
> The card did 1180/1750 @ 50mv/25mv AUX, and while it did not run as cool on the VRM as my MSI, it never broke 85C with a custom fan curve.
> 
> The card has somewhat of a bad rap I think.....
> 
> As far as max overclock, I squeezed a few runs out at 1210MHz/1775 (have to find them), and the graphics score was around 14,380, but I found max stability was at around 1195/1750 with 100mv, which ran a tad too hot for me (81C/88CVRM)
> 
> At 50mv the core could be kept to around 77C which was pretty decent with such a high clock speed...... I am sure with a TIM swap it would do a tad better
> 
> Next card I am testing will HOPEFULLY be the Devil 13
> 
> 
> 
> 
> 
> 
> 
> 
> 
> We shall see!!!


Pretty good clocks, these cards would really shine with watercooling, but I think block availability is poor. I see this card will be getting around...


----------



## Waleh

Quote:


> Originally Posted by *AverdanOriginal*
> 
> Hi. I have a MSI R9 390 and Iam happy with it. took me some time though to get the temps down and I think I still have room for more temp reduction but thats fine tuning. Not sure abotu the Gigabyte though.
> If your build must be an mini-itx case and "only" the Gigabyte G1 will fit I would actually suggested to not go for a 390. Why? Mainly because the potential for good air flow is limited in mini-itx cases. there are better cards especially build for mini-itx cases. Fury Nano, mini GTX 970 from asus (i thinki?) and some others.


So, you are saying I should not get the G1 390/Windforce 390? My only other option at my price point is a GTX970 like the EVGA 970. I really want a 390 but the gigabyte version seems to be plagued with issues sadly








. I know someone with the case I want and a G1 390 who maxes 80 degrees during gaming (70% fan speed). However, my bigger concern is the shutdown problem with the GPU. Maybe I should just get a 970..


----------



## Agent Smith1984

Quote:


> Originally Posted by *Waleh*
> 
> So, you are saying I should not get the G1 390/Windforce 390? My only other option at my price point is a GTX970 like the EVGA 970. I really want a 390 but the gigabyte version seems to be plagued with issues sadly
> 
> 
> 
> 
> 
> 
> 
> . I know someone with the case I want and a G1 390 who maxes 80 degrees during gaming (70% fan speed). However, my bigger concern is the shutdown problem with the GPU. Maybe I should just get a 970..


Check my for sale post, that strix destroys the g1 and the 970 well below your price point!


----------



## Waleh

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Check my for sale post, that strix destroys the g1 and the 970 well below your price point!


Sadly, if I get a 390 I can only get the Gigabyte version because it is the only one to fit in my ITX case. Also, here in Canada, the 970 and 390 are similarly priced.


----------



## TehMasterSword

Guys..... Will Amazon honor a refund for my Strix 390 if I took it apart to replace the thermal paste? I can't find any relevant info, but one of the screws had a tamper evident sticker on it, so if they open the box and look, they will know


----------



## Sgt Bilko

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Mine hit a max of 69c today whilst gaming (28-22c ambient)
> 
> 
> 
> Ah, that sounds better, does it ever go past 30C where you live?
> 
> I'm trying to find out which 390X has the best cooler.
Click to expand...

Yep, it hits 40c or so here in summer (just getting the warmer weather now)

Im also running my 390x in Trifire with my 295x2.


----------



## AverdanOriginal

Quote:


> Originally Posted by *Waleh*
> 
> Sadly, if I get a 390 I can only get the Gigabyte version because it is the only one to fit in my ITX case. Also, here in Canada, the 970 and 390 are similarly priced.


Quote:


> Originally Posted by *Waleh*
> 
> So, you are saying I should not get the G1 390/Windforce 390? My only other option at my price point is a GTX970 like the EVGA 970. I really want a 390 but the gigabyte version seems to be plagued with issues sadly
> 
> 
> 
> 
> 
> 
> 
> . I know someone with the case I want and a G1 390 who maxes 80 degrees during gaming (70% fan speed). However, my bigger concern is the shutdown problem with the GPU. Maybe I should just get a 970..


The problem I see here is that you are dead set on building a mini case with an ITX Mobo. If that is really the case, then consider the following:
1. airflow potential is very limited since everything is packed in the pc case
2. Graphicscard needs to be small, hence less space for a big cooler
3. smaller card, smaller cooler, small price point means lower power of the graphicscard... = GTX 970







......









Ok fun aside. I see there are two different Gigabytes available now, both the same size, G1 Gaming and Windforce 2X. Not sure if both are voltage locked.
Besides the voltage lock, these cards tend to go hotter then a GTX 970, but when undervolted to be at the same as stock GTX 970 they produce the same heat aswell.

My suggestion would be, IF you really want to build a mini-ITX pc build and want it to be quiet either go with Asus GTX 970 Direct CU MINI OC... OR if you have the money to spend the R9 Nano.
It hurts me to suggest a GTX 970 in this case, but I don't think it makes sense to suggest a hot card for a small build.
I'll post in minute my experience with exchanging the TIM on my MSI R9 390, maybe that helps too.


----------



## AverdanOriginal

Quote:


> Originally Posted by *tolis626*
> 
> Hey guys, a couple of things.
> 
> First off, I may be getting a bit paranoid here, but can someone post your 12V reading from GPU-z?
> 
> Secondly, did anyone else install the 15.10 driver? I think my memory is less stable with that driver. It's either that, or it wasn't being stressed enough. And on that note, what's a good way to stress the memory controller? While gaming it will fluctuate from 5-10% to 100%, but it's usually in the 20-40% range.
> 
> Third, regarding the MSI 390x. How much do these cards sag? God damn, I just noticed today and it's bent quite a bit. Looks scary tbh. Although I guess that's the price of that huge cooler. Also, removing the cooler... What screws do I need to unscrew? The 4 around the GPU die on the back only, or are there more?
> 
> I also ran it with my side panel off today. Temps were exactly the same, which means my airflow is more than adequate. So the TIM is the issue here. Which leads me to hoping that my OC will improve if I change my TIM and improve my temps. Unfortulately, I also noticed that my card has some bad coild whine, especially at high voltages. Fortunately, I'm looking at my PC from the side and it blocks the noise like 90%, so it doesn't bother me even when not using headphones.
> 
> Damn I'm frustrated today...




Here is a GPU-Z with my current system, old TIM and all my case fansat full 12V. Running Unigine Heaven @1920x1080, Quality Ultra, Tesselation Extreme, AA x8.
If you want I can send you some more via PM, but I just checked an wether it's Firestrike or Heaven my max 12V is always 11.75V

Give me a minute and I put a full report of exchanging the TIM on my MSI R9 390 and results.... LOVE IT


----------



## tolis626

Quote:


> Originally Posted by *AverdanOriginal*
> 
> 
> 
> Here is a GPU-Z with my current system, old TIM and all my case fansat full 12V. Running Unigine Heaven @1920x1080, Quality Ultra, Tesselation Extreme, AA x8.
> If you want I can send you some more via PM, but I just checked an wether it's Firestrike or Heaven my max 12V is always 11.75V
> 
> Give me a minute and I put a full report of exchanging the TIM on my MSI R9 390 and results.... LOVE IT


Yes please! Dude, you just made my day. Or night, to be more precise.

THANK YOU!









PS : Just because I started getting all paranoid, I replaced my 8+6 harness cable with two seperate cables and connected the card with those. Same results. So yeah, it's not the PSU. Probably the measurement is borked. That's a relief.


----------



## mandrix

EK water block is on, ambient is 25 C. Much better than 59 C idle without the block.


----------



## Sgt Bilko

Quote:


> Originally Posted by *tolis626*
> 
> Quote:
> 
> 
> 
> Originally Posted by *AverdanOriginal*
> 
> 
> 
> Here is a GPU-Z with my current system, old TIM and all my case fansat full 12V. Running Unigine Heaven @1920x1080, Quality Ultra, Tesselation Extreme, AA x8.
> If you want I can send you some more via PM, but I just checked an wether it's Firestrike or Heaven my max 12V is always 11.75V
> 
> Give me a minute and I put a full report of exchanging the TIM on my MSI R9 390 and results.... LOVE IT
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yes please! Dude, you just made my day. Or night, to be more precise.
> 
> THANK YOU!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> PS : Just because I started getting all paranoid, I replaced my 8+6 harness cable with two seperate cables and connected the card with those. Same results. So yeah, it's not the PSU. Probably the measurement is borked. That's a relief.
Click to expand...

Don't trust software readouts for PSU.

Only way to be sure is by using a multimeter.


----------



## Waleh

Quote:


> Originally Posted by *AverdanOriginal*
> 
> The problem I see here is that you are dead set on building a mini case with an ITX Mobo. If that is really the case, then consider the following:
> 1. airflow potential is very limited since everything is packed in the pc case
> 2. Graphicscard needs to be small, hence less space for a big cooler
> 3. smaller card, smaller cooler, small price point means lower power of the graphicscard... = GTX 970
> 
> 
> 
> 
> 
> 
> 
> ......
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Ok fun aside. I see there are two different Gigabytes available now, both the same size, G1 Gaming and Windforce 2X. Not sure if both are voltage locked.
> Besides the voltage lock, these cards tend to go hotter then a GTX 970, but when undervolted to be at the same as stock GTX 970 they produce the same heat aswell.
> 
> My suggestion would be, IF you really want to build a mini-ITX pc build and want it to be quiet either go with Asus GTX 970 Direct CU MINI OC... OR if you have the money to spend the R9 Nano.
> It hurts me to suggest a GTX 970 in this case, but I don't think it makes sense to suggest a hot card for a small build.
> I'll post in minute my experience with exchanging the TIM on my MSI R9 390, maybe that helps too.


Thanks for your help chief! The only reason I want a mini-itx case is because I will be transporting the PC and I want as much juice as possible in the tiny Sugo SG13







. The nano is too expensive for my budget same with the GTX980 which would also fit in the case. I may get the MSI GTX 970 or EVGA SSC ACX 2.0+. I'll keep reading around. +rep


----------



## AverdanOriginal

OK. After finally having tested many different fan set-ups in my case to get the best performance with the most versatility and quietness, my temps in my case are the same as if I have all sides and the top open. So Airflow is perfect.

Now I wanted to see for myself if changing the TIM on the MSI R9 390 really does improve the Temps by up to 6°.

So I unscrewed the 4 screws holding down the GPU to the cooler and unplugged the fan from the PCB (I also unscreewed some screws connecting the backplate on the front side, but I am not sure if that was necessary).

I thought I would need to wiggle the card a bit as seen on some youtube videos but actually it came off really easy (mind you I took the card out without having turned on my PC for the whole day so it was cool).
What I saw was once I got a glimpse of the bare GPU core was really frustrating. Thermal paster all over the place and the viscosity of the TIM was really fluid. so probably cheap TIM with heaps of silicon in it. (just my guess)
Additionally I saw that the thermal pads on the chokes was not covering all perfectly. See picture below.









As you can see. the TIM is all over the place and on the actual core had barely enough on it. It also looks like there was a major airbubble in the middle.

So I cleand everything off (which took me like at least 15 min if not longer) and noticed also some scratches on the cooler. (See picture below)



Now I changed the TIM (used Gelid GC-Extreme) put everything back together and plugged in the fan (actually I nearly forgot to plug in the fan again







)
I used the spread Method since this is the actual core and you want to make sure everything is covered with TIM (different to applying TIM on a CPU with IHS)

Now to the resultsopen in new tab to get better resolution)



In the Excel you can see slight overclock of 1130/1680 with mv +13 and aux +13 in order to produce a little more heat then with normal 24/7 setting of 1120/1650 @ mv 0 and aux 0. The case Fan setting was so that all the fans were reduced in speed so that they are not loud. standing 2 meters away they were not audible (very silent).

The Grey line was with the old TIM. Heaven and Firestrike both produced 78°C @ 76% Fan and around 20°C ambient temp.
The second line is with new TIM keeping my custom fan curve. a drop of 3°C on each benchmark but check the ambient temp which was around 23°C. *SO a DELTA of 6°C*. Also the Fan only span at 69%.

*So I forced a constant 76% on my fans to see the real difference (3rd line). I got a delta of 7°C in Heaven and a Delta of 9°C in Firestrike. considering measruement inaccuracy a lower temp of 6-9°C with changing the TIM is realistic.*

I also noticed a lower VDDCin with the lower temps not sure what to make of this. And also the slight artifacting in Firestrike did not appear again, due to the lower temp? maybe, not sure.

Now I can put in my new i5-6600k with the asus hero VIII board to get a realistic go at overclocking









Hope this helps some people that plan on exchanging the TIM on their R9 390.


----------



## Dundundata

Try HWinfo for 12V. GPUz gives me 11.63, while HW is 12.144. I tend to believe the latter because Corsair has their own program for my PSU and it gives me 12.1.


----------



## Dundundata

@Averdan, that seems to be a common problem, pretty much what my XFX looked like-a gobbed on mess. And it looks like yours was missing some right in the middle







You'd think they could get this part of the manufacturing process right, seeing how many RMAs they must get for heat and how vital it is to the card. I am tempted to see what they did to my MSI but my temps are great so I must restrain myself from messing with it! I never really liked the spread method as much as the pea/BB dab, but that's another debate entirely.


----------



## tangelo

Quote:


> Originally Posted by *tolis626*
> 
> Hey guys, a couple of things.
> 
> First off, I may be getting a bit paranoid here, but can someone post your 12V reading from GPU-z?


MSI R9 390 @ 1100/1600 run at 99-100 GPU usage for 4+ hours

12V Min 11.63V
12V Max 11.88V
12V Avg 11.73V


----------



## TehMasterSword

Quote:


> Originally Posted by *kalidae*
> 
> The problem is your case. The h440 has terrible Airflow because it's optimised for silence. I had the same problem with my nitro 390 and my define r4 case also designed for silence but very average airflow. I swapped the case for a nzxt n450 which is the same internal design as the h440 but 10x better airflow it's even advertised and says 10x better. It defintely helped temps


Back after some testing!

I ran Valley at the same settings again today. Ended up hitting a constant 88C. I took the side panel off. Still steady. I cranked the case fans up. Still steady 88C. I turned on the default fan curve with MSI AB and the fans spun up to jet engine speeds lowering the temps all the way down to 80C constant. I turned the case fans back to their normal curve. Same temps. I put the panel back on. Same temps. Seems its not my case airflow, but more evidence that ASUS shipped the card with an absurdly low fan curve hoping nobody would notice the card is screaming hot. This should mean my temps in MGS V will be around 76C with this fan curve. Guess I'll just have to learn to live with a jet engine in my case until someone makes a FULL water block for it!

EDIT: YEAAAAH BOIIIIII. Amazon customer service confirmed best. They didn't care that I opened up the card. They are accepting it for a full refund, free shipping. What card should I swap it out for?


----------



## tolis626

Quote:


> Originally Posted by *Dundundata*
> 
> @Averdan, that seems to be a common problem, pretty much what my XFX looked like-a gobbed on mess. And it looks like yours was missing some right in the middle
> 
> 
> 
> 
> 
> 
> 
> You'd think they could get this part of the manufacturing process right, seeing how many RMAs they must get for heat and how vital it is to the card. I am tempted to see what they did to my MSI but my temps are great so I must restrain myself from messing with it! I never really liked the spread method as much as the pea/BB dab, but that's another debate entirely.


Application method on GPUs is quite a different thing than on CPUs. I don't think the pea method would work that well. How would you even make sure that you cover the whole core while you also don't apply too much TIM? I think it's not the most reliable method. I would either go with spreading or even the star pattern method that EK suggests on their waterblocks.
Quote:


> Originally Posted by *Dundundata*
> 
> Try HWinfo for 12V. GPUz gives me 11.63, while HW is 12.144. I tend to believe the latter because Corsair has their own program for my PSU and it gives me 12.1.


Oh I've tried HWinfo and it's the same reading. Seems to be just software being unreliable.
Quote:


> Originally Posted by *tangelo*
> 
> MSI R9 390 @ 1100/1600 run at 99-100 GPU usage for 4+ hours
> 
> 12V Min 11.63V
> 12V Max 11.88V
> 12V Avg 11.73V


Thanks man. Same numbers as me. It seems to be normal after all. Gave me a scare first though.


----------



## Dundundata

I did the pea method on the xfx and temps dropped 10C. It should spread evenly the same as a CPU once you put the heatsink on.


----------



## Gildejean

Hello everyone!

First i would like to thank AgentSmith for starting this and all the members of the club that are always putting up OC results, infos and helping each other! I have been on OCN a few times for reviews but i've never heard of something like an "Owners Club" before, i thought this was so AWESOME that i decide to read every post of this thread whenever i could, took me 5 DAYS to get here and i didnt get bored at any time, it was really nice, like reading a book, and i would do it again! xD
I knew only the basics to OC, never tried OCing anything except for my Android, wich is a tottally different proccess and it doesnt take much experience. i've learned so much about vendors (GPU wise) clocks, voltage, temps (specially temps) and benchmarks (like NEVER use furmark haha)in this time that i feel like a pro! (Not really, just kidding) thank you all!

So, to my situation... My mobo died about a month ago and all my setup was some f-ing old parts so i just started my first real job and decided to do some serious money-saving and build a new rig from zero, regardless of the time it takes, i brand new PC setup its worth it, until there, just mobile







. The hardware im looking for are:

CPU: Core I5 6600k skylake
Mobo: Asus z170 pro gaming
RAM: 8GB DDR4 2133 (16GB later)
GPU: Sapphire Nitro 390 with backplate
Storage: 240GB Kingston SSD
CPU cooling: CM Seidon 240M
PSU: EVGA SuperNOVA 750w B2
Display: Benq xl2430t 24" 144hz
Case: Corsair Carbide Spec03
Already own from old PC: 1TB HDD, DVD optical driver, Xigmatek 500w psu that i'm aware that wont be enough and Asus HD 3850x2 dual GPU that reached its limit.
Went for the sapphire because it hás the best cooling, and by the time i get this rig will be the end of summer here in Brazil (i guess the exact time its winter in US), ambient temps reaching 40C in extreme days and dont want to let the AC on everytime i'm gaming. I stil have a few questions and i hope you guys can help me out.

1: My main use for this rig is gaming, and maybe some benching now that i got really into this thing. I'm a MMO guy, só i mostly ill be playing games that this GPU can nail 150+ FPS like FPS's, moba's, and open world RPGs, was always a console player but since im not getting an atual gen console, gonna switch to PC, how a good OC (1150/1650 maybe) its gonna help in real gaming performance? I know i cant expect +30 FPS gain but maybe 10 in hm... BF4?

2. I'll be playing in 1080p, maybe VSR 1440p, theres a need to go for 390x instead? And theres much difference between VSR and real 1440p performance wise? This question is cause i really wanna enjoy the 144hz feature só i plan on going 60+ FPS, maybe some 100 or max i can get.

3. This PSU can handle the i5 OC'ed to 4,5ghz and everything fine? Plan to buy a second 390 in a year or so and i know i would have to upgrade, what should i get there? 1000w min?

4. Do any of you guys know a shop that can ship for Brazil besides eBay? I have trust issues with these. Hardware here are overpriced, we are used to go on local Store, cause by shipping here
The prices are abusive. By my math, importing + taxes still cheaper than buy over here. And we dont have plenty of options, for WC for exemple... All i can find CM and Corsair.

Im so sorry for post lenght, but cmon, after reading the whole thread, i kinda deserve it right hahaha. Sorry for any grammar mistakes and thank you all again! Feel free to give your opinion in anything even not GPU related on the rig.


----------



## Geoclock

Did replacing thermal paste void MSI warranty?
My card also got some incorrect pad placing.
it plays under load 60-65c but it's better to make sure everything is OK inside.


----------



## bichael

Quote:


> Originally Posted by *Waleh*
> 
> Thanks for your help chief! The only reason I want a mini-itx case is because I will be transporting the PC and I want as much juice as possible in the tiny Sugo SG13
> 
> 
> 
> 
> 
> 
> 
> . The nano is too expensive for my budget same with the GTX980 which would also fit in the case. I may get the MSI GTX 970 or EVGA SSC ACX 2.0+. I'll keep reading around. +rep


I think the VTX 390 would also fit, looks to be more or less the same card as the VTX and Powercolor Turbo Duo 290s. These I think are about 267mm long, ie no more than the pcb and I think an SG13 can take around 270mm max? I think they have the same pcb as the powercolor pcs+ cards but run at lower clocks. Obviously being smaller twin fan coolers they are unlikely to be the best out there but from what reviews I did find they seem reasonable. A 970 may be a better option depending on priorities and pricing but something to think about anyway









My 390 is going in an SG05, hence all the reading up on lengths, but I will be watercooling with an external rad...


----------



## gupsterg

I was wondering if some MSI 390 / X owners can state their VID and ASIC ratings, I'm just curious to correlate something I noted with 290/X. I was after results with factory bios and no OC present but factory clocks. Please use APP linked below for VID, best to take reading after a reboot and no OC app running in background.

Link:- The Stilt VID app
Quote:


> Originally Posted by *tolis626*
> 
> Hey, does anyone know why the 12V rail reading in GPU-Z keeps fluctuating and is always below 12V? It's usually 11.75V with spikes to 11.88V and it quite often dips to 11.66V under load. How accurate could that be?


That voltage relates to what the voltage chip on card thinks is 12v coming in.

I wouldn't worry about it







.

Voltage control chip is same as 290/X, all 3 290/X cards I've experienced do the same.


----------



## kizwan

Quote:


> Originally Posted by *AverdanOriginal*
> 
> OK. After finally having tested many different fan set-ups in my case to get the best performance with the most versatility and quietness, my temps in my case are the same as if I have all sides and the top open. So Airflow is perfect.
> 
> Now I wanted to see for myself if changing the TIM on the MSI R9 390 really does improve the Temps by up to 6°.
> 
> So I unscrewed the 4 screws holding down the GPU to the cooler and unplugged the fan from the PCB (I also unscreewed some screws connecting the backplate on the front side, but I am not sure if that was necessary).
> 
> I thought I would need to wiggle the card a bit as seen on some youtube videos but actually it came off really easy (mind you I took the card out without having turned on my PC for the whole day so it was cool).
> What I saw was once I got a glimpse of the bare GPU core was really frustrating. Thermal paster all over the place and the viscosity of the TIM was really fluid. so probably cheap TIM with heaps of silicon in it. (just my guess)
> Additionally I saw that the thermal pads on the VRams was not covering all perfectly. See picture below.
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> As you can see. the TIM is all over the place and on the actual core had barely enough on it. It also looks like there was a major airbubble in the middle.
> 
> So I cleand everything off (which took me like at least 15 min if not longer) and noticed also some scratches on the cooler. (See picture below)
> 
> 
> 
> Now I changed the TIM (used Gelid GC-Extreme) put everything back together and plugged in the fan (actually I nearly forgot to plug in the fan again
> 
> 
> 
> 
> 
> 
> 
> )
> I used the spread Method since this is the actual core and you want to make sure everything is covered with TIM (different to applying TIM on a CPU with IHS)
> 
> Now to the resultsopen in new tab to get better resolution)
> 
> 
> 
> In the Excel you can see slight overclock of 1130/1680 with mv +13 and aux +13 in order to produce a little more heat then with normal 24/7 setting of 1120/1650 @ mv 0 and aux 0. The case Fan setting was so that all the fans were reduced in speed so that they are not loud. standing 2 meters away they were not audible (very silent).
> 
> The Grey line was with the old TIM. Heaven and Firestrike both produced 78°C @ 76% Fan and around 20°C ambient temp.
> The second line is with new TIM keeping my custom fan curve. a drop of 3°C on each benchmark but check the ambient temp which was around 23°C. *SO a DELTA of 6°C*. Also the Fan only span at 69%.
> 
> *So I forced a constant 76% on my fans to see the real difference (3rd line). I got a delta of 7°C in Heaven and a Delta of 9°C in Firestrike. considering measruement inaccuracy a lower temp of 6-9°C with changing the TIM is realistic.*
> 
> I also noticed a lower VDDCin with the lower temps not sure what to make of this. And also the slight artifacting in Firestrike did not appear again, due to the lower temp? maybe, not sure.
> 
> Now I can put in my new i5-6600k with the asus hero VIII board to get a realistic go at overclocking
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hope this helps some people that plan on exchanging the TIM on their R9 390.


That's Choke actually. *VRMs* (the parts you want to cool) are on the right side along the chokes. *VRAMs* are actually referring to graphic card memory ICs.


----------



## AverdanOriginal

Quote:


> Originally Posted by *Dundundata*
> 
> @Averdan, that seems to be a common problem, pretty much what my XFX looked like-a gobbed on mess. And it looks like yours was missing some right in the middle
> 
> 
> 
> 
> 
> 
> 
> You'd think they could get this part of the manufacturing process right, seeing how many RMAs they must get for heat and how vital it is to the card. I am tempted to see what they did to my MSI but my temps are great so I must restrain myself from messing with it! I never really liked the spread method as much as the pea/BB dab, but that's another debate entirely.


Yeah my temps were alright, but I thought "hey if I can get another 6°C improvement with 30 min work... why not"







And you are right, you'd think they would put more effort into these heatpipes, but hey, also on the new skylake CPU you can get 10°C or more improvement on delliding and changing the TIM. But for the everyday user, it would probably not make a difference.
Quote:


> Originally Posted by *tolis626*
> 
> Application method on GPUs is quite a different thing than on CPUs. I don't think the pea method would work that well. How would you even make sure that you cover the whole core while you also don't apply too much TIM? I think it's not the most reliable method. I would either go with spreading or even the star pattern method that EK suggests on their waterblocks.


Like you said Dundundata, that is an entirely different debate







But like tolis 626 said: on IHS fo the CPU i also use the pea method since you already a have a cooler on the CPU die, but on the GPU i opted for the spread method in order to make sure to have it all covered (it also depends what Thermal paste/Metal liquid you are using of course). the +-1 temp difference of perfect and near perfect spread is ok for me.


----------



## AverdanOriginal

Quote:


> Originally Posted by *kizwan*
> 
> That's Choke actually. *VRMs* (the parts you want to cool) are on the right side along the chokes. *VRAMs* are actually referring to graphic card memory ICs.


Yeah you are right. Edited my post. thanks for the info


----------



## tolis626

Quote:


> Originally Posted by *gupsterg*
> 
> I was wondering if some MSI 390 / X owners can state their VID and ASIC ratings, I'm just curious to correlate something I noted with 290/X. I was after results with factory bios and no OC present but factory clocks. Please use APP linked below for VID, best to take reading after a reboot and no OC app running in background.
> 
> Link:- The Stilt VID app


I just did that and it gives me a VID (or VDD?) of 1.275V. That seems kinda high, doesn't it? At stock clocks/voltages it only goes up to about 1.19V. Maybe 1.2V, but I don't think it ever goes that high.

Anyways, my ASIC is 74.9%. That's... quite good actually. I like it.
Quote:


> Originally Posted by *gupsterg*
> 
> That voltage relates to what the voltage chip on card thinks is 12v coming in.
> 
> I wouldn't worry about it
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Voltage control chip is same as 290/X, all 3 290/X cards I've experienced do the same.


It's the first time I took notice and it bugged me quite a bit. More than it should actually. Since all is right, I don't care what the reading is, though.


----------



## RWGTROLL




----------



## gupsterg

Quote:


> Originally Posted by *tolis626*
> 
> I just did that and it gives me a VID (or VDD?) of 1.275V. That seems kinda high, doesn't it? At stock clocks/voltages it only goes up to about 1.19V. Maybe 1.2V, but I don't think it ever goes that high.


Many thanks







, I have not been following thread fully so you wouldn't mind stating do you have MSI 390 or 390X?

Why you're seeing VDDC lower than VID is a) software monitoring not accurate b) it droops c) a fair amount of voltage control is done depending on load I have seen if I run *x* benchmark I hit differing MAX VDDC than *y* benchmark.
Quote:


> Originally Posted by *tolis626*
> 
> Anyways, my ASIC is 74.9%. That's... quite good actually. I like it.


ASIC rating = Leakage ID , I have read a few posts by the Stilt and the simple explanation is its opposite to what GPU-Z advise on ASIC rating.

Some informative posts by The Stilt:-

1st post
2nd post
3rd post
4th post

Those I think cover it well but there are also other posts by him with other golden nuggets of info







.

The only other thing I recall reading was he stated low leakage ID ASIC can sustain increase in higher voltage better (long term use) where as high leakage ID can't.

*Why I asked for the test?*

All factory roms operate using Electronic Variable Voltage (EVV) this mean based on some calcs depending on GPU properties a VID is set. The other thing that affects it is default GPU clock.

I found on my 290X that if I set ROM to have default GPU clock of 1000MHz I get VID 1.25v and as I set higher default GPU clock in ROM (not OC app) then it goes down. This is due to another calc which is going on, if you then set GPU voltage manually instead of EVV this effect stops.

I have not yet flashed a 390/X rom to my Vapor-X 290X due to custom PCB, so wanted to know if it is same with 390/X.

SO can someone with lower default clock 390/X than MSI 390/X like a Sapphire Nitro 390/X run same test and state ASIC quality?

Cheers.


----------



## tolis626

Quote:


> Originally Posted by *gupsterg*
> 
> Many thanks
> 
> 
> 
> 
> 
> 
> 
> , I have not been following thread fully so you wouldn't mind stating do you have MSI 390 or 390X?
> 
> Why you're seeing VDDC lower than VID is a) software monitoring not accurate b) it droops c) a fair amount of voltage control is done depending on load I have seen if I run *x* benchmark I hit differing MAX VDDC than *y* benchmark.
> ASIC rating = Leakage ID , I have read a few posts by the Stilt and the simple explanation is its opposite to what GPU-Z advise on ASIC rating.
> 
> Some informative posts by The Stilt:-
> 
> 1st post
> 2nd post
> 3rd post
> 4th post
> 
> Those I think cover it well but there are also other posts by him with other golden nuggets of info
> 
> 
> 
> 
> 
> 
> 
> .
> 
> *Why I asked for the test?*
> 
> All factory roms operate using Electronic Variable Voltage (EVV) this mean based on some calcs depending on GPU properties a VID is set. The other thing that affects it is default GPU clock.
> 
> I found on my 290X that if I set ROM to have default GPU clock of 1000MHz I get VID 1.25v and as I set higher default GPU clock in ROM (not OC app) then it goes down. This is due to another calc which is going on, if you then set GPU voltage manually instead of EVV this effect stops.
> 
> I have not yet flashed a 390/X rom to my Vapor-X 290X due to custom PCB, so wanted to know if it is same with 390/X.
> 
> SO can someone with lower default clock 390/X than MSI 390/X like a Sapphire Nitro 390/X run same test and state ASIC quality?
> 
> Cheers.


Crap. I just read the Stilt's posts and crash-landed. I thought higher ASIC "quality" meant better overclocking. Damn, and I was happy about it. Dud overclockers seem to like me in general. Oh well...


----------



## Pwned24

Hey So I got the new 15.10 Beta Drivers and I thought it was good. My Witcher 3 Program crashes still however less often. What I don't understand though is the feeling that the GPU is being underutilized? The fan speeds are the same yet my GPU doesn't even break 70C (i usually reach 75C) GPU usage also used to be a constant 100 in Witcher 3 but according to Afterburner it now fluctuates. Also, the power being used by the card is significantly lower (less than 230Watts despite +50% power set). Is anyone else getting the same issues?


----------



## tolis626

Quote:


> Originally Posted by *Pwned24*
> 
> Hey So I got the new 15.10 Beta Drivers and I thought it was good. My Witcher 3 Program crashes still however less often. What I don't understand though is the feeling that the GPU is being underutilized? The fan speeds are the same yet my GPU doesn't even break 70C (i usually reach 75C) GPU usage also used to be a constant 100 in Witcher 3 but according to Afterburner it now fluctuates. Also, the power being used by the card is significantly lower (less than 230Watts despite +50% power set). Is anyone else getting the same issues?


I haven't tested any of the things you just said, but I'll do so next time I start Inquisition. What I have noticed is that my card crashes easier from memory overclocks. I mean, I was stable at 1675MHz with +20mV AUX before, but now it crashes in about 2 minutes into the game even with +25mV. Strange...


----------



## Agent Smith1984

Quote:


> Originally Posted by *RWGTROLL*


You are added!

PLEASE PLEASE post up some overclocking results for that Devil. I want to see how that card does on it's factory hybrid water cooling.

I am thinking 1200MHz may come easy for that card......


----------



## AliNT77

Quote:


> Originally Posted by *RWGTROLL*


can U post your firestrike graphics score with everything on default?









thx in advance


----------



## Dundundata

Pretty excited to see what el Diablo can do!


----------



## diggiddi

Guys in pursuit of crossfire options here, get another 290x or go for 390/x? I'm at 1080p
I already have ze 290x lightning, looking at picking up a Samsung curved 4k tv in the future
main games are Cry 3, BF3/4, F1 what to do????


----------



## Agent Smith1984

Quote:


> Originally Posted by *diggiddi*
> 
> Guys in pursuit of crossfire options here, get another 290x or go for 390/x? I'm at 1080p
> I already have ze 290x lightning, looking at picking up a Samsung curved 4k tv in the future
> main games are Cry 3, BF3/4, F1 what to do????


No need to go for a 390 if you aren't adding another 390 later for 8GB usable VRAM.

If you get a 390 now, you can still only use 4GB.

Disregard all the talk of VRAM stacking, as we likely won't see that for a few years, and that's if programmers even decide to use it at all.


----------



## diggiddi

Quote:


> Originally Posted by *Agent Smith1984*
> 
> No need to go for a 390 if you aren't adding another 390 later for 8GB usable VRAM.
> 
> If you get a 390 now, you can still only use 4GB.
> 
> Disregard all the talk of VRAM stacking, as we likely won't see that for a few years, and that's if programmers even decide to use it at all.


Cool I was thinking of using it as Primary card in situations where 4+GB Vram is needed in single card mode +rep though
BTW how are/were you getting more the 30FPS on 4k using HDMI?


----------



## Agent Smith1984

Quote:


> Originally Posted by *diggiddi*
> 
> Cool I was thinking of using it as Primary card in situations where 4+GB Vram is needed in single card mode +rep though
> BTW how are/were you getting more the 30FPS on 4k using HDMI?


Using a DP to HDMI adapter

Check this out:
http://www.amazon.com/gofanco%C2%AE-DisplayPort-Converter-UltraHD-Display/dp/B00OSBDVF8/ref=sr_1_2?ie=UTF8&qid=1445018190&sr=8-2&keywords=dp+to+hdmi+4k+60hz+adapter


----------



## diggiddi

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Using a DP to HDMI adapter
> 
> Check this out:
> http://www.amazon.com/gofanco%C2%AE-DisplayPort-Converter-UltraHD-Display/dp/B00OSBDVF8/ref=sr_1_2?ie=UTF8&qid=1445018190&sr=8-2&keywords=dp+to+hdmi+4k+60hz+adapter


Thx repped again


----------



## Agent Smith1984

So, I held on to the Asus Strix afterall....

(would note that I got a payment from someone for it who was actually in another country and did not have a confirmed address, when confronted they decided to cancel the transaction, so the intention was to sell the card)

I am picking up another new toy tonight though...

AMD FX-8370 for $130









5GHz here I come!!


----------



## AverdanOriginal

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Using a DP to HDMI adapter
> 
> Check this out:
> http://www.amazon.com/gofanco%C2%AE-DisplayPort-Converter-UltraHD-Display/dp/B00OSBDVF8/ref=sr_1_2?ie=UTF8&qid=1445018190&sr=8-2&keywords=dp+to+hdmi+4k+60hz+adapter


Hi Agent, was gonna ask you aswell, since I was searching for a DP to HDMI converter/adapter but only found 3840x2160 @ 30Hz. The one you linked also only has as a specification 3840x2160 @ 30 Hz. Wouldn't that mean you can have a maximum of 30 FPS? or can it perform better than specified? do you get with that adapter on 4k more than 30 FPS?
Would help a lot.


----------



## Agent Smith1984

Quote:


> Originally Posted by *AverdanOriginal*
> 
> Hi Agent, was gonna ask you aswell, since I was searching for a DP to HDMI converter/adapter but only found 3840x2160 @ 30Hz. The one you linked also only has as a specification 3840x2160 @ 30 Hz. Wouldn't that mean you can have a maximum of 30 FPS? or can it perform better than specified? do you get with that adapter on 4k more than 30 FPS?
> Would help a lot.


Wow, you are right! That would explain all the tearing! Derp









But seriously, it looks like you are correct, but my games were all breaking 30fps per fraps and i got no tearing.... Could it be because the tv itself has hardware to reduce motioon blur and tearing issues?


----------



## abu77

Quote:


> Originally Posted by *Agent Smith1984*
> 
> So, I held on to the Asus Strix afterall....
> 
> (would note that I got a payment from someone for it who was actually in another country and did not have a confirmed address, when confronted they decided to cancel the transaction, so the intention was to sell the card)
> 
> I am picking up another new toy tonight though...
> 
> AMD FX-8370 for $130
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 5GHz here I come!!


please can tell me what settings you used to get those temps and overclocks because I have asus strix 390 and its temps are so high 86c load at stock speeds. How loud does it get too because mine is very loud at around 70% fan speed. Can my card be faulty or is it a software problem? Thank you


----------



## Derek129

So I decided to replace the thermal paste on my 390. This is what the factory applied looked like. Doesn't look right to me? I was seeing 70 degrees + in most games and while gaming last night for an hour on gtav my max temp was 66


----------



## Darkeylel

So unfortunately my Gigabyte 390x died pretty sad first time out of 6 gigabyte cards this has happened.

Now for the long arse warranty claim hazah


----------



## Gildejean

No answers for my questions


----------



## Sgt Bilko

Quote:


> Originally Posted by *Gildejean*
> 
> No answers for my questions


What was your question?


----------



## RWGTROLL

Test 2 with +50% Power Limit in Afterburner


----------



## Sgt Bilko

Quote:


> Originally Posted by *RWGTROLL*
> 
> 
> 
> Test 2 with +50% Power Limit in Afterburner


Hmmm, so by the looks of that it's only the XFX 390x's that need the extra powerlimit to stay at max clocks (at stock).

damn good vrm temps though and that AIO sure is doing damn well to keep those core temps


----------



## Conathan

1. 

2. MSI R9 390 Gaming 8G

3. Stock

Been lurking for a bit, but decided to finally join because I'm looking for insight.

In my post is my current OC.

My aux voltage is also set to 75+.

This is as stable as I could seem to get my card to run, but I feel like I'm pumping way more voltage into my card then most.

I have very little knowledge of this my method was just bump the core and memory separately in increments of +5, and when I started seeing artifacts bump the voltage up +5.

Started seeing artifacts no matter what voltage at 1160 on the core, and started at 1720 for the memory.

Just wanna know if anyone more knowledgeable on the subject may be able to point out something I'm doing wrong, or give me some tips.

Also, here's a little more info on the rest of my system.



The rest like my psu is in my sig.


----------



## Dundundata

You could try lowering the memory under 1700 and bumping up the core clock. See what you can do with +100/50mV. My MSI seems to have a tougher time with high memory OC but can do core increases easily.


----------



## Conathan

Quote:


> Originally Posted by *Dundundata*
> 
> You could try lowering the memory under 1700 and bumping up the core clock. See what you can do with +100/50mV. My MSI seems to have a tougher time with high memory OC but can do core increases easily.


I've done that I raised them both independently.

Even without the memory OC'd I couldn't go over 1155.

I've been monitoring it tonight while gaming, and I'm think it's a temp restraint.

Might just have to work on my air flow, and finding a cooling alt.


----------



## flopper

Quote:


> Originally Posted by *Gildejean*
> 
> Hello everyone!
> 
> 1: My main use for this rig is gaming, and maybe some benching now that i got really into this thing. I'm a MMO guy, só i mostly ill be playing games that this GPU can nail 150+ FPS like FPS's, moba's, and open world RPGs, was always a console player but since im not getting an atual gen console, gonna switch to PC, how a good OC (1150/1650 maybe) its gonna help in real gaming performance? I know i cant expect +30 FPS gain but maybe 10 in hm... BF4?
> 
> 2. I'll be playing in 1080p, maybe VSR 1440p, theres a need to go for 390x instead? And theres much difference between VSR and real 1440p performance wise? This question is cause i really wanna enjoy the 144hz feature só i plan on going 60+ FPS, maybe some 100 or max i can get.
> 
> 3. This PSU can handle the i5 OC'ed to 4,5ghz and everything fine? Plan to buy a second 390 in a year or so and i know i would have to upgrade, what should i get there? 1000w min?
> 
> 4. Do any of you guys know a shop that can ship for Brazil besides eBay? I have trust issues with these. Hardware here are overpriced, we are used to go on local Store, cause by shipping here
> The prices are abusive. By my math, importing + taxes still cheaper than buy over here. And we dont have plenty of options, for WC for exemple... All i can find CM and Corsair.
> 
> Im so sorry for post lenght, but cmon, after reading the whole thread, i kinda deserve it right hahaha. Sorry for any grammar mistakes and thank you all again! Feel free to give your opinion in anything even not GPU related on the rig.


1. MMO are cpu bound mainly, a 390 do good. OC the cpu is the better option.
2. 390 is fine for 1080p and overkill.
144hz with freesync monitor is the way to go. you dont need 144 or more fps to enjoy a 144hz screen.
Freesync as if you buy a monitor today nothing else make sense IMO.
3. crossfire 1000w with a quality psu.
4. no idea.


----------



## Dundundata

@Conathan, What are your core/vrm temps?


----------



## Conathan

Quote:


> Originally Posted by *Dundundata*
> 
> @Conathan, What are your core/vrm temps?


Under 100% load GPU is 95C, VRM1 is 82C, and VRM2 is 49C.


----------



## Dundundata

Ignore vrm2 apparently there is only 1 sensor

That is definitely way too high might want to look into checking the thermal paste. Does your case have good airflow?


----------



## Conathan

Quote:


> Originally Posted by *Dundundata*
> 
> Ignore vrm2 apparently there is only 1 sensor
> 
> That is definitely way too high might want to look into checking the thermal paste. Does your case have good airflow?


Yeah I was gonna look into that and possibly find an aftermarket cooling solution or water loop.

Has pretty good airflow I haven't done anything to make it any better but it's a wide open case with a plenty of fans.

Card has plenty of breathing room.


----------



## RWGTROLL

This is some of the benchmarks for my Powercolor R9 390x Devil. Also these are my average GPU temp well running the benchmarks

GPU Temps ( C )
Minimum, Maximum, Average
GPU: 35 ,60, 45
VRM: 36, 64, 56


----------



## Sgt Bilko

Dirty Rally hehe

Must be some Bikini bystander mod i don't know about









(Just messing with ya







)

Good results though and interesting that Mantle is lower than DX11 with BF4


----------



## rdr09

Quote:


> Originally Posted by *RWGTROLL*
> 
> This is some of the benchmarks for my Powercolor R9 390x Devil
> 
> 
> GPU Temps ( C )
> Minimum, Maximum, Average
> GPU: 34 ,49, 45.1
> VRM: 37, 64, 57
> 
> 
> 
> GPU Temps. (C)
> Minimum, Maximum, Average
> GPU: 30.0, 46.0, 37.0
> VRM: 35.0, 61.0, 44.4
> 
> 
> 
> GPU Temps ( C )
> Minimum, Maximum, Average
> GPU: 31, 45, 37
> VRM: 34, 65, 42
> 
> 
> 
> GPU Temps ( C ):
> Minimum, Maximum, Average
> GPU: 30.0, 46.0, 36.0
> VRM: 34.0, 66.0, 42.6
> 
> 
> 
> GPU Temps ( C )
> Minimum, Maximum, Average
> GPU: 35 ,60, 45
> VRM: 36, 64, 56


isn't it that like 2 390Xs? at 1080 you need to raise the rez scale to 150% or something or you won't be able to push the gpus much. also, you can't leave the cpu at stock. imo, any cpu.


----------



## RWGTROLL

Quote:


> Originally Posted by *rdr09*
> 
> isn't it that like 2 390Xs? at 1080 you need to raise the rez scale to 150% or something or you won't be able to push the gpus much. also, you can't leave the cpu at stock. imo, any cpu.


no it is a single 390x. The devil 13 are the dual gpu cards.


----------



## Sgt Bilko

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *RWGTROLL*
> 
> This is some of the benchmarks for my Powercolor R9 390x Devil
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> GPU Temps ( C )
> Minimum, Maximum, Average
> GPU: 34 ,49, 45.1
> VRM: 37, 64, 57
> 
> 
> 
> GPU Temps. (C)
> Minimum, Maximum, Average
> GPU: 30.0, 46.0, 37.0
> VRM: 35.0, 61.0, 44.4
> 
> 
> 
> GPU Temps ( C )
> Minimum, Maximum, Average
> GPU: 31, 45, 37
> VRM: 34, 65, 42
> 
> 
> 
> GPU Temps ( C ):
> Minimum, Maximum, Average
> GPU: 30.0, 46.0, 36.0
> VRM: 34.0, 66.0, 42.6
> 
> 
> 
> GPU Temps ( C )
> Minimum, Maximum, Average
> GPU: 35 ,60, 45
> VRM: 36, 64, 56
> 
> 
> 
> 
> 
> 
> isn't it that like 2 390Xs? at 1080 you need to raise the rez scale to 150% or something or you won't be able to push the gpus much. also, you can't leave the cpu at stock. imo, any cpu.
Click to expand...

The 390x2 Devil 13 is the dual GPU card, he has the 390x Devil which is an AIO/Air cooled card.

Key is the "13", if it has that in the name then it's Dual GPU (apart from the 6970 Devil 13







)


----------



## rdr09

Quote:


> Originally Posted by *RWGTROLL*
> 
> no it is a single 390x. The devil 13 are the dual gpu cards.


Quote:


> Originally Posted by *Sgt Bilko*
> 
> The 390x2 Devil 13 is the dual GPU card, he has the 390x Devil which is an AIO/Air cooled card.
> 
> Key is the "13", if it has that in the name then it's Dual GPU (apart from the 6970 Devil 13
> 
> 
> 
> 
> 
> 
> 
> )


I see. Then I suggest check usage.

What did you use to measure fps with Mantle?

My 7950 gets higher min at 1080.


----------



## RWGTROLL

Quote:


> Originally Posted by *rdr09*
> 
> isn't it that like 2 390Xs? at 1080 you need to raise the rez scale to 150% or something or you won't be able to push the gpus much. also, you can't leave the cpu at stock. imo, any cpu.


Quote:


> Originally Posted by *rdr09*
> 
> I see. Then I suggest check usage.
> 
> What did you use to measure fps with Mantle?
> 
> My 7950 gets higher min at 1080.


http://www.overclock.net/t/1530583/fta-frame-time-analyzer-v1-0-1-supports-bf4-civ-be-da-i

It was the most accurate tool I think. Also fun fact it works for Battlefront.


----------



## rdr09

Quote:


> Originally Posted by *RWGTROLL*
> 
> http://www.overclock.net/t/1530583/fta-frame-time-analyzer-v1-0-1-supports-bf4-civ-be-da-i
> 
> It was the most accurate tool I think. Also fun fact it works for Battlefront.


Thanks +rep. Fraps does not when I tested in mantle.

Seriously, though, my 7950 gets a higher min in BF4 maxed out than that.


----------



## Sgt Bilko

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *RWGTROLL*
> 
> http://www.overclock.net/t/1530583/fta-frame-time-analyzer-v1-0-1-supports-bf4-civ-be-da-i
> 
> It was the most accurate tool I think. Also fun fact it works for Battlefront.
> 
> 
> 
> Thanks +rep. Fraps does not when I tested in mantle.
> 
> Seriously, though, my 7950 gets a higher min in BF4 maxed out than that.
Click to expand...

It's a pity that Fraps doesn't work with Mantle....it should get DX12 support though.


----------



## RWGTROLL

Quote:


> Originally Posted by *rdr09*
> 
> Thanks +rep. Fraps does not when I tested in mantle.
> 
> Seriously, though, my 7950 gets a higher min in BF4 maxed out than that.


I will run it right now again 1 sec also are you talk about dx 11 score or both


----------



## cclau901

I am going to buy a xfx r9 390, and I am going to watercool it with an EK block. Is there a way for me to check if the card I chose has the revisions on it that makes it not compatible with the waterblock?


----------



## rdr09

Quote:


> Originally Posted by *RWGTROLL*
> 
> I will run it right now again 1 sec also are you talk about dx 11 score or both


Mantle does not work with FRAPS. Here was my stock 7950 and an i7 4.5GHz in BF4 Maxed MP 64 . . .



Now this was a stock 290 with the same cpu in BF4 MP 64 . . .



You got to check if the gpu is pegged using AB. Like this . . .



that's with i7 HT off.

Do you mind running Firestrike?


----------



## TehMasterSword

Quote:


> Originally Posted by *cclau901*
> 
> I am going to buy a xfx r9 390, and I am going to watercool it with an EK block. Is there a way for me to check if the card I chose has the revisions on it that makes it not compatible with the waterblock?


Oh, I would also like to know this since I'm planning on picking up an XFX when my refund comes. What did the revisions do?


----------



## Zaber123

Installed my PCS+ 390 yesterday and after some initial hiccups it is a beast. +100mV +50 power limit gave me 1150/1620. Temps and noise are really nice, staying in the low to mid 70s and being quieter than my cpu fans.

For the record, the initial problem was a frame stutter every 2 seconds. I was running a DVI monitor and an HDMI TV cloned. Only displaying to one monitor at a time fixed my stutter issue.


----------



## RWGTROLL

Quote:


> Originally Posted by *rdr09*
> 
> Thanks +rep. Fraps does not when I tested in mantle.
> 
> Seriously, though, my 7950 gets a higher min in BF4 maxed out than that.


Quote:


> Originally Posted by *rdr09*
> 
> Mantle does not work with FRAPS. Here was my stock 7950 and an i7 4.5GHz in BF4 Maxed MP 64 . . .
> 
> 
> 
> Now this was a stock 290 with the same cpu in BF4 MP 64 . . .
> 
> 
> 
> You got to check if the gpu is pegged using AB. Like this . . .
> 
> 
> 
> that's with i7 HT off.
> 
> Do you mind running Firestrike?


this is what I got at stock straight out of the box

http://www.3dmark.com/3dm/8943139?


----------



## RWGTROLL

Quote:


> Originally Posted by *rdr09*
> 
> Mantle does not work with FRAPS. Here was my stock 7950 and an i7 4.5GHz in BF4 Maxed MP 64 . . .
> 
> 
> 
> Now this was a stock 290 with the same cpu in BF4 MP 64 . . .
> 
> 
> 
> You got to check if the gpu is pegged using AB. Like this . . .
> 
> 
> 
> that's with i7 HT off.
> 
> Do you mind running Firestrike?


If you could i would like to know what exact setting you are using because I do have all motion blur disabled ,Ultra preset and, FOV is 90. I don't play at this setting as the eye candy is useless to me because I play very competitive. If you would like to know what I play at normally let me know.


----------



## gupsterg

Cheers for the ROM +Rep







.

Be interested to know VID of your card







, do a clean reboot, no OC apps running and try this program, Link:- The Stilt VID app

Please also state ASIC rating







.


----------



## RWGTROLL

Quote:


> Originally Posted by *TehMasterSword*
> 
> Oh, I would also like to know this since I'm planning on picking up an XFX when my refund comes. What did the revisions do?


what water block company are you going for ? also most of the companies have it on there sites

http://www.ekwb.com/configurator/waterblock/3831109830192#DB_inline?height=260&width=530&inline_id=comp_table


----------



## RWGTROLL

Quote:


> Originally Posted by *gupsterg*
> 
> Cheers for the ROM +Rep
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Be interested to know VID of your card
> 
> 
> 
> 
> 
> 
> 
> , do a clean reboot, no OC apps running and try this program, Link:- The Stilt VID app
> 
> Please also state ASIC rating
> 
> 
> 
> 
> 
> 
> 
> .


I will do the other one later going to go game for now or do some research don't know yet.


----------



## Zaber123

I thought the stutter was gone, it isn't. I think the problem has to do with the framerate being locked to 59 fps whenever I enable vsync. I have noticed this on the Witcher 3 and on Dying Light with Windows 10 and the latest beta driver. Has anyone had a similar issue with juttering?


----------



## Dundundata

Quote:


> Originally Posted by *Zaber123*
> 
> I thought the stutter was gone, it isn't. I think the problem has to do with the framerate being locked to 59 fps whenever I enable vsync. I have noticed this on the Witcher 3 and on Dying Light with Windows 10 and the latest beta driver. Has anyone had a similar issue with juttering?


what are you using to enable vsync?


----------



## agntallen

def need to join this. i just got a r9 390 for my new rig.

i tried looking up my situation in this thread, but i figured it would be best to ask if this is this worth RMA'ing?
Originally my dvi out was not working & only hdmi. It was probably due to me forgetting to plug in the other 8 pin pci-e plug. plugged it back in but the hdmi out was the only one working.
i waited the next day to see if the dvi output would work & oddly enough it works.
is it worth the time to get a RMA if there was that issue one time? or do you think the card is fine now?
i don't mind not having a working computer while the replacement is on its way. just wanted to get the opinion of others on here.

btw rdr09, do you still play bf4? if so let's play !


----------



## cclau901

Quote:


> Originally Posted by *TehMasterSword*
> 
> Oh, I would also like to know this since I'm planning on picking up an XFX when my refund comes. What did the revisions do?


Sorry it took so long, but the revisions gave the card new inductors according to ek. If you go here it you can go to AMD, Radeon R9 390, XFX, then you can see that there are two of each type of XFX r9 390 DD's. EK only has blocks compatible with two of the four, and the air coolers they come with all looks the same, so you can't tell the old from the revised ones.


----------



## Zaber123

Quote:


> Originally Posted by *Dundundata*
> 
> what are you using to enable vsync?


I have tried In-Game, CCC, and Radeon Pro. But the CCC profile setting and RadeonPro setting are not working for me. I can see the RadeonPro overlay, but the frames are uncapped. That is the case for the Witcher 3 and Dying Light.


----------



## rdr09

Quote:


> Originally Posted by *RWGTROLL*
> 
> If you could i would like to know what exact setting you are using because I do have all motion blur disabled ,Ultra preset and, FOV is 90. I don't play at this setting as the eye candy is useless to me because I play very competitive. If you would like to know what I play at normally let me know.


i think you need to disable igpu if you are not using it. i don't see anything wrong with your FS score.

in bf4, i just set everything at the highest setting (let me see if i can find a screenie) and my fov is about same as yours.

i do not, however, mess with CCC. i uncheck CCC in msconfig under startup programs. i think i read you adjusted tess in there.

CCC combined with other apps such as AB can cause conflict and may affect a game.

here was my BF4 settings at 1080 DX11 and Mantle (never play in DX11 except monitoring fps) . . .



like i said, CCC is at default. Overdrive is disabled.

edit: @agntallen, i haven't played since july. i left the states and moved overseas and my stuff still seating at the port.


----------



## TehMasterSword

Quote:


> Originally Posted by *cclau901*
> 
> Sorry it took so long, but the revisions gave the card new inductors according to ek. If you go here it you can go to AMD, Radeon R9 390, XFX, then you can see that there are two of each type of XFX r9 390 DD's. EK only has blocks compatible with two of the four, and the air coolers they come with all looks the same, so you can't tell the old from the revised ones.


Thank you! Just you possibly saved me from ordering the wrong card and not being able to watercool it in the near future.


----------



## cclau901

Quote:


> Originally Posted by *TehMasterSword*
> 
> Thank you! Just you possibly saved me from ordering the wrong card and not being able to watercool it in the near future.


Haha, I'm on the same boat, and no problem


----------



## TehMasterSword

Quote:


> Originally Posted by *cclau901*
> 
> Haha, I'm on the same boat, and no problem


Welp, I've been trying to figure out how to tell which Rev. Amazon is selling with no luck. Looks like I'll be going with Powercolor. They only have ONE card out, NO revisions, confirmed reference PCB with already excellent stock cooling and not bad looks. If you come across any info on it within the next week, would you mind posting it here? I REALLY would like an XFX if I can be sure its the reference board!


----------



## kizwan

Quote:


> Originally Posted by *TehMasterSword*
> 
> Quote:
> 
> 
> 
> Originally Posted by *cclau901*
> 
> Haha, I'm on the same boat, and no problem
> 
> 
> 
> Welp, I've been trying to figure out how to tell which Rev. Amazon is selling with no luck. Looks like I'll be going with Powercolor. They only have ONE card out, NO revisions, confirmed reference PCB with already excellent stock cooling and not bad looks. If you come across any info on it within the next week, would you mind posting it here? I REALLY would like an XFX if I can be sure its the reference board!
Click to expand...

PowerColor 390's cards are not reference PCB though but EK does have full waterblock for PowerColor because the board custom design is the same with the PowerColor 290's custom coolers/boards cards.

http://www.ekwb.com/configurator/waterblock/3831109830192


----------



## cclau901

Quote:


> Originally Posted by *TehMasterSword*
> 
> Welp, I've been trying to figure out how to tell which Rev. Amazon is selling with no luck. Looks like I'll be going with Powercolor. They only have ONE card out, NO revisions, confirmed reference PCB with already excellent stock cooling and not bad looks. If you come across any info on it within the next week, would you mind posting it here? I REALLY would like an XFX if I can be sure its the reference board!


I have found out that XFX does not void the warranty even if you take off the stock cooler. So you can just buy a 390, check if it has the new or old inductors, and get an RMA if it has the new ones.
Here's where I got all of this information: http://www.overclock.net/t/1561704/official-amd-r9-390-390x-owners-club/3420#post_24517162
You can check the XFX website in their FAQ's if you want to make sure that they do not void your warranty.


----------



## dmcl325i

Anyone else running the powercolor r9 390? Wondering what sort of increase in clock speed vs power limit people are using as i cant seem to get stable beyond about +6% clock speed and +3 to +4% power limit. I have got up to about +10% clock speed but cant find a similar sweet spot like i did at +4 and +6% clock speed.

This is on stock fans, etc.


----------



## Sgt Bilko

Quote:


> Originally Posted by *cclau901*
> 
> Quote:
> 
> 
> 
> Originally Posted by *TehMasterSword*
> 
> Welp, I've been trying to figure out how to tell which Rev. Amazon is selling with no luck. Looks like I'll be going with Powercolor. They only have ONE card out, NO revisions, confirmed reference PCB with already excellent stock cooling and not bad looks. If you come across any info on it within the next week, would you mind posting it here? I REALLY would like an XFX if I can be sure its the reference board!
> 
> 
> 
> I have found out that XFX does not void the warranty even if you take off the stock cooler. So you can just buy a 390, check if it has the new or old inductors, and get an RMA if it has the new ones.
> Here's where I got all of this information: http://www.overclock.net/t/1561704/official-amd-r9-390-390x-owners-club/3420#post_24517162
> You can check the XFX website in their FAQ's if you want to make sure that they do not void your warranty.
Click to expand...

No......RMA'ing when there is no problem with the product is bad for consumers, it just leads to higher prices in the long run due to processing and service costs.


----------



## Dundundata

Quote:


> Originally Posted by *Zaber123*
> 
> I have tried In-Game, CCC, and Radeon Pro. But the CCC profile setting and RadeonPro setting are not working for me. I can see the RadeonPro overlay, but the frames are uncapped. That is the case for the Witcher 3 and Dying Light.


You can enable vsync and choose your max frames if you use Rivatuner coupled with a program called Direct3D (D3D) Overrider. You cap the frames with Rivatuner and you force vsync/triple buffering with D3D. There are global settings or you can assign each game their own custom settings.


----------



## Dundundata

.


----------



## flopper

Quote:


> Originally Posted by *agntallen*
> 
> def need to join this. i just got a r9 390 for my new rig.
> 
> i tried looking up my situation in this thread, but i figured it would be best to ask if this is this worth RMA'ing?
> Originally my dvi out was not working & only hdmi. It was probably due to me forgetting to plug in the other 8 pin pci-e plug. plugged it back in but the hdmi out was the only one working.
> i waited the next day to see if the dvi output would work & oddly enough it works.
> is it worth the time to get a RMA if there was that issue one time? or do you think the card is fine now?
> i don't mind not having a working computer while the replacement is on its way. just wanted to get the opinion of others on here.
> 
> btw rdr09, do you still play bf4? if so let's play !


if it works now thats your answer


----------



## Zaber123

Quote:


> Originally Posted by *Dundundata*
> 
> You can enable vsync and choose your max frames if you use Rivatuner coupled with a program called Direct3D (D3D) Overrider. You cap the frames with Rivatuner and you force vsync/triple buffering with D3D. There are global settings or you can assign each game their own custom settings.


It turns out my issue was with the 15.10 beta driver. I went back to 15.7 and all my games on both monitors are now locking to 60 fps instead of 59.

I'd still like to figure out what is going on with RadeonPro so I can use the Dynamic Framerate Control. I can add visual effects with it fine (FXAA, SMAA), but the vsync settings aren't taking hold.


----------



## TehMasterSword

Quote:


> Originally Posted by *cclau901*
> 
> I have found out that XFX does not void the warranty even if you take off the stock cooler. So you can just buy a 390, check if it has the new or old inductors, and get an RMA if it has the new ones.
> Here's where I got all of this information: http://www.overclock.net/t/1561704/official-amd-r9-390-390x-owners-club/3420#post_24517162
> You can check the XFX website in their FAQ's if you want to make sure that they do not void your warranty.


But I am already in the middle of a return XD


----------



## Dundundata

Can someone explain VRM Power Out vs Power in, I am NOT very good with amps/watts/volts but I'm learning.

Also what is a "safe" voltage for these GPU's, or does it not matter as long as temps aren't too high?


----------



## Bartouille

Quote:


> Originally Posted by *Dundundata*
> 
> Can someone explain VRM Power Out vs Power in, I am NOT very good with amps/watts/volts but I'm learning.
> 
> Also what is a "safe" voltage for these GPU's, or does it not matter as long as temps aren't too high?


VRM Power In is the power (in watts ofc) that goes into the vrm before it is converted (12v) into whatever the voltage is (so let's say 1.2v for the core and 1v for the memory interface). But of course VRMs aren't perfect so there is always some loss when doing these conversions. So VRM Power Out is the real power that ends up going to core/memory interface, rest is wasted (that's what makes vrm so hot). Power Out is actually the one that matters because that's what the driver checks for the power limiter. Power Out is always lower than Power In.

This is AFAIK so correct me if I'm wrong.


----------



## gupsterg

This is what I also understand from reading

a) posts regarding this in HWiNFO support threads.
b) tech spec data pdfs of VRMs and matching the bracketed terms.


----------



## Ultra-m-a-n

@Agent Smith1984

I was going to PM you about the card, but I figure I might as well contribute to the thread. First off I'd like to say thank you! Thanks for the good deal,for shipping to APO, and for being a good seller and selling me a great card. Running this card has been awesome, and it finally pushed me to get a new monitor. Going from a 1440x900 to a 2560x1440 has been amazing and the 390 is awesome for this resolution. It has been killing every game I throw at it.

Anyways I tried tweaking the card this weekend and last, and I currently am running at 1125/1620 @+19mv/+50% power limit.

I honestly think I'm starving it of voltage, I might go to 25mv but so far it has been pretty stable. No artifacts and pretty consistent performance, just random black screen crashes on the Armored Warfare beta, hence my assessment of starving voltage. But I did a few runs of valley and the card was fine.

I am curious to compare OC that you were running. Also what program were you using to tweak the card? I'm just using standard MSI afterburner..


----------



## tolis626

Guys, isn't anyone else getting more crashes with 15.10 compared to 15.9.1? Overclocks that were rock stable previously (Mainly memory overclocks) crash now. It's not even funny. I'm running my memory at 1650MHz at the same aux voltage I used to run 1675MHz and it just crashed on me while playing a little Witcher 3. I wanted to punch my screen. ****...

On a side note, holy crap is that game killing the GPU. Even with Hairworks disabled (It looks so cool with it enabled, but it does slash 10-15FPS from my framerate, so...) it barely manages 50-55FPS with everything maxed. I mean, Inquisition is hard to run too, but this is on another level. Any advice on what I may be able to do to have it run better? Also, what's up with AA not being configurable? It's just a tickbox to turn it on or off. And sharpening looks like crap, or is it just me?

So many questions... I may sound annoying.


----------



## Dundundata

Did u set foliage distance to high? This writeup is pretty good.

http://www.gamersnexus.net/game-bench/1952-complete-witcher-3-graphics-optimization-guide-and-performance


----------



## tangelo

Question: *Is the card acting normal?*

MSI R9 390



While playing Sniper Elite III on 1080p with all details maxed the gpu load goes up and down in a non-random fashion. The core clock also fluctuates a bit but not that much. After I ALT+TAB to win the clocks went back to the stock 1060 that I was using for gaming at that time. Power Limit is at +50.

I don't notice anything special while gaming. The game runs smoothly at 60fps with some odd drops to 40 now and then, but nothing constant or anything that would correlate with the GPU load jumping.

I run the game on DX11. Need to check if this happends with Mantle too...


----------



## tolis626

Quote:


> Originally Posted by *Dundundata*
> 
> Did u set foliage distance to high? This writeup is pretty good.
> 
> http://www.gamersnexus.net/game-bench/1952-complete-witcher-3-graphics-optimization-guide-and-performance


Everything is maxed out, so yeah, probably. I'll check it out, thanks!









Quote:


> Originally Posted by *tangelo*
> 
> Question: *Is the card acting normal?*
> 
> MSI R9 390
> 
> 
> 
> While playing Sniper Elite III on 1080p with all details maxed the gpu load goes up and down in a non-random fashion. The core clock also fluctuates a bit but not that much. After I ALT+TAB to win the clocks went back to the stock 1060 that I was using for gaming at that time. Power Limit is at +50.
> 
> I don't notice anything special while gaming. The game runs smoothly at 60fps with some odd drops to 40 now and then, but nothing constant or anything that would correlate with the GPU load jumping.
> 
> I run the game on DX11. Need to check if this happends with Mantle too...


Are you using Vsync? If yes, and I'm guessing you are, that's normal. If your GPU can push 100FPS at a given time in that game and you force it to do 60, it's not going to sit at 100% GPU usage. So it will lower the clock to save power. Don't worry about it. If you disable Vsync, you'll watch your clockspeed being a near constant 1060MHz.


----------



## tangelo

Quote:


> Originally Posted by *tolis626*
> Are you using Vsync?


Damn I feel stupid. I completely forgot about that. Yeah I have vsync on.

Thanks!


----------



## FooSkiii

hey anybodie on here play Ark Survival Evloved???
i've been playing it and it keeps crashing after like an hour of gameplay....
any ideas???


----------



## tolis626

Quote:


> Originally Posted by *dmcl325i*
> 
> Anyone else running the powercolor r9 390? Wondering what sort of increase in clock speed vs power limit people are using as i cant seem to get stable beyond about +6% clock speed and +3 to +4% power limit. I have got up to about +10% clock speed but cant find a similar sweet spot like i did at +4 and +6% clock speed.
> 
> This is on stock fans, etc.


Why would you stick to +4% power limit? If you're doing any sort of overclocking, just crank it to +50%. It's perhaps the only setting that has no chance of damaging your card in the long run. It just allows you to use all the settings that require that extra power and you'll need it to be stable at higher clocks.


----------



## SpecFree

Just got my MSI 390 last week, and doing the first night of usage, i had the fans hitting 60% doing gaming.

Yet yesterday i noticed the fans not doing quite the same today? it runs alot cooler then it did just a few days ago, anyone have any idea on what could cause this?


----------



## Gumbi

Quote:


> Originally Posted by *SpecFree*
> 
> Just got my MSI 390 last week, and doing the first night of usage, i had the fans hitting 60% doing gaming.
> 
> Yet yesterday i noticed the fans not doing quite the same today? it runs alot cooler then it did just a few days ago, anyone have any idea on what could cause this?


Cooler temps?







Did you modify the fan curve? Check temps in GPUz while gaming.


----------



## Ultra-m-a-n

Quote:


> Originally Posted by *SpecFree*
> 
> Just got my MSI 390 last week, and doing the first night of usage, i had the fans hitting 60% doing gaming.
> 
> Yet yesterday i noticed the fans not doing quite the same today? it runs alot cooler then it did just a few days ago, anyone have any idea on what could cause this?


Cooler ambient temps would do that, I had a similar situation. Since there is the cooler winter temps, I opened my windows and my CPU was sitting at 38C and my 390 was at 65C under gaming load while playing planetside 2. Needless to say I was using my computer to blow warm air on me and I was wearing a sweater


----------



## SpecFree

Quote:


> Originally Posted by *Gumbi*
> 
> Cooler temps?
> 
> 
> 
> 
> 
> 
> 
> Did you modify the fan curve? Check temps in GPUz while gaming.


Ill look at it when i get home from work, it might've just been the ambient temps like ultra said, but i run a very quiet system and normally i pick up on the gpu fans being the only thing i can hear, and yeah last night all i could hear was the clicking from the HDD :/

Ill report back when i investigate


----------



## Gumbi

Quote:


> Originally Posted by *SpecFree*
> 
> Ill look at it when i get home from work, it might've just been the ambient temps like ultra said, but i run a very quiet system and normally i pick up on the gpu fans being the only thing i can hear, and yeah last night all i could hear was the clicking from the HDD :/
> 
> Ill report back when i investigate


Was it the same game?


----------



## KNG HOLDY

hi guys,

i want to buy a 390 and i still dont know which i should take.

i want to watercool it later so i guess a XFX or an ASUS

the "8192MB XFX Radeon R9 390 Double Dissipation Black Edition Aktiv PCIe 3.0 x16" cost ~330€ Here in Germany and the asus one ~360€

which one would u pick? its my first amd product









edit:// wont watercool it before 2016, does the xfx one has a zero fan mode too?


----------



## SpecFree

Quote:


> Originally Posted by *Gumbi*
> 
> Was it the same game?


Indeed it was, i first thought it might be the thermal paste having settled but im unsure now


----------



## TehMasterSword

Get the XFX, but make sure you don't get the revision 1 version. Only the original card uses the reference PCB compatible with EK 290X water blocks. Rev. 1 does not fit.


----------



## KNG HOLDY

Quote:


> Originally Posted by *TehMasterSword*
> 
> Get the XFX, but make sure you don't get the revision 1 version. Only the original card uses the reference PCB compatible with EK 290X water blocks. Rev. 1 does not fit.


does the xfx one own the zerofan mode in idle?
is "8192MB XFX Radeon R9 390 Double Dissipation Black Edition Aktiv PCIe 3.0 x16" the right one or is it the rev. 1? the page i want to buy the gpu from doesnt tell me anything of rev

i can get a "8192MB XFX Radeon R9 390 Double Dissipation Black " for 330€ and a "8192MB XFX Radeon R9 390 Double Dissipation Edition " for 320€


----------



## TehMasterSword

Quote:


> Originally Posted by *KNG HOLDY*
> 
> does the xfx one own the zerofan mode in idle?
> is "8192MB XFX Radeon R9 390 Double Dissipation Black Edition Aktiv PCIe 3.0 x16" the right one or is it the rev. 1? the page i want to buy the gpu from doesnt tell me anything of rev
> 
> i can get a "8192MB XFX Radeon R9 390 Double Dissipation Black " for 330€ and a "8192MB XFX Radeon R9 390 Double Dissipation Edition " for 320€


I'm pretty confident every 390 has the zero fan mode.

According to EK, you want the regular, non Revised card. I, too, couldn't find info on it so I am going for a powercolor. Maybe contact the vendor?


----------



## KNG HOLDY

i guess just the "AMD Radeon R9 390 Double Dissipation Core Edition" has the reference design but i dont find the card on so many german shops lol


----------



## Zaber123

Quote:


> Originally Posted by *dmcl325i*
> 
> Anyone else running the powercolor r9 390? Wondering what sort of increase in clock speed vs power limit people are using as i cant seem to get stable beyond about +6% clock speed and +3 to +4% power limit. I have got up to about +10% clock speed but cant find a similar sweet spot like i did at +4 and +6% clock speed.
> 
> This is on stock fans, etc.


Just crank that to 50%. All it's doing is allowing your card to take the power it needs. With my PCS+ 390 I went from 1010/1500 to 1150/1620 with +50% and +100mV. Any less voltage was a no go.

Also, customize your fan curve. See what the highest noise level that you're comfortable with and customize it to to ramp up to that around 75 degrees. My preference is to have it really spin up after 80,but the cards can handle more.


----------



## caenlen

Two questions.

1. Can I crossfire a 390 in a second slot with my 390x still as primary without any issues? Or do I have to pay extra to get the benefit of Crossfire and get another 390x?

2. Will Crossfire work fine with a Freesync monitor?


----------



## battleaxe

Quote:


> Originally Posted by *caenlen*
> 
> Two questions.
> 
> 1. Can I crossfire a 390 in a second slot with my 390x still as primary without any issues? Or do I have to pay extra to get the benefit of Crossfire and get another 390x?
> 
> 2. Will Crossfire work fine with a Freesync monitor?


Yes 390/390x works fine.

In most cases on Xfire anyway freesync a non-issue.


----------



## Sgt Bilko

Quote:


> Originally Posted by *TehMasterSword*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KNG HOLDY*
> 
> does the xfx one own the zerofan mode in idle?
> is "8192MB XFX Radeon R9 390 Double Dissipation Black Edition Aktiv PCIe 3.0 x16" the right one or is it the rev. 1? the page i want to buy the gpu from doesnt tell me anything of rev
> 
> i can get a "8192MB XFX Radeon R9 390 Double Dissipation Black " for 330€ and a "8192MB XFX Radeon R9 390 Double Dissipation Edition " for 320€
> 
> 
> 
> I'm pretty confident every 390 has the zero fan mode.
> 
> According to EK, you want the regular, non Revised card. I, too, couldn't find info on it so I am going for a powercolor. Maybe contact the vendor?
Click to expand...

XFX doesnt have a 0dB mode for the 390 and 390x


----------



## Sgt Bilko

Quote:


> Originally Posted by *battleaxe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *caenlen*
> 
> Two questions.
> 
> 1. Can I crossfire a 390 in a second slot with my 390x still as primary without any issues? Or do I have to pay extra to get the benefit of Crossfire and get another 390x?
> 
> 2. Will Crossfire work fine with a Freesync monitor?
> 
> 
> 
> Yes 390/390x works fine.
> 
> In most cases on Xfire anyway freesync a non-issue.
Click to expand...

Yes.....I've done 295x2 + 290x + 390x Quadfire and i normally run 295x2 + 390x Trifire daily and it works beautifully


----------



## KNG HOLDY

Quote:


> Originally Posted by *Sgt Bilko*
> 
> XFX doesnt have a 0dB mode for the 390 and 390x


:c which r9 390 should i take :C i want to run it with aircooling for a few month and watercool it later
wanted a sapphire first (no fullcover waterblocks) then i wanted a xfx and i couldnt tell which versions support full waterblocks :c

its a hard road to my first amd gpu :c


----------



## diggiddi

Quote:


> Originally Posted by *caenlen*
> 
> Two questions.
> 
> 1. Can I crossfire a 390 in a second slot with my 390x still as primary without any issues? Or do I have to pay extra to get the benefit of Crossfire and get another 390x?
> 
> 2. Will Crossfire work fine with a Freesync monitor?


Yes to both questions, you should have no problem but It might be better if you Overclock the 390 to match the 390x for sake of smoothness, someone else can chime in on this
Run the 390X in Firestrike and Overclock the 390 to meet or get as close to that score as possible


----------



## Sgt Bilko

Quote:


> Originally Posted by *diggiddi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *caenlen*
> 
> Two questions.
> 
> 1. Can I crossfire a 390 in a second slot with my 390x still as primary without any issues? Or do I have to pay extra to get the benefit of Crossfire and get another 390x?
> 
> 2. Will Crossfire work fine with a Freesync monitor?
> 
> 
> 
> Yes to both questions, you should have no problem but It might be better if you Overclock the 390 to match the 390x for sake of smoothness, someone else can chime in on this
> Run the 390X in Firestrike and Overclock the 390 to meet or get as close to that score as possible
Click to expand...

My 295x2 runs at a lower core and mem clock speed and it's still smooth.

You dont need to do that anymore unless you are OCD about the numbers








Quote:


> Originally Posted by *KNG HOLDY*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> XFX doesnt have a 0dB mode for the 390 and 390x
> 
> 
> 
> :c which r9 390 should i take :C i want to run it with aircooling for a few month and watercool it later
> wanted a sapphire first (no fullcover waterblocks) then i wanted a xfx and i couldnt tell which versions support full waterblocks :c
> 
> its a hard road to my first amd gpu :c
Click to expand...

Powercolor's PCS+ supports the blocks iirc and yes it would be hard to find out which version of the XFX would support it, i might ask XFX because I'm curious about it myself


----------



## KNG HOLDY

Quote:


> Originally Posted by *Sgt Bilko*
> 
> My 295x2 runs at a lower core and mem clock speed and it's still smooth.
> 
> You dont need to do that anymore unless you are OCD about the numbers
> 
> 
> 
> 
> 
> 
> 
> 
> Powercolor's PCS+ supports the blocks iirc and yes it would be hard to find out which version of the XFX would support it, i might ask XFX because I'm curious about it myself


ty for your nice help!








iirc?









do u know the idle temp of the powercolour? could i just turn the fans off overnight if i want to download stuff?


----------



## Geoclock

Should i update to AMD Catalyst™ 15.10 Beta ?
I'm using previous 15.9.1. beta and works fine.


----------



## mandrix

Quote:


> Originally Posted by *KNG HOLDY*
> 
> ty for your nice help!
> 
> 
> 
> 
> 
> 
> 
> 
> iirc?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> do u know the idle temp of the powercolour? could i just turn the fans off overnight if i want to download stuff?


My PCS 390X idled at 58-59C until I waterblocked it. That's with a 25C ambient at highest.


----------



## Sgt Bilko

Quote:


> Originally Posted by *KNG HOLDY*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> My 295x2 runs at a lower core and mem clock speed and it's still smooth.
> 
> You dont need to do that anymore unless you are OCD about the numbers
> 
> 
> 
> 
> 
> 
> 
> 
> Powercolor's PCS+ supports the blocks iirc and yes it would be hard to find out which version of the XFX would support it, i might ask XFX because I'm curious about it myself
> 
> 
> 
> ty for your nice help!
> 
> 
> 
> 
> 
> 
> 
> 
> iirc?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> do u know the idle temp of the powercolour? could i just turn the fans off overnight if i want to download stuff?
Click to expand...

You're welcome









iirc = if i recall correctly


----------



## KNG HOLDY

could i take the powercolor 390 and just turn the fans off when im idleing? or should i just take out my gpu when i download stuff over night?


----------



## tangelo

Quote:


> Originally Posted by *Geoclock*
> 
> Should i update to AMD Catalyst™ 15.10 Beta ?
> I'm using previous 15.9.1. beta and works fine.


I had major problems getting the 15.10b to even boot to desktop. So if you don't have a immediate need to update, wait for the next non-beta IMHO.


----------



## RWGTROLL

Quote:


> Originally Posted by *KNG HOLDY*
> 
> could i take the powercolor 390 and just turn the fans off when im idleing? or should i just take out my gpu when i download stuff over night?


i have a question. Why are you worried about it running all night ? the fans are going to last a long time if you take care of the card.


----------



## kalidae

Quote:


> Originally Posted by *RWGTROLL*
> 
> i have a question. Why are you worried about it running all night ? the fans are going to last a long time if you take care of the card.


I'm assuming his pc is in his bedroom and he wants silence.


----------



## Sgt Bilko

Quote:


> Originally Posted by *kalidae*
> 
> Quote:
> 
> 
> 
> Originally Posted by *RWGTROLL*
> 
> i have a question. Why are you worried about it running all night ? the fans are going to last a long time if you take care of the card.
> 
> 
> 
> I'm assuming his pc is in his bedroom and he wants silence.
Click to expand...

My thoughts as well.


----------



## Mysticking32

I usually like the noise my pc makes because at night it drowns out other disruptive noises. (live in a dorm) And I absolutely love that it does that. But if you don't have that problem then I can definitely see how it could get annoying.


----------



## KNG HOLDY

yeah im really picky







atm i turn off every fan in my system when i sleep and even bourght a evga 750w g2 just because of the idle 0db mode









but i dont know how warm a r9 390 from powercolour can get


----------



## tolis626

Quote:


> Originally Posted by *KNG HOLDY*
> 
> yeah im really picky
> 
> 
> 
> 
> 
> 
> 
> atm i turn off every fan in my system when i sleep and even bourght a evga 750w g2 just because of the idle 0db mode
> 
> 
> 
> 
> 
> 
> 
> 
> 
> but i dont know how warm a r9 390 from powercolour can get


I think that Powercolor's cards have a 0db mode. But even if they don't, making a custom fan profile in AB will probably solve this issue for you. Maybe you'll even be able to get them to stop spinning completely or it will at least be completely silent at low enough speeds. I think you'll be OK.


----------



## fyzzz

Quote:


> Originally Posted by *tolis626*
> 
> I think that Powercolor's cards have a 0db mode. But even if they don't, making a custom fan profile in AB will probably solve this issue for you. Maybe you'll even be able to get them to stop spinning completely or it will at least be completely silent at low enough speeds. I think you'll be OK.


Just look up a review. From techpowerup: Idle fan noise is fantastic as the card completely turns its fans off in idle (up to 60°C)


----------



## hugoolly

I've been doing a little research on this lately, you can buy small heat sinks that simply stick into the vrms with thermal adhesive tape (which usually come pre-applied). This will allow the g10s fan to actively cool the vrm a lot more effectively.
http://www.amazon.co.uk/dp/B00GTH9ZK4/ref=wl_it_dp_o_pC_nS_ttl?_encoding=UTF8&colid=US355HYM4RAB&coliid=I2QATO9XZHY4XS


----------



## hugoolly

Quote:


> Originally Posted by *TehMasterSword*
> 
> Hey guys! Long time lurker, new joiner!
> 
> 
> 
> http://imgur.com/fTqGBLf
> 
> 
> Upgraded from a Strix 960 (now donated to a friend). Absolutely loving the performance of the card, but a little wary of the temps. 66 on idle, 90 at max load, max settings, 2x AA in Valley Benchmark, 85 in MGS V max settings. I took the card apart and replaced the thermal compound with my own which brought down the idle temps a few degrees, but still hit 90 in Valley. Really wish I could put this bad boy under water =/
> 
> Anyone know of any solutions? From what I've read, ASUS doesn't use a reference board so no full waterblocks are compatible, and an NZXT G10 leaves the VRMs to fend for themselves.


I've been doing a little research on this lately, you can buy small heat sinks that simply stick into the vrms with thermal adhesive tape (which usually come pre-applied). This will allow the g10s fan to actively cool the vrm a lot more effectively.
http://www.amazon.co.uk/dp/B00GTH9ZK4/ref=wl_it_dp_o_pC_nS_ttl?_encoding=UTF8&colid=US355HYM4RAB&coliid=I2QATO9XZHY4XS


----------



## thanozr

Add me too pls.


----------



## AverdanOriginal

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Wow, you are right! That would explain all the tearing! Derp
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But seriously, it looks like you are correct, but my games were all breaking 30fps per fraps and i got no tearing.... Could it be because the tv itself has hardware to reduce motioon blur and tearing issues?


Hmm interesting fact. I am not sure, but I always thought a 30Hz or 60Hz is roughly equivalent to 30fps or 60 fps (plus/minus something) on that screen. Perhaps someone here in the forum could shed some light into this?!









So you are saying you recorded higher FPS than 30 via FRAPS or RivaTuner (or whichever software you are using) allthough the screen should only be getting 30Hz due to the adapter?
What jumps to my mind, but this is just a reeeeaaalllly wild guess, maybe fraps/Rivatuner only record the fps that the card produces but not the fps that actually get shown on the TV? So let's say you card send 50 FPS to the Screen, only but due to the adapter only 30 raeach the screen, then the screen adjusts with its hardware the images and you see a smooth 30 fps but your fraps/rivatuner counts 50 fps the whole time??? could that be it?

OR, it is way simpler. the 3840x2160 @ 30Hz is only a stated minimum from the Hardware Manufacturer and in some cases the cables can do more than 30Hz









Probably the second option


----------



## Dundundata

Rivatuner will show whatever the card is producing


----------



## Agent Smith1984

Quote:


> Originally Posted by *Dundundata*
> 
> Rivatuner will show whatever the card is producing


Yep, and I was getting really good results, however it looks like I was still visually capped at 30FPS, which I admittidly didn't notice, except for some occasional tearing, which I had chocked up to the frames that were breaking 60FPS....


----------



## gerpogi

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Well you'd be wanting these people then:
> 
> @Superjit94, @Bigm, @xhitekredneckx, @BlaXey, @undyingbread, @CerealKillah, @bazookatooths, @FooSkiii, @Darkstalker420, @Weird0ne and @MechaDurka
> 
> And am I seriously the only person in here with a XFX DD 390x?


I have one


----------



## Mister300

I have a XFX DD 390X great value card


----------



## By-Tor

Quote:


> Originally Posted by *hugoolly*
> 
> I've been doing a little research on this lately, you can buy small heat sinks that simply stick into the vrms with thermal adhesive tape (which usually come pre-applied). This will allow the g10s fan to actively cool the vrm a lot more effectively.
> http://www.amazon.co.uk/dp/B00GTH9ZK4/ref=wl_it_dp_o_pC_nS_ttl?_encoding=UTF8&colid=US355HYM4RAB&coliid=I2QATO9XZHY4XS


Used them for years with great results.

7950's


----------



## diggiddi

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Yep, and I was getting really good results, however it looks like I was still visually capped at 30FPS, which I admittidly didn't notice, except for some occasional tearing, which I had chocked up to the frames that were breaking 60FPS....


I believe these do 4k @60Hz
http://www.amazon.com/dp/B00Z9QE7OU/ref=wl_it_dp_o_pC_nS_ttl?_encoding=UTF8&colid=94LKZGB2CXBS&coliid=IO3PZ9L37JWSX&psc=1


----------



## RicoDee

OR this one too :

http://www.amazon.com/HDMI-Cable-6ft-Connectors-PlayStation/dp/B00NQ9OQU2/ref=sr_1_2?s=electronics&ie=UTF8&qid=1445400399&sr=1-2&keywords=4K%4060Hz


----------



## RicoDee

Quote:


> Originally Posted by *diggiddi*
> 
> I believe these do 4k @60Hz
> http://www.amazon.com/dp/B00Z9QE7OU/ref=wl_it_dp_o_pC_nS_ttl?_encoding=UTF8&colid=94LKZGB2CXBS&coliid=IO3PZ9L37JWSX&psc=1[/qu
> 
> or this one :
> 
> http://www.amazon.com/HDMI-Cable-6ft-Connectors-PlayStation/dp/B00NQ9OQU2/ref=sr_1_2?s=electronics&ie=UTF8&qid=1445400399&sr=1-2&keywords=4K%4060Hz


----------



## navjack27

i'm wondering what the lowest core and memory clocks people can get with their 390x. i run mine all day at 1100/1650 +50 but as of recently i've wanted to toy around with underclocking and volting and power limiting when i sleep and the computer is really not doing much. so far i got 700/800 -50. but i feel weird going lower or messing with the voltage.


----------



## tolis626

Quote:


> Originally Posted by *navjack27*
> 
> i'm wondering what the lowest core and memory clocks people can get with their 390x. i run mine all day at 1100/1650 +50 but as of recently i've wanted to toy around with underclocking and volting and power limiting when i sleep and the computer is really not doing much. so far i got 700/800 -50. but i feel weird going lower or messing with the voltage.


Dude, my "power saving" profile is 1050/1575 at - 100mV and my "light gaming" one is 1100/1600 at - 50mV. You can do way, way better than what you currently have. Decreasing the frequency at idle doesn't do anything, as it will downclock to 300MHz anyway.


----------



## AverdanOriginal

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Yep, and I was getting really good results, however it looks like I was still visually capped at 30FPS, which I admittidly didn't notice, except for some occasional tearing, which I had chocked up to the frames that were breaking 60FPS....


I am really satisfied with the performance of my MSI R9 390 and even more so now since I got my skylake upgrade it even performs better, BUT I don't know why they only provided the card with an HDMI 1.4a (which can max do 4k @ 24HZ) and a Displayport 1.2 which would do 4K @ 60Hz. You get enough Computer Screens today with 4K and Displayport connection, but finding a 4k TV with displayport is difficult and if it has one, it is a really expensive TV. I thought back then, "Ok, just get an adapter" but looking through amazon and other sites I only found 4k @ 30Hz max so far. It would be sufficient, but like you said Agent you might get tearing effects.
Quote:


> Originally Posted by *diggiddi*
> 
> I believe these do 4k @60Hz
> http://www.amazon.com/dp/B00Z9QE7OU/ref=wl_it_dp_o_pC_nS_ttl?_encoding=UTF8&colid=94LKZGB2CXBS&coliid=IO3PZ9L37JWSX&psc=1


Thx, but thats mini-displayport to HDMI and as stated in the specs on amazon "ULTRA HD: Support HDMI Ultra HD solution up to [email protected]". Again only 30Hz
Quote:


> Originally Posted by *RicoDee*
> 
> OR this one too :
> 
> http://www.amazon.com/HDMI-Cable-6ft-Connectors-PlayStation/dp/B00NQ9OQU2/ref=sr_1_2?s=electronics&ie=UTF8&qid=1445400399&sr=1-2&keywords=4K%4060Hz


Thx aswell, but that is HDMI to HDMI and not HDMI to Displayport.

Maybe this one??? Doesn't say it in the specs, but customers confirmed 4k @ 60 Hz.








http://www.amazon.com/aLLreLi-DisplayPort-Adapter-Gold-Plated-Connector/dp/B00ZA067MA/ref=pd_sim_23_2?ie=UTF8&refRID=159RSWNCD3ZX9BZ36ABG&dpID=314vlE80xWL&dpSrc=sims&preST=_AC_UL160_SR160%2C160_

Edit: ok checked on the German Amazon and there the *specs state again 4k only @ 30Hz*
Whhyyyyyy, only why I ask you????


----------



## diggiddi

I think was the one Iwas trying to link but had several tabs open


----------



## koxy

Guys any news about water block for msi r9 390 ? Card is fine when idling but at full load...


----------



## Sgt Bilko

Quote:


> Originally Posted by *koxy*
> 
> Guys any news about water block for msi r9 390 ? Card is fine when idling but at full load...


No block and probably never will be for it, demand isn't high enough to justify it.


----------



## Agent Smith1984

Sooooo........

Alphacool has an ALMOST FULL COVER BLOCK, for the MSI!!


----------



## Sgt Bilko

Fullcover block implies vrm + vram cooling as well, while this do that it's with a passive heatsink.

Nothing against the block, I'm actually happy that Alphacool make them since they are extremely versatile but a fullcover water block it is not


----------



## Gumbi

Hey guys, I'm trying to get a feel for Firestrike scores, having not used it before. Ran through the benches on the free version and scored almost 14.8k graphics on my 290X, and just wondering how it compares to the top 390X scores.

This was at a heavy overclock of 1232/1641mhz with 200mv pumping through


----------



## Agent Smith1984

Quote:


> Originally Posted by *Gumbi*
> 
> Hey guys, I'm trying to get a feel for Firestrike scores, having not used it before. Ran through the benches on the free version and scored almost 14.8k graphics on my 290X, and just wondering how it compares to the top 390X scores.
> 
> This was at a heavy overclock of 1232/1641mhz with 200mv pumping through


Dead on with a 390x clocked the same..... However, since 390's will not break 1200mhz on core, the 390 has to have the memory clocked in the 1780+ to compete, if it even cane.


----------



## navjack27

Quote:


> Originally Posted by *tolis626*
> 
> Dude, my "power saving" profile is 1050/1575 at - 100mV and my "light gaming" one is 1100/1600 at - 50mV. You can do way, way better than what you currently have. Decreasing the frequency at idle doesn't do anything, as it will downclock to 300MHz anyway.


nah not the way i run afterburner. i turn that off so when i'm actually gaming it keeps me at my high clocks. turning back on powerplay and all that is more of a hassle before i go to bed, i'd get distracted and loose 45min messing with clock speeds. i disable power saving stuff on everything and manually set everything instead.

EDIT: yeah i just brought all the sliders down and saved it in profile 5 in afterburner. 30c idle lol


----------



## tolis626

Quote:


> Originally Posted by *navjack27*
> 
> nah not the way i run afterburner. i turn that off so when i'm actually gaming it keeps me at my high clocks. turning back on powerplay and all that is more of a hassle before i go to bed, i'd get distracted and loose 45min messing with clock speeds. i disable power saving stuff on everything and manually set everything instead.


Ermmm... Sorry, what?









The way I use it, I have saved a few profiles in AB (You can have up to five). The two main ones are the power saving one and the gaming one (1160/1675MHz at +65/+25mV). I have mapped each profile to a switch, which is Ctrl+Alt+x, where x is 1 through 5. So, say I want to enable profile 1 to leave it idle to download something overnight, I just press Ctrl+Alt+1. Then I want to play Witcher 3? I press Ctrl+Alt+3 to enable my overclocked profile. Any simpler and I would have hated it.


----------



## Agent Smith1984

Guys,

I have been digging, and digging, and apparently, when it comes to getting 60FPS on my practically new 4K TV..... I AM SCREWED WITH AMD!!!

There are NO DP to HDMI 2.0 converters on the market..... I am almost sure of it at this point.

If anyone finds one that is verified for sure as DP1.4 -> HDMI 2.0 (4k60), please advise and I will create a special section just for you on the OP!!!









If I don't find something by Friday.... I am sadly leaving the club and going from my original choice of (2x MSI 390), to a single GTX 980Ti....

This really grinds my gears


----------



## Ultra-m-a-n

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Guys,
> 
> I have been digging, and digging, and apparently, when it comes to getting 60FPS on my practically new 4K TV..... I AM SCREWED WITH AMD!!!
> 
> There are NO DP to HDMI 2.0 converters on the market..... I am almost sure of it at this point.
> 
> If anyone finds one that is verified for sure as DP1.4 -> HDMI 2.0 (4k60), please advise and I will create a special section just for you on the OP!!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If I don't find something by Friday.... I am sadly leaving the club and going from my original choice of (2x MSI 390), to a single GTX 980Ti....
> 
> This really grinds my gears


Well there is hope.. There is this tweet from Club3d, and as of Oct 5 they are still testing, so no word on release...


__ https://twitter.com/i/web/status/631090223355723777%5B%2FURL
but at this point it is still vaporware.

The tech to do this exist, but it has been over a year now and nothing has been released.
http://www.paradetech.com/products/displayport-format-converters/ps176/


----------



## AverdanOriginal

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Guys,
> 
> I have been digging, and digging, and apparently, when it comes to getting 60FPS on my practically new 4K TV..... I AM SCREWED WITH AMD!!!
> 
> There are NO DP to HDMI 2.0 converters on the market..... I am almost sure of it at this point.
> 
> If anyone finds one that is verified for sure as DP1.4 -> HDMI 2.0 (4k60), please advise and I will create a special section just for you on the OP!!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If I don't find something by Friday.... I am sadly leaving the club and going from my original choice of (2x MSI 390), to a single GTX 980Ti....
> 
> This really grinds my gears


Damn I should have never brought this up. Sry dude.
I also checked the world wide web and a couple of other forums where heaps of people are looking for such an adapter.

Like Ultra-m-a-n already showed there is the Club3d version planed for end of this year.

I also found this Bizlink DP1.2 to Hdmi 2.0 Adapter also planned for the end of this year. They already showed it on the CES 2015
http://www.clubic.com/salon-informatique-tic/ces/actu-ces_2015_du_displayport_a_l_hdmi_2_0_chez_bizlink-749111.html

But it seems all manufacturers are working and testing on it. The first one to hit such an adapter on the market will probably hit the gold bucket. Let's keep the fingers crossed that in the next 2 months we'll see one appear.


----------



## Scorpion49

Hey guys, I just picked up an MSI 390X. I was hoping for good things, but this card is so freaking loud. It seems to ride along at around 75*C with my case open, but the fans sound like a reference 290X x2, it is impossibly loud for something with such a massive heatsink. Is this normal for these? I have never used an MSI card since the TFIII days and I don't remember them being this loud either.


----------



## tangelo

Quote:


> Originally Posted by *Scorpion49*
> 
> Hey guys, I just picked up an MSI 390X. I was hoping for good things, but this card is so freaking loud. It seems to ride along at around 75*C with my case open, but the fans sound like a reference 290X x2, it is impossibly loud for something with such a massive heatsink. Is this normal for these? I have never used an MSI card since the TFIII days and I don't remember them being this loud either.


Close your case so the airflow will be better. Use custom fan profile and adjust it to your liking and try to find the sweet spot for temp/noise.


----------



## Scorpion49

Quote:


> Originally Posted by *tangelo*
> 
> Close your case so the airflow will be better. Use custom fan profile and adjust it to your liking and try to find the sweet spot for temp/noise.


So, what you're saying is that yes, they are naturally noisy. Got it.


----------



## tangelo

Quote:


> Originally Posted by *Scorpion49*
> 
> So, what you're saying is that yes, they are naturally noisy. Got it.


Kinda yeah. More accurate imho would be "they run hot". So depending on your cooling solution, case setup etc. they all contribute to how noisy it will be.
I have a airflow oriented case, and I can't say my R9 390 is noisy. I can notice the fans when they are at ~80% and that happens very rarely.

I know R9 390 is not 390X but both of these cards have been described as "noisy" while other users say they are very quiet. So go figure


----------



## Scorpion49

Quote:


> Originally Posted by *tangelo*
> 
> Kinda yeah. More accurate imho would be "they run hot". So depending on your cooling solution, case setup etc. they all contribute to how noisy it will be.
> I have a airflow oriented case, and I can't say my R9 390 is noisy. I can notice the fans when they are at ~80% and that happens very rarely.
> 
> I know R9 390 is not 390X but both of these cards have been described as "noisy" while other users say they are very quiet. So go figure


I've owned dozens of Hawaii cards, I was just shocked that one with a gigantic aftermarket cooler and full custom PCB could be as loud as a pair of 290X reference cards in crossfire. The temps aren't bad and the fan RPM isn't very high either so I guess the fans are just noisy as all hell which is slightly disappointing. I was debating on this or the XFX and I think the XFX would have been the better choice now.


----------



## diggiddi

Quote:


> Originally Posted by *Ultra-m-a-n*
> 
> Well there is hope.. There is this tweet from Club3d, and as of Oct 5 they are still testing, so no word on release...
> 
> 
> __ https://twitter.com/i/web/status/631090223355723777%5B%2FURL
> but at this point it is still vaporware.
> 
> The tech to do this exist, but it has been over a year now and nothing has been released.
> http://www.paradetech.com/products/displayport-format-converters/ps176/


Rip, rip rep!


----------



## GorillaSceptre

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Guys,
> 
> I have been digging, and digging, and apparently, when it comes to getting 60FPS on my practically new 4K TV..... I AM SCREWED WITH AMD!!!
> 
> There are NO DP to HDMI 2.0 converters on the market..... I am almost sure of it at this point.
> 
> If anyone finds one that is verified for sure as DP1.4 -> HDMI 2.0 (4k60), please advise and I will create a special section just for you on the OP!!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If I don't find something by Friday.... I am sadly leaving the club and going from my original choice of (2x MSI 390), to a single GTX 980Ti....
> 
> This really grinds my gears


A single 980 Ti won't push 4k/60, so it looks like a catch 22 for you either way.









Chasing that 4k dream is a struggle.. Don't 980's have HDMI 2.0? Maybe going SLI with those would be the best option?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Ultra-m-a-n*
> 
> Well there is hope.. There is this tweet from Club3d, and as of Oct 5 they are still testing, so no word on release...
> 
> 
> __ https://twitter.com/i/web/status/631090223355723777%5B%2FURL
> but at this point it is still vaporware.
> 
> The tech to do this exist, but it has been over a year now and nothing has been released.
> http://www.paradetech.com/products/displayport-format-converters/ps176/


DUDE!!!

Thank you so much, I'd have never found it. +Rep on my end too!!

There is now hope!!


----------



## Agent Smith1984

Quote:


> Originally Posted by *GorillaSceptre*
> 
> A single 980 Ti won't push 4k/60, so it looks like a catch 22 for you either way.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Chasing that 4k dream is a struggle.. Don't 980's have HDMI 2.0? Maybe going SLI with those would be the best option?


Honestly, I gamed at 4K on a single 390 using high/very high custom settings, with AWESOME results considering it was one card... I was in the 45-75 FPS range for Crysis 3 and BF4 at true 4096x2160 resolution (I'll dig my results post up if I can find it) Mind you that was with clocks at 1200/1700 24/7!

So, two 390's will handle max quality 4K gaming for the most part, but that's only because they carry 8GB of VRAM!!

The 980 with it's 4GB will do okay in 80% of games, but there will be times when it chokes on VRAM, so the 980Ti seemed most logical with it's "close-to-a-295x2" performance, and it's 6GB frame buffer.....

I will see what the ETA is on the little adapter that CLub 3d has, and go from there..... but don't be surprised if I start posting 980Ti benchies for you guys in the next few weeks


----------



## Dundundata

That stinks about no hdmi 2.0., hopefully you can find an adapter because if I do upgrade my monitor it will most likely be a 4k TV + xfire MSI's. Please do keep us updated on this!

As for fan speed mine is rather quiet but the fans never get more than around low 60%. I don't think XFX fans are any quieter.


----------



## Ultra-m-a-n

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Honestly, I gamed at 4K on a single 390 using high/very high custom settings, with AWESOME results considering it was one card... I was in the 45-75 FPS range for Crysis 3 and BF4 at true 4096x2160 resolution (I'll dig my results post up if I can find it) Mind you that was with clocks at 1200/1700 24/7!
> 
> So, two 390's will handle max quality 4K gaming for the most part, but that's only because they carry 8GB of VRAM!!
> 
> The 980 with it's 4GB will do okay in 80% of games, but there will be times when it chokes on VRAM, so the 980Ti seemed most logical with it's "close-to-a-295x2" performance, and it's 6GB frame buffer.....
> 
> I will see what the ETA is on the little adapter that CLub 3d has, and go from there..... but don't be surprised if I start posting 980Ti benchies for you guys in the next few weeks


Haha I did a quick search on that stuff after reading your post, as this does interest me a little bit as my parents just bought a no name 4k TV, I am not even sure that it has HDMI 2.0, so it is currently @ 30Hz.

Anyways looking at the other tweets, they are saying that they want to release the adapter before Xmas. So not quite by Friday









Heres the tweet:
https://pbs.twimg.com/media/CRvjp_uWIAA_SzD.jpg

Annnnnd heres the pic from that tweet


----------



## Sgt Bilko

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Gumbi*
> 
> Hey guys, I'm trying to get a feel for Firestrike scores, having not used it before. Ran through the benches on the free version and scored almost 14.8k graphics on my 290X, and just wondering how it compares to the top 390X scores.
> 
> This was at a heavy overclock of 1232/1641mhz with 200mv pumping through
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Dead on with a 390x clocked the same..... However, since 390's will not break 1200mhz on core, the 390 has to have the memory clocked in the 1780+ to compete, if it even cane.
Click to expand...

Im testing a 390 atm that breaks 1200 core speed, even benched it at 1250/1500 last night actually.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Im testing a 390 atm that breaks 1200 core speed, even benched it at 1250/1500 last night actually.












Please share more!!

XFX I'm guessing? A different one???


----------



## tolis626

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Im testing a 390 atm that breaks 1200 core speed, even benched it at 1250/1500 last night actually.


Could you please test something for me? When you get the time, see if increasing memory clocks affects core stability. It keeps nagging me but I don't have enough time these days to test it out myself thoroughly. You could test like 1250/1650MHz or something like that. I'm curious to see what you'll come up with.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Im testing a 390 atm that breaks 1200 core speed, even benched it at 1250/1500 last night actually.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Please share more!!
> 
> XFX I'm guessing? A different one???
Click to expand...

It's a Sapphire Nitro, it acts more like my 290's did than my 390x though.

I've only done some quick GPUPI and Firestrike runs with it atm though.

Will be testing gaming later on with it.

As for the Club3D adaptor it shouldn't be too far away from what I've heard.


----------



## Sgt Bilko

Quote:


> Originally Posted by *tolis626*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Im testing a 390 atm that breaks 1200 core speed, even benched it at 1250/1500 last night actually.
> 
> 
> 
> Could you please test something for me? When you get the time, see if increasing memory clocks affects core stability. It keeps nagging me but I don't have enough time these days to test it out myself thoroughly. You could test like 1250/1650MHz or something like that. I'm curious to see what you'll come up with.
Click to expand...

1250 is a benching speed, its not properly stable (needs +200mV) but thats what my XFX DD 290's needed for 1250/1500 as well.

But ill run it through if i get some time (very busy this week







)


----------



## Gumbi

How are you cooling her? I imagine the VRMs get hot at that voltage.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Gumbi*
> 
> How are you cooling her? I imagine the VRMs get hot at that voltage.


Stock cooling 100% fan speed for short bursts, I've got the card on loan for a couple of weeks for a review.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Stock cooling 100% fan speed for short bursts, I've got the card on loan for a couple of weeks for a review.


Sarge, is this a newer model with backplate?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Stock cooling 100% fan speed for short bursts, I've got the card on loan for a couple of weeks for a review.
> 
> 
> 
> Sarge, is this a newer model with backplate?
Click to expand...

No backplate on this one, the ones with them just went on sale here in Aus this week.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Sgt Bilko*
> 
> No backplate on this one, the ones with them just went on sale here in Aus this week.


So weird to see a 390 even bench stable at 1250...

I mean, so many cards on the list, and none breaking 1200...

I've had two and they behaved very similarly to each other...

Might have a golden card, as far as 300's are concerned!


----------



## Dundundata

Hmmm I haven't tried breaking 1200, but maybe I should


----------



## Sgt Bilko

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> No backplate on this one, the ones with them just went on sale here in Aus this week.
> 
> 
> 
> So weird to see a 390 even bench stable at 1250...
> 
> I mean, so many cards on the list, and none breaking 1200...
> 
> I've had two and they behaved very similarly to each other...
> 
> Might have a golden card, as far as 300's are concerned!
Click to expand...

I did some checking on HWBot and there is another Nitro that is at least bench stable at the same speeds as this one so its possible that the Nitro cards are better than most 390's at least in terms of overclocking.

I can't make any definite conclusions based off one card of course but maybe a few in here can give it a go?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I did some checking on HWBot and there is another Nitro that is at least bench stable at the same speeds as this one so its possible that the Nitro cards are better than most 390's at least in terms of overclocking.
> 
> I can't make any definite conclusions based off one card of course but maybe a few in here can give it a go?


I think it's typical sapphire non- binning to be honest...

So many tri-x 's all over the net that only clock to 1120, and then all these others that do 1250-1300.

Maybe the same case with the 390?
Say that cause so far the results and reviews of overclocking have been less than stellar for them.

On another note...

I need some help regarding clocking my 9590 on the kitty... If you get a chance


----------



## Sgt Bilko

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> I did some checking on HWBot and there is another Nitro that is at least bench stable at the same speeds as this one so its possible that the Nitro cards are better than most 390's at least in terms of overclocking.
> 
> I can't make any definite conclusions based off one card of course but maybe a few in here can give it a go?
> 
> 
> 
> I think it's typical sapphire non- binning to be honest...
> 
> So many tri-x 's all over the net that only clock to 1120, and then all these others that do 1250-1300.
> 
> Maybe the same case with the 390?
> Say that cause so far the results and reviews of overclocking have been less than stellar for them.
> 
> On another note...
> 
> I need some help regarding clocking my 9590 on the kitty... If you get a chance
Click to expand...

Sure, send me a message or ask in the kitty club and I along with some others can help you out


----------



## LeSwede

I have some weird performance issues. Ive tested pretty much every single thing and I still cant get them over 2500 score D:

1 card:


2 cards:


----------



## Agent Smith1984

Quote:


> Originally Posted by *LeSwede*
> 
> I have some weird performance issues. Ive tested pretty much every single thing and I still cant get them over 2500 score D:
> 
> 1 card:
> 
> 
> 2 cards:


Can you show a firestrike result also?

Are the two cards clocked the same?

Is the top card throttling under load?

I have found that when using crossfire on these cards, that using different clock speeds causing major issues....

And even if they are set at the same clock, if the top card is throttling due to temps, it will cause the same issues.

Can we a GPU-Z or Afterburner screen shot of what's going on during the run?


----------



## CamsX

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I think it's typical sapphire non- binning to be honest...
> 
> So many tri-x 's all over the net that only clock to 1120, and then all these others that do 1250-1300.
> 
> Maybe the same case with the 390?
> Say that cause so far the results and reviews of overclocking have been less than stellar for them.
> 
> On another note...
> 
> I need some help regarding clocking my 9590 on the kitty... If you get a chance


Hello all, been away enjoying my card.









Is VRM cooling more relevant to Core overclocking or Memory overclocking? Or both?

1250 on the core for Sapphire seems extremely good. Congrats Sgt. Bilko.


----------



## xboxshqip

Can anyone tell me anything about the ASUS STRIX Radeon R9 390, its coming November 3 on my door, and the hype is unstoppable.


----------



## Agent Smith1984

Quote:


> Originally Posted by *xboxshqip*
> 
> Can anyone tell me anything about the ASUS STRIX Radeon R9 390, its coming November 3 on my door, and the hype is unstoppable.


Just had one for two weeks...

Use a custom fan profile, and high flow case fans, and she'll do great!


----------



## TehMasterSword

Crossfire between a 390 and a 390x will work
Quote:


> Originally Posted by *xboxshqip*
> 
> Can anyone tell me anything about the ASUS STRIX Radeon R9 390, its coming November 3 on my door, and the hype is unstoppable.


Temps are going to be high. Use a more aggressive fan curve and wear headphones.


----------



## qwaarjet

Quote:


> Originally Posted by *xboxshqip*
> 
> Can anyone tell me anything about the ASUS STRIX Radeon R9 390, its coming November 3 on my door, and the hype is unstoppable.


I have a 390X STRIX, was loud and hot, but then I decided to try undervolting which worked great. It's overclocked and undervolted and my temps are great now. VRM went from 106C to mid 80's. Core temp is mid-70's and much quieter.


----------



## Ha-Nocri

Quote:


> Originally Posted by *qwaarjet*
> 
> I have a 390X STRIX, was loud and hot, but then I decided to try undervolting which worked great. It's overclocked and undervolted and my temps are great now. VRM went from 106C to mid 80's. Core temp is mid-70's and much quieter.


Damn, ASUS is terrible with video cards if that is true for every sample of the card. Their 290 is among the worst.


----------



## Agent Smith1984

Telling you guys...

Do NOT crossfire a 390 with 390x unless you are clocking the 390 about 50mhz higher than the 390x(in which case you have wasted money on the 390x). You want both GPU's to perform identical or you will have stutter...


----------



## Sgt Bilko

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Telling you guys...
> 
> Do NOT crossfire a 390 with 390x unless you are clocking the 390 about 50mhz higher than the 390x(in which case you have wasted money on the 390x). You want both GPU's to perform identical or you will have stutter...


Not true, I ran a 290 with my 295x2, worked fine for me and the 290 either stayed at it's original clocks or clocked itself up to match the 295x2, I've also done it with a 390x + 290x (different speeds) and i plan to do it with a 390x + 390 as well


----------



## Agent Smith1984

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Not true, I ran a 290 with my 295x2, worked fine for me and the 290 either stayed at it's original clocks or clocked itself up to match the 295x2, I've also done it with a 390x + 290x (different speeds) and i plan to do it with a 390x + 390 as well


That's crazy cause i have gotten terrible results with differing gpu speeds...

First noticed it with my 290's...
If i clocked my better 290 to its max clocks (1175/1600), and then the lesser clocking one to its limits (1140/1450), and noticed during monitoring the usage, that it would get constant hitches, and have terrible stutters, and if i clocked them together, they woul perform perfect. Same thing with msi 390, and Asus 390 together....

And most recently,i tested it with a standard 7970 @ 925/1375, and a DD BE @ 1050/1500, and the performance was terrible until i clocked the standard to match the BE...

It was driving me nuts, and the only remedy was to clock- lock....

So you had NO issues at all?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Not true, I ran a 290 with my 295x2, worked fine for me and the 290 either stayed at it's original clocks or clocked itself up to match the 295x2, I've also done it with a 390x + 290x (different speeds) and i plan to do it with a 390x + 390 as well
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That's crazy cause i have gotten terrible results with differing gpu speeds...
> 
> First noticed it with my 290's...
> If i clocked my better 290 to its max clocks (1175/1600), and then the lesser clocking one to its limits (1140/1450), and noticed during monitoring the usage, that it would get constant hitches, and have terrible stutters, and if i clocked them together, they woul perform perfect. Same thing with msi 390, and Asus 390 together....
> 
> And most recently,i tested it with a standard 7970 @ 925/1375, and a DD BE @ 1050/1500, and the performance was terrible until i clocked the standard to match the BE...
> 
> It was driving me nuts, and the only remedy was to clock- lock....
> 
> So you had NO issues at all?
Click to expand...

when I'm gaming i game at stock clocks (most of the time) but no, I've not noticed any issues on my end with it


----------



## TsukikoChan

i received my sapphire 390x 2 days ago (i'll put my application forward for the club over the weekend, want some screenshots first) wooo

strangely the fans are always on :< 20-25% on idle, with temps of ~45degrees gpu (55 for vrm). i thought these are meant to be zero-fan based until about 60degrees? my case has decent cooling (fractal r4 with lots of fans) so wondering what's up.. i tried to set a custom fan curve in sapphire trixx (so that it was 0 until 50-60 and then up to 30-40%) but it would not apply it seemed :<
any ideas guys

on the plus side, loving the speed of this beast! coming from a 7870 to this my games have really gotten so smoooooooooooooooooth :-D


----------



## AverdanOriginal

Quote:


> Originally Posted by *RWGTROLL*
> 
> This is some of the benchmarks for my Powercolor R9 390x Devil. Also these are my average GPU temp well running the benchmarks
> 
> GPU Temps ( C )
> Minimum, Maximum, Average
> GPU: 35 ,60, 45
> VRM: 36, 64, 56


Hi, you are using Windows 10 correct? Since I noticed that your score with mantle resulted in less average FPS in BF4 compared to using DX11.
I just switched to Windows 10, and my impression is that mantle seems to work less good in Dragon Age Inquisition than mantle on Windows 7. More stuttering and less smooth frames. (I also installed my new skylake set-up so I am not sure if that could also be the cause. I haven't tested it yet, since Rivaturner nor Fraps work with Dragon Age, but I saw you already posted an alternative which should work with mantle. +Rep
So has anyone else who switched to windows 10 noticed a fall in FPS and/or Frametimes when using mantle compared to DX11?


----------



## Agent Smith1984

I am going to lose my mind with my system....

I have a friggin crazy memory leak in WIndows 10....

I noticed it yesterday.... it's some sort of driver issue or something....

Anyone have this issue with windows 10?

I never noticed it when using a 16GB RAM kit, but I switched over to my 8GB Trident-X kit, and now I get a constant 66% + RAM usage as soon as the system boots.

GRRRR


----------



## SpecFree

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I am going to lose my mind with my system....
> 
> I have a friggin crazy memory leak in WIndows 10....
> 
> I noticed it yesterday.... it's some sort of driver issue or something....
> 
> Anyone have this issue with windows 10?
> 
> I never noticed it when using a 16GB RAM kit, but I switched over to my 8GB Trident-X kit, and now I get a constant 66% + RAM usage as soon as the system boots.
> 
> GRRRR


Before i upgraded to 16gb i had the same issue, i believe this was the thing i ended up doing -

Win+R
type regedit,
then go to HKEY_LOCAL_MACHINE\SYSTEM\ControlSet001\Services\Ndu and change Start value to 4. this will turn off data usage monitoring which may be bad for those with data quotes.

It might not be the exact issue you are having, but try it out


----------



## Agent Smith1984

Quote:


> Originally Posted by *SpecFree*
> 
> Before i upgraded to 16gb i had the same issue, i believe this was the thing i ended up doing -
> 
> Win+R
> type regedit,
> then go to HKEY_LOCAL_MACHINE\SYSTEM\ControlSet001\Services\Ndu and change Start value to 4. this will turn off data usage monitoring which may be bad for those with data quotes.
> 
> It might not be the exact issue you are having, but try it out


That was actually the first thing I tried....

No luck!

I am going to look a little deeper into it this evening.

I am also going to add my second trident-x kit to see if going 16GB stops the issue, though if it is using too much RAM, it's still an issue whether I can "afford" the RAM loss or not, so I'd like to get it figured out.


----------



## Gumbi

Windows 10 since it was released a few months ago. Upgraded from standard corsair vengeance 2 × 4 gb to 2 × 4gb of samsubg wonder RAM and never had any issues, memory leaks etc.

Have you gone to the Task Manager to see if anything pops out as using too much RAM?

Dealt with a Java memory leak at work the other day which was a nightmare, running any Java executable (including the uninstall!) would results in thousands of java processes running and maxing out the RAM usage, bringing the computer to a standstill. I know the pain it can cause :/


----------



## flopper

Quote:


> Originally Posted by *AverdanOriginal*
> 
> Hi, you are using Windows 10 correct? Since I noticed that your score with mantle resulted in less average FPS in BF4 compared to using DX11.
> I just switched to Windows 10, and my impression is that mantle seems to work less good in Dragon Age Inquisition than mantle on Windows 7. More stuttering and less smooth frames. (I also installed my new skylake set-up so I am not sure if that could also be the cause. I haven't tested it yet, since Rivaturner nor Fraps work with Dragon Age, but I saw you already posted an alternative which should work with mantle. +Rep
> So has anyone else who switched to windows 10 noticed a fall in FPS and/or Frametimes when using mantle compared to DX11?


dice dont update Mantle for windows anymore.
dx12 is what they are going for.


----------



## Scorpion49

Got another question about this MSI 390X. It is advertised as 1100/6100 but my card out of the box only does 1080/6000. Is the advertised speed an overclock applied by that gaming software that MSI seems very intent on pushing?


----------



## Gumbi

Quote:


> Originally Posted by *Scorpion49*
> 
> Got another question about this MSI 390X. It is advertised as 1100/6100 but my card out of the box only does 1080/6000. Is the advertised speed an overclock applied by that gaming software that MSI seems very intent on pushing?


Yes I think so.


----------



## SpecFree

Ive been running furmark for the last 15min, and ive been hitting about 84 degrees with case fans running at 75% and the actual fans set to the stock fan curve, should i be worried? or is this somewhat normal

MSI R9 390 btw


----------



## DJNIKEL24

Hey guys. Just got my msi 390 lite edition (don't ask) and coming from a Sapphire vapor x 7970. I just installed this last night and only played one game, nba 2k16, but noticing that I'm getting the same fps as the 7970. I was really hoping for 120fps but just like the 7970 it hits 120 but then dips between 90-105 fps every time I cross half court. I've tried numerous tweaks turning settings down to medium and even overclocking just to 1105 for now but can't seem to bump that fps up. It's so weird. Any advice thoughts or ideas?


----------



## Gumbi

Quote:


> Originally Posted by *SpecFree*
> 
> Ive been running furmark for the last 15min, and ive been hitting about 84 degrees with case fans running at 75% and the actual fans set to the stock fan curve, should i be worried? or is this somewhat normal
> 
> MSI R9 390 btw


Don't kill your GPU with Furmark. Your temps seem fine. Run Heaven or Valley Benchmark for a more realistic number.


----------



## Gumbi

Quote:


> Originally Posted by *DJNIKEL24*
> 
> Hey guys. Just got my msi 390 lite edition (don't ask) and coming from a Sapphire vapor x 7970. I just installed this last night and only played one game, nba 2k16, but noticing that I'm getting the same fps as the 7970. I was really hoping for 120fps but just like the 7970 it hits 120 but then dips between 90-105 fps every time I cross half court. I've tried numerous tweaks turning settings down to medium and even overclocking just to 1105 for now but can't seem to bump that fps up. It's so weird. Any advice thoughts or ideas?


You are very possibly CPU limited.


----------



## Agent Smith1984

Just......

Bought......

A fury......


----------



## diggiddi

Let us know how the 4gb HBM works with 4k vs the 390


----------



## Gumbi

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Just......
> 
> Bought......
> 
> A fury......


Which model? The XFX? It does look nice I must say


----------



## Agent Smith1984

Quote:


> Originally Posted by *Gumbi*
> 
> Which model? The XFX? It does look nice I must say


Yep!

$519.99....

Couldn't pass it up!!!


----------



## SpecFree

Quote:


> Originally Posted by *Gumbi*
> 
> Don't kill your GPU with Furmark. Your temps seem fine. Run Heaven or Valley Benchmark for a more realistic number.


Just went through Heaven instead

Scored

FPS:
61.2

Score:
1541

Min FPS:
8.5

Max FPS:
134.0

System

Platform:
Windows NT 6.2 (build 9200) 64bit

CPU model:
Intel(R) Core(TM) i5-6600K CPU @ 3.50GHz (3503MHz) x4

GPU model:
AMD Radeon (TM) R9 390 Series 15.201.1151.0 (4095MB) x1 < 4gb?

Settings

Render:
Direct3D11

Mode:
1920x1080 8xAA fullscreen

Preset
Custom

Quality
High

Tessellation:
Extreme


----------



## Agent Smith1984

I will be anxious to compare the numbers to the archive of 4k results I have for my 390 at 1200/1700 clock speeds.....

The Fury is NOT a good clocker, especially with no voltage control yet, and I'm not holding my breath to get it either...

If I'm not impressed with the card, it'll be going back (have heard nightmares about coil whine, which is reason enough to return it anyways)....

If it goes back, I'm buying two 390's for crossfire, and calling it a day for at least 2 years....


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I will be anxious to compare the numbers to the archive of 4k results I have for my 390 at 1200/1700 clock speeds.....
> 
> The Fury is NOT a good clocker, especially with no voltage control yet, and I'm not holding my breath to get it either...
> 
> If I'm not impressed with the card, it'll be going back (have heard nightmares about coil whine, which is reason enough to return it anyways)....
> 
> If it goes back, I'm buying two 390's for crossfire, and calling it a day for at least 2 years....


+

yeah, right. Like you'll actually be able to do that. LOL


----------



## Agent Smith1984

Quote:


> Originally Posted by *battleaxe*
> 
> +
> 
> yeah, right. Like you'll actually be able to do that. LOL


Well, I meant for this rig....

(2) 390's is end of the road for what FX can handle, even at 5GHz.....

Now... Zen? lol, that will see several GPU's I'm sure.


----------



## DJNIKEL24

Quote:


> Originally Posted by *Gumbi*
> 
> You are very possibly CPU limited.


I have a fx8350. Man I hope that's not it. Maybe it's just the game?


----------



## Gumbi

Quote:


> Originally Posted by *DJNIKEL24*
> 
> I have a fx8350. Man I hope that's not it. Maybe it's just the game?


It could be the game. If the game relies heavily on the CPU and only uses 2 cores, then an FX chip is really going to limit you. But that only goes for a few games. You should be fine generally speaking.


----------



## Agent Smith1984

Quote:


> Originally Posted by *DJNIKEL24*
> 
> Hey guys. Just got my msi 390 lite edition (don't ask) and coming from a Sapphire vapor x 7970. I just installed this last night and only played one game, nba 2k16, but noticing that I'm getting the same fps as the 7970. I was really hoping for 120fps but just like the 7970 it hits 120 but then dips between 90-105 fps every time I cross half court. I've tried numerous tweaks turning settings down to medium and even overclocking just to 1105 for now but can't seem to bump that fps up. It's so weird. Any advice thoughts or ideas?


Overclock the CPU and the 7970 will probably even stay over 120FPS at 1080P.....

You need around 4.6GHz on the FX-8 to really make it a valid "gamer" at high FPS.


----------



## DJNIKEL24

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Overclock the CPU and the 7970 will probably even stay over 120FPS at 1080P.....
> 
> You need around 4.6GHz on the FX-8 to really make it a valid "gamer" at high FPS.


I can't seem to get over 4.2/4.3.


----------



## Agent Smith1984

Quote:


> Originally Posted by *DJNIKEL24*
> 
> I can't seem to get over 4.2/4.3.


Off topic here, but I can probably give you a lot more help through PM or over int he 83** Vishera club.

It would help to know you entire setup specs first though (ie: board, cooling, PSU, RAM, etc....)


----------



## kizwan

Quote:


> Originally Posted by *SpecFree*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Gumbi*
> 
> Don't kill your GPU with Furmark. Your temps seem fine. Run Heaven or Valley Benchmark for a more realistic number.
> 
> 
> 
> Just went through Heaven instead
> 
> Scored
> 
> FPS:
> 61.2
> 
> Score:
> 1541
> 
> Min FPS:
> 8.5
> 
> Max FPS:
> 134.0
> 
> System
> 
> Platform:
> Windows NT 6.2 (build 9200) 64bit
> 
> CPU model:
> Intel(R) Core(TM) i5-6600K CPU @ 3.50GHz (3503MHz) x4
> 
> GPU model:
> AMD Radeon (TM) R9 390 Series 15.201.1151.0 (4095MB) x1 < 4gb?
> 
> Settings
> 
> Render:
> Direct3D11
> 
> Mode:
> 1920x1080 8xAA fullscreen
> 
> Preset
> Custom
> 
> Quality
> High
> 
> Tessellation:
> Extreme
Click to expand...

That seems too low, considering you're running it at Custom-High-Extreme. You should run it at Custom-Ultra-Extreme. Much easier to compare because this what most people running the heaven benchmark at. What was the clock you ran at? I'm not sure whether Gumbi's CPU limited apply to you but try overclock your CPU more.


----------



## Gumbi

Quote:


> Originally Posted by *kizwan*
> 
> That seems too low, considering you're running it at Custom-High-Extreme. You should run it at Custom-Ultra-Extreme. Much easier to compare because this what most people running the heaven benchmark at. What was the clock you ran at? I'm not sure whether Gumbi's CPU limited apply to you but try overclock your CPU more.


Nope it's bang on actually. 1080p 8xAA and max tess should be 60FPS~ depending on your clocks.


----------



## kizwan

Well I've never run at High quality before, always Ultra. I'm assuming with High should be better. Looking at the max FPS, I think my assumption is correct. Even if not, you can tell something wrong happened there by looking at the min FPS. I won't call the score bang on though. Still I'm assuming he is running at stock clock when I said the score is too low, in addition to the setting he is running heaven at.

*Edit:* Ooops! I looked at wrong screenshots here (samples I have here for comparison). Sorry. So, I retracted back my analysis but the min fps does drop too low than it should be. Basically something is messing with his bench that may also can affect his games experience. So, should check that out.


----------



## diggiddi

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Overclock the CPU and the 7970 will probably even stay over 120FPS at 1080P.....
> 
> *You need around 4.6GHz on the FX-8* to really make it a valid "gamer" at high FPS.


This is what I have always said, under 4.6ghz you are leaving food on the table, unless you are using mantle


----------



## Gumbi

Quote:


> Originally Posted by *kizwan*
> 
> Well I've never run at High quality before, always Ultra. I'm assuming with High should be better. Looking at the max FPS, I think my assumption is correct. Even if not, you can tell something wrong happened there by looking at the min FPS. I won't call the score bang on though. Still I'm assuming he is running at stock clock when I said the score is too low, in addition to the setting he is running heaven at.
> 
> *Edit:* Ooops! I looked at wrong screenshots here (samples I have here for comparison). Sorry. So, I retracted back my analysis but the min fps does drop too low than it should be. Basically something is messing with his bench that may also can affect his games experience. So, should check that out.


It could be just a stutter. I've seen other with similar drops.


----------



## Dundundata

Quote:


> Originally Posted by *SpecFree*
> 
> Ive been running furmark for the last 15min, and ive been hitting about 84 degrees with case fans running at 75% and the actual fans set to the stock fan curve, should i be worried? or is this somewhat normal
> 
> MSI R9 390 btw


I recommend a custom fan curve for the card, are you taking a look at your vrm temp too? Heaven is good but have you tried Firestrike?


----------



## ImJJames

Can someone with a 390X post there firestrike 1.1 scores 4770K/4790K CPU's preferred. GPU overclocked preferred.

Just want to do a comparison with my 290X

http://www.3dmark.com/fs/5963038


----------



## Gumbi

Quote:


> Originally Posted by *ImJJames*
> 
> Can someone with a 390X post there firestrike 1.1 scores 4770K/4790K CPU's preferred. GPU overclocked preferred.
> 
> Just want to do a comparison with my 290X
> 
> http://www.3dmark.com/fs/5963038


I did 14776 graphics score with my 290X @ 1232/1641. 4790k @ 4.9ghz.


----------



## Hemanse

Got my MSI R9 390 back today, been wondering if anyone have a good fan curve other than the standard one?


----------



## Thorbrant

Hey guys! I just got a brand new computer with a R9 390 graphics card. I overclocked my i5 6600k to 4,5GHz and now I tried to give my graphics card a boost. I have a Sapphire R9 390 Nitro, and I can't get it stable with values others got here (however, I didn't read all 360 pages). I tried to go to 1200/1600, but even with +190mV, that didn't work. I got it halfway stable on heaven at 1150 with +190mV, but I got artifacts in grid 2. I'm really new here with overclocking a graphics card. I use trixx 5.0. My results in heaven benchmark and 3dmark are below what I've seen here.



These are already oc results, stock results score below 1000.

Did I miss something essential here or do I just have bad luck with my graphics card?


----------



## Hemanse

Quote:


> Originally Posted by *Thorbrant*
> 
> Hey guys! I just got a brand new computer with a R9 390 graphics card. I overclocked my i5 6600k to 4,5GHz and now I tried to give my graphics card a boost. I have a Sapphire R9 390 Nitro, and I can't get it stable with values others got here (however, I didn't read all 360 pages). I tried to go to 1200/1600, but even with +190mV, that didn't work. I got it halfway stable on heaven at 1150 with +190mV, but I got artifacts in grid 2. I'm really new here with overclocking a graphics card. I use trixx 5.0. My results in heaven benchmark and 3dmark are below what I've seen here.
> 
> 
> 
> These are already oc results, stock results score below 1000.
> 
> Did I miss something essential here or do I just have bad luck with my graphics card?


You cant really just look at everyone else numbers and copy them, every card has its limits, some cards OC good, some average and some bad, its not called the silicon lottery for nothing







You pretty much have to bump up the numbers little by little, if you then start seeing artifacts or crashes you dial the OC back a bit.





 Jayztwocents did a pretty nice video on overclocking a little while back.


----------



## uacGALACTIC

I've noticed that GPU_Z reports fan speeds at 1280rpm with a temp of 41 degrees celcius. When I examine the fans they're not spinning at all. I know this card has a feature that won't spin up the fans unless its under load but then shouldn't gpu-z say "0rpm" ??


----------



## xboxshqip

thanks for the tips yeah custom fan profile on the card, and case air flow shouldn't be a problem i have a haf 932.

Hope all works great


----------



## AverdanOriginal

Quote:


> Originally Posted by *RWGTROLL*
> 
> http://www.overclock.net/t/1530583/fta-frame-time-analyzer-v1-0-1-supports-bf4-civ-be-da-i
> 
> It was the most accurate tool I think. Also fun fact it works for Battlefront.


Hi,

A little bit off topic I guess







I tried yesterday to measure my fps in Dragon Age Inquisition with this tool, but I can't get it to record a benchmark or make a benchmark log. All I see is that you need a premade benchmark log and can analyse it with this tool. Am I missing something. help would be highly appreciated since I want to test if mantle on win 10 performs less good and if driver 15.10 beta or 15.7 gives me better fps.
thx.


----------



## Thorbrant

Quote:


> Originally Posted by *Hemanse*
> 
> You cant really just look at everyone else numbers and copy them, every card has its limits, some cards OC good, some average and some bad, its not called the silicon lottery for nothing
> 
> 
> 
> 
> 
> 
> 
> You pretty much have to bump up the numbers little by little, if you then start seeing artifacts or crashes you dial the OC back a bit.
> 
> 
> 
> 
> 
> Jayztwocents did a pretty nice video on overclocking a little while back.


Thank you! And I just realized there is a table with results in the first post... I think I stabilized it at 1100/1600 without giving extra Voltage on my graphics card.


----------



## Gumbi

Your numbers are quite low, I'm almost sure you're throttling. have you maxed out the power limit (plus 50%)? You should do that. Also, be careful when going above 100mv, as that card's VRM cooling, while decent, isn't stellar, and they get quite hot the more voltage you pump into them.

Here's an image of me getting 67FPS at a super high overclock. You should be around 60~FPS average and maxing out from 120-130.


----------



## Thorbrant

Quote:


> Originally Posted by *Gumbi*
> 
> Your numbers are quite low, I'm almost sure you're throttling. have you maxed out the power limit (plus 50%)? You should do that. Also, be careful when going above 100mv, as that card's VRM cooling, while decent, isn't stellar, and they get quite hot the more voltage you pump into them.
> 
> Here's an image of me getting 67FPS at a super high overclock. You should be around 60~FPS average and maxing out from 120-130.


I was wondering about the max fps as well. First I thought it was because it ran at windowed mode, but I don't reach more in fullscreen. Most of the time, I see the fps stable at 60, and I have the feeling it throttles here, yeah. But I don't know why. However, here's my firestrike benchmark:



It doesn't look that bad, does it?


----------



## Gumbi

I only started using Firestrike recently so I don't have a great feel for the numbers, but I can get 14.7k with MAX overclock, so 13.4k seems completely fine. Run GPUz and take note of max core/VRM temps when you are benching, if core gets to mid 80s it might start throttling. And if VRMs get in the 90s/100s it might throttle too.


----------



## Thorbrant

Ok, I had vsync activated. Now the results look much more like the others here:


----------



## Gumbi

HAH, that would explain it. Yes your results are now very normal. Are you monitoring your temps? What clocks was that at?


----------



## Thorbrant

Temperatures are stable around 70°. These results are now with 1100/1600 without any change of the voltage. I'm gonna keep it this way, I don't feel very comfortable with higher voltage settings in everyday use.


----------



## Gumbi

Quote:


> Originally Posted by *Thorbrant*
> 
> Temperatures are stable around 70°. These results are now with 1100/1600 without any change of the voltage. I'm gonna keep it this way, I don't feel very comfortable with higher voltage settings in everyday use.


Sounds good


----------



## Streetdragon

i need a bit help guys...
There is something strange with my R9 390 Nitro. Without any overclock, my voltage is @ 1086mV. Normal.. BUT if i clock the VRAM higher then stock, the voltage jumos to 1258mV. I only changed the memoryspeed.... form where do the card get the voltage? While gaming, the voltage drops to 1120mV or something like that.

I have 2 screens!


----------



## Scorpion49

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Just......
> 
> Bought......
> 
> A fury......


As someone who had four different Fury cards and RMA'd or sold all of them for massive coil whine, I sincerely wish you luck. The XFX was by far the worst, you know its bad when I would rather play with low settings at 1440p on a GTX 950 than have that thing in my rig for one more second longer.


----------



## TehMasterSword

It's happening, boys! Amazon finally refunded me for my returned Strix 390 and my new Powercolor 390 will be here Tuesday. Has anyone done a THOROUGH benchmark/testing of this card, Agent Smith style? Would anyone be interested in some pretty graphs?


----------



## Agent Smith1984

Quote:


> Originally Posted by *TehMasterSword*
> 
> It's happening, boys! Amazon finally refunded me for my returned Strix 390 and my new Powercolor 390 will be here Tuesday. Has anyone done a THOROUGH benchmark/testing of this card, Agent Smith style? Would anyone be interested in some pretty graphs?


I'd love to see the powercolor thoroughly tested!


----------



## TehMasterSword

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I'd love to see the powercolor thoroughly tested!


Alright! This will be the first time I test a card for performance outside of what just benefits me. I'm thinking of testing...

Temps/Speeds/Valley Score/Fire Strike Score/GTA V, Witcher 3 FPS @ stock settings and fan curve

Then doing it all over again with the best OC I can get and a more aggressive curve.

Any relevant info people might be interested in that I missed? All of this will be done w/ 4690k water cooled @ 4.6Ghz in an H440 with fans running at max.


----------



## Agent Smith1984

Quote:


> Originally Posted by *TehMasterSword*
> 
> Alright! This will be the first time I test a card for performance outside of what just benefits me. I'm thinking of testing...
> 
> Temps/Speeds/Valley Score/Fire Strike Score/GTA V, Witcher 3 FPS @ stock settings and fan curve
> 
> Then doing it all over again with the best OC I can get and a more aggressive curve.
> 
> Any relevant info people might be interested in that I missed? All of this will be done w/ 4690k water cooled @ 4.6Ghz in an H440 with fans running at max.


Sounds good to me man!


----------



## tolis626

Hey, I realized I'm not on the owners list, so here's my stuff. 1185MHz has been the maximum stable clock I've managed, but for some reason my highest Firestrike score was taken with my card running at 1170/1700. Oh well...

PS : At 1185MHz with +100mV it overheats within 5-10 minutes of Valley, even with my side panel off and with relatively cool ambient temps. What the hell kind of goo is MSI using as a TIM for these cards? Dayum...


----------



## uacGALACTIC

I've noticed that GPU_Z reports fan speeds at 1280rpm with a temp of 41 degrees celcius. When I examine the fans they're not spinning at all. I know this card has a feature that won't spin up the fans unless its under load but then shouldn't gpu-z say "0rpm" ??
Any help is much appreciated.

Also i played unreal tournament 4 for a while today with setting all cranked to max, vsync AA and looked and all 3 fans were running. So they function under load. But why do software report fan activity when there is none??


----------



## yafatana

stutter in games with sapphire r9 390 nitro please help me fix it
I built that pc
i7 6700k
ga z170x gaming 5
g650 seasonic
sapphire r9 390
I stress test the video card with valley benchmark
I get max 190fps min 39 fps
also I got some stutter while run the test
now
when I try to play the settlers 7 path to kingdom I get
stutter every 5 seconds for a half second
I'm running latest amd beta driver 15.10
how can I fix it please

the sound does not stutter
also just to note I have both intel and sound blaster x -fi drivers instaled


----------



## Agent Smith1984

Quote:


> Originally Posted by *TehMasterSword*
> 
> Alright! This will be the first time I test a card for performance outside of what just benefits me. I'm thinking of testing...
> 
> Temps/Speeds/Valley Score/Fire Strike Score/GTA V, Witcher 3 FPS @ stock settings and fan curve
> 
> Then doing it all over again with the best OC I can get and a more aggressive curve.
> 
> Any relevant info people might be interested in that I missed? All of this will be done w/ 4690k water cooled @ 4.6Ghz in an H440 with fans running at max.


Sounds good to me man!
Quote:


> Originally Posted by *tolis626*
> 
> 
> 
> Hey, I realized I'm not on the owners list, so here's my stuff. 1185MHz has been the maximum stable clock I've managed, but for some reason my highest Firestrike score was taken with my card running at 1170/1700. Oh well...
> 
> PS : At 1185MHz with +100mV it overheats within 5-10 minutes of Valley, even with my side panel off and with relatively cool ambient temps. What the hell kind of goo is MSI using as a TIM for these cards? Dayum...


I'll get you in Monday bud. My spreadsheet is at work, I normally do updates to it on Monday and Friday.

Thanks for the proper submission!


----------



## Agent Smith1984

Quote:


> Originally Posted by *yafatana*
> 
> stutter in games with sapphire r9 390 nitro please help me fix it
> I built that pc
> i7 6700k
> ga z170x gaming 5
> g650 seasonic
> sapphire r9 390
> I stress test the video card with valley benchmark
> I get max 190fps min 39 fps
> also I got some stutter while run the test
> now
> when I try to play the settlers 7 path to kingdom I get
> stutter every 5 seconds for a half second
> I'm running latest amd beta driver 15.10
> how can I fix it please
> 
> the sound does not stutter
> also just to note I have both intel and sound blaster x -fi drivers instaled


If you could, post a screen shot of GPU-Z monitor, or Afterburner graphs while this is taking place.

THanks


----------



## yafatana

Quote:


> Originally Posted by *Agent Smith1984*
> 
> If you could, post a screen shot of GPU-Z monitor, or Afterburner graphs while this is taking place.
> 
> THanks


thank you man for help me
here is some screenshots from afterburner after run the settlers7 path to kingdom for 1 minute stutter
note
I'm using rivatuner to lock fps to 60 fps
these screenshots with fps locked to 60
even withot to lock the fps to 60 same stutter also
onother note
I did not changed any settings in catylyst control settings as I don't khnow to changed
also no overclock all stock speed also I don't khnow anything about overclock















cpu temp is normal

please let me khnow if you need any other info
I'm soo sad because of the stutter please help me fix it thanks


----------



## Thorbrant

if you don't mind, here's my submission for the owner's list.



I hope it's all you need. My graphics card runs stable at 1100/1600 without extra voltage.


----------



## goncalo16

Hello guys, i have ASUS STRIX R9390 DC3OC 8GD5 GAMING, and i'm new in this things. i would like to overclock it.

*this are my specs:*
CPU: Intel Core i7 6700K (4.0GHz) Socket 1151 + Corsair Hydro H100i GTX
GPU: Asus Strix R9390 DC3OC 8GD5 Gaming
PS: XFX TS Series 750w Black Edition 80 Plus Gold Rated
MB: Asus Maximus VIII Ranger + Asus ROG Front Base
RAM: G.Skill Kit 16GB DDR4 3000MHz Ripjaws V Red (2 x 8GB)
HDD: 1x1TB WD Blue, 1x500GB WD Blue, 1x500GB WD Green
SSD: SSD Samsung 850 Pro 256GB SATA III
Case: Cooler Master 690 II Advanced (USB 3)

I used Asus Suite 3 to OC my CPU, now is at 4.7 Ghz (the program did everything automatically and stoped at 4.7Ghz, i think it's a good way to OC CPU for noobs like me), and i would like to OC my graphic card, i have Asus GPU Tweak II, and i use OC mode profile but i know i can go further..

Can you suggest me safe OC to my R390?


----------



## diggiddi

Quote:


> Originally Posted by *Agent Smith1984*
> 
> If you could, post a screen shot of GPU-Z monitor, or Afterburner graphs while this is taking place.
> 
> THanks










here, but what are the lower resolutions you are able to run your 4k tv at, any 21:9 ratios eg 3440x 1440, 3440x1080, 2560x1080?


----------



## mandrix

I've seen a few people say their card does 1100/1600 without additional voltage. But are you sure? For example, I can set AB for 1100/1600 and the voltage scales up to 1.242 automagically. (according to AB anyway)

Also, my machine kept rebooting when I would run gpu benchmarks. Now, I make and sleeve my own cables throughout my computer. But, I noticed if I used SeaSonic's pcie cables there would be no reboot.
After checking out the cables a little closer, I saw that SeaSonic used 16 gauge for the positive wires, and 18 gauge for the ground wires, whereas I have always used only 18 gauge.

I since replaced the positive wires with 16 gauge, and guess what? No more reboots.
Weird, huh?


----------



## AverdanOriginal

Quote:


> Originally Posted by *yafatana*
> 
> thank you man for help me
> here is some screenshots from afterburner after run the settlers7 path to kingdom for 1 minute stutter
> note
> I'm using rivatuner to lock fps to 60 fps
> these screenshots with fps locked to 60
> even withot to lock the fps to 60 same stutter also
> onother note
> I did not changed any settings in catylyst control settings as I don't khnow to changed
> also no overclock all stock speed also I don't khnow anything about overclock
> 
> 
> 
> 
> 
> 
> 
> 
> 
> cpu temp is normal
> 
> please let me khnow if you need any other info
> I'm soo sad because of the stutter please help me fix it thanks


Yeah you can see that the Frametimes aswell as the FPS are jumping like crazy. Since you haven't overclocked your Card I would first suggest either to see if you have all the most up-to-date patches for the game. If you have, try to go back to the official AMD drvier 15.7 instead of 15.10. Many people have reported unstable gaming and memory leackage on the beta 15.10 driver in different games, maybe your games belongs to that list.
If that doesn't solve the problem, just a wild guess, but perhaps turn on fps cap in Catalyst (to 90 as that was you max fps) or turn on v-sync in the game if that is possible.
What Hz can you screen do? What output/input are you using? Hdmi, Displayport or maybe just DVI-D?
It seems like (but I can't read it exactly) that you card stops at 60 fps, runs the game at 60 and then falls down to 30 (30ish or something can't make out the exact number).


----------



## AverdanOriginal

Quote:


> Originally Posted by *goncalo16*
> 
> Hello guys, i have ASUS STRIX R9390 DC3OC 8GD5 GAMING, and i'm new in this things. i would like to overclock it.
> 
> *this are my specs:*
> CPU: Intel Core i7 6700K (4.0GHz) Socket 1151 + Corsair Hydro H100i GTX
> GPU: Asus Strix R9390 DC3OC 8GD5 Gaming
> PS: XFX TS Series 750w Black Edition 80 Plus Gold Rated
> MB: Asus Maximus VIII Ranger + Asus ROG Front Base
> RAM: G.Skill Kit 16GB DDR4 3000MHz Ripjaws V Red (2 x 8GB)
> HDD: 1x1TB WD Blue, 1x500GB WD Blue, 1x500GB WD Green
> SSD: SSD Samsung 850 Pro 256GB SATA III
> Case: Cooler Master 690 II Advanced (USB 3)
> 
> I used Asus Suite 3 to OC my CPU, now is at 4.7 Ghz (the program did everything automatically and stoped at 4.7Ghz, i think it's a good way to OC CPU for noobs like me), and i would like to OC my graphic card, i have Asus GPU Tweak II, and i use OC mode profile but i know i can go further..
> 
> Can you suggest me safe OC to my R390?


Hard to say. Everycard behaves differently even more so in different PC set-ups. Asus cards are normally not the best binned cards but you never know.
First turn up your power target up all the way (won't hurt). Now there are two ways (or maybe more, I always use these two ways).
The Fast way, but not the most accurate way as you will not keep log of all the single items changing is to overclock on the fly:
1. open MSI Afterburner or in your case Asus GPU Tweak II, set powertarget to max, open unigine heaven in window mode (Quality-Ultra, Tesselation-Extreme, Anti-Aliasing-x8), and let it run (not the benchmark)
2. now slightly turn up (maybe in 10 MHz steps or in the beginning 25-30 MHz) steps the Memory clock. after every step wait some minutes, if heaven shows no stuttering, artifacts or anything unusal turn up again. Keep doing this until heaven performs weird (artifacts, stuttering, weird colours.....). write down that Memory clock. then bring memory clock back to stock and do the same for GPU clock until you see strange things happening in unigine heaven. Write down also this gpu clock.
3. then put memory and gpu clock at the written down value, mabye a bit 20-50 MhZ below that value and run unigine heaven benchmark in full screen with above mentioned settings.
If it passes the benchmark, let it run for another half an hour our full hour and your overclock can be called stable. I and I guess someothers also make a run of Firestrike 1.1 and do some gaming with tuff games like crysis 3, witcher 2 and 3 or Dragon Age Inquisition as these games can stress the card pretty good. Just to be on the safe side.

My preffered way:
Make a table in excel, after every step make a benchmark of unigine heaven and firestrike and note down scores, temps, if you saw any artifacts. that way I get a feel for the card much better and can go back to see where the most stable bench was. Takes way more time, but you have comparable data to fall back to in case something stuffs up later on.

Hope that helps.


----------



## yafatana

Quote:


> Originally Posted by *AverdanOriginal*
> 
> Yeah you can see that the Frametimes aswell as the FPS are jumping like crazy. Since you haven't overclocked your Card I would first suggest either to see if you have all the most up-to-date patches for the game. If you have, try to go back to the official AMD drvier 15.7 instead of 15.10. Many people have reported unstable gaming and memory leackage on the beta 15.10 driver in different games, maybe your games belongs to that list.
> If that doesn't solve the problem, just a wild guess, but perhaps turn on fps cap in Catalyst (to 90 as that was you max fps) or turn on v-sync in the game if that is possible.
> What Hz can you screen do? What output/input are you using? Hdmi, Displayport or maybe just DVI-D?
> It seems like (but I can't read it exactly) that you card stops at 60 fps, runs the game at 60 and then falls down to 30 (30ish or something can't make out the exact number).


. I have 1680 -1050 Xerox screen also I m using rivatuner you lock fps to 60 fps in the settlers 7 path to kingdom because the game does not support vsyng also without using rivatuner I was getting up to 500fps also stutter also I read something about enable overdrive in catylyst cynter to 50+ will that help ? Also when the stutter happen the fps drop at the same time I played euro truck simulator for 5 minutes and there was no stutter also I played the settlers 7 with an old hd 6700 with same 15.10 driver and there was no stutter. I'm using dvi port for the r9 390


----------



## Streetdragon

Quote:


> Originally Posted by *mandrix*
> 
> I've seen a few people say their card does 1100/1600 without additional voltage. But are you sure? For example, I can set AB for 1100/1600 and the voltage scales up to 1.242 automagically. (according to AB anyway)


I have this issue too. Google dont help. HOpe someone can explain this behavior


----------



## DJNIKEL24

Quote:


> Originally Posted by *mandrix*
> 
> I've seen a few people say their card does 1100/1600 without additional voltage. But are you sure? For example, I can set AB for 1100/1600 and the voltage scales up to 1.242 automagically. (according to AB anyway)


I get those speeds exactly without touching the voltage and I don't notice AB adjusting them.


----------



## mandrix

Quote:


> Originally Posted by *Streetdragon*
> 
> I have this issue too. Google dont help. HOpe someone can explain this behavior


Quote:


> Originally Posted by *DJNIKEL24*
> 
> I get those speeds exactly without touching the voltage and I don't notice AB adjusting them.


Probably depends on the manufacturer BIOS.
Either way it doesn't bother me.


----------



## gupsterg

Quote:


> Originally Posted by *mandrix*
> 
> I've seen a few people say their card does 1100/1600 without additional voltage. But are you sure? For example, I can set AB for 1100/1600 and the voltage scales up to 1.242 automagically. (according to AB anyway)


Quote:


> Originally Posted by *Streetdragon*
> 
> I have this issue too. Google dont help. HOpe someone can explain this behavior


Quote:


> Originally Posted by *DJNIKEL24*
> 
> I get those speeds exactly without touching the voltage and I don't notice AB adjusting them.


Quote:


> Originally Posted by *mandrix*
> 
> Probably depends on the manufacturer BIOS.
> Either way it doesn't bother me.


I'll have go to explain what I noticed with 290/X roms and I believe this is the same for other Hawaii GPU cards.

Firstly a ROM can have a GPU core voltage offset within ROM. If in MSI AB you see a preset GPU core voltage offset your ROM has it. For example the Sapphire 390X Nitro ROM has it.

Now I would think an owner would report their using stock voltages.

I have also noted in Sapphire 390X Nitro ROM VDDCI (Aux voltage in MSI AB) is 1.05V vs 1.00v in other 390/X ROMs. Again an owner would think or report as stock voltage / not having touched it.

Next we must talk about ASIC profiling (Leakage ID (ASIC Quality)).
Quote:


> Originally Posted by *The Stilt*
> 
> High ASIC "Quality" (Leakage) = Lower operating voltage, larger current draw, hotter, less energy efficient (due higher losses)
> Low ASIC "Quality" = Higher operating voltage, lower current draw, cooler, more energy efficient


SO say an owner has x ASIC Quality they will have x VID for GPU, another owner has y ASIC Quality they will have y VID for GPU.

SO when both these owners state I use +25mv for x OC of GPU/RAM they will have differing VID / VDDC.

Next we must talk about EVV (Electronic Variable Voltage) in every stock ROM only the lowest state of GPU voltage is the same ie DPM 0. Now why EVV is used for DPM 1 - 7 is so with all this profiling going on each ROM does not have to be tailored to a exact GPUs properties.

Now one thing that also happens under EVV is the default GPU clock affects VID.

For example when I flashed my card with a default GPU clock of 1100MHz I get a differing VID to 1000MHz. Everything in my testing was the same and the same ROM was used but edited GPU clock.

SO as default GPU clock in a ROM goes higher set VID will be lower.

SO an owner of a 390X with say 1050MHz as default GPU clock will have higher VID thus should reach a higher OC without touching voltages than an owner with 1100MHz.


----------



## Levys

Here is my validation http://www.techpowerup.com/gpuz/details.php?id=54nsk

XFX R9 390X DD

3rd party watercooling ( R9 290X kryographics + active backplate )

Hope this works


----------



## Scorpion49

Anyone know when we can get the MSI 390X waterblock from alphacool?


----------



## yafatana

Quote:


> Originally Posted by *yafatana*
> 
> thank you man for help me
> here is some screenshots from afterburner after run the settlers7 path to kingdom for 1 minute stutter
> note
> I'm using rivatuner to lock fps to 60 fps
> these screenshots with fps locked to 60
> even withot to lock the fps to 60 same stutter also
> onother note
> I did not changed any settings in catylyst control settings as I don't khnow to changed
> also no overclock all stock speed also I don't khnow anything about overclock
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> cpu temp is normal
> 
> please let me khnow if you need any other info
> I'm soo sad because of the stutter please help me fix it thanks


. To the OP please give me an advice to my proplem thanks


----------



## derkington

Quote:


> Originally Posted by *TsukikoChan*
> 
> i received my sapphire 390x 2 days ago (i'll put my application forward for the club over the weekend, want some screenshots first) wooo
> 
> strangely the fans are always on :< 20-25% on idle, with temps of ~45degrees gpu (55 for vrm). i thought these are meant to be zero-fan based until about 60degrees? my case has decent cooling (fractal r4 with lots of fans) so wondering what's up.. i tried to set a custom fan curve in sapphire trixx (so that it was 0 until 50-60 and then up to 30-40%) but it would not apply it seemed :<
> any ideas guys
> 
> on the plus side, loving the speed of this beast! coming from a 7870 to this my games have really gotten so smoooooooooooooooooth :-D


Got the same card, my fans on light load spin at 28%, idle Core temp 49 degree and VRM 59. My case fans on the slowest speed (PWM control) are louder so I do not really notice. I suppose if the GPU temp went to 10c or something like that the fans on the card would spin even slower.

While digging about (because I was curious as well), I saw that there is a new sapphire 390x, along with a core bump from 1055Mhz to 1080 MHz) and a backplate according to the site: http://www.sapphiretech.com/catapage_pd.asp?cataid=69&lang=eng
The card is already awesome but this is a nice touch







.


----------



## Rmosher

Here is my validation link:http://www.techpowerup.com/gpuz/details.php?id=dd52x

Sapphire r9 390x Tri-X. Stock/air. Stock voltage.

I love this thing!


----------



## andrewyou

hi
I have a quick question about the sapphire r9 390x.
Is my PSU ok for this card or do i need to get another one asap ?

My rig :
I5 2500k stock
1xSATA HDD
2x4 GB RAM
PSU : Seasonic S12II-620 Bronze 620W

Thank you !


----------



## Gumbi

Quote:


> Originally Posted by *andrewyou*
> 
> hi
> I have a quick question about the sapphire r9 390x.
> Is my PSU ok for this card or do i need to get another one asap ?
> 
> My rig :
> I5 2500k stock
> 1xSATA HDD
> 2x4 GB RAM
> PSU : Seasonic S12II-620 Bronze 620W
> 
> Thank you !


It's fine.


----------



## diggiddi

Quote:


> Originally Posted by *derkington*
> 
> While digging about (because I was curious as well), I saw that there is a new sapphire 390x, along with a core bump from 1055Mhz to 1080 MHz) and a backplate according to the site: http://www.sapphiretech.com/catapage_pd.asp?cataid=69&lang=eng
> The card is already awesome but this is a nice touch
> 
> 
> 
> 
> 
> 
> 
> .


This is my take on it, they've had these cards for a while now but needed stock of original cards to dwindle before releasing them into the wild,
or else everyone will buy these and leave the older ones sitting on retailers shelves. that's my story and I'm sticking to it


----------



## Rmosher

I have an EVGA G2 1000 watt PSU. Sapphire recommends a 750 watt minimum. If I were you I would definitely upgrade to a 750 minimum. Just make sure you get a quality power supply. Don't go with a crappy brand. I learned this the hard way!


----------



## derkington

false
Quote:


> Originally Posted by *andrewyou*
> 
> hi
> I have a quick question about the sapphire r9 390x.
> Is my PSU ok for this card or do i need to get another one asap ?
> 
> My rig :
> I5 2500k stock
> 1xSATA HDD
> 2x4 GB RAM
> PSU : Seasonic S12II-620 Bronze 620W
> 
> Thank you !


I am runnning this card on a Seasonic M12II-620 EVO 620W. My rig is as follows,

Core i5 4690K
Asus Z97-K MB
Kingston HyperX 8GB
Corsair Hydro H75
2 x 140mm Phanteks PH-F140MP case fans
1 x Akaska High Performance Viper 140mm
Samsung SSD 850 EVO 250GB
SAMSUNG HD103UJ 1TB
ST3500418AS 500GB

Always crank up the graphics settings to max (Crisis 3 for example), never had any power issues ever.


----------



## bazookatooths

So i've been running VSR at 1440p on my 1080p monitor in various video games, and this video card really making me happy!








I also like to watch higher res videos at the 1800p setting !


----------



## Klutz0

Hey guys, I recently put together a PC with a Gigabyte *G1 Gaming R9 390*.

I just noticed this in the OP:
Quote:


> Gigabyte = No waterblocks.


Why not? Will there ever be any?
If not, and I wanted to put in a custom water loop, what would my options be?

Also, this is my first time trying out overclocking of any kind and was hoping to get some feedback!

I used AMD CCC's OverDrive to overclock and tried to find a comprehensive guide of some sort but didn't find much and have some questions:

Power limit setting : what does this do? Should I just push it to +50% right off the bat?
Memory clock : whenever I tried to bring this up even a little it seemed like I was getting artifacts and screen tearing in 3DMark... Is that normal?
After some testing and trying things out, I ended up with a 40% power limit increase and 7% GPU clock increase, with the memory staying at 1500MHz. Is that reasonable?


Spoiler: AMD OverDrive settings







Can someone validate that the 3DMark results are more or less what they should be?
3DMark results comparison before (10499) and after (11006) overclocking.


----------



## diggiddi

use Firestrike instead
http://www.overclock.net/t/633816/how-to-overclock-your-amd-ati-gpu


----------



## Klutz0

Quote:


> Originally Posted by *diggiddi*
> 
> use Firestrike instead
> http://www.overclock.net/t/633816/how-to-overclock-your-amd-ati-gpu


You mean Afterburner?


----------



## diggiddi

Quote:


> Originally Posted by *Klutz0*
> 
> You mean Afterburner?


Instead of GPU tool for stress testing, monitor with AB, Trixx, etc


----------



## TsukikoChan

Quote:


> Originally Posted by *derkington*
> 
> Got the same card, my fans on light load spin at 28%, idle Core temp 49 degree and VRM 59. My case fans on the slowest speed (PWM control) are louder so I do not really notice. I suppose if the GPU temp went to 10c or something like that the fans on the card would spin even slower.
> 
> While digging about (because I was curious as well), I saw that there is a new sapphire 390x, along with a core bump from 1055Mhz to 1080 MHz) and a backplate according to the site: http://www.sapphiretech.com/catapage_pd.asp?cataid=69&lang=eng
> The card is already awesome but this is a nice touch
> 
> 
> 
> 
> 
> 
> 
> .


there's a new version of mine already? *flips table* *puts table back upright* after i just got it ;_;
it looks as though it's just a backplate change and a bit of an oc applied  i can live with that esp as they would prob charge an extra 20-40quid for that backplate.

yup, same temps i get on my card XD


----------



## Gumbi

Quote:


> Originally Posted by *Rmosher*
> 
> I have an EVGA G2 1000 watt PSU. Sapphire recommends a 750 watt minimum. If I were you I would definitely upgrade to a 750 minimum. Just make sure you get a quality power supply. Don't go with a crappy brand. I learned this the hard way!


This is nonsense. 620 watts is perfect for an Intel + 390(x) build.


----------



## Stige

Quote:


> Originally Posted by *Gumbi*
> 
> This is nonsense. 620 watts is perfect for an Intel + 390(x) build.


This man is correct, zero reason to go overkill 750W for any single GPU setup.

A good 600W should work more than fine for any single GPU setup.

Heck, I ran Crossfire HD7950 with 620W Antec crap and 5GHz 2500K without any issues.


----------



## battleaxe

Quote:


> Originally Posted by *Stige*
> 
> This man is correct, zero reason to go overkill 750W for any single GPU setup.
> 
> A good 600W should work more than fine for any single GPU setup.
> 
> 620W Antec crap and 5GHz 2500K without any issues.


I have 290 on that exact PSU. Its fine.


----------



## XxxxVulcanxxxX

Hi im happy to be part of the 390x owners group i have the MSI gaming 8G version. Had the card a few weeks now and im just starting to push its overclock.

I am 100% stable in all games at 1100/1650 at stock voltages, anymore and i get issues so looks like im going to start on the voltage sliders next.

Also any ideas why I still keep showing in 3d mark/firestrike as a 290? its not a major concern its just annoying when im doing comparisons i get the poor peoples version this card, lol.


----------



## Levys

Quote:


> Originally Posted by *andrewyou*
> 
> hi
> I have a quick question about the sapphire r9 390x.
> Is my PSU ok for this card or do i need to get another one asap ?
> 
> My rig :
> I5 2500k stock
> 1xSATA HDD
> 2x4 GB RAM
> PSU : Seasonic S12II-620 Bronze 620W


To give you an idea

I'm running a FX8350 + R9 390X + 2 sticks of 8GB + complete xspc water cooling on a Crosshair v formula -Z mobo + 4x12cm 3x14cm 3x20cm fans + fancontroller and bunch of lights.
all on an old corsair 650w HX psu. and it runs fine stock as well as overclocked to 5.0Ghz cpu and 1200/1600gpu.

so dont worry







the proposed 750w is somewhat overrated.


----------



## Rmosher

I just tend to stick to what the manufacturer recommends. And I guess I like overkill lol


----------



## mandrix

My 390X will do higher than 1160/1610 with only power adjustments....but after lots of testing 1160/1610 gives best scores and it automagically sets voltage as high as 1.242 peak without me touching it.


----------



## DJNIKEL24

Quote:


> Originally Posted by *Agent Smith1984*
> 
> You need around 4.6GHz on the FX-8 to really make it a valid "gamer" at high FPS.


I'm not a pro at overclocking but i just installed a watercooler for my fx8350 and amd overdrive went from 4.3 previously to 6.2 after water. Here's my heaven benchmark photo that I took with my phone. Can you guys confirm that my cpu speed is 6ghz? I'm not sure if that's what it means cause other forums are saying that it's basically impossible to get 6 on this cpu. Also the 390 is oc'd just over 1100/6000 without major voltage since I'm still learning. Lol
Edit for another question. Are there any disadvantages to putting the 390 fans at 100%? I don't mind noise since I have headphones on mostly. I set them at 100 yesterday and the temp difference was more than 20c. Usually maxes at 70c but with fans up all the way was maxing just under 50c. Thanks again guys.


----------



## diggiddi

Quote:


> Originally Posted by *DJNIKEL24*
> 
> 
> I'm not a pro at overclocking but i just installed a watercooler for my fx8350 and amd overdrive went from 4.3 previously to 6.2 after water. Here's my heaven benchmark photo that I took with my phone. Can you guys confirm that my cpu speed is 6ghz? I'm not sure if that's what it means cause other forums are saying that it's basically impossible to get 6 on this cpu. Also the 390 is oc'd just over 1100/6000 without major voltage since I'm still learning. Lol
> Edit for another question. Are there any disadvantages to putting the 390 fans at 100%? I don't mind noise since I have headphones on mostly. I set them at 100 yesterday and the temp difference was more than 20c. Usually maxes at 70c but with fans up all the way was maxing just under 50c. Thanks again guys.


Yeah that 6.2 ghz is most likely impossible download HWinfo64 to monitor your system
Running fans at 100% will just burn them out quicker but gpu runs cooler as you noticed its entirely up to you what you prefer


----------



## Agent Smith1984

Quote:


> Originally Posted by *DJNIKEL24*
> 
> 
> I'm not a pro at overclocking but i just installed a watercooler for my fx8350 and amd overdrive went from 4.3 previously to 6.2 after water. Here's my heaven benchmark photo that I took with my phone. Can you guys confirm that my cpu speed is 6ghz? I'm not sure if that's what it means cause other forums are saying that it's basically impossible to get 6 on this cpu. Also the 390 is oc'd just over 1100/6000 without major voltage since I'm still learning. Lol
> Edit for another question. Are there any disadvantages to putting the 390 fans at 100%? I don't mind noise since I have headphones on mostly. I set them at 100 yesterday and the temp difference was more than 20c. Usually maxes at 70c but with fans up all the way was maxing just under 50c. Thanks again guys.


Run firestrike and post up, I'll get you pointed in the right direction bud!


----------



## Rmosher

Ok guys, I busted out my watt meter and started running some tests to see what I was pulling from the wall and made some interesting discoveries. First off a brief overview of my system specs:

FX 8350 OCed to 4.5 with a vcore of 1.4 volts.
Six case fans (three intake and three exhaust).
Two CPU fans in push/pull.
Toshiba HDD+LG optical drive+Multi card reader.
Two sticks of 8 GB Crucial RAM.
And of course my Sapphire R9 390X OCed to 1100 core and 1750 VRAM with stock voltage. Here are the results:

Idle on desktop- *191-206* watts
Cinebench R15- *328* watts
Prime95 blend- *355* watts
Unigine Heaven Extreme HD- *425* watts
Unigine Valley Extreme HD- *447* watts
Firestrike- *447* watts
Tomb Raider benchmark (max settings, FXAA, TressFX ON)- *440* watts
Assetto Corsa benchmark (max settings, 2x aa)- *476* watts

And now for the insane metric!

Prime95blend+Furmark 1080p preset+Valley Extreme HD preset+Cinebench R15 all at the same time!- *660 watts*.

Surprised my rig didn't freeze up and crash running all that at the same time lol

So in conclusion you _can_ run a 390X with a *high quality* 620 watt PSU. I can't stress enough about a good, high quality unit. I think everyone would agree you should always get a top tier psu regardless of wattage. Hope you guys find this helpful....


----------



## Klutz0

Can I join the club?










Manufacturer: Gigabyte
Series: G1 Gaming 390
Cooling: Stock/Air
Stock Clocks: 1025/1500
OC Clocks: 1100/1600
Voltage Core/Aux: -
GPU-Z link: http://www.techpowerup.com/gpuz/details.php?id=88r45


----------



## Rmosher

Quote:


> Originally Posted by *Klutz0*
> 
> Can I join the club?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Manufacturer: Gigabyte
> Series: G1 Gaming 390
> Cooling: Stock/Air
> Stock Clocks: 1025/1500
> OC Clocks: 1100/1600
> *Voltage Core/Aux*: -
> GPU-Z link: http://www.techpowerup.com/gpuz/details.php?id=88r45


I'm surprised I didn't have the AUX voltage slider in AB for my Sapphire. Is it just the MSI cards that have that?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Rmosher*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Klutz0*
> 
> Can I join the club?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Manufacturer: Gigabyte
> Series: G1 Gaming 390
> Cooling: Stock/Air
> Stock Clocks: 1025/1500
> OC Clocks: 1100/1600
> *Voltage Core/Aux*: -
> GPU-Z link: http://www.techpowerup.com/gpuz/details.php?id=88r45
> 
> 
> 
> 
> I'm surprised I didn't have the AUX voltage slider in AB for my Sapphire. Is it just the MSI cards that have that?
Click to expand...

All Hawaii/Grenada cards have it in there.

btw, i just hooked up my Kill-a-Watt and I'm at 547w from the wall in Unigine Valley with an R9 390 at stock









Sleeping Dogs Benchmark: 608w from the wall


----------



## Rmosher

Quote:


> Originally Posted by *Sgt Bilko*
> 
> All Hawaii/Grenada cards have it in there.
> 
> btw, i just hooked up my Kill-a-Watt and I'm at 547w from the wall in Unigine Valley with an R9 390 at stock
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sleeping Dogs Benchmark: 608w from the wall


Wow! What are you running for specs?! lol


----------



## Sgt Bilko

Quote:


> Originally Posted by *Rmosher*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> All Hawaii/Grenada cards have it in there.
> 
> btw, i just hooked up my Kill-a-Watt and I'm at 547w from the wall in Unigine Valley with an R9 390 at stock
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sleeping Dogs Benchmark: 608w from the wall
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Wow! What are you running for specs?! lol
Click to expand...

It's my sig rig but I'll list it anyway:

FX-9590 @ 5.0 / 1.5v
Asus Sabertooth 990FX R2
16GB (2x8GB) G.SKILL TridentX 2400Mhz 10-12-12-31
GPU changes but in this case it's a Sapphire R9 390 Nitro
XSPC Raystorm CPU block with a 750 v4 bay res/pump
5 x Noctua iPPC NF-F12's @ 1800rpm (Bitfenix Recon Fan Controller)
Samsung 840 Evo 250GB (Main Boot)
Samsung 850 Evo 500GB (Game Drive)
Seagate 2TB SSHD (Other Game Drive)
Seagate 4TB HDD (Storage Drive)
Corsair AX1200i PSU

That's about it......


----------



## Rmosher

Quote:


> Originally Posted by *Sgt Bilko*
> 
> It's my sig rig but I'll list it anyway:
> 
> FX-9590 @ 5.0 / 1.5v
> Asus Sabertooth 990FX R2
> 16GB (2x8GB) G.SKILL TridentX 2400Mhz 10-12-12-31
> GPU changes but in this case it's a Sapphire R9 390 Nitro
> XSPC Raystorm CPU block with a 750 v4 bay res/pump
> 5 x Noctua iPPC NF-F12's @ 1800rpm (Bitfenix Recon Fan Controller)
> Samsung 840 Evo 250GB (Main Boot)
> Samsung 850 Evo 500GB (Game Drive)
> Seagate 2TB SSHD (Other Game Drive)
> Seagate 4TB HDD (Storage Drive)
> Corsair AX1200i PSU
> 
> That's about it......


Ah ha I see why you're pulling that much more than me with the 9590 and the pumps etc. Actually that's not too bad all considered. BTW I just noticed that you have to *click* the arrow next to core voltage. I thought it was supposed to scroll down, I noticed it by accident. *a thousand face palms*


----------



## DJNIKEL24

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Run firestrike and post up, I'll get you pointed in the right direction bud!


Ok thanks I will do it tonight. Meanwhile I found the problem was amd overdrive inaccurately reading my speeds. I deleted it and overclocked via my bios and reached 5ghz on my 8350 with little voltage adjustments. Seems to be running stable. Also the 390 oc'd nicely after that at 1160/6350 and heaven improved. I will run firestrike and see what that's about and report back. This is my newest heaven photo attached


----------



## Klutz0

Quote:


> Originally Posted by *Rmosher*
> 
> I'm surprised I didn't have the AUX voltage slider in AB for my Sapphire. Is it just the MSI cards that have that?


I'm afraid I'm not sure what you're referring to...

The Gigabyte card I have is voltage locked. I didn't have access to any voltage sliders, AUX or otherwise.


----------



## Agent Smith1984

Quote:


> Originally Posted by *DJNIKEL24*
> 
> 
> Ok thanks I will do it tonight. Meanwhile I found the problem was amd overdrive inaccurately reading my speeds. I deleted it and overclocked via my bios and reached 5ghz on my 8350 with little voltage adjustments. Seems to be running stable. Also the 390 oc'd nicely after that at 1160/6350 and heaven improved. I will run firestrike and see what that's about and report back. This is my newest heaven photo attached


Even geting to 5GHz is like pulling teeth on the 8350 without the right board and cooling...

You need to run IBT AVX and verify it is not overheating or throttling, or even both.... and also verify that it is actually stable.


----------



## DJNIKEL24

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Even geting to 5GHz is like pulling teeth on the 8350 without the right board and cooling...
> 
> You need to run IBT AVX and verify it is not overheating or throttling, or even both.... and also verify that it is actually stable.


Ok. I just ran prime95 for about 35 minutes. Temps didn't go higher than 65 on the cpu. I will look into those tonight


----------



## Blackcurrent

I'm thinking of purchasing a AMD card specifially the Sapphire 390. First time AMD user, I'm scared about driver stability having heard horror story about their past drivers. Should I be concerned? Is the purchase valid over the 970.


----------



## Scorpion49

Quote:


> Originally Posted by *Blackcurrent*
> 
> I'm thinking of purchasing a AMD card specifially the Sapphire 390. First time AMD user, I'm scared about driver stability having heard horror story about their past drivers. Should I be concerned? Is the purchase valid over the 970.


Both sides always have had and always will have small driver issues depending on your setup and the games you play. No use worrying about it unless you find mention of problems with what you specifically want to do. Overall, except for power consumption a 390 is a stronger card than a 970.


----------



## Hemanse

Im guessing running my card at 1150 / 1525 with no added voltage at 78-80 degrees full load is okay right? With the side of my case off it drops to around 73-74 at full load, probably just very poor airflow i need to do something about soon.


----------



## derkington

Quote:


> Originally Posted by *Blackcurrent*
> 
> I'm thinking of purchasing a AMD card specifially the Sapphire 390. First time AMD user, I'm scared about driver stability having heard horror story about their past drivers. Should I be concerned? Is the purchase valid over the 970.


You may have read this already, in my opinion this site was quite harsh when the 300 series came out. However I think this is a good review and a true picture of how good the 390 is. http://www.tomshardware.com/reviews/sapphire-nitro-r9-390-8g-d5,4245.html - I have 390x and drivers where never really an issue for me in fact I have been on the last two betas.


----------



## diggiddi

Quote:


> Originally Posted by *Hemanse*
> 
> Im guessing running my card at 1150 / 1525 with no added voltage at 78-80 degrees full load is okay right? With the side of my case off it drops to around 73-74 at full load, probably just very poor airflow i need to do something about soon.


That sounds good to me, my second 290X needs all the volts it can get to do that, what is your fan curve like? or is it stock?


----------



## Hemanse

Quote:


> Originally Posted by *diggiddi*
> 
> That sounds good to me, my second 290X needs all the volts it can get to do that, what is your fan curve like? or is it stock?


I changed the curve a bit, i have never really set a fan curve, always just used the default one, but i wanted to run with a bit of fan speed instead of the zero frozr thing.



This is how mine looks at the moment.


----------



## diggiddi

That quite a gentle curve which explains your temps I have mine to ramp up to 100% at 75oC


----------



## Rob27shred

I just realized I never got my name on the official 390X/390 owners club. Here's my pics, it's an XFX DD R9 390 core edition. Please add me when you get a chance.


----------



## battleaxe

Quote:


> Originally Posted by *Rob27shred*
> 
> I just realized I never got my name on the official 390X/390 owners club. Here's my pics, please add me when you get a chance.


Let us know how this card does. I've had my eye on them as well as the Nitro's.


----------



## Rmosher

Quote:


> Originally Posted by *Hemanse*
> 
> I changed the curve a bit, i have never really set a fan curve, always just used the default one, but i wanted to run with a bit of fan speed instead of the zero frozr thing.
> 
> 
> 
> This is how mine looks at the moment.


Here is the fan curve I use in AB for my Sapphire 390X:


I see a max temp of about 62c under gaming and about 64-65c under stress testing. What kind of case do you have? Might be room for improvement in there.


----------



## Hemanse

Quote:


> Originally Posted by *Rmosher*
> 
> Here is the fan curve I use in AB for my Sapphire 390X:
> 
> 
> I see a max temp of about 62c under gaming and about 64-65c under stress testing. What kind of case do you have? Might be room for improvement in there.


That curve is kinda aggresive, i like my card pretty quiet, but i have a CM Trooper case which is basically a piece of garbage, im gonna replace it soon, you cant pull air in from the front because of the way the drive cages are located. so right now my airflow is rather lackluster to say the least.

Your temps are pretty damn crazy for a 390x, but i guess they would get lower with those fan speeds







Most of the reviews i have read of the MSI 390 the temps under load hover around 73-75, while Jayztwocents had his card at around 80 overclocked.


----------



## Rmosher

Quote:


> Originally Posted by *Hemanse*
> 
> That curve is kinda aggresive, i like my card pretty quiet, but i have a CM Trooper case which is basically a piece of garbage, im gonna replace it soon, you cant pull air in from the front because of the way the drive cages are located. so right now my airflow is rather lackluster to say the least.
> 
> Your temps are pretty damn crazy for a 390x, but i guess they would get lower with those fan speeds
> 
> 
> 
> 
> 
> 
> 
> Most of the reviews i have read of the MSI 390 the temps under load hover around 73-75, while Jayztwocents had his card at around 80 overclocked.


My fans hit about 65% max speed and I can barely hear them to be honest. I also have 6 case fans, 2 intakes on the front, 1 on the side as well as 3 exhausts, 2 on the top and 1 on the back.. My bottom front case fan is a recently purchased EK Vardar F3. Saw them on jayztwocents and wanted to give them a shot. I went with a static pressure in that fan position because of the HDD cage right there that it has to push through. I noticed my entire system temp dropped by about 2 degrees Celsius which is crazy for just adding a case fan! I love the Vardar fan, will definitely be using more of them in the future. I had a Sapphire R7 260X before this card and had it OCed as far as it would go with 100% fan speed and that thing would regularly hit 78c. My 390X totally stock (with stock fan curve) only would hit about 60-61c. here is the stock fan curve it came with



My 390X is pretty quiet up to about 70% before it starts getting intrusive to me. At a 100% it sounds like a jet engine lol


----------



## Hemanse

Quote:


> Originally Posted by *Rmosher*
> 
> My fans hit about 65% max speed and I can barely hear them to be honest. I also have 6 case fans, 2 intakes on the front, 1 on the side as well as 3 exhausts, 2 on the top and 1 on the back.. My bottom front case fan is a recently purchased EK Vardar F3. Saw them on jayztwocents and wanted to give them a shot. I went with a static pressure in that fan position because of the HDD cage right there that it has to push through. I noticed my entire system temp dropped by about 2 degrees Celsius which is crazy for just adding a case fan! I love the Vardar fan, will definitely be using more of them in the future. I had a Sapphire R7 260X before this card and had it OCed as far as it would go with 100% fan speed and that thing would regularly hit 78c. My 390X totally stock (with stock fan curve) only would hit about 60-61c. here is the stock fan curve it came with
> 
> 
> 
> My 390X is pretty quiet up to about 70% before it starts getting intrusive to me. At a 100% it sounds like a jet engine lol


Can i ask what kinda case you are using? Im thinking about getting a Phanteks Enthoo Pro, seems like a nicely priced and very well build case with good airflow. When i read reviews of the Sapphire 390x you have they atleast come nowhere near your temps, but i guess alot of factors weigh in, im 99.9% sure mine are related to bad airflow, right now im pretty sure my noctua fans are just circulating hot air inside my case







Also thinking about taking the card down a PCI-E slot, have it in the top one right now and it is so damn close to my NH-D14


----------



## Rmosher

Quote:


> Originally Posted by *Hemanse*
> 
> Can i ask what kinda case you are using? Im thinking about getting a Phanteks Enthoo Pro, seems like a nicely priced and very well build case with good airflow. When i read reviews of the Sapphire 390x you have they atleast come nowhere near your temps, but i guess alot of factors weigh in, im 99.9% sure mine are related to bad airflow, right now im pretty sure my noctua fans are just circulating hot air inside my case
> 
> 
> 
> 
> 
> 
> 
> Also thinking about taking the card down a PCI-E slot, have it in the top one right now and it is so damn close to my NH-D14


I have a CM HAF 912. Been pretty happy with it. I read those reviews too and I was expecting to be somewhere between the 70-75c range with my 390X. I have been pleasantly surpised at the cooling performance. I think Sapphire makes a heck of a cooler!


----------



## Hemanse

Quote:


> Originally Posted by *Rmosher*
> 
> I have a CM HAF 912. Been pretty happy with it. I read those reviews too and I was expecting to be somewhere between the 70-75c range with my 390X. I have been pleasantly surpised at the cooling performance. I think Sapphire makes a heck of a cooler!


Yeah i actuallu would have gone with the Sapphire card myself, but its not a very well known brand here in Europe, or atleast not in Denmark, so i would have to pay 50$ more for the Sapphire over the MSI, did not really seem worth it


----------



## Rmosher

Quote:


> Originally Posted by *Hemanse*
> 
> Yeah i actuallu would have gone with the Sapphire card myself, but its not a very well known brand here in Europe, or atleast not in Denmark, so i would have to pay 50$ more for the Sapphire over the MSI, did not really seem worth it


Oh I see, I didn't know that. From what I have seen MSI makes a damn good card too. Your English is very good by the way.


----------



## Hemanse

Quote:


> Originally Posted by *Rmosher*
> 
> Oh I see, I didn't know that. From what I have seen MSI makes a damn good card too. Your English is very good by the way.


Thanks







but yeah im fine with the MSI card so far, when i get my case situation fixed im sure the temps will go down a bit, with the side panel off they drop by 8-10c.


----------



## mrbull3tproof

Quote:


> Originally Posted by *Rmosher*
> 
> I have a CM HAF 912. Been pretty happy with it.


Have same case and I can tell you that having only rear fan as outtake (intake - 2 front, 2 top) not only lowered slightly temps, but also reduced dust in the case (which already was nearly non existent thanks to demciflex filters on intakes and on side without fan).
Thanks to positive pressure dust is not sucked into case any more throught front prefored vents over hard drives cages.


----------



## diggiddi

Quote:


> Originally Posted by *Rob27shred*
> 
> I just realized I never got my name on the official 390X/390 owners club. Here's my pics, it's an XFX DD R9 390 core edition. Please add me when you get a chance.


So are you crossing it with the 290/x you just got?


----------



## kubiks

Woohoo, finally checking in. heres my crossfire 390x setup. msi r9 390x. I NEED WATERBLOCKS! Might jsut go with the alphacool setup. Ive been trollign the internet for months looking for some sort of block and did not want to go with the G10 kraken!


----------



## kubiks

they are blocked by a huge dc blower i have hooked up to help cool!


----------



## kubiks

And heres one more with the old 850 watt evga power supply that turned out to not be enough for my setup! I now have a corsair Rm1000i. Setup has pulled a peak of 940 watts from the wall on benchmarks only with my 4790k oc'd to 4.8, with ram at 1.65 @ 2400 and both carrds clocked to 1130/1580


----------



## DJNIKEL24

Quote:


> Originally Posted by *kubiks*
> 
> And heres one more with the old 850 watt evga power supply that turned out to not be enough for my setup! I now have a corsair Rm1000i. Setup has pulled a peak of 940 watts from the wall on benchmarks only with my 4790k oc'd to 4.8, with ram at 1.65 @ 2400 and both carrds clocked to 1130/1580


What kind of temps are you getting? I really want to crossfire two msi le 390s but I wanna try to avoid water blocks cause I dunno how comfortable I am with installing them. Also I'm currently with a cool master 850 gold rated psu. You think it wouldn't be enough for crossfire?


----------



## Sgt Bilko

Quote:


> Originally Posted by *DJNIKEL24*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kubiks*
> 
> And heres one more with the old 850 watt evga power supply that turned out to not be enough for my setup! I now have a corsair Rm1000i. Setup has pulled a peak of 940 watts from the wall on benchmarks only with my 4790k oc'd to 4.8, with ram at 1.65 @ 2400 and both carrds clocked to 1130/1580
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What kind of temps are you getting? I really want to crossfire two msi le 390s but I wanna try to avoid water blocks cause I dunno how comfortable I am with installing them. Also I'm currently with a cool master 850 gold rated psu. You think it wouldn't be enough for crossfire?
Click to expand...

Well you don't have to worry about a fullcover block seeing as there aren't any apart from the one from Alphacool (Universal GPU Block + Heatsink for vram/vrms)

Fill out the rigbuilder and i might be able to tell you if 850w is enough (link is in my sig for it)

all in all with crossfire if you have good airflow then you won't need watercooling to keep the temps down


----------



## mrbull3tproof

Quote:


> Originally Posted by *kubiks*
> 
> And heres one more with the old 850 watt evga power supply that turned out to not be enough for my setup! I now have a corsair Rm1000i. Setup has pulled a peak of 940 watts from the wall on benchmarks only with my 4790k oc'd to 4.8, with ram at 1.65 @ 2400 and both carrds clocked to 1130/1580


Why did you mount your PSU intake to the top? This way GPU fan on the bottom card doesn't get enough air because it's stolen by PSU intake.


----------



## Rob27shred

Quote:


> Originally Posted by *diggiddi*
> 
> So are you crossing it with the 290/x you just got?


Yes sir!







Just waiting on my new PSU to arrive today. Right now I have an EVGA 850w P2 in my rig so I ordered a Seasonic X 1250w XM2 to make sure I have enough juice. Should have pics & bench scores up by tonight.


----------



## kubiks

Quote:


> Originally Posted by *mrbull3tproof*
> 
> Why did you mount your PSU intake to the top? This way GPU fan on the bottom card doesn't get enough air because it's stolen by PSU intake.


That was the first temporary setup. It was only for a few days to test power consumption. It's no longer set up as pictured.

To respond to the other question. I'm going to water cool both. Opened slot reaches 92 under bunch load, 85 on dota 2. Lower card never breaks 75 degrees. I'm going to try the alpha cool naxhos on the top card and report my findings. Vrm temp not a problem so far on air


----------



## kubiks

Lol God damn auto correct. Bench and nexxos were what I meant. I want a full cover but don't want to have to modify one for a different carday as none exist for 390x yet. With the vrm and core temps and the oversized nexxos black which cools a decent part of the heatsink, I'm comfortable with buying one to hold me over till a full cover is released


----------



## kubiks

That's if alpha cool doesn't take 3 monthso to reach me. Full coverage may be out by then!


----------



## Agent Smith1984

Quote:


> Originally Posted by *kubiks*
> 
> That's if alpha cool doesn't take 3 monthso to reach me. Full coverage may be out by then!


EK says they have no plans to release a full cover.....

If they aren't, I doubt anyone else with either.

The alphacool is about the best solution right now for both Sapphire and MSI owners right now.

Otherwise get a reference design set of cards.


----------



## bichael

Although I only had it on a 270x before the Alphacool GPX worked great. Nice low temps even with a small pump and low speed fans. VRM cooling may not be quite as good as regular full cover block but for me I'm willing to trade that off for the lower price.

Will feedback how it does on the 390 as soon as I get it on. Main problem so far is the delay in waiting for aquatuning to get in an upgrade kit so I can move it onto my 390PCS+...


----------



## Levys

Quote:


> Originally Posted by *Agent Smith1984*
> 
> EK says they have no plans to release a full cover.....
> 
> If they aren't, I doubt anyone else with either.
> 
> The alphacool is about the best solution right now for both Sapphire and MSI owners right now.
> 
> Otherwise get a reference design set of cards.


I have send my MSI r9 390x back and got an XFX 390x for that reason ( no waterblock)
The XFX card is great and my Kryographics r9 290x was a perfect fit because of the same pcb used.
clocks to around 1200/1650.




I wonder if more people went for this solution as I can't seem to find any (all stock or geto''ish







)
Btw. the lighting doesn't do my rig justice


----------



## Casterina

I'm considering purchasing the R9 390 but which brand has the best cooler as I'll be overclocking the card


----------



## TehMasterSword

Sapphire or Powercolor have the beefiest coolers.


----------



## Rob27shred

Quote:


> Originally Posted by *battleaxe*
> 
> Let us know how this card does. I've had my eye on them as well as the Nitro's.


My 390 has done really well by itself for me, not sure if I just got lucky & got a good OCer but I have been running mine at core 1117 & mem 1675 stable without messing with the voltage at all. Just upped the power limit to 50 in CCC. I now have it crossfired with a 290X & everything has been able to stay the same.


----------



## kubiks

I say we llol pool our knowledge and take measurement of our cards and compare to full size blocks for last series. I believe we can find a 290 x back that can be molded to fit our cards. I'm willing to drop 100 bucks on a block to try. With ekwb just saying no to a whole line of cards, desperate times call for desperate measures. No pun intended lol


----------



## kubiks

There we go with auto correct again. These are amazing cards, and people are putting them down without even trying them! My two beat a few 980 ti sli benches on firestrike


----------



## IanDrexP

Will be joining SOON! CANT WAIT for my 390 to arrive!!!


----------



## Rob27shred

Quote:


> Originally Posted by *kubiks*
> 
> There we go with auto correct again. These are amazing cards, and people are putting them down without even trying them! My two beat a few 980 ti sli benches on firestrike




This is the result of my 2nd firestrike run with my 390 crossfired with my 290X. I probably need to fine tune everything though, I have both cards running at different clock speeds (the stable non voltage added OC I found for either 390 core 1117 mem 1675 290X core 1112 mem 1420) Temps are pretty up there for the main card (390) as well, although not so bad with the side panel removed so I definitely need to mod my case for a side intake fan. Or figure out a water cooling solution but I have to really slow down on spending at this point & I doubt my case would fit any anyways.









Edit: Well apparently my 390 is running at core 1112 mem 1500 according to Firestrike, not sure what that's about since in CCC I have them both set to the above stated speeds. I do remember hearing something about the stronger card downclocking to match the weaker one in crossfire but that don't make sense in my situation. The 390 stock is core 1015 mem 1500 & the 290X stock is core 1030 mem 1250, hmm... Oh well just happy to have it all set up & working at this point!


----------



## Sgt Bilko

Quote:


> Originally Posted by *Rob27shred*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kubiks*
> 
> There we go with auto correct again. These are amazing cards, and people are putting them down without even trying them! My two beat a few 980 ti sli benches on firestrike
> 
> 
> 
> 
> 
> This is the result of my 2nd firestrike run with my 390 crossfired with my 290X. I probably need to fine tune everything though, I have both cards running at different clock speeds (the stable non voltage added OC I found for either 390 core 1117 mem 1675 290X core 1112 mem 1420) Temps are pretty up there for the main card (390) as well, although not so bad with the side panel removed so I definitely need to mod my case for a side intake fan. Or figure out a water cooling solution but I have to really slow down on spending at this point & I doubt my case would fit any anyways.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: Well apparently my 390 is running at core 1112 mem 1500 according to Firestrike, not sure what that's about since in CCC I have them both set to the above stated speeds. I do remember hearing something about the stronger card downclocking to match the weaker one in crossfire but that don't make sense in my situation. The 390 stock is core 1015 mem 1500 & the 290X stock is core 1030 mem 1250, hmm... Oh well just happy to have it all set up & working at this point!
Click to expand...

I've run all types of Hawaii + Grenada crossfire setups now and the card with the higher base clock NEVER lowers itself down, If anything the card with the lower base clock has overclocked itself to match the higher clocked one.

Run down of what Crossfire setups I've done so far:

R9 290 + R9 290
R9 290x + R9 290
R9 290 + R9 295x2
R9 290 + R9 290 + R9 295x2
R9 290x + R9 295x2
R9 290x + R9 290 + R9 295x2
R9 390 + R9 295x2
R9 390 + R9 390x
R9 390x + R9 295x2
R9 390x + R9 290x

When i say it like that it seems like alot


----------



## wickedsunny

I am buying a sapphire R9 390 this week for 3D rendering in Blender.

Can anyone tell me where can I get a cooler block for it?


----------



## diggiddi

Don't think there is one


----------



## wickedsunny

Quote:


> Originally Posted by *diggiddi*
> 
> Don't think there is one


I read an update from the first page of this thread that alpha cool has now semi blocks for sapphire and MSI.

But I am not getting the link for them
Quote:


> COFIRMED (UPDATED 7/15/15)
> 
> The only full cover EK blocks that are cross compatible between 290 and 390 series cards, are those manufactured for Asus DCU, XFX DD, and PowerColor PCS+ cards.
> 
> However, Alphacool has now began producing a semi-cover blocks for the Sapphire and MSI cards, which uses a water block, mated to a large passivbe heatsink to cool the VRM's. Good news for owners of those cards, as those cards are both quite popular!


----------



## wickedsunny

I checked in the compatibility list

http://www.alphacool.com/download/compatibility%20list%20ATI.pdf

This might work, can anyone confirm it:

Alphacool NexXxoS GPX - ATI R9 390 M01 - mit Backplate - Schwarz

Sapphire Radeon R9 390 Nitro, 8GB GDDR5, DVI, HDMI, 3x DisplayPort, lite retail (11244-00-20G)


----------



## Blackcurrent

Add me to the club baby. Brand new just bought Sapphire 390X Nitro, first time AMD user. So far I love the card only that it stutters when I play games for a good 1 min and then is smoothers out. Is that normal? Using the latest Beta Drivers.


----------



## tangelo

Quote:


> Originally Posted by *Blackcurrent*
> 
> Add me to the club baby. Brand new just bought Sapphire 390X Nitro, first time AMD user. So far I love the card only that it stutters when I play games for a good 1 min and then is smoothers out. Is that normal? Using the latest Beta Drivers.


The latest beta drivers caused stutter on some games for some people. Expecting/hoping this is fixed on the next non-beta driver.


----------



## Levys

Cant wait for the new omega drivers next month


----------



## LexLuthor

Hi..
I purchased a XFX R9 390 in July, and couldn't be more happy with my choice.
For reference, I had a Crossfire system with a XFX 7970 + Sapphire 7950 with my Eyefinity setup. But, I needed to be "a bit" update in terms of hardware. So, I was between an used R9 290/x or a brand new R9 390..
I've been reading this thread since the begining (thanx, Agent Smith







), but I couldn't never post before because I have problems with my internet conection, so I can't upload photos and stuffs....yet...








But, I couldn't resist the temptation to finally write, at least one post.
My XFX R9 390 is a beast. I can OC it at 1100/1600 with only sliding the Power line on Afterburner. Just that. Pretty amazing.
But when I try to reach 1200, total failure. I even installed Trixx to add more voltage, but it didn't.
Of course, I haven't been dedicated in full to tune the card, sliders and values, but so far is a great card. I managed to have it at 1150/1650 at stock voltage (just de Power slider at max), but I think I could reach more.
About temps, well, after having a XFX 6870 that was as hot as the sun (+100 °C when playing, but without any failure) and a 7970 that wasn't as much as I espected, I hesitated a bit on getting this R9 390.. After reading several posts on this thread, went for it... And could never be more happy...








This card is as cooler as my old Sapphire 7950 (being Sapphire one of the best brands in VGAs cooling solutions). It's amazing.
At full OC and a custom fan curve, I only got +70 °C core and +80 °C VRM... Even when tried to reach 1200 on the core (the few times I did it), core went only at +82°C and VRMs at +89°C...
It's reeeeeeaaally cooler..
I will continue to be updated on this thread, and at the moment I'm back online, I'll upload test results..

Best regards..


----------



## tolis626

Quote:


> Originally Posted by *gupsterg*
> 
> I'll have go to explain what I noticed with 290/X roms and I believe this is the same for other Hawaii GPU cards.
> 
> Firstly a ROM can have a GPU core voltage offset within ROM. If in MSI AB you see a preset GPU core voltage offset your ROM has it. For example the Sapphire 390X Nitro ROM has it.
> 
> Now I would think an owner would report their using stock voltages.
> 
> I have also noted in Sapphire 390X Nitro ROM VDDCI (Aux voltage in MSI AB) is 1.05V vs 1.00v in other 390/X ROMs. Again an owner would think or report as stock voltage / not having touched it.
> 
> Next we must talk about ASIC profiling (Leakage ID (ASIC Quality)).
> SO say an owner has x ASIC Quality they will have x VID for GPU, another owner has y ASIC Quality they will have y VID for GPU.
> 
> SO when both these owners state I use +25mv for x OC of GPU/RAM they will have differing VID / VDDC.
> 
> Next we must talk about EVV (Electronic Variable Voltage) in every stock ROM only the lowest state of GPU voltage is the same ie DPM 0. Now why EVV is used for DPM 1 - 7 is so with all this profiling going on each ROM does not have to be tailored to a exact GPUs properties.
> 
> Now one thing that also happens under EVV is the default GPU clock affects VID.
> 
> For example when I flashed my card with a default GPU clock of 1100MHz I get a differing VID to 1000MHz. Everything in my testing was the same and the same ROM was used but edited GPU clock.
> 
> SO as default GPU clock in a ROM goes higher set VID will be lower.
> 
> SO an owner of a 390X with say 1050MHz as default GPU clock will have higher VID thus should reach a higher OC without touching voltages than an owner with 1100MHz.


Well... Can anyone confirm that the Sapphire cards have a 1.05V VDDCI at stock (Basically a +50mV AUX)? If so, it would explain why Sapphire's cards' memories overclock better on average. It would also mean that running +50mV AUX would be rather "safe", so to speak.

It just seems too simple to be true...


----------



## Rmosher

Sapphire R9 390X owner here, I see 1.047 AUX voltage in AB. GPUZ Confirms this. Hope that helps


----------



## tolis626

Quote:


> Originally Posted by *Rmosher*
> 
> Sapphire R9 390X owner here, I see 1.047 AUX voltage in AB. GPUZ Confirms this. Hope that helps


Without touching anything, I suppose? If so, awesome. I may be able to get the 1700+MHz I want out of my memory after all. Thanks friend!


----------



## Strife21

Interested in pulling the trigger on a r9 390x currently have a 270x, I read some people were having issues with the drivers and such, has most of this been resolved? It would be going in a new skylake build I made.


----------



## Rmosher

Yes, that is completely stock right down to the stock fan curve. When I load my overclock I am able to get 1100/1750 core and ram with the stock voltage. Was pretty pumped to see a 7000 GHz effective memory clock! Just the memory overclock made a nice improvement in fps. Between 3-6 fps. Gives me a memory bandwidth of 448GB/s







Love this card, what a beast.


----------



## Rmosher

I have had no problems with drivers at all. However I don't use beta drivers though. This is my second AMD card, not sure where the whole "amd drivers are bad" thing comes from. I've used both nVidia and AMD/ATI and they both have bad drivers sometimes. Just the nature of things I guess.


----------



## gupsterg

Quote:


> Originally Posted by *tolis626*
> 
> Well... Can anyone confirm that the Sapphire cards have a 1.05V VDDCI at stock (Basically a +50mV AUX)? If so, it would explain why Sapphire's cards' memories overclock better on average. It would also mean that running +50mV AUX would be rather "safe", so to speak.
> 
> It just seems too simple to be true...


This GPU-Z image is from a Sapphire 390 Nitro review.



Review page link:- http://www.legitreviews.com/dirt-rally-performance-review-geforce-gtx-970-versus-radeon-r9-390_167054/2

I have also a) noted it in image posted in threads b) I can also post relevant screenshot of section from rom and if you ref the Hawaii Bios editing thread you can verify it.

This information is also correct for Sapphire 390X Nitro.


----------



## Rob27shred

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I've run all types of Hawaii + Grenada crossfire setups now and the card with the higher base clock NEVER lowers itself down, If anything the card with the lower base clock has overclocked itself to match the higher clocked one.
> 
> Run down of what Crossfire setups I've done so far:
> 
> R9 290 + R9 290
> R9 290x + R9 290
> R9 290 + R9 295x2
> R9 290 + R9 290 + R9 295x2
> R9 290x + R9 295x2
> R9 290x + R9 290 + R9 295x2
> R9 390 + R9 295x2
> R9 390 + R9 390x
> R9 390x + R9 295x2
> R9 390x + R9 290x
> 
> When i say it like that it seems like alot


LOL, nice to know though!







I was not sure if the cards clocking to the lower speeds was true or not. Maybe my memory is reading 1500 in FS cause that's the best the 290x can do without adding voltage? The 290x started to get unstable past 1425 in CCC with power limit turned up to +50 when I did a quick OC on it. This is my 1st crossfire setup so I have a lot to learn yet.


----------



## TehMasterSword

Quick update: My BEAUTIFUL Powercolor 390 came in on Tuesday and have been enjoying Hearts of Stone on it this week! Card runs fantastic, yet to see it break 77C in games, FAR cooler than the Strix. I'll be putting it through its paces and overclocking it this weekend!


----------



## Superb677

Hey everyone. I'm actually new to the site. I wish I would have joined earlier because so far I am loving it. I still have to finish adding in my rig details but I definitely want to join the club. Love this GPU more and more with each new game I play. I have 1x 140mm and 3x Cooler Master SickleFlow 120mm fans for case cooling and a 140mm Arctic Freezer CPU Cooler. Thanks everyone! Looking forward to getting to know you all!







I'll add my OC numbers below along with Unigine Valley and 3D Mark Fire Strike Benchmark scores.

*OC Numbers:*

Core Clock 1135 Mhz
1625 Mhz Memory Clocked

*Unigine Valley Benchmark:*

*FPS:* 64.8
*Score:* 2713
*Min FPS:* 30.1
*Max FPS:* 125.3

_1920x1080 8xAA_

*3D Mark (Fire Strike 1.1)*

*OVR Score:* 11,178
*Graphics Score:* 13,550
*Physics Score:* 10,469
*Combined Score:* 5,056


----------



## TsukikoChan

Quote:


> Originally Posted by *tolis626*
> 
> Well... Can anyone confirm that the Sapphire cards have a 1.05V VDDCI at stock (Basically a +50mV AUX)? If so, it would explain why Sapphire's cards' memories overclock better on average. It would also mean that running +50mV AUX would be rather "safe", so to speak.
> 
> It just seems too simple to be true...


aah, i was wondering about this. same thing happened to me (sapphire 390x), the sapphire trixx software showed +50mv on the voltage and i thought i did it without remembering ^^ glad to know it's not just me and that they did that by default haha.


----------



## tolis626

Quote:


> Originally Posted by *TsukikoChan*
> 
> aah, i was wondering about this. same thing happened to me (sapphire 390x), the sapphire trixx software showed +50mv on the voltage and i thought i did it without remembering ^^ glad to know it's not just me and that they did that by default haha.


Yup! I would think that if a manufacturer does it, it's kinda safe. I mean, yes, the Sapphire has a different PCB, cooler etc, but come on, if it was anything dangerous at all they wouldn't risk it. Unless the MSI and Sapphire cards work so differently that it's cause for concern, but I seriously doubt that. So, +50mV AUX for me. And on a side note, when buying this card I thought that overclocking the memory was useless. Glad to see I was wrong. There is a performance improvement just by upping the memory clocks, even if it takes a significant overclock to see it.

Now... Another thing. I finally upgraded to Windows 10 yesterday (had to force the damn thing, it wouldn't do it on its own even though I had reserved my copy) and Catalyst is gone. I mean, the GPU works fine and all and Afterburner recognises the driver as 15.10. I just can't get into CCC. I tried unistalling and reinstalling the drivers, but nothing happened. Any ideas? At this point I'm too bored to do much and I'm thinking about waiting for the 15.11 driver, but still... Driver cleaning maybe?

PS : I can't seem to find 15.9 anywhere.... Only 15.10 and 15.7.1. What's up with that?


----------



## Dundundata

Wasn't 15.9 the buggy driver? I've been on 15.8 for awhile now. Did u do a clean install of Windows? Did you try DDU uninstaller?


----------



## tolis626

Quote:


> Originally Posted by *Dundundata*
> 
> Wasn't 15.9 the buggy driver? I've been on 15.8 for awhile now. Did u do a clean install of Windows? Did you try DDU uninstaller?


Yup, that solved it. My boredom leads to problems once more. Oh well.









Thanks man!

15.9 had the memory leak, you're correct. 15.9.1, the fixed version, worked really well for me though. 15.10 messed my overclocks as I've already posted about. Anyway, I can live with it til the next one is out.


----------



## gerpogi

Quote:


> Originally Posted by *tolis626*
> 
> Yup, that solved it. My boredom leads to problems once more. Oh well.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks man!
> 
> 15.9 had the memory leak, you're correct. 15.9.1, the fixed version, worked really well for me though. 15.10 messed my overclocks as I've already posted about. Anyway, I can live with it til the next one is out.


How do you guys get these driver updates? I've been stuck with 15.7 for like the longest time. Those updates only pop out as beta drivers when I manually check for updates but that's about it

Edit: nvm you guys were talking about beta drivers derp


----------



## diggiddi

Someone normally posts on here when new driver is out


----------



## TehMasterSword

Just got off work, spent the last 2 hours running benchmarks. UNFORTUNATELYYYY I don't have FRAPS installed or any other FPS counter that I could use for my run of GTA V or any other game that doesn't have a built in benchmarker, so my results for GTA V is an estimate. ALSO unfortunately, I don't actually have Microsoft excel, so I don't have an easy way of making a bar graph, so for now, here are my results in text form!

Stock clocks

1010/1500
Idle 61C

OC

1120/1630
Idle 61C

Stock results

Valley (Ultra, 1080p, x8 AA)
FPS 59.7
Score 2497
Temp 73C

GTA V (Max EVERYTHING)
25FPS Avg
Temp 70C

Fire Strike

3DMark Score 9908
Graphics Score 11969
Physics Score 9021
Combined Score 4623
Graphics Test 157.02 fps
Graphics Test 247.86 fps
Physics Test 28.64 fps
Combined Test 21.5 fps

OC Results

Valley
64.6FPS
Score 2703
Temp 76C

Fire Strike

3DMark Score 10060
Graphics Score 12214
Physics Score 9023
Combined Score 4679
Graphics Test 158.12 fps
Graphics Test 248.89 fps
Physics Test 28.65 fps
Combined Test 21.77 fps

GTA V
30FPS Avg
Temp 72C

TL;DR- I wouldn't say that the temperatures are excellent, but I can't break 80C no matter what I throw at this card. I was able to get a ~10% OC without touching the voltage resulting in a ~10% increase in all benchmarks! I'm VERY happy with this card!

EDIT: Just realized I haven't been added to the list! Here is my screen shot


----------



## Superb677

Quote:


> Originally Posted by *TehMasterSword*
> 
> Just got off work, spent the last 2 hours running benchmarks. UNFORTUNATELYYYY I don't have FRAPS installed or any other FPS counter that I could use for my run of GTA V or any other game that doesn't have a built in benchmarker, so my results for GTA V is an estimate. ALSO unfortunately, I don't actually have Microsoft excel, so I don't have an easy way of making a bar graph, so for now, here are my results in text form!
> 
> Stock clocks
> 
> 1010/1500
> Idle 61C
> 
> OC
> 
> 1120/1630
> Idle 61C
> 
> Stock results
> 
> Valley (Ultra, 1080p, x8 AA)
> FPS 59.7
> Score 2497
> Temp 73C
> 
> GTA V (Max EVERYTHING)
> 25FPS Avg
> Temp 70C
> 
> Fire Strike
> 
> 3DMark Score 9908
> Graphics Score 11969
> Physics Score 9021
> Combined Score 4623
> Graphics Test 157.02 fps
> Graphics Test 247.86 fps
> Physics Test 28.64 fps
> Combined Test 21.5 fps
> 
> OC Results
> 
> Valley
> 64.6FPS
> Score 2703
> Temp 76C
> 
> Fire Strike
> 
> 3DMark Score 10060
> Graphics Score 12214
> Physics Score 9023
> Combined Score 4679
> Graphics Test 158.12 fps
> Graphics Test 248.89 fps
> Physics Test 28.65 fps
> Combined Test 21.77 fps
> 
> GTA V
> 30FPS Avg
> Temp 72C
> 
> TL;DR- I wouldn't say that the temperatures are excellent, but I can't break 80C no matter what I throw at this card. I was able to get a ~10% OC without touching the voltage resulting in a ~10% increase in all benchmarks! I'm VERY happy with this card!
> 
> EDIT: Just realized I haven't been added to the list! Here is my screen shot


Did you try using the on-screen display for fps and other counters that MSI Afterburner uses to bench GTA V? I don't think you should be only getting an avg of 30 fps with that card especially with the OC. I am running on a XFX R9 390 and have it OC'd to 1135 MHz and 1625 Mem Clock. and I am getting well over 60 fps and sits more in the 70-80 range most of the time.Then again I am running on 1080p and a 60 hz monitor and you may be running on something more demanding like 1440 or 4k.


----------



## TehMasterSword

I'm running a 1080p 144hz monitor. The kicker is this is max everything. Literally. AA kicked up to x8 or x16, etc. There were a few options that had absurdly high settings, don't increase graphics that much, but decrease performance by a lot. I could easily turn down a few knobs and still have a pretty gaming running well above 100, but whats the point of a benchmark to compare to other cards if it's not MAX?









Also, I do have MSI AB, and I do have the onscreen counter. Does AB have a built in FPS counting tool that will give me an Avg FPS over time?


----------



## Superb677

Quote:


> Originally Posted by *TehMasterSword*
> 
> I'm running a 1080p 144hz monitor. The kicker is this is max everything. Literally. AA kicked up to x8 or x16, etc. There were a few options that had absurdly high settings, don't increase graphics that much, but decrease performance by a lot. I could easily turn down a few knobs and still have a pretty gaming running well above 100, but whats the point of a benchmark to compare to other cards if it's not MAX?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also, I do have MSI AB, and I do have the onscreen counter. Does AB have a built in FPS counting tool that will give me an Avg FPS over time?


Ah, okay makes sense for sure. I do have everything MAX settings except I forgot that I saw a lot of people saying how the Advanced graphic settings are super taxing even with our huge 8 gigs of memory. I forgot that I turned them off and got literally double the fps I was getting with them turned on/up. The game really didn't look any less appealing with them being off either so I figured I would rather take the FPS boost. Regardless I bet the game looks spectacular on 1440hz (hope to be my next upgrade).

As far as an FPS counting tool, you can always use the graph and tell it to monitor your FPS and just increase the span of time that you want it to show in the settings and when you close out of the game take a look and see where your jumps and dips are. That's what I do.


----------



## wickedsunny

Guys I am buying Sapphire Nitro R9 390 for 3D rendering, so it will be running for several hours., should I get it with backplate or without?


----------



## 6speed

I've been saving up to get the MSI R9 390 hopefully here soon, pre Fallout 4 preferably. I didn't really think it would be an issue but after checking some PSU calc's some say my PSU might not be enough. I have a Corsair TX650, does anyone think this could be an issue? It does have 2x PCI 8-pin connectors so i'm good there, and my sig is up to date. Thanks in advance for any help.


----------



## wickedsunny

Quote:


> Originally Posted by *6speed*
> 
> I've been saving up to get the MSI R9 390 hopefully here soon, pre Fallout 4 preferably. I didn't really think it would be an issue but after checking some PSU calc's some say my PSU might not be enough. I have a Corsair TX650, does anyone think this could be an issue? It does have 2x PCI 8-pin connectors so i'm good there, and my sig is up to date. Thanks in advance for any help.


I am using a 700 W cooler master silent pro, it takes about 300 W, Max 500, so calculate the W for your other components.

best pick up your parts in pcpartpicker.com website and select the PSU at very end, it will show the compatibility list.


----------



## kizwan

Quote:


> Originally Posted by *6speed*
> 
> I've been saving up to get the MSI R9 390 hopefully here soon, pre Fallout 4 preferably. I didn't really think it would be an issue but after checking some PSU calc's some say my PSU might not be enough. I have a Corsair TX650, does anyone think this could be an issue? It does have 2x PCI 8-pin connectors so i'm good there, and my sig is up to date. Thanks in advance for any help.


Your CPU @5GHz may pulling around 200W. So you have around 448W for the GPU. So it should be enough.


_Source: http://www.overclock.net/t/910467/the-ultimate-sandy-bridge-oc-guide-p67a-ud7-performance-review#post_11945976_


----------



## Sgt Bilko

For everyone that wanted to watercool their 390x's it looks like XFX might have a card coming that will be compatible



http://www.eteknix.com/xfx-brings-back-blower-style-r9-390x/


__ https://twitter.com/i/web/status/659423187847876608%2Fphoto%2F1
Nothing has been confirmed as yet but if the cooler has been unmodified then EK 290x blocks should work without issues


----------



## Superb677

For all my fellow 390 users, I was hoping you can help me with some clarification on what Memory Voltage and Aux Voltage actually means/does in MSI Afterburner. Does it improve anything when overclocking? Is it safe? Thanks in advance


----------



## gupsterg

Quote:


> Originally Posted by *Superb677*
> 
> I was hoping you can help me with some clarification on what Memory Voltage and Aux Voltage actually means/does in MSI Afterburner.


Are you saying the Memory Voltage slider is active for your card in MSI AB?

Many thanks







.


----------



## Superb677

Quote:


> Originally Posted by *gupsterg*
> 
> Are you saying the Memory Voltage slider is active for your card in MSI AB?


Oh now that you mentioned it, I realize it is not active. lol








Let me reword my question then. What would be the difference between Aux Voltage and Core Voltage?


----------



## Dundundata

MasterSword, Afterburner should come with RivaTuner which has an excellent OSD when coupled with AB. You just need to pick which sensors you want displayed in AB.


----------



## Scorpion49

So I just pulled the trigger on the Alphacool 390X block for the MSI Gaming card, hopefully it works well. Anything has to be better than the leaf-blower MSI cooler I suppose.


----------



## gupsterg

Quote:


> Originally Posted by *Superb677*
> 
> Oh now that you mentioned it, I realize it is not active. lol
> 
> 
> 
> 
> 
> 
> 
> 
> Let me reword my question then. What would be the difference between Aux Voltage and Core Voltage?


AUX Voltage = VDDCI , which AFAIK is the I/O bus voltage (between memory and GPU core), not the 8x IMC in the die.

Core Voltage = VDDC , which is the GPU core voltage and I think this also feeds the 8x IMC in the die.

Memory Voltage (MVDD) AFAIK only the 290 series MSI Lightning and Asus Matrix cards have, as they have an extra chip controlling it. Where as the standard voltage control chip (IR3567) on its own doesn't support software RAM voltage adjustment AFAIK.


----------



## Superb677

Quote:


> Originally Posted by *gupsterg*
> 
> AUX Voltage = VDDCI , which AFAIK is the I/O bus voltage (between memory and GPU core), not the 8x IMC in the die.
> 
> Core Voltage = VDDC , which is the GPU core voltage and I think this also feeds the 8x IMC in the die.
> 
> Memory Voltage (MVDD) AFAIK only the 290 series MSI Lightning and Asus Matrix cards have, as they have an extra chip controlling it. Where as the standard voltage control chip (IR3567) on its own doesn't support software RAM voltage adjustment AFAIK.


Ah okay. Very helpful thank you so much. So if I am not mistaken I would think increasing the Aux voltage would help overclocking abilities especially the memory clock. Is there any concern for safety of the card when increasing it?


----------



## gupsterg

Personally I wouldn't go over 1.05V ie +50mv in MSI AB, seen Sapphire 390/X roms have this.

The info I got from the Stilt states never to go over 1.10v ie +100mv in MSI AB.

MSI AB doesn't allow more, unless there is a command line switch that can be added to shortcut properties to increase it, just like the limit on GPU core voltage can be raised.

Other OC apps I don't know about as don't use them.

Recently an owner in the 290/X owners thread posted it helped attain better OCs on a poor clocking card, with myself I found it had no effect on my card's OC ability. SO I'm guessing depends case by case.


----------



## dartmaul15

greetings to all you overcllocker overlords, one of your humble peasants call for your aid.

Okay, jokes aside, i need your help. I'm planning to get my hands on an r9 390, and i'm wondering about three things. First of all, which of the cards are better for stock config clocking? I know the MSI one dominates in therms of hard overclocking, but i suspect there's a lot of voltage clocking involved there. I've broken one gpu on that, and don't want to risk it. So, which model will be the best?

And secondly, i've noticed that the sapphire version consumes as much as 370w of power in STOCK config? That's a huge difference from the MSI one who according to my sources use 275w.

Thidly, what increase in voltages can i expect to see from a stock configuration overclock?

I'd love to hear some answers. And i am aware they will be general, due to our old pal "silicone lottery", so don't worry about that.

Thanks in advance.


----------



## tolis626

Quote:


> Originally Posted by *dartmaul15*
> 
> greetings to all you overcllocker overlords, one of your humble peasants call for your aid.
> 
> Okay, jokes aside, i need your help. I'm planning to get my hands on an r9 390, and i'm wondering about three things. First of all, which of the cards are better for stock config clocking? I know the MSI one dominates in therms of hard overclocking, but i suspect there's a lot of voltage clocking involved there. I've broken one gpu on that, and don't want to risk it. So, which model will be the best?
> 
> And secondly, i've noticed that the sapphire version consumes as much as 370w of power in STOCK config? That's a huge difference from the MSI one who according to my sources use 275w.
> 
> Thidly, what increase in voltages can i expect to see from a stock configuration overclock?
> 
> I'd love to hear some answers. And i am aware they will be general, due to our old pal "silicone lottery", so don't worry about that.
> 
> Thanks in advance.


Well, there's no way the Sapphire consumes 370W at stock. I doubt the card can pull that much sustained wattage at all, even when overclocked. You'd need a beefier PCB for that kind of stuff. Now you might see spikes of power usage sometimes that go really high, but those are momentary and you shouldn't pay attention. I'd think all 390 cards will consume roughly the same amount of power, seeing as the core is unchanged. When variations in voltage and clocks are taken into account, of course.

Regarding overclocking at stock volts... Don't even factor that into your decision. It's 100% random. Some say the MSI cards tend to overclock better, but I'd say no. It's random as always, as is overclocking in general. You can expect something in the 1100-1150MHz range at stock volts, most cards do something like that.

Also, what do you mean about seeing an increase in voltages from a stock configuration overclock? The way I understand it is you want to know how far you can push your overvolt/overclock at with stock cooling? If so, the Sapphire cards arguably have the best cooler. Throw +100mV at it (The max Afterburner allows) and it's still ok. The MSI cooler is really good too, but MSI seems to have screwed up with their TIM application, so be aware of that, especially if, like me, you don't like tinkering with your shiny new card. XFX and Powercolor are fine choices too. I would just stay away from Gigabyte for the 300 series. That cooler of theirs seems awful and they're voltage locked, so... Yeah.


----------



## kubiks

So , I pulled the trigger. I've purchased Two Alphacool nexxxos GPX blocks for my Msi Gaming 8gb 390x crossfire setup. Im doin' it! It's gonna take a while to get the stuff shipped from germany im sure, but I will be sure to report my findings! Can't wait. I didnt intend to watercool when i was planning my new system, and i ended up with cards there are no full blocks for. I have been watching this thread and scouring the internet for waterblocks to no avail. Even if this turns out badly, at least we can all learn from it. Regardless of the outcome, it will make for some GREAT information in the guide.


----------



## Scorpion49

Quote:


> Originally Posted by *kubiks*
> 
> So , I pulled the trigger. I've purchased Two Alphacool nexxxos GPX blocks for my Msi Gaming 8gb 390x crossfire setup. Im doin' it! It's gonna take a while to get the stuff shipped from germany im sure, but I will be sure to report my findings! Can't wait. I didnt intend to watercool when i was planning my new system, and i ended up with cards there are no full blocks for. I have been watching this thread and scouring the internet for waterblocks to no avail. Even if this turns out badly, at least we can all learn from it. Regardless of the outcome, it will make for some GREAT information in the guide.


modmymods has them in stock though....


----------



## Josh81

No R9 380 thread, just picked up two for $180 each and am crossfiring them after having a 660 TI for the past year (my cat spilled my fiance's coffee on computer and ruined a bunch of my parts, i had 2x 660 ti)

Arriving today, can't wait to post up some benchmarks.

Edit : rest of computer

i5-4690k
16 gb ddr3 ram
asus a97
500 gb SSD

I went the crossfire route due to my tremendous success with 2x 660 TI, and I won't be going past 1440p gaming. (probably won't upgrade monitor for a few months from 1080p-->1440p)


----------



## Levys

Quote:


> Originally Posted by *6speed*
> 
> I've been saving up to get the MSI R9 390 hopefully here soon, pre Fallout 4 preferably. I didn't really think it would be an issue but after checking some PSU calc's some say my PSU might not be enough. I have a Corsair TX650, does anyone think this could be an issue? It does have 2x PCI 8-pin connectors so i'm good there, and my sig is up to date. Thanks in advance for any help.


I had the same psu for my FX8350 rig with an r9 290 and now I'm using a HX650w, It should be just fine.


----------



## Levys

Quote:


> Originally Posted by *kubiks*
> 
> So , I pulled the trigger. I've purchased Two Alphacool nexxxos GPX blocks for my Msi Gaming 8gb 390x crossfire setup. Im doin' it! It's gonna take a while to get the stuff shipped from germany im sure, but I will be sure to report my findings! Can't wait. I didnt intend to watercool when i was planning my new system, and i ended up with cards there are no full blocks for. I have been watching this thread and scouring the internet for waterblocks to no avail. Even if this turns out badly, at least we can all learn from it. Regardless of the outcome, it will make for some GREAT information in the guide.


Are you sure those blocks will fit the custom pcb desing of the MSI R9 390X? Its about a cm ( +- half inch) wider than the standard r9 290x pcb and the layout is very different. My kryographics r9 290x waterblock did not fit the MSI card. an i had to go for the XFX card.( with original pcb design) I'm glad I did though, so far I had it run 3D mark at 1225/1750 without any artifacts.

Just a warning, unless you know something I don't


----------



## wickedsunny

Quote:


> Just a warning, unless you know something I don't tongue.gif


They have a specification list probably he checked that


----------



## flopper

Quote:


> Originally Posted by *Josh81*
> 
> No R9 380 thread, just picked up two for $180 each and am crossfiring them after having a 660 TI for the past year (my cat spilled my fiance's coffee on computer and ruined a bunch of my parts, i had 2x 660 ti)
> 
> Arriving today, can't wait to post up some benchmarks.
> 
> Edit : rest of computer
> 
> i5-4690k
> 16 gb ddr3 ram
> asus a97
> 500 gb SSD
> 
> I went the crossfire route due to my tremendous success with 2x 660 TI, and I won't be going past 1440p gaming. (probably won't upgrade monitor for a few months from 1080p-->1440p)


Personally I would rather go with the most single powerful card and as far 1440p and a 390 works great.
next year I be upgrading once next gen is out.


----------



## wickedsunny

Quote:


> Originally Posted by 6speed View Post
> 
> I've been saving up to get the MSI R9 390 hopefully here soon, pre Fallout 4 preferably. I didn't really think it would be an issue but after checking some PSU calc's some say my PSU might not be enough. I have a Corsair TX650, does anyone think this could be an issue? It does have 2x PCI 8-pin connectors so i'm good there, and my sig is up to date. Thanks in advance for any help.


Check out this to find your power usage

http://outervision.com/power-supply-calculator


----------



## dartmaul15

Quote:


> Originally Posted by *tolis626*
> 
> Well, there's no way the Sapphire consumes 370W at stock. I doubt the card can pull that much sustained wattage at all, even when overclocked. You'd need a beefier PCB for that kind of stuff. Now you might see spikes of power usage sometimes that go really high, but those are momentary and you shouldn't pay attention. I'd think all 390 cards will consume roughly the same amount of power, seeing as the core is unchanged. When variations in voltage and clocks are taken into account, of course.
> 
> Regarding overclocking at stock volts... Don't even factor that into your decision. It's 100% random. Some say the MSI cards tend to overclock better, but I'd say no. It's random as always, as is overclocking in general. You can expect something in the 1100-1150MHz range at stock volts, most cards do something like that.
> 
> Also, what do you mean about seeing an increase in voltages from a stock configuration overclock? The way I understand it is you want to know how far you can push your overvolt/overclock at with stock cooling? If so, the Sapphire cards arguably have the best cooler. Throw +100mV at it (The max Afterburner allows) and it's still ok. The MSI cooler is really good too, but MSI seems to have screwed up with their TIM application, so be aware of that, especially if, like me, you don't like tinkering with your shiny new card. XFX and Powercolor are fine choices too. I would just stay away from Gigabyte for the 300 series. That cooler of theirs seems awful and they're voltage locked, so... Yeah.


i figured it could be due to the cooler that th sapphire one used that much power. And it was the numbers listed here. keep in mind it's a norwegian site. I cross checked it with other places, and turned out nobody had anything, so i asumed the worst and decided to ask.

Yes, that's hat i meant. How much i can safely increase the stock voltage. And thanks for the lengthy answer. It's easy to enable voltage controll on them, right? my current saphire radeon 5870 seems to be voltage locked (done what's needed in msi afterburner, and even tried flashing the bios), so don't have any experice with it.

The reason i was asking was because I am planning to crossfire two r9 390 later (ie build my rig to handle it, get one card now, and one later on) and i was torn between an 850 and a 1000w PSU. With all bone stock, a CF setup sits at 700w powerdrain. I asume a 1000w one would be better for overclocking if i'm running crossfire? Or will the evga super nova 850w g2 be enough? If not, i'm considering the 1000w one.


----------



## wickedsunny

Quote:


> Originally Posted by *dartmaul15*
> 
> i figured it could be due to the cooler that th sapphire one used that much power. And it was the numbers listed here. keep in mind it's a norwegian site. I cross checked it with other places, and turned out nobody had anything, so i asumed the worst and decided to ask.
> 
> Yes, that's hat i meant. How much i can safely increase the stock voltage. And thanks for the lengthy answer. It's easy to enable voltage controll on them, right? my current saphire radeon 5870 seems to be voltage locked (done what's needed in msi afterburner, and even tried flashing the bios), so don't have any experice with it.
> 
> The reason i was asking was because I am planning to crossfire two r9 390 later (ie build my rig to handle it, get one card now, and one later on) and i was torn between an 850 and a 1000w PSU. With all bone stock, a CF setup sits at 700w powerdrain. I asume a 1000w one would be better for overclocking if i'm running crossfire? Or will the evga super nova 850w g2 be enough? If not, i'm considering the 1000w one.


Check the link I gave above. To be on safe side you will need 1100 W. I am going to get 2 sapphire R9 390 myself, ordered one for now.


----------



## dartmaul15

Quote:


> Originally Posted by *wickedsunny*
> 
> Check the link I gave above. To be on safe side you will need 1100 W. I am going to get 2 sapphire R9 390 myself, ordered one for now.


been there, checked it allready








Unclocked i hit 708w, and given how each card can be clocked up 100w or so, it'll hit 900ish. Add in some CPU clocking (won't touch power clock) I'm just a bit unsure on what to go for. A 850 one is too weak, that's clear. But will 1000w be enough? the prize gap from 1000w and over is HUGE over here. But hell, i guess the supernova g2 1300w will last well into the next ice age. And it gives plenty of leeway for casemodding, and putting in excessively high number of HDDs


----------



## Gumbi

Quote:


> Originally Posted by *dartmaul15*
> 
> been there, checked it allready
> 
> 
> 
> 
> 
> 
> 
> 
> Unclocked i hit 708w, and given how each card can be clocked up 100w or so, it'll hit 900ish. Add in some CPU clocking (won't touch power clock) I'm just a bit unsure on what to go for. A 850 one is too weak, that's clear. But will 1000w be enough? the prize gap from 1000w and over is HUGE over here. But hell, i guess the supernova g2 1300w will last well into the next ice age. And it gives plenty of leeway for casemodding, and putting in excessively high number of HDDs


It should be plenty. You won't be adding 100 watts to each card with overclocking, as with 2 cards, unless you have insane case airflow your overclocking will be more limited by heat.


----------



## kubiks

Quote:


> Originally Posted by *Levys*
> 
> Are you sure those blocks will fit the custom pcb desing of the MSI R9 390X? Its about a cm ( +- half inch) wider than the standard r9 290x pcb and the layout is very different. My kryographics r9 290x waterblock did not fit the MSI card. an i had to go for the XFX card.( with original pcb design) I'm glad I did though, so far I had it run 3D mark at 1225/1750 without any artifacts.
> 
> Just a warning, unless you know something I don't


Check alpha cools configurator. It's the gpx 390 m02


----------



## wickedsunny

Quote:


> Originally Posted by *dartmaul15*
> 
> been there, checked it allready
> 
> 
> 
> 
> 
> 
> 
> 
> Unclocked i hit 708w, and given how each card can be clocked up 100w or so, it'll hit 900ish. Add in some CPU clocking (won't touch power clock) I'm just a bit unsure on what to go for. A 850 one is too weak, that's clear. But will 1000w be enough? the prize gap from 1000w and over is HUGE over here. But hell, i guess the supernova g2 1300w will last well into the next ice age. And it gives plenty of leeway for casemodding, and putting in excessively high number of HDDs


I will personally go for a antec/cooler master 1500 W. I am planning to add 3 sapphire in one for 3D rendering in Blender.

So if you ever wanna go 3 SLI, keep that in mind.


----------



## kubiks

Quote:


> Originally Posted by *Scorpion49*
> 
> modmymods has them in stock though....


They weren't listed on modmymods when I purchased them. Thank for the concern though


----------



## dartmaul15

Quote:


> Originally Posted by *Gumbi*
> 
> It should be plenty. You won't be adding 100 watts to each card with overclocking, as with 2 cards, unless you have insane case airflow your overclocking will be more limited by heat.


A corsair H60 CPU cooler (don't need more for intel) in push-pull config, slight overpressure on the case, and a fractal define R5 with high effect noctua fans (up to 140 cubic meters per hour). Doubt i can get it much cooler without watercooling, or without getting fans sounding like a hunter hurricane taking off








Obviously won't use all the fan slots,







as that would get my pc dangerously close of generating enough lift for takeoff







but i'll make sure it's running cool enough.

So a 1000w one is enough?
Quote:


> Originally Posted by *wickedsunny*
> 
> I will personally go for a antec/cooler master 1500 W. I am planning to add 3 sapphire in one for 3D rendering in Blender.
> 
> So if you ever wanna go 3 SLI, keep that in mind.


problem is just in 3way crossfire i will be broke as hell, and have an F35 jet engine. seriously, you can't realistically keep 3way sli cool without waterblocks. At least not overclocked XD
So even 2 way crossfire is future proofing by my standards.


----------



## Scorpion49

Quote:


> Originally Posted by *kubiks*
> 
> They weren't listed on modmymods when I purchased them. Thank for the concern though


Ah, I ordered mine yesterday. Was waiting for a US retailer to have them. Hopefully they work great, looks like no active VRM cooling on them but no other choices at the moment.


----------



## tolis626

Quote:


> Originally Posted by *Scorpion49*
> 
> Ah, I ordered mine yesterday. Was waiting for a US retailer to have them. Hopefully they work great, looks like no active VRM cooling on them but no other choices at the moment.


Well, if I had to hazard a guess, I'd say they would be just fine. Yes, the VRMs may not be receiving water to cool them, but that's a humongous heatsink. Think about it, it's about the size of the stock heatsink for many cards. As long as your case airflow is ok, I wouldn't expect your VRMs to hit more than 60-70C, even when overclocking. Not to mention that you're gonna get at least some passive heat transfer from the heatsink to the water that passes through the GPU block.

I don't like how these Alphacool blocks look and they aren't as elegant a solution as a full cover block, but they are genius. They perform really well from what I've seen and don't cost an arm and a leg. I will surely have them on my list when/if I make a custom loop. If they could make them a little better looking, I wouldn't consider anything else as long as my budget is limited. Especially that backplate... I'd try to find a way to use my cards stock one.


----------



## Scorpion49

Quote:


> Originally Posted by *tolis626*
> 
> Well, if I had to hazard a guess, I'd say they would be just fine. Yes, the VRMs may not be receiving water to cool them, but that's a humongous heatsink. Think about it, it's about the size of the stock heatsink for many cards. As long as your case airflow is ok, I wouldn't expect your VRMs to hit more than 60-70C, even when overclocking. Not to mention that you're gonna get at least some passive heat transfer from the heatsink to the water that passes through the GPU block.
> 
> I don't like how these Alphacool blocks look and they aren't as elegant a solution as a full cover block, but they are genius. They perform really well from what I've seen and don't cost an arm and a leg. I will surely have them on my list when/if I make a custom loop. If they could make them a little better looking, I wouldn't consider anything else as long as my budget is limited. Especially that backplate... I'd try to find a way to use my cards stock one.


The VRM on these things get very, very hot even with direct airflow. Hawaii is notorious for hot VRM's and I've had enough of them on water to not be very hopeful. I had some of the early EK 290X blocks that passively cooled the VRM and with even a mild overclock I was seeing 100C+. I'll withhold my judgement until I see it in action but I'm not super optimistic.


----------



## thanozr

Just a little OC.

http://www.techpowerup.com/gpuz/details.php?id=2aneu


----------



## rdr09

Quote:


> Originally Posted by *Scorpion49*
> 
> The VRM on these things get very, very hot even with direct airflow. Hawaii is notorious for hot VRM's and I've had enough of them on water to not be very hopeful. I had some of the early EK 290X blocks that passively cooled the VRM and with even a mild overclock I was seeing 100C+. I'll withhold my judgement until I see it in action but I'm not super optimistic.


EK Full waterblocks on hawaiis? they are better than other brands, VRMs are cooler than the core at any oc as a matter of fact.


----------



## Scorpion49

Quote:


> Originally Posted by *rdr09*
> 
> EK Full waterblocks on hawaiis? they are better than other brands, VRMs are cooler than the core at any oc as a matter of fact.


The first gen EK blocks (back in 2013 when Hawaii came out) only cooled the core. There was no VRM channel like there is now. With only the passive properties of the block and no airflow things got very hot. I tried those and some other ones, I think they were Koolance, that had the same problem.


----------



## rdr09

Quote:


> Originally Posted by *Scorpion49*
> 
> The first gen EK blocks (back in 2013 when Hawaii came out) only cooled the core. There was no VRM channel like there is now. With only the passive properties of the block and no airflow things got very hot. I tried those and some other ones, I think they were Koolance, that had the same problem.


i bought mine at launch. i was one of the first members of the club. i watercooled mine after a week on air.


----------



## Scorpion49

Quote:


> Originally Posted by *rdr09*
> 
> i bought mine at launch. i was one of the first members of the club. i watercooled mine after a week on air.


Ah yeah you're right, I looked back through my pictures of them and I had Koolance and Swiftech blocks and switched to the EK when they were available because of the high VRM1 temps. Anyways, I wasn't trying to knock EK or whatever it is you got defensive about, the point still stands that Hawaii needs adequate cooling on the power delivery parts and passive waterblocks have proven to not be great at it.


----------



## wickedsunny

Quote:


> Originally Posted by *Scorpion49*
> 
> The first gen EK blocks (back in 2013 when Hawaii came out) only cooled the core. There was no VRM channel like there is now. With only the passive properties of the block and no airflow things got very hot. I tried those and some other ones, I think they were Koolance, that had the same problem.


Even a sapphire with three fans needs cooling?

What are the average temps. of R9 390 without water cooling?


----------



## Scorpion49

Quote:


> Originally Posted by *wickedsunny*
> 
> Even a sapphire with three fans needs cooling?
> 
> What are the average temps. of R9 390 without water cooling?


You've misunderstood my post I think, your sapphire card is fine.


----------



## rdr09

Quote:


> Originally Posted by *Scorpion49*
> 
> Ah yeah you're right, I looked back through my pictures of them and I had Koolance and Swiftech blocks and switched to the EK when they were available because of the high VRM1 temps. Anyways, I wasn't trying to knock EK or whatever it is you got defensive about, the point still stands that Hawaii needs adequate cooling on the power delivery parts and passive waterblocks have proven to not be great at it.


just trying to put out the right info to help others. you are prolly referring to the universal blocks 'cause koolance full blocks are just as good as ek's.


----------



## wickedsunny

Quote:


> Originally Posted by *Scorpion49*
> 
> You've misunderstood my post I think, your sapphire card is fine.


Sigh, that is a relief.


----------



## Scorpion49

Quote:


> Originally Posted by *rdr09*
> 
> just trying to put out the right info to help others. you are prolly referring to the universal blocks 'cause koolance full blocks are just as good as ek's.


These right here, VRM cooling was absolutely terrible:


----------



## rdr09

Quote:


> Originally Posted by *Scorpion49*
> 
> These right here, VRM cooling was absolutely terrible:


yah, now i remember. those are no better than universals. they came out with a revision. ek had it right the first time.


----------



## Scorpion49

Quote:


> Originally Posted by *rdr09*
> 
> yah, now i remember. those are no better than universals. they came out with a revision. ek had it right the first time.


Yeah the EK blocks were great, I just got them backwards because its been so many years haha. Well anyway, I hope the Alphacool blocks work well, kind of an interesting design with the universal block stuck into a full-cover heatsink. The price is good too, less than $100 even with a full backplate. Some people think they're ugly but I kind of like how it looks.


----------



## fat4l

ok so I finished my ROM for 1100MHz/1500MHz.
GPU1- 1212v set in the dpm7
GPU2- 1250v set in the dpm7

Real max voltage is:
GPU1- 1.195v
GPU2- 1.234v

Would be nice to see how much volts ur 290X/390X needs for 1100/1500MHz clocks


----------



## kubiks

Quote:


> Originally Posted by *Scorpion49*
> 
> Ah, I ordered mine yesterday. Was waiting for a US retailer to have them. Hopefully they work great, looks like no active VRM cooling on them but no other choices at the moment.


Yeah, I hear ya. I ordered mine a few days ago, but i checked for us retailers, didnt see anyone carrying it. Even at mods but I may have missed it. According to the various reviews, they cool the vrms a bit y and the core extends onto the heatsink so it cools more than just the core. People are so caught up in bashing semi cover blocks, but weve been using them for years. Its a personal preference or a necessity depending on the situation. Necessity this time!

What struck me the most about this product, is that the core cools VERY VERY WELL. Lower temps than most blocks, but its a bit restrictive. Theres an extremerigs review of the block on tyoutube and the core cools better than all other tested full blocks. pretty impressive.


----------



## AverdanOriginal

So guys,
Finally after having installed my new sweet Skylake Mothership (i5-6600k @4.5GHz with BCLK 125MHz, Multi @36x, Cache @4.25GHz and FCLK @1.25GHz), having tweaked my air flow even a bit more with air penetrators from silverstone and having changed the TIM I went ahead and gone all the way (well at least for me all the way) with trying to find the highest possible overclock on my MSI R9 390.

Ended up with *1210MHz Core @ +100mv* and *1700MHz Memory @ +25aux*.








Any higher and artifacting in Firestrike got crazy (even with higher aux volt.) and didn't even finish a bench run.

Here my Firestrike 1.1 Scores:
http://www.3dmark.com/3dm/9117603

The card reached a Heat of 80C°







with ambient being 22.6C° (though my fans were spinning only at 84%)

It seems this overclock pretty much lands around the same as most others here with a MSI R9 390. Also as most mentioned more overclocking will not necessarily bring your more FPS or much higher scores in Firestrike nor Heaven. so 1200/1700 seems kind of normal for the high binned MSIs and the sweet spot?!

For everyday gaming on 1080p I use the normal overclock of *1130/1680 with not 0mv and 0aux adjustement*.
For the next summertime I have already my nice undervolt of *1030/1680 with -87mv* which brings me another -10C° in case I get 37C° in my room again.

Here the detailed steps (I love documenting things in order to see what went wrong or also to follow my stupid thoughts later on)



at the overclocking step moving from 1140/1500 up I switched to Skylake hence the higher Firestrike scores. (allthough I once forgot to turn off the framerate cap in CCC







)

Any hints, tips, suggestions are always welcome.









@AgentSmith: could you please add this overclock to the List on the first page? (my account is already listed)


----------



## kizwan

Quote:


> Originally Posted by *Scorpion49*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> just trying to put out the right info to help others. you are prolly referring to the universal blocks 'cause koolance full blocks are just as good as ek's.
> 
> 
> 
> These right here, VRM cooling was absolutely terrible:
Click to expand...

Quote:


> Originally Posted by *Scorpion49*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> yah, now i remember. those are no better than universals. they came out with a revision. ek had it right the first time.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yeah the EK blocks were great, I just got them backwards because its been so many years haha. Well anyway, I hope the Alphacool blocks work well, kind of an interesting design with the universal block stuck into a full-cover heatsink. The price is good too, less than $100 even with a full backplate. Some people think they're ugly but I kind of like how it looks.
Click to expand...

I was going to ask you to post pictures of the EK initial/first gen waterblock for the Hawaii because I have the EK first gen full waterblock for Hawaii. :lol:


----------



## Streetdragon

im looking forward for the Alphacool nexxxos GPX tests from you guys!
Wanna overclock my Nitro without VRM-Melting^^


----------



## TsukikoChan

so, catalyst is being replaced with crimson now? i wonder when the first version comes out and how it will affect us


----------



## flopper

Quote:


> Originally Posted by *TsukikoChan*
> 
> so, catalyst is being replaced with crimson now? i wonder when the first version comes out and how it will affect us


soon, they going for looks which is a step forward.
since they formed the radeon group there be a bit more focus on software along the way as VR etc..are coming.
they changed from the .net to .QT for the app source code which will change the loading of the app by seconds if the source is correct.

I am more laid back atm, waiting for zen and the next generation green awesome island next year.
going all crimson next year


----------



## tolis626

So today I decided to do a little overclocking and benchmarking and at 1185/1725MHz with +100/+55mV these just happened.

http://www.3dmark.com/3dm/9125612?

http://www.3dmark.com/fs/6377920

14725 graphics in normal Firestrike and 6481 in Extreme? Holy crap. And it didn't even break 75C. I love it when it's cold. I mean, it's like 15-20C still, but it's getting lower. Maybe I'll give Trixx a go later on...


----------



## vane100

Hey guys,

Small question. I've recently started OC'ing my MSI R9 390 8G.

Current MSI Afterburner settings
+100 mV
+50 power
Core Clock 1170 MHz
Mem clock 1700 MHz

While running GTA 5 Im getting 0 artifacts under 100% load for playing like 1 hour or so. The only thing is that my GPU gets to around 80 degrees celsius.
My idle temps are around 63 degrees. I'm not sure whether these temps can harm my GPU on the long run?

What are normal (non harmful) 100% load temperatures for a MSI R9 390?

Thanks!


----------



## Cannon19932006

Alright guys, got a scenario for you.

I just got a great new job and have quite a bit of extra spending cash and want to get another 390x, the question is, do I get the MSI 390x and have matching pair.

Or do I get some other 390x variant.

Which 390x should I get guys?


----------



## Gumbi

Quote:


> Originally Posted by *Cannon19932006*
> 
> Alright guys, got a scenario for you.
> 
> I just got a great new job and have quite a bit of extra spending cash and want to get another 390x, the question is, do I get the MSI 390x and have matching pair.
> 
> Or do I get some other 390x variant.
> 
> Which 390x should I get guys?


Another MSI would be fine. They have some of the better cooling in the 390(x) series of cards (especially VRMs).

Make sure you have very good case cooling if you want to use 2 390xs.


----------



## Gumbi

Quote:


> Originally Posted by *vane100*
> 
> Hey guys,
> 
> Small question. I've recently started OC'ing my MSI R9 390 8G.
> 
> Current MSI Afterburner settings
> +100 mV
> +50 power
> Core Clock 1170 MHz
> Mem clock 1700 MHz
> 
> While running GTA 5 Im getting 0 artifacts under 100% load for playing like 1 hour or so. The only thing is that my GPU gets to around 80 degrees celsius.
> My idle temps are around 63 degrees. I'm not sure whether these temps can harm my GPU on the long run?
> 
> What are normal (non harmful) 100% load temperatures for a MSI R9 390?
> 
> Thanks!


Completely fine. Game on


----------



## Cannon19932006

Anyone know if the MSI 390x Gaming *LE* is any good?


----------



## H1vemind

Hi there guys and gals, I just ordered myself a sapphire nitro 390 and hope to join when it gets here, been following the thread for a while to help me decide on whether or not to go for it.


----------



## Casterina

What's the best software to stress/benchmark the card so I can see the temps when under load?


----------



## tangelo

Quote:


> Originally Posted by *Casterina*
> 
> What's the best software to stress/benchmark the card so I can see the temps when under load?


I would suggest Valley and/or Firestrike on 3dmark. Stay away from furmark


----------



## battleaxe

Quote:


> Originally Posted by *Cannon19932006*
> 
> Anyone know if the MSI 390x Gaming *LE* is any good?


Its a lower binned version. If you don't plan to OC then it would be fine. Otherwise... Get the better one.


----------



## diggiddi

3dmark Firestrike


----------



## Rob27shred

Quote:


> Originally Posted by *Casterina*
> 
> What's the best software to stress/benchmark the card so I can see the temps when under load?


Unigine Valley & Heaven would be best for stressing, 3DMark Firestirke would be best for benching (the scores are kept in a database & comparable with others) IMO.
http://unigine.com/en/products/benchmarks/heaven
http://unigine.com/en/products/benchmarks/valley
http://www.futuremark.com/benchmarks/3dmark?_ga=1.136484912.1213891817.1446507546


----------



## Mister300

Perfectly fine, can mod fan curve in AB if you are concerned. BTW my XFX inquiry said the XFX 390X card can run in the mid 90 C range and still be replaced by warranty if it were to break. My hits 82 max on some scenarios with stock cooling in a NZXT 440. Would recommend new thermal paste.


----------



## tolis626

Quote:


> Originally Posted by *gupsterg*
> 
> Personally I wouldn't go over 1.05V ie +50mv in MSI AB, seen Sapphire 390/X roms have this.
> 
> The info I got from the Stilt states never to go over 1.10v ie +100mv in MSI AB.
> 
> MSI AB doesn't allow more, unless there is a command line switch that can be added to shortcut properties to increase it, just like the limit on GPU core voltage can be raised.
> 
> Other OC apps I don't know about as don't use them.
> 
> Recently an owner in the 290/X owners thread posted it helped attain better OCs on a poor clocking card, with myself I found it had no effect on my card's OC ability. SO I'm guessing depends case by case.


Even if a bit late, may I ask what the Stilt told you? My card seems to struggle with memory overclocking, especially after the 15.10 driver. I've reverted back to 15.7.1 and it's somewhat better, but it still takes +55mV AUX to do 1700MHz, and even that is semi-stable and causes my core overclocks to crash easier. I mean, it takes like an extra 5mV for the core to be stable compared to what was stable without getting the memory so high.

I know I'm getting deep into "it's f-ing stupid" territory, but I'm kind of disappointed I can't get high memory clocks, especially seeing others hitting 1750MHz with ease.


----------



## kubiks

Wow, so I posted a few days ago after ordering my pair of Alphacool nexxos gpx 309 m02 blocks from Aquatuning in Germany. I immediately had buyers remorse, as another member found them at a US Vendor and they were actually in stock! Needless to say, I was pretty bummed out. Yesterday I got an email that they had shipped out from the warehouse in Germany. This morning I woke up, checked the tracking.... They are already through US customs in Philly. Holy crap! How could this possibly be? MY order for the rads and fittings / pump from Performance pcs only moved from Cocoa florida to Orlando Florida in the time it took my two waterblocks to go through customs twice and cross an ocean!

I am blown away at the service and shipping from Aquatuning, I was hesitant to purchase internationally and wanted to wait for a domestic vendor. I did pay for express shipping, but it wa sonly 3 dollars more htna the standard shipping. Just letting you all know in case you are on the fence about ordering from them.


----------



## Rmosher

Quote:


> Originally Posted by *tolis626*
> 
> Even if a bit late, may I ask what the Stilt told you? My card seems to struggle with memory overclocking, especially after the 15.10 driver. I've reverted back to 15.7.1 and it's somewhat better, but it still takes +55mV AUX to do 1700MHz, and even that is semi-stable and causes my core overclocks to crash easier. I mean, it takes like an extra 5mV for the core to be stable compared to what was stable without getting the memory so high.
> 
> I know I'm getting deep into "it's f-ing stupid" territory, but I'm kind of disappointed I can't get high memory clocks, especially seeing others hitting 1750MHz with ease.


What make is your card again? It seems that the Sapphire cards in particular are for some reason more likely to hit 1700+ Mhz memory speeds. I have a Sapphire 390X and I was able to hit 1750 on memory with no voltage bump. But other makes don't seem to fare so well in this regard as far as I can tell. Not sure why


----------



## tolis626

Quote:


> Originally Posted by *Rmosher*
> 
> What make is your card again? It seems that the Sapphire cards in particular are for some reason more likely to hit 1700+ Mhz memory speeds. I have a Sapphire 390X and I was able to hit 1750 on memory with no voltage bump. But other makes don't seem to fare so well in this regard as far as I can tell. Not sure why


It's an MSI 390x. Sapphire cards have 50mV added to their AUX voltage by default, so that's a thing. But even so, I can't go too far. Bummer...


----------



## Dundundata

My MSI core clocks great, the memory not so much. My XFX was the opposite. I wouldn't worry about it.


----------



## tolis626

Quote:


> Originally Posted by *Dundundata*
> 
> My MSI core clocks great, the memory not so much. My XFX was the opposite. I wouldn't worry about it.


Well, I can't say my MSI's core is a great overclocker. The max it's hit while being stable is 1185MHz at +100mV. Anything higher and I get artifacting. I think it has something to do with temperatures though, so I will give it a try again once I grow enough balls to change my card's TIM.









Thing is, I'm wondering whether overclocking the core affects the memory and vice versa. It seems a bit more... Sensitive when both are higher. It certainly crashes quicker if it's going to crash. I'll keep abusing it in the next few days and see what I can come up with.

For now, I seem to have found a stable-ish, usable-ish overclock of 1165/1725MHz at +75/+60mV. Now normally I'm able to do 1170MHz with +75mV, but I tried running Heaven at 1170/1725 and it crashed. At 1165/1725 it went through half an hour of Valley though. So there's that. Alas, +60mV AUX does seem quite high... I'm really struggling to decide whether I should keep using it like that or not.


----------



## battleaxe

Quote:


> Originally Posted by *tolis626*
> 
> Well, I can't say my MSI's core is a great overclocker. The max it's hit while being stable is 1185MHz at +100mV. Anything higher and I get artifacting. I think it has something to do with temperatures though, so I will give it a try again once I grow enough balls to change my card's TIM.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thing is, I'm wondering whether overclocking the core affects the memory and vice versa. It seems a bit more... Sensitive when both are higher. It certainly crashes quicker if it's going to crash. I'll keep abusing it in the next few days and see what I can come up with.
> 
> For now, I seem to have found a stable-ish, usable-ish overclock of 1165/1725MHz at +75/+60mV. Now normally I'm able to do 1170MHz with +75mV, but I tried running Heaven at 1170/1725 and it crashed. At 1165/1725 it went through half an hour of Valley though. So there's that. Alas, +60mV AUX does seem quite high... I'm really struggling to decide whether I should keep using it like that or not.


Have you tried running Trixx and pushing more than 100mv? If not, give that a try.


----------



## XxxxVulcanxxxX

Im using the MSI 390x but on an older pci-e gen2 slot what (if anything) am I loosing out on? My FPS are all where they should be and firstrike score is a respectable 10748 but im struggling to get a stable over clock over 1120mhz core or anything over 1650 vram. this could be heat related as tho as im touching 82c in stress testing...

would the older pci generation be the reason i keep getting the card recognised as a 290x?


----------



## tolis626

Quote:


> Originally Posted by *battleaxe*
> 
> Have you tried running Trixx and pushing more than 100mv? If not, give that a try.


That's a big no-no for now. It already has a tendency to overheat at +100mV, so I'm afraid to go any higher without doing something about the cooling first. As soon as I replace the TIM, I may go up to +125mV or something, but certainly not higher. I'm too broke to buy another GPU right now, so I can't risk it.









Quote:


> Originally Posted by *XxxxVulcanxxxX*
> 
> Im using the MSI 390x but on an older pci-e gen2 slot what (if anything) am I loosing out on? My FPS are all where they should be and firstrike score is a respectable 10748 but im struggling to get a stable over clock over 1120mhz core or anything over 1650 vram. this could be heat related as tho as im touching 82c in stress testing...
> 
> would the older pci generation be the reason i keep getting the card recognised as a 290x?


The MSI gets recognized as an 8GB 290x for some reason in Firestrike. It's normal, don't worry about it. Also, as long as you're running it at full x16 bandwidth, you'll most probably be fine. I'd say you'll be completely fine but, you know, one can never be sure. Just think of it as PCIe 3.0 x8, it's the same thing.

Now, regarding overclocking. You will need to push some volts into it to go higher than 1120MHz. Use Afterburner for that and also set a more aggressive fan curve in Afterburner's settings. The stock MSI one is aimed towards silent operation, so it tends to let your card run hot. Also, make sure your case airflow is adequate. These cards can greatly benefit from good airflow. 1650MHz for the RAM is fine. It already gives you over 400GB/s of bandwidth. Unless you're like me and you're chasing numbers, in which case... Yeah... Going higher will require you to raise your AUX voltage.

Just start by sticking to the basics, don't mess with everything at the same time.


----------



## XxxxVulcanxxxX

Quote:


> Originally Posted by *tolis626*
> 
> That's a big no-no for now. It already has a tendency to overheat at +100mV, so I'm afraid to go any higher without doing something about the cooling first. As soon as I replace the TIM, I may go up to +125mV or something, but certainly not higher. I'm too broke to buy another GPU right now, so I can't risk it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The MSI gets recognized as an 8GB 290x for some reason in Firestrike. It's normal, don't worry about it. Also, as long as you're running it at full x16 bandwidth, you'll most probably be fine. I'd say you'll be completely fine but, you know, one can never be sure. Just think of it as PCIe 3.0 x8, it's the same thing.
> 
> Now, regarding overclocking. You will need to push some volts into it to go higher than 1120MHz. Use Afterburner for that and also set a more aggressive fan curve in Afterburner's settings. The stock MSI one is aimed towards silent operation, so it tends to let your card run hot. Also, make sure your case airflow is adequate. These cards can greatly benefit from good airflow. 1650MHz for the RAM is fine. It already gives you over 400GB/s of bandwidth. Unless you're like me and you're chasing numbers, in which case... Yeah... Going higher will require you to raise your AUX voltage.
> 
> Just start by sticking to the basics, don't mess with everything at the same time.


I am running the full pci-e gen 2.0 x16 didnt think It would make much of a difference from the scores im getting it in the right ball park. Im not trying to to break world records i just feel like im lagging behind what other users are getting. I have a define s case with 6 140mm fans so airflow is not an issue. I did drop a massive 15c in the witcher3 by putting a fan on the side panel blowing onto the card directly and an extra exhaust in the roof, before that i actually had a red screen of death and heat soak issues in the case. Might just stick where i am for now and try again when i get the watercooling installed in the new year...

Love the card tho playing pretty much everthing maxed out and at 1440p in most cases. not bad a £250 card (friend brought it by mistake and lost reciept so couldnt return it) he brought a 980ti


----------



## Dundundata

Besides benching I've been quite timid with my card only running +25mV core @1150/1650. I've clocked higher and temps aren't an issue but I get around 60fps in Witcher 3 with hairworks on (most of the time).

Can anyone recommend a good fan controller? I have 6 case fans running @100%, which is fine for the most part as they are not too loud, but it's really unnecessary when I'm not gaming. Ideally it would fit in a 5.25 drive and not be as deep as a disc drive, or something I can mount elsewhere. Really I just want to be able to turn all the fans down and then crank them up when OCing. So actually the less dials/switches I need to turn the better.


----------



## Gumbi

Quote:


> Originally Posted by *Dundundata*
> 
> Besides benching I've been quite timid with my card only running +25mV core @1150/1650. I've clocked higher and temps aren't an issue but I get around 60fps in Witcher 3 with hairworks on (most of the time).
> 
> Can anyone recommend a good fan controller? I have 6 case fans running @100%, which is fine for the most part as they are not too loud, but it's really unnecessary when I'm not gaming. Ideally it would fit in a 5.25 drive and not be as deep as a disc drive, or something I can mount elsewhere. Really I just want to be able to turn all the fans down and then crank them up when OCing. So actually the less dials/switches I need to turn the better.


I can't recommend a specific one, but I have an excellent one built into my case, with 2 channels (up to 3 fans or 18w per channel - which is plenty btw).

I have 4 Noctua 2k RPM fans hooked which run whisper quiet at the flick of a slide, and give me *superb* temps wheb cranked up. I can pump 200mv into my Vapor X like it's nothing and have VRMs in the low 50s easily.

The case is a Nanoxia Deep Silence 1.


----------



## Darkeylel

Quote:


> Originally Posted by *Dundundata*
> 
> Besides benching I've been quite timid with my card only running +25mV core @1150/1650. I've clocked higher and temps aren't an issue but I get around 60fps in Witcher 3 with hairworks on (most of the time).
> 
> Can anyone recommend a good fan controller? I have 6 case fans running @100%, which is fine for the most part as they are not too loud, but it's really unnecessary when I'm not gaming. Ideally it would fit in a 5.25 drive and not be as deep as a disc drive, or something I can mount elsewhere. Really I just want to be able to turn all the fans down and then crank them up when OCing. So actually the less dials/switches I need to turn the better.


I had this on my previous build worked a treat

http://www.pccasegear.com/index.php?main_page=product_info&products_id=27219&cPath=408

Haven't installed it into my new build just haven't had the time to rewire everything


----------



## kubiks

Quote:


> Originally Posted by *Dundundata*
> 
> My MSI core clocks great, the memory not so much. My XFX was the opposite. I wouldn't worry about it.


good thing the core is what we really like to clock the most! besides with a crossfire r9 390x msi gaming, thats 16 gigs of vram. i havent seen big performance boosts from clocking the vram, even with a 100 mhz boost.


----------



## kubiks

Quote:


> Originally Posted by *Streetdragon*
> 
> im looking forward for the Alphacool nexxxos GPX tests from you guys!
> Wanna overclock my Nitro without VRM-Melting^^


I cant wait either! Tomorrow is the day, probably be up and running with deltas and specs this weekend!


----------



## gupsterg

Quote:


> Originally Posted by *tolis626*
> 
> Even if a bit late, may I ask what the Stilt told you?


a) stay below 1.10v

b) higher than 1.05v makes no difference

c) difference with increased VDDCI is marginal

Nothing more.


----------



## mandrix

So 1.05v For VDDCI. Can anyone say what is considered a "safe" max voltage for VDDC?

My PCS 390X will automatically set VDDC to 1.242v (momentarily-averages about 1.16 during benching) when I use AB to increase core clock. But I found if I add +30 VDDC then it rounds out to 1.25v in AB and helps to get a little higher core clock.
But I would like to test higher VDDC unless it's likely to damage the card.


----------



## AverdanOriginal

Quote:


> Originally Posted by *TsukikoChan*
> 
> so, catalyst is being replaced with crimson now? i wonder when the first version comes out and how it will affect us


Quote:


> Originally Posted by *flopper*
> 
> soon, they going for looks which is a step forward.
> since they formed the radeon group there be a bit more focus on software along the way as VR etc..are coming.
> they changed from the .net to .QT for the app source code which will change the loading of the app by seconds if the source is correct.
> 
> I am more laid back atm, waiting for zen and the next generation green awesome island next year.
> going all crimson next year


Correct. I just read up on this and the source (PCGH) also stated that it will boot much quicker (instead of ~8 seconds in future only 0.6 seconds):
Additional Features which they will most porbably implement:
- Design: better looks and split into Gaming-Video-Display-System. Additional on the bottom there will be 2 tabs being "update" and "Messages" (or so), where you can look into error messages in case you experienced a crash of some sort.
- Overdrive (and this might be really interesting): OC-Profiles customizable for different games (so for each game you have you can adjust the OC of your GPU accordingly to your needs). So you can adjust max fan speed, max temperature on top of max overclock and power target for each of your games.

The only thing that I am missing is information if they will include a fan curve and voltage adjustment. If so, then I would even consider to un-install my MSI Afterburner.
Oh and I hope they include true 4k Downsampling for R9 300 Series since it currently still is only 1800p and not 2160p


----------



## AverdanOriginal

Quote:


> Originally Posted by *Casterina*
> 
> What's the best software to stress/benchmark the card so I can see the temps when under load?


Quote:


> Originally Posted by *tangelo*
> 
> I would suggest Valley and/or Firestrike on 3dmark. Stay away from furmark


Quote:


> Originally Posted by *Rob27shred*
> 
> Unigine Valley & Heaven would be best for stressing, 3DMark Firestirke would be best for benching (the scores are kept in a database & comparable with others) IMO.
> http://unigine.com/en/products/benchmarks/heaven
> http://unigine.com/en/products/benchmarks/valley
> http://www.futuremark.com/benchmarks/3dmark?_ga=1.136484912.1213891817.1446507546


Yep I'd second that.
While Heaven really purely stresses your GPU and as such is good for only overclocking your Graphicscard. Any additional overclock to your CPU affects the scores of Heaven Benchmarks very little. While with Firestrike 1.1, a better CPU can bring you much higher scores.

For checking system stability a quick run through Firestrike is a good quick test (will show artifacting or stop much quicker than Heaven or Valley). For durability let it run for 1 hour through Heaven.

The difference between Valley and Heaven is that Valley also splits the load more between GPU and CPU. So to compare GPUs Heaven is the better option since it really only compares the GPUs while a Valley Benchmark can show bigger differences due to different hardware set-ups (Different CPUs, Motherboard, RAM....)

But most importantly like tangelo already said: *STAY AWAY from FURMARK*.


----------



## TsukikoChan

Quote:


> Originally Posted by *AverdanOriginal*
> 
> Correct. I just read up on this and the source (PCGH) also stated that it will boot much quicker (instead of ~8 seconds in future only 0.6 seconds):
> Additional Features which they will most porbably implement:
> - Design: better looks and split into Gaming-Video-Display-System. Additional on the bottom there will be 2 tabs being "update" and "Messages" (or so), where you can look into error messages in case you experienced a crash of some sort.
> - Overdrive (and this might be really interesting): OC-Profiles customizable for different games (so for each game you have you can adjust the OC of your GPU accordingly to your needs). So you can adjust max fan speed, max temperature on top of max overclock and power target for each of your games.
> 
> The only thing that I am missing is information if they will include a fan curve and voltage adjustment. If so, then I would even consider to un-install my MSI Afterburner.
> Oh and I hope they include true 4k Downsampling for R9 300 Series since it currently still is only 1800p and not 2160p


The oc per game will be awesome, esp as you said if they add in custom fan profiles per game too. i doubt we'd get rid of afterburner or trixx for a while yet 
The only thing I'm hoping for is better dx9/11 driver optimisations. i know dx12 and vulkan are their priorities now but it would be nice


----------



## BradleyW

Quote:


> Originally Posted by *TsukikoChan*
> 
> The oc per game will be awesome, esp as you said if they add in custom fan profiles per game too. i doubt we'd get rid of afterburner or trixx for a while yet
> The only thing I'm hoping for is better dx9/11 driver optimisations. i know dx12 and vulkan are their priorities now but it would be nice


Even if they improve it, they won't state it on the notes because that means AMD would be admitting to the issue in the first place. AMD *never* state all their fixes on notes.


----------



## DarknightOCR

hello back to PCs, I bought a sapphire R9 390 Tri-X Nitro
It turns out that I've seen that EK has no compatible block.
I've been studying the PCBs, and I will buy a block to the R290, I think with some small changes gives to mount the block.
I'll cut / chop where do lack are only 2 or 3 places.
Then share the results


----------



## AverdanOriginal

Quote:


> Originally Posted by *tolis626*
> 
> Well, I can't say my MSI's core is a great overclocker. The max it's hit while being stable is 1185MHz at +100mV. Anything higher and I get artifacting. I think it has something to do with temperatures though, so I will give it a try again once I grow enough balls to change my card's TIM.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thing is, I'm wondering whether overclocking the core affects the memory and vice versa. It seems a bit more... Sensitive when both are higher. It certainly crashes quicker if it's going to crash. I'll keep abusing it in the next few days and see what I can come up with.
> 
> For now, I seem to have found a stable-ish, usable-ish overclock of 1165/1725MHz at +75/+60mV. Now normally I'm able to do 1170MHz with +75mV, but I tried running Heaven at 1170/1725 and it crashed. At 1165/1725 it went through half an hour of Valley though. So there's that. Alas, +60mV AUX does seem quite high... I'm really struggling to decide whether I should keep using it like that or not.


Actually when I overclock my graphics card I tend to check where my highest Memory OC is (while letting the Core stay at stock) and then put Memory back on and then see how far I can OC the Core. Then you have your max on each. When you combine both you nearly always need to dial downn both since the card needs to of course splitt the power on both clocks. So Memory OC and Core OC affect each other definetlly. and your OC of 1165/1725 is quite good actually.


----------



## AverdanOriginal

Quote:


> Originally Posted by *XxxxVulcanxxxX*
> 
> I am running the full pci-e gen 2.0 x16 didnt think It would make much of a difference from the scores im getting it in the right ball park. Im not trying to to break world records i just feel like im lagging behind what other users are getting. I have a define s case with 6 140mm fans so airflow is not an issue. I did drop a massive 15c in the witcher3 by putting a fan on the side panel blowing onto the card directly and an extra exhaust in the roof, before that i actually had a red screen of death and heat soak issues in the case. Might just stick where i am for now and try again when i get the watercooling installed in the new year...
> 
> Love the card tho playing pretty much everthing maxed out and at 1440p in most cases. not bad a £250 card (friend brought it by mistake and lost reciept so couldnt return it) he brought a 980ti


Hi,
I switched recently from an AMD Phenom II x64 1055T @ 3.7GHz with a PCI-e Gen 2 Board to Skylake. Scores improved in Firestrike but not in Heaven. Improvement mainly related to the switch in CPU not necessarily in PCI-e Gen. BUT what I noticed was that adding more voltage would result in less artifacting --> Hence with PCI-e Gen 3 I was able to overclock better.
But on same clock you would not see much difference in FPS.

Seeing your R9 390 as a 290 in Firestrike might be related to an older version of Firestrike. Do you have the most up-to-date version?


----------



## AverdanOriginal

Quote:


> Originally Posted by *Dundundata*
> 
> Besides benching I've been quite timid with my card only running +25mV core @1150/1650. I've clocked higher and temps aren't an issue but I get around 60fps in Witcher 3 with hairworks on (most of the time).
> 
> Can anyone recommend a good fan controller? I have 6 case fans running @100%, which is fine for the most part as they are not too loud, but it's really unnecessary when I'm not gaming. Ideally it would fit in a 5.25 drive and not be as deep as a disc drive, or something I can mount elsewhere. Really I just want to be able to turn all the fans down and then crank them up when OCing. So actually the less dials/switches I need to turn the better.


Nice OC with fairly low mv







I am using the Lamptron FC5 v2. 4 channels with each having 30W. Needed the 30 Watts since I wanted to dim my LEDS all over one regulator (3 LEDS each 6,4W), and I have 4 fans connected each on two channels. so I am planning to install another color LED on the fourth channel since its still free







Size is fairly small and LCD shows RPM and Volts/Temperature.


----------



## kubiks

I love a good project. I don't really care for ekwb products, so I avoided that road, buying a full block and backplate and modding. I feel ekwb is overpriced and I don't care for the designs.

regardless that's a great project, and I'm definitely looking forward to seeing your results. if the mods are simple you may see a ton of people in the community picking that up


----------



## tsgrayson

Hello 390x owners! I have been a long time nvidia user but have been thinking the 390x in crossfire or trifire might be good for me for my new system. My question is for those that run this setup, any problems or regrets? I am looking at pairing it with an Acer XR34.


----------



## Dundundata

Averdan, I actually just placed an order for a Lamptron FC6 20Wx4. No LEDs at the moment but I've been thinking of adding a strip, to see all those sexy parts







. Nice to know I can hook it up to the fan controller.

As for the card, I absolutely love the MSI...great clocks and runs really cool in my case o'fans. Core and VRM temps are excellent.


----------



## SamuelGamer

Its Pretty Noice.


----------



## XxxxVulcanxxxX

Quote:


> Originally Posted by *AverdanOriginal*
> 
> Hi,
> I switched recently from an AMD Phenom II x64 1055T @ 3.7GHz with a PCI-e Gen 2 Board to Skylake. Scores improved in Firestrike but not in Heaven. Improvement mainly related to the switch in CPU not necessarily in PCI-e Gen. BUT what I noticed was that adding more voltage would result in less artifacting --> Hence with PCI-e Gen 3 I was able to overclock better.
> But on same clock you would not see much difference in FPS.
> 
> Seeing your R9 390 as a 290 in Firestrike might be related to an older version of Firestrike. Do you have the most up-to-date version?


@AverdanOriginal I have just looked on 3dmark and all my scores now show that i have a 390x they must have updated something their end as it was showing 290x only a few days ago and i have 3dmark through steam so its always been kept up to date so this must be a behind the scenes update to their databases to affect my older scores. I had a suspicion the pci-e 2 might have an impact but im just not willing to spend more upgrading for a few extra Mhz. At least not untill AMD show us what zen can really do and Intels inevitable counter attack.

@kubiks I have fitted an alphacool cooler to a friends 290x tonight and what a tedious and boring job sticking all the thermal pads on was... if your fitting 2 in one sitting, good luck to you







cant wait to get one on mine tho his temps were sitting at 50c in gtaV (he had work so limited testing for tonight)


----------



## vsseracer

Cards I am looking at:

http://www.amazon.com/gp/product/B011D7AAA8?keywords=r9%20390&qid=1446686392&ref_=sr_1_1&refinements=p_n_feature_four_browse-bin%3A6066318011&s=pc&sr=1-1

http://www.amazon.com/gp/product/B00ZGF3UAQ?keywords=r9%20390&qid=1446686392&ref_=sr_1_3&refinements=p_n_feature_four_browse-bin%3A6066318011&s=pc&sr=1-3

http://www.amazon.com/gp/product/B015DXHEAW?keywords=r9%20390&qid=1446686392&ref_=sr_1_6&refinements=p_n_feature_four_browse-bin%3A6066318011&s=pc&sr=1-6

Any preference?


----------



## battleaxe

Quote:


> Originally Posted by *vsseracer*
> 
> Cards I am looking at:
> 
> http://www.amazon.com/gp/product/B011D7AAA8?keywords=r9%20390&qid=1446686392&ref_=sr_1_1&refinements=p_n_feature_four_browse-bin%3A6066318011&s=pc&sr=1-1
> 
> http://www.amazon.com/gp/product/B00ZGF3UAQ?keywords=r9%20390&qid=1446686392&ref_=sr_1_3&refinements=p_n_feature_four_browse-bin%3A6066318011&s=pc&sr=1-3
> 
> http://www.amazon.com/gp/product/B015DXHEAW?keywords=r9%20390&qid=1446686392&ref_=sr_1_6&refinements=p_n_feature_four_browse-bin%3A6066318011&s=pc&sr=1-6
> 
> Any preference?


Saphire for sure


----------



## Agent Smith1984

Quote:


> Originally Posted by *Cannon19932006*
> 
> Anyone know if the MSI 390x Gaming *LE* is any good?


To be honest man..... I am ALL FOR getting matching MSI cards, BUT, and this is a big BUT..... if you don't care about looks, I would throw an XFX or Sapphire in the top, cause the 2.5 slot card is so tall, that it will be choked for air with another above it..... Just a tip though.....









Quote:


> Originally Posted by *tsgrayson*
> 
> Hello 390x owners! I have been a long time nvidia user but have been thinking the 390x in crossfire or trifire might be good for me for my new system. My question is for those that run this setup, any problems or regrets? I am looking at pairing it with an Acer XR34.


Similar tip for you too sir.... if going with one card, get an MSI, and I will continue to suggest that card.... but if going crossfire, get an XFX or Sapphire for it's standard dual slot height.








Quote:


> Originally Posted by *SamuelGamer*
> 
> 
> Its Pretty Noice.


Get us some proper submission for club entry and I'll get you added bud!








Congrats on the card... I had the Strix 390 for a bit, and though it was a bit on the hot side, it still clocked pretty good (1185/1725), and I really liked the looks of it......
Quote:


> Originally Posted by *battleaxe*
> 
> Saphire for sure


100% agreed between those, unless you just GOTTA HAVE THAT SEXY RED STRIX for your windowed build.....


----------



## wickedsunny

Quote:


> Originally Posted by *Agent Smith1984*
> 
> To be honest man..... I am ALL FOR getting matching MSI cards, BUT, and this is a big BUT..... if you don't care about looks, I would throw an XFX or Sapphire in the top, cause the 2.5 slot card is so tall, that it will be choked for air with another above it..... Just a tip though.....
> 
> 
> 
> 
> 
> 
> 
> 
> Similar tip for you too sir.... if going with one card, get an MSI, and I will continue to suggest that card.... but if going crossfire, get an XFX or Sapphire for it's standard dual slot height.
> 
> 
> 
> 
> 
> 
> 
> 
> Get us some proper submission for club entry and I'll get you added bud!
> 
> 
> 
> 
> 
> 
> 
> 
> Congrats on the card... I had the Strix 390 for a bit, and though it was a bit on the hot side, it still clocked pretty good (1185/1725), and I really liked the looks of it......
> 100% agreed between those, unless you just GOTTA HAVE THAT SEXY RED STRIX for your windowed build.....


Sapphire' is very long but is it less thicker than MSI? How many pcie slots does it cover?


----------



## AverdanOriginal

Quote:


> Originally Posted by *Dundundata*
> 
> Averdan, I actually just placed an order for a Lamptron FC6 20Wx4. No LEDs at the moment but I've been thinking of adding a strip, to see all those sexy parts
> 
> 
> 
> 
> 
> 
> 
> . Nice to know I can hook it up to the fan controller.
> 
> As for the card, I absolutely love the MSI...great clocks and runs really cool in my case o'fans. Core and VRM temps are excellent.


Nice choice
















for the LEDS I'd suggest to get the Nanoxia Rigid LEDs. they are defined as dimmable and are pre-tested. They are so bright I ended up only installing 2 instead of 3.







getting UV version sent to me now, since my front Fans have UV fanblades.... wanna see how that looks when UV light is shining on them


----------



## AverdanOriginal

Quote:


> Originally Posted by *wickedsunny*
> 
> Sapphire' is very long but is it less thicker than MSI? How many pcie slots does it cover?


The Sapphire R9 390(X) is about 42mm thick and the MSI about 51mm. So the Sapphire covers 2 Slots while the MSI covers 2.5 Slots. That makes a big difference for air flow in terms of crossfire.


----------



## wickedsunny

Quote:


> Originally Posted by *AverdanOriginal*
> 
> The Sapphire R9 390(X) is about 42mm thick and the MSI about 51mm. So the Sapphire covers 2 Slots while the MSI covers 2.5 Slots. That makes a big difference for air flow in terms of crossfire.


Thanks Man.

I am not going to crossfire but I need them for rendering in Blender.

Currently my Quadro 4000 goes upto 80 degrees when on full load, I have like 5 master cooler fans with inflow outflow setup.

So that extra space will help to cool down 2 x sapphire in long run, maybe I could even add one blower slot in between them to take out those extra heat from rear.


----------



## ansha

Is it normal that the gpu core voltage jumps up to 1.25V when in windows browsing web? It is at 0.89V idle but jumps all the way to 1.25V. It's interesting that under full load in games it doesn't go above 1.19V.
Card in question is Powercolor PCS+ 390.

You can see in the pic how the memory clocks up-down together with voltage all the time while chrome is open..


----------



## Gumbi

Turn off hardware acceleration.


----------



## 5dragons

guys i gonna buy the 390x, and i wanna know if the msi it is worthy i had see that she is having some problem with power consumption


----------



## Gumbi

Quote:


> Originally Posted by *5dragons*
> 
> guys i gonna buy the 390x, and i wanna know if the msi it is worthy i had see that she is having some problem with power consumption


What do you mean? What PSU do you have?


----------



## 5dragons

600 w EVGA


----------



## Gumbi

Quote:


> Originally Posted by *5dragons*
> 
> 600 w EVGA


What model exactly? Should be fine unless you are running a heavily overclocked AMD CPU. (Any Intel CPU/overclock is fine).


----------



## 5dragons

600 Bronze EVGA, i have fx 8350 and in the future i'm thinking in buy a new cooler and get to 4.5 ghz oc


----------



## Gumbi

Quote:


> Originally Posted by *5dragons*
> 
> 600 Bronze EVGA, i have fx 8350 and in the future i'm thinking in buy a new cooler and get to 4.5 ghz oc


You should be fine but you'll probably only be able to heavily overclock one of those two components with that CPU. The FX 8350 sucks a lot of juice and even more when overclocked.


----------



## 5dragons

ty a lot, if gonna heavily oc both, i probaly gonna buy a new PS


----------



## Gumbi

Sounds good.


----------



## Cannon19932006

New 3dmark score with the 2nd 390x

http://www.3dmark.com/fs/6406978


----------



## xboxshqip

add me to the club guys, the beast has arrived.


----------



## mandrix

Quote:


> Originally Posted by *Cannon19932006*
> 
> New 3dmark score with the 2nd 390x
> 
> http://www.3dmark.com/fs/6406978


sweet.


----------



## Rob27shred

Quote:


> Originally Posted by *Cannon19932006*
> 
> New 3dmark score with the 2nd 390x
> 
> http://www.3dmark.com/fs/6406978


Very nice score there! Here is what I get with my 390 & 290X crossfired, fully OCed http://www.3dmark.com/fs/6352937, everyday driver OC http://www.3dmark.com/fs/6389397


----------



## Aerugo

I got the STRIX R9 390 as well a few weeks ago.
Last video card from AMD/ATI I owned was a HD4850 and had been with nVidia since then.
Almost jumped into a 970/980 for GSYNC purposes, but the premium for GSYNC in the card/monitor combo smelled too much.

IMO, nVidia could have easily used the display port 1.2A standard (adaptive sync standard), which AMD dubs FreeSync, but they choose not to.

I've been extremely pleased with the performance/price of the 390. It's more then 'good enough' for the gaming I do.
BTW, paired it up with an ASUS MG279Q FreeSync monitor at the same time.


----------



## tolis626

So... I've finally gotten my card stable at 1170/1700MHz at +80/+55mV and it got me thinking... Is it too much? I mean, for everyday gaming. I would love to hear your opinion guys. Would YOU run your card that high? I'm kinda concerned about its longevity, but I wouldn't cry too much if it died in 3 years. I will probably have switched to something better by then.

PS : I know I'm getting a little bit tiring, but bear with me.


----------



## jon666

390 is now sitting at 1160mhz core, 1525 on memory with 106 mv added. Originally tried dropping memory to 1400 in the hopes of getting a higher core clock, and after playing around with settings for a day couldn't get things stable unless I brought up memory as well. Might throw on my universal swiftech block on it to cool things down. Temps get high enough in battlefield 4 to throttle occasionally after hours and hours of gaming.


----------



## Gumbi

Quote:


> Originally Posted by *tolis626*
> 
> So... I've finally gotten my card stable at 1170/1700MHz at +80/+55mV and it got me thinking... Is it too much? I mean, for everyday gaming. I would love to hear your opinion guys. Would YOU run your card that high? I'm kinda concerned about its longevity, but I wouldn't cry too much if it died in 3 years. I will probably have switched to something better by then.
> 
> PS : I know I'm getting a little bit tiring, but bear with me.


It's fine. How hot does it get under load? Core/VRMs? And how high is that voltage in practise? Not talking about spikes now, just generally what the voltage stays at under load.

As long as it doesn't get above 1.35 you should be OK. AFAIK, technically current damages cards more than voltage. And since a cooler card draws less current, 1.4v at 60 degrees is less damaging than 1.4v at 80 degrees over the long term.


----------



## kubiks

Ok. My alphacool waterblock is in. 1 card is connected to the loop until the weekend. Second card goes in tomorrow. So far running kombustor the card is maxing at 44c. Idle at 30c. Vrm temps not an issue. They were at 75 c after 15 minutes of kombustor at 1440p 8x aa. There's I'd finally an affordable way to tame the 390x dragon. Pictures coming tonight of the block install and crossfire temporary setup


----------



## kubiks

Also. Overclock ed in afterburner


----------



## tolis626

Quote:


> Originally Posted by *Gumbi*
> 
> It's fine. How hot does it get under load? Core/VRMs? And how high is that voltage in practise? Not talking about spikes now, just generally what the voltage stays at under load.
> 
> As long as it doesn't get above 1.35 you should be OK. AFAIK, technically current damages cards more than voltage. And since a cooler card draws less current, 1.4v at 60 degrees is less damaging than 1.4v at 80 degrees over the long term.


WIth the current ambients (I'd guess they fluctuate between 15 and 25 C during the day) and the settings I posted... I get a max of 81C. It will get hotter with more prolonged sessions, but I doubt it will get much hotter. It used to get to like 84-85C, but cooler weather helps. The VRMs usually end up in the 75-80C range (MSI got that right), so they're ok. While playing, I think my voltage fluctuates between 1.23V and 1.265V. I don't think it ever reaches 1.27V unless I go all the way to +100mV. I'll check again and see if I'm right. If I'm not, I'll correct this post, but until then, these are the values I can give you.

Now, it does get to about 1.35V, but these are spikes when idling and having my OC profile applied. So yeah, it doesn't matter. What's funny is that sometimes Afterburner loses its **** and reports some crazy temps. Like right now, it reports my max temp as being 416C. I has reported over 500C in the past. Valley is even funnier, reporting temps in excess of 60.000C. That's right, my GPU is hotter than the sun's internals. Sweet.









Luckily, GPU-z reports everything right, so I don't worry about melting everything.

EDIT : Yup, I remembered correctly. It fluctuates between 1.234V and 1.258V. That's with BF4.


----------



## Cannon19932006

Quote:


> Originally Posted by *Rob27shred*
> 
> Very nice score there! Here is what I get with my 390 & 290X crossfired, fully OCed http://www.3dmark.com/fs/6352937, everyday driver OC http://www.3dmark.com/fs/6389397


Very Nice.


----------



## kubiks

Also. Overclock ed in afterburner to 1675 memory with 20 auxiliary and 1175 with 100mv and not real noticeable temp raise. Maybe 1 or 2 c.


----------



## bruthaflex

Hey guys,
I am new to the forums, I have been starting a new system build around an XFX R9 390 Double Dissipation.
Here is my build list, I have already purchased the GPU,







I am still deciding on the rest of the components.:

PCPartPicker part list: http://pcpartpicker.com/p/hCQtGX
Price breakdown by merchant: http://pcpartpicker.com/p/hCQtGX/by_merchant/

CPU: Intel Core i5-4430 3.0GHz Quad-Core Processor
Motherboard: ASRock H97M-ITX/AC Mini ITX LGA1150 Motherboard
Memory: Corsair Vengeance 8GB (2 x 4GB) DDR3-1600 Memory
Storage: Sandisk SSD PLUS 120GB 2.5" Solid State Drive
Storage: Kingston SSDNow V300 Series 240GB 2.5" Solid State Drive
Video Card: Sapphire Radeon R9 390 8GB Nitro Video Card
Case: Cooler Master Elite 130 Mini ITX Tower Case
Power Supply: Corsair Builder 600W 80+ Bronze Certified ATX Power Supply
Optical Drive: LG UH12NS30 Blu-Ray Reader, DVD/CD Writer
Total: $0.00
Prices include shipping, taxes, and discounts when available
Generated by PCPartPicker 2015-11-06 14:25 EST-0500

I would like a stable build for Modded Skyrim, Mount and Blade, Fall Out 3/4, Rainbow Six Siege, Ghost Recon Wildlands, and GTA 5. I would like too use the mini-itx case if possible, I do not plan on overclocking.

I have been reading about many issues concerning DX11 and the drivers for this card. I know there are a great many R9 390 users who are not having issues, and I thought this would be a good place to ask people with experience with this video card, especially since I am building from scratch.

Which OS, Win 7 or Win 10.
Which Drivers?
Any other advice you can give to get this build started on the right foot would be greatly appreciated.


----------



## EternalRest

XFX has blower style 390 for sale on NewEgg.

http://www.newegg.com/Product/Product.aspx?Item=N82E16814150759


----------



## BradleyW

According to Guru3D, the 390X is 28fps faster than the 290X on BOIII @ 1080p, yet they are almost identical cards.


----------



## Gumbi

Quote:


> Originally Posted by *BradleyW*
> 
> According to Guru3D, the 390X is 28fps faster than the 290X on BOIII @ 1080p, yet they are almost identical cards.


Link?

They are the same chips. The 390s just have the memory overvolted a bit more so they max out at 1700-1750 instead of 1550-1650. The 390s come clocked higher by default though.


----------



## BradleyW

Quote:


> Originally Posted by *Gumbi*
> 
> Link?
> 
> They are the same chips. The 390s just have the memory overvolted a bit more so they max out at 1700-1750 instead of 1550-1650. The 390s come clocked higher by default though.


http://www.overclock.net/t/1579588/guru3d-call-of-duty-black-ops-iii-pc-graphics-performance-benchmark-review#post_24580248


----------



## tbg299

I attempted an overclock on my R9 390 8Gb Gaming and am downright disappointed with the results. No matter what I do, I can't get past a 1130 MHz core clock without artifacting occuring on Valley benchmark. Could there be something I'm overlooking or is my card just a dud? I mean, I've read several reviews where they were able to get around 1200 MHz on this card easily.


----------



## diggiddi

Up your voltages in increments and power limit to max, maybe more aggro fan profile


----------



## TehMasterSword

Anyone else move up to the 15.11 beta drivers? It fixed my constant crashing in GTA V!


----------



## kubiks

Well, it's a mess right now but it works! Temp setup until the crossfire bridge comes.


----------



## kubiks

One mor


----------



## Dundundata

Quote:


> Originally Posted by *tbg299*
> 
> I attempted an overclock on my R9 390 8Gb Gaming and am downright disappointed with the results. No matter what I do, I can't get past a 1130 MHz core clock without artifacting occuring on Valley benchmark. Could there be something I'm overlooking or is my card just a dud? I mean, I've read several reviews where they were able to get around 1200 MHz on this card easily.


What are u using to OC? What voltage increases are you running? How are your temps?


----------



## Rob27shred

Quote:


> Originally Posted by *BradleyW*
> 
> According to Guru3D, the 390X is 28fps faster than the 290X on BOIII @ 1080p, yet they are almost identical cards.


ALMOST being the keyword in your statement. The memory is clocked 250mHz faster on the 390x/390 which gives a faster texture fill rate I think. I have a XFX DD 390 & MSI Gaming OC 290x in my rig & the 390 beats the 290x in all single card benchs.


----------



## Gumbi

Quote:


> Originally Posted by *tbg299*
> 
> I attempted an overclock on my R9 390 8Gb Gaming and am downright disappointed with the results. No matter what I do, I can't get past a 1130 MHz core clock without artifacting occuring on Valley benchmark. Could there be something I'm overlooking or is my card just a dud? I mean, I've read several reviews where they were able to get around 1200 MHz on this card easily.


1200mhz is NEVER easy on any 390, not sure what reviews you're reading. They often push the cards further than what can reasonably be run as a 24/7 overclock without excellent case cooling.

What are your load core/VRM temps, and how much voltage are you putting through the card? You can DEFINITELY do more than 1130 on that card if your temps are OK and you know what you're doing.


----------



## Gumbi

Quote:


> Originally Posted by *Rob27shred*
> 
> ALMOST being the keyword in your statement. The memory is clocked 250mHz faster on the 390x/390 which gives a faster texture fill rate I think. I have a XFX DD 390 & MSI Gaming OC 290x in my rig & the 390 beats the 290x in all single card benchs.


I'll have to call you up on that. My 290X gives some pretty nice benches. The only edge the 390 (x)s have is that their memory is overvolted a bit more and can consistently do 1700-1750 mhz.


----------



## wickedsunny

Just got my R9 390 Saphire trix OC

When I boot my MOTHERBOARD BIOS IMAGE ( gigabyte is coming pixelated crooked or black). No problem playing games but there are some weird noises and artefacts while rendering with Blender in a particular scene. I have done some benchmarking scenes in blender and they are all coming fine. Just my scene is coming a bit weird.

This could be because I am using Beta Drivers? I have installed the drivers in Safe mode device manager and not by proper installation like normal Nvidia Drivers. So catalyst control panel is not working.

Should I try removing the drivers and installing the stable version from normal windows? It is the windows auto installer which creates the problem and doesn't allow installing, that is why I tried installing in Safe Mode.



benchmark is great


----------



## Rob27shred

Quote:


> Originally Posted by *Gumbi*
> 
> I'll have to call you up on that. My 290X gives some pretty nice benches. The only edge the 390 (x)s have is that their memory is overvolted a bit more and can consistently do 1700-1750 mhz.


I was talking about stock speeds but it looks you are right either way. I had it backwards, looking at my firestrike results shows my 290X has better stock scores by a hair.







Just go by the graphics scores on those links also, the 390 one was before I OCed my CPU.
290X stock - http://www.3dmark.com/fs/6330120
390 stock - http://www.3dmark.com/fs/5896981


----------



## Geoclock

Which Windows OS is for gamers? Have MSI 390x.
Recently i switched from 8 to 8.1 and have very bad game loads and play . Also changed Catalyst by latest but same.
Go back to 8 or 10 ?


----------



## Rob27shred

Quote:


> Originally Posted by *Geoclock*
> 
> Which Windows OS is for gamers? Have MSI 390x.
> Recently i switched from 8 to 8.1 and have very bad game loads and play . Also changed Catalyst by latest but same.
> Go back to 8 or 10 ?


I am running Win10 & haven't had any issues with it so far. Although my experience with it seems to be in the minority. My desktop has only had win10 on it but I upgraded my laptop to win10 from 8.1 & the only game that gave me problems was Diablo 3 after the switch. If 8 & 10 are your only options I'd say go to 10.


----------



## Dundundata

Been using Win10 + 15.8 driver...smooth sailing


----------



## Slowpoke66

Heya!

Bought a new card to my backup-rig and would like to join the club;



Firestrike: http://www.3dmark.com/fs/6414821

Maybe I'll get another one for CF and replace my two ASUS 290 DCII in my main rig. Sadly, I'm rather short on cash atm...


----------



## Dundundata

Great numbers Slowpoke, welcome! What kind of score do you get with the 2 290s?


----------



## Slowpoke66

Quote:


> Originally Posted by *Dundundata*
> 
> Great numbers Slowpoke, welcome! What kind of score do you get with the 2 290s?


Thx!

290 CF in Firestrike: http://www.3dmark.com/fs/5801415


----------



## Levys

[/quote]
Quote:


> Originally Posted by *tbg299*
> 
> I attempted an overclock on my R9 390 8Gb Gaming and am downright disappointed with the results. No matter what I do, I can't get past a 1130 MHz core clock without artifacting occuring on Valley benchmark. Could there be something I'm overlooking or is my card just a dud? I mean, I've read several reviews where they were able to get around 1200 MHz on this card easily.


Why is this AMD's fault? i clock't to 1225/1750 + 90mvcore and +90mv aux with no artifact's ( XFX R9 390x on custom water)
could be a bad card..what brand is it? did you ad voltage?


----------



## Levys

Quote:


> Originally Posted by *kubiks*
> 
> One mor


If you want the back plates to really mather on your temps (mostly vrm ) then put some like these inbetween 

Those are arctic thermal pads 1.5mm thickness and they really help to convert the heat from the pcb to the backplate. you can thank me later


----------



## vsseracer

I'm so close to buying a MSI 390 for 1440p monitor but can't detach myself from the urge to buy a 970......


----------



## rdr09

Quote:


> Originally Posted by *vsseracer*
> 
> I'm so close to buying a MSI 390 for 1440p monitor but can't detach myself from the urge to buy a 970......


$300 tops for a 4GB card. Well, if it has 4.


----------



## sal0

I'm getting new card since my old HD7850 just doesn't cut it anymore in GTA V and some other instances. I'm from europe and just found out I can get Sapphire trix 290X 8GB for 328€ while Sapphire 390 Nitro being 336€.

290X should have avantage since its fully unlocked chip in first place? I looked from kitguru review the PCB of cards is identical, only bios / sw level should be different, if so would flashing 290x to 390x help anything?

Any help or comments would be appreciated









links to cards:
http://www.mindfactory.de/product_info.php/8192MB-Sapphire-Radeon-R9-290X-Tri-X-OC-Aktiv-PCIe-3-0-x16--Lite-Retail-_992580.html

http://www.mindfactory.de/product_info.php/8192MB-Sapphire-Radeon-R9-390-Nitro-inkl--Backplate-Aktiv-PCIe-3-0-x16--_1012937.html


----------



## Blackcurrent

Sigh I don't get what is going on. I've tried two different R9 390s both from different shops and I have a stutter/freeze problem with GTA V and Witcher 3. It resolves itself after a min but the first min the game just freezes for a good second a few times. I've tried reinstalling W10, different driver, different PCI slot, OCing CPU, downclocking and increasing voltage no luck... tomorrow is the last day I can swap the card maybe I should go for a 970. GTX 670 previously gave me no issues...


----------



## Gumbi

Can any 390(x) owner beat this score? Run at 100% fan and 1241/1644 (blame those odd numbers on Trixx's terrible interface). Ambients are quite and helped, maybe 14 c in my room at the moment. but it was a stable run!. Plus 200mv in Trixx is 1.4v when stressed fully, but in many games it wouldn't be glued to 1.4v as it was in Firestrike. Still, the temps are AMAZING on the VaporX cooler. Ambients helped, as well as my case cooling setup (have 4 Nocuta 2k fans hooked up to a fan controller, and 2 other random make fans too of lesser speeds).

i uploaded the image to show GPUz logs. Also, my PSU 12v sensor dips a nice bit when using so much voltage, I don't think it's a cause for concern, but I am going to be grabbing a new PSU soon (this SuperFlower Amazon Bronze 650 watt has served me well for 5 years). Probably going to grab a SuperFlower 650 watt Gold (either Golden Green or Leadex).

My card is easily drawing 350~ watts with this voltage, if not more, and CPU is 4790k at 4.9ghz/1.31v. So system is being heavily loaded when running this bench! probably 550 - 600 watts!

http://www.3dmark.com/3dm/9201024?


----------



## ShadowBSE

What's a good starting point to OC my PowerColor 390 PCS+ (--> MSI Afterburner settings)?I don't want to increase the voltage.

Also, what's the temperature I should aim for at 100% GPU usage (and what's the maximum temp)? Any suggestions on a custom fan profile?

Thanks in advance!


----------



## fat4l

Quote:


> Originally Posted by *Gumbi*
> 
> Can any 390(x) owner beat this score? Run at 100% fan and 1241/1644 (blame those odd numbers on Trixx's terrible interface). Ambients are quite and helped, maybe 14 c in my room at the moment. but it was a stable run!. Plus 200mv in Trixx is 1.4v when stressed fully, but in many games it wouldn't be glued to 1.4v as it was in Firestrike. Still, the temps are AMAZING on the VaporX cooler. Ambients helped, as well as my case cooling setup (have 4 Nocuta 2k fans hooked up to a fan controller, and 2 other random make fans too of lesser speeds).
> 
> i uploaded the image to show GPUz logs. Also, my PSU 12v sensor dips a nice bit when using so much voltage, I don't think it's a cause for concern, but I am going to be grabbing a new PSU soon (this SuperFlower Amazon Bronze 650 watt has served me well for 5 years). Probably going to grab a SuperFlower 650 watt Gold (either Golden Green or Leadex).
> 
> My card is easily drawing 350~ watts with this voltage, if not more, and CPU is 4790k at 4.9ghz/1.31v. So system is being heavily loaded when running this bench! probably 550 - 600 watts!
> 
> http://www.3dmark.com/3dm/9201024?


That's a lot of volts








What is the "max" for VDDC ? Can u re-run FS and check/upload image pls ?









Anyway I can try to beat the score


----------



## tolis626

Quote:


> Originally Posted by *Gumbi*
> 
> Can any 390(x) owner beat this score? Run at 100% fan and 1241/1644 (blame those odd numbers on Trixx's terrible interface). Ambients are quite and helped, maybe 14 c in my room at the moment. but it was a stable run!. Plus 200mv in Trixx is 1.4v when stressed fully, but in many games it wouldn't be glued to 1.4v as it was in Firestrike. Still, the temps are AMAZING on the VaporX cooler. Ambients helped, as well as my case cooling setup (have 4 Nocuta 2k fans hooked up to a fan controller, and 2 other random make fans too of lesser speeds).
> 
> i uploaded the image to show GPUz logs. Also, my PSU 12v sensor dips a nice bit when using so much voltage, I don't think it's a cause for concern, but I am going to be grabbing a new PSU soon (this SuperFlower Amazon Bronze 650 watt has served me well for 5 years). Probably going to grab a SuperFlower 650 watt Gold (either Golden Green or Leadex).
> 
> My card is easily drawing 350~ watts with this voltage, if not more, and CPU is 4790k at 4.9ghz/1.31v. So system is being heavily loaded when running this bench! probably 550 - 600 watts!
> 
> http://www.3dmark.com/3dm/9201024?


I didn't beat you, but I'd say I'm too close for comfort.









http://www.3dmark.com/fs/6378264

14725 vs 14770. I'm like 10MHz away from your score.









Still, this Vapor-X... Damn, they can take some punishment. Too bad Sapphire decided to not use that cooler for the 300 series, or even the Fury. Would have made almost too much sense. And there is the PCB to back it up. I doubt most 390 cards can really draw 1.4V like it's no big deal.

Also, 4.9GHz at 1.31V on the 4790k? You bastard... Mine only does 4.7GHz at 1.31V (And that's not 100% stable, needs 1.315v). For 4.8GHz I need something stupid like 1.4V and 4.9GHz isn't even doable. Goddamn silicon lottery...


----------



## Gumbi

@fat4l I'll rerun Firestrike, but the max I'm pretty sure is the 1.406v (basically 1.4v), the current reading was in line with the straight line seen in the log which seems to indicate it's capped out at that by the BIOS.

@tolis626 Very nice score! it seems your memory is helping you pull through the extra points to make up for my faster core. Could you bench 1185/1644 and see what score you get, i want to see how much of a difference the 100~ extra mhz on the memory is making.

And yes, my 4790k is definitely a golden chip







Yours isn't bad at all actually, most do 4.5/4/6 at 1.31v~. Feels great doing 4.9 with ease on air


----------



## tolis626

Quote:


> Originally Posted by *Gumbi*
> 
> @fat4l I'll rerun Firestrike, but the max I'm pretty sure is the 1.406v (basically 1.4v), the current reading was in line with the straight line seen in the log which seems to indicate it's capped out at that by the BIOS.
> 
> @tolis626 Very nice score! it seems your memory is helping you pull through the extra points to make up for my faster core. Could you bench 1185/1644 and see what score you get, i want to see how much of a difference the 100~ extra mhz on the memory is making.
> 
> And yes, my 4790k is definitely a golden chip
> 
> 
> 
> 
> 
> 
> 
> Yours isn't bad at all actually, most do 4.5/4/6 at 1.31v~. Feels great doing 4.9 with ease on air


You asked, and I shall deliver. Memory being at 1725MHz is good for some 250-300 points, you were right.

http://www.3dmark.com/fs/6431129

Still... I'm tempted to install Trixx and beat that score of yours. I shouldn't...


----------



## Gumbi

Quote:


> Originally Posted by *tolis626*
> 
> You asked, and I shall deliver. Memory being at 1725MHz is good for some 250-300 points, you were right.
> 
> http://www.3dmark.com/fs/6431129
> 
> Still... I'm tempted to install Trixx and beat that score of yours. I shouldn't...


Wow, very nice. Didn't realise, definitely makes a difference. I should run 1185/1644 and compare it to your score!

@fat4l Max reported voltage was actually 1.422







.


----------



## fat4l

Quote:


> Originally Posted by *tolis626*
> 
> Also, 4.9GHz at 1.31V on the 4790k? You bastard... Mine only does 4.7GHz at 1.31V (And that's not 100% stable, needs 1.315v). For 4.8GHz I need something stupid like 1.4V and 4.9GHz isn't even doable. Goddamn silicon lottery...


Mine is doing 5100MHz at 1.35V stable...








Quote:


> Originally Posted by *Gumbi*
> 
> Wow, very nice. Didn't realise, definitely makes a difference. I should run 1185/1644 and compare it to your score!
> 
> @fat4l Max reported voltage was actually 1.422
> 
> 
> 
> 
> 
> 
> 
> .


Do u run it at this voltage on daily basis ?


----------



## tolis626

Quote:


> Originally Posted by *fat4l*
> 
> Mine is doing 5100MHz at 1.35V stable...


...

F*ck you man, f*ck you...









Since we already went a bit OT, have you guys messed with your uncore speed on your 4790k's? I remember the Haswell overclocking thread and they were saying that underclocking/undervolting uncore imrproves the core's overclockability, but I have yet to see this in action.


----------



## gupsterg

390/X have slightly tighter ram timings so for like RAM clocks should pull ahead. Obviously need to be same GPU frequency for compare and ideally run on same system.

The Stilt did some tighter timings for Hynix AFR & Elpida ICs, when I use those in say 1375MHz strap and set 1375MHz I get same bench as stock ROM with 1500MHz RAM, only way 1500MHz pulls ahead is if I modify that strap to use 1250MHz stock timings.

IMO @Gumbi your 3dMark score should improve with that little mod (not that its a low score by any means







).
Quote:


> Originally Posted by *tolis626*
> 
> ...
> 
> F*ck you man, f*ck you...


Check out my sweet i5 4690K on air in my sig







...


----------



## Gumbi

Quote:


> Originally Posted by *tolis626*
> 
> ...
> 
> F*ck you man, f*ck you...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Since we already went a bit OT, have you guys messed with your uncore speed on your 4790k's? I remember the Haswell overclocking thread and they were saying that underclocking/undervolting uncore imrproves the core's overclockability, but I have yet to see this in action.


Haven't looked into it myself. i should do, honestly I was so happy with how easily this overclock came that I left it without doing extensive optimisation.
Quote:


> Originally Posted by *gupsterg*
> 
> 390/X have slightly tighter ram timings so for like RAM clocks should pull ahead. Obviously need to be same GPU frequency for compare and ideally run on same system.
> 
> The Stilt did some tighter timings for Hynix AFR & Elpida ICs, when I use those in say 1375MHz strap and set 1375MHz I get same bench as stock ROM with 1500MHz RAM, only way 1500MHz pulls ahead is if I modify that strap to use 1250MHz stock timings.
> 
> IMO @Gumbi your 3dMark score should improve with that little mod (not that its a low score by any means
> 
> 
> 
> 
> 
> 
> 
> ).


I'll look into it! but seeing as I have Elpida, I don't know if it'll give me any more performance, it may end up just clocking lower with that mod overall and making no difference in score!
Quote:


> Originally Posted by *fat4l*
> 
> Mine is doing 5100MHz at 1.35V stable...
> 
> 
> 
> 
> 
> 
> 
> 
> Do u run it at this voltage on daily basis ?


Daaaamn.... I'd say I could do 5ghz at that voltage, but that's too hot on air (unless I delidded it...







). Currently have a Dark Rock Pro 2, which is a superb air cooler, and basically near the limits of air cooling for a CPU. 1.31v gets to about 86c when I do x264 encoding for an extended period.

I run my CPU at that voltage constantly, yes (of course it idles at super low volts most of the time







). And no, my GPU I only set it to 200mv for bench runs. Cooling is not a problem, but it's inefficnet and wasteful.

I can do 1160/1550 at 63mv no problem which is sooo much cooler (not that 66c/50/50 is hot







of course it was at low ambnients and 100% fans) for marginally less performance.


----------



## fat4l

Quote:


> Originally Posted by *gupsterg*
> 
> 390/X have slightly tighter ram timings so for like RAM clocks should pull ahead. Obviously need to be same GPU frequency for compare and ideally run on same system.
> 
> The Stilt did some tighter timings for Hynix AFR & Elpida ICs, when I use those in say 1375MHz strap and set 1375MHz I get same bench as stock ROM with 1500MHz RAM, only way 1500MHz pulls ahead is if I modify that strap to use 1250MHz stock timings.
> 
> IMO @Gumbi your 3dMark score should improve with that little mod (not that its a low score by any means
> 
> 
> 
> 
> 
> 
> 
> ).
> Check out my sweet i5 4690K on air in my sig
> 
> 
> 
> 
> 
> 
> 
> ...


I will have to wait for the mem mod as well








Now I can do 12050 score with 1200/1500MHz and thats pretty crap...

And ur i5 is very nice


----------



## gupsterg

Quote:


> Originally Posted by *fat4l*
> 
> And ur i5 is very nice


Cheers! shame my GPU doesn't OC as well







.

My cooler cost me £13.50 (2nd hand ebay job) chip was brand new (£135, 2nd one). RB stress mode ends up bit higher for temps due to the 290X kicking out heat, max core temp 78c over 8hrs, room temp 23-24c. X264 same 4.9 / 4.3 profile but as the 290X isn't loaded max temp 74c over 8hrs, same room temp.
Quote:


> Originally Posted by *fat4l*
> 
> I will have to wait for the mem mod as well


Should have it ready this evening for you







.
Quote:


> Originally Posted by *Gumbi*
> 
> I'll look into it! but seeing as I have Elpida, I don't know if it'll give me any more performance, it may end up just clocking lower with that mod overall and making no difference in score!


Using Stilts timings for me max ram clock is about 1450MHz IIRC, any higher I see artifacts at desktop







.

Stock 1250MHz I've tested to 1550MHz, not gone higher as been busy with other bits.

You should be ok with stock 1250MHz strap timings in 1500MHz+, Fyzzz uses those on his 290 Hynix BFR from what I recall, Graphics Score 15191.


----------



## wickedsunny

*Guys how do I under clock my GPU?*

I have bought Saphire R9 390 Trix for 3D rendering in Blender.

The issue is it is great for games but causes random artefacts while rendering in blender. Sometimes it is great sometimes it causes image artefacts.

It came overclocked as default to 1040 core clock rather than 1000.

What will help to reduce the artefacts: less core clock or less memory clock?

DO I need to reduce memory clock as well if I reduce the core clock and vice versa?

Is it possible these artefacts could be an issue because of less power?

Here is my present config.

Windows 7 Ultimate 64-bit SP1
Intel Core i7 950 @ 3.07GHz
Gigabyte Technology Co., Ltd. X58A-UD7 (Socket 1366)
24.0GB Triple-Channel DDR3 @ 799MHz
8192MB ATI AMD Radeon R9 390 Series (Sapphire/PCPartner)
Cooler Master seidon 120v+ cpu cooler.
4 cooler master sickle flow + 2 stock fans
3 Seagate hard disks + asus dvd
PSU is Cooler master Silent Pro 700 W

My UPS (APC BACK UPS RS 600) is not working, gives a bleeping sound when I run render with GPU or run benchmark). So currently cpu is connected directly to wall socket. Will need to buy another UPS.

Any help will be appreciated.


----------



## fat4l

Quote:


> Originally Posted by *gupsterg*
> 
> Cheers! shame my GPU doesn't OC as well
> 
> 
> 
> 
> 
> 
> 
> .


Well I would say that it depends on voltage..I think you are putting low volts into the card








If we had some statistics showing how much volts ppl run we would know more. They are just saying +100mV in AB but what it actually means ? We both know its DPM7+100mV and it can actually be 1.3v, 1.35v etc...
Shame ppl don't want to run EVV application to see what their stock voltage is.
My super expensive Ares 3 is not a great clocker either which, considering there are only 500 of them, is very sad. Asus should put more effort into the card.
My cores need @1100MHz- 1.212v and 1.250v(1.195v and 1.234v max real). I think your card is close to this right ?
But yes, we, especially you, deserve more for all the effort, work u have done and knowledge u have regarding hawaii








Quote:


> Originally Posted by *gupsterg*
> 
> My cooler cost me £13.50 (2nd hand ebay job) chip was brand new (£135, 2nd one). RB stress mode ends up bit higher for temps due to the 290X kicking out heat, max core temp 78c over 8hrs, room temp 23-24c. X264 same 4.9 / 4.3 profile but as the 290X isn't loaded max temp 74c over 8hrs, same room temp.


Is it delided ?

Quote:


> Originally Posted by *gupsterg*
> 
> Should have it ready this evening for you
> 
> 
> 
> 
> 
> 
> 
> .


No need to rush. Rather double check hehe


----------



## tolis626

Hey fat4l and Gumbi, what Cinebench scores do you get with your CPUs? Also, what memory are you using? I'm kinda curious now, maybe I will mess with my CPU a bit more later today...

Also, gupsterg, nice one. Not impressive speed wise but man, you could run that puppy in an mITX case with a small cooler and get wicked good performance. Nice stuff!


----------



## Gumbi

Quote:


> Originally Posted by *tolis626*
> 
> Hey fat4l and Gumbi, what Cinebench scores do you get with your CPUs? Also, what memory are you using? I'm kinda curious now, maybe I will mess with my CPU a bit more later today...
> 
> Also, gupsterg, nice one. Not impressive speed wise but man, you could run that puppy in an mITX case with a small cooler and get wicked good performance. Nice stuff!


991 - 997. My RAM is very good. Samsung Wonder RAM that I recently bought off a friend. Clocked at 2200mhz at 1.6v with timings of 9-10-12-20.


----------



## kizwan

Quote:


> Originally Posted by *TehMasterSword*
> 
> Anyone else move up to the 15.11 beta drivers? It fixed my constant crashing in GTA V!


Is this WHEA_UNCORRECTABLE_ERROR crash?


----------



## wickedsunny

Quote:


> Originally Posted by *kizwan*
> 
> Is this WHEA_UNCORRECTABLE_ERROR crash?


15.11 beta drivers? Where is the link? I checked today and there was no update available.


----------



## kizwan

Here you go...

http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx


----------



## wickedsunny

Quote:


> Originally Posted by *kizwan*
> 
> Here you go...
> 
> http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx


Thanks.

Also I was getting random crashes with Blender and my renders had artefacts. I used MSI After Burner and increased the power limit to max without increasing the core.

I am not getting any more crashes , my renders are looking well too. R9 390 is a power hog, it does need a lot of power from the PSU.

I also rooted out all the Nvidia folders from the system and registry.Then defragged the windows drive, just in case they were the problem.


----------



## DarknightOCR

Here for a joke with friends.
This is new 4790k, and best of all I've ever had.

390 nitro, also looks behave very well.
without artifacts with nothing all flawless


----------



## gupsterg

@fat4l

I don't give card a highish voltage offset, VRM temp is amazingly low, GPU temps sorta suck. My case is inverted ATX, therefore orientation of heatsink is right way up (so should be better).


Spoiler: My slightly modded SilverStone TJ06








Agree, its a shame more people aren't trying Stilts VID app.

Regarding your Ares III I can sorta see where you coming from, but as always silicon lottery is a big factor. I guess your sorta paying for a great water cooling block factory fitted to card as well.


Spoiler: My cards info with ROM from factory















My factory ROM not have +31mv but it's programmed into voltage chip, so VID is Stilts DPM7 + 31mv.

For me factory ROM with tightened RAM set to 1090/1475 with DPM 7 @ 1.250v is best for 24/7 use. Due to the offset in voltage chip, drooped MAX VDDC for 3dMark FS = 1.250V. *SO* technically you could say I'm giving it 6.25mv more than out of the box to get 1090 / 1475 vs 1030 / 1325. *OR* say +37.5mv as usually stated.

Using same ROM if I set clocks as 1100 / 1525 with DPM7 1.262V (+31mv from Volt.chip) to be artifact free over long testing, drooped MAX VDDC for 3dMark FS = 1.258V. *SO* technically with extra 18.75mv over how it would be from factory I get 1100 / 1525 vs 1030 / 1325. *OR* say +43.75mv as usually stated. This ROM benches slightly better than an MSI 390X Gaming (1100/1525 from factory) from when I compared 3dMark scores in various reviews.

For my 1090 ROM I get about max 75C in 3dMark FS, 1100 ROM start touching 77C over lengthy looped testing.

I think I gotta sort my fan profile for case fans, even games I class heavy GPU load end up lower temps than 3dMark. I'm guessing as my fan profile is bound to CPU temp they step up a few notches where as in 3dMark their just lazying around at 600-700RPM IIRC.
Quote:


> Originally Posted by *fat4l*
> 
> But yes, we, especially you, deserve more for all the effort, work u have done and knowledge u have regarding hawaii


Cheers for your kind words, TBH just giving back what I gain from community IMO







.
Quote:


> Originally Posted by *fat4l*
> 
> Is it delided ?


Nope, HSF not lapped or IHS, just using AS5 only.
Quote:


> Originally Posted by *tolis626*
> 
> Also, gupsterg, nice one. Not impressive speed wise


I'm sorry how is i5 4690K doing 4.9GHz @ 1.255v with 4.3GHz Cache @ 1.10v _and_ 2400MHz @ 1T (is @ stock timings @ present 11-13-14-32) on air not impressive?








Quote:


> Originally Posted by *tolis626*
> 
> but man you could run that puppy in an mITX case with a small cooler and get wicked good performance. Nice stuff!


I can't see anyone getting away with my CPU clocked at what it is, with small cooler (I'm guessing you mean air) in a mITX case getting away with stress testing it, gaming/general use I guess they may be able to.


----------



## TehMasterSword

No, I only get that when I am pushing my CPU OC. I used to get DirectX initialization errors after a few minutes of play. Now I was able to finish the game!


----------



## fat4l

Quote:


> Originally Posted by *tolis626*
> 
> Hey fat4l and Gumbi, what Cinebench scores do you get with your CPUs? Also, what memory are you using? I'm kinda curious now, maybe I will mess with my CPU a bit more later today...
> 
> Also, gupsterg, nice one. Not impressive speed wise but man, you could run that puppy in an mITX case with a small cooler and get wicked good performance. Nice stuff!


~1030-1036.
I'm using asus M7 hero and 4x4GB Corsair Dominator Platinum 2666MHz CL10-12-12-31...


----------



## fat4l

Quote:


> Originally Posted by *gupsterg*
> 
> Regarding your Ares III I can sorta see where you coming from, but as always silicon lottery is a big factor. I guess your sorta paying for a great water cooling block factory fitted to card as well.


Well I can tel you that temps on my 295X2 with Aquacomputer Waterblock were MUCH better. I'm saying Much Better. Now with 1200Mhz ~1.35-1.38v max real I'm getting about max60C on both cores.
With 295X2 I had about 10C less....
This EK block is not the best sadly...
Nor are the cores. If I was asus, I would be benching the cores and putting the best 1000(500x2) on ares 3. It's not a massively produced card so it deserves it imo.
Quote:


> Originally Posted by *gupsterg*
> 
> 
> 
> Spoiler: My cards info with ROM from factory
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My factory ROM not have +31mv but it's programmed into voltage chip, so VID is Stilts DPM7 + 31mv.
> 
> Using same ROM if I set clocks as 1100 / 1525 with DPM7 1.262V (+31mv from Volt.chip) to be artifact free over long testing, drooped MAX VDDC for 3dMark FS = 1.258V.


Aha. I get it all now.
So what ur saying basically is. @1100MHz u need dmp7_1.262v + 0.031v=1.293v which is about 1.258v_real right ?
I think it also depends on your cooling. If you had 50-60C max then it would require less volts for 1100mhz.

If so, then I'm rly curious about the fact what voltages other ppl have to get 1100mhz or lets say 1200mhz @artifact free in FS.
If we had this database we would then easily compare....









BTW are the volt-offset chips common ? Or what cards have them usually?

Quote:


> Originally Posted by *gupsterg*
> 
> I'm sorry how is i5 4690K doing 4.9GHz @ 1.255v with 4.3GHz Cache @ 1.10v _and_ 2400MHz @ 1T (is @ stock timings @ present 11-13-14-32) on air not impressive?


To me, ur chips is very awesome. I would say golden. If I was you I would delid it and put Thermal Grizzly on it(Or CLU/CLP) and go for 5G easyyyyyy


----------



## gupsterg

I get where you coming from regarding your Ares III now, I would think / feel the same.

Yes I need VID of 1.293v so VDDC is MAX 1.258v, as you already know from your testing real VDDC ends up lower 1.156.

A database would be so great.
Quote:


> BTW are the volt-offset chips common ? Or what cards have them usually?


No extra chip, the normal voltage control chip (IR3567B) can come pre programmed from factory. I can't change offset within chip, my ROM have no offset so I can't change it like we did with yours. What I can do is flash a ROM with a voltage offset within it and then the one in voltage chip is ignored. Then I can make it zero or anything I want, like I did with yours. I gained a updated Sapphire Vapor-X 290X ROM via tech support







, that allows me to do this.
Quote:


> To me, ur chips is very awesome. I would say golden. If I was you I would delid it and put Thermal Grizzly on it(Or CLU/CLP) and go for 5G easyyyyyy


Very tempting indeed *but* TBH another 100MHz or so really isn't gonna make a huge difference. Plus there's a) delidding risk b) die cracking

For some reason whilst surfing I ended up on MLG forum and saw this thread. I can only imagine how gutted the guy must have been to lose that chip (i7 4770K doing hwbot prime at 6.63ghz 1st attempt).

TBH I'm tempted to use the [email protected] profile.

a) to me that's low voltage and its not like my GPU is being bottle necked by CPU (on my original i5 I used to do 4.4GHz @ 1.180v as 24/7 use).

b) shouldn't be any risk of degradation, then may just end up keeping it and never selling like my semi antique Q6600







.


----------



## tolis626

Quote:


> Originally Posted by *gupsterg*
> 
> [
> I'm sorry how is i5 4690K doing 4.9GHz @ 1.255v with 4.3GHz Cache @ 1.10v _and_ 2400MHz @ 1T (is @ stock timings @ present 11-13-14-32) on air not impressive?
> 
> 
> 
> 
> 
> 
> 
> 
> I can't see anyone getting away with my CPU clocked at what it is, with small cooler (I'm guessing you mean air) in a mITX case getting away with stress testing it, gaming/general use I guess they may be able to.


The hell did I read in the first place? I thought I read 4.6GHz at 1.205V. Man, my reading skills go to crap when I haven't drank my coffee.









And holy crap is that a nice chip. 1.1V for a 4.3GHz cache? Mine needs like 1.225V to do that, maybe even more. And the memory is doing quite well... May I ask if you have changed anything to get the memory working that high other than the command rate?

And now that I read your post correctly, yeah... Certainly not small cooler friendly. But a 120mm AIO would definitely manage it quite well, so it would still do great in an mITX case. Apparently you guys won the CPU lottery all together. Let me go cry in my corner while hugging my average chip...









@fat4l

You... You...









I managed a 975 sometime, but I don't even remember what settings I used for that. I think it was like in my sig, but with 4.4GHz cache or something. Maybe 4.5GHz. Still, I'm quite far from your score. Damn it. I want your CPU.


----------



## Gumbi

Quote:


> Originally Posted by *gupsterg*
> 
> I get where you coming from regarding your Ares III now, I would think / feel the same.
> 
> Yes I need VID of 1.293v so VDDC is MAX 1.258v, as you already know from your testing real VDDC ends up lower 1.156.
> 
> A database would be so great.
> No extra chip, the normal voltage control chip (IR3567B) can come pre programmed from factory. I can't change offset within chip, my ROM have no offset so I can't change it like we did with yours. What I can do is flash a ROM with a voltage offset within it and then the one in voltage chip is ignored. Then I can make it zero or anything I want, like I did with yours. I gained a updated Sapphire Vapor-X 290X ROM via tech support
> 
> 
> 
> 
> 
> 
> 
> , that allows me to do this.
> Very tempting indeed *but* TBH another 100MHz or so really isn't gonna make a huge difference. Plus there's a) delidding risk b) die cracking
> 
> For some reason whilst surfing I ended up on MLG forum and saw this thread. I can only imagine how gutted the guy must have been to lose that chip (i7 4770K doing hwbot prime at 6.63ghz 1st attempt).
> 
> TBH I'm tempted to use the [email protected] profile.
> 
> a) to me that's low voltage and its not like my GPU is being bottle necked by CPU (on my original i5 I used to do 4.4GHz @ 1.180v as 24/7 use).
> 
> b) shouldn't be any risk of degradation, then may just end up keeping it and never selling like my semi antique Q6600
> 
> 
> 
> 
> 
> 
> 
> .


That is an insane chip. You could possibly hit 5.1ghz ghz at 1.32/1.33v which woukd be quite hot on air, but I do 1.31 handily enough on air for 4.9ghz. 5.1 onair woukd be INSANE.


----------



## dartmaul15

so, black friday and cyber sunday is drawing close, and this guy got a lot of money to spend. A brand new pc is in the making (as some of you know, since i asked for OC Vcore and voltages for the r9 390 earlier), and i'm unsure on the GPU.

I've narrowed it down to either MSI or Sapphire. But I am simply unsure if i'm to go for the r9 390 or the r9 390x. Given the money i put into my rig, the difference in cost will not be that great.

So i'm mostly interrested in knowing the overclocked performance; how the r9 390 compares to the r9 390x when they're both overclocked. And what values can you expect to see on the stock cooling for them?

on a side note; what's wrong with the owner sheet? I can't scroll left/right on it.


----------



## Blackcurrent

Nobody seems to know why my pc would have mini freezes/stutter with my 390? It's really odd that it resolves itself after a min or two after playing. Why would that be? My 670 previously gave superb performance, its like the 390 has to warm up or something


----------



## Dundundata

If you are on a pc you can right click on the owners sheet and open in a new tab


----------



## battleaxe

Quote:


> Originally Posted by *dartmaul15*
> 
> so, black friday and cyber sunday is drawing close, and this guy got a lot of money to spend. A brand new pc is in the making (as some of you know, since i asked for OC Vcore and voltages for the r9 390 earlier), and i'm unsure on the GPU.
> 
> I've narrowed it down to either MSI or Sapphire. But I am simply unsure if i'm to go for the r9 390 or the r9 390x. Given the money i put into my rig, the difference in cost will not be that great.
> 
> So i'm mostly interrested in knowing the overclocked performance; how the r9 390 compares to the r9 390x when they're both overclocked. And what values can you expect to see on the stock cooling for them?
> 
> on a side note; what's wrong with the owner sheet? I can't scroll left/right on it.


go for the 390x I say. You won't be sorry you got a better card, and you may be sorry you didn't. Unless you are all about value, then the 390 is a better value technically.


----------



## dartmaul15

Quote:


> Originally Posted by *battleaxe*
> 
> go for the 390x I say. You won't be sorry you got a better card, and you may be sorry you didn't. Unless you are all about value, then the 390 is a better value technically.


exactly what differences of performace (when overclocked) can i expect to see with them? With an i7 haswell processor (i know i7 i considered overkill for gaming, but i am aiming for overkill there)


----------



## Blackcurrent

Muhahahaha. Bought a 390X Sapphire card gave me stutter. Bought a 390 non X Nitro card from a different shop also gave me stutter. Bought a GTX 970 Strix card butter smooth baby GG AMD. I've tried everything to fix the stutter no fix. Never again AMD.


----------



## rdr09

Quote:


> Originally Posted by *Blackcurrent*
> 
> Muhahahaha. Bought a 390X Sapphire card gave me stutter. Bought a 390 non X Nitro card from a different shop also gave me stutter. Bought a GTX 970 Strix card butter smooth baby GG AMD. I've tried everything to fix the stutter no fix. Never again AMD.


others have similar experience as you . . .


Spoiler: Warning: Spoiler!


----------



## wickedsunny

Quote:


> Originally Posted by *Blackcurrent*
> 
> Muhahahaha. Bought a 390X Sapphire card gave me stutter. Bought a 390 non X Nitro card from a different shop also gave me stutter. Bought a GTX 970 Strix card butter smooth baby GG AMD. I've tried everything to fix the stutter no fix. Never again AMD.


Which games are giving you the sttuters?

I am not getting any stutter with sapphire Nitro R9 390 when running uniheaven extreme or Paths of Exile.

Sapphire is a power HOG make sure you max its power limit in after burner, that could be a reason for stutters.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Blackcurrent*
> 
> Nobody seems to know why my pc would have mini freezes/stutter with my 390? It's really odd that it resolves itself after a min or two after playing. Why would that be? My 670 previously gave superb performance, its like the 390 has to warm up or something


If you are gaming at high res/settings, and are using RAPID mode on your Samsung drive, you will have a slight shortage of RAM, because RAPID mode forces system RAM to be used as a temporary disk cache. This in tern causes your pagefile to swell up to a large size (due to the shortage in system RAM), and will cause textures to swap from the pagefile at times....

I suggest using 16GB of system RAM, as it is generally good to have twice the system RAM or more as you do GPU RAM.

For example, the Titan X can't even use all of 12GB of VRAM without having at least 24GB of system RAM.... Not sure the reasonings for this, but I have tested tons of cards, and RAM setups in the past few months, and 16GB worked mich better with my 8GB 390 cards than 8GB of RAM did (I noticed the more RAM I had, the more VRAM the card would allow itself to use, you would think it would be the opposite????)

It's all pretty confusing..... Try disabling your pagefile altogether and see if that stops the stutter....


----------



## fat4l

Quote:


> Originally Posted by *gupsterg*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> I get where you coming from regarding your Ares III now, I would think / feel the same.
> 
> Yes I need VID of 1.293v so VDDC is MAX 1.258v, as you already know from your testing real VDDC ends up lower 1.156.
> 
> A database would be so great.
> No extra chip, the normal voltage control chip (IR3567B) can come pre programmed from factory. I can't change offset within chip, my ROM have no offset so I can't change it like we did with yours. What I can do is flash a ROM with a voltage offset within it and then the one in voltage chip is ignored. Then I can make it zero or anything I want, like I did with yours. I gained a updated Sapphire Vapor-X 290X ROM via tech support
> 
> 
> 
> 
> 
> 
> 
> , that allows me to do this.
> Very tempting indeed *but* TBH another 100MHz or so really isn't gonna make a huge difference. Plus there's a) delidding risk b) die cracking
> 
> For some reason whilst surfing I ended up on MLG forum and saw this thread. I can only imagine how gutted the guy must have been to lose that chip (i7 4770K doing hwbot prime at 6.63ghz 1st attempt).
> 
> TBH I'm tempted to use the [email protected] profile.
> 
> a) to me that's low voltage and its not like my GPU is being bottle necked by CPU (on my original i5 I used to do 4.4GHz @ 1.180v as 24/7 use).
> 
> b) shouldn't be any risk of degradation, then may just end up keeping it and never selling like my semi antique Q6600
> 
> 
> 
> 
> 
> 
> 
> .


Well everyone is different. I know 100MHz will not bring any difference but I'm more about "being special" and going over the "magic" number....5G baby!








With proper backplate(Ek one for example) and if you use ihs but not glue it back you shouldnt worry about die cracking and there's no chance the pcb will bend...

From my perspective, I was even that crazy that I run my cpu "direct die"... No ihs + waterblock on it lol...
That was very risky procedure and I could lose 700£ cpu...







....but...I love to go for the best u know ...

U could know this, is there any rule as for "which brand 290x/390X is the best" in terms of OCing ?
For example asus matrix/msi lightning cards...are they rly better than usual 290x ?


----------



## Blackcurrent

Trust me I've tried that. I've tried everything it didn't work out sadly. I'm rocking a 970 now and it runs really smooth. My first choice was a 390X but that didn't work out, second choice was a 390 and that didn't work out so I went with the 970. Anyhow Ciao!


----------



## xboxshqip

So far so good, fan speed at 40% in IDE, card stays at 30c, I just don't like the zero fan speed thing, 50c in IDE is to much for my taste.

At full load fan goes at 60% and temps sit aaround 60c with room temps 20c

I gotta say i love my ASUS Radeon R9 390, but is still to be seen how well those temps will be in summer, wen ambient temperature here goes 40c the double of winter.

Card is quiet at 50% fan speed, @ 60% the noise can be noticed if you play some stealth games, but usually with speakers on the noise goes away. at 70-80% it sounds like a jet.


----------



## flopper

Quote:


> Originally Posted by *wickedsunny*
> 
> Which games are giving you the sttuters?
> 
> I am not getting any stutter with sapphire Nitro R9 390 when running uniheaven extreme or Paths of Exile.
> 
> Sapphire is a power HOG make sure you max its power limit in after burner, that could be a reason for stutters.


I call those, user error.
something specific to his system.


----------



## gupsterg

Quote:


> Originally Posted by *tolis626*
> 
> The hell did I read in the first place? I thought I read 4.6GHz at 1.205V. Man, my reading skills go to crap when I haven't drank my coffee.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> And holy crap is that a nice chip. 1.1V for a 4.3GHz cache? Mine needs like 1.225V to do that, maybe even more. And the memory is doing quite well... May I ask if you have changed anything to get the memory working that high other than the command rate?
> 
> And now that I read your post correctly, yeah... Certainly not small cooler friendly. But a 120mm AIO would definitely manage it quite well, so it would still do great in an mITX case. Apparently you guys won the CPU lottery all together. Let me go cry in my corner while hugging my average chip...


My reading skills at times do also go with lack of caffeine







. This chip just had lower VID and the other voltages than my original i5, some info in this post.

When deciding to raise cache I initially tested my known good 4.4GHz CPU OC (1.010v) with 4.4GHz cache (1.10v), as thought be great to have results of cache at same speed of CPU to compare with other profiles. It passed 36 loops x264 but failed 6.5hrs [email protected], so as cache don't make huge difference I just opted to test higher CPU clocks at 4.3GHz cache. I've been tempted to up cache ratio/voltage further and test but been busy.

Regarding RAM again been lazy/busy to try tightened timings I just dropped CR to 1T and worked with both my chips, didn't need to adjust anything for it. The RAM is rated for 2400MHz even if Intel chip spec isn't, datasheet not state CR, did look at SPD data via AiDA but couldn't see CR info in it. GPU-Z SPD tab show slacker timings with 1T for 2400MHz, mobo on auto pick the timings I've highlighted in AiDA but 2T CR.


Spoiler: My RAM info






Quote:


> Originally Posted by *Gumbi*
> 
> That is an insane chip. You could possibly hit 5.1ghz ghz at 1.32/1.33v which woukd be quite hot on air, but I do 1.31 handily enough on air for 4.9ghz. 5.1 onair woukd be INSANE.


Cheers! initially I bought it to sell on as got it cheap and thought I'd make a few £ on it, but couldn't resist opening it and testing, if I hadn't never would've known what I'd let slip through my hands! LOL

I once built a system for someone with a K processor they were never gonna OC from day 1 but later when they needed it, can you imagine someone having this CPU and not OC'ing from day 1! what a waste IMO.
Quote:


> Originally Posted by *fat4l*
> 
> Well everyone is different. I know 100MHz will not bring any difference but I'm more about "being special" and going over the "magic" number....5G baby!
> 
> 
> 
> 
> 
> 
> 
> 
> With proper backplate(Ek one for example) and if you use ihs but not glue it back you shouldnt worry about die cracking and there's no chance the pcb will bend...


I agree 5G is the magic number! (stop tempting me







) , the poster on MLG forum did use IHS and states "Probably too much cold sessions" so you reckon exotic cooling (LN2) sessions exaggerate chance of cracked die? you read if people get on water?
Quote:


> Originally Posted by *fat4l*
> 
> U could know this, is there any rule as for "which brand 290x/390X is the best" in terms of OCing ?
> For example asus matrix/msi lightning cards...are they rly better than usual 290x ?


No idea, I believe companies only bin test for what their gonna clock and sell a card for on for market. I'd think they just can spend the resources testing further, otherwise consumer be paying the price. To me the STD edition 290 Tri-X I once had made me feel wow on OC'ing, stock clock 947/1250 and ROM had +25mv and without changing voltage offset hit 1100/1475. Looking at MSI AB screenie MAX VDDC of 1.250V which same as my 290X 1090/1475 ROM. Perhaps there's a market for Silicon Lottery to expand into GPUs?

TBH what I think is you've got as much chance on a ref PCB card of getting a great clocking GPU as custom ones. Where the custom ones stand out to me is say VRM design.

I don't think for the average owner (like me) say on air the ref VRM doesn't seem under built, but where the extra phases help is temps. Where as for the owner using LN2 it must come in to play IMO for the higher VDDC / clocks benchers go for.

For example I was doing research on my Vapor-X 290X a) via reviews which had VRM component data b) some other info. AFAIK know my card have 10 phases for GPU (VDDC) + 2 for RAM (MVDDC) + 1 for RAM I/O (VDDCI). As the GPU / RAM phases use same spec of mosfet as ref PCB 290X and a slide from Sapphire highlight ref PCB is 40A a phase that's 400A for GPU. I can't ever see myself reaching VID/Clock to create loading of that (AFAIK).

Here is that slide for 390/X owners to ref, as Sapphire used the 290/X new edition PCB for their 390/X Nitro.


Spoiler: Warning: Spoiler!







Now this is why I think Vapor-X 290X with 10 phases have lower temps, if I take the example in slide of 200A divided it over 10 that's 20A a phase.

Why I think the 390/X have an extra phase is due to what the Stilt said about Grenada.

Plus this:-

High ASIC "Quality" (Leakage) = Lower operating voltage, larger current draw, hotter, less energy efficient (due higher losses)
Low ASIC "Quality" = Higher operating voltage, lower current draw, cooler, more energy efficient

*Or* companies decided to add an extra phase due to slightly higher default clocks/allowing for OC room on most std spec 390/X, dunno really.


----------



## fat4l

Quote:


> Originally Posted by *gupsterg*
> 
> I agree 5G is the magic number! (stop tempting me
> 
> 
> 
> 
> 
> 
> 
> ) , the poster on MLG forum did use IHS and states "Probably too much cold sessions" so you reckon exotic cooling (LN2) sessions exaggerate chance of cracked die? you read if people get on water?


Well most prolly. U know that if you cool something to -100C its very fragile so...
The only issue with water is die cracking(one of the corners) if used without IHS however if you do it properly and you use the proper kit for it(ek for example) the chances are minimal.
If you use IHS and a proper backplate I doubt IHS can crack..


----------



## Levys

Quote:


> Originally Posted by *Gumbi*
> 
> Can any 390(x) owner beat this score? Run at 100% fan and 1241/1644 (blame those odd numbers on Trixx's terrible interface). Ambients are quite and helped, maybe 14 c in my room at the moment. but it was a stable run!. Plus 200mv in Trixx is 1.4v when stressed fully, but in many games it wouldn't be glued to 1.4v as it was in Firestrike. Still, the temps are AMAZING on the VaporX cooler. Ambients helped, as well as my case cooling setup (have 4 Nocuta 2k fans hooked up to a fan controller, and 2 other random make fans too of lesser speeds).
> 
> i uploaded the image to show GPUz logs. Also, my PSU 12v sensor dips a nice bit when using so much voltage, I don't think it's a cause for concern, but I am going to be grabbing a new PSU soon (this SuperFlower Amazon Bronze 650 watt has served me well for 5 years). Probably going to grab a SuperFlower 650 watt Gold (either Golden Green or Leadex).
> 
> My card is easily drawing 350~ watts with this voltage, if not more, and CPU is 4790k at 4.9ghz/1.31v. So system is being heavily loaded when running this bench! probably 550 - 600 watts!
> 
> http://www.3dmark.com/3dm/9201024?


I came very close to your gpu score 14749 while only adding 100mv on the core and 90 aux. but clocked higher at 1225/1750 no artifacts ( have not try'd any higher)
I have a FX 8350 chip so cpu score is a bit out of reach. and when i clock both to the max (cpu 5.0Ghz). my 650w psu fails me.
http://www.3dmark.com/3dm/9093935?

nice score btw


----------



## Levys

Quote:


> Originally Posted by *Blackcurrent*
> 
> Muhahahaha. Bought a 390X Sapphire card gave me stutter. Bought a 390 non X Nitro card from a different shop also gave me stutter. Bought a GTX 970 Strix card butter smooth baby GG AMD. I've tried everything to fix the stutter no fix. Never again AMD.


are you sure the Nvidia drivers where correctly removed and shizz


----------



## Temuka

Quote:


> Originally Posted by *wickedsunny*
> 
> Which games are giving you the sttuters?
> 
> I am not getting any stutter with sapphire Nitro R9 390 when running uniheaven extreme or Paths of Exile.
> 
> Sapphire is a power HOG make sure you max its power limit in after burner, that could be a reason for stutters.


What do you mean under maxing out its power limit?


----------



## GorillaSceptre

Just put my 290X up for sale, some of the results in Black Ops 3 made me think the 390X isn't just a re-brand.. I also want the 8GB.









The Vapor-X cooler is outstanding though, is there anything comparable for the 390X, cooler wise? I have insane ambient's so i need all the cooling i can get, what's the best one?


----------



## BradleyW

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Just put my 290X up for sale, some of the results in Black Ops 3 made me think the 390X isn't just a re-brand.. I also want the 8GB.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The Vapor-X cooler is outstanding though, is there anything comparable for the 390X, cooler wise? I have insane ambient's so i need all the cooling i can get, what's the best one?


290X is only getting smashed at 1080p. At 144p it is only a few frames behind the 390X. Of course it is just a rebrand. If the 8GB VRAM was the advantage, why are the 4GB Fury cards running just was fast as the 8GB cards?


----------



## Agent Smith1984

I would think if any of these cards have a shot at clocking high in the 150-200mv range, it would be the sapphire with It's higher tdp capability(only card with 8+8 power), but haven't seen any results to prove that....


----------



## Gumbi

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Just put my 290X up for sale, some of the results in Black Ops 3 made me think the 390X isn't just a re-brand.. I also want the 8GB.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The Vapor-X cooler is outstanding though, is there anything comparable for the 390X, cooler wise? I have insane ambient's so i need all the cooling i can get, what's the best one?


The short answer is no







You'd want very good casecooling to bring the best out of them to match the VaporX cooling. Why not grab an 8GB VaporX second hand? I did, and have been extremely happy with it.


----------



## GorillaSceptre

Quote:


> Originally Posted by *BradleyW*
> 
> 290X is only getting smashed at 1080p. At 144p it is only a few frames behind the 390X. Of course it is just a rebrand. If the 8GB VRAM was the advantage, why are the 4GB Fury cards running just was fast as the 8GB cards?


But why is it getting smashed at any res? I can get my money back right away, if i wait i'll probably lose out. I usually heavily mod my games, so the 8GB will come in handy later on. I don't really care for COD anyway, it's more for the Vram.









Quote:


> Originally Posted by *Gumbi*
> 
> The short answer is no
> 
> 
> 
> 
> 
> 
> 
> You'd want very good casecooling to bring the best out of them to match the VaporX cooling. Why not grab an 8GB VaporX second hand? I did, and have been extremely happy with it.


Can't get them where i live or i would.


----------



## BradleyW

Quote:


> Originally Posted by *GorillaSceptre*
> 
> But why is it getting smashed at any res? I can get my money back right away, if i wait i'll probably lose out. I usually heavily mod my games, so the 8GB will come in handy later on. I don't really care for COD anyway, it's more for the Vram.
> 
> 
> 
> 
> 
> 
> 
> 
> Can't get them where i live or i would.


290X is not getting smashed at any Res. It is only performing badly at 1080p on this title. Nothing to do with VRAM. If VRAM was the cause, the Fury X would also perform worse than the 390X. 1440p puts the 290X right back up with the 390X.


----------



## Tivan

Quote:


> Originally Posted by *GorillaSceptre*
> 
> But why is it getting smashed at any res? I can get my money back right away, if i wait i'll probably lose out. I usually heavily mod my games, so the 8GB will come in handy later on. I don't really care for COD anyway, it's more for the Vram.
> 
> 
> 
> 
> 
> 
> 
> 
> Can't get them where i live or i would.


Driver version and power limit/thermal constraints (if reference model was used for testing) come to mind.

Also 390X has a slightly better tuned bios. A lot of little things can make a big difference


----------



## GorillaSceptre

Quote:


> Originally Posted by *BradleyW*
> 
> 290X is not getting smashed at any Res. It is only performing badly at 1080p on this title. Nothing to do with VRAM. If VRAM was the cause, the Fury X would also perform worse than the 390X. 1440p puts the 290X right back up with the 390X.


You used the word smashed lol..

If you don't know what's causing the issue then how do you know what isn't? The fact is that i game at 1080p, and the 390x is beating the 290x by nearly 25fps (from a couple sites that I've seen), i can get my money back and get a card with 8GB of memory instead of 4. Why wouldn't i?

I just wanted to know which 390x has the best cooler.


----------



## BradleyW

Quote:


> Originally Posted by *GorillaSceptre*
> 
> You used the word smashed lol..
> 
> If you don't know what's causing the issue then how do you know what isn't? The fact is that i game at 1080p, and the 390x is beating the 290x by nearly 25fps (from a couple sites that I've seen), i can get my money back and get a card with 8GB of memory instead of 4. Why wouldn't i?
> 
> I just wanted to know which 390x has the best cooler.


I used the word smashed in response to your use of the word.

Show me any other modern AAA title benchmarked, were the 390X beats the 290X by 25fps. That's right, you can't and neither can I. The difference is "usually" 2-3fps at most.

If the 390X was so much better, why is the performance very similar to the 290X in every other game? better yet, why are they neck and neck at 1440p? There is clearly a *software issue* with the 290X at 1080p *on this particular title.*


----------



## Gumbi

They use the same chip, the only difference is, the 390 cards have better memory (they routinely clock from 1700-1750mhz). The reason they beat 290s in benches is they are being compared to reference 290s, and the 390s themselves come with high stock clocks.

Any other major discrepancy is almost certainly a software issue.


----------



## BradleyW

Quote:


> Originally Posted by *Gumbi*
> 
> They use the same chip, the only difference is, the 390 cards have better memory (they routinely clock from 1700-1750mhz). The reason they beat 290s in benches is they are being compared to reference 290s, and the 390s themselves come with high stock clocks.
> 
> Any other major discrepancy is almost certainly a software issue.


*
Exactly this.*

Also the 290X tested is often reference which throttles anyway.
Quote:


> Originally Posted by *GorillaSceptre*
> 
> You used the word smashed lol..
> 
> If you don't know what's causing the issue then how do you know what isn't? The fact is that i game at 1080p, and the 390x is beating the 290x by nearly 25fps (from a couple sites that I've seen), i can get my money back and get a card with 8GB of memory instead of 4. Why wouldn't i?
> 
> I just wanted to know which 390x has the best cooler.


I used the word smashed in response to your use of the word.

Show me any other modern AAA title benchmarked, were the 390X beats the 290X by 25fps. That's right, you can't and neither can I. The difference is "usually" 2-3fps at most.

If the 390X was so much better, why is the performance very similar to the 290X in every other game? better yet, why are they neck and neck at 1440p? There is clearly a *software issue* with the 290X at 1080p *on this particular title.*


----------



## GorillaSceptre

Quote:


> Originally Posted by *BradleyW*
> 
> *I used the word smashed in response to your use of the word.*
> 
> Show me any other modern AAA title benchmarked, were the 390X beats the 290X by 25fps. That's right, you can't and neither can I. The difference is "usually" 2-3fps at most.
> 
> If the 390X was so much better, why is the performance very similar to the 290X in every other game? better yet, why are they neck and neck at 1440p? There is clearly a *software issue* with the 290X at 1080p *on this particular title.*


Okay, this is getting silly now.. We're arguing about who used the word smashed first on a forum, look back a page.. You used the damn word, the end.

I never said the 390x was so much better.. What are you going on about? No one knows what the issue is, i said it might be the memory clock difference in the COD thread, i couldn't test that theory as i don't have the game, nor does my 290x clock that high.

Regardless of what is causing the problem, the 390x is beating the 290x at 1080p, my primary res, i can get a 390x at no additional cost to me and avoid the "issue" completely, i'm also guaranteed it won't happen in other games too. While also getting 8 Gigs of Vram to boot.

Now, does anyone know which 390x has the best cooler?


----------



## Gumbi

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Okay, this is getting silly now.. We're arguing about who used the word smashed first on a forum, look back a page.. You used the damn word, the end.
> 
> I never said the 390x was so much better.. What are you going on about? No one knows what the issue is, i said it might be the memory clock difference in the COD thread, i couldn't test that theory as i don't have the game, nor does my 290x clock that high.
> 
> Regardless of what is causing the problem, the 390x is beating the 290x at 1080p, my primary res, i can get a 390x at no additional cost to me and avoid the "issue" completely, i'm also guaranteed it won't happen in other games too. While also getting 8 Gigs of Vram to boot.
> 
> Now, does anyone know which 390x has the best cooler?


Where are you paying as much for a 290X as a 390X,LOL. Also, the 290X performance is a glitch, it won't remain that way.


----------



## BradleyW

Quote:


> Originally Posted by *Gumbi*
> 
> *Where are you paying as much for a 290X as a 390X,LOL*. Also, the 290X performance is a glitch, it won't remain that way.


That's a very good point. Sums it all up.


----------



## wickedsunny

Quote:


> Originally Posted by *Temuka*
> 
> What do you mean under maxing out its power limit?


In afterburner you have an option to make the GPU use excess power limit. I need to max it out to render without any artefacts with blender. Otherwise it will throttle or crash Blender.

Could be ebcause of my 4 years old 700 W psu which is drawing a lot of power.

Power Limit means, it will use extra power only if required while doing extreme things.


----------



## GorillaSceptre

Quote:


> Originally Posted by *Gumbi*
> 
> Where are you paying as much for a 290X as a 390X,LOL. Also, the 290X performance is a glitch, it won't remain that way.


Who said anything about _paying_ as much? That's why it's a good deal.. Fyi, before going "LOL", stop and think for a moment that U.S prices don't represent the entire planet..

Regardless of whether or not it's a glitch, it's not affecting the 390x. So i should turn down a good deal because you say it won't remain that way? Can you also guarantee it won't happen in future titles too? I didn't ask for advice on whether or not i should keep the 290x, i asked 390x owners what cards, in their opinion, have the best cooler..
Quote:


> Originally Posted by *BradleyW*
> 
> That's a very good point. Sums it all up.


Oh, that sums it all up does it, lmao whatever..

Both of you should check the title of this thread, it's the 390/x owners club, i don't see 290/x anywhere in there. Go defend your purchase somewhere else.

Helpful as always AMD users..


----------



## BradleyW

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Who said anything about _paying_ as much? That's why it's a good deal.. Fyi, before going "LOL", stop and think for a moment that U.S prices don't represent the entire planet..
> 
> Regardless of whether or not it's a glitch, it's not affecting the 390x. So i should turn down a good deal because you say it won't remain that way? Can you also guarantee it won't happen in future titles too? I didn't ask for advice on whether or not i should keep the 290x, i asked 390x owners what cards, in their opinion, have the best cooler..
> Oh, that sums it all up does it, lmao whatever..
> 
> Both of you should check the title of this thread, it's the 390/x owners club, i don't see 290/x anywhere in there. Go defend your purchase somewhere else.
> 
> Helpful as always AMD users..


It is not against the TOS to participate in discussion in an owners thread when one does not have the product in question within their possession.


----------



## GorillaSceptre

Quote:


> Originally Posted by *BradleyW*
> 
> It is not against the TOS to participate in discussion in an owners thread when one does not have the product in question within their possession.


Never said it was against TOS. I asked a question, you chose to quote my post and start an argument about the 290x vs 390x, while ignoring the main reason i made the post in the first place.. Very helpful, thanks.


----------



## Gumbi

I'm not from the US. GG. I can guarantee it's not going to be the norm, because they use the same core. IT'S THE SAME CARD.

The ONLY reason a 390X will provide a big boost in pefformance over a 290X (instead of nice small gains herw and there) will be if a game uses more than 4GB of VRAM.

For the record I think a 390 is a great purchase...


----------



## BradleyW

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Never said it was against TOS. I asked a question, you chose to quote my post and start an argument about the 290x vs 390x, while ignoring the main reason i made the post in the first place.. Very helpful, thanks.


To answer your original question, the answer is the PowerColor Radeon R9 390X Devil.
My debate was based on your inaccurate and unjustified assumption that the performance on the 1080p benchmark is an accurate reflection of the 390X vs the 290X. I've explained why this is not the case on previous posts and evidence is readily available to support my argument.


----------



## GorillaSceptre

Quote:


> Originally Posted by *Gumbi*
> 
> I'm not from the US. GG. I can guarantee it's not going to be the norm, because they use the same core. IT'S THE SAME CARD.
> 
> The ONLY reason a 390X will provide a big boost in pefformance over a 290X (instead of nice small gains herw and there) will be if a game *uses more than 4GB of VRAM.*
> 
> For the record I think a 390 is a great purchase...


Did i say you lived in the U.S? I said their prices don't represent all countries.. Is it so hard to believe that in countries with low turn around on GPU's, that those GPU's will continue to keep high prices? That was also in response to your sarcastic "LOL". So yeah, GG.









Really? only if it uses more than 4GB of Vram? It's almost as if i mentioned that in my original posts..









@BradleyW

What was wrong with my original statement? There is a difference between them, one that makes up a 25fps difference. You said so yourself, it can't be the Vram amount, because it catches back up in 1440p, so what is it? The answer is we don't know, but there is obviously a difference.. It probably won't happen in other titles, it might be a one off problem that will be patched, none of that matters to me as i can get a 390x for no additional cost.

I'll have a look at the Powercolor, thanks.


----------



## GorillaSceptre

If anyone else can give me some suggestions, i'll offer free rep.


----------



## Gumbi

Based on the above information, Occam's Razor would suggest the bench itself is flawed and not the card


----------



## battleaxe

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Okay, this is getting silly now.. We're arguing about who used the word smashed first on a forum, look back a page.. You used the damn word, the end.
> 
> I never said the 390x was so much better.. What are you going on about? No one knows what the issue is, i said it might be the memory clock difference in the COD thread, i couldn't test that theory as i don't have the game, nor does my 290x clock that high.
> 
> Regardless of what is causing the problem, the 390x is beating the 290x at 1080p, my primary res, i can get a 390x at no additional cost to me and avoid the "issue" completely, i'm also guaranteed it won't happen in other games too. While also getting 8 Gigs of Vram to boot.
> 
> Now, does anyone know which 390x has the best cooler?


XFX has one of the best 390x around. Sapphire Nitro is good too. As is the PCS. I'd probably pick in that order. I may pick up a XFX 390x this weekend too.

Jeez... tough getting an answer around here eh?


----------



## wickedsunny

Quote:


> Originally Posted by *Gumbi*
> 
> Can any 390(x) owner beat this score? Run at 100% fan and 1241/1644 (blame those odd numbers on Trixx's terrible interface). Ambients are quite and helped, maybe 14 c in my room at the moment. but it was a stable run!. Plus 200mv in Trixx is 1.4v when stressed fully, but in many games it wouldn't be glued to 1.4v as it was in Firestrike. Still, the temps are AMAZING on the VaporX cooler. Ambients helped, as well as my case cooling setup (have 4 Nocuta 2k fans hooked up to a fan controller, and 2 other random make fans too of lesser speeds).
> 
> i uploaded the image to show GPUz logs. Also, my PSU 12v sensor dips a nice bit when using so much voltage, I don't think it's a cause for concern, but I am going to be grabbing a new PSU soon (this SuperFlower Amazon Bronze 650 watt has served me well for 5 years). Probably going to grab a SuperFlower 650 watt Gold (either Golden Green or Leadex).
> 
> My card is easily drawing 350~ watts with this voltage, if not more, and CPU is 4790k at 4.9ghz/1.31v. So system is being heavily loaded when running this bench! probably 550 - 600 watts!
> 
> http://www.3dmark.com/3dm/9201024?


These are mine Saphire Nitro R9 390 score with my very old cpu 17 950. GPU has a factory default settings, except max power limit in after burner.

I was thinking of under clocking. I am using 15.10 beta drivers.

I have not used powertech monitoring system will try using that as well. GPU temp never went over 65 C degrees.

http://www.3dmark.com/3dm11/10510540


----------



## AverdanOriginal

Wow.... Everybody take a chill Pill.... Love that you see since the middle of the year not only the discussion Nvidia vs. AMD but also 290x vs. 390x vs 970 vs. 390. vs 290.... (forget about the 970







)

All those PCB+Chip bases ARE different.
Quote:


> Originally Posted by *Gumbi*
> 
> I'm not from the US. GG. I can guarantee it's not going to be the norm, because they use the same core. *IT'S THE SAME CARD.*
> 
> The ONLY reason a 390X will provide a big boost in pefformance over a 290X (instead of nice small gains herw and there) will be if a game uses more than 4GB of VRAM.
> 
> For the record I think a 390 is a great purchase...


Now that is exactly the biggest misconception. You need to differentiate between Cards and Chips.

True!!! the 290x and 390x have the same chip. But the biggest difference of the newer 390(X)s vs the older 290(X)s are the newly designed PCBs, the new cooling system, the new BIOS....
So they are different Cards with the same CHIP.
It's like comparing the same CPU with two different Motherboards, and two different CPU Coolers (and even build into two different pc cases). Of course this chip will behave differently.
NOW I am not taking into consideration really refurnished 290Xs with updated PCBs.

Quote:


> Originally Posted by *BradleyW*
> 
> 290X is not getting smashed at any Res. It is only performing badly at 1080p on this title. Nothing to do with VRAM. *If VRAM was the cause, the Fury X would also perform worse than the 390X*. 1440p puts the 290X right back up with the 390X.


FURY X is using HBM VRAMs while R9 390X is using GDDR5 VRAMs can't really compare those.

NOW that being said, there should normally be no reason to "upgrade" from a 290(X) to a 390(X) unless you don't care about a minimal increase of FPS for 300-450 Euros spent








Quote:


> Originally Posted by *GorillaSceptre*
> 
> If anyone else can give me some suggestions, i'll offer free rep.


Sapphire seems to run the coolest on air but is not as good binned as MSI or XFX (according to the chart on the first page of this forum)
If you want to cool with water you probably have to go with XFX or Powercolor. Allthough I read somewhere that, was it Alphacool or something?, that offers now a water cooling system for MSI r9 390(X)???

Small side note: I hope my reasoning will not fire up the discussion again


----------



## battleaxe

Quote:


> Originally Posted by *GorillaSceptre*
> 
> If anyone else can give me some suggestions, i'll offer free rep.


I'd say XFX offers the best 390x overall. Good for air, good for cooling with an AIO since it has a good VRM cooler that is separate and good for full cover blocks as you can get those for it too.

I'm so temped to pick one up as well. Saw one at MC today for 399.99 after rebate. Nice card.

Do you have a seller for your 290x? I may be interested. PM me if so, and price.


----------



## GorillaSceptre

Quote:


> Originally Posted by *battleaxe*
> 
> Jeez... tough getting an answer around here eh?


Tell me about it..








Quote:


> Originally Posted by *AverdanOriginal*
> 
> Sapphire seems to run the coolest on air but is not as good binned as MSI or XFX (according to the chart on the first page of this forum)
> If you want to cool with water you probably have to go with XFX or Powercolor. Allthough I read somewhere that, was it Alphacool or something?, that offers now a water cooling system for MSI r9 390(X)???
> 
> Small side note: I hope my reasoning will not fire up the discussion again


Thanks.









Quote:


> Originally Posted by *battleaxe*
> 
> I'd say XFX offers the best 390x overall. Good for air, good for cooling with an AIO since it has a good VRM cooler that is separate and good for full cover blocks as you can get those for it too.
> 
> I'm so temped to pick one up as well. Saw one at MC today for 399.99 after rebate. Nice card.
> 
> Do you have a seller for your 290x? I may be interested. PM me if so, and price.


Appreciate it, i'll take a closer look at the XFX, i just have to see if my supplier friend has them in stock.









A friend is hooking me up, so i'm sort of selling it to a distributor.







I live in South Africa anyway, so the shipping can be pricey, you'd probably find a much better deal locally than i could give you.









I wish they made a Vapor-X 390X, the cooler really is something special.. I had a 37C ambient yesterday and it was still "only" in the high 70's/low 80's.


----------



## DarknightOCR

My score on Firestrike



CPU 4790K @ 4.6Ghz - 1.18v

R9 390 Nitro - 1200/1710 +90mv No Artifacts


----------



## jon666

^The voltage for both of those has me jealous.

Edit:
Results from overclocked CPU and xfire pitcairn 7870's. http://www.3dmark.com/fs/4058192
Stock CPU and current overclock on 390 http://www.3dmark.com/fs/6449717

Waiting on CPU block to come back before I can put everything under water again. xfire setup was watercooled.


----------



## battleaxe

I wish my system would run Firestrike... stupid thing. Hasn't been able to run it for like 6 mos, and I have no idea why. Annoying.


----------



## Agent Smith1984

Quote:


> Originally Posted by *battleaxe*
> 
> I wish my system would run Firestrike... stupid thing. Hasn't been able to run it for like 6 mos, and I have no idea why. Annoying.


Woah, what???

What do you mean it isn't "able"?

Not launching, or crashing?


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Woah, what???
> 
> What do you mean it isn't "able"?
> 
> Not launching, or crashing?


It launches then immediately crashes. No matter what settings. Has nothing to do with clocks. Haven't been able to figure it out, so I just gave up sorta... lol


----------



## Agent Smith1984

Remove the app, remove the Futuremark service, then reinstall BOTH.

Happened to me with 3dmark11 when I first got the Fury....


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Remove the app, remove the Futuremark service, then reinstall BOTH.
> 
> Happened to me with 3dmark11 when I first got the Fury....


Already did that about 5 times. Still nothing. 3dMark11 finally works on Win10 now. That didn't even work on Win7. But Firestrike still won't run.


----------



## Sgt Bilko

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Quote:
> 
> 
> 
> Originally Posted by *BradleyW*
> 
> *I used the word smashed in response to your use of the word.*
> 
> Show me any other modern AAA title benchmarked, were the 390X beats the 290X by 25fps. That's right, you can't and neither can I. The difference is "usually" 2-3fps at most.
> 
> If the 390X was so much better, why is the performance very similar to the 290X in every other game? better yet, why are they neck and neck at 1440p? There is clearly a *software issue* with the 290X at 1080p *on this particular title.*
> 
> 
> 
> Okay, this is getting silly now.. We're arguing about who used the word smashed first on a forum, look back a page.. You used the damn word, the end.
> 
> I never said the 390x was so much better.. What are you going on about? No one knows what the issue is, i said it might be the memory clock difference in the COD thread, i couldn't test that theory as i don't have the game, nor does my 290x clock that high.
> 
> Regardless of what is causing the problem, the 390x is beating the 290x at 1080p, my primary res, i can get a 390x at no additional cost to me and avoid the "issue" completely, i'm also guaranteed it won't happen in other games too. While also getting 8 Gigs of Vram to boot.
> 
> Now, does anyone know which 390x has the best cooler?
Click to expand...

Powercolor 390x Devil, Sapphire 390x Tri-X, XFX 390x DD, MSI Gaming 390x, Powercolor 390x PCS+

^ those are the best 390x cards in terms of cooling

If you don't mind noise then get a triple fan design one but if you do then XFX or MSI.


----------



## Rmosher

In your firestrike score results it detects your card as a 390x but whenever I run firestrike it is detected as a 290x. Anyone know why? I know it's up to date through steam and everything. Just curious


----------



## GorillaSceptre

Quote:


> Originally Posted by *Rmosher*
> 
> In your firestrike score results it detects your card as a 390x but whenever I run firestrike it is detected as a 290x. Anyone know why? I know it's up to date through steam and everything. Just curious


Users on Reddit (i know, but sometimes there's good info there) were saying the Steam version is more likely to do that, it also apparently gives lower scores.


----------



## AverdanOriginal

Quote:


> Originally Posted by *Agent Smith1984*
> 
> If you are gaming at high res/settings, and are using RAPID mode on your Samsung drive, you will have a slight shortage of RAM, because RAPID mode forces system RAM to be used as a temporary disk cache. This in tern causes your pagefile to swell up to a large size (due to the shortage in system RAM), and will cause textures to swap from the pagefile at times....
> 
> I suggest using 16GB of system RAM, as it is generally good to have twice the system RAM or more as you do GPU RAM.
> 
> For example, the Titan X can't even use all of 12GB of VRAM without having at least 24GB of system RAM.... Not sure the reasonings for this, but I have tested tons of cards, and RAM setups in the past few months, and 16GB worked mich better with my 8GB 390 cards than 8GB of RAM did (I noticed the more RAM I had, the more VRAM the card would allow itself to use, you would think it would be the opposite????)
> 
> It's all pretty confusing..... Try disabling your pagefile altogether and see if that stops the stutter....


Interesting point @Agent Smith1984
I never thought, that there would be a such a close connection between System RAM and VRAM.
I always thought you need to allocate twice the size of pagefile as System Ram. But I have 16GB RAM so I was not entirely comfortable with allocating 32GB to my pagefile.
While I would think 4GB would be enough for Pagefile, in games like Watchdog I noticed I could only play with Ultra Settings If I allocate at least 8-10GB to my pagefile. But that's of course game specific bad programming.

So I have 16GB of DDR4 RAM and a MSI R9 390 8GB. From your experience, this is the way to go correct? Lucky me








Anay suggestions for pagefile size in Win10? disabling it is not an option


----------



## Rmosher

Huh, well the lower score thing sucks lol. Thanks for the answer. By the way; I have a Sapphire R9 390x Tri-x. Wicked good cooling, max temp of 64C running unigine heaven ultra hd preset for an hour and a half. That's with a +38 mv OC also!


----------



## Gumbi

Quote:


> Originally Posted by *Rmosher*
> 
> Huh, well the lower score thing sucks lol. Thanks for the answer. By the way; I have a Sapphire R9 390x Tri-x. Wicked good cooling, max temp of 64C running unigine heaven ultra hd preset for an hour and a half. That's with a +38 mv OC also!


And how are the VRMs? Nice!


----------



## AverdanOriginal

Quote:


> Originally Posted by *Rmosher*
> 
> In your firestrike score results it detects your card as a 390x but whenever I run firestrike it is detected as a 290x. Anyone know why? I know it's up to date through steam and everything. Just curious


Quote:


> Originally Posted by *GorillaSceptre*
> 
> Users on Reddit (i know, but sometimes there's good info there) were saying the Steam version is more likely to do that, it also apparently gives lower scores.


I'd second that. Most of the people having this problem installed the steam version. Another member of this forum had the problem but after another update it displayed the correct card. I had the problem asweill since I just downloaded from the first page I found on google. I thought maybe they haven't updated yet, since back then the card was only like 2-3weeks on the market available and looked for a beta version or so directly on 3dmark.com. Got the newest version and it displayed the correct R9 390 for me.

So I'd suggest to uninstall the version you are using now, and download the official free version directly from 3Dmark. Also check that you really uninstalled everything of the former Firestrike.


----------



## Agent Smith1984

Quote:


> Originally Posted by *AverdanOriginal*
> 
> Interesting point @Agent Smith1984
> I never thought, that there would be a such a close connection between System RAM and VRAM.
> I always thought you need to allocate twice the size of pagefile as System Ram. But I have 16GB RAM so I was not entirely comfortable with allocating 32GB to my pagefile.
> While I would think 4GB would be enough for Pagefile, in games like Watchdog I noticed I could only play with Ultra Settings If I allocate at least 8-10GB to my pagefile. But that's of course game specific bad programming.
> 
> So I have 16GB of DDR4 RAM and a MSI R9 390 8GB. From your experience, this is the way to go correct? Lucky me
> 
> 
> 
> 
> 
> 
> 
> 
> Anay suggestions for pagefile size in Win10? disabling it is not an option


Just make your pagefile a few gigs max in my opinion. It will force your games to use the RAM instead of the SSD, and we know which of those two is much faster than the other!!!

16GB system RAM with 8GB VRAM is perfect


----------



## GorillaSceptre

Quote:


> Originally Posted by *Rmosher*
> 
> Huh, well the lower score thing sucks lol. Thanks for the answer. By the way; I have a Sapphire R9 390x Tri-x. Wicked good cooling, max temp of 64C running unigine heaven ultra hd preset for an hour and a half. That's with a +38 mv OC also!


Those are great temps.







What's your ambient temp though?

I was leaning towards the Sapphire because of how fantastic my Vapor-x cooler is. I just don't like that it has no backplate.. Shallow but still.


----------



## fyzzz

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Those are great temps.
> 
> 
> 
> 
> 
> 
> 
> What's your ambient temp though?
> 
> I was leaning towards the Sapphire because of how fantastic my Vapor-x cooler is. I just don't like that it has no backplate.. Shallow but still.


I believe that there is a other version of the sapphire 390(x), that has a backplate and a clock speed of 1040 mhz.


----------



## Rmosher

Quote:


> Originally Posted by *Gumbi*
> 
> And how are the VRMs? Nice!




I am still surprised how cool this thing runs


----------



## Rmosher

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Those are great temps.
> 
> 
> 
> 
> 
> 
> 
> What's your ambient temp though?
> 
> I was leaning towards the Sapphire because of how fantastic my Vapor-x cooler is. I just don't like that it has no backplate.. Shallow but still.


There is indeed a newer version with a backplate that just came out if that's your thing. http://www.sapphiretech.com/productdetial.asp?pid=F107EA58-6C33-4F74-A739-03FF852D34A1&lang=eng
And my ambient temp is about 22 Celsius.


----------



## GorillaSceptre

Quote:


> Originally Posted by *fyzzz*
> 
> I believe that there is a other version of the sapphire 390(x), that has a backplate and a clock speed of 1040 mhz.


Quote:


> Originally Posted by *Rmosher*
> 
> There is indeed a newer version with a backplate that just came out if that's your thing. http://www.sapphiretech.com/productdetial.asp?pid=F107EA58-6C33-4F74-A739-03FF852D34A1&lang=eng
> And my ambient temp is about 22 Celsius.


Yes!







I can only choose a card from one shop though, i hope they have stock of them..

I can only dream of 22C. The last few days have had the highest recorded temperatures on record where i live, 40C today, and about 35C inside..

No overclocking for me in the summer.


----------



## Rmosher

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Yes!
> 
> 
> 
> 
> 
> 
> 
> I can only choose a card from one shop though, i hope they have stock of them..
> 
> I can only dream of 22C. The last few days have had the highest recorded temperatures on record where i live, 40C today, and about 35C inside..
> 
> No overclocking for me in the summer.


Holy ***** that's too hot for me! I thrive in the cold winters, I despise the humid summers lol. Good luck with your GPU shopping, make sure to post some benches when you get squared away


----------



## GorillaSceptre

Quote:


> Originally Posted by *Rmosher*
> 
> Holy ***** that's too hot for me! I thrive in the cold winters, I despise the humid summers lol. Good luck with your GPU shopping, make sure to post some benches when you get squared away


Yeah, it's insane. Summers are usually nice around here, but 10C higher than normal is unbearable..

Will do, as long as i don't get bad throttling.


----------



## gupsterg

Quote:


> Originally Posted by *GorillaSceptre*
> 
> If anyone else can give me some suggestions, i'll offer free rep.


First let me introduce The Stilt to you.

From what I know this overclocker has affiliation with AMD, thus access to info and bits "general" people wouldn't.

Next here is his info regarding Grenada aka 390/X (he's also posted this somewhere on OCN IIRC).

Next all the 390/X roms I've seen only support 1 RAM IC and here's Stilts info on it (plus in one of the posts in that thread and on another forum he states that IC has slightly tighter timings).

Next besides the RAM IC how 390/X get higher RAM timings compared with 290/X.
Quote:


> To get around the MEMCLK barrier in Grenada, AMD has latched some of the memory controller related timings (through bios).


Link:- Quote from

Now you may recall form the Guru3D post the Stilt stated 390/X are higher leakage ASIC, here some concise info what that means.
Quote:


> High ASIC "Quality" (Leakage) = Lower operating voltage, larger current draw, hotter, less energy efficient (due higher losses)
> Low ASIC "Quality" = Higher operating voltage, lower current draw, cooler, more energy efficient


Link:- 
Quote from

SO to sum up 390/X is rebrand of 290/X with 2x more RAM, differing RAM IC, with slightly tighter timings _and_ a bios mod to "enhance" IMC to achieve higher RAM speed.
Quote:


> Originally Posted by *GorillaSceptre*
> 
> The fact is that i game at 1080p, and the 390x is beating the 290x by nearly 25fps (from a couple sites that I've seen)


This review show Crysis 3 gap of 6 min 7 aver. FPS difference @ 1080P *but* be aware the MSI 390X Gaming is clocked at 1100 /1525 and they compare with stock AMD 290X of 1000/1250. Link:- Test setup

Here is differing site's review, GTA IV 6.6 FPS difference @ 1080P, again note its ref 290X against 1100/1525 of MSI 390X Gaming.

Here HardOCP down clocked a MSI 390X Gaming to ref 290X clocks.

My 290X 4GB clocked to same speeds as MSI 390X Gaming benches same as it in 3dMark, not had time to test other things *but* after reading all the info I have on the subject I doubt if there's any difference _and_ if there is it will be so small that purchase price of 290X makes 390X seem way over priced for a few FPS if that.

Agree 290/X are hard to come by now, but 2nd hand can be had cheaper.

Please provide this info where 390X is beating 290X by 25FPS, then at least we can compare if it is same clocks comparative.


----------



## GorillaSceptre

Quote:


> Originally Posted by *gupsterg*
> 
> Please provide this info where 390X is beating 290X by 25FPS, then at least we can compare if it is same clocks comparative.


We were talking about Black Ops 3.. Not the 290x vs 390x in general, why do you think i bought a 290x in the first place? In Black Ops 3 there is a 25 fps difference, don't know why, and i don't really care, but lots of people in the Black Ops thread would be happy to hear your and "The Stilt'" give insight.









I want a 390x for the Vram, lets not beat a dead horse here, that argument was put to bed long ago..


----------



## gupsterg

Sorry if I got wrong end of stick







, can you provide this Black OPS 3 FPS compare? I'm just curious to see the benches.

The HardOCP review also had a test where they went 4K res with Witcher 3, GTA IV , etc and again 10-14% difference.

Perhaps its a driver thing? if you flash a 290/X to 390/X it won't use same driver, due to how a real 390/X is detected via a hardware identification method.


----------



## mus1mus

If I were to pick a 390, which will have the best chance of a great clocker?


----------



## Sgt Bilko

Quote:


> Originally Posted by *mus1mus*
> 
> If I were to pick a 390, which will have the best chance of a great clocker?


Sapphire, XFX or MSI seem to be the better ones I've seen
Quote:


> Originally Posted by *gupsterg*
> 
> Sorry if I got wrong end of stick
> 
> 
> 
> 
> 
> 
> 
> , can you provide this Black OPS 3 FPS compare? I'm just curious to see the benches.
> 
> The HardOCP review also had a test where they went 4K res with Witcher 3, GTA IV , etc and again 10-14% difference.
> 
> Perhaps its a driver thing? if you flash a 290/X to 390/X it won't use same driver, due to how a real 390/X is detected via a hardware identification method.


I have a 290x + 390x sitting here.......but i don't have Blops III


----------



## AverdanOriginal

Quote:


> Originally Posted by *GorillaSceptre*
> 
> We were talking about Black Ops 3.. Not the 290x vs 390x in general, why do you think i bought a 290x in the first place? In Black Ops 3 there is a 25 fps difference, don't know why, and i don't really care, but lots of people in the Black Ops thread would be happy to hear your and "The Stilt'" give insight.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I want a 390x for the Vram, lets not beat a dead horse here, that argument was put to bed long ago..


Quote:


> Originally Posted by *gupsterg*
> 
> Sorry if I got wrong end of stick
> 
> 
> 
> 
> 
> 
> 
> , can you provide this Black OPS 3 FPS compare? I'm just curious to see the benches.
> 
> The HardOCP review also had a test where they went 4K res with Witcher 3, GTA IV , etc and again 10-14% difference.
> 
> Perhaps its a driver thing? if you flash a 290/X to 390/X it won't use same driver, due to how a real 390/X is detected via a hardware identification method.


I read on the weekend in PC Games Hardware, that Black Ops 3 is in its current version apparently suffering from huge memory leackage. that might explain the way better performance of the 8GB R9 390 over the 4GB R9 290. Similar situation was in the beginning with Batman Arkham Knight.

EDIT: I checked again, and they updated the test. For the comparisement of 15 GPUs they needed to reduce the graphic settings from "Extra" (I guess they mean something like ULTRA HIGH) to "High". They needed to do this since a 8GB 390X was outperforming a GTX 980 with 4GB by up to 50% higher FPS on Extra High Settings. Sadly they have no direct comparisment of 390X to 290X only 390X to 290 on High Setting.
Test is in German but graph should be understandable








http://www.pcgameshardware.de/Call-of-Duty-Black-Ops-3-Spiel-55478/Specials/Test-Benchmarks-1176980/


----------



## Temuka

Did someone played Fallout 4 on 390 Nitro/MSI ? How are FPS and smoothness of the game ?


----------



## Levys

Quote:


> Originally Posted by *Temuka*
> 
> Did someone played Fallout 4 on 390 Nitro/MSI ? How are FPS and smoothness of the game ?


I'm playing it with the XFX R9 390X and it keeps +- 60fps stock mostly the long distance view/shadows that cripple performance. (btw. falout config tool can help a bit...check google)

Great game btw


----------



## Hemanse

Had to take my MSI R9 390 back today, it had a bit of coil whine which i could learn to live with, but my biggest gripe it was the amount of noise the damn thing made, not sure if i just had a dud, but to keep my card at 75 degrees celcius while gaming the fans to to spin up to around 60-70%, at those levels it got pretty damn obnoxius. Hard getting into a game when all you can hear even with headphones on is the sound of a tiny jet engine to left on the floor, dont even have my pc on the table









So now i gotta figure out if i need to go 970 which does run cooler overall, so the fans can atleast run at a lower %, or if i should try out a Sapphire card which seems to be a bit quieter and have a more pleasant sound overall


----------



## Scorpion49

Quote:


> Originally Posted by *Hemanse*
> 
> Had to take my MSI R9 390 back today, it had a bit of coil whine which i could learn to live with, but my biggest gripe it was the amount of noise the damn thing made, not sure if i just had a dud, but to keep my card at 75 degrees celcius while gaming the fans to to spin up to around 60-70%, at those levels it got pretty damn obnoxius. Hard getting into a game when all you can hear even with headphones on is the sound of a tiny jet engine to left on the floor, dont even have my pc on the table
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So now i gotta figure out if i need to go 970 which does run cooler overall, so the fans can atleast run at a lower %, or if i should try out a Sapphire card which seems to be a bit quieter and have a more pleasant sound overall


My MSI 390X is always at 100% fan speed in games and still runs at 82-85*C. That cooler seems to be absolutely horrific at dealing with the amount of heat these cards can put out. I stopped using it altogether, it can gather dust with my freesync monitor while I enjoy my GTX 950 and my old 1080p 120hz screen, at least it doesn't make me deaf even while wearing headphones.


----------



## Agent Smith1984

NM


----------



## battleaxe

Quote:


> Originally Posted by *Scorpion49*
> 
> My MSI 390X is always at 100% fan speed in games and still runs at 82-85*C. That cooler seems to be absolutely horrific at dealing with the amount of heat these cards can put out. I stopped using it altogether, it can gather dust with my freesync monitor while I enjoy my GTX 950 and my old 1080p 120hz screen, at least it doesn't make me deaf even while wearing headphones.


Put an AIO cooler on it. (redmod)

Shuts them right up.









I've done it about 10 times now to both Nvidia and AMD cards. Hard to beat the lower temps and noise. Hard to go back to air honestly. Not sure I could.


----------



## Scorpion49

Quote:


> Originally Posted by *battleaxe*
> 
> Put an AIO cooler on it. (redmod)
> 
> Shuts them right up.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've done it about 10 times now to both Nvidia and AMD cards. Hard to beat the lower temps and noise. Hard to go back to air honestly. Not sure I could.


Yeah, I have an actual waterblock coming for it already but I don't know if I'm going to use it. I'm so, so tired of having to compromise one thing or another to use AMD cards. After I sold my 970 and Gsync monitor, its been nothing but frustration for almost an entire year. 4 Furies, 3 290X, a dead 270 and this 390X later and I'm still not able to *just have fun and enjoy my games*.


----------



## battleaxe

Quote:


> Originally Posted by *Scorpion49*
> 
> Yeah, I have an actual waterblock coming for it already but I don't know if I'm going to use it. I'm so, so tired of having to compromise one thing or another to use AMD cards. After I sold my 970 and Gsync monitor, its been nothing but frustration for almost an entire year. 4 Furies, 3 290X, a dead 270 and this 390X later and I'm still not able to *just have fun and enjoy my games*.


Odd, your experience is so far from what I have experienced. I've easily had as much or more trouble with Nvidia cards. I have no preference between them really. Right now, I may lean slightly more toward AMD if anything. I'm still mad about getting a 3.5gb card on my 970 that was supposed to be 4gb. Still makes me mad. It runs fine sure, but so does my 290/290x's. I'm sure its frustrating though.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Scorpion49*
> 
> Yeah, I have an actual waterblock coming for it already but I don't know if I'm going to use it. I'm so, so tired of having to compromise one thing or another to use AMD cards. After I sold my 970 and Gsync monitor, its been nothing but frustration for almost an entire year. 4 Furies, 3 290X, a dead 270 and this 390X later and I'm still not able to *just have fun and enjoy my games*.


I feel like you are probably a small minority of people.....

I've gone through 4) 7900 series cards, 1) 280x, 2) 290's, 2) 390's and now a Fury in the last 2 years and haven't really had any problems except with the 280x which had a wide spread issue with VRAM overvoltage due to Asus own mistake..... and then a little coil noise on my current Fury, but nothing major in any case where I couldn't "enjoy my games.."

Sorry to hear about your results though....


----------



## Scorpion49

Quote:


> Originally Posted by *battleaxe*
> 
> Odd, your experience is so far from what I have experienced. I've easily had as much or more trouble with Nvidia cards. I have no preference between them really. Right now, I may lean slightly more toward AMD if anything. I'm still mad about getting a 3.5gb card on my 970 that was supposed to be 4gb. Still makes me mad. It runs fine sure, but so does my 290/290x's. I'm sure its frustrating though.


Quote:


> Originally Posted by *Agent Smith1984*
> 
> I feel like you are probably a small minority of people.....
> 
> I've gone through 4) 7900 series cards, 1) 280x, 2) 290's, 2) 390's and now a Fury in the last 2 years and haven't really had any problems except with the 280x which had a wide spread issue with VRAM overvoltage due to Asus own mistake..... and then a little coil noise on my current Fury, but nothing major in any case where I couldn't "enjoy my games.."
> 
> Sorry to hear about your results though....


I doubt anyone besides extreme benchers or reviewers (probably miners too) have seen as many AMD cards as I have, since the 7900 series launched I can count at least 60 without even thinking about it and I know there were plenty more. And there is always, ALWAYS a problem. Every single time, bar none. Either a driver issue with the game I want to play, coil whine, DOA junk, (looking at you 7950's), Freesync issues or something else that makes me have to fiddle fart around with them for ages before giving up and moving to the next one. Nvidia has a buttload of problems too, it just so happened that the last Nvidia card I used was pretty straight and everything just managed to work without issues. Right now I've been troubleshooting Fallout 4 for about 12 hours and managed to play through the intro. My 950 runs it like a champ, add the 390X to the mix and I get 27fps hard locked.


----------



## Dundundata

I have no issues with my msi390, runs quiet and cool. Maybe there's something wrong with the fan Hemanse? Scorpion the only thing I can think of is a thermal paste issue.

Edit, wrote this before i saw the other posts...

Been playing Fallout 4 and it maintains a solid 60. The game set "godrays quality" to high vs ultra and I haven't tested to see what kind of a hit it takes, i'm guessing similar to foliage distance in Witcher 3.


----------



## Ultra-m-a-n

Quote:


> Originally Posted by *Temuka*
> 
> Did someone played Fallout 4 on 390 Nitro/MSI ? How are FPS and smoothness of the game ?


I was playing last night, with GodRays on low and it was smooth on my 1440p 60hz monitor.

My CPU is an overclocked X5650 (hex core xeon) @ 4.5GHz.

No stuttering or immersion breaking for me... I do suspect once better drivers are out, along with updates to the game, the framerate will be much smoother for all of us with the AMD cards.


----------



## Temuka

Quote:


> Originally Posted by *Ultra-m-a-n*
> 
> I was playing last night, with GodRays on low and it was smooth on my 1440p 60hz monitor.
> 
> My CPU is an overclocked X5650 (hex core xeon) @ 4.5GHz.
> 
> No stuttering or immersion breaking for me... I do suspect once better drivers are out, along with updates to the game, the framerate will be much smoother for all of us with the AMD cards.


I really hope so,will get my hands on game in 3 days and I will also test it out with my 390 Nitro


----------



## tangelo

Quote:


> Originally Posted by *Temuka*
> 
> Did someone played Fallout 4 on 390 Nitro/MSI ? How are FPS and smoothness of the game ?


I play it with everything on ultra/max @ 1080p. The info about my rig is on my sig.

Pretty solid 60fps, with some strange dips to 40ish. Specially when outside and during sand storms.


----------



## GorillaSceptre

Quote:


> Originally Posted by *Scorpion49*
> 
> I doubt anyone besides extreme benchers or reviewers (probably miners too) have seen as many AMD cards as I have, since the 7900 series launched I can count at least 60 without even thinking about it and I know there were plenty more. And there is always, ALWAYS a problem. Every single time, bar none. Either a driver issue with the game I want to play, coil whine, DOA junk, (looking at you 7950's), Freesync issues or something else that makes me have to fiddle fart around with them for ages before giving up and moving to the next one. Nvidia has a buttload of problems too, it just so happened that the last Nvidia card I used was pretty straight and everything just managed to work without issues. Right now I've been troubleshooting Fallout 4 for about 12 hours and managed to play through the intro. My 950 runs it like a champ, add the 390X to the mix and I get 27fps hard locked.


I've never had any major issues with either vendor. Guess i'm just lucky..

I have to ask, because i read a lot of posts like this.. If you have countless problems with AMD, why would you buy 60 variations of their products? Doesn't make sense lol.


----------



## Hemanse

Quote:


> Originally Posted by *Scorpion49*
> 
> My MSI 390X is always at 100% fan speed in games and still runs at 82-85*C. That cooler seems to be absolutely horrific at dealing with the amount of heat these cards can put out. I stopped using it altogether, it can gather dust with my freesync monitor while I enjoy my GTX 950 and my old 1080p 120hz screen, at least it doesn't make me deaf even while wearing headphones.


Would go crazy if i had to listen to that 24/7, at 100% fan speed it sounds like the PC is about ready to take off, i think i might just go with a quiet 970 or 980 instead, the 970 might be a tad slower and have less VRAM, but i can live with that if the card is silent instead


----------



## rdr09

Quote:


> Originally Posted by *Hemanse*
> 
> Would go crazy if i had to listen to that 24/7, at 100% fan speed it sounds like the PC is about ready to take off, i think i might just go with a quiet 970 or 980 instead, the 970 might be a tad slower and have less VRAM, but i can live with that if the card is silent instead


you might want to visit the geforce forum.


----------



## Zack Foo

Quote:


> Originally Posted by *Hemanse*
> 
> Would go crazy if i had to listen to that 24/7, at 100% fan speed it sounds like the PC is about ready to take off, i think i might just go with a quiet 970 or 980 instead, the 970 might be a tad slower and have less VRAM, but i can live with that if the card is silent instead


i am using msi 390x and i pretty sure he has bad air flow in his case. Like pretty freaking sure. cause i was having that temp too 80 ish with 60% fan speed but after i ckear up some space for good air flow the card never go over 80 with 50% fan speed in games like gtav and black ops 3. (tried yesterday btw.)


----------



## Mysticking32

I too was having temp problems with my 390x's. Went through two and am now on the third. First had temps up to 90 playing the witcher 3. Second had 85, but it worked fine. (first had artifacts at stock settings). And msi sent me a free one to replace my r9 280x so I sent the second 390x back to amazon for a full refund. (Basically got a free card)

This one works perfectly. Temps never over 75. Overclocked to 1110mhz and 1525mhz on the memory. I'm pretty sure it had something to do with the thermal paste when the first models shipped.


----------



## Strife21

Well I got a XFX r9 390 vrm temp 1 seems high though. Its around 90C. I thought they fixed the issue with a better heat sink. Not sure what to do


----------



## Sgt Bilko

Quote:


> Originally Posted by *Strife21*
> 
> Well I got a XFX r9 390 vrm temp 1 seems high though. Its around 90C. I thought they fixed the issue with a better heat sink. Not sure what to do


RMA it or at least ask XFX about it: http://xfxforce.com/support , it shouldn't be that high unless you have really poopy airflow


----------



## Strife21

Quote:


> Originally Posted by *Sgt Bilko*
> 
> RMA it or at least ask XFX about it: http://xfxforce.com/support , it shouldn't be that high unless you have really poopy airflow


Yea I have really good air flow thats why its worrying me. Thats load temp btw not idle.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Strife21*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> RMA it or at least ask XFX about it: http://xfxforce.com/support , it shouldn't be that high unless you have really poopy airflow
> 
> 
> 
> Yea I have really good air flow thats why its worrying me. Thats load temp btw not idle.
Click to expand...

VRM 1 is for the memory and it's located here:



It doesn't have a heatsink unlike Vrm 2 (which is for the Core) but even under a decent overclock (1600-1700Mhz) it shouldn't go over 75-80c really


----------



## kizwan

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Strife21*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> RMA it or at least ask XFX about it: http://xfxforce.com/support , it shouldn't be that high unless you have really poopy airflow
> 
> 
> 
> Yea I have really good air flow thats why its worrying me. Thats load temp btw not idle.
> 
> Click to expand...
> 
> VRM 1 is for the memory and it's located here:
> 
> 
> 
> It doesn't have a heatsink unlike Vrm 2 (which is for the Core) but even under a decent overclock (1600-1700Mhz) it shouldn't go over 75-80c really
Click to expand...

I think you got that reversed for the VRM1 & VRM2.


----------



## Sgt Bilko

Quote:


> Originally Posted by *kizwan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Strife21*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> RMA it or at least ask XFX about it: http://xfxforce.com/support , it shouldn't be that high unless you have really poopy airflow
> 
> 
> 
> Yea I have really good air flow thats why its worrying me. Thats load temp btw not idle.
> 
> Click to expand...
> 
> VRM 1 is for the memory and it's located here:
> 
> 
> 
> It doesn't have a heatsink unlike Vrm 2 (which is for the Core) but even under a decent overclock (1600-1700Mhz) it shouldn't go over 75-80c really
> 
> Click to expand...
> 
> I think you got that reversed for the VRM1 & VRM2.
Click to expand...

Stock Clocks running GPU-Z's render test:


Same Clocks but with +50mV Core Voltage:


I think you might be right......


----------



## Strife21

Quote:


> Originally Posted by *Sgt Bilko*
> 
> VRM 1 is for the memory and it's located here:
> 
> 
> 
> It doesn't have a heatsink unlike Vrm 2 (which is for the Core) but even under a decent overclock (1600-1700Mhz) it shouldn't go over 75-80c really


So GTA5 benchmark gets it to 80C with stock clocks, Valley benchmark gets it to 86C at stock. I'll try adjusting the fan curve.


----------



## Strife21

Quote:


> Originally Posted by *Strife21*
> 
> So GTA5 benchmark gets it to 80C with stock clocks, Valley benchmark gets it to 86C at stock. I'll try adjusting the fan curve.


now at stock with increased fan curve 71C in GTA5 bench


----------



## Agent Smith1984

When did noise start taking priority over performance in the enthusiast PC world??!!??


----------



## Gumbi

Quote:


> Originally Posted by *Agent Smith1984*
> 
> When did noise start taking priority over performance in the enthusiast PC world??!!??


Why not both?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Gumbi*
> 
> Why not both?


Yeah, both is nice, this XFX Fury does a good job with limiting fan noise/temps, while performing very well, but the "cricket" chirping in my case under load is a tad annoying, however it appears to have lessoned a lot since I have gotten the card.... Maybe all that 4k gaming?









http://www.3dmark.com/fs/6468992


----------



## joeh4384

Quote:


> Originally Posted by *Agent Smith1984*
> 
> When did noise start taking priority over performance in the enthusiast PC world??!!??


I don't mind some noise but it is nice to not have to run your GPU fans at jet airplane levels.


----------



## tolis626

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Yeah, both is nice, this XFX Fury does a good job with limiting fan noise/temps, while performing very well, but the "cricket" chirping in my case under load is a tad annoying, however it appears to have lessoned a lot since I have gotten the card.... Maybe all that 4k gaming?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/fs/6468992


I don't even dare think about how much power your system consumes... Like, seriously, if they release voltage modification software for the Fury, what's gonna happen?

Nice space heater bro.









On a more serious note, damn that Fury needs some volts and clocks. At 1000/500MHz it's just under 10% faster than an overclocked 390x, at least in Firestrike. Not that I care about that, but the 980ti is quite far ahead. If people could at least get their Furies to 1150-1200MHz (And some overclock on the memory side, of course) just so we could see how they perform, it'd be great. But no, no voltage control. Sigh...


----------



## fat4l

Quote:


> Originally Posted by *Gumbi*
> 
> Can any 390(x) owner beat this score? Run at 100% fan and 1241/1644 (blame those odd numbers on Trixx's terrible interface). Ambients are quite and helped, maybe 14 c in my room at the moment. but it was a stable run!. Plus 200mv in Trixx is 1.4v when stressed fully, but in many games it wouldn't be glued to 1.4v as it was in Firestrike. Still, the temps are AMAZING on the VaporX cooler. Ambients helped, as well as my case cooling setup (have 4 Nocuta 2k fans hooked up to a fan controller, and 2 other random make fans too of lesser speeds).
> 
> i uploaded the image to show GPUz logs. Also, my PSU 12v sensor dips a nice bit when using so much voltage, I don't think it's a cause for concern, but I am going to be grabbing a new PSU soon (this SuperFlower Amazon Bronze 650 watt has served me well for 5 years). Probably going to grab a SuperFlower 650 watt Gold (either Golden Green or Leadex).
> 
> My card is easily drawing 350~ watts with this voltage, if not more, and CPU is 4790k at 4.9ghz/1.31v. So system is being heavily loaded when running this bench! probably 550 - 600 watts!
> 
> http://www.3dmark.com/3dm/9201024?


So I tried it now......
290X 1250/1520MHz.
[email protected]

14849 Graphics Score


----------



## Strife21

Well I took the XFX R9 390 back. The VRM temp was just too high. Picked up an MSI R9 390 Gaming instead and wow what a difference.

max core temp 65C, (with fans only running at 36%)
max VRM 1 temp 65C

And my favorite part of all no annoying coil whine. I tried two XFX and both had horrible coil whine.


----------



## Dundundata

Quote:


> Originally Posted by *Strife21*
> 
> Well I took the XFX R9 390 back. The VRM temp was just too high. Picked up an MSI R9 390 Gaming instead and wow what a difference.
> 
> max core temp 65C, (with fans only running at 36%)
> max VRM 1 temp 65C
> 
> And my favorite part of all no annoying coil whine. I tried two XFX and both had horrible coil whine.


I had the same experience, 2 xfx's both with whine. I get similar temps with the msi, vrm is usually lower than core...highest core gets is around 70 overclocked in a warm room. Plus it's quiet. Guess it's the luck of the draw and perhaps hardware combinations?


----------



## Strife21

Quote:


> Originally Posted by *Dundundata*
> 
> I had the same experience, 2 xfx's both with whine. I get similar temps with the msi, vrm is usually lower than core...highest core gets is around 70 overclocked in a warm room. Plus it's quiet. Guess it's the luck of the draw and perhaps hardware combinations?


Yea so overclocked to 1125/1550 and max temp is 72 on core and 69 on vrm. Only issue is I could overclock the memory to 1700 on the xfx. I feel better with a quitter and cooler card tho.


----------



## mus1mus

I wonder if someone in here trued modding their bios to squeeze some extra performance.

The 200 series cards are doing great on modded bios.


----------



## KNG HOLDY

my msi r9 390 just arrived i played like 1 hour gta5 and i hit a max of 92°C should i still try to overclock or shouldnt i?


----------



## fat4l

wth is with temps lol ?? 92C lol ?
I thought these new "uber" coolers are much better ...


----------



## KNG HOLDY

im running at 1160 core clock / 1650 memory clock first time i oced a gpu :c thats ok for a msi r9 390 right? i get a max of 88°C
if i try to add +10mhz memory clock my pc crashes if i add +10mhz core clock valley benchmark crashes









got a 2519 score w/o oc and 2765 with (~+9,8%)


----------



## Strife21

Quote:


> Originally Posted by *KNG HOLDY*
> 
> im running at 1160 core clock / 1650 memory clock first time i oced a gpu :c thats ok for a msi r9 390 right? i get a max of 88°C
> if i try to add +10mhz memory clock my pc crashes if i add +10mhz core clock valley benchmark crashes
> 
> 
> 
> 
> 
> 
> 
> 
> 
> got a 2519 score w/o oc and 2765 with (~+9,8%)


No that is not okay. My MSI R9 390 runs at 70-72C. That is way to hot, They dont throttle until 93C but it shouldnt be running that hot how is your air flow in your case?


----------



## MarshMellow

Has anyone found a full Waterblock for the 390/390X DCIII ????????


----------



## Dundundata

Idk what is going on with your cards sorry but those temps don't sound good at all


----------



## Derek129

My giga 390 never passes 70c under hours of gaming. Still can sound like a jet but its not a bad card in the long run


----------



## KNG HOLDY

my pcie 3.0 slot is broken so i ve to run my gpu at the bottom like 5cm above my hdds (air 540 case)

got 3 intakes and 2 exaust fans all run at max 700rpm (all on radiators for my cpu ((used them before for cpu and gpu but the gpu just arrived today so no watercooling till 2016 i guess))


----------



## lightsout

Is the info in the OP still relevant in regards to which cards are good?

I see that the MSI gaming has voltage control. What would be some of the best cards for a max air OC.

I'm talking 390's btw.


----------



## flopper

Quote:


> Originally Posted by *lightsout*
> 
> Is the info in the OP still relevant in regards to which cards are good?
> 
> I see that the MSI gaming has voltage control. What would be some of the best cards for a max air OC.
> 
> I'm talking 390's btw.


Normally a MSI card.
they seem to do slightly better overall.


----------



## lightsout

Quote:


> Originally Posted by *flopper*
> 
> Quote:
> 
> 
> 
> Originally Posted by *lightsout*
> 
> Is the info in the OP still relevant in regards to which cards are good?
> 
> I see that the MSI gaming has voltage control. What would be some of the best cards for a max air OC.
> 
> I'm talking 390's btw.
> 
> 
> 
> Normally a MSI card.
> they seem to do slightly better overall.
Click to expand...

thanks +rep


----------



## lightsout

Quote:


> Originally Posted by *flopper*
> 
> Quote:
> 
> 
> 
> Originally Posted by *lightsout*
> 
> Is the info in the OP still relevant in regards to which cards are good?
> 
> I see that the MSI gaming has voltage control. What would be some of the best cards for a max air OC.
> 
> I'm talking 390's btw.
> 
> 
> 
> Normally a MSI card.
> they seem to do slightly better overall.
Click to expand...

How do you like your Nitro card? It seems they have one of the best coolers. Anything you don't like about it?


----------



## kubiks

So I never made a final post after installing the two Alphacool Nexxxos GPX 390 M02 blocks on my crossfire x MSI R9 390x setup. Installation is smooth and a breeze. The thermal pad installation was the annoying part. Considering there is no water cooling to the VRM section (with the exception of one VRM module close to the core), I felt that one slip up there could be a problem. So far though, so good. VRM temps slightly lower than stock, but with the two cards under water and in series (for the time being until i can find a freaking bridge. MSI z97 gaming 7 has a single card design, might be 2.4 inch between cards and hard to find a decent one) with one Swiftech MCRx 420 and a Alphacool 280 ut 60mm rad I am getting Idle at 29* and full load Kombuster at 1440p 45*. The Ut 60 is passive for now, as it seems fans arent making a huge deal and I am just setting up my new Corsair Obsidian 900D, I need to make space for the dual 140 phanteks sp fans I have.

These cards are great under water. I can push them pretty well ( up to 1175/1700) with no real stability issues. I havent taken the time to sit down and OC the cards properly, still working on getting my 4790k to its sweet spot. I know there is a great wealth of info in this thread so Ill check it out soon. OCing on a dual 390x setup hits your psu hard, and I am usually hovering around 1000 watts when OC, 600-750 when not.

I mean, theres not much to say about these blocks. Core temps as low, as these are great waterblcoks. Reviews have shown them to outperform full cover EKs on the core alone, but these blocks are quite restrictive. Ive read that it may be that restriction and block design that lends itself to the good cooling. I am running a Phobya dc12 260 with 4.7m head, and its working perfectly, but when I bring the cpu in from its own loop, Ill probably head to the MCP 655 or go witha series pump loop. Considering the low price taag for a block with a backplate and only spending 200.00 on the blocks, this was a great purchase IMO. Ordered from aquatuning, and they shipped internationally for 19.00 with the products showing up only TWO AND A HALF DAYS after ordering. Quite impressed with that turn around time. The blocks even match my black red and white build, so I'm a happy camper.

I know many people were curious about waterblocks, and this alphacool block is rather new, so if any of you have any questions or want any tips on the installation, pm or respond to my post.

Here they are in the new setup.. Considering moving the 280 ut60 up to the roof and eliminating the Swiftech h220x.


----------



## KNG HOLDY

@kubiks: could u even keep the original dragon msi backplate ? or does the block only fit with the alphacool one?


----------



## bichael

Eagerly awaiting my 'upgrade kit' so I can put my GPX block on.

I did manage to install the card at least. 12300 firestrike graphics score, 8400 total - as obviously held back by my dual core (feel bad upgrading when I got such a great chip but no doubt will eventually). Measured around 370W max on my power meter so the 450W SFX psu is doing fine. GPU temp did get up to 81oC but then that was with the fan part zip tied to the side of my case







No way a 2.5 slot cooler is going in a SG05...


----------



## KNG HOLDY

btw my screen so that u can add me to the list:

http://www2.pic-upload.de/img/28862750/Unbenannt.png

two questions:
its the first time i overclocked a gpu is 1150/1600 an ok oc?
second question on my asrock z77 pro 3 the pcie 3.0x16 slot is broken and the gpu is on the x4 version 1.1 according to gpuz (i thought it was atleast an pcie 2.0 x16) will upgrading the motherboard to finally use pcie 3.0 again be worth it? i personally prefer to wait like a few month and buy skylake 2k16


----------



## flopper

Quote:


> Originally Posted by *KNG HOLDY*
> 
> btw my screen so that u can add me to the list:
> 
> http://www2.pic-upload.de/img/28862750/Unbenannt.png
> 
> two questions:
> its the first time i overclocked a gpu is 1150/1600 an ok oc?
> second question on my asrock z77 pro 3 the pcie 3.0x16 slot is broken and the gpu is on the x4 version 1.1 according to gpuz (i thought it was atleast an pcie 2.0 x16) will upgrading the motherboard to finally use pcie 3.0 again be worth it? i personally prefer to wait like a few month and buy skylake 2k16


OC is a good one.

wait, while the 4x might be a slight bottleneck it wont be worth it to change just for that.
.


----------



## kubiks

Quote:


> Originally Posted by *KNG HOLDY*
> 
> @kubiks: could u even keep the original dragon msi backplate ? or does the block only fit with the alphacool one?


Stock msi backplate won't fit. Need to use alphacool plate. Acts as part of the heatsink also


----------



## Charcharo

I hope it is not too late to join the club








http://www.techpowerup.com/gpuz/details.php?id=2ds9e

*This is how this is done... right?

Would love yo be with you guys here. This is my first personal build in 6 years







and am so far... mostly happy with it.


----------



## xboxshqip

Unigine 'Heaven' Factory OC 1050 Mhz Clock 1500 Mhz Vram.


----------



## rdr09

Quote:


> Originally Posted by *xboxshqip*
> 
> Unigine 'Heaven' Factory OC 1050 Mhz Clock 1500 Mhz Vram.


i've been wanting to compare a stock 390 with my 290. i have to oc my 290 to 1150 core to get 58 in that bench.

if you don't mind and if you have time . . . do you mind running Firestrike at stock clocks as well? +rep.


----------



## Streetdragon

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *kubiks*
> 
> So I never made a final post after installing the two Alphacool Nexxxos GPX 390 M02 blocks on my crossfire x MSI R9 390x setup. Installation is smooth and a breeze. The thermal pad installation was the annoying part. Considering there is no water cooling to the VRM section (with the exception of one VRM module close to the core), I felt that one slip up there could be a problem. So far though, so good. VRM temps slightly lower than stock, but with the two cards under water and in series (for the time being until i can find a freaking bridge. MSI z97 gaming 7 has a single card design, might be 2.4 inch between cards and hard to find a decent one) with one Swiftech MCRx 420 and a Alphacool 280 ut 60mm rad I am getting Idle at 29* and full load Kombuster at 1440p 45*. The Ut 60 is passive for now, as it seems fans arent making a huge deal and I am just setting up my new Corsair Obsidian 900D, I need to make space for the dual 140 phanteks sp fans I have.
> 
> These cards are great under water. I can push them pretty well ( up to 1175/1700) with no real stability issues. I havent taken the time to sit down and OC the cards properly, still working on getting my 4790k to its sweet spot. I know there is a great wealth of info in this thread so Ill check it out soon. OCing on a dual 390x setup hits your psu hard, and I am usually hovering around 1000 watts when OC, 600-750 when not.
> 
> I mean, theres not much to say about these blocks. Core temps as low, as these are great waterblcoks. Reviews have shown them to outperform full cover EKs on the core alone, but these blocks are quite restrictive. Ive read that it may be that restriction and block design that lends itself to the good cooling. I am running a Phobya dc12 260 with 4.7m head, and its working perfectly, but when I bring the cpu in from its own loop, Ill probably head to the MCP 655 or go witha series pump loop. Considering the low price taag for a block with a backplate and only spending 200.00 on the blocks, this was a great purchase IMO. Ordered from aquatuning, and they shipped internationally for 19.00 with the products showing up only TWO AND A HALF DAYS after ordering. Quite impressed with that turn around time. The blocks even match my black red and white build, so I'm a happy camper.
> 
> I know many people were curious about waterblocks, and this alphacool block is rather new, so if any of you have any questions or want any tips on the installation, pm or respond to my post.
> 
> Here they are in the new setup.. Considering moving the 280 ut60 up to the roof and eliminating the Swiftech h220x.





+Rep awesome!
Maybe i didnt saw the VRM temps.. How are the Temps over time? VRM1 AND VRM2?


----------



## gupsterg

Quote:


> Originally Posted by *rdr09*
> 
> i've been wanting to compare a stock 390 with my 290. i have to oc my 290 to 1150 core to get 58 in that bench.
> 
> if you don't mind and if you have time . . . do you mind running Firestrike at stock clocks as well? +rep.


Perhaps mod the RAM timings for your 290.

When the 390X came out and I saw MSI had the highest clocked out of box 1100/1525 I had been aiming to create a rom to match/beat benches for that card so I could have my card at same clocks 24/7.

Firstly I was aiming to get same 3dMark score as MSI 390X, most reviews had overall 3dMark score, part of that is made up of physics/combined tests which vary more depending on CPU in rig from what I saw.

Now graphics score I found in image on Guru3D review. This 12967 score my card beat by 3%, 13406.

I also saw a valley score on Bit Tech which my rom beats by 3% again.

The 390/X roms have slightly faster ram timings, so only way I was matching/beating a 390X clock for clock in above tests is by placing tighter timings in my rom.


----------



## rdr09

Quote:


> Originally Posted by *gupsterg*
> 
> Perhaps mod the RAM timings for your 290.
> 
> When the 390X came out and I saw MSI had the highest clocked out of box 1100/1525 I had been aiming to create a rom to match/beat benches for that card so I could have my card at same clocks 24/7.
> 
> Firstly I was aiming to get same 3dMark score as MSI 390X, most reviews had overall 3dMark score, part of that is made up of physics/combined tests which vary more depending on CPU in rig from what I saw.
> 
> Now graphics score I found in image on Guru3D review. This 12967 score my card beat by 3%, 13406.
> 
> I also saw a valley score on Bit Tech which my rom beats by 3% again.
> 
> The 390/X roms have slightly faster ram timings, so only way I was matching/beating a 390X clock for clock in above tests is by placing tighter timings in my rom.


Thanks for the link. The scores for the 290 were prolly recycled. i compared to my runs of over a year ago and it just matches my system with just an i7 Sandy at 4.5GHz HT off (i5) and the 290 at stock. Also, it says it used 15.15 driver. must be typo.

I was just really curious about the comparison between 290 and the 300. So, my 1200 core matches your 13400 GS. I play at stock 'cause with 2 290s . . . it is my Sandy that needs oc'ing. +rep.

edit: here was a run of Oct last year at stock with just a Sandy . . .

http://www.3dmark.com/3dm/4250058

Beat an i7 5960K @ 4.4GHz. Unless i am missing something.


----------



## Scorpion49

I finally got my Alphacool MSI block as well. Here are some pics of the install, one thing I noticed right away is that MSI doesn't put full pads on half the memory modules (also, the PCB is very brown, ick). Did 30 minutes of Valley at stock clocks to see how temps were, I'm sure it will get hotter on the core during games because I have a single 360mm radiator with the 390X and a 5930k at 4.5ghz on it, but what I'm seeing so far is 65C on the core and 69/48C on VRM1/VRM2. I may need to run my D5 a little faster, its only on like 15% with just the CPU.


----------



## KNG HOLDY

i played with my first amd gpu for 2 days and i had ZERO problems besides crashes while oc

im so happy after all the stuff i read about amd gpus i was so concerned if i should rly buy an r9 390 :c
i want to watercool the gpu later i have 1 320 and 1 240 rad, can i get an even higher oc when i already reached my max and get a max temp of 75°C with all fans at 100%? i just tried to set core voltage to +100 and power limit to the +50 and want to test if i can get the card to be stable at 1200/1650 or 1150/1700 but i cant pass 1150/1650 :c


----------



## Waldos Platypus

Hey ladies and gents,

I bought a Gigabyte 390 G1 because it was the shortest available, and fit into my mini-itx case (Lian-Li pc-q08b). I've tried searching the comments, but didn't find if anyone successfully used a NZXT Kraken G10 or a Corsair HG10 on a Gigabyte 390? I realize I will probably have to change cases, but the fans are simply too loud for a case that small. If someone has, please point me to their post, or let me know how it worked for you/them. Thanks in advance.


----------



## mrbull3tproof

Quote:


> Originally Posted by *Waldos Platypus*
> 
> Hey ladies and gents,
> 
> I bought a Gigabyte 390 G1 because it was the shortest available, and fit into my mini-itx case (Lian-Li pc-q08b). I've tried searching the comments, but didn't find if anyone successfully used a NZXT Kraken G10 or a Corsair HG10 on a Gigabyte 390? I realize I will probably have to change cases, but the fans are simply too loud for a case that small. If someone has, please point me to their post, or let me know how it worked for you/them. Thanks in advance.


Arctic accelero xtreme IV works perfectly for me (also very quiet) but I'm affraid that's not watercooling.

http://www.overclock.net/t/1561704/official-amd-r9-390-390x-owners-club/690#post_24197589

Of course there's no way it will fit your case.


----------



## Charcharo

Hmm guys, I have a question... or should I say a few.

I currently have an i5 4460 CPU and an R9 390 (PCS+) and most of the time I am quite happy. I am currently at 1440x900 but I hope to soon make the jump to 1440 or 1600p. Also am on Windows 7

Now I see that in some games like Fallout 4 the CPU seems to be quite central ( http://www.eurogamer.net/articles/digitalfoundry-2015-the-best-pc-hardware-for-fallout-4-4023 ) . What I am asking is whether my R9 390 is getting bottlenecked by the i5 4460, whether the 4690 and 4790 are actually worth it anyway and whether I will get a performance increase when I switch to Windows 10 and an SSD.

*Damn that looks like a somewhat awkward comment now*


----------



## kubiks

So
Quote:


> Originally Posted by *Streetdragon*
> 
> +Rep awesome!
> Maybe i didnt saw the VRM temps.. How are the Temps over time? VRM1 AND VRM2?


\
So after playing Fallout 4 for 4 hours core temp on both cards was at 40*, and VRM1 was 52* / VRM2 41*. It seems that Fallout may not be the best test though as it doesnt tax the gpus very hard.

Two days ago, I ran MSI Kombustor at 1440p for 2 hours. The results at the end showed the cores to both be roughly 55* and Vrm1 80* Vrm 2 62*. Not bad for two hours of 140fps stresstesting


----------



## kubiks

Keep in mind I have a 1x280 60 mm rad, 1 4x120 rad and 1 240 rad


----------



## kubiks

Quote:


> Originally Posted by *Charcharo*
> 
> Hmm guys, I have a question... or should I say a few.
> 
> I currently have an i5 4460 CPU and an R9 390 (PCS+) and most of the time I am quite happy. I am currently at 1440x900 but I hope to soon make the jump to 1440 or 1600p. Also am on Windows 7
> 
> Now I see that in some games like Fallout 4 the CPU seems to be quite central ( http://www.eurogamer.net/articles/digitalfoundry-2015-the-best-pc-hardware-for-fallout-4-4023 ) . What I am asking is whether my R9 390 is getting bottlenecked by the i5 4460, whether the 4690 and 4790 are actually worth it anyway and whether I will get a performance increase when I switch to Windows 10 and an SSD.
> 
> *Damn that looks like a somewhat awkward comment now*


Youll definitely see a performance increase running fallout on the SSD. As for your processor, I wouldnt upgrade that cpu unitl something new comes out. That is a good gaming processor. You may be able to squeeze a bit more power from it thru OC.

By the way, with two 390x;s and a 4790k oc'ed to 4.8, I still cant run fallout 4 at 1440p ultra







I must be doin' it wrong.-

As for windows 10 and DX12. I cannot comment on that as I have no idea.

good luck!


----------



## xboxshqip

Quote:


> Originally Posted by *xboxshqip*
> 
> Unigine 'Heaven' Factory OC 1050 Mhz Clock 1500 Mhz Vram.


Quote:


> Originally Posted by *rdr09*
> 
> i've been wanting to compare a stock 390 with my 290. i have to oc my 290 to 1150 core to get 58 in that bench.
> 
> if you don't mind and if you have time . . . do you mind running Firestrike at stock clocks as well? +rep.


Sure, can you provide a link of the exact version you wont me to try, thanks.


----------



## rdr09

Quote:


> Originally Posted by *xboxshqip*
> 
> Sure, can you provide a link of the exact version you wont me to try, thanks.


this version includes the demos, so this is going to test how good your cooling on the card.









https://www.techpowerup.com/downloads/2497/futuremark-3dmark-2013-v1-5-915/

sometimes the short version goes on sale in steam.


----------



## lightsout

Wheres all the nitro owners. Seems like the card has great cooling any disadvantages? Price is good too.

I am still debating between the non x 390 and a 970. I know the 390 is a bit better but nvidia seems to just work. Although I hate to "have" to flash the bios to get some proper overclocking going.


----------



## jon666

Use sapphire trixx for overclocking. My nitro seems to be holding up pretty good. I think I might slap on a universal waterblock on it tonight though.


----------



## seanpatrick

I've got the Sapphire Nittro w/ backplate. I've got it O'C'd to 1110 mhz with a 1685 clock, with no voltage boost through afterburner (though with a +10 power limit). It's quiet and stays at 70 MAX after hours of gaming, so I'm liking it.

I am having trouble with what seems to be the common complaint of random horizontal lines flashing briefly across the screen a couple of times an hour or so (totally random) - but I think that might be a driver issue that has been discussed in other threads. I'm currently waiting to see if the next driver update addresses it. If it gets bad enough I guess I'll have to RMA it - if the lines are still there once in a while it's bye-bye AMD unfortunately.


----------



## lightsout

Quote:


> Originally Posted by *jon666*
> 
> Use sapphire trixx for overclocking. My nitro seems to be holding up pretty good. I think I might slap on a universal waterblock on it tonight though.


Quote:


> Originally Posted by *seanpatrick*
> 
> I've got the Sapphire Nittro w/ backplate. I've got it O'C'd to 1110 mhz with a 1685 clock, with no voltage boost through afterburner (though with a +10 power limit). It's quiet and stays at 70 MAX after hours of gaming, so I'm liking it.
> 
> I am having trouble with what seems to be the common complaint of random horizontal lines flashing briefly across the screen a couple of times an hour or so (totally random) - but I think that might be a driver issue that has been discussed in other threads. I'm currently waiting to see if the next driver update addresses it. If it gets bad enough I guess I'll have to RMA it - if the lines are still there once in a while it's bye-bye AMD unfortunately.


Thanks guys. Can you control voltage in afterburner? And is the horizontal line thing for all 390s or just the nitro?


----------



## Charcharo

Quote:


> Originally Posted by *kubiks*
> 
> Youll definitely see a performance increase running fallout on the SSD. As for your processor, I wouldnt upgrade that cpu unitl something new comes out. That is a good gaming processor. You may be able to squeeze a bit more power from it thru OC.
> 
> By the way, with two 390x;s and a 4790k oc'ed to 4.8, I still cant run fallout 4 at 1440p ultra
> 
> 
> 
> 
> 
> 
> 
> I must be doin' it wrong.-
> 
> As for windows 10 and DX12. I cannot comment on that as I have no idea.
> 
> good luck!


From what I know, I can not overclock a 4460. I am thinking of overclocking (a bit) my R9 390 though.

I guess SSD + windows 10 may help me some still though.


----------



## rdr09

Quote:


> Originally Posted by *kubiks*
> 
> Youll definitely see a performance increase running fallout on the SSD. As for your processor, I wouldnt upgrade that cpu unitl something new comes out. That is a good gaming processor. You may be able to squeeze a bit more power from it thru OC.
> 
> By the way, with two 390x;s and a 4790k oc'ed to 4.8, I still cant run fallout 4 at 1440p ultra
> 
> 
> 
> 
> 
> 
> 
> I must be doin' it wrong.-
> 
> As for windows 10 and DX12. I cannot comment on that as I have no idea.
> 
> good luck!


i doubt amd will beat nvidia for multi-gpu support on failout 4.


----------



## kizwan

Quote:


> Originally Posted by *KNG HOLDY*
> 
> i played with my first amd gpu for 2 days and i had ZERO problems besides crashes while oc
> 
> im so happy after all the stuff i read about amd gpus i was so concerned if i should rly buy an r9 390 :c
> i want to watercool the gpu later i have 1 320 and 1 240 rad, can i get an even higher oc when i already reached my max and get a max temp of 75°C with all fans at 100%? i just tried to set core voltage to +100 and power limit to the +50 and want to test if i can get the card to be stable at 1200/1650 or 1150/1700 but i cant pass 1150/1650 :c


Did you try increase the AUX voltaage to +50mV? At 1200 & above on core with high memory overclock, it's likely you need a lot of voltage to get it stable in gaming. I think nice spot for gaming is in 11XX range for the core.
Quote:


> Originally Posted by *kubiks*
> 
> So
> Quote:
> 
> 
> 
> Originally Posted by *Streetdragon*
> 
> +Rep awesome!
> Maybe i didnt saw the VRM temps.. How are the Temps over time? VRM1 AND VRM2?
> 
> 
> 
> \
> So after playing Fallout 4 for 4 hours core temp on both cards was at 40*, and VRM1 was 52* / VRM2 41*. It seems that Fallout may not be the best test though as it doesnt tax the gpus very hard.
> 
> Two days ago, I ran MSI Kombustor at 1440p for 2 hours. The results at the end showed the cores to both be roughly 55* and Vrm1 80* Vrm 2 62*. Not bad for two hours of 140fps stresstesting
Click to expand...

If you have GTA V, see what temps you get with it.
Quote:


> Originally Posted by *seanpatrick*
> 
> I've got the Sapphire Nittro w/ backplate. I've got it O'C'd to 1110 mhz with a 1685 clock, with no voltage boost through afterburner (though with a +10 power limit). It's quiet and stays at 70 MAX after hours of gaming, so I'm liking it.
> 
> I am having trouble with what seems to be the common complaint of random horizontal lines flashing briefly across the screen a couple of times an hour or so (totally random) - but I think that might be a driver issue that has been discussed in other threads. I'm currently waiting to see if the next driver update addresses it. If it gets bad enough I guess I'll have to RMA it - if the lines are still there once in a while it's bye-bye AMD unfortunately.


Is this problem happened at stock clock? Monitor - 60Hz/120Hz/144Hz? HDMI or DP?


----------



## rdr09

Quote:


> Originally Posted by *rdr09*
> 
> Thanks for the link. The scores for the 290 were prolly recycled. i compared to my runs of over a year ago and it just matches my system with just an i7 Sandy at 4.5GHz HT off (i5) and the 290 at stock. Also, it says it used 15.15 driver. must be typo.
> 
> I was just really curious about the comparison between 290 and the 300. So, my 1200 core matches your 13400 GS. I play at stock 'cause with 2 290s . . . it is my Sandy that needs oc'ing. +rep.
> 
> edit: here was a run of Oct last year at stock with just a Sandy . . .
> 
> http://www.3dmark.com/3dm/4250058
> 
> Beat an i7 5960K @ 4.4GHz. Unless i am missing something.


I think even the 390X results is off . . .

http://www.3dmark.com/3dm/7690148?

considering the physics score of the 5960X is 20K. Mine is only 11K. It got rigged. lol


----------



## TsukikoChan

Been using my sapphire 390x for a few weeks now and quite happy with it, although the fans are noticeable on anything above 40% fan speed :< i wear headphones so it's ok but still strange to hear it whirl up sometimes.
Using trixx I've got it sitting at 1132/~1600mhz (i can't remember exact memory clock i use) with +50-54mV and +50% power, it runs stable in games (i could prob push it more). I put a custom fancurve on it so it now rarely goes above 64oC in games (vrm sits between 54 and 65 in games), leaving it as automatic fan control tends to make it a bit more noisier. I'll try take some screenshots tonight of my settings, curve and try get a benchmark on the go.
It runs Evil Within at a near solid 58-60fps (i think at high/ultra quality) and Shadow of Mordor at ultra settings (+hd textures) at between 65 and 100fps (80-100 in first area, 65-90 in second area). The card is a beast!

One thing I've noted though since moving to the 390x, sometimes (very rarely) my pc would freeze up now.. i mean, like completely, not just the screen. Sometimes this happens during Dota2 or SoM or Evil within. I've been on voip at same time and there is nothing being heard or sent, like the entire pc has locked up. I need to manually power down and reboot to get it working again.. Happens like 1-2 times a week. Anyone else get this? :< I didn't get this prior to the 390x.


----------



## xboxshqip

FireStrike 1.1
http://www.3dmark.com/3dm/9313372
Quote:


> Originally Posted by *rdr09*
> 
> this version includes the demos, so this is going to test how good your cooling on the card.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> https://www.techpowerup.com/downloads/2497/futuremark-3dmark-2013-v1-5-915/
> 
> sometimes the short version goes on sale in steam.


There you go
http://www.3dmark.com/3dm/9313372


----------



## kizwan

Quote:


> Originally Posted by *xboxshqip*
> 
> FireStrike 1.1
> http://www.3dmark.com/3dm/9313372
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> this version includes the demos, so this is going to test how good your cooling on the card.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> https://www.techpowerup.com/downloads/2497/futuremark-3dmark-2013-v1-5-915/
> 
> sometimes the short version goes on sale in steam.
> 
> 
> 
> There you go
> http://www.3dmark.com/3dm/9313372
Click to expand...

290 with 390 modded BIOS. Comparison just for fun. ~1.25V on core & 1.047V AUX.

http://www.3dmark.com/compare/fs/6509421/fs/6508894


----------



## mus1mus

Quote:


> Originally Posted by *kizwan*
> 
> 290 with 390 modded BIOS. Comparison just for fun. ~1.25V on core & 1.047V AUX.
> 
> http://www.3dmark.com/compare/fs/6509421/fs/6508894


Hey man.

What's the max load voltage you getting when you input 1.25 on DPM? Say at +200 on trixx?


----------



## kizwan

Quote:


> Originally Posted by *mus1mus*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> 290 with 390 modded BIOS. Comparison just for fun. ~1.25V on core & 1.047V AUX.
> 
> http://www.3dmark.com/compare/fs/6509421/fs/6508894
> 
> 
> 
> Hey man.
> 
> What's the max load voltage you getting when you input 1.25 on DPM? Say at +200 on trixx?
Click to expand...

Max voltage was 1.445V (with the ELPIDA card) before Vdroop with +200mV. With 947/1250, DPM7 was 1.16 to 1.18V if I remember correctly but once clocks set to 1000/1300 DPM7 voltage jump up to 1.25V, both cards (ELPIDA & HYNIX).


----------



## mus1mus

Hmm. I must be doing something wrong.

If I touch the Voltage table (dpm 7) I can only get 1.344 at load. 1.359 before droop.

If I just let it, 1.42ish.


----------



## kizwan

Quote:


> Originally Posted by *mus1mus*
> 
> Hmm. I must be doing something wrong.
> 
> If I touch the Voltage table (dpm 7) I can only get 1.344 at load. 1.359 before droop.
> 
> If I just let it, 1.42ish.


How is that possible?







I used your modded BIOS with 1.25V at DPM 7 & it registered above 1.4V with +200mV. I'll flash your modded BIOS back & record the voltage while running firestrike.


----------



## Waldos Platypus

Thanks for the reply, but you're absolutely correct when saying it won't fit. I might need to bite the bullet and get a different case.


----------



## gupsterg

Quote:


> Originally Posted by *rdr09*
> 
> Thanks for the link.


No worries on link







.
Quote:


> Originally Posted by *rdr09*
> 
> I think even the 390X results is off . . .
> 
> http://www.3dmark.com/3dm/7690148?
> 
> considering the physics score of the 5960X is 20K. Mine is only 11K. It got rigged. lol


No the 20K physics is not off







, use the search facility on 3dmark database







.

Link:- top 3 results in database for 5960X with 1x 390X

The i7 5960X has 8 physical cores, with HT it can do 16 threads, i7 2700K is half that.

Link:- i7 2700K vs i7 5960X

Look at info here regarding each 3dMark 13 test/scoring method. You'll see under red text heading physics, important info relating to why the 5960X is scoring more than 2700K.


----------



## rdr09

Quote:


> Originally Posted by *gupsterg*
> 
> No worries on link
> 
> 
> 
> 
> 
> 
> 
> .
> No the 20K physics is not off
> 
> 
> 
> 
> 
> 
> 
> , use the search facility on 3dmark database
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Link:- top 3 results in database for 5960X with 1x 390X
> 
> The i7 5960X has 8 physical cores, with HT it can do 16 threads, i7 2700K is half that.
> 
> Link:- i7 2700K vs i7 5960X
> 
> Look at info here regarding each 3dMark 13 test/scoring method. You'll see under red text heading physics, important info relating to why the 5960X is scoring more than 2700K.


i know the physics score if right on. the graphics score of the 390X is low. it should be above 13K. Same with the overall score for the stock 290. Theirs is matching mine with just an i7 Sandy @ 4.5GHz.









Actually, they picked the low clocked 390X. I've seen some clocked at 1080 out the box. i guess that was what's available.

@xbox, thank you for the fs run. i really have to oc 100MHz to match a 390. wish i can do what Kizwan did.


----------



## bbrotha

Hi, I recently purchase a MSI 390 card and I got a few doubts that perhaps you guys can help me with, this is my setup:

i7 4790 (non-k) @ 3.6 ghz
16 gb ram @ 1600 mhz
750 watts evga psu
MSI Gaming 5 mobo
MSI 390
1440p 27" monitor

It seem that my card is getting bottleneck somehow, I run mostly every game in 1440p at 50-60 fps, but when I play some old games it still gives me the 60fps, even if I downgrade the resolution to 720p, it seems it is stuck at 60fps. Also the gpu wont go higher than 60c and I never have listened the fans go extremely high, so I'm guessing it is not giving full performance.

Anyone had a similar issues? Is there a possibility that something is capping my performance?


----------



## Charcharo

Quote:


> Originally Posted by *bbrotha*
> 
> Hi, I recently purchase a MSI 390 card and I got a few doubts that perhaps you guys can help me with, this is my setup:
> 
> i7 4790 (non-k) @ 3.6 ghz
> 16 gb ram @ 1600 mhz
> 750 watts evga psu
> MSI Gaming 5 mobo
> MSI 390
> 1440p 27" monitor
> 
> It seem that my card is getting bottleneck somehow, I run mostly every game in 1440p at 50-60 fps, but when I play some old games it still gives me the 60fps, even if I downgrade the resolution to 720p, it seems it is stuck at 60fps. Also the gpu wont go higher than 60c and I never have listened the fans go extremely high, so I'm guessing it is not giving full performance.
> 
> Anyone had a similar issues? Is there a possibility that something is capping my performance?


Really depends on the old games too (to an extent).

Which ones are giving you issues?


----------



## bbrotha

for example, Skyrim, I even run it at 720p and it doesn't go higher than 60fps, I'm guessing it should be around 90fps maybe...


----------



## gupsterg

Quote:


> Originally Posted by *rdr09*
> 
> i know the physics score if right on. the graphics score of the 390X is low. it should be above 13K. Same with the overall score for the stock 290. Theirs is matching mine with just an i7 Sandy @ 4.5GHz.


I've seen 13k for graphics on its own not for overall in reviews, most reviews state overall from what I recall.
Quote:


> Originally Posted by *rdr09*
> 
> Actually, they picked the low clocked 390X. I've seen some clocked at 1080 out the box. i guess that was what's available.


The guru3d & bit tech reviews I linked were for the MSI 390X Gaming which is 1100/1525 out of the box and was tested at those speeds.

Guru3d don't state on test setup page but on overclocking page, heading this sample Link:- http://www.guru3d.com/articles_pages/msi_radeon_r9_390x_gaming_8g_oc_review,26.html

Bit tech test setup Link:- http://www.bit-tech.net/hardware/graphics/2015/07/09/msi-r9-390x/2


----------



## Charcharo

Quote:


> Originally Posted by *bbrotha*
> 
> for example, Skyrim, I even run it at 720p and it doesn't go higher than 60fps, I'm guessing it should be around 90fps maybe...


From what I know, Skyrim is V-sync capped at 60 and if the game goes above it, it will start to bug out... badly.


----------



## bbrotha

ok, I guess I would try with any other game that its not capped and see if I can get different results, thanks!


----------



## rdr09

Quote:


> Originally Posted by *gupsterg*
> 
> I've seen 13k for graphics on its own not for overall in reviews, most reviews state overall from what I recall.
> The guru3d & bit tech reviews I linked were for the MSI 390X Gaming which is 1100/1525 out of the box and was tested at those speeds.
> 
> Guru3d don't state on test setup page but on overclocking page, heading this sample Link:- http://www.guru3d.com/articles_pages/msi_radeon_r9_390x_gaming_8g_oc_review,26.html
> 
> Bit tech test setup Link:- http://www.bit-tech.net/hardware/graphics/2015/07/09/msi-r9-390x/2


If it is clocked at 1100, then it should definitely score higher that what it was in the screenshot of the link you posted. i've seen in this very thread some with stock 1080 score over 13K in GS.

Like i said, even their 290 overall score is questionable. And, just recently, their W3 bench numbers are questionable. We may have another Anand in our hands.


----------



## Charcharo

Quote:


> Originally Posted by *bbrotha*
> 
> ok, I guess I would try with any other game that its not capped and see if I can get different results, thanks!


If you want to mutilate your video card on an old, but good looking game I can always suggest:

STALKER Clear Sky ; ver 1.5.10 + Sky Reclamation (DX10.1, DX10.1 A-tested MSAA x4 on). This is the single most punishing vanilla old game DX10.1 bench I can think of. Also will hammer your CPU to hell and back (A-life). MSAA on all foliage... simply insane. And with the bug fix Sky Reclamation Mod, it is one hell of a great game too, nothing in the new ones gets close














Also STALKER Lost Alpha can kill a system, but wait for its update that adds DX11.

What else can hammer a card that is old... IDK right now. Crysis 1 is always a choice. Or modding games a lot.


----------



## seanpatrick

Quote:


> Originally Posted by *kizwan*
> 
> Did you try increase the AUX voltaage to +50mV? At 1200 & above on core with high memory overclock, it's likely you need a lot of voltage to get it stable in gaming. I think nice spot for gaming is in 11XX range for the core.
> If you have GTA V, see what temps you get with it.
> Is this problem happened at stock clock? Monitor - 60Hz/120Hz/144Hz? HDMI or DP?


This happens at stock clocks. It happens on both monitors, both run at 60hz, 1080p; dvi to monitor and HDMI to projector. I've replaced the cpu and motherboard but the issue remains. I've clean installed the last few catalyst drivers as well, no change.


----------



## tolis626

Quote:


> Originally Posted by *seanpatrick*
> 
> This happens at stock clocks. It happens on both monitors, both run at 60hz, 1080p; dvi to monitor and HDMI to projector. I've replaced the cpu and motherboard but the issue remains. I've clean installed the last few catalyst drivers as well, no change.


Maybe I'm stating the obvious, but you never know... Have you tried with V-sync on? This sounds an awful lot like tearing to me. Don't just check the tickbox, check to see if it is indeed working. Also, don't force your framerate to 60Hz or anything of that nature for now. Just try everything normal, just with V-sync on. If the issue persists, I dunno... I doubt it's AMD's fault though. First time I'm hearing of this...


----------



## gupsterg

Quote:


> Originally Posted by *rdr09*
> 
> If it is clocked at 1100, then it should definitely score higher that what it was in the screenshot of the link you posted. i've seen in this very thread some with stock 1080 score over 13K in GS.
> 
> Like i said, even their 290 overall score is questionable. And, just recently, their W3 bench numbers are questionable. We may have another Anand in our hands.


Dunno about benches in this thread, as only recently subbed to it and started taking an interest in posts.

I do browse the 3dmark data every so often and only view valid scores but I do recall reading something about how scores can be manipulated by users setup (dunno how true that is). SO I do bare that in mind, the other thing I also consider is if the score was obtained artifact free, there must be some scores which are valid in what sense Furturemark software think but may well have artifacted. SO on the whole just use as a rough guide.

Generally I take all benches in reviews with a pinch of cynicism/suspicion







.

I've been a subscriber of CustomPC since 2003 (a mag in the UK) and never so far found a bench to be not realistic. They're reviews I highly regard as good to that point and are the only ones I don't have cynicism/suspicion. Bit Tech has some of that magazines reviews online, IIRC have some kind of affiliation to it. They use same test rig for a few year IIRC (or deemed until its a bottleneck) so comparatives of reviews can be done. They also do these mega graphics card roundups which I find handy, IIRC they are not published online. So far everytime I've bought a CPU/GPU and done compares of my rig with their data not found an anomaly.


----------



## rdr09

Quote:


> Originally Posted by *gupsterg*
> 
> Dunno about benches in this thread, as only recently subbed to it and started taking an interest in posts.
> 
> I do browse the 3dmark data every so often and only view valid scores but I do recall reading something about how scores can be manipulated by users setup (dunno how true that is). SO I do bare that in mind, the other thing I also consider is if the score was obtained artifact free, there must be some scores which are valid in what sense Furturemark software think but may well have artifacted. SO on the whole just use as a rough guide.
> 
> Generally I take all benches in reviews with a pinch of cynicism/suspicion
> 
> 
> 
> 
> 
> 
> 
> .
> 
> I've been a subscriber of CustomPC since 2003 (a mag in the UK) and never so far found a bench to be not realistic. They're reviews I highly regard as good to that point and are the only ones I don't have cynicism/suspicion. Bit Tech has some of that magazines reviews online, IIRC have some kind of affiliation to it. They use same test rig for a few year IIRC (or deemed until its a bottleneck) so comparatives of reviews can be done. They also do these mega graphics card roundups which I find handy, IIRC they are not published online. So far everytime I've bought a CPU/GPU and done compares of my rig with their data not found an anomaly.


when two or more are getting pretty similar results at 1100 clock or whatever clocks, then all of the sudden a member comes up with 1000 more points at same clocks . . . that gives it.

It should be fairly easy to ask a favor to the members here for an 1100 run. But, i've browsed before and that is why i pretty much know that i have to oc my 290 200MHz more to match a 390X at stock. close to it.


----------



## fat4l

Here is my score for a single card, 1250/1700MHz.
15030 Graphics score


----------



## seanpatrick

Quote:


> Originally Posted by *tolis626*
> 
> Maybe I'm stating the obvious, but you never know... Have you tried with V-sync on? This sounds an awful lot like tearing to me. Don't just check the tickbox, check to see if it is indeed working. Also, don't force your framerate to 60Hz or anything of that nature for now. Just try everything normal, just with V-sync on. If the issue persists, I dunno... I doubt it's AMD's fault though. First time I'm hearing of this...


There are many more than me unfortunately:

https://community.amd.com/thread/191232

My problem happens whether I'm gaming, or just on the desktop doing nothing - so I doubt it's a V-sync issue. My refresh rate isn't forced, it's just what's been automatically selected, and I might add, has never been problematic on any of my previous cards (including the 280x series).


----------



## tolis626

Quote:


> Originally Posted by *seanpatrick*
> 
> There are many more than me unfortunately:
> 
> https://community.amd.com/thread/191232
> 
> My problem happens whether I'm gaming, or just on the desktop doing nothing - so I doubt it's a V-sync issue. My refresh rate isn't forced, it's just what's been automatically selected, and I might add, has never been problematic on any of my previous cards (including the 280x series).


Ok... Maybe try changing your refresh rate to 59Hz from 60Hz? Or maybe the other way around, if it's set to 59Hz by default. Try that and see if it helps. If that doesn't help either and your games don't crash... Then I'm out of ideas too.


----------



## kubiks

My firestrike crossfire 390x under water. mild oc

I am fairly new to 3d mark, whats the difference between graphics score and the total score? MY gfx score was over 25k


----------



## kubiks




----------



## SystemTech

Hey guys, need some quick guidance...

I have the opportunity of upgrading my card to a 390X (from a ASUS DCU 390) and only have to pay the difference (~$100 depending on brand).

Is the difference worth the $100?

My 1 year plans are to watercool and get a second card (whichever i choose).
Going 390X might mean a change of brand to MSI and then full coverage blocks are limited (although i love the look of the alphacools).

What are your guys thoughts. Change or stay???
Im very tempted to switch it up to a MSI 390X Gaming but that might just be the e-peen talking


----------



## seanpatrick

Quote:


> Originally Posted by *kubiks*
> 
> My firestrike crossfire 390x under water. mild oc
> 
> I am fairly new to 3d mark, whats the difference between graphics score and the total score? MY gfx score was over 25k


This post reminds me of Reddit 'Gone Wild' posts like 'I'm new here, be gentle' or 'shy girl, be nice' - meanwhile the girl is SMOKING hot and obviously not that shy at all...

18000+ score? I'd say you're doing alright


----------



## Nickyvida

Just got my MSi 390. Jumped over to AMD for the first time so i'm not too sure what to do. Pretty much dumbfounded navigating through the CCC


----------



## Dundundata

Use Afterburner instead


----------



## flopper

Quote:


> Originally Posted by *SystemTech*
> 
> Hey guys, need some quick guidance...
> 
> I have the opportunity of upgrading my card to a 390X (from a ASUS DCU 390) and only have to pay the difference (~$100 depending on brand).
> 
> Is the difference worth the $100?
> 
> My 1 year plans are to watercool and get a second card (whichever i choose).
> Going 390X might mean a change of brand to MSI and then full coverage blocks are limited (although i love the look of the alphacools).
> 
> What are your guys thoughts. Change or stay???
> Im very tempted to switch it up to a MSI 390X Gaming but that might just be the e-peen talking


for 1440p yes.
for 1080p maybe.

overall a 100mhz better OC at average on a x version.


----------



## Nickyvida

Quote:


> Originally Posted by *Dundundata*
> 
> Use Afterburner instead


Thanks. im switching between the two at the moment. Starting to feel my way round it now. Is 15.11 the latest drivers AMD has for the 390?


----------



## flopper

Quote:


> Originally Posted by *Nickyvida*
> 
> Thanks. im switching between the two at the moment. Starting to feel my way round it now. Is 15.11 the latest drivers AMD has for the 390?


15.11.1 beta


----------



## mus1mus

So, if a 290X is $20 cheaper than the 390, which is the better buy?


----------



## SystemTech

Quote:


> Originally Posted by *flopper*
> 
> for 1440p yes.
> for 1080p maybe.
> 
> overall a 100mhz better OC at average on a x version.


My thoughts as well. Well im already hunting for a 1440p screen so im guessing that answers that haha.
MSI's seem to the best bang for buck too


----------



## Gumbi

Quote:


> Originally Posted by *mus1mus*
> 
> So, if a 290X is $20 cheaper than the 390, which is the better buy?


Probably the 390, unless the 290X is a lightning, Trix new version, VaporX, or PCS+ model.


----------



## AverdanOriginal

Quote:


> Originally Posted by *kubiks*
> 
> My firestrike crossfire 390x under water. mild oc
> 
> I am fairly new to 3d mark, whats the difference between graphics score and the total score? MY gfx score was over 25k


I am not sure, but I think the difference is the following:

Graphic Score is your Graphics Card (or Cards in your case)
Physics Score is your CPU+Mobo+Ram
Combined Score is... duuuh combination of both above
Total Score is like everything combined and weighted (but not the average it seems).

Hope that helps


----------



## mus1mus

Quote:


> Originally Posted by *Gumbi*
> 
> Probably the 390, unless the 290X is a lightning, Trix new version, VaporX, or PCS+ model.


It's between the 290X Twin Frozer and a Nitro 390. Both new.

I'm leaning towards the MSI due to the FC block availability, and I have a couple of 290s already. One is on an RMA, the other, clocks really good.

But the nitro is equipped with 8GB VRAMs that may have an edge considering I have a 1600p monitor.

Any performance graphs from the nitro owners round here?

I'm also not looking for stock clocks as you can see, I am a pretty jackin' it up.
http://www.3dmark.com/3dm11/10544612


----------



## TsukikoChan

looks like sapphire trixx got updated in the last day, gives a lot more options for Fury cards.. wonder how it changes things for the 390x 

__
https://www.reddit.com/r/3t79oc/sapphire_trixx_v521_is_out_amd_fury_voltage/


----------



## Gumbi

Quote:


> Originally Posted by *mus1mus*
> 
> It's between the 290X Twin Frozer and a Nitro 390. Both new.
> 
> I'm leaning towards the MSI due to the FC block availability, and I have a couple of 290s already. One is on an RMA, the other, clocks really good.
> 
> But the nitro is equipped with 8GB VRAMs that may have an edge considering I have a 1600p monitor.
> 
> Any performance graphs from the nitro owners round here?
> 
> I'm also not looking for stock clocks as you can see, I am a pretty jackin' it up.
> http://www.3dmark.com/3dm11/10544612


Hmm well theyMSI cooling is only average on the 290(x) cards, and the 8GB will serve you well at 1600p, might be best to go with that card then tbh.

That being said if you are going on water the cooling is irrelevant... In that case I'd give the edge to the MSI 290X UNLESS you value the 8GB more.


----------



## lightsout

Quote:


> Originally Posted by *TsukikoChan*
> 
> looks like sapphire trixx got updated in the last day, gives a lot more options for Fury cards.. wonder how it changes things for the 390x
> 
> __
> https://www.reddit.com/r/3t79oc/sapphire_trixx_v521_is_out_amd_fury_voltage/


The update also took voltage control away for some cards. Or maybe it was the previous update.


----------



## TsukikoChan

Quote:


> Originally Posted by *lightsout*
> 
> The update also took voltage control away for some cards. Or maybe it was the previous update.


hmm, curious. I use the previous version on my sapphire 390x and it does give me voltage control, i'll be installing this one tonight and i'll let you know how it looks afterwards


----------



## lightsout

Nitro owners. Can you control voltage with AB or do you have to use trixx


----------



## jon666

Last time I tried it was only trixx that allowed changes in voltage


----------



## Gumbi

Quote:


> Originally Posted by *jon666*
> 
> Last time I tried it was only trixx that allowed changes in voltage


Afterburner has a setting which unlocka voltage control.... All 390Xs var the Gigabyte one can be overvolted in Afterburner AND Trixx.


----------



## Notarnicola

im just pushing for my msi 390 to see what i can take from it!
its 200mV to mutch? Temps are still under 80ªc core and under 70 vrm!


----------



## Gumbi

Quote:


> Originally Posted by *Notarnicola*
> 
> im just pushing for my msi 390 to see what i can take from it!
> its 200mV to mutch? Temps are still under 80ªc core and under 70 vrm!


For bench runs it's fine, but it's a tad much for air generally. I'd prefer to keep my temps under 70 too for that.

The MSI cooling is very good but not THATgood.


----------



## Dundundata

Quote:


> Originally Posted by *Nickyvida*
> 
> Thanks. im switching between the two at the moment. Starting to feel my way round it now. Is 15.11 the latest drivers AMD has for the 390?


If you are going to overclock, up the power limit to max. Also I recommend a custom fan profile. Rivatuner should be included and you can setup the OSD so you can see gpu temps, gpu load, fan speed etc. in games and benchmarks.


----------



## Dundundata

Quote:


> Originally Posted by *Notarnicola*
> 
> im just pushing for my msi 390 to see what i can take from it!
> its 200mV to mutch? Temps are still under 80ªc core and under 70 vrm!


Please post your results at that voltage!


----------



## jon666

AAAAAAAAAGGGHH. I broke this http://www.datasheet4u.com/share_search.php?sWord=78M05 which belongs to thishttp://www.datasheetarchive.com/6r8%20COIL-datasheet.html Only the center piece. Currently looking for a soldering gun. Power went out right after I started working the stock cooler loose. At least my universal waterblock will mate with this card no problem.


----------



## lightsout

Quote:


> Originally Posted by *Gumbi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *jon666*
> 
> Last time I tried it was only trixx that allowed changes in voltage
> 
> 
> 
> Afterburner has a setting which unlocka voltage control.... All 390Xs var the Gigabyte one can be overvolted in Afterburner AND Trixx.
Click to expand...

Is that a fact? Great news if so can anyone else vouch for this info?


----------



## Notarnicola

1200/1500 150mV power limit: 10%
Temps GPU: max 77ºC min 38ºC
Temps VRM1: max 66ºC min 36ºC



3dmark at 1190/1500 150mV power limit: 10%
Temps GPU: max 80ºC min 38ºC
Temps VRM1: max 69ºC min 36ºC


----------



## Gumbi

@ Notarnicola

150mv and 10% power limit? You're almost certainly power throttling.


----------



## battleaxe

Quote:


> Originally Posted by *Notarnicola*
> 
> 
> 
> 
> 
> 1200/1500 150mV power limit: 10%
> Temps GPU: max 77ºC min 38ºC
> Temps VRM1: max 66ºC min 36ºC
> 
> 
> 
> 3dmark at 1190/1500 150mV power limit: 10%
> Temps GPU: max 80ºC min 38ºC
> Temps VRM1: max 69ºC min 36ºC


Quote:


> Originally Posted by *Gumbi*
> 
> @ Notarnicola
> 
> 150mv and 10% power limit? You're almost certainly power throttling.


Pretty much. That score is low for those clocks.


----------



## Notarnicola

Quote:


> Originally Posted by *Gumbi*
> 
> @ Notarnicola
> 
> 150mv and 10% power limit? You're almost certainly power throttling.


Got the same


1200/1500 163mV power limit: 10%


1200/1500 150mV power limit: 50%

just max and min changed a bit!


----------



## Gumbi

Try 3DMark. It's a lot more intense than Valley.


----------



## mus1mus

I agree with the guys.

I scored 2 FPS more with 3045 at same clocks on a 290.

But I am running cooler. Maybe that matters as well.


----------



## jon666

Bought a soldering gun, been reading up on how to fix stuff on GPU's with one of those. Will probably practice on a dead 7870. Will probably re-seat universal waterblock once I get everything else on my 390 figured out. I occasionally have to convince myself that volt modding a 390 would be a wasted endeavor. I might try it just to do it. Might do it since I already damaged that flipping voltage regulator.


----------



## Nickyvida

Quote:


> Originally Posted by *flopper*
> 
> 15.11.1 beta


Oh i see. I heard about Crimson but im not sure if they are only bound to the Fury line. Are they for all Amd GFC?


----------



## Nickyvida

Quote:


> Originally Posted by *Dundundata*
> 
> If you are going to overclock, up the power limit to max. Also I recommend a custom fan profile. Rivatuner should be included and you can setup the OSD so you can see gpu temps, gpu load, fan speed etc. in games and benchmarks.
> Hmm as in slide the PLL to the right?


Im not really sure on how to oc a graphics card. Never did it before. But i have Rivatuner set up already. Works great in game so far.

Is there a way to flash my bios to 390x or enable the disabled 390x compute units by the way?


----------



## SystemTech

Quote:


> Originally Posted by *SystemTech*
> 
> My thoughts as well. Well im already hunting for a 1440p screen so im guessing that answers that haha.
> MSI's seem to the best bang for buck too


So its pretty much confirmed. MSI R9 390X Gaming incoming and my ASUS R9 390 DCuII is going into my brother rig
















Hopefully collecting it today but maybe tomorrow.


----------



## Jonindo21

Hey guys,

Bought an msi r9 390 the other day and ran 3dmark firestrike and got a score of around 9800 is that normal? BTW I have 8gb/1333Mhz ram and an i5 3470 @3.4Ghz.

Cheers ?


----------



## Slowpoke66

Quote:


> Originally Posted by *Jonindo21*
> 
> Hey guys,
> 
> Bought an msi r9 390 the other day and ran 3dmark firestrike and got a score of around 9800 is that normal? BTW I have 8gb/1333Mhz ram and an i5 3470 @3.4Ghz.
> 
> Cheers ?


Is that the total score? Focus on the graphics score for comparison.

Here's a link to my stock vs my highest OC in Firestrike with the same card: http://www.3dmark.com/compare/fs/6511569/fs/6520009


----------



## Jonindo21

Quote:


> Originally Posted by *Slowpoke66*
> 
> Is that the total score? Focus on the graphics score for comparison.
> 
> Here's a link to my stock vs my highest OC in Firestrike with the same card: http://www.3dmark.com/compare/fs/6511569/fs/6520009


I'll have to get back to you on that in about 3 hours after my Electromagnetics exam ?


----------



## Vexile

I'm having trouble deciding which R9 390 to buy, since the prices in my country are messed up, we use different currency.
I'm planning to OC the card as much as it can.

POWER COLOR AXR9 390 8GBD5-PPDHE - ~386$
SAPPHIRE NITRO R9 390 8G GD5 L - ~400$
MSI R9 390 GAMING 8G - ~424$

I was leaning towards the Nitro card, since it's probably with the best cooling, but i read it's not overclocking so well or atleast the MSI one overclocks much better with a bit worse temperatures.

Suggestions?


----------



## navjack27

oh you want 390 not x. well just for data to chew on, my 390x is still stable with 1100/1650 and gets great fps and temps in everything. of corse i don't leave it at stock fan curve tho, thats dumb. maybe it'll hit like 65c or as high as 70c if its a bad hot day and i don't open my window or something. my case has like.... no fans in it ... don't judge me. i'm getting some new memory so i'll run some new benchmarks for nice numbers in a couple days.

my last benches -
fire strike - http://www.3dmark.com/fs/6406813
api - http://www.3dmark.com/aot/83893


----------



## navjack27

Nickyvida,
there aren't disabled computes in a 390x man, you got them all

EDIT: didn't mean to make double post, sorry


----------



## Nickyvida

Quote:


> Originally Posted by *navjack27*
> 
> Nickyvida,
> there aren't disabled computes in a 390x man, you got them all
> 
> EDIT: didn't mean to make double post, sorry


Thanks for the data.







Really quite helpful

Whoops. I meant to say flash my 390 to the 390x bios because of the disabled compute and shaders in my 390 as compared to the 390x( 2516 shaders for 390 and 2816 for 390X and 44 compute units for the 390X and 40 for the 390)

.I've heard of people getting extra shaders or compute units from those disabled on the 290 and flashing thier bios to the 290x 's if my memory serves me right. But it is a risky process and i'm a tech idiot..

Just another question, hope it's not too dumb.

I used to keep my 780's fan speed at 100% all the time and with it spoiling at 2 years 5 months unexpectedly, does having the fan speed set to 100% all the time decrease the life expectancy of the GPU? Or is it gaming frequently that did it in?

Going off the sticker on the Twinfrozr V cooler so i'm not too sure what the increase in lifetime MSI specifies. I'm thinking the fan blades but i could be wrong.


----------



## SystemTech

I think they are referring to the fact that the fans can actually stop under light GPU loads meaning you basically have a passive GPU when your browsing the web, working in office etc and only once temps of the card hit certain ranges do the fans actually start up.
Therefore because the fans are not working nearly as much their lifespan increases greatly.

With regards to your 780, yes at 100% for 2 years it is impressive that it lasted so long. These are high RPM fans and therefore have a shorter lifespan compared to say a 120mm std case fan.
I normally set the fans up manually on a linear line(for every 1* the card goes up the fan goes up 1%. but the fans have a 10% headstart meaning that the card at 90* has the fans on 100% but at 80*c has them at 90%. 70*C is 80% and so on down to a min of 20% that way the revolutions are ALOT lower than 100% 24/7


----------



## Nickyvida

Quote:


> Originally Posted by *SystemTech*
> 
> I think they are referring to the fact that the fans can actually stop under light GPU loads meaning you basically have a passive GPU when your browsing the web, working in office etc and only once temps of the card hit certain ranges do the fans actually start up.
> Therefore because the fans are not working nearly as much their lifespan increases greatly.
> 
> With regards to your 780, yes at 100% for 2 years it is impressive that it lasted so long. These are high RPM fans and therefore have a shorter lifespan compared to say a 120mm std case fan.
> I normally set the fans up manually on a linear line(for every 1* the card goes up the fan goes up 1%. but the fans have a 10% headstart meaning that the card at 90* has the fans on 100% but at 80*c has them at 90%. 70*C is 80% and so on down to a min of 20% that way the revolutions are ALOT lower than 100% 24/7


Ah i see, thanks for clarifying. I was thinking if i had the fan speed set to 100% i would greatly increase the lifespan of the card because of reduced temperatures hence what i did with the 780. How should i set up the card in a way so i don't overheat the cards under gaming as well as degrade the fans?
s


----------



## Dundundata

If your case has good airflow and your ambients arent too high i dont see a reason to run fans 100% all the time, even when gaming. My card gets around 70 max with fan speed less than 70%.


----------



## Jonindo21

Sorry that took so long. Here's the results from stock speeds and then overclocked:

STOCK: 1040MHz Core Clock; 1500MHz Memory Clock


OVERCLOCKED: 1120MHz Core Clock; 1740MHz Memory Clock


Also, for some reason at my overclock when I increase the voltage, there seems to be a performance drop?


----------



## flopper

Quote:


> Originally Posted by *Vexile*
> 
> I'm having trouble deciding which R9 390 to buy, since the prices in my country are messed up, we use different currency.
> I'm planning to OC the card as much as it can.
> 
> POWER COLOR AXR9 390 8GBD5-PPDHE - ~386$
> SAPPHIRE NITRO R9 390 8G GD5 L - ~400$
> MSI R9 390 GAMING 8G - ~424$
> 
> I was leaning towards the Nitro card, since it's probably with the best cooling, but i read it's not overclocking so well or atleast the MSI one overclocks much better with a bit worse temperatures.
> 
> Suggestions?


Hum, 390 is around a 1100-1160mhz on average OC.
in terms of fps the difference isnt what I would call significant so basically I bought the cheapest 390 here which was for me the sapphire 390 Nitro.
I have a 1140mhz OC without issue and can do 1160mhz but then its getting into strained pushed effects.

Normally I would google little on the brand I buy and if there does not seem to be any special problems its a go.


----------



## DarknightOCR

my nitro is 1220/1750 with stock cooler rising to + 175mv / + 100mv 50%

without touching the voltage is 1150/1620 + -


----------



## Vexile

Nitro it is then. Looks better anyway imo ; )


----------



## Piccolo55

Quote:


> Originally Posted by *Vexile*
> 
> Nitro it is then. Looks better anyway imo ; )


There's a new nitro model now too


----------



## diggiddi

Quote:


> Originally Posted by *Jonindo21*
> 
> Sorry that took so long. Here's the results from stock speeds and then overclocked:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> STOCK: 1040MHz Core Clock; 1500MHz Memory Clock
> 
> 
> OVERCLOCKED: 1120MHz Core Clock; 1740MHz Memory Clock
> 
> 
> 
> 
> Also, for some reason at my overclock when I increase the voltage, there seems to be a performance drop?


http://www.3dmark.com/3dm/9338488

At same speed you have me beaten, my Graphics score is 11982 (4.8ghz cpu) but my highest OC 1230/1620 at 4.6ghz cpu is in the 14K range
Win7 what OS are you using btw. It seems the Grenada has improved on the Hawaii a lil bit


----------



## PhillyB

Hi all, I am debating upgrading from my 280x to a 390x and am not really sure if I should or not at this point. I use eyefinity (5040x1050) for games and the 280x works most the time without a problem, but am running into difficulty with modern games wanting more than the 280x can give. For example, Fallout 4 averages 30fps to 45fps which I'm fine with. But, the drops to 20 in large urban areas are cutting into my limits of enjoyment.

With the next gen cards coming out next year, is it worth upgrading now and using the 390x for a couple years? If i buy now, I would use the card in my next build at the end of 2016.

edit: with the next gen cards I would probably need new monitors or active adaptors for the display port to dvi. with a 390x, i wouldn't.


----------



## diggiddi

Quote:


> Originally Posted by *DarknightOCR*
> 
> my nitro is 1220/1750 with stock cooler rising to + 175mv / + 100mv 50%
> 
> without touching the voltage is 1150/1620 + -


390 or 390x? those are some good clocks


----------



## diggiddi

Quote:


> Originally Posted by *PhillyB*
> 
> Hi all, I am debating upgrading from my 280x to a 390x and am not really sure if I should or not at this point. I use eyefinity (5040x1050) for games and the 280x works most the time without a problem, but am running into difficulty with modern games wanting more than the 280x can give. For example, Fallout 4 averages 30fps to 45fps which I'm fine with. But, the drops to 20 in large urban areas are cutting into my limits of enjoyment.
> 
> With the next gen cards coming out next year, is it worth upgrading now and using the 390x for a couple years? If i buy now, I would use the card in my next build at the end of 2016.
> 
> edit: with the next gen cards I would probably need new monitors or active adaptors for the display port to dvi. with a 390x, i wouldn't.


What is your budget cos next gen wont be cheap


----------



## PhillyB

preferably around $400 for the card


----------



## diggiddi

Grab a 390x then


----------



## PhillyB

Quote:


> Originally Posted by *diggiddi*
> 
> Grab a 390x then


I was looking at the MSI 390x Gaming LE and an Alphacool block. Watercooling is a requirement for whatever I get.


----------



## kubiks

you should keep the Asus card as well I'm getting great results with the crossfire setup


----------



## mus1mus

A 290X maybe a better buy. Considering the blocks.

Or an XFX 390X which EK supports.


----------



## kubiks

Quote:


> Originally Posted by *PhillyB*
> 
> I was looking at the MSI 390x Gaming LE and an Alphacool block. Watercooling is a requirement for whatever I get.


I highly recommend the alphacool blocks. Both installed and perform flawlessly


----------



## PhillyB

Quote:


> Originally Posted by *mus1mus*
> 
> A 290X maybe a better buy. Considering the blocks.
> 
> Or an XFX 390X which EK supports.


EK supports the early model, but not sure about which would be purchased because they kept the same model number between the inducer versions.

I was tempted by the xfx 290x 8gb, but its the same price as the MSI 390x.
Quote:


> Originally Posted by *kubiks*
> 
> I highly recommend the alphacool blocks. Both installed and perform flawlessly


I am using an alphacool block on my 280x and am fairly happy with it. very restrictive because it uses pins, but it cools pretty effectively.


----------



## Gdourado

Hello,
How are you doing?
I have a question.
I have a Sapphire 390X Tri-x.
It requires two 8 pin connectors for power.
The psu is a Xfx 850w pro Black edition.
It is fully modular.
The Pcie cables are psu to two 6+2 pin.
Each cable has two Pcie connectors.
Like those sata cães that a single cable can have two, three or four sata connections.
My question is if a single psu cable to two 6+2 pins is enough?
Or I should use two independent cables, each from the psu to an 8 pin GPU connection.

Thanks.
Cheers


----------



## Gdourado

The cables are like this


----------



## kizwan

Quote:


> Originally Posted by *Gdourado*
> 
> Hello,
> How are you doing?
> I have a question.
> I have a Sapphire 390X Tri-x.
> It requires two 8 pin connectors for power.
> The psu is a Xfx 850w pro Black edition.
> It is fully modular.
> The Pcie cables are psu to two 6+2 pin.
> Each cable has two Pcie connectors.
> Like those sata cães that a single cable can have two, three or four sata connections.
> My question is if a single psu cable to two 6+2 pins is enough?
> Or I should use two independent cables, each from the psu to an 8 pin GPU connection.
> 
> Thanks.
> Cheers


You should be able to use one cable for both 8-pin connectors. I personally recommend using different cable (coming from PSU) for each 6 or 8-pin connectors.


----------



## Charcharo

Guys, do you think an i5 4460 is bottle-necking the Power Color R9 390 GPU?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Charcharo*
> 
> Guys, do you think an i5 4460 is bottle-necking the Power Color R9 390 GPU?


No way....


----------



## Charcharo

Am asking just because... well at times I am somewhat dissapointed at how it seems to be doing.

In games like Metro it seems to be doing perfectly (the GPU). I generally have no complaints with how it does in Witcher 3.

But games like Armored Warfare, World of Tanks, FC4 and Bugout 4 (though I suspect I need wait for the new drivers







) ... they dont run badly, but... not good enough I think.

Also am currently at only 1440x900. And without an SSD. Though a SSD and a new monitor are my next purchases (in that order). Am also running Windows 7.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Charcharo*
> 
> Am asking just because... well at times I am somewhat dissapointed at how it seems to be doing.
> 
> In games like Metro it seems to be doing perfectly (the GPU). I generally have no complaints with how it does in Witcher 3.
> 
> But games like Armored Warfare, World of Tanks, FC4 and Bugout 4 (though I suspect I need wait for the new drivers
> 
> 
> 
> 
> 
> 
> 
> ) ... they dont run badly, but... not good enough I think.
> 
> Also am currently at only 1440x900. And without an SSD. Though a SSD and a new monitor are my next purchases (in that order). Am also running Windows 7.


Okay, at 900p you technically DO have a CPU bottleneck, but it's not a performance limiting bottleneck.

These cards will really shine at resolutions of 1080 and above, and will get the work load back on the GPU, instead of the CPU.

I bet your GPU usage in SOME titles is less than 100% due to CPU capping out cores.... Especially in FC4 that doesn't always thread so well without high resolution.

Also, definitely get yourself an SSD, not so much from an in game performance standpoint, but man.... boot and load times are just much better. It's everything I can do to tolerate using a traditional hard drive on my work station....


----------



## Charcharo

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Okay, at 900p you technically DO have a CPU bottleneck, but it's not a performance limiting bottleneck.
> 
> These cards will really shine at resolutions of 1080 and above, and will get the work load back on the GPU, instead of the CPU.
> 
> I bet your GPU usage in SOME titles is less than 100% due to CPU capping out cores.... Especially in FC4 that doesn't always thread so well without high resolution.
> 
> Also, definitely get yourself an SSD, not so much from an in game performance standpoint, but man.... boot and load times are just much better. It's everything I can do to tolerate using a traditional hard drive on my work station....


Hmm, so you think that at 1440P I will start really appreciating this beast of a GPU?

I hope so. As for the SSD, I looked at some benchmarks that stated that a few games like for example Bugout 4 and World of Tanks DO benefit from an SSD for some odd reason.

Also heard Windows 10 would improve performance even if just a bit.


----------



## Kenpachi7144

Hello there guys. . . Here goes my question that has probably been asked before but will an overclocked 390 match/beat a 390x in performance? Reason for asking this is because I'm in the market for a new single card to replace my SLI 760s, and the 390 is the best bang for my buck right now. Reason I want to retire my 760s is because of they only have 2gb of vram. I can still play most(all) games on ultra 60+fps at 1080 but now I want a 390 and after long hrs of research I can't find anything to justify the extra $100 the 390x costs, so here I am. Any reply is greatly appreciated. Thank you.


----------



## kizwan

Quote:


> Originally Posted by *Kenpachi7144*
> 
> Hello there guys. . . Here goes my question that has probably been asked before but will an overclocked 390 match/beat a 390x in performance? Reason for asking this is because I'm in the market for a new single card to replace my SLI 760s, and the 390 is the best bang for my buck right now. Reason I want to retire my 760s is because of they only have 2gb of vram. I can still play most(all) games on ultra 60+fps at 1080 but now I want a 390 and after long hrs of research I can't find anything to justify the extra $100 the 390x costs, so here I am. Any reply is greatly appreciated. Thank you.


Yes, overclocked 390 will beat 390X at stock. If both overclocked to the same clock, let say 1100/1600, then 390 will not be able to beat 390X.


----------



## Jonindo21

Quote:


> Originally Posted by *diggiddi*
> 
> http://www.3dmark.com/3dm/9338488
> 
> At same speed you have me beaten, my Graphics score is 11982 (4.8ghz cpu) but my highest OC 1230/1620 at 4.6ghz cpu is in the 14K range
> Win7 what OS are you using btw. It seems the Grenada has improved on the Hawaii a lil bit


I am using Windows 10


----------



## Jonindo21

Quote:


> Originally Posted by *diggiddi*
> 
> http://www.3dmark.com/3dm/9338488
> 
> At same speed you have me beaten, my Graphics score is 11982 (4.8ghz cpu) but my highest OC 1230/1620 at 4.6ghz cpu is in the 14K range
> Win7 what OS are you using btw. It seems the Grenada has improved on the Hawaii a lil bit


Pushed it a bit more:

@1150; 1740
http://www.3dmark.com/3dm/9344400?

Although there is zero artifacting during benchmarks, in kombustor (in afterburner) when I run a quick test, the fps drops the higher I push the core clock/memory clock past 1150/1740. Have I hit the wall? I have core voltage at +100mV and aux at +50mV and temps are fine. I haven't run firestrike at any clock past 1150/1740 as I am worried I will do some damage. Especially hesitant as I only bought it last week. I've heard about this 'Golden Ratio' for the 200 series which was 1.25-1.4 times the core clock=memory clock. Is there such a ratio for the 300 series? Have I hit the wall or am I just doing something wrong?


----------



## DarknightOCR

Quote:


> Originally Posted by *diggiddi*
> 
> 390 or 390x? those are some good clocks


390 Nitro


----------



## LuisFilipepio

Hi guys,

I could really use your help, since i'm a total noob at this, and am overwhelmed by the ammount of information i have to process.

Recently i got a monitor as a gift, a Benq XL2730Z, so after a little research i though about upgrading my GPU to be able to use the Freesync tecnology.(actually i have a HD 5750)

After a lot of reading i settle my self between the MSI R9 390/390X (in portugal we dont have almost any Sapphire in local stores), and i'm going to decide this week which one to get

My PC is not state of the art.

8 GB Ram DDR3

AMd Phenom II X6 1050 t CPU

RECOM PE-550 Plus PSU

Motherboard Asustek M4A87TD / USB3

So my 2 main questions are:

Is there really much difference between the 2 cards that justify the 100€?

Is my PSU enough to handle any of them, and if not, which one do you recomend?

Thanks in advance,


----------



## DarknightOCR

dá uma vista de olhos na PCcomponentes em espanha.
consegues bons preços e está cá no dia aseguir









alguma coisa PM

sorry to speak in Portuguese


----------



## Agent Smith1984

Quote:


> Originally Posted by *LuisFilipepio*
> 
> Hi guys,
> 
> I could really use your help, since i'm a total noob at this, and am overwhelmed by the ammount of information i have to process.
> 
> Recently i got a monitor as a gift, a Benq XL2730Z, so after a little research i though about upgrading my GPU to be able to use the Freesync tecnology.(actually i have a HD 5750)
> 
> After a lot of reading i settle my self between the MSI R9 390/390X (in portugal we dont have almost any Sapphire in local stores), and i'm going to decide this week which one to get
> 
> My PC is not state of the art.
> 
> 8 GB Ram DDR3
> 
> AMd Phenom II X6 1050 t CPU
> 
> RECOM PE-550 Plus PSU
> 
> Motherboard Asustek M4A87TD / USB3
> 
> So my 2 main questions are:
> 
> Is there really much difference between the 2 cards that justify the 100€?
> 
> Is my PSU enough to handle any of them, and if not, which one do you recomend?
> 
> Thanks in advance,


1050t is still a tried and true CPU when overclocked! 3.8ghz+ is ideal....

In your situation, i advise saving money and getting the 390... It'll run on that psu, but don't expect much overclocking. In my opinion, a 390 with a mild overclock, pairs up very nicely with an x6 ph2!


----------



## Jonindo21

Out of curiosity, who actually prefers playing at 4k with medium/high settings compared to 1080p ultra settings? With my r9 390 is it worth going 4k if later on down the road I buy another one and crossfire?

Cheers


----------



## battleaxe

Quote:


> Originally Posted by *Jonindo21*
> 
> Out of curiosity, who actually prefers playing at 4k with medium/high settings compared to 1080p ultra settings? With my r9 390 is it worth going 4k if later on down the road I buy another one and crossfire?
> 
> Cheers


Me. All day, every day, and I cannot go back. I have a 1080p display on the other side of the desk and it just looks grainy, fuzzy, and pixelated, by comparison. There is no comparison actually. 1080 looks like absolute crap next to 4k. Crap.

Just my humble (yet correct) opinion here of course.


----------



## Jonindo21

Quote:


> Originally Posted by *battleaxe*
> 
> Me. All day, every day, and I cannot go back. I have a 1080p display on the other side of the desk and it just looks grainy, fuzzy, and pixelated, by comparison. There is no comparison actually. 1080 looks like absolute crap next to 4k. Crap.
> 
> Just my humble (yet correct) opinion here of course.


Damn making me jealous D: Would an r9 390 do okay at 4k you think?


----------



## mus1mus

Quote:


> Originally Posted by *Jonindo21*
> 
> Damn making me jealous D: Would an r9 390 do okay at 4k you think?


No.

You need at least 2 of them to get decent framerates.


----------



## BurgerRipper




----------



## battleaxe

Quote:


> Originally Posted by *Jonindo21*
> 
> Damn making me jealous D: Would an r9 390 do okay at 4k you think?


Quote:


> Originally Posted by *mus1mus*
> 
> No.
> 
> You need at least 2 of them to get decent framerates.


Ehhh... I got by with just one for a while. You just have to disable all the AA settings in most games. It still looks way better than 1080 even then. There's a pretty big performance penalty for the AA settings in most games. Even with one card, I'd still rather game on 4k. That's an easy decision if you ask me. Just my OP.


----------



## Agent Smith1984

I'm all about the 4k too guys!

My overclocked 390 would handle 4k at HIGH- VERY HIGH settings with no aa, at 35-65fps, and it was great...


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I'm all about the 4k too guys!
> 
> My overclocked 390 would handle 4k at HIGH- VERY HIGH settings with no aa, at 35-65fps, and it was great...


----------



## BurgerRipper

Quote:


> Originally Posted by *Jonindo21*
> 
> Out of curiosity, who actually prefers playing at 4k with medium/high settings compared to 1080p ultra settings? With my r9 390 is it worth going 4k if later on down the road I buy another one and crossfire?
> 
> Cheers


I will do, this happened to me when I jumped from 720p to 1080p, too much difference, you get more of everything even with worse settings, resolution just gets the game experience better.


----------



## Jonindo21

Hmmm 4k seems so tempting now. I mean it did before, but I had the impression that my 390 would be crap at 4k.

I suppose dx12 will most likely give anyone with a dx12 compatible gpu a decent performance boost in future dx12 games. So a 390 may in fact be more viable at 4k than I first thought.....


----------



## desetnik

I've been comparing my fps results in coop Dying light with friend.
On same areas where I get 51fps he gets 10 fps more.
Now the even bigger problem is. He is playing on 2560x1080 while my res is 1920x1080. The only difference in settings same windows.
His configuration is pcs+ r9 390 stock, 4790k 4.1ghz and 16gb ram.
Also his driver is 15.7 while I'm using the 15.11.1

I've been using this card for a while, this is first time I compared the results and they are bad.


----------



## jazz995756

Quick question I have the XFX 390 and want to water cool it unfortunately I'm low on cash but I do have extra closed loop water coolers laying around I was wanting to know if the nzxt g10 would work or corsairs equivalent to that?


----------



## seanpatrick

Hey guys -

So I've been reading through some threads and people have suggested you really need about 28 amps on the 12v rail for their 290 / 390. I've got a 1000 watt psu ( Rosewill RBR 1000-M http://www.realhardtechx.com/index_archivos/Page2917.htm) and have noticed that my 12v rail is broken up into 4, at 20, 20, 30, 30 .... I'm overclocking my 390 a bit - and am wondering if my PSU is less than ideal? I just assumed that the total wattage was MORE than sufficient, but had never really thought about the power distribution. Am I pushing it with this or will I be ok?


----------



## diggiddi

Quote:


> Originally Posted by *jazz995756*
> 
> Quick question I have the XFX 390 and want to water cool it unfortunately I'm low on cash but I do have extra closed loop water coolers laying around I was wanting to know if the nzxt g10 would work or corsairs equivalent to that?


That cooler should work fine

Quote:


> Originally Posted by *seanpatrick*
> 
> Hey guys -
> 
> So I've been reading through some threads and people have suggested you really need about 28 amps on the 12v rail for their 290 / 390. I've got a 1000 watt psu ( Rosewill RBR 1000-M http://www.realhardtechx.com/index_archivos/Page2917.htm) and have noticed that my 12v rail is broken up into 4, at 20, 20, 30, 30 .... I'm overclocking my 390 a bit - and am wondering if my PSU is less than ideal? I just assumed that the total wattage was MORE than sufficient, but had never really thought about the power distribution. Am I pushing it with this or will I be ok?


I'm running crossfire 290x lightnings on an Antec HCG 750w you should be fine


----------



## kizwan

Quote:


> Originally Posted by *LuisFilipepio*
> 
> Hi guys,
> 
> I could really use your help, since i'm a total noob at this, and am overwhelmed by the ammount of information i have to process.
> 
> Recently i got a monitor as a gift, a Benq XL2730Z, so after a little research i though about upgrading my GPU to be able to use the Freesync tecnology.(actually i have a HD 5750)
> 
> After a lot of reading i settle my self between the MSI R9 390/390X (in portugal we dont have almost any Sapphire in local stores), and i'm going to decide this week which one to get
> 
> My PC is not state of the art.
> 
> 8 GB Ram DDR3
> 
> AMd Phenom II X6 1050 t CPU
> 
> RECOM PE-550 Plus PSU
> 
> Motherboard Asustek M4A87TD / USB3
> 
> So my 2 main questions are:
> 
> Is there really much difference between the 2 cards that justify the 100€?
> 
> Is my PSU enough to handle any of them, and if not, which one do you recomend?
> 
> Thanks in advance,


With that PSU you have 264W for the GPU. I'm not sure whether that enough though.
Quote:


> Originally Posted by *desetnik*
> 
> I've been comparing my fps results in coop Dying light with friend.
> On same areas where I get 51fps he gets 10 fps more.
> Now the even bigger problem is. He is playing on 2560x1080 while my res is 1920x1080. The only difference in settings same windows.
> His configuration is pcs+ r9 390 stock, 4790k 4.1ghz and 16gb ram.
> Also his driver is 15.7 while I'm using the 15.11.1
> 
> I've been using this card for a while, this is first time I compared the results and they are bad.


Your friend have better & faster single thread performance CPU. He is still at 1080p anyway. If 1440p I would be surprise if he have higher FPS than yours 1080p.
Quote:


> Originally Posted by *seanpatrick*
> 
> Hey guys -
> 
> So I've been reading through some threads and people have suggested you really need about 28 amps on the 12v rail for their 290 / 390. I've got a 1000 watt psu ( Rosewill RBR 1000-M http://www.realhardtechx.com/index_archivos/Page2917.htm) and have noticed that my 12v rail is broken up into 4, at 20, 20, 30, 30 .... I'm overclocking my 390 a bit - and am wondering if my PSU is less than ideal? I just assumed that the total wattage was MORE than sufficient, but had never really thought about the power distribution. Am I pushing it with this or will I be ok?


There are 2 x 8-pin PCIe & 2 x 6-pin PCIe connectors on the PSU. With the multiple +12V rails, I recommend connect the 6 or 8-pin PCIe connectors on the GPU to separate 6 or 8-pin PCIe connectors on the PSU. Basically each PCIe connectors on the GPU are individually directly connected to the PSU. I can't find clearer picture but I think the DC connectors on the PSU are connected to different +12V rails.


----------



## rdr09

Quote:


> Originally Posted by *Jonindo21*
> 
> Hmmm 4k seems so tempting now. I mean it did before, but I had the impression that my 390 would be crap at 4k.
> 
> I suppose dx12 will most likely give anyone with a dx12 compatible gpu a decent performance boost in future dx12 games. So a 390 may in fact be more viable at 4k than I first thought.....


like mus1mus said, you gonna need two for 4K. some games . . . the maps will be hard to see like . . .



in some games like fps ones . . . you'd feel like a cheater.


----------



## Agent Smith1984

Here are my results on BF4 and Crysis 3 with an overclock:

http://www.overclock.net/t/1561704/official-amd-r9-390-390x-owners-club/1430#post_24303616


----------



## lightsout

Searching this thread I found one person with an XFX DD 390 getting 90c on the vrm temps.

Anyone else with this card have different (or similar) results? Weird that XFX is advertising some awesome new design for vrm cooling and it is still that high.


----------



## battleaxe

Quote:


> Originally Posted by *lightsout*
> 
> Searching this thread I found one person with an XFX DD 390 getting 90c on the vrm temps.
> 
> Anyone else with this card have different (or similar) results? Weird that XFX is advertising some awesome new design for vrm cooling and it is still that high.


How hot is the core when getting those temps? The hot air from the core cooler would blow onto the VRM cooler and heat it up as well. If not, then sounds like the VRM thermal pads are not doing their job very well. Those VRM's should be a lot cooler than that.


----------



## lightsout

Quote:


> Originally Posted by *battleaxe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *lightsout*
> 
> Searching this thread I found one person with an XFX DD 390 getting 90c on the vrm temps.
> 
> Anyone else with this card have different (or similar) results? Weird that XFX is advertising some awesome new design for vrm cooling and it is still that high.
> 
> 
> 
> How hot is the core when getting those temps. The hot air from the core cooler would blow onto the VRM cooler and heat it up as well. If not, then sounds like the VRM thermal pads are not doing their job very well. Those VRM's should be a lot cooler than that.
Click to expand...

Yeah idk it wasn't myy card. Just wanted to hear from other owners of the card, do you have experience with the cooler?


----------



## battleaxe

Quote:


> Originally Posted by *lightsout*
> 
> Yeah idk it wasn't myy card. Just wanted to hear from other owners of the card, do you have experience with the cooler?


Not personally, but I have done tons of experimenting with separate air coolers on the VRM's on the 290/290x card, and know what heats them up. So the new 390/390x from XFX should run nice and cool on the VRM's. It has the best VRM coolers on the market for air cooled cards. If its not working right, then something wasn't assembled correctly, or the core is heating it up too much. (needs more airflow could be another issue)


----------



## lightsout

Quote:


> Originally Posted by *battleaxe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *lightsout*
> 
> Yeah idk it wasn't myy card. Just wanted to hear from other owners of the card, do you have experience with the cooler?
> 
> 
> 
> Not personally, but I have done tons of experimenting with separate air coolers on the VRM's on the 290/290x card, and know what heats them up. So the new 390/390x from XFX should run nice and cool on the VRM's. It has the best VRM coolers on the market for air cooled cards. If its not working right, then something wasn't assembled correctly, or the core is heating it up too much. (needs more airflow could be another issue)
Click to expand...

Thanks, I just asked because this card is on the newegg BF sale. Although the price isn't the best $289 ($259 after $30 MIR) Which is good but I hate MIR's. XFX don't seem to have great resale value though.


----------



## battleaxe

Quote:


> Originally Posted by *lightsout*
> 
> Thanks, I just asked because this card is on the newegg BF sale. Although the price isn't the best $289 ($259 after $30 MIR) Which is good but I hate MIR's. XFX don't seem to have great resale value though.


The new XFX 390/390x are some of the best clockers around from what I have seen on here. Great binning. They took what they did wrong on the 290/290x and improved it big time. Now they are one of the best. If I was buying a 390/390x it would be an XFX easily. Very worth that asking price too. Where are you getting this deal if I may ask?

Also, the XFX is just begging to have an AIO strapped to it too BTW. The VRM's are already taken care of too with their own separate coolers.

Edit: NVM, I found the sale. Not bad.


----------



## lightsout

Quote:


> Originally Posted by *battleaxe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *lightsout*
> 
> Thanks, I just asked because this card is on the newegg BF sale. Although the price isn't the best $289 ($259 after $30 MIR) Which is good but I hate MIR's. XFX don't seem to have great resale value though.
> 
> 
> 
> The new XFX 390/390x are some of the best clockers around from what I have seen on here. Great binning. They took what they did wrong on the 290/290x and improved it big time. Now they are one of the best. If I was buying a 390/390x it would be an XFX easily. Very worth that asking price too. Where are you getting this deal if I may ask?
> 
> Also, the XFX is just begging to have an AIO strapped to it too BTW. The VRM's are already taken care of too with their own separate coolers.
> 
> Edit: NVM, I found the sale. Not bad.
Click to expand...

I see that the vrms have a heat sink on them. Hopefuly it would not get in the way if someone did strap an AIO on them.

I found a review on hardocp. Their card was a poor overclocker not even hitting 1100. That seems pretty low maybe they got a poor card or were doing it wrong idk?
http://www.hardocp.com/article/2015/09/21/xfx_r9_390_double_dissipation_8gb_video_card_review/1#.Vk-VOXarQuU

Here's where I got the info for the BF sale.
http://www.bfads.net/Black-Friday/NewEgg/Ad?page=2#viewer


----------



## battleaxe

Quote:


> Originally Posted by *lightsout*
> 
> I see that the vrms have a heat sink on them. Hopefuly it would not get in the way if someone did strap an AIO on them.
> 
> I found a review on hardocp. Their card was a poor overclocker not even hitting 1100. That seems pretty low maybe they got a poor card or were doing it wrong idk?
> http://www.hardocp.com/article/2015/09/21/xfx_r9_390_double_dissipation_8gb_video_card_review/1#.Vk-VOXarQuU
> 
> Here's where I got the info for the BF sale.
> http://www.bfads.net/Black-Friday/NewEgg/Ad?page=2#viewer


From what I've seen on this thread, most are going 1150+ so yeah I would say they got a turd.

@agent Smith1984 should be able to chime in more, he's seen more here than I have. Maybe he has something to add.


----------



## lightsout

Quote:


> Originally Posted by *battleaxe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *lightsout*
> 
> I see that the vrms have a heat sink on them. Hopefuly it would not get in the way if someone did strap an AIO on them.
> 
> I found a review on hardocp. Their card was a poor overclocker not even hitting 1100. That seems pretty low maybe they got a poor card or were doing it wrong idk?
> http://www.hardocp.com/article/2015/09/21/xfx_r9_390_double_dissipation_8gb_video_card_review/1#.Vk-VOXarQuU
> 
> Here's where I got the info for the BF sale.
> http://www.bfads.net/Black-Friday/NewEgg/Ad?page=2#viewer
> 
> 
> 
> From what I've seen on this thread, most are going 1150+ so yeah I would say they got a turd.
> 
> @agent Smith1984 should be able to chime in more, he's seen more here than I have. Maybe he has something to add.
Click to expand...

I was looking at the results in the OP of this thread and it does appear MSI is king, with possibly all the 1200 cards. Besides maybe one or two. MSI and sapphire seem to be consistently good clockers, XFX is average in those results. Just something I noticed.

I do like that the card is a prime candidate for an AIO, which Is something I have done in the past. But for simplicity sake will probably avoid.


----------



## Dundundata

When I had my xfx it was an ok OCer, better on memory than core. I was able to hit 1140core but could never get 1150, the memory hit 1700 with ease though. Kind of the opposite of the MSI I have now, where core clocks easier than memory. I must say though I think alot comes down to luck of the draw, even in cooling.

For instance my xfx ran hot until I re-applied paste, then it was good, although I forget what the vrm temps were. My MSI runs real cool though, usually sub 70C, and the VRM's....they are even lower! Although I've read people having issues with the cooling and fan noise on that card, so again luck of the draw.

I think what we really need is better quality control/testing.

Oh and my xfx sold within a day and for good money too, so no worries there.


----------



## lightsout

Quote:


> Originally Posted by *Dundundata*
> 
> When I had my xfx it was an ok OCer, better on memory than core. I was able to hit 1140core but could never get 1150, the memory hit 1700 with ease though. Kind of the opposite of the MSI I have now, where core clocks easier than memory. I must say though I think alot comes down to luck of the draw, even in cooling.
> 
> For instance my xfx ran hot until I re-applied paste, then it was good, although I forget what the vrm temps were. My MSI runs real cool though, usually sub 70C, and the VRM's....they are even lower! Although I've read people having issues with the cooling and fan noise on that card, so again luck of the draw.
> 
> I think what we really need is better quality control/testing.
> 
> Oh and my xfx sold within a day and for good money too, so no worries there.


Thanks for the info +rep. I would be happy with either of those cards I think. Probably the MSI more so but it will come down to price.


----------



## CrazyElf

Out of curiosity, has anyone found a use for the extra 4GB of VRAM? What about in Crossfire?
Quote:


> Originally Posted by *lightsout*
> 
> I see that the vrms have a heat sink on them. Hopefuly it would not get in the way if someone did strap an AIO on them.
> 
> I found a review on hardocp. Their card was a poor overclocker not even hitting 1100. That seems pretty low maybe they got a poor card or were doing it wrong idk?
> http://www.hardocp.com/article/2015/09/21/xfx_r9_390_double_dissipation_8gb_video_card_review/1#.Vk-VOXarQuU
> 
> Here's where I got the info for the BF sale.
> http://www.bfads.net/Black-Friday/NewEgg/Ad?page=2#viewer


I would highly recommend that you avoid the XFX cards. They had a history of terrible overheating problems with their VRMs.
http://www.legitreviews.com/xfx-radeon-r9-290-double-dissipation-video-card-review_138612/12

Note that the review was merely gaming. Anything compute intensive would mean an even hotter GPU, by perhaps as much as 20-30C!

They've also been known to replace their GPUs with crappier PCBs.
http://hardforum.com/showthread.php?t=1601246

I would agree that MSI is a good choice, although the Twin Frozr is not the end all be all, it is amongst the best 2 slot coolers. That and they on average seem to do very well. It's a pity that there was no 390X Lightning series, as they've been since discontinued. I wish that they would keep them in production for longer.

Also, MSI also seems to have pretty good RMA, which is always nice. They can be slow to process, but they come through in the end and are pretty generous in that regard.

I've had mixed experiences with Sapphire RMA. I used to have terrible experiences, but lately I've been hearing that they have gotten better. The Tri-X is a solid cooler though (note that I emphasized cooling potential a lot as these Hawaii cards run quite hot).

If the price is a problem, see if you can find some decent used 290X (or 290s) on Kijiji.


----------



## lightsout

Quote:


> Originally Posted by *CrazyElf*
> 
> Out of curiosity, has anyone found a use for the extra 4GB of VRAM? What about in Crossfire?
> Quote:
> 
> 
> 
> Originally Posted by *lightsout*
> 
> I see that the vrms have a heat sink on them. Hopefuly it would not get in the way if someone did strap an AIO on them.
> 
> I found a review on hardocp. Their card was a poor overclocker not even hitting 1100. That seems pretty low maybe they got a poor card or were doing it wrong idk?
> http://www.hardocp.com/article/2015/09/21/xfx_r9_390_double_dissipation_8gb_video_card_review/1#.Vk-VOXarQuU
> 
> Here's where I got the info for the BF sale.
> http://www.bfads.net/Black-Friday/NewEgg/Ad?page=2#viewer
> 
> 
> 
> I would highly recommend that you avoid the XFX cards. They had a history of terrible overheating problems with their VRMs.
> http://www.legitreviews.com/xfx-radeon-r9-290-double-dissipation-video-card-review_138612/12
> 
> Note that the review was merely gaming. Anything compute intensive would mean an even hotter GPU, by perhaps as much as 20-30C!
> 
> They've also been known to replace their GPUs with crappier PCBs.
> http://hardforum.com/showthread.php?t=1601246
> 
> I would agree that MSI is a good choice, although the Twin Frozr is not the end all be all, it is amongst the best 2 slot coolers. That and they on average seem to do very well. It's a pity that there was no 390X Lightning series, as they've been since discontinued. I wish that they would keep them in production for longer.
> 
> Also, MSI also seems to have pretty good RMA, which is always nice. They can be slow to process, but they come through in the end and are pretty generous in that regard.
> 
> I've had mixed experiences with Sapphire RMA. I used to have terrible experiences, but lately I've been hearing that they have gotten better. The Tri-X is a solid cooler though (note that I emphasized cooling potential a lot as these Hawaii cards run quite hot).
> 
> If the price is a problem, see if you can find some decent used 290X (or 290s) on Kijiji.
Click to expand...

XFX's marketing stuff for these 390's says they have totally redesigned their vrm cooling, they now have a plate over the ram and heatsinks on the vrms. Not sure how much is fluff but there was a guy in this thread saying his hit 90c and he had to return. Waiting for an actual owner to say yay or nay on that. Or at least share their experience.

MSI's really do seem to be good clockers. And yeah I would prefer MSI's RMA, they all have horror stories but like you said MSI seems to be pretty consumer friendly.

I would also like to know about your ram question. Is anyone in here using more than 4gb's on a single card config?


----------



## battleaxe

Quote:


> Originally Posted by *CrazyElf*
> 
> Out of curiosity, has anyone found a use for the extra 4GB of VRAM? What about in Crossfire?
> I would highly recommend that you avoid the XFX cards. They had a history of terrible overheating problems with their VRMs.
> http://www.legitreviews.com/xfx-radeon-r9-290-double-dissipation-video-card-review_138612/12
> 
> Note that the review was merely gaming. Anything compute intensive would mean an even hotter GPU, by perhaps as much as 20-30C!
> 
> They've also been known to replace their GPUs with crappier PCBs.
> http://hardforum.com/showthread.php?t=1601246
> 
> I would agree that MSI is a good choice, although the Twin Frozr is not the end all be all, it is amongst the best 2 slot coolers. That and they on average seem to do very well. It's a pity that there was no 390X Lightning series, as they've been since discontinued. I wish that they would keep them in production for longer.
> 
> Also, MSI also seems to have pretty good RMA, which is always nice. They can be slow to process, but they come through in the end and are pretty generous in that regard.
> 
> I've had mixed experiences with Sapphire RMA. I used to have terrible experiences, but lately I've been hearing that they have gotten better. The Tri-X is a solid cooler though (note that I emphasized cooling potential a lot as these Hawaii cards run quite hot).
> 
> If the price is a problem, see if you can find some decent used 290X (or 290s) on Kijiji.


You're comparing the 290 series with 390 series. Irrelevant. The 390 is completely different. Completely.

That being said. MSI RMA experience is rather stellar. I've had to do so myself, and was quite happy.


----------



## Sgt Bilko

Quote:


> Originally Posted by *lightsout*
> 
> Quote:
> 
> 
> 
> Originally Posted by *battleaxe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *lightsout*
> 
> I see that the vrms have a heat sink on them. Hopefuly it would not get in the way if someone did strap an AIO on them.
> 
> I found a review on hardocp. Their card was a poor overclocker not even hitting 1100. That seems pretty low maybe they got a poor card or were doing it wrong idk?
> http://www.hardocp.com/article/2015/09/21/xfx_r9_390_double_dissipation_8gb_video_card_review/1#.Vk-VOXarQuU
> 
> Here's where I got the info for the BF sale.
> http://www.bfads.net/Black-Friday/NewEgg/Ad?page=2#viewer
> 
> 
> 
> From what I've seen on this thread, most are going 1150+ so yeah I would say they got a turd.
> 
> @agent Smith1984 should be able to chime in more, he's seen more here than I have. Maybe he has something to add.
> 
> Click to expand...
> 
> I was looking at the results in the OP of this thread and it does appear MSI is king, with possibly all the 1200 cards. Besides maybe one or two. MSI and sapphire seem to be consistently good clockers, XFX is average in those results. Just something I noticed.
> 
> I do like that the card is a prime candidate for an AIO, which Is something I have done in the past. But for simplicity sake will probably avoid.
Click to expand...

If you want the best clocking cards then get a 290x seeing as they hit higher core clocks easier than what the 300 series can.

Quote:


> Originally Posted by *lightsout*
> 
> Quote:
> 
> 
> 
> Originally Posted by *CrazyElf*
> 
> Out of curiosity, has anyone found a use for the extra 4GB of VRAM? What about in Crossfire?
> Quote:
> 
> 
> 
> Originally Posted by *lightsout*
> 
> I see that the vrms have a heat sink on them. Hopefuly it would not get in the way if someone did strap an AIO on them.
> 
> I found a review on hardocp. Their card was a poor overclocker not even hitting 1100. That seems pretty low maybe they got a poor card or were doing it wrong idk?
> http://www.hardocp.com/article/2015/09/21/xfx_r9_390_double_dissipation_8gb_video_card_review/1#.Vk-VOXarQuU
> 
> Here's where I got the info for the BF sale.
> http://www.bfads.net/Black-Friday/NewEgg/Ad?page=2#viewer
> 
> 
> 
> I would highly recommend that you avoid the XFX cards. They had a history of terrible overheating problems with their VRMs.
> http://www.legitreviews.com/xfx-radeon-r9-290-double-dissipation-video-card-review_138612/12
> 
> Note that the review was merely gaming. Anything compute intensive would mean an even hotter GPU, by perhaps as much as 20-30C!
> 
> They've also been known to replace their GPUs with crappier PCBs.
> http://hardforum.com/showthread.php?t=1601246
> 
> I would agree that MSI is a good choice, although the Twin Frozr is not the end all be all, it is amongst the best 2 slot coolers. That and they on average seem to do very well. It's a pity that there was no 390X Lightning series, as they've been since discontinued. I wish that they would keep them in production for longer.
> 
> Also, MSI also seems to have pretty good RMA, which is always nice. They can be slow to process, but they come through in the end and are pretty generous in that regard.
> 
> I've had mixed experiences with Sapphire RMA. I used to have terrible experiences, but lately I've been hearing that they have gotten better. The Tri-X is a solid cooler though (note that I emphasized cooling potential a lot as these Hawaii cards run quite hot).
> 
> If the price is a problem, see if you can find some decent used 290X (or 290s) on Kijiji.
> 
> Click to expand...
> 
> XFX's marketing stuff for these 390's says they have totally redesigned their vrm cooling, they now have a plate over the ram and heatsinks on the vrms. Not sure how much is fluff but there was a guy in this thread saying his hit 90c and he had to return. Waiting for an actual owner to say yay or nay on that. Or at least share their experience.
> 
> MSI's really do seem to be good clockers. And yeah I would prefer MSI's RMA, they all have horror stories but like you said MSI seems to be pretty consumer friendly.
> 
> I would also like to know about your ram question. Is anyone in here using more than 4gb's on a single card config?
Click to expand...

The vrm cooling on the DD's is really really good, among the best in the 300 series and yes they still have the potential to hit 90c if your core temps and case airflow are horrible.
Quote:


> Originally Posted by *battleaxe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *CrazyElf*
> 
> Out of curiosity, has anyone found a use for the extra 4GB of VRAM? What about in Crossfire?
> I would highly recommend that you avoid the XFX cards. They had a history of terrible overheating problems with their VRMs.
> http://www.legitreviews.com/xfx-radeon-r9-290-double-dissipation-video-card-review_138612/12
> 
> Note that the review was merely gaming. Anything compute intensive would mean an even hotter GPU, by perhaps as much as 20-30C!
> 
> They've also been known to replace their GPUs with crappier PCBs.
> http://hardforum.com/showthread.php?t=1601246
> 
> I would agree that MSI is a good choice, although the Twin Frozr is not the end all be all, it is amongst the best 2 slot coolers. That and they on average seem to do very well. It's a pity that there was no 390X Lightning series, as they've been since discontinued. I wish that they would keep them in production for longer.
> 
> Also, MSI also seems to have pretty good RMA, which is always nice. They can be slow to process, but they come through in the end and are pretty generous in that regard.
> 
> I've had mixed experiences with Sapphire RMA. I used to have terrible experiences, but lately I've been hearing that they have gotten better. The Tri-X is a solid cooler though (note that I emphasized cooling potential a lot as these Hawaii cards run quite hot).
> 
> If the price is a problem, see if you can find some decent used 290X (or 290s) on Kijiji.
> 
> 
> 
> You're comparing the 290 series with 390 series. Irrelevant. The 390 is completely different. Completely.
> 
> That being said. MSI RMA experience is rather stellar. I've had to do so myself, and was quite happy.
Click to expand...

It seem pretty obvious by now that you just don't like XFX and as Battleaxe said they took everything bad about the 200 series and improved it (I own both an XFX DD 290x and 390x btw)

If you want the absolute best cooling then Tri-X or Devil, if you want the best all round card then MSI or XFX


----------



## lightsout

It is true I don't love XFX. But I haven't read an msi owner say their vrm was anywhere near 90c. But I haven't done a ton of research.
Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *lightsout*
> 
> Quote:
> 
> 
> 
> Originally Posted by *battleaxe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *lightsout*
> 
> I see that the vrms have a heat sink on them. Hopefuly it would not get in the way if someone did strap an AIO on them.
> 
> I found a review on hardocp. Their card was a poor overclocker not even hitting 1100. That seems pretty low maybe they got a poor card or were doing it wrong idk?
> http://www.hardocp.com/article/2015/09/21/xfx_r9_390_double_dissipation_8gb_video_card_review/1#.Vk-VOXarQuU
> 
> Here's where I got the info for the BF sale.
> http://www.bfads.net/Black-Friday/NewEgg/Ad?page=2#viewer
> 
> 
> 
> From what I've seen on this thread, most are going 1150+ so yeah I would say they got a turd.
> 
> @agent Smith1984 should be able to chime in more, he's seen more here than I have. Maybe he has something to add.
> 
> Click to expand...
> 
> I was looking at the results in the OP of this thread and it does appear MSI is king, with possibly all the 1200 cards. Besides maybe one or two. MSI and sapphire seem to be consistently good clockers, XFX is average in those results. Just something I noticed.
> 
> I do like that the card is a prime candidate for an AIO, which Is something I have done in the past. But for simplicity sake will probably avoid.
> 
> Click to expand...
> 
> If you want the best clocking cards then get a 290x seeing as they hit higher core clocks easier than what the 300 series can.
> 
> Quote:
> 
> 
> 
> Originally Posted by *lightsout*
> 
> Quote:
> 
> 
> 
> Originally Posted by *CrazyElf*
> 
> Out of curiosity, has anyone found a use for the extra 4GB of VRAM? What about in Crossfire?
> Quote:
> 
> 
> 
> Originally Posted by *lightsout*
> 
> I see that the vrms have a heat sink on them. Hopefuly it would not get in the way if someone did strap an AIO on them.
> 
> I found a review on hardocp. Their card was a poor overclocker not even hitting 1100. That seems pretty low maybe they got a poor card or were doing it wrong idk?
> http://www.hardocp.com/article/2015/09/21/xfx_r9_390_double_dissipation_8gb_video_card_review/1#.Vk-VOXarQuU
> 
> Here's where I got the info for the BF sale.
> http://www.bfads.net/Black-Friday/NewEgg/Ad?page=2#viewer
> 
> Click to expand...
> 
> I would highly recommend that you avoid the XFX cards. They had a history of terrible overheating problems with their VRMs.
> http://www.legitreviews.com/xfx-radeon-r9-290-double-dissipation-video-card-review_138612/12
> 
> Note that the review was merely gaming. Anything compute intensive would mean an even hotter GPU, by perhaps as much as 20-30C!
> 
> They've also been known to replace their GPUs with crappier PCBs.
> http://hardforum.com/showthread.php?t=1601246
> 
> I would agree that MSI is a good choice, although the Twin Frozr is not the end all be all, it is amongst the best 2 slot coolers. That and they on average seem to do very well. It's a pity that there was no 390X Lightning series, as they've been since discontinued. I wish that they would keep them in production for longer.
> 
> Also, MSI also seems to have pretty good RMA, which is always nice. They can be slow to process, but they come through in the end and are pretty generous in that regard.
> 
> I've had mixed experiences with Sapphire RMA. I used to have terrible experiences, but lately I've been hearing that they have gotten better. The Tri-X is a solid cooler though (note that I emphasized cooling potential a lot as these Hawaii cards run quite hot).
> 
> If the price is a problem, see if you can find some decent used 290X (or 290s) on Kijiji.
> 
> Click to expand...
> 
> XFX's marketing stuff for these 390's says they have totally redesigned their vrm cooling, they now have a plate over the ram and heatsinks on the vrms. Not sure how much is fluff but there was a guy in this thread saying his hit 90c and he had to return. Waiting for an actual owner to say yay or nay on that. Or at least share their experience.
> 
> MSI's really do seem to be good clockers. And yeah I would prefer MSI's RMA, they all have horror stories but like you said MSI seems to be pretty consumer friendly.
> 
> I would also like to know about your ram question. Is anyone in here using more than 4gb's on a single card config?
> 
> Click to expand...
> 
> The vrm cooling on the DD's is really really good, among the best in the 300 series and yes they still have the potential to hit 90c if your core temps and case airflow are horrible.
> Quote:
> 
> 
> 
> Originally Posted by *battleaxe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *CrazyElf*
> 
> Out of curiosity, has anyone found a use for the extra 4GB of VRAM? What about in Crossfire?
> I would highly recommend that you avoid the XFX cards. They had a history of terrible overheating problems with their VRMs.
> http://www.legitreviews.com/xfx-radeon-r9-290-double-dissipation-video-card-review_138612/12
> 
> Note that the review was merely gaming. Anything compute intensive would mean an even hotter GPU, by perhaps as much as 20-30C!
> 
> They've also been known to replace their GPUs with crappier PCBs.
> http://hardforum.com/showthread.php?t=1601246
> 
> I would agree that MSI is a good choice, although the Twin Frozr is not the end all be all, it is amongst the best 2 slot coolers. That and they on average seem to do very well. It's a pity that there was no 390X Lightning series, as they've been since discontinued. I wish that they would keep them in production for longer.
> 
> Also, MSI also seems to have pretty good RMA, which is always nice. They can be slow to process, but they come through in the end and are pretty generous in that regard.
> 
> I've had mixed experiences with Sapphire RMA. I used to have terrible experiences, but lately I've been hearing that they have gotten better. The Tri-X is a solid cooler though (note that I emphasized cooling potential a lot as these Hawaii cards run quite hot).
> 
> If the price is a problem, see if you can find some decent used 290X (or 290s) on Kijiji.
> 
> Click to expand...
> 
> You're comparing the 290 series with 390 series. Irrelevant. The 390 is completely different. Completely.
> 
> That being said. MSI RMA experience is rather stellar. I've had to do so myself, and was quite happy.
> 
> Click to expand...
> 
> It seem pretty obvious by now that you just don't like XFX and as Battleaxe said they took everything bad about the 200 series and improved it (I own both an XFX DD 290x and 390x btw)
> 
> If you want the absolute best cooling then Tri-X or Devil, if you want the best all round card then MSI or XFX
Click to expand...


----------



## Charcharo

I seem to be getting... somewhat dissapointing frames in Far Cry 4.


----------



## SystemTech

So my MSI R9 390X Gaming maxes out around 1175/1600. I can run 3D Mark 11 but there is some glitching that occurs but the run completes fine so i would say for benching purposes 1175 but probably around 1170 for normal usage which is not too shabby








Scores are P14081 and X5274 running the benchmark tests only with default settings.



Then the ASUS R9 390 DCuII manages a easy 1100/1600 and i think it can be pushed a bit more as i still have some voltage headroom. Will post results when i get to its max


----------



## Sgt Bilko

Quote:


> Originally Posted by *lightsout*
> 
> It is true I don't love XFX. But I haven't read an msi owner say their vrm was anywhere near 90c. But I haven't done a ton of research.


Just going to leave this here:



Now don't take this as me bashing on MSI, I quite like MSI actually but it's more that you are condemning the XFX cards based off one users experience with it.

I did a Furmark test a while ago with 1200/1500 +100mV on my DD 390x.......want to see?



Don't know about you but I can accept that with that much extra voltage pushing through it it only got that high especially when you take into account that my case has both a front mounted and top mounted rads, I live in Australia with no A/C and that this is a dual fan cooler (if this was a triple fan then it would most likely beat out Sapphire for the cooling crown, same with MSI)


----------



## desetnik

Quote:


> Originally Posted by *kizwan*
> 
> Your friend have better & faster single thread performance CPU. He is still at 1080p anyway. If 1440p I would be surprise if he have higher FPS than yours 1080p.


The resolution width difference is still 25%! Dying light benchmarks show no fps benefit from stronger or OC cpu above 4770k. Even unigine benchmarks with weaker cpu's get better score by 100 but that varies alot. Also 4790k gets exactly same framerates in games judging from anand tech. Its enough that my card temperature and fan speed is bad but underperforming like this. I feel like I messed up something in my build.


----------



## Charcharo

I really do wonder about getting a 4790k...


----------



## rdr09

Quote:


> Originally Posted by *Charcharo*
> 
> I really do wonder about getting a 4790k...


it seems you found what's causing your issue.


----------



## Charcharo

Quote:


> Originally Posted by *rdr09*
> 
> it seems you found what's causing your issue.


Well... but isnt the 4460 good enough







?

I was careful when I chose it









On the other hand, it seems the only games that run ... not that well so far are Far Cry 4 and Fallout 4. Generally the rest (bar still Beta Armored Warfare and Wargaming's SSD-requiring for some stupid reason WoT) of the games I play like Metro Redux and Witcher 3 perform great.


----------



## rdr09

Quote:


> Originally Posted by *Charcharo*
> 
> Well... but isnt the 4460 good enough
> 
> 
> 
> 
> 
> 
> 
> ?
> 
> I was careful when I chose it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> On the other hand, it seems the only games that run ... not that well so far are Far Cry 4 and Fallout 4. Generally the rest (bar still Beta Armored Warfare and Wargaming's SSD-requiring for some stupid reason WoT) of the games I play like Metro Redux and Witcher 3 perform great.


i am not so sure if switching to an i7 4790K will make all your games run better. but if it means getting a new board too, then i'd say stay with your current setup.


----------



## Gumbi

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Just going to leave this here:
> 
> 
> 
> Now don't take this as me bashing on MSI, I quite like MSI actually but it's more that you are condemning the XFX cards based off one users experience with it.
> 
> I did a Furmark test a while ago with 1200/1500 +100mV on my DD 390x.......want to see?
> 
> 
> 
> Don't know about you but I can accept that with that much extra voltage pushing through it it only got that high especially when you take into account that my case has both a front mounted and top mounted rads, I live in Australia with no A/C and that this is a dual fan cooler (if this was a triple fan then it would most likely beat out Sapphire for the cooling crown, same with MSI)


Dude, of course it's gonna get hot! It's Furmark. it's a power virus. My VaporX power throttles at stock instantly on Furmark (and my cooling is very, very good, relatively low ambients too).


----------



## Sgt Bilko

Quote:


> Originally Posted by *Gumbi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Just going to leave this here:
> 
> 
> 
> Now don't take this as me bashing on MSI, I quite like MSI actually but it's more that you are condemning the XFX cards based off one users experience with it.
> 
> I did a Furmark test a while ago with 1200/1500 +100mV on my DD 390x.......want to see?
> 
> 
> 
> Don't know about you but I can accept that with that much extra voltage pushing through it it only got that high especially when you take into account that my case has both a front mounted and top mounted rads, I live in Australia with no A/C and that this is a dual fan cooler (if this was a triple fan then it would most likely beat out Sapphire for the cooling crown, same with MSI)
> 
> 
> 
> *Dude, of course it's gonna get hot! It's Furmark*. it's a power virus. My VaporX power throttles at stock instantly on Furmark (and my cooling is very, very good, relatively low ambients too).
Click to expand...

That was my point


----------



## Gumbi

Sorry, I see what you meant to say now. I thought that was the other guy posting an XFX number


----------



## Charcharo

Quote:


> Originally Posted by *rdr09*
> 
> i am not so sure if switching to an i7 4790K will make all your games run better. but if it means getting a new board too, then i'd say stay with your current setup.


I do think I do not need to change my motherboard to use it. Maybe not OC it by a lot (only by little) but it should run well.

What other issues may be causing some of these problems in some games?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Gumbi*
> 
> Sorry, I see what you meant to say now. I thought that was the other guy posting an XFX number


All good man


----------



## Gdourado

Hello,
I have a sapphire 390x tri-x.
I set it up for +20% power limit and +100mv.
With those settings I can get 1160 core clock and 6800 memory speed.
If I go for 1175 core at the same settings, I get artifacts in both heaven and Valley.

Are these results good?
Should I increase the voltage to try and get more core clock?

Thank you.
Cheers


----------



## tangelo

I have been pretty pleased with my MSI R9 390. I play on 1080p @ 60 FPS. I have been thinking of getting a 1440p screen with freesync or 120-144Hz. Would this card be fast enough for running modern games in 1440p and above 60 fps with max/ultra settings, or should I just get a 1440p panel with the normal 60Hz?


----------



## Gumbi

Quote:


> Originally Posted by *Gdourado*
> 
> Hello,
> I have a sapphire 390x tri-x.
> I set it up for +20% power limit and +100mv.
> With those settings I can get 1160 core clock and 6800 memory speed.
> If I go for 1175 core at the same settings, I get artifacts in both heaven and Valley.
> 
> Are these results good?
> Should I increase the voltage to try and get more core clock?
> 
> Thank you.
> Cheers


How hot are your core/VRMs getting when you bench it at those clocks volts? Generally people stick to 100mv as a 24/7 overclock, but you could put a bit more through it if you have the temps under control.


----------



## CrazyElf

Sigh ... what we need is a 3 slot cooler from MSI on a 390X, ideally with a custom PCB. I suppose I am asking for a 390X Lightning.

That would be the ultimate card. Good RMA, good cooling, Samsung VRAM, and likely MSI will bin the chips. I seriously wish they had not EOL'ed so early.

Quote:


> Originally Posted by *lightsout*
> 
> XFX's marketing stuff for these 390's says they have totally redesigned their vrm cooling, they now have a plate over the ram and heatsinks on the vrms. Not sure how much is fluff but there was a guy in this thread saying his hit 90c and he had to return. Waiting for an actual owner to say yay or nay on that. Or at least share their experience.
> 
> MSI's really do seem to be good clockers. And yeah I would prefer MSI's RMA, they all have horror stories but like you said MSI seems to be pretty consumer friendly.
> 
> I would also like to know about your ram question. Is anyone in here using more than 4gb's on a single card config?


I've been doing some research about this.

Anyways, here is the best review I've been able to find:
https://techreport.com/blog/28800/how-much-video-memory-is-enough

Conclusions based on the review

The gap you see between the 290X and 390X is based purely on the core clocks (the 290X is slower clocked). It does seem like clock for clock with the same BIOS, the 290X and 390X are identical though. (See here: http://www.overclock.net/t/1565660/lets-settle-this-290-x-vs-390-x-debate-once-and-for-all/0_100. )
Based on the review, AMD has a higher bandwidth memory controller (especially with the Fury X), but Nvidia is better at using what bandwidth is available.
4GB is not going to be a fine at 4k resolutions or below, even if AMD is less efficient and even with strong settings.
My guess is the frame time spike by the 290X on Far Cry is due to the BIOS. So if you have a 290X or 290, flashing with the 390X or 390 BIOS is worthy of consideration to improve performance.
Even with Crossfire setups, I don't think it will be a problem either. Past 5760x3200 you might have issues, but considering even 290X Crossfire setups are not playable, this is a total non-issue. So I guess it means anyone that is price sensitive might be able to get a used 290X.

The card appears to be well balanced (Ex: you run out of core before running out of VRAM).

Quote:


> Originally Posted by *battleaxe*
> 
> You're comparing the 290 series with 390 series. Irrelevant. The 390 is completely different. Completely.
> 
> That being said. MSI RMA experience is rather stellar. I've had to do so myself, and was quite happy.


Checking now. +Rep - you are correct.

Hmm you are right about this one. XFX DD 390X does seem to have been redesigned:


So in that regard at least, XFX has gotten better. Good to know.

Out of curiosity, how has your Linux experience been with these GPUs?

Quote:


> Originally Posted by *Sgt Bilko*
> 
> If you want the best clocking cards then get a 290x seeing as they hit higher core clocks easier than what the 300 series can.
> The vrm cooling on the DD's is really really good, among the best in the 300 series and yes they still have the potential to hit 90c if your core temps and case airflow are horrible.
> It seem pretty obvious by now that you just don't like XFX and as Battleaxe said they took everything bad about the 200 series and improved it (I own both an XFX DD 290x and 390x btw)
> 
> If you want the absolute best cooling then Tri-X or Devil, if you want the best all round card then MSI or XFX


Their cards before the 390X DD sucked. That is, to the best of my knowledge, factually accurate. It's not whether I like or dislike a brand. It's whether or not their products are terribly designed or not that matters to me. The 290X and 7970X DD were terribly designed. The fact that they have been redesigning their GPUs after release, using cheaper non-reference PCBs is also factually accurate. They are not the only manufacturer that does it, but it's no crime to call a spade a spade IMO. I for one, would not recommend buying a used 290/290X DD or a 7970/50 DD.

That and I have heard some negative experiences before about their RMA. I don't know whether that has changed more recently. It does look like Sapphire has improved considerably.

Quote:


> Originally Posted by *Sgt Bilko*
> 
> If you want the best clocking cards then get a 290x seeing as they hit higher core clocks easier than what the 300 series can.
> The vrm cooling on the DD's is really really good, among the best in the 300 series and yes they still have the potential to hit 90c if your core temps and case airflow are horrible.
> It seem pretty obvious by now that you just don't like XFX and as Battleaxe said they took everything bad about the 200 series and improved it (I own both an XFX DD 290x and 390x btw)
> 
> If you want the absolute best cooling then Tri-X or Devil, if you want the best all round card then MSI or XFX


So are you saying that the 290X seems to OC better? Higher clocks at a lower voltage?

The only other benefit I see with some cards is that they may use Samsung VRAM, which has somewhat tighter timings. At the same VRAM and core clock, I'd expect a GPU with Samsung VRAM to do slightly better.

Quote:


> Originally Posted by *Charcharo*
> 
> Well... but isnt the 4460 good enough
> 
> 
> 
> 
> 
> 
> 
> ?
> 
> I was careful when I chose it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> On the other hand, it seems the only games that run ... not that well so far are Far Cry 4 and Fallout 4. Generally the rest (bar still Beta Armored Warfare and Wargaming's SSD-requiring for some stupid reason WoT) of the games I play like Metro Redux and Witcher 3 perform great.


I'm not sure in your situation if a 4790K would be the best investment for the money.

The only time I would upgrade a CPU purely for gaming is when you are running into CPU bottlenecks, which judging by your statement, you don't seem to be. Games like Cities Skylines are very CPU intensive (especially if you have giant cities near the agent limits). Battlefield 4 comes to mind as well as a CPU bound game as does the Total War series.

Most other games are GPU bound. If you have the money, in your position, I'd buy a second 390. That will yield better performance.


----------



## Sgt Bilko

Quote:


> Originally Posted by *CrazyElf*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> If you want the best clocking cards then get a 290x seeing as they hit higher core clocks easier than what the 300 series can.
> The vrm cooling on the DD's is really really good, among the best in the 300 series and yes they still have the potential to hit 90c if your core temps and case airflow are horrible.
> It seem pretty obvious by now that you just don't like XFX and as Battleaxe said they took everything bad about the 200 series and improved it (I own both an XFX DD 290x and 390x btw)
> 
> If you want the absolute best cooling then Tri-X or Devil, if you want the best all round card then MSI or XFX
> 
> 
> 
> So are you saying that the 290X seems to OC better? Higher clocks at a lower voltage?
> 
> The only other benefit I see with some cards is that they may use Samsung VRAM, which has somewhat tighter timings. At the same VRAM and core clock, I'd expect a GPU with Samsung VRAM to do slightly better.
Click to expand...

Most 290x's I've seen will hit a higher clock speed than what the 390/x's can but that doesn't mean they are faster thanks to to drivers picking up on this.

as for Samsung ram there is no clear cut way to choose them as it seems to be random apart from the lightning and that said it doesn't mean it will automatically be faster, my 390x for example will do 1150/1700 on stock voltage but requires +100mV to hit 1200 on the core.

overall the 390x is a better card than the 290x (as you'd expect) but if you want a 100% clear cut answer as to what you should get then i can't give you that......you need to decide for yourself


----------



## lightsout

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *lightsout*
> 
> It is true I don't love XFX. But I haven't read an msi owner say their vrm was anywhere near 90c. But I haven't done a ton of research.
> 
> 
> 
> Just going to leave this here:
> 
> 
> 
> Now don't take this as me bashing on MSI, I quite like MSI actually but it's more that you are condemning the XFX cards based off one users experience with it.
> 
> I did a Furmark test a while ago with 1200/1500 +100mV on my DD 390x.......want to see?
> 
> 
> 
> Don't know about you but I can accept that with that much extra voltage pushing through it it only got that high especially when you take into account that my case has both a front mounted and top mounted rads, I live in Australia with no A/C and that this is a dual fan cooler (if this was a triple fan then it would most likely beat out Sapphire for the cooling crown, same with MSI)
Click to expand...

Actually I wasn't trying to condemn XFX at all. I just wanted an actual owner to chime in besides the one that said his hit 90.

Also I have quite a bit of negative about XFX with the previous cards so I'm skeptical when a company claims they've fixed the issue.

Can any msi owners comment on their vrm load temps.


----------



## Gdourado

Quote:


> Originally Posted by *Gumbi*
> 
> How hot are your core/VRMs getting when you bench it at those clocks volts? Generally people stick to 100mv as a 24/7 overclock, but you could put a bit more through it if you have the temps under control.


I am still on the stock automatic fan curve.
Core goes to 80 and vrm a bit higher like 85.

Is 1160 a good core oc for a 390x?
Or do they scale better with voltage?

Cheers


----------



## Gumbi

Quote:


> Originally Posted by *Gdourado*
> 
> I am still on the stock automatic fan curve.
> Core goes to 80 and vrm a bit higher like 85.
> 
> Is 1160 a good core oc for a 390x?
> Or do they scale better with voltage?
> 
> Cheers


1160 is a respectable overclock. How much voltage are you putting through it for that?


----------



## Gdourado

Quote:


> Originally Posted by *Gumbi*
> 
> 1160 is a respectable overclock. How much voltage are you putting through it for that?


+100mv and +20% power limit.


----------



## tangelo

Quote:


> Originally Posted by *lightsout*
> 
> Actually I wasn't trying to condemn XFX at all. I just wanted an actual owner to chime in besides the one that said his hit 90.
> 
> Also I have quite a bit of negative about XFX with the previous cards so I'm skeptical when a company claims they've fixed the issue.
> 
> Can any msi owners comment on their vrm load temps.


MSI R9 390 clocked at 1125/1650 with +50% power limit running on 100% gpu usage the VRM temps are around 60C


----------



## Charcharo

Quote:


> Originally Posted by *CrazyElf*
> 
> That and I have heard some negative experiences before about their RMA. I don't know whether that has changed more recently. It does look like Sapphire has improved considerably.
> So are you saying that the 290X seems to OC better? Higher clocks at a lower voltage?
> 
> The only other benefit I see with some cards is that they may use Samsung VRAM, which has somewhat tighter timings. At the same VRAM and core clock, I'd expect a GPU with Samsung VRAM to do slightly better.
> I'm not sure in your situation if a 4790K would be the best investment for the money.
> 
> The only time I would upgrade a CPU purely for gaming is when you are running into CPU bottlenecks, which judging by your statement, you don't seem to be. Games like Cities Skylines are very CPU intensive (especially if you have giant cities near the agent limits). Battlefield 4 comes to mind as well as a CPU bound game as does the Total War series.
> 
> Most other games are GPU bound. If you have the money, in your position, I'd buy a second 390. That will yield better performance.


Hmm. Sorry for being such a noob guys







Is the first PC that is completely mine I have built in 6 years. And the highest end one I ever built. And the first one where absolutely every single euro was my own









Just a bit paranoid on these things. I already know I want a good SSD in a month or two and maybe also upgrade to Windows 10 (heard AMD cards perform a bit better then). And a really good monitor too









And a new wallet









Thing is, there is a chance for me to exchange my i5 4460 along with my old PC and to snag a 4790k.


----------



## battleaxe

Quote:


> Originally Posted by *Charcharo*
> 
> Hmm. Sorry for being such a noob guys
> 
> 
> 
> 
> 
> 
> 
> Is the first PC that is completely mine I have built in 6 years. And the highest end one I ever built. And the first one where absolutely every single euro was my own
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just a bit paranoid on these things. I already know I want a good SSD in a month or two and maybe also upgrade to Windows 10 (heard AMD cards perform a bit better then). And a really good monitor too
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And a new wallet
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thing is, there is a chance for me to exchange my i5 4460 along with my old PC and to snag a 4790k.


Do it!

Edit: wait, can your MOBU support OC'ing? If so, then do it. If not, then you better figure the price of a new board too.


----------



## lightsout

Quote:


> Originally Posted by *tangelo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *lightsout*
> 
> Actually I wasn't trying to condemn XFX at all. I just wanted an actual owner to chime in besides the one that said his hit 90.
> 
> Also I have quite a bit of negative about XFX with the previous cards so I'm skeptical when a company claims they've fixed the issue.
> 
> Can any msi owners comment on their vrm load temps.
> 
> 
> 
> MSI R9 390 clocked at 1125/1650 with +50% power limit running on 100% gpu usage the VRM temps are around 60C
Click to expand...

60 thats it? Thats really low. Seems like there are hot spots on the card that temp sensors don't read. The discrepancy between that and the tests on the toms review is so huge.
http://www.tomshardware.com/reviews/amd-radeon-r9-390x-r9-380-r7-370,4178-11.html


----------



## Charcharo

Quote:


> Originally Posted by *battleaxe*
> 
> Do it!
> 
> Edit: wait, can your MOBU support OC'ing? If so, then do it. If not, then you better figure the price of a new board too.


I think it does. Though I will not OC a lot


----------



## tangelo

Quote:


> Originally Posted by *lightsout*
> 
> 60 thats it? Thats really low. Seems like there are hot spots on the card that temp sensors don't read. The discrepancy between that and the tests on the toms review is so huge.
> http://www.tomshardware.com/reviews/amd-radeon-r9-390x-r9-380-r7-370,4178-11.html


I just ended a 3 hour session of Fallout 4 and the highest temp the VRM achieved was 63C. But after running Heaven Benchmark couple of times the temp raised to 69C while the fan was running at 59%

So I guess it varies quite much from application to another and fan curve.


----------



## Dundundata

That sounds about right as far as MSI VRM goes, though I can only go by what the sensor says.


----------



## GorillaSceptre

Just got a MSI 390x.









Haven't played any games yet, just been messing around in some benches checking temps. What so far has been deemed to be "safe" for 24/7 temps? The max I've seen is 83C on the core after about a half hour of Valley with around 80% fans, ambient temps must be around 30C. That's with the stock 1080/1.2v clocks.

Not to impressed with the cooler so far, especially coming from a Vapor-x, but it's not that loud and gets the job done, assuming my temps are safe..


----------



## Strife21

Quote:


> Originally Posted by *lightsout*
> 
> It is true I don't love XFX. But I haven't read an msi owner say their vrm was anywhere near 90c. But I haven't done a ton of research.


I had a xfx 390 with amazing air flow in my case and a additonal case fan on the side panel pointed directly at it and the vrm was hitting 93C at stock speeds. I took it back got an MSI card and the VRM doesnt go over 73C overclocked

It also had horrible coil whine which I don't have now with the MSI


----------



## mandrix

This is my best Heaven bench so far with my PCS 390X @ 1190/1610 25/50.
I can do basically the same thing with about 1220/1510 63/50.
VRM gets pretty hot even with the water block, but it cools down fast.


----------



## fat4l

Quote:


> Originally Posted by *Gumbi*
> 
> 1160 is a respectable overclock. How much voltage are you putting through it for that?


Quote:


> Originally Posted by *Gdourado*
> 
> +100mv and +20% power limit.


Well you guys should rly understand that every* card has a different stock voltage.
So saying +100mV is not rly saying anything. When you are clocking your cpu you don;t say i put +150mV, but you rather say i put 1.3v through it.
This is the same with graphics cards. You need to get your stock voltage, with evv program(located in hawaii bios editing thread) and then add +100mV.
Some cards have 1.25V stock, some have 1.19V stock. Thus OC will vary.
Hope it makes sense


----------



## BlackFox1337

Hello all,

im looking to grab a 390x on Black Friday, but i need one that can take a waterblock. Any recommendations?


----------



## Harry604

grabbing a sapphire 390x trix for 400$ can 299$ US dollar brand new in box

let you know what the best overclock it can do

what should i expect from this card


----------



## Cannon19932006

With the cold front coming in tonight, I decided to let some of that cool cool ambient air in, and popped off my side panel, setup a big fan and run a little 3dmark until I got this bad boy.

http://www.3dmark.com/fs/6552532

Both gpu's were at 1150/1700 with +100mv core, and +25mv aux. Top gpu maxed out at 82c, bottom maxed out at 60c.


----------



## mus1mus

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Most 290x's I've seen will hit a higher clock speed than what the 390/x's can but that doesn't mean they are faster thanks to to drivers picking up on this.
> 
> as for Samsung ram there is no clear cut way to choose them as it seems to be random apart from the lightning and that said it doesn't mean it will automatically be faster, my 390x for example will do 1150/1700 on stock voltage but requires +100mV to hit 1200 on the core.
> 
> overall the 390x is a better card than the 290x (as you'd expect) but if you want a 100% clear cut answer as to what you should get then i can't give you that......you need to decide for yourself


Nice info mate. As I am looking at buying either a 290X or a 390.

390 costs about $20 more for an 8GB version. But if these temps plague the VRM and OC are quite meh compared to the 290X, I might lean back to the 290X. Especially with the WB availability.

I wish XFX 390s were available locally though. That would be considerable for me.

Any of you guys looking at bios editting your 300 series cards?
Quote:


> Originally Posted by *BlackFox1337*
> 
> Hello all,
> 
> im looking to grab a 390x on Black Friday, but i need one that can take a waterblock. Any recommendations?


It's in the OP.


----------



## kizwan

Quote:


> Originally Posted by *Gdourado*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Gumbi*
> 
> 1160 is a respectable overclock. How much voltage are you putting through it for that?
> 
> 
> 
> +100mv and +20% power limit.
Click to expand...

Always max out power limit. There's no reason not to, unless you intentionally want to cap the power level. If your gpu core frequency maintained at max 100% all the time then +20% power limit is fine for you but setting it at +50% instead of +20% will not make your card power level to increase in your case.


----------



## Harry604

well just setup my 390x trix

1200mhz core and 1600 memory at 100mv 50 powerlimit

and has not crashed yet


----------



## GorillaSceptre

So jealous of you guys playing with overclocks.. My ambient temps don't allow it.









I think i'm going custom water next year when Greenland/Pascal drops. Summer sucks.


----------



## lightsout

Quote:


> Originally Posted by *Strife21*
> 
> Quote:
> 
> 
> 
> Originally Posted by *lightsout*
> 
> It is true I don't love XFX. But I haven't read an msi owner say their vrm was anywhere near 90c. But I haven't done a ton of research.
> 
> 
> 
> I had a xfx 390 with amazing air flow in my case and a additonal case fan on the side panel pointed directly at it and the vrm was hitting 93C at stock speeds. I took it back got an MSI card and the VRM doesnt go over 73C overclocked
> 
> It also had horrible coil whine which I don't have now with the MSI
Click to expand...

Wow thank you I appreciate that info. That's hot.


----------



## Levys

Quote:


> Originally Posted by *Sgt Bilko*
> 
> VRM 1 is for the memory and it's located here:
> 
> 
> 
> It doesn't have a heatsink unlike Vrm 2 (which is for the Core) but even under a decent overclock (1600-1700Mhz) it shouldn't go over 75-80c really


I have mentioned this before, but here it is again










1,5 mm Arctic thermal pads between the backplate, since I had an Arctic Accelero IV
I used it every time with nice results. The Aac IV came without any other vrm cooling
and did a great job for my reference r9 290, and now on my R9 390x.


----------



## Levys

Quote:


> Quote:
> 
> 
> 
> Originally Posted by *CrazyElf*
> 
> Out of curiosity, has anyone found a use for the extra 4GB of VRAM? What about in Crossfire?
> 
> 
> 
> Yes I did....Its called Black Ops III and it eats memory for breakfast
> 
> 
> 
> 
> 
> 
> 
> ( that's ram and vram )
> I run out of memory after +- 25min, 16gb ram and 8gb vram ...lols didn't check for a fix yet. hope they do, It starts of at +5gb vram maxed out.
> but runs smooth as butter 58-62fps with v sync. hope this answers your question a bit.
Click to expand...


----------



## Sgt Bilko

Quote:


> Originally Posted by *Levys*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> VRM 1 is for the memory and it's located here:
> 
> 
> 
> It doesn't have a heatsink unlike Vrm 2 (which is for the Core) but even under a decent overclock (1600-1700Mhz) it shouldn't go over 75-80c really
> 
> 
> 
> I have mentioned this before, but here it is again
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1,5 mm Arctic thermal pads between the backplate, since I had an Arctic Accelero IV
> I used it every time with nice results. The Aac IV came without any other vrm cooling
> and did a great job for my reference r9 290, and now on my R9 390x.
Click to expand...

That's pretty cool, my 390x doesn't get all that hot but this could work for quite a few people out there


----------



## tangelo

As I didn't receive any input, I'll ask again. Please forgive me









With R9 390, should I go for a 1440p Freesync monitor or just a normal 1440p 60Hz? Is the card powerful enough to run games on maxed settings with 1440p above 60fps to justify the premium price for a Freesync monitor? I have a 1080p now.

Need any input you guys with 1440p panels and R9 390 could provide.


----------



## Sgt Bilko

Quote:


> Originally Posted by *tangelo*
> 
> As I didn't receive any input, I'll ask again. Please forgive me
> 
> 
> 
> 
> 
> 
> 
> 
> 
> With R9 390, should I go for a 1440p Freesync monitor or just a normal 1440p 60Hz? Is the card powerful enough to run games on maxed settings with 1440p above 60fps to justify the premium price for a Freesync monitor? I have a 1080p now.
> 
> Need any input you guys with 1440p panels and R9 390 could provide.


For the most part a 390 will run between 60-90fps in the vast majority of games on Ultra.........if it's worth getting a higher hz monitor for that then go for it


----------



## mandrix

So what is it exactly this app is supposed to be telling me about my card?
When I monitor the voltage, I'm not sure how this applies to anything I see. Thanks!


----------



## mus1mus

Quote:


> Originally Posted by *mandrix*
> 
> So what is it exactly this app is supposed to be telling me about my card?
> When I monitor the voltage, I'm not sure how this applies to anything I see. Thanks!


It says, your GPU chip requires that Voltage at the max Frequency it is set to run at that instance.

For non-OC'ed cards, it says the default Voltage the chip needs to run the defined stock clock. Or Stock VID.

For you to effectively get the card's stock VID, you need to reset all OC profiles to the card. Do a reboot and run the App.

It is helpful when modifying the card's BIOS to get the idea of the Voltage you need if you want to do a pre-overclocked BIOS. (Hawaii BIOS Reader app has that capability).

Also note that OCing software such as Trixx, MSI AB, and HIS iTurbo adds a Voltage Offset unto that VID if you increase it by the slider. To analyze it, your Voltage applied to the card after doing an Overvolt will be VID + Offset. That is if no Voltage Droop is present or happens.

With your VID, I think it's quite high. What's your ASIC?


----------



## mandrix

Quote:


> Originally Posted by *mus1mus*
> 
> It says, your GPU chip requires that Voltage at the max Frequency it is set to run at that instance.
> 
> For non-OC'ed cards, it says the default Voltage the chip needs to run the defined stock clock. Or Stock VID.
> 
> For you to effectively get the card's stock VID, you need to reset all OC profiles to the card. Do a reboot and run the App.
> 
> It is helpful when modifying the card's BIOS to get the idea of the Voltage you need if you want to do a pre-overclocked BIOS. (Hawaii BIOS Reader app has that capability).


So it's something that is programmed into the BIOS then.
I have checked with this app at stock and OC'd and it always reads the same.

BTW the max voltage I see in AB when the card is overclocked (but no changes to voltage) is about 1.25v. When I add, say, +63 to the core, AB shows 1.3 'ish.
Thanks!


----------



## mus1mus

Quote:


> Originally Posted by *mandrix*
> 
> So it's something that is programmed into the BIOS then.
> I have checked with this app at stock and OC'd and it always reads the same.
> 
> BTW the max voltage I see in AB when the card is overclocked (but no changes to voltage) is about 1.25v. When I add, say, +63 to the core, AB shows 1.3 'ish.
> Thanks!


It is a chip attribute. But OC software and BIOS modding can override that value.

It's called DPM 7 Voltage.

When you check it vua GPU-Z, it is already affected by VDroop so, it is less than the actual Voltage the chip is supplied with. Especially at load.


----------



## mandrix

Quote:


> Originally Posted by *mus1mus*
> 
> It is a chip attribute. But OC software and BIOS modding can override that value.
> 
> It's called DPM 7 Voltage.
> 
> When you check it vua GPU-Z, it is already affected by VDroop so, it is less than the actual Voltage the chip is supplied with. Especially at load.


OK.
Makes sense. So when AB is showing 1.3v under load it's actually taking in substantial more voltage then.


----------



## mus1mus

Quote:


> Originally Posted by *mandrix*
> 
> OK.
> Makes sense. So when AB is showing 1.3v under load it's actually taking in substantial more voltage then.


You might need to verify that with other apps. (and/or a Multitester)

I have not tested Afterburner much with AMD cards due to their limited Voltage range. So maybe try to confirm that with GPU-Z.


----------



## mandrix

Quote:


> Originally Posted by *mus1mus*
> 
> You might need to verify that with other apps. (and/or a Multitester)
> 
> I have not tested Afterburner much with AMD cards due to their limited Voltage range. So maybe try to confirm that with GPU-Z.


AB, gpu-z, HWINFO64 all seem to agree with each other on VDDC with any of the AMD cards I've owned. But I've never checked with DMM and since I run a block and backplate I would not be able to anyway I guess. HWINFO64 does show some current and power figures, though, which might be useful.


----------



## Gdourado

Quote:


> Originally Posted by *Harry604*
> 
> well just setup my 390x trix
> 
> 1200mhz core and 1600 memory at 100mv 50 powerlimit
> 
> and has not crashed yet


When I try 1200 on my trix, I start to get small artifacts in heaven.
Have you tried heaven with those clocks?


----------



## kizwan

Quote:


> Originally Posted by *mandrix*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mus1mus*
> 
> You might need to verify that with other apps. (and/or a Multitester)
> 
> I have not tested Afterburner much with AMD cards due to their limited Voltage range. So maybe try to confirm that with GPU-Z.
> 
> 
> 
> AB, gpu-z, HWINFO64 all seem to agree with each other on VDDC with any of the AMD cards I've owned. But I've never checked with DMM and since I run a block and backplate I would not be able to anyway I guess. HWINFO64 does show some current and power figures, though, which might be useful.
Click to expand...

Look at max value. That's voltage before Vdroop.

Your default DPM 7 1.275V + offset +63mV = 1.338V, but your card not necessarily going to get exactly 1.338V (before Vdroop). It may less or more. 1.3 something sound about right to me. It think it's pretty much like CPU VID which is the voltage that should be delivered to the chip but the actual voltage it get may be more (or less).


----------



## mandrix

Quote:


> Originally Posted by *kizwan*
> 
> Look at max value. That's voltage before Vdroop.
> 
> Your default DPM 7 1.275V + offset +63mV = 1.338V, but your card not necessarily going to get exactly 1.338V (before Vdroop). It may less or more. 1.3 something sound about right to me. It think it's pretty much like CPU VID which is the voltage that should be delivered to the chip but the actual voltage it get may be more (or less).


Thanks, good to know.
Is there any consensus on how high is "safe"? So far it's looking like more than +63mv doesn't bring much to the table for me but it would be good to know how far I can go benching the card.


----------



## fat4l

Quote:


> Originally Posted by *mandrix*
> 
> So what is it exactly this app is supposed to be telling me about my card?
> When I monitor the voltage, I'm not sure how this applies to anything I see. Thanks!


This is your 3D voltage. In reality it will not be 1.275v due to vdroop but about 1.25V. You also need to check MAX value(in afterburner or gpu-z) not average one to see the max value after vdroop.

Regardging what I said earlier, this is the reason I said "+100mV in AB" doesn't say anything regarding the voltage.
For example when I add +100mV, my 3d voltage then is 1.29375v while yours would be 1.375v. See the difference ? Thus...different oveclock...

Quote:


> Originally Posted by *mus1mus*
> 
> With your VID, I think it's quite high. What's your ASIC?


You can check his asic by using evv program.
Lkg = asic.
31A=794/1023=77.6%.

You are right, it's too high or I would say, it's much higher than on 290X.
I believe this is the reason why 390X cards clock higher/better than 290X. There's no magic behind it, it's just more core voltage.
Regarding mems, they use better mem controller=higher stable clocks + they use newer/better mem chips = higher clocks + tighter timings(than 290X, but still not as tight as they could be).
So clock for clock, 390X is faster(cuz of better mem timings).
Overclocking wise, 390X clocks higher due to facts I mentioned above.









For comparison 295X2 uses mostly 1.25V DPM7(I checked 4 different asics= 70.4%, 68.4%,73.9%,77.8% asic; the voltage should vary but it doesn't for some reason).
My cores(290X) are 79.5%-1.19375v DPM7 and 78.3%-1.22500v DPM7.

It would be nice to see if more ppl run this EVV program to see the logic behind 390X cards.


----------



## Dundundata

So with +50mV core I am getting Max 1.318V in HWinfo, sound about right?

Also what is considered a safe voltage for these cards?


----------



## jazz995756

Hey guys just wondering looking into doing a crossfire and newegg will have a nice deal on the XFX model for Black Friday do you guys think it's a good idea? Or is it just not even worth it.


----------



## hugoolly

Hey guys, just finished my latest upgrades.
Now rocking 2 390 ASUS DC2s in crossfire, 750d case and RM1000W PSU.


----------



## hugoolly

Just got my second 390, as long as you have a good enough PSU it is worth it.
Performance gains are pretty great, just with fallout 4 was optimised better for cf


----------



## hugoolly

Quote:


> Originally Posted by *jazz995756*
> 
> Hey guys just wondering looking into doing a crossfire and newegg will have a nice deal on the XFX model for Black Friday do you guys think it's a good idea? Or is it just not even worth it.


It's worth it if you have a high enough wattage PSU. I just added my second 390 today and it's pretty awesome


----------



## mus1mus

Quote:


> Originally Posted by *fat4l*
> 
> This is your 3D voltage. In reality it will not be 1.275v due to vdroop but about 1.25V. You also need to check MAX value(in afterburner or gpu-z) not average one to see the max value after vdroop.
> 
> Regardging what I said earlier, this is the reason I said "+100mV in AB" doesn't say anything regarding the voltage.
> For example when I add +100mV, my 3d voltage then is 1.29375v while yours would be 1.375v. See the difference ? Thus...different oveclock...
> You can check his asic by using evv program.
> Lkg = asic.
> 31A=794/1023=77.6%.
> 
> You are right, it's too high or I would say, it's much higher than on 290X.
> I believe this is the reason why 390X cards clock higher/better than 290X. There's no magic behind it, it's just more core voltage.
> Regarding mems, they use better mem controller=higher stable clocks + they use newer/better mem chips = higher clocks + tighter timings(than 290X, but still not as tight as they could be).
> So clock for clock, 390X is faster(cuz of better mem timings).
> *Overclocking wise, 390X clocks higher due to facts I mentioned above.
> 
> 
> 
> 
> 
> 
> 
> *
> 
> For comparison 295X2 uses mostly 1.25V DPM7(I checked 4 different asics= 70.4%, 68.4%,73.9%,77.8% asic; the voltage should vary but it doesn't for some reason).
> My cores(290X) are 79.5%-1.19375v DPM7 and 78.3%-1.22500v DPM7.
> 
> It would be nice to see if more ppl run this EVV program to see the logic behind 390X cards.


Doesn't sound right to me.

My 290 has a VID of 1.18ish / ASIC of 81% and clocks to 1350.

Any 390X that can reach that will be stupendously a nightmare for me.


----------



## fat4l

Quote:


> Originally Posted by *mus1mus*
> 
> Doesn't sound right to me.
> 
> My 290 has a VID of 1.18ish / ASIC of 81% and clocks to 1350.
> 
> Any 390X that can reach that will be stupendously a nightmare for me.


You cant look at one card, your card, only. Your card is very good @silicon lottery.
Look at 290X in general.
How far can 290X get on "stock" volts in general vs How far can 390X get on "stock" volts in general....


----------



## mus1mus

Quote:


> Originally Posted by *fat4l*
> 
> You cant look at one card, your card, only. Your card is very good @silicon lottery.
> Look at 290X in general.
> How far can 290X get on "stock" volts in general vs How far can 390X get on "stock" volts in general....


Then I think, the OC question is out of the equation.

From what I can gather on this thread, 1200 core for a 390/X is something generally accepted as high. But it also takes a lot of voltage to reach it. (VID + Offset)

And you were wrong about me looking at just one card. I have in fact 3 290/X cards. One is a 290 hynix, one 290 Elpida, and a 290X Elpida. None of them has problem hitting 1200 within +150 Voltage offset. While all within 1.25 VID.

And as you are very aware, stock volts for this cards vary a lot. So your OC at stock voltage logic is very out of the concept of your previous posts. What's with the change of heart?

To add, even my poorest clocker 290 Hynix card does 1200 at +100 on MSI AB. But I don't use it that much (MSI AB) due to it limiting my memory OC to 1625. (before BIOS editing came into my senses) But that card is a hella bencher.


----------



## Dundundata

Quote:


> Originally Posted by *jazz995756*
> 
> Hey guys just wondering looking into doing a crossfire and newegg will have a nice deal on the XFX model for Black Friday do you guys think it's a good idea? Or is it just not even worth it.


That is a really good deal ($260), I hope I don't see the MSI going for that price because I've spent enough


----------



## Dundundata

@hugolly, nice setup. I wish I went for the 1000 because if I ever Xfire my PSU will need an upgrade. I also wish I had a full tower case like that!

In fact if ever I do Xfire I will probably have enough parts leftover to build a second rig.


----------



## kizwan

Look at voltage while gpu is under load. Don't rely on the DPM 7 voltage. 390 cards may have high DPM 7 voltage but if voltage droop is more than 290 cards, the 390 cards may actually operating at lower voltage.


----------



## genji

I just got an R9 390 2 days ago. I'm liking it so far. I've had some issues with CPU in Star Wars Battlefront in the main menu and very beginning of game going to 100%, but from what I've read this happened during the beta as well and may be associated to AMD drivers. In game however it runs flawlessly and looks great. I did have one issue though that just occured after 2 days. One of the maps was black except for what was right in front of me and when I went forward you could see it 'light up' but then further distances were blacked out unti I got to them. I've played that map before and had no issues, so maybe it was a glitch? I asked in the game chat, but no one else was experiencing the issue. I could see the pause menu and chat window just fine, so believe it was just a glitch on the map, I'll have to go there again and re-try to see if it occurs again. I ran furmark to verify I wasn't seeing any issues with the card. I have until January 15th to return/exchange it at Best Buy if anything is wrong with it or if I don't decide to keep it and it also has a lifetime warranty, so I'm not too worried, just want to make sure it's not going to have any issues.


----------



## Sgt Bilko

Quote:


> Originally Posted by *genji*
> 
> I just got an R9 390 2 days ago. I'm liking it so far. I've had some issues with CPU in Star Wars Battlefront in the main menu and very beginning of game going to 100%, but from what I've read this happened during the beta as well and may be associated to AMD drivers. In game however it runs flawlessly and looks great. I did have one issue though that just occured after 2 days. One of the maps was black except for what was right in front of me and when I went forward you could see it 'light up' but then further distances were blacked out unti I got to them. I've played that map before and had no issues, so maybe it was a glitch? I asked in the game chat, but no one else was experiencing the issue. I could see the pause menu and chat window just fine, so believe it was just a glitch on the map, I'll have to go there again and re-try to see if it occurs again. I ran furmark to verify I wasn't seeing any issues with the card. I have until January 15th to return/exchange it at Best Buy if anything is wrong with it or if I don't decide to keep it and it also has a lifetime warranty, so I'm not too worried, just want to make sure it's not going to have any issues.


the Battlefront dark map glitch is an easy fix (a friend figured this one out).....just turn lighting and ambient occlusion down to low (might need to play around with those settings) and it'll work, I've had the issue myself and by chance is it the death match style Hoth map?

and the CPU usage is normal as well, my 9590 hits 100% in loading + main menus even at 4k


----------



## Mister300

i have same issue with maps going black it seems to happen when you exit out into the menus.


----------



## Mister300

Wait on patch should fix it I run it on max settings and it does ramp up my 5820K to 100%.


----------



## genji

Quote:


> Originally Posted by *Sgt Bilko*
> 
> the Battlefront dark map glitch is an easy fix (a friend figured this one out).....just turn lighting and ambient occlusion down to low (might need to play around with those settings) and it'll work, I've had the issue myself and by chance is it the death match style Hoth map?
> 
> and the CPU usage is normal as well, my 9590 hits 100% in loading + main menus even at 4k


Yes! It was on the Hoth Ice Caves. I'll turn down those settings. Thanks for the tip!


----------



## genji

Quote:


> Originally Posted by *Mister300*
> 
> Wait on patch should fix it I run it on max settings and it does ramp up my 5820K to 100%.


Good to know I'm not the only one this happens to.


----------



## RWGTROLL

this might help you guys out


----------



## genji

Quote:


> Originally Posted by *RWGTROLL*
> 
> 
> 
> 
> 
> this might help you guys out


That was the exact map and exact issue. Thanks for posting


----------



## fat4l

Quote:


> Originally Posted by *mus1mus*
> 
> Then I think, the OC question is out of the equation.
> 
> From what I can gather on this thread, 1200 core for a 390/X is something generally accepted as high. But it also takes a lot of voltage to reach it. (VID + Offset)
> 
> And you were wrong about me looking at just one card. I have in fact 3 290/X cards. One is a 290 hynix, one 290 Elpida, and a 290X Elpida. None of them has problem hitting 1200 within +150 Voltage offset. While all within 1.25 VID.
> 
> And as you are very aware, stock volts for this cards vary a lot. So your OC at stock voltage logic is very out of the concept of your previous posts. What's with the change of heart?
> 
> To add, even my poorest clocker 290 Hynix card does 1200 at +100 on MSI AB. But I don't use it that much (MSI AB) due to it limiting my memory OC to 1625. (before BIOS editing came into my senses) But that card is a hella bencher.


My point is, with the same asic for 290x and 390x cards, the dpm7 evv voltage should be the same. It's not. 390Xs have higher stock volts, period.


----------



## mus1mus

Quote:


> Originally Posted by *fat4l*
> 
> My point is, *with the same asic for 290x and 390x cards, the dpm7 evv voltage should be the same.* It's not. 2*390Xs have higher stock volts*, _period._


1. You were the only one who claimed this.
2. Quite logical for cards that run at much higher Clocks.
3. !


----------



## fat4l

Quote:


> Originally Posted by *mus1mus*
> 
> 1. You were the only one who claimed this.
> 2. Quite logical for cards that run at much higher Clocks.
> 3. !


1. It's base on evv reports I saw. If ppl here in this thread run evv program they would show u








And yes I'm the only one cuz ppl here in this thread don't do any mods, usually.
2. Gupsterg and I proved that higher the stock clocks the lower the evv Voltage. Don't ask me why. Evv for these cards has been modified by amd so they clock better...proving 1.
3.290x and 390x both use the same cores.
If I picked 10 290x and 10 390x and tried to oc them while providing extra 100mV voltage for 390x I believe they would win. It's logical....but then again silicon lottery.

Ps. Ppl start using evv program. Post results


----------



## Charcharo

Can someone do a Metro LL Redux Benchmark run for me?

Settings:
1440x900
SSAA On
Physx On
Very High settings
Very High Tesselation
Vsync Off (duh







)
AF 16X
Motion Blur Normal

Just want to compare (will post my score a bit later).


----------



## Levys

Hey,

Can someone provide a link to this evv. program and possibly marginal info on how to use it?
I would like to check this out. thnx


----------



## mus1mus

Here.

Undo any OC you have. Reboot on a clean GPU state (no OC, pure stock). Run the app.


----------



## Levys

I'll check it out and post some info later. thnx


----------



## SystemTech

Thanks. Downloaded it, just to run it when i get home later


----------



## Levys

Here is my ecc result.


----------



## Levys

Quote:


> Originally Posted by *mandrix*
> 
> So what is it exactly this app is supposed to be telling me about my card?
> When I monitor the voltage, I'm not sure how this applies to anything I see. Thanks!


Am I getting this wrong or is the above card using 0,03*** voltage more than mine at a stock speed of only 10Mhz higher?



and is this partly the result of the asic score differential.
or am I just talking nonsence now


----------



## mus1mus

More of a chip quality than anything I can explain to you.


----------



## fat4l

Quote:


> Originally Posted by *Levys*
> 
> Am I getting this wrong or is the above card using 0,03*** voltage more than mine at a stock speed of only 10Mhz higher?
> 
> 
> 
> and is this partly the result of the asic score differential.
> or am I just talking nonsence now


It depends on asic quality + default mhz + other stuff we do not know but certainly affect it.
His asic is 77.6%.
Your asic is 80.0%.

General rule is the higher the asic, the lower the volts.

As you can see with your 80% asic(390X) your card is using 1.24375v. My core1(290X) has asic of 79.5% and is using 1.19375v. That's a difference of 50mV while the asic quality is almost the same(390X vs 290X voltage "strategy"







)


----------



## Mister300

Try again with Phy X(useless with AMD hardware) off and SSAA off these are FPS killers. I get 84 FPS at 1080 everything maxed out except SSAA off and physX is off. XFX 390X no OC , 5820K @ 4.4, 16 gig hyperX 2667 RAM.


----------



## Charcharo

Quote:


> Originally Posted by *Mister300*
> 
> Try again with Phy X(useless with AMD hardware) off and SSAA off these are FPS killers. I get 84 FPS at 1080 everything maxed out except SSAA off and physX is off. XFX 390X no OC , 5820K @ 4.4, 16 gig hyperX 2667 RAM.


I dont know, Metro Redux uses a CPU-based version of PhysX. So it runs just fine on my i5









I get 72 fps average at 1440x900 with SSAA on and PhysX on


----------



## lightsout

Delete...


----------



## Wright0Concern

Could anyone here maybe give me some advice on whether or not EK are likely to release a full block for the Asus Strix R9 390 DCIIIOC(3), or any company for that matter.
I realise that this would be pure speculation but I'm trying to decide if its worth holding onto the card to water cool it in the future, or if I should just try and flog it and buy a card that is all ready supported with a full block?


----------



## mus1mus

Quote:


> Originally Posted by *Wright0Concern*
> 
> Could anyone here maybe give me some advice on whether or not EK are likely to release a full block for the Asus Strix R9 390 DCIIIOC(3), or any company for that matter.
> I realise that this would be pure speculation but I'm trying to decide if its worth holding onto the card to water cool it in the future, or if I should just try and flog it and buy a card that is all ready supported with a full block?


Visit EK' page.

http://www.overclock.net/t/993624/ek-club/


----------



## TsukikoChan

Finally got around to basic firestriking my Sapphire 390x, did a run on my stock 390x speed and got the following score:

(9004 - G score 12441, P score 8621, combined score 2997)

So, ran my OC that i normally use in games atm (which I'm sure i can push further) of 1131mhz on core, 1593mhz memory, +50% limit and +56mV:

(9337 - G score 13177, P score 8656, combined score 3044)

Here's the other detail on my comp as it appears on firestrike (from my stock run):


Edit: forgot to mention, during these tests my VRM temp never went above 65-70oC and core never went above 60-65oC and CPU never went above ~30-35oC. I forgot to take screenshots of temps. I only heard the gpu fan once in both tests near the end.

I know I can push the cpu, gpu and memory a bit further (and i will), so any suggestions or thoughts on this will be greatly appreciated!
Is this is a nice score for a low/mild OC on sapphire 390x?
Should i concentrate on more my cpu and memory now instead? (i want to wait for Zen benchmarks to appear before i make another cpu upgrade)

Also, can i get added to the club please? :3


----------



## GaToMaLaCo

@TsukikoChan

I also have a Sapphire 390X and also have almost the same score. Which is kind of... little low considering the other brands scores. Maybe the FX8350 is bottlenecking?... perhaps. tonight will oc the 390x to 1100mhz and test again.


----------



## mus1mus

Quote:


> Originally Posted by *GaToMaLaCo*
> 
> @TsukikoChan
> 
> I also have a Sapphire 390X and also have almost the same score. Which is kind of... little low considering the other brands scores. Maybe the FX8350 is bottlenecking?... perhaps. tonight will oc the 390x to 1100mhz and test again.


Just look at the Graphics Score if you're on an AMD CPU. It's purely GPU. You lose a bit due to it being PCIE 2.0. But not in the hundreds.

Physics is Purely CPU. (doesn't flex out FX Power anyway)

Combined uses half of the FX's thread copability. (Roughly 60% utilization of 4 (83XX) threads too.) Results in a low GPU utilization if you have a high-end GPU.

Purely coding specific than Bottlenecking.


----------



## TsukikoChan

Quote:


> Originally Posted by *GaToMaLaCo*
> 
> @TsukikoChan
> 
> I also have a Sapphire 390X and also have almost the same score. Which is kind of... little low considering the other brands scores. Maybe the FX8350 is bottlenecking?... perhaps. tonight will oc the 390x to 1100mhz and test again.


yea, looks like we have almost the same hardware :O kinda sucky about the fx8350 in these score huh...

Quote:


> Originally Posted by *mus1mus*
> 
> Just look at the Graphics Score if you're on an AMD CPU. It's purely GPU. You lose a bit due to it being PCIE 2.0. But not in the hundreds.
> Physics is Purely CPU. (doesn't flex out FX Power anyway)
> Combined uses half of the FX's thread copability. (Roughly 60% utilization of 4 (83XX) threads too.) Results in a low GPU utilization if you have a high-end GPU.
> Purely coding specific than Bottlenecking.


aah, so a lil loss from pcie2 but otherwise a decent graphics score.
ty for explaining this for me XD i really don't want to upgrade my cpu just yet until more word about zen comes out (then i'll get a zen or shift to intel for cpu).
do we have a better benchmarking tool for purely gpu checking that doesn't care about fx/intel chips?


----------



## mus1mus

You will feel good how the FX scores in 3DMark 11 than in FS.

Also try the Ultra and Extreme FS.


----------



## TsukikoChan

Quote:


> Originally Posted by *mus1mus*
> 
> You will feel good how the FX scores in 3DMark 11 than in FS.
> 
> Also try the Ultra and Extreme FS.


I may have to try these when i get home 
though, i'm using the free version, is there a free version of 11/ulra/extreme?


----------



## mus1mus

Performance test (IIRC 720P) is free.

Ultra and Extreme FS unlocks with the $20 key. IIRC. Got it cheaper on steam due to FOREX Rates though.


----------



## SystemTech

I think its your CPU thats hurting your score as @mus1mus mentioned with the core usage.
Here is my run ; http://www.3dmark.com/fs/6553339

Graphics Score
13213

Physics Score
12294

Combined Score
5031

Quite a big jump on yours by almost 2k in the combined test. Look at your CPU score vs mine.








Our GPU scores are close enough considering i ran mine on a 50mhz oc more than yours.


----------



## TsukikoChan

Quote:


> Originally Posted by *SystemTech*
> 
> I think its your CPU thats hurting your score as @mus1mus mentioned with the core usage.
> Here is my run ; http://www.3dmark.com/fs/6553339
> 
> Graphics Score
> 13213
> 
> Physics Score
> 12294
> 
> Combined Score
> 5031
> 
> Quite a big jump on yours by almost 2k in the combined test. Look at your CPU score vs mine.
> 
> 
> 
> 
> 
> 
> 
> 
> Our GPU scores are close enough considering i ran mine on a 50mhz oc more than yours.


:< (urge to upgrade cpu asap rising)


----------



## Charcharo

My only urge to upgrade CPU is due to the fear it may actually at times bottleneck my 390









That and maybe for me to be calm that I will be set to use the same PC (with at most switching the GPU in time) for at least 6-7 years.


----------



## mus1mus

lol.
The bench is a poor representation of the gaming capabilities of your rig.

It's Intel-biased. (no offense if you have an Intel)

There are users that can chime in here that AMD FX doesn't bottleneck their games. (we are now using more than 4 threads in gaming)

An i5 will be a downgrade. And i7 quad, a side grade.

If you have the urge, and the cash, wait a bit. Next year will be more exciting.

Intel is rumored to release a 10C/20T Broadwell-E. Along with it, the 8Core will be cheaper, and so are the 6C variants. DDR4 will be more mature. And so on.

AMD might also come up big with Zen.

And games will follow suit, leaving thr "you don't need more than 4 threads in gaming" stigma.

But if your urge denies your logic, go pick a 5820K. Shouldn't be much more expensive than Skylake.


----------



## SystemTech

Quote:


> Originally Posted by *mus1mus*
> 
> lol.
> The bench is a poor representation of the gaming capabilities of your rig.
> 
> It's Intel-biased. (no offense if you have an Intel)
> 
> There are users that can chime in here that AMD FX doesn't bottleneck their games. (we are now using more than 4 threads in gaming)
> 
> An i5 will be a downgrade. And i7 quad, a side grade.
> 
> If you have the urge, and the cash, wait a bit. Next year will be more exciting.
> 
> Intel is rumored to release a 10C/20T Broadwell-E. Along with it, the 8Core will be cheaper, and so are the 6C variants. DDR4 will be more mature. And so on.
> 
> AMD might also come up big with Zen.
> 
> And games will follow suit, leaving thr "you don't need more than 4 threads in gaming" stigma.
> 
> But if your urge denies your logic, go pick a 5820K. Shouldn't be much more expensive than Skylake.


I agree on all counts there. Wait for ZEN but in the meantime your FX wil hold up great








And to add, those games that use less than 4 cores, will still be fine with the FX.


----------



## mus1mus

In the mean time, OC'ng your FX will be a good option.

Buy an AIO and OC it to at least 4.7 and notice your games run smoother.

Btw, not the same card but







http://www.3dmark.com/compare/3dm11/10544667/3dm11/10103639


----------



## mandrix

Some Fire Strike scores from my PCS 390x.

Stock 1060/1500


1200/1670 1.289 core - 1.047 aux (idle voltages)


----------



## TsukikoChan

ty guys, i was kidding about the urge rising  i will wait on the zen release and make my transition then. as you say, i'm not hitting into any problems with my fx8350 in games atm (was getting solid 55-60 in evil within, 60-80+ in ultra mordor and most of my normal games never drop below 60), so the score isn't weighing that heavily on me (i game at 1080p on a 120hz monitor atm)

my fx8350 is sitting at 4.5 atm and the max the temp ever gets to is 30 (the noctua nh14 is a daaaaaaaamn fine air cooler!!), so i may try to push it up to 4.7+ and let the temps rise a bit to 40 (i mean, it's winter, at best means i'm a lil more cozy in my art-room/study :3 ), i'm still refining my OC score on the 390x atm. loving this card btw!! <3
any thoughts on ram though? does it play a bit part in these scores or games really?


----------



## Charcharo

I guess I will probably wait too. Less I find a cheap 4790k (as in actually cheap) I am better off getting a 1440p monitor and a good SSD now.

BTW:





Yey. Now how do I install this







Do I have to uninstall CCC?
*Noobish question, I know, this is my first time when I am on a reasonably high end machine and was actually hyped about a driver + software update


----------



## battleaxe

Testing out the new Crimson Drivers: http://support.amd.com/en-us?webSyncID=3e759e45-e9f8-4a20-9e58-8f98d46d37b4&sessionGUID=88e90022-786c-4366-8af9-4c85295b3013


----------



## kizwan

Quote:


> Originally Posted by *TsukikoChan*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Finally got around to basic firestriking my Sapphire 390x, did a run on my stock 390x speed and got the following score:
> 
> (9004 - G score 12441, P score 8621, combined score 2997)
> 
> 
> 
> So, ran my OC that i normally use in games atm (which I'm sure i can push further) of 1131mhz on core, 1593mhz memory, +50% limit and +56mV:
> 
> (9337 - G score 13177, P score 8656, combined score 3044)
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Here's the other detail on my comp as it appears on firestrike (from my stock run):
> 
> 
> Edit: forgot to mention, during these tests my VRM temp never went above 65-70oC and core never went above 60-65oC and CPU never went above ~30-35oC. I forgot to take screenshots of temps. I only heard the gpu fan once in both tests near the end.
> 
> I know I can push the cpu, gpu and memory a bit further (and i will), so any suggestions or thoughts on this will be greatly appreciated!
> Is this is a nice score for a low/mild OC on sapphire 390x?
> Should i concentrate on more my cpu and memory now instead? (i want to wait for Zen benchmarks to appear before i make another cpu upgrade)
> 
> Also, can i get added to the club please? :3


I prefer look at FPS for Graphics Test 1 & 2. Yours in my opinion is spot on. If you compare with Intel result, you'll see the difference is likely from less than 2 FPS to less than 3 FPS.


----------



## Vexile

Guys, which tests should i run to check if my r9 390 overclock is stable?


----------



## TsukikoChan

Quote:


> Originally Posted by *Charcharo*
> 
> I guess I will probably wait too. Less I find a cheap 4790k (as in actually cheap) I am better off getting a 1440p monitor and a good SSD now.
> 
> BTW:
> 
> 
> 
> 
> 
> Yey. Now how do I install this
> 
> 
> 
> 
> 
> 
> 
> Do I have to uninstall CCC?
> *Noobish question, I know, this is my first time when I am on a reasonably high end machine and was actually hyped about a driver + software update


probably yes it will need uninstall since this is a complete replacement for it 
the question is should we do that now since it's literally just out, might have teething issues for the first few days.
Quote:


> Originally Posted by *battleaxe*
> 
> Testing out the new Crimson Drivers: http://support.amd.com/en-us?webSyncID=3e759e45-e9f8-4a20-9e58-8f98d46d37b4&sessionGUID=88e90022-786c-4366-8af9-4c85295b3013


gonna get this when i get home, it sounds like they are doing some performance enhancements too.
might retry my firestrike run afterwards and see what difference crimson makes (if any).


----------



## TsukikoChan

So, here's the new features of crimson (from the link BA put above):
Radeon Settings
New Install UI
Liquid VR
Asynchronous Shaders
Shader Cache
Optimized Flip Queue Size
Freesync™ Enhancements
Custom Resolution Support
Frame Pacing Enhancements
Frame Rate Target Control Enhancements
Updated Video Feature support for 'Carrizo' products
Power Optimization
Directional Scaling
Dynamic Contrast Update
DisplayPort to HDMI 2.0 support

hmm, interesting :-D


----------



## PhillyB

new drivers are supposedly faster...will give this a shot tonight after I get my 390x installed.

wccftech Analysis


----------



## Charcharo

I installed it and I think it automatically deleted CCC









Can not find it now


----------



## kizwan

Quote:


> Originally Posted by *Vexile*
> 
> Guys, which tests should i run to check if my r9 390 overclock is stable?


My personal choice is GTA V if I want it gaming stable.


----------



## Dundundata

So how are these new drivers working for people. I will probably give them a try later on, seems like some decent gains in performance.


----------



## battleaxe

Quote:


> Originally Posted by *Dundundata*
> 
> So how are these new drivers working for people. I will probably give them a try later on, seems like some decent gains in performance.


Seem very similar to 15.11 beta on most tests. Haven't done a ton of testing in game yet, interface looks a lot better now too though. So far, so good.


----------



## tangelo

Looking promising...

http://www.techpowerup.com/reviews/AMD/Radeon_Crimson_Edition_Drivers/


----------



## fat4l

New driver(crimson), new, improved, results








1200/1700MHz(1500Strap Timings)










Spoiler: Warning: 15.11 DRIVERS



*P*


*X*


*U*






Spoiler: Warning: CRIMSON DRIVERS



*P*


*X*


*U*




edit:// reuploaded images.
edit2://added comparison


----------



## SystemTech

mmm not bad for a driver update :
http://www.3dmark.com/compare/fs/6579699/fs/6553339
Crimson on the left, 15.11 on the right.


----------



## Slowpoke66

Quote:


> Originally Posted by *SystemTech*
> 
> mmm not bad for a driver update :
> http://www.3dmark.com/compare/fs/6579699/fs/6553339
> Crimson on the left, 15.11 on the right.


Not the same clock speeds, linked the wrong comparison?

Here's my results with crimson on the left and *15.7.1* on the right: http://www.3dmark.com/compare/fs/6579991/fs/6414821

Not much of an improvement, in my case...


----------



## tangelo

Here is my comparison with same clocks

EDIT: Found some discrepancy between the two runs so I deleted them and need to redo the benchmarks later.


----------



## Gumbi

Quote:


> Originally Posted by *tangelo*
> 
> Here is my comparison with same clocks
> 
> http://www.3dmark.com/compare/fs/6579764/fs/6239693


Sizeable gains!


----------



## Slowpoke66

Quote:


> Originally Posted by *tangelo*
> 
> Here is my comparison with same clocks
> 
> http://www.3dmark.com/compare/fs/6579764/fs/6239693


Ooh! Soo close to 15K GS! Do one more run!!









Nice improvement! ***??


----------



## Gumbi

That's higher than my best score on my 290X which is 1476 (clocked at 1241/1641mhz).


----------



## Rob27shred

Quote:


> Originally Posted by *lightsout*
> 
> Actually I wasn't trying to condemn XFX at all. I just wanted an actual owner to chime in besides the one that said his hit 90.
> 
> Also I have quite a bit of negative about XFX with the previous cards so I'm skeptical when a company claims they've fixed the issue.
> 
> Can any msi owners comment on their vrm load temps.


I think you are referring to me as the person who stated their main VRM on a XFX DD 390 hit 90C. I do have to say that I have only seen temps that high on my XFX DD 390 when OCing with lots of extra voltage & in a crossfire setup. So a single non OCed (or lightly OCed) XFX DD 390 will never see VRM temps that high unless you have defective one. XFX actually did do a good job with their DD cooler for the 390/X & have brought temps down considerably VS the 290/Xs. I also just got word back from XFX support & the VRMs on the 390/Xs are rated to 120C so even at 90 it would not be in danger of hurting your card. The XFX & MSI aftermarket 390/Xs are both quality cards with minimal problems that I have heard about so I doubt you'd have a problem with either.


----------



## lightsout

Quote:


> Originally Posted by *Rob27shred*
> 
> Quote:
> 
> 
> 
> Originally Posted by *lightsout*
> 
> Actually I wasn't trying to condemn XFX at all. I just wanted an actual owner to chime in besides the one that said his hit 90.
> 
> Also I have quite a bit of negative about XFX with the previous cards so I'm skeptical when a company claims they've fixed the issue.
> 
> Can any msi owners comment on their vrm load temps.
> 
> 
> 
> I think you are referring to me as the person who stated their main VRM on a XFX DD 390 hit 90C. I do have to say that I have only seen temps that high on my XFX DD 390 when OCing with lots of extra voltage & in a crossfire setup. So a single non OCed (or lightly OCed) XFX DD 390 will never see VRM temps that high unless you have defective one. XFX actually did do a good job with their DD cooler for the 390/X & have brought temps down considerably VS the 290/Xs. I also just got word back from XFX support & the VRMs on the 390/Xs are rated to 120C so even at 90 it would not be in danger of hurting your card. The XFX & MSI aftermarket 390/Xs are both quality cards with minimal problems that I have heard about so I doubt you'd have a problem with either.
Click to expand...

Good to know thank you.


----------



## mus1mus

Quote:


> Originally Posted by *tangelo*
> 
> Here is my comparison with same clocks
> 
> http://www.3dmark.com/compare/fs/6579764/fs/6239693


Interesting. I might finally be able to break 16K GS.


----------



## Levys

Well here is my comparison , looks somewhat like Slowpoke66's results, ( I mean the small gains)

http://www.3dmark.com/compare/fs/6581637/fs/6355287


----------



## Levys

Wow guess everyone is trying out the new crimson's Eeey

http://www.3dmark.com/proxycon/error/exception


----------



## Rob27shred

Quote:


> Originally Posted by *lightsout*
> 
> Good to know thank you.


No problem, hope you join the club as these cards are pretty nice!


----------



## Darkeylel

Sigh when the 15.11 Beta drivers have less bugs then new full release drivers you know there is a problem......


----------



## TsukikoChan

upped my cpu OC to 4.6ghz and reran firestrike last night with new crimson drivers, got a nice wee increase.
old was 9337 (13177-8656-3044)
new is 9488 (13377-8849-3089) ( http://www.3dmark.com/3dm/9413841 )
so, +200 on graphics, +197 on physics and +45 on combined.

not a bad wee increase from purely a 100mhz increase in cpu/core and Crimson 

only bad thing i can find about crimson so far is that it didn't detect all my installed games and it reset my colour calibration on my main monitor (just after i finally got it right for working on art =_= ).

also, i can see what you guys mean about combined not being amd friendly.. when firestrike was running the combined test i watched the usage on my cpu and gpu.. cpu didn't go above 45-50% usage and gpu only went up to 700mhz on core.. so, combined didn't even push my hardware at all.. weird....


----------



## Charcharo

The good things so far:

- It seems like it removed CCC by itself during installation.
- I REALLY like this frame target thing (I never used it before) . Quieter during WoT play sessions, less power used, runs at 57-61 fps with the target locked at 60 (which is kind of odd but the experience is fine to me).
It also seems to have aided my Far Cry 4 gaming!
- The caching thing sounds great, but I dont know how to test whether it works. Supposedly helps the CPU load and loading and minimum frame rates and stutter. Will see how it is in FC4 and WoT later, but so far I have not noticed it








- The software is much faster than CCC . Seems to be faster and better done than Nvidia's equivalent ( I sometimes check on my brothers GTX 760 PC) too. I actually like how it looks and how easy it is to navigate.

What I dont like (the bad):

- It crashed a few times. Well, nothing too serious, it just stopped working for an unknown reason and I restarted it (must it always be on? ). And it worked as fast and without any problems. But must it always be on for my custom profiles for games to work?
- It does not detect all my games















- Some menus are still based on CCC. Will be fixed I guess.

- I am a noob and did not learn how to fiddle with CCC on time so now I suck (a bit less) with the new thing


----------



## mus1mus

Quote:


> Originally Posted by *TsukikoChan*
> 
> upped my cpu OC to 4.6ghz and reran firestrike last night with new crimson drivers, got a nice wee increase.
> old was 9337 (13177-8656-3044)
> new is 9488 (13377-8849-3089) ( http://www.3dmark.com/3dm/9413841 )
> so, +200 on graphics, +197 on physics and +45 on combined.
> 
> not a bad wee increase from purely a 100mhz increase in cpu/core and Crimson
> 
> only bad thing i can find about crimson so far is that it didn't detect all my installed games and it reset my colour calibration on my main monitor (just after i finally got it right for working on art =_= ).
> 
> also, i can see what you guys mean about combined not being amd friendly.. when firestrike was running the combined test i watched the usage on my cpu and gpu.. cpu didn't go above 45-50% usage and gpu only went up to 700mhz on core.. so, combined didn't even push my hardware at all.. weird....


Weird? Imagine this.

http://www.3dmark.com/compare/fs/5196034/fs/4536219/fs/6324704/fs/5607835

Nuff said.


----------



## TsukikoChan

Quote:


> Originally Posted by *mus1mus*
> 
> Weird? Imagine this.
> 
> http://www.3dmark.com/compare/fs/5196034/fs/4536219/fs/6324704/fs/5607835
> 
> Nuff said.


You say nuff said but this probably does need something said as it's 4 results of different hardware or different OC settings on same hardware and i don't even know if any of these have new crimson drivers or not :< there's little to no commonality between the results other than it was your machine, y'know? haha :-D


----------



## mus1mus

What it shows, even a higher tier card, the 980ti didn't improve Combined score since it's on an AMD.









Bench sucks for AMD. Not that AMD sucks on the bench.

Got it?


----------



## TsukikoChan

Quote:


> Originally Posted by *mus1mus*
> 
> What it shows, even a higher tier card, the 980ti didn't improve Combined score since it's on an AMD.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Bench sucks for AMD. Not that AMD sucks on the bench.
> 
> Got it?


yarp XD


----------



## Agent Smith1984

Hey guys (and gals







....)

I do my best to make sure I am getting everyone who has submitted proper proof, to the owner's list, but my emails for new posts don't always come through like they are supposed to.

If anyone has issued proof, and is not on the list and wants to be, please let me know.

I have been moving for two weeks now, and have had very little time to scroll the thread.

Again, my apologies on this. Should have some more time to update and manage in the coming weeks!

Glad to see this thing has grown so much


----------



## TsukikoChan

I need help with 2 things guys if you can help me:
What is the button on the sapphire 390x card for? i read it's for uefi mode, so is it not uefi by default? should i turn it on (i think i'm using an uefi bios).

Does anyone else get random pc freezes on the 390x cards still? ever since i moved to 390x, randomly every other day my pc freezes up when loading/playing a game, sometimes accompanied by a visual and aural blip across all screens. No response, screen is showing last image, it's like a complete freeze. Only thing i can do is a hard shutdown. This has occurred on both Crimson and CCC drivers. Is there a problem with my card do you think or is it still the broken 390/x compatibility i heard about?


----------



## TsukikoChan

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Hey guys (and gals
> 
> 
> 
> 
> 
> 
> 
> ....)
> 
> I do my best to make sure I am getting everyone who has submitted proper proof, to the owner's list, but my emails for new posts don't always come through like they are supposed to.
> 
> If anyone has issued proof, and is not on the list and wants to be, please let me know.
> 
> I have been moving for two weeks now, and have had very little time to scroll the thread.
> 
> Again, my apologies on this. Should have some more time to update and manage in the coming weeks!
> 
> Glad to see this thing has grown so much


That's ok dood XD can you add me to the list please?
http://www.overclock.net/t/1561704/official-amd-r9-390-390x-owners-club/4300


----------



## diggiddi

Quote:


> Originally Posted by *TsukikoChan*
> 
> I need help with 2 things guys if you can help me:
> What is the button on the sapphire 390x card for? i read it's for uefi mode, so is it not uefi by default? should i turn it on (i think i'm using an uefi bios).
> 
> Does anyone else get random pc freezes on the 390x cards still? ever since i moved to 390x, randomly every other day my pc freezes up when loading/playing a game, sometimes accompanied by a visual and aural blip across all screens. No response, screen is showing last image, it's like a complete freeze. Only thing i can do is a hard shutdown. This has occurred on both Crimson and CCC drivers. Is there a problem with my card do you think or is it still the broken 390/x compatibility i heard about?


Are you running Afterburner?


----------



## TsukikoChan

Quote:


> Originally Posted by *diggiddi*
> 
> Are you running Afterburner?


I'm unsure, i think i did at one point. i typically have sapphire trixx open with the logging open on another monitor. I'll doublecheck afterburner when i get home. i do have radeonpro on it though (fftype-0 sort of requires it to fix screentearing).


----------



## Dundundata

Fyi, Gog is having a sale on Witcher 3 half off, and you can get every Witcher game + W3 DLCpass for just over $60


----------



## Agent Smith1984

Quote:


> Originally Posted by *mus1mus*
> 
> What it shows, even a higher tier card, the 980ti didn't improve Combined score since it's on an AMD.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Bench sucks for AMD. Not that AMD sucks on the bench.
> 
> Got it?


Absolutely the truth....

The bench sucks!

Funny how crossfire does NOT improve combined on AMD systems AT ALL, however it goes through the roof on Intel rigs...

They'd better get it right on the next 3dmark. I flat out only use it to look at graphics score.

A better complete test (and honestly more enjoyable to look at) is 3dmark11 in my opinion. It seems to at least give the AMD chips a chance on both sides.

It's just ridiculous that you see graphics scores match overclocked i5's (the FX respective competition), the physics score matching overclocked i5's, and then you get to the combined score and it's only 50-70% in some cases.
Strangely enough, the phenom's do great in the combined test on FireStrike, as my x6 @ 4GHz was dead even with i5's in the 4.6GHz range.... Mind you with 6 cores, but I digress...

Question, how high you boosting on that Ti?

My little brother got his EVGA Classified up to 1552 boost clock in games.... got it for $480 on craigslist... pretty jelly, even with my Fury


----------



## KNG HOLDY

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Hey guys (and gals
> 
> 
> 
> 
> 
> 
> 
> ....)
> 
> I do my best to make sure I am getting everyone who has submitted proper proof, to the owner's list, but my emails for new posts don't always come through like they are supposed to.
> 
> If anyone has issued proof, and is not on the list and wants to be, please let me know.
> 
> I have been moving for two weeks now, and have had very little time to scroll the thread.
> 
> Again, my apologies on this. Should have some more time to update and manage in the coming weeks!
> 
> Glad to see this thing has grown so much


y would like to be added









http://www2.pic-upload.de/img/28953076/prf.png

(core clock 1200, 1700 memory clock, +75 core voltage, +50 power limit, +50 aux voltage)(msi r9 390)

btw i want to swap from softtubing to acryl is it hard to bend it right?


----------



## fat4l

Quote:


> Originally Posted by *KNG HOLDY*
> 
> y would like to be added
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www2.pic-upload.de/img/28953076/prf.png
> 
> (core clock 1200, 1700 memory clock, +75 core voltage, +50 power limit, +50 aux voltage)(msi r9 390)
> 
> btw i want to swap from softtubing to acryl is it hard to bend it right?


Mind showing EVV report ?









Link:- The Stilt's VID APP


----------



## Agent Smith1984

Quote:


> Originally Posted by *KNG HOLDY*
> 
> y would like to be added
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www2.pic-upload.de/img/28953076/prf.png
> 
> (core clock 1200, 1700 memory clock, +75 core voltage, +50 power limit, +50 aux voltage)(msi r9 390)
> 
> btw i want to swap from softtubing to acryl is it hard to bend it right?


Nice card you got there, my MSI 390 did the same clocks, but needed 100mv









You are added


----------



## Gumbi

If that's stable, that's a very impressive core clock for the voltage.


----------



## fat4l

Quote:


> Originally Posted by *Gumbi*
> 
> If that's stable, that's a very impressive core clock for the voltage.


No offense but we can't rly say this as +XXmV doesn't say anything as we DO NOT KNOW dpm7 voltage, thus I asked for EVV report


----------



## FooSkiii

Hey guys I'm am looking to water cool my xfx 390 and I need help looking for the right water block.

I have been looking at the ekwb water blocks -

https://shop.ekwb.com/ek-fc-r9-290x-nickel-rev-2-0

https://shop.ekwb.com/ek-fc-r9-290x-se-acetal-nickel

and this one

http://www.performance-pcs.com/aquacomputer-kryographics-hawaii-for-radeon-r9-290x-and-290-acrylic-glass-nickel-plated.html

if anybody has any info that can help me out would be much appreciated.

GPU info -
XFX r9 390
Model # R9-390A-8DFR


----------



## sportsczy

This new Crimson driver is absolute garbage. All they did was put lipstick on a pig.


----------



## tangelo

Quote:


> Originally Posted by *sportsczy*
> 
> This new Crimson driver is absolute garbage. All they did was put lipstick on a pig.


Some ppl say the driver wasn't even updated. Just the software


----------



## Levys

Quote:


> Originally Posted by *FooSkiii*
> 
> Hey guys I'm am looking to water cool my xfx 390 and I need help looking for the right water block.
> 
> I have been looking at the ekwb water blocks -
> 
> https://shop.ekwb.com/ek-fc-r9-290x-nickel-rev-2-0
> 
> https://shop.ekwb.com/ek-fc-r9-290x-se-acetal-nickel
> 
> and this one
> 
> http://www.performance-pcs.com/aquacomputer-kryographics-hawaii-for-radeon-r9-290x-and-290-acrylic-glass-nickel-plated.html
> 
> if anybody has any info that can help me out would be much appreciated.
> 
> GPU info -
> XFX r9 390
> Model # R9-390A-8DFR


It will fit, I use (and recommend ) the Kryographics R9 290/X + active backplate for my own watercooled XFX R9 390X DD

https://www.caseking.de/en/aqua-computer-kryographics-fuer-r9-290-290x-vernickelt-black-edition-wach-378.html

https://www.caseking.de/en/aqua-computer-backplate-fuer-kryographics-r9-290x-290-aktiv-xcs-wach-373.html

I would buy this again in a hartbeat. Earlyer on in the tread I mentioned the use of Arctic 1,5 mm thermal pads
to put between the backplate and pcb to get even better thermal dissipation, this tip can be used with EK backplates to


----------



## Gumbi

Quote:


> Originally Posted by *fat4l*
> 
> No offense but we can't rly say this as +XXmV doesn't say anything as we DO NOT KNOW dpm7 voltage, thus I asked for EVV report


Quote:


> Originally Posted by *tangelo*
> 
> Some ppl say the driver wasn't even updated. Just the software


How would you explain people getting performance gains across the board then?


----------



## Dundundata

The new driver works fine for me, as did the old one. Haven't really looked at the software much yet.


----------



## mandrix

No problem with new driver...better Fire Strike scores anyway.


----------



## Darkeylel

Battlefront I get black spots on one of the Hoth maps which was the same problem I was getting in 15.11.1 driver. Think I will just stay with 15.11 beta drivers for now till they fix the problem. Also yes I know it can be fixed if turning down shadows and ambient occlusion but if I can run it all at ultra I want to run it all at ultra haha


----------



## Vexile

For how long should i run Unigine Heaven to see if my 390 is stable? Is 3 loops enough?


----------



## SystemTech

Ok what does this mean in VVM :

Hawaii = 1.275v/1080mhz(DPM7) / 0x30E(Lkg)

ASIC of 76.4%..not the greatest but hey. Its still a MSI 390X









Will check my brothers ASUS 9 390 later or tommoz.


----------



## zorvalth

What do you think of my latest creation(the backplate)


----------



## SystemTech

VERY NICE CONGRATZ....But i do have to say, DAMN that cards long


----------



## fat4l

Quote:


> Originally Posted by *SystemTech*
> 
> Ok what does this mean in VVM :
> 
> Hawaii = 1.275v/1080mhz(DPM7) / 0x30E(Lkg)
> 
> ASIC of 76.4%..not the greatest but hey. Its still a MSI 390X
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Will check my brothers ASUS 9 390 later or tommoz.


This means your default voltage used in 3D.
It is 1.275V, but due to vdroop u will see max of, maybe, 1.25V.


----------



## tangelo

On my MSI I was able to get 1700 for mem with zero extra volts and +50 power limit when I lowered my core clock a bit.


----------



## mus1mus

That's the reason why I prefer moving the memory slider first then the core.


----------



## battleaxe

Quote:


> Originally Posted by *zorvalth*
> 
> What do you think of my latest creation(the backplate)


Looks awesome. Did you machine that?


----------



## Gumbi

That's what's confusing me







It looks sick and yet the post implied that it was machined by him/her.


----------



## battleaxe

Quote:


> Originally Posted by *Gumbi*
> 
> That's what's confusing me
> 
> 
> 
> 
> 
> 
> 
> 
> It looks sick and yet the post implied that it was machined by him/her.


For someone who has access to a CNC machine, that would be pretty easy. Its all just in writing the program. I used to do this, but unfortunately don't anymore nor have access to a machine. Otherwise, I would always be making cool stuff for my PC too.


----------



## zorvalth

Quote:


> Originally Posted by *battleaxe*
> 
> Looks awesome. Did you machine that?


Yep, laser cut, powder coated, laser engraved









The same as ones I made for MSI 970 Gaming 4G.


----------



## battleaxe

Quote:


> Originally Posted by *zorvalth*
> 
> Yep, laser cut, powder coated, laser engraved
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The same as ones I made for MSI 970 Gaming 4G.


Looks good. I wonder how many you could fit on a sheet, and if there would be a market worth selling them? Probably not, as most cards come with them already. Nice job!


----------



## TsukikoChan

Quote:


> Originally Posted by *zorvalth*
> 
> Yep, laser cut, powder coated, laser engraved
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The same as ones I made for MSI 970 Gaming 4G.


You could make a small business off this on the boards here if people provide designs to go on it


----------



## EternalRest

Any thoughts on Asus 390 strix? I think I'm going to get it tonight.


----------



## Gumbi

TriX has far better cooling, same goes for the XFX/Powercolor.


----------



## GorillaSceptre

Quote:


> Originally Posted by *EternalRest*
> 
> Any thoughts on Asus 390 strix? I think I'm going to get it tonight.


Depends on your ambient temps, if you live somewhere with low temps the the silicone lottery will probably give in before the cooler does. But apples to apples the Strix isn't among the best.


----------



## Dundundata

Nice backplate!


----------



## Charcharo

I have to say this Frame Target Control thing... I really like it!


----------



## simo292a

Hey guys i have had my MSI 390x for some time now and i want to test the limit og the card. What would be a good starting overclock And what have you guys been able to achieve with your 390/390x???


----------



## tangelo

Quote:


> Originally Posted by *simo292a*
> 
> What would be a good starting overclock And what have you guys been able to achieve with your 390/390x???


Dunno about "starting" but it seems most ppl are pleased if they can achieve 1200/1700 on 390. But it also depends on how comfortable you are with higher temps and what kind of cooling one is using. The max stable I was able to achieve was 1150/1700 with +100mv/50+ powerlimit, but the temps were around 80C so I lowered the core to 1125 and kept the mem at 1700 for my 27/7 settings with just 50% power limit and no added voltage. I think I could get the core a bit higher but I just don't bother as I'm not going to watercool the card and the temps are starting to be too high for my liking.

Depending on the game and usage the temps on my 390 usually are around 65-72C while VRM is 50-60C. And this is with a quiet fan curve. About 20-40% fan speed while there is 60-90% GPU usage.
With stress-testing the temps on core go easily over 70C but I have never seen it run that high while gaming on 1080p


----------



## simo292a

Quote:


> Originally Posted by *tangelo*
> 
> Dunno about "starting" but it seems most ppl are pleased if they can achieve 1200/1700 on 390. But it also depends on how comfortable you are with higher temps and what kind of cooling one is using. The max stable I was able to achieve was 1150/1700 with +100mv/50+ powerlimit, but the temps were around 80C so I lowered the core to 1125 and kept the mem at 1700 for my 27/7 settings with just 50% power limit and no added voltage. I think I could get the core a bit higher but I just don't bother as I'm not going to watercool the card and the temps are starting to be too high for my liking.
> 
> Depending on the game and usage the temps on my 390 usually are around 65-72C while VRM is 50-60C. And this is with a quiet fan curve. About 20-40% while there is 60-90% GPU usage.
> With stress-testing the temps on core go easily over 70C but I have never seen it run that high while gaming on 1080p


What kind of performance improvements were you seeing by overclocking?


----------



## tangelo

Quote:


> Originally Posted by *simo292a*
> 
> Hey guys i have had my MSI 390x for some time now and i want to test the limit og the card. What would be a good starting overclock And what have you guys been able to achieve with your 390/390x???


Quote:


> Originally Posted by *simo292a*
> 
> What kind of performance improvements were you seeing by overclocking?


~7-8% better score in 3DMark Fire strike. ~5-10fps more depending on the game.


----------



## simo292a

Quote:


> Originally Posted by *tangelo*
> 
> ~7-8% better score in 3DMark Fire strike. ~5-10fps more depending on the game.


OK thats a pretty good increase i guess i will just try to see how far i can push my 390x without letting the temps get too high.


----------



## kubiks

Massive sale at performance pcs. Hit it up if you wanna cool your 390x! Black ice rads and EK fittings 25% off . Everything in store 10% off. Black Friday only, don't delay!


----------



## kubiks

I'm gonna buy one of the modded McP 655s and some new fittings. With rads on sale like that (and these 390xs need rad space!) I may snag anoter. Can never have too much rad.


----------



## disintegratorx

May I please join in? I just got a Powercolor Devil R9 390X and it rocks.











My GPU-Z ID



My GPU-Z Numbers...



Images from my Windows Phone



And me, heavy metal LOL (for good taste)



This card kicks some serious butt. Way better than my previous 290X. I would recommend this card to anyone and its exactly well fit for the price!


----------



## disintegratorx

Quote:


> Originally Posted by *simo292a*
> 
> Hey guys i have had my MSI 390x for some time now and i want to test the limit og the card. What would be a good starting overclock And what have you guys been able to achieve with your 390/390x???


Just what I posted, right up above, simo... Don't know about how the temps will do on your card though, so I would recommend being careful. I have a +25mV on the core also to achieve this. This setting is absolutely perfect for me. Runs every game as smooth as possible on the settings I like, which are Adaptive Multisampling and everything else default. With the newest AMD Crimson drivers, everything is running instantaniiously with my SSD's. Hope that helps ya.


----------



## GorillaSceptre

@disintegratorx

Awesome.









I nearly got one for the cooling, but it was to pricey for me.







Good luck with it, make sure to post some results, i haven't seen to many of them out there.


----------



## BakedIkezor

I have a Sapphire R9 390, the old no backplate version, and I've been having trouble at pc startup.

Whenever I boot up my pc or restart it, it will load the my last OC profile I've set with fan NOT on "auto". It started when I set my fan to 100% with 1125/1625/81mV for benchmarking and since then, It will directly boot up into that profile and had to go into either MSI AB or Radeon setting to reset it to stock settings.

I've tried uninstalling the MSI AB but that doesn't solve the problem, so I downloaded the "AMD clean uninstall" and that solve the problem but left me with no driver, as soon as I reinstall the driver the problem started again.

The card will basically boot up with any clock profile without its fan profile set to "auto". Is this a known issue? is there a fix for it? I don't mind the card booting up with a OC clock profile but I do want my fans on auto, the noise is rather loud when set at 48% (the max speed it was running on my stress test) and I don't want it to boot up at 20% speed either just in case I forgot set to auto and card cooks itself up during my gaming sessions.

I'm using the latest driver, the radeon crimson, on windows 10 64bit.


----------



## Charcharo

I have it to, except it is at 85% for me (the fan) on my PCS card.

It actually works like that for 3-4 minutes I found out and then it simply stops behaving like a mule. I also now turned MSI Afterburner for start up


----------



## doubtfm

Sapphire R9 390 Nitro

Core Clock: 1147 Memory Clock: 1650 Core Voltage: +100 Power Limit: +10

This card is so awesome!


----------



## Vexile

1170 core with 1700 memory at 85C peak and +100mv +50% power limit or 1140 core with 1600 memory at 75C peak and +81mv +50% power limit?


----------



## DarknightOCR

Quote:


> Originally Posted by *BakedIkezor*
> 
> I have a Sapphire R9 390, the old no backplate version, and I've been having trouble at pc startup.
> 
> Whenever I boot up my pc or restart it, it will load the my last OC profile I've set with fan NOT on "auto". It started when I set my fan to 100% with 1125/1625/81mV for benchmarking and since then, It will directly boot up into that profile and had to go into either MSI AB or Radeon setting to reset it to stock settings.
> 
> I've tried uninstalling the MSI AB but that doesn't solve the problem, so I downloaded the "AMD clean uninstall" and that solve the problem but left me with no driver, as soon as I reinstall the driver the problem started again.
> 
> The card will basically boot up with any clock profile without its fan profile set to "auto". Is this a known issue? is there a fix for it? I don't mind the card booting up with a OC clock profile but I do want my fans on auto, the noise is rather loud when set at 48% (the max speed it was running on my stress test) and I don't want it to boot up at 20% speed either just in case I forgot set to auto and card cooks itself up during my gaming sessions.
> 
> I'm using the latest driver, the radeon crimson, on windows 10 64bit.


my I started doing it yesterday.
Once you have applied the fan to 50%, just to see what would be the noise.

now whenever initiated the windows, a fan went to 50%.
I had to reset the AB or Overdrive.

Uninstalled crimson , and went back to install and looks good.
Bug overdrive / crimson ???


----------



## krnmc

I got the strix last week and it runs hotter compared to others. And my Fan makes random clicking noise here and there but as of now its pretty good. It was bit pricey here in Korea but oh well. Mine came with 3 years of A/S so not too worried about it.


----------



## FooSkiii

bump


----------



## FooSkiii

Quote:


> Originally Posted by *Levys*
> 
> It will fit, I use (and recommend ) the Kryographics R9 290/X + active backplate for my own watercooled XFX R9 390X DD
> 
> https://www.caseking.de/en/aqua-computer-kryographics-fuer-r9-290-290x-vernickelt-black-edition-wach-378.html
> 
> https://www.caseking.de/en/aqua-computer-backplate-fuer-kryographics-r9-290x-290-aktiv-xcs-wach-373.html
> 
> I would buy this again in a hartbeat. Earlyer on in the tread I mentioned the use of Arctic 1,5 mm thermal pads
> to put between the backplate and pcb to get even better thermal dissipation, this tip can be used with EK backplates to


That looks epikkkk I've never seen a setup like that with back plate cooling before!
how are the temps on the gpu and vrms????


----------



## ManofGod1000

I have an R9 290 reference that I am selling for $150 on another forum. (Sold already, just giving the information here as a reference.) I am considering either the XFX R9 390 or XFX R9 390X at Best Buy which are 299 or 399. I have a 4k monitor that I do game at that resolution and I am not a 60fps whore.







However, I would like to get the R9 390 but, someone on hardforum mentioned that really, only the 390x would be good enough out of those 2 for the resolution I am using.

What do you guys think???? Thanks.


----------



## Sgt Bilko

Quote:


> Originally Posted by *ManofGod1000*
> 
> I have an R9 290 reference that I am selling for $150 on another forum. (Sold already, just giving the information here as a reference.) I am considering either the XFX R9 390 or XFX R9 390X at Best Buy which are 299 or 399. I have a 4k monitor that I do game at that resolution and I am not a 60fps whore.
> 
> 
> 
> 
> 
> 
> 
> 
> However, I would like to get the R9 390 but, someone on hardforum mentioned that really, only the 390x would be good enough out of those 2 for the resolution I am using.
> 
> What do you guys think???? Thanks.


Either one can do 4k at reasonable settings, the difference at 4k for most games (at ultra) would be around 5fps avg or so but that can mean the difference between 40 and 45 fps though sometimes


----------



## ManofGod1000

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Either one can do 4k at reasonable settings, the difference at 4k for most games (at ultra) would be around 5fps avg or so but that can mean the difference between 40 and 45 fps though sometimes


Cool, thank you. I guess the real question then is, if it where your money, would you spend the extra $100, stick with what you have in the R9 290 reference, or get the R9 390 non X and just overclock?


----------



## Sgt Bilko

Quote:


> Originally Posted by *ManofGod1000*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Either one can do 4k at reasonable settings, the difference at 4k for most games (at ultra) would be around 5fps avg or so but that can mean the difference between 40 and 45 fps though sometimes
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Cool, thank you. I guess the real question then is, if it where your money, would you spend the extra $100, stick with what you have in the R9 290 reference, or get the R9 390 non X and just overclock?
Click to expand...

if you are looking to go single card on 4k the get the 390x, you have the buffer of the extra Vram and the additional stream processors and you can still give it a mild overclock (bear in mind your GPU will run hotter at 4k than 1440p/1080p)

the 290 and 390 can do it but honestly I'd rather spend the bit extra and go 390x


----------



## mus1mus

You will benefit from the extra buffer the 8gb gives at 4k. And some driver boost on the 300 series cards.


----------



## 5dragons

Guys i recent had buy my r9 390x, and i`m getting high temperatures. i already change the fan curve see the images below. And the fan keep on 20%, Can someone help me? what can be?


----------



## Kayaler

Quote:


> Originally Posted by *5dragons*
> 
> Guys i recent had buy my r9 390x, and i`m getting high temperatures. i already change the fan curve see the images below. And the fan keep on 20%, Can someone help me? what can be?


Click "Auto" button in FAN Speed(%).

Make sure you can not adjust this scroll if you are in AUTO MODE.


----------



## 5dragons

ty


----------



## Levys

Quote:


> Originally Posted by *FooSkiii*
> 
> That looks epikkkk I've never seen a setup like that with back plate cooling before!
> how are the temps on the gpu and vrms????


Well the temps in my loop are just great, my card is looped together with an FX8350 4,4Ghz and my
gpu and cpu never go above 50°c.

as an example = playing Call of duty Advanced warfare for bout an hour everything turned on to the max (only supersampling x2)
overclocked 1200/1600.



If I loose the overclock and/or set most games to ultra. the temps never reach 50°c (advanced warfare hits hard)
the backplate cooling is more relevant thanks to thermalpads inbetween ( like I mentioned before)
you must also make sure the thermal pipe on top makes good contact with the backplate ( thermalpaste or some thinner t.pads )

My vrm's always stay under the core temps and always very close to each other.
almost allways around 46°c core 39-42°c vrm for most new games on ultra, and thats with an FX 8350 4.4Ghz upping the temps of my water +- 33 to max +-38°c
also my delta never goes above 20°c perhaps in summer max 28°c.

hope this answers your question


----------



## Levys

Quote:


> Originally Posted by *ManofGod1000*
> 
> I have an R9 290 reference that I am selling for $150 on another forum. (Sold already, just giving the information here as a reference.) I am considering either the XFX R9 390 or XFX R9 390X at Best Buy which are 299 or 399. I have a 4k monitor that I do game at that resolution and I am not a 60fps whore.
> 
> 
> 
> 
> 
> 
> 
> However, I would like to get the R9 390 but, someone on hardforum mentioned that really, only the 390x would be good enough out of those 2 for the resolution I am using.
> 
> What do you guys think???? Thanks.


My XFX R9 390X rocks, pushed it to 1225/1750 already with MSI afterburner alone . go for the black edition if you can. (smal premium)
I forgot ,but I cant really complain with the card I got. Its solid and easy to put under water.
any R9 290/X reference cooling block will fit.


----------



## mandrix

Quote:


> Originally Posted by *Levys*
> 
> My XFX R9 390X rocks, pushed it to 1225/1750 already with MSI afterburner alone . go for the black edition if you can. (smal premium)
> I forgot ,but I cant really complain with the card I got. Its solid and easy to put under water.
> any R9 290/X reference cooling block will fit.


What kind of Heaven or Fire Strike scores do you get with that 1225/1750 OC?
I can push mine pretty far but at a point I get no higher benchmarks.


----------



## Levys

Quote:


> Originally Posted by *mandrix*
> 
> What kind of Heaven or Fire Strike scores do you get with that 1225/1750 OC?
> I can push mine pretty far but at a point I get no higher benchmarks.


So far I have had mixed results with the unofficial crimson vs official crimson. drivers
with the official drivers I get better GPU and CPU scores but lesser combined scores. really frustrating

http://www.3dmark.com/compare/fs/6631486/fs/6581637

5% less on the combined score is much.
but on the other hand I'm happy to have reached 15036 on the gpu score.
I will try to oc it higher as I still can use trixx to put an extra 100Mv on the core









PS. my card scales well, all the way I kept getting improved GPU scores going higher.


----------



## ManofGod1000

Well, thanks for the advice everyone. However, I am going to just stick with my R9 290 for a little longer. I did buy the R9 390X XFX version from Best Buy and gave it a try. However, although benchmarks like 3D Mark 13, 11, and Vantage saw good improvements, games such as Crysis 3 and Batman Arkham Knight saw absolutely no difference at 4k resolutions. I just figured it was not worth $400 just for a few benchmark score improvements. Oh well, that is the way it goes.


----------



## mandrix

Quote:


> Originally Posted by *Levys*
> 
> So far I have had mixed results with the unofficial crimson vs official crimson. drivers
> with the official drivers I get better GPU and CPU scores but lesser combined scores. really frustrating
> 
> http://www.3dmark.com/compare/fs/6631486/fs/6581637
> 
> 5% less on the combined score is much.
> but on the other hand I'm happy to have reached 15036 on the gpu score.
> I will try to oc it higher as I still can use trixx to put an extra 100Mv on the core
> 
> 
> 
> 
> 
> 
> 
> 
> 
> PS. my card scales well, all the way I kept getting improved GPU scores going higher.


Nice graphic scores.
Any idea what kind of total voltage you are putting through the card, or at least what any utilities show? I've had mine up around 1.3 VDDC, I'd like to try more but no one seems to have an idea how much is "safe", or if they do they won't answer me. lol.

Thanks!


----------



## KNG HOLDY

smb tested the "Alphacool NexXxoS GPX - ATI R9 390 M02" ?
if im playing gta w/o side panel i get a max of ~75-80°C but with my sidepanel installed it reach 90°C+ within 2 minutes


----------



## Tivan

Quote:


> Originally Posted by *KNG HOLDY*
> 
> smb tested the "Alphacool NexXxoS GPX - ATI R9 390 M02" ?
> if im playing gta w/o side panel i get a max of ~75-80°C but with my sidepanel installed it reach 90°C+ within 2 minutes


What's your airflow like? edit: watercooling is definitely an option either way = D


----------



## KNG HOLDY

Quote:


> Originally Posted by *Tivan*
> 
> What's your airflow like? edit: watercooling is definitely an option either way = D


because my mobo is broken the gpu is on the last pcie slot ~3,5cm above my hdd'S its very bad







ive'got 1 240mm 1 360mm rad in my case installed currently just for the cpu

but its so strange to buy an 100€ gpu block for an 360€ gpu :c


----------



## Dundundata

That is a serious overclock!


----------



## Levys

Quote:


> Originally Posted by *mandrix*
> 
> Nice graphic scores.
> Any idea what kind of total voltage you are putting through the card, or at least what any utilities show? I've had mine up around 1.3 VDDC, I'd like to try more but no one seems to have an idea how much is "safe", or if they do they won't answer me. lol.
> 
> Thanks!


Max I have seen is 1.32VDDC but 1225/1750 isn't a clock speed I feel safe to use 24/7 for gaming.
Most of the time its OC'd to 1200/1600 ( = >1.3VDDC ) for demanding games 69Mv core, 50Mv aux, 20+ powerlimit
I think anything close to or under 1.3 is ( safe) as long as your temps are fine all over the card.
especially the vrm temps. since they regulate that voltage.


----------



## mandrix

Quote:


> Originally Posted by *Levys*
> 
> Max I have seen is 1.32VDDC but 1225/1750 isn't a clock speed I feel safe to use 24/7 for gaming.
> Most of the time its OC'd to 1200/1600 ( = >1.3VDDC ) for demanding games 69Mv core, 50Mv aux, 20+ powerlimit
> I think anything close to or under 1.3 is ( safe) as long as your temps are fine all over the card.
> especially the vrm temps. since they regulate that voltage.


Very good.
As some will say, +69 will have a different total voltage from card to card so that's why I asked for VDDC. For example +63 is about 1.305 (at idle) on mine.

But I have a block on my card, so temps are OK .

Thanks for checking!


----------



## TsukikoChan

In case anyone is getting the 20% fanspeed lock, there is a huuuuuuuuuuge reddit thread about amd users all getting hit by this due to Crimson drivers. seems it is overwriting fancurves/overdrive settings and locking the fanspeed to 20% (i heard in some extreme cases at 0%). I've been told they are releasing a fix for this soon but a lot of people are saying they've already burnt out their gcards due to Crimson and there's nothing they can do about it.


----------



## rdr09

Quote:


> Originally Posted by *TsukikoChan*
> 
> In case anyone is getting the 20% fanspeed lock, there is a huuuuuuuuuuge reddit thread about amd users all getting hit by this due to Crimson drivers. seems it is overwriting fancurves/overdrive settings and locking the fanspeed to 20% (i heard in some extreme cases at 0%). I've been told they are releasing a fix for this soon but a lot of people are saying they've already burnt out their gcards due to Crimson and there's nothing they can do about it.
> so, guys, keep an eye on your fanspeeds and temps if you're using Crimson. "Auto" fan I'm told should keep it stable but this seems like quite an oversight in Crimson :<


i hope the fix will increase my firestrike score.


----------



## mus1mus

Means, I can hope for 17K GS then?


----------



## rdr09

Quote:


> Originally Posted by *mus1mus*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Means, I can hope for 17K GS then?


Lol. "Fire" striked.

i must say you are very close to my crossfire 290s score.


----------



## TsukikoChan

I need some help guys, ever since i got this 390x (upgraded from 7870) I've been getting random pc freezes, mostly when playing games though once on the desktop.
I lowered my OC to default on the card, no diff; i uninstalled afterburner and turned off trixx with no diff. It's been a bit more stable ever since i lowered my OC on my cpu (from 4.5/4.6 down to 4.3 on my fx8350). Any thoughts guys? when the freeze occurs i heard an audible "zzzzt" in my headphones and the screens all lock to what i see and the only way out is a hard reset (nothing is accessible or working).

Part of me thinks the card might need rma-ed, part of me thinks this card is acting funny with my psu (coolermaster 750w gx) (which makes sense if a lowered OC works as the system is drawing less power), but I've no notion and would appreciate any help!!


----------



## mus1mus

Quote:


> Originally Posted by *rdr09*
> 
> Lol. "Fire" striked.
> 
> i must say you are very close to my crossfire 290s score.


But yours should be playable minus a few MHz. Haven't done playing with the thing yet. I might next day though.
Quote:


> Originally Posted by *TsukikoChan*
> 
> I need some help guys, ever since i got this 390x (upgraded from 7870) I've been getting random pc freezes, mostly when playing games though once on the desktop.
> I lowered my OC to default on the card, no diff; i uninstalled afterburner and turned off trixx with no diff. It's been a bit more stable ever since i lowered my OC on my cpu (from 4.5/4.6 down to 4.3 on my fx8350). Any thoughts guys? when the freeze occurs i heard an audible "zzzzt" in my headphones and the screens all lock to what i see and the only way out is a hard reset (nothing is accessible or working).
> 
> Part of me thinks the card might need rma-ed, part of me thinks this card is acting funny with my psu (coolermaster 750w gx) (which makes sense if a lowered OC works as the system is drawing less power), but I've no notion and would appreciate any help!!


Start clean.

Verify your CPU OC, Do a clean install of VGA Driver.


----------



## TsukikoChan

Quote:


> Originally Posted by *mus1mus*
> 
> Start clean.
> 
> Verify your CPU OC, Do a clean install of VGA Driver.


will do!  will clean down vga drivers tonight and reinstall crimson and see how it goes from there.


----------



## rdr09

Quote:


> Originally Posted by *TsukikoChan*
> 
> will do!  will clean down vga drivers tonight and reinstall crimson and see how it goes from there.


well, i just checked the recommended list of psu and the one you have is not there. it could very well be the psu. it was able to handle the 7870 but the psu is drawing twice the power now.


----------



## TsukikoChan

Quote:


> Originally Posted by *rdr09*
> 
> well, i just checked the recommended list of psu and the one you have is not there. it could very well be the psu. it was able to handle the 7870 but the psu is drawing twice the power now.


yea, that was one of my thoughts. the psu has done me well over the last few years, even managed to run a 7870 alongside a geforce 460 for some time (hybrid, i wanted physx for Alice:MR) but i suppose the 390x draws much more power. how much power does a 390x draw? it's kind of a shock to see a 750W that can't handle a fx8350 and a 390x together

edit: interesting note, typing in the emote ": < " (without spaces) breaks a message from appearing correctly in here ^^;;


----------



## rdr09

Quote:


> Originally Posted by *TsukikoChan*
> 
> yea, that was one of my thoughts. the psu has done me well over the last few years, even managed to run a 7870 alongside a geforce 460 for some time (hybrid, i wanted physx for Alice:MR) but i suppose the 390x draws much more power. how much power does a 390x draw? it's kind of a shock to see a 750W that can't handle a fx8350 and a 390x together
> 
> edit: interesting note, typing in the emote ": < " (without spaces) breaks a message from appearing correctly in here ^^;;


it is not so much the number 750W but the quality as well as the age of it. a 390X is like an oc'ed 290X. it can pull upwards of 300W, then combined that with your cpu.


----------



## mus1mus

Both stock, yes.

No throttling on both, maybe yes.

OC effect on power is not linear. So dont expect a 10% OC on both CPU and GPU to equate to 20% more power pull.

Plus, both compnents are power hungry.


----------



## rdr09

Quote:


> Originally Posted by *mus1mus*
> 
> Both stock, yes.
> 
> No throttling on both, maybe yes.
> 
> OC effect on power is not linear. So dont expect a 10% OC on both CPU and GPU to equate to 20% more power pull.
> 
> Plus, both compnents are power hungry.


I recommend taking a reading of the 12V using an app like HWINFO64. may not be very accurate, though. make sure it does not go any lower than 11.4V at load during a game or even benching in Firestrike.

GPUZ will do the same. Set the reading (using the dropdown) of the 12V to show Minimum value.


----------



## TsukikoChan

Quote:


> Originally Posted by *rdr09*
> 
> I recommend taking a reading of the 12V using an app like HWINFO64. may not be very accurate, though. make sure it does not go any lower than 11.4V at load during a game or even benching in Firestrike.
> 
> GPUZ will do the same. Set the reading (using the dropdown) of the 12V to show Minimum value.


ty! will keep an eye on this in the coming days; what should i be looking for other than it dropping below 11.4? does dropping below 11.4 mean that the psu can't supply enough or is fluctuating too much? should i increase my OCs back up to try exasperate it to see this better?


----------



## rdr09

Quote:


> Originally Posted by *TsukikoChan*
> 
> ty! will keep an eye on this in the coming days; what should i be looking for other than it dropping below 11.4? does dropping below 11.4 mean that the psu can't supply enough or is fluctuating too much? should i increase my OCs back up to try exasperate it to see this better?


yes, dropping below 11.4V means the psu is not strong enuf. even just above like 11.5V . . . i would start being concerned and remove all oc. remember the apps are not really accurate and the psu may be supplying even lower.


----------



## PhillyB

I just got a MSI 390x and am running it stock until my water block shows up. Currently using the crimson drivers.

I ran into a strange problem over the weekend. After dropping out of a game (Fallout 4 at this point), i get horizontal lines bouncing around the screens (3 screens in eyefinity mode) while the card is at idle. If i reboot, it goes away. If it turn off the monitors and come back a while later, they are gone. If i turn on something that forces the gpu to increase its clock speed, the lines go away as well.

Any ideas?


----------



## mandrix

Quote:


> Originally Posted by *rdr09*
> 
> I recommend taking a reading of the 12V using an app like HWINFO64. may not be very accurate, though. make sure it does not go any lower than 11.4V at load during a game or even benching in Firestrike.
> 
> GPUZ will do the same. Set the reading (using the dropdown) of the 12V to show Minimum value.


No software apps come close to reading actual 12v rail on my SeaSonic 1000 psu's....better to use a meter. For example HWINFO shows 11.6v while in reality it's 12.04v on my bench meter.


----------



## rdr09

Quote:


> Originally Posted by *mandrix*
> 
> No software apps come close to reading actual 12v rail on my SeaSonic 1000 psu's....better to use a meter. For example HWINFO shows 11.6v while in reality it's 12.04v on my bench meter.


i wish i know exactly how to measure the 12V while the gpu is on load. is there any link to the procedure. i guess i can always goolgle.

i always just bounce the figures between gpuz and hwinfo . . .



but, you are correct, a fluke will be best.


----------



## flopper

Quote:


> Originally Posted by *PhillyB*
> 
> I just got a MSI 390x and am running it stock until my water block shows up. Currently using the crimson drivers.
> 
> I ran into a strange problem over the weekend. After dropping out of a game (Fallout 4 at this point), i get horizontal lines bouncing around the screens (3 screens in eyefinity mode) while the card is at idle. If i reboot, it goes away. If it turn off the monitors and come back a while later, they are gone. If i turn on something that forces the gpu to increase its clock speed, the lines go away as well.
> 
> Any ideas?


Might be memory clocks going up and down a bit.
its powerplay basically adjusting in real time.


----------



## PhillyB

any way to turn that feature off?


----------



## Charcharo

Interestingly I seem to have some slowdowns in Serious Sam 3 (modified Ultra 1440x900).


----------



## mypickaxe

Quote:


> Originally Posted by *TsukikoChan*
> 
> I need some help guys, ever since i got this 390x (upgraded from 7870) I've been getting random pc freezes, mostly when playing games though once on the desktop.
> I lowered my OC to default on the card, no diff; i uninstalled afterburner and turned off trixx with no diff. It's been a bit more stable ever since i lowered my OC on my cpu (from 4.5/4.6 down to 4.3 on my fx8350). Any thoughts guys? when the freeze occurs i heard an audible "zzzzt" in my headphones and the screens all lock to what i see and the only way out is a hard reset (nothing is accessible or working).
> 
> Part of me thinks the card might need rma-ed, part of me thinks this card is acting funny with my psu (coolermaster 750w gx) (which makes sense if a lowered OC works as the system is drawing less power), but I've no notion and would appreciate any help!!


Same thing can happen in an airflow restricted situation. Case in point: A Corsair Carbide 330R case with sound dampening material, even with 2 140mm fans on intake an a 140mm fan on the outtake, it can get awfully hot inside of the case with an open air card such as an R9 390 Strix (I saw the same thing with an EVGA GTX 780 Classified.)

Does your issue happen with the case panel off?


----------



## Dundundata

There's a new driver out now that's supposed to fix the fan issue.


----------



## disintegratorx

Quote:


> Originally Posted by *Dundundata*
> 
> There's a new driver out now that's supposed to fix the fan issue.


Yeah there is, with an issue with the fan being stuck at running at 20% speed. Plus it has other stability improvements, like for the game Fallout 4. Here is the DL link from here for any who might want it: http://support.amd.com/en-us/download

I already installed it just in case. I don't have Fallout 4 yet though...


----------



## Kerelm

Picked up my gigabyte G1 390 on Saturday. Big upgrade from my 7950


----------



## Brunestud

I just bought my gigabyte G1 390 a couple of weeks ago, I'm using it with fx 8350..... I'm not really happy, I'm playing Batman Arkham City, it's so laggy, :/ I feel like I wasted 415 dollars


----------



## jaydude

Quote:


> Originally Posted by *Kerelm*
> 
> Picked up my gigabyte G1 390 on Saturday. Big upgrade from my 7950


I have the same card, I dropped my temps from 78c "hot day 30c+" max to 69c by replacing the thermal pad on the mosfets with a Alphacool Eisschicht 1mm 11W/mK Thermal Pad and some Thermal Grizzly Hydronaut, I know it can get pretty hot out there in the west









I also upgraded from a 7950, nice card but the prices here are ridiculous


----------



## jaydude

Quote:


> Originally Posted by *Brunestud*
> 
> I just bought my gigabyte G1 390 a couple of weeks ago, I'm using it with fx 8350..... I'm not really happy, I'm playing Batman Arkham City, it's so laggy, :/ I feel like I wasted 415 dollars


Arkham City from what I remember has always been up and down, even on my old 7950 it was laggy, although I remember playing awhile back I fiddled with some setting and it ran better, I think I turned off physx and lowered shadows and set AA to multisampling instead of supersampling from the control panal, maybe some others I cant remember.

Welcome to OCN btw, you can find out how to fill out your rig and add it to your sig Here, it helps to know what you are running so we can help out faster


----------



## Carlern

Are there any cooling solutions for the MSI R9 390? I was looking into liquid coolers such as the Kraken G10 and Corsair HG10, but they do not seem to work. However, I came over this waterblock from Alphacool: http://www.alphacool.com/product_info.php/info/p1663_Alphacool-NexXxoS-GPX---ATI-R9-390-M02---incl--backplate---black.html

Even though I don't have any experience with watercooling, I'm intrested in giving it a shot. Therefore, I have some questions:

What more do I need other than the waterblock? Pumps, liquid etc. Are there any kits avaiable?
Does the waterblock cover everything such as the vrm?
Are there any other solutions avaiable?


----------



## kizwan

Quote:


> Originally Posted by *Carlern*
> 
> Are there any cooling solutions for the MSI R9 390? I was looking into liquid coolers such as the Kraken G10 and Corsair HG10, but they do not seem to work. However, I came over this waterblock from Alphacool: http://www.alphacool.com/product_info.php/info/p1663_Alphacool-NexXxoS-GPX---ATI-R9-390-M02---incl--backplate---black.html
> 
> Even though I don't have any experience with watercooling, I'm intrested in giving it a shot. Therefore, I have some questions:
> 
> What more do I need other than the waterblock? Pumps, liquid etc. Are there any kits avaiable?
> Does the waterblock cover everything such as the vrm?
> Are there any other solutions avaiable?



Pump, radiator(s), reservoir, tube (soft or hard), coolant (distilled/premixed/DI), fittings and adapters, fans. You can also get kits that include everything, for example kit from EKWB.
The VRM will be passively cool by the block because it doesn't have water channel on the VRM area.
Nope. You pretty much screwed.


----------



## Carlern

Is it possible to connect a Predator 360 to the waterblock?


----------



## kizwan

Quote:


> Originally Posted by *Carlern*
> 
> Is it possible to connect a Predator 360 to the waterblock?


Yes, of course.


----------



## Kerelm

Quote:


> Originally Posted by *jaydude*
> 
> I have the same card, I dropped my temps from 78c "hot day 30c+" max to 69c by replacing the thermal pad on the mosfets with a Alphacool Eisschicht 1mm 11W/mK Thermal Pad and some Thermal Grizzly Hydronaut, I know it can get pretty hot out there in the west
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I also upgraded from a 7950, nice card but the prices here are ridiculous


Oh thanks man! I'll take a look into that. The card is basically a damn flame thrower at the moment.


----------



## Carlern

Quote:


> Originally Posted by *kizwan*
> 
> Yes, of course.


Oh wow, that's great! Isn't that the cheapest and easiest solution? Do you need any specific tools/equipment?


----------



## kizwan

Quote:


> Originally Posted by *Carlern*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Yes, of course.
> 
> 
> 
> Oh wow, that's great! Isn't that the cheapest and easiest solution? Do you need any specific tools/equipment?
Click to expand...

The Predator 360 come with quick disconnect (QDC) which is useful when you want to add gpu block in the loop. With QDC, you can remove any component in the loop without draining the loop. If you like this feature, you will need a couple of pairs of QDC. See *this post* for an example. The shiny adapters connected to the gpu block & top radiators are QDC.


----------



## Carlern

Quote:


> Originally Posted by *kizwan*
> 
> The Predator 360 come with quick disconnect (QDC) which is useful when you want to add gpu block in the loop. With QDC, you can remove any component in the loop without draining the loop. If you like this feature, you will need a couple of pairs of QDC. See *this post* for an example. The shiny adapters connected to the gpu block & top radiators are QDC.


Does everything come with the Predator 360 or do I need to buy them seperatly?

Edit: I was also thinking about only using the Predator and not include a pump.


----------



## jaydude

Quote:


> Originally Posted by *Kerelm*
> 
> Oh thanks man! I'll take a look into that. The card is basically a damn flame thrower at the moment.


a 1200-1500rpm fan blowing air underneath your gpu would also help immensely, I just installed one there and max temp on battlefield 4 online maxed out was 60c, fallout 4 always under 58c
















My case fans are louder then my gpu at full loud


----------



## TsukikoChan

390x was stable last night (no pc freezes), didn't get to uninstall drivers. kept an eye on 12v line through hwmonitor and the lowest it went was 11.78. hmm.

question for peeps, in the new crimson (or 390x, i cant remember when it started), my multimonitor setup sometimes goes awry. quite often when gaming it would 'turn off' a monitor (typically the one i'm gaming on!) and i would need to power off the monitor and power it back on before my displays would be back to normal. if i turn off my middle monitor, the desktop layout would change to accomodate having 2 monitors instead of 3, which plays havoc with icons on my desktop. my old 7870 didnt do this, it never detected turned off monitors and decide to change my monitor layout automatically.. can this be turned off guys? i'm sick of having my monitors change layout or turn off in the middle of playing dota or online games.. don't know if this is crimson doing this or the 390x..


----------



## kizwan

Quote:


> Originally Posted by *Carlern*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> The Predator 360 come with quick disconnect (QDC) which is useful when you want to add gpu block in the loop. With QDC, you can remove any component in the loop without draining the loop. If you like this feature, you will need a couple of pairs of QDC. See *this post* for an example. The shiny adapters connected to the gpu block & top radiators are QDC.
> 
> 
> 
> Does everything come with the Predator 360 or do I need to buy them seperatly?
> 
> Edit: I was also thinking about only using the Predator and not include a pump.
Click to expand...

The Predator have the built in pump. You will need to get fittings (compression fittings or barb) & tube.


----------



## ChaosAD

Just broke 14k FS score with 1120/1740 +0mv/+50PL. But then after 3-4 mins idle on desktop i got black screen









Question: why max clock on FS 1080p before artifacts is 1120 but at 1440p is 1140 with +0mv both? I thought 1440p is more demanding!


----------



## Charcharo

Are there any Devil 13 and 390X Devil users here














?

Even though I dont have one (instead a regular R9 390 from PowerColor)... there is something badass and absurd about those GPUs.


----------



## LocoDiceGR

Why no 380/380x official club thread?


----------



## eltano06

Quote:


> Originally Posted by *BALANTAKOS*
> 
> Why no 380/380x official club thread?


It would be nice to a 285/380/380x owners club.


----------



## Agent Smith1984

Quote:


> Originally Posted by *eltano06*
> 
> It would be nice to a 285/380/380x owners club.


Someone start one! I think it's a great idea...


----------



## Dundundata

I ran the EVV program



What does it mean '0x283'?

Also it told me to restore the default SCLK frequency. I ran the program upon reboot with no OC.


----------



## seanpatrick

Anyone else getting the odd horizontal line flash across their screen? It happens every half hour to hour for me, and I can't pinpoint its source. I think it MIGHT have something to do with the GPU clock intermittently (and randomly) dropping for a second, but can't be sure.

Here's a picture of GPU-z when watching a youtube video. GPU load is all over the place:



Though this could just be hardware acceleration in Firefox.

Perhaps AI suite is the problem? (though it didn't cause problems on my old cards)


----------



## tangelo

While playing Just Cause 3 with vsync on I get stutter, lag and 25-30 fps with details on medium-high. But when I disable the vsync I can get 60-90fps with all the details on max...

Any ideas how to remedy this as I don't like the tearing I get when above 60fps...


----------



## Dundundata

well i don't think you're the only one, do you have the latest driver?

http://www.gamepur.com/guide/21345-just-cause-3-pc-error-save-game-location-fix-fps-drop-stuttering-crash.html

one trick i would recommend is a program called d3doverrider (direct3d), which let's you enable vsync and triple buffering - might work better than the in-game vsysc.


----------



## ManofGod1000

Ok, serious question. I have just purchased an XFX R9 390 and have cross fired it with a reference R9 290x from XFX. I installed the 390 in the top slot and the 290 in the second slot of my Gigabyte Z170 Gaming 7 board. (I am running an i7-6700k at 4.7GHz.) I have an APC UPS that displays the power draw so I am getting about 665 Watts through 3D Mark 11 and about 785 Watts through playing Crysis 3 at 4k resolutions. The 3D Mark 11 Graphics Score is 30294 so I know crossfire is working correctly.

Now the question: I have a Thermaltake Smart Series M850W which uses a single 12v rail. The thing is, should I worry about my power supply not being enough or am I good since I am not running these games all the time? Truth is, I am not overly concerned but I figured it would be good to ask others and see what they think.


----------



## jon666

I might have made a terrible mistake. Any one have any pictures of the voltage regulators for 390's?


----------



## LocoDiceGR

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Someone start one! I think it's a great idea...


yeah i agree, i am waiting all this time for a thread


----------



## ManofGod1000

Quote:


> Originally Posted by *ManofGod1000*
> 
> Ok, serious question. I have just purchased an XFX R9 390 and have cross fired it with a reference R9 290x from XFX. I installed the 390 in the top slot and the 290 in the second slot of my Gigabyte Z170 Gaming 7 board. (I am running an i7-6700k at 4.7GHz.) I have an APC UPS that displays the power draw so I am getting about 665 Watts through 3D Mark 11 and about 785 Watts through playing Crysis 3 at 4k resolutions. The 3D Mark 11 Graphics Score is 30294 so I know crossfire is working correctly.
> 
> Now the question: I have a Thermaltake Smart Series M850W which uses a single 12v rail. The thing is, should I worry about my power supply not being enough or am I good since I am not running these games all the time? Truth is, I am not overly concerned but I figured it would be good to ask others and see what they think.


Well, this is a 390 thread and I own a 390, how about it?







Oh, and I have the 290 reference fan set to 75% so yes, it gets noisy but, I can deal with that for the performance it brings. I just do not want to blow my power supply.


----------



## kizwan

Quote:


> Originally Posted by *ManofGod1000*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ManofGod1000*
> 
> Ok, serious question. I have just purchased an XFX R9 390 and have cross fired it with a reference R9 290x from XFX. I installed the 390 in the top slot and the 290 in the second slot of my Gigabyte Z170 Gaming 7 board. (I am running an i7-6700k at 4.7GHz.) I have an APC UPS that displays the power draw so I am getting about 665 Watts through 3D Mark 11 and about 785 Watts through playing Crysis 3 at 4k resolutions. The 3D Mark 11 Graphics Score is 30294 so I know crossfire is working correctly.
> 
> Now the question: I have a Thermaltake Smart Series M850W which uses a single 12v rail. The thing is, should I worry about my power supply not being enough or am I good since I am not running these games all the time? Truth is, I am not overly concerned but I figured it would be good to ask others and see what they think.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Well, this is a 390 thread and I own a 390, how about it?
> 
> 
> 
> 
> 
> 
> 
> Oh, and I have the 290 reference fan set to 75% so yes, it gets noisy but, I can deal with that for the performance it brings. I just do not want to blow my power supply.
Click to expand...

http://www.hardocp.com/article/2014/06/16/thermaltake_smart_sp850m_850w_power_supply_review/9#.Vl7tq_krIuU

Your PSU is fine & enough. It can handle both of your cards just fine.


----------



## ManofGod1000

Quote:


> Originally Posted by *kizwan*
> 
> http://www.hardocp.com/article/2014/06/16/thermaltake_smart_sp850m_850w_power_supply_review/9#.Vl7tq_krIuU
> 
> Your PSU is fine & enough. It can handle both of your cards just fine.


Cool, thanks.







Sheez, I am a member there and did not even think to look there, doh!


----------



## Agent Smith1984

Just FYI, I appreciate all the cool pics of people getting new cards, but PLEASE be aware that if you want to be on the member list, that the pic needs to include something identifying your user ID


----------



## Levys

Quote:


> Originally Posted by *jon666*
> 
> I might have made a terrible mistake. Any one have any pictures of the voltage regulators for 390's?


here you go...now what did you do







.....


----------



## jon666

I had to re solder a connection because somehow I broke it.


----------



## iscariot

Hey guys what would you say are are the idle tem,ps for a 390x? Mine is idleing at 70 celcius seems bit high. Could be the temps in the house though


----------



## Sgt Bilko

Quote:


> Originally Posted by *iscariot*
> 
> Hey guys what would you say are are the idle tem,ps for a 390x? Mine is idleing at 70 celcius seems bit high. Could be the temps in the house though


Better check to make sure you don't have that crimson fan bug because that is way too high....

Mine idles at 50c with 20% fan speed running 3 monitors (memory always at 3D clocks)


----------



## iscariot

HOw do I check that?


----------



## SystemTech

Thats definitely too high although are the fans spinning at all? Have you set any fan profile in Afterburner? Your fans might just be switched off. I know before i setup my profile the fans stop and then she idles at about 60*C so 70 is not much more esp if your in a warm climate.
Im not to sure on the exact steps but do the crimson beta update first then once its updated you go somewhere into the overdrive setting or something in the AMD system panel and look for fan control.


----------



## TsukikoChan

Quote:


> Originally Posted by *iscariot*
> 
> HOw do I check that?


Using either crimson, hwmonitor or your flavor of afterburner/trixx, watch the fanspeed % on your gpu during a game or anything that heats it up. typically the crimson bug means that your fan speed never goes above 20% ever, even with fancurves set. if your fan goes above 20-30% during play you don't have the crimson bug and might be that simply you need to set an aggressive fancurve on your gpu


----------



## rdr09

Quote:


> Originally Posted by *iscariot*
> 
> HOw do I check that?


use HWINFO64 or GPUZ (under Sensors Tab - set the fan speed reading to show Max).


----------



## Carniflex

I upgraded Sapphire 7950 FleX OC in my sig rig to Gigabyte 390X G1



It's pretty nice upgrade, especially considering my main screen is nowadays a 28'' 4K. However, mine is running very hot. Have seen as high as 90C in OCCT during 3 min stress and stability test. Very minor OC as well, +3% power, +3.8% core and +125 MHz on memory bus. Just to round some numbers up a bit.


----------



## DarknightOCR

yesterday I went to install the bios of 390x Tri-X 390 in my Nitro

All impeccable, not unblock anything, as I had expected.
However, the clock was in 1055 instead of 1040 Nitro, and the lowest voltage.

MSI AB in stock 390 Nitro accuses + 19mv, the 390x TriX accuses + 13mv and the clock to 1055mhz.

Now it was like if someone have a Tri-X 390x, with bios 015.049.xxxxxx that post

My 390 nitro had bios 015.049xxx and TechPowerUp only exists for both 390 or 390x the first bios 015.048xxxxx

GPU-Z, AB, Crimson etc ... continues to appear R9 390


----------



## mus1mus

It will not appear as a 390X due to the number of shaders activated. 2560 vs 2816.

If you want better performance, maybe BIOS Modding will help.


----------



## DarknightOCR

I know.

I just flash for fun.

the only thing that changed was the clock 1040 to 1055, and the voltage + 19mv to +13mv

but now,I like to put a more current bios of 390x TriX, and stay that way.

15MHz gain


----------



## mus1mus

Just be wary to keep the original BIOS in case something gets nasty.


----------



## DarknightOCR

Yup
My original BIOS is saved


----------



## fat4l

Quote:


> Originally Posted by *Dundundata*
> 
> I ran the EVV program
> 
> 
> 
> What does it mean '0x283'?
> 
> Also it told me to restore the default SCLK frequency. I ran the program upon reboot with no OC.


It's 2B3 not 283.
It means asic quality.
To calculate asic quality, u have to convert this hex number to deci number.
Then divide by 1023. Then multiply by 100 to get % value.

2B3=691
691/1023 * 100 = 67.5%

1.275V=3D voltage(before Vdroop)


----------



## Dundundata

Thank you, that matches the score on GPUZ

So for volts, stock my max seems to be ~1.23V, 45mV below 3d voltage, is that vdroop?

If I add around 50mV it goes up to about 1.32V, which has become my OC for demanding games. Temps are around 70 at this voltage depending on ambient, and VRM stays low 60s.

If I add 100mV I get around 1.37V, temps a bit higher maybe up to 75.

From what I understand you don't typically want to add much more than that, although with temps being sub 80 I'm not too concerned. The performance benefit of going that high probably isn't worth it to me at that point though.


----------



## fat4l

Quote:


> Originally Posted by *Dundundata*
> 
> Thank you, that matches the score on GPUZ
> 
> So for volts, stock my max seems to be ~1.23V, 45mV below 3d voltage, is that vdroop?
> 
> If I add around 50mV it goes up to about 1.32V, which has become my OC for demanding games. Temps are around 70 at this voltage depending on ambient, and VRM stays low 60s.
> 
> If I add 100mV I get around 1.37V, temps a bit higher maybe up to 75.
> 
> From what I understand you don't typically want to add much more than that, although with temps being sub 80 I'm not too concerned. The performance benefit of going that high probably isn't worth it to me at that point though.


well if you want to check the MAX voltage then I would recommend running gpu-z or afterburner and check the MAX value, not the average one.
Run 3dmark firestrike/firestrike extreme and after the run, check the max value.


Ur temps are good tho








I'm using 1200/1700(1500timings mod)MHZ with 1.387V for gpu1 and 1.425V for gpu2. The voltage of 12.5mV is already included for extra stability.


----------



## Gdourado

I currently have a sapphire 390x tri-x.
My psu is a xfx 850w black edition.
Is the psu enough to add a second 390x?
My cpu is a 6600k at 4.8.

Cheers


----------



## iscariot

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *iscariot*
> 
> Hey guys what would you say are are the idle tem,ps for a 390x? Mine is idleing at 70 celcius seems bit high. Could be the temps in the house though
> 
> 
> 
> Better check to make sure you don't have that crimson fan bug because that is way too high....
> 
> Mine idles at 50c with 20% fan speed running 3 monitors (memory always at 3D clocks)
Click to expand...

I'll check it. The drivers updated again last night before I went out. Have not had a chance to look again. My fans are not locked at 20% though. They were sitting at about 48% I did notice though that the Fan percentage was set to "On" everytime I changed it to "Off" and the word automatic appeared next to the percentage after I clicked apply it reverted back to "On" which I would imagine is locking the fan speed. Is that right or do I have it backwards?


----------



## Dundundata

Quote:


> Originally Posted by *fat4l*
> 
> well if you want to check the MAX voltage then I would recommend running gpu-z or afterburner and check the MAX value, not the average one.
> 
> Ur temps are good tho
> 
> 
> 
> 
> 
> 
> 
> 
> I'm using 1200/1700(1500timings mod)MHZ with 1.387V for gpu1 and 1.425V for gpu2. The voltage of 12.5mV is already included for extra stability.


Yes I used HWinfo MAX to get those values. That is good to know you are running higher volts through your cards. Apparently these cards are designed to take a beating!

My card doesn't like that high of a memory OC, it does fine at 1650 and could probably handle 1675. Core I can do close to or maybe 1200. Thing is I had to increase volts a bit for Witcher 3 after I started seeing occasional artifacts. That game is taxing for sure. But I must say the card runs it beautifully with a moderate OC.

On another note, assuming a quality PSU what would be a recommended wattage for Xfire 390's with a 4790k. Right now my total system sits around 400W.


----------



## Tobiman

Quote:


> Originally Posted by *Carlern*
> 
> Are there any cooling solutions for the MSI R9 390? I was looking into liquid coolers such as the Kraken G10 and Corsair HG10, but they do not seem to work. However, I came over this waterblock from Alphacool: http://www.alphacool.com/product_info.php/info/p1663_Alphacool-NexXxoS-GPX---ATI-R9-390-M02---incl--backplate---black.html
> 
> Even though I don't have any experience with watercooling, I'm intrested in giving it a shot. Therefore, I have some questions:
> 
> What more do I need other than the waterblock? Pumps, liquid etc. Are there any kits avaiable?
> Does the waterblock cover everything such as the vrm?
> Are there any other solutions avaiable?


How are you sure the Kraken G10 won't work? I use one on my 290 albeit, it's a powercolor one with reference board.


----------



## Tobiman

Quote:


> Originally Posted by *Dundundata*
> 
> Yes I used HWinfo MAX to get those values. That is good to know you are running higher volts through your cards. Apparently these cards are designed to take a beating!
> 
> My card doesn't like that high of a memory OC, it does fine at 1650 and could probably handle 1675. Core I can do close to or maybe 1200. Thing is I had to increase volts a bit for Witcher 3 after I started seeing occasional artifacts. That game is taxing for sure. But I must say the card runs it beautifully with a moderate OC.
> 
> On another note, assuming a quality PSU what would be a recommended wattage for Xfire 390's with a 4790k. Right now my total system sits around 400W.


850 watts minimum.


----------



## Tobiman

Quote:


> Originally Posted by *Gdourado*
> 
> I currently have a sapphire 390x tri-x.
> My psu is a xfx 850w black edition.
> Is the psu enough to add a second 390x?
> My cpu is a 6600k at 4.8.
> 
> Cheers


You should be good to go.


----------



## DarknightOCR

Gdourado

your Tri-x 390x, has 015.048.xxxx bios or 015.049.xxxx?

you could confirm

thanks


----------



## iscariot

its a little better. Idling between 50 and 60 now although it is hot here right now and my top rad needs a clean. Bit worried though it hit 92 playing fallout 4. That cant be good


----------



## kubiks

Quote:


> Originally Posted by *Gdourado*
> 
> I currently have a sapphire 390x tri-x.
> My psu is a xfx 850w black edition.
> Is the psu enough to add a second 390x?
> My cpu is a 6600k at
> 
> No! I have a 4790k and two 390x msi and I peak at 800 non Oc and up to 1000 with cpu and gpu oc. 1000watt minimum or disaster. I tried the 850 supernova go and it would shut comp down on benches for hitting psu max.
> Damn hard to type on a cellphone lol


----------



## Carlern

Quote:


> Originally Posted by *Tobiman*
> 
> How are you sure the Kraken G10 won't work? I use one on my 290 albeit, it's a powercolor one with reference board.


The MSI 3xx models uses a custom PCB layout, not the same as the 290 and 290x. However, I found an Alphacool waterblock for the MSI 390 so I'm golden.


----------



## Carniflex

Does any of you know if there exists a full block for Gigabyte 390X G1 or if anyone is planning in making one. The usual suspects (like EK configurator) do not seem to be of any help but surely I am not the first one thinking of putting it under water considering what a howler it has turned out to be. Or is my only option going with core only block - in which case which would you suggest for the VRM's? The usual little aluminum copper ones that stick to them with double sided thermal tape?


----------



## bichael

My guess would be Alphacool will probably release a GPX block for it at some point. You could check with one of the Aquatuning reps if they have an idea of when one might be coming. They've been pretty helpful while I've been trying to get hold of just the upgrde kit part to put one on my 390 PCS+ - which hopefully I will have soon.


----------



## Levys

Quote:


> Originally Posted by *jon666*
> 
> I had to re solder a connection because somehow I broke it.


you are joking right? I did exactly the same









I broke of a capacitor and and even smaller part I cant recall its name but it was DAMN small.
I had to brick an old Nvidia card







to get the parts to solder back on.



It took me like forever to fix it. and I didn't even notice the smaller part was missing to begin with,
so when I put my card back together with my water cooling the first time a tube went leaking where it connected with my gpu ( just above it and on the inside)
spraying water all over and inbetween the waterblock and the pcb, together with the awful beep warning me my gpu isn't working.








I thought it was dead for sure.








So I took it apart dry'd it and took a closer look. Saw the smaller part missing used my soldering skills a magnifying-glass and a needle to hold the damn small thing in place.
an after about 7 try's I got it right...









Phewww what a story ,,and now my card runs perfect clocking to and over 1225/1750


----------



## TsukikoChan

Quote:


> Originally Posted by *Levys*
> 
> you are joking right? I did exactly the same
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I broke of a capacitor and and even smaller part I cant recall its name but it was DAMN small.
> I had to brick an old Nvidia card
> 
> 
> 
> 
> 
> 
> 
> to get the parts to solder back on.
> 
> It took me like forever to fix it. and I didn't even notice the smaller part was missing to begin with,
> so when I put my card back together with my water cooling the first time a tube went leaking where it connected with my gpu ( just above it and on the inside)
> spraying water all over and inbetween the waterblock and the pcb, together with the awful beep warning me my gpu isn't working.
> 
> 
> 
> 
> 
> 
> 
> 
> I thought it was dead for sure.
> 
> 
> 
> 
> 
> 
> 
> 
> So I took it apart dry'd it and took a closer look. Saw the smaller part missing used my soldering skills a magnifying-glass and a needle to hold the damn small thing in place.
> an after about 7 try's I got it right...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Phewww what a story ,,and now my card runs perfect clocking to and over 1225/1750


did the same to my ps2 some years ago  broke like 3 of those and took me some time to resolder new ones :-D i know your pain.


----------



## KNG HOLDY

i broke my motherboard (z77 pro3) a long time ago and my r9 390 is running on pcie 2.0 x4

would it be worth to buy a new motherboard and maybe get even better cpu oc?

or would you guys wait a few month and totally upgrade to skylake?


----------



## Sgt Bilko

Quote:


> Originally Posted by *KNG HOLDY*
> 
> i broke my motherboard (z77 pro3) a long time ago and my r9 390 is running on pcie 2.0 x4
> 
> would it be worth to buy a new motherboard and maybe get even better cpu oc?
> 
> or would you guys wait a few month and totally upgrade to skylake?


What CPU you running?

Fill out your sig rig if you can, it'll help out


----------



## KNG HOLDY

Quote:


> Originally Posted by *Sgt Bilko*
> 
> What CPU you running?
> 
> Fill out your sig rig if you can, it'll help out


will do later :c but i can send it here before hehe

i5-3570k
MSI R9 390 (on PCIe 2.0 x4)
Asrock Z77 Pro 3
EVGA 750w G2

im just running my cpu @ 4,2ghz, i was able to pass all cpu stable test at 4,4 ghz with cpu vore of ~1,3v but it crashed csgo wuut?


----------



## Sgt Bilko

Quote:


> Originally Posted by *KNG HOLDY*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> What CPU you running?
> 
> Fill out your sig rig if you can, it'll help out
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> will do later :c but i can send it here before hehe
> 
> i5-3570k
> MSI R9 390 (on PCIe 2.0 x4)
> Asrock Z77 Pro 3
> EVGA 750w G2
> 
> im just running my cpu @ 4,2ghz, i was able to pass all cpu stable test at 4,4 ghz with cpu vore of ~1,3v but it crashed csgo wuut?
Click to expand...

Might be worth holding out and getting a Skylake rig if you'd rather, PCIe x4 won't be a big hit in gaming.....maybe 5fps at most I'd say


----------



## KNG HOLDY

"Real performance losses only become apparent in x8 1.1 and x4 2.0, where the performance drop becomes noticeable with around 15%. We also tested x4 1.1, though of more academic interest, and saw performance drop by up to 25%, an indicator that PCIe bandwidth can't be constrained indefinitely without a serious loss in performance."
https://www.techpowerup.com/reviews/NVIDIA/GTX_980_PCI-Express_Scaling/22.html

are there any big cpu releases for '16?


----------



## Sgt Bilko

Quote:


> Originally Posted by *KNG HOLDY*
> 
> "Real performance losses only become apparent in x8 1.1 and x4 2.0, where the performance drop becomes noticeable with around 15%. We also tested x4 1.1, though of more academic interest, and saw performance drop by up to 25%, an indicator that PCIe bandwidth can't be constrained indefinitely without a serious loss in performance."
> https://www.techpowerup.com/reviews/NVIDIA/GTX_980_PCI-Express_Scaling/22.html
> 
> are there any big cpu releases for '16?


13% at 1920x1080 and 14% at 1600x900 with the gap closing the higher the resolution which makes sense (less fps means less traffic)

I assume you're at least at 1080p if not 1440p?

16 should see Kaby Lake and Zen releasing both around 2H of 2016


----------



## KNG HOLDY

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 13% at 1920x1080 and 14% at 1600x900 with the gap closing the higher the resolution which makes sense (less fps means less traffic)
> 
> I assume you're at least at 1080p if not 1440p?
> 
> 16 should see Kaby Lake and Zen releasing both around 2H of 2016


yeah im playing at 1080p


----------



## Sgt Bilko

Quote:


> Originally Posted by *KNG HOLDY*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> 13% at 1920x1080 and 14% at 1600x900 with the gap closing the higher the resolution which makes sense (less fps means less traffic)
> 
> I assume you're at least at 1080p if not 1440p?
> 
> 16 should see Kaby Lake and Zen releasing both around 2H of 2016
> 
> 
> 
> yeah im playing at 1080p
Click to expand...

Well you've got a tough choice ahead of you, New board or new rig.........personally I'd opt for new rig


----------



## Joe88

Just got myself an MSI R9 390 GAMING 8G








upgraded from a his 4870

gpuz - http://www.techpowerup.com/gpuz/details.php?id=n4w2y
cpuz - http://valid.x86.fr/62cs6s


----------



## ManofGod1000

Quote:


> Originally Posted by *Joe88*
> 
> Just got myself an MSI R9 390 GAMING 8G
> 
> 
> 
> 
> 
> 
> 
> 
> upgraded from a his 4870
> 
> gpuz - http://www.techpowerup.com/gpuz/details.php?id=n4w2y
> cpuz - http://valid.x86.fr/62cs6s


Now that is what I call an upgrade.


----------



## Charcharo

Heh. I upgraded from the old ATI 5770









Damn... that was a video card. Still plays new games... sure maybe at low-medium, but it did...


----------



## raszh

I need an opinion, which of the following cards would you get:

Asus R9 390 Strix OC - 348 USD,
Powercolor R9 390 PCS+ - 358 USD
MSI R9 390 Gaming - 390 USD

The Strix seems to run hotter than the other two but is the cheapest (at the moment). The MSI seems to be the best for overclocking and has the best temperature but is it worth the extra money?
I don't know much about the Powercolor.

Keep in mind my case is permanently open! So I don't have a perfect air flow and don't want to sit next to a vacuum cleaner while playing. So the card should be somewhat quiet.

Thanks for your answers!


----------



## Carniflex

Quote:


> Originally Posted by *raszh*
> 
> I need an opinion, which of the following cards would you get:
> 
> Asus R9 390 Strix OC - 348 USD,
> Powercolor R9 390 PCS+ - 358 USD
> MSI R9 390 Gaming - 390 USD
> 
> The Strix seems to run hotter than the other two but is the cheapest (at the moment). The MSI seems to be the best for overclocking and has the best temperature but is it worth the extra money?
> I don't know much about the Powercolor.
> 
> Keep in mind my case is permanently open! So I don't have a perfect air flow and don't want to sit next to a vacuum cleaner while playing. So the card should be somewhat quiet.
> 
> Thanks for your answers!


I'm not sure 390 is a card that could be called somewhat quiet. At least my gigabyte g1 390x is quite an howler and sits at 60C idle and 90C under full load with two 140mm fans blowing cool air into it. So my suggestion would be (1) the one which has a waterblok existing for it (if you want to put it under water) or (2) The one with the largest cooler with the most fans, i.e., not the MSI one with just 2 fans on it.


----------



## Dundundata

That MSI is 390x money... I've read good things about the powercolor


----------



## Rmosher

Got a good question here. Does anyone know which version UVD and VCE are on the 390X? Can't seem to get a good answer anywhere, wikipedia says it's different than the 290X but I'm not sure due to the 390X mostly being a rebrand. I apologize if this isn't in the right thread....


----------



## LocoDiceGR

Quote:


> Originally Posted by *Rmosher*
> 
> Got a good question here. Does anyone know which version UVD and VCE are on the 390X? Can't seem to get a good answer anywhere, wikipedia says it's different than the 290X but I'm not sure due to the 390X mostly being a rebrand. I apologize if this isn't in the right thread....


its GCN 1.1 so i think UVD 4.0.

GCN 1.2 have ASIC block UVD 5.0

and FIJI ASIC block UVD 6.0


----------



## raszh

Quote:


> Originally Posted by *Carniflex*
> 
> I'm not sure 390 is a card that could be called somewhat quiet. At least my gigabyte g1 390x is quite an howler and sits at 60C idle and 90C under full load with two 140mm fans blowing cool air into it. So my suggestion would be (1) the one which has a waterblok existing for it (if you want to put it under water) or (2) The one with the largest cooler with the most fans, i.e., not the MSI one with just 2 fans on it.


I had a Palit GTX 770 which no longer works. That card wasn't super quiet but also not really loud.
Quote:


> Originally Posted by *Dundundata*
> 
> That MSI is 390x money... I've read good things about the powercolor


The 390x cards start at 456 USD in my country.


----------



## Joe88

I picked up the MSI for $230 after rebate from newegg's black friday sale
as far as temps with the default fan curve, I have trouble going over 71C at full load, idles around 50C with fans off


----------



## mandrix

PCS 390X 1200/1670. My best Fire Strike so far, 1.305v/1.047v. Crimson beta.


----------



## thanozr

Quote:


> Originally Posted by *Joe88*
> 
> I picked up the MSI for $230 after rebate from newegg's black friday sale
> as far as temps with the default fan curve, I have trouble going over 71C at full load, idles around 50C with fans off


$230? WoW! I got my Sapphire TRI-X OC for $490!


----------



## Devilywan88

got my 390 Strix last weekend. but the temp is a bit high for my liking.


----------



## FooSkiii

hey guys just installed windows 10 on my system and i'm having a problem resizeing my second monitor :/
my 1st monitor is sized perfect but my second one is over sized the windows bar is below the screen

here is a pikk http://puu.sh/lLAVA/c41b36708e.jpg

any help would be greatly appreciated


----------



## Dundundata

Trying to finish my rigbuilder but I cannot find the MSI 390 in product match, anyone know what I can search for to pull it up? Tried a number of things but no dice.


----------



## Carniflex

Has any of you watercooled a Gigabyte 390X or 390 G1 with a core block - if you have how are the temps under water and what did you do for VRM's and how hot are they getting if you have taken the look at them?

Just the usual glue-on small aluminum/copper heat-sinks and some airflow or does it need something fancier for VRM's?


----------



## kizwan

Quote:


> Originally Posted by *FooSkiii*
> 
> hey guys just installed windows 10 on my system and i'm having a problem resizeing my second monitor :/
> my 1st monitor is sized perfect but my second one is over sized the windows bar is below the screen
> 
> here is a pikk http://puu.sh/lLAVA/c41b36708e.jpg
> 
> any help would be greatly appreciated


Did you try over scan? You should be able to select the monitor you want to undrscan/overscan in the Crimson control panel.


----------



## SirBubby

I keep getting a black screen when trying to OC my MSI 390. Wondering if you guys had any ideas...

I've been using the MSI gaming app in OC mode 1060/1525 with no issues. I decided I wanted to push it harder this weekend.

I downloaded MSI Afterburner and did some reading. I left the mv and power limit stock and decide to try 1070/1525. Soon as I open 3d marks I got a flickering black screen. When I lunched firestrike my screen went black but I could hear audio. I then tried setting the power limit to 20 and got the same result. I then tried mv and 20 and power limit at 20 and got the same result. I tried a bunch of different settings and always got a black screen. I then removed afterburner and used crimson utility and tried that, same result. Only thing that seems to work is stock or msi gaming app. Anyone have any ideas?

Setup:
PSU: EVGA 850 g2
CPU: 6700k
RAM: 16gb gskill
GPU: MSI 390
Drivers: Crimson 15.11.1 beta

Thanks


----------



## Tobiman

Quote:


> Originally Posted by *Carniflex*
> 
> Has any of you watercooled a Gigabyte 390X or 390 G1 with a core block - if you have how are the temps under water and what did you do for VRM's and how hot are they getting if you have taken the look at them?
> 
> Just the usual glue-on small aluminum/copper heat-sinks and some airflow or does it need something fancier for VRM's?


If you don't already have a water loop, i'd just go with an AIO plus kraken G10 combo. 65 C max on my overclocked r9 290.


----------



## Tobiman

Quote:


> Originally Posted by *SirBubby*
> 
> I keep getting a black screen when trying to OC my MSI 390. Wondering if you guys had any ideas...
> 
> I've been using the MSI gaming app in OC mode 1060/1525 with no issues. I decided I wanted to push it harder this weekend.
> 
> I downloaded MSI Afterburner and did some reading. I left the mv and power limit stock and decide to try 1070/1525. Soon as I open 3d marks I got a flickering black screen. When I lunched firestrike my screen went black but I could hear audio. I then tried setting the power limit to 20 and got the same result. I then tried mv and 20 and power limit at 20 and got the same result. I tried a bunch of different settings and always got a black screen. I then removed afterburner and used crimson utility and tried that, same result. Only thing that seems to work is stock or msi gaming app. Anyone have any ideas?
> 
> Setup:
> PSU: EVGA 850 g2
> CPU: 6700k
> RAM: 16gb gskill
> GPU: MSI 390
> Drivers: Crimson 15.11.1 beta
> 
> Thanks


It could be that your memory could be at its limit especially if it's elpida. Most apps don't gain a lot from memory overclock these days so I wouldn't be too worried. Try overclocking just the core and see if you still get blackscreens. Also, use GPU-Z to confirm if you your memory is elpida or hynix or samsung. The last two are very good overclockers.
It could also mean that your card needs more juice, try 50mv and see if it sticks.


----------



## SirBubby

Quote:


> Originally Posted by *Tobiman*
> 
> It could be that your memory could be at its limit especially if it's elpida. Most apps don't gain a lot from memory overclock these days so I wouldn't be too worried. Try overclocking just the core and see if you still get blackscreens. Also, use GPU-Z to confirm if you your memory is elpida or hynix or samsung. The last two are very good overclockers.
> It could also mean that your card needs more juice, try 50mv and see if it sticks.


I'll check gpu-z when I get home and try just the core. My asic is only 64%.

Thanks for the suggestions!


----------



## Dundundata

Well for starters there's really no reason not to set power limit to +50. But your card should handle 1070/1525 no problem. I would try uninstalling drivers and using DDU uninstaller to completely remove them, and re-seating your video card in the motherboard, because it doesn't sound right.

For memory he was talking about the video card but it should be able to handle more than that, but I have hynix so not sure about elpida.


----------



## Tobiman

Quote:


> Originally Posted by *Devilywan88*
> 
> got my 390 Strix last weekend. but the temp is a bit high for my liking.


Your card is having a hard time getting any air. There's barely any space between it and the PSU cover. i would use something to prop it up on the other side so the card can get air.


----------



## Tobiman

Quote:


> Originally Posted by *raszh*
> 
> I need an opinion, which of the following cards would you get:
> 
> Asus R9 390 Strix OC - 348 USD,
> Powercolor R9 390 PCS+ - 358 USD
> MSI R9 390 Gaming - 390 USD
> 
> The Strix seems to run hotter than the other two but is the cheapest (at the moment). The MSI seems to be the best for overclocking and has the best temperature but is it worth the extra money?
> I don't know much about the Powercolor.
> 
> Keep in mind my case is permanently open! So I don't have a perfect air flow and don't want to sit next to a vacuum cleaner while playing. So the card should be somewhat quiet.
> 
> Thanks for your answers!


PCS+ anyday.


----------



## jodybdesigns

I have been watching and reading and watching and reading, but I think it is time to ask.

Is it truly worth upgrading my Crossfire 7950's for a R9 390? I see the power consumption is high, but heat isn't a problem as I plan on putting the 390 on water. My 7950's are NOT on water. Restraints on finding blocks and back plates for my PCB resulted in not being able to put them on water. But a pro is only having one video card, with no Crossfire headache. Although when drivers aren't an issue, Crossfire is amazing.

I will be playing exclusively on 1440p. All modern titles, including first day releases.

Crossfire 7950's?
Or a R9 390?


----------



## Agent Smith1984

Quote:


> Originally Posted by *jodybdesigns*
> 
> I have been watching and reading and watching and reading, but I think it is time to ask.
> 
> Is it truly worth upgrading my Crossfire 7950's for a R9 390? I see the power consumption is high, but heat isn't a problem as I plan on putting the 390 on water. My 7950's are NOT on water. Restraints on finding blocks and back plates for my PCB resulted in not being able to put them on water. But a pro is only having one video card, with no Crossfire headache. Although when drivers aren't an issue, Crossfire is amazing.
> 
> I will be playing exclusively on 1440p. All modern titles, including first day releases.
> 
> Crossfire 7950's?
> Or a R9 390?


Na man, not unless you really are having issues with Crossfire support.

2) Tahiti's still makes for a nice setup, especially if they clock decent.


----------



## b0uncyfr0

So are the 390x's good at overclocking? If they're simply higher binned 290x's, do they have that much OC headroom?

The average 290x can hit 1100 mhz and quite a few hit 1150, which is good. What are the enthusiast cards like TriX, Msi and PCS getting in the OC department ; 1200mhz common?


----------



## jodybdesigns

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Na man, not unless you really are having issues with Crossfire support.
> 
> 2) Tahiti's still makes for a nice setup, especially if they clock decent.


Fair enough. I may spend the money elsewhere to TRY and get some back plates for my 7950's. I know there has to be somebody around here who can whip them up for me. I have seen the universal blocks made by swiftech. I heard they are pretty damn good.

And I must admit. AMD has been on good game about releasing drivers more frequently lately. Fallout 4 had drivers quicker than about any title I have seen in a long time. Although, this may be because the updates were part of a "package" that happened to include the update for the AMD title Star Wars: Battlefront. Or it may be AMD stepping up to step on Nvidia titles the way like Nvidia did with Battlefield 4.

But thanks!


----------



## ZealotKi11er

Quote:


> Originally Posted by *b0uncyfr0*
> 
> So are the 390x's good at overclocking? If they're simply higher binned 290x's, do they have that much OC headroom?
> 
> The average 290x can hit 1100 mhz and quite a few hit 1150, which is good. What are the enthusiast cards like TriX, Msi and PCS getting in the OC department ; 1200mhz common?


My 290X can do 1225MHz and 1275MHz in benchmarks. Only thing better with 390X is memory.


----------



## Devilywan88

Quote:


> Originally Posted by *Tobiman*
> 
> Your card is having a hard time getting any air. There's barely any space between it and the PSU cover. i would use something to prop it up on the other side so the card can get air.


I tried putting the PSU on the other side already, Didn't make any improvement. The temp is much different from my Prodigy case. About 8-10c different on gpu temp. Regret buying this case. The review rarely mentioned about the airflow is so bad on this case.

My gpu temp reaching 94c with 64% fan speed. Cant stand hearing the fan noise when gaming. Thinking to buy Corsair Air 240 soon.


----------



## iscariot

Quote:


> Originally Posted by *Devilywan88*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Tobiman*
> 
> Your card is having a hard time getting any air. There's barely any space between it and the PSU cover. i would use something to prop it up on the other side so the card can get air.
> 
> 
> 
> I tried putting the PSU on the other side already, Didn't make any improvement. The temp is much different from my Prodigy case. About 8-10c different on gpu temp. Regret buying this case. The review rarely mentioned about the airflow is so bad on this case.
> 
> My gpu temp reaching 94c with 64% fan speed. Cant stand hearing the fan noise when gaming. Thinking to buy Corsair Air 240 soon.
Click to expand...

MIne hit 91c at 99% while playing FO4 on Ultra high. I dont particularly like it either.


----------



## Carniflex

Quote:


> Originally Posted by *Tobiman*
> 
> If you don't already have a water loop, i'd just go with an AIO plus kraken G10 combo. 65 C max on my overclocked r9 290.


My loop is not assembled atm (downgraded for air when I decided to make my portable setup 20 kg lighter) . I have about 4 gpu core blocks and pile of tubing and fittings and 3 really small pumps + their tops lying around so my investment - if I cant find a full cover block for G1 390X is a new (smaller) radiator as my current rad is Nova 1080 which is like 5 kg 9x 120 mm thing. As I would be putting only and only this one GPU under water this time I can get away with much smaller radiator (and substantially lighter as well).

65 C is pretty good - I have no experience with AIO's but that seems approx the same 30..35 C drop in temperature one gets with a custom loop. What did you do with the VRM's? Left them bare and just pointed some airflow on the card or used these little additional glue-on heat-sinks on them or there was something specific with the AIO to take care of them? As far as I understand 290/290X are basically the same cards ... I'm just wondering about VRM's considering how hot the card normally runs and how loud it howls under the air (at least in my case).

What I have done myself in the past is using these little glue-on heat-sinks

with a 120mm or 140mm pointed at the gfx cards - if in case where there is no vent mounts on the sidepanel then I have just used double sided tape to attach one to the side-panel above gfx cards. But my experience in this regard is limited to somewhat weaker cards like 6770 or 7870.


----------



## 5dragons

guys i have a question, with my msi 390x at 100% usage and 100% fan speed, i´m getting 81° what is the safe max temp for the card?


----------



## Joe88

Quote:


> Originally Posted by *jodybdesigns*
> 
> And I must admit. AMD has been on good game about releasing drivers more frequently lately. Fallout 4 had drivers quicker than about any title I have seen in a long time. Although, this may be because the updates were part of a "package" that happened to include the update for the AMD title Star Wars: Battlefront. Or it may be AMD stepping up to step on Nvidia titles the way like Nvidia did with Battlefield 4.


they also had just cause 3 graphics patches right before launch


----------



## seanpatrick

Quote:


> Originally Posted by *Devilywan88*
> 
> I tried putting the PSU on the other side already, Didn't make any improvement. The temp is much different from my Prodigy case. About 8-10c different on gpu temp. Regret buying this case. The review rarely mentioned about the airflow is so bad on this case.
> 
> My gpu temp reaching 94c with 64% fan speed. Cant stand hearing the fan noise when gaming. Thinking to buy Corsair Air 240 soon.


You can't really go wrong if you're air cooling w/ the Air 540. Mine is fantastic and easily surpasses every other case I've owned (15 or so over the years) in terms of temperatures. You can have 2 x 140 or 3 x 120mm case fans blowing directly on your components (with nothing to block the air-flow), it's pretty hard to beat.

Bear in mind that some 390s run hotter than others. The XFX, Sapphire and Powercolour PCS+ cards run fairly cool (all else being equal), where the ASUS and MSI cards run hot. Gigabyte falls somewhere in between.

In any case you'll definitely notice improved temps (unless there's something wrong with your card) with the 540.


----------



## Joe88

My msi runs pretty good as far as temps go, running witcher 3 maxed out (even gimpworks) @ 1080p it went up to 71C with stock fan curve
might be just silicon lottery or just poor case airflow, I have the enthoo pro with the huge 200mm fan in the front


----------



## Carniflex

Quote:


> Originally Posted by *5dragons*
> 
> guys i have a question, with my msi 390x at 100% usage and 100% fan speed, i´m getting 81° what is the safe max temp for the card?


It should throttle at 95 C


----------



## Dundundata

Quote:


> Originally Posted by *Joe88*
> 
> My msi runs pretty good as far as temps go, running witcher 3 maxed out (even gimpworks) @ 1080p it went up to 71C with stock fan curve
> might be just silicon lottery or just poor case airflow, I have the enthoo pro with the huge 200mm fan in the front


same here, Corsair Spec03 case with HDbay removed, 6 case fans (probably overkill) and it stays cool. I don't use the stock curve but it usually gets in the 60% range during hard gaming. I run Witcher 3 maxed as well minus foliage distance which I set to high. Hairworks on though cause Geralt's hair must look good









In fact with the same setup my XFX got hotter than the MSI does, it was still fine though.


----------



## shadowking1711

Hey can you add me to the club I have an sapphire R9 390x

and i have an question what does the UEFI button do on the TriX card do and what is a good overclock that anyone has got on this card

and any tips to give a new overclocker to start?

Thanks


----------



## Carniflex

I have received some information in regards Gigabyte 390X G1 water cooling options. I wrote to Alphacool few days ago asking if they have a suitable block for this card and apparently they will have relatively soon.
Quote:


> thank you for your email, we will publish a matching cooler for your card in the few days.


Probably not a full cover (although one can hope ofc) as they don't do full cover strictly speaking but something in their NexXxoS GPX series. Which is basically next best thing with a GPU core block and then a relatively large passive heat-sink (+ backplate) for the rest of the components.


----------



## dave2150

I knocked off 15dc off the temperatures of my MSI 390X by reapplying the thermal paste. There was hardly any paste on my GPU, and what was there appeared to have migrated around the GPU instead of on top of the die.

It used to run at 85-90DC at load, now it maxes out at 75. Massive reduction in fan speed and noise also









It only takes 10 minutes to reapply the thermal paste, 8 screws is all you have to remove, well worth it.


----------



## Carniflex

Quote:


> Originally Posted by *dave2150*
> 
> I knocked off 15dc off the temperatures of my MSI 390X by reapplying the thermal paste. There was hardly any paste on my GPU, and what was there appeared to have migrated around the GPU instead of on top of the die.
> 
> It used to run at 85-90DC at load, now it maxes out at 75. Massive reduction in fan speed and noise also
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It only takes 10 minutes to reapply the thermal paste, 8 screws is all you have to remove, well worth it.


Thats a good info - might have to give it a try - although I did already order bulk of the parts for my GPU loop.


----------



## battleaxe

Quote:


> Originally Posted by *shadowking1711*
> 
> Hey can you add me to the club I have an sapphire R9 390x
> 
> and i have an question what does the UEFI button do on the TriX card do and what is a good overclock that anyone has got on this card
> 
> and any tips to give a new overclocker to start?
> 
> Thanks


Check out the opening post on how to add your cards please.


----------



## Dundundata

Quote:


> Originally Posted by *dave2150*
> 
> I knocked off 15dc off the temperatures of my MSI 390X by reapplying the thermal paste. There was hardly any paste on my GPU, and what was there appeared to have migrated around the GPU instead of on top of the die.
> 
> It used to run at 85-90DC at load, now it maxes out at 75. Massive reduction in fan speed and noise also
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It only takes 10 minutes to reapply the thermal paste, 8 screws is all you have to remove, well worth it.


I did this with an XFX which was the opposite, too much paste. Well worth it if you have a good setup and your card still runs hot, I knocked 10C off that card going from about 80-70.


----------



## Noviets

So what's the deal with the linear gpu clock now?
Seems to go from the idle state of 300mhz and to the oc amount of 1200mhz. but spends 99% of the time between 600-800mhz.

Would it not be better to have it clock as it's highest and be at 30% load, rather than be at 600mhz and be at 100% load?

Also has anyone encountered the really eratic GPU load from their 390x? I'm wondering if it's to do with the clock going up and down alot, but it even standing still in a game like fallout the gpu goes from <5% (sometimes hitting zero) to 100% I'm talking constant bouncing.

It seems to be fine in Firestrike, but every game I have played so far I feel is very jittery, for lack of a better word.

Are you guys using the Beta drivers?
Would anyone mind posting their GPU load with a 390x from Fallout 4 on Ultra?


----------



## jodybdesigns

Quote:


> Originally Posted by *Noviets*
> 
> So what's the deal with the linear gpu clock now?
> Seems to go from the idle state of 300mhz and to the oc amount of 1200mhz. but spends 99% of the time between 600-800mhz.
> 
> Would it not be better to have it clock as it's highest and be at 30% load, rather than be at 600mhz and be at 100% load?


You running Multi Monitors? This usually happens with AMD cards when you have Multi Monitors. It keeps them from "flickering" because ULPS.


----------



## Noviets

Quote:


> Originally Posted by *jodybdesigns*
> 
> You running Multi Monitors? This usually happens with AMD cards when you have Multi Monitors. It keeps them from "flickering" because ULPS.


I do have multiple monitors. But I do not use Eyefinity.
I'll try turning off ULPS with MSI:AB and see if it makes a difference

Edit: @jodybdesigns I freaking love you! That solved the issue completely.
So why was it causing it to do that? It was horrible!
Do I need to keep AB running for ULPS to be disabled or is it a set-and-forget type of deal?


----------



## iangus

So, has anyone tried out this hybrid water block from alphacool on an msi 390 yet. I'd like to know how hot the VRMs get under load.


----------



## kizwan

Quote:


> Originally Posted by *Noviets*
> 
> So what's the deal with the linear gpu clock now?
> Seems to go from the idle state of 300mhz and to the oc amount of 1200mhz. but spends 99% of the time between 600-800mhz.
> 
> Would it not be better to have it clock as it's highest and be at 30% load, rather than be at 600mhz and be at 100% load?
> 
> Also has anyone encountered the really eratic GPU load from their 390x? I'm wondering if it's to do with the clock going up and down alot, but it even standing still in a game like fallout the gpu goes from <5% (sometimes hitting zero) to 100% I'm talking constant bouncing.
> 
> It seems to be fine in Firestrike, but every game I have played so far I feel is very jittery, for lack of a better word.
> 
> Are you guys using the Beta drivers?
> Would anyone mind posting their GPU load with a 390x from Fallout 4 on Ultra?


It's normal for Hawaii GPU usage to fluctuating a lot at 1080p but it shouldn't affect your gaming experience. Fluctuating will become less and less at higher resolution. For clock, the fluctuation should be minor if any & most of the time running at max frequency. I don't have fallout but if I'm not mistaken there's still issues with the drivers right? Clock fluctuating a lot like that usually happen if power level exceeds TDP or you have FPS target set in the CCC/Crimson control panel. Stuttering can be caused by a couple of things like power saving or pagefile or unstable overclock. Games are going to stress the gpu more than bench like firestrike,


----------



## Thenew22

Hey Guys, after my Tri X 290x suffered from Bad Memory and flickering all over the Place i changed to the R9 390 Strix from Asus. Iam just trying to overclock it. Are there People here who got it and have Resuluts for it?

Iam managing 1090 and 6600 without Vmod and +50 Power Target. After that everything gets a litte bit rougher. Any help how to start and get the Most out of it. I read something about close to 1200 GPU Clock.


----------



## jodybdesigns

Quote:


> Originally Posted by *Noviets*
> 
> I do have multiple monitors. But I do not use Eyefinity.
> I'll try turning off ULPS with MSI:AB and see if it makes a difference
> 
> Edit: @jodybdesigns I freaking love you! That solved the issue completely.
> So why was it causing it to do that? It was horrible!
> Do I need to keep AB running for ULPS to be disabled or is it a set-and-forget type of deal?


Bad programming is the only thing I can think of. ULPS is the cause for a lot of issues.

I discovered ULPS was a problem for me years ago. When I was overclocking my 4000,5000, and 6000 series cards - I would hit the same issue as you.

Let me explain how this works. Your card has several BIOS settings telling the card how many volts to have, how fast run, etc etc. Now, when you overclock your card, some card manufacturers (SAPPHIRE) likes to change those stock settings.

Lets say my card stock is 900/1100. ULPS when not in operation is actually 300/600. When I put an overclock to my card for example: 1000/1200. ULPS changes the low load to 200/500 or 400/600 - or how ever much the manufacturer sets the target at. Sometimes the clocks are too low. It causes the monitors to artifact. Causes flunctuations. You are getting flunctuations because ULPS does not know how to correctly "clock" your card for the current operations being used.

I hate ULPS. Leave it off. Or modify your BIOS manually.

You know when you only see 67% usage on a game? Are you getting a solid 60FPS? Great! This is ULPS in action locking the cards usage to your Vsync (or 60fps). So if it takes 67% of the cards power to push out 60FPS constant. That is what you get. Want to see your cards usage go higher? Disable Vsync and watch ULPS push your card to 90% or more.


----------



## kizwan

ULPS is only applicable to Crossfire where it put the secondary (non-primary) cards to low power state when idle to reduce power consumption. What you're explaining there is PowerPlay.


----------



## Joe88

hm I noticed recently after I hooked up my hdtv to hdmi and cloned the display I started getting small graphic flickering when running at full screen, seems to go away with windowed mode
Its not happening in just cause 3 but it is in tf2 and rayman legends
I am not using eyefinify just using the built in display settings in windows 10 to clone the displays

would disabling ULPS fix that?


----------



## jodybdesigns

Quote:


> Originally Posted by *Joe88*
> 
> hm I noticed recently after I hooked up my hdtv to hdmi and cloned the display I started getting small graphic flickering when running at full screen, seems to go away with windowed mode
> Its not happening in just cause 3 but it is in tf2 and rayman legends
> I am not using eyefinify just using the built in display settings in windows 10 to clone the displays
> 
> would disabling ULPS fix that?


Try just the HDMI cable without your monitor plugged in (your TV primary) and see what it does.
Quote:


> Originally Posted by *kizwan*
> 
> ULPS is only applicable to Crossfire where it put the secondary (non-primary) cards to low power state when idle to reduce power consumption. What you're explaining there is PowerPlay.


You're right! Good call! Been a long day.


----------



## seanpatrick

Quote:


> Originally Posted by *Joe88*
> 
> hm I noticed recently after I hooked up my hdtv to hdmi and cloned the display I started getting small graphic flickering when running at full screen, seems to go away with windowed mode
> Its not happening in just cause 3 but it is in tf2 and rayman legends
> I am not using eyefinify just using the built in display settings in windows 10 to clone the displays
> 
> would disabling ULPS fix that?


You're not alone. This was driving me CRAZY with my Sapphire card. I contacted Sapphire and they said they'd never heard of the problem, but it's been commented on in several different threads. My XFX card seems to have less of a problem with it.


----------



## CaveManthe0ne

just got a Sapphire r9 390 Nitro OC w/backplate!
(cooling is the Sapphire cooler- no water or anything.)


----------



## Carniflex

Quote:


> Originally Posted by *iangus*
> 
> So, has anyone tried out this hybrid water block from alphacool on an msi 390 yet. I'd like to know how hot the VRMs get under load.


There is a review for 290 block (passive heatsink is the same) http://hardwareoverclock.com/Alphacool-NexXxoS-GPX-R9-290-M01-Wasserkuehlung.htm - it's in German so google translate helps. In a nutshell the VRM's are supposed to be "OK" - although full cover blocks are ofc much better at it.


----------



## Joe88

Quote:


> Originally Posted by *jodybdesigns*
> 
> Try just the HDMI cable without your monitor plugged in (your TV primary) and see what it does.


With just the HDTV plugged (DVI->HDMI cable) in it doesnt do it
With just the Monitor plugged in (DVI cable) it doesnt do it
When both are plugged in these about 1 inch high bars keep flickering across the screen every so often when playing full screen games


----------



## Majentrix

Does anyone know if the 390 Strix will fit in a Silverstone FT02?


----------



## b0uncyfr0

Quote:


> Originally Posted by *Majentrix*
> 
> Does anyone know if the 390 Strix will fit in a Silverstone FT02?


I have a Vapor-X 290x in my FT02. That's almost 12 cm. As long as its under that, youll be fine.


----------



## jodybdesigns

Quote:


> Originally Posted by *seanpatrick*
> 
> You're not alone. This was driving me CRAZY with my Sapphire card. I contacted Sapphire and they said they'd never heard of the problem, but it's been commented on in several different threads. My XFX card seems to have less of a problem with it.


Quote:


> Originally Posted by *Joe88*
> 
> With just the HDTV plugged (DVI->HDMI cable) in it doesnt do it
> With just the Monitor plugged in (DVI cable) it doesnt do it
> When both are plugged in these about 1 inch high bars keep flickering across the screen every so often when playing full screen games


Its software at this point. Very well could be the HDTV throwing this error back to you. Try extending the displays instead of cloning them.See what happens.


----------



## tomytom99

Does anybody here happen to own a Powercolor 390x Hybrid?
I'm considering getting it to upgrade from a 270x, I need it to increase my eyefinity performance, and for the fact that it would be easy to replace the radiator's fan for any reason. (I've had too many cards that have had fan bearings die)


----------



## Beavosaur

Picked up an Asus Strix r9 390 open box for $267 before tax! Pretty nice improvement from my Sapphire HD7970 OC w/ boost. Only 1 issue is it freezes my pc and screen goes black when I run Batteflield 4 in Mantle mode fullscreen. Runs just fine in borderless and windowed mode. Not a big deal at all though really. Looking forward to overclocking this bad boy! I'll post results soon.


----------



## seanpatrick

Has anyone tried replacing the thermal paste on their 390? Anything important I should know before attempting? (I've done it a ton of times on CPUs, but never on a Gpu)


----------



## Dundundata

Quote:


> Originally Posted by *seanpatrick*
> 
> Has anyone tried replacing the thermal paste on their 390? Anything important I should know before attempting? (I've done it a ton of times on CPUs, but never on a Gpu)


Yes it's basically the same thing, I used same paste and method as CPU


----------



## Noviets

So am I right in saying there isn't a waterblock for the Gigabyte 390x? This thing so so much louder than my DC2 Top 7970 that I'd like to add it to my loop.


----------



## jaydude

Quote:


> Originally Posted by *Noviets*
> 
> So am I right in saying there isn't a waterblock for the Gigabyte 390x? This thing so so much louder than my DC2 Top 7970 that I'd like to add it to my loop.


Not that I have heard of unfortunately, I replaced the thermal pads on the mosfets with some 1mm 11w/mk alphacool pads and the paste with some thermal grizzly hydronaut on mine, dropped temps by a good 10c+ on full load, no where near as loud as it was before on AB default fan profile and custom, if worse comes to worse this might help as a last resort other then buying a new card all over again


----------



## Noviets

Quote:


> Originally Posted by *jaydude*
> 
> Not that I have heard of unfortunately, I replaced the thermal pads on the mosfets with some 1mm 11w/mk alphacool pads and the paste with some thermal grizzly hydronaut on mine, dropped temps by a good 10c+ on full load, no where near as loud as it was before on AB default fan profile and custom, if worse comes to worse this might help as a last resort other then buying a new card all over again


I might do that as well. It's almost Summer here now and it's already pretty loud during gaming.
When it gets to the point you can hear it over your headphones, it needs to be fixed lol.

Is Grizzly Hydronaut the best paste or just what you had on hand?


----------



## Carniflex

Quote:


> Originally Posted by *Noviets*
> 
> So am I right in saying there isn't a waterblock for the Gigabyte 390x? This thing so so much louder than my DC2 Top 7970 that I'd like to add it to my loop.


I have same problem. Asked various watercooling manufacturers and apparently Alphacool is supposed to release one "in few days" according to the guy I talked to. Their system is sort of "hybrid" one with only core cooled by water and VRM's/RAM by a large passive aluminium heatsink which is card specific. Alternative is ofc using a core only block and putting some glue-on heatsink (and good amount of airflow) on them yourself.

Redoing thermal paste sounds like a good option as well. After taking a look at mine there does not seem to be any "warranty void" stickers on screws at first glance.


----------



## jaydude

Quote:


> Originally Posted by *Noviets*
> 
> I might do that as well. It's almost Summer here now and it's already pretty loud during gaming.
> When it gets to the point you can hear it over your headphones, it needs to be fixed lol.
> 
> Is Grizzly Hydronaut the best paste or just what you had on hand?


Yes so far it is the best I have used, I also have some Kryonaut that I got first but it was too hard to spread and the temps did not seem right, so I got the Hydronaut and so far it is perfect, easy to spread and temps dropped by a good amount after that and the new pads, I figure the Kryonaut is mainly for cpu's because cpu temp dropped a little from what I noticed but did nothing for the gpu.

I have used many many different pastes and the Hydronaut is my new favourite, I love this stuff









Edit: The heatsink on the gigabyte 390/x directly cools the mosfets and vrm's, hotter mosfets on the gpu unfortunately means higher gpu temps because of this. if you can get those mosfets cooler the gpu as a whole will be cooler, I both like and dislike this design, it is a damn fine cooler but gigabyte used crap to fill the gaps.


----------



## seanpatrick

Well I reapplied the paste (it was super easy to do) but it made almost zero difference to temps. They weren't bad to begin with, but thought I could milk a few less degrees out of it. It turns out XFX did a pretty good job with their application:




I noticed a 2 degree temp drop while idle, and no change under load.

I might re-do it again a little later tonight just to make sure my application was alright, but I think 78 degrees under heavy load (with + 30mV / + 20 power limit / 1150 mhz core / 1700 memory) is about the best I'm going to do









EDIT: It's so easy I just re-did it. Same 2 degree drop in idle, but this time I managed a 3-4 degree drop under heavy load









I'd say it was worth it. I used Gelid Extreme FWIW.


----------



## flanders010

Can i be added please









Bought the 390x to replace my 660sli setup, was a bit nervous about going with AMD but i think it has paid off.

I game on a 1440p monitor and the 660's just couldn't cope with only 2Gb vram.



Ben


----------



## flopper

Quote:


> Originally Posted by *flanders010*
> 
> Can i be added please
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Bought the 390x to replace my 660sli setup, was a bit nervous about going with AMD but i think it has paid off.
> 
> I game on a 1440p monitor and the 660's just couldn't cope with only 2Gb vram.
> 
> Ben


well as they say, the dark side has its moments but the light side of the force is indeed a powerfull ally.








In each iteration the dark side power user always loose.
dont the sith pay attention?

single card is the way to go any brand btw.


----------



## Beavosaur

This is what I got so far with a quick 30 min overclock. Had to do it while my daughter was sleeping. Lol
http://www.3dmark.com/fs/6757262


----------



## SirBubby

Quote:


> Originally Posted by *Tobiman*
> 
> It could be that your memory could be at its limit especially if it's elpida. Most apps don't gain a lot from memory overclock these days so I wouldn't be too worried. Try overclocking just the core and see if you still get blackscreens. Also, use GPU-Z to confirm if you your memory is elpida or hynix or samsung. The last two are very good overclockers.
> It could also mean that your card needs more juice, try 50mv and see if it sticks.


I checked GPU-Z, it's hynix so that's a relief.
Quote:


> Originally Posted by *Dundundata*
> 
> Well for starters there's really no reason not to set power limit to +50. But your card should handle 1070/1525 no problem. I would try uninstalling drivers and using DDU uninstaller to completely remove them, and re-seating your video card in the motherboard, because it doesn't sound right.
> 
> For memory he was talking about the video card but it should be able to handle more than that, but I have hynix so not sure about elpida.


I uninstalled 3d mark, ran ddu and reinstalled the newest beta drivers. I redownloaded 3d mark and ran it with no OC to the video card, but it still didn't work. The screen starts to flicker black and then goes black, even if you esc out of the problem. Strange because it used to work. I can open a game and it works fine. It must be an issue with 3d mark. I guess I need to find another gpu stress tester. Anyone have an recommendations? Unigine Valley?


----------



## Dhoulmagus

Shucks. @Agent Smith1984 not sure if you're still maintaining the OP (or if this has already been covered) but the XFX DD 390/390X has been revised with a taller inductor that makes it incompatible with full coverage blocks, some dremel work is needed now if you want to fit a 290x reference block on them.

http://www.overclock.net/t/1569032/xfx-r9-390-dd-removed-from-the-ek-fc-r9-290x-v2-ca-cooling-configurator See this thread

On that note, I'm trying to pick out a 390 OR 390x that has a (or multiple) full coverage blocks available. It seems like my only choice right now are the powercolor models which don't seem to be the greatest overclockers. Is this accurate or are there other reference boards I can still get my hands on (preferably through Newegg)?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Serious_Don*
> 
> Shucks. @Agent Smith1984 not sure if you're still maintaining the OP (or if this has already been covered) but the XFX DD 390/390X has been revised with a taller inductor that makes it incompatible with full coverage blocks, some dremel work is needed now if you want to fit a 290x reference block on them.
> 
> http://www.overclock.net/t/1569032/xfx-r9-390-dd-removed-from-the-ek-fc-r9-290x-v2-ca-cooling-configurator See this thread
> 
> On that note, I'm trying to pick out a 390 OR 390x that has a (or multiple) full coverage blocks available. It seems like my only choice right now are the powercolor models which don't seem to be the greatest overclockers. Is this accurate or are there other reference boards I can still get my hands on (preferably through Newegg)?


I'll get this updated, and thanks for the info.

I am still strying to maintain the thread, but it's limited lately due to moving, reviews, and testing new hardware.

I normally work on the member list 2 days a week, but seldom update the OP as the information on these cards hasn't changed a ton.

I definitely need to get your info added though


----------



## Dhoulmagus

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I'll get this updated, and thanks for the info.
> 
> I am still strying to maintain the thread, but it's limited lately due to moving, reviews, and testing new hardware.
> 
> I normally work on the member list 2 days a week, but seldom update the OP as the information on these cards hasn't changed a ton.
> 
> I definitely need to get your info added though


Thanks man, yeah I used your OP to figure out which cards I could get and stumbled across that gem (or disappointment).. Just don't want anybody to grab one by accident and find out its not a reference board... This really throws a wrench into my build.


----------



## Carniflex

Quote:


> Originally Posted by *Noviets*
> 
> So am I right in saying there isn't a waterblock for the Gigabyte 390x? This thing so so much louder than my DC2 Top 7970 that I'd like to add it to my loop.


Heard back from Alphacool - apparently this block: http://www.alphacool.com/product_info.php/info/p1682_Alphacool-NexXxoS-GPX is supposed to fit. Although it seems they have not yet updated their compatibility pdf nor online configurator so I would give it couple of days before ordering to be sure.


----------



## Dhoulmagus

Quote:


> Originally Posted by *Carniflex*
> 
> Heard back from Alphacool - apparently this block: http://www.alphacool.com/product_info.php/info/p1682_Alphacool-NexXxoS-GPX is supposed to fit. Although it seems they have not yet updated their compatibility pdf nor online configurator so I would give it couple of days before ordering to be sure.


So there are options after all outside of Powercolor??


----------



## Charcharo

Quote:


> Originally Posted by *Serious_Don*
> 
> So there are options after all outside of Powercolor??


Why? They are good.


----------



## Dhoulmagus

Quote:


> Originally Posted by *Charcharo*
> 
> Why? They are good.


This purchase isn't specifically for me. I'm just the middle man.

The person in question owns a Tri-X 290 but that board was revised so the block is out. They were looking for another Sapphire product but the Nitro 390/x are also not reference boards, I told them I would find out their options and here I am.. Just wanted to know if there were options.


----------



## Carniflex

Quote:


> Originally Posted by *Serious_Don*
> 
> This purchase isn't specifically for me. I'm just the middle man.
> 
> The person in question owns a Tri-X 290 but that board was revised so the block is out. They were looking for another Sapphire product but the Nitro 390/x are also not reference boards, I told them I would find out their options and here I am.. Just wanted to know if there were options.


If you do not have the card yet then there is a number of them which do have *full cover* blocks. The alphacool one is a hybrid block with large aluminium passive heatsink for VRM's and RAM and a watercooled core block for GPU core. For example here: http://www.ekwb.com/configurator/ you can chek if EK is making a full cover block for any specific card. Say for example Asus DC2 390X has a full cover block from them.

If Alphacool is ok then I believe they had also a version for 390x Nitro

Edit: Then again Alphacool has also a block for Tri-X 290 so if Alphacool is ok then the guy might not have to upgrade at all as I don't remeber from top of my head anything 390/390X could do and 290 could not seeing as they are basically the same card.


----------



## yuannan

I'm about to bite the bullet on a 390.
But I'm stuck between the MSI and Tri-X and ASUS.

I've posted on LTT and the people there says the ASUS DC3 is very bad due to poor heat dissipation and loud.
The Tri-X is said to very quite and inaudible until 60% fan speed.
and the MSI is said to be very good at OC but on the loud side.

Is there any truth to this?

My current 380 is MSI and can be heard from 30% fan speed and heats up to 80+C (at 30%).
But can keep a cool 65C at full fan speed. But it is crazy loud.
Will the the 390 be the same considering it's the same cooler (TF V).

Can anyone give me info in the format like i just did with the ASUS, TRI-X and other cards you guys might think is worth the money.
And what variant is the "best" one, mainly want a quiet and cool card. Most likely will not be overclocking.

Thanks in advance.


----------



## blode

is there a way to keep my clock from jumping? i'm playing rocket league. its 60 FPS locked and isn't hard to run (a lot of people play on laptops) and it was running fine on my 780ti which died a week ago. with a 390, my clock is waffling between 400 and 900 and stuttering and stammering and introducing artifacts to the field as well. the card does this same thing when i play just cause 2, but stays a little bit higher and i don't actually notice any performance errors. i have one monitor so i don't know how ULPS would be to blame, and i've tried +0, +20 and +50 on AMD Crimson global overdrive power setting



any ideas to stop this dumb power saving feature or whatever's going on?


----------



## jaydude

Quote:


> Originally Posted by *yuannan*
> 
> I'm about to bite the bullet on a 390.
> But I'm stuck between the MSI and Tri-X and ASUS.
> 
> I've posted on LTT and the people there says the ASUS DC3 is very bad due to poor heat dissipation and loud.
> The Tri-X is said to very quite and inaudible until 60% fan speed.
> and the MSI is said to be very good at OC but on the loud side.
> 
> Is there any truth to this?
> 
> My current 380 is MSI and can be heard from 30% fan speed and heats up to 80+C (at 30%).
> But can keep a cool 65C at full fan speed. But it is crazy loud.
> Will the the 390 be the same considering it's the same cooler (TF V).
> 
> Can anyone give me info in the format like i just did with the ASUS, TRI-X and other cards you guys might think is worth the money.
> And what variant is the "best" one, mainly want a quiet and cool card. Most likely will not be overclocking.
> 
> Thanks in advance.


From what I have seen in terms of temps it seems to be so far: 1 - Powercolor, 2 - XFX, 3 - Sapphire "w/backplate", 4 - MSI, 5 - Gigabyte "volt locked" 6 - Asus.

This is from memory so I could be off somewhat, but I do know the Powercolor and XFX 390's coolers are excellent at keeping temps low.









Edited


----------



## seanpatrick

Quote:


> Originally Posted by *jaydude*
> 
> From what I have seen in terms of temps it seems to be so far: 1 - XFX, 2 - Powercoler, 3 - Sapphire "w/backplate", 4 - Gigabyte "volt locked", 5 - Asus "not sure about the newer ones" and last MSI.
> 
> This is from memory so I could be off somewhat, but I do know the Powercolor and XFX 390's coolers are excellent at keeping temps low.


I'm almost with you, but I have both the Sapphire and xfx, and the Sapphire is about 5 degrees cooler under load and at idle.

Otherwise that ranking sounds accurate based on all the threads I've read.


----------



## Agent Smith1984

Quote:


> Originally Posted by *jaydude*
> 
> From what I have seen in terms of temps it seems to be so far: 1 - XFX, 2 - Powercoler, 3 - Sapphire "w/backplate", 4 - Gigabyte "volt locked", 5 - Asus "not sure about the newer ones" and last MSI.
> 
> This is from memory so I could be off somewhat, but I do know the Powercolor and XFX 390's coolers are excellent at keeping temps low.


Ooh no....

The MSI is much better than Asus with temps!


----------



## Joe88

just avoid asus and gigabyte cards
xfx had alot of heat problems with their r9 290 cooler not sure if it's fixed with the 390
sapphire is pretty much top end but is also more expensive than the other brands
and msi has been fine with their cooler afaik


----------



## Dundundata

I've had great temps with MSI and good with XFX. It's weird because people with same cards get different results.


----------



## jaydude

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Ooh no....
> 
> The MSI is much better than Asus with temps!


Fixed, maybe I should have rechecked the reviews first haha, I am kicking myself over not waiting a little longer, would have picked up the powercolor 390 instead of the gigabyte


----------



## Agent Smith1984

Quote:


> Originally Posted by *jaydude*
> 
> Fixed, maybe I should have rechecked the reviews first haha, I am kicking myself over not waiting a little longer, would have picked up the powercolor 390 instead of the gigabyte


One thing is for sure, if you aren't dead set on max OC, the giga card, is great... Just cause it's giga ya know?


----------



## Joe88

Quote:


> Originally Posted by *jodybdesigns*
> 
> Its software at this point. Very well could be the HDTV throwing this error back to you. Try extending the displays instead of cloning them.See what happens.


extended desktop the flickering is not happening
might be just crimson software doing it, or windows 10...


----------



## Dhoulmagus

Quote:


> Originally Posted by *Carniflex*
> 
> If you do not have the card yet then there is a number of them which do have *full cover* blocks. The alphacool one is a hybrid block with large aluminium passive heatsink for VRM's and RAM and a watercooled core block for GPU core. For example here: http://www.ekwb.com/configurator/ you can chek if EK is making a full cover block for any specific card. Say for example Asus DC2 390X has a full cover block from them.
> 
> If Alphacool is ok then I believe they had also a version for 390x Nitro
> 
> Edit: Then again Alphacool has also a block for Tri-X 290 so if Alphacool is ok then the guy might not have to upgrade at all as I don't remeber from top of my head anything 390/390X could do and 290 could not seeing as they are basically the same card.


Thanks for the reply,

I've tried using the configurator from both Alphacool and EKWB for his Tri-x, unfortunately he has the 'New Edition' 11227-13 model with the modified VRM. I have only been able to find GPU only blocks which he doesn't want to do for 'looks' reasons and potentially unoptimal chipset cooling. If somebody could confirm the alphacool hybrid block would work, we may be able to work with that. Regardless I offered to buy his 290 so using the money toward a 390/x wouldn't be a big deal. It was my fault he bought it in the first place because I thought the Tri-X was reference.

I saw that the 390/X DC2 is compatible but can't seem to find one, they stay out of stock on Newegg and the 390 on Amazon is $435 - that seems steep. Have they been discontinued? The DC2 390X for $450 *may* be an option but a bit over budget, but an obvious choice for only $15 more than the 390.

With that said it seems like the Powercolor is the only option left for a full coverage block (minus potentially passive cooling hybrids) but doesn't seem to have great OC results.

I never realized how stressful this can be for somebody who absolutely requires their system fully water cooled.. I see much more options with the 980 but that's a big bump in cost.


----------



## Carniflex

Quote:


> Originally Posted by *Serious_Don*
> 
> Thanks for the reply,
> 
> I've tried using the configurator from both Alphacool and EKWB for his Tri-x, unfortunately he has the 'New Edition' 11227-13 model with the modified VRM. I have only been able to find GPU only blocks which he doesn't want to do for 'looks' reasons and potentially unoptimal chipset cooling. If somebody could confirm the alphacool hybrid block would work, we may be able to work with that. Regardless I offered to buy his 290 so using the money toward a 390/x wouldn't be a big deal. It was my fault he bought it in the first place because I thought the Tri-X was reference.
> 
> I saw that the 390/X DC2 is compatible but can't seem to find one, they stay out of stock on Newegg and the 390 on Amazon is $435 - that seems steep. Have they been discontinued? The DC2 390X for $450 *may* be an option but a bit over budget, but an obvious choice for only $15 more than the 390.
> 
> With that said it seems like the Powercolor is the only option left for a full coverage block (minus potentially passive cooling hybrids) but doesn't seem to have great OC results.
> 
> I never realized how stressful this can be for somebody who absolutely requires their system fully water cooled.. I see much more options with the 980 but that's a big bump in cost.


I believe that vast majority of manufacturers (as I'm not any of them doing otherwise from top of my head) use exactly same PCB for both 390 and 390X versions of their cards.

As far as Alphacool goes - they have this program where you can send your card in and they will make a block for it and then send you a free block for your card if they start manufacturing it ( http://www.alphacool.com/shop_content.php/coID/58/content/Send-it-and-get-one-cooler-for-free/XTCsid/1a27lgvilv28lsu3bo2kd4tul4 ). Their blocks look ok as far as I understand (have not used them myself in person so far) single slot design although the thermal pads are a bit on the thick side for VRM's but then again reviews seems to suggest the cooling of auxiliary components is "adequate", i.e, somewhere in the ballpark of 65..70C for VRM's on the 290 review I saw a while ago and core cooling performance is what you would expect from water if not even slightly better than some full cover blocks. Just a note that if FULL system is water cooled the restriction of their block might be noticfeable - it's one of the highest restriction GPU blocks out there on the market as it's basically their CPU block with different mounting bracket. Then again it should not be a crippling problem for systems where only a single GPU is in the loop.

I'm pretty sure EK ships world wide, although I have no glue what might be their shipping cost to the US - might be an issue if it turns out to be excessive.

Another possibility is looking into some online shop that deals with water cooling and order a card with a water block already installed. I know aquatuning has nowadays a store on the US side as well and I'm pretty sure there is some originally US based stores as well I'm just not aware of them from top of my head


----------



## battleaxe

Quote:


> Originally Posted by *Joe88*
> 
> just avoid asus and gigabyte cards
> xfx had alot of heat problems with their r9 290 cooler not sure if it's fixed with the 390
> sapphire is pretty much top end but is also more expensive than the other brands
> and msi has been fine with their cooler afaik


XFX fixed all the issues they had on the 290. Its now one of the best cards among the Saphire and MSI. Those are the top three now probably.


----------



## SirBubby

Quote:


> Originally Posted by *SirBubby*
> 
> Quote:
> Originally Posted by Tobiman View Post
> 
> It could be that your memory could be at its limit especially if it's elpida. Most apps don't gain a lot from memory overclock these days so I wouldn't be too worried. Try overclocking just the core and see if you still get blackscreens. Also, use GPU-Z to confirm if you your memory is elpida or hynix or samsung. The last two are very good overclockers.
> It could also mean that your card needs more juice, try 50mv and see if it sticks.
> 
> I checked GPU-Z, it's hynix so that's a relief.
> Quote:
> Originally Posted by Dundundata View Post
> 
> Well for starters there's really no reason not to set power limit to +50. But your card should handle 1070/1525 no problem. I would try uninstalling drivers and using DDU uninstaller to completely remove them, and re-seating your video card in the motherboard, because it doesn't sound right.
> 
> For memory he was talking about the video card but it should be able to handle more than that, but I have hynix so not sure about elpida.
> 
> I uninstalled 3d mark, ran ddu and reinstalled the newest beta drivers. I redownloaded 3d mark and ran it with no OC to the video card, but it still didn't work. The screen starts to flicker black and then goes black, even if you esc out of the problem. Strange because it used to work. I can open a game and it works fine. It must be an issue with 3d mark. I guess I need to find another gpu stress tester. Anyone have an recommendations? Unigine Valley?


UPDATE:

I found out its not the video card but my new ACER Acer XG270HU monitor. I plugged in my old ASUS and had zero problems. For some reason afterburner and that acer won't work together, but i can use the MSI gaming app no problem.


----------



## mandrix

Quote:


> Originally Posted by *Carniflex*
> 
> I believe that vast majority of manufacturers (as I'm not any of them doing otherwise from top of my head) use exactly same PCB for both 390 and 390X versions of their cards.
> 
> As far as Alphacool goes - they have this program where you can send your card in and they will make a block for it and then send you a free block for your card if they start manufacturing it ( http://www.alphacool.com/shop_content.php/coID/58/content/Send-it-and-get-one-cooler-for-free/XTCsid/1a27lgvilv28lsu3bo2kd4tul4 ). Their blocks look ok as far as I understand (have not used them myself in person so far) single slot design although the thermal pads are a bit on the thick side for VRM's but then again reviews seems to suggest the cooling of auxiliary components is "adequate", i.e, somewhere in the ballpark of 65..70C for VRM's on the 290 review I saw a while ago and core cooling performance is what you would expect from water if not even slightly better than some full cover blocks. Just a note that if FULL system is water cooled the restriction of their block might be noticfeable - it's one of the highest restriction GPU blocks out there on the market as it's basically their CPU block with different mounting bracket. Then again it should not be a crippling problem for systems where only a single GPU is in the loop.
> 
> *I'm pretty sure EK ships world wide, although I have no glue what might be their shipping cost to the US* - might be an issue if it turns out to be excessive.
> 
> Another possibility is looking into some online shop that deals with water cooling and order a card with a water block already installed. I know aquatuning has nowadays a store on the US side as well and I'm pretty sure there is some originally US based stores as well I'm just not aware of them from top of my head


I had to order the water block for my PCS 390x straight from EK as no one here carried them.
Block was $126.49 (US)
Shipping to US was $25.52 for a total of $152.01, and arrived in about 3 days.
PPCs here in Florida did have the backplate, though, and I ordered it from them for $34.99 + S&H.


----------



## flopper

Quote:


> Originally Posted by *SirBubby*
> 
> UPDATE:
> 
> I found out its not the video card but my new ACER Acer XG270HU monitor. I plugged in my old ASUS and had zero problems. For some reason afterburner and that acer won't work together, but i can use the MSI gaming app no problem.


did you set msi afterburner to kernel mode?
Might help.


----------



## SirBubby

Quote:


> Originally Posted by *flopper*
> 
> did you set msi afterburner to kernel mode?
> Might help.


I did not, I'll give that a try. Thank you!

EDIT:

Kernal mode didn't work.

Switched from DP to HDMI cable, that didn't work.

Turned off Free sync, that didn't work

Turn refresh rate down to 60hz from 144hz, that didn't work.

I'm out of ideas. My buddy has the same card and monitor and his works fine. Maybe I have a defective monitor?


----------



## yuannan

Nice information.

I'm very likely to get the Sapphire one right now.

What about noise?

Which cooler makes the most and least noise?


----------



## Vexile

I need help and opinions asap. I bought the r9 390 Sapphire Nitro w/ Backplate two weeks ago.
I'm playing a lot of games, but mostly World of Warcraft in my free time and frames get up to 400 in some areas.
I had no problems up until yesterday, when the card started to have huge coil whine when the fps is not limited to 200.
I repeat, i didn't had the problem earlier.

I gave the whole pc to the place where i got it and they are testing it with Heaven benchmarks, where the card doesn't make a sound, since the fps is a lot lower. They said my card is a lot quieter and runs better than most 390's. When i told them what's the problem they said all the 390's and 970's have the same problem with high fps and that Sapphire will probably not replace my card.

Is that really the situation? Should i try for a replacement or should i limit the frames to 200 and forget about it..?


----------



## Carniflex

Quote:


> Originally Posted by *Vexile*
> 
> I need help and opinions asap. I bought the r9 390 Sapphire Nitro w/ Backplate two weeks ago.
> I'm playing a lot of games, but mostly World of Warcraft in my free time and frames get up to 400 in some areas.
> I had no problems up until yesterday, when the card started to have huge coil whine when the fps is not limited to 200.
> I repeat, i didn't had the problem earlier.
> 
> I gave the whole pc to the place where i got it and they are testing it with Heaven benchmarks, where the card doesn't make a sound, since the fps is a lot lower. They said my card is a lot quieter and runs better than most 390's. When i told them what's the problem they said all the 390's and 970's have the same problem with high fps and that Sapphire will probably not replace my card.
> 
> Is that really the situation? Should i try for a replacement or should i limit the frames to 200 and forget about it..?


Just limit the frames to whatever is your monitor refresh rate and don't worry about it would be my suggestion. As long as it's not whining during your normal gameplay that is.


----------



## battleaxe

Quote:


> Originally Posted by *Vexile*
> 
> I need help and opinions asap. I bought the r9 390 Sapphire Nitro w/ Backplate two weeks ago.
> I'm playing a lot of games, but mostly World of Warcraft in my free time and frames get up to 400 in some areas.
> I had no problems up until yesterday, when the card started to have huge coil whine when the fps is not limited to 200.
> I repeat, i didn't had the problem earlier.
> 
> I gave the whole pc to the place where i got it and they are testing it with Heaven benchmarks, where the card doesn't make a sound, since the fps is a lot lower. They said my card is a lot quieter and runs better than most 390's. When i told them what's the problem they said all the 390's and 970's have the same problem with high fps and that Sapphire will probably not replace my card.
> 
> Is that really the situation? Should i try for a replacement or should i limit the frames to 200 and forget about it..?


Quote:


> Originally Posted by *Carniflex*
> 
> Just limit the frames to whatever is your monitor refresh rate and don't worry about it would be my suggestion. As long as it's not whining during your normal gameplay that is.


Agreed. Just limit the frames. Your monitor won't be able to render 400fps anyway and you are likely getting some tearing as a result.


----------



## dave2150

Quote:


> Originally Posted by *Vexile*
> 
> I need help and opinions asap. I bought the r9 390 Sapphire Nitro w/ Backplate two weeks ago.
> I'm playing a lot of games, but mostly World of Warcraft in my free time and frames get up to 400 in some areas.
> I had no problems up until yesterday, when the card started to have huge coil whine when the fps is not limited to 200.
> I repeat, i didn't had the problem earlier.
> 
> I gave the whole pc to the place where i got it and they are testing it with Heaven benchmarks, where the card doesn't make a sound, since the fps is a lot lower. They said my card is a lot quieter and runs better than most 390's. When i told them what's the problem they said all the 390's and 970's have the same problem with high fps and that Sapphire will probably not replace my card.
> 
> Is that really the situation? Should i try for a replacement or should i limit the frames to 200 and forget about it..?


The FPS cap in WoW is 200 - so not sure where your getting the 400fps mark from. Also that must have been at 800x600 resolution with the details on low or something.

WoW is quite demanding when maxing it out at 1440P or higher, especially with the recent addition of HBAO+ etc.


----------



## Agent Smith1984

Quote:


> Originally Posted by *dave2150*
> 
> The FPS cap in WoW is 200 - so not sure where your getting the 400fps mark from. Also that must have been at 800x600 resolution with the details on low or something.
> 
> WoW is quite demanding when maxing it out at 1440P or higher, especially with the recent addition of HBAO+ etc.


Cap the frames at 200 if you are only getting the noise at FPS over that mark..... You don't ever need anything over that mark anyways.

It is a bit odd to hear so much of the inductor noise claims on these cards though, because neither of the 2 390's I had made a peep, and none of the 290's I had made noise either....

Now the Fury I had did have some noticeable coil whine, though I noticed it slowly got better after benching the card some.

My 980 doesn't make any noise, and I had heard of terrible coil whine with those also.....


----------



## tangelo

My MSI R9 390 is very quiet. My old Sapphire Vapor-X 7950 was waay more louder.


----------



## Vexile

Quote:


> Originally Posted by *dave2150*
> 
> The FPS cap in WoW is 200 - so not sure where your getting the 400fps mark from. Also that must have been at 800x600 resolution with the details on low or something.
> 
> WoW is quite demanding when maxing it out at 1440P or higher, especially with the recent addition of HBAO+ etc.


I'm playing at 1080p all on max settings and i get like 500 fps in some old raids or when i'm on a flying mount, so yeah it's possible.
Other than that the card is very quiet and good overclocker. With setting fps limit to 200 in amd control panel there is no problem.
The thing is that the first weeks there was no whine at all and now suddenly there is.


----------



## tims390x

Got my 390x







<3

Validation
http://www.techpowerup.com/gpuz/details.php?id=58dad

Card
Gigabyte R9 390X G1

Cooling
Stock


----------



## ZealotKi11er

Quote:


> Originally Posted by *blode*
> 
> is there a way to keep my clock from jumping? i'm playing rocket league. its 60 FPS locked and isn't hard to run (a lot of people play on laptops) and it was running fine on my 780ti which died a week ago. with a 390, my clock is waffling between 400 and 900 and stuttering and stammering and introducing artifacts to the field as well. the card does this same thing when i play just cause 2, but stays a little bit higher and i don't actually notice any performance errors. i have one monitor so i don't know how ULPS would be to blame, and i've tried +0, +20 and +50 on AMD Crimson global overdrive power setting
> 
> 
> 
> any ideas to stop this dumb power saving feature or whatever's going on?


Should not be the case. The clocks are low for me too when i plat Dota 2. Never had any problems.


----------



## Sgt Bilko

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *blode*
> 
> is there a way to keep my clock from jumping? i'm playing rocket league. its 60 FPS locked and isn't hard to run (a lot of people play on laptops) and it was running fine on my 780ti which died a week ago. with a 390, my clock is waffling between 400 and 900 and stuttering and stammering and introducing artifacts to the field as well. the card does this same thing when i play just cause 2, but stays a little bit higher and i don't actually notice any performance errors. i have one monitor so i don't know how ULPS would be to blame, and i've tried +0, +20 and +50 on AMD Crimson global overdrive power setting
> 
> 
> 
> any ideas to stop this dumb power saving feature or whatever's going on?
> 
> 
> 
> Should not be the case. The clocks are low for me too when i plat Dota 2. Never had any problems.
Click to expand...

I don't have those issues but to lock the clocks to 3D speeds download MSI Afterburner and in the settings enable unofficial overclocking without PowerPlay support.

That will lock the card to its max clocks at all times so you will need to bump up the fan speeds to keep the temps down and if that doesn't work for you then its not the card thats the problem.


----------



## XxxxVulcanxxxX

Im running the MSI 390x gaming 8G card and I was having a few issues with heat in my fractal designs define S even though I have 6 140mm fans (4intake, 2exhaust). I reapplied the thermal paste and this dropped me a few degrees at best but it did nothing to enable any higher clock speeds. 1120Mhz core was the best I could get without it sounding so loud that I could hear it over my headset and raising voltages was almost a no go with the amount of heat this thing was already pumping out.

So today I managed to get the components together to get this thing silenced or at least a lot quieter than it was... Im so glad I have.

Im only cooling the GPU for now as that was the priority, My parts list..
Alphacool Nexxos 390 M02 block
Magicool MC-DCP450 Pump res combo
XSPC EX280 radiator
EK fittings and thick walled hose
Thermal Grizzly Kryonaut TIM.

I knew how much fun these alphacool blocks are to fit after fitting one to a 290x already but after a couple of hours it was all up and running and now I get amazing performance and its super quiet in games.

Valley Benchmark loop temps were 79c now down to 55c - (24c drop)
Single firestrike run I was topping out at 76c but now its just 51c - (25c drop)
VRM temps have stayed around the same with most tests being lower by 1c or the same as stock cooler
Tests ran with ambient room temp ~20c

I have now managed to run core speed up to 1200Mhz without any difficulty but im still playing around with speeds and voltages so will have to post back with results on that once I settle on a final set of numbers. Im happy now with my setup and have some video footage of the build and plenty of more information that im tempted to upload to youtube once I do some editing and voice over stuff cos you really dont want to hear me and my mate swearing for an hour.

I ran OCCT stress test for over an hour at 1100Mhz core and managed to warm the card to 70c but worryingly the VRMs showed 128c. I am certain this would have caused my system to reboot had i left it that length of time running on the stock cooler. I dont usually run OCCT as its not a real test of how a graphics card works but I did it as a comparison for my pal with the 290x with the same block.


----------



## jdorje

Joining the club, as it were.



http://imgur.com/2R8VQjJ


I have the XFX 390 (8256). GPU cooling is fine but VRM temps aren't great (generally like 25C hotter than the core, so even making an appropriate fan curve is tricky). The backplate is nice I guess.

On stock voltage I have it at 1090 mhz and 1750 mhz vram. It might not be 100% stable at this level (particularly the vram) but is pretty close. If I raise voltage I know I'll be hitting 110+ on the VRMS...how bad is that?

How can I improve VRM cooling? I've got a screwdriver and some CLU as well as regular TIM lying around.


----------



## afyeung

Add me to the club. Specs: i7 5820k @4.5ghz. MSI 390x gaming + MSI 290 watercooled. 390x is running hot as hell btw. Probably 90 degrees at stock. I plan to repaste it and add a fan on the back of the radiator on top of the 390x so it moves the air away from it. Any other suggestions? I'll worry about custom fan curve later because this card is already super loud. I thought it would be cool and quiet


----------



## jdorje

Quote:


> Originally Posted by *afyeung*
> 
> Add me to the club. Specs: i7 5820k @4.5ghz. MSI 390x gaming + MSI 290 watercooled. 390x is running hot as hell btw. Probably 90 degrees at stock. I plan to repaste it and add a fan on the back of the radiator on top of the 390x so it moves the air away from it. Any other suggestions? I'll worry about custom fan curve later because this card is already super loud. I thought it would be cool and quiet


Put the watercooled card on top.


----------



## Sgt Bilko

Quote:


> Originally Posted by *jdorje*
> 
> Joining the club, as it were.
> 
> 
> 
> http://imgur.com/2R8VQjJ
> 
> 
> I have the XFX 390 (8256). GPU cooling is fine but VRM temps aren't great (generally like 25C hotter than the core, so even making an appropriate fan curve is tricky). The backplate is nice I guess.
> 
> On stock voltage I have it at 1090 mhz and 1750 mhz vram. It might not be 100% stable at this level (particularly the vram) but is pretty close. If I raise voltage I know I'll be hitting 110+ on the VRMS...how bad is that?
> 
> How can I improve VRM cooling? I've got a screwdriver and some CLU as well as regular TIM lying around.


Weird, my 390x's VRM's never het hotter than the core temp and DON'T use CLU on the VRM's because you will fry the card you could try some Fujipoly thermal pads though


----------



## afyeung

Quote:


> Originally Posted by *jdorje*
> 
> Put the watercooled card on top.


The tubes can't bend that far. I just thought the 390x would be significantly easier to cool than the 290x. Idle temps are amazing though. Usually around 26 degrees. Might try with just the 390x in and see if it still gets extremely hot.


----------



## ZealotKi11er

Quote:


> Originally Posted by *afyeung*
> 
> The tubes can't bend that far. I just thought the 390x would be significantly easier to cool than the 290x. Idle temps are amazing though. Usually around 26 degrees. Might try with just the 390x in and see if it still gets extremely hot.


Nope. 390X runs hotter then 290X. They just used better cooler on 390X.


----------



## Sgt Bilko

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *afyeung*
> 
> The tubes can't bend that far. I just thought the 390x would be significantly easier to cool than the 290x. Idle temps are amazing though. Usually around 26 degrees. Might try with just the 390x in and see if it still gets extremely hot.
> 
> 
> 
> Nope. 390X runs hotter then 290X. They just used better cooler on 390X.
Click to expand...

i will be testing this within the next week or so......stay tuned


----------



## jdorje

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Weird, my 390x's VRM's never het hotter than the core temp and DON'T use CLU on the VRM's because you will fry the card you could try some Fujipoly thermal pads though


It seems like there must not be any contact between them and the heat sink, right?

When we say the VRMs we're talking about the chokes, right? On this image, it'd be the 6 rectangles that say r15?

FWIW my mobo VRMS are super hot too. I live at 8400 feet (~70% sea level pressure) and it does some strange things to cooling.

Ah about the thermal pads. Found this and this. The first one is like a third the price but only 2/3 the cooling power. Is there a better place to get that than amazon? Are those a little sticky, and able to hold onto some little metal heat sinks? Will it be enough to bridge the gap between my vrms and my GPU heatsink block? Guess I'll have to open it up to answer that last one.

Thanks for your help.


----------



## afyeung

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Nope. 390X runs hotter then 290X. They just used better cooler on 390X.


Actually, the idle temps on the 390x are amazing so they did in fact tweak the bios. Idle temps at 25 degrees c. However it does consume just as much power if not more. And so far it runs super hot for me under load even with the case side panel off.


----------



## ZealotKi11er

Quote:


> Originally Posted by *afyeung*
> 
> Actually, the idle temps on the 390x are amazing so they did in fact tweak the bios. Idle temps at 25 degrees c. However it does consume just as much power if not more. And so far it runs super hot for me under load even with the case side panel off.


Idle does not really matter. Thats probably the cooler. 390X has higher clock in both core and memory and runs higher voltages hence higher power consumption.


----------



## jdorje

Quote:


> Originally Posted by *afyeung*
> 
> The tubes can't bend that far. I just thought the 390x would be significantly easier to cool than the 290x. Idle temps are amazing though. Usually around 26 degrees. Might try with just the 390x in and see if it still gets extremely hot.


Cooling crossfire requires lots of ventilation. Remove the side panel and blow a fan on it and you should see massive improvement. But without that the top fan is getting no air movement as its just blocked by the bottom.

No air-cooled GPU will come anywhere near an AIO. The AIO doesn't require ventilation in the same way since the heat is pumped away. I'd try again really hard to get the water-cooled one on top.


----------



## Sgt Bilko

Quote:


> Originally Posted by *jdorje*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Weird, my 390x's VRM's never het hotter than the core temp and DON'T use CLU on the VRM's because you will fry the card you could try some Fujipoly thermal pads though
> 
> 
> 
> It seems like there must not be any contact between them and the heat sink, right?
> 
> When we say the VRMs we're talking about the chokes, right? On [img=[URL=http://cdn.wccftech.com/wp-content/uploads/2015/02/HIS-Radeon-R9-290X-IceQ-X2-Turbo-4GB-GDDR5-H290XQMT4GD-PCB.jpg%5Dthis]http://cdn.wccftech.com/wp-content/uploads/2015/02/HIS-Radeon-R9-290X-IceQ-X2-Turbo-4GB-GDDR5-H290XQMT4GD-PCB.jpg]this[/URL] image[/img], it'd be the 6 rectangles that say r15?
> 
> FWIW my mobo VRMS are super hot too. I live at 8400 feet (~70% sea level pressure) and it does some strange things to cooling.
Click to expand...

Im on mobile so its hard to point out but the VRM's (voltage regulator modules) are next to the chokes and are shorter and are metal capped usually.

And it's possible that the heatsink isn't making correct contact


----------



## kizwan

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *afyeung*
> 
> Actually, the idle temps on the 390x are amazing so they did in fact tweak the bios. Idle temps at 25 degrees c. However it does consume just as much power if not more. And so far it runs super hot for me under load even with the case side panel off.
> 
> 
> 
> Idle does not really matter. Thats probably the cooler. 390X has higher clock in both core and memory and runs higher voltages hence higher power consumption.
Click to expand...

Actually it's true. My 290's with 390 BIOS was running cooler than 290 BIOS at same clock.

He actually run both his 290 & 390 card at same clock. I also found that 390 BIOS have higher vdroop than 290 BIOS. So even if the 390 pre-set to higher voltage, the actual voltage when underload may not higher than 290. 290s & 390s are pretty much the same card but the BIOS & memory ICs are different.

But yeah, when crossfire, doesn't matter how good the top card cooling is, top card will running hot because the radiated heat from bottom card will prevent the top card from getting cooler air for cooling. The watercooled card should be on top.


----------



## navjack27

i did a whole video on this and i might do more... RE: jumping clocks


----------



## rdr09

Quote:


> Originally Posted by *navjack27*
> 
> i did a whole video on this and i might do more... RE: jumping clocks


untick voltage monitoring.see if it helps.


----------



## navjack27

oh i was just posting that cuz earlier people were talking about how to lock their clocks at 3d mode. my performance is BEAUTIFUL right now.

http://www.3dmark.com/fs/6578175

http://www.passmark.com/baselines/V8/display.php?id=52502541084


----------



## Spartoi

I bought a Sapphire R9 390 from newegg a few days ago and it was running fine until I actually tried playing games. My setup is one monitor connected via a Display Port and one very long HDMI cable connected to my TV. When I'm playing games for long sessions, I switch over to my TV. I tried that for the first time today, but I have run into a major issue. When playing a game, my TV's screen will turn black for about 3 seconds and then go back to normal. Unfortunately, this repeats throughout the entire time the game is open, so my TV is constantly looping a black screen making everything unplayable.

After some testing and researching, I'm fairly certain this problem is caused by the GPU. My previous GTX 980 didn't have any problems with this setup so I'm so sure it's the GPU. I'm also sure it's the GPU because if I unplug my monitor and leave the TV as the only device connected to my GPU, the game runs without any black screen looping. Once I re-add the monitor, the black screen re-occurs. Games run perfectly fine on the monitor btw, regardless if the TV is connected as well or disconnected.

So, I was wondering if this was a hardware problem or a software (driver) problem? Should I RMA this GPU or can this be fixed with some software setting?


----------



## afyeung

Quote:


> Originally Posted by *kizwan*
> 
> Actually it's true. My 290's with 390 BIOS was running cooler than 290 BIOS at same clock.
> 
> He actually run both his 290 & 390 card at same clock. I also found that 390 BIOS have higher vdroop than 290 BIOS. So even if the 390 pre-set to higher voltage, the actual voltage when underload may not higher than 290. 290s & 390s are pretty much the same card but the BIOS & memory ICs are different.
> 
> But yeah, when crossfire, doesn't matter how good the top card cooling is, top card will running hot because the radiated heat from bottom card will prevent the top card from getting cooler air for cooling. The watercooled card should be on top.


Quote:


> Originally Posted by *jdorje*
> 
> Cooling crossfire requires lots of ventilation. Remove the side panel and blow a fan on it and you should see massive improvement. But without that the top fan is getting no air movement as its just blocked by the bottom.
> 
> No air-cooled GPU will come anywhere near an AIO. The AIO doesn't require ventilation in the same way since the heat is pumped away. I'd try again really hard to get the water-cooled one on top.


Well first, I'd rather run the 390x as the main driver because it still is more powerful and I have come across less issues with it. And also, if I'm running a liquid cooled card on the bottom which is barely radiating any heat other than the VRM's, and I'm getting throttling on the top card, how is it possible that people can run air cooled in crossfire and get decent temps? I think the issue might be the radiator fan on the AIO is not moving any air away from the 390x so the hot air recirculates. I will add another fan for push pull on the AIO and repaste the 390x. Hopefully this will work. If it doesn't, I will test the 390x alone and see if that's the issue. I run my system with the side panel off btw.


----------



## afyeung

Quote:


> Originally Posted by *kizwan*
> 
> Actually it's true. My 290's with 390 BIOS was running cooler than 290 BIOS at same clock.
> 
> He actually run both his 290 & 390 card at same clock. I also found that 390 BIOS have higher vdroop than 290 BIOS. So even if the 390 pre-set to higher voltage, the actual voltage when underload may not higher than 290. 290s & 390s are pretty much the same card but the BIOS & memory ICs are different.
> 
> But yeah, when crossfire, doesn't matter how good the top card cooling is, top card will running hot because the radiated heat from bottom card will prevent the top card from getting cooler air for cooling. The watercooled card should be on top.


I actually have a fan blowing directly on both GPUs. However, I think this might actually be detrimental to the 390x because it's getting fed with extremely hot air which was heated up by the backplate on the 290 which gets real hot. I decided to remove the fan then test later and see if I get an improvement on temps. I really don't want to have to repaste the 390x because the idle temps are great and poor pasting is probably not the issue. If all else fails I guess I'll just invest in another liquid cooling project


----------



## Carniflex

Quote:


> Originally Posted by *afyeung*
> 
> I actually have a fan blowing directly on both GPUs. However, I think this might actually be detrimental to the 390x because it's getting fed with extremely hot air which was heated up by the backplate on the 290 which gets real hot. I decided to remove the fan then test later and see if I get an improvement on temps. I really don't want to have to repaste the 390x because the idle temps are great and poor pasting is probably not the issue. If all else fails I guess I'll just invest in another liquid cooling project


I had similar situation, a sidepanel 140mm blowing cold air right into 390X and temps were damn bad. Flipping the vent around (so it pulls air out of the case did drop the temps by about 3...4C but it did not perform any miracles. I'm still hitting 90C when playing long enough. I have another 140mm blowing cold air towards the card from the front of the case (~20 cm away). It just seems that 390X is one very very hot card normally.

On the other hand after flipping the sidepanel 140mm around to pull air out of the case my bottom card (not crossfired, just 7870 in there with 40mm gap [two empty pcie slots] between the cards) runs about 20C hotter than it did before (mostly idle, up to 30% load sometimes). Motherboard sensors are showing up to 70C as well now which is also approx 20C hotter than they used to be. I will most likely put a loop on just the 390X because of all that. I dropped water a while ago because of the weight (it's a portable system I carry around with mine - last iteration with water was 32 kg, now it's 12 kg, going back for water for this single card will add about 3..4 kg back probably unfortunately).


----------



## dave2150

Repasting the MSI 390X reduced my temperatures by 15DC across the board, well worth it.


----------



## XxxxVulcanxxxX

Quote:


> Originally Posted by *dave2150*
> 
> Repasting the MSI 390X reduced my temperatures by 15DC across the board, well worth it.


I had same with the MSI card, i lost around 7-10c, The stock paste on mine looked all curdled and horrible and look like really cheap and nasty. Shame on MSI. I also have issue with the ram cooling on the shroud it only covers half the chip on 6 of the chips....



this was the messy horrible thermal paste MSI used on my card from the factory...


----------



## afyeung

Quote:


> Originally Posted by *Carniflex*
> 
> I had similar situation, a sidepanel 140mm blowing cold air right into 390X and temps were damn bad. Flipping the vent around (so it pulls air out of the case did drop the temps by about 3...4C but it did not perform any miracles. I'm still hitting 90C when playing long enough. I have another 140mm blowing cold air towards the card from the front of the case (~20 cm away). It just seems that 390X is one very very hot card normally.
> 
> On the other hand after flipping the sidepanel 140mm around to pull air out of the case my bottom card (not crossfired, just 7870 in there with 40mm gap [two empty pcie slots] between the cards) runs about 20C hotter than it did before (mostly idle, up to 30% load sometimes). Motherboard sensors are showing up to 70C as well now which is also approx 20C hotter than they used to be. I will most likely put a loop on just the 390X because of all that. I dropped water a while ago because of the weight (it's a portable system I carry around with mine - last iteration with water was 32 kg, now it's 12 kg, going back for water for this single card will add about 3..4 kg back probably unfortunately).


Quote:


> Originally Posted by *dave2150*
> 
> Repasting the MSI 390X reduced my temperatures by 15DC across the board, well worth it.


Quote:


> Originally Posted by *XxxxVulcanxxxX*
> 
> I had same with the MSI card, i lost around 7-10c, The stock paste on mine looked all curdled and horrible and look like really cheap and nasty. Shame on MSI. I also have issue with the ram cooling on the shroud it only covers half the chip on 6 of the chips....
> 
> 
> 
> this was the messy horrible thermal paste MSI used on my card from the factory...


Will definitely get to repasting the 390x. Thanks for the reassurance and info. I tried using the fan to blow heat away but it made almost no difference. Overall, as of now I am very dissatisfied with the thermals on the 390x and hopefully repasting will at least lower the temps to the 80s so I can start gaming for longer than 3 mins before my temps go through the roof.


----------



## Mysticking32

Anyone getting the error message Display driver has stopped responding on the newest beta drivers??? I'm not sure what's causing it. It's not happening during gameplay but only when they start up. Haven't tested other games (GTA 5 works fine though) but this is happening with Total war rome 2 and assassin's creed 4. Again so far it's only when the games start up then it gives the message. Quick reboot fixes it but I've tried about everything I know to do to fix it.






I also used ddu to uninstall my drivers. Took gpu out of pci slot and clean installed.

Wiped windows entirely and reinstalled everything.

I also ran memtest on my ram. No issues there either.

Might be power supply. But I'm thinking I might have to rma the gpu. Hopefully not but it's possible.


----------



## XxxxVulcanxxxX

Mysticking32 I thought the exact same things blaming my psu and doing the driver cleaning etc... It turned out to be poor thermal performance in my case as I'm now running much bigger overclocks on water cooling and nothing else has changed in my system. Turn down overclocks a touch and try again that's all I could do to keep stability. Also it was random when I'd get the driver crash sometimes I would be at mid 70s C and others over 80c+


----------



## kizwan

Quote:


> Originally Posted by *Mysticking32*
> 
> Anyone getting the error message Display driver has stopped responding on the newest beta drivers??? I'm not sure what's causing it. It's not happening during gameplay but only when they start up. Haven't tested other games (GTA 5 works fine though) but this is happening with Total war rome 2 and assassin's creed 4. Again so far it's only when the games start up then it gives the message. Quick reboot fixes it but I've tried about everything I know to do to fix it.
> 
> 
> 
> 
> 
> 
> I also used ddu to uninstall my drivers. Took gpu out of pci slot and clean installed.
> 
> Wiped windows entirely and reinstalled everything.
> 
> I also ran memtest on my ram. No issues there either.
> 
> Might be power supply. But I'm thinking I might have to rma the gpu. Hopefully not but it's possible.


Please fill in rigbuilder form. This way we can see what you have there. Anyway, RMA is a bit extreme don't you think? Did you try Crimson non-beta drivers?


----------



## Mysticking32

Ah my mistake. I forgot to do the rig builder on this site.

I5 4670k (tried overclocked and non overclocked. same result) OC 4.4
16gb Corsair Dominator 2400mhz ram
MSI r9 390x (also tried oc and non oc) Oc to 1100mhz. (so shouldn't be a problem since msi says those are stock settings)
650 watt PSU Rosewill Hive Bronze
Motherboard is Asrock z97 killer Fatal1ty

And I'm afraid to try the non beta drivers due to the fan issue. If need be I can try it though. Just got to watch the temps lol.


----------



## ClowReed

Hello, people! I just made an account to post here.

I tried to search for answers all over the internet and in this post... but nothing really manages to solve my problem.

Well... as you may well know the 390X heats up a LOT. I mean... REALLY. I just came from Nvidia after long years without giving a chance to AMD and now I'm back with this 390X.

I've already repasted it with Arctic Silver 7 and it helps a little but it's not a game changer. My 390X is from a local brand called "PCYES". It seems like crap but uses the IceQ x2 from HIS. Actually it's exactly the HIS model just rebranded. This one> http://www.hisdigital.com/gb/product2-885.shtml

Ok... so... This card throttles by default, right? Cause if I play Star Citizen or Metro 2033 Redux it just can't manage the heat or power at default settings. I've even seen some texture artifacts (some black patterns) with stock settings (it's overclocked by default) running 3DMark Fire Strike in loop and reaching the max temps (95 ºC). I've created custom fan settings with MSI Afterburner and it reaches 70 to 90% of fan speed in any gaming situation to manage the temps under 85 ºC (my new max temp setted at Overdrive settings). I've raised the Power Limit to 20% to prevent power throttle. Even with all these tweaks the card throttles cause in some situations it can't stay under 85 ºC

So... any help is much appreciated. I don't want this card to throttle cause if so... what's the point in it being overclocked? And why AMD sets such a default max temp if it even gives artifacts at that temp? And that's WAY TOO HIGH TEMP for daily use. Probably destroys the card after some time using it that way.

I think I didn't forget to mention anything... if so... please point it out for me.

Thanks in advance!!


----------



## Dundundata

What kind of voltage does the 390x run?


----------



## Carniflex

Quote:


> Originally Posted by *ClowReed*
> 
> Hello, people! I just made an account to post here.
> 
> I tried to search for answers all over the internet and in this post... but nothing really manages to solve my problem.
> 
> Well... as you may well know the 390X heats up a LOT. I mean... REALLY. I just came from Nvidia after long years without giving a chance to AMD and now I'm back with this 390X.
> 
> I've already repasted it with Arctic Silver 7 and it helps a little but it's not a game changer. My 390X is from a local brand called "PCYES". It seems like crap but uses the IceQ x2 from HIS. Actually it's exactly the HIS model just rebranded. This one> http://www.hisdigital.com/gb/product2-885.shtml
> 
> Ok... so... This card throttles by default, right? Cause if I play Star Citizen or Metro 2033 Redux it just can't manage the heat or power at default settings. I've even seen some texture artifacts (some black patterns) with stock settings (it's overclocked by default) running 3DMark Fire Strike in loop and reaching the max temps (95 ºC). I've created custom fan settings with MSI Afterburner and it reaches 70 to 90% of fan speed in any gaming situation to manage the temps under 85 ºC (my new max temp setted at Overdrive settings). I've raised the Power Limit to 20% to prevent power throttle. Even with all these tweaks the card throttles cause in some situations it can't stay under 85 ºC
> 
> So... any help is much appreciated. I don't want this card to throttle cause if so... what's the point in it being overclocked? And why AMD sets such a default max temp if it even gives artifacts at that temp? And that's WAY TOO HIGH TEMP for daily use. Probably destroys the card after some time using it that way.
> 
> I think I didn't forget to mention anything... if so... please point it out for me.
> 
> Thanks in advance!!


Is water cooling an option for you? If it is get some reasonable starter kit (~125..150 $/eur) and a gpu block for it (~50 ... 125 $/eur).


----------



## OdinValk

So After 3 bad 970s, I joined team red today with a Sapphire Tri-x 390.

http://www.techpowerup.com/gpuz/details.php?id=famyn


----------



## kizwan

Quote:


> Originally Posted by *ClowReed*
> 
> Hello, people! I just made an account to post here.
> 
> I tried to search for answers all over the internet and in this post... but nothing really manages to solve my problem.
> 
> Well... as you may well know the 390X heats up a LOT. I mean... REALLY. I just came from Nvidia after long years without giving a chance to AMD and now I'm back with this 390X.
> 
> I've already repasted it with Arctic Silver 7 and it helps a little but it's not a game changer. My 390X is from a local brand called "PCYES". It seems like crap but uses the IceQ x2 from HIS. Actually it's exactly the HIS model just rebranded. This one> http://www.hisdigital.com/gb/product2-885.shtml
> 
> Ok... so... This card throttles by default, right? Cause if I play Star Citizen or Metro 2033 Redux it just can't manage the heat or power at default settings. I've even seen some texture artifacts (some black patterns) with stock settings (it's overclocked by default) running 3DMark Fire Strike in loop and reaching the max temps (95 ºC). I've created custom fan settings with MSI Afterburner and it reaches 70 to 90% of fan speed in any gaming situation to manage the temps under 85 ºC (my new max temp setted at Overdrive settings). I've raised the Power Limit to 20% to prevent power throttle. Even with all these tweaks the card throttles cause in some situations it can't stay under 85 ºC
> 
> So... any help is much appreciated. I don't want this card to throttle cause if so... what's the point in it being overclocked? And why AMD sets such a default max temp if it even gives artifacts at that temp? And that's WAY TOO HIGH TEMP for daily use. Probably destroys the card after some time using it that way.
> 
> I think I didn't forget to mention anything... if so... please point it out for me.
> 
> Thanks in advance!!


It should not artifacting at stock (factory overclocked). To test whether your card is faulty or not, you can either increase voltage or running the card at stock (1070/1500). You also can try running the card at 1000/1250. If the card still artifacting, I suggest RMA the card.

Hawaii cards are designed to be able to operate up to 95C without damaging the card. Once the card reaches 95C, the card will throttling but it shouldn't cause artifacting. If your card do, it most likely faulty.


----------



## dave2150

Quote:


> Originally Posted by *afyeung*
> 
> Will definitely get to repasting the 390x. Thanks for the reassurance and info. I tried using the fan to blow heat away but it made almost no difference. Overall, as of now I am very dissatisfied with the thermals on the 390x and hopefully repasting will at least lower the temps to the 80s so I can start gaming for longer than 3 mins before my temps go through the roof.


Quote:


> Originally Posted by *afyeung*
> 
> Will definitely get to repasting the 390x. Thanks for the reassurance and info. I tried using the fan to blow heat away but it made almost no difference. Overall, as of now I am very dissatisfied with the thermals on the 390x and hopefully repasting will at least lower the temps to the 80s so I can start gaming for longer than 3 mins before my temps go through the roof.


Quote:


> Originally Posted by *afyeung*
> 
> Will definitely get to repasting the 390x. Thanks for the reassurance and info. I tried using the fan to blow heat away but it made almost no difference. Overall, as of now I am very dissatisfied with the thermals on the 390x and hopefully repasting will at least lower the temps to the 80s so I can start gaming for longer than 3 mins before my temps go through the roof.


Quote:


> Originally Posted by *afyeung*
> 
> Will definitely get to repasting the 390x. Thanks for the reassurance and info. I tried using the fan to blow heat away but it made almost no difference. Overall, as of now I am very dissatisfied with the thermals on the 390x and hopefully repasting will at least lower the temps to the 80s so I can start gaming for longer than 3 mins before my temps go through the roof.


Once repasted you should notice a big reduction in temperatures. The MSI cooler is really awesome when it's got properly applied thermal compound for sure.


----------



## The Stilt

Could someone who owns a Gigabyte 390 / 390X card dump their original bios and upload it? I´d like to have a look to see what they have done to prevent voltage adjustments.


----------



## jdorje

Quote:


> Originally Posted by *Spartoi*
> 
> I bought a Sapphire R9 390 from newegg a few days ago and it was running fine until I actually tried playing games. My setup is one monitor connected via a Display Port and one very long HDMI cable connected to my TV. When I'm playing games for long sessions, I switch over to my TV. I tried that for the first time today, but I have run into a major issue. When playing a game, my TV's screen will turn black for about 3 seconds and then go back to normal. Unfortunately, this repeats throughout the entire time the game is open, so my TV is constantly looping a black screen making everything unplayable.
> 
> After some testing and researching, I'm fairly certain this problem is caused by the GPU. My previous GTX 980 didn't have any problems with this setup so I'm so sure it's the GPU. I'm also sure it's the GPU because if I unplug my monitor and leave the TV as the only device connected to my GPU, the game runs without any black screen looping. Once I re-add the monitor, the black screen re-occurs. Games run perfectly fine on the monitor btw, regardless if the TV is connected as well or disconnected.
> 
> So, I was wondering if this was a hardware problem or a software (driver) problem? Should I RMA this GPU or can this be fixed with some software setting?


You're losing signal through the hdmi cable. What's your resolution and refresh?

Try a shorter cable, a better cable, or a different hdmi port.


----------



## ClowReed

Quote:


> Originally Posted by *Carniflex*
> 
> Is water cooling an option for you? If it is get some reasonable starter kit (~125..150 $/eur) and a gpu block for it (~50 ... 125 $/eur).


I'm from South America. Where could I buy these? It's very expensive here but depending on the results it's worth trying.

Thanks for the reply!


----------



## ClowReed

Quote:


> Originally Posted by *kizwan*
> 
> It should not artifacting at stock (factory overclocked). To test whether your card is faulty or not, you can either increase voltage or running the card at stock (1070/1500). You also can try running the card at 1000/1250. If the card still artifacting, I suggest RMA the card.
> 
> Hawaii cards are designed to be able to operate up to 95C without damaging the card. Once the card reaches 95C, the card will throttling but it shouldn't cause artifacting. If your card do, it most likely faulty.


Thanks for your reply! Running at the default clocks of the 390X (1000/1500) didn't showed me any artifacts indeed. However... a lot of throttling going on because of power limit and temperature. With the AMD default fan curve setted and the board at 92 ºC the fans didn't even reach 40% and the temperature control is done based mostly on throttling. It's sad cause the board never gonna run at the default clock speeds. I checked and it's always fluctuating around 950 mhz. So... No point in having it overclocked. If it's not because of the temperature, it's because the Power Limit thing. Very sad.

This without mentioning that this model specifically doesn't run without some artifacts at it's stock settings and high temperatures (1070/1500 @ +25mV). Yes, the artifacting is very subtle, yet very disturbing when I spot one.

One thing I'm confused about is that the board artifacts (a minimum, but it does) at +90 ºC and 1070MHz @ +25mV... But at stock speeds it doesn't, even at the same temperatures... Could this be repaired increasing the voltage?

Thanks in advance!


----------



## Guzmanus

Quote:


> Originally Posted by *ClowReed*
> 
> Thanks for your reply! Running at the default clocks of the 390X (1000/1500) didn't showed me any artifacts indeed. However... a lot of throttling going on because of power limit and temperature. With the AMD default fan curve setted and the board at 92 ºC the fans didn't even reach 40% and the temperature control is done based mostly on throttling. It's sad cause the board never gonna run at the default clock speeds. I checked and it's always fluctuating around 950 mhz. So... No point in having it overclocked. If it's not because of the temperature, it's because the Power Limit thing. Very sad.
> 
> This without mentioning that this model specifically doesn't run without some artifacts at it's stock settings and high temperatures (1070/1500 @ +25mV). Yes, the artifacting is very subtle, yet very disturbing when I spot one.
> 
> One thing I'm confused about is that the board artifacts (a minimum, but it does) at +90 ºC and 1070MHz @ +25mV... But at stock speeds it doesn't, even at the same temperatures... Could this be repaired increasing the voltage?
> 
> Thanks in advance!


If you have activated the frame rate control is normal to have the frequency throttling around: the GPU changes its frecuency to keep the framerate strictly to the number you set.


----------



## Carniflex

Quote:


> Originally Posted by *ClowReed*
> 
> I'm from South America. Where could I buy these? It's very expensive here but depending on the results it's worth trying.
> 
> Thanks for the reply!


Hm - that might make it problematic indeed if there is no local online store selling these.

The results .. well ... normally it wound do wonders for overclock as such but it mainly helps with noise and max temperature. As a rough ballpark max temp should drop approx 30C or so when going under water cooling.


----------



## jdorje

Quote:


> Originally Posted by *ClowReed*
> 
> Well... as you may well know the 390X heats up a LOT. I mean... REALLY. I just came from Nvidia after long years without giving a chance to AMD and now I'm back with this 390X.
> 
> I've already repasted it with Arctic Silver 7 and it helps a little but it's not a game changer. My 390X is from a local brand called "PCYES". It seems like crap but uses the IceQ x2 from HIS. Actually it's exactly the HIS model just rebranded. This one> http://www.hisdigital.com/gb/product2-885.shtml


The 390x does need substantial cooling; it's like 75W more than the 390 under load. Getting a third-tier or even second-tier cooler is pretty questionable. Add on that you probably have high ambients in summer and its a recipe for heat.

An alternative to the custom loop suggested earlier is to add on a hybrid AIO. There are a few options for htis, but you need a cooler bracket with a fan - the corsair hg10 and nzxt g10 are the most common, but corsair has a new one that's a blower-style cooler as well. You then add on a compatible AIO cooler to cool the GPU itself, piping most of the heat away to an exhaust 120mm rad somewhere. The prices on these vary a lot, but at sale prices in the US I think it could be done for as little as $50 ($10 for an on-sale g10 and $40 for an on-sale AIO). South America prices I have no idea obviously.

There's also the PC devil hybrid cooler. I wonder if they'd sell just the cooler block without the actual card?

One more option would be lowering the voltage and dropping the clock a bit. Since thermals are your main issue, raising the voltage does not seem advisable.


----------



## Spartoi

Quote:


> Originally Posted by *jdorje*
> 
> You're losing signal through the hdmi cable. What's your resolution and refresh?
> 
> Try a shorter cable, a better cable, or a different hdmi port.


1080p/60Hhz. This is my TV is a Sony KDL50W800B and my monitor is an ASUS VN248H-P.

I tried moving my PC closer to my TV connected through a different, shorter HDMI cable in a different port on the TV and the issue remains. I never thought the HDMI cable was the issue though because my GTX 980 worked fine in this setup and removing the Display Port connected monitor and having only the TV via HDMI plugged in fixes the issue. But this is not a solution. There is clearly something wrong with my 390's HDMI port. Just not sure if it software or hardware and if it is even worth RMA'ing.


----------



## ClowReed

Quote:


> Originally Posted by *jdorje*
> 
> The 390x does need substantial cooling; it's like 75W more than the 390 under load. Getting a third-tier or even second-tier cooler is pretty questionable. Add on that you probably have high ambients in summer and its a recipe for heat.
> 
> An alternative to the custom loop suggested earlier is to add on a hybrid AIO. There are a few options for htis, but you need a cooler bracket with a fan - the corsair hg10 and nzxt g10 are the most common, but corsair has a new one that's a blower-style cooler as well. You then add on a compatible AIO cooler to cool the GPU itself, piping most of the heat away to an exhaust 120mm rad somewhere. The prices on these vary a lot, but at sale prices in the US I think it could be done for as little as $50 ($10 for an on-sale g10 and $40 for an on-sale AIO). South America prices I have no idea obviously.
> 
> There's also the PC devil hybrid cooler. I wonder if they'd sell just the cooler block without the actual card?
> 
> One more option would be lowering the voltage and dropping the clock a bit. Since thermals are your main issue, raising the voltage does not seem advisable.


Yeah. I think I'll try that. We have water cooling here, but for AMD cards it's harder to find. I'll search carefully.

I already did undervoltage, but the temperature gains are so marginal that I don't think it's the case. If I want this beast to not throttle I think I'll need to go with water cooling sooner or later.

Anyway, thanks a lot for everyone that replied! At least now I know my GPU isn't flawed.

EDIT: Oh! About water coolers... I've seen some models but the block didn't cover the memories. How to workaround that?


----------



## ClowReed

Quote:


> Originally Posted by *Guzmanus*
> 
> If you have activated the frame rate control is normal to have the frequency throttling around: the GPU changes its frecuency to keep the framerate strictly to the number you set.


Thanks for your reply! But even with the frames well under the target it throttles. Anyway, I tried to disable it and the throttling indeed keep occuring.


----------



## XxxxVulcanxxxX

So after fitting the alphacool block to my MSI 390x I finally had a good play around with the overclocking settings today and managed 1230Mhz on the core but it was not 100% stable without using more than +100mv which i dont really feel comfortable running 24/7.

Ive settled for now at 1200mhz core +80mv and 1725mhz +50mv. This seems stable so far in 3d mark, valley, GTA V and assetto corsa but fallout 4 being the pig it is does not like the ram this high and for this one game alone I have the ram at 1700 with the same +50mv.

Im quite impressed my old 2500k is still keeping up so well after almost 5 years!


----------



## Dhoulmagus

Quote:


> Originally Posted by *XxxxVulcanxxxX*
> 
> So after fitting the alphacool block to my MSI 390x I finally had a good play around with the overclocking settings today and managed 1230Mhz on the core but it was not 100% stable without using more than +100mv which i dont really feel comfortable running 24/7.
> 
> Ive settled for now at 1200mhz core +80mv and 1725mhz +50mv. This seems stable so far in 3d mark, valley, GTA V and assetto corsa but fallout 4 being the pig it is does not like the ram this high and for this one game alone I have the ram at 1700 with the same +50mv.
> 
> Im quite impressed my old 2500k is still keeping up so well after almost 5 years!


Hey not bad, is this the hybrid block that was suggested to me a few pages back? Could you link me to the block your using and maybe share a picture of it mounted to your card?

edit: ah sorry, I found your post a few pages back. Is this MSI 390X you have the current model they have selling on Newegg? I've been trying to figure out which card and block combo I can get for a build I'm doing for a relative, he doesn't seem to be too big on going with Powercolor because OC results seem to be iffy with them. This might be a nice idea. I just want to make certain first because I already had him buy a Tri-x 290 only to find out they revised it with a different PCB that didn't fit the block he was after.


----------



## Majentrix

Coming to a PC near you.


----------



## jodybdesigns

Quote:


> Originally Posted by *XxxxVulcanxxxX*
> 
> Im quite impressed my old 2500k is still keeping up so well after almost 5 years!


Yeah I won't be upgrading my 3570k until newer tech comes out. Keep saying this a lot, but would like to see the bridge between CPU->RAM eliminated with no bottleneck.

But when that happens, technology will evolve.


----------



## Darkeylel

So did I read a post right that you can put the gigabyte 390x on water ? If so anyone able to link me what I need and do they deliver to Australia


----------



## Joe88

first oc http://www.techpowerup.com/gpuz/details.php?id=fe36


----------



## Dhoulmagus

Quote:


> Originally Posted by *Darkeylel*
> 
> So did I read a post right that you can put the gigabyte 390x on water ? If so anyone able to link me what I need and do they deliver to Australia


http://www.alphacool.com/product_info.php/info/p1382_GPX-A-290-M02.html?language=en&XTCsid=6d1vcuguvmi79u2k852c7jei66

That is a hybrid block he is using, its a GPU block + passive cooling for the vrm/chipset. It is advertised to work on the MSI 290/x and 390/x. I'm not sure about getting it to Australia, you'll have to search for the block and see.


----------



## jdorje

So my (XFX 8256) 390 has two VRM sensors. In the hwinfo block for the GPU Core they are listed as "GPU VRM Temperature1" and 2. Temperature 1 is the hot one, when under load, though it's cooler at idle. Interestingly in the hwinfo block for the GPU VRAM the two sensors are switched; temperature1 and temperature2 are precisely reversed. But anyway.

Which VRM is which? The fact that the two sensors are reversed in the two different blocks confuses me here.

A quick search turns up this image, which is an XFX 280x. By comparison this is the XFX 390x, which I suspect closely correlates to my card. Supposedly it's the single-choke VRM on the left whose change lead to a great improvement in the XFX vrm temps, along with making it incompatible with standard 390 waterblocks.

I haven't opened the card up yet, obviously. That'll come soon but I'd like to know as much as possible first.


----------



## Carniflex

Quote:


> Originally Posted by *Serious_Don*
> 
> http://www.alphacool.com/product_info.php/info/p1382_GPX-A-290-M02.html?language=en&XTCsid=6d1vcuguvmi79u2k852c7jei66
> 
> That is a hybrid block he is using, its a GPU block + passive cooling for the vrm/chipset. It is advertised to work on the MSI 290/x and 390/x. I'm not sure about getting it to Australia, you'll have to search for the block and see.


390 M04 is supposed to be the Gigabyte 390x G1 block. I would wait few more days until they update their compatibility pdf list. It was released just couple of days ago and their online configurator and compatibility are not aware of it just yet.


----------



## Carniflex

Quote:


> Originally Posted by *Darkeylel*
> 
> So did I read a post right that you can put the gigabyte 390x on water ? If so anyone able to link me what I need and do they deliver to Australia


They ship internationally. The block for Gigabyte 390x G1 was supposed to be the 390 M04 - give it few days before ordering as their online configurator is not yet aware of that new block.


----------



## Carniflex

Quote:


> Originally Posted by *ClowReed*
> 
> Yeah. I think I'll try that. We have water cooling here, but for AMD cards it's harder to find. I'll search carefully.
> 
> I already did undervoltage, but the temperature gains are so marginal that I don't think it's the case. If I want this beast to not throttle I think I'll need to go with water cooling sooner or later.
> 
> Anyway, thanks a lot for everyone that replied! At least now I know my GPU isn't flawed.
> 
> EDIT: Oh! About water coolers... I've seen some models but the block didn't cover the memories. How to workaround that?


When using core only solution the memory and VRM's are usually cooled by some small aluminium/copper heatsinks attached to these with thermal tape. When using AIO cooler for gpu you will end up with card that takes 4+ slots normally - something to be aware of if you have multiple cards. For heatsinks VRM's are more important than RAM - RAM is normally fine with just some air passigng over them unless exceptionally overclocked.


----------



## jaydude

Quote:


> Originally Posted by *The Stilt*
> 
> Could someone who owns a Gigabyte 390 / 390X card dump their original bios and upload it? I´d like to have a look to see what they have done to prevent voltage adjustments.


Sure here ya go

Gigabyte390G1Bios.zip 100k .zip file


----------



## kizwan

Quote:


> Originally Posted by *jaydude*
> 
> Quote:
> 
> 
> 
> Originally Posted by *The Stilt*
> 
> Could someone who owns a Gigabyte 390 / 390X card dump their original bios and upload it? I´d like to have a look to see what they have done to prevent voltage adjustments.
> 
> 
> 
> Sure here ya go
> 
> Gigabyte390G1Bios.zip 100k .zip file
Click to expand...

It is much better to look on the voltage controller on the Gigabyte card. That will show whether voltage is really locked or not.


----------



## Dundundata

Quote:


> Originally Posted by *Joe88*
> 
> first oc http://www.techpowerup.com/gpuz/details.php?id=fe36


Nice but why leave power limit at +0


----------



## navjack27

Quote:


> Originally Posted by *kizwan*
> 
> It should not artifacting at stock (factory overclocked). To test whether your card is faulty or not, you can either increase voltage or running the card at stock (1070/1500). You also can try running the card at 1000/1250. If the card still artifacting, I suggest RMA the card.
> 
> Hawaii cards are designed to be able to operate up to 95C without damaging the card. Once the card reaches 95C, the card will throttling but it shouldn't cause artifacting. If your card do, it most likely faulty.


mine artifacts kinda at anything higher then my 1100/1650 overclock. now as i understand the memory overclock i have is quite standard. but i would really love to get 1150 or higher on the core. the temps don't really get that high up there the way i have it now and i forget what temps i saw with it at higher then this.

when i overclock i just go thru valley benchmark and do a full run with every +10mhz i change on the core. i think one time i took a day to put it at stock clocks and then do my +10mhz on the clock with stock memory speeds (with power saving disabled for no clock fluctuations and 100% fan) until i hit what seemed to be the ceiling. and then i put core back at stock and took the memory up 10+mhz each run of valley. then i just tried to find a happy medium with each max clock i found. should i do that again just to gather some data?

EDIT: oooo i might as well say i didn't touch voltage i don't think but i do set max power limit


----------



## Dundundata

Quote:


> Originally Posted by *navjack27*
> 
> mine artifacts kinda at anything higher then my 1100/1650 overclock. now as i understand the memory overclock i have is quite standard. but i would really love to get 1150 or higher on the core. the temps don't really get that high up there the way i have it now and i forget what temps i saw with it at higher then this.
> 
> when i overclock i just go thru valley benchmark and do a full run with every +10mhz i change on the core. i think one time i took a day to put it at stock clocks and then do my +10mhz on the clock with stock memory speeds (with power saving disabled for no clock fluctuations and 100% fan) until i hit what seemed to be the ceiling. and then i put core back at stock and took the memory up 10+mhz each run of valley. then i just tried to find a happy medium with each max clock i found. should i do that again just to gather some data?
> 
> EDIT: oooo i might as well say i didn't touch voltage i don't think but i do set max power limit


Artifacts are no good. You need to either lower your overclock or start increasing the voltage at that point.


----------



## KNG HOLDY

i want to watercool my msi r9 390 but the block is nowhere available not even at alphacool







((


----------



## The Stilt

Quote:


> Originally Posted by *jaydude*
> 
> Sure here ya go
> 
> Gigabyte390G1Bios.zip 100k .zip file


Thanks for the bios









This bios is certainly not voltage locked. There is a voltage limit, but it certainly is *not locked*. The voltage limit is high enough to allow +125mV - +312.5mV voltage offsets, depending on the leakage (stock voltage). Even it the worst case you´ll have +125mV margin.


----------



## navjack27

Quote:


> Originally Posted by *Dundundata*
> 
> Artifacts are no good. You need to either lower your overclock or start increasing the voltage at that point.


i guess later today i'll be doing some testing with max voltages and clocks and such since temps aren't an issue really. should i just start with this overclock i'm running now and go up from here while adding, like what, +5mv with each run until there are no artifacts? any max voltage numbers i should be looking at? is that VDDC power in and out in gpu-z anything accurate, i'm not sure what it is exactly. winter is a great time for overclocking, just open ur window and enjoy LOL

night ya'll its 7am and i need to pass out


----------



## Dundundata

Quote:


> Originally Posted by *navjack27*
> 
> i guess later today i'll be doing some testing with max voltages and clocks and such since temps aren't an issue really. should i just start with this overclock i'm running now and go up from here while adding, like what, +5mv with each run until there are no artifacts? any max voltage numbers i should be looking at? is that VDDC power in and out in gpu-z anything accurate, i'm not sure what it is exactly. winter is a great time for overclocking, just open ur window and enjoy LOL
> 
> night ya'll its 7am and i need to pass out


VDDC voltage should be fairly accurate as far as I know. My normal overclock is 1.32V and I've went up to 1.375V testing. I think as long as your temps are good it should be OK. Make sure to check VRM temp as well.


----------



## kizwan

Quote:


> Originally Posted by *ClowReed*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> It should not artifacting at stock (factory overclocked). To test whether your card is faulty or not, you can either increase voltage or running the card at stock (1070/1500). You also can try running the card at 1000/1250. If the card still artifacting, I suggest RMA the card.
> 
> Hawaii cards are designed to be able to operate up to 95C without damaging the card. Once the card reaches 95C, the card will throttling but it shouldn't cause artifacting. If your card do, it most likely faulty.
> 
> 
> 
> 
> 
> 
> 
> Thanks for your reply! Running at the default clocks of the 390X (1000/1500) didn't showed me any artifacts indeed. However... a lot of throttling going on because of power limit and temperature. With the AMD default fan curve setted and the board at 92 ºC the fans didn't even reach 40% and the temperature control is done based mostly on throttling. It's sad cause the board never gonna run at the default clock speeds. I checked and it's always fluctuating around 950 mhz. So... No point in having it overclocked. If it's not because of the temperature, it's because the Power Limit thing. Very sad.
> 
> This without mentioning that this model specifically doesn't run without some artifacts at it's stock settings and high temperatures (1070/1500 @ +25mV). Yes, the artifacting is very subtle, yet very disturbing when I spot one.
> 
> One thing I'm confused about is that the board artifacts (a minimum, but it does) at +90 ºC and 1070MHz @ +25mV... But at stock speeds it doesn't, even at the same temperatures... Could this be repaired increasing the voltage?
> 
> Thanks in advance!
Click to expand...

Don't use default fan profile. Always use custom fan profile.I doubt the core fluctuatiing because power limit. I found that it's normal for single Hawaii card to have very minor core fluctuating. If it's fluctuating a lot then something else interfering. Did you use MSI AB? If yes, please post screenshot of your MSI AB setting (General tab). Also try "Unofficial overclocking mode without powerplay".

If it only artifacting when overclock (even with factory overclocked), raising voltage should fixed the artifacting.
Quote:


> Originally Posted by *navjack27*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> It should not artifacting at stock (factory overclocked). To test whether your card is faulty or not, you can either increase voltage or running the card at stock (1070/1500).
> 
> Hawaii cards are designed to be able to operate up to 95C without damaging the card. Once the card reaches 95C, the card will throttling but it shouldn't cause artifacting. If your card do, it most likely faulty.
> 
> 
> 
> 
> 
> 
> 
> mine artifacts kinda at anything higher then my 1100/1650 overclock. now as i understand the memory overclock i have is quite standard. but i would really love to get 1150 or higher on the core. the temps don't really get that high up there the way i have it now and i forget what temps i saw with it at higher then this.
> 
> when i overclock i just go thru valley benchmark and do a full run with every +10mhz i change on the core. i think one time i took a day to put it at stock clocks and then do my +10mhz on the clock with stock memory speeds (with power saving disabled for no clock fluctuations and 100% fan) until i hit what seemed to be the ceiling. and then i put core back at stock and took the memory up 10+mhz each run of valley. then i just tried to find a happy medium with each max clock i found. should i do that again just to gather some data?
> 
> EDIT: oooo i might as well say i didn't touch voltage i don't think but i do set max power limit
Click to expand...

No need to do that again. 1100/1650 is your highest overclock without manually adding more voltage. BTW, 1250/1375/1500/1625/1750 are memory straps & these number represent max clock for each straps. Instead of 1650, I recommend dial it down to 1625 for optimal performance. 1625 use tighter timings than 1650.


----------



## Dundundata

Quote:


> Originally Posted by *kizwan*
> 
> Don't use default fan profile. Always use custom fan profile.I doubt the core fluctuatiing because power limit. I found that it's normal for single Hawaii card to have very minor core fluctuating. If it's fluctuating a lot then something else interfering. Did you use MSI AB? If yes, please post screenshot of your MSI AB setting (General tab). Also try "Unofficial overclocking mode without powerplay".
> 
> If it only artifacting when overclock (even with factory overclocked), raising voltage should fixed the artifacting.
> No need to do that again. 1100/1650 is your highest overclock without manually adding more voltage. BTW, 1250/1375/1500/1625/1750 are memory straps & these number represent max clock for each straps. Instead of 1650, I recommend dial it down to 1625 for optimal performance. 1625 use tighter timings than 1650.


Interesting is that the case for all 390 cards? Also while you're at it care to critique my settings I'm not sure what everything does and I've left most default I think.


----------



## kizwan

Quote:


> Originally Posted by *Dundundata*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Don't use default fan profile. Always use custom fan profile.I doubt the core fluctuatiing because power limit. I found that it's normal for single Hawaii card to have very minor core fluctuating. If it's fluctuating a lot then something else interfering. Did you use MSI AB? If yes, please post screenshot of your MSI AB setting (General tab). Also try "Unofficial overclocking mode without powerplay".
> 
> If it only artifacting when overclock (even with factory overclocked), raising voltage should fixed the artifacting.
> No need to do that again. 1100/1650 is your highest overclock without manually adding more voltage. BTW, 1250/1375/1500/1625/1750 are memory straps & these number represent max clock for each straps. Instead of 1650, I recommend dial it down to 1625 for optimal performance. 1625 use tighter timings than 1650.
> 
> 
> 
> 
> 
> 
> 
> Interesting is that the case for all 390 cards? Also while you're at it care to critique my settings I'm not sure what everything does and I've left most default I think.
Click to expand...

Your settings are ok. Very minor fluctuating now and then is normal. If all the time it is either something is interfering or the games you are playing not really demanding. For the latter your gaming experience should not be affected.


----------



## battleaxe

Quote:


> Originally Posted by *afyeung*
> 
> Will definitely get to repasting the 390x. Thanks for the reassurance and info. I tried using the fan to blow heat away but it made almost no difference. Overall, as of now I am very dissatisfied with the thermals on the 390x and hopefully repasting will at least lower the temps to the 80s so I can start gaming for longer than 3 mins before my temps go through the roof.


There's no reason to be dissatisfied with your cards. They are starving for cool air. I'd get hot too in that situation. You gotta figure outa way to get them cool air. As it is now, there's no chance of them ever running cool, on the air card anyway.

Like you said earlier. Might be better to do an AIO on the top one too. You'd probably drop 20-30c on the top card in that situation.


----------



## afyeung

Quote:


> Originally Posted by *battleaxe*
> 
> There's no reason to be dissatisfied with your cards. They are starving for cool air. I'd get hot too in that situation. You gotta figure outa way to get them cool air. As it is now, there's no chance of them ever running cool, on the air card anyway.
> 
> Like you said earlier. Might be better to do an AIO on the top one too. You'd probably drop 20-30c on the top card in that situation.


Repasting my card right now. But yes, there is a big reason to be dissatisfied with the cooling performance. I had a fan blowing directly on the 390x and the side panel off. It still hit very high temps. This card is huge and supposed to be cooler than the 290x. Also, I've seen videos of people running crossfire which I am, except both cards were on air. The bottom card isn't even producing hot air yet the 390x gets real hot. If repasting doesn't work then there is something wrong with the cooler.


----------



## Dundundata

Quote:


> Originally Posted by *afyeung*
> 
> Repasting my card right now. But yes, there is a big reason to be dissatisfied with the cooling performance. I had a fan blowing directly on the 390x and the side panel off. It still hit very high temps. This card is huge and supposed to be cooler than the 290x. Also, I've seen videos of people running crossfire which I am, except both cards were on air. The bottom card isn't even producing hot air yet the 390x gets real hot. If repasting doesn't work then there is something wrong with the cooler.


Have you tried running just the 390x? I think the difference with 2 air cooled cards is at least there is airflow coming from the bottom.


----------



## Carniflex

Quote:


> Originally Posted by *battleaxe*
> 
> There's no reason to be dissatisfied with your cards. They are starving for cool air. I'd get hot too in that situation. You gotta figure outa way to get them cool air. As it is now, there's no chance of them ever running cool, on the air card anyway.
> 
> Like you said earlier. Might be better to do an AIO on the top one too. You'd probably drop 20-30c on the top card in that situation.


If 2x 140mm fans blowing cold air into it "starving for cool air" then I'm not sure what can be done to give it any more sort of putting ultra kazes on it but latter ones are not suitable for rooms where peolle intend to live as well. 390X is just excpetionally hot card as far as I can see.


----------



## afyeung

Quote:


> Originally Posted by *Dundundata*
> 
> Have you tried running just the 390x? I think the difference with 2 air cooled cards is at least there is airflow coming from the bottom.


Like I said, I had a 140mm blowing on it with the panel off. I just repasted it. I'll test the temps when I have time. There is absolutely no reason for it to be running that hot.


----------



## jdorje

Quote:


> Originally Posted by *afyeung*
> 
> Like I said, I had a 140mm blowing on it with the panel off. I just repasted it. I'll test the temps when I have time. There is absolutely no reason for it to be running that hot.


What are its temps if you remove the bottom card?


----------



## afyeung

Quote:


> Originally Posted by *jdorje*
> 
> What are its temps if you remove the bottom card?


Haven't tried it yet. I'll try it if the repaste didn't work. I have a fan blowing directly on it and I can feel a ton of air coming out from the side of the cooler. So it's definitely getting a substantial amount of air.


----------



## battleaxe

Quote:


> Originally Posted by *afyeung*
> 
> Repasting my card right now. But yes, there is a big reason to be dissatisfied with the cooling performance. I had a fan blowing directly on the 390x and the side panel off. It still hit very high temps. This card is huge and supposed to be cooler than the 290x. Also, I've seen videos of people running crossfire which I am, except both cards were on air. The bottom card isn't even producing hot air yet the 390x gets real hot. If repasting doesn't work then there is something wrong with the cooler.


Quote:


> Originally Posted by *Carniflex*
> 
> If 2x 140mm fans blowing cold air into it "starving for cool air" then I'm not sure what can be done to give it any more sort of putting ultra kazes on it but latter ones are not suitable for rooms where peolle intend to live as well. 390X is just excpetionally hot card as far as I can see.


Guys... guys... the point is, there's an 1/8th of an inch between those cards. Take the bottom card out and test it for temps then come back here and tell us you don't have a problem in that scenario. Basic physics man. Nothing rocket science about this.

Unless of course you use a high flow air compressor to push cold air between those cards. But I don't see one of those in there. Until then, or you change the config... You are going to be hotter than you want to be.


----------



## Joe88

Quote:


> Originally Posted by *Dundundata*
> 
> Nice but why leave power limit at +0


I looked at different sites and they all seemed to say leave it alone and causes oc instability


----------



## Sgt Bilko

Quote:


> Originally Posted by *Joe88*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Dundundata*
> 
> Nice but why leave power limit at +0
> 
> 
> 
> I looked at different sites and they all seemed to say leave it alone and causes oc instability
Click to expand...

Whoever said that is an idiot, Powerlimit just lets the card draw additional power so it can maintain the clock speeds you set for it (to a point of course)

Not the same as overvolting of course


----------



## Joe88

So should I just max it out then?


----------



## battleaxe

Quote:


> Originally Posted by *Joe88*
> 
> So should I just max it out then?


Yes. This is what we all do. Anyone who had been around here a little while anyway.


----------



## afyeung

MSI 390x still extremely hot and loud even after the thermal paste reapplication. Going to run the card on the bottom and see how it does.


----------



## navjack27

afyeung, what is hot to you for your card? also, a fan blowing on a card with two fans on it with fans on fans on fans on fans... basically what i'm saying is TURBULENCE


----------



## tangelo

Quote:


> Originally Posted by *Joe88*
> 
> So should I just max it out then?


Yes.


----------



## tangelo

Quote:


> Originally Posted by *navjack27*
> 
> afyeung, what is hot to you for your card? also, a fan blowing on a card with two fans on it with fans on fans on fans on fans... basically what i'm saying is TURBULENCE


Indeed. And some (most?) of the time opening the case and putting fans to blow in just makes the situation worse. He could try to close the case and make sure the ventilation is properly set. Enough air in, enough air out and nice flow thru the case.


----------



## navjack27

only reason my side is open on mine is cuz i have no case fans LOL. h80i gt on my cpu and the 390x are the only fans. my ****ty thermaltake case has a useless blue led fan in the front that just BARELY causes the hairs on my arm to move, let alone move actual volumes of air.


----------



## Majentrix

My 390 Strix is getting lower benchmark scores when I touch anything in Afterburner. What's going on here? Even moving the power and voltage sliders up dropped my Firestrike score 500 points.


----------



## navjack27

disable all the graphs. pause the graphs and turn off the OSD in rivatuner

also, since i dont have to edit my last post. to emphasize my point with my case being ****ty.


this "machine" gets 12k 3dmark score LOL


----------



## afyeung

Quote:


> Originally Posted by *navjack27*
> 
> afyeung, what is hot to you for your card? also, a fan blowing on a card with two fans on it with fans on fans on fans on fans... basically what i'm saying is TURBULENCE


Quote:


> Originally Posted by *tangelo*
> 
> Indeed. And some (most?) of the time opening the case and putting fans to blow in just makes the situation worse. He could try to close the case and make sure the ventilation is properly set. Enough air in, enough air out and nice flow thru the case.


If the card is near throttling(95) at stock I consider that hot. Actually, if I play any longer it might throttle. The cooler is moving a ton of hot air out of the case and the chip is still super hot. I thought leaving the side panel off would help temps? And it's not as ghetto as it sounds. I just have a fan point at the gpu mounted on the drive bay since the front fans don't have enough static pressure to reach the gpu(many cases have this feature). None of this makes any sense. Even when I have the bottom card off so it literally produces no heat, the 390x still gets super hot. I'll be really pissed if the chip is faulty which causes it to have a significantly higher tdp. If switching the cards helps for some odd reason I'll be really happy. But if it still runs hot, I'll sell the card and move back to Hawaii and be able to afford a liquid cooled 290/x. Really disappointed since this card was supposed to offer great thermals and acoustics yet offer similar or better performance than the 290(x).


----------



## afyeung

Quote:


> Originally Posted by *afyeung*
> 
> If the card is near throttling(95) at stock I consider that hot. Actually, if I play any longer it might throttle. The cooler is moving a ton of hot air out of the case and the chip is still super hot. I thought leaving the side panel off would help temps? And it's not as ghetto as it sounds. I just have a fan point at the gpu mounted on the drive bay since the front fans don't have enough static pressure to reach the gpu(many cases have this feature). None of this makes any sense. Even when I have the bottom card off so it literally produces no heat, the 390x still gets super hot. I'll be really pissed if the chip is faulty which causes it to have a significantly higher tdp. If switching the cards helps for some odd reason I'll be really happy. But if it still runs hot, I'll sell the card and move back to Hawaii and be able to afford a liquid cooled 290/x. Really disappointed since this card was supposed to offer great thermals and acoustics yet offer similar or better performance than the 290(x).


I meant switching the cards as in putting the liquid cooled 290 on top as others have suggested.


----------



## navjack27

return that card if it rruns that hot stock


----------



## afyeung

Quote:


> Originally Posted by *navjack27*
> 
> return that card if it rruns that hot stock


I got from MSI as a replacement for my 290. Can't really return it. If putting it on the bottom doesn't help I'll have to sell it.


----------



## Dundundata

Try removing the bottom card and see if that helps.


----------



## Ha-Nocri

Not sure if posted:

Sapphire R9 390 Nitro 8GB (Rev 2 w/ backplate)

The perfect card








Quote:


> ...our thermal imaging camera shows a temperature reduction of almost 30c from the PCB to backplate.


----------



## jon666

Can you buy the back plates anywhere?


----------



## navjack27

ha-nocri, thats pretty impressive on that REV 2 card.


----------



## Joe88

So all that was changed was the back plate?


----------



## jdorje

This picture is completely irrelevant to this thread, but goes farther to explaining what I want to do than any words can.










A while ago I came up with the above data set for my 4690k, convincing myself that voltage scaling is linear at low clock, then becoming quadratic, then (maybe) super-quadratic.

Now I'm wondering the same thing about my 390. I've got about 5 data points so far: at +0 mV I can get the clock to 1100 mhz, at +100 mV to about 1130, and at -100 mV to 1040. Is this going to be quadratic all the way? Voltage reduction actually seems pretty attractive; -100 mV results in a 5% loss of performance versus +0 (note: it's still higher than the stock 1015 mhz!) but power use is ~25% lower (not measured yet exactly, but temps are about 25% closer to ambient).

The whole thing takes some time, and has to be done all in a row to be sure of the results (otherwise you lose interest and then a new driver comes out and everything changes). For the CPU test I ran my computer for a whole afternoon, reading a good book while crashing my computer agtain and again. I ran it in .01V vid increments and 100 mhz increments; with ~25 multipliers and ~60 voltage increments that took about 85 15-minute stress tests. It also requires a soft definition of stable; in the above case the ability to pass one loop of x264.

For the 390 I probably want to make things much shorter. Stability will be defined as passing one valley benchmark without artifacting. Ram I'll probably lock at 1625 mhz, which I know is well in the stable range (afaict ram stability and core stability are independent, but there's no reason to have high ram influence the test). With 200 mV range I can do 21 voltage increments (10 mV each) and about 20 clock increments of 5 mhz each (lol). With such low clock increments though, will the inherent randomness of a limited stability test provide too much noise? That'll take 40-45 runs of valley benchmark which is 4(?) minutes long, which is a fraction the amount of time investment the CPU test took.

Any tips would be appreciated. I'm going to wait for a cold day for this.

Edit: there is exactly one successful run for each clock increment and exactly one failed run for each voltage increment.


----------



## Carniflex

Quote:


> Originally Posted by *Joe88*
> 
> So all that was changed was the back plate?


I'm not familiar with that specific card, however a heatspreader or backplate with some thermal pads can have a significant effect on temperatures. For example a Sapphire 7950 FleX I have is this small metal heatspreader bolted to the VRM's and memory mods on the cooler side and they are a lot cooler than on the opposite side even without much airflow (when the card was water cooled with core-only block).


----------



## Majentrix

Quote:


> Originally Posted by *navjack27*
> 
> disable all the graphs. pause the graphs and turn off the OSD in rivatuner


I've done this and there haven't been any changes.

This is at stock:
http://www.3dmark.com/3dm/9721328

And this is with +50mv, +50% power and 1075 on the core:
http://www.3dmark.com/3dm/9721375


----------



## navjack27

don't raise the voltage. you give it too much and performance lowers. i think hardOCP talked about this in their review
http://www.hardocp.com/article/2015/07/06/msi_radeon_r9_390x_gaming_8g_overclocking_review/2#.Vm5dvFUrLmg
Quote:


> We have to be careful raising the Core Voltage. Raising it too high will cause performance to degrade. Many overclockers make the mistake of raising the voltage as high as possible and then try to find the highest stable overclock. While that might seem intuitive, it is not. Raising the voltage quickly pushes the GPU to its TDP limit and once you hit that TDP limit your potential for sustained performance diminishes. If you do that, it is very likely that once you start gaming you will not be enjoying the performance advantage that you think you are set up for. We suggest that you need to gradually bump the voltage until you find the best actual in-game clocks. It is all a big time consuming balancing act.


but with that in mind and the new drivers...

i did some testing

http://www.3dmark.com/3dm/9721883?



and without gpu-z and afterburner doing monitoring and logging
http://www.3dmark.com/fs/6803726

the highest i was able to get without loss in performance was 1150/1750 with +50 power limit and +75 core mv and nothing added to aux mv. i tried 1160 and i artifacted in firestrike then i upped it to +100mv but that just resulted in less fps in the 2nd test when the camera is panning from the side of the figure (usually goes up to like 83fps) with a like 60 something fps at that part so i called it right there.

EDIT: LOL zero performance gain in CSGO's FPS_BENCHMARK map (cpu bound)


----------



## Carniflex

Is it only me or is 390X more "jittery" in games than 7950? Or perhaps the new crimson drivers issue? The game specifically where I seem to have a problem is Planetside 2 at 4k resolution in potato mode (low preset) - in game fps counter shows consistent 59/60 frames per second, however when I turn quickly I can "feel" the jitter in there which I am rather not fond of.

Card is atm at -10% power as it is unusable at 0% because of particularly loud howling and temperatures reaching 90+ C on it. Dunno how reliable the GPU-z readings are for it but I have never seen that card at it's factory clock (1060 MHz for Gigabyte 390X G1) even when moving the power slider to +10%. Even at -10% power it's hitting 80 .. 85 C with one 140mm pushing cold air into it and another 140 mm sucking hot air out from the area it's in.


----------



## Darkeylel

Quote:


> Originally Posted by *Carniflex*
> 
> Is it only me or is 390X more "jittery" in games than 7950? Or perhaps the new crimson drivers issue? The game specifically where I seem to have a problem is Planetside 2 at 4k resolution in potato mode (low preset) - in game fps counter shows consistent 59/60 frames per second, however when I turn quickly I can "feel" the jitter in there which I am rather not fond of.
> 
> Card is atm at -10% power as it is unusable at 0% because of particularly loud howling and temperatures reaching 90+ C on it. Dunno how reliable the GPU-z readings are for it but I have never seen that card at it's factory clock (1060 MHz for Gigabyte 390X G1) even when moving the power slider to +10%. Even at -10% power it's hitting 80 .. 85 C with one 140mm pushing cold air into it and another 140 mm sucking hot air out from the area it's in.


I think it's Crimson because in black ops3 I had a really solid consistent FPS. Since going to these drivers I get random FPS drops.

Not 100% sure if it's the driver or win10 being crappy still


----------



## BakedIkezor

*PSA Announcement for non-backplate Sapphire R9 390 Nitro owners*

Are you an early adopter like me that bought a non-backplate version of the R9 390? Is not having a backplate on your otherwise magnificent R9 390 Nitro bugging you?

If so, contact Sapphire Support and they will sell you a backplate for a small fee.

They charged me 27USD including shipping for the backplate. It doesn't improve your GPU performance ofc, but if you have a windowed case and care about the aesthetics of your system as much as the functionality of it, it's a no brainer.


----------



## navjack27

Quote:


> Originally Posted by *Carniflex*
> 
> Is it only me or is 390X more "jittery" in games than 7950? Or perhaps the new crimson drivers issue? The game specifically where I seem to have a problem is Planetside 2 at 4k resolution in potato mode (low preset) - in game fps counter shows consistent 59/60 frames per second, however when I turn quickly I can "feel" the jitter in there which I am rather not fond of.
> 
> Card is atm at -10% power as it is unusable at 0% because of particularly loud howling and temperatures reaching 90+ C on it. Dunno how reliable the GPU-z readings are for it but I have never seen that card at it's factory clock (1060 MHz for Gigabyte 390X G1) even when moving the power slider to +10%. Even at -10% power it's hitting 80 .. 85 C with one 140mm pushing cold air into it and another 140 mm sucking hot air out from the area it's in.


i have none of those issues with mine. i love my 390x. butter smooth


----------



## Agent Smith1984

Plenty of member list updates added.... OC's, cooling changes, and new members.

Let me know if you see something missing or not right.


----------



## navjack27

mister smith...

navjack27
MSI
Gaming 390X
Air/stock
1100/1525
1150/1650

could be changed to 1150/1750 +75core with my new 3dmark score as validation


----------



## PhillyB

Quote:


> Originally Posted by *KNG HOLDY*
> 
> i want to watercool my msi r9 390 but the block is nowhere available not even at alphacool
> 
> 
> 
> 
> 
> 
> 
> ((


Not sure if anyone responded as I'm still working through the posts.

Alphacool is currently manufacturing more of these. Should be available soon at their website. I have one on backorder.


----------



## battleaxe

Quote:


> Originally Posted by *Carniflex*
> 
> Is it only me or is 390X more "jittery" in games than 7950? Or perhaps the new crimson drivers issue? The game specifically where I seem to have a problem is Planetside 2 at 4k resolution in potato mode (low preset) - in game fps counter shows consistent 59/60 frames per second, however when I turn quickly I can "feel" the jitter in there which I am rather not fond of.
> 
> Card is atm at -10% power as it is unusable at 0% because of particularly loud howling and temperatures reaching 90+ C on it. Dunno how reliable the GPU-z readings are for it but I have never seen that card at it's factory clock (1060 MHz for Gigabyte 390X G1) even when moving the power slider to +10%. Even at -10% power it's hitting 80 .. 85 C with one 140mm pushing cold air into it and another 140 mm sucking hot air out from the area it's in.


Is the AMD Gaming App installed? I noticed this when it was recording video. I didn't even realize it was doing it, but seems it was recording video and was causing the stuttering. Soon as I turned it off, all returned to normal.


----------



## OdinValk

I seem to be getting a rather annoying bottleneck between my Sapphire R9 390 and my lowly amd fx-6300 which is already overclocked to 4.4ghz I have pretty decent cooling and could probably push it to 4.5 or .6 maybe higher.. but would that help to alleviate the bottleneck? Or am I just spinning my wheels and need to upgrade the chip? so far I have only noticed it in Fallout 4 which i know is a fairly CPU intensive game.. I will do a little more testing in some other games and see what I get, but the question remains


----------



## Sgt Bilko

Quote:


> Originally Posted by *OdinValk*
> 
> I seem to be getting a rather annoying bottleneck between my Sapphire R9 390 and my lowly amd fx-6300 which is already overclocked to 4.4ghz I have pretty decent cooling and could probably push it to 4.5 or .6 maybe higher.. but would that help to alleviate the bottleneck? Or am I just spinning my wheels and need to upgrade the chip? so far I have only noticed it in Fallout 4 which i know is a fairly CPU intensive game.. I will do a little more testing in some other games and see what I get, but the question remains


Turn shadows down to medium or low and godrays to medium, that'll help you out


----------



## Dundundata

Quote:


> Originally Posted by *navjack27*
> 
> don't raise the voltage. you give it too much and performance lowers. i think hardOCP talked about this in their review
> http://www.hardocp.com/article/2015/07/06/msi_radeon_r9_390x_gaming_8g_overclocking_review/2#.Vm5dvFUrLmg
> but with that in mind and the new drivers...
> 
> i did some testing
> 
> http://www.3dmark.com/3dm/9721883?
> 
> 
> 
> and without gpu-z and afterburner doing monitoring and logging
> http://www.3dmark.com/fs/6803726
> 
> the highest i was able to get without loss in performance was 1150/1750 with +50 power limit and +75 core mv and nothing added to aux mv. i tried 1160 and i artifacted in firestrike then i upped it to +100mv but that just resulted in less fps in the 2nd test when the camera is panning from the side of the figure (usually goes up to like 83fps) with a like 60 something fps at that part so i called it right there.
> 
> EDIT: LOL zero performance gain in CSGO's FPS_BENCHMARK map (cpu bound)


Nice score you're memory OC is quite high. If you really want to "go for it" maybe try adding some Aux volts and see if you can't increase the core a bit. But I would be quite happy with where your card is. Curious, what is your actual voltage reading at +75mV?


----------



## navjack27

GPU-ZSensorLog_1150_1750_plus75.txt 404k .txt file


will this help


----------



## OdinValk

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Turn shadows down to medium or low and godrays to medium, that'll help you out


Thanks I will give that a try.


----------



## Carniflex

Quote:


> Originally Posted by *battleaxe*
> 
> Is the AMD Gaming App installed? I noticed this when it was recording video. I didn't even realize it was doing it, but seems it was recording video and was causing the stuttering. Soon as I turned it off, all returned to normal.


No I have never bothered to install it.


----------



## battleaxe

Quote:


> Originally Posted by *Carniflex*
> 
> No I have never bothered to install it.


Can't be that then...


----------



## kizwan

Quote:


> Originally Posted by *navjack27*
> 
> GPU-ZSensorLog_1150_1750_plus75.txt 404k .txt file
> 
> 
> will this help


Voltage when underload is 1.227 - 1.25V.


----------



## navjack27

awesome work on graphing that for me

EDIT: am i the only one who likes the amd gaming evolved program? i use that when i wanna record stuff and have pretty much no impact in my gameplay performance. when i want quality i use OBS with special settings.


----------



## Charcharo

http://wccftech.com/amd-aibs-prepare-4-gb-radeon-r9-390/

How do you guys feel about this?


----------



## battleaxe

Quote:


> Originally Posted by *navjack27*
> 
> awesome work on graphing that for me
> 
> EDIT: am i the only one who likes the amd gaming evolved program? i use that when i wanna record stuff and have pretty much no impact in my gameplay performance. when i want quality i use OBS with special settings.


I've noticed I get stuttering when using it. So only use it when I want to record something.


----------



## Joe88

Quote:


> Originally Posted by *Charcharo*
> 
> http://wccftech.com/amd-aibs-prepare-4-gb-radeon-r9-390/
> 
> How do you guys feel about this?


Interesting, was expecting at least a $50 difference
Not sure how much of a market it will have, the 8GB cards go on sale all the time for below $300, a few times you could have gotten one for $210 with combined promotions, I got mine for $255
its more future proofing than anything else, some game are starting to approach 3.5GB @ 1080p but running 100 skyrim mods will eat vram like crazy way more than 4GB
still 0.5GB better than the competition though


----------



## Dundundata

Quote:


> Originally Posted by *Charcharo*
> 
> http://wccftech.com/amd-aibs-prepare-4-gb-radeon-r9-390/
> 
> How do you guys feel about this?


They should have just made these cards with 6GB to begin with and called it a day


----------



## Charcharo

I guess. With 6 Gigs it would solve any VRAM issues at 1080 and 1440 (and most 4K ones too).

Then again there is no kill like overkill


----------



## surfinchina

Quote:


> Originally Posted by *Charcharo*
> 
> I guess. With 6 Gigs it would solve any VRAM issues at 1080 and 1440 (and most 4K ones too).
> 
> Then again there is no kill like overkill


Overkill for some...
I got this card for Architectural modelling. 6gb can fit a big model in it so I can spin around it without stuttering, 8gb can fit a huge model, with trees, people, complex texture mapping and I can move around it as if it's reality.
Definitely worth it when I'm chasing dollars for my designs...

Not that I design such things, but I can if I want to. Maybe it is overkill









On a side issue, I got a Dell 34" screen (more overkill) recently and was using the DP port. When I started the computer I had to turn the screen off then on again to show anything.
I changed to HDMI 2 port and it's fine.

Stupid computers...

I think it's something to do with lack of signal and sleep mode or something.


----------



## Dundundata

don't get me wrong I am fine with 8gb









just seems weird to then release with 4


----------



## Derek129

Hey guys I want to get rid of the horrid fans on my gigabyte 390 if possible. I was originally going to get the arctic accelero extreme iv like a fellow member here had done with his but I dont have the case space for that and dont feel like cutting anymore than I have already. So I was wondering if a Kraken g10 and a corsair h55 would work or the arctic hybrid III AIO liquid cooling set up and which one would work best. Let me know your input! Thanks


----------



## battleaxe

Quote:


> Originally Posted by *Derek129*
> 
> Hey guys I want to get rid of the horrid fans on my gigabyte 390 if possible. I was originally going to get the arctic accelero extreme iv like a fellow member here had done with his but I dont have the case space for that and dont feel like cutting anymore than I have already. So I was wondering if a Kraken g10 and a corsair h55 would work or the arctic hybrid III AIO liquid cooling set up and which one would work best. Let me know your input! Thanks


AIO wil win both for heat and noise. Done it enough myself to know. Air just can't compare. Only issue you face is getting the VRM's colder. But there are ways to win that batlle too.


----------



## pillowsack

Well I got my XFX DD under a universal GPU water block tonight. Short furmark runs without the VRM's spiking and I was able to accomplish 1225/1750 with 90MV+ and 50%PL

I copied someone elses overclock off the spread sheet, worked well. Core never went over 45C with 2 hours of GTA 5 online.

I have the gelid VRM coolers coming in the mail. Even though the card has VRM 1 cooling, it's not adequate, and the VRM 2 can hit 75C max. When they're cooled i'll probably open trixx and aim for some higher numbers with that sweet MV increases(gimme that 200 GTexel/s)


----------



## yuannan

I just received my r9 390x Sapphire trix.
and at idle with MSI afterburn 20% fan speed it sits at a very hot 50-55C

Is this normal?


----------



## Ha-Nocri

Quote:


> Originally Posted by *yuannan*
> 
> I just received my r9 390x Sapphire trix.
> and at idle with MSI afterburn 20% fan speed it sits at a very hot 50-55C
> 
> Is this normal?


I thought fans are not spinning at all at idle?


----------



## yuannan

I use afterburn to make sure it does, at 20% it is near silent and I don't like the sudden burst of fans kicking in. Plus lower temps is always good.


----------



## jodybdesigns

Welcome to club AMD. Where we like our GPUs like we like our women. Hot and fast.


----------



## Ha-Nocri

Quote:


> Originally Posted by *yuannan*
> 
> I use afterburn to make sure it does, at 20% it is near silent and I don't like the sudden burst of fans kicking in. Plus lower temps is always good.


Well, if fans are rly spinning, then yes, 50-55c is high. Are you sure they are spinning tho? Physically checked with your fingers?


----------



## yuannan

Yes they are 100% on, What should I do now?


----------



## Ha-Nocri

Quote:


> Originally Posted by *yuannan*
> 
> Yes they are 100% on, What should I do now?


How is air-flow in your case?
What is the temp under load? Like, after 5-10 minutes of running Unigine Heaven?

Also, is the card down-clocking to 300MHz while idling?


----------



## yuannan

Air flow is decent. Below average but good.

1*140mm running at 700rmp+ at 50%
1*120mm running at 600rmp+ at 50%

fan speed is controlled my mobo.

The temps are quite high but it's kinda oh-kay considering it's only one fan at 20%.

posted this on LTT: http://linustechtips.com/main/topic/505938-r9-390x-hot-temps/ and people are saying it's normal?

My previous card was a MSI r9 380 and it ran a coolish 40C idle and 70C max load on Furmark (i know it's not good but it works nice for what i need)

this card runs at very good considering idle temp at 65-70C at only 40-45%


----------



## XxxxVulcanxxxX

Quote:


> Originally Posted by *yuannan*
> 
> Air flow is decent. Below average but good.
> 
> 1*140mm running at 700rmp+ at 50%
> 1*120mm running at 600rmp+ at 50%
> 
> fan speed is controlled my mobo.
> 
> The temps are quite high but it's kinda oh-kay considering it's only one fan at 20%.
> 
> posted this on LTT: http://linustechtips.com/main/topic/505938-r9-390x-hot-temps/ and people are saying it's normal?
> 
> My previous card was a MSI r9 380 and it ran a coolish 40C idle and 70C max load on Furmark (i know it's not good but it works nice for what i need)
> 
> this card runs at very good considering idle temp at 65-70C at only 40-45%


Before moving to watercooling I had a tough time keeping my MSI 390x cool and i had 4x140mm 1000rpm blowing in and 2 more as exhaust and even at idle i would see temps in the 40s. Gaming was almost always 70c+ at the stock clock speeds. Im also uk based so would be similar ambient temps, My room is normally around 18c in the day and upto 20-25c when the wife gets home and puts the heating on, lol.

My personal view is that you might need a bit more cool air in if your case will support it. I dropped around 8-10 C at load by using the side panel fan blowing directly on to the card.


----------



## jdorje

Quote:


> Originally Posted by *yuannan*
> 
> I use afterburn to make sure it does, at 20% it is near silent and I don't like the sudden burst of fans kicking in. Plus lower temps is always good.


I have the same thing with my xfx. The fan speed is too low to drop temps that far, or maybe too low to circulate warm air out of the case. Give it long enough it'll drop to ambient+10 but it's really slow. On the other hand it's basically silent so I agree there's no need for a fanless mode.


----------



## pillowsack

Just put on the Gelid VRM coolers and mounted my stock intel fan onto my VRM's. Lets see what I can push this mofo to


----------



## Ha-Nocri

Quote:


> Originally Posted by *yuannan*
> 
> Air flow is decent. Below average but good.
> 
> 1*140mm running at 700rmp+ at 50%
> 1*120mm running at 600rmp+ at 50%
> 
> fan speed is controlled my mobo.
> 
> The temps are quite high but it's kinda oh-kay considering it's only one fan at 20%.
> 
> posted this on LTT: http://linustechtips.com/main/topic/505938-r9-390x-hot-temps/ and people are saying it's normal?
> 
> My previous card was a MSI r9 380 and it ran a coolish 40C idle and 70C max load on Furmark (i know it's not good but it works nice for what i need)
> 
> this card runs at very good considering idle temp at 65-70C at only 40-45%


Load temperature is fine, idle is high, so you are fine. Not sure tho, how you set only 1 fan to spin @ idle. Can you set all 3 fans to spin @20% @idle?

But if I had sapphire 390 instead of my 290, I would leave fans off. It's a great feature, your card won't get dusty as fast. I would also increase case air-flow with 1 or 2 additional fan(s)


----------



## yuannan

Quote:


> Originally Posted by *Ha-Nocri*
> 
> blah


Its in the software/firmware of my r9 390x's IFC which does that,it is actually advertised as one of the main selling points. But It does not have a switch like it does in the older cards. I'll look around but if i can I would change it to 10% fan speed on all 3 so it's actually "30%". Maybe even higher if it's near silent.

My case has all filters so it's fine to have it on all the time.


----------



## Ha-Nocri

Dust always finds a way ;D


----------



## Haenir

Hi, I have the sapphire 390 and i can get 1100 on the core and 1700 on the memory without touching voltage or power limit. It's stable(heaven for 20 mins without problems) and the temps are fine(67C at max load), but i just realized that I'm having stuttering, screen tearing and in general graphical problems in quite a few games. Might it have to do with the fact that I didn't touch the power limit or that the oc is too high? In that case what could I settle on?


----------



## Ha-Nocri

Quote:


> Originally Posted by *Haenir*
> 
> Hi, I have the sapphire 390 and i can get 1100 on the core and 1700 on the memory without touching voltage or power limit. It's stable(heaven for 20 mins without problems) and the temps are fine(67C at max load), but i just realized that I'm having stuttering, screen tearing and in general graphical problems in quite a few games. Might it have to do with the fact that I didn't touch the power limit or that the oc is too high? In that case what could I settle on?


What is your CPU, RAM... and what games?


----------



## Haenir

Ram: 2x4gb c13 corsair vengeance and cpu an intel i5 6500, I'm having problems with wolfensten, project cars, serious sam and even rocket league.


----------



## Ha-Nocri

CPU and RAM are fine. But I didn't play any of the games so don't know. Screen tearing occurs if you don't have v-sync on. I'm guessing you have a 60Hz monitor? If that is so, the best way to play games is to aim to always have 60+ fps and turn v-sync on.


----------



## Haenir

Okay thanks, the fact is that vsync does remove tearing but adds really annoying stuttering. About the oc should I leave the power limit and the clocks or change something?


----------



## Ha-Nocri

Quote:


> Originally Posted by *Haenir*
> 
> Okay thanks, the fact is that vsync does remove tearing but adds really annoying stuttering. About the oc should I leave the power limit and the clocks or change something?


v-sync shouldn't add stuttering, unless your frame-rate drops below 60.

You can increase power-limit, just in case the card uses more power than what power-limit at 0% allows... it can't hurt.

Also, what PSU do you have?


----------



## Haenir

Okay I'll do that, should i go to +10% or 20? The psu is a 650W xfx gold plus


----------



## Ha-Nocri

You can set +50%, won't change anything. Only will give card more room if it needs, which I doubt.


----------



## Haenir

Thanks


----------



## Dundundata

Try running stock, do you sill get stuttering?


----------



## jdorje

Quote:


> Originally Posted by *pillowsack*
> 
> Just put on the Gelid VRM coolers and mounted my stock intel fan onto my VRM's. Lets see what I can push this mofo to


Got pics?


----------



## navjack27

Quote:


> Originally Posted by *Haenir*
> 
> Hi, I have the sapphire 390 and i can get 1100 on the core and 1700 on the memory without touching voltage or power limit. It's stable(heaven for 20 mins without problems) and the temps are fine(67C at max load), but i just realized that I'm having stuttering, screen tearing and in general graphical problems in quite a few games. Might it have to do with the fact that I didn't touch the power limit or that the oc is too high? In that case what could I settle on?


it really sounds like power throttle. disable powerplay with AB and set power limit to max to let the gpu use all available power it needs at load. no vsync and set driver texture quality to performance and surface optimizations.


----------



## Majentrix

What kind of temps are normal for Firestrike? My 390 Strix is reaching 94c and throttling down to 1000MHz almost immediately.


----------



## Carniflex

Quote:


> Originally Posted by *Haenir*
> 
> Okay thanks, the fact is that vsync does remove tearing but adds really annoying stuttering. About the oc should I leave the power limit and the clocks or change something?


I have similar issue. Planetside 2, running on all low settings at Hossin I get periodic "slowdown" or "jerking" with "smoothing" on (that their version of v-sync for windowed mode). This is different than tearing and is pretty noticeable. If I turn smoothing/v-sync off it gets worse and frame rate drops to around 40 from around 60 for some reason. In my case it might be card throwwtling as I have set it to -10% power because of the heat issues, at stock settings it hits 90+ C and howls so bad I can hear it through closed earphones a meter away from closed PC case over game sounds and reasonable level music. It also causes wife to come up from first floor from watching TV and giving me a remark that she can hear it from downstairs. So settings it to +50% power is not really an option for me.

I'll add a remark that I have never seen my card to run at full stock speed in GPU-z - it seems to max out at around 800 .. 900 MHz and just jerk around there with it's clocks even when setting power limit to +10%.
Quote:


> Originally Posted by *Ha-Nocri*
> 
> v-sync shouldn't add stuttering, unless your frame-rate drops below 60.
> 
> You can increase power-limit, just in case the card uses more power than what power-limit at 0% allows... it can't hurt.
> 
> Also, what PSU do you have?


I have similar issue and unfortunately power limit is not an option. Actually in my case turning v-sync off makes this "jerking" significantly worse. In case of Planetside 2 I would go as far as to say that this Gigabyte 390X G1 performs for me worse than 7950 @ 1 GHz did at 4k resolution. It's a lot louder and hotter and does not seem to be able to offer me greater performance.
Quote:


> Originally Posted by *navjack27*
> 
> it really sounds like power throttle. disable powerplay with AB and set power limit to max to let the gpu use all available power it needs at load. no vsync and set driver texture quality to performance and surface optimizations.


That card will hit the thermal limit in like 0.5 sec if you just push the dials all the way to the right. I would probably need to water cool it to be able to do that without hitting the 95 C when just watching youtube or something.

Anyway - if its and power throttle issue I would expect AMD to adress it as a top priority with their drivers. It's _very_ annoying when it happens and considering the thermals of 390 / 390X I do expect it to either power throttle or thermal throttle for majority of the users at least occasionally.


----------



## navjack27

Quote:


> Originally Posted by *Majentrix*
> What kind of temps are normal for Firestrike? My 390 Strix is reaching 94c and throttling down to 1000MHz almost immediately.


MY MSI 390x gets as hot as MAYBE 70-75 (more commonly 65) with that max overclock i found of +75mv core +50 powerlimit and 1150/1750
Quote:


> Originally Posted by *Carniflex*
> 
> I have similar issue. Planetside 2, running on all low settings at Hossin I get periodic "slowdown" or "jerking" with "smoothing" on (that their version of v-sync for windowed mode). This is different than tearing and is pretty noticeable. If I turn smoothing/v-sync off it gets worse and frame rate drops to around 40 from around 60 for some reason. In my case it might be card throwwtling as I have set it to -10% power because of the heat issues, at stock settings it hits 90+ C and howls so bad I can hear it through closed earphones a meter away from closed PC case over game sounds and reasonable level music. It also causes wife to come up from first floor from watching TV and giving me a remark that she can hear it from downstairs. So settings it to +50% power is not really an option for me.
> 
> I'll add a remark that I have never seen my card to run at full stock speed in GPU-z - it seems to max out at around 800 .. 900 MHz and just jerk around there with it's clocks even when setting power limit to +10%.
> I have similar issue and unfortunately power limit is not an option. Actually in my case turning v-sync off makes this "jerking" significantly worse. In case of Planetside 2 I would go as far as to say that this Gigabyte 390X G1 performs for me worse than 7950 @ 1 GHz did at 4k resolution. It's a lot louder and hotter and does not seem to be able to offer me greater performance.
> That card will hit the thermal limit in like 0.5 sec if you just push the dials all the way to the right. I would probably need to water cool it to be able to do that without hitting the 95 C when just watching youtube or something.
> 
> Anyway - if its and power throttle issue I would expect AMD to adress it as a top priority with their drivers. It's _very_ annoying when it happens and considering the thermals of 390 / 390X I do expect it to either power throttle or thermal throttle for majority of the users at least occasionally.


only thing i'm asking is to move the power limit, the card won't do anything insane with that. that isn't what the power limit is for. voltages on the other hand...


----------



## Carniflex

Quote:


> Originally Posted by *navjack27*
> 
> MY MSI 390x gets as hot as MAYBE 70-75 (more commonly 65) with that max overclock i found of +75mv core +50 powerlimit and 1150/1750
> only thing i'm asking is to move the power limit, the card won't do anything insane with that. that isn't what the power limit is for. voltages on the other hand...


Is that under water cooling?

And I'll add a note that in my case increasing power limit makes the card hit 90+C and howl like it's Armageddon if I put it at +10% power limit. I do have ok airflow, one 140 mm pushing into card from front panel and 140mm on the side panel above card sucking hot air out.


----------



## navjack27

its on this



no case fans that do anything. the side always open. custom fan curve


----------



## Haenir

Quote:


> Originally Posted by *navjack27*
> 
> it really sounds like power throttle. disable powerplay with AB and set power limit to max to let the gpu use all available power it needs at load. no vsync and set driver texture quality to performance and surface optimizations.


Okay, I'll try that, how do i change driver texture quality to performance and surface optimizations? Will they impact negatively the quality? I mean shouldn't my gpu be able to run these games smoothly?


----------



## pillowsack

To you guys posting right now, these cards are spicy. They are pretty much 290 and 290X. The card manufactures didn't put much love into VRM cooling, let alone heatsinks.

The latest beta AMD driver (crimson) supposedly set fans to 20% as well.

If you're getting bad temps right now i'd suggest putting different thermal paste on(if it doesn't void your warranty(or if it does and you don't care)). Otherwise look into gelid VRM coolers and AIO water cooling set up. It would run you about $100~ but would be a good investment if you want to add some more beef to your card.







I cut the stock mounting tabs off the intel fan.

What did I gain out of all this hard work?



1200/1700 24/7 stable. It was bios flashed to 1100/1600 cause that was stable on air.

This is with 50mv+ and 50PL.


----------



## kizwan

@pillowsack, nice mod there.


----------



## navjack27

Quote:


> Originally Posted by *Haenir*
> 
> Okay, I'll try that, how do i change driver texture quality to performance and surface optimizations? Will they impact negatively the quality? I mean shouldn't my gpu be able to run these games smoothly?


if you don't know, then its okay. i'd rather you be running stock for those kinda things. it would effect quality KINDA. hey how hot are your guys houses / room with computer? we keep the house under 21c 70f year round. i wonder if thats what is up with ur guyses temps. cuz i'm just lost on why i'm getting great fps, great temps, great everything and maybe its that maybe its other tweaks i've done over the lifetime of my pc and habits i have when i play a game or run a benchmark.


----------



## angelgrin

Hi guys,

is anyone using these cards for Autodesk Inventor/Autocad/REVIT?

i just want to know if it is fast and stable with minimum driver issues.

because i am planning on building a work/gaming pc with this card as opposed to the GTX 970.

TIA


----------



## Carniflex

Quote:


> Originally Posted by *navjack27*
> 
> if you don't know, then its okay. i'd rather you be running stock for those kinda things. it would effect quality KINDA. hey how hot are your guys houses / room with computer? we keep the house under 21c 70f year round. i wonder if thats what is up with ur guyses temps. cuz i'm just lost on why i'm getting great fps, great temps, great everything and maybe its that maybe its other tweaks i've done over the lifetime of my pc and habits i have when i play a game or run a benchmark.


Around 20 C. It might be a number of factors from crappy draw in silicon lottery all the way to something a card manufacturer has done "wrong" like bad thermal paste, too much or too little of it, too thick or too crappy thermal tape on VRM's or something "wrong" with the system like crappy airflow, dust.etc...

I'm personally convinced in my case the issue is something crappy Gigabyte has done with their cooler. It's more or less same size as the MSI one also with dual fans but seems to hit consistently 90+ C for many people so it seems to not be only an issue for me. I have seen few guys mention that replacing the thermal paste with some better quality stuff can bring down the temps by 10 C or so. Which I have not yet done.


----------



## jdorje

Don't use the power limit to keep your card cool. That's what it is for but it's not the right way. An overclocker can do much better, by tweaking the overclock to get a viable level of power use.

Reduce your voltage, then change the clock to be stable. These cards don't scale that well with voltage anyway. My card may be above average but -100 mV still leaves it above stock clock (1035) and incredibly cool. It's like 80w lower (at the wall) compared to stock.


----------



## Darkeylel

Quote:


> Originally Posted by *Carniflex*
> 
> Around 20 C. It might be a number of factors from crappy draw in silicon lottery all the way to something a card manufacturer has done "wrong" like bad thermal paste, too much or too little of it, too thick or too crappy thermal tape on VRM's or something "wrong" with the system like crappy airflow, dust.etc...
> 
> I'm personally convinced in my case the issue is something crappy Gigabyte has done with their cooler. It's more or less same size as the MSI one also with dual fans but seems to hit consistently 90+ C for many people so it seems to not be only an issue for me. I have seen few guys mention that replacing the thermal paste with some better quality stuff can bring down the temps by 10 C or so. Which I have not yet done.


I have a gigabyte 390x have yet to hit above 80c and that's with a default fan curve as well


----------



## tangelo

Quote:


> Originally Posted by *Charcharo*
> 
> I guess. With 6 Gigs it would solve any VRAM issues at 1080 and 1440 (and most 4K ones too).
> 
> Then again there is no kill like overkill


With VR 8gigs is just fine, not overkill


----------



## XxxxVulcanxxxX

It really seems hit and miss with almost any manufacturer of the 390 series as far as cooling goes... so many different reports on the temps from people with the same make/speeds... we cant all have poor airflow...









Does ASIC quality come into play with the temps due to the differing voltage requirements? or am i thinking too much into it? I just cant see why some are reporting large overclocks on air and others are struggling to keep it at stock without overheating from the same manufacturer...?

My ASIC quality is 81.6% and i struggled overclocking on air with the msi twin frozr cooler...


----------



## seanpatrick

Any know how to get rid of an overcloc? I got stuck at 1140 in afterburner. I try resetting clocks but it now shows my overclock as my base clock. I tried uninstalling afterburner and cleaning the registry, then reinstalling, still stuck. I tried trixx to reset my clocks, no luck!


----------



## XxxxVulcanxxxX

Quote:


> Originally Posted by *seanpatrick*
> 
> Any know how to get rid of an overcloc? I got stuck at 1140 in afterburner. I try resetting clocks but it now shows my overclock as my base clock. I tried uninstalling afterburner and cleaning the registry, then reinstalling, still stuck. I tried trixx to reset my clocks, no luck!


Have you disabled the powerplay setting in afterburner? I had this same issue when i did that. I had to close all overclocking utilities (i only use afterburner) and reset the dislpay driver to defaults by going into the preferences on the bottom of the crimson control panel and reset factory defaults, then reboot... this fixed it for me.


----------



## seanpatrick

Quote:


> Originally Posted by *XxxxVulcanxxxX*
> 
> Have you disabled the powerplay setting in afterburner? I had this same issue when i did that. I had to close all overclocking utilities (i only use afterburner) and reset the dislpay driver to defaults by going into the preferences on the bottom of the crimson control panel and reset factory defaults, then reboot... this fixed it for me.


Unfortunately that didn't work, but re-installing the drivers brought it back down. I was messing with 2d and 3d profiles in afterburner, trying to get it to actually recognize my settings (in attempt to combat the issue where I get a horizontal line flicker through my screen periodically) but so far it doesn't seem to care what I put into the profile settings. Any idea on that front?


----------



## XxxxVulcanxxxX

Unfortunately I have had no luck in setting the 2D/3D profiles in afterburner myself, some on here have mentioned it, maybe someone else can help us both out on that front...


----------



## Jaffi

Hi guys,

So I am about to upgrade my card to a Sapphire 390(X) Nitro. The X costs only 60 € more and so it's a hard decision. I wouldn't have payed the usual +100 € they charge, but this deal is kinda tempting.

Sooo, what would you recommend? I am currently playing in 1080p but I will most likely upgrade to a higher resolution and freesync next year.


----------



## jdorje

Quote:


> Originally Posted by *seanpatrick*
> 
> Any know how to get rid of an overcloc? I got stuck at 1140 in afterburner. I try resetting clocks but it now shows my overclock as my base clock. I tried uninstalling afterburner and cleaning the registry, then reinstalling, still stuck. I tried trixx to reset my clocks, no luck!


Hwinfo shows you still at the higher clock (and presumably voltage)?


----------



## jdorje

Voltage required does seem to rise quadratically, but at a really gentle curve.

I noticed the possible existence of golden voltages, or maybe golden clocks, where a certain point would be way above the curve compared to adjacent points. However it'd take a lot more testing to see if such golden points actually exist or are the result of the loose definition of stability used here.

Another more significant thing I noticed was that voltage doesn't correspond linearly to what's set in afterburner. Logically the offset voltage system works by having a base voltage and having the current voltage equal base plus offset. But the core voltage recorded in hwinfo, though somewhat close to this, is not a linear correlation at all. One explanation is that base voltage might be adaptive based on clock already, and then an additional offset on top of that. I didn't explore this much at all though so I don't know.


----------



## diggiddi

Quote:


> Originally Posted by *Jaffi*
> 
> Hi guys,
> 
> So I am about to upgrade my card to a Sapphire 390(X) Nitro. The X costs only 60 € more and so it's a hard decision. I wouldn't have payed the usual +100 € they charge, but this deal is kinda tempting.
> 
> Sooo, what would you recommend? I am currently playing in 1080p but I will most likely upgrade to a higher resolution and freesync next year.


Get the best card you can afford and don't look back


----------



## Artraxus

New to the forums...
Here is a pic I took with my phone of OHM, GPU-Z, and Corsair Link all showing R9-390



Here is a pic for my case with the side off and the Sapphire Nitro R9-390 installed... (it's also my avatar pic)



Air cooled with the stock Nitro tri-x cooler and stock speed until I am done with my research.


----------



## Darkeylel

Quote:


> Originally Posted by *Artraxus*
> 
> New to the forums...
> Here is a pic I took with my phone of OHM, GPU-Z, and Corsair Link all showing R9-390
> 
> 
> 
> Here is a pic for my case with the side off and the Sapphire Nitro R9-390 installed... (it's also my avatar pic)
> 
> 
> 
> Air cooled with the stock Nitro tri-x cooler and stock speed until I am done with my research.


And here I thought my cable management skills where bad







. Tidy up those sata cables please haha


----------



## Artraxus

There are three devices on SATA there. Optical drive, SSD, and HDD. All cables head straight out of the back of the cable management holes in the case. This is a bad angle as it looks like one cable that is too long and poorly managed. That is not the case my trolly internet friend. There are 3 cables.


----------



## Luckael

any user here of Asus R9 390 Strix? how's the temp while playing? planning to get an asus strix or Sapphire r9 390.


----------



## seanpatrick

Quote:


> Originally Posted by *Luckael*
> 
> any user here of Asus R9 390 Strix? how's the temp while playing? planning to get an asus strix or Sapphire r9 390.


Most of what I've read has the Strix with terrible fans, whereas the Sapphire is excellent.


----------



## Carniflex

Quote:


> Originally Posted by *Carniflex*
> 
> I have similar issue. Planetside 2, running on all low settings at Hossin I get periodic "slowdown" or "jerking" with "smoothing" on (that their version of v-sync for windowed mode). This is different than tearing and is pretty noticeable. If I turn smoothing/v-sync off it gets worse and frame rate drops to around 40 from around 60 for some reason. In my case it might be card throwwtling as I have set it to -10% power because of the heat issues, at stock settings it hits 90+ C and howls so bad I can hear it through closed earphones a meter away from closed PC case over game sounds and reasonable level music. It also causes wife to come up from first floor from watching TV and giving me a remark that she can hear it from downstairs. So settings it to +50% power is not really an option for me.
> 
> I'll add a remark that I have never seen my card to run at full stock speed in GPU-z - it seems to max out at around 800 .. 900 MHz and just jerk around there with it's clocks even when setting power limit to +10%.
> I have similar issue and unfortunately power limit is not an option. Actually in my case turning v-sync off makes this "jerking" significantly worse. In case of Planetside 2 I would go as far as to say that this Gigabyte 390X G1 performs for me worse than 7950 @ 1 GHz did at 4k resolution. It's a lot louder and hotter and does not seem to be able to offer me greater performance.
> That card will hit the thermal limit in like 0.5 sec if you just push the dials all the way to the right. I would probably need to water cool it to be able to do that without hitting the 95 C when just watching youtube or something.
> 
> Anyway - if its and power throttle issue I would expect AMD to adress it as a top priority with their drivers. It's _very_ annoying when it happens and considering the thermals of 390 / 390X I do expect it to either power throttle or thermal throttle for majority of the users at least occasionally.


Seems like some of the issues I'm experiencing might be caused by the "Crimson" drivers.

http://www.overclock.net/t/1584336/tt-amd-looking-into-underclocking-issue-caused-by-crimson-edition-driver/0_50#post_24708218


----------



## navjack27

Quote:


> Originally Posted by *XxxxVulcanxxxX*
> 
> It really seems hit and miss with almost any manufacturer of the 390 series as far as cooling goes... so many different reports on the temps from people with the same make/speeds... we cant all have poor airflow...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Does ASIC quality come into play with the temps due to the differing voltage requirements? or am i thinking too much into it? I just cant see why some are reporting large overclocks on air and others are struggling to keep it at stock without overheating from the same manufacturer...?
> 
> My ASIC quality is 81.6% and i struggled overclocking on air with the msi twin frozr cooler...


see you might have a point, my ASIC quality is only 76.3%... well maybe not, i dunno. its something like higher quality leaks more voltage into heat and less quality holds onto the voltage... or something


----------



## Dundundata

My asic is 67.5%, msi 390 and it runs cool and overclocks well. What is your voltage, at +50mV i get 1.32 with 70C.


----------



## Luckael

Quote:


> Originally Posted by *seanpatrick*
> 
> Most of what I've read has the Strix with terrible fans, whereas the Sapphire is excellent.


thanks for your response


----------



## yuannan

So I've messaged Sapphire support about my r9 390x running hot at idle and they are slow to reply.

So I thought why not try out OC and heaven and see my scores.

Very mild OC at 0 core, 0 power limit, 100% fan speed, 1150/1500

no AA


8xAA


How is my scores?
If this is bad I might return it and buy another card probs from another brand as Sapphire is quite loud imo.


----------



## The Stilt

Quote:


> Originally Posted by *jdorje*
> 
> I noticed the possible existence of golden voltages, or maybe golden clocks, where a certain point would be way above the curve compared to adjacent points. However it'd take a lot more testing to see if such golden points actually exist or are the result of the loose definition of stability used here.
> 
> Another more significant thing I noticed was that voltage doesn't correspond linearly to what's set in afterburner. Logically the offset voltage system works by having a base voltage and having the current voltage equal base plus offset. But the core voltage recorded in hwinfo, though somewhat close to this, is not a linear correlation at all. One explanation is that base voltage might be adaptive based on clock already, and then an additional offset on top of that. I didn't explore this much at all though so I don't know.


All 290 / 390 cards have eight different DPMs and each of the DPMs have different voltage (and clocks). The offset voltage is added on top of the driven VID command (defined by the DPM). So the offset remains the same, while the based voltage does not (unless the DPM stays static).

The GPU voltage readout from software isn´t reliable regardless what software you are using. The actual voltage is usually at least 50mV higher. Also the amount of voltage droop directly correlates to the current draw (i.e. stress level).


----------



## afyeung

Running 1150/1700 with the MSI Twin Frozr at +38mv core and +25mv aux and temps around 73c while gaming. Anything else I should know about oc'ing this card?


----------



## afyeung

@navjack27
How is your graphics score so high???? Teach me please O_O I have the 390x at 1150/1700. Hopefully I can get somewhere around 15000. Which drivers?


----------



## navjack27

LOL. windows 7. crimson beta.


and i use process lasso to set some stuff for 3dmark to make sure its using the cpu correctly.

uhhh besides that... nothing really special.


----------



## afyeung

Im using a 5820k. How do I configure it so it improves my graphics score? That's the highest 390x graphics score I've ever seen.


----------



## navjack27

yeah i know i'm surprised also. as you know i have a 4790s, not even a K... does everyone with a K have a 970 instead or something?

well in my bios i have all power saving features disabled. hyperthreading enabled just for when i run benchmarks, otherwise its off. and all core ratios are at 40. my memory is at 2400... uhhh... bclk is 100:100. so basically my cpu is ALWAYS at 4ghz... my cache is at 40 also...


----------



## Dundundata

I have 4790k but no overclock, my ram is 1600, not sure if i can OC it...will it make a difference in gaming?


----------



## Charcharo

Hello guys!

I seem to be having strange issues now with my AMD R9 390 from PowerColor.

An hour or two ago as I was browsing on how to easily get a few achievements for Doom 3 BFG Edition (yes I know...) , my screen started going black in Windows.

Basically as I was browsing and wanted to enter task manager to see something, it started having problems.

Now here is what happens:
On a normal boot up after the Windows 7 loading screen it goes black. That is all







Sound still plays.

Here is what I tried:
1. It is not the cables that is for sure.
2. The card is firmly in place in its slot.
3. I did use Display Driver Uninstaller AND then downloaded AMD's latest driver from the page.
A) Once I used the default ones on front page like a dweeb.
B) Will try ones from manual selection (Desktop - R9 - R9 3XX - Win 7 64 bit) and the minimal set up one.

4. I did use System Restore just in case (once helped me with an even worse problem on old PC, so I always use it if problems persist).

Here is what is probably not a problem. If something here is stupid:
1. The card seems to be alive as the display works in Safe Mode (I have not changed cables at all, nor have I used the integrated one).
A) Also works just fine AFTER DDU. In fact I am typing from the PC right now as I will attempt another AMD driver install as soon as I finish this post.
B) It has never been OCed more than its factory overclock of +10 to the core.
C) It has never been REALLY hammered down harshly.
D) GPU -Z displays all info just fine... except not seeing it as an R9 390... but all other info is there... if that even means anything.

If someone can help it would be appreciated. Also lol ... this is from latest attempt before I restart

Installingdrivermanually.JPG 46k .JPG file


----------



## Dundundata

You could try an older driver, the ones before Crimson...or give windows 10 a try.


----------



## Charcharo

Quote:


> Originally Posted by *Dundundata*
> 
> You could try an older driver, the ones before Crimson...or give windows 10 a try.


I might try the first. As for Win 10... was hoping to first... get an SSD and then try Win 10...


----------



## Dundundata

Quote:


> Originally Posted by *Charcharo*
> 
> I might try the first. As for Win 10... was hoping to first... get an SSD and then try Win 10...


I have a small 120GB SSD for Windows. Could never get Catalyst drivers to install on Windows 7 for some reason, so gave 10 a try. I rather like it overall.


----------



## Charcharo

Using good old 15.7.1 does not seem to help









As for Windows 10... I still have no SSD. Also am worried about its backwards compatibility


----------



## flopper

Quote:


> Originally Posted by *Charcharo*
> 
> Hello guys!
> 
> I seem to be having strange issues now with my AMD R9 390 from PowerColor.


seems the 3D activation with driver kicks in with black screen.
so, check if you can shift the bios switch if the card has one that is.
raise voltage slightly in 3D mode using msi afterburner if possible.
change the cable and port connecting your screen.

basically it might be a rma issue but trouble shooting is needed first


----------



## afyeung

Hey guys. I recently benched with Firestrike and I got this http://www.3dmark.com/3dm/9776699?
I thought my graphics score would be at least 14000? Idk.


----------



## kizwan

Quote:


> Originally Posted by *navjack27*
> 
> yeah i know i'm surprised also. as you know i have a 4790s, not even a K... does everyone with a K have a 970 instead or something?
> 
> well in my bios i have all power saving features disabled. hyperthreading enabled just for when i run benchmarks, otherwise its off. and all core ratios are at 40. my memory is at 2400... uhhh... bclk is 100:100. so basically my cpu is ALWAYS at 4ghz... my cache is at 40 also...


No surprise there. You're benching with Tessellation modified which will allow you to score higher than others. Try run again with settings in Crimson & RadeonPro to default.


----------



## LeSwede

Hello,

Can anyone confirm if any blocks fit on the MSI R9 390 8GB Gaming?
Like this for example http://www.alphacool.com/product_info.php/info/p1663_Alphacool-NexXxoS-GPX---ATI-R9-390-M02---mit-Backplate---Schwarz.html

Really dont want to buy anything until Im 100% sure it fits, show pics if possible please







Also do they cool the VRM/VRAM?


----------



## BlackFox1337

Quote:


> Originally Posted by *LeSwede*
> 
> Hello,
> 
> Can anyone confirm if any blocks fit on the MSI R9 390 8GB Gaming?
> Like this for example http://www.alphacool.com/product_info.php/info/p1663_Alphacool-NexXxoS-GPX---ATI-R9-390-M02---mit-Backplate---Schwarz.html
> 
> Really dont want to buy anything until Im 100% sure it fits, show pics if possible please
> 
> 
> 
> 
> 
> 
> 
> Also do they cool the VRM/VRAM?


Not sure if this will help, but this is the M02 block and the MSI 390X which should be the same Reference GPU. Also a couple pics of the finished product once installed in a Corsair Air 240


----------



## LeSwede

Quote:


> Originally Posted by *BlackFox1337*
> 
> Not sure if this will help, but this is the M02 block and the MSI 390X which should be the same Reference GPU. Also a couple pics of the finished product once installed in a Corsair Air 240


Thank you so much







Now I can safely buy 2 blocks for my GPUs







+1 Sir!


----------



## jdorje

Get the November 30 (or is it 29?) Beta hot fix of the Crimson drivers. Use ddu to clean install them.


----------



## navjack27

Quote:


> Originally Posted by *kizwan*
> 
> No surprise there. You're benching with Tessellation modified which will allow you to score higher than others. Try run again with settings in Crimson & RadeonPro to default.


no i don't i leave it on let application decide tessellation level. i call out other people on that. also 3dmark announces on the results page if you modify that.

Quote:


> Originally Posted by *afyeung*
> 
> Hey guys. I recently benched with Firestrike and I got this http://www.3dmark.com/3dm/9776699?
> I thought my graphics score would be at least 14000? Idk.


only place i'm better in the score is the graphics


----------



## Charcharo

Quote:


> Originally Posted by *flopper*
> 
> seems the 3D activation with driver kicks in with black screen.
> so, check if you can shift the bios switch if the card has one that is.
> raise voltage slightly in 3D mode using msi afterburner if possible.
> change the cable and port connecting your screen.
> 
> basically it might be a rma issue but trouble shooting is needed first


Egh... how do I do that on a PowerColor card? Does it even have one







?

Also how to raise voltage slightly in 3D mode? It can not get to 3D mode I guess (works only in safe mode or after reinstalled drivers).

The last one I will try, though... probably I can not (old screen, few port options).

So RMA will mean I can return it?


----------



## jdorje

Firestrike scores are way too cpu-dependent. Go with a pure GPU benchmark like the unigine ones.

Also: what does the aux voltage do? How can I change the memory voltage if it's locked in afterburner?


----------



## Carniflex

I have a note about the Crimson drivers.

*DO NOT USE the initial release version!!! If you still want to use them use the latest beta with hotfixes.*

I used the initial release version at it was fine without the issues reported by others. However, apparently it can develop the reported fan speed issues just out of blue on it's own as well even when initially fine. Last evening was just running GPU-z on one of my other screens and while playing took a glance from the corner of my eye at temperatures. Which were sitting flat on 96 C apparently after I took a second glance to confirm that what I though I saw was really there. Fan speed was locked on "manual" and at 50% without me setting it there. I guess I was lucky that it had defaulted to 50%, not 0% - in the latter case I'm pretty confident it would have managed to fry the card before I would have really noticed. OK, I set it to "auto" make sure it works properly .. all is fine, right ... quit the game I was using to make sure fan speed works properly and bam! this time it was locked on "manual" and on 0% - noticed few minutes later when card was (fortunately) basically idle (just gazillion web tabs open) that card is sitting again at 96 C flat. It activated fan at 100% for second making loud noise and then proceeded to shut down the fan again - good thing it was doing that as because I had just "sorted" the issue I would probably not have noticed for a while.

I now installed the beta Crimson driver and so far it seems to behave - at least as far as fan speed is concerned.


----------



## flopper

Quote:


> Originally Posted by *Carniflex*
> 
> I have a note about the Crimson drivers.
> 
> *DO NOT USE the initial release version!!! If you still want to use them use the latest beta with hotfixes.*
> 
> .


new crimson has a whql driver now since yesteray.


----------



## mandrix

Quote:


> Originally Posted by *Charcharo*
> 
> Egh... how do I do that on a PowerColor card? Does it even have one
> 
> 
> 
> 
> 
> 
> 
> ?
> 
> Also how to raise voltage slightly in 3D mode? It can not get to 3D mode I guess (works only in safe mode or after reinstalled drivers).
> 
> The last one I will try, though... probably I can not (old screen, few port options).
> 
> So RMA will mean I can return it?


The BIOS switch would be about 1 1/2" from the rear of the card. It can select either of the BIOS versions, but PowerColor cards normally have identical BIOS versions for each switch position. At least the 6 PowerColor cards I've owned are like that. Just shut down, flip the switch, and restart the pc.
If you need to RMA then just go to the PowerColor web site and initiate the RMA.
GL.


----------



## Irked

Stock cooling


----------



## Artraxus

Well after some moderate testing, intense testing yet to come, here are my results...

Sapphire Nitro R9 -390 overclocked to 1130MHz on the core and 1660 on the memory.

Furmark (the 1080p benchmark button) score = 5305 peak temp = 72C.
Valley Benchmark 1080p score = 3569 peak temp = 65C
Heaven 1080p score = 2007 peak temp = 65C

I am still trying to tweak things but I start to get some artifacts if I try to push it further. I think my temps are good though but I am new at this so validation would help.


----------



## afyeung

Quote:


> Originally Posted by *Artraxus*
> 
> Well after some moderate testing, intense testing yet to come, here are my results...
> 
> Sapphire Nitro R9 -390 overclocked to 1130MHz on the core and 1660 on the memory.
> 
> Furmark (the 1080p benchmark button) score = 5305 peak temp = 72C.
> Valley Benchmark 1080p score = 3569 peak temp = 65C
> Heaven 1080p score = 2007 peak temp = 65C
> 
> I am still trying to tweak things but I start to get some artifacts if I try to push it further. I think my temps are good though but I am new at this so validation would help.


A little bit of voltage would help. My 390x needed +13mv to get to 1140. It appears that these cards are still crap core clockers overall in terms of stability.


----------



## pillowsack

My 390 hit 1200 stable.


----------



## Pwned24

My most stable overclock that I can use even for long game sessions on my R9 390 Sapphire Nitro (had a backplate but I removed it because RAM Clips were in the way) were:
Core: 1100
Memory: 1600

Sadly any higher would cause artifacts after a while







Anyone know why I cant reach as high as you guys and if possible how I can improve it?


----------



## ZealotKi11er

Quote:


> Originally Posted by *Pwned24*
> 
> My most stable overclock that I can use even for long game sessions on my R9 390 Sapphire Nitro (had a backplate but I removed it because RAM Clips were in the way) were:
> Core: 1100
> Memory: 1600
> 
> Sadly any higher would cause artifacts after a while
> 
> 
> 
> 
> 
> 
> 
> Anyone know why I cant reach as high as you guys and if possible how I can improve it?


Probably has to do with temperature.


----------



## Artraxus

I would say from my limited knowledge that you are not getting enough voltage to your GPU. The sapphire should be able to handle more than that.

What are your Idle and load temps?

What are you using to overclock the card?


----------



## Levys

Quote:


> Originally Posted by *pillowsack*
> 
> Well I got my XFX DD under a universal GPU water block tonight. Short furmark runs without the VRM's spiking and I was able to accomplish 1225/1750 with 90MV+ and 50%PL
> 
> I copied someone elses overclock off the spread sheet, worked well. Core never went over 45C with 2 hours of GTA 5 online.


well you little devil...taking other peoples clocks ( my clock )









Glad someone else clocked as high as mine...for now

what kind of 3Dmark score do you get at those clocks?


----------



## Jaffi

I needed to allocate a noise and stopped the fans of my 390 nitro one by one with my finger. Those fans stop completely, then they start to spin again. Could I have damaged something by doing this? So far everything is running fine again.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Jaffi*
> 
> I needed to allocate a noise and stopped the fans of my 390 nitro one by one with my finger. Those fans stop completely, then they start to spin again. Could I have damaged something by doing this? So far everything is running fine again.


Should not be a problem. I have stopped fans with my finger many times.


----------



## Jaffi

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Should not be a problem. I have stopped fans with my finger many times.


Thank you! By the way, I read this review and it says:
Quote:


> By default, the card comes with its legacy BIOS enabled, but pressing the button switches it to the digitally-signed UEFI. In that mode, boot and resume times should be quicker


So do I really need to press that button to make it work with UEFI? I just switched it with my old 280 and UEFI says fastboot and secure boot would still be enabled.


----------



## Levys

Quote:


> Originally Posted by *seanpatrick*
> 
> Any know how to get rid of an overcloc? I got stuck at 1140 in afterburner. I try resetting clocks but it now shows my overclock as my base clock. I tried uninstalling afterburner and cleaning the registry, then reinstalling, still stuck. I tried trixx to reset my clocks, no luck!


The latest Crimson drivers should fix this problem. ( I have read this on AMD page somwhere )


----------



## tangelo

Quote:


> Originally Posted by *Jaffi*
> 
> I needed to allocate a noise and stopped the fans of my 390 nitro one by one with my finger. Those fans stop completely, then they start to spin again. Could I have damaged something by doing this? So far everything is running fine again.


I doubt it. I've done the same multiple times in the past and all the fans worked fine after it.


----------



## KNG HOLDY

which software do you guys to test your overclocks?

i used Valley benchmark and hit the exact score with 1200/1600 and 1200/1650 and my score was worse at 1200/1700 even if i didnt saw any artifacts!?

my screen just turns black if i try to raise my memory clock a bit and i start to get artifacts if i try a bit more core clock
did i reach my limit even if i just hit 70°C after 30min of valley?

screen:
http://www2.pic-upload.de/img/29178786/Untitled.png

edit:// i tested with +100 core voltage / +50 power limit / +100 aux voltage


----------



## tangelo

Quote:


> Originally Posted by *KNG HOLDY*
> 
> which software do you guys to test your overclocks?
> 
> i used Valley benchmark and hit the exact score with 1200/1600 and 1200/1650 and my score was worse at 1200/1700 even if i didnt saw any artifacts!?
> 
> my screen just turns black if i try to raise my memory clock a bit and i start to get artifacts if i try a bit more core clock
> did i reach my limit even if i just hit 70°C after 30min of valley?
> 
> screen:
> http://www2.pic-upload.de/img/29178786/Untitled.png
> 
> edit:// i tested with +100 core voltage / +50 power limit / +100 aux voltage


Usually Heaven, Valley, Firestrike and normal gaming


----------



## KNG HOLDY

Quote:


> Originally Posted by *tangelo*
> 
> Usually Heaven, Valley, Firestrike and normal gaming


if i cant raise memory or/and core clock w/o getting issues but im just at 70°C i reached my max?

couldnt i blow more voltage in it with another software and get a better result? // bios hack ?? or is that just stuff for experts?


----------



## tangelo

Quote:


> Originally Posted by *KNG HOLDY*
> 
> if i cant raise memory or/and core clock w/o getting issues but im just at 70°C i reached my max?
> 
> couldnt i blow more voltage in it with another software and get a better result? // bios hack ?? or is that just stuff for experts?


I'm no expert, but I would guess you are at your cards limits already. 1200/1700 is very good in my opinion. I had the same problem with my 7950. It run cool but when I tried to raise gpu or mem above certain point I just got black screen or hardlocks. Nothing I could do about it.

You are already pushing quite much juice to the card, I dunno if it would be beneficial to add any more to it. I would be happy with that OC


----------



## kizwan

Quote:


> Originally Posted by *KNG HOLDY*
> 
> which software do you guys to test your overclocks?
> 
> i used Valley benchmark and hit the exact score with 1200/1600 and 1200/1650 and my score was worse at 1200/1700 even if i didnt saw any artifacts!?
> 
> my screen just turns black if i try to raise my memory clock a bit and i start to get artifacts if i try a bit more core clock
> did i reach my limit even if i just hit 70°C after 30min of valley?
> 
> screen:
> http://www2.pic-upload.de/img/29178786/Untitled.png
> 
> edit:// i tested with +100 core voltage / +50 power limit / +100 aux voltage


Lower score at higher memory clock means memory error correction occurred. It means your memory overclock is not completely stable.

If you can not add any more core voltage or aux voltage doesn't help, your card have reached the limit.


----------



## fat4l

Quote:


> Originally Posted by *KNG HOLDY*
> 
> which software do you guys to test your overclocks?
> 
> i used Valley benchmark and hit the exact score with 1200/1600 and 1200/1650 and my score was worse at 1200/1700 even if i didnt saw any artifacts!?
> 
> my screen just turns black if i try to raise my memory clock a bit and i start to get artifacts if i try a bit more core clock
> did i reach my limit even if i just hit 70°C after 30min of valley?
> 
> screen:
> http://www2.pic-upload.de/img/29178786/Untitled.png
> 
> edit:// i tested with +100 core voltage / +50 power limit / +100 aux voltage


this is because of straps and EDC(ECC) errors which are invisible(no artifacts).
EDC errors are IMC errors and not mem errors.
Regarding straps, hawaii/grenada chips are using straps and different timings.
To maximize your performance you have to run mems are following frequencies and not between them. 1500, 1625, 1750.
Lower the strap=better the timings.
The question is, will IMC handle it ? It differs so you have to try it for yourself. Try one of the frequencies/straps i suggested and see what gives you the max perf.
You can also bump aux voltage to 1050mV and this can help IMC to handle high mem clocks.
Alretnatively you can mod your bios and use the timings from lower straps and boost your performance.


----------



## Dundundata

that kinda stinks though, because 1750 is pretty high for most cards. But I went from 1650 to 1625 based on your info.

What about Trixx, you can add more voltage there. How are your VRM temps? What is your actual voltage reading for your card?

Some cards don't like memory OC's as much. I've gotten the black screen as well when I try to go too high, even with good temps like you have.


----------



## pillowsack

So after tinkering with my card some more(jacking up the voltage







) I was able to achieve this:



So far so good, been running heaven for the past 3 minutes with no artifacts. FPS from stock in heaven probably raised a good 10-15FPS.

Core temp is slowly bouncing back and forth from 47 to 48C, VRM's never go above 85C(that's fine right?)

After reading a bit more here I saw the whole strap mentioning, figured I should aim for 1750 and not 1700. I tested and saw that 1625 and 1700 gave about the same FPS in heaven.

EDIT:

Hrmmmm my core does not like to stabilize. The vdroop is strong.... It goes from 1.328V to 1.24V~ voltage under load. On the bright side my memory is seemingly fine at 1750...


----------



## jdorje

My understanding of the strap phenomena is that at each point you are raising the timings. Just to make up numbers, 1750 might run at CL10 while 1755 runs at cl11. This means 1750 may be better than 1755, but if you can only run 1725 (CL10 still) its not fundamentally worse than 1750.

I get the same performance at 1725 as at 1750. 1775 is much lower performance and 1800 artifacts. Based on this I think past 1725 ish I start getting some errors. I'll try some aux voltage and pushing for 1750 though.

Even at 1725, the difference from 1500 is massive. And since the vram overclock doesn't have a high cost in power usage I'd say it's the most important oc.

One other thing I worry about is stutters due to vram errors. Even if overall fps is higher with faster memory, if it causes the 1% to be slower it might not be desirable. I think this could be measured though.


----------



## pillowsack

I find the stutters and artifacts with the core overclocked with my 1750. The core just doesn't like to do 1200-1225 which is annoying. I blame the vdroop.

I have yet to test with the Trixxx though, I suppose I could shoot more voltage into it to counter the vdroop


----------



## ZealotKi11er

Quote:


> Originally Posted by *pillowsack*
> 
> I find the stutters and artifacts with the core overclocked with my 1750. The core just doesn't like to do 1200-1225 which is annoying. I blame the vdroop.
> 
> I have yet to test with the Trixxx though, I suppose I could shoot more voltage into it to counter the vdroop


The limit of Hawaii is on 1200MHz range. To get close to 1250MHz stable you will need +200mV and low temps for both core and VRM. I am talking about 40C and under for Core and 45C and under for VRM. I can not run 3DMark @ 1300MHz on a normal day with card running ~ 48C and VRM1 hitting 58C. I need to open Window and have card drop 10-15C lower with 5C Ambient temps.


----------



## diggiddi

If you are running memory at 1600mhz will your bandwidth be 1600x4x64/1000 =409.6 GB/s??


----------



## Charcharo

Quote:


> Originally Posted by *mandrix*
> 
> The BIOS switch would be about 1 1/2" from the rear of the card. It can select either of the BIOS versions, but PowerColor cards normally have identical BIOS versions for each switch position. At least the 6 PowerColor cards I've owned are like that. Just shut down, flip the switch, and restart the pc.
> If you need to RMA then just go to the PowerColor web site and initiate the RMA.
> GL.


I need to thank you!

I used the Switch, and reinstalled all drivers. And now it works again. Perfectly









I guess something was wrong with the BIOS. No matter, I will use it now and if a problem comes again and I can not fix it, I will RMA/RTV it and get a new one.

I do wonder what went awry last time but it works now. So thank you!


----------



## afyeung

Really done with LTT forums on overclocking. Especially since half the people who discuss graphics cards there probably don't even own a decent desktop. I'll be testing the 390x at different memory straps like people have suggested. But if there's no performance difference between 1625 and 1700, might as well run at 1700 if they need the same voltage right? Again, I'll be test 1500- 1700+. I don't think my 390x can do 1700+ on the memory.


----------



## Carniflex

Quote:


> Originally Posted by *diggiddi*
> 
> If you are running memory at 1600mhz will your bandwidth be 1600x4x64/1000 =409.6 GB/s??


Sounds about right


----------



## mandrix

Quote:


> Originally Posted by *Charcharo*
> 
> I need to thank you!
> 
> I used the Switch, and reinstalled all drivers. And now it works again. Perfectly
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I guess something was wrong with the BIOS. No matter, I will use it now and if a problem comes again and I can not fix it, I will RMA/RTV it and get a new one.
> 
> I do wonder what went awry last time but it works now. So thank you!


Great!


----------



## Carniflex

I seem to have an intriguing problem. I took this screenshot earlier today and noticed that gpu-z reports 15.8 installed while I know for sure I have 15.12 installed. Thought that perhaps it's some kind of issue from installing 15.12 over older drivers, did the driver sweeper in safe mode, reboot and reinstalled 15.12 from scratch and ... it still detects as 15.8 apparently for gpu-z and dxdiag???

The screenshot in question as well.
Quote:


> Originally Posted by *Carniflex*


----------



## jdorje

It seems like AUX voltage is the vram voltage. At least according to hwinfo and some vague mentions up above.

But...bumping the vram voltage doesn't seem to do anything. At +0 I do 1725 mhz with no problems. But even at +100 I can't go to 1775 mhz memory speed. Maybe there's a slight inprovement at 1750 mhz, but it doesnt seem statistically significant.

Tried bumping core with raised AUX voltage also. No luck.

Meanwhile dropping voltage is hilarious. I tried -100 mhz and got an immediate blackscreen and (I deduce) full system crash. Couldn't start up the system at all. Had to bring out my win10 thumb drive installer, use the command line to enable legacy boot, use legacy boot to enable safe mode (useful trick that), then from safe mode find and edit the afterburner config file to reset the voltage.

These tests are done at low-ish settings for the core voltage. If I bump core voltage to +100 mV I have thermal issues.


----------



## Levys

Quote:


> Originally Posted by *Carniflex*
> 
> I seem to have an intriguing problem. I took this screenshot earlier today and noticed that gpu-z reports 15.8 installed while I know for sure I have 15.12 installed. Thought that perhaps it's some kind of issue from installing 15.12 over older drivers, did the driver sweeper in safe mode, reboot and reinstalled 15.12 from scratch and ... it still detects as 15.8 apparently for gpu-z and dxdiag???
> 
> The screenshot in question as well.


Mine says the same and I also did a clean install + Ccleaner register cleaning
probably GPU Z 0.8.6 will probably be fix't at 0.8.7







hell last version ( 0.8.5.) says my card is DX11 and not 12


----------



## ZealotKi11er

Quote:


> Originally Posted by *afyeung*
> 
> Really done with LTT forums on overclocking. Especially since half the people who discuss graphics cards there probably don't even own a decent desktop. I'll be testing the 390x at different memory straps like people have suggested. But if there's no performance difference between 1625 and 1700, might as well run at 1700 if they need the same voltage right? Again, I'll be test 1500- 1700+. I don't think my 390x can do 1700+ on the memory.


If there is no performance difference why run 1700MHz?


----------



## Carniflex

Running at 4k resolution the memory controller shows normally around 20% load with short spikes up to 40..60% range sometimes according to gpu-z for me when running the memory at 1625 MHz. Although I'm sure there are some cases where higher memory bandwidth is noticeable in some benchmakrs.


----------



## ziggystardust

I'm trying to consider buying a new gpu these days. And I'm kind of between a Zotac GTX 970 Extreme Core Edition (or Galax exoc) and a MSI 390X.

I'm on a 1080p for now, not sure if I can ever find a chance to upgrade to 1440p or 4K in a near future.

390X is of course a bit more expensive than a GTX 970 but I think it might be a bit more future proof especially for DX12.

What you guys think? Is it worth paying 100-120 bucks more? And what about the MSI 390X? Does that card run cool?


----------



## yuannan

Quote:


> Originally Posted by *ziggystardust*
> 
> I'm trying to consider buying a new gpu these days. And I'm kind of between a Zotac GTX 970 Extreme Core Edition (or Galax exoc) and a MSI 390X.
> 
> I'm on a 1080p for now, not sure if I can ever find a chance to upgrade to 1440p or 4K in a near future.
> 
> 390X is of course a bit more expensive than a GTX 970 but I think it might be a bit more future proof especially for DX12.
> 
> What you guys think? Is it worth paying 100-120 bucks more? And what about the MSI 390X? Does that card run cool?


Get what ever is cheaper, by the time DX12 rolls around you'll probs have money to get a new GPU. Games don't happen overnight. I'm thinking late 2016 to early 2017 for a decent DX12 to come out.
MSI makes nice coolers for AMD cards and look the best IMO. They have slightly better OC as you can see if you go through the list on the OP. But which ever is cheaper is better.


----------



## ziggystardust

Quote:


> Originally Posted by *yuannan*
> 
> Get what ever is cheaper, by the time DX12 rolls around you'll probs have money to get a new GPU. Games don't happen overnight. I'm thinking late 2016 to early 2017 for a decent DX12 to come out.
> MSI makes nice coolers for AMD cards and look the best IMO. They have slightly better OC as you can see if you go through the list on the OP. But which ever is cheaper is better.


At the moment GTX 970 is about 100$ cheaper where i live. But still 390x feels a tad better performer. I might be wrong though. The problem is I will probably use this card more than a year. Considering I have a 3 years old 7870. I'm a bit slow at upgrading PC.







But I get your point


----------



## navjack27

dude go with the msi r9 390x gaming 8g like i got. its perfect in every way. the only card i'd upgrade to from this is a fury x


----------



## kizwan

Quote:


> Originally Posted by *Carniflex*
> 
> I seem to have an intriguing problem. I took this screenshot earlier today and noticed that gpu-z reports 15.8 installed while I know for sure I have 15.12 installed. Thought that perhaps it's some kind of issue from installing 15.12 over older drivers, did the driver sweeper in safe mode, reboot and reinstalled 15.12 from scratch and ... it still detects as 15.8 apparently for gpu-z and dxdiag???
> 
> The screenshot in question as well.
> Quote:
> 
> 
> 
> Originally Posted by *Carniflex*
Click to expand...

It's the Crimson driver doing. During installation, Crimson driver will register "Catalyst_Version = 15.8" in registry. The driver also register "RadeonSoftwareVersion = 15.12" in registry.


----------



## fat4l

Quote:


> Originally Posted by *jdorje*
> 
> My understanding of the strap phenomena is that at each point you are raising the timings. Just to make up numbers, 1750 might run at CL10 while 1755 runs at cl11. This means 1750 may be better than 1755, but if you can only run 1725 (CL10 still) its not fundamentally worse than 1750.
> 
> I get the same performance at 1725 as at 1750. 1775 is much lower performance and 1800 artifacts. Based on this I think past 1725 ish I start getting some errors. I'll try some aux voltage and pushing for 1750 though.
> 
> Even at 1725, the difference from 1500 is massive. And since the vram overclock doesn't have a high cost in power usage I'd say it's the most important oc.
> 
> One other thing I worry about is stutters due to vram errors. Even if overall fps is higher with faster memory, if it causes the 1% to be slower it might not be desirable. I think this could be measured though.


yeah you have the point there.
each strap has its ownset of timings.
Now your memory would maybe be fine with 1500 strap(1376-1500MHz) timings up to 1750MHz. However AMD is playing safe so they use different timings for each strap.
Another things is IMC. Your memory could be working just fine up to 2000MHz but IMC may not-will not handle it.
You can see how far you can push your IMC when you stop getting extra performance when overclocking. EDC errors are produced and being corrected by ECC, thus even tho your mem clocks are higher theres no performance gain.

Lets say you know your mem and imc can go up to 1700Mhz. You don't want to be using 1750 strap(1626-1750) timings with 1700MHz so you can mod your bios so it uses, for example, timings from 1500 strap. Lets say you tested it and its fine. So now you have 1700Mhz with 1500 strap timings which gives you a few extra % of performance.
Or lets say 1500 is too tight and you will start seeing artifacts after 1650MHz. Then you can use timings from 1625 strap so when clocking, your timings wont switch at 1626Mhz to 1700 ones but will stay the same all the way up to 1750 MHz. So now you have more perfomance, again coz 1700Mhz+1625timings >>1700MHz+1750timings which is obious









TLDR, mems can handle much better timings and it gives you extra performance if you use timings from lower strap.


----------



## Dundundata

Does the msi 390 have a dual bios switch?


----------



## pillowsack

Would someone be interested in making a strap modified bios for me? I have no idea what the heck to do after looking at that thread. I've read it about 4 times but it's still just going over my head


----------



## navjack27

i tried messing with the bios editor but i just don't trust myself in setting the values myself. i'd also like a maybe, better explanation of what the Hawaii bios editor things do.


----------



## fyzzz

I would love to know what 390(x)'s could do with bios modding. First off i suggest go look at this thread: http://www.overclock.net/t/1561372/hawaii-bios-editing-290-290x-295x2-390-390x/0_50. I can help with modding bios, when i have time and if someone wants it. My 290 managed to score this: http://www.3dmark.com/fs/6641212 when i put in tighter timings and some other tweaks.


----------



## diggiddi

Quote:


> Originally Posted by *Carniflex*
> 
> Sounds about right


TY, repped up, Now At 1080p running 2 monitors does faster VRAM help any?


----------



## jdorje

Quote:


> Originally Posted by *fyzzz*
> 
> I would love to know what 390(x)'s could do with bios modding. First off i suggest go look at this thread: http://www.overclock.net/t/1561372/hawaii-bios-editing-290-290x-295x2-390-390x/0_50. I can help with modding bios, when i have time and if someone wants it. My 290 managed to score this: http://www.3dmark.com/fs/6641212 when i put in tighter timings and some other tweaks.


Read through the thread. Pretty confusing! What can you edit in the bios that would improve your performance beyond what you can do with afterburner? Seems like the power limit could increase it slightly, but most 390s are about tapped out by the time they hit their power limit anyway...and are restricted to ~300W from the 8+6 pin cables. Then you can edit the ram subtimings, which uh, sounds even more complicated. What else?


----------



## fat4l

Quote:


> Originally Posted by *jdorje*
> 
> Read through the thread. Pretty confusing! What can you edit in the bios that would improve your performance beyond what you can do with afterburner? Seems like the power limit could increase it slightly, but most 390s are about tapped out by the time they hit their power limit anyway...and are restricted to ~300W from the 8+6 pin cables. Then you can edit the ram subtimings, which uh, sounds even more complicated. What else?


Memory timings are important. You can also mod aux voltage. Fan control. Tdp. Volts-you can go above the AB limit.


----------



## BlackFox1337

Anyone running a watercooled 390x? I'm wondering what kind of temps you are seeing.


----------



## pillowsack

I have a universal water block, so my VRM's aren't cold. My core never goes above 47C though. It comes after my CPU in the loop.



This is my new stable 24/7 overclock currently. It let me play games for 4 hours straight with no crashing(yes I am that lazy...)

I just really wanna see that core go up a little bit more. When my core goes up the instability comes though, at least when running 1200mhz. It takes a good 20-30mins before stuff gets unstable at 1200, but right now it doesn't flinch.


----------



## Carniflex

Quote:


> Originally Posted by *diggiddi*
> 
> TY, repped up, Now At 1080p running 2 monitors does faster VRAM help any?


Not really. I mean I'm sure there is difference somewhere but I do not believe it is something one can notice in actual real life usage with significant enough difference in something for human to detect. If there is difference then something on the lines of 5 ms faster texture loading here or there, perhaps one more frame per second if running already at 200 fps, etc.

If there is a benchmark specifically designed to tax gfx memory bandwidth then this one should show more or less linear gain inline with memory bandwidth.


----------



## navjack27

Quote:


> Originally Posted by *fyzzz*
> 
> I would love to know what 390(x)'s could do with bios modding. First off i suggest go look at this thread: http://www.overclock.net/t/1561372/hawaii-bios-editing-290-290x-295x2-390-390x/0_50. I can help with modding bios, when i have time and if someone wants it. My 290 managed to score this: http://www.3dmark.com/fs/6641212 when i put in tighter timings and some other tweaks.


yeah if i can get tighter timings at my max memory overclock that'd be neat to test out. i guess maybe a higher power limit, but i'm quite sure after doing the math that its already set for the max it can pull from all the power sources.


----------



## Carniflex

Is it possible that the TIM Gigabyte has used on their 390X G1 has a curing time? The temperatures I'm seeing after few weeks of using the card seem to be somewhat more tame than they used to. While initially I was hitting 90+ on regular basis even with lowered power limit then now it seems to have settled at about ~80 ... 85 C with power limit at 0%.


----------



## AlC0

I have now tested the Alphacool NexXxoS GPX - ATI R9 390 M03, it fits my Asus 390 Strix Gaming (DC3 cooler) perfectly.

For those interested in performance:

Restriction: high (about twice as restrictive as EK)
Core temps: great
VRM temps: good

Explanation:

In my loop I have 2x D5 pumps, 3 radiators, a flow sensor, and a CPU block. Before I put in the Alphacool block I had 6,5 LPM or 1,7 GPM. With the EK block I had for my previous card I got 5.5 LPM or 1,4 GPM, and with the Alphacool block I got 4,5 LPM or 1,1 GPM. Core temps seems to be a little lower with this block, 30 C on idle and 45 C during Furmark + Prime95. The VRM/mosfet/memory temps are a little higher, about 60-68C under full load. For VRM/mosfets/memory that is no problem. These components should not encounter any problems under 90 C, and everything below this point is considered a breeze as I understand.

I hope this information will be helpfull to others


----------



## mus1mus

Quote:


> Originally Posted by *Carniflex*
> 
> Is it possible that the TIM Gigabyte has used on their 390X G1 has a curing time? The temperatures I'm seeing after few weeks of using the card seem to be somewhat more tame than they used to. While initially I was hitting 90+ on regular basis even with lowered power limit then now it seems to have settled at about ~80 ... 85 C with power limit at 0%.


Impossibrul!









If they even have a cure time, they should have cured way before you received the card.

So who did try the bios mods?


----------



## kizwan

For the TIM to cure, it require thermal cycle. If Gigabyte use AS5, then yes, it require break-in or curing. Just want to point out that, you not necessarily get temp drop after the TIM cured. Temp may not changed at all. TIM that cured just means the TIM is now have optimized thermal conductivity.


----------



## Levys

Quote:


> Originally Posted by *BlackFox1337*
> 
> Anyone running a watercooled 390x? I'm wondering what kind of temps you are seeing.


I will play Black ops 3 for an hour and will post a screenshot with the temps ,( should be well under 50c° and vrm's to )


----------



## kubiks

Quote:


> Originally Posted by *AlC0*
> 
> I have now tested the Alphacool NexXxoS GPX - ATI R9 390 M03, it fits my Asus 390 Strix Gaming (DC3 cooler) perfectly.
> 
> For those interested in performance:
> 
> Restriction: high (about twice as restrictive as EK)
> Core temps: great
> VRM temps: good
> 
> Explanation:
> 
> In my loop I have 2x D5 pumps, 3 radiators, a flow sensor, and a CPU block. Before I put in the Alphacool block I had 6,5 LPM or 1,7 GPM. With the EK block I had for my previous card I got 5.5 LPM or 1,4 GPM, and with the Alphacool block I got 4,5 LPM or 1,1 GPM. Core temps seems to be a little lower with this block, 30 C on idle and 45 C during Furmark + Prime95. The VRM/mosfet/memory temps are a little higher, about 60-68C under full load. For VRM/mosfets/memory that is no problem. These components should not encounter any problems under 90 C, and everything below this point is considered a breeze as I understand.
> 
> I hope this information will be helpfull to others


I was impressed. Core performance greater than EK fullcover, with a lower price tag. I'm running two m02 on my msi 390x crossfire setup. Core temp won't break 50c with both cards running furmark, and that is running the gpu blocks in serial


----------



## AlC0

Quote:


> Originally Posted by *kubiks*
> 
> I was impressed. Core performance greater than EK fullcover, with a lower price tag. I'm running two m02 on my msi 390x crossfire setup. Core temp won't break 50c with both cards running furmark, and that is running the gpu blocks in serial


I was impressed too. Alphacool also includes a backplate with thermal pads, so you get passive cooling on the backside of the card. These pads are on the same spots as the front pads, so that you get passive cooling as a bonus on the back side. On top of that, Alphacool are selling upgrade kits with only VRM/mosfet/memory cooling for new GPU's, so theoretically you should be able to upgrade the block to match your next GPU for only 45€. I must say I am very happy with the way things turned out, and that I went with this block!


----------



## Levys

390x
Quote:


> Originally Posted by *BlackFox1337*
> 
> Anyone running a watercooled 390x? I'm wondering what kind of temps you are seeing.


I didn't get around to playing black ops 3. It wasn't pushing hard enough.
now crysis 3 and metro last light those games will push your hardware, even more than benchmarks I find.
This is a screenshot of Metro LL maxed out everything 4.4Ghz OC on FX 8350 and 1150/1625 on R9 390X MAX temps around 53C°
vrm1 around 56C° and vrm 2 around 48C° ( in other games both vrm's stay under the core temp)
Black ops 3 only got me to around 46C°


----------



## BlackFox1337

My Setup-

i7 6700k stock clocks
EK Supremacy MX
390x w/ Alphacool M02 block stock clocks
3/8 - 5/8 tubing
120 - 30mm Black Ice GTS Rad
240 - 30mm Black Ice GTS Rad
100 Res w/ D5 Vario
3x Noiseblocker 800Rpm fans
1x Corsair 120 fan
1x Scythe 120x15mm fan

Playing GTA V at 3440x1440 High Settings im seeing temps with fans on full speed around 58-60c with MSI Afterburner. CPU temps seem to be a little less. Any other tools i should use, and judging by these temps, will i have any issues overclocking? I think i may need to replace the Noiseblockers with higher RPM/Static pressure fans.


----------



## Harry604

my 390x trix the idle temp is 40C

no overclock or added voltage

900D case

it use to idle at 30C

is there a bug with new drivers


----------



## navjack27

i'm hopefully getting help with the bios mods. i cant for the life of me figure out any hex editing like this. i'm looking at numbers and letters... a white background and black text... there are offsets... and things to find... and rows of these are actually memory timings.... my head just exploded. why isn't there a program that can just find this **** for us?


----------



## pillowsack

Quote:


> Originally Posted by *navjack27*
> 
> i'm hopefully getting help with the bios mods. i cant for the life of me figure out any hex editing like this. i'm looking at numbers and letters... a white background and black text... there are offsets... and things to find... and rows of these are actually memory timings.... my head just exploded. why isn't there a program that can just find this **** for us?


Skip modding the bios and just see what your memory can poop out. According to the timings if you can get 1625 or 1750 that should be ideal. You'll be pushing the MHZ but for 1501 timings at 1625 and 1626 timings at 1750.

I got a little bored today and did this:



After about 20 minutes of ear bleeding VVSHSHSHHHRHRHRRR, I found out my core will just simply not go over 1150mhz. My memory is running perfectly fine at 1750, I tried 1775+ but it starting artifacting in furmark.

My VRM1 hit 99C tops, and VRM2 only 60C with these finger deadly fans.

Kind of a bummer, I guess I got a poorly binned chip, or this vdroop is killing me. I also modified the bios a bit to make the power limit hit 350W, instead of the stock 310W or so....

My chip will be set at 1.3V but in furmark drops to 1.15-1.2v, which is pretty drastic. I guess the VRM temps could play with that? Anyone with a full cover block want to clarify? Someone earlier said that VRM temps have to be below 50C for him otherwise it artifacts with his overclock. GPU Core temp isn't a problem for me.

Anyways I guess I'm content with 1125/1750 regardless. 1750 should be pushing the timings pretty good right? This is just with the aux voltage set to 25MV, core voltage at stock, and power limit at about 330W.


----------



## navjack27

yeah i can do 1750 but i don't push games past 1440x1080 1920x1080 or 1280x960 and i'm thinking a tighter timing at a higher clock or whatever would be better for that.

anyone else have it so vrm2 never fluctuates the temp when monitoring but it does move over the day...

isn't it great when u run the API overhead test and u hear the coil whine sound like a space ship or a nicely tuned synthesizer?


----------



## battleaxe

Quote:


> Originally Posted by *pillowsack*
> 
> Skip modding the bios and just see what your memory can poop out. According to the timings if you can get 1625 or 1750 that should be ideal. You'll be pushing the MHZ but for 1501 timings at 1625 and 1626 timings at 1750.
> 
> I got a little bored today and did this:
> 
> 
> 
> After about 20 minutes of ear bleeding VVSHSHSHHHRHRHRRR, I found out my core will just simply not go over 1150mhz. My memory is running perfectly fine at 1750, I tried 1775+ but it starting artifacting in furmark.
> 
> My VRM1 hit 99C tops, and VRM2 only 60C with these finger deadly fans.
> 
> Kind of a bummer, I guess I got a poorly binned chip, or this vdroop is killing me. I also modified the bios a bit to make the power limit hit 350W, instead of the stock 310W or so....
> 
> My chip will be set at 1.3V but in furmark drops to 1.15-1.2v, which is pretty drastic. I guess the VRM temps could play with that? Anyone with a full cover block want to clarify? Someone earlier said that VRM temps have to be below 50C for him otherwise it artifacts with his overclock. GPU Core temp isn't a problem for me.
> 
> Anyways I guess I'm content with 1125/1750 regardless. 1750 should be pushing the timings pretty good right? This is just with the aux voltage set to 25MV, core voltage at stock, and power limit at about 330W.


Get VRM1 cooler, like down into the 60's at least and you will see higher clocks.

Something like this will get you down there. http://www.overclock.net/t/1203636/official-amd-ati-gpu-mod-club-aka-the-red-mod/2900#post_24504814

I have since this pic was posted put another section of this sink across the back of both these VRM1 coolers, which essentially binds them together. I was then able to attach an AIO cooler to just the VRM1's, so I have one AIO cooler cooling the VRM's on both 290X cards. Temps hardly reach mid 40's gaming, or about 55c with 200mv added. So it ended up being a hybrid air/water AIO cooler on the VRM's of the cards. Works pretty good, not as perfect as full cover blocks, but I cannot get a block of one of these cards. So that conversation is pointless. But you can do a lot better than those VRM1 coolers you have now. I started out that way too btw. Still have a set of those laying around here.


----------



## navjack27

i'm bouncing off the walls here!!!! i just modded my bios with help from @kizwan and got this new high score in 3dmark firestrike http://www.3dmark.com/fs/6894435


----------



## fyzzz

Quote:


> Originally Posted by *navjack27*
> 
> i'm bouncing off the walls here!!!! i just modded my bios with help from @kizwan and got this new high score in 3dmark firestrike http://www.3dmark.com/fs/6894435


Nice, that you've sorted it out. What mod did you do? Just timings? There is 2 hex values inside the bios that does wonders to the score.


----------



## navjack27

i moved the 125000 timings to every other mhz up to 1750. i also found out that it has timings for 2000mhz.. but i haven't touched that yet.


----------



## pillowsack

Quote:


> Originally Posted by *battleaxe*
> 
> Get VRM1 cooler, like down into the 60's at least and you will see higher clocks.
> 
> Something like this will get you down there. http://www.overclock.net/t/1203636/official-amd-ati-gpu-mod-club-aka-the-red-mod/2900#post_24504814
> 
> I have since this pic was posted put another section of this sink across the back of both these VRM1 coolers, which essentially binds them together. I was then able to attach an AIO cooler to just the VRM1's, so I have one AIO cooler cooling the VRM's on both 290X cards. Temps hardly reach mid 40's gaming, or about 55c with 200mv added. So it ended up being a hybrid air/water AIO cooler on the VRM's of the cards. Works pretty good, not as perfect as full cover blocks, but I cannot get a block of one of these cards. So that conversation is pointless. But you can do a lot better than those VRM1 coolers you have now. I started out that way too btw. Still have a set of those laying around here.


I put the gelid kit on this. Holy balls I like the idea though.

Where the heck did you get that heatsink??? I want to buy one right now.

I have the appropiate tools to cut, drill, and sand though. I am VERY interested in doing that, and i'm grateful you shown me this. I wish I didn't buy the gelid kit now although it was $12 :upsided


----------



## battleaxe

Quote:


> Originally Posted by *pillowsack*
> 
> I put the gelid kit on this. Holy balls I like the idea though.
> 
> Where the heck did you get that heatsink??? I want to buy one right now.
> 
> I have the appropiate tools to cut, drill, and sand though. I am VERY interested in doing that, and i'm grateful you shown me this. I wish I didn't buy the gelid kit now although it was $12 :upsided


three feet of it was around $13. I'll find it for you once u get back to my pc


----------



## pillowsack

Quote:


> Originally Posted by *battleaxe*
> 
> three feet of it was around $13. I'll find it for you once u get back to my pc


thank you sir!


----------



## Dundundata

Quote:


> Originally Posted by *navjack27*
> 
> yeah i can do 1750 but i don't push games past 1440x1080 1920x1080 or 1280x960 and i'm thinking a tighter timing at a higher clock or whatever would be better for that.
> 
> anyone else have it so vrm2 never fluctuates the temp when monitoring but it does move over the day...
> 
> isn't it great when u run the API overhead test and u hear the coil whine sound like a space ship or a nicely tuned synthesizer?


that is normal for the MSI, apparently there is only 1 VRM sensor.


----------



## Dundundata

Quote:


> Originally Posted by *navjack27*
> 
> i'm bouncing off the walls here!!!! i just modded my bios with help from @kizwan and got this new high score in 3dmark firestrike http://www.3dmark.com/fs/6894435


if you get a chance and wouldn't mind posting up a little tutorial...i'm a little scared to mess with the bios but I'm sitting here at 1625, it would be nice to have lower timing at 1650.


----------



## pillowsack

kizwan posted an easier tutorial at the end of the thread in bios modding for hawaii cards.

Anyone have XDX 390X bios?


----------



## VBoOmeRanGV

Has a Gigabyte 390X. Just replaced 2X 7850 HD's
This card kicks their ass


----------



## Spartoi

I recently got upgraded to an R9 390X from a R9 390 via an RMA, but I was wondering how I can tell (via software) if I actually have a 390X? Using GPU-Z, it only says that I have 390 series card which isn't helpful in my case.


----------



## pillowsack

Quote:


> Originally Posted by *Spartoi*
> 
> I recently got upgraded to an R9 390X from a R9 390 via an RMA, but I was wondering how I can tell (via software) if I actually have a 390X? Using GPU-Z, it only says that I have 390 series card which isn't helpful in my case.


You can only really tell by the shader count in GPU-Z

Quote:


> Originally Posted by *VBoOmeRanGV*
> 
> Has a Gigabyte 390X. Just replaced 2X 7850 HD's
> This card kicks their ass


Is it a reference design card? Would you share the bios?


----------



## battleaxe

Quote:
[QUOTE]Originally Posted by...ww.overclock.net/img/forum/go_quote.gif[/IMG]

thank you sir!







[/QUOTE]
Here it is.

https://www.amazon.com/dp/B00OYEU5AI/ref=cm_sw_r_awd_qjEEwbZ5W2VXG

Th shipping is insane though. I think I had this as part of a larger order and paid zero for shipping. Could be other suppliers that offer this too. But this one will fit perfectly between two cards. I threaded the aluminum and used screws from the backside. Just be careful not to over tighten and warp the PCB as this will make the vrms in the middle not make good contact.


----------



## VBoOmeRanGV

Quote:


> Originally Posted by *pillowsack*
> 
> Is it a reference design card? Would you share the bios?


http://www.newegg.com/Product/Product.aspx?Item=N82E16814125804&cm_re=gigabyte_R9_390x-_-14-125-804-_-Product

Above is what I have, I do not have bios info


----------



## navjack27

Quote:


> Originally Posted by *Dundundata*
> 
> if you get a chance and wouldn't mind posting up a little tutorial...i'm a little scared to mess with the bios but I'm sitting here at 1625, it would be nice to have lower timing at 1650.


yeah i just followed kizwan's tutorial. it finally made sense when i went slow and did the colors in the hex editor he uses.

and spartoi 

EDIT: i'm trying to figure out how to max out the TDP max and powerlimit settings in the bios editor. its a 8pin and 6pin connector with the pci express, so thats 300w.. but the default settings have it going to like over 300w... and reading reviews online say its a 6+1+1 or just a 6+1 power phase...


----------



## pillowsack

Quote:


> Originally Posted by *navjack27*
> 
> yeah i just followed kizwan's tutorial. it finally made sense when i went slow and did the colors in the hex editor he uses.
> 
> and spartoi
> 
> EDIT: i'm trying to figure out how to max out the TDP max and powerlimit settings in the bios editor. its a 8pin and 6pin connector with the pci express, so thats 300w.. but the default settings have it going to like over 300w... and reading reviews online say its a 6+1+1 or just a 6+1 power phase...


I did it because I feel like it's safe. Do it from the hawaii bios editor tool.
Only a small ammount comes from the PCI-E slot as it's limited. Most of the power comes from the PCI-E PSU connectors. As long as they're decent wires, not skinny BS ones, and your PSU can handle it...


----------



## mus1mus

Quote:


> Originally Posted by *navjack27*
> 
> i'm bouncing off the walls here!!!! i just modded my bios with help from @kizwan and got this new high score in 3dmark firestrike http://www.3dmark.com/fs/6894435


Very Pretty!










I guess I need a 300 card!


----------



## jdorje

http://www.3dmark.com/3dm/9845887?

Got my 390 (stable) to a very auspicious 11,111 firestrike score. The benchmark is incredibly CPU dependent, so of course I had to push the overclock on my cpu to get the last points.

Unfortunately temps are too high to run this 24/7. On the other hand it's only 4% faster than my stock-voltage overclock.


----------



## navjack27

Quote:


> Originally Posted by *pillowsack*
> 
> I did it because I feel like it's safe. Do it from the hawaii bios editor tool.
> Only a small ammount comes from the PCI-E slot as it's limited. Most of the power comes from the PCI-E PSU connectors. As long as they're decent wires, not skinny BS ones, and your PSU can handle it...



this is what it looks like without any mods... i'm not really sure how much higher it can go.


----------



## Raylove

I would like to join


----------



## jaydude

Quote:


> Originally Posted by *Carniflex*
> 
> Is it possible that the TIM Gigabyte has used on their 390X G1 has a curing time? The temperatures I'm seeing after few weeks of using the card seem to be somewhat more tame than they used to. While initially I was hitting 90+ on regular basis even with lowered power limit then now it seems to have settled at about ~80 ... 85 C with power limit at 0%.


Change the thermal paste to some Hydronaut or similar, also change the thermal pads on the mosfets with some alphacool 1mm 11w/mk pads, you wont regret it, especially in summer


----------



## Carniflex

Quote:


> Originally Posted by *jaydude*
> 
> Change the thermal paste to some Hydronaut or similar, also change the thermal pads on the mosfets with some alphacool 1mm 11w/mk pads, you wont regret it, especially in summer


I did consider that possibility for sure. Although as I ordered a "full cover" block for it last night I think I'll just put it under water. It's not gonna be a true full cover - its that alphacool GPX block with core being water cooled and massive aluminum full cover radiator for VRM's and RAM chips. Will be a little while before I can do it as I also ordered a new chassis which will need some modding before it's suitable for my purposes.


----------



## jdorje

Quote:


> Originally Posted by *jaydude*
> 
> Change the thermal paste to some Hydronaut or similar, also change the thermal pads on the mosfets with some alphacool 1mm 11w/mk pads, you wont regret it, especially in summer


Which pads do you mean? The one on the back or ones on the front?


----------



## jaydude

Quote:


> Originally Posted by *jdorje*
> 
> Which pads do you mean? The one on the back or ones on the front?


I don't think there are pads on back under the backplate on the gigabyte 390/x if thats what you mean, I mean here


----------



## tims390x

Quote:


> Originally Posted by *jaydude*
> 
> I don't think there are pads on back under the backplate on the gigabyte 390/x if thats what you mean, I mean here


I've been considering repasting my 390X - Is it really that simple as removing backplate screws and then pulling backplate off? I took screws off the other day but it seemed like an unnatural amount of force required to take off back plate so I just put screws back in and hoped I didn't break anything xD.

If possible could you (or any 390X Gigabyte owner) show a few pictures of the process (unless it is just simple and requires some force removing that back plate..).

Also - On another note i have been thinking recently with all the bios flashing going on -

Is it possible - to flash the Gigabyte 390X - then remove any voltage lock - to then OC more than factory? (I was thinking once I get custom loop it would be sick to see if you could push these gigabyte models)

Thanks


----------



## Dorland203

Quote:


> Originally Posted by *navjack27*
> 
> i'm bouncing off the walls here!!!! i just modded my bios with help from @kizwan and got this new high score in 3dmark firestrike http://www.3dmark.com/fs/6894435


Almost 16000 Graphics score for a 390x is phenomenal.Not to mention the gpu clock is only 1150,can you push it to 1200 and bench again ?


----------



## jaydude

Quote:


> Originally Posted by *tims390x*
> 
> I've been considering repasting my 390X - Is it really that simple as removing backplate screws and then pulling backplate off? I took screws off the other day but it seemed like an unnatural amount of force required to take off back plate so I just put screws back in and hoped I didn't break anything xD.
> 
> If possible could you (or any 390X Gigabyte owner) show a few pictures of the process (unless it is just simple and requires some force removing that back plate..).
> 
> Also - On another note i have been thinking recently with all the bios flashing going on -
> 
> Is it possible - to flash the Gigabyte 390X - then remove any voltage lock - to then OC more than factory? (I was thinking once I get custom loop it would be sick to see if you could push these gigabyte models)
> 
> Thanks


You do not have to remove the backplate at all only the screws on the backplate 

As it seems at the moment there is no way remove the volt lock other then volt modding.


----------



## navjack27

Quote:


> Originally Posted by *Dorland203*
> 
> Almost 16000 Graphics score for a 390x is phenomenal.Not to mention the gpu clock is only 1150,can you push it to 1200 and bench again ?


i don't think i can go higher then 1150 with the stock power tables in the bios. cooling is fine, i'm on air... i really think i'm in a power delivery issue rather then a thermal one. if someone can walk me thru changing those settings i posted earlier... i'd really be in debt.


----------



## mus1mus

Quote:


> Originally Posted by *navjack27*
> 
> i don't think i can go higher then 1150 with the stock power tables in the bios. cooling is fine, i'm on air... i really think i'm in a power delivery issue rather then a thermal one. if someone can walk me thru changing those settings i posted earlier... i'd really be in debt.


Look up TDP Max, TDC Limit and Power Limit, I set them to 999 no issues. But you can be conaervative and set it lower.

Hawaii BIOS Reader allows changing those values. i.e. change 230 to what you want.

Also look up GPU and Memory Frequency Tables.

You can then change the DPM7 Values which correspond to Voltages before droop. 6558?

i.e. 1250 or a higher value to raise the 3D full clock Voltage. Refer to bios editting thread for the voltages values allowed.

Just remember to put equal Voltage Values to both GPU and Memory Frequency DPM7 Voltages. (else, black screen is inevitable)

Use TriXX or HIS Iturbo for higher Voltages. (just make sure your cooling is capable.)

Save the rom and flash with caution. By caution, I mean not flashing both BIOS positions and saving your default BIOS.







enjoy!


----------



## LeSwede

Took my MSI R9 390 8GB Gaming apart to get you peeps some pics of the PCB


----------



## jodybdesigns

Jesus 16k on Firestrike with 1 card????? Goodbye crossfire 7950's. You just made up my mind for me.


----------



## ManofGod1000

Quote:


> Originally Posted by *LeSwede*
> 
> Took my MSI R9 390 8GB Gaming apart to get you peeps some pics of the PCB


Sexy!


----------



## Dundundata

Quote:


> Originally Posted by *LeSwede*
> 
> Took my MSI R9 390 8GB Gaming apart to get you peeps some pics of the PCB


Quote:


> Originally Posted by *ManofGod1000*
> 
> Sexy!


Put some clothes on









Is that what it looks like stock


----------



## battleaxe

Quote:


> Originally Posted by *Dundundata*
> 
> Put some clothes on
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Is that what it looks like stock


Nice job. Covering those chokes will shut them right up. I will be shocked if your card exhibits any coil whine at all like that. I've shut up several cards now doing the same thing.


----------



## fat4l

Quote:


> Originally Posted by *navjack27*
> 
> i'm bouncing off the walls here!!!! i just modded my bios with help from @kizwan and got this new high score in 3dmark firestrike http://www.3dmark.com/fs/6894435


is this with everyting set to default in drivers or with some tweaks(tessellation off etc) ?


----------



## Dundundata

Quote:


> Originally Posted by *battleaxe*
> 
> Nice job. Covering those chokes will shut them right up. I will be shocked if your card exhibits any coil whine at all like that. I've shut up several cards now doing the same thing.


What did u use exactly, will it help with cooling on air?

On another note I stumbled into the bios editing topic...interesting stuff there


----------



## navjack27

Quote:


> Originally Posted by *fat4l*
> 
> is this with everyting set to default in drivers or with some tweaks(tessellation off etc) ?





and i use process lasso to set affinity and other process based stuffs. nothing fishy at all.


----------



## navjack27

Quote:


> Originally Posted by *mus1mus*
> 
> Look up TDP Max, TDC Limit and Power Limit, I set them to 999 no issues. But you can be conaervative and set it lower.
> 
> Hawaii BIOS Reader allows changing those values. i.e. change 230 to what you want.


really? i can just up dat shizz? i've read some stuff about it but i just wasn't that clear on how much the card obeys that stuff. so i guess maxxing it out lets the card pull everything it can from all the power sources its connected to, but not an out of spec kinda amount.


----------



## LeSwede

Quote:


> Originally Posted by *Dundundata*
> 
> Put some clothes on
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Is that what it looks like stock


Yep, I just pulled of the cooler, took a pic and put it back on


----------



## LeSwede

Quote:


> Originally Posted by *battleaxe*
> 
> Nice job. Covering those chokes will shut them right up. I will be shocked if your card exhibits any coil whine at all like that. I've shut up several cards now doing the same thing.


This is just default configuration, Ive added nothing to the card.


----------



## Dundundata

good to know, explains my excellent cooling









Of course always looking for improvement

Now to figure out hex editing and bios modding


----------



## Dundundata

delete


----------



## battleaxe

Quote:


> Originally Posted by *Dundundata*
> 
> What did u use exactly, will it help with cooling on air?
> 
> On another note I stumbled into the bios editing topic...interesting stuff there


Well, then the manufacturer knew what they were doing. They put thermal tape over the chokes to keep them from resonating. That's what causes coil whine. It wont' really do anything for cooling, its just to keep them quiet.


----------



## sportsczy

Isn't radeonpro obsolete now? I'm also very weary of optimizers including process lasso... open to being proved wrong on both counts however.


----------



## navjack27

It is obsolete but some functions work still. I was asked what I did and I answered. I use process lasso for testing things using only my real cores without disabling hyperthreading. I've compared it to what it's like with HT disabled in BIOS and its within margin of error so I use process lasso when I don't want to reboot. I know radeonpro works in dx 11 mainly the flip queue. In dx 9 mostly everything works without restarting the game/program. You can see this by changing the lod bias and alt tabbing back to the game. The process lasso stuff like I/o priority and memory priority show a measurable difference in the positive in tests like linX and other CPU benchmarks, so I assume that would carry over to other benchmarks in general.


----------



## Dundundata

Quote:


> Originally Posted by *battleaxe*
> 
> Well, then the manufacturer knew what they were doing. They put thermal tape over the chokes to keep them from resonating. That's what causes coil whine. It wont' really do anything for cooling, its just to keep them quiet.


I see. Wish I knew this when I had my XFX, it didn't have anything like that and the whine was bad. But I do love the MSI.


----------



## navjack27

i changed up my TDP MAX - POWER LIMIT - TDC LIMIT : all to 300 and set my 24/7 stable overclock into the bios (1100/1625). its obviously slower then if i locked the clocks at the 3d clocks.. that isn't the interesting part, the interesting part is that i still cant get over 1150/1750 even with upping the voltage and power limits. artifacting at the tail end of testing 1160mhz in firestrike. and 1175mhz gets artifacts instantly.

i'm assuming i need to up the voltage... or something. i'm really not sure. i see the voltage drop that happens when it goes under load and i'm trying to keep that dropped voltage up a little higher each time i raise up the clocks.


----------



## navjack27

OKAY... OKAY... i used that HIS iTurbo program... 1175/1750 100% power limit +100mv vddc offset 1000mv vddci
http://www.3dmark.com/fs/6909024

13056 3dmarks
Graphics Score 16146
Physics Score 11278
Combined Score 5938

100% valid. artifacts VERY MINOR... i had gpu-z running to monitor but i forgot to log to file. i'll run again with a higher vddc offset to remove artifacts.
but here is the picture of gpu-z


----------



## mus1mus

Quote:


> Originally Posted by *navjack27*
> 
> i changed up my TDP MAX - POWER LIMIT - TDC LIMIT : all to 300 and set my 24/7 stable overclock into the bios (1100/1625). its obviously slower then if i locked the clocks at the 3d clocks.. that isn't the interesting part, the interesting part is that i still cant get over 1150/1750 even with upping the voltage and power limits. artifacting at the tail end of testing 1160mhz in firestrike. and 1175mhz gets artifacts instantly.
> 
> i'm assuming i need to up the voltage... or something. i'm really not sure. i see the voltage drop that happens when it goes under load and i'm trying to keep that dropped voltage up a little higher each time i raise up the clocks.


Try applying the same Memory timings to the next strap. 1751-1875 etc. And check if you can raise the memory frequency in 12 or 13 MHz increments if not 25 MHz.

Trixx will allow higher Voltages and/or HIS Iturbo.

Edit: Just saw your next post.

Try a higher TDP and TDC.


----------



## sportsczy

Afterburner is a lot better than Trixx imo...

In any case, from my experience with my 390, you can run 1100/1600 without adding any power. Beyond those figures... If you want to increase the core clock, you need to increase the core voltage. To increase the memory clock, i had to increase the aux voltage. I'm running 1150/1750 at +31/+31 and it's completely stable. Since i've installed a Kraken on the card + the Gelid VRM kit, it runs cool enough so i have it on permanently. I could get it up to 1180/1750, but it wasn't worth it as perf increase was negligible.

I get a score of 11157 on 3DMark with a i5-4690k running at 4.4 and the R9 390 with 1150/1750.

Oh and power is running at +20%


----------



## navjack27

Quote:


> Originally Posted by *mus1mus*
> 
> Try applying the same Memory timings to the next strap. 1751-1875 etc. And check if you can raise the memory frequency in 12 or 13 MHz increments if not 25 MHz.
> 
> Trixx will allow higher Voltages and/or HIS Iturbo.
> 
> Edit: Just saw your next post.
> 
> Try a higher TDP and TDC.


yeah i just tried it with an over 100mv core offset and it ran test 1 in firestrike then blackscreened and i had to reboot. so i'll change that. **** it, i'll go all out on the values in the bios, it seems like the card won't let itself get hurt.


----------



## Dundundata

wow over 16K firestrike, seems you've cracked it!


----------



## pillowsack

So you might think I'm crazy, but after 6 hours woohoo I'm content.

I returned my XFX 390, after I found out their mascot isn't a dog, but instead a dog-man. I'm not into that fursuit stuff. Anyways, I saw microcenter had 2 MSI 390s in stock. My heart was set. After an hours worth of driving in the rain, they had magically sold both of them($305).

After whining to my girlfriend a whole bunch she let me get the MSI 390X, they only had 1 left









I haven't done much to the card.... overclocking and bios wise yet...

The problem was that I had a universal GPU waterblock. I wanted the MSI specifically because the ram heatspreader also goes over the VRM2(in GPU-z). Great, but VRM1 has nothing to keep it happy since it's built into the stock MSI cooler. Oh wait... that beefy stock MSI cooler? Huh







I seriously need to not break things the day I buy them









Besides my awesome Purple LED's + red and black theme, it is now MSI motherboard and MSI GPU, which is awesome. Oh yeah, VRM1 never goes above 64C(and the fan doesn't even start because the GPU core stays under 55C!). Gonna have to make a custom fan curve, since afterburner isn't doing that properly. The MSI logo does still light up on the card. This thing does not nearly have as much coil whine as that freakin XFX had either.

The only downside:
This damn card is just half an inch longer than the XFX and I can't have my two fans I had on the back of my 360 radiator. Whatever... I guess... Hopefully I don't have to cut any stupid hole in this corsair 540 case....



Now just for the CLU to arrive on Saturday to bring my delidded processor temps down another 10-15C so I can push my 4690K to 4.8Ghz and call it a day.


----------



## battleaxe

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *pillowsack*
> 
> So you might think I'm crazy, but after 6 hours woohoo I'm content.
> 
> I returned my XFX 390, after I found out their mascot isn't a dog, but instead a dog-man. I'm not into that fursuit stuff. Anyways, I saw microcenter had 2 MSI 390s in stock. My heart was set. After an hours worth of driving in the rain, they had magically sold both of them($305).
> 
> After whining to my girlfriend a whole bunch she let me get the MSI 390X, they only had 1 left
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I haven't done much to the card.... overclocking and bios wise yet...
> 
> The problem was that I had a universal GPU waterblock. I wanted the MSI specifically because the ram heatspreader also goes over the VRM2(in GPU-z). Great, but VRM1 has nothing to keep it happy since it's built into the stock MSI cooler. Oh wait... that beefy stock MSI cooler? Huh
> 
> 
> 
> 
> 
> 
> 
> I seriously need to not break things the day I buy them
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Besides my awesome Purple LED's + red and black theme, it is now MSI motherboard and MSI GPU, which is awesome. Oh yeah, VRM1 never goes above 64C(and the fan doesn't even start because the GPU core stays under 55C!). Gonna have to make a custom fan curve, since afterburner isn't doing that properly. The MSI logo does still light up on the card. This thing does not nearly have as much coil whine as that freakin XFX had either.
> 
> The only downside:
> This damn card is just half an inch longer than the XFX and I can't have my two fans I had on the back of my 360 radiator. Whatever... I guess... Hopefully I don't have to cut any stupid hole in this corsair 540 case....
> 
> 
> 
> Now just for the CLU to arrive on Saturday to bring my delidded processor temps down another 10-15C so I can push my 4690K to 4.8Ghz and call it a day.






Awesome mod. Now that's what I'm talking about. Very nice!!! Only way to beat that is with a full block. Nice work.


----------



## navjack27

yeah dude, mr dogman, lookin awesome


----------



## navjack27

okay stable, no artifacts
http://www.3dmark.com/fs/6910347

1175/1750
100% power limit
125mv offset
1000mv vddci

SCORE
13102 with AMD Radeon R9 390X(1x) and Intel Core i7-4790S Processor
Graphics Score 16143
Physics Score 11424
Combined Score 5977

error on the core temp, highest was like 65-68 or something


----------



## ZealotKi11er

Quote:


> Originally Posted by *navjack27*
> 
> okay stable, no artifacts
> http://www.3dmark.com/fs/6910347
> 
> 1175/1750
> 100% power limit
> 125mv offset
> 1000mv vddci
> 
> SCORE
> 13102 with AMD Radeon R9 390X(1x) and Intel Core i7-4790S Processor
> Graphics Score 16143
> Physics Score 11424
> Combined Score 5977
> 
> error on the core temp, highest was like 65-68 or something


Have you changed Tessellation settings at all?


----------



## jdorje

Quote:


> Originally Posted by *pillowsack*
> 
> So you might think I'm crazy, but after 6 hours woohoo I'm content.


Hilarious mod.

You do that with your hands or a...saw?

Was there liquid inside the heat pipes?

What's cooling the vrm at the rear of the card?


----------



## navjack27

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Have you changed Tessellation settings at all?


no and your like the 10th person to ask me lol. it wouldn't show up as valid otherwise.

EDIT: memory timing mod and tdc tdp and powerlimit mods in bios with these


----------



## ZealotKi11er

Quote:


> Originally Posted by *navjack27*
> 
> no and your like the 10th person to ask me lol. it wouldn't show up as valid otherwise.
> 
> EDIT: memory timing mod and tdc tdp and powerlimit mods in bios with these


What do you score without the mods?


----------



## navjack27

i'm pretty sure this is the result i had before i did stuff to my card. same radeonpro settings tho

http://www.3dmark.com/fs/6803726

and an older one at my previous 24/7 overclock

http://www.3dmark.com/fs/6559582


----------



## pillowsack

Quote:


> Originally Posted by *navjack27*
> 
> i'm pretty sure this is the result i had before i did stuff to my card. same radeonpro settings tho
> 
> http://www.3dmark.com/fs/6803726
> 
> and an older one at my previous 24/7 overclock
> 
> http://www.3dmark.com/fs/6559582


Does our card have a bios switch? I can't see it, and am way to lazy to figure out. I want to try your bios out but it's like eehhhhh where's the switch at...

What's your current stable 24/7 overclock now? I'm talking 1+ hour gaming sessions in BF4 or anything else demanding


----------



## navjack27

no bios switch. only try my bios if u know u have hynix memory and the SAME EXACT CARD... nothing in my bios will **** up ur card as long as its the same memory chips. everything else isn't modded, besides the tdp tdc n such. voltages are the same. any overclocking u will do will come from u and u alone.


----------



## Joe88

Quote:


> Originally Posted by *battleaxe*
> 
> Nice job. Covering those chokes will shut them right up. I will be shocked if your card exhibits any coil whine at all like that. I've shut up several cards now doing the same thing.


I have the same card and can still get coil whine
during loading screens with vsync off, goes up to like 4500 fps and the coil whine starts


----------



## navjack27

to anyone who wants this done. i'll mod ur bios files for ya. ask for either the memory timing mod or memory timing and TDC/TDP/PL mods along with it for possible better overclocking headroom.

EDIT: is coil whine that bad? i get it like a mofo but as far as i understand it, its not a bad thing its just an annoying thing. it'll happen with most any card over 1000fps really. got it with my gtx 660s earlier this year and i get it with this card. i think i heard of people using hot glue or something to fix it before.


----------



## jdorje

I use rss to frame cap at 144 fps. Some games (benchmarks) need a custom profile though. At higher fps coil whine is immediate though minor.


----------



## kizwan

Quote:


> Originally Posted by *jdorje*
> 
> I use rss to frame cap at 144 fps. Some games (benchmarks) need a custom profile though. At higher fps coil whine is immediate though minor.


Why didn't use Frame Rate Target Control?


----------



## jdorje

Quote:


> Originally Posted by *kizwan*
> 
> Why didn't use Frame Rate Target Control?


In the Crimson controls? I haven't experimented with it yet and have read some reports of it being buggy (like so much in crimson...).


----------



## navjack27

o YEAHHHH crimson is so buggy and bad

















thats why i use it right now and have no problems at all


----------



## Carniflex

Quote:


> Originally Posted by *pillowsack*
> 
> So you might think I'm crazy, but after 6 hours woohoo I'm content.
> 
> I returned my XFX 390, after I found out their mascot isn't a dog, but instead a dog-man. I'm not into that fursuit stuff. Anyways, I saw microcenter had 2 MSI 390s in stock. My heart was set. After an hours worth of driving in the rain, they had magically sold both of them($305).
> 
> After whining to my girlfriend a whole bunch she let me get the MSI 390X, they only had 1 left
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I haven't done much to the card.... overclocking and bios wise yet...
> 
> The problem was that I had a universal GPU waterblock. I wanted the MSI specifically because the ram heatspreader also goes over the VRM2(in GPU-z). Great, but VRM1 has nothing to keep it happy since it's built into the stock MSI cooler. Oh wait... that beefy stock MSI cooler? Huh
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I seriously need to not break things the day I buy them
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Besides my awesome Purple LED's + red and black theme, it is now MSI motherboard and MSI GPU, which is awesome. Oh yeah, VRM1 never goes above 64C(and the fan doesn't even start because the GPU core stays under 55C!). Gonna have to make a custom fan curve, since afterburner isn't doing that properly. The MSI logo does still light up on the card. This thing does not nearly have as much coil whine as that freakin XFX had either.
> 
> The only downside:
> This damn card is just half an inch longer than the XFX and I can't have my two fans I had on the back of my 360 radiator. Whatever... I guess... Hopefully I don't have to cut any stupid hole in this corsair 540 case....
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Now just for the CLU to arrive on Saturday to bring my delidded processor temps down another 10-15C so I can push my 4690K to 4.8Ghz and call it a day.


Congrats









As far as half an inch and fans on radiator go - there exist also "slim" fans about half an inch thinner than the normal ones. Might be worth a try.


----------



## jdorje

The original crimson tried to kill my 280. I'm using one of the betas instead but it refuses to show my freesync range the way the non-beta did.

I'd like to overclock with profiles in crimson but it only uses percentages for everything, not numbers. Is there any way to toggle this?

And I have read that the fps control has some issues. You recommend using it though?


----------



## navjack27

only go into the crimson control panel rarely

use msi afterburner or trixx or any 3rd party overclocking software


----------



## pillowsack

Looks like I have 1150/1750 stable after a good run on the ole furmark. VRM never exceeded 76C this time though, but hey, I'll take that over the panic and close furmark at 99C the XFX did. Core hit 65C with my water pump running on low. Updated after burner to 4.2.0. maaan, that let me do the fan profile(fan is not spinning unless core is above 40C.)

I downloaded that HIS iturbo app, holy crap.... Who would ever want to shoot 400mv+ into their beloved 390????? I could only imagine the VRM's just going POP unless they're sub temps....

So my day one warranty voided 390X is ok. I'd rather have this cut in half 390X with reasonable temps at 1150/1750, than a XFX 390 at 1100/1750 with insanely high VRM temps.









On another note, the MSI gaming app is really buggy and likes to make my core and memory run at maximum speed even at idle. It has this really cool LED breathing effect though. I mean I still have the damn LED MSI logo on the card...

Does anyone know how I could make it glow/breathe without that darn app?


----------



## Spartoi

I've been playing The Witcher: Enhanced Edition and there isn't a V-Sync option. When I had a GTX 980, I could use Nvidia control panel to enable V-Sync for the game but I can't seem to do that in Radeon Settings. I have Frame Rate Target Control set to 60FPS and Wait For Vertical Sync set to "Always On" but the frame rate still fluctuates above 60FPS.

Any suggestions?


----------



## Dundundata

Quote:


> Originally Posted by *Spartoi*
> 
> I've been playing The Witcher: Enhanced Edition and there isn't a V-Sync option. When I had a GTX 980, I could use Nvidia control panel to enable V-Sync for the game but I can't seem to do that in Radeon Settings. I have Frame Rate Target Control set to 60FPS and Wait For Vertical Sync set to "Always On" but the frame rate still fluctuates above 60FPS.
> 
> Any suggestions?


Use d3doverrider, u can enable vsync and triple buffering per game. I used it for witcher 1 and deadspace


----------



## Jaffi

I recently installed a 390X in my system, running together with an i5 2500K @4.5 GHz, 8 gigs of ram, game is installed on a SSD. PSU is a "be quiet! Straight Power E9-CM 580W".
Playing GTA 5 I noticed quite a few drops in both GPU clock and load. When these occur, CPU load mostly is >90%, so could this be a CPU limit? I never had such a powerful graphics card hence did I never witness a CPU limit, my old cards were always running at maximum clock speeds.
So I would like to confirm it's this instead of something else. Of course vsync is turned off, temps are fine, also maxed out powerlimit, but that didn't help.
I also think that bumping MSAA to x4 and thus reducing framerate more, the drops became less frequent. Maybe another sign of CPU limit.


----------



## Tivan

Quote:


> Originally Posted by *Jaffi*
> 
> I recently installed a 390X in my system, running together with an i5 2500K @4.5 GHz, 8 gigs of ram, game is installed on a SSD. PSU is a "be quiet! Straight Power E9-CM 580W".
> Playing GTA 5 I noticed quite a few drops in both GPU clock and load. When these occur, CPU load mostly is >90%, so could this be a CPU limit? I never had such a powerful graphics card hence did I never witness a CPU limit, my old cards were always running at maximum clock speeds.
> So I would like to confirm it's this instead of something else. Of course vsync is turned off, temps are fine, also maxed out powerlimit, but that didn't help.
> I also think that bumping MSAA to x4 and thus reducing framerate more, the drops became less frequent. Maybe another sign of CPU limit.


Probably that, and AMD GPU drivers are still a little lacking for DX11 (with regard to utilizing DX11 CPU multicore features; it's assumed they fix that sometime, now that they dropped support for pre GCN cards.). A stronger CPU would definitely help alleviate the issue, though.


----------



## Jaffi

Is my current PSU (http://www.bequiet.com/en/powersupply/282) sufficient to power the following build:

* Sapphire 390X Nitro
* i5 2500K @4.5 GHz
* 2x Samsung 850 SSD
* 1x HDD 1TB

Also, if it was not, how would I notice? Would the PC crash or would it just run with less power?


----------



## MtlSpice

aight well since its christmas an all i didnt have time to sit down an OC my CPU .... i did some Futurmark benchmark tho here are the result ... let me know what you think.

( 2 GPU - 1440p - CPU 4.0 GHz ) http://www.3dmark.com/fs/6907353

( 1 GPU - 1440p - CPU 4.0 GHz ) http://www.3dmark.com/fs/6906630

( 2 GPU - 1080p - CPU 4.0 GHz ) http://www.3dmark.com/fs/6908006

( 1 GPU - 1080p - CPU 4.0 GHz ) http://www.3dmark.com/fs/6907826

ill be posting more benchmark when i get to OC my CPU ... Merry Christmas everyone


----------



## bichael

Quote:


> Originally Posted by *Jaffi*
> 
> Is my current PSU (http://www.bequiet.com/en/powersupply/282) sufficient to power the following build:
> 
> * Sapphire 390X Nitro
> * i5 2500K @4.5 GHz
> * 2x Samsung 850 SSD
> * 1x HDD 1TB
> 
> Also, if it was not, how would I notice? Would the PC crash or would it just run with less power?


Yes and then some I would say!

Not too familiar with four rail setups but guess you could run each PCI-e cable from different rails just in case though in theory I'm not sure that should be necessary.


----------



## adondecoy

my msi r9 390 hits 94c when playing cod advanced warfare even with 100% fan speed, is it normal?


----------



## kizwan

Quote:


> Originally Posted by *Jaffi*
> 
> I recently installed a 390X in my system, running together with an i5 2500K @4.5 GHz, 8 gigs of ram, game is installed on a SSD. PSU is a "be quiet! Straight Power E9-CM 580W".
> Playing GTA 5 I noticed quite a few drops in both GPU clock and load. When these occur, CPU load mostly is >90%, so could this be a CPU limit? I never had such a powerful graphics card hence did I never witness a CPU limit, my old cards were always running at maximum clock speeds.
> So I would like to confirm it's this instead of something else. Of course vsync is turned off, temps are fine, also maxed out powerlimit, but that didn't help.
> I also think that bumping MSAA to x4 and thus reducing framerate more, the drops became less frequent. Maybe another sign of CPU limit.


It is hard to believe your CPU can't handle single 390X. GPU clock drops now and then are normal with single card. If you're playing at 1080p, enable virtual resolution & play at 1440p. 390X can handle 1440p & CPU usage should drop.

BTW, if I remember correctly MSAA doesn't affect FPS. I should test it again. Currently MSAA at x8.
*EDIT:* Oops! My bad. MSAA does drop FPS. With MSAA OFF, GTAV happily run above 80 FPS while MSAA x8, 60 to 80 FPS.
Quote:


> Originally Posted by *Jaffi*
> 
> Is my current PSU (http://www.bequiet.com/en/powersupply/282) sufficient to power the following build:
> 
> * Sapphire 390X Nitro
> * i5 2500K @4.5 GHz
> * 2x Samsung 850 SSD
> * 1x HDD 1TB
> 
> Also, if it was not, how would I notice? Would the PC crash or would it just run with less power?


Make sure you connect each PCIe connectors to separate rails.


----------



## pillowsack

Man that was spooky. I had to do a system restore... It was probably that damn MSI gaming app that controlled the LED on the card...

The computer would get to the windows log in screen and then just go black. I'm thinking "well crap there goes a $400 card". Did the restore where it removed afterburner and the stupid bugged out app.


----------



## jdorje

Gta v will drop below 60 fps (16 ms frame time) on a 4.5 ghz sandy I think; however it should be pretty rare and only 20-25 ms frame time tops.

Gta v also does stuff in the background (loading map areas?) that can cause periodic hiccups, especially right after opening the game.

Ah found it.



















That's for a 4690k. A 2500k at 4.5 ghz is probably closer to the 3.7 ghz model.

I dropped by aux voltage to -100 to see what would happen.

What happened is the system immediately crashed. And the profile was saved in afterburner to load automatically so I was boned.

Getting to command line let me set startup to legacy via some command line I forget. This in turn lets me get into safe mode on boot with f8 - a worthy change. From there I edited the afterburner config file to reset the voltage.


----------



## Dundundata

Quote:


> Originally Posted by *adondecoy*
> 
> my msi r9 390 hits 94c when playing cod advanced warfare even with 100% fan speed, is it normal?


No. Does your rig have good airflow? That is much too high. For comparison my MSI hits 70ish under load overclocked with 60-65% fan speed on the card. If your case has good airflow and your ambient temps aren't real high it could be a TIM issue with the card. Are you using the latest drivers?


----------



## Dundundata

Quote:


> Originally Posted by *pillowsack*
> 
> Man that was spooky. I had to do a system restore... It was probably that damn MSI gaming app that controlled the LED on the card...
> 
> The computer would get to the windows log in screen and then just go black. I'm thinking "well crap there goes a $400 card". Did the restore where it removed afterburner and the stupid bugged out app.


Yeah i never bothered installing that app. I mean the led is on it just don't do anything fancy


----------



## adondecoy

Quote:


> Originally Posted by *Dundundata*
> 
> No. Does your rig have good airflow? That is much too high. For comparison my MSI hits 70ish under load overclocked with 60-65% fan speed on the card. If your case has good airflow and your ambient temps aren't real high it could be a TIM issue with the card. Are you using the latest drivers?


my case has good airflow, I have corsair air 240 with full speed case fan. Ambient temp is 26-27c and yes I'm using the latest amd crimson driver
So what do you think the problem is? Can I RMA this card because of it?


----------



## maximsilentfoot

Hi guys. I've been using an Asus 390x DCUii, running core/mem on stock. I've been randomly getting these weird artfacts which pop on and off the screen every once in awhile, stays on for around half a second and disappears. Here is a sample in GTA V running at 1080p windowed borderless with most settings between high-ultra. Despite changing drivers (via DDU) they haven't gone away and shows up in GTA V, Dragon Age Inquisition and Dead Space 3. Any ideas?


----------



## jdorje

Quote:


> Originally Posted by *maximsilentfoot*
> 
> Hi guys. I've been using an Asus 390x DCUii, running core/mem on stock. I've been randomly getting these weird artfacts which pop on and off the screen every once in awhile, stays on for around half a second and disappears. Here is a sample in GTA V running at 1080p windowed borderless with most settings between high-ultra. Despite changing drivers (via DDU) they haven't gone away and shows up in GTA V, Dragon Age Inquisition and Dead Space 3. Any ideas?


That style of artifact is common when your core clock is too high. Lower your overclock. If it happens at stock it'd be grounds for rma.

To confirm, go into afterburner and lower your clock by about 50 mhz. Artifacts should go away.


----------



## adondecoy

Quote:


> Originally Posted by *Dundundata*
> 
> No. Does your rig have good airflow? That is much too high. For comparison my MSI hits 70ish under load overclocked with 60-65% fan speed on the card. If your case has good airflow and your ambient temps aren't real high it could be a TIM issue with the card. Are you using the latest drivers?


I also tested it with furmark but it ended again in 94c after 3 minutes of stress test
the card is on stock speed and I didn't even touch any setting but fan control
this is frustrating, I will post my temp again tomorrow. I will try to open the the case panel and its window side panel too


----------



## tims390x

Could anybody assist me - so I am running 2x 390X Gigabyte in xfire - When I open up CSGO, the cards go HAM, both running at 70'C really hard, as if they are trying to run Cry3 maxed out.

Now I know the gygabyte cards are howlers, but surely the cards don't need to be running so hard just for csgo?

I have tried radeon Software frame limiting, Vsync makes the game mouse input laggy... I mean first time in 5 years that I am on a desktop again with such high tier GPUs, is there special CrossFire settings etc?

Appreciate the any help


----------



## jdorje

Quote:


> Originally Posted by *adondecoy*
> 
> I also tested it with furmark but it ended again in 94c after 3 minutes of stress test


How much cooler is it with the case side panel removed?
Quote:


> Originally Posted by *tims390x*
> 
> Could anybody assist me - so I am running 2x 390X Gigabyte in xfire - When I open up CSGO, the cards go HAM, both running at 70'C really hard, as if they are trying to run Cry3 maxed out.


How does it work in other games?

What if you disable crossfire for cs:go?


----------



## tims390x

Quote:


> Originally Posted by *jdorje*
> 
> How much cooler is it with the case side panel removed?
> How does it work in other games?
> 
> What if you disable crossfire for cs:go?


Using 1 GPU or using CrossFire - the card goes fullout - it runs the game beautifully, but for example -

Fallout 4 Ultra after 3 hours of gameplay my temperature sits @ around 65

CSGO sitting in Menu, cards are at 65, once I am in a game they go up to like 75-80'C.

Payday 2 maxed out both run normally 'lower at around 60-65'.


----------



## pillowsack

Quote:


> Originally Posted by *tims390x*
> 
> Using 1 GPU or using CrossFire - the card goes fullout - it runs the game beautifully, but for example -
> 
> Fallout 4 Ultra after 3 hours of gameplay my temperature sits @ around 65
> 
> CSGO sitting in Menu, cards are at 65, once I am in a game they go up to like 75-80'C.
> 
> Payday 2 maxed out both run normally 'lower at around 60-65'.
> 
> Will test some other games now just to see if the same thing occurs, but I think it's related to FPS, or something >.<


CSGO just likes to ride GPU's even at the menu. There's no real explanation but it's always done that to me, even on my 7950's. You're best bet is to just disable crossfire bro. All steams do that, even TF2...

So I had to just reinstall windows. Don't install that MSI gaming app, no matter how badly you want to play with your cards LED


----------



## battleaxe

Quote:


> Originally Posted by *pillowsack*
> 
> CSGO just likes to ride GPU's even at the menu. There's no real explanation but it's always done that to me, even on my 7950's. You're best bet is to just disable crossfire bro. All steams do that, even TF2...
> 
> So I had to just reinstall windows. Don't install that MSI gaming app, no matter how badly you want to play with your cards LED


Ouch... that's rough.


----------



## pillowsack

Quote:


> Originally Posted by *battleaxe*
> 
> Ouch... that's rough.


Eh, it's like I install windows every 1-3 months anyways.


----------



## battleaxe

Quote:


> Originally Posted by *pillowsack*
> 
> Eh, it's like I install windows every 1-3 months anyways.


Yeah, me too. Takes a few hours to get everything nailed down the way it was though. I just did it myself last week. I always love how everything runs after the fact though, so much smoother and everything.

I love your handle. Sounds like some kinda weird fetish or something. lol


----------



## jodybdesigns

Quote:


> Originally Posted by *pillowsack*
> 
> Eh, it's like I install windows every 1-3 months anyways.


I wished. Owning 13 servers filled to the rim with websites, I have 6tb filled up on a local cloud storage, almost 2TB filled up on my machine, and almost 1TB filled up on a second machine. I cannot afford to lose any of my registry and appdata, as backing up these core files is always a huge headache and never goes back and you still end up reinstalling everything. Down-time costs big money....


----------



## Spartoi

Quote:


> Originally Posted by *Dundundata*
> 
> Use d3doverrider, u can enable vsync and triple buffering per game. I used it for witcher 1 and deadspace


Thanks. This didn't work for me I was able to figure out why the configuration wasn't applying in Radeon Settings. I used the scan option to scan my library for games and The Witcher EE was among them so I edited those settings to enable V-Sync. What I didn't know before was that a different file was being edited than the one I was using to launch the game. The Radeon Settings was modifiying the witcher.exe file but I wasn't launching the game with that. Instead, I would launch the game through Steam which uses the launcher.exe file. Once I edited that file in Radeon Settings to enable V-Sync, it started working.

EDIT:

Now it doesn't work again.









EDIT 2:

It's working again. It seems I have to enable V-Sync and set Frame Rate Control to 60FPS. I thought enabling V-Sync alone would match my TV's refresh rate (60Hz) but I guess not. I'm happy it's working again though.


----------



## Dundundata

Quote:


> Originally Posted by *Spartoi*
> 
> Thanks. This didn't work for me I was able to figure out why the configuration wasn't applying in Radeon Settings. I used the scan option to scan my library for games and The Witcher EE was among them so I edited those settings to enable V-Sync. What I didn't know before was that a different file was being edited than the one I was using to launch the game. The Radeon Settings was modifiying the witcher.exe file but I wasn't launching the game with that. Instead, I would launch the game through Steam which uses the launcher.exe file. Once I edited that file in Radeon Settings to enable V-Sync, it started working.


you might need Rivatuner for D3D to work, not sure...it comes with Afterburner. Glad you got it working.


----------



## Darkeylel

Quote:


> Originally Posted by *tims390x*
> 
> Using 1 GPU or using CrossFire - the card goes fullout - it runs the game beautifully, but for example -
> 
> Fallout 4 Ultra after 3 hours of gameplay my temperature sits @ around 65
> 
> CSGO sitting in Menu, cards are at 65, once I am in a game they go up to like 75-80'C.
> 
> Payday 2 maxed out both run normally 'lower at around 60-65'.
> 
> Will test some other games now just to see if the same thing occurs, but I think it's related to FPS, or something >.<


Limit your FPS at the menu can't remember the command for the life of me but I had the same problem. Once I limited my FPS to 60 was all good


----------



## dandy-

Im having an issue with my r9 390 where the GPU downclocks itself due to low usage in csgo, but right after the downclock -> usage spikes to 99% -> so it starts to clock back up again resulting in a slight stutter. Which is kind of anoying.

Im running windows 10, ive tried radeonpro, but the "always use the highest performance clocks while gaming" cant be ticked, nor can anything in that overdrive-tab. The current crimson overdrive doesnt allow me to set a fixed clockrate, so... Anyone got any tips?


----------



## kizwan

Quote:


> Originally Posted by *dandy-*
> 
> Im having an issue with my r9 390 where the GPU downclocks itself due to low usage in csgo, but right after the downclock -> usage spikes to 99% -> so it starts to clock back up again resulting in a slight stutter. Which is kind of anoying.
> 
> Im running windows 10, ive tried radeonpro, but the "always use the highest performance clocks while gaming" cant be ticked, nor can anything in that overdrive-tab. The current crimson overdrive doesnt allow me to set a fixed clockrate, so... Anyone got any tips?


Use MSI AB & enable Unofficial overclocking without PowerPlay. Just overclock +1MHz either core or memory for this to kick in & run at max clocks all the time. After playing games, just press reset button in MSI AB to allow the clocks to downclock again.


----------



## Hardstyler3

I am new to the club

bought a Sapphire R9 390 3 days ago + complete new parts

installed windows 10 clean and crimson 15.12

made power target +50%

set core to 1175mhz and memory to 1600mhz perfectly stable

http://www.3dmark.com/3dm/9879352?

i am happy









ahh temps always stay around 70 degrees very good


----------



## Dundundata

Very nice welcome!


----------



## MtlSpice

Yup ... they are similar to the R9 290x with stock cooler they would max out at 95 ... but with a well ventilated case the temp can stay a little lower ..... i have 2 in crossfire ... the top one will go to 83 while the bottom one stays at 75


----------



## derkington

If you are UK based and on the market for a 390x there is currently a deal on over at overclockers for a Sapphire 390x £284.99.
https://www.overclockers.co.uk/sapphire-radeon-r9-nitro-390x-8192mb-gddr5-pci-express-graphics-card-with-backplate-11241-04-20g-gx-376-sp.html


----------



## Joe88

well got sad news, my 390 has died

I was just using my desktop like I usually do just browsing the web and all of the sudden the whole computer just shuts off, I tried to turn it back on but nothing worked, no lights, no sound, just nothing. I feared that the whole pc was fried or something.
So tried to figure out what was wrong, got out my old psu (zumax/topower 550w) and hooked every thing up to it and still nothing. I tried the paper clip test on the tx750 and it was still working so now I removed the gpu (r9 390) and the pc booted on the old psu. So I got my old 4870 out and hooked it up on the old psu first and it posted and booted to windows, everything seemed to be fine, all drives and hardware still working.

Now I hooked everything up to the TX750 with the 4870 installed and it booted up fine but when I replace the 4870 with the 390 I get the same nothing happens when I push the power button. The 4870 uses alot more idle power than the 390 so if the 4870 worked fine on the tx750 than the 390 should work fine too at idle.

So next I tried to use 2 psu's, the tx750 to power everything else but the gfx card and the 550w to power the gfx card (using the paper clip trick), the 550w booted first with psu fan running, no difference from what I can detect on the gpu. Next I turned on the system and the gpu light lit up but no display and instead got 5 beeps.

So in short, neither psu can boot the 390, if you attempt to boot the system with it installed and connected to the psu nothing happens, installed but not connected to the psu the gpu led logo lights up system tries to boot.
With the 4870 installed in the same slot and using the same cables it boots fine on both psu's.

I had the card overclocked for a few days about 2 weeks ago, temps got too high for my liking (94C core, 101C vram) so I reverted it back to stock and temps dont go above 71C core, and 74C vram. I played tf2 for about 2 hours about 5 hours before all this happened but temps were fine and never went above 63C.
The only thing I did different was installed the MSI gaming app to change the card led to have a breathing effect (this was also right before I played tf2).

I tried moving to different pcie slots, installed in an 8x slot with everything including gpu on the tx750, it boots and the card led goes on but gets no signal and got 5 beeps meaning gpu error.

It seems like the psu is fine as the 4870 installed its powering everything fine so I might have to rma the gpu and of course this happened right outside the 30 day replace period from newegg.









I was running it at stock when this happened, pretty po'd as I spent $20 more to get the msi ver over the xfx


----------



## navjack27

i'm so confused. i re-installed my drivers... and i cannot get that 3dmark score again... its gone... i reapplied the radeonmod tweaks i think i had. the bios is the same... i can't break 14k gpu score


----------



## Jaffi

Quote:


> Originally Posted by *kizwan*
> 
> It is hard to believe your CPU can't handle single 390X. GPU clock drops now and then are normal with single card. If you're playing at 1080p, enable virtual resolution & play at 1440p. 390X can handle 1440p & CPU usage should drop.
> 
> BTW, if I remember correctly MSAA doesn't affect FPS. I should test it again. Currently MSAA at x8.
> *EDIT:* Oops! My bad. MSAA does drop FPS. With MSAA OFF, GTAV happily run above 80 FPS while MSAA x8, 60 to 80 FPS.
> Make sure you connect each PCIe connectors to separate rails.


Thanks man! I just connected the card to both rails instead of single one and I bumped MSAA from 2x to 4x. GPU drops are much less frequent and not as big anymore, just tiny jumps of around 20 MHz. Of course FPS dropped, now I am mostly seeing 50-70 fps, sometimes they drop to 40s in grassy areas, but that's normal I guess. I am just asking myself if it would be better to use either MSAA, VSR in driver or resolution scaling ingame. I think they all do basically the same so it doesn't matter?!


----------



## jodybdesigns

Lock your FPS a few FPS higher than the minimum. Smoooooth gaming. Turn off powerplay and disable ULPS. No more spikes


----------



## tangelo

Quote:


> Originally Posted by *jdorje*
> 
> That style of artifact is common when your core clock is too high. Lower your overclock. If it happens at stock it'd be grounds for rma.
> 
> To confirm, go into afterburner and lower your clock by about 50 mhz. Artifacts should go away.


Man, he said he is playing with stock clocks. I would just RMA the card.


----------



## navjack27

after some detective work on trying to recreate my high score... i'm sorry to say that i have to call it invalid at this point. same settings, same drivers, same bios AND i cannot get that score OR even close to it again. but i did notice something; before i was asked if my tessellation has been modified, and i swore it wasn't i mean, 3dmark calls ur score invalid if u change it... well to test that before i would change my values and then run the test... but nothing would actually change... the score was still high with 2x but it never warned me. i just kinda ignored that part thinking that radeonpro or something was wonky... i dunno. just dismissed it. well i change it now and it warns me at the results page... did i somehow do something with how tessellation is processed? i have no idea what i did honestly. everything else runs as fast as before but i guess some weird mishap with not using DDU when upgrading my drivers made something stick but not show as a setting or something. editing my signature now... so sad man, so sad.

i'd love for some discussion on this fluke score i got. its going to be eating at my brain for a while lol


----------



## Synntx

Grabbed me a Sapphire 390x Tri-X yesterday, along with a few other goodies such as a couple SSDs, and went home and slapped it all together!

Because I'm weird like that, I did a fresh install on windows after installing the new hardware and reset my BIOS to stock. Here is what I got with everything stock:

http://www.3dmark.com/fs/6951286

And here is what I got with overclock:

ttp://www.3dmark.com/fs/6953360

My STRIX 380 was running the Witcher 3 on ultra maxed out setting at around 27-32fps, my 390x runs it at 57-60fps. I was also getting an average FPS of 40 in Valley benchmark on the 380, and get almost 80 with the 390x. Needless to say, I'm happy with my upgrade, and glad I didn't go the 390 route! It feels nice to finally have a high end card.

Now my if only my crappy 8320e would cooperate better when I try to overclock it!! 4.7ghz stable with 1.588volts







Killin my physics score!!!


----------



## navjack27

nice @Synntx have fun with it man


----------



## Synntx

Quote:


> Originally Posted by *navjack27*
> 
> nice @Synntx have fun with it man


@navjack27Thanks man. I'm a little disappointed though, that my Graphics score increased only ~500 from stock to 1175/1725. A bit disconcerting to say the least. Though I know my pathetic 8320e is my biggest drawback right now.

Yesterday I picked up 2 SSDs, the 390x, and a pair of 2x8gb EVGA DDR3-1866 ram, and I'm actually considering returning the RAM and one SSD and picking up an 8350/8370 at this point. Previously I had two sets of these: http://www.newegg.com/Product/Product.aspx?Item=N82E16820220570

The new RAM didn't seem to make a difference. I'm ready to beat my CPU with a hammer because it's so crappy.


----------



## navjack27

any reason you don't want to go intel?


----------



## Synntx

Quote:


> Originally Posted by *navjack27*
> 
> any reason you don't want to go intel?


Other than I don't have the spread now to pickup a new CPU/Mobo combo?? No. Not other than I've always been an AMD fan and want desperately for them to excel when DX12 becomes the norm. I guess I could return my 390x, RAM, and SSD's and grab an intel CPU/Mobo......but I really like my 390x









Then again, my 380 will probably perform adequately to my current setup in 1080p if I make the intel jump.........


----------



## Levys

Quote:


> Originally Posted by *navjack27*
> 
> i'm so confused. i re-installed my drivers... and i cannot get that 3dmark score again... its gone... i reapplied the radeonmod tweaks i think i had. the bios is the same... i can't break 14k gpu score


I already thought that 16K score was just a fluke,,,otherwise me be verry jelly I only managed to get like 15078 gpu score at 1225/1750
I am looking in to bios edeting for memory timings etc. but I dont think it will get me another 1000K. although it would be nice


----------



## tangelo

Quote:


> Originally Posted by *navjack27*
> 
> before i was asked if my tessellation has been modified, and i swore it wasn't i mean, 3dmark calls ur score invalid if u change it...


I've run Fire Strike with "optimized" tessellation settings and 3DMark said the result was valid.


----------



## navjack27

Thing is. This time and last time radeonpro and the crimson was set to use application settings... So I'm still really unsure how my score inflated. I posted screenshots of my settings and was able to repeat it, until I ran ddu and did a "fresh" driver install.


----------



## tangelo

Quote:


> Originally Posted by *jodybdesigns*
> 
> Lock your FPS a few FPS higher than the minimum. Smoooooth gaming. Turn off powerplay and disable ULPS. No more spikes


I thought ULPS only affects systems with crossfire?


----------



## ZealotKi11er

Quote:


> Originally Posted by *navjack27*
> 
> Thing is. This time and last time radeonpro and the crimson was set to use application settings... So I'm still really unsure how my score inflated. I posted screenshots of my settings and was able to repeat it, until I ran ddu and did a "fresh" driver install.


Your score looks really high for only 1175MHz that is why i was a bit skeptical. People have to push 1300MHz + in the core to get over 15.5K even with memory mods. Tessellation was probably your problem. You know software is not always bug free. To be 100% sure do a clean install of Windows, set your OC and change nothing else and see what you score.


----------



## Synntx

Here's my results after all overclocking is done.

http://www.3dmark.com/fs/6967071


----------



## fat4l

Quote:


> Originally Posted by *navjack27*
> 
> Thing is. This time and last time radeonpro and the crimson was set to use application settings... So I'm still really unsure how my score inflated. I posted screenshots of my settings and was able to repeat it, until I ran ddu and did a "fresh" driver install.


Yeah that's why I was confused. I "only" got 15000+ graphics score on my card with cf disabled and I was like, how come you are getting 1k+ ?







Thus the tesselation question... But have no idea how it happened.


----------



## fat4l

For those who are interested in performance 290X CF vs Fury X, here it is:

*2x 290X* vs *Fury X*
Both setups clocked high: 1200/1700 vs 1180/590 MHz.
2x 290X wins by ~40%

FS: http://www.3dmark.com/compare/fs/6802832/fs/6734662# = 47% difference in GS
FSX: http://www.3dmark.com/compare/fs/6802501/fs/6590481# = 36% difference in GS
FSU: http://www.3dmark.com/compare/fs/6812511/fs/6592177# = 37% difference in GS


----------



## navjack27

Quote:


> Originally Posted by *fat4l*
> 
> Yeah that's why I was confused. I "only" got 15000+ graphics score on my card with cf disabled and I was like, how come you are getting 1k+ ?
> 
> 
> 
> 
> 
> 
> 
> Thus the tesselation question... But have no idea how it happened.


But mess with Tess now and I still don't get close to that score still. I tried 2x, AMD optimized... Not off cuz I know you don't see some geometry with it off, so actually the Tess values can't be the reason


----------



## Jaffi

Quote:


> Originally Posted by *Synntx*
> 
> Grabbed me a Sapphire 390x Tri-X yesterday, along with a few other goodies such as a couple SSDs, and went home and slapped it all together!
> 
> Because I'm weird like that, I did a fresh install on windows after installing the new hardware and reset my BIOS to stock. Here is what I got with everything stock:
> 
> http://www.3dmark.com/fs/6951286
> 
> And here is what I got with overclock:
> 
> ttp://www.3dmark.com/fs/6953360
> 
> My STRIX 380 was running the Witcher 3 on ultra maxed out setting at around 27-32fps, my 390x runs it at 57-60fps. I was also getting an average FPS of 40 in Valley benchmark on the 380, and get almost 80 with the 390x. Needless to say, I'm happy with my upgrade, and glad I didn't go the 390 route! It feels nice to finally have a high end card.
> 
> Now my if only my crappy 8320e would cooperate better when I try to overclock it!! 4.7ghz stable with 1.588volts
> 
> 
> 
> 
> 
> 
> 
> Killin my physics score!!!


Are you sure that first benchmark was @stock speeds? You were running 1140/1650 there, my 390X Nitro has 1080/1500 stock.


----------



## Synntx

Quote:


> Originally Posted by *Jaffi*
> 
> Are you sure that first benchmark was @stock speeds? You were running 1140/1650 there, my 390X Nitro has 1080/1500 stock.


Huh. Interdasting. I wonder if i uploaded the wrong screeny. I believe my tri-x is factory 1040/1500. I'll rerun the benchmark when i get home tonight. What kinda scores are you getting at stock? Don't suppose you'd like to part with your backplate?


----------



## Jaffi

Quote:


> Originally Posted by *Synntx*
> 
> Huh. Interdasting. I wonder if i uploaded the wrong screeny. I believe my tri-x is factory 1040/1500. I'll rerun the benchmark when i get home tonight. What kinda scores are you getting at stock? Don't suppose you'd like to part with your backplate?


Haha not really, it's the first card with backplate I ever got and we are best friends now







Here is a benchmark @stock, CPU is a 2500K @4.5 GHz: http://www.3dmark.com/3dm/9789390


----------



## Synntx

Quote:


> Originally Posted by *Jaffi*
> 
> Haha not really, it's the first card with backplate I ever got and we are best friends now
> 
> 
> 
> 
> 
> 
> 
> Here is a benchmark @stock, CPU is a 2500K @4.5 GHz: http://www.3dmark.com/3dm/9789390


Ok cool, looks like our systems benchmark relatively the same. I wonder how difficult it would be to fabricate a backplate for mine.


----------



## kizwan

Quote:


> Originally Posted by *navjack27*
> 
> Quote:
> 
> 
> 
> Originally Posted by *fat4l*
> 
> Yeah that's why I was confused. I "only" got 15000+ graphics score on my card with cf disabled and I was like, how come you are getting 1k+ ?
> 
> 
> 
> 
> 
> 
> 
> Thus the tesselation question... But have no idea how it happened.
> 
> 
> 
> But mess with Tess now and I still don't get close to that score still. I tried 2x, AMD optimized... Not off cuz I know you don't see some geometry with it off, so actually the Tess values can't be the reason
Click to expand...

Did you mod your memory timings (1250 & above) with magic hex yet? You know timings starts with something like this; *XX X1*. Just change that to *FF F1* which is the magic hex I'm talking about.


----------



## Levys

Quote:


> Originally Posted by *kizwan*
> 
> Did you mod your memory timings (1250 & above) with magic hex yet? You know timings starts with something like this; *XX X1*. Just change that to *FF F1* which is the magic hex I'm talking about.


Tel me more pls , magic hex?
can i find this info in another tread perhaps?


----------



## mus1mus

FF F1 is not magical in an absolute way.









You can substitute them with anything as long as you keep the semantics.

The truly amazing benching HEX is in the proper timings. But not really sure if it would give some boost in gaming.


----------



## Levys

My best 3D MARK score yet and that's with an FX 8350 possibly the best system score in the world ( similar systems )

http://www.3dmark.com/fs/6969961

That's without any radeonpro settings, just standard clean run.


----------



## Synntx

Quote:


> Originally Posted by *Levys*
> 
> My best 3D MARK score yet and that's with an FX 8350 possibly the best system score in the world ( similar systems )
> 
> http://www.3dmark.com/fs/6969961
> 
> That's without any radeonpro settings, just standard clean run.


Pretty nice run! How did you manage to get a 390 to break the 15k mark?? Better yet, what kinda voltage increase was required for 1225/1750???


----------



## navjack27

Quote:


> Originally Posted by *kizwan*
> 
> Did you mod your memory timings (1250 & above) with magic hex yet? You know timings starts with something like this; *XX X1*. Just change that to *FF F1* which is the magic hex I'm talking about.


I am using the stock bios with nothing changed except for the timings now. I'm done chasing unicorns lol. Channing the tdp and tdc made it unstable during boinc and [email protected] and csgo, csgo is embarrassing to crash during a mm game lol.

So yeah I used hex workshop to copy paste the timings from 1250 to everything higher up to 1750.

I'm back to just using a stable overclock of 1100/1750 pl50+ with no voltage change for my 24hr overclock with power play disabled using AB. I'm not using radeonpro anymore cuz I found that the flip queue and LOD it sets are the wrong numbers. I'd change the values and then open radeonmod and regedit and see what it adds/changes to confirm. So I'm just manually changing stuff now. 15.12 crimson. Win7. I won't reinstall windows or upgrade anytime soon... There is just too much that works as it should for me to go and throw another wrench in the proverbial works. I can do 1150/1750 or 1150/1625 with added voltage, and that is stable for the most part. I wonder if the issues are related to my psu. Evga 850w b2. It's semi modular... I can use the hard wired vga power connections, each one is a 6 pin with a lil 2 pin thingy to make them 8 pin if need be... The modular has a 6 pin and another 6 that has the lil 2 pin on it, but they run from one cable. It appears to be a single 12v rail with 70a. I see in gpuz the card pulling quite a bit. And my CPU has all its power saving disabled in bios, maybe not a good mix?


----------



## Argorn5757

Not sure if this is the right place, but here goes.

I just bought the Sapphire R9 390 and installed it. Everything (seems to) work fine, except at startup the fancy BIOS screen is entirely screwed up (super pixely/artifacts)

Normally I would suspect a bad GPU, but I have a Gigabyte MOBO and google tells me this is just an issue with the R9s and gigabyte mobos? And turning off the full screen BIOS picture makes the issue go away.

Has anyone else with one of these and a gigabyte mobo had the same issue? I'm running some stress testing right now to make sure that that's the only issue with it.

Thanks!


----------



## Synntx

Quote:


> Originally Posted by *navjack27*
> 
> I am using the stock bios with nothing changed except for the timings now. I'm done chasing unicorns lol. Channing the tdp and tdc made it unstable during boinc and [email protected] and csgo, csgo is embarrassing to crash during a mm game lol.
> 
> So yeah I used hex workshop to copy paste the timings from 1250 to everything higher up to 1750.
> 
> I'm back to just using a stable overclock of 1100/1750 pl50+ with no voltage change for my 24hr overclock with power play disabled using AB. I'm not using radeonpro anymore cuz I found that the flip queue and LOD it sets are the wrong numbers. I'd change the values and then open radeonmod and regedit and see what it adds/changes to confirm. So I'm just manually changing stuff now. 15.12 crimson. Win7. I won't reinstall windows or upgrade anytime soon... There is just too much that works as it should for me to go and throw another wrench in the proverbial works. I can do 1150/1750 or 1150/1625 with added voltage, and that is stable for the most part. I wonder if the issues are related to my psu. Evga 850w b2. It's semi modular... I can use the hard wired vga power connections, each one is a 6 pin with a lil 2 pin thingy to make them 8 pin if need be... The modular has a 6 pin and another 6 that has the lil 2 pin on it, but they run from one cable. It appears to be a single 12v rail with 70a. I see in gpuz the card pulling quite a bit. And my CPU has all its power saving disabled in bios, maybe not a good mix?


Personally i turned my power saving features on in bios after dosing a stable CPU overclock. I just don't see the sense in running it at full blast 24/7. My GPU, though, has ULPS disabled and force constant voltage enabled, but i have target frame rate set at 65 to let the gpu be more efficient


----------



## Carniflex

Quote:


> Originally Posted by *Argorn5757*
> 
> Not sure if this is the right place, but here goes.
> 
> I just bought the Sapphire R9 390 and installed it. Everything (seems to) work fine, except at startup the fancy BIOS screen is entirely screwed up (super pixely/artifacts)
> 
> Normally I would suspect a bad GPU, but I have a Gigabyte MOBO and google tells me this is just an issue with the R9s and gigabyte mobos? And turning off the full screen BIOS picture makes the issue go away.
> 
> Has anyone else with one of these and a gigabyte mobo had the same issue? I'm running some stress testing right now to make sure that that's the only issue with it.
> 
> Thanks!


I have Gigabyte X79 platform motherboard and Gigabyte 390X card, second screen has some blue stripes/noise on it at certain spots during the UEFI splash screen (running over HDMI - DVI cable), the other screen is fine that plugged into that card (4k, running over displayport).


----------



## kizwan

Quote:


> Originally Posted by *mus1mus*
> 
> FF F1 is not magical in an absolute way.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You can substitute them with anything as long as you keep the semantics.
> 
> The truly amazing benching HEX is in the proper timings. But not really sure if it would give some boost in gaming.


I'm more interested with "proper timings" if it's in human readable form.


----------



## Levys

Quote:


> Originally Posted by *Synntx*
> 
> Pretty nice run! How did you manage to get a 390 to break the 15k mark?? Better yet, what kinda voltage increase was required for 1225/1750???


only used MSI +100mv on the core and +90,ish on the memory. no artifact's
+ a great waterblock to keep the temps in check..they hardly break 55°c at those speeds.
geus I won the silicon lottery for once

The force is strong with this one


----------



## Synntx

Quote:


> Originally Posted by *Levys*
> 
> only used MSI +100mv on the core and +90,ish on the memory. no artifact's
> + a great waterblock to keep the temps in check..they hardly break 55°c at those speeds.
> geus I won the silicon lottery for once
> 
> The force is strong with this one


I'd say so


----------



## pillowsack

Figured i'd run 3Dmark since it's $5 on steam right now.

http://www.3dmark.com/3dm/9991969?

SCORE
11728

Graphics Score 14652

But my GPU was at 1175 and 1700. My memory doesn't like to be stable, so i've got some copper enzo heatsinks coming to fix that.


----------



## Synntx

Guess you can add me to the list!

Sapphire 390x Tri-X

Stock air cooling

Here is my stock GPU firestrike scores:
http://www.3dmark.com/fs/6978668

My overclock scores:
http://www.3dmark.com/fs/6994354

Love this 390x!


----------



## RagnarDE

You gotta love this GPU ^^


----------



## Spartoi

Quote:


> Originally Posted by *Synntx*
> 
> Guess you can add me to the list!
> 
> Sapphire 390x Tri-X
> 
> Stock air cooling
> 
> Here is my stock GPU firestrike scores:
> http://www.3dmark.com/fs/6978668
> 
> My overclock scores:
> http://www.3dmark.com/fs/6994354
> 
> Love this 390x!


Could you share your core voltage and ASIC quality?


----------



## Charcharo

Guys...
Can someone post 1440x900 benchmarks of their R9 390?
*Hides in low resolution 10-year old monitor cave*


----------



## sportsczy

Is a 3D Mark score of 11357 with a R9 390 and a I5 4690K good?


----------



## navjack27

yeah bretty gewd


----------



## m70b1jr

http://www.techpowerup.com/gpuz/details.php?id=v94c5

1115/1500

Without touching the core voltage, I just bought a Artic Accelero Hybrid III 140 for $100 off Amazon, I didn't wanna mess with the core voltage on the stock XFX cooler, as my temps were getting in the 75's, and I'm not used to that. I had an R9 270 before this and it NEVER went above 60, I swear. But here's my firestrike scores.

http://www.3dmark.com/fs/7009273

Better than 76%.. I'm pretty impressed my self.


----------



## Dundundata

U might need to reapply TIM


----------



## navjack27

look what a fresh windows 10 install nets ya

http://www.3dmark.com/fs/7016767

+125mv on core
nothing on aux

1175/1750

SCORE
13223 with AMD Radeon R9 390X(1x) and Intel Core i7-4790S Processor
Graphics Score 16487
Physics Score 11416
Combined Score 5885


----------



## m70b1jr

Anyone know the max temps for bench marking on this thing?


----------



## m70b1jr

Decided to turn my PC into a lava world

Using the stock XFX cooler, I was able to do the following

1155 / 1500 (haven't touched mem clocks yet) Key word, yet. At 1160 there was SMALL, very SMALL artifacting. Like a small flicker.

Added +100 on voltage and AUX Voltage

Gets up to 90 Celsius bench marking. I can smell my PC burning.


----------



## Stige

Running my ASUS Strix R9 390 DC3 at 1245/1700 now that I got the waterblock for it. +100mv/+50mV.
Couldn't touch voltage controls under stock cooler or VRM would be on fire...

Going over +100mV on the core voltage didn't seen to help much at all, at 100mV/1250 core I get few small blue/red/green dots on screen occasionally, it is stable but small artifacts doesn't cut it. Lowering it to 1245 on Core made them all disappear.

Couldn't even do 1300 under +200mV, barely reduced artifacts over +100mV.

Stock FPS on valley was 64.2, highest I achieved was 73.7 under 1260/1700.
Couldn't get a 10 FPS increase


----------



## m70b1jr

Quote:


> Originally Posted by *Stige*
> 
> Running my ASUS Strix R9 390 DC3 at 1245/1700 now that I got the waterblock for it. +100mv/+50mV.
> Couldn't touch voltage controls under stock cooler or VRM would be on fire...
> 
> Going over +100mV on the core voltage didn't seen to help much at all, at 100mV/1250 core I get few small blue/red/green dots on screen occasionally, it is stable but small artifacts doesn't cut it. Lowering it to 1245 on Core made them all disappear.
> 
> Couldn't even do 1300 under +200mV, barely reduced artifacts over +100mV.
> 
> Stock FPS on valley was 64.2, highest I achieved was 73.7 under 1260/1700.
> Couldn't get a 10 FPS increase


So maybe it's a bad idea i'm doing +100mV on a stock XFX cooler?


----------



## Sgt Bilko

Quote:


> Originally Posted by *m70b1jr*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Stige*
> 
> Running my ASUS Strix R9 390 DC3 at 1245/1700 now that I got the waterblock for it. +100mv/+50mV.
> Couldn't touch voltage controls under stock cooler or VRM would be on fire...
> 
> Going over +100mV on the core voltage didn't seen to help much at all, at 100mV/1250 core I get few small blue/red/green dots on screen occasionally, it is stable but small artifacts doesn't cut it. Lowering it to 1245 on Core made them all disappear.
> 
> Couldn't even do 1300 under +200mV, barely reduced artifacts over +100mV.
> 
> Stock FPS on valley was 64.2, highest I achieved was 73.7 under 1260/1700.
> Couldn't get a 10 FPS increase
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So maybe it's a bad idea i'm doing +100mV on a stock XFX cooler?
Click to expand...

95c for Core temps and under 90c vrms you'll be fine.

I've done +200mV on my 390x (XFX DD) for short benches and its held up well.


----------



## m70b1jr

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 95c for Core temps and under 90c vrms you'll be fine.
> 
> I've done +200mV on my 390x (XFX DD) for short benches and its held up well.


I can't get mine past 100, slider ends there, even after enabling unofficial overclock mode


----------



## Sgt Bilko

Quote:


> Originally Posted by *m70b1jr*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> 95c for Core temps and under 90c vrms you'll be fine.
> 
> I've done +200mV on my 390x (XFX DD) for short benches and its held up well.
> 
> 
> 
> I can't get mine past 100, slider ends there, even after enabling unofficial overclock mode
Click to expand...

Use Sapphire Trixx for up to +200mV


----------



## m70b1jr

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Use Sapphire Trixx for up to +200mV


Sweet, will do. I hate the UI on it though.

Here's newest speeds on STOCK cooler, AUX and Core vM + 100



This thing is an amazing overclocker. My R9-270 came 1070mhz STOCK speed, and I could not overclock that thing AT ALL. So getting this another +140mhz ish on core, and 100mhz on memory is pretty cool.

As I said before, I did order an ARTIC Cooler, (140 III) and I'll add 200mV on this, to push it even further. Can anyone give me a life expectancy of this thing OC'd this much? Temps will be taken care off.


----------



## Sgt Bilko

Quote:


> Originally Posted by *m70b1jr*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Use Sapphire Trixx for up to +200mV
> 
> 
> 
> Sweet, will do. I hate the UI on it though.
> 
> Here's newest speeds on STOCK cooler, AUX and Core vM + 100
> 
> 
> 
> This thing is an amazing overclocker. My R9-270 came 1070mhz STOCK speed, and I could not overclock that thing AT ALL. So getting this another +140mhz ish on core, and 100mhz on memory is pretty cool.
> 
> As I said before, I did order an ARTIC Cooler, (140 III) and I'll add 200mV on this, to push it even further. Can anyone give me a life expectancy of this thing OC'd this much? Temps will be taken care off.
Click to expand...

For reference: mine does 1150/1700 on stock voltage.

As for UI, i prefer Afterburners old UI and i can choose that but with Trixx to get the old UI you need an older version (1.5 is the lastest one with it iirc)


----------



## m70b1jr

Quote:


> Originally Posted by *Sgt Bilko*
> 
> For reference: mine does 1150/1700 on stock voltage.
> 
> As for UI, i prefer Afterburners old UI and i can choose that but with Trixx to get the old UI you need an older version (1.5 is the lastest one with it iirc)


Weird. I get artifacting at 1125 MHZ stock voltage, so I upped my voltage all the way up to get my 1155 stable.. hopefully I can do more when my liquid cooler comes in. Shipping can be up to 11 days apparently, so It could be a white before I'm back on this thread.

Also, can I get a life expectancy on this thing with +200mV on the core, and possibly up to 1250 on core speeds? As long as It lasts me like a year - two years I'll be fine.


----------



## xxxGODxxx

Guess I got a dud, max stable clock at +100mv on both core and aux but I can only get 1100mhz on the core and 1600mhz on the memory. I am running a sapphire 390x tri x.

gpuz.gif 27k .gif file


----------



## Tobiman

Quote:


> Originally Posted by *Levys*
> 
> My best 3D MARK score yet and that's with an FX 8350 possibly the best system score in the world ( similar systems )
> 
> http://www.3dmark.com/fs/6969961
> 
> That's without any radeonpro settings, just standard clean run.


That's a great score. I was able to get my 290 to bench at 1250mhz but it's considerably slower than yours. Maybe the score has improved over time.

http://www.3dmark.com/fs/1942507


----------



## Tobiman

Quote:


> Originally Posted by *m70b1jr*
> 
> Weird. I get artifacting at 1125 MHZ stock voltage, so I upped my voltage all the way up to get my 1155 stable.. hopefully I can do more when my liquid cooler comes in. Shipping can be up to 11 days apparently, so It could be a white before I'm back on this thread.
> 
> Also, can I get a life expectancy on this thing with +200mV on the core, and possibly up to 1250 on core speeds? As long as It lasts me like a year - two years I'll be fine.


You might not need as much voltage, if the temps are below 60 degrees. On the other hand, you might not be able to hit such speed.


----------



## fat4l

Quote:


> Originally Posted by *m70b1jr*
> 
> I can't get mine past 100, slider ends there, even after enabling unofficial overclock mode


Or try HiS iTurbo. it's even better.

Also, stock voltage differs from card to card.


----------



## Synntx

Quote:


> Originally Posted by *Spartoi*
> 
> Could you share your core voltage and ASIC quality?


----------



## jdorje

I've got a major problem.

I have two monitors, both running off my 390. But with both monitors hooked up, the memory never downclocks; it's stuck at 1725 hz. I guess the lowest power state is never entered.

But it's actually much worse than it sounds. If I turn off my second monitor it downclocks and my system runs at 60 watts at the wall. With the second monitor connected the system runs at around 140W. That's at idle! This is a massive electrical draw, which not only raises temps (not really an issue on idle) but will be really expensive (potentially like $40 a year).

Some research indicates this has been a problem for some time, but nobody else points out the huge power difference in the two states. One suggestion was to mod the bios to lower the clock on the applicable (second-lowest?) power state. I am interested in modding my bios, but I have a single-bios card so there's...how much risk? Another hack I've read is to use profiles and force a super-low-clock profile when on the desktop, but this sounds like a massive pain (you need to make your stock settings into the idle state and have an OC profile for every single game).

Another option I'll probably look at is running the secondary monitor off my igpu. But the igpu is currently disabled and when I've enabled it, it's reduced my CPU overclock's stability (that's on limited data and supposedly shouldn't happen, so I'll give it another look).

What is going on and what should I do about it?


----------



## afyeung

The vram clocks like a dream on my 390x(1750mhz easy with +50mv aux). But getting the core to even 1150 is such a challenge. Any tips? I thought I was running rock stable with 1150mhz and +38mv core. The moment I loaded up Metro LL after vacation I got annoying black artifacts so I had to remove the core overclock. Also, what's the highest people have gotten to on the mem with performance improvements? I know 1750 is pretty high already but maybe people have gotten significantly higher.


----------



## afyeung

Whoops wrong shot


----------



## m70b1jr

Quote:


> Originally Posted by *afyeung*
> 
> The vram clocks like a dream on my 390x(1750mhz easy with +50mv aux). But getting the core to even 1150 is such a challenge. Any tips? I thought I was running rock stable with 1150mhz and +38mv core. The moment I loaded up Metro LL after vacation I got annoying black artifacts so I had to remove the core overclock. Also, what's the highest people have gotten to on the mem with performance improvements? I know 1750 is pretty high already but maybe people have gotten significantly higher.


Just max your voltage lmao. and AUX


----------



## Carniflex

Quote:


> Originally Posted by *jdorje*
> 
> I've got a major problem.
> 
> I have two monitors, both running off my 390. But with both monitors hooked up, the memory never downclocks; it's stuck at 1725 hz. I guess the lowest power state is never entered.
> 
> But it's actually much worse than it sounds. If I turn off my second monitor it downclocks and my system runs at 60 watts at the wall. With the second monitor connected the system runs at around 140W. That's at idle! This is a massive electrical draw, which not only raises temps (not really an issue on idle) but will be really expensive (potentially like $40 a year).
> 
> Some research indicates this has been a problem for some time, but nobody else points out the huge power difference in the two states. One suggestion was to mod the bios to lower the clock on the applicable (second-lowest?) power state. I am interested in modding my bios, but I have a single-bios card so there's...how much risk? Another hack I've read is to use profiles and force a super-low-clock profile when on the desktop, but this sounds like a massive pain (you need to make your stock settings into the idle state and have an OC profile for every single game).
> 
> Another option I'll probably look at is running the secondary monitor off my igpu. But the igpu is currently disabled and when I've enabled it, it's reduced my CPU overclock's stability (that's on limited data and supposedly shouldn't happen, so I'll give it another look).
> 
> What is going on and what should I do about it?


It's there to avoid screen "flickering" in multi display mode if I remember right. Probably some kind of "duct tape" fix in drivers which they forgot about or have not gotten around fixing yet. I have no clue how to fix it for you - I remember writing custom power profiles a while back to get memory to stay at highest cloks as this screen flickering thing is pretty annoying when it happens.


----------



## jdorje

Hooking it up to the igpu seems to work. Performance might be a bit lower (but the secondary monitor is desktop only) and power use is about 4w higher than the 390 with a single monitor.










I strongly recommend everyone else take a look at this data for your own card if using two monitors. One of the symptoms here is higher idle GPU temps, both core and VRM. GPU core is idling at 35C now versus about 60C before. Of course as I said the temperatures don't really matter, but nearly 80 watts more idle usage is absolutely insane.


----------



## fat4l

Quote:


> Originally Posted by *Synntx*


 1150mV VDDCI (AUX) waaay too high. It can kill ur card...


----------



## Majentrix

I know the Strix isn't the coolest 390 on the market, but there's no way temps should be this high playing WoW. Does anyone know what could be causing this? Running at stock fan speeds and clocks with two monitors and the frame limiter set to 75FPS.

EDIT: Closed Afterburner and temps have dropped down to something more normal, around 75c. Wonder what in AB was causing that, I didn't have any settings changed.


----------



## Darkeylel

Quote:


> Originally Posted by *jdorje*
> 
> Hooking it up to the igpu seems to work. Performance might be a bit lower (but the secondary monitor is desktop only) and power use is about 4w higher than the 390 with a single monitor.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I strongly recommend everyone else take a look at this data for your own card if using two monitors. One of the symptoms here is higher idle GPU temps, both core and VRM. GPU core is idling at 35C now versus about 60C before. Of course as I said the temperatures don't really matter, but nearly 80 watts more idle usage is absolutely insane.


Curious what are you using to measure what you're pulling off the wall ?


----------



## Stige

Quote:


> Originally Posted by *Darkeylel*
> 
> Curious what are you using to measure what you're pulling off the wall ?


Plenty of cheap plugs out there to measure the power consumption.
Like dis


----------



## afyeung

Happy new year everyone! I've been able to get to 1850mhz on the memory with +75mv aux. Anybody with high mem speeds been able to find the sweet spot? I'll try pushing to 1900mhz and see how it goes.


----------



## Stige

Overclocking memory is pointless anyway, the benefits are none in games and it's like 1 fps per 100MHz in Valley.


----------



## afyeung

Yeah. I'm just letting it sit at 1750mhz. Too bad my GPU's core is such a bad clocker. Not that the 390x is a decent overclocker at all but mine seems to be particularly worse.


----------



## jdorje

Quote:


> Originally Posted by *Darkeylel*
> 
> Curious what are you using to measure what you're pulling off the wall ?


kill a watt


----------



## Tobiman

Quote:


> Originally Posted by *jdorje*
> 
> I've got a major problem.
> 
> I have two monitors, both running off my 390. But with both monitors hooked up, the memory never downclocks; it's stuck at 1725 hz. I guess the lowest power state is never entered.
> 
> But it's actually much worse than it sounds. If I turn off my second monitor it downclocks and my system runs at 60 watts at the wall. With the second monitor connected the system runs at around 140W. That's at idle! This is a massive electrical draw, which not only raises temps (not really an issue on idle) but will be really expensive (potentially like $40 a year).
> 
> Some research indicates this has been a problem for some time, but nobody else points out the huge power difference in the two states. One suggestion was to mod the bios to lower the clock on the applicable (second-lowest?) power state. I am interested in modding my bios, but I have a single-bios card so there's...how much risk? Another hack I've read is to use profiles and force a super-low-clock profile when on the desktop, but this sounds like a massive pain (you need to make your stock settings into the idle state and have an OC profile for every single game).
> 
> Another option I'll probably look at is running the secondary monitor off my igpu. But the igpu is currently disabled and when I've enabled it, it's reduced my CPU overclock's stability (that's on limited data and supposedly shouldn't happen, so I'll give it another look).
> 
> What is going on and what should I do about it?


It's the same for both Nvidia and AMD.


----------



## viking21

Hi guys, happy new year everyone!
What's your fan curve on afterburner? My gpu is an Msi R9 390 and I get 70-75c playing 1 hour on gta v keeping my fan speed at 70-80% around 65-70c but like that is too loud.
I tried to decrease at 40% and it reached also 77-78c but fixed the noise. Is it too hot and should I improve the temperature or is it fine? What is a good balance?


----------



## Stige

Worry about VRM temps, not the core temp.


----------



## Dundundata

Quote:


> Originally Posted by *viking21*
> 
> Hi guys, happy new year everyone!
> What's your fan curve on afterburner? My gpu is an Msi R9 390 and I get 70-75c playing 1 hour on gta v keeping my fan speed at 70-80% around 65-70c but like that is too loud.
> I tried to decrease at 40% and it reached also 77-78c but fixed the noise. Is it too hot and should I improve the temperature or is it fine? What is a good balance?




This is my fan curve for the same card. I get about 75C Max, which usually means I need to turn the heat off in my house! Otherwise I'm sitting closer to 70C.


----------



## ToadofTerror

I got a MSI 390 and where wondering if my fan speed and temps are ok. When running witcher 3 with 1040 core clock and 1500 memory clock i get fan speeds anywhere from 40-60% with a temperature around 70-78 degrees.


----------



## Synntx

Quote:


> Originally Posted by *fat4l*
> 
> 1150mV VDDCI (AUX) waaay too high. It can kill ur card...


Yeaaaaaa I got no explanation for that. Here it is again......


----------



## Dundundata

Quote:


> Originally Posted by *ToadofTerror*
> 
> I got a MSI 390 and where wondering if my fan speed and temps are ok. When running witcher 3 with 1040 core clock and 1500 memory clock i get fan speeds anywhere from 40-60% with a temperature around 70-78 degrees.


is that with the default fan? you can set your own fan curve which could bring it down a bit. other factors are ambient temps and case setup. otherwise those temps are OK. check your VRM temp as well


----------



## jdorje

To make the power usage chart on the last page (which I edited a bit) I removed my 390, did DDU to uninstall all amd drivers, and tested with just integrated graphics.

While I had my 390 (XFX 8256) out I took a look at taking it apart. I ended up wussing out on this rather than removing the "VOID IF REMOVED" stickers. But I did tighten all the screws. The two in particular were the two on the back of core (holding the heat sink in place - there are 4 such chips but the other two were stickered over) and the two holding the VRM heat sink in place.

Temps are...massively...improved. At my max overclock of +100 mV and 1145 clock, I was getting core of 80-85 (custom fan curve) with VRMS approaching 120 before I'd shut it down. I didn't feel comfortable running this long term. Now VRMs barely break 100; I'd say their temps have been reduced by around 20C. Core is maybe a little cooler. I've run it at +100 mV/1145/1725 until temps stabilized and they're hitting 80C and 103C on the core VRM.

Interestingly when I unscrewed the VRM heat sink (I didn't realize what it was at the time!) it came totally lose. There's no thermal padding or TIM of any sort under there.

Next time I take this thing out of the case I'll take it all the way apart. I'll probably throw some thermal padding onto the VRMs and some CLU onto the die. Is CLU acceptable for the GPU heat sink? How can I know?


----------



## Sgt Bilko

Quote:


> Originally Posted by *fat4l*
> 
> Quote:
> 
> 
> 
> Originally Posted by *m70b1jr*
> 
> I can't get mine past 100, slider ends there, even after enabling unofficial overclock mode
> 
> 
> 
> Or try HiS iTurbo. it's even better.
> 
> Also, stock voltage differs from card to card.
Click to expand...

He's on Stock cooling.....there is 0 need for iTurbo at all here


----------



## jdorje

...and yet despite the cooler temps, stability is lower than before. Could it be that some of the VRMs are hotter than previously? There's ~6 core VRMs but only 1 sensor for them.


----------



## Sgt Bilko

Quote:


> Originally Posted by *jdorje*
> 
> ...and yet despite the cooler temps, stability is lower than before. Could it be that some of the VRMs are hotter than previously? There's ~6 core VRMs but only 1 sensor for them.


There are two temp sensors for VRMs (VRM1 and VRM2) use GPU-Z and scroll down on the sensor tab to see them


----------



## jdorje

Quote:


> Originally Posted by *Sgt Bilko*
> 
> There are two temp sensors for VRMs (VRM1 and VRM2) use GPU-Z and scroll down on the sensor tab to see them


Yeah. The other vrm sensor though is much lower under load and hotter at idle. I've always assumed it's tied to the vram vrm which is at the other end of the card.


----------



## Sgt Bilko

Quote:


> Originally Posted by *jdorje*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> There are two temp sensors for VRMs (VRM1 and VRM2) use GPU-Z and scroll down on the sensor tab to see them
> 
> 
> 
> Yeah. The other vrm sensor though is much lower under load and hotter at idle. I've always assumed it's tied to the vram vrm which is at the other end of the card.
Click to expand...

VRM1 is for Core and VRM2 is for VRAM so yes that makes sense, VRM2 doesn't have a heatsink on it but it doesn't really need one.

In other news:



Going to do a cooler replacement on my 290x with the 390x cooler and see exactly how much of an improvement there is on the cooler alone


----------



## jdorje

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Going to do a cooler replacement on my 290x with the 390x cooler and see exactly how much of an improvement there is on the cooler alone


Dammit you're guilting me for not taking mine apart.


----------



## moey1974

Just wanted to share some info on my overclock of my new R9 390 from Gigabyte. As most of you know, the Gigabyte R9 390's have locked voltages but I was pretty happy with the overclock regardless of it having a locked voltage. I bought this card already knowing of the locked voltages and opted for the G1 series of their R9 390's.

Anyways, I wanted to post this before i go, I won't be around to reply back as i am going out of town but for anyone interested in what a Gigabyte R9 390 G1 Edition can do even with locked voltages here you go, although some of it has to do with a per card basis, its not guaranteed every card. Also Keep in mind, i could of went further but opted to stay here and not push it any further. In fact i only did this to see what the card could do, as right now i have no desire or reason to keep it overclocked, maybe down the road i might OC it to get a bit more performance out of it but for now..no reason other than to see what i could do.

So i basically went as far as 1125mhz core/1625mhz memory with simply raising the Power Limit 10%, that's all. I left this run on GPU-Z's overclock testing for 30 minutes and played Battlefield 4, Crysis 3 and COD BOPs3 for a few hours non-stop switching between games during these few hours while keeping the card running with the OC. It's nice and stable and obviously could do a bit more but this is about as far as i feel i would want to go in the future.

I suspect i could do 1150mhz tops but i really don't see a reason to go pushing for more. I basically wanted a 10% OC and that is what my card can do with a bit more headroom for a bit more OC if i do decide to stick to a perm OC but for now, default clocks of 1025core/1500memory is more than enough as i do use a 1080p Panasonic Plasma as my display. I use my main PC in the living room as a Home Theater Media Center for gaming, movie watching,etc. PC's just play HD movies so nicely, better than a high end bluray player. I also have a PS4 but man, this R9 390 is one nice card for 1080p/1440p gaming. Even using VSR for downsampling on this 1080p plasma is beautiful to say the least. Also if i do upgrade to a 4k TV...you can simply use a Displayport to HDMI cable to get that 4k resolution at 60fps. You simply use this card's Displayport to HDMI cable.. Club3D has a nice cable for this on Amazon, FYI.


----------



## Stige

"I don't see reason pushing for more" What kinda logic is that??!?

You overclock, you go all out! No saying "I could but I won't because reasons and no logic at all".

1240/1800 on my R9 390 right now, 1250 on Core wasn't "stable", or well it was stable but I saw a few weird red/green/blue balls occasionally while playing AW, no other game showed any symptoms.


----------



## mus1mus

Quote:


> Originally Posted by *moey1974*
> 
> Just wanted to share some info on my overclock of my new R9 390 from Gigabyte. As most of you know, the Gigabyte R9 390's have locked voltages but I was pretty happy with the overclock regardless of it having a locked voltage. I bought this card already knowing of the locked voltages and opted for the G1 series of their R9 390's.
> 
> Anyways, I wanted to post this before i go, I won't be around to reply back as i am going out of town but for anyone interested in what a Gigabyte R9 390 G1 Edition can do even with locked voltages here you go, although some of it has to do with a per card basis, its not guaranteed every card. Also Keep in mind, i could of went further but opted to stay here and not push it any further. In fact i only did this to see what the card could do, as right now i have no desire or reason to keep it overclocked, maybe down the road i might OC it to get a bit more performance out of it but for now..no reason other than to see what i could do.
> 
> So i basically went as far as 1125mhz core/1625mhz memory with simply raising the Power Limit 10%, that's all. I left this run on GPU-Z's overclock testing for 30 minutes and played Battlefield 4, Crysis 3 and COD BOPs3 for a few hours non-stop switching between games during these few hours while keeping the card running with the OC. It's nice and stable and obviously could do a bit more but this is about as far as i feel i would want to go in the future.
> 
> I suspect i could do 1150mhz tops but i really don't see a reason to go pushing for more. I basically wanted a 10% OC and that is what my card can do with a bit more headroom for a bit more OC if i do decide to stick to a perm OC but for now, default clocks of 1025core/1500memory is more than enough as i do use a 1080p Panasonic Plasma as my display. I use my main PC in the living room as a Home Theater Media Center for gaming, movie watching,etc. PC's just play HD movies so nicely, better than a high end bluray player. I also have a PS4 but man, this R9 390 is one nice card for 1080p/1440p gaming. Even using VSR for downsampling on this 1080p plasma is beautiful to say the least. Also if i do upgrade to a 4k TV...you can simply use a Displayport to HDMI cable to get that 4k resolution at 60fps. You simply use this card's Displayport to HDMI cable.. Club3D has a nice cable for this on Amazon, FYI.


Using GPU-Z to validate an OC?

Vury Noice!








Quote:


> Originally Posted by *Stige*
> 
> "I don't see reason pushing for more" What kinda logic is that??!?
> 
> You overclock, you go all out! No saying "I could but I won't because reasons and no logic at all".
> 
> 1240/1800 on my R9 390 right now, 1250 on Core wasn't "stable", or well it was stable but I saw a few weird red/green/blue balls occasionally while playing AW, no other game showed any symptoms.


----------



## Stige

Quote:


> Originally Posted by *mus1mus*
> 
> Using GPU-Z to validate an OC?
> 
> Vury Noice!





As requested









Stock Valley


1260/1700 Valley


I will benchmark these current settings in Valley in a bit.


----------



## mus1mus

VDDC at 1.35
VDDCI at 1.211

Looking nice on my end. But be very careful with VRM 1. Otherwise,









ppssst, BIOS Modding will take you further.


----------



## Stige

I'm not sure where that max VDDCI is from because even with +100mV on Aux Voltage, I never see that high voltage









Must be some weird peak voltage.


----------



## fat4l

Quote:


> Originally Posted by *mus1mus*
> 
> VDDC at 1.35
> *VDDCI at 1.211*
> 
> Looking nice on my end. But be very careful with VRM 1. Otherwise,
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ppssst, BIOS Modding will take you further.


VDDCI at 1.211
 this is way too high... It can kill the card easily.

Quote:


> Originally Posted by *Stige*
> 
> I'm not sure where that max VDDCI is from because even with +100mV on Aux Voltage, I never see that high voltage
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Must be some weird peak voltage.


Sometimes when u oc and u set cddci to +100mV and restart the system/freeze....then after the restart vddci is set to +100mV but the O program doesnt see it and if you put another +100mV on top of that, it wil lactually be 1.2v.....aka +200mV.


----------



## m70b1jr

I wish overdrive or MSI had native support for adding more voltage.


----------



## viking21

Playing on Gta V (v-sync on, msaa x2 and very high) I get these wavy lines on screen:


Spoiler: Warning: Spoiler!





































































while watching a video on youtube it seems to be fine.


Spoiler: Warning: Spoiler!















What could be the problem?

I did a clean installation of my driver but didn't solve.


----------



## jdorje

Quote:


> Originally Posted by *viking21*
> 
> What could be the problem?.


It's almost like your monitor is in reduced color mode.


----------



## Synntx

Okay, so I tried to push my OC further by using Sapphire TRIXX along with AB to push the voltage passed +100. Well, any voltage increase beyond +125 causes (i presume) the drivers to fail. Total black screen/system lockup.









I will reset the BIOS back to stock and see if that has anything to do with it!


----------



## jdorje

You can only flash the bios by rebooting with a bootable USB drive?


----------



## Synntx

Quote:


> Originally Posted by *jdorje*
> 
> You can only flash the bios by rebooting with a bootable USB drive?


No. I flashed mine in windows with command prompt


----------



## OneB1t

its better to flash bios under DOS
but these cards have dualbios with switch so not that bad situation when bios flash go wrong


----------



## tangelo

Quote:


> Originally Posted by *Stige*
> 
> 
> 
> 
> As requested
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Stock Valley
> 
> 
> 1260/1700 Valley
> 
> 
> I will benchmark these current settings in Valley in a bit.


Holy ****!


----------



## Stige

Quote:


> Originally Posted by *fat4l*
> 
> VDDCI at 1.211
> this is way too high... It can kill the card easily.
> Sometimes when u oc and u set cddci to +100mV and restart the system/freeze....then after the restart vddci is set to +100mV but the O program doesnt see it and if you put another +100mV on top of that, it wil lactually be 1.2v.....aka +200mV.


Well that explains.. everything









I was like "how the fak does this 1800MHz cause instant crashing today when it worked for hours of gaming yesterday!?!"
Well that is why then, the obscene AUX voltage propably due to earlier crash lol

It was too good to be true








I had to back my memory down to 1750 now, seems stable for now playing RS:S atleast.


----------



## jdorje

Quote:


> Originally Posted by *OneB1t*
> 
> its better to flash bios under DOS
> but these cards have dualbios with switch so not that bad situation when bios flash go wrong


Unfortunately my card has no dual bios (xfx 8256).

I do now have my secondary monitor running off of integrated graphics. Still when the gpu crashes it usually takes the system with it.


----------



## mus1mus

I have a lot runs with VDDCI past 1.2 Volts. No negative effects as far as the card is concerned.









But it also has no positive effect whatsoever.









I do however notice the bug when using HIS ITurbo and Crimson Crashes.


----------



## Stige

Well yeah, I saw no negative effect either during the day of gaming at those "mad" voltages.

We seriously need a software to allow unlocked voltage control :l


----------



## fat4l

Quote:


> Originally Posted by *Stige*
> 
> Well yeah, I saw no negative effect either during the day of gaming at those "mad" voltages.
> 
> We seriously need a software to allow unlocked voltage control :l


Bios modding ?


----------



## Stige

Quote:


> Originally Posted by *fat4l*
> 
> Bios modding ?


Works? :O


----------



## fat4l

Quote:


> Originally Posted by *Stige*
> 
> Works? :O


lol....of course








http://www.overclock.net/t/1561372/hawaii-bios-editing-290-290x-295x2-390-390x/0_30


----------



## Synntx

Well, one of the fans went out on my week old Sapphire 390x, so i returned it to microcenter, only they were completely out of 390x models. So i ran over to Fry's and picked up an MSI 390x. Will post results when I'm finished overclocking


----------



## w1ck3ddd

Here I join the club. Running R9 390 Nitro with auto curve and overclocked. Are the temps ok ?


----------



## jdorje

I flashed my bios.

Used the Hawaii bios reader and didn't change much - just the stock speeds and fan speed.

Couple strange things though...

Voltage as read by hwinfo rose when I upped the stock clock. From ~1.147 at stock up to ~1.19. It's a pretty massive bump. But temps were about what I expected (no huge rise that you'd expect from 50 mV). So what gives? How do you accurately see voltage?

When I had the bios open, there were no power tables in the reader. And the memory timings table was...unintelligible. And the tdp was listed in three different places as 204 watts. Isn't that really low?

Flashed the original back for now but I'll give it some more work today.


----------



## Stige

Can someone just do +100mV to this on both Core and VDDCI? Or too much to ask for?

Hawaii.zip 100k .zip file


Saved with GPU-Z, does that work or do I need to save it some other way?


----------



## mus1mus

Quote:


> Originally Posted by *Stige*
> 
> Can someone just do +100mV to this on both Core and VDDCI? Or too much to ask for?
> 
> Hawaii.zip 100k .zip file
> 
> 
> Saved with GPU-Z, does that work or do I need to save it some other way?


Show me your Default VDDC.
Don't worry about VDDCI too much.


----------



## Stige

Quote:


> Originally Posted by *mus1mus*
> 
> Show me your Default VDDC.
> Don't worry about VDDCI too much.


I worry! My memory isn't stable at 1800MHz











These? Stock values.


----------



## OneB1t

Quote:


> Originally Posted by *jdorje*
> 
> I flashed my bios.
> 
> Used the Hawaii bios reader and didn't change much - just the stock speeds and fan speed.
> 
> Couple strange things though...
> 
> Voltage as read by hwinfo rose when I upped the stock clock. From ~1.147 at stock up to ~1.19. It's a pretty massive bump. But temps were about what I expected (no huge rise that you'd expect from 50 mV). So what gives? How do you accurately see voltage?
> 
> When I had the bios open, there were no power tables in the reader. And the memory timings table was...unintelligible. And the tdp was listed in three different places as 204 watts. Isn't that really low?
> 
> Flashed the original back for now but I'll give it some more work today.


different bios have different base offset
memory timing not finished


----------



## mus1mus

Quote:


> Originally Posted by *Stige*
> 
> I worry! My memory isn't stable at 1800MHz
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> These? Stock values.


Revert any OC tool to default. NO OC, NO OV.

RUN GPU-Z Stress to load the Card.
And Check for the Max VDDC.

Once you get the Value, Edit the DPM 7 Value for the CORE and Memory via HawaiiBiosReader.

The Value - 62XXX on the Rightmost Column of each table is clickable and edittable.

Put in something like 1300 (1.300) if you want. But I personally set 1250 (1.250) as max for the BIOS as it's enough for my card. And any additional OC and OV is reserved for Trixx and the like.

Remember to put the same Values for both GPU Core and Memory Voltages. Else, it won't boot.

I am not in a PC. And I encourage you to learn the tool OneB1t made.


----------



## Stige

Stock cooler


After putting on the Alphacool full cover block for the card


After tying a GT AP-15 fan on top of the VRM


Starting to get them where I want them to be. Getting better thermal pads for the VRM next week hopefully (Phobya 7w/mm compared the Alphacool pads which are 3w/mk as far as I know)

Hopefully get that VRM down to ~60C.

I mean it is fine now even at +100mV but need to get it lower to push some more heh


----------



## Stige

Quote:


> Originally Posted by *mus1mus*
> 
> Revert any OC tool to default. NO OC, NO OV.
> 
> RUN GPU-Z Stress to load the Card.
> And Check for the Max VDDC.
> 
> Once you get the Value, Edit the DPM 7 Value for the CORE and Memory via HawaiiBiosReader.
> 
> The Value - 62XXX on the Rightmost Column of each table is clickable and edittable.
> 
> Put in something like 1300 (1.300) if you want. But I personally set 1250 (1.250) as max for the BIOS as it's enough for my card. And any additional OC and OV is reserved for Trixx and the like.
> 
> Remember to put the same Values for both GPU Core and Memory Voltages. Else, it won't boot.
> 
> I am not in a PC. And I encourage you to learn the tool OneB1t made.


You mean somewhere here?


Or here?


I'm a bit confused as what the voltages are, they don't make any sense to me atleast, only sensible bit is the speeds.


----------



## OneB1t

voltages are in GPU freq table and MEM freq table

by default card is using asic based voltage thats why you see 65xxx in voltage column

if you want to change to custom voltage then rewrite this value to something like 1200 or 1300

you can set all DPM states as you want


----------



## Stige

Quote:


> Originally Posted by *OneB1t*
> 
> voltages are in GPU freq table and MEM freq table
> 
> by default card is using asic based voltage thats why you see 65xxx in voltage column
> 
> if you want to change to custom voltage then rewrite this value to something like 1200 or 1300
> 
> you can set all DPM states as you want


So just to make sure I'm reading you both correctly, I can replace those 65xxx numbers with just 1200 or 1300 (which would be 1.2V or 1.3V, correct?)


----------



## OneB1t

yes


this is how its look after modification


----------



## mus1mus

GPU Frequency Table and Memory Frequency Table.

Only Replace the Value cureently displayed as 65288 to say, 1300. Or 1350. Mean 1.3 and 1.35 respectively.

65288 and the values above it are BIOS defaults that doesn't OVERRIDE the Default GPU Voltages for each DPM State (Chip Dependent).

Edit: Ninja'd!


----------



## OneB1t

i know its complicated but bios editing is never staight forward task







and hawaii is little weird in terms of voltage control and 2D / 3D states
you need to spend some time mastering this piece of briliant SW


----------



## Stige

I noticed the "same voltage on both or it won't boot" part.

So I have to put memory voltage that high or is there something I'm missing here?


----------



## mus1mus

Yes. Memory and Core Voltages should be equal. They are directly linked.


----------



## Stige

Quote:


> Originally Posted by *mus1mus*
> 
> Yes. Memory and Core Voltages should be equal. They are directly linked.


So if I want 1.3V for Core, I need 1.3V for Memory (VDDCI?) aswell?
Huh?


----------



## mus1mus

VDDCI is not the Memory Voltage.


----------



## OneB1t

yep









i was thinking about linking both editable tables so values from GPU table will be automatically written into MEM table but decided that i was too lazy for this operation














so you need to edit both by hand


----------



## Stige

Quote:


> Originally Posted by *mus1mus*
> 
> VDDCI is not the Memory Voltage.


Well VDDCI = AUX Voltage on MSI Afterburner.
And it seems to have an effect on how high I can overclock my memory.

So what is it then if not memory voltage?


----------



## OneB1t

there is no setting for aux voltage in hawaiibiosreader
we never find offset of this value so if you want to mess with aux you need to use afterburner

also for correct use you should write your voltage settings to all 6 tables in hawaiireader or card could be unstable while playing video or doing other stuff while heavily overclocked (because there are different voltage than for 3D load)


----------



## mus1mus

Quote:


> Originally Posted by *Stige*
> 
> Well VDDCI = AUX Voltage on MSI Afterburner.
> And it seems to have an effect on how high I can overclock my memory.
> 
> So what is it then if not memory voltage?


Someone said it's the Memory Controller Voltage.

If you are using Trixx, it's out of reach. MSI AB and HIS ITurbo have that option. That's why I said to reserve further messing those areas for OC Tools.


----------



## jdorje

Quote:


> Originally Posted by *OneB1t*
> 
> different bios have different base offset
> memory timing not finished


So just by bumping the clock in the bios (in two different places) the voltage can be raised? Or is the higher voltage a phantom effect due to a higher perceived offset?

Still trying to determine if the higher clock in bios results in higher temps (and therefore voltage) than higher clock in afterburner - but it appears like it does.

Not that bumping the overclock in the bios is all that helpful since that's one of the few things you can do in software. It just seemed like an easy place to start.

EDIT: default voltage is basically adaptive, so yes it raises the voltage.


----------



## Stige

Before flashing:


After flashing:


HawaiiReader:


Something I'm missing here? derp
These values are with the GPU-Z render running.

Flashers flashed fine and verified fine.


----------



## Stige

Edited the Limit Tables aswell, still no change.

I'm flashing with atiflash -p 0 overvolt.rom


----------



## jdorje

Is it possible a driver reinstall is necessary after changing those voltages?


----------



## Stige

I just realized I have been posting in the completely wrong thread aswell oops...


----------



## jdorje

Quote:


> Originally Posted by *Stige*
> 
> I just realized I have been posting in the completely wrong thread aswell oops...


What's the right thread then?


----------



## Stige

Quote:


> Originally Posted by *jdorje*
> 
> What's the right thread then?


This?
http://www.overclock.net/t/1561372/hawaii-bios-editing-290-290x-295x2-390-390x/710#post_24757825


----------



## m70b1jr

Saphire's Trixx allows +200mV but I hate the software UI in general.


----------



## Stige

And doesn't allow you to change VDDCI so.


----------



## jdorje

I'm stuck with a raised VDDC. Back on the original/stock bios but the VDDC is like +50 mV. Might be I screwed up and flashed a bios when I had afterburner on +44 mV. But even reflashing the stock bios doesn't fix it now. What do I do? DDU driver reinstall?


----------



## pillowsack

What's the alphacool semi-water block that fits the MSI 390X? I think im gonna splooge and buy it.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Stige*
> 
> And doesn't allow you to change VDDCI so.


What does that help with?


----------



## pillowsack

Well I found it. If you wanna get a alphacool block for the MSI 390 it's this guy:

http://www.alphacool.com/product_info.php/info/p1663_Alphacool-NexXxoS-GPX---ATI-R9-390-M02---incl--backplate---black.html

Expect to pay about $120~ in USD for it


----------



## viking21

Quote:


> Originally Posted by *Stige*
> 
> Worry about VRM temps, not the core temp.


I got this:



very high settings, msaa x2, AF x16, post fx very high, population density and variety at maximum.


----------



## Carniflex

Quote:


> Originally Posted by *pillowsack*
> 
> Well I found it. If you wanna get a alphacool block for the MSI 390 it's this guy:
> 
> http://www.alphacool.com/product_info.php/info/p1663_Alphacool-NexXxoS-GPX---ATI-R9-390-M02---incl--backplate---black.html
> 
> Expect to pay about $120~ in USD for it


I just got the M04 version for Gigabyte 390X from them. Not installed it yet but it looks nice. Posted some pictures of it in post #62 in the thread about these blocks: http://www.overclock.net/t/1564888/alphacool-gpx-owners-club-and-information-thread/50_50#post_24755127

Price wise yeah - they are only marginally cheaper than a true full cover blocks - but what is going for them is that they have versions also for not the most popular non reference revisions, like, for example my gigabyte 390X G1 for which it's basically they only "full cover"-like block in existence.


----------



## kizwan

Quote:


> Originally Posted by *Stige*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Before flashing:
> 
> 
> After flashing:
> 
> 
> HawaiiReader:
> 
> 
> 
> 
> Something I'm missing here? derp
> These values are with the GPU-Z render running.
> 
> Flashers flashed fine and verified fine.


Yeah, GPU-Z render will not use 1.3V. Try games instead, e.g. GTAV, BF4, etc. You'll see voltage can go up to 1.3V & even little bit higher too but remember most of the time (when underload) the card will work at much lower than that because of Vdroop. Basically 1.3V that you set in the BIOS is just peak voltage. Operating voltage = voltage after Vdroop.

For example mine DPM7 set to 1287 for both cards, ASIC 70.3% & 77% respectively.

With GTAV:-
70.3% card: Peak: 1.297V, Vdroop: ~1.227V
77% card: Peak: 1.289V, Vdroop: ~1.219V
Quote:


> Originally Posted by *Stige*
> 
> And doesn't allow you to change VDDCI so.


If you can not raise VDDCI, then you'll just need more core voltage. That's all.

Also, I noticed this behavior, at least with my cards.

With official overclocking with MSI AB, you use either of these to get stable:-
*+X* mv for core & *+Y* mv for vddci
OR
*+(X+Z)* mv & *+0* mv for the vddci

With unofficial overclocking without powerplay with MSI AB, you can get stable (at the same clock as with official overclocking) with:-
*+X* mv for the core & *+0* mv for the vddci

*+(X+Z)* basically means higher core offset voltage.

Also just a reminder, before switching to official overclocking from unofficial overclocking without powerplay, reset your overclock to stock first. Or else your card will stubbornly running at max clocks all the time. Only uninstall & reinstalling the drivers will fix it. I think everyone know this already but I did noticed from time to time people complaining their clocks stuck at max clocks.


----------



## fat4l

Quote:


> Originally Posted by *Stige*
> 
> Well VDDCI = AUX Voltage on MSI Afterburner.
> And it seems to have an effect on how high I can overclock my memory.
> 
> So what is it then if not memory voltage?


This is memory controller voltage.
It can help with mem oc. Usually 1050mV is enough.
You can modify vddci voltage only directly is bios via hex editor.
I edited mine to 1050mV.


----------



## pillowsack

Would you be interested in helping modify my bios? My XFX card was at 1050 but this MSI is 1000 stock.

I'm afraid to flash cause the MSI doesn't have a bios switch. I do have a second GPU I can use incase things go wrong though.

I was hoping for memory strap mods and some voltage changes.


----------



## jdorje

Quote:


> Originally Posted by *kizwan*
> 
> Also, I noticed this behavior, at least with my cards. [...]


Aux voltage can raise core stability?


----------



## kizwan

Quote:


> Originally Posted by *jdorje*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Also, I noticed this behavior, at least with my cards. [...]
> 
> 
> 
> Aux voltage can raise core stability?
Click to expand...

Helps memory overclock.


----------



## mus1mus

A little.


----------



## Stige

Quote:


> Originally Posted by *kizwan*
> 
> Yeah, GPU-Z render will not use 1.3V. Try games instead, e.g. GTAV, BF4, etc. You'll see voltage can go up to 1.3V & even little bit higher too but remember most of the time (when underload) the card will work at much lower than that because of Vdroop. Basically 1.3V that you set in the BIOS is just peak voltage. Operating voltage = voltage after Vdroop.


I just tried flashing 1.25V and 1.3V and saw no difference in voltage running Valley.


----------



## Synntx

Alright, I got my MSI 390x in and overclocked. 1165/1725 is the highest stable overclock I can manage on this new card, whereas my Sapphire Tri-X overclocked 1190/1750. The MSI card gets a graphics score of ~14500 and the Sapphire got ~15000. Too bad one of the fans went out, looks like my Sapphire was a better card


----------



## Stige

Quote:


> Originally Posted by *Carniflex*
> 
> I just got the M04 version for Gigabyte 390X from them. Not installed it yet but it looks nice. Posted some pictures of it in post #62 in the thread about these blocks: http://www.overclock.net/t/1564888/alphacool-gpx-owners-club-and-information-thread/50_50#post_24755127
> 
> Price wise yeah - they are only marginally cheaper than a true full cover blocks - but what is going for them is that they have versions also for not the most popular non reference revisions, like, for example my gigabyte 390X G1 for which it's basically they only "full cover"-like block in existence.


I paid like 95€ for the fullcover M03 for my card, a lot cheaper than any other alternatives in here atleast, not that there are any alternatives.
Most full cover blocks cost that 95€ + backplate, Alphacool has the backplate included.


----------



## pillowsack

Quote:


> Originally Posted by *Synntx*
> 
> Alright, I got my MSI 390x in and overclocked. 1165/1725 is the highest stable overclock I can manage on this new card, whereas my Sapphire Tri-X overclocked 1190/1750. The MSI card gets a graphics score of ~14500 and the Sapphire got ~15000. Too bad one of the fans went out, looks like my Sapphire was a better card


I find the memory doesn't like to overclock that well. I do know that when I open up trixx and set my vcore to 200MV though 1225mhz for core seems stable for benchmarks, 1200mhz is rock solid stable.

The memory is really touchy though like I said. Overclocking the memory on these cards doesn't help much anyways from what I understand right?


----------



## Stige

From my experience, 100MHz on memory is about 1 FPS in Valley.

I have managed to bench at 1750MHz so far but I need to increase my VDDCI to get it further via bios mods.


----------



## kizwan

Quote:


> Originally Posted by *Stige*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Yeah, GPU-Z render will not use 1.3V. Try games instead, e.g. GTAV, BF4, etc. You'll see voltage can go up to 1.3V & even little bit higher too but remember most of the time (when underload) the card will work at much lower than that because of Vdroop. Basically 1.3V that you set in the BIOS is just peak voltage. Operating voltage = voltage after Vdroop.
> 
> 
> 
> I just tried flashing 1.25V and 1.3V and saw no difference in voltage running Valley.
Click to expand...

Are you looking at the voltage while valley is running or you look at max reported voltage?


----------



## Stige

Quote:


> Originally Posted by *kizwan*
> 
> Are you looking at the voltage while valley is running or you look at max reported voltage?


Max.


----------



## ultimo1337

Getting pretty good results, 74,1% Asic quality (for what its worth.

To bad they aren't unlockable to a 390X would love the extra shader units


----------



## viking21

Quote:


> Originally Posted by *jdorje*
> 
> It's almost like your monitor is in reduced color mode.


I had already set to 32 bit.

Quote:


> Originally Posted by *viking21*
> 
> I got this:
> 
> 
> 
> very high settings, msaa x2, AF x16, post fx very high, population density and variety at maximum.


anyone?
are those performance good?


----------



## kizwan

Quote:


> Originally Posted by *viking21*
> 
> Quote:
> 
> 
> 
> Originally Posted by *jdorje*
> 
> It's almost like your monitor is in reduced color mode.
> 
> 
> 
> I had already set to 32 bit.
Click to expand...

I have those wavy lines on the sky too. I didn't notice it until you mentioned about it. It's not normal?

Quote:


> Originally Posted by *viking21*
> 
> Quote:
> 
> 
> 
> Originally Posted by *viking21*
> 
> I got this:
> 
> 
> 
> very high settings, msaa x2, AF x16, post fx very high, population density and variety at maximum.
> 
> 
> 
> anyone?
> are those performance good?
Click to expand...

I can't really tell from the picture because gpu is idling in the screenshot. Temp wise it is look good to me.

This is mine with MSAA x8 & high settings. FPS above 50.

Code:



Code:


<graphics>
    <Tessellation value="2" />
    <LodScale value="1.000000" />
    <PedLodBias value="0.200000" />
    <VehicleLodBias value="0.000000" />
    <ShadowQuality value="2" />
    <ReflectionQuality value="1" />
    <ReflectionMSAA value="8" />
    <SSAO value="2" />
    <AnisotropicFiltering value="16" />
    <MSAA value="8" />
    <MSAAFragments value="0" />
    <MSAAQuality value="0" />
    <SamplingMode value="0" />
    <TextureQuality value="1" />
    <ParticleQuality value="1" />
    <WaterQuality value="1" />
    <GrassQuality value="0" />
    <ShaderQuality value="1" />
    <Shadow_SoftShadows value="1" />
    <UltraShadows_Enabled value="false" />
    <Shadow_ParticleShadows value="true" />
    <Shadow_Distance value="1.000000" />
    <Shadow_LongShadows value="false" />
    <Shadow_SplitZStart value="0.930000" />
    <Shadow_SplitZEnd value="0.890000" />
    <Shadow_aircraftExpWeight value="0.990000" />
    <Shadow_DisableScreenSizeCheck value="false" />
    <Reflection_MipBlur value="true" />
    <FXAA_Enabled value="true" />
    <TXAA_Enabled value="false" />
    <Lighting_FogVolumes value="true" />
    <Shader_SSA value="true" />
    <DX_Version value="2" />
    <CityDensity value="1.000000" />
    <PedVarietyMultiplier value="1.000000" />
    <VehicleVarietyMultiplier value="1.000000" />
    <PostFX value="1" />
    <DoF value="false" />
    <HdStreamingInFlight value="false" />
    <MaxLodScale value="0.000000" />
    <MotionBlurStrength value="0.000000" />
  </graphics>


----------



## viking21

Quote:


> Originally Posted by *kizwan*
> 
> I have those wavy lines on the sky too. I didn't notice it until you mentioned about it. It's not normal?
> I can't really tell from the picture because gpu is idling in the screenshot. Temp wise it is look good to me.
> 
> This is mine with MSAA x8 & high settings. FPS above 50.
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> <graphics>
> <Tessellation value="2" />
> <LodScale value="1.000000" />
> <PedLodBias value="0.200000" />
> <VehicleLodBias value="0.000000" />
> <ShadowQuality value="2" />
> <ReflectionQuality value="1" />
> <ReflectionMSAA value="8" />
> <SSAO value="2" />
> <AnisotropicFiltering value="16" />
> <MSAA value="8" />
> <MSAAFragments value="0" />
> <MSAAQuality value="0" />
> <SamplingMode value="0" />
> <TextureQuality value="1" />
> <ParticleQuality value="1" />
> <WaterQuality value="1" />
> <GrassQuality value="0" />
> <ShaderQuality value="1" />
> <Shadow_SoftShadows value="1" />
> <UltraShadows_Enabled value="false" />
> <Shadow_ParticleShadows value="true" />
> <Shadow_Distance value="1.000000" />
> <Shadow_LongShadows value="false" />
> <Shadow_SplitZStart value="0.930000" />
> <Shadow_SplitZEnd value="0.890000" />
> <Shadow_aircraftExpWeight value="0.990000" />
> <Shadow_DisableScreenSizeCheck value="false" />
> <Reflection_MipBlur value="true" />
> <FXAA_Enabled value="true" />
> <TXAA_Enabled value="false" />
> <Lighting_FogVolumes value="true" />
> <Shader_SSA value="true" />
> <DX_Version value="2" />
> <CityDensity value="1.000000" />
> <PedVarietyMultiplier value="1.000000" />
> <VehicleVarietyMultiplier value="1.000000" />
> <PostFX value="1" />
> <DoF value="false" />
> <HdStreamingInFlight value="false" />
> <MaxLodScale value="0.000000" />
> <MotionBlurStrength value="0.000000" />
> </graphics>




Don't consider the average, the game had been paused a few times.


----------



## Synntx

Quote:


> Originally Posted by *pillowsack*
> 
> I find the memory doesn't like to overclock that well. I do know that when I open up trixx and set my vcore to 200MV though 1225mhz for core seems stable for benchmarks, 1200mhz is rock solid stable.
> 
> The memory is really touchy though like I said. Overclocking the memory on these cards doesn't help much anyways from what I understand right?


It would seem that any voltage increase beyond +125mv causes the benchmark to crash at the end. I also cannot run AB and TRIXX at the same time without crashing the driverrs. I dunno what to do about that.

Besides, running +200mv on this card sends the temps skyrocketing into the 80s. When I had my Tri-X i could keep them under 75. Seems Sapphire just makes a better card for overclocking than MSI does. But I REALLY like the look of the MSI card, plus it has a back plate. I'll just be content with a mediocre overclock. It's still gonna max out gaming in 1080p so I'm happy









I'm not sure what other BIOS mods I can do to get better benchmark scores. I've already modded the timings.

I've got a friend who has a STRIX DC03 390x who was eyeballing my MSI and proposed a swap. I may take him up just to test ASUS's setup.


----------



## kizwan

Quote:


> Originally Posted by *viking21*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> I have those wavy lines on the sky too. I didn't notice it until you mentioned about it. It's not normal?
> I can't really tell from the picture because gpu is idling in the screenshot. Temp wise it is look good to me.
> 
> This is mine with MSAA x8 & high settings. FPS above 50.
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> <graphics>
> <Tessellation value="2" />
> <LodScale value="1.000000" />
> <PedLodBias value="0.200000" />
> <VehicleLodBias value="0.000000" />
> <ShadowQuality value="2" />
> <ReflectionQuality value="1" />
> <ReflectionMSAA value="8" />
> <SSAO value="2" />
> <AnisotropicFiltering value="16" />
> <MSAA value="8" />
> <MSAAFragments value="0" />
> <MSAAQuality value="0" />
> <SamplingMode value="0" />
> <TextureQuality value="1" />
> <ParticleQuality value="1" />
> <WaterQuality value="1" />
> <GrassQuality value="0" />
> <ShaderQuality value="1" />
> <Shadow_SoftShadows value="1" />
> <UltraShadows_Enabled value="false" />
> <Shadow_ParticleShadows value="true" />
> <Shadow_Distance value="1.000000" />
> <Shadow_LongShadows value="false" />
> <Shadow_SplitZStart value="0.930000" />
> <Shadow_SplitZEnd value="0.890000" />
> <Shadow_aircraftExpWeight value="0.990000" />
> <Shadow_DisableScreenSizeCheck value="false" />
> <Reflection_MipBlur value="true" />
> <FXAA_Enabled value="true" />
> <TXAA_Enabled value="false" />
> <Lighting_FogVolumes value="true" />
> <Shader_SSA value="true" />
> <DX_Version value="2" />
> <CityDensity value="1.000000" />
> <PedVarietyMultiplier value="1.000000" />
> <VehicleVarietyMultiplier value="1.000000" />
> <PostFX value="1" />
> <DoF value="false" />
> <HdStreamingInFlight value="false" />
> <MaxLodScale value="0.000000" />
> <MotionBlurStrength value="0.000000" />
> </graphics>
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Don't consider the average, the game had been paused a few times.
Click to expand...

I can see the picture. What I meant is that I can not tell you from the picture whether performance is good or not. I can only tell temps are good. The most important is if the game running smooth.


----------



## viking21

Quote:


> Originally Posted by *viking21*
> 
> 
> 
> Don't consider the average, the game had been paused a few times.


Sorry, with "performance" I meant how good are those temps.
I'm not a mother tongue so I could make some mistakes and writing here it's also a way to improve my skills


----------



## pillowsack

Quote:


> Originally Posted by *Synntx*
> 
> It would seem that any voltage increase beyond +125mv causes the benchmark to crash at the end. I also cannot run AB and TRIXX at the same time without crashing the driverrs. I dunno what to do about that.
> 
> Besides, running +200mv on this card sends the temps skyrocketing into the 80s. When I had my Tri-X i could keep them under 75. Seems Sapphire just makes a better card for overclocking than MSI does. But I REALLY like the look of the MSI card, plus it has a back plate. I'll just be content with a mediocre overclock. It's still gonna max out gaming in 1080p so I'm happy
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm not sure what other BIOS mods I can do to get better benchmark scores. I've already modded the timings.
> 
> I've got a friend who has a STRIX DC03 390x who was eyeballing my MSI and proposed a swap. I may take him up just to test ASUS's setup.


Yikes. Maybe it's not too stable because of temperature? Maybe you should consider water cooling it dude.

I will admit the dragon backplate is cool, but it still didn't stop my card from sagging that much.

I really want some modded bios with timings mod and vddci set to 1050 stock and vcore bumped up a little if anyone wants to help.


----------



## navjack27

i can do a timings mod pretty much by heart at this point. i won't mess with voltages tho. i'm done chasing higher overclocks for this card. i really think you get the best REAL WORLD performance without raising voltages and going the highest core and memory clock you can get stable.


----------



## ZealotKi11er

Quote:


> Originally Posted by *navjack27*
> 
> i can do a timings mod pretty much by heart at this point. i won't mess with voltages tho. i'm done chasing higher overclocks for this card. i really think you get the best REAL WORLD performance without raising voltages and going the highest core and memory clock you can get stable.


Same here. Even under water if I got for overclock with voltage the power draw is too high. The heat coming from the radiator like as hot as a hairdryer. Also for some reason even normal overclocks start to fail once the car hits over 60C.


----------



## LuisFilipepio

Hi guys, I need your input please.

I Got a MSI R9 390, and more recently a 144hz monitor. The problem is my CPU is for sure "bottlenecking" the GPU, since its a X6 1055t. I can't even get CS GO >144 fps (goes from 90 to 150fpss).

I like to hear from you guys which upgrade should i get, spending the less money in the process, without compromising the gaming experience.

Overclocking my X6 1055t? AMD FX8320 (i read in forums it also bottle necks the 390?)? I5 4690 // i5 6500?

Thanks in advance!


----------



## LuisFilipepio

Hi guys, I need your input please.

I Got a MSI R9 390, and more recently a 144hz monitor. The problem is my CPU is for sure "bottlenecking" the GPU, since its a X6 1055t. I can't even get CS GO >144 fps (goes from 90 to 150fpss).

I like to hear from you guys which upgrade should i get, spending the less money in the process, without compromising the gaming experience.

Overclocking my X6 1055t? AMD FX8320 (i read in forums it also bottle necks the 390?)? I5 4690 // i5 6500?

Thanks in advance!


----------



## OneB1t

overclock your cpu as high as you can go or switch to intel


----------



## Stige

Quote:


> Originally Posted by *OneB1t*
> 
> overclock your cpu as high as you can go or *switch to intel*


Words of wisdom









Even a Skylake i3 beats the crap out of those AMD CPUs


----------



## mus1mus

lol.

Can someone even justify the upgrade cost just to maintain a solid 144 FPS vs 120FPS?

While in monitoring, you can notice 100 vs 144 for example, will you even feel the difference when not looking into the numbers?


----------



## Stige

Everything is smoother!!1111

ALWAYS GO FOR SMOOOOOOOOOOOOTH!


----------



## mus1mus

Greed









Like asking for a 1.5V Core Voltage foe the GPU when 1.35 is enough to burn the house.









Seriously though, OC'ing his CPU will improve his framerates.

But in your defense, his Phenom really begs for a retirement.


----------



## OneB1t

i have OCed fx-8320 and for workload im doing its better than i5-4xxx for half money investment








so its not that black/white

also running 4K for games so no cpu bottleneck problem


----------



## Carniflex

Quote:


> Originally Posted by *LuisFilipepio*
> 
> Hi guys, I need your input please.
> 
> I Got a MSI R9 390, and more recently a 144hz monitor. The problem is my CPU is for sure "bottlenecking" the GPU, since its a X6 1055t. I can't even get CS GO >144 fps (goes from 90 to 150fpss).
> 
> I like to hear from you guys which upgrade should i get, spending the less money in the process, without compromising the gaming experience.
> 
> Overclocking my X6 1055t? AMD FX8320 (i read in forums it also bottle necks the 390?)? I5 4690 // i5 6500?
> 
> Thanks in advance!


I would crank up the 1055T to ~3.7 .. 3.8 GHz for a start. It goes up to that relatively easily often without even needing additional voltage (depending on the luck on silicon lottery) - from that point onwards you will be running already up the steeper hill and reaching 4.0 GHz would need substantial voltage bump and result in much higher heat load for relatively meager ~200 MHz gain. It's a decent CPU and has no problems running a single GFX card. If 1055T @ 3.8 GHz is not doing the trick then in most games neither will non "k" version of i5. Where there is a difference is in multiple card setups (SLI / Crossfire) but even there I would say, the difference is not overly dramatic. Besides, if multi card setups are of concern then the correct path to take would lead to LGA2011 platfrom, not to the mainstream segment where the PCIe buses are not running normally at full x16 speed with multi card setups.

I would say that there is no real point of "upgrading" from 1055T to FX8320. It's more of a sidegrade.
Quote:


> Originally Posted by *Stige*
> 
> Words of wisdom
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Even a Skylake i3 beats the crap out of those AMD CPUs


I assume you are not speaking from personal experience







I went from 1055T to i7-3820 a while ago and to be frank - I did not "feel" any difference. Sure, there is some difference in benchmarks but the AMD 1055T @ 3.8 GHz was gaming just as fine for me as i7 - 3820 @ 4.5 GHz does. Granted my reason for upgrade was not gaming but I needed a feature which was present on LGA2011 platform as 32 GB of RAM which I had on 1055T was no longer quite cutting it for me and only option to go higher in the price point accessible to me at the time was LGA2011. But in everyday gaming not to mention in desktop tasks one does not feel that approx 30% performance difference (for the specific load I had in mind when upgrading).


----------



## LuisFilipepio

Quote:


> Originally Posted by *Carniflex*
> 
> I would crank up the 1055T to ~3.7 .. 3.8 GHz for a start. It goes up to that relatively easily often without even needing additional voltage (depending on the luck on silicon lottery) - from that point onwards you will be running already up the steeper hill and reaching 4.0 GHz would need substantial voltage bump and result in much higher heat load for relatively meager ~200 MHz gain. It's a decent CPU and has no problems running a single GFX card. If 1055T @ 3.8 GHz is not doing the trick then in most games neither will non "k" version of i5. Where there is a difference is in multiple card setups (SLI / Crossfire) but even there I would say, the difference is not overly dramatic. Besides, if multi card setups are of concern then the correct path to take would lead to LGA2011 platfrom, not to the mainstream segment where the PCIe buses are not running normally at full x16 speed with multi card setups.
> 
> I would say that there is no real point of "upgrading" from 1055T to FX8320. It's more of a sidegrade.
> I assume you are not speaking from personal experience
> 
> 
> 
> 
> 
> 
> 
> I went from 1055T to i7-3820 a while ago and to be frank - I did not "feel" any difference. Sure, there is some difference in benchmarks but the AMD 1055T @ 3.8 GHz was gaming just as fine for me as i7 - 3820 @ 4.5 GHz does. Granted my reason for upgrade was not gaming but I needed a feature which was present on LGA2011 platform as 32 GB of RAM which I had on 1055T was no longer quite cutting it for me and only option to go higher in the price point accessible to me at the time was LGA2011. But in everyday gaming not to mention in desktop tasks one does not feel that approx 30% performance difference (for the specific load I had in mind when upgrading).


The GPU is the R9 390. I guess for all your answers i could get good gaming performance for just overclocking to 3,8GHz. The 8320 would just be a waste of Money.

Otherwise get the i5 4690 or i5 6500, with all the costs that come with it (mobo, new DDR4 ram...), right?


----------



## Carniflex

Quote:


> Originally Posted by *LuisFilipepio*
> 
> The GPU is the R9 390. I guess for all your answers i could get good gaming performance for just overclocking to 3,8GHz. The 8320 would just be a waste of Money.
> 
> Otherwise get the i5 4690 or i5 6500, with all the costs that come with it (mobo, new DDR4 ram...), right?


Yes.

With a small remark that life is rarely as simple as single correct yes or no answer. Overall I mean it really depends on budget and what are the goals. If you would be already changing platform then it might make sense to cough up few hundred more and go for LGA2011 platfrom - it would probably last you half a decade or even more without any need to upgrade. AMD platfrom issue nowadays is not exactly the lack of CPU power but that the chipsets tend to be ancient basically. Especially for the AM3+ platfrom. Lack of PCIe 3.0 and so on. Although to be fair with single card it does not make a lot of difference if it's running on PCIe 2.0 x16 or PCIe 3.0 x16.

Basically the 1055T you have should last you few good years more with modest over-clock. Noting in addition that there are some games that heavily prefer intel cpu's, like, for example, the Skyrim because it's very crappy coding (x87 extension reliance, strictly single threaded, game physics tied to the frame rate, etc). Current trend seems to be that fresher engines try to make use of at least 4 threads if not 8. AMD 1055T has 6 true full cores (both integer and floating point units) and it does pretty well in applications which can make a proper use of multiple threads.


----------



## OneB1t

you will see that with DX12 games writed from scratch FX-8xxx will keep with i7-4xxx pretty easily


----------



## jodybdesigns

Quote:


> Originally Posted by *LuisFilipepio*
> 
> The GPU is the R9 390. I guess for all your answers i could get good gaming performance for just overclocking to 3,8GHz. The 8320 would just be a waste of Money.
> 
> Otherwise get the i5 4690 or i5 6500, with all the costs that come with it (mobo, new DDR4 ram...), right?


I JUST built a Skylake i3-6100 for my boss yesterday. It literally tears my 1045T @ 3.6ghz in half and slings it all over the room. And it is only 2 cores lol

I would go with a Skylake i3 (or the i5) over anything AMD has to offer right now. And trust me, I am an AMD fanboy. But they just don;t have anything to offer for the money. $25 more for an i3 with ddr4 and newer tech? Um, yes please.

*edit* And what is this myth you speak of called DX12? I swear the developers are trolling us like they did in Pokemon when they told us Mew was under the truck. We just kept pushing and pulling lol


----------



## Stige

Quote:


> Originally Posted by *jodybdesigns*
> 
> I JUST built a Skylake i3-6100 for my boss yesterday. It literally tears my 1045T @ 3.6ghz in half and slings it all over the room. And it is only 2 cores lol
> 
> I would go with a Skylake i3 (or the i5) over anything AMD has to offer right now. And trust me, I am an AMD fanboy. But they just don;t have anything to offer for the money. $25 more for an i3 with ddr4 and newer tech? Um, yes please.
> 
> *edit* And what is this myth you speak of called DX12? I swear the developers are trolling us like they did in Pokemon when they told us Mew was under the truck. We just kept pushing and pulling lol


----------



## mus1mus

Quote:


> Originally Posted by *Carniflex*
> 
> Yes.
> 
> With a small remark that life is rarely as simple as single correct yes or no answer. Overall I mean it really depends on budget and what are the goals. If you would be already changing platform then it might make sense to cough up few hundred more and go for LGA2011 platfrom - it would probably last you half a decade or even more without any need to upgrade. AMD platfrom issue nowadays is not exactly the lack of CPU power but that the chipsets tend to be ancient basically. Especially for the AM3+ platfrom. Lack of PCIe 3.0 and so on. Although to be fair with single card it does not make a lot of difference if it's running on PCIe 2.0 x16 or PCIe 3.0 x16.
> 
> Basically the 1055T you have should last you few good years more with modest over-clock. Noting in addition that there are some games that heavily prefer intel cpu's, like, for example, the Skyrim because it's very crappy coding (x87 extension reliance, strictly single threaded, game physics tied to the frame rate, etc). Current trend seems to be that fresher engines try to make use of at least 4 threads if not 8. AMD 1055T has 6 true full cores (both integer and floating point units) and it does pretty well in applications which can make a proper use of multiple threads.


In addition, moving up to an 8320E will be notable. Newer instructions sets embedded, 2 more threads, easier to OC.

But it will also be needing cooling support, and preferrably a good motherboard. That it makes a little sense to call an upgrade. As you will be close to call it quits very soon. Current AMD pure CPUs are on a dead end.

Now, moving into Intel, I wouldn't call anything an upgrade from your situation other than i7s. Forget i3s and i5s.

With all the claims for them being better, going from a weak 6-core to a strong quad is still a sidegrade. Pick something you can enjoy longer.

An upgrade into a 5820K is now more affordable than ever. And will be competitive in at least 2 years from now.

Those who talk about i5s, especially i3s are less concerned about the overall significance of a multicore/multithread CPU. Fewer and fewer games utilize single threading and the notion of "games don't use more than 2 cores" is no longer relevant.
Quote:


> Originally Posted by *jodybdesigns*
> 
> I JUST built a Skylake i3-6100 for my boss yesterday. *It literally tears my 1045T @ 3.6ghz in half and slings it all over the room.* And it is only 2 cores lol
> 
> I would go with a Skylake i3 (or the i5) over anything AMD has to offer right now. And trust me, I am an AMD fanboy. But they just don;t have anything to offer for the money. $25 more for an i3 with ddr4 and newer tech? Um, yes please.
> 
> *edit* And what is this myth you speak of called DX12? I swear the developers are trolling us like they did in Pokemon when they told us Mew was under the truck. We just kept pushing and pulling lol


With what? Excel?









You are living in the past. Enjoy Skyrim as long as you can. And every single threaded app will keep you blinded with the development.

It's here. DX12 is not a fad. And multithread is not the future.

An i3 belongs to people whose life revolves into spreadsheets and browsing on a very limited spectrum of life's happenings.

Tell me, have you seen the FX fly?


----------



## jodybdesigns

Quote:


> Originally Posted by *mus1mus*
> 
> With what? Excel?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You are living in the past. Enjoy Skyrim as long as you can. And every single threaded app will keep you blinded with the development.
> 
> It's here. DX12 is not a fad. And multithread is not the future.
> 
> An i3 belongs to people whose life revolves into spreadsheets and browsing on a very limited spectrum of life's happenings.
> 
> Tell me, have you seen the FX fly?


Skyrim lol. (Fallout 4 lol too).

Don't get your feelings hurt because a dual core proc is 6% faster then a 6 core. That's just the way it is.

The only DX12 anything I have seen is developers showing off their stupid engines. I could care less what your engine looks like. Give me something to play.

The last time I ever saw a FX fly, it exploded mid-air like a Chinese rocket. That's why I am still rocking a Daneb.

*edit* yeah my boss works on spreadsheets - so the i3 was a perfect fit as intended.


----------



## LuisFilipepio

Thanks for the input guys.

As for the LGA2011 platfrom (5820k), its a couple hundred euros more than the i5 6500/ 4690 counterpart, which right now is more than i can spend.

Since the upgrade to the i5 4690 or i5 6500 you all think it wouldn't be that beneficial, i think i will then OC my 1055t, and wait for better days (€€!)


----------



## jodybdesigns

Quote:


> Originally Posted by *LuisFilipepio*
> 
> Thanks for the input guys.
> 
> As for the LGA2011 platfrom (5820k), its a couple hundred euros more than the i5 6500/ 4690 counterpart, which right now is more than i can spend.
> 
> Since the upgrade to the i5 4690 or i5 6500 you all think it wouldn't be that beneficial, i think i will then OC my 1055t, and wait for better days (€€!)


Dude we played several games on my 1045T oc'd to 3.6 ghz paired with a HD 7870xt. We had a friggin blast in the living room on Christmas (its a HTPC. Everybody wanted to play Mortal Kombat X and GTA5. Both games played great @ 1080p. The system didn't hiccup. Just hold on to what you have if you don't have the money to fully upgrade.


----------



## jdorje

Quote:


> Originally Posted by *LuisFilipepio*
> 
> Overclocking my X6 1055t? AMD FX8320 (i read in forums it also bottle necks the 390?)? I5 4690 // i5 6500?
> 
> Thanks in advance!


4690k or better and overclock it.


----------



## Charcharo

How do I use VSR?
I am stumped


----------



## diggiddi

Quote:


> Originally Posted by *Charcharo*
> 
> How do I use VSR?
> I am stumped


open CCC or crimson under display enable VSR


----------



## Charcharo

Quote:


> Originally Posted by *diggiddi*
> 
> open CCC or crimson under display enable VSR


Thing is I see no difference. I am on 1440 x 900 though









Is that the issue? An unsupported resolution? Will it work in all my games by default?


----------



## ZealotKi11er

Quote:


> Originally Posted by *Charcharo*
> 
> Thing is I see no difference. I am on 1440 x 900 though
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Is that the issue? An unsupported resolution? Will it work in all my games by default?


So you have to go to the game you play and set the higher resolutions.


----------



## diggiddi

Well your desktop icons should change size (smaller) indicating higher res


----------



## jdorje

Quote:


> Originally Posted by *diggiddi*
> 
> Well your desktop icons should change size (smaller) indicating higher res


Vsr desktop? Is that any good at all?


----------



## diggiddi

Quote:


> Originally Posted by *jdorje*
> 
> Vsr desktop? Is that any good at all?


Yes it is, if like him you have less than 1080 res monitor with those "ginormous" icons


----------



## jodybdesigns

Eh, VSR is sketchy. When I boost ANY of my monitors resolutions, everything becomes a bit aliased (rough edges).

It makes text unbearable IMO. It's like looking at a 60 inch TV @ 1080p. You can't read anything lol


----------



## diggiddi

Quote:


> Originally Posted by *jodybdesigns*
> 
> Eh, VSR is sketchy. When I boost ANY of my monitors resolutions, everything becomes a bit aliased (rough edges).
> 
> It makes text unbearable IMO. It's like looking at a 60 inch TV @ 1080p. You can't read anything lol


Your normal resolution is 1080 right?


----------



## Charcharo

Well... the desktop does not change and I dont get new options available in the games I play


----------



## jodybdesigns

Quote:


> Originally Posted by *diggiddi*
> 
> Your normal resolution is 1080 right?


Yeah my Lenovo in the living room is 1080p. And my 144hz Asus is as well. Both upscales to 1440p on VSR. Text is really hard to read on both though.
Quote:


> Originally Posted by *Charcharo*
> 
> Well... the desktop does not change and I dont get new options available in the games I play


Check your advanced display settings in Windows once you turn on VSR


----------



## diggiddi

Quote:


> Originally Posted by *jodybdesigns*
> 
> Yeah my Lenovo in the living room is 1080p. And my 144hz Asus is as well. Both upscales to 1440p on VSR. Text is really hard to read on both though.
> Check your advanced display settings in Windows once you turn on VSR


IMO 1080 is the cutoff point, beyond that is diminishing returns
Should be able to click on desktop and open screen resolution from the pop up window


----------



## Charcharo

Quote:


> Originally Posted by *jodybdesigns*
> 
> Yeah my Lenovo in the living room is 1080p. And my 144hz Asus is as well. Both upscales to 1440p on VSR. Text is really hard to read on both though.
> Check your advanced display settings in Windows once you turn on VSR


Well I did it. I was too stupid to realize I needed to change it in the Resolution settings.
For some reason I thought it will pop up with a tab to ask me to change resolution manually... dont ask me why... so when it did not pop up I thought it was not working.

ANyways... at 2560x1600 my desktop looks funky







Is that due to Win 7?

I benchmarked Metro Redux:

Options: Resolution: *2560 x 1600; Quality: Very High; SSAA: Off; Texture filtering: AF 16X; Motion Blur: Normal; Tesselation: Very High; VSync: Off; Advanced PhysX: Off;*
Average Framerate: *50.16*
Max. Framerate: 146.46 (Frame: 7514)
Min. Framerate: 17.53 (Frame: 8)

Options: Resolution: *2560 x 1600;* Quality: Very High; *SSAA: On*; Texture filtering: AF 16X; Motion Blur: Normal; Tesselation: Very High; VSync: Off; *Advanced PhysX: On;*
Average Framerate: *25.48*
Max. Framerate: 113.73 (Frame: 1946)
Min. Framerate: 6.45 (Frame: 8)

Is this within line guys?


----------



## Jaffi

Is forcing constant voltage in Afterburner a good idea? I am currently doing some OC and so far the card runs stable @1150/1600 but I had to bump the voltage to +67 to get no artifacts in firestrike scene 1 looping (which I found is the best way to determine if core OC is stable). Games run fine @+63 mV.


----------



## Deymark

I wnna buy an r9 390 from amazon.com (got my money there).
Which manufacturer you recommend to get?Should i wait more for better prices?


----------



## ZealotKi11er

Quote:


> Originally Posted by *Deymark*
> 
> I wnna buy an r9 390 from amazon.com (got my money there).
> Which manufacturer you recommend to get?Should i wait more for better prices?


Sapphire.


----------



## Deymark

sapphire costs 14$ more than MSI and MSI is 14$ more than giga g1.
should i wait for better offers or what to do??


----------



## Stige

Get a Sapphire regardless, you won't regret it.

I had a HD7950 Vapor-X which was marvelous, now I have ASUS Strix DC3 R9 390 and it's crap, the VRM is on fire with stock cooler...


----------



## TsukikoChan

I need help guys and I'm not sure how to continue with my sapphire 390 (Back from holiday and able to post again):

1) Sometimes my middle monitor (dvi) would randomly show a black screen and windows won't correct this until i turn off the monitor and back on. this however leads into point 2.

2) When i turn off a monitor in Crimson/390x (this didn't occur before on my 7870), windows/crimson now detects this as a signal loss (i.e. i pulled out the cable) and changes my monitor layout, which in turns kills windows explorer (explorer needs restarted) and my icons and stuff gets jumbled on the desktop. my old 7870 never detected me turning off a monitor before and didn't care, so why should the 390x? i don't want my monitor layout to dynamically change.. the thing is, it doesn't do this for at least one of the other monitors (cant remember if it is connected via dvi or displayport).

3) my screens 'freeze' and windows crash (no visual change) rarely if i have an OC on my cpu/gpu now. part of me thinks this is due to my psu (gx750w coolmaster) and not supporting a 8350 OC and 390x.. it should watt-wise though :<

any ideas guys? i don't want to upgrade my psu to fix pt3 only to find that points 1 and 2 are still there. I read that pt1 might be down to not enough power as well. not sure what/how to fix pt2.

ty! <3


----------



## Deymark

/del


----------



## Deymark

Quote:


> Originally Posted by *Deymark*
> 
> I am talking about normal Sapphire ,not http://www.amazon.com/gp/product/B00ZGL8CYY?keywords=r9%20390&qid=1452090005&ref_=sr_1_8&s=pc&sr=1-8 which is like 60$ more expensive.
> 
> http://www.amazon.com/Sapphire-11244-01-20G-Radeon-NITRO-Graphics/dp/B015DXHEAW/ref=sr_1_4?s=electronics&ie=UTF8&qid=1452090527&sr=1-4&keywords=r9+390
> vs
> http://www.amazon.com/MSI-R9-390-GAMING-8G/dp/B00ZGF0UAE/ref=sr_1_1?s=electronics&ie=UTF8&qid=1452090527&sr=1-1&keywords=r9+390


----------



## ZealotKi11er

Quote:


> Originally Posted by *Deymark*


Get the cheaper Sapphire card.


----------



## Synntx

Quote:


> Originally Posted by *Jaffi*
> 
> Is forcing constant voltage in Afterburner a good idea? I am currently doing some OC and so far the card runs stable @1150/1600 but I had to bump the voltage to +67 to get no artifacts in firestrike scene 1 looping (which I found is the best way to determine if core OC is stable). Games run fine @+63 mV.


Yes, force constant voltage. This will help with stability. What card do you have? +67mv seems high for those speeds.
Quote:


> Originally Posted by *Charcharo*
> 
> Well I did it. I was too stupid to realize I needed to change it in the Resolution settings.
> For some reason I thought it will pop up with a tab to ask me to change resolution manually... dont ask me why... so when it did not pop up I thought it was not working.
> 
> ANyways... at 2560x1600 my desktop looks funky
> 
> 
> 
> 
> 
> 
> 
> Is that due to Win 7?
> 
> I benchmarked Metro Redux:
> 
> Options: Resolution: *2560 x 1600; Quality: Very High; SSAA: Off; Texture filtering: AF 16X; Motion Blur: Normal; Tesselation: Very High; VSync: Off; Advanced PhysX: Off;*
> Average Framerate: *50.16*
> Max. Framerate: 146.46 (Frame: 7514)
> Min. Framerate: 17.53 (Frame: 8)
> 
> Options: Resolution: *2560 x 1600;* Quality: Very High; *SSAA: On*; Texture filtering: AF 16X; Motion Blur: Normal; Tesselation: Very High; VSync: Off; *Advanced PhysX: On;*
> Average Framerate: *25.48*
> Max. Framerate: 113.73 (Frame: 1946)
> Min. Framerate: 6.45 (Frame: 8)
> 
> Is this within line guys?


The only drawback to VSR is you'll have to increase font size and icon size. Also recalibrate cleartext in windows. It looks pretty weird and i never could get it to work just right to where it doesn't hurt my eyes.
Quote:


> Originally Posted by *Deymark*
> 
> I wnna buy an r9 390 from amazon.com (got my money there).
> Which manufacturer you recommend to get?Should i wait more for better prices?


Get a sapphire with a backplate. They seem to manufacturer the best grenada cards.


----------



## jodybdesigns

Quote:


> Originally Posted by *Deymark*
> 
> sapphire costs 14$ more than MSI and MSI is 14$ more than giga g1.
> should i wait for better offers or what to do??


Always go with Sapphire or Powercolor when buying AMD GPU's. I owned 5 7950 Vapor-X's in a miner. Took 2 of them out and I am still using them to this day in Crossfire. Overclocked the entire time. Even more when mining. I still have 80% ASIC on both cards.

I still own 2 Sapphire 4870's and 1 Sapphire 7770.

I still own 1 Sapphire 6870 that is a 5870 rebadge and I put in a Mac Pro.

My cousin owns 2 Powercolor 7970's in Crossfire. Overclocked running 25/8 for 3 years.

He also has a Powercolor 3870 in his HTPC that has been going a whopping 6 years constant.

My 2 cents.


----------



## Whippet

Just replaced my R9 280 with the new Sapphire Nitro 390x with backplate.
Love it, temp maxes out at 71 running all 3D Mark primary tests and happy with the performance.
Major down side is the hideous coil whine. Even on game menus before starting its very audible and in game its rediculous.
Luckily I game with closed cup headphones but not sure I can live with it.


----------



## Agent Smith1984

I apologize to anyone not listed on the club list recently.

For some reason I am not getting my new post emails from OC that prompts me to check post, so I am doing some back tracking and will have the list updated by Friday.

Thanks again for all the member support!!


----------



## Kriggs

Guys, my R9 390 Nitro is oscilating the Core Clock, even tough the game is hitting just 50fps (Far Cry 4), is this normal? Can I make it stay in the 1040mhz like it stays in Witcher 3? Thanks.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Kriggs*
> 
> Guys, my R9 390 Nitro is oscilating the Core Clock, even tough the game is hitting just 50fps (Far Cry 4), is this normal? Can I make it stay in the 1040mhz like it stays in Witcher 3? Thanks.


Try increasing the power for the card. I don't think you can force the core to stay 1040MHz. For me it does not drop unless the game is not used the GPU much but I don't remember in FC4 having the card fluctuate in frequency.


----------



## Kriggs

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Try increasing the power for the card. I don't think you can force the core to stay 1040MHz. For me it does not drop unless the game is not used the GPU much but I don't remember in FC4 having the card fluctuate in frequency.


It does help with a 10% power boost, not 100% but less fluctuation, why tough if I haven't done any overclock by myself, it's basicly all stock, it's sitting on a 1000W PSU and 60ºC, so throttling isn't an issue.

After:


What are the disvantage of power boost? Same as upping the VCORE?


----------



## ZealotKi11er

Quote:


> Originally Posted by *Kriggs*
> 
> It does help with a 10% power boost, not 100% but less fluctuation, why tough if I haven't done any overclock by myself, it's basicly all stock, it's sitting on a 1000W PSU and 60ºC, so throttling isn't an issue.
> 
> After:
> 
> 
> What are the disvantage of power boost? Same as upping the VCORE?


Well it will let the card use more power so run hotter. No its not the same as vCore. vCore can damage the card.


----------



## Whippet

Quote:


> Originally Posted by *Whippet*
> 
> Just replaced my R9 280 with the new Sapphire Nitro 390x with backplate.
> Love it, temp maxes out at 71 running all 3D Mark primary tests and happy with the performance.
> Major down side is the hideous coil whine. Even on game menus before starting its very audible and in game its rediculous.
> Luckily I game with closed cup headphones but not sure I can live with it.


Actually turns out to be the PSU after listening up close, wonder how much whine is mis-diagnosed as coming from the GPU.

PSU is a adecent one Antec semi modular 650W that in reviews went well above 700W.

What do you reckon, live with it or get a new PSU and if so which brand?


----------



## pillowsack

Quote:


> Originally Posted by *Whippet*
> 
> Actually turns out to be the PSU after listening up close, wonder how much whine is mis-diagnosed as coming from the GPU.
> 
> PSU is a adecent one Antec semi modular 650W that in reviews went well above 700W.
> 
> What do you reckon, live with it or get a new PSU and if so which brand?


I think you should be ok with a 650W. What's the rest of your system like?


----------



## Ha-Nocri

Quote:


> Originally Posted by *Kriggs*
> 
> Guys, my R9 390 Nitro is oscilating the Core Clock, even tough the game is hitting just 50fps (Far Cry 4), is this normal? Can I make it stay in the 1040mhz like it stays in Witcher 3? Thanks.


What's your CPU? Did you try over-clocking it?


----------



## jdorje

Quote:


> Originally Posted by *Jaffi*
> 
> Is forcing constant voltage in Afterburner a good idea? I am currently doing some OC and so far the card runs stable @1150/1600 but I had to bump the voltage to +67 to get no artifacts in firestrike scene 1 looping (which I found is the best way to determine if core OC is stable). Games run fine @+63 mV.


Could be wrong but I think the answer varies by card. It presumably means some sort of llc for less voltage droop. But it'll raise temps so it's a question of whether llc is better or it's better to just raise voltage.


----------



## Kriggs

Quote:


> Originally Posted by *Ha-Nocri*
> 
> What's your CPU? Did you try over-clocking it?


I5 [email protected], it stays in 50~60%, rarely tops that.


----------



## Whippet

Quote:


> Originally Posted by *pillowsack*
> 
> I think you should be ok with a 650W. What's the rest of your system like?


6600k @ 4.4, only running one SSD and one HD. Only other periphery is a NZXT Grid+ powered by one molex running 4 case fans and my Noctua C15 cooler.
So thinking I'm well within limits. Tried GPU overclocks and volt changes still bad. Only way of reducing the PSU noise is to frame rate limit to 30 FPS..


----------



## Jaffi

I was OC'ing my 390X Nitro, was running @1200/1600 with +100 mV. VRM and GPU temps were peaking @ 85/75 degrees, yet the card wasn't maintaining the maximum boost clock. When I bumped coolers manually, GPU temp went to 65° C and it was much more stable. So why would it already thermal throttle @75° C? Or was it something else?


----------



## ZealotKi11er

Quote:


> Originally Posted by *Jaffi*
> 
> I was OC'ing my 390X Nitro, was running @1200/1600 with +100 mV. VRM and GPU temps were peaking @ 85/75 degrees, yet the card wasn't maintaining the maximum boost clock. When I bumped coolers manually, GPU temp went to 65° C and it was much more stable. So why would it already thermal throttle @75° C? Or was it something else?


The hotter the card run the more power it will leak so you where probably running at the power limit.


----------



## Jaffi

Quote:


> Originally Posted by *ZealotKi11er*
> 
> The hotter the card run the more power it will leak so you where probably running at the power limit.


Geez, even when I already had set the limit to +50? Well, according to GPUZ the card was pulling 333 Watts at that point, so I guess that could have been it!


----------



## ZealotKi11er

Quote:


> Originally Posted by *Jaffi*
> 
> Geez, even when I already had set the limit to +50? Well, according to GPUZ the card was pulling 333 Watts at that point, so I guess that could have been it!


Try going a bit lower. Try +25%. Giving too much power can make the card run hotter.


----------



## pillowsack

Quote:


> Originally Posted by *Whippet*
> 
> 6600k @ 4.4, only running one SSD and one HD. Only other periphery is a NZXT Grid+ powered by one molex running 4 case fans and my Noctua C15 cooler.
> So thinking I'm well within limits. Tried GPU overclocks and volt changes still bad. Only way of reducing the PSU noise is to frame rate limit to 30 FPS..


PSU just has coil whine. How old is it? Got the exact model so I can see if there's a Johnny Guru review?


----------



## jdorje

Make a few more tweaks to my XFX (8256) 390!









I opened the card up and replaced the VRM padding and the TIM. Used some good stuff for the padding but the TIM is just the cheapo ceramique I've had lying around for years. Still, core temps were improved by a couple degrees and VRMS by a fair bit. I used a manual-spreading method with a piece of paper to get a thin cover of TIM over the whole chip. Not sure this is the best for this chip but, unlike a CPU, it's important to get the whole thing covered. Which is hard with the dab method without putting on way too much and having it run everywhere. I may replace it with CLU, but the heat sink appears to be pure copper which is going to absorb the liquid over time; on the other hand just seeing that happen might be worth the effort. It's VRM temps, not core temps, that are the issue though so it's not going to be worth spending money to improve the core cooling alone.

I tweaked my VRAM speed in 5 mhz increments, running 3 valley full-screen benches with everything else shut down so that I'd get pretty consistent numbers. I notice if you tab out of valley and reopen it, scores drop significantly (and match what you get if you run it windowed). It maxed out at 1140 mhz before starting to drop (first slowly, then tremendously by 1160). So I'm on 1140 mhz ram.

I got into bios rom tweaking, with help from the nice people and the guide over at the bios thread. First I switched from adaptive to a static VID, which is currently at 1200 (that's in millivolts, so 1.2V) to go with my 1090 core speed. I'm not positive this is quite enough voltage; an aida64 dump indicated my stock VID was 1228 but that seemed too high so I dropped it a bit. Might add a few ticks back on though or even go up to 1100 core speed. At this setting temps are well in hand, 75-80C core and 90-95C VRM. I also changed the fan speed in the bios to a custom one (lower at idle, heavier under load), which does contribute to my temps. None of these give a performance boost, they just mean I don't play around in afterburner so much anymore.

The last bios change was moving around ram straps. The hawaii vram goes in straps, so from 1501-1625 mhz it runs at one set of timings but once you go to 1626-1750 mhz it switches to a slower set of timings. The adjustment here is to use a hex editor to open up the rom itself (the hawaii reader frontend can't do this yet) and copy the whole block of strap timings from one set to another. I ended up using the 1101-1250 mhz strap for everything up to 2000 mhz, and it remains stable at the same 1740 mhz but with a 2-3% overall FPS increase.

Overall, ram tweaking is the most interesting part of the overclock. Raising core to match a set voltage is quite easy, or raising voltage to match a set core, but the tradeoff of voltage per core clock is not a very good one and temps and power use both skyrocket. Overclocking the ram and lowering the timings gives just as large of a performance boost (maybe more) with essentially no increase in power use.

Happy with this for my everyday overclock now. Power use is lower than it was on stock (EDIT: not sure this is strictly true, but, it is quite low - 415W at the wall when running both valley and x264 at the same time). Valley (hd extreme) scores are in the 2890-2895 range. 3dmark graphics score 13440 : http://www.3dmark.com/3dm/10143902

I also did bump the voltage up to 1350 and got clock up to 1175 for one 11.5k 3dmark run: http://www.3dmark.com/3dm/10131521


----------



## Jaffi

How dangerous is 1200/1600 with +100mV for a 390x nitro when temps are below 80/90 degrees for GPU/VRMs? Is it ok to do some benchmark runs with those settings even if you get some artifacts? I think it should be good?


----------



## ZealotKi11er

Quote:


> Originally Posted by *Jaffi*
> 
> How dangerous is 1200/1600 with +100mV for a 390x nitro when temps are below 80/90 degrees for GPU/VRMs? Is it ok to do some benchmark runs with those settings even if you get some artifacts? I think it should be good?


Its fine.


----------



## pillowsack

Quote:


> Originally Posted by *jdorje*
> 
> Make a few more tweaks to my XFX (8256) 390!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I opened the card up and replaced the VRM padding and the TIM. Used some good stuff for the padding but the TIM is just the cheapo ceramique I've had lying around for years. Still, core temps were improved by a couple degrees and VRMS by a fair bit. I used a manual-spreading method with a piece of paper to get a thin cover of TIM over the whole chip. Not sure this is the best for this chip but, unlike a CPU, it's important to get the whole thing covered. Which is hard with the dab method without putting on way too much and having it run everywhere. I may replace it with CLU, but the heat sink appears to be pure copper which is going to absorb the liquid over time; on the other hand just seeing that happen might be worth the effort. It's VRM temps, not core temps, that are the issue though so it's not going to be worth spending money to improve the core cooling alone.
> 
> I tweaked my VRAM speed in 5 mhz increments, running 3 valley full-screen benches with everything else shut down so that I'd get pretty consistent numbers. I notice if you tab out of valley and reopen it, scores drop significantly (and match what you get if you run it windowed). It maxed out at 1140 mhz before starting to drop (first slowly, then tremendously by 1160). So I'm on 1140 mhz ram.
> 
> I got into bios rom tweaking, with help from the nice people and the guide over at the bios thread. First I switched from adaptive to a static VID, which is currently at 1200 (that's in millivolts, so 1.2V) to go with my 1090 core speed. I'm not positive this is quite enough voltage; an aida64 dump indicated my stock VID was 1228 but that seemed too high so I dropped it a bit. Might add a few ticks back on though or even go up to 1100 core speed. At this setting temps are well in hand, 75-80C core and 90-95C VRM. I also changed the fan speed in the bios to a custom one (lower at idle, heavier under load), which does contribute to my temps. None of these give a performance boost, they just mean I don't play around in afterburner so much anymore.
> 
> The last bios change was moving around ram straps. The hawaii vram goes in straps, so from 1501-1625 mhz it runs at one set of timings but once you go to 1626-1750 mhz it switches to a slower set of timings. The adjustment here is to use a hex editor to open up the rom itself (the hawaii reader frontend can't do this yet) and copy the whole block of strap timings from one set to another. I ended up using the 1101-1250 mhz strap for everything up to 2000 mhz, and it remains stable at the same 1740 mhz but with a 2-3% overall FPS increase.
> 
> Overall, ram tweaking is the most interesting part of the overclock. Raising core to match a set voltage is quite easy, or raising voltage to match a set core, but the tradeoff of voltage per core clock is not a very good one and temps and power use both skyrocket. Overclocking the ram and lowering the timings gives just as large of a performance boost (maybe more) with essentially no increase in power use.
> 
> Happy with this for my everyday overclock now. Power use is lower than it was on stock (EDIT: not sure this is strictly true, but, it is quite low - 415W at the wall when running both valley and x264 at the same time). Valley (hd extreme) scores are in the 2890-2895 range. 3dmark graphics score 13440 : http://www.3dmark.com/3dm/10143902
> 
> I also did bump the voltage up to 1350 and got clock up to 1175 for one 11.5k 3dmark run: http://www.3dmark.com/3dm/10131521


That sounds awesome! Did you use the fujipoly thermal pads? I think I might end up buying some now.

I bought a temp gun and for some reason the backside of my card gets pretty spicy. The thermal pads they use must really suck.


----------



## jdorje

Quote:


> Originally Posted by *pillowsack*
> 
> That sounds awesome! Did you use the fujipoly thermal pads? I think I might end up buying some now.
> 
> I bought a temp gun and for some reason the backside of my card gets pretty spicy. The thermal pads they use must really suck.


Yeah, the fujipoly. It took 1/3 of a pad which is like...$5-7 worth...not really worth it I feel. However the padding in there was pretty bad.

The backplate gets extremely hot, almost painful to the touch. It's not directly touching anything, but metal connects it to the whole rest of the heat sink I guess. I think fitting a heat sink to the backplate and then blowing a fan through it could maybe be beneficial, but...just fitting an AIO on the core is probably a lot better use of time and money.

Unfortunately the VRM cooler on my XFX doesn't really connect to anything. Just a small cooler "inside" the main cooler. So it must have huge ambient yet not that large of a heat sink. With the new thermal pad temps are up to 10C lower at my max overclock, but it's still far from good.


----------



## Whippet

Quote:


> Originally Posted by *pillowsack*
> 
> PSU just has coil whine. How old is it? Got the exact model so I can see if there's a Johnny Guru review?


Antec TruePower New Modular 650W '80 Plus Bronze' Power Supply. Bought in Oct 2011.


----------



## pillowsack

Quote:


> Originally Posted by *Whippet*
> 
> Antec TruePower New Modular 650W '80 Plus Bronze' Power Supply. Bought in Oct 2011.


Weeellll, I guess it is 5 years old. It wouldn't hurt to be safe than sorry, especially with how power hungry these GPU's are.

Look up the PSU tier list and look for something cheap eh? You only need 600W+ i'd think. Maybe do more to be safe in the future? Crossfire maybe.

Quote:


> Originally Posted by *jdorje*
> 
> Yeah, the fujipoly. It took 1/3 of a pad which is like...$5-7 worth...not really worth it I feel. However the padding in there was pretty bad.
> 
> The backplate gets extremely hot, almost painful to the touch. It's not directly touching anything, but metal connects it to the whole rest of the heat sink I guess. I think fitting a heat sink to the backplate and then blowing a fan through it could maybe be beneficial, but...just fitting an AIO on the core is probably a lot better use of time and money.
> 
> Unfortunately the VRM cooler on my XFX doesn't really connect to anything. Just a small cooler "inside" the main cooler. So it must have huge ambient yet not that large of a heat sink. With the new thermal pad temps are up to 10C lower at my max overclock, but it's still far from good.


I had the 390 XFX for a while but returned it. The VRM cooling was less than perfect. VRM2 wasn't even covered, and the little Gelid heatsink kit you can buy isn't worth while. The VRM1 cooler was ok stock, I didn't get any different with the Gelid honestly.

I did just buy those fuji pads though, I bought the strip one for $17 on amazon(Better be worth something, at least 10C!). I want my card to be as cool as possible. I'm pretty content with it running stock clocks, since GTA 5 runs great, and maybe it'll be a good idea to make the card live for another year or two. Adding another 390X would be killer IMO.

Also I have some more of these Enzotech copper ram sink things to plaster the card in.





Purple LEDs really playing with the camera lighting here


----------



## jdorje

What's the highest safe VID for a 10 minute benchmark run? Assuming around 80-90C temps. 1300 mv? 1400?


----------



## pillowsack

Quote:


> Originally Posted by *jdorje*
> 
> What's the highest safe VID for a 10 minute benchmark run? Assuming around 80-90C temps. 1300 mv? 1400?


Really depends man, I mean the more vcore the hotter the VRMs. You really gotta watch those.

These cards dont like lots of votltage either imo.


----------



## Whippet

Quote:


> Originally Posted by *pillowsack*
> 
> Weeellll, I guess it is 5 years old. It wouldn't hurt to be safe than sorry, especially with how power hungry these GPU's are.
> 
> Look up the PSU tier list and look for something cheap eh? You only need 600W+ i'd think. Maybe do more to be safe in the future? Crossfire maybe


Yeah guess so better dig into the overdraft some more then : /
Whereabouts is that PSU list you refer to?


----------



## pillowsack

Quote:


> Originally Posted by *Whippet*
> 
> Yeah guess so better dig into the overdraft some more then : /
> Whereabouts is that PSU list you refer to?


http://i.imgur.com/tgrbCnr.jpg

There's a pretty image of it I found. What's your budget? I could help you look

From the looks of johnny gurus review of the 750W variant of your PSU, it is a good one. Capacitor aging and whatnot is the problem here though, being 5 years old. You probably have a squealing capacitor or coil whine. Replacing it would be better than it taking your whole system out randomly though eh?


----------



## Stige

Antec HCG as Tier2a hahaha.
In same category with Rosewill Tachyon.

Must be a joke list or something, Antec HCG PSUs are crap. Earthwatts are nothing amazing either.
EDIT: Earthwatts are actually Seasonic stuff so they are quality.


----------



## pillowsack

Quote:


> Originally Posted by *Stige*
> 
> Antec HCG as Tier2a hahaha.
> In same category with Rosewill Tachyon.
> 
> Must be a joke list or something, Antec HCG PSUs are crap. Earthwatts are nothing amazing either.


I'd rather believe in a list composed from 100's of people than some dude in Finland









From what I found, the HCG 620 was apparently not that great, but they don't sell them anymore on newegg. Definitely shouldn't put two GPU's on that thing....


----------



## Stige

Quote:


> Originally Posted by *pillowsack*
> 
> I'd rather believe in a list composed from 100's of people than some dude in Finland
> 
> 
> 
> 
> 
> 
> 
> 
> 
> From what I found, the HCG 620 was apparently not that great, but they don't sell them anymore on newegg. Definitely shouldn't put two GPU's on that thing....


I had 2x HD7950 on Antec HCG-620 huehue


----------



## Dundundata

Quote:


> Originally Posted by *jdorje*
> 
> What's the highest safe VID for a 10 minute benchmark run? Assuming around 80-90C temps. 1300 mv? 1400?


I've run games at 1.375V with decent temps, lower than 80C. I would say keep an eye on gpu/vrm temps and work up slowly. No need to run more volts than needed.

As for PSU I've been quite happy with Corsair RM750i, japanese capacitors, full modular


----------



## jdorje

Quote:


> Originally Posted by *Dundundata*
> 
> I've run games at 1.375V with decent temps, lower than 80C. I would say keep an eye on gpu/vrm temps and work up slowly. No need to run more volts than needed.


It's a strange situation.

I'm trying to bench at high settings. Specifically in 3dmark benches, where I'm within a few points of the highest score fora 390 + 4690k in all three firestrike tests. But it's challenging to get higher.

When I asked yesterday I thought voltage would be the limit. But it's really not. In fact I can't go over about 1300 mV core (that's +75 on my stock setting of 1225) with my ~1180 mhz core or I'll start power throttling. I raised the power limit in the bios slightly and have a +50% in afterburner on top of that, for about a 357W power limit. But because the clocks are really high (not actually stable), even on "low" voltage I'm hitting that power limit.

Also temps are no problem, because 3dmark runs for like 30 seconds then loads the next test for like 30 seconds. With the side of my case off and fans locked at 100%, core stays in the mid-70s and VRMs mid-80s.

Based on low temps and voltages, I'd think something else was wrong, except that the benchmark scores are in line with what they should be.

So uh, yeah. I suspect the only way to raise performance more would be to raise the power limit, which I don't think I should do. It's a 6+8 pin card.

Valley is a little lower power use it seems. I can go up to 1200 mhz / 1325 mV there.

3080 in Valley extreme hd preset:


http://imgur.com/j1cXD0p


3224 in 3dmark fire strike ultra (4k) : http://www.3dmark.com/fs/7114496

11533 in 3dmark fire strike (1080) : http://www.3dmark.com/3dm/10131521?

Those aren't stable settings at all, but I have increased everyday performance greatly as well with the bios mods mentioned last page. Highly recommended. But like I said, the limitation isn't voltage or clock...it's power level (at least in firestrike).


----------



## navjack27

you have a good graphics score, the lack of extra threads is the issue in 3dmark

EDIT: my vrms don't even get close to that on my msi 390x even with voltages upped


----------



## jdorje

Quote:


> Originally Posted by *navjack27*
> 
> you have a good graphics score, the lack of extra threads is the issue in 3dmark
> 
> EDIT: my vrms don't even get close to that on my msi 390x even with voltages upped


I'm only comparing it to other 4690k scores. Comparing graphics scores with their search tool doesn't seem to be possible or I'd do that.

Doesn't the msi card have more VRMS/voltage phases, just like the lightning did? Hard to compete with that. This seems to show 7 instead of the 6 I have.


----------



## Whippet

Quote:


> Originally Posted by *pillowsack*
> 
> http://i.imgur.com/tgrbCnr.jpg
> 
> There's a pretty image of it I found. What's your budget? I could help you look
> 
> From the looks of johnny gurus review of the 750W variant of your PSU, it is a good one. Capacitor aging and whatnot is the problem here though, being 5 years old. You probably have a squealing capacitor or coil whine. Replacing it would be better than it taking your whole system out randomly though eh?


Thanks, appreciate the advice. I'd be looking at around £75. Need modular or semi as my case is a NZXT S340 and its a bit tight on space.


----------



## thebaltar

I will send my 290x Gigabyte to RMA, but i have fear to receive a 390x card.
Anyone know if Gigabyte adopt this politic?

Gigabyte 390x have the voltage hard locked? This is true?


----------



## diggiddi

Quote:


> Originally Posted by *Stige*
> 
> Antec HCG as Tier2a hahaha.
> In same category with Rosewill Tachyon.
> 
> Must be a joke list or something, Antec HCG PSUs are crap. Earthwatts are nothing amazing either.
> EDIT: Earthwatts are actually Seasonic stuff so they are quality.


My HCG 750 is doing very well powering my 290x lightning XFire setup and FX830 at 4.8ghz thank you!


----------



## navjack27

Yeah my EVGA b2 850w powers my stuff without issue. Johnny guru reviewed the 750w variant and said it was able to pull silver ratings, I kinda assume that might apply to mine too.


----------



## chubbyfatazn

How'd I do?



Sapphire Nitro R9 390, stock cooler

Stock clocks: 1040/1500
Stock core voltage: 1.173v

Max core: 1170 (+12.5%)
Max memory: 1800 (+20%)

No changes made to voltage. VRM temps are a bit high, might need to look into that later. Otherwise, I'm pretty satisfied.

Oh yeah, ASIC 73.2%, fwiw.


----------



## pillowsack

Quote:


> Originally Posted by *chubbyfatazn*
> 
> How'd I do?
> 
> 
> 
> Sapphire Nitro R9 390, stock cooler
> 
> Stock clocks: 1040/1500
> Stock core voltage: 1.173v
> 
> Max core: 1170 (+12.5%)
> Max memory: 1800 (+20%)
> 
> No changes made to voltage. VRM temps are a bit high, might need to look into that later. Otherwise, I'm pretty satisfied.
> 
> Oh yeah, ASIC 73.2%, fwiw.


That looks good!









Stability is key with me though









http://www.3dmark.com/fs/7122842

11910 with AMD Radeon R9 390X(1x) and Intel Core i5-4690K

Really wanna break that 15k mark on this though: Graphics Score 14732

I'm liking this memory mod. Highest score I've gotten, and that's only at 1625 memory. I don't think it likes to go much higher









I hold the highest 4690K and 390X benchmark scores at least
















Maybe I should do a suicide run with my CPU at 1.55+ volts to do 5Ghz and drop my memory to cas 10(suicide ram)...

Also I haven't optimized my drivers or really anything(crimson drivers). How the heck do I do all of this?


----------



## Carniflex

3D Mark score - 9392 - http://www.3dmark.com/fs/7124325

This seems to be a bit on the low side? I mean with people mentioning scores around 11k and above even in this thread. Or are the people talking about graphics score specifically which is 11536 for this card at stock settings?


----------



## fyzzz

Quote:


> Originally Posted by *Carniflex*
> 
> 3D Mark score - 9392 - http://www.3dmark.com/fs/7124325
> 
> This seems to be a bit on the low side? I mean with people mentioning scores around 11k and above even in this thread. Or are the people talking about graphics score specifically which is 11536 for this card at stock settings?


Your graphics score is okay. But your physics score and combined score is low and therefore much lower overall score.


----------



## kizwan

Quote:


> Originally Posted by *Carniflex*
> 
> 3D Mark score - 9392 - http://www.3dmark.com/fs/7124325
> 
> This seems to be a bit on the low side? I mean with people mentioning scores around 11k and above even in this thread. Or are the people talking about graphics score specifically which is 11536 for this card at stock settings?


Your graphics score in the low side. It should be higher.

This is comparing yours with my 290 at stock clock with stock BIOS.
http://www.3dmark.com/compare/fs/7124325/fs/5781943#


----------



## ZealotKi11er

Quote:


> Originally Posted by *Carniflex*
> 
> 3D Mark score - 9392 - http://www.3dmark.com/fs/7124325
> 
> This seems to be a bit on the low side? I mean with people mentioning scores around 11k and above even in this thread. Or are the people talking about graphics score specifically which is 11536 for this card at stock settings?


Yes your GPU score is a bit low. You should be on 13K for that.


----------



## Stige

Finaly my temps are starting to look how I like them:



+75mV on Core and +50mV on VDDCI with BIOS mods.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Stige*
> 
> Finaly my temps are starting to look how I like them:
> 
> 
> 
> +75mV on Core and +50mV on VDDCI with BIOS mods.


Very impressive. I am hitting almost 60C with my cards because only have 360 + 240 RAD with slow fans.


----------



## TsukikoChan

new crimson version out, hopefully it solves a few issues i've been having with my 390x :S


----------



## mus1mus

Quote:


> Originally Posted by *TsukikoChan*
> 
> new crimson version out, hopefully it solves a few issues i've been having with my 390x :S


You mean 15.12?

Still crap when benching.


----------



## TsukikoChan

Quote:


> Originally Posted by *mus1mus*
> 
> You mean 15.12?
> 
> Still crap when benching.


nope, 16.1 came out today to my knowledge :O has a fair amount of resolved issues, one of which looks like my blackscreen issue i've been having.

http://support.amd.com/en-us/kb-articles/Pages/AMD-Radeon-Software-Crimson-Edition-16.1-Hotfix-Release-Notes.aspx?sf18430937=1


----------



## mus1mus

Did you try it yet? DL'ing


----------



## TsukikoChan

Quote:


> Originally Posted by *TsukikoChan*
> 
> nope, 16.1 came out today to my knowledge :O has a fair amount of resolved issues, one of which looks like my blackscreen issue i've been having.
> 
> http://support.amd.com/en-us/kb-articles/Pages/AMD-Radeon-Software-Crimson-Edition-16.1-Hotfix-Release-Notes.aspx?sf18430937=1


Quote:


> Originally Posted by *mus1mus*
> 
> Did you try it yet? DL'ing


Nope, at work so i will download and install it when i get home later this evening 
if blackscreen occurs again i will be submitting a ticket to sapphire to see what is up with the issues i'm seeing on my 390x (random blackscreens, fixable by turning the monitor off and on, & how to turn off detection of dvi monitor turn off).


----------



## mus1mus

Quote:


> Originally Posted by *TsukikoChan*
> 
> Nope, at work so i will download and install it when i get home later this evening
> if blackscreen occurs again i will be submitting a ticket to sapphire to see what is up with the issues i'm seeing on my 390x (random blackscreens, fixable by turning the monitor off and on, & how to turn off detection of dvi monitor turn off).


hmm.

I can only say, the Voltages start with Windows now. So any failed OC resumes to last state.

Which is NICE.

Big rep for the tip sir
















EDIT:

I can't get into Windows after a Failed OC. BUGGER


----------



## jdorje

I switched back to legacy boot mode so I can enter safe mode easily. Even though it's like 10 seconds longer to boot (not actually measured).


----------



## Stige

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Very impressive. I am hitting almost 60C with my cards because only have 360 + 240 RAD with slow fans.


I got a 360 block with 3x Gentle Typhoon AP-15 on it, with the Alphacool "almost-fullcover" block (VRM doesn't have watercooling on it).

I strapped a GT AP-15 on the block and changed the pads, I have dropped 11-12C from the original temps I had on first install of the block.

And almost 30C drop on VRM1 from the stock cooler. Core dropped like 35C with the block install using Phobya NanoGrease.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Stige*
> 
> I got a 360 block with 3x Gentle Typhoon AP-15 on it, with the Alphacool "almost-fullcover" block (VRM doesn't have watercooling on it).
> 
> I strapped a GT AP-15 on the block and changed the pads, I have dropped 11-12C from the original temps I had on first install of the block.
> 
> And almost 30C drop on VRM1 from the stock cooler.


Just for GPU or CPU too?


----------



## NeverOCed

There ya go.


----------



## jackblk

http://www.techpowerup.com/gpuz/details.php?id=dcygz

Just bought last week, happy with it =D.

Anyway, i have limited gpu clock on dota 2. It causes me to have ~100 fps only :/. I have 144hz monitor so this is a bit frustrating


----------



## Stige

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Just for GPU or CPU too?


My motherboard is under warranty right now so I'm stuck with a H61M-VG4 right now... So only GPU as it is right now, CPU will be back once I get my mobo back.

I can't get +10 FPS on Valley no matter what! I have hit the limit with this card








Custom Bios, total of +200mV and +94mV on VDDCI.

Stock Valley:


Overclocked 1268/1730


Maybe with my Z77 OC Formula I can break that +10FPS barrier that haunts me now


----------



## navjack27

1150/1625 +75mv core +0mv aux


----------



## ZealotKi11er

Quote:


> Originally Posted by *jackblk*
> 
> http://www.techpowerup.com/gpuz/details.php?id=dcygz
> 
> Just bought last week, happy with it =D.
> 
> Anyway, i have limited gpu clock on dota 2. It causes me to have ~100 fps only :/. I have 144hz monitor so this is a bit frustrating


Very normal. Its CPU limited. Try to change some settings and see if it helps. Used to get solid 120 fps before Reborn. Try running the game Full Screen and see if that helps.


----------



## jackblk

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Very normal. Its CPU limited. Try to change some settings and see if it helps. Used to get solid 120 fps before Reborn. Try running the game Full Screen and see if that helps.


I tried fullscreen, no difference at all. I doubt that it's the cpu limited :/. It only has 600 mhz when running dota


----------



## Stige

Quote:


> Originally Posted by *navjack27*
> 
> 1150/1625 +75mv core +0mv aux


Same clocks, so how.. or why?



EDIT: Oh it's 390X, no wonder...


----------



## navjack27

after some tweaks ^_^



i've confused you even more right?

same clocks same bios.

just changed flip queue to 5
set processor affinity to use cores 0,2,4,6 and set process to high priority

EDIT: well i have an i7 with hyperthreading. i'm @ 4ghz on my cpu with no power saving. i disabled powerplay with afterburner...


----------



## ZealotKi11er

Quote:


> Originally Posted by *jackblk*
> 
> I tried fullscreen, no difference at all. I doubt that it's the cpu limited :/. It only has 600 mhz when running dota


What is only 600MHz?


----------



## navjack27

i'm sure he means powerplay is limiting his gpus core clock speed


----------



## ZealotKi11er

Quote:


> Originally Posted by *navjack27*
> 
> i'm sure he means powerplay is limiting his gpus core clock speed


I know in Dota 2 the GPU will not hot full speed. That only happens because the CPU is limiting factor.


----------



## navjack27

i feel like installing dota 2 just so i can see what i get. is there a benchmark or do you just mean in general. i hate mobas but i'm willing to debunk and help


----------



## ZealotKi11er

Quote:


> Originally Posted by *navjack27*
> 
> i feel like installing dota 2 just so i can see what i get. is there a benchmark or do you just mean in general. i hate mobas but i'm willing to debunk and help


Just join a 10v10 game and see what you get. During fight it will go lower. Getting 70 fps at start you will feel the lag during fights. You need 100 fps so you dont lag.


----------



## Spartoi

Quote:


> Originally Posted by *navjack27*
> 
> after some tweaks ^_^
> 
> 
> 
> i've confused you even more right?
> 
> same clocks same bios.
> 
> just changed flip queue to 5
> set processor affinity to use cores 0,2,4,6 and set process to high priority
> 
> EDIT: well i have an i7 with hyperthreading. i'm @ 4ghz on my cpu with no power saving. i disabled powerplay with afterburner...


Are you using stock bios or modified? If modified, can you teach me/direct me where to learn how to mod my bios?


----------



## navjack27

Quote:


> Originally Posted by *Spartoi*
> 
> Are you using stock bios or modified? If modified, can you teach me/direct me where to learn how mod my bios?


only thing i modifed is already in the memory timing mod thread

EDIT: and


----------



## Stige

Quote:


> Originally Posted by *navjack27*
> 
> after some tweaks ^_^
> 
> 
> 
> i've confused you even more right?
> 
> same clocks same bios.
> 
> just changed flip queue to 5
> set processor affinity to use cores 0,2,4,6 and set process to high priority
> 
> EDIT: well i have an i7 with hyperthreading. i'm @ 4ghz on my cpu with no power saving. i disabled powerplay with afterburner...


So just tweaks that are "not allowed" in the Valley thread, pretty moot then :l


----------



## navjack27

we aren't in that thread are we?! i'm kinda just showing how, well not pointless, maybe not that big of a data point that benchmark is.


----------



## Stige

Quote:


> Originally Posted by *navjack27*
> 
> we aren't in that thread are we?! i'm kinda just showing how, well not pointless, maybe not that big of a data point that benchmark is.


Affirmative, I'll move onto that, hopefully less trolls.


----------



## Jaffi

I accidentally let my 390x nitro run GTA 5 with overclock to 1200/1600 with +100 mV for at least 5 minutes and without noticing the artifacting because I was afk... I immediately checked temperatures and they didn't exceed 76/87 degrees for GPU/VRMs, yet I am concerned this accident could have done damage to my card







I haven't seen any artifacts since, but is there a way to be sure?


----------



## Stige

No it won't cause any damage, +100mV is nothing heh.

My R9 390 is game stable at 1250/1700 now.


----------



## pillowsack

Quote:


> Originally Posted by *Stige*
> 
> No it won't cause any damage, +100mV is nothing heh.
> 
> My R9 390 is game stable at 1250/1700 now.


Nice! That looks like a good set up, temperatures are well behaved too. What's your 3Dmark graphics score? Also if you think GTA 5 is demanding, Counter Strike mother [email protected]%^%!# GO will always love to test my overclocks in a competitive match. GTA 5 will always run no problems though.


----------



## chubbyfatazn

Quote:


> Originally Posted by *pillowsack*
> 
> That looks good!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Stability is key with me though


It appears to be stable. At least, it doesn't crap out when looping Heaven and nothing wrong so far in FO4 or GTA5. I know I just jinxed myself though...

Don't get me wrong, I had my suspicions regarding the memory clocks, that I was testing them incorrectly or something since I didn't see any results remotely similar to mine on the front page. I did finally get my system to crap itself at 1820MHz though, so iono.


----------



## Stige

Quote:


> Originally Posted by *pillowsack*
> 
> Nice! That looks like a good set up, temperatures are well behaved too. What's your 3Dmark graphics score? Also if you think GTA 5 is demanding, Counter Strike mother [email protected]%^%!# GO will always love to test my overclocks in a competitive match. GTA 5 will always run no problems though.


I have never even touched GTA V. And never will.

CS:GO I know about, had my CPU crash in the past in it, and only in CS:GO.

I haven't run 3DMark yet cause my mobo is under warranty, don't wanna run it on stock CPU...


----------



## m70b1jr

Can someone look at this?.

http://www.overclock.net/t/1587106/arctic-accelero-hybrid-iii-140-on-xfx-r9-390-need-some-help


----------



## BradleyW

Quote:


> Originally Posted by *m70b1jr*
> 
> Can someone look at this?.
> 
> http://www.overclock.net/t/1587106/arctic-accelero-hybrid-iii-140-on-xfx-r9-390-need-some-help


Posted.


----------



## Darkeylel

So quick question what's 16.1 supposed to fix ? Because it's saying hotfix haha


----------



## Whippet

Quote:


> Originally Posted by *Darkeylel*
> 
> So quick question what's 16.1 supposed to fix ? Because it's saying hotfix haha


http://www.overclock3d.net/articles/gpu_displays/amd_radeon_software_crimson_edition_16_1_whql_driver/1

Wont auto update from desktop app though, is it because its a hotfix and not a full driver update? But it takes the driver version number from 15.X to 16.X...


----------



## pillowsack

Quote:


> Originally Posted by *Whippet*
> 
> http://www.overclock3d.net/articles/gpu_displays/amd_radeon_software_crimson_edition_16_1_whql_driver/1
> 
> Wont auto update from desktop app though, is it because its a hotfix and not a full driver update? But it takes the driver version number from 15.X to 16.X...


It's 2016 now


----------



## kizwan

Quote:


> Originally Posted by *Spartoi*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *navjack27*
> 
> after some tweaks ^_^
> 
> 
> 
> i've confused you even more right?
> 
> same clocks same bios.
> 
> just changed flip queue to 5
> set processor affinity to use cores 0,2,4,6 and set process to high priority
> 
> EDIT: well i have an i7 with hyperthreading. i'm @ 4ghz on my cpu with no power saving. i disabled powerplay with afterburner...
> 
> 
> 
> 
> 
> 
> 
> Are you using stock bios or modified? If modified, can you teach me/direct me where to learn how to mod my bios?
Click to expand...

In addition to the guide at first post In below thread, this may help you too.
http://www.overclock.net/t/1561372/hawaii-bios-editing-290-290x-295x2-390-390x/590#post_24725815
Quote:


> Originally Posted by *Jaffi*
> 
> I accidentally let my 390x nitro run GTA 5 with overclock to 1200/1600 with +100 mV for at least 5 minutes and without noticing the artifacting because I was afk... I immediately checked temperatures and they didn't exceed 76/87 degrees for GPU/VRMs, yet I am concerned this accident could have done damage to my card
> 
> 
> 
> 
> 
> 
> 
> I haven't seen any artifacts since, but is there a way to be sure?


No problem. BTW, if your card can survive with GTAV more than 30 minutes (or at least 1 hour), then your overclock is stable. If your overclock is not stable, it will crash around 15 to 30 minutes in game.
Quote:


> Originally Posted by *Darkeylel*
> 
> So quick question what's 16.1 supposed to fix ? Because it's saying hotfix haha


http://support.amd.com/en-us/kb-articles/Pages/AMD-Radeon-Software-Crimson-Edition-16.1-Hotfix-Release-Notes.aspx


----------



## Darkeylel

Cheers was just confused with the title saying hotfix and it's 300mb to download not the terms I would be using haha


----------



## Dundundata

Quote:


> Originally Posted by *navjack27*
> 
> only thing i modifed is already in the memory timing mod thread
> 
> EDIT: and


I was able to find my mem timings. So it looks like 1250 is the timing used for all straps, do you know the reason for this number? Also are there any harmful effects to using such a low timing for higher clocks?


----------



## kizwan

With memory overclock, tighten timings offer slightly performance boost than loosen timings. 1250 timings to me is just recommendation. If you cannot get stable with 1250 memory timings when overclock, you can try higher (loosen) timings, e.g. 1375 timings. There shouldn't be any harmful effects other than instability. Worst case scenario you get corrupted/broken windows because of crash.

Basically you will need to try it out yourself which timings works great for you. Tighten timings give you a couple FPS boost which is great if you're benching but you will not see much difference in games though.
Quote:


> Originally Posted by *Darkeylel*
> 
> Cheers was just confused with the title saying hotfix and it's 300mb to download not the terms I would be using haha


The notes also include installation instruction that include uninstallation of the older Crimson drivers. So it's not really a hot fix. They probably did this because don't want users to expect too much from this new driver?


----------



## Spartoi

Quote:


> Originally Posted by *navjack27*
> 
> only thing i modifed is already in the memory timing mod thread
> 
> EDIT: and


Quote:


> Originally Posted by *kizwan*
> 
> In addition to the guide at first post In below thread, this may help you too.
> http://www.overclock.net/t/1561372/hawaii-bios-editing-290-290x-295x2-390-390x/590#post_24725815


Thanks. I applied the timing mod but once stable overclock became unstable. I had to lower my memory speed to 1705mhz from 1725 mhz.

When I was benchmarking, my VRM temps max at 104C. Is that alright or too high?


----------



## fat4l

Quote:


> Originally Posted by *Spartoi*
> 
> Thanks. I applied the timing mod but once stable overclock became unstable. I had to lower my memory speed to 1705mhz from 1725 mhz.
> 
> When I was benchmarking, my VRM temps max at 104C. Is that alright or too high?


During benchmarking? Ugh. Imagine what a normal playing for hours would do(considering the same volts and clocks).
To me it is high and I would repaste/change pads.

Also, yes changing of timings can change your name oc. You just check what gives you more performance. You can still apply higher strap timings and get your max OC easy.


----------



## Synntx

Quote:


> Originally Posted by *Spartoi*
> 
> Thanks. I applied the timing mod but once stable overclock became unstable. I had to lower my memory speed to 1705mhz from 1725 mhz.
> 
> When I was benchmarking, my VRM temps max at 104C. Is that alright or too high?


Holy crap, those VRM temps are wayyyyyy too high.


----------



## Synntx

Quote:


> Originally Posted by *Jaffi*
> 
> I accidentally let my 390x nitro run GTA 5 with overclock to 1200/1600 with +100 mV for at least 5 minutes and without noticing the artifacting because I was afk... I immediately checked temperatures and they didn't exceed 76/87 degrees for GPU/VRMs, yet I am concerned this accident could have done damage to my card
> 
> 
> 
> 
> 
> 
> 
> I haven't seen any artifacts since, but is there a way to be sure?


Not dangerous at all. Those are slightly high operating temps but nowhere near danger levels. You're good. If you can stabilize that clock at those temps I'd say you're doing alright.


----------



## w1ck3ddd

Anybody knows what are the safe temps for a 390 Nitro on the VRMs ?
Getting 75 on VRM1 and 85 on VRM2 with oc 1140/1600 with default voltage.


----------



## Carniflex

Quote:


> Originally Posted by *w1ck3ddd*
> 
> Anybody knows what are the safe temps for a 390 Nitro on the VRMs ?
> Getting 75 on VRM1 and 85 on VRM2 with oc 1140/1600 with default voltage.


That should be safe. Normally VRM's are rated up to 105 .. 125 C, however, in my opinion it's best to keep them under 80 ... 90C. Assuming one wants it's card to last beyond the warranty period. Now VRM's might be able to take that kind of temperatures, but running them that hot tends to heat up also other stuff nearby which might not be as happy with high temperatures.


----------



## BradleyW

Quote:


> Originally Posted by *w1ck3ddd*
> 
> Anybody knows what are the safe temps for a 390 Nitro on the VRMs ?
> Getting 75 on VRM1 and 85 on VRM2 with oc 1140/1600 with default voltage.


100c and below for 24/7 use. They can withstand up to 130c before throttling and shut downs.


----------



## Spartoi

Quote:


> Originally Posted by *fat4l*
> 
> During benchmarking? Ugh. Imagine what a normal playing for hours would do(considering the same volts and clocks).
> To me it is high and I would repaste/change pads.
> 
> Also, yes changing of timings can change your name oc. You just check what gives you more performance. You can still apply higher strap timings and get your max OC easy.


If I use a higher (i.e. 1375) strap timing, do I also apply it to the 1250 timings or just the current strap timing (1375) and above?


----------



## fat4l

Quote:


> Originally Posted by *Spartoi*
> 
> If I use a higher (i.e. 1375) strap timing, do I also apply it to the 1250 timings or just the current strap timing (1375) and above?


Well I would say just to the one you are goign to test.
If you are aiming for 1700+ then just change 1750 strap timings and replace them with 1375 ones..


----------



## battleaxe

Quote:


> Originally Posted by *BradleyW*
> 
> 100c and below for 24/7 use. They can withstand up to 130c before throttling and shut downs.


That's freaking hot. I guess from a technical standpoint it may be correct, but higher clocks warrant as low a temp as you can manage. Its worth getting these cool. I would wager there's a 30-50mhz difference in clock speed stability between 50C and 120C. Maybe more than that. I know I gained at least 10mhz just by going from 70C to 60C. Maybe its just me, but there's no way I'd want to run that hot.


----------



## xillius200

Anyone in the r9 390/390x owner club have a Gigabyte r9 390x G1 Gaming that could post their bios for me? I managed to get a second hand r9 390x but someones flashed an MSI bios onto it, so the fans aren't working as they should and the card seems to be acting up a bit.


----------



## The Stilt

Quote:


> Originally Posted by *xillius200*
> 
> Anyone in the r9 390/390x owner club have a Gigabyte r9 390x G1 Gaming that could post their bios for me? I managed to get a second hand r9 390x but someones flashed an MSI bios onto it, so the fans aren't working as they should and the card seems to be acting up a bit.


Have you checked both of the bios chips (switch)?


----------



## tangelo

Quote:


> Originally Posted by *Darkeylel*
> 
> So quick question what's 16.1 supposed to fix ? Because it's saying hotfix haha


It fixed horrible framerate issues on Elite: Dangerous


----------



## jdorje

Up to 120C or more is perfectly safe on the vrms.

But...

Lower vrm temps means better voltage stability. So as an overclocker you want them lower.

My vrms have a large but not well ventilated heat sink. It takes them 10-20 minutes to max out in temp. Problem is made worse because the "ambient" temperature for the vrm cooling is raised extremely by the heat of the core.

Replacing the thermal padding under the sink dropped my vrms by a significant amount. I didn't measure it accurately though. I dropped core temp at the same time, and since I had a fan curve going that means the lower temps were with less airflow. Still they dropped from 102 to 91 with my everyday overclock.


----------



## OneB1t

if you have such high temperature then you will suffer from much increased power usage... as VRM hitting efficiency wall much faster at higher temperatures


----------



## Carniflex

Quote:


> Originally Posted by *xillius200*
> 
> Anyone in the r9 390/390x owner club have a Gigabyte r9 390x G1 Gaming that could post their bios for me? I managed to get a second hand r9 390x but someones flashed an MSI bios onto it, so the fans aren't working as they should and the card seems to be acting up a bit.


https://www.dropbox.com/s/xq95edv5yk298gl/Gigabyte390X_G1_Gaming_015.049.000.002.000000.rom?dl=0

There is mine. This is from this card:


----------



## xillius200

Thank you that helped a lot







i've been able to flash it back now







fans working as they should again and no glitches







just wondering but what kind of temps do you get from your gigabyte g1? mine goes up to 85 when playing games like battlefield 4


----------



## Carniflex

Quote:


> Originally Posted by *xillius200*
> 
> Thank you that helped a lot
> 
> 
> 
> 
> 
> 
> 
> i've been able to flash it back now
> 
> 
> 
> 
> 
> 
> 
> fans working as they should again and no glitches
> 
> 
> 
> 
> 
> 
> 
> just wondering but what kind of temps do you get from your gigabyte g1? mine goes up to 85 when playing games like battlefield 4


That is a good temp, I'm hitting up to ~90C at 4K in more demanding applications with case fans at "low". First few weeks were hitting as high as 95C but after first few weeks it seemed to settle down a bit lower at around 80 .. 85C normally with temperature creeping up slowly as the case internal heat up over ~30 min or so. It's quite hot and loud card. I have a 1400 rpm 140 mm fan blowing into it from the front of the case and a 140mm 1100 rpm fan sucking the hot air out from it on the sidepanel. It has 2 slots of "breather room between this card and the second gfx card in pcie slots 5/6 under it.


----------



## fat4l

I'm doing some testing of my 1750MHz mem if it is fully stable and it seems so...yay.
1500timings applied.
1050aux voltage.

1200/1750MHz + Crossfire

*13146 GS in FS X!*

http://www.3dmark.com/fs/7157746










Anyone with 390X CF wanna match ?


----------



## Chaoz

Add me to the group:

I bought a ASUS STRIX R9 390 DC3OC


----------



## ZealotKi11er

Quote:


> Originally Posted by *fat4l*
> 
> I'm doing some testing of my 1750MHz mem if it is fully stable and it seems so...yay.
> 1500timings applied.
> 1050aux voltage.
> 
> 1200/1750MHz + Crossfire
> 
> *13146 GS in FS X!*
> 
> http://www.3dmark.com/fs/7157746
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyone with 390X CF wanna match ?


Can beat it with 290X + 290 + 3770K.


----------



## Stige

Quote:


> Originally Posted by *Chaoz*
> 
> Add me to the group:
> 
> I bought a ASUS STRIX R9 390 DC3OC


Propably the worst cooler they make for R9 390 :l

My condolences, I own one too.


----------



## Mysticking32

Just an update. Newest drivers from AMD posted on the 7th increase performance a lot. I'll post pics later. Tests performed on firestrike.


----------



## OneB1t

16.1 crimson?


----------



## Mysticking32

Quote:


> Originally Posted by *OneB1t*
> 
> 16.1 crimson?


Yep. Was surprised as well lol.

Here are some benchmarks

http://www.3dmark.com/compare/fs/7091387/fs/7119919/fs/7120007#


----------



## Falmatrix2r

Hi, I'm gonna be the owner of a Sapphire 390x in a couple of days. My current card is a Sapphire 6950 and it just feels like it's time for an upgrade. Recent titles weren't very nice to my card lol. I had 2 questions, the first one is about my PC in general. Is it going to be a bottleneck for this card? I have an i7 950 stock, 18GB DDR3, 850w psu and an Asus P6T6 WS Revolution motherboard. This gfx card will obviously be the last upgrade on this PC before building a new one but I've had 5 wonderful years with it until now. My second question is about the card itself, I'm buying the Tri-X version of this card with 1055Mhz clock and no backplate. I've seen a newer version pop up which is Nitro branded with 1080Mhz and a backplate. Am I missing out on the newer Nitro version? I jumped on the Tri-X one because I found a nice deal online which brings it close to a 390 and a 970.
Thanks


----------



## jdorje

Quote:


> Originally Posted by *Mysticking32*
> 
> Just an update. Newest drivers from AMD posted on the 7th increase performance a lot. I'll post pics later. Tests performed on firestrike.


Hm I'm using the beta. Do I already have these updates or do i need to downgrade to get them?

God AMDs driver versioning numbers are terrible.


----------



## ThatGuy16

Anyone OC with the crimson software? I just want a mild OC, I have a standard PowerColor R390 at 1000Mhz ...Just want to bump it up to 1050-1100 and not sure what to do with the voltage and memory.. if anything? Or should I download the MSI program?


----------



## ZealotKi11er

Quote:


> Originally Posted by *ThatGuy16*
> 
> Anyone OC with the crimson software? I just want a mild OC, I have a standard PowerColor R390 at 1000Mhz ...Just want to bump it up to 1050-1100 and not sure what to do with the voltage and memory.. if anything? Or should I download the MSI program?


Get MSI AB if you want to OC.


----------



## mus1mus

Just be careful with the latest driver. It's not as forgiving as the previous drivers. Any failed OC attemp will greet you Black Screens at Startup.

Though, Voltages now stick after a reboot when using Trixx and iTurbo. But when Voltages revert back afyer a restart, hmmm.


----------



## bichael

Quote:


> Originally Posted by *ThatGuy16*
> 
> Anyone OC with the crimson software? I just want a mild OC, I have a standard PowerColor R390 at 1000Mhz ...Just want to bump it up to 1050-1100 and not sure what to do with the voltage and memory.. if anything? Or should I download the MSI program?


I used to overclock my 270x pretty well using catalyst for both core and memory, I believe it's an okay option if you're not looking to touch the voltage, ofcourse depends how far you can go without needing a bit more voltage...

I also have the powercolor 390 and was only able to get to about 1040 using crimson OC without artifacts in firestrike. However this could be down to my 450W sfx PSU so would be interested to know how you get on (I wasn't really expecting or wanting to OC but couldn't resist having a go).


----------



## Sgt Bilko

Alrighty guys, so i replaced the cooler on my XFX DD 290x with the one from the 390x and it dropped the temps like a rock...

More testing is needed ( i don't have much time to devote to it) but from what I've seen so far XFX did a helluva better job than i originally gave them credit for






And part 2:


----------



## ThatGuy16

Quote:


> Originally Posted by *bichael*
> 
> I used to overclock my 270x pretty well using catalyst for both core and memory, I believe it's an okay option if you're not looking to touch the voltage, ofcourse depends how far you can go without needing a bit more voltage...
> 
> I also have the powercolor 390 and was only able to get to about 1040 using crimson OC without artifacts in firestrike. However this could be down to my 450W sfx PSU so would be interested to know how you get on (I wasn't really expecting or wanting to OC but couldn't resist having a go).


I just raised mine on MSI Afterburner, to 1100 +60mv and 1600 memory, played BF4 maxxed out getting 50-60FPS with 100% GPU usage for about 45 mins.. temps on GPU and VRM were good, try after burner... could be your PSU though, maybe


----------



## kizwan

Quote:


> Originally Posted by *mus1mus*
> 
> Just be careful with the latest driver. It's not as forgiving as the previous drivers. Any failed OC attemp will greet you Black Screens at Startup.
> 
> Though, Voltages now stick after a reboot when using Trixx and iTurbo. But when Voltages revert back afyer a restart, hmmm.


Why use iTurbo? What iTurbo have that Trixx don't have? Did you use both at the same time?


----------



## Sgt Bilko

Quote:


> Originally Posted by *kizwan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mus1mus*
> 
> Just be careful with the latest driver. It's not as forgiving as the previous drivers. Any failed OC attemp will greet you Black Screens at Startup.
> 
> Though, Voltages now stick after a reboot when using Trixx and iTurbo. But when Voltages revert back afyer a restart, hmmm.
> 
> 
> 
> Why use iTurbo? What iTurbo have that Trixx don't have? Did you use both at the same time?
Click to expand...

iTurbo allows for more voltage than what Trixx gives you (+400mV iirc)

I normally use AB and Trixx just because i never have a need for more than +200mV......but Mus is a little crazy and under water as well i think


----------



## kizwan

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mus1mus*
> 
> Just be careful with the latest driver. It's not as forgiving as the previous drivers. Any failed OC attemp will greet you Black Screens at Startup.
> 
> Though, Voltages now stick after a reboot when using Trixx and iTurbo. But when Voltages revert back afyer a restart, hmmm.
> 
> 
> 
> Why use iTurbo? What iTurbo have that Trixx don't have? Did you use both at the same time?
> 
> Click to expand...
> 
> iTurbo allows for more voltage than what Trixx gives you (+400mV iirc)
> 
> I normally use AB and Trixx just because i never have a need for more than +200mV......but Mus is a little crazy and under water as well i think
Click to expand...

I see. I use AB & Trixx too when I need more than +100mV, higher MHz range & AUX voltage at the same time. I want to feed more than +200mV but even though mine underwater, ambient a bit too high 34C. Even if I can lower ambient to 27C with A/C, I'm still don't feel comfortable. Maybe if I can get water chiller.


----------



## Carniflex

Quote:


> Originally Posted by *Falmatrix2r*
> 
> Hi, I'm gonna be the owner of a Sapphire 390x in a couple of days. My current card is a Sapphire 6950 and it just feels like it's time for an upgrade. Recent titles weren't very nice to my card lol. I had 2 questions, the first one is about my PC in general. Is it going to be a bottleneck for this card? I have an i7 950 stock, 18GB DDR3, 850w psu and an Asus P6T6 WS Revolution motherboard. This gfx card will obviously be the last upgrade on this PC before building a new one but I've had 5 wonderful years with it until now. My second question is about the card itself, I'm buying the Tri-X version of this card with 1055Mhz clock and no backplate. I've seen a newer version pop up which is Nitro branded with 1080Mhz and a backplate. Am I missing out on the newer Nitro version? I jumped on the Tri-X one because I found a nice deal online which brings it close to a 390 and a 970.
> Thanks


Your system should be fine.


----------



## jimmyvizvary

Hi guys I thought you'd be interested in my results of crossfiring an r9 290 with the r9 390 http://www.3dmark.com/fs/7157434


----------



## kizwan

Quote:


> Originally Posted by *jimmyvizvary*
> 
> Hi guys I thought you'd be interested in my results of crossfiring an r9 290 with the r9 390 http://www.3dmark.com/fs/7157434


I think FPS for graphics test 1 & 2 are lower than it should be. Make sure ULPS disabled & your cards are not throttling.


----------



## jimmyvizvary

Quote:


> Originally Posted by *kizwan*
> 
> I think FPS for graphics test 1 & 2 are lower than it should be. Make sure ULPS disabled & your cards are not throttling.


The cards are not throttling because they're not exceeding 70 degrees Celsius and I know the throttle at 95 degrees Celsius and the cards are not overclocked I'll try turning off the U LPS and see if that changes it and I'll post my results shortly I almost forgot the r9 390 is under clock to the r9 290 specs


----------



## Falmatrix2r

Hey thanks for your reply
Hopefully getting the card on Wednesday
I've been told that my jaw will probably drop coming from a 6950 to a 390x, is that true?


----------



## mus1mus

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Why use iTurbo? What iTurbo have that Trixx don't have? Did you use both at the same time?
> 
> 
> 
> iTurbo allows for more voltage than what Trixx gives you (+400mV iirc)
> 
> I normally use AB and Trixx just because i never have a need for more than +200mV......but Mus is a little crazy and under water as well i think
Click to expand...

Correct on all accounts.









In addition, iTurbo allows adjustments for VDDCI. But I don't really notice it helping. Aside from an instance where I did pulled VDDCI down due to some Desktop Corruption at Idle.

I think, my mod was amiss on the VDDCI Part. Dropping the offset solved my issue though.

Crazy as I may have been, there's just an urge to push past where most mortals revel. Gotta see what's in there.









Getting another card tomorrow.


----------



## Carniflex

Quote:


> Originally Posted by *Falmatrix2r*
> 
> Hey thanks for your reply
> Hopefully getting the card on Wednesday
> I've been told that my jaw will probably drop coming from a 6950 to a 390x, is that true?


Umm .. its still only a GFX card. And 6950 would be still offering quite satisfactory performance at 1080p and adequate at 1440p. So while I'm sure you will notice the difference it's not something that would knock your socks off. Depends ofc on various details like the resolution you are running and the games you like the play. If you have been throwing content at your 6950 beyond its ability to swallow then the increase when going from ~30 fps to 45 .. 60 fps can be quite noticeable. If just ramping up settings from medium or high to "ultra" then in many games you would have to look for it extra to notice the difference in actual game-play.


----------



## m70b1jr

Everytime I put my voltage past +150mV, and click apply, I get a black screen of death and have to reboot.


----------



## Stige

Quote:


> Originally Posted by *Carniflex*
> 
> Umm .. its still only a GFX card. And 6950 would be still offering quite satisfactory performance at 1080p and adequate at 1440p. So while I'm sure you will notice the difference it's not something that would knock your socks off. Depends ofc on various details like the resolution you are running and the games you like the play. If you have been throwing content at your 6950 beyond its ability to swallow then the increase when going from ~30 fps to 45 .. 60 fps can be quite noticeable. If just ramping up settings from medium or high to "ultra" then in many games you would have to look for it extra to notice the difference in actual game-play.


6950 is old as fek, how is that even remotely adequate for anything these days, a HD7950 is "adequate" at 1080p if you ask me, it gets about same framerates at 1080p that I get with R9 390 at 1440p.
And you claim adequate for 1440p hah
Even a GTX 970 will have hard time at 1440p in modern games hahaha


----------



## Carniflex

Quote:


> Originally Posted by *Stige*
> 
> 6950 is old as fek, how is that even remotely adequate for anything these days, a HD7950 is "adequate" at 1080p if you ask me, it gets about same framerates at 1080p that I get with R9 390 at 1440p.
> And you claim adequate for 1440p hah


It really depends on what you are after. I have a long history of running high resolutions with underpowered GFX cards so I think have pretty good idea how far can a card be stretched with some compromises. Mostly various combinations of 1080p eyefinity between 3 and 5 screens with cards like 5770, 6770 crossfire, 7870, 7950 and quite recently also with 390X and a single 4k screen more recently.

Just ramping up straight to "ultra" and finding out that things run bad is no surprise, finding the right settings to drop for good enough gameplay is the trick with the older GFX cards. As far as I have seen telling from a glance on screen its pretty hard to tell if the game is running on medium or high without looking it up on settings. An example screen-shots I took a while ago for some thread long buried in here - but these just happen to be in my pictures library to provide an example of what I am trying to say:



If I remember correct then DX9 low, DX 11 medium and DX 11 high settings in Metro 2033. 5x 1080p portrait eyefinity running on a 7870. And yeah DX 11 on high was not playable at all







But DX 9 on low was pretty nice experience on that card in that game back in the day.


----------



## mus1mus

Quote:


> Originally Posted by *m70b1jr*
> 
> Everytime I put my voltage past +150mV, and click apply, I get a black screen of death and have to reboot.


It may not be a complete Black Screen.

Try doing that again and once you Black Screen, pull the monitor cable out for a second and plug it back in.


----------



## Stige

Quote:


> Originally Posted by *Carniflex*
> 
> It really depends on what you are after. I have a long history of running high resolutions with underpowered GFX cards so I think have pretty good idea how far can a card be stretched with some compromises. Mostly various combinations of 1080p eyefinity between 3 and 5 screens with cards like 5770, 6770 crossfire, 7870, 7950 and quite recently also with 390X and a single 4k screen more recently.
> 
> Just ramping up straight to "ultra" and finding out that things run bad is no surprise, finding the right settings to drop for good enough gameplay is the trick with the older GFX cards. As far as I have seen telling from a glance on screen its pretty hard to tell if the game is running on medium or high without looking it up on settings. An example screen-shots I took a while ago for some thread long buried in here - but these just happen to be in my pictures library to provide an example of what I am trying to say:
> 
> 
> 
> If I remember correct then DX9 low, DX 11 medium and DX 11 high settings in Metro 2033. 5x 1080p portrait eyefinity running on a 7870. And yeah DX 11 on high was not playable at all
> 
> 
> 
> 
> 
> 
> 
> But DX 9 on low was pretty nice experience on that card in that game back in the day.


I don't like playing my games at 30 fps, I'm not into consoles, maybe that is why.


----------



## Charcharo

Consoles dont do 30. They do 20... to 30. Dont confuse 30 locked with 20-30









Also, generally it is quite nice that we have the choice. I think just ramping up everything to Ultra without any thought thrown into is stupid, as some settings affect image quality more and drain less performance than others...


----------



## Stige

Quote:


> Originally Posted by *Charcharo*
> 
> *Consoles dont do 30. They do 20... to 30. Dont confuse 30 locked with 20-30
> 
> 
> 
> 
> 
> 
> 
> *
> 
> Also, generally it is quite nice that we have the choice. I think just ramping up everything to Ultra without any thought thrown into is stupid, as some settings affect image quality more and drain less performance than others...


They are all the same to me, unplayable. Anything below 60 is unplayable and I prefer 100+ at most times. I'd rather have high fps over anything else. What's the point of a higher refresh rate if you don't?
Performance > *


----------



## Charcharo

Quote:


> Originally Posted by *Stige*
> 
> They are all the same to me, unplayable. Anything below 60 is unplayable and I prefer 100+ at most times. I'd rather have high fps over anything else. What's the point of a higher refresh rate if you don't?
> Performance > *


I am at 1440x900. My issues are CPU limitations and overhead









Not frame rate nor visual quality.

To me 20-30 is unplayable. 30 is playable depending on the game... for example Witcher 3 or strategy games. 120 is ideal so far (since when locked, there is no tearing).


----------



## Dundundata

I'll take high fps on ultra please


----------



## BradleyW

Quote:


> Originally Posted by *Stige*
> 
> They are all the same to me, unplayable. Anything below 60 is unplayable and I prefer 100+ at most times. I'd rather have high fps over anything else. What's the point of a higher refresh rate if you don't?
> Performance > *


This is the biggest mistake people make with high refresh rates. No matter what FPS you run at, the fixed refresh will always refresh at that value once per second. So if the FPS is at 30 and the Hz is at 144, the screen will still refresh at 144hz giving you significantly smoother and snappier animation despite having a low FPS.

30FPS @ 144Hz feels better than 60 FPS @ 60Hz. I can play any game at 30 FPS and it's perfectly smooth, no input lag and no tearing. The only advantage to high FPS on a high refresh rate screen is that it helps make things even more smooth with less blur.


----------



## the9quad

Quote:


> Originally Posted by *BradleyW*
> 
> This is the biggest mistake people make with high refresh rates. No matter what FPS you run at, the fixed refresh will always refresh at that value once per second. So if the FPS is at 30 and the Hz is at 144, the screen will still refresh at 144hz giving you significantly smoother and snappier animation despite having a low FPS.
> 
> 30FPS @ 144Hz feels better than 60 FPS @ 60Hz. I can play any game at 30 FPS and it's perfectly smooth, no input lag and no tearing. The only advantage to high FPS on a high refresh rate screen is that it helps make things even more smooth with less blur.












I think the best thing about high refresh rates is with Vsync off, I will very rarely notice tearing. Next year sometime I will upgrade my video cards and go with a freesync or Gsync monitor, depending on which camp knocks one out of the park, but in the meantime high refresh rate helps.


----------



## Mysticking32

Quote:


> Originally Posted by *jdorje*
> 
> Hm I'm using the beta. Do I already have these updates or do i need to downgrade to get them?
> 
> God AMDs driver versioning numbers are terrible.


You don't necessarily have to uninstall your current drivers. (I didn't. but it's generally recommended) And these are the whql drivers not the beta. .

http://support.amd.com/en-us/download/desktop?os=Windows%2010%20-%2064

Make sure you download your correct os version though.


----------



## Stige

Quote:


> Originally Posted by *BradleyW*
> 
> This is the biggest mistake people make with high refresh rates. No matter what FPS you run at, the fixed refresh will always refresh at that value once per second. So if the FPS is at 30 and the Hz is at 144, the screen will still refresh at 144hz giving you significantly smoother and snappier animation despite having a low FPS.
> 
> 30FPS @ 144Hz feels better than 60 FPS @ 60Hz. I can play any game at 30 FPS and it's perfectly smooth, no input lag and no tearing. The only advantage to high FPS on a high refresh rate screen is that it helps make things even more smooth with less blur.


30 fps feels like crap regardless of refresh rate. You are right that it does make it feel much smoother but it still is very choppy and unplayable. I just can't stand to play a game that doesn't run smooth.

Only exception I have found so far is Naval Action that rans at around ~50 FPS for me at all times but that is SO slow spaced I can't even feel it.


----------



## BradleyW

Quote:


> Originally Posted by *Stige*
> 
> 30 fps feels like crap regardless of refresh rate. You are right that it does make it feel much smoother but it still is very choppy and unplayable. I just can't stand to play a game that doesn't run smooth.
> Only exception I have found so far is Naval Action that rans at around ~50 FPS for me at all times but that is SO slow spaced I can't even feel it.


I'm not sure what monitor you've tested, but here on my ASUS VG278HE and BenQ XR3501, 30FPS locked at 144Hz is smoother than 60 FPS @ 60 Hz. Not a single tear or stutter in the slightest, and I'm very sensitive to stuttering. Just ask the9quad, he'll vouch for that!


----------



## OneB1t

Quote:


> Originally Posted by *BradleyW*
> 
> I'm not sure what monitor you've tested, but here on my ASUS VG278HE and BenQ XR3501, 30FPS locked at 144Hz is smoother than 60 FPS @ 60 Hz. Not a single tear or stutter in the slightest, and I'm very sensitive to stuttering. Just ask the9quad, he'll vouch for that!


thats just lie


----------



## BradleyW

Quote:


> Originally Posted by *OneB1t*
> 
> thats just lie


No.


----------



## Stige

Quote:


> Originally Posted by *BradleyW*
> 
> I'm not sure what monitor you've tested, but here on my ASUS VG278HE and BenQ XR3501, 30FPS locked at 144Hz is smoother than 60 FPS @ 60 Hz. Not a single tear or stutter in the slightest, and I'm very sensitive to stuttering. Just ask the9quad, he'll vouch for that!


No tearing or stuttering doesn't mean anything if the game still runs at 30 FPS, that is just choppy. It isn't smooth in any way.


----------



## BradleyW

Quote:


> Originally Posted by *Stige*
> 
> No tearing or stuttering doesn't mean anything if the game still runs at 30 FPS, that is just choppy. It isn't smooth in any way.


If a game at 30 FPS moves around perfectly without a single hitch or stutter, has perfect input and hardly no motion blur, how is that not "smooth"?

If 30 FPS @ 144Hz is smoother than the world renowned perfection of 60hz @ 60 FPS, how's that not smooth either?

I guess we just have a different perception of smooth and I respect that.


----------



## jdorje

30 fps is in no way smoother than 60 fps. If your saying it can tear less, than sure. A true 60 fps would never tear at 60 hz, but if course there's no suck thing as true 60 fps.

Use freesync if tearing is that bad for you.


----------



## BradleyW

Quote:


> Originally Posted by *jdorje*
> 
> *30 fps is in no way smoother than 60 fps.* If your saying it can tear less, than sure. A true 60 fps would never tear at 60 hz, but if course there's no suck thing as true 60 fps.
> Use freesync if tearing is that bad for you.


You've taken it out of context by leaving out the critical part of "refresh rate". I will say it agin.

30 FPS @ *60 Hz* = Crap.
30 FPS @ *144 Hz* = 60 FPS @ *60 Hz* in terms of smoothness.


----------



## OneB1t

nope you are wrong
if you have just 30fps it always be crap
panel can use some sort of frame motion calculation but that adds input lag


----------



## Stige

Quote:


> Originally Posted by *BradleyW*
> 
> You've taken it out of context by leaving out the critical part of "refresh rate". I will say it agin.
> 
> 30 FPS @ *60 Hz* = Crap.
> 30 FPS @ *144 Hz* = 60 FPS @ *60 Hz* in terms of smoothness.


No, just no. Never, ever.


----------



## BradleyW

Quote:


> Originally Posted by *OneB1t*
> 
> nope you are wrong
> if you have just 30fps it always be crap
> panel can use some sort of frame motion calculation but that adds input lag


That's what makes high refresh rate screens so useful. You don't need as much FPS for a smooth and enjoyable experience compared to a 60hz screen. I'm more than happy to invite you to my home to show you.









Well, no worries, each to their own. I'm enjoying an amazing 144hz experience here, even at low fps. Lucky me I guess.


----------



## OneB1t

stop trolling 30fps is same on 60hz and 144hz panel

i you want to fix tearing you need freesync/g-sync and still 30fps will look worse than 60fps


----------



## Tivan

60fps at 60hz with stable vsync is as smooth as any other synced experience. (more motion blur than higher refresh rate+framerate solutions, or oldschool CRTs, though.)

30fps at 144hz is better than unstable 45fps on a 60hz monitor, but they both are not as smooth as a synced setup. going over ~46fps gets you the 'fluid' motion effect.

144hz is less obvious about not being smooth, because the intervals where frames, that were rendered for a random point in time, can be displayed, are smaller. Only syncing lets you ensure that frames are shown in strict correlation with the temoral data in mind, that they were rendered with.

Say you render a frame roughly every 2ms, and they represent the state of things roughly 2ms apart. Even a 144hz monitor will show you the frame between 0ms and 1.999ms too late. (Unless you do vsync. Then the frames are rendered for the monitor intervals. A flawless implimentation of framerate limiting might work as well.)

With unstable 30fps (new frame every 33.33ms on average), the 144hz refresh rate ensures that frames only get shown around up to 6.943ms too late. While on a 60hz refresh monitor, unstable 30fps would have more than twice the variance. Vsync'd 30fps on a 60hz monitor, if stable, is pretty smooth though. (just don't get the 'fluid' feel, till ~46fps. But you'd need that on the 144hz as well.)


----------



## BradleyW

Quote:


> Originally Posted by *OneB1t*
> 
> 
> 
> 
> 
> 
> 
> 
> stop trolling 30fps is same on 60hz and 144hz panel
> 
> i you want to fix tearing you need freesync/g-sync and still 30fps will look worse than 60fps


I never said I had a tearing issue.


----------



## Chaoz

Quote:


> Originally Posted by *Stige*
> 
> Propably the worst cooler they make for R9 390 :l
> 
> My condolences, I own one too.


Meh, i quite like it, tbh.


----------



## Stige

Quote:


> Originally Posted by *Chaoz*
> 
> Meh, i quite like it, tbh.


The VRM temps are unacceptable even at stock voltage.


----------



## Chaoz

Quote:


> Originally Posted by *Stige*
> 
> The VRM temps are unacceptable even at stock voltage.


What's unacceptable for you? My temps are fine it seems, i put a custom profile in the GPU Tweak, so the fans would spin at 1200 rpm. And the temps are for my core: 37°C and VRAM: 30°C


----------



## Stige

Quote:


> Originally Posted by *Chaoz*
> 
> What's unacceptable for you? My temps are fine it seems, i put a custom profile in the GPU Tweak, so the fans would spin at 1200 rpm. And the temps are for my core: 37°C and VRAM: 30°C


No one cares about idle temps, try Armored Warfare or some other GPU abusive game and your VRM will be 90C+.


----------



## Charcharo

Quote:


> Originally Posted by *Stige*
> 
> No one cares about idle temps, try Armored Warfare or some other GPU abusive game and your VRM will be 90C+.


For some reason Armored Warafare both runs and looks... terrible on my system.


----------



## Stige

Quote:


> Originally Posted by *Charcharo*
> 
> For some reason Armored Warafare both runs and looks... terrible on my system.


But idling in AW hangar shoots my temps up more than anything else.


----------



## Chaoz

Quote:


> Originally Posted by *Stige*
> 
> No one cares about idle temps, try Armored Warfare or some other GPU abusive game and your VRM will be 90C+.


Just played a bit of Black Ops 3 maxed out completely and my VRAM temps are around 55°C


----------



## Stige

Quote:


> Originally Posted by *Chaoz*
> 
> Just played a bit of Black Ops 3 maxed out completely and my VRAM temps are around 55°C


VRM, not VRAM.

Use GPU-Z... And screenshot.


----------



## Chaoz

Quote:


> Originally Posted by *Stige*
> 
> VRM, not VRAM.
> 
> Use GPU-Z... And screenshot.


Sorry, mistyped that, it does say VRM. Thought it said VRAM









I didn't screenshot it, i logged it.

GPU-ZSensorLog.txt 223k .txt file


----------



## ThatGuy16

What should the VRM temps stay under?


----------



## Stige

Quote:


> Originally Posted by *Chaoz*
> 
> Sorry, mistyped that, it does say VRM. Thought it said VRAM
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I didn't screenshot it, i logged it.
> 
> GPU-ZSensorLog.txt 223k .txt file


That game seems to put like no load at all on the GPU. Barely 100% load anywhere and you have alt+tabbed a lot during that log or something, jumps between zero and 100 basicly.
Menu?


----------



## Chaoz

Quote:


> Originally Posted by *Stige*
> 
> That game seems to put like no load at all on the GPU. Barely 100% load anywhere and you have alt+tabbed a lot during that log or something, jumps between zero and 100 basicly.
> Menu?


Hmm, weird.

I didn't alt+tab at all. Maybe when you load into the game or something?


----------



## fat4l

Quote:


> Originally Posted by *m70b1jr*
> 
> Everytime I put my voltage past +150mV, and click apply, I get a black screen of death and have to reboot.


you are not the only one with that.
Hawaii cards suffer from this if volts/clocks are too high(it differs from card to card) and you are using displayport at either [email protected]+ or [email protected]
This is what limits my OC as well. I'm doing 1200MHz on the core for this reason...


----------



## Stige

Quote:


> Originally Posted by *Chaoz*
> 
> Hmm, weird.
> 
> I didn't alt+tab at all. Maybe when you load into the game or something?


Every 5 seconds the GPU usage drops to 0 basicly, something fishy about those logs... Run Valley for a bit or something, those temps seem way too low for a Strix DC3 390.


----------



## Chaoz

Quote:


> Originally Posted by *Stige*
> 
> Every 5 seconds the GPU usage drops to 0 basicly, something fishy about those logs... Run Valley for a bit or something, those temps seem way too low for a Strix DC3 390.


Well i ran valley for a full benchmark at ExtremeHD on standard clockspeed.

Here is the log:

GPU-ZSensorLog2.txt 356k .txt file


After i quit the benchmark software i checked the temps and the max it got was 63°C on the VRM, that i could see.

This is my score:


----------



## Stige

Either you got really lucky for some weird reason or they have done something to those cards because I wasn't the only one before with very high VRM temps even with zero overclocks and aggressive fan profiles.


----------



## Falmatrix2r

Yeah it's still just a gfx card that I'll probably have to change one day like everything else but it's a nice feeling to not have to worry every time I get a new game and think to myself "oh boy will I be able to run this nicely?"
Adequate 1440p for a 6950? I'd say ok if you're playing L4D2 but definitely not a recent title...I'm doing 1080p and I'm having a hard time keeping 30fps on GTAV with medium settings. But then I'm still surprised by how I manage to have decent fps with almost everything maxed out in games like Nazi Zombie Army or Sleeping Dogs, they perform very well while looking very nice. Can't say the same about Watch Dogs on my 6950...that was a pain.


----------



## Chaoz

Quote:


> Originally Posted by *Stige*
> 
> Either you got really lucky for some weird reason or they have done something to those cards because I wasn't the only one before with very high VRM temps even with zero overclocks and aggressive fan profiles.


Hmm, beats me. I bought it of amazon a month ago. Haven't had any problems what so ever. Even with 2000 rpm, which it got up to about with the benchmark just now.


----------



## Stige

Quote:


> Originally Posted by *Chaoz*
> 
> Hmm, beats me. I bought it of amazon a month ago. Haven't had any problems what so ever. Even with 2000 rpm, which it got up to about with the benchmark just now.


Got pics of the cooler? I can't really explain why such a massive difference on supposedly the same cooler.
I doubt just new pads or something would make that big difference.


----------



## Chaoz

Quote:


> Originally Posted by *Stige*
> 
> Got pics of the cooler? I can't really explain why such a massive difference on supposedly the same cooler.
> I doubt just new pads or something would make that big difference.


No clue, tbh.

Here's a pic. I wrapped the red in carbon, because it's ugly and i don't have any red in my case.


----------



## Carniflex

I have a bit surprising hiccup somewhere between Windows 7 Pro, Crimson drivers, MATLAB figures and 390X.

I have a script that draws a bunch of figures in MATLAB from a data files piled in a directory. I have Windows 7 Pro with a visualization theme with Aero enabled, two GFX cards - 390X as a driver of the main display array (3x1 portrait eyefinity with 1080p screens) and a 7870 Eyefinity 6 card driving two auxiliary displays (both 1080p in portrait). So I run the script and it starts producing figures, in gpu-z vram usage creeps slowly up on the main card (390X) and remains practically flat on the 7870. When dedicated vRAM usage creeps to approx 1.7 GB windows starts complaining about "slow performance" and when I hit ~1.8 GB it just switches to "Basic" color scheme although I still have 6 GB of vRAM free according to GPU-z and there is no significant memory load on a 7870 (which does have only 2 GB of vRAM).

Any thoughts? Suggestions? It does seem like Windows issue probably.


----------



## TsukikoChan

Quote:


> Originally Posted by *Carniflex*
> 
> I have a bit surprising hiccup somewhere between Windows 7 Pro, Crimson drivers, MATLAB figures and 390X.
> 
> I have a script that draws a bunch of figures in MATLAB from a data files piled in a directory. I have Windows 7 Pro with a visualization theme with Aero enabled, two GFX cards - 390X as a driver of the main display array (3x1 portrait eyefinity with 1080p screens) and a 7870 Eyefinity 6 card driving two auxiliary displays (both 1080p in portrait). So I run the script and it starts producing figures, in gpu-z vram usage creeps slowly up on the main card (390X) and remains practically flat on the 7870. When dedicated vRAM usage creeps to approx 1.7 GB windows starts complaining about "slow performance" and when I hit ~1.8 GB it just switches to "Basic" color scheme although I still have 6 GB of vRAM free according to GPU-z and there is no significant memory load on a 7870 (which does have only 2 GB of vRAM).
> 
> Any thoughts? Suggestions? It does seem like Windows issue probably.


it's a windows thing, i've come across it at work with some heavy video applications.
go to system properties, click on the performance settings button and take it off "let windows choose whats best for my computer" to either a custom or appearance option. As long as the option "Enable Aero Peek" is ticked then it shouldn't change to basic when Vram/memory is increasing a lot.


----------



## ralph9994

390 user here. any successful overclocks with the MSI version of the card? (Twin Frozr)


----------



## m70b1jr

I wish I could get my card to be stable when at 200mV instead of the black screen of death


----------



## Dundundata

Quote:


> Originally Posted by *ralph9994*
> 
> 390 user here. any successful overclocks with the MSI version of the card? (Twin Frozr)


Yes indeed, they can generally hit 1200mhz


----------



## navjack27

instead of going straight into a benchmark run to test stability. i'm just going to play the games i usually do in order to test out a lil bump over my previous 24/7 overclock. so i'll be using afterburner with disabled powerplay with the clocks of 1120/1625 with no voltage adjust and a 50% power limit (i still don't think it does anything once you disable powerplay but it won't hurt to raise it). i'm steering clear of 1750 memory clocks with the timing mod, i firmly believe that it produces mem errors after a while that i guess the memory controller gets sick of correcting. eventually that clock speed in some games or tests just starts to crash a lil bit, not full crashes but little hitches that i just don't see when i lower it to 1625.


----------



## ZealotKi11er

Quote:


> Originally Posted by *navjack27*
> 
> instead of going straight into a benchmark run to test stability. i'm just going to play the games i usually do in order to test out a lil bump over my previous 24/7 overclock. so i'll be using afterburner with disabled powerplay with the clocks of 1120/1625 with no voltage adjust and a 50% power limit (i still don't think it does anything once you disable powerplay but it won't hurt to raise it). i'm steering clear of 1750 memory clocks with the timing mod, i firmly believe that it produces mem errors after a while that i guess the memory controller gets sick of correcting. eventually that clock speed in some games or tests just starts to crash a lil bit, not full crashes but little hitches that i just don't see when i lower it to 1625.


I play games for a good 2 hours do termine stable OC. 290X are very picky cards. They will crash OC based on temperatures.


----------



## Bigeggs83

Hey guys.

I have a VTX R9 390 and am happy to have this card as it works brilliantly.


----------



## battleaxe

Quote:


> Originally Posted by *mus1mus*
> 
> Correct on all accounts.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> In addition, iTurbo allows adjustments for VDDCI. But I don't really notice it helping. Aside from an instance where I did pulled VDDCI down due to some Desktop Corruption at Idle.
> 
> I think, my mod was amiss on the VDDCI Part. Dropping the offset solved my issue though.
> 
> Crazy as I may have been, there's just an urge to push past where most mortals revel. Gotta see what's in there.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Getting another card tomorrow.


Where do you get the most recent version of iTurbo? without signing up for some spamware?


----------



## m70b1jr

Anyone got a fix for the +150mV black screen of death?


----------



## jdorje

If I push voltage+clock too high I get black screen; it goes away if I close the program though. I assumed this was a power problem from the power limit. Is that not the case?


----------



## fat4l

No. Digital output gets corrupted. Can;t be fixed,


----------



## mus1mus

meh. You do know it happens on the 2XX cards too right? And you can fix that with the right BIOS.









Sad to say, people on the 3XX cards don't dp much BIOS flashing/modding.


----------



## OneB1t

try to use stilt MCU bios i think that works best as for digital outputs


----------



## ZealotKi11er

Never had problem with my cards +200mV via TriXX.


----------



## mus1mus

+400 iTurbo with the right BIOS here.


----------



## jdorje

I read somewhere it's only a problem with displayport, which is what I'm using.


----------



## battleaxe

Quote:


> Originally Posted by *mus1mus*
> 
> +400 iTurbo with the right BIOS here.


Where do we get the iTurbo? the link on their site is broken. Can someone upload it?


----------



## pillowsack

Quote:


> Originally Posted by *mus1mus*
> 
> meh. You do know it happens on the 2XX cards too right? And you can fix that with the right BIOS.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sad to say, people on the 3XX cards don't dp much BIOS flashing/modding.


I'm willing to test. Can you help me out?


----------



## ralph9994

Quote:


> Originally Posted by *Dundundata*
> 
> Yes indeed, they can generally hit 1200mhz


Cannot seem to get any further than 1150mhz (1150/1625/+75mv)


----------



## fat4l

Quote:


> Originally Posted by *jdorje*
> 
> I read somewhere it's only a problem with displayport, which is what I'm using.


yes exactly, displayport problem.
Go and use 60Hz(if you are using more) and see what happens. No blackouts.


----------



## mus1mus

Quote:


> Originally Posted by *pillowsack*
> 
> I'm willing to test. Can you help me out?


Try out other BIOSes.

If you don't have a back up card, Take note where and at what levelof Voltage the Black screen happens.

It is not too safe and success is not a guarantee with modding and flashing.


----------



## jdorje

Quote:


> Originally Posted by *fat4l*
> 
> yes exactly, displayport problem.
> Go and use 60Hz(if you are using more) and see what happens. No blackouts.


But then I wouldn't need to overclock my card!

But okay, for breaking the 4690k+390 firestrike record, I'll try it.

I am using a modified version of my own bios, with memory strap upgrades and changes to all the voltages.


----------



## ThatGuy16

FYI the powercolor cooling system kinda sucks lol. Playing BF4 for a few hours max GPU temp 84c and I think VRM is about the same. 1100mhz /1550 + 38mv and 20% power limit. I feel like I need to buy a block for this thing lol


----------



## jdorje

Quote:


> Originally Posted by *ThatGuy16*
> 
> FYI the powercolor cooling system kinda sucks lol. Playing BF4 for a few hours max GPU temp 84c and I think VRM is about the same. 1100mhz /1550 + 38mv and 20% power limit. I feel like I need to buy a block for this thing lol


Those temps are pretty good...core can probably be dropped by remounting the cooler. VRMs are a lot cooler than mine.


----------



## jdorje

Seems like stability is extremely temperature-sensitive if I measure it with OCCT. If I open my case and raise fans to max I can get 30 mhz higher core clock before errors are reported. That's a 40C difference in VRM temperatures...

Problem is VRM temps don't correlate to core temps (different heat sink), so there's no way to ramp up fans for them. Or is there?


----------



## granadier12

runnning cod advanced warfare on background, what do you guys think, are ok the temps and oc?? and with this game the card is hotter than the others and i dont know why.. with any other game it barely reach 71 C on load.


----------



## OneB1t

thats simple







your CPU bottlenecking this card at 1080p so while running less graphic intensive game it have much less work to do so its colder

if you want to get rid of CPU bottleneck go 4K or buy another CPU


----------



## tolis626

Hey guys, sorry if this has been adressed before, but here goes.

I used to run my CPU at 4.7GHz and all was good. Details don't matter so much at this point. A little while back, I decided to redo my CPU overclock properly, so I managed to get my CPU to 4.8GHz and my RAM to 2400MHz up from 2133MHz. And it's stable, at least during stress testing.

So what's the problem? My GPU would do 1175MHz +80mV on the core and 1675MHz +50mV AUX on the memory game stable (Max temps I'd see were high 70s to low 80s, depending on the game). And by stable I mean hours of BF4 would go by with no instabilities. Nowadays, I can't do that. I get artifacts or my PC downright crashes completely. And it's not only that. I played most games using 1125MHz core and 1600MHz memory with no added voltage for lower power consumption, but that's unstable too. My 1150MHz/1650Mhz profile (+50mV on core and AUX) seems stable, but I'm not even sure at this point.

Taking my CPU back to 4.7GHz seems to kind of alleviate, but I haven't had time to test it properly. Sorry that I don't provide more info and for the messy post, but my brain is in disarray right now. Gotta get away from the screen a bit. If anyone has had similar experiences or has any tips to give me, please do so.

PS : I've also tried 1175MHz with +100mV, same result. I also tried 1175MHz with lower memory clocks. Again, same result. I don't think the CPU is unstable, but it may be causing instabilities...? Damn, my brain hurts...


----------



## OneB1t

Power supply ?


----------



## tolis626

Quote:


> Originally Posted by *OneB1t*
> 
> Power supply ?


I don't think it's the PSU. It's a 1300W EVGA G2 that's been tremendously good so far (Just a little noisy at idle). Although the cabling in my house is crap to be honest. I had to change the outlet I have it plugged in to, because my screen would go on and off when I rolled my window up or down (They share the same cabling it seems). And it was interfering with something in the PC, presumably the GPU, because the screen wouldn't do it when using any other input.

Maybe I should check the cables? What should I even do in that case?


----------



## ZealotKi11er

Quote:


> Originally Posted by *tolis626*
> 
> Hey guys, sorry if this has been adressed before, but here goes.
> 
> I used to run my CPU at 4.7GHz and all was good. Details don't matter so much at this point. A little while back, I decided to redo my CPU overclock properly, so I managed to get my CPU to 4.8GHz and my RAM to 2400MHz up from 2133MHz. And it's stable, at least during stress testing.
> 
> So what's the problem? My GPU would do 1175MHz +80mV on the core and 1675MHz +50mV AUX on the memory game stable (Max temps I'd see were high 70s to low 80s, depending on the game). And by stable I mean hours of BF4 would go by with no instabilities. Nowadays, I can't do that. I get artifacts or my PC downright crashes completely. And it's not only that. I played most games using 1125MHz core and 1600MHz memory with no added voltage for lower power consumption, but that's unstable too. My 1150MHz/1650Mhz profile (+50mV on core and AUX) seems stable, but I'm not even sure at this point.
> 
> Taking my CPU back to 4.7GHz seems to kind of alleviate, but I haven't had time to test it properly. Sorry that I don't provide more info and for the messy post, but my brain is in disarray right now. Gotta get away from the screen a bit. If anyone has had similar experiences or has any tips to give me, please do so.
> 
> PS : I've also tried 1175MHz with +100mV, same result. I also tried 1175MHz with lower memory clocks. Again, same result. I don't think the CPU is unstable, but it may be causing instabilities...? Damn, my brain hurts...


Drivers, ULPS are probably causing problems.


----------



## BradleyW

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Drivers, ULPS are probably causing problems.


Enter the world's best ULPS guide:
http://www.overclock.net/t/1587347/how-to-disable-ulps-cfx-windows-10-crimson-all-versions#post_24790359


----------



## ralph9994

1150/1625/+75mv MSI R9 390

Happy with the result.


----------



## tolis626

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Drivers, ULPS are probably causing problems.


I'm using the latest drivers and ULPS is, as far as I know, disabled. And I don't think it should matter anyway, I'm only using one card.
Quote:


> Originally Posted by *BradleyW*
> 
> Enter the world's best ULPS guide:
> http://www.overclock.net/t/1587347/how-to-disable-ulps-cfx-windows-10-crimson-all-versions#post_24790359


I disabled ULPS in Afterburner before seeing the guide, and when I did what the guide said both values were 0, so ULPS was indeed disabled. But, as I said above, why would it matter? I'm using a single GPU. Isn't it relevant only in multi-GPU systems?

Not only that, but it didn't seem to help. I still got artifacts (colorful flashes all over my screen). I didn't completely crash, which is good, but I didn't play for long (like maybe 10 minutes), so it's probably the same as before.

I'm getting depressed... But thank you both for your help!


----------



## jdorje

Did you try reverting drivers?

Or switching from stable to beta?


----------



## tolis626

Quote:


> Originally Posted by *jdorje*
> 
> Did you try reverting drivers?
> 
> Or switching from stable to beta?


No, but this has been happening with both 15.12 and 16.1.


----------



## jodybdesigns

Quote:


> Originally Posted by *tolis626*
> 
> I'm using the latest drivers and ULPS is, as far as I know, disabled. And I don't think it should matter anyway, I'm only using one card.
> I disabled ULPS in Afterburner before seeing the guide, and when I did what the guide said both values were 0, so ULPS was indeed disabled. But, as I said above, why would it matter? I'm using a single GPU. Isn't it relevant only in multi-GPU systems?
> 
> Not only that, but it didn't seem to help. I still got artifacts (colorful flashes all over my screen). I didn't completely crash, which is good, but I didn't play for long (like maybe 10 minutes), so it's probably the same as before.
> 
> I'm getting depressed... But thank you both for your help!


I mentioned disabling ULPS for single card users a while back. Some people (like myself), have had success with disabling ULPS on single card configs and getting the fluctuations to go away. But Powerplay can also be an issue as well, as this is supposed to cause the throttling on cards. Now I have found with tons of testing that ULPS has something to do with this as well. As disabling ULPS on my single card HD4870 config, kept my secondary VGA monitors from flickering when I had an overclock (digital signals did not flicker). I also found this true on my HD6870 / 5870 Rebadge as well, as it did the same thing.

Now I did not get any of the throttling issues with my Crossfire 7950's, or on ANY of my 7000 series cards. But I did get SOME throttling on my new 390, disabling ULPS through Sapphire Trixx resulted in no more throttling from my core clocks.

But I hate to have reality sink in, but it sounds like the VRAM is bad on the card. I would RMA it if possible...


----------



## Chaoz

Quote:


> Originally Posted by *tolis626*
> 
> No, but this has been happening with both 15.12 and 16.1.


Have you tried installed 15.15? Seems this was the more stable driver for these cards.
I'm still running the driver without problems, I kept BSOD'ing in 16.1.


----------



## tolis626

Quote:


> Originally Posted by *Chaoz*
> 
> Have you tried installed 15.15? Seems this was the more stable driver for these cards.
> I'm still running the driver without problems, I kept BSOD'ing in 16.1.


You mean 15.12, right? If so, I was running that until 2-3 days ago. If not, is there even a 15.15 driver?

@jodybdesigns

I have thought that the card might be faulty, but I highly doubt it. I lost some overclocking potential for some reason. May be CPU related Maybe not. But I don't think it's going the RMA route. At least I hope so. If I can't pinpoint any issue with your guys' help, I will conduct some thorough testing in the next few days to see what is happening. I will try stock clocks on the GPU, lower overclocks on my CPU and RAM and see what happens.


----------



## jodybdesigns

Quote:


> Originally Posted by *tolis626*
> 
> You mean 15.12, right? If so, I was running that until 2-3 days ago. If not, is there even a 15.15 driver?
> 
> @jodybdesigns
> 
> I have thought that the card might be faulty, but I highly doubt it. I lost some overclocking potential for some reason. May be CPU related Maybe not. But I don't think it's going the RMA route. At least I hope so. If I can't pinpoint any issue with your guys' help, I will conduct some thorough testing in the next few days to see what is happening. I will try stock clocks on the GPU, lower overclocks on my CPU and RAM and see what happens.


Eh, if you are having to do all of that to get stable @ stock, I would say something on the hardware side has malfunctioned. It could be software, but these are all known symptoms of a bad card.


----------



## tolis626

Quote:


> Originally Posted by *jodybdesigns*
> 
> Eh, if you are having to do all of that to get stable @ stock, I would say something on the hardware side has malfunctioned. It could be software, but these are all known symptoms of a bad card.


I get what you're saying, but I never said it crashed at stock. It crashed at 1125MHz core 1600MHz at stock VOLTS. I haven't run this card stock since the first day I got it in September.









I will try to do a lengthy gaming session at 1150/1650MHz +50mV, my second known stable gaming overclock, and see what happens. But, tomorrow. For now, work.


----------



## Chaoz

Quote:


> Originally Posted by *tolis626*
> 
> You mean 15.12, right? If so, I was running that until 2-3 days ago. If not, is there even a 15.15 driver?
> 
> @jodybdesigns
> 
> I have thought that the card might be faulty, but I highly doubt it. I lost some overclocking potential for some reason. May be CPU related Maybe not. But I don't think it's going the RMA route. At least I hope so. If I can't pinpoint any issue with your guys' help, I will conduct some thorough testing in the next few days to see what is happening. I will try stock clocks on the GPU, lower overclocks on my CPU and RAM and see what happens.


No I mean 15.15 driver.
I wouldn't suggest it if it didn't exist.

http://support.amd.com/en-us/kb-articles/Pages/AMD-Radeon-300-Series.aspx


----------



## BradleyW

Quote:


> Originally Posted by *Chaoz*
> 
> No I mean 15.15 driver.
> I wouldn't suggest it if it didn't exist.
> 
> http://support.amd.com/en-us/kb-articles/Pages/AMD-Radeon-300-Series.aspx


I think he thought you made a typo at first.


----------



## Dundundata

Quote:


> Originally Posted by *tolis626*
> 
> I get what you're saying, but I never said it crashed at stock. It crashed at 1125MHz core 1600MHz at stock VOLTS.


that sounds like a high OC for stock volts, maybe it wasn't as stable as it seemed.


----------



## Dundundata

Quote:


> Originally Posted by *ralph9994*
> 
> 1150/1625/+75mv MSI R9 390
> 
> Happy with the result.


sounds about right. I run the same OC @ +50mV, which equates to about 1.32V, gpu temp around 70C and VRM temp in the 60s.


----------



## jdorje

Set the all-time high firestrike score for a 4690k + 390.

http://www.3dmark.com/fs/7199111










Neither CPU nor GPU overclocks are stable - just enough to pass.

I also have this one with a slightly higher graphics score. Couldn't reproduce it when I bumped CPU core though.

http://www.3dmark.com/fs/7198755


----------



## diggiddi

Quote:


> Originally Posted by *tolis626*
> 
> I don't think it's the PSU. It's a 1300W EVGA G2 that's been tremendously good so far (Just a little noisy at idle). Although the cabling in my house is crap to be honest. I had to change the outlet I have it plugged in to, because my screen would go on and off when *I rolled my window up or down* (They share the same cabling it seems). And it was interfering with something in the PC, presumably the GPU, because the screen wouldn't do it when using any other input.
> 
> Maybe I should check the cables? What should I even do in that case?










Your house has power windows or you live in your car








Sounds like a possible PSU issue to me


----------



## tolis626

Quote:


> Originally Posted by *Chaoz*
> 
> No I mean 15.15 driver.
> I wouldn't suggest it if it didn't exist.
> 
> http://support.amd.com/en-us/kb-articles/Pages/AMD-Radeon-300-Series.aspx


Quote:


> Originally Posted by *BradleyW*
> 
> I think he thought you made a typo at first.


What Bradley said. I thought it was a typo. I never even knew there was a 15.15 driver.









Will try it tomorrow!
Quote:


> Originally Posted by *Dundundata*
> 
> that sounds like a high OC for stock volts, maybe it wasn't as stable as it seemed.


It's quite good, but it was stable. Like, really, that's what I've been using for months. I started experimenting on the higher overclocks only after establishing a stock volt overclock. That's what's so strange.
Quote:


> Originally Posted by *diggiddi*
> 
> 
> 
> 
> 
> 
> 
> 
> Your house has power windows or you live in your car
> 
> 
> 
> 
> 
> 
> 
> 
> Sounds like a possible PSU issue to me


Yeah, bad choice of words on my part.









The shutters on my windows are raised and lowered by a mechanism, not by hand. The shutter itself rolls above the window. That's what I meant. How my brain decided I wanted to write that I roll down my windows, I have no idea. I don't live in my car - hell, I don't even have a car.









So now how do I check if it's the PSU... I don't have another PSU at hand, and don't have anyone from whom I could borrow a PSU of at least passable quality. Most of my friends and family use laptops and those that have desktops use the cheapest PSU that met their needs. Damn... Any other way I might check it? Maybe I should first check the cables...


----------



## diggiddi

Downclock your gpu and or cpu and see if issues persist


----------



## Falmatrix2r

Hey guys, I'm happy to announce that I just got my 390x yesterday. The card has some great potential, crimson software looks really nice. I'll try some OC on it later on tho I don't wanna get too crazy on it.
I do have something weird going on, when I turn my PC on, the POST message with all vital information used to scroll down very fast but now it's almost lagging and I don't know why. Like if it went from 60 fps to 20. It's strange. Does it have something to do with my motherboard being 5 years old? I had mistakenly the UEFI switch on first boot and switched to legacy later on, can that be it?


----------



## Synntx

Okay, so I finally got all setup with my new Nitro 390x. I applied the 1250 strap all the way up to the 1750 strap. I've got +100mv/+25mv running 1200/1750, and the best graphics score I can obtain is like, literally, right at 15000.....I can't figure out how to push it any further for a higher graphics score. For some reason, if I try to use TRIXX to push more voltage, I get major instability for anything more than +125mv.


----------



## ziggystardust

Hello everyone,

I was going to buy the MSI 390X a few weeks ago, but it wasn't in stocks where I live. I decided to wait a bit. Now it's in stocks but also there is Sapphire 390X Nitro /w backplate and it's about 100 bucks cheaper than any other 390X.

Any thoughts on Sapphire 390X Nitro? How's the core and vrm cooling performance? I won't go hardcore overclock but i will try to push 1150 on core at least.
.


----------



## diggiddi

Go for it then it is a good card


----------



## kizwan

Quote:


> Originally Posted by *ziggystardust*
> 
> Hello everyone,
> 
> I was going to buy the MSI 390X a few weeks ago, but it wasn't in stocks where I live. I decided to wait a bit. Now it's in stocks but also there is Sapphire 390X Nitro /w backplate and it's about 100 bucks cheaper than any other 390X.
> 
> Any thoughts on Sapphire 390X Nitro? How's the core and vrm cooling performance? I won't go hardcore overclock but i will try to push 1150 on core at least.
> .


EK going to release waterblock for MSI 390/390X. Just in case you want to water cool your card. If not, go with Sapphire.


----------



## twisted46

Hey all,
I switched from team green at the beginning of the year due to game crashes with my 970 that I could not solve for anything. Happy to report that I have yet to have any driver crashes on any game! Also I have not lost any performance at 1440p and the upscaling to 1800p is enjoyable. I will post benchmarks when I'm back home.

However I do have questions and concerns. So first system specs:

Thermaltake 750W bronze psu
MSI z170 m3 mobo
I5 6500 cpu cooled by a H100I
8Gb corsair ddr4
180gb Intel ssd
1.5Tb hdd
Sapphire r9 390

First concern, thermals. The 390 runs A LOT hotter than the 970 or my 270 Xfire setup ran. Underload I see the GPU running 77-80C and vrm1 temp reaches 95C. The biggest issue I have with this is that my cpu temp raises 10-15C during benchmarks that don't touch the cpu at all. In fact if I run stress test with cpuZ my temps are lower than when I run heaven... I have two 120mm intakes at the front of the case, a single 120.. Exhaust at the rear and then the top is occupied by the H100i. The obvious solution is a Kraken g10 but the vrms around the GPU will likely roast and for the price of a loop I may as well just by a 390x devil. What about the hg10?

The second concern is overclocking limits. At +100mV and +50% power my max OC is 1125/1650.I know that is close to on par with these cards but I feel like I should be able to push a little higher. Does max OC for these go up much with liquid?


----------



## m70b1jr

Quote:


> Originally Posted by *twisted46*
> 
> Hey all,
> I switched from team green at the beginning of the year due to game crashes with my 970 that I could not solve for anything. Happy to report that I have yet to have any driver crashes on any game! Also I have not lost any performance at 1440p and the upscaling to 1800p is enjoyable. I will post benchmarks when I'm back home.
> 
> However I do have questions and concerns. So first system specs:
> 
> Thermaltake 750W bronze psu
> MSI z170 m3 mobo
> I5 6500 cpu cooled by a H100I
> 8Gb corsair ddr4
> 180gb Intel ssd
> 1.5Tb hdd
> Sapphire r9 390
> 
> First concern, thermals. The 390 runs A LOT hotter than the 970 or my 270 Xfire setup ran. Underload I see the GPU running 77-80C and vrm1 temp reaches 95C. The biggest issue I have with this is that my cpu temp raises 10-15C during benchmarks that don't touch the cpu at all. In fact if I run stress test with cpuZ my temps are lower than when I run heaven... I have two 120mm intakes at the front of the case, a single 120.. Exhaust at the rear and then the top is occupied by the H100i. The obvious solution is a Kraken g10 but the vrms around the GPU will likely roast and for the price of a loop I may as well just by a 390x devil. What about the hg10?
> 
> The second concern is overclocking limits. At +100mV and +50% power my max OC is 1125/1650.I know that is close to on par with these cards but I feel like I should be able to push a little higher. Does max OC for these go up much with liquid?


Yes, I have an Arctic accelero hybrid III 140. I have it push air out of my case.


----------



## Spartoi

Quote:


> Originally Posted by *m70b1jr*
> 
> Yes, I have an Arctic accelero hybrid III 140. I have it push air out of my case.


Do you know if I could install this on my Sapphire 390X?


----------



## jdorje

Your temps are a bit high, and based on cpuv temp rising I assume it's from lack of case airflow.

That said those temps are completely fine. The card throttles at 98C, and even under voltage is okay to get hot. The vrms are okay up to 110-130C, though they drop a lot in efficiency as they get hotter.


----------



## m70b1jr

Quote:


> Originally Posted by *Spartoi*
> 
> Do you know if I could install this on my Sapphire 390X?


Probably, the cooler is VERY difficult to install, needed 2 people to help me, but it is possible with one person, very precision hands and patience. The cooler will fit anything, it's just the VRAM heatsink might not fit on, but it has PLENTY of headroom for installation so i'm pretty sure it'll work.


----------



## tolis626

Quote:


> Originally Posted by *twisted46*
> 
> Hey all,
> I switched from team green at the beginning of the year due to game crashes with my 970 that I could not solve for anything. Happy to report that I have yet to have any driver crashes on any game! Also I have not lost any performance at 1440p and the upscaling to 1800p is enjoyable. I will post benchmarks when I'm back home.
> 
> However I do have questions and concerns. So first system specs:
> 
> Thermaltake 750W bronze psu
> MSI z170 m3 mobo
> I5 6500 cpu cooled by a H100I
> 8Gb corsair ddr4
> 180gb Intel ssd
> 1.5Tb hdd
> Sapphire r9 390
> 
> First concern, thermals. The 390 runs A LOT hotter than the 970 or my 270 Xfire setup ran. Underload I see the GPU running 77-80C and vrm1 temp reaches 95C. The biggest issue I have with this is that my cpu temp raises 10-15C during benchmarks that don't touch the cpu at all. In fact if I run stress test with cpuZ my temps are lower than when I run heaven... I have two 120mm intakes at the front of the case, a single 120.. Exhaust at the rear and then the top is occupied by the H100i. The obvious solution is a Kraken g10 but the vrms around the GPU will likely roast and for the price of a loop I may as well just by a 390x devil. What about the hg10?
> 
> The second concern is overclocking limits. At +100mV and +50% power my max OC is 1125/1650.I know that is close to on par with these cards but I feel like I should be able to push a little higher. Does max OC for these go up much with liquid?


Is the H100i an intake or exhaust? This smells like air flow issues from afar. The Sapphire is among the coolest running 390 cards.

And if you think that can't make a big difference. I have an MSI 390x and for cooling I have 2 bottom intake fans right below it to supply it with fresh air, two front intake fans, a rear exhaust fan and a top intake fan, along with my Corsair H110. All the fans, including those on the H110, are 140mm Phanteks PH-F140SP. With the same clocks and voltages, if the H110 is intake there is little flow of air and the GPU hits 83-84C, maybe even 85-86C. Although there's so many intakes, because it's an imbalanced setup (7 fans intake and 1 exhaust). If I switch the H110 to exhaust then air flows better from bottom front to top back and the GPU temps drop to 74-75C. So yeah, it matters.


----------



## jdorje

Oh I misread that! You absolutely need your aio to be on intake if you're overclocking.


----------



## Chaoz

Quote:


> Originally Posted by *twisted46*
> 
> First concern, thermals. The 390 runs A LOT hotter than the 970 or my 270 Xfire setup ran. Underload I see the GPU running 77-80C and vrm1 temp reaches 95C. The biggest issue I have with this is that my cpu temp raises 10-15C during benchmarks that don't touch the cpu at all. In fact if I run stress test with cpuZ my temps are lower than when I run heaven... I have two 120mm intakes at the front of the case, a single 120.. Exhaust at the rear and then the top is occupied by the H100i. The obvious solution is a Kraken g10 but the vrms around the GPU will likely roast and for the price of a loop I may as well just by a 390x devil. What about the hg10?


That's really quite hot, tbh. With my ASUS Strix R9 390 DC3OC the temps barely go over 70°C and the VRM's also barely go over 65°C. Especially while I'm playing games for hours on end.
My card never runs over 70°C even the VRM's don't go over 65°C even while in Valley on extreme settings.

Maybe I got lucky??


----------



## Loosenut

come test your overclocks or just support Team AMD & come help us win the Forum Folding War

http://www.overclock.net/t/1587731/forum-folding-war-2016-team-amd#post_24796394


----------



## ziggystardust

Quote:


> Originally Posted by *kizwan*
> 
> EK going to release waterblock for MSI 390/390X. Just in case you want to water cool your card. If not, go with Sapphire.


Thanks for the heads up. I'm not planning a water cooling setup.

My main concern is the vrm temps though. I noticed some people having kinda high vrm temps on their Sapphire 390s.


----------



## jazz995756

So a few days ago I got 2 ultra wide monitors and when I hooked them up everything was fine until I opened chrome I started getting glitches and artifacts everywhere and it wasn't just chrome it was any program open but once I minimize the windows and only my desktop showing on both monitors the issue stopped. Any one know anything about that running all stock clocks did a reboot and I haven't noticed it but I haven't had time to use my computer at all kind of curious how it will handle gaming on the ultra wide though


----------



## Spartoi

Quote:


> Originally Posted by *m70b1jr*
> 
> Probably, the cooler is VERY difficult to install, needed 2 people to help me, but it is possible with one person, very precision hands and patience. The cooler will fit anything, it's just the VRAM heatsink might not fit on, but it has PLENTY of headroom for installation so i'm pretty sure it'll work.


What are the temps of your VRMs and GPU before and after using the cooler? Also, which would you recommend getting? The 290x version or the generic one?


----------



## Carniflex

Quote:


> Originally Posted by *ziggystardust*
> 
> Hello everyone,
> 
> I was going to buy the MSI 390X a few weeks ago, but it wasn't in stocks where I live. I decided to wait a bit. Now it's in stocks but also there is Sapphire 390X Nitro /w backplate and it's about 100 bucks cheaper than any other 390X.
> 
> Any thoughts on Sapphire 390X Nitro? How's the core and vrm cooling performance? I won't go hardcore overclock but i will try to push 1150 on core at least.
> .


There should not be major performance differences between different brands of 390X. And as far as waterblocks go, full covers are normally done only for the most popular models, however Alphacool has this program where they are doing hybrid "full cover" blocks for pretty much any card out there, if they dont have one already done for a specific card then one can send their card in, they will do a block for it and then send you the block from the first patch free as a thanx. Their blocks are core only but have this rather massive ~700 g aluminium heatsink for VRM's and RAM and backplate.


----------



## m70b1jr

Quote:


> Originally Posted by *Spartoi*
> 
> What are the temps of your VRMs and GPU before and after using the cooler? Also, which would you recommend getting? The 290x version of the generic one?


290x has the same PCB as the 390x


----------



## twisted46

Quote:


> Originally Posted by *m70b1jr*
> 
> Yes, I have an Arctic accelero hybrid III 140. I have it push air out of my case.


What card do you have and which model did you buy?

Has anyone tried the hg10 or h10? And kept vrms cool?

My H100i is at the top of the case so flipping it to an intake would hardly help get air to the bottom of the card, but would keep my cpu cooler.

I messed around with the fan curve and got temps down to 73 on the gpu and 85 on vrm1. In Valley (where I had the blue checkered artifacting) vrm1 was still reaching 91c but the good news is that I am at 1150/1600 now with stable benchmarks and played FC4 at 3200x1800p for an hour with no issues. I promise benchmarks will come right now my hitting 1544 in heaven 1080 extreme 8xAA and 1240s at 1440p extreme no AA.


----------



## kizwan

Quote:


> Originally Posted by *jazz995756*
> 
> So a few days ago I got 2 ultra wide monitors and when I hooked them up everything was fine until I opened chrome I started getting glitches and artifacts everywhere and it wasn't just chrome it was any program open but once I minimize the windows and only my desktop showing on both monitors the issue stopped. Any one know anything about that running all stock clocks did a reboot and I haven't noticed it but I haven't had time to use my computer at all kind of curious how it will handle gaming on the ultra wide though


By glitches did you mean flickering? Try disable hardware acceleration under Settings >> Advanced settings. I'm pretty sure this is because the gpu memory clock fluctuating.
Quote:


> Originally Posted by *ziggystardust*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> EK going to release waterblock for MSI 390/390X. Just in case you want to water cool your card. If not, go with Sapphire.
> 
> 
> 
> Thanks for the heads up. I'm not planning a water cooling setup.
> 
> My main concern is the vrm temps though. I noticed some people having kinda high vrm temps on their Sapphire 390s.
Click to expand...

I'm pretty sure Sapphire VRM cooling is better than MSI. If you overclock & push more voltage, VRM can easily get hot though.


----------



## Dundundata

My msi vrm is usually in the 60s tops


----------



## afyeung

Quote:


> Originally Posted by *m70b1jr*
> 
> 290x has the same PCB as the 390x


No it doesn't. The layout is lightly different since there's only aftermarket cards. And the cap sizes are slightly different. That means something like the Corsair HG10 which was designed for the Reference 290x won't fit the 390x. Even though corsair specified the HG10 to fit my MSI 290, it didn't fit well because the cap sizes are different.


----------



## afyeung

Quote:


> Originally Posted by *jdorje*
> 
> Your temps are a bit high, and based on cpuv temp rising I assume it's from lack of case airflow.
> 
> That said those temps are completely fine. The card throttles at 98C, and even under voltage is okay to get hot. The vrms are okay up to 110-130C, though they drop a lot in efficiency as they get hotter.


That's not true smh. The 290(x) throttled at 95-96 and the 390(x) throttles at 94. The stock fan curve will keep you from ever getting that high anyways. 98c is way too hot.


----------



## afyeung

Quote:


> Originally Posted by *Dundundata*
> 
> Yes indeed, they can generally hit 1200mhz


That's for a very good card. An average card will hit 1150.


----------



## tolis626

Quote:


> Originally Posted by *Synntx*
> 
> Okay, so I finally got all setup with my new Nitro 390x. I applied the 1250 strap all the way up to the 1750 strap. I've got +100mv/+25mv running 1200/1750, and the best graphics score I can obtain is like, literally, right at 15000.....I can't figure out how to push it any further for a higher graphics score. For some reason, if I try to use TRIXX to push more voltage, I get major instability for anything more than +125mv.


I wanted to ask but forgot, how high is your actual AUX voltage? I don't mean the offset you have set in Afterburner/Trixx, but the VDDCI value shown in GPU-z. Thanks.


----------



## jdorje

Quote:


> Originally Posted by *Dundundata*
> 
> My msi vrm is usually in the 60s tops


Is the vrm cooled by the same heat sink as cools the core? On my xfx it's a separate heat sink inside the other one but I feel like it gets a good bit hotter.

And what connects the heat sink to the vrms? Thermal padding?
Quote:


> Originally Posted by *afyeung*
> 
> That's not true smh. The 290(x) throttled at 95-96 and the 390(x) throttles at 94. The stock fan curve will keep you from ever getting that high anyways. 98c is way too hot.


98 is what's set in my 390s bios. I don't disagree it's way too hot. And if you raise voltage you have to keep temperatures lower.


----------



## m70b1jr

Quote:


> Originally Posted by *jdorje*
> 
> Is the vrm cooled by the same heat sink as cools the core? On my xfx it's a separate heat sink inside the other one but I feel like it gets a good bit hotter.
> 
> And what connects the heat sink to the vrms? Thermal padding?
> 98 is what's set in my 390s bios. I don't disagree it's way too hot. And if you raise voltage you have to keep temperatures lower.


Arctic accelero hybrid III 140 is the only cooler with a VRM Heatsink, and a dedicated fan for it.


----------



## jdorje

Quote:


> Originally Posted by *m70b1jr*
> 
> Arctic accelero hybrid III 140 is the only cooler with a VRM Heatsink, and a dedicated fan for it.


Huh? My xfx 390 has a heatsink on the core vrms. Every other cooler has even lower vrm temps so they must have one too.

No dedicated fan though. It's just a small heat sink with fins inside the main heat sink.


----------



## m70b1jr

Quote:


> Originally Posted by *jdorje*
> 
> Huh? My xfx 390 has a heatsink on the core vrms. Every other cooler has even lower vrm temps so they must have one too.
> 
> No dedicated fan though. It's just a small heat sink with fins inside the main heat sink.


I meant like a all in one liquid cooler


----------



## jdorje

Quote:


> Originally Posted by *m70b1jr*
> 
> I meant like a all in one liquid cooler


Ahh. However if you throw an AIO on then it will lower ambient temps significantly by carrying the heat far away from the card (put it on exhaust). Then you can rig up a VRM heat sink easily enough I imagine. Seems like a much more attractive choice than just getting a slightly better open-air cooler.


----------



## m70b1jr

Quote:


> Originally Posted by *jdorje*
> 
> Ahh. However if you throw an AIO on then it will lower ambient temps significantly by carrying the heat far away from the card (put it on exhaust). Then you can rig up a VRM heat sink easily enough I imagine. Seems like a much more attractive choice than just getting a slightly better open-air cooler.


Dude, what I'm trying to say is the artic accelero hybrid III, is an AIO liquid cooler that comes with dedicated VRM coolers.


----------



## jdorje

Quote:


> Originally Posted by *m70b1jr*
> 
> Dude, what I'm trying to say is the artic accelero hybrid III, is an AIO liquid cooler that comes with dedicated VRM coolers.


Oh!

How much is it?

Didn't know they made hybrids.

Does it fit all 390/xs? They have slightly different profiles...


----------



## m70b1jr

Quote:


> Originally Posted by *jdorje*
> 
> Oh!
> 
> How much is it?
> 
> Didn't know they made hybrids.
> 
> Does it fit all 390/xs? They have slightly different profiles...


The 290x one fits the 390x, and you can get it for $100 on Amazon


----------



## jdorje

Quote:


> Originally Posted by *m70b1jr*
> 
> The 290x one fits the 390x, and you can get it for $100 on Amazon


I understand that but the different manufacturers make 390s (and 290s) with different profiles. A standard 390 waterblock won't fit my 390 for instance.


----------



## m70b1jr

Quote:


> Originally Posted by *jdorje*
> 
> I understand that but the different manufacturers make 390s (and 290s) with different profiles. A standard 390 waterblock won't fit my 390 for instance.


The water block will fit. It's the VRM that MIGHT not fit, but it has plenty of headroom.


----------



## Carniflex

Quote:


> Originally Posted by *jdorje*
> 
> Oh!
> 
> How much is it?
> 
> Didn't know they made hybrids.
> 
> Does it fit all 390/xs? They have slightly different profiles...


Alphacool is also releasing a new GPU core block with integrated pump in a month or so. So if you want you can get away with just an Alphacool block with this new core/pump combo and a radiator (with fan on it ofc). Standard G1/4'' connections so you can probably pick the tubing size you like. I suspect they will sell them as kits as well with 10/13mm tubing like they do with their other kits. Alphacool ones are also "hybrid" blocks. Massive heatsink for VRM's and RAM and GPU core is cooled by water. There is also some contact between water block and that passive heat-sink so its not all passive.

If you are interested in hybrid solutions.


----------



## twisted46

Quote:


> Originally Posted by *m70b1jr*
> 
> The water block will fit. It's the VRM that MIGHT not fit, but it has plenty of headroom.


What card did you fit it to?


----------



## Mazda6i07

Maybe I'm just unlucky or something, but I'm pretty sure my Msi R9 390 Has coil wine, tiny buzzing sound while gaming. As soon as i exit gameplay it goes away. Pretty mad, just installed the card and that was the first thing i noticed. Guess i'll have to rma it through newegg day one....







***


----------



## Synntx

Quote:


> Originally Posted by *tolis626*
> 
> I wanted to ask but forgot, how high is your actual AUX voltage? I don't mean the offset you have set in Afterburner/Trixx, but the VDDCI value shown in GPU-z. Thanks.




???


----------



## tolis626

Quote:


> Originally Posted by *Synntx*
> 
> 
> 
> ???


Isn't that too much? I think that the stig (???) had said that 1.05V is the max we should be doing.


----------



## pillowsack

Quote:


> Originally Posted by *Synntx*
> 
> 
> 
> ???


what are your clocks at? I've seen some mention of people with 290s doing 1.2v aux which seems bizarre. I think with the right cooling that's ok though(water).

I've noticed with my MSI that 1150 is NOT stable(CS:GO stable, the true stress test for some damn reason...) at 1.32vcore but when I put aux at 1.063 it is stable. I'm gonna be experimenting more with the AUX here in a bit.


----------



## navjack27

Isn't it nuts how csgo can be the ultimate test for stability! A 45 min match is great for bringing that stuff out. Especially when your running over 400fps for most of it


----------



## EternalRest

Is the Powercolor 390 PCS+ good? I'm thinking about getting. Its on sale and with a NewEgg gift card, I can get it cheaper plus MIR.


----------



## jazz995756

What waterblock can I use on my xfx 390 I'm wanting a full block but I read that it isn't compatible anymore? Anyone have a link as to where I can get one?

Edit: Will either of these work?
http://www.frozencpu.com/products/24125/ex-blc-1713/EK_MSI_Gigabyte_Radeon_R9-290X_VGA_Liquid_Cooling_Block_Rev_20_-_Acrylic_EK-FC_R9-290X_Rev20.html?tl=g57c599s2078

or this

https://shop.ekwb.com/ek-fc-r9-290x-acetal-nickel-rev-2-0?SID=drhhkp7jkafgtquojgtr2hggb3

if it helps this is my card R9-390A-8DFR the one with the special vrm cooling and such


----------



## navjack27

Quote:


> Originally Posted by *Synntx*
> 
> Okay, so I finally got all setup with my new Nitro 390x. I applied the 1250 strap all the way up to the 1750 strap. I've got +100mv/+25mv running 1200/1750, and the best graphics score I can obtain is like, literally, right at 15000.....I can't figure out how to push it any further for a higher graphics score. For some reason, if I try to use TRIXX to push more voltage, I get major instability for anything more than +125mv.


i was gonna ask how u got that kinda OC, but you have an dual 8pin power going on up in thar... so thats how i guess.


----------



## bichael

Quote:


> Originally Posted by *EternalRest*
> 
> Is the Powercolor 390 PCS+ good? I'm thinking about getting. Its on sale and with a NewEgg gift card, I can get it cheaper plus MIR.


Yep. Has a good cooler and backplate. Waterblocks available. Generally may not be the highest clocking but obviously that's still mostly luck of the draw.


----------



## Carniflex

Quote:


> Originally Posted by *jazz995756*
> 
> What waterblock can I use on my xfx 390 I'm wanting a full block but I read that it isn't compatible anymore? Anyone have a link as to where I can get one?
> 
> Edit: Will either of these work?
> http://www.frozencpu.com/products/24125/ex-blc-1713/EK_MSI_Gigabyte_Radeon_R9-290X_VGA_Liquid_Cooling_Block_Rev_20_-_Acrylic_EK-FC_R9-290X_Rev20.html?tl=g57c599s2078
> 
> or this
> 
> https://shop.ekwb.com/ek-fc-r9-290x-acetal-nickel-rev-2-0?SID=drhhkp7jkafgtquojgtr2hggb3
> 
> if it helps this is my card R9-390A-8DFR the one with the special vrm cooling and such


Probably not. Even Alphacool does not seem to have atm a block for that particular model: http://www.alphacool.com/configurator.php

Then again that might be an opportunity if you want to go that route. Alphacool has a program where you can contact them for a block for a card model which they dont have yet, if you are the first one with the card to contact them then you can mail them the card, they will scan it, make the first prototype and make sure everything works and then mail you the block back. Then they will send you a retail version of the block free of charge when it releases a little while later.

Do note that alpha cool blocks are not "true" full cover - they have a basically core block + large aluminium heatsink + backplate. It's more like a hybrid block as the core block makes also contact with the aluminum block so it's cooled both passively and also by the water flow through the core block. Meaning VRM temps are a bit higher than on a "true" fullcover block while the GPU core is in the same ballpark if not even a little bit better.


----------



## Mazda6i07

I haven't overclocked my card whatsoever, but if steam randomly closes out of a game after 5-10 minutes of gameplay for no reason does that mean something is unstable or is it the drivers or ?


----------



## ziggystardust

Just bought Sapphire 390X Nitro today. Haven't tried overcloking just yet. I wanted to test it a bit first.

First things first I'm just a bit unlucky with coil whine it seems. I've tried a Gigabyte 980 G1 before and it had a terrible coil whine. Now this card has a slight coil whine, especially at some high frames (like Heaven benchmark or Witcher 3 menu).

Also I found vrm temps a bit high compared what people getting with MSI 390s. Max temps I've seen so far are; vrm temp1: 81C vrm temp2: 74C core 70C @ default fan profile. Card is idling at 35-38C core, 38-45C vrms while fans not spinning.. I have 2 front intake fans, 1 back exhaust and 2 top exhaust fans, so I think my case airflow is decent. But I'm open to any suggestions.


----------



## Synntx

Quote:


> Originally Posted by *pillowsack*
> 
> what are your clocks at? I've seen some mention of people with 290s doing 1.2v aux which seems bizarre. I think with the right cooling that's ok though(water).
> 
> I've noticed with my MSI that 1150 is NOT stable(CS:GO stable, the true stress test for some damn reason...) at 1.32vcore but when I put aux at 1.063 it is stable. I'm gonna be experimenting more with the AUX here in a bit.


Clocks are 1200/1750
Quote:


> Originally Posted by *navjack27*
> 
> i was gonna ask how u got that kinda OC, but you have an dual 8pin power going on up in thar... so thats how i guess.


Yea but does the score match up with the OC?
Quote:


> Originally Posted by *ziggystardust*
> 
> Just bought Sapphire 390X Nitro today. Haven't tried overcloking just yet. I wanted to test it a bit first.
> 
> First things first I'm just a bit unlucky with coil whine it seems. I've tried a Gigabyte 980 G1 before and it had a terrible coil whine. Now this card has a slight coil whine, especially at some high frames (like Heaven benchmark or Witcher 3 menu).
> 
> Also I found vrm temps a bit high compared what people getting with MSI 390s. Max temps I've seen so far are; vrm temp1: 81C vrm temp2: 74C core 70C @ default fan profile. Card is idling at 35-38C core, 38-45C vrms while fans not spinning.. I have 2 front intake fans, 1 back exhaust and 2 top exhaust fans, so I think my case airflow is decent. But I'm open to any suggestions.


Pull the heatsink, clean the chip, apply new thermal paste (non metallic), reseat heatsink, watch temps go down, ??????, profit

Honestly those Temps seem about right to me.


----------



## ziggystardust

Quote:


> Originally Posted by *Synntx*
> 
> Clocks are 1200/1750
> Yea but does the score match up with the OC?
> Pull the heatsink, clean the chip, apply new thermal paste (non metallic), reseat heatsink, watch temps go down, ??????, profit
> 
> Honestly those Temps seem about right to me.


What's your vrm temps at load?


----------



## Synntx

Quote:


> Originally Posted by *ziggystardust*
> 
> What's your vrm temps at load?


75-80 i believe. I'll run a stress test when i get home today and report back. Sapphire has the best cooling from what I've seen versus ASUS and MSI


----------



## ziggystardust

Quote:


> Originally Posted by *Synntx*
> 
> 75-80 i believe. I'll run a stress test when i get home today and report back. Sapphire has the best cooling from what I've seen versus ASUS and MSI


Thanks man. I'll be waiting for your result.


----------



## navjack27

i'm playing too much maplestory as of late to actually follow thru with any more overclocking. but now i'm just doing 1120/1625 with a +31mv VDDC and a stock 1000mv VDDCI. i've stopped using msi ab and now i'm using clockblocker and iTurbo


----------



## Synntx

Quote:


> Originally Posted by *ziggystardust*
> 
> Thanks man. I'll be waiting for your result.


VRM2 temp is 65 at idle and 72-75 at load. GPU temp is 50 at idle and 65 at load.


----------



## mus1mus

Quote:


> Originally Posted by *navjack27*
> 
> i'm playing too much maplestory as of late to actually follow thru with any more overclocking. but now i'm just doing 1120/1625 with a +31mv VDDC and a stock 1000mv VDDCI. i've stopped using msi ab and now i'm using clockblocker and iTurbo


How does clockblocker improve the card's performance? Like Benching?


----------



## navjack27

i've gone back and fourth with using clockblocker or msi disabling powerplay. i say clockblocker at least in the newest version does equal to locking the clocks at full tilt

EDIT: its so hard to not type clockblocker incorrectly


----------



## ziggystardust

Quote:


> Originally Posted by *Synntx*
> 
> VRM2 temp is 65 at idle and 72-75 at load. GPU temp is 50 at idle and 65 at load.


Thanks a lot for the results. These are for the default voltages right?

And also I noticed something weird. After I installed Trixx, I noticed the gpu voltage is at +19mV default. What's that about?

I resetted that for now. Is it possible that the gpu voltage increased automatically?


----------



## ZealotKi11er

Quote:


> Originally Posted by *ziggystardust*
> 
> Thanks a lot for the results. These are for the default voltages right?
> 
> And also I noticed something weird. After I installed Trixx, I noticed the gpu voltage is at +19mV default. What's that about?
> 
> I resetted that for now. Is it possible that the gpu voltage increased automatically?


Unless the BIOS has extra Voltage.


----------



## mus1mus

Offset in BIOS should do that.


----------



## tolis626

In case anyone was wondering what was happening with the problems I had with my GPU, it seems it's not the GPU after all. A semi stable CPU does cause instabilities when the GPU is also at its limits. I just finished playing BF4 with 1150/1650MHz no problem. If I go to 1175MHz it will most probaly crash. I also backed my CPU down to 4.7GHz and it didn't have any problems. Just some artifacting at some point because I used semi stable GPU settings on purpose, but no downright crashing like before. So yeah, there's that. I'm relieved it's not the GPU.

After I finish some exams the next week or so, I will probably get to changing the TIM on my card as it does bottleneck my cooling even with the cold ambients where I live right now. MSI usually does honor the warranty even if I change the TIM, right? I mean, as long as I don't damage the card.


----------



## ziggystardust

I've been doing some researches to find out why Sapphire 390s have a bit high vrm temps and found out they actually used International Rectifier's PowIRStages instead of copper mosfets which is AMD's reference design and also used in other cards like MSI and xfx etc...

Could that be the reason of those higher vrm temps since power stage design includes hi-low side mosfets as well as drivers which would tend to run hot? Or is it just the slightly worse vrm cooling design?


----------



## ZealotKi11er

Quote:


> Originally Posted by *ziggystardust*
> 
> I've been doing some researches to find out why Sapphire 390s have a bit high vrm temps and found out they actually used International Rectifier's PowIRStages instead of copper mosfets which is AMD's reference design and also used in other cards like MSI and xfx etc...
> 
> Could that be the reason of those higher vrm temps since power stage design includes hi-low side mosfets as well as drivers which would tend to run hot? Or is it just the slightly worse vrm cooling design?


Reference VRM1 run very hot. Hotter than anything I have used. You probably need to buy better thermal pads. I only have 11W ones but might have to get some 17W Extreme ones.


----------



## mus1mus

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Reference VRM1 run very hot. Hotter than anything I have used. You probably need to buy better thermal pads. I only have 11W ones but might have to get some 17W Extreme ones.


Heya. If you still have the pad off tye reference card, it worked very good. Not breaking 50C now compared to EK's 70+C.


----------



## Synntx

Quote:


> Originally Posted by *ziggystardust*
> 
> Thanks a lot for the results. These are for the default voltages right?
> 
> And also I noticed something weird. After I installed Trixx, I noticed the gpu voltage is at +19mV default. What's that about?
> 
> I resetted that for now. Is it possible that the gpu voltage increased automatically?


Mine does the same thing. The card comes slightly overclocked from the factory which is why it shows +19.

My VRM temps are under load with +100mv at 1200/1750. Of course these are benching speeds, not gaming speeds.


----------



## ziggystardust

Quote:


> Originally Posted by *Synntx*
> 
> Mine does the same thing. The card comes slightly overclocked from the factory which is why it shows +19.
> 
> My VRM temps are under load with +100mv at 1200/1750. Of course these are benching speeds, not gaming speeds.


My VRM temps are a bit high then. I'm hitting low 80s max and it avarages around 74-75. And it's just with default clocks. Haven't tried overclocking yet.

I know it's still far from the danger zone but I'm still worried a bit.

Do you use custom fan curve? If yes what's your curve like?


----------



## Falmatrix2r

ah I was wondering the same thing with my afterburner saying +13 too...and Sapphire needed to do that for a 5Mhz stable overclock? oh boy...
And what do you mean by benching speeds and not gaming?


----------



## ziggystardust

Quote:


> Originally Posted by *Falmatrix2r*
> 
> ah I was wondering the same thing with my afterburner saying +13 too...and Sapphire needed to do that for a 5Mhz stable overclock? oh boy...
> And what do you mean by benching speeds and not gaming?


Strangely enough afterburner saying +0 for me.

I guess he meant he's not using these speeds in long gaming sessions, it's just for benchmarking purposes.

What's your vrm temps like by the way?


----------



## Synntx

Quote:


> Originally Posted by *Falmatrix2r*
> 
> ah I was wondering the same thing with my afterburner saying +13 too...and Sapphire needed to do that for a 5Mhz stable overclock? oh boy...
> And what do you mean by benching speeds and not gaming?


I don't max out my card's clock speeds for intensive gaming. I back down them down a bit, as each game is optimized differently. What may be stable in Firestrike may not be stable in the witcher 3...


----------



## Falmatrix2r

Well the first time I OC'd my 6950 it was with FurMark 4 years ago, I had a stable OC but then I played L4D2 and crashed, pulled back a little on the OC, played GTA4 and crashed, pulled back again...4-5 games later, I only had a 5mhz overclock left...


----------



## TsukikoChan

Quote:


> Originally Posted by *tolis626*
> 
> In case anyone was wondering what was happening with the problems I had with my GPU, it seems it's not the GPU after all. A semi stable CPU does cause instabilities when the GPU is also at its limits. I just finished playing BF4 with 1150/1650MHz no problem. If I go to 1175MHz it will most probaly crash. I also backed my CPU down to 4.7GHz and it didn't have any problems. Just some artifacting at some point because I used semi stable GPU settings on purpose, but no downright crashing like before. So yeah, there's that. I'm relieved it's not the GPU.
> 
> After I finish some exams the next week or so, I will probably get to changing the TIM on my card as it does bottleneck my cooling even with the cold ambients where I live right now. MSI usually does honor the warranty even if I change the TIM, right? I mean, as long as I don't damage the card.


Same thing happens to me on my setup, if i OC my sapphire 390x with an small OC on my cpu (8350, OC to 4.4ghz+) i get instability with physics in games as well as random blackscreens on my dvi monitor and even sometimes full system freezes.
downgrading my OC on my cpu to 4.3/4.2 the stability returns even if i push the GPU harder with an OC.
On my previous card (7870) i could get my cpu up to 4.7ghz on air easy with no stability problems so for my case (maybe yours too?) i think it's just my psu can't handle both an OC on cpu and on the 390x (it's an older CM-GX750w), so i'm going to get a new psu (tier 1 or 2) and try it again


----------



## tolis626

Quote:


> Originally Posted by *TsukikoChan*
> 
> Same thing happens to me on my setup, if i OC my sapphire 390x with an small OC on my cpu (8350, OC to 4.4ghz+) i get instability with physics in games as well as random blackscreens on my dvi monitor and even sometimes full system freezes.
> downgrading my OC on my cpu to 4.3/4.2 the stability returns even if i push the GPU harder with an OC.
> On my previous card (7870) i could get my cpu up to 4.7ghz on air easy with no stability problems so for my case (maybe yours too?) i think it's just my psu can't handle both an OC on cpu and on the 390x (it's an older CM-GX750w), so i'm going to get a new psu (tier 1 or 2) and try it again


First of all thanks for the input. Feels good to not be the only one with strange problems.









Secondly, I highly, highly doubt it's my PSU. I could have 3 390x's for all I care, and I could still overclock them along with my CPU without pushing 1300W. My rig's using what? 400-450W under full load? Maybe 500W when stress testing both the CPU and GPU? I don't think I've ever seen 500W on the wall, but I digress.


----------



## TsukikoChan

Quote:


> Originally Posted by *tolis626*
> 
> First of all thanks for the input. Feels good to not be the only one with strange problems.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Secondly, I highly, highly doubt it's my PSU. I could have 3 390x's for all I care, and I could still overclock them along with my CPU without pushing 1300W. My rig's using what? 400-450W under full load? Maybe 500W when stress testing both the CPU and GPU? I don't think I've ever seen 500W on the wall, but I digress.


I agree on the power it's drawing. I borrowed a friend's monitor and found it was drawing ~450W if i stress the GPU with a mild OC (4.2-4.3 CPU) and only went up a bit when i up the OC on the cpu. My psu is 750W so that's well within my limits, so i've been suggested it's when i push the psu up that high that there's maybe a more pronounced ripple or instability in the supply to the cpu/gpu due to my psu being <= tier3, some have even said to get a psu with 2 12v rails and have the 2 connectors to the gpu come from the different rails.
i think it's kinda sucky i need to get a new psu to run this as previously i ran my 7870 in hybrid mode with a geforce 460 (for physx) without issue before so how much more powerhungry is this 390x compared to the 7000 series 0.o.. hmmm...


----------



## tolis626

Quote:


> Originally Posted by *TsukikoChan*
> 
> I agree on the power it's drawing. I borrowed a friend's monitor and found it was drawing ~450W if i stress the GPU with a mild OC (4.2-4.3 CPU) and only went up a bit when i up the OC on the cpu. My psu is 750W so that's well within my limits, so i've been suggested it's when i push the psu up that high that there's maybe a more pronounced ripple or instability in the supply to the cpu/gpu due to my psu being <= tier3, some have even said to get a psu with 2 12v rails and have the 2 connectors to the gpu come from the different rails.
> i think it's kinda sucky i need to get a new psu to run this as previously i ran my 7870 in hybrid mode with a geforce 460 (for physx) without issue before so how much more powerhungry is this 390x compared to the 7000 series 0.o.. hmmm...


So here's my theory. It may not be the PSU, but the motherboard. A GPU draws power from it's PCI connectors that come from the PSU (An 8-pin and a 6-pin or two 8-pins, most commonly) and also additional power from the PCIe slot on the motherboard. So, if the GPU draws too much power from the motherboard, it may cause instability to the CPU's power delivery. It's just a theory, but I intend to check it... Somehow.


----------



## TsukikoChan

Quote:


> Originally Posted by *tolis626*
> 
> So here's my theory. It may not be the PSU, but the motherboard. A GPU draws power from it's PCI connectors that come from the PSU (An 8-pin and a 6-pin or two 8-pins, most commonly) and also additional power from the PCIe slot on the motherboard. So, if the GPU draws too much power from the motherboard, it may cause instability to the CPU's power delivery. It's just a theory, but I intend to check it... Somehow.


keep me informed on how that goes, i will still be getting a new psu in the coming month or 2 if finances allow :-D


----------



## TsukikoChan

wee question, does anyone here use a DP->DVI adaptor on their 390x? if so, do you notice any latency issues with it and can it support up to 120/144hz dvi 1080p resolution?

I'm trying to figure out a way of having my 390x 'stop' detecting my dvi monitor being turned off (windows decides to rearrange my desktop configuration each time which is annoying to say the least), and i figured an adaptor from dp->dvi might do the trick (as i have 2 vga monitors attached via dp adaptors and windows couldn't care less if those disconnect). I want win7/crimson to stop detecting turning my monitor off and keep outputting signal to it regardless if it is receiving or not


----------



## tolis626

Quote:


> Originally Posted by *TsukikoChan*
> 
> wee question, does anyone here use a DP->DVI adaptor on their 390x? if so, do you notice any latency issues with it and can it support up to 120/144hz dvi 1080p resolution?
> 
> I'm trying to figure out a way of having my 390x 'stop' detecting my dvi monitor being turned off (windows decides to rearrange my desktop configuration each time which is annoying to say the least), and i figured an adaptor from dp->dvi might do the trick (as i have 2 vga monitors attached via dp adaptors and windows couldn't care less if those disconnect). I want win7/crimson to stop detecting turning my monitor off and keep outputting signal to it regardless if it is receiving or not


I can't help you solve your problem, but I suggest you try DesktopOK as a stop-gap measure. Among other things it saves your desktop layout and you can restore it at will (You can even set it to automatically restore it at every boot). It's a nice little tool that's saved me quite a lot of time, as I do like my desktop organized in a very specific, symmetrical manner with equal numbers of icons on both sides of the screen (Kind of OCD, I know). Hope it helps.


----------



## jdorje

Highest I've been able to get my 4690k+390 is 525W at the wall, and that's with very heavy overclocks. 430W with stock.


----------



## sgtgates

Hey guys sorry if this has been covered already..

Looking to grab 2 r9 390's for some nice 1400p 144htz gaming. Currently have 1080p ultrawide but it'll dominate that for now.

Decent price on the PowerColors PCS+ right now. Is this the exact EK block that fits them?

http://www.performance-pcs.com/ek-msi-gigabyte-radeon-r9-290x-vga-liquid-cooling-block-nickel-acetal.html

Card:

http://www.newegg.com/Product/Product.aspx?Item=N82E16814131672

I'd rather not wait on availability/use the new alphacool 390 blocks nor spend the extra $60 total getting the msi gaming and wait on a ek block to be released.

If that's not the right block please direct me to the correct one, I know there is a block that fits this specific 390 model somewhere

Edit: Because the Alphacool NexXxoS GPX - ATI R9 290X and 290 M07 block that would fit it according to the alphacool sheet is out of stock currently


----------



## ZealotKi11er

Quote:


> Originally Posted by *sgtgates*
> 
> Hey guys sorry if this has been covered already..
> 
> Looking to grab 2 r9 390's for some nice 1400p 144htz gaming. Currently have 1080p ultrawide but it'll dominate that for now.
> 
> Decent price on the PowerColors PCS+ right now. Is this the exact EK block that fits them?
> 
> http://www.performance-pcs.com/ek-msi-gigabyte-radeon-r9-290x-vga-liquid-cooling-block-nickel-acetal.html
> 
> Card:
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814131672
> 
> I'd rather not wait on availability/use the new alphacool 390 blocks nor spend the extra $60 total getting the msi gaming and wait on a ek block to be released.
> 
> If that's not the right block please direct me to the correct one, I know there is a block that fits this specific 390 model somewhere
> 
> Edit: Because the Alphacool NexXxoS GPX - ATI R9 290X and 290 M07 block that would fit it according to the alphacool sheet is out of stock currently


Do yourself a favor and get a GTX980 Ti. If GTX980 did not exist I would have gotten 390 too.


----------



## m70b1jr

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Do yourself a favor and get a GTX980 Ti. If GTX980 did not exist I would have gotten 390 too.


pls no.


----------



## ZealotKi11er

Quote:


> Originally Posted by *m70b1jr*
> 
> pls no.


Explain? CFX is trash. Single GPU is much better. If you look at benchmarks a aftermarket GTX980 Ti ~ 295X2 ~ 390 x 2. I would have recommended Fury X but GTX980 Ti is the better card.


----------



## sgtgates

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Do yourself a favor and get a GTX980 Ti. If GTX980 did not exist I would have gotten 390 too.


Quote:


> Originally Posted by *ZealotKi11er*
> 
> Explain? CFX is trash. Single GPU is much better. If you look at benchmarks a aftermarket GTX980 Ti ~ 295X2 ~ 390 x 2. I would have recommended Fury X but GTX980 Ti is the better card.


My question didn't call for a matter of biased opinion.

No I do not care for a 980ti. I already have rigs with dual gtx 970's, titan x's and a 980. My 980 died, pathetic strix cooler. Playing around with mantle because I want too.
Haven't used AMD since my dual 7970 set-up a year + back and I'm wanting to flirt with it with some upcoming 2016 titles. I understand both sides, but regarding your opinion, it's irrelevant.

Anyone else got an answer for me?


----------



## mus1mus

Then why bother asking?


----------



## Carniflex

Quote:


> Originally Posted by *mus1mus*
> 
> Then why bother asking?


If you read his question he did not ask which card he should get. He asked if that specific water block fits that specific card. And got in return "please get 980 ti instead". In an AMD 390 / 390X thread on top of that


----------



## Synntx

So I keep running into the same issue......when i go to run Firestrike, it'll get through all the way until the final test, then I'll get a black screen and have to do a hard restart. This seems to happen at various clocks.

Well today i experienced the same thing and decided to run Valley to see our i could replicate the issue. Valley ran for an hour with no issues. Temps ate not the problem as the card never gets above 75.

Is it possible my PSU is hitting its limit?? I figured 750w would be more than enough, even with overclocks.

My card has dual 8pins running off the same rail, but I'm not sure that's what is causing it. Only thing i can think of its that i need a bigger PSU but wanted to see what you guys think


----------



## mus1mus

Quote:


> Originally Posted by *Carniflex*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mus1mus*
> 
> Then why bother asking?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If you read his question he did not ask which card he should get. He asked if that specific water block fits that specific card. And got in return "please get 980 ti instead". In an AMD 390 / 390X thread on top of that
Click to expand...

Well TBH, the question at hand can be answered with a little diligence on his part doing a bit of digging into EK's support.


----------



## mus1mus

Quote:


> Originally Posted by *Synntx*
> 
> So I keep running into the same issue......when i go to run Firestrike, it'll get through all the way until the final test, then I'll get a black screen and have to do a hard restart. This seems to happen at various clocks.
> 
> Well today i experienced the same thing and decided to run Valley to see our i could replicate the issue. Valley ran for an hour with no issues. Temps ate not the problem as the card never gets above 75.
> 
> Is it possible my PSU is hitting its limit?? I figured 750w would be more than enough, even with overclocks.
> 
> My card has dual 8pins running off the same rail, but I'm not sure that's what is causing it. Only thing i can think of its that i need a bigger PSU but wanted to see what you guys think


Heaven and Firestrike are two different animals. And each reacts to your OC differently.

To really get into your issue, isolate the overclocks. Both on GPU and CPU. Slowly upping the OC in tiny increments will give you an idea of what;s happening.

You ought to be concerned more in Gaming though.


----------



## Synntx

Quote:


> Originally Posted by *mus1mus*
> 
> Heaven and Firestrike are two different animals. And each reacts to your OC differently.
> 
> To really get into your issue, isolate the overclocks. Both on GPU and CPU. Slowly upping the OC in tiny increments will give you an idea of what;s happening.
> 
> You ought to be concerned more in Gaming though.


Well i get that they're two different beasts, but this is a sudden crash. I sometimes get the same thing during gaming. And the clocks don't seem to matter much as sometimes i can get through Firestrike over and over at 1200/1750 and then it'll fail at 1170/1700.

There are no warning signs about that a crash is imminent like artefacts or tearing or corruption in the pixels, or just happens. The only thing i can think of is that my PSU is being pushed beyond its limits. Its a 750w gold but being that its a thermaltake i doubt it can really go that far. Even still, 750w should be enough for a heavily overclocked system with a single gpu......right?


----------



## mus1mus

Quote:


> Originally Posted by *Synntx*
> 
> Well i get that they're two different beasts, but this is a sudden crash. I sometimes get the same thing during gaming. And the clocks don't seem to matter much as sometimes i can get through Firestrike over and over at 1200/1750 and then it'll fail at 1170/1700.
> 
> There are no warning signs about that a crash is imminent like artefacts or tearing or corruption in the pixels, or just happens. The only thing i can think of is that my PSU is being pushed beyond its limits. Its a 750w gold but being that its a thermaltake i doubt it can really go that far. Even still, 750w should be enough for a heavily overclocked system with a single gpu......right?


How much Voltage do you feed the GPU?
Is your CPU OC Stable Enough?

Your issue sounds familiar.







Try to drop the memory to stock and leave the core at 1200.


----------



## Synntx

Quote:


> Originally Posted by *mus1mus*
> 
> How much Voltage do you feed the GPU?
> Is your CPU OC Stable Enough?
> 
> Your issue sounds familiar.
> 
> 
> 
> 
> 
> 
> 
> Try to drop the memory to stock and leave the core at 1200.


Yes I'm confident my CPU OC is stable. I'm running +100mv so at load i believe the GPU is running 1.32v. I've tried to push higer as ive got temperature headroom but that just makes the crash happen faster, which is why im looking at the psu as the culprit.

I'll try to back down the memory clocks when i get home and see if that makes a difference. IIRC memory on these cards doesn't make much of a difference anyways.


----------



## Stige

Quote:


> Originally Posted by *sgtgates*
> 
> My question didn't call for a matter of biased opinion.
> 
> No I do not care for a 980ti. I already have rigs with dual gtx 970's, titan x's and a 980. My 980 died, pathetic strix cooler. Playing around with mantle because I want too.
> Haven't used AMD since my dual 7970 set-up a year + back and I'm wanting to flirt with it with some upcoming 2016 titles. I understand both sides, but regarding your opinion, it's irrelevant.
> 
> Anyone else got an answer for me?


To answer: No.

This is the one you want: https://www.ekwb.com/configurator/waterblock/3831109830192
They only seemed to have the backplate for it in the store you linked.


----------



## kizwan

Quote:


> Originally Posted by *tolis626*
> 
> Quote:
> 
> 
> 
> Originally Posted by *TsukikoChan*
> 
> I agree on the power it's drawing. I borrowed a friend's monitor and found it was drawing ~450W if i stress the GPU with a mild OC (4.2-4.3 CPU) and only went up a bit when i up the OC on the cpu. My psu is 750W so that's well within my limits, so i've been suggested it's when i push the psu up that high that there's maybe a more pronounced ripple or instability in the supply to the cpu/gpu due to my psu being <= tier3, some have even said to get a psu with 2 12v rails and have the 2 connectors to the gpu come from the different rails.
> i think it's kinda sucky i need to get a new psu to run this as previously i ran my 7870 in hybrid mode with a geforce 460 (for physx) without issue before so how much more powerhungry is this 390x compared to the 7000 series 0.o.. hmmm...
> 
> 
> 
> So here's my theory. It may not be the PSU, but the motherboard. A GPU draws power from it's PCI connectors that come from the PSU (An 8-pin and a 6-pin or two 8-pins, most commonly) and also additional power from the PCIe slot on the motherboard. So, if the GPU draws too much power from the motherboard, it may cause instability to the CPU's power delivery. It's just a theory, but I intend to check it... Somehow.
Click to expand...

That's where the 6/9-pin PCIe power connectors come in play. The PCIe slot can only deliver limited of power & the rest will be covered by the 6/9-pin PCIe power connectors.
Quote:


> Originally Posted by *sgtgates*
> 
> Hey guys sorry if this has been covered already..
> 
> Looking to grab 2 r9 390's for some nice 1400p 144htz gaming. Currently have 1080p ultrawide but it'll dominate that for now.
> 
> Decent price on the PowerColors PCS+ right now. Is this the exact EK block that fits them?
> 
> http://www.performance-pcs.com/ek-msi-gigabyte-radeon-r9-290x-vga-liquid-cooling-block-nickel-acetal.html
> 
> Card:
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814131672
> 
> I'd rather not wait on availability/use the new alphacool 390 blocks nor spend the extra $60 total getting the msi gaming and wait on a ek block to be released.
> 
> If that's not the right block please direct me to the correct one, I know there is a block that fits this specific 390 model somewhere
> 
> Edit: Because the Alphacool NexXxoS GPX - ATI R9 290X and 290 M07 block that would fit it according to the alphacool sheet is out of stock currently


Powercolor PCS+ use *SE* version of the EK-FC waterblock. Look for *EK-FC 290X SE* waterblock.
Quote:


> Originally Posted by *mus1mus*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Carniflex*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mus1mus*
> 
> Then why bother asking?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If you read his question he did not ask which card he should get. He asked if that specific water block fits that specific card. And got in return "please get 980 ti instead". In an AMD 390 / 390X thread on top of that
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> Well TBH, the question at hand can be answered with a little diligence on his part doing a bit of digging into EK's support.
Click to expand...

Many people actually believe Powercolor PCS+ use referenced design which is not actually. That guy just want a little help to point him to the right direction.








Quote:


> Originally Posted by *Synntx*
> 
> So I keep running into the same issue......when i go to run Firestrike, it'll get through all the way until the final test, then I'll get a black screen and have to do a hard restart. This seems to happen at various clocks.
> 
> Well today i experienced the same thing and decided to run Valley to see our i could replicate the issue. Valley ran for an hour with no issues. Temps ate not the problem as the card never gets above 75.
> 
> Is it possible my PSU is hitting its limit?? I figured 750w would be more than enough, even with overclocks.
> 
> My card has dual 8pins running off the same rail, but I'm not sure that's what is causing it. Only thing i can think of its that i need a bigger PSU but wanted to see what you guys think


Try run Heaven loop for 1 hour. See if the graphics turn to wireframe. If yes, then your overclock not stable & was throwing errors that being corrected internally.


----------



## Synntx

Id really like to figure this out. It's very irritating!


----------



## TsukikoChan

Quote:


> Originally Posted by *tolis626*
> 
> I can't help you solve your problem, but I suggest you try DesktopOK as a stop-gap measure. Among other things it saves your desktop layout and you can restore it at will (You can even set it to automatically restore it at every boot). It's a nice little tool that's saved me quite a lot of time, as I do like my desktop organized in a very specific, symmetrical manner with equal numbers of icons on both sides of the screen (Kind of OCD, I know). Hope it helps.


ty! will install this tonight! hopefully it helps though still somewhat sucky :< i wonder does this happen under windows 10 as well..


----------



## Synntx

So I found my way over to this site: http://outervision.com/power-supply-calculator

I plugged everything in and this thing estimates my rig is using 800+ watts......which is what I've been kinda thinkin thusfar....

Maybe it's time to bite the bullet and grab a new PSU


----------



## rdr09

Quote:


> Originally Posted by *Synntx*
> 
> So I found my way over to this site: http://outervision.com/power-supply-calculator
> 
> I plugged everything in and this thing estimates my rig is using 800+ watts......which is what I've been kinda thinkin thusfar....
> 
> Maybe it's time to bite the bullet and grab a new PSU


My sig rig at 1100 core on both 290s but no added voltage and my i7 Sandy @ 4.5 GHz spiked to 660W in the combined test of Firestrike.

Read from the UPS.

With a single 290 @ 1200 core with +50mV . . . it registered 470W highest. Less 40W for my monitor, then it only hit 430W.

I have an alphacool waterpump, six fans, and a single HDD as well.


----------



## Synntx

Quote:


> Originally Posted by *rdr09*
> 
> My sig rig at 1100 core on both 290s but no added voltage and my i7 Sandy @ 4.5 GHz spiked to 660W in the combined test of Firestrike.
> 
> Read from the UPS.
> 
> With a single 290 @ 1200 core with +50mV . . . it registered 470W highest. Less 40W for my monitor, then it only hit 430W.
> 
> I have an alphacool waterpump, six fans, and a single HDD as well.


I can't figure out what's causing my issue. I've backed my CPU OC down to ensure that was not the problem, and still get the same issues.


----------



## Synntx

Quote:


> Originally Posted by *Synntx*
> 
> I can't figure out what's causing my issue. I've backed my CPU OC down to ensure that was not the problem, and still get the same issues.


Backing down the GPU clocks get rid of the issue, naturally, but I want to know what's CAUSING the issue, because higher clocks show no other issues (ie artefacts, etc) other than a random reboot.


----------



## rdr09

Quote:


> Originally Posted by *Synntx*
> 
> Backing down the GPU clocks get rid of the issue, naturally, but I want to know what's CAUSING the issue, because higher clocks show no other issues (ie artefacts, etc) other than a random reboot.


as a last ditch effort . . . try reinstalling FS. If it's still happening, then it could be the psu. TT is not really known for making good psu.

check it yourself if your brand is in this list . . .

http://www.overclock.net/t/183810/faq-recommended-power-supplies#user_700-799W

even though it is it might be a manufacturer's defect or something.


----------



## tolis626

Quote:


> Originally Posted by *kizwan*
> 
> That's where the 6/9-pin PCIe power connectors come in play. The PCIe slot can only deliver limited of power & the rest will be covered by the 6/9-pin PCIe power connectors.


I don't think that power from the PCIe slot is capped at 75W (or however much the spec is). I think that power draw is dynamic and on an overclocked system it can draw more power than the spec allows from both the slot and the PSU connectors. Could be wrong though. My concern, however, isn't whether the mobo can supply the power, but whether that power draw causes voltage ripples in the CPU's power delivery. Unlikely, but not out of the realm of possibility.







Quote:


> Originally Posted by *kizwan*
> 
> Try run Heaven loop for 1 hour. See if the graphics turn to wireframe. If yes, then your overclock not stable & was throwing errors that being corrected internally.


Do you mean that if Heaven (And Valley or not?) encounter errors they will automatically switch to wireframe? I don't think I've seen that happen and I've definitely had my fair share of runs with a lot of errors.
Quote:


> Originally Posted by *Synntx*
> 
> So I found my way over to this site: http://outervision.com/power-supply-calculator
> 
> I plugged everything in and this thing estimates my rig is using 800+ watts......which is what I've been kinda thinkin thusfar....
> 
> Maybe it's time to bite the bullet and grab a new PSU


No way your rig needs 800W dude. Maybe it suggests an 800W PSU, but that's another story. 750W should be plenty enough even for Crossfire/SLI.

It seems that in some cases, having your CPU overclocked too much doesn't play well with GPU overclocks. Also, try lowering the memory. I'm having similar issues with you and memory is the prime suspect.









PS : Is there anyone here that has been running their card with more than 1.05V AUX voltage 24/7 for a while? I'm curious to see what people are using... Do keep in mind that Sapphire cards start with 1.05V, whereas other cards (I think all, MSI is sure like that) are at 1.0V by default, so post your voltages accordingly. Thanks!


----------



## kizwan

There is also possibility that gpu just cannot handle overclock.


----------



## rdr09

Quote:


> Originally Posted by *kizwan*
> 
> There is also possibility that gpu just cannot handle overclock.


that's right. some systems are more sensitive to Futuremark, especially 3D Mark 11 than either Valley or Heaven.


----------



## kizwan

Quote:


> Originally Posted by *tolis626*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> That's where the 6/9-pin PCIe power connectors come in play. The PCIe slot can only deliver limited of power & the rest will be covered by the 6/9-pin PCIe power connectors.
> 
> 
> 
> I don't think that power from the PCIe slot is capped at 75W (or however much the spec is). I think that power draw is dynamic and on an overclocked system it can draw more power than the spec allows from both the slot and the PSU connectors. Could be wrong though. My concern, however, isn't whether the mobo can supply the power, but whether that power draw causes voltage ripples in the CPU's power delivery. Unlikely, but not out of the realm of possibility.
Click to expand...

Of course power draw is dynamic which means when needed. I didn't say otherwise. PCIe specification is not publicly available. ATX specification in the other hand available. So I cannot confirm whether PCIe slot is capped at 75W or not. However, let say PCIe slot is capped, it should be no problem because gpu still be able to draw power from 6/8-pin PCIe connectors. GPU can draw more power through 6/8-pin PCIe cables when needed.

Both CPU & GPU draw power from the same +12V rail, so I don't think there will be any voltage ripple issue.
Quote:


> Originally Posted by *tolis626*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Try run Heaven loop for 1 hour. See if the graphics turn to wireframe. If yes, then your overclock not stable & was throwing errors that being corrected internally.
> 
> 
> 
> Do you mean that if Heaven (And Valley or not?) encounter errors they will automatically switch to wireframe? I don't think I've seen that happen and I've definitely had my fair share of runs with a lot of errors.
Click to expand...

Heaven. I didn't test with Valley. How long did you run Heaven? One loop just not enough. You should let it run for multiple loop. One hour should be enough I think because I encountered wireframe in Heaven under one hour.

*Edit:* You misunderstood me. I didn't say that Heaven encounter errors. Do you know for example memory controller have the ability in error detection & correction? This is what I meant actually. It kind of error that doesn't show visually because it was being corrected internally.


----------



## Synntx

Quote:


> Originally Posted by *tolis626*
> 
> I don't think that power from the PCIe slot is capped at 75W (or however much the spec is). I think that power draw is dynamic and on an overclocked system it can draw more power than the spec allows from both the slot and the PSU connectors. Could be wrong though. My concern, however, isn't whether the mobo can supply the power, but whether that power draw causes voltage ripples in the CPU's power delivery. Unlikely, but not out of the realm of possibility.
> 
> 
> 
> 
> 
> 
> 
> 
> Do you mean that if Heaven (And Valley or not?) encounter errors they will automatically switch to wireframe? I don't think I've seen that happen and I've definitely had my fair share of runs with a lot of errors.
> No way your rig needs 800W dude. Maybe it suggests an 800W PSU, but that's another story. 750W should be plenty enough even for Crossfire/SLI.
> 
> It seems that in some cases, having your CPU overclocked too much doesn't play well with GPU overclocks. Also, try lowering the memory. I'm having similar issues with you and memory is the prime suspect.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> PS : Is there anyone here that has been running their card with more than 1.05V AUX voltage 24/7 for a while? I'm curious to see what people are using... Do keep in mind that Sapphire cards start with 1.05V, whereas other cards (I think all, MSI is sure like that) are at 1.0V by default, so post your voltages accordingly. Thanks!


Lower DRAM or GPU mem?


----------



## mus1mus

Do you still need to figure out your issue?

It's your OC. Go back to stock and OC the card bit by bit. Brute force will not make it run stable.


----------



## tolis626

Quote:


> Originally Posted by *kizwan*
> 
> Of course power draw is dynamic which means when needed. I didn't say otherwise. PCIe specification is not publicly available. ATX specification in the other hand available. So I cannot confirm whether PCIe slot is capped at 75W or not. However, let say PCIe slot is capped, it should be no problem because gpu still be able to draw power from 6/8-pin PCIe connectors. GPU can draw more power through 6/8-pin PCIe cables when needed.
> 
> Both CPU & GPU draw power from +12V rail, so I don't think there will be any voltage ripple issue.


Well, you're probably right, but I want to check it at some point, purely out of curiosity.








Quote:


> Originally Posted by *kizwan*
> 
> Heaven. I didn't test with Valley.How long did you run Heaven? One loop just not enough. You should let it run for multiple loop. One hour should be enough I think because I encountered wireframe in Heaven under one hour.


Hmmm... I have run Valley for longer periods of time, but I've never seens wireframe. However, I usually test with games. Battlefield 4 and WItcher 3 will dig out any instabilities in no time. I will try Heaven to see how it goes.

For the record, I'm seeing similar behavior as Synntx, where the whole PC will freeze, throw a red screen (aparently that's a thing when the crash is caused by the GPU) and then reboot itself (or not). I'm pretty sure it's the memory, but I'm also sure it's not that simple. It seems that my GPU can't hold both a high core and memory clock, while my CPU is also pushing 4.8GHz. If I back my CPU down a bit, it's more stable (doesn't downright crash, it will show artifacts first). I'll have to investigate further.
Quote:


> Originally Posted by *Synntx*
> 
> Lower DRAM or GPU mem?


GPU. Although, if you're running very high speed RAM, I'd also try setting it to XMP while testing, just to be sure it's not messing with your stability. Also, what's your AUX voltage (VDDCI in GPU-z)?


----------



## mus1mus

Quote:


> Originally Posted by *rdr09*
> 
> that's right. some systems are more sensitive to Futuremark, especially 3D Mark 11 than either Valley or Heaven.


Very true. It's not a one-tale-tell-all.

Also consider that somehow, these GPUs can have Voltage walls. I have seen this on both my 290Xs.

1300/1625 at 1.35ish or +175 Heaven loop stable.

1310/1625 fails even at +250. In Voltage-speak, around 1.45 Volts.


----------



## Stige

Quote:


> Originally Posted by *Synntx*
> 
> So I found my way over to this site: http://outervision.com/power-supply-calculator
> 
> I plugged everything in and this thing estimates my rig is using 800+ watts......which is what I've been kinda thinkin thusfar....
> 
> Maybe it's time to bite the bullet and grab a new PSU


That site is crap and should never be used to buy a PSU.

It recommends a 698W PSU for me when my setup barely touches 500W on the wall socket, which means like 450W real power from the PSU after efficiency.
And that is only one instance I can even push it above 500W, Armored Warfare Hangar. Nothing else comes even close to that.

My CPU is at 1.52V and GPU is at +200mV.


----------



## TsukikoChan

Quote:


> Originally Posted by *tolis626*
> 
> Well, you're probably right, but I want to check it at some point, purely out of curiosity.
> 
> 
> 
> 
> 
> 
> 
> 
> Hmmm... I have run Valley for longer periods of time, but I've never seens wireframe. However, I usually test with games. Battlefield 4 and WItcher 3 will dig out any instabilities in no time. I will try Heaven to see how it goes.
> 
> For the record, I'm seeing similar behavior as Synntx, where the whole PC will freeze, throw a red screen (aparently that's a thing when the crash is caused by the GPU) and then reboot itself (or not). I'm pretty sure it's the memory, but I'm also sure it's not that simple. It seems that my GPU can't hold both a high core and memory clock, while my CPU is also pushing 4.8GHz. If I back my CPU down a bit, it's more stable (doesn't downright crash, it will show artifacts first). I'll have to investigate further.
> GPU. Although, if you're running very high speed RAM, I'd also try setting it to XMP while testing, just to be sure it's not messing with your stability. Also, what's your AUX voltage (VDDCI in GPU-z)?


I'm interested to see how this goes with us lot with instabilities due to OCs on cpu and gpu, i may bump up my cpu's OC again and see if i can try get it stable. might be ripple, i wonder if my bios allows tighter control on that or similar. i might have to keep a monitor on and see if the volts fall during heavy play.. hmmm..


----------



## jdorje

If you have dual Hawaii cards 750 might not be enough once you start overclocking.

A single card? I cant break 550w at the wall and that's with heavy overclock on both.


----------



## diggiddi

For reference I have 2 stock lightings and an 8350 @4.8ghz PSU Antec HCG 750
I can overclock the Vram slightly and still run but it does crash when I run FS which it didn't do b4
So 750w with single 390/x should be fine


----------



## Synntx

Quote:


> Originally Posted by *mus1mus*
> 
> Do you still need to figure out your issue?
> 
> It's your OC. Go back to stock and OC the card bit by bit. Brute force will not make it run stable.


I may do this.
What do you use to check for a stable overclock? I seem to notice artefacts much earlier in Firestrike vs Heaven or Valley


----------



## ZealotKi11er

Quote:


> Originally Posted by *Synntx*
> 
> I may do this.
> What do you use to check for a stable overclock? I seem to notice artefacts much earlier in Firestrike vs Heaven or Valley


I always start with Firestrike. I run it until I dont see artifacts. Then I play BF4 for at least 2 hour and see if I get RSOD or crash. After that I continue to do normal gaming and go back to stock. From all cards I have had these cars are so close to the MAX and so temp dependent that change in room temp can break your overclock even if it was stable.


----------



## jdorje

Quote:


> Originally Posted by *Synntx*
> 
> I may do this.
> What do you use to check for a stable overclock? I seem to notice artefacts much earlier in Firestrike vs Heaven or Valley


If you want full stability, use OCCT. It has an error checking feature which will detect even small instability in core overclock.

For gaming, just bump it as far as you think might be stable for a particular game, then back down in ~5 mhz increments if you get any artifacting.


----------



## milan616

Not finding much help in the Hawaii BIOS editing thread, any fellow 390(x) guys have some advice for me on this?

Trying to figure something out with regard to decreasing DPM7 voltage. Right now at stock 1080/1500 or 1100 core OC + 1500 mem I don't go above 1188mV core. If I bump up the memory in Afterburner to 1600 I end up at 1260-1280mV (makes sense since my EV VDD decoder reads 1275mV for DPM7) in anything above idle except for max clock speed where I drop back down to 1160mV or so (-31mV core voltage set in AB). If I manually set DPM7 to 1160 in the GPU Freq and Limit Tables will that prevent me from going above that in the middle power states? And will it scale normally, or will they all go to 1160 all the time? Also, do I need to set anything in the MEM Freq table? Thanks!


----------



## jdorje

1188 is your vid or vcore? Vid you can find via AIDA64 but vcore will be substantially lower due to droop.

1160 vid is extremely low. Mine is 1228 and that's at a high asic level. Sounds like your vid is 1275. Some cards do have a builtin hidden offset though.

If you change your vid down to 1160 that's like -115 offset. Vcore will be tiny.

To change vid you must change it in all 6 tables. That's a topic for the bios editing thread though.

Or maybe I misunderstand your question.


----------



## gsdavid1

MSI R9 390 @ 1150/1700 stock voltages +50% power

Custom fan curve, temps max 72c after 30 mins of valley loop or firestrike 1.1. Vrm temp 65 (according to GPU Z, but was only 10 min run)



Firestrike 1.1 Graphics Score 13841

My only question is about MSI AB reporting max GPU voltages of 1.234, is that fine? I haven't touched voltages anywhere but the MSI gaming ap afaik likes to mess w bios and i did run it several times..

i'm getting very good results from what i've seen in thread if im still on stock voltages so im concerned voltages are maybe set to auto (like in CPU OC, sry complete noob when it comes to GPU OC) so they are higher than stock(over volting due to auto?) I didn't even push the card to its max, i just set these values and its been stable for weeks in CS/Fallout/Rust. Wonder how far i could go before artifacts/RSOD

edit: got question about the GPU cooler..since i plan to keep the card for bit longer (like 5 yrs) should i remove my custom fan curve that is very aggressive once GPU hits 70c, since most people expect the fans to die long before the GPU does due to higher temps? Afaik replacing MSI fans is almost impossible unless you buy those almost 100$ full cooler kits? with no fan control i get like 75c on full load where im at 69-72 with 85 - 100% custom fan speed

tnx


----------



## Synntx

Well I dropped my memory overclock down to 1650 and I'm still experiencing the same random black screens.


----------



## kizwan

Quote:


> Originally Posted by *Synntx*
> 
> Well I dropped my memory overclock down to 1650 and I'm still experiencing the same random black screens.


Did you try set fixed voltage in AB or Trixx? Did you try Unofficial overclocking without powerplay?


----------



## tolis626

Quote:


> Originally Posted by *Synntx*
> 
> Well I dropped my memory overclock down to 1650 and I'm still experiencing the same random black screens.


With my highest core overclock, I had to lower mine to 1600MHz to avoid problems. Even 1625MHz gives artifacting.


----------



## TsukikoChan

Quote:


> Originally Posted by *kizwan*
> 
> Did you try Unofficial overclocking without powerplay?


unofficial overclocking without powerplay?


----------



## milan616

Quote:


> Originally Posted by *jdorje*
> 
> 1188 is your vid or vcore? Vid you can find via AIDA64 but vcore will be substantially lower due to droop.
> 
> 1160 vid is extremely low. Mine is 1228 and that's at a high asic level. Sounds like your vid is 1275. Some cards do have a builtin hidden offset though.
> 
> If you change your vid down to 1160 that's like -115 offset. Vcore will be tiny.
> 
> To change vid you must change it in all 6 tables. That's a topic for the bios editing thread though.
> 
> Or maybe I misunderstand your question.


No I think you've got my question right. Vid for DPM7 is 1275 in both AIDA and EVV VID checker, and VDDC(Vcore) runs 1227 max when at 1600 on memory, lower when not overclocked. However, I do think I have an offset in the BIOS. My card is an MSI 390x. Looking in a hex editor at 0xAC40 I saw a value of 51h. I only came across the Stilt mentioning that address once in regard to someone else's question, and they had a 65h or so in there that was a 30-something mV offset, so I'm not quite sure how the offset works.

Not overclocked:

Code:



Code:


------[ GPU PStates List ]------

DPM0: GPUClock =  300 MHz, VID = 0.90000 V
DPM1: GPUClock =  533 MHz, VID = 0.96800 V
DPM2: GPUClock =  788 MHz, VID = 1.01200 V
DPM3: GPUClock =  909 MHz, VID = 1.11200 V
DPM4: GPUClock =  960 MHz, VID = 1.16200 V
DPM5: GPUClock = 1005 MHz, VID = 1.21200 V
DPM6: GPUClock = 1040 MHz, VID = 1.25000 V
DPM7: GPUClock = 1080 MHz, VID = 1.27500 V

Here is what I think I'm trying to get to, but thinking about that offset in the BIOS I'd either need to edit it out or raise my 1160mV up to 1190-1200mV. Basically just trying to get the card to run cooler (in 3d, but especially under media playback situations) since my fans start to get noisy due to my GPU fans being right up against the vent holes of my BitFenix Prodigy.


----------



## kizwan

Quote:


> Originally Posted by *TsukikoChan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Did you try Unofficial overclocking without powerplay?
> 
> 
> 
> unofficial overclocking without powerplay?
Click to expand...

Yes in MSI Afterburner. Do you have same problem too?
Quote:


> Originally Posted by *milan616*
> 
> Quote:
> 
> 
> 
> Originally Posted by *jdorje*
> 
> 1188 is your vid or vcore? Vid you can find via AIDA64 but vcore will be substantially lower due to droop.
> 
> 1160 vid is extremely low. Mine is 1228 and that's at a high asic level. Sounds like your vid is 1275. Some cards do have a builtin hidden offset though.
> 
> If you change your vid down to 1160 that's like -115 offset. Vcore will be tiny.
> 
> To change vid you must change it in all 6 tables. That's a topic for the bios editing thread though.
> 
> Or maybe I misunderstand your question.
> 
> 
> 
> No I think you've got my question right. Vid for DPM7 is 1275 in both AIDA and EVV VID checker, and VDDC(Vcore) runs 1227 max when at 1600 on memory, lower when not overclocked. However, I do think I have an offset in the BIOS. My card is an MSI 390x. Looking in a hex editor at 0xAC40 I saw a value of 51h. I only came across the Stilt mentioning that address once in regard to someone else's question, and they had a 65h or so in there that was a 30-something mV offset, so I'm not quite sure how the offset works.
> 
> Not overclocked:
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> ------[ GPU PStates List ]------
> 
> DPM0: GPUClock =  300 MHz, VID = 0.90000 V
> DPM1: GPUClock =  533 MHz, VID = 0.96800 V
> DPM2: GPUClock =  788 MHz, VID = 1.01200 V
> DPM3: GPUClock =  909 MHz, VID = 1.11200 V
> DPM4: GPUClock =  960 MHz, VID = 1.16200 V
> DPM5: GPUClock = 1005 MHz, VID = 1.21200 V
> DPM6: GPUClock = 1040 MHz, VID = 1.25000 V
> DPM7: GPUClock = 1080 MHz, VID = 1.27500 V
> 
> Here is what I think I'm trying to get to, but thinking about that offset in the BIOS I'd either need to edit it out or raise my 1160mV up to 1190-1200mV. Basically just trying to get the card to run cooler (in 3d, but especially under media playback situations) since my fans start to get noisy due to my GPU fans being right up against the vent holes of my BitFenix Prodigy.
Click to expand...

That offset not necessarily apply to your rom. What GPU-Z report for the VDDCI voltage? If you want to lower your voltage, that is exactly what you should do but I think you will need to follow "SVI 2 Compliant Voltages". So the closest voltage is either 1.11250 or 1.11875. Also change voltage for other DPM states too. After you set DPM7 to 1160, what do you see the value for DPM7 in the GPU PStates List?

*Warning: Some voltages are shown in image below which are way to high, so
please ref stock values as to what you should use.*


For example:-

Code:



Code:


------[ GPU PStates List ]------

DPM0: GPUClock =  300 MHz, VID = 0.75000 V
DPM1: GPUClock =  533 MHz, VID = 0.81200 V
DPM2: GPUClock =  788 MHz, VID = 0.86200 V
DPM3: GPUClock =  909 MHz, VID = 0.96200 V
DPM4: GPUClock =  960 MHz, VID = 1.01200 V
DPM5: GPUClock = 1005 MHz, VID = 1.06200 V
DPM6: GPUClock = 1040 MHz, VID = 1.10000 V
DPM7: GPUClock = 1080 MHz, VID = 1.12500 V


----------



## jdorje

By changing vid from 1275 to 1160 you have massively reduced your voltage.


----------



## tolis626

Man, I really, really want to get into editing and flashing my card's BIOS, but the fact that the MSI 390x only has 1 BIOS freaks me out too bad to even try it.









Anyways... I think I'm onto something here. I think (And really, it's just that for now) that when I increase the AUX voltage on my card and also increase the memory speed (or maybe just the memory speed, I dunno) it causes addition droop to my core voltage and that's why it crashes. At some point I had it run Valley with 1175MHz core, 1700MHz memory or something (Important thing is that it was high) and +100mV on the core and +75mV AUX and I Alt+Tabbed to desktop and I saw that my GPU core voltage was in the 1.24-1.26V range, whereas with +100mV offset I normally get close to 1.3V under load. I will test more, but I would like to hear what you guys have to say abou this.

@kizwan

What does unofficial overclocking without PowerPlay do exactly? The explanation in AB is kind of vague. Is there any chance it will help?


----------



## navjack27

Quote:


> Originally Posted by *Synntx*
> 
> Well I dropped my memory overclock down to 1650 and I'm still experiencing the same random black screens.


you are using 1650 which is still in the 1750 timing strap. use 1625 instead.


----------



## milan616

Quote:


> Originally Posted by *kizwan*
> 
> That offset not necessarily apply to your rom. What GPU-Z report for the VDDCI voltage? If you want to lower your voltage, that is exactly what you should do but I think you will need to follow "SVI 2 Compliant Voltages". So the closest voltage is either 1.11250 or 1.11875. Also change voltage for other DPM states too. After you set DPM7 to 1160, what do you see the value for DPM7 in the GPU PStates List?
> 
> *Warning: Some voltages are shown in image below which are way to high, so
> please ref stock values as to what you should use.*
> _SVI2 chart was here_
> 
> For example:-
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> ------[ GPU PStates List ]------
> 
> DPM0: GPUClock =  300 MHz, VID = 0.75000 V
> DPM1: GPUClock =  533 MHz, VID = 0.81200 V
> DPM2: GPUClock =  788 MHz, VID = 0.86200 V
> DPM3: GPUClock =  909 MHz, VID = 0.96200 V
> DPM4: GPUClock =  960 MHz, VID = 1.01200 V
> DPM5: GPUClock = 1005 MHz, VID = 1.06200 V
> DPM6: GPUClock = 1040 MHz, VID = 1.10000 V
> DPM7: GPUClock = 1080 MHz, VID = 1.12500 V


I haven't applied a modded BIOS yet so nothing much to say about DPM7 after yet. Since MSI only has a single BIOS I want to make sure I do it right, or have someone else help out. I'm thinking I'll go with the voltage list you provided, but I don't think I'll step down as low on the lower DPM slots unless I've seen someone else running that low.

So you can see here that VDDC is pretty high to start (twitch stream and dota running in the background), and then the Heaven benchmark starts running to put it into full 3d mode bringing it to 1.156-1.172. VDDCI stays constant the whole time.



Quote:


> Originally Posted by *jdorje*
> 
> By changing vid from 1275 to 1160 you have massively reduced your voltage.


Yeah, but I'm not looking to overclock here.


----------



## kizwan

Quote:


> Originally Posted by *tolis626*
> 
> Man, I really, really want to get into editing and flashing my card's BIOS, but the fact that the MSI 390x only has 1 BIOS freaks me out too bad to even try it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyways... I think I'm onto something here. I think (And really, it's just that for now) that when I increase the AUX voltage on my card and also increase the memory speed (or maybe just the memory speed, I dunno) it causes addition droop to my core voltage and that's why it crashes. At some point I had it run Valley with 1175MHz core, 1700MHz memory or something (Important thing is that it was high) and +100mV on the core and +75mV AUX and I Alt+Tabbed to desktop and I saw that my GPU core voltage was in the 1.24-1.26V range, whereas with +100mV offset I normally get close to 1.3V under load. I will test more, but I would like to hear what you guys have to say abou this.
> 
> @kizwan
> 
> What does unofficial overclocking without PowerPlay do exactly? The explanation in AB is kind of vague. Is there any chance it will help?


Without PowerPlay allow your card to run at max clocks all the time when overclock. This may improve stability when overclock.
Quote:


> Originally Posted by *milan616*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> That offset not necessarily apply to your rom. What GPU-Z report for the VDDCI voltage? If you want to lower your voltage, that is exactly what you should do but I think you will need to follow "SVI 2 Compliant Voltages". So the closest voltage is either 1.11250 or 1.11875. Also change voltage for other DPM states too. After you set DPM7 to 1160, what do you see the value for DPM7 in the GPU PStates List?
> 
> *Warning: Some voltages are shown in image below which are way to high, so
> please ref stock values as to what you should use.*
> _SVI2 chart was here_
> 
> For example:-
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> ------[ GPU PStates List ]------
> 
> DPM0: GPUClock =  300 MHz, VID = 0.75000 V
> DPM1: GPUClock =  533 MHz, VID = 0.81200 V
> DPM2: GPUClock =  788 MHz, VID = 0.86200 V
> DPM3: GPUClock =  909 MHz, VID = 0.96200 V
> DPM4: GPUClock =  960 MHz, VID = 1.01200 V
> DPM5: GPUClock = 1005 MHz, VID = 1.06200 V
> DPM6: GPUClock = 1040 MHz, VID = 1.10000 V
> DPM7: GPUClock = 1080 MHz, VID = 1.12500 V
> 
> 
> 
> 
> 
> I haven't applied a modded BIOS yet so nothing much to say about DPM7 after yet. Since MSI only has a single BIOS I want to make sure I do it right, or have someone else help out. I'm thinking I'll go with the voltage list you provided, but I don't think I'll step down as low on the lower DPM slots unless I've seen someone else running that low.
> 
> So you can see here that VDDC is pretty high to start (twitch stream and dota running in the background), and then the Heaven benchmark starts running to put it into full 3d mode bringing it to 1.156-1.172. VDDCI stays constant the whole time.
Click to expand...

I keep thinking 1116 even though you wrote in the editor 1160.







You can use it as an example of course.

This much better I think:-

Code:



Code:


------[ GPU PStates List ]------

DPM0: GPUClock =  300 MHz, VID = 0.80000 V
DPM1: GPUClock =  533 MHz, VID = 0.85600 V
DPM2: GPUClock =  788 MHz, VID = 0.90600 V
DPM3: GPUClock =  909 MHz, VID = 1.00000 V
DPM4: GPUClock =  960 MHz, VID = 1.05000 V
DPM5: GPUClock = 1005 MHz, VID = 1.10000 V
DPM6: GPUClock = 1040 MHz, VID = 1.13700 V
DPM7: GPUClock = 1080 MHz, VID = 1.16200 V

The best course of action is for you to try these voltage using overclocking software like MSI AB. Try set negative offset until you can get close to ~1.162V & test this in games. When you know it is stable then you can proceed with BIOS mod.

BTW, your VDDCI in GPU-Z is 1.047V & the HawaiiBiosEditor show 1.050V for VDDCI (AUX) which show your BIOS doesn't set offset voltage.


----------



## milan616

Quote:


> Originally Posted by *kizwan*
> 
> The best course of action is for you to try these voltage using overclocking software like MSI AB. Try set negative offset until you can get close to ~1.162V & test this in games. When you know it is stable then you can proceed with BIOS mod.
> 
> BTW, your VDDCI in GPU-Z is 1.047V & the HawaiiBiosEditor show 1.050V for VDDCI (AUX) which show your BIOS doesn't set offset voltage.


Sorry, I had upped my aux voltage to 1050 in my modded BIOS screenshots, the default was 1000mV. So my thing with Afterburner is that the offset voltage seems to only apply to the full 3D state. I had it set to -31mV core and +50mV aux (I had some very infrequent glitches that I found went away with a bump to aux). If I could figure out a way to get AB to apply the offset voltage the whole way through I'd completely go that route instead of the BIOS mod since I use AB for a custom fan curve as well.


----------



## Synntx

Quote:


> Originally Posted by *navjack27*
> 
> you are using 1650 which is still in the 1750 timing strap. use 1625 instead.


Quote:


> Originally Posted by *kizwan*
> 
> Did you try set fixed voltage in AB or Trixx? Did you try Unofficial overclocking without powerplay?


I'll try both of these things and report back.


----------



## Kerelm

Hey guys,

was playing Squad last night on my sig rig at ~high @ 2k res and my r9 390 (gigabyte G1) was hitting 90'c on forced 80% fan speed.

They right to run this hot or should I pull my heatsink off and put some aftermarket TIM on ?

ty.


----------



## jazz995756

Quote:


> Originally Posted by *Kerelm*
> 
> Hey guys,
> 
> was playing Squad last night on my sig rig at ~high @ 2k res and my r9 390 (gigabyte G1) was hitting 90'c on forced 80% fan speed.
> 
> They right to run this hot or should I pull my heatsink off and put some aftermarket TIM on ?
> 
> ty.


That's too high I also game at 2k on high settings with an overclock and I hit 70c max has it always been like this or did it recently start? I would probably go ahead and change the TIM but that's up to you I'm not too sure of the warranty but you might void it taking the heatsink off but don't quote me on that I might be wrong maybe someone else can chime in


----------



## Kerelm

mines always seem to run pretty hot, but this is the hottest i've seen it.

was getting to ~80'c in fallout 4.


----------



## m70b1jr

Mine gets 80c on an AIO with my overclock, only advantage over the stock cooler is that my AIO gpu cooler is a lot quieter.


----------



## Kerelm

I would love to put a AIO on my g1 but I'm not sure it's do-able.


----------



## jazz995756

Quote:


> Originally Posted by *Kerelm*
> 
> mines always seem to run pretty hot, but this is the hottest i've seen it.
> 
> was getting to ~80'c in fallout 4.


What's the airflow like in your case maybe you don't have sufficient airflow and the GPU is suffocating itself


----------



## Kerelm

it's real good I can feel bulk hot air pumping out the back of the case. I have 2x 140 in the front blowing over the GPU with my rear 140 as exhaust. My h110 is top mounted pullning air down into the case. My cpu temps are fine.


----------



## jazz995756

Quote:


> Originally Posted by *Kerelm*
> 
> it's real good I can feel bulk hot air pumping out the back of the case. I have 2x 140 in the front blowing over the GPU with my rear 140 as exhaust. My h110 is top mounted pullning air down into the case. My cpu temps are fine.


Then all I can think of is the GPU being dusty or the TIM needing to be replaced


----------



## kizwan

Quote:


> Originally Posted by *milan616*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> The best course of action is for you to try these voltage using overclocking software like MSI AB. Try set negative offset until you can get close to ~1.162V & test this in games. When you know it is stable then you can proceed with BIOS mod.
> 
> BTW, your VDDCI in GPU-Z is 1.047V & the HawaiiBiosEditor show 1.050V for VDDCI (AUX) which show your BIOS doesn't set offset voltage.
> 
> 
> 
> Sorry, I had upped my aux voltage to 1050 in my modded BIOS screenshots, the default was 1000mV. So my thing with Afterburner is that the offset voltage seems to only apply to the full 3D state. I had it set to -31mV core and +50mV aux (I had some very infrequent glitches that I found went away with a bump to aux). If I could figure out a way to get AB to apply the offset voltage the whole way through I'd completely go that route instead of the BIOS mod since I use AB for a custom fan curve as well.
Click to expand...

Better don't complicate things then & just change the DPM 7 voltage. You're on the right track with the HawaiiBiosEditor.


----------



## gsdavid1

1170/1725 still on stock volts

75c GPU / 67 VRM

i get artifacts in fire strike so i would need to touch voltages to make this 24-7 stable :/


----------



## Carniflex

Quote:


> Originally Posted by *Kerelm*
> 
> I would love to put a AIO on my g1 but I'm not sure it's do-able.


Alphacool should be releasing their AIO for GFX cards in approx a month or so. It's that "hybrid full cover" block basically with the GPX core block "upgraded" to hold also a very slim pump. Standard G1/4'' connectors if the preliminary teasers are what we would be getting once it releases. So would be pretty flexible in it's configuration options.


----------



## Stige

And the GPX series coolers work pretty well as long as you replace the awful thermal pads on VRM and glue a cooler on the VRM end.
I can run it at +200mV now with those fixes without VRM going over 75C.

Phobya 7w/mk thermal pads and GT AP-15.


----------



## Kerelm

Do you guys know if me replacing the TIM on my G1 will void warranty? ty


----------



## tolis626

Quote:


> Originally Posted by *Kerelm*
> 
> Do you guys know if me replacing the TIM on my G1 will void warranty? ty


Technically speaking, it should. But some manufacturers will still honor your warranty if all you do is replace the TIM and don't damage the card in any way. I think MSI is one of them. Don't quote me on this, though, wait for others to chime in too.


----------



## gupsterg

Quote:


> Originally Posted by *Dundundata*
> 
> I ran the EVV program
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> What does it mean '0x283'?
> 
> 
> 
> Also it told me to restore the default SCLK frequency. I ran the program upon reboot with no OC.


Thanks for this info share, been after data on what 390/X owners have as VID DPM7. I'd also be interested in your other DPM state VIDs, any chance you can check via AiDA64?


Spoiler: AiDA64 Registers dump











As @fat4l stated you have low ASIC quality which = low leakage. From information The Stilt has stated this is desirable to have, view here.

What kind of overclocks are you getting? IIRC that's the lowest ASIC quality/leakage I've noted.
Quote:


> Originally Posted by *XxxxVulcanxxxX*
> 
> Does ASIC quality come into play with the temps due to the differing voltage requirements? or am i thinking too much into it? I just cant see why some are reporting large overclocks on air and others are struggling to keep it at stock without overheating from the same manufacturer...?
> 
> My ASIC quality is 81.6% and i struggled overclocking on air with the msi twin frozr cooler...


Besides the "Silicon lottery" factor see linked post above of The Stilt.
Quote:


> Originally Posted by *battleaxe*
> 
> Where do you get the most recent version of iTurbo? without signing up for some spamware?


I think v1.6.6 is still latest (did google couldn't find info), you can find non spamware link near end of OP this thread heading Useful Links.
Quote:


> Originally Posted by *milan616*
> 
> However, I do think I have an offset in the BIOS. My card is an MSI 390x.


In MSI AB do you see +xx mV on Core Voltage slider even though you didn't add one?



Pressing "Reset" should revert information displayed to detected bios defaults of card. If you've been applying other bios to card then it can be a good idea to check by:-

i) uninstalling OC apps without keeping settings.
ii) uninstall AMD driver.
iii) run DDU.
iv) reinstall everything and check.
Quote:


> Originally Posted by *milan616*
> 
> Looking in a hex editor at 0xAC40 I saw a value of 51h. I only came across the Stilt mentioning that address once in regard to someone else's question, and they had a 65h or so in there that was a 30-something mV offset, so I'm not quite sure how the offset works.


Do not use information about someone's bios regarding offset locations unless you know you have exact same bios.
Quote:


> Originally Posted by *milan616*
> 
> 
> 
> Spoiler: VID for DPMS
> 
> 
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> ------[ GPU PStates List ]------
> 
> DPM0: GPUClock =  300 MHz, VID = 0.90000 V
> DPM1: GPUClock =  533 MHz, VID = 0.96800 V
> DPM2: GPUClock =  788 MHz, VID = 1.01200 V
> DPM3: GPUClock =  909 MHz, VID = 1.11200 V
> DPM4: GPUClock =  960 MHz, VID = 1.16200 V
> DPM5: GPUClock = 1005 MHz, VID = 1.21200 V
> DPM6: GPUClock = 1040 MHz, VID = 1.25000 V
> DPM7: GPUClock = 1080 MHz, VID = 1.27500 V


Many thanks for posting this info, what is your ASIC quality?


----------



## tolis626

While my overclocking endeavors are going straight to hell it seems, I took the time to get some numbers for my profile that's aimed towards low power consumption, which is 1000MHz core with -100mV, 1500MHz memory and 0% added power limit. Now, I did lose about 10% performance (a little less than 10% actually) compared to stock, at least in BF4, but it's not like the 390x doesn't have performance to spare most of the time. The most interesting thing is the numbers, though. Under load it uses about 1.05V VDDC and consumes less than 150W, if GPU-z is to be trusted (It showed an average of about 120W actually, with the highest value being 143W). Temps didn't break 60C, they hovered around 55C most of the time and the card was inaudible even with my head next to the case. Nice stuff! Turns out Hawaii really is efficient.

On another note, high memory clocks are a no go on my chip. I can do 1650MHz (maybe a bit higher, I haven't really tested at this point) up to 1150MHz core, but over that and even 1625MHz is flaky as all hell and I have to drop to 1600MHz. With 1600MHz I can get 1180MHz on the core though. I'm going to see if performance is better with the last 25-30MHz of core clock or with the highest memory overclock I can achieve at 1150MHz core. If the performance is the same or greater, keeping the 1150MHz core would be better because of the lower voltage and thermals. Hmmm...


----------



## Stige

I tried Furmark today.

At my overclocked settings, desktop wattage was about ~130W from the wall, I put furmark on at 1080p and it jumped to 622W lol
Stock clocks/voltages, desktop was at ~128W, so very little change, but starting furmark and it went to 420W!

200W difference







massive.


----------



## jdorje

130 from the wall at idle? You must be doing something wrong.


----------



## battleaxe

Quote:


> Originally Posted by *m70b1jr*
> 
> Mine gets 80c on an AIO with my overclock, only advantage over the stock cooler is that my AIO gpu cooler is a lot quieter.


You should try reseating the block. That's too high even for an AIO. I run AIO on my old 290 and it never goes over 65C even when heavily overclocked. Like 1220mhz plus on the core. Still around 63C max. But I have seen temps like that, only when the block wasn't seated correctly. Try again and see if your luck improves, seems too high to me.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Stige*
> 
> I tried Furmark today.
> 
> At my overclocked settings, desktop wattage was about ~130W from the wall, I put furmark on at 1080p and it jumped to 622W lol
> Stock clocks/voltages, desktop was at ~128W, so very little change, but starting furmark and it went to 420W!
> 
> 200W difference
> 
> 
> 
> 
> 
> 
> 
> massive.


How much OC and Volts?


----------



## Synntx

Alright, I eliminated the CPU OC, dropped the memory speed back down to stock, completely wiped the drivers using DDU, reinstalled drivers, flashed modded timings bios again, and started reworking my overclock. I am STILL getting the crash/reboot issue.....which is taking me back to the PSU being the issue. Anyone else have any other suggestions??


----------



## tolis626

Quote:


> Originally Posted by *Synntx*
> 
> Alright, I eliminated the CPU OC, dropped the memory speed back down to stock, completely wiped the drivers using DDU, reinstalled drivers, flashed modded timings bios again, and started reworking my overclock. I am STILL getting the crash/reboot issue.....which is taking me back to the PSU being the issue. Anyone else have any other suggestions??


Are you sure it's not caused by the modded timings? Memory problems can be a pain in the butt to diagnose and it's not always about speed. I'd flash the original BIOS if I were you.

Other than that... Maybe check your cables. I've seen cases where bad cables would be causing problems (It's been years since I've seen that happen though). Also, when you say it crashes, what does it do? BSOD? RSOD? Or just downright reboot? If it just reboots suddenly out of nowhere, it may indeed be the PSU but because it's dying, not because 750W isn't enough.


----------



## Synntx

Quote:


> Originally Posted by *tolis626*
> 
> Are you sure it's not caused by the modded timings? Memory problems can be a pain in the butt to diagnose and it's not always about speed. I'd flash the original BIOS if I were you.
> 
> Other than that... Maybe check your cables. I've seen cases where bad cables would be causing problems (It's been years since I've seen that happen though). Also, when you say it crashes, what does it do? BSOD? RSOD? Or just downright reboot? If it just reboots suddenly out of nowhere, it may indeed be the PSU but because it's dying, not because 750W isn't enough.


Yeah I thought the same thing and flashed the stock BIOS back and started pushing the clocks higher and higher until I started getting the crash again. What I mean by crash, is the whole system just reboots. Before it would reboot and then fail POST, so I'd have to hard reset it to get it going again. Then I adjusted my CPU OC a bit and it started crashing and rebooting under GPU load.

But, I think it's safe to say I have diagnosed and troubeshooted the issue................go ahead...............please.......................SOMEONE............................................................ask me what it was


----------



## tolis626

Quote:


> Originally Posted by *Synntx*
> 
> Yeah I thought the same thing and flashed the stock BIOS back and started pushing the clocks higher and higher until I started getting the crash again. What I mean by crash, is the whole system just reboots. Before it would reboot and then fail POST, so I'd have to hard reset it to get it going again. Then I adjusted my CPU OC a bit and it started crashing and rebooting under GPU load.
> 
> But, I think it's safe to say I have diagnosed and troubeshooted the issue................go ahead...............please.......................SOMEONE............................................................ask me what it was


It isn't the PSU, is it?


----------



## Synntx

Quote:


> Originally Posted by *tolis626*
> 
> It isn't the PSU, is it?


Well, to be fair, it DID have something to do with power delivery, so I wasn't far off the beaten path...........

If it is what I think it is, turns out the PCIE cables were a bit loose at the card. I have sleeved extension cables that didn't seem to be seated ALL the way in, which actually explains perfectly the behavior I was seeing from the computer.

I figured 750w would be more than enough juice, even if I wanted to crossfire 390x, which is why I bought that PSU. Plus, it's Gold rated and semi modular, so the price was right and I scooped it up. I was just thinking this particular unit may have been bad. I knew it was going to be a hardware issue, I just couldn't figure it out until...............

........well in a fit of rage I took the window off my case and was ready to rip out that 390x and throw it from the roof of the tallest building I could find in true Office Space fashion, when out of the corner of my eye i noticed this slight.....imperfection......in the lining up of the PCIE cables. And I was like, no freakin way. So I pushed them in as hard as I could and ran the benchmarks again, and lo and behold, no crashes


----------



## tolis626

Quote:


> Originally Posted by *Synntx*
> 
> Well, to be fair, it DID have something to do with power delivery, so I wasn't far off the beaten path...........
> 
> If it is what I think it is, turns out the PCIE cables were a bit loose at the card. I have sleeved extension cables that didn't seem to be seated ALL the way in, which actually explains perfectly the behavior I was seeing from the computer.
> 
> I figured 750w would be more than enough juice, even if I wanted to crossfire 390x, which is why I bought that PSU. Plus, it's Gold rated and semi modular, so the price was right and I scooped it up. I was just thinking this particular unit may have been bad. I knew it was going to be a hardware issue, I just couldn't figure it out until...............
> 
> ........well in a fit of rage I took the window off my case and was ready to rip out that 390x and throw it from the roof of the tallest building I could find in true Office Space fashion, when out of the corner of my eye i noticed this slight.....imperfection......in the lining up of the PCIE cables. And I was like, no freakin way. So I pushed them in as hard as I could and ran the benchmarks again, and lo and behold, no crashes


Stand up, go to the closest mirror you can find, look yourself in the eye and then slap yourself. That's what I do in these circumstances.









At least everything is alright. Least you can do is post about how far you can push that thing. Now push it. PUSH IIIIIIIIIIT!


----------



## Synntx

Quote:


> Originally Posted by *tolis626*
> 
> Stand up, go to the closest mirror you can find, look yourself in the eye and then slap yourself. That's what I do in these circumstances.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> At least everything is alright. Least you can do is post about how far you can push that thing. Now push it. PUSH IIIIIIIIIIT!


Oh I plan to. Currently OC'd to 1190 on the core w/ +100mv, temps barely hit 67 in a Firestrike run.


----------



## gupsterg

Quote:


> Originally Posted by *Synntx*
> 
> ........well in a fit of rage I took the window off my case and was ready to rip out that 390x and throw it from the roof of the tallest building I could find in true Office Space fashion


OMG!








Quote:


> Originally Posted by *tolis626*
> 
> Stand up, go to the closest mirror you can find, look yourself in the eye and then slap yourself. That's what I do in these circumstances.


I don't need a mirror







, I have a wife she does it for me!


----------



## tolis626

Quote:


> Originally Posted by *Synntx*
> 
> Oh I plan to. Currently OC'd to 1190 on the core w/ +100mv, temps barely hit 67 in a Firestrike run.


Mine can't go that high, sadly.It caps out at 1185MHz, but even then only for benchmarking. I get artifacts in games above 1175MHz. 1175MHz isn't bad, but still, I'd like more. I hope that maybe I'll gain a few MHz more by replacing the TIM and improving temps.
Quote:


> Originally Posted by *gupsterg*
> 
> OMG!
> 
> 
> 
> 
> 
> 
> 
> 
> I don't need a mirror
> 
> 
> 
> 
> 
> 
> 
> , I have a wife she does it for me!


Hahahahahah









Man, I can only imagine how much nagging the married among you receive. Women, usually, don't take it TOO well when we spend too much time with our toys. My girlfriend doesn't. I hope she doesn't ever decide to say "Me or the PC". She'll lose.









So, a bit of testing was done and strange things are happening. I was able to take my memory as high as 1725MHz with +50mV AUX when using 1150MHz with +50mV on the core. When the core is at 1175MHz, my highest stable overclock, anything above 1625MHz memory will crash and even 1625MHz is so-so stable. So I did a bit of benchmarking. 1150/1725MHz is just a teeny tiny bit slower than 1175/1625MHz in Firestrike, but I'd say that's within margin of error and couldn't be bothered to watch through the whole test again. In Valley, however, 1150/1725 was a tiny bit faster than 1175/1625 (1909 vs 1900). Sadly, I thought I took screenshots but they weren't saved for some reason. I did save the results, though, but I have to zip them to upload them and am bored, tbh.









Anyone has any clues why this is happening? First off, I didn't think memory clocks would make much of a difference, as the 390x has bandwidth in spades, but it seems they do. And secondly and most importantly, what could be the reason that my memory refuses to overclock properly at higher core clocks? I'm truly at a loss here. If it was 25MHz or so I wouldn't care, but it's 100MHz less and it costs me in performance.


----------



## navjack27

Quote:


> Originally Posted by *Synntx*
> 
> Oh I plan to. Currently OC'd to 1190 on the core w/ +100mv, temps barely hit 67 in a Firestrike run.


lol that's gold man!


----------



## Synntx

Well, looks like I spoke too soon. I'm still getting the same issue.









I guess I'll go grab a new PSU and see if that alleviates it.


----------



## EternalRest

Quote:


> Originally Posted by *bichael*
> 
> Yep. Has a good cooler and backplate. Waterblocks available. Generally may not be the highest clocking but obviously that's still mostly luck of the draw.


Thanks. I bought it this mourning. After a NewEgg gift card, plus MIR, I will get it for 275. Good deal and the wife said yes. Now I need to try to talk to her into letting me getting the BenQ XL2730Z.


----------



## granadier12

Im planning to get the alphacool nexxus(or something like that) r9 390 m04 to watercool the card (gigabyte r9 390 g1 gamming 8gb) what do u guys think?? Good idea?

Sent from my VS985 4G using Tapatalk


----------



## Carniflex

Quote:


> Originally Posted by *granadier12*
> 
> Im planning to get the alphacool nexxus(or something like that) r9 390 m04 to watercool the card (gigabyte r9 390 g1 gamming 8gb) what do u guys think?? Good idea?
> 
> Sent from my VS985 4G using Tapatalk


Well - it's the only "full cover" block for that card - so it's not like there is a lot of options for a "full cover" (it's a hybrid block, as you are probably quite aware).


----------



## bichael

Just put the alpacool GPX block on my Powercolor 390 and got it up and running last night. Not time for much testing but at stock the gpu was around 50oC and VRMs around 60oC on a run of Firestrike. That's with a small pump and at 30oC+ room temp. (Not worth comparing to my air results as I had to mount the fans outside the case so they were blowing through a side panel and across a gap, about 80 / 70 for what it's worth). I'm definitely happy with it, gives nearly all the benefits of a fullcover block at less cost and while being upgradeable.

Topless Powercolor 390









If this looks cramped imagine what it's like after the psu, hdd's, and blu ray drive go in. The 360 rad lives outside...


----------



## DJNIKEL24

MSI R9 390 crossfire slightly oc'd to 1100 with an fx-8350 oc'd @ 4.9ghz.


----------



## Zyphur

My XFX 390 DD just came in and I'm prepping to set it up tomorrow morning. I'm replacing a 7870 with it. I'm going to uninstall and clean out old drivers to start fresh, and wanted to pre-download the newest drivers but every download I click on the AMD site gives me a 'Download Not Complete' page. I get that on all the drivers, Crimson Stable 15.12 & Crimson Beta 16.1 Hotfix. What's up? (X posted to 390 club and xfx club)
Edit: Just got them from guru.


----------



## jaimehrubiks

Hi I am planning to get the shapphire R9 390. But I want to have the future possibility of getting anotherone in the furure, just in case. You think it will be enough power to crossover R9 390 with this PSU http://xtremmedia.com/Corsair_RM850i_850W_80_Plus_Gold_Modular.html ?

You have any other recommendations? I don't plan to get the second one any sooner. Thanks

Edit: I have a i56600K whick I'm planning to OC a little also.


----------



## navjack27

Here is something interesting. Last year it was colder where I am and the only thing that changed was that this year I have the 390x... I run my computer the same and everything... 7kw/h more for the same month.

But we pay 0.05 cents per kWh... So it's a non issue for the most part.


----------



## Zyphur

So trying to game on my brand new XFX 390, and the gpu usage is going crazy. I remember having this issue and fixing it somehow on my 7870 I think. Any ideas for a fix? Dual monitors, 2500k at 4.0, dual monitors.


----------



## m70b1jr

Quote:


> Originally Posted by *Zyphur*
> 
> So trying to game on my brand new XFX 390, and the gpu usage is going crazy. I remember having this issue and fixing it somehow on my 7870 I think. Any ideas for a fix? Dual monitors, 2500k at 4.0, dual monitors.


Newest drivers?


----------



## Zyphur

Quote:


> Originally Posted by *m70b1jr*
> 
> Newest drivers?


Crimson 16.1 Hotfix after installing older drivers.


----------



## Renner

Yeah, I had a similar issue with unusually high GPU activity on desktop on my 390, for some strange reason during or after midnight period. I just started hearing its coolers turning on and temps rising and rising... Can't say it didn't make me panicking a bit, restart would solve it, but I had no explanations or solutions. It didn't happen for the last several months, certainly not for the last Catalyst driver, or during any of the Crimson ones.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Zyphur*
> 
> Crimson 16.1 Hotfix after installing older drivers.


Is the fps doing the same thing. Also do you have

Enable unified GPU usage monitoring enables in MSI AB Settings/General Tab?


----------



## Zyphur

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Is the fps doing the same thing. Also do you have
> 
> Enable unified GPU usage monitoring enables in MSI AB Settings/General Tab?


FPS seems lower than earlier, no noticeable strange spikes though. Edit: Didn't even realize that the issue started after downloading AB. I enabled that setting and it seems to be good now. Thanks


----------



## granadier12

Quote:


> Originally Posted by *Carniflex*
> 
> Well - it's the only "full cover" block for that card - so it's not like there is a lot of options for a "full cover" (it's a hybrid block, as you are probably quite aware).


Yes i do, thats why im planning to get it cuz i wanna full watercool my pc and the other option for this card its artic accelero hibrid III 140, i contact them and they told me that it will work but a read somewhere else that maybe the vrm are goona be exposed cuz the plate may not fit.


----------



## Synntx

Just wnated to fill everyone in on my current state of affairs.......here is how my PC sits right now:



I just picked up one of these suckers for only $100!!!!!!!

http://www.newegg.com/Product/Product.aspx?Item=N82E16817371067

Anyhow, I saw it was on the Tier 2 list so figure what the heyhoo.

Well wouldn't ya know it? I've got my card OC'd to 1210/1625 this very moment w/ +150mv.

NO...........FREAKIN........RESTARTS!!!!!!!!!!!

My original hunch seems to be correct, in that my PSU was either defective, dying, or inadequate.

http://www.3dmark.com/fs/7305474


----------



## Zyphur

First day with my new XFX 390 in the books. About 9 hours straight of gaming (GTA V, Blade & Soul, LoL (I'll push it harder tomorrow with more demanding games)) and multitasking on dual monitors. The monitoring results seem great to me, but as I'm new to the card I don't know ideal temps and such, I'd love second opinions. How do they look?
Here is my HWinfo, Afterburning monitoring, and CPU HWinfo monitoring results:


Spoiler: Warning: Spoiler!


----------



## jdorje

That's at stock right?

Those temps are fine but you're going to run into vrm temp issues if you overclock/overvolt it. Very similar to my xfx 390 (8256).

Edit: also, overclock your cpu more if you want a full 60 fps in gta v.


----------



## rdr09

Quote:


> Originally Posted by *Synntx*
> 
> Just wnated to fill everyone in on my current state of affairs.......here is how my PC sits right now:
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> I just picked up one of these suckers for only $100!!!!!!!
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16817371067
> 
> Anyhow, I saw it was on the Tier 2 list so figure what the heyhoo.
> 
> Well wouldn't ya know it? I've got my card OC'd to 1210/1625 this very moment w/ +150mv.
> 
> NO...........FREAKIN........RESTARTS!!!!!!!!!!!
> 
> My original hunch seems to be correct, in that my PSU was either defective, dying, or inadequate.
> 
> http://www.3dmark.com/fs/7305474


Glad you got it solved. Nice rig. Got a 540 as well and love it.

Quote:


> Originally Posted by *jdorje*
> 
> That's at stock right?
> 
> Those temps are fine but you're going to run into vrm temp issues if you overclock/overvolt it. Very similar to my xfx 390 (8256).
> 
> Edit: also, overclock your cpu more if you want a full 60 fps in gta v.


^this. Need to oc the cpu more not that another issue is causing the eratic gpu usage. even with a single 390 an i5 sandy needs oc'ing. and even then, with games using much cores . . . an oc will only do so much good.

Zyphur's cores are hitting 100% spikes.


----------



## ziggystardust

Anyone having gpu core clock fluctuating in certain games?

I have Sapphire 390X Nitro and the stock clock is 1080 MHz (on an i7 4770K + 16gigs of ram). In Witcher 3, Project Cars, Dying Light, Mordor, Tomb Raider and Talos Principle it seems fine and I'm getting stable 1080 mhz. But in Warhammer Vermintide, Killing Floor 2 and Lords of the Fallen it sometimes fluctuates and even dips down to 600-900 causing slight slow downs/stutters occasionally. Similar fluctuations happen slightly and more rarely in Heaven and Valley benchmarks as well (like 3-4 times in a run).

I read these fluctuations caused by AMD's new power saving features in Crimson drivers since it's not fully optimized and working as intended yet. But still seems odd to me. Anyone having a similar situation?


----------



## OneB1t

cpu bottleneck


----------



## ziggystardust

Quote:


> Originally Posted by *OneB1t*
> 
> cpu bottleneck


So you say my i7 4770K bottlenecking a R9 390X? It bottlenecks in crappy Heaven benchmark where cpu utilization barely at 10% or Lords of the Fallen but not in Shadow of Mordor or Dying Light where it really hits 40-50s%?

Are you sure?


----------



## AliNT77

Check vrm temps

I had the same issue when my vrms where naked


----------



## ziggystardust

Quote:


> Originally Posted by *AliNT77*
> 
> Check vrm temps
> 
> I had the same issue when my vrms where naked


It's not a temp issue I think. My vrm temps are maxing out at 81-82C in some games and mostly averaging around 70s (@stock fan curve where fans not spinning until 60C). Moreover it does that even at the beginning like vrms are still around 50s.

Just checked Metro Last Light. It's stable 1080 MHz, temps max core 67C, vrm 75-77C. In Killing Floor 2, temps are even lower but still fluctuating.


----------



## tolis626

Quote:


> Originally Posted by *ziggystardust*
> 
> It's not a temp issue I think. My vrm temps are maxing out at 81-82C in some games and mostly averaging around 70s (@stock fan curve where fans not spinning until 60C). Moreover it does that even at the beginning like vrms are still around 50s.
> 
> Just checked Metro Last Light. It's stable 1080 MHz, temps max core 67C, vrm 75-77C. In Killing Floor 2, temps are even lower but still fluctuating.


Just to eliminate this as a possiblity, open Radeon Settings, go to the Games tab and then into Global Settings. Check if your framerate target is low. For some reason mine was stuck at 60FPS and all DX11 games (So everything I play except for Battlefield 4) were crawling. I upped it to 200FPS and boom, issues gone. It's probably not it, but it's worth a shot.


----------



## ziggystardust

Quote:


> Originally Posted by *tolis626*
> 
> Just to eliminate this as a possiblity, open Radeon Settings, go to the Games tab and then into Global Settings. Check if your framerate target is low. For some reason mine was stuck at 60FPS and all DX11 games (So everything I play except for Battlefield 4) were crawling. I upped it to 200FPS and boom, issues gone. It's probably not it, but it's worth a shot.


Yes, it's disabled.

The thing is my friend bought a 380X a few days ago and he just said he's having similar core clock fluctuations in some games.


----------



## OneB1t

yes im sure








100/8 = 12.5% if any thread from game reach this core load then gpu is cpu bottlenecked

you say your cpu load is about 10% so here we go cpu bottleneck








you can check this in process hacker http://processhacker.sourceforge.net

dying light is typical single thread bottleneck even that you see 40% cpu load there is one thread which loads its core to 100% load

again this cannot be seen with normal task manager you need software like process hacker to see this problem
when your card is not fully loaded by your cpu then gpu clock drops

if you want to remove this problem then overclock your cpu or go higher resolution







(or switch to nvidia which have less cpu overhead which is main reason why it performns better in 1080p but in 4k where cpu bottleneck is lifted card like 390X can keep easily with 980)

other solution is to wait for DX12 era as this API will decrease CPU overhead


----------



## ziggystardust

It turned out to be the crappy Crimson drivers I guess. I went back to Catalyst 15.11 and there is almost no fluctuation in these games now. So it's AMD pushing somekinda clumsy power saving feature.

However in benchmarks Crimson 16.1 performing slightly better.

Quote:


> Originally Posted by *OneB1t*
> 
> yes im sure
> 
> 
> 
> 
> 
> 
> 
> 
> 100/8 = 12.5% if any thread from game reach this core load then gpu is cpu bottlenecked
> 
> you say your cpu load is about 10% so here we go cpu bottleneck
> 
> 
> 
> 
> 
> 
> 
> 
> you can check this in process hacker http://processhacker.sourceforge.net
> 
> dying light is typical single thread bottleneck even that you see 40% cpu load there is one thread which loads its core to 100% load
> 
> again this cannot be seen with normal task manager you need software like process hacker to see this problem
> when your card is not fully loaded by your cpu then gpu clock drops
> 
> if you want to remove this problem then overclock your cpu or go higher resolution
> 
> 
> 
> 
> 
> 
> 
> (or switch to nvidia which have less cpu overhead which is main reason why it performns better in 1080p but in 4k where cpu bottleneck is lifted card like 390X can keep easily with 980)
> 
> other solution is to wait for DX12 era as this API will decrease CPU overhead


So you are telling me these crap Heaven/Valley benchmarks are more cpu intensive than Shadow of Mordor, Metro Last Light or Battlefield 4? (tried BF4 and no fluctuation here as well)

Also I can see all my core/thread loads in MSI AB and never seen 100% load in Dying Light on any thread ever. It's mostly perfectly even across all cores/thread, averaging around 40-50%.

I guess you were trying to say that the gpu is bottlenecking the cpu if cpu load is that low. Because it makes no sense that cpu is bottlenecking the gpu, where gpu load is constantly 100%. Gpu load never drops (unless I use fps limiter), it's just the fluctuating clock speed and that is only in certain games, and these are not very cpu intensive games as well. So no, there shouldn't be any kind of cpu bottleneck anywhere. It's not possible. Even 390x crossfire can't be bottlenecked by a 4770K.

PS: And it's also isn't the cpu feeding/loading the gpu in the first place. It's the cpu answering gpu's calls. That's why cpu comparisons/benchmarks always made with lower resolutions with lower graphic settings to eliminate the gpu limits.


----------



## OneB1t

you cannot see 100% core load as windows scheduler spread that load onto whole cpu

check that process in process hacker he can show thread load not cpu load then you will see that 1 core is 100% loaded by processes like dying light or unigine heaven

what i was saying is that you have CPU bottleneck and you even dont know about it as task manager and windows scheduler masking it









let me explain on example

this is application im developing which have opengl visualization is strictly singlethreaded
http://postimg.org/image/5zvgg4dqf/

cpu usage is 18% but thread usage is 12.19% (which is nearly 100% as 100% is 12.25 (100/8 in my case as i have 8 thread cpu))
so even your cpu is used only from 18% there is brutal single thread bottleneck

when you lock that application to use just 1 thread then this happen
http://postimg.org/image/4xmqpazn3/

you can see that in task manager 1 core is 100% load









this is how windows scheduler works masking single thread bottleneck in apps

as long as aplication have one thread which is using 12.25% of cpu (in our case as we have 8 cores or 25% in case you have 4 core) you hitting cpu bottleneck and your card is not running 100% clocks

other problem can be power limit for card but i dont think that this can be your issue

crimson driver also have more aggresive downclocking when your card is not fully loaded

CPU bottleneck is happening when 1 thread is reaching 100% load even that whole cpu can have only few percent load
in normal game there is more than 1 thread but still if one of these threads reach 12.5% usage then game cannot run faster even if there is 4 cores not used at all

this is also reason why FX-8xxx is bad in certain games like WOW, guild wars, dying light, fallout 4 or skyrim but doing pretty well in well threaded games like battlefield 4, dragon age, witcher 3 etc... and with all DX12 workloads
Quote:


> Originally Posted by *ziggystardust*
> 
> PS: And it's also isn't the cpu feeding/loading the gpu in the first place. It's the cpu answering gpu's calls. That's why cpu comparisons/benchmarks always made with lower resolutions with lower graphic settings to eliminate the gpu limits.


nope its CPU feeding GPU with data and lower resolution mean less work for GPU and lower resolution also mean bigger CPU bottleneck in games/apps


----------



## ziggystardust

Quote:


> Originally Posted by *OneB1t*
> 
> you cannot see 100% core load as windows scheduler spread that load onto whole cpu
> 
> check that process in process hacker he can show thread load not cpu load then you will see that 1 core is 100% loaded by processes like dying light or unigine heaven
> 
> what i was saying is that you have CPU bottleneck and you even dont know about it as task manager and windows scheduler masking it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> let me explain on example
> 
> this is application im developing which have opengl visualization is strictly singlethreaded
> http://postimg.org/image/5zvgg4dqf/
> 
> cpu usage is 18% but thread usage is 12.19% (which is nearly 100% as 100% is 12.25 (100/8 in my case as i have 8 thread cpu))
> so even your cpu is used only from 18% there is brutal single thread bottleneck
> 
> when you lock that application to use just 1 thread then this happen
> http://postimg.org/image/4xmqpazn3/
> 
> you can see that in task manager 1 core is 100% load
> 
> 
> 
> 
> 
> 
> 
> 
> 
> this is how windows scheduler works masking single thread bottleneck in apps
> 
> as long as aplication have one thread which is using 12.25% of cpu (in our case as we have 8 cores or 25% in case you have 4 core) you hitting cpu bottleneck and your card is not running 100% clocks
> 
> other problem can be power limit for card but i dont think that this can be your issue
> 
> crimson driver also have more aggresive downclocking when your card is not fully loaded
> 
> CPU bottleneck is happening when 1 thread is reaching 100% load even that whole cpu can have only few percent load
> in normal game there is more than 1 thread but still if one of these threads reach 12.5% usage then game cannot run faster even if there is 4 cores not used at all


As I said in my previous post I can see all individual thread loads with MSI AB real time on screen while playing games, and no, none of my threads or cores hitting anywhere near 100% load in any of the games or benchmarks.

I also mentioned my gpu load never drops below 99-100% in my previous post. I can monitor gpu loads on screen via MSI AB again. The problem is even the load is 100% the core speed fluctuates in some games. Not all of them, but some of them. And there isn't any kind of bottleneck. There won't be even if I have three 390Xs


----------



## OneB1t

Quote:


> Originally Posted by *ziggystardust*
> 
> PS: And it's also isn't the cpu feeding/loading the gpu in the first place. It's the cpu answering gpu's calls. That's why cpu comparisons/benchmarks always made with lower resolutions with lower graphic settings to eliminate the gpu limits.


you cannot see threads in msi afterburner... only per core load but its not same thing... as windows scheduler is messing with you
if you have gpu load at 100% then increase power limit

problem is that gpu load info in afterburner is malformed try to decrease pooling rate and you will see that 100% is not really 100%







in games like dying light


----------



## jdorje

Shouldn't be cpu bottleneck in unigine programs no matter your cpu.

Power throttling would be my guess.


----------



## OneB1t

in fact unigine heaven HAVE single thread bottleneck in some scenes which start to appear only if you card is powerfull like 390X or better running in low resolution like 1080p without MSAA


----------



## ziggystardust

Quote:


> Originally Posted by *OneB1t*
> 
> you cannot see threads in msi afterburner... only per core load but its not same thing... as windows scheduler is messing with you
> if you have gpu load at 100% then increase power limit
> 
> problem is that gpu load info in afterburner is malformed try to decrease pooling rate and you will see that 100% is not really 100%
> 
> 
> 
> 
> 
> 
> 
> in games like dying light


I can see 8 thread loads in AB. Are they fabricated loads?

I'm at 100ms polling rate, should I decrease more? I didn't have clock speed problem in Dying Light though. It's just maxing out at 1080 and not moving at all.

Now I'm on catalyst 15.11 and the problem is mostly gone. Just a slight fluctuations in Lords of the Fallen and Warhammer Vermintide. Which are actually pretty crappy games in terms of optimization.

My friend is having fluctuations in AC Syndicate and Batman Arkham Knight with his 380X and an i7 with Crimson 16.1. And there are actually lots of threads on other forums about this clock fluctuations.


----------



## OneB1t

you still dont understand which threads i mean..








im talking about process threads and you cant see them in msi afterburner or windows task manager

you need something like process hacker to see process threads

im running my 290X on crimson 16.1 and not experiencing any of theese but i also run 4K resolution


----------



## Sgt Bilko

Quote:


> Originally Posted by *ziggystardust*
> 
> Quote:
> 
> 
> 
> Originally Posted by *OneB1t*
> 
> you cannot see 100% core load as windows scheduler spread that load onto whole cpu
> 
> check that process in process hacker he can show thread load not cpu load then you will see that 1 core is 100% loaded by processes like dying light or unigine heaven
> 
> what i was saying is that you have CPU bottleneck and you even dont know about it as task manager and windows scheduler masking it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> let me explain on example
> 
> this is application im developing which have opengl visualization is strictly singlethreaded
> http://postimg.org/image/5zvgg4dqf/
> 
> cpu usage is 18% but thread usage is 12.19% (which is nearly 100% as 100% is 12.25 (100/8 in my case as i have 8 thread cpu))
> so even your cpu is used only from 18% there is brutal single thread bottleneck
> 
> when you lock that application to use just 1 thread then this happen
> http://postimg.org/image/4xmqpazn3/
> 
> you can see that in task manager 1 core is 100% load
> 
> 
> 
> 
> 
> 
> 
> 
> 
> this is how windows scheduler works masking single thread bottleneck in apps
> 
> as long as aplication have one thread which is using 12.25% of cpu (in our case as we have 8 cores or 25% in case you have 4 core) you hitting cpu bottleneck and your card is not running 100% clocks
> 
> other problem can be power limit for card but i dont think that this can be your issue
> 
> crimson driver also have more aggresive downclocking when your card is not fully loaded
> 
> CPU bottleneck is happening when 1 thread is reaching 100% load even that whole cpu can have only few percent load
> in normal game there is more than 1 thread but still if one of these threads reach 12.5% usage then game cannot run faster even if there is 4 cores not used at all
> 
> 
> 
> As I said in my previous post I can see all individual thread loads with MSI AB real time on screen while playing games, and no, none of my threads or cores hitting anywhere near 100% load in any of the games or benchmarks.
> 
> I also mentioned my gpu load never drops below 99-100% in my previous post. I can monitor gpu loads on screen via MSI AB again. The problem is even the load is 100% the core speed fluctuates in some games. Not all of them, but some of them. And there isn't any kind of bottleneck. *There won't be even if I have three 390Xs*
Click to expand...

That's debatable but anyways.....

put your powerlimit to +50% and run it again, it might be PowerPlay doing it.


----------



## ziggystardust

Quote:


> Originally Posted by *Sgt Bilko*
> 
> That's debatable but anyways.....
> 
> put your powerlimit to +50% and run it again, it might be PowerPlay doing it.


Yeah i was kinda exaggerating









I already tried increasing the powerlimit but no change.

Since it's just few games that doing this, I will call it crap optimization and continue using Catalyst 15.11.


----------



## Sgt Bilko

Quote:


> Originally Posted by *ziggystardust*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> That's debatable but anyways.....
> 
> put your powerlimit to +50% and run it again, it might be PowerPlay doing it.
> 
> 
> 
> Yeah i was kinda exaggerating
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I already tried increasing the powerlimit but no change.
> 
> Since it's just few games that doing this, I will call it crap optimization and continue using Catalyst 15.11.
Click to expand...

I had some issues with GTA V earlier and it turned out to be Fraps messing with it, disabled it and everything was smooth

other option is to go into MSI Afterburner's settings and with the "unofficial overclocking mode" drop down change it to "without PowerPlay support", that locks the card to 3D clocks all the time so heat will be up but you won't have any clock speed fluctuations anymore


----------



## OneB1t

maybe 100ms pooling rate in afterbuner can also cause that (or in gpu-z)


----------



## Zyphur

Quote:


> Originally Posted by *jdorje*
> 
> That's at stock right? Those temps are fine but you're going to run into vrm temp issues if you overclock/overvolt it. Very similar to my xfx 390 (8256). Edit: also, overclock your cpu more if you want a full 60 fps in gta v.


Quote:


> Originally Posted by *rdr09*
> 
> Glad you got it solved. Nice rig. Got a 540 as well and love it.
> ^this. Need to oc the cpu more not that another issue is causing the eratic gpu usage. even with a single 390 an i5 sandy needs oc'ing. and even then, with games using much cores . . . an oc will only do so much good. Zyphur's cores are hitting 100% spikes.


Yea that's completely stock. I haven't even changed any radeon global settings except fps target to max. The fan also never even went over 50, so if I overclock or play more demanding stuff I'll of course manually push the clock up more, especially since the fans are super quiet compared to any other cards I've had. Oh yea I was wondering, is there a way to change the minimum auto fan speed from 20%?

I recently got a Hyper 212 so I plan on pushing the CPU more. Just wanted to make sure the temps and voltage are good before going higher. The voltage being 1.361 seems high to me for such a low OC so I've been weary of possible heating issues going higher. Here's my UEFI settings for it.


Spoiler: Warning: Spoiler!




Had made a thread about it here but no responses http://www.overclock.net/t/1582962/first-time-ocer-how-are-my-initial-results-2500k-air-prime95


----------



## ziggystardust

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I had some issues with GTA V earlier and it turned out to be Fraps messing with it, disabled it and everything was smooth
> 
> other option is to go into MSI Afterburner's settings and with the "unofficial overclocking mode" drop down change it to "without PowerPlay support", that locks the card to 3D clocks all the time so heat will be up but you won't have any clock speed fluctuations anymore


Not using Fraps as well. Since it's just few games and the fluctuations in them are minimal now with the latest Catalyst, I will just live with it.

Quote:


> Originally Posted by *OneB1t*
> 
> maybe 100ms pooling rate in afterbuner can also cause that (or in gpu-z)


Yeah I tried raising it to 1000ms before and it was the same. It's probably those games are terribly optimized and Crimson drivers just making it even worse. Just noticed that fluctuations in Lords of the Fallen doesn't really effect the fps most of the times.


----------



## shadowking1711

Screenshot1.png 2417k .png file


Hey Guys I would like to join ^_^

Here is my build I have a sapphire r9 390x tri-x

IMG_20160123_132809.jpg 6066k .jpg file


temp.temp_taken_picture.jpg 1900k .jpg file


----------



## ziggystardust

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I had some issues with GTA V earlier and it turned out to be Fraps messing with it, disabled it and everything was smooth
> 
> other option is to go into MSI Afterburner's settings and with the "unofficial overclocking mode" drop down change it to "without PowerPlay support", that locks the card to 3D clocks all the time so heat will be up but you won't have any clock speed fluctuations anymore


Quote:


> Originally Posted by *OneB1t*
> 
> maybe 100ms pooling rate in afterbuner can also cause that (or in gpu-z)


I found something on the net for Killing Floor 2, that involves changing something about texture streaming in its .ini file. Looks like the game is using a shabby texture streaming which is mostly optimized for less vram usage but it turned up people with high-end cards have terrible stutters. I disabled that in ini file, went back to Crimson 16.1 and now the clock speed is not fluctuating like before. No hiccups etc.. Still some very slight fluctuates though but very rarely. So it's probably that texture streaming design is bottlenecking the vram and the gpu thinks there is low load for a moment and downclocks itself. It seems one part of the culprit is those lazy developers. Still, the biggest part is AMD's amazing drivers.


----------



## Mysticking32

Hey guys. Anyone experiencing some weird texture flickering? I don't have any pics right now unfortunately but I will try to post some later.

This is with the msi r9 390x

I was on 16.1 doing fine and then one day I launched AC Unity and whenever I go to a place with like shadows (for example) They continuously flicker when I look up and down. I can physically see the changes taking place. I'm 90 percent sure this isn't an artifact since it was only with shadows. So I turned off nvidia soft shadows and put it on regular high. This stopped that issue. The thing is I put over 100 hours into the game before that even started happening. Which is another reason why I'm confused. I still get minor shadow flickering from a distance.

On Battlefield 4 flames and smoke do the flickering. So I tried turning down the settings there but no cigar. There are also some blurry textures particularly in the distance. Like a side of a building will be blurry. Just a random building. Or a tv will have some weird lines until I get close to it.

These issues didn't start until I installed msi afterburner and overclocked to 1100/1525 (I don't think) It could have been I was just missing these issues but I highly doubt it. Voltage was at 20 as I was planning on doing some testing later but stopped when I saw these issues. This is nothing dangerous at all and I've overclocked much higher than this before so I doubt it's that.

What I've tried so far is a clean install of the latest drivers. Uninstalling the drivers regularly, then doing it again with ddu. Then doing a fresh install. Of course cancelling the overclock. Lowering in game settings. (and as i said in ac unity it helped a lot, but after 100 hours it shouldn't have started doing this at all).

Any suggestions at all would be wonderful.

Could it be the drivers??

I'm about to flip the bios switch on the card and test that.


----------



## Chaoz

Quote:


> Originally Posted by *Mysticking32*
> 
> Hey guys. Anyone experiencing some weird texture flickering? I don't have any pics right now unfortunately but I will try to post some later.
> 
> This is with the msi r9 390x
> 
> I was on 16.1 doing fine and then one day I launched AC Unity and whenever I go to a place with like shadows (for example) They continuously flicker when I look up and down. I can physically see the changes taking place. I'm 90 percent sure this isn't an artifact since it was only with shadows. So I turned off nvidia soft shadows and put it on regular high. This stopped that issue. The thing is I put over 100 hours into the game before that even started happening. Which is another reason why I'm confused. I still get minor shadow flickering from a distance.
> 
> On Battlefield 4 flames and smoke do the flickering. So I tried turning down the settings there but no cigar. There are also some blurry textures particularly in the distance. Like a side of a building will be blurry. Just a random building. Or a tv will have some weird lines until I get close to it.
> 
> These issues didn't start until I installed msi afterburner and overclocked to 1100/1525 (I don't think) It could have been I was just missing these issues but I highly doubt it. Voltage was at 20 as I was planning on doing some testing later but stopped when I saw these issues. This is nothing dangerous at all and I've overclocked much higher than this before so I doubt it's that.
> 
> What I've tried so far is a clean install of the latest drivers. Uninstalling the drivers regularly, then doing it again with ddu. Then doing a fresh install. Of course cancelling the overclock. Lowering in game settings. (and as i said in ac unity it helped a lot, but after 100 hours it shouldn't have started doing this at all).
> 
> Any suggestions at all would be wonderful.
> 
> Could it be the drivers??
> 
> I'm about to flip the bios switch on the card and test that.


That's why I'm still on a previous CCC version (15.15).

A friend of mine has the same card and he still uses CCC version 15.7.1. without any issues.


----------



## Mysticking32

Quote:


> Originally Posted by *Chaoz*
> 
> That's why I'm still on a previous CCC version (15.15).
> 
> A friend of mine has the same card and he still uses CCC version 15.7.1. without any issues.


Wait you're having these issues as well? And okay I will revert back to a previous driver. The thing is I've been on these drivers for a while now. Since release actually and this problem just started.

But I will definitely try reverting. I'm open to all suggestions right now lol


----------



## OneB1t

Quote:


> Originally Posted by *ziggystardust*
> 
> I found something on the net for Killing Floor 2, that involves changing something about texture streaming in its .ini file. Looks like the game is using a shabby texture streaming which is mostly optimized for less vram usage but it turned up people with high-end cards have terrible stutters. I disabled that in ini file, went back to Crimson 16.1 and now the clock speed is not fluctuating like before. No hiccups etc.. Still some very slight fluctuates though but very rarely. So it's probably that texture streaming design is bottlenecking the vram and the gpu thinks there is low load for a moment and downclocks itself. It seems one part of the culprit is those lazy developers. Still, the biggest part is AMD's amazing drivers.


do you have SSD?


----------



## ziggystardust

Quote:


> Originally Posted by *OneB1t*
> 
> do you have SSD?


Yeah, but KF2 is installed on a HDD. So you think it's cacheing the textures to page file, instead of ram or that huge 8GB vram? I guess it's possible. After disabling that texture streaming, vram usage jumps from 1.7/1.8 GB to 3/3.5 GB. Hiccups in the Lords of the Fallen was probably caused by a similar thing.


----------



## Geoclock

My MSI 390x still can't play just Walfenstein Old Blood without problems, picture disorder, some lines all the time even with new catalyst.


----------



## Chaoz

Quote:


> Originally Posted by *Mysticking32*
> 
> Wait you're having these issues as well? And okay I will revert back to a previous driver. The thing is I've been on these drivers for a while now. Since release actually and this problem just started.
> 
> But I will definitely try reverting. I'm open to all suggestions right now lol


Yeah had a few problems with my drivers aswell. When I first bought my R9 390 DC3OC I installed the lastest at the time, but they were a bit dodgy, so i uninstalled and installed 15.15. (Which was the most stable driver that was released. Without installing Gaming Evold app ofc.)
Haven't had any issues since. Can play Black Ops 3 maxed out for hours on end, without any issues.

Also make sure you don't install the gaming evolved app. If you untick that option before you install, it will decrease the chance of problems.


----------



## Mysticking32

Quote:


> Originally Posted by *Chaoz*
> 
> Yeah had a few problems with my drivers aswell. When I first bought my R9 390 DC3OC I installed the lastest at the time, but they were a bit dodgy, so i uninstalled and installed 15.15. (Which was the most stable driver that was released. Without installing Gaming Evold app ofc.)
> Haven't had any issues since. Can play Black Ops 3 maxed out for hours on end, without any issues.
> 
> Also make sure you don't install the gaming evolved app. If you untick that option before you install, it will decrease the chance of problems.


Thank you for the suggestion. Unfortunately it didn't work. I reverted back to 3 drivers and tested ac unity with all of them. I did the crimson from december.

Then I did the catalysts from the months before including all the way to 15.11.

This might be something hardware related







. God I would hate to rma.


----------



## OneB1t

did you tryed to remove driver with DDU? (and run game without any OSD?)


----------



## Mysticking32

Quote:


> Originally Posted by *OneB1t*
> 
> did you tryed to remove driver with DDU? (and run game without any OSD?)


Yes to the ddu. And wait that's a good idea. I did have steam overlay enabled all this time. I'll try that and get back to you.

Update: Blah no luck. I'm going to try a clean windows install. Maybe some afterburner settings stuck around or something? At least that's what I'm hoping.

I just don't get how an overclock to 1100/1525 could cause this. Those are literally the advertised msi settings. Ughhh


----------



## Dundundata

@MySticking, I get some weird issues like that in Witcher 3 sometimes, seems like lighting issues. I figure it's just the game/driver and ignore it since it's the only game that does this. AC:Unity is supposed to be a poorly made game is it not, had many issues when it came out?


----------



## Spartoi

Quote:


> Originally Posted by *m70b1jr*
> 
> Yes, I have an Arctic accelero hybrid III 140. I have it push air out of my case.


So I bought this AIO water cooler for my Sapphire R9 390x. It's the Arctic Accelero Hybrid III 140 290x variant. Installing it was a pain but once I got it on my temps fell drastically. I idle at mid-high 20s and after an hour Furmark, temps never exceeded 55C.

Now my problem is the core temps are good but the VRMs aren't since they aren't watercooled but air cooled. They are higher than they were with the stock cooler. I'm guessing this is because the 290x VRM heatsink didn't perfectly fit my 390x so I placed it as best as I could on my card. The highest temps I've seen is 97C which is pretty hot. I've been using stock voltage because I don't want the VRMs to get any hotter but this inability to raise voltage is killing my OC ability. So, I was wondering if you or anyone had any advice on lowering the VRM temps? I tried putting another fan under my GPU to help cool the VRMs but it only decreased the temps by 2C. I wonder if I should take off the 80mm fan on the VRM heatsink and just use the 120mm fan on the bottom of my case to cool the heatsink? Thoughts?


----------



## Stige

Free tip: Don't buy AIO crap. Money wasted every single time, be it CPU or GPU.

You would have been better off with the normal Arctic Accelero cooler.


----------



## Dagamus NM

Quote:


> Originally Posted by *Spartoi*
> 
> So I bought this AIO water cooler for my Sapphire R9 390x. It's the Arctic Accelero Hybrid III 140 290x variant. Installing it was a pain but once I got it on my temps fell drastically. I idle at mid-high 20s and after an hour Furmark, temps never exceeded 55C.
> 
> Now my problem is the core temps are good but the VRMs aren't since they aren't watercooled but air cooled. They are higher than they were with the stock cooler. I'm guessing this is because the 290x VRM heatsink didn't perfectly fit my 390x so I placed it as best as I could on my card. The highest temps I've seen is 97C which is pretty hot. I've been using stock voltage because I don't want the VRMs to get any hotter but this inability to raise voltage is killing my OC ability. So, I was wondering if you or anyone had any advice on lowering the VRM temps? I tried putting another fan under my GPU to help cool the VRMs but it only decreased the temps by 2C. I wonder if I should take off the 80mm fan on the VRM heatsink and just use the 120mm fan on the bottom of my case to cool the heatsink? Thoughts?


Did you put thermal paste on both sides of your thermal pads? It sounds like you don't have proper pressure on your pads.


----------



## Spartoi

Quote:


> Originally Posted by *Stige*
> 
> Free tip: Don't buy AIO crap. Money wasted every single time, be it CPU or GPU.
> 
> You would have been better off with the normal Arctic Accelero cooler.


Why? The core temps are WAY better than there with the stock cooler. The VRM temps aren't as good but I think that's because I'm using a cooler that isn't suppose to technically fit my 390x.
Quote:


> Originally Posted by *Dagamus NM*
> 
> Did you put thermal paste on both sides of your thermal pads? It sounds like you don't have proper pressure on your pads.


You're suppose to put thermal paste on the thermal pads?









On the stock cooler, there was no thermal paste on the thermal pads and the manual doesn't mention doing that so I never thought to put any on them.


----------



## battleaxe

Quote:


> Originally Posted by *Stige*
> 
> Free tip: Don't buy AIO crap. Money wasted every single time, be it CPU or GPU.
> 
> You would have been better off with the normal Arctic Accelero cooler.


Not true at all. Post some proof of this and I'll post mine that rebuts it.








Quote:


> Originally Posted by *Spartoi*
> 
> So I bought this AIO water cooler for my Sapphire R9 390x. It's the Arctic Accelero Hybrid III 140 290x variant. Installing it was a pain but once I got it on my temps fell drastically. I idle at mid-high 20s and after an hour Furmark, temps never exceeded 55C.
> 
> Now my problem is the core temps are good but the VRMs aren't since they aren't watercooled but air cooled. They are higher than they were with the stock cooler. I'm guessing this is because the 290x VRM heatsink didn't perfectly fit my 390x so I placed it as best as I could on my card. The highest temps I've seen is 97C which is pretty hot. I've been using stock voltage because I don't want the VRMs to get any hotter but this inability to raise voltage is killing my OC ability. So, I was wondering if you or anyone had any advice on lowering the VRM temps? I tried putting another fan under my GPU to help cool the VRMs but it only decreased the temps by 2C. I wonder if I should take off the 80mm fan on the VRM heatsink and just use the 120mm fan on the bottom of my case to cool the heatsink? Thoughts?


You need to make some good heat sinks for your VRM's. Its not hard to do. Anything on the market is not going to do an adequate job of cooling the VRM's. Get some heat sink stock from Amazon and make your own heat sinks. Then use FujiPoly Extreme and your temps will be just fine. Mine don't ever go higher than 55-58C anymore even while pushing 1200mhz plus on the core. It is possible, you just have to do some research and make a set that works for your setup.


----------



## Mysticking32

Quote:


> Originally Posted by *Dundundata*
> 
> @MySticking, I get some weird issues like that in Witcher 3 sometimes, seems like lighting issues. I figure it's just the game/driver and ignore it since it's the only game that does this. AC:Unity is supposed to be a poorly made game is it not, had many issues when it came out?


I hear you. This shadow issue does occur in AC Unity only for the most part.

But it also occurs in BF4 with fires and smoke flickering. I mean it's not that big of a deal. It's just hell of annoying lol.

And unity is for sure a poorly made game lol.


----------



## yoyo711

I have R9 290. I like to upgrade video card.
390x or 980 ti.

Let me know.

Thanks


----------



## Zyphur

Validation: http://www.techpowerup.com/gpuz/details.php?id=88e2e | http://valid.x86.fr/grf54w
Testing out a bit of an OC on cpu and gpu today


----------



## ziggystardust

Quote:


> Originally Posted by *yoyo711*
> 
> I have R9 290. I like to upgrade video card.
> 390x or 980 ti.
> 
> Let me know.
> 
> Thanks


I would wait for the new gpus. Polaris seems promising and R9 290 is still a beast.


----------



## yoyo711

Quote:


> Originally Posted by *ziggystardust*
> 
> I would wait for the new gpus. Polaris seems promising and R9 290 is still a beast.


Thanks +REP

How good Polaris 390X +10% ?. And When they are going to release Polaris ?

Thanks


----------



## tolis626

Continuing my "research" into why my card started misbehaving, it seems it has something to do with thermals. If it goes over 75C, I will get some artifacting if I'm running close to max overclocks on the core and memory stability goes down the drain. Like really, with a max of 73C, 1150/1725 works fine. Thing is, all my case fans ramp up and down depending on CPU temps, so in benches like Valley, where the CPU is mostly idle, my case fans don't spin up and my card kind of suffocates (not really, but temps do go up compared to normal gaming) and the same overclock will crash after a while, unless I manually set the fans to higher speeds. It also might have something to do with power delivery because, even at lower temps, mem overclocks suffer as soon as I go higher than +50mV core voltage in AB. Or it may be different temps on different parts of the die, with the IMC getting hotter and thus destabilizing? Maybe... In that case a TIM replacement is a must.

And about that... Those of you with the MSI cards that have taken the cooler off, could you please tell me if the heatsink's base is bare copper or nickel plated? If it's nickel plated, we could be using liquid metal TIMs, like CLU, for the best results. But if that stuff touches copper it gets solid after a while, which is not good.

PS : Now that I think of it, liquid metal TIMs are conductive, right? If so, wouldn't it be catastrophic for the card if some of the TIM trickeld down away from the die and onto the PCB? Or am I just too worried about stuff I shouldn't be?


----------



## Stige

Quote:


> Originally Posted by *Spartoi*
> 
> Why? The core temps are WAY better than there with the stock cooler. The VRM temps aren't as good but I think that's because I'm using a cooler that isn't suppose to technically fit my 390x.
> You're suppose to put thermal paste on the thermal pads?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> On the stock cooler, there was no thermal paste on the thermal pads and the manual doesn't mention doing that so I never thought to put any on them.


If you hit over 50C with watercooling then the cooler is bad. And it doesn't matter if your core is lower if your VRM temps are almost 100C. My core on my card barely break 40C under water.


----------



## ziggystardust

Quote:


> Originally Posted by *yoyo711*
> 
> Thanks +REP
> 
> How good Polaris 390X +10% ?. And When they are going to release Polaris ?
> 
> Thanks


Noone knows yet. But it will probably be more than that. Polaris is a whole new GCN iteration, die shrink from 28nm to 14nm, more transistors etc... According to AMD's press documents new gpus will be 2X performance per watt. I doubt there will be direct 2X performance difference in enthusiast level cards though, but in the worst case scenario they will be as efficient as Fury Nano with better performance.

If nothing goes wrong, we'll see first Polaris gpus between summer and back to school period. Probably there will be an E3 show.


----------



## ziggystardust

Quote:


> Originally Posted by *Mysticking32*
> 
> I hear you. This shadow issue does occur in AC Unity only for the most part.
> 
> But it also occurs in BF4 with fires and smoke flickering. I mean it's not that big of a deal. It's just hell of annoying lol.
> 
> And unity is for sure a poorly made game lol.


Have you tried an older catalyst version, like 15.7.1?

Crimson drivers are just broken at best as well as some latest Catalyst drivers, like 15.11. They are just messing up with some of effects in certain games. They are trying to fix something but break another. Killing Floor 2's real time reflections for example. It's been broken and pixelating all over the textures for quite some time. I rolled back 15.7.1 today and now it's looks it's supposed to be. It even took several drivers to fix that friggin broken compass in Fallout 4.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Stige*
> 
> If you hit over 50C with watercooling then the cooler is bad. And it doesn't matter if your core is lower if your VRM temps are almost 100C. My core on my card barely break 40C under water.


I break 60C with custom loop. Not that hard to hit high temps with these cards.


----------



## TheCowTamer

Hey guys I purchased a xfx 390x about a month ago. I am wondering if its a mislabeled 390. I have gpu-z and it shows that it has the 2816 shaders and all of the stickers on it say it's a 390x but when go to 3dmark is lists it as a 390 series. Also when I overclock it my 3dmark score goes down, I'm not thermal throttling because my temps never go above 67C. Any other way I can check and make sure it's a 390x and not a 390. I was also planning on putting a full cover block but from what it looks like there is no full cover blocks for the xfx ones. Thanks guys.


----------



## m70b1jr

Quote:


> Originally Posted by *TheCowTamer*
> 
> Hey guys I purchased a xfx 390x about a month ago. I am wondering if its a mislabeled 390. I have gpu-z and it shows that it has the 2816 shaders and all of the stickers on it say it's a 390x but when go to 3dmark is lists it as a 390 series. Also when I overclock it my 3dmark score goes down, I'm not thermal throttling because my temps never go above 67C. Any other way I can check and make sure it's a 390x and not a 390. I was also planning on putting a full cover block but from what it looks like there is no full cover blocks for the xfx ones. Thanks guys.


The XFX is basically the same PCB as a reference 390 / 290.


----------



## ZealotKi11er

Quote:


> Originally Posted by *TheCowTamer*
> 
> Hey guys I purchased a xfx 390x about a month ago. I am wondering if its a mislabeled 390. I have gpu-z and it shows that it has the 2816 shaders and all of the stickers on it say it's a 390x but when go to 3dmark is lists it as a 390 series. Also when I overclock it my 3dmark score goes down, I'm not thermal throttling because my temps never go above 67C. Any other way I can check and make sure it's a 390x and not a 390. I was also planning on putting a full cover block but from what it looks like there is no full cover blocks for the xfx ones. Thanks guys.


2816 SP is 390X.


----------



## TheCowTamer

Quote:


> Originally Posted by *m70b1jr*
> 
> The XFX is basically the same PCB as a reference 390 / 290.


I thought when I read earlier in this post there, it said no full cover blocks for xfx 390x? I guess I will have to go back and check again


----------



## Mysticking32

Quote:


> Originally Posted by *ziggystardust*
> 
> Have you tried an older catalyst version, like 15.7.1?
> 
> Crimson drivers are just broken at best as well as some latest Catalyst drivers, like 15.11. They are just messing up with some of effects in certain games. They are trying to fix something but break another. Killing Floor 2's real time reflections for example. It's been broken and pixelating all over the textures for quite some time. I rolled back 15.7.1 today and now it's looks it's supposed to be. It even took several drivers to fix that friggin broken compass in Fallout 4.


That's true. the compass in gta 5 was messed up for a long time. I'll roll back to 15.7.1 and try that. The thing is these issues just started. I'll get back to you when I revert


----------



## m70b1jr

Quote:


> Originally Posted by *Mysticking32*
> 
> That's true. the compass in gta 5 was messed up for a long time. I'll roll back to 15.7.1 and try that. The thing is these issues just started. I'll get back to you when I revert


The compass was also broke in fallout 4 but but they fixed it.


----------



## tolis626

Quote:


> Originally Posted by *tolis626*
> 
> Continuing my "research" into why my card started misbehaving, it seems it has something to do with thermals. If it goes over 75C, I will get some artifacting if I'm running close to max overclocks on the core and memory stability goes down the drain. Like really, with a max of 73C, 1150/1725 works fine. Thing is, all my case fans ramp up and down depending on CPU temps, so in benches like Valley, where the CPU is mostly idle, my case fans don't spin up and my card kind of suffocates (not really, but temps do go up compared to normal gaming) and the same overclock will crash after a while, unless I manually set the fans to higher speeds. It also might have something to do with power delivery because, even at lower temps, mem overclocks suffer as soon as I go higher than +50mV core voltage in AB. Or it may be different temps on different parts of the die, with the IMC getting hotter and thus destabilizing? Maybe... In that case a TIM replacement is a must.
> 
> And about that... Those of you with the MSI cards that have taken the cooler off, could you please tell me if the heatsink's base is bare copper or nickel plated? If it's nickel plated, we could be using liquid metal TIMs, like CLU, for the best results. But if that stuff touches copper it gets solid after a while, which is not good.
> 
> PS : Now that I think of it, liquid metal TIMs are conductive, right? If so, wouldn't it be catastrophic for the card if some of the TIM trickeld down away from the die and onto the PCB? Or am I just too worried about stuff I shouldn't be?


Kinda ashamed, but bumping this one because...

I just tested. Stability is compromised above 75C, at least on my system. I just put it at 1175/1700MHz with +90/+50mV and locked my GPU fan speed to 100% and had my case fans also at 100%, blasting cold air on the damn thing. Result? Temps were under 70C most of the time while playing BF4 and it was stable. Then I take the fans back to auto (both GPU and case fans) and, surely enough, a few minutes (and I mean a few, like less than 5) later it crashed without having changed anything else. Could anyone else please check if your card has similar behavior?


----------



## ZealotKi11er

Quote:


> Originally Posted by *tolis626*
> 
> Kinda ashamed, but bumping this one because...
> 
> I just tested. Stability is compromised above 75C, at least on my system. I just put it at 1175/1700MHz with +90/+50mV and locked my GPU fan speed to 100% and had my case fans also at 100%, blasting cold air on the damn thing. Result? Temps were under 70C most of the time while playing BF4 and it was stable. Then I take the fans back to auto (both GPU and case fans) and, surely enough, a few minutes (and I mean a few, like less than 5) later it crashed without having changed anything else. Could anyone else please check if your card has similar behavior?


Yes. Not 75C but 60C+ depending on OC. This is under water.


----------



## Mysticking32

Just an update I figured out what the problem was with AC Unity. And I feel like an idiot. It turns out the bug has something to do with the soft shadows settings and fxaa. Setting it to msaa x2 fixes the problem. Just as setting the soft shadows to high did.

All the driver installs, windows reinstall, etc lol. and it was that.

I'm still not sure how I went the entire game without this problem but whatever lol. Still working on the bf4 flickers.

Thanks to the people who replied and helped


----------



## TsukikoChan

Quote:


> Originally Posted by *Mysticking32*
> 
> Just an update I figured out what the problem was with AC Unity. And I feel like an idiot. It turns out the bug has something to do with the soft shadows settings and fxaa. Setting it to msaa x2 fixes the problem. Just as setting the soft shadows to high did.
> 
> All the driver installs, windows reinstall, etc lol. and it was that.
> 
> I'm still not sure how I went the entire game without this problem but whatever lol. Still working on the bf4 flickers.
> 
> Thanks to the people who replied and helped


This is for the texture missing bug people are mentioning? I just wanna pipe in the same thing has been happening in Evil Within as well.. I know it's an unoptimised game but for the most part it is very pretty but there's been a couple of slow texture loads and Kidman's face never got a full texture (always a low-LOD texture version). I'm curious as to why this is doing this too. hmm..


----------



## Stige

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I break 60C with custom loop. Not that hard to hit high temps with these cards.


I still think there has to be something wrong if your temps are that high.

My 390 is at +163mV right now at 1225 on Core and barely breaks that 40C there.

VRM does reach 70-75C but like I said earlier, VRM is pretty much always the temps you have to worry about, never your core temps.

But I would look at fixing your loop if you hit 60C on the Core.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Stige*
> 
> I still think there has to be something wrong if your temps are that high.
> 
> My 390 is at +163mV right now at 1225 on Core and barely breaks that 40C there.
> 
> VRM does reach 70-75C but like I said earlier, VRM is pretty much always the temps you have to worry about, never your core temps.
> 
> But I would look at fixing your loop if you hit 60C on the Core.


Its very normal. When you have 3770K and 2 x 290s with 360 + 240 Running Push/Pull with fans @ 7v you get those temps. Also that's ~ 700W heat getting dumped on the RAD. If you look any RAD data you will see that you need more RAD space and faster fans. Even with single GPU I have always been ~ 48C. I have checked Water Temps before load and after and the difference was ~ 18C which is normal and why the core hits 60C.


----------



## Dundundata

Quote:


> Originally Posted by *ziggystardust*
> 
> Have you tried an older catalyst version, like 15.7.1?
> 
> It even took several drivers to fix that friggin broken compass in Fallout 4.


ha I was like what is wrong with my game! nexusmods i think had this fix right away though, which made me chuckle when i saw it in the updated driver log


----------



## Stige

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Its very normal. When you have 3770K and 2 x 290s with 360 + 240 Running Push/Pull with fans @ 7v you get those temps. Also that's ~ 700W heat getting dumped on the RAD. If you look any RAD data you will see that you need more RAD space and faster fans. Even with single GPU I have always been ~ 48C. I have checked Water Temps before load and after and the difference was ~ 18C which is normal and why the core hits 60C.


Well my 390 + 3570K with a 360 should be pretty close to yours. Or your pump is not pushing enough water or your fans are just not doing anything at such low voltage.
I just checked, 47C max on Core, 68C on VRM1 and 48C on VRM2 while I have played BnS for past 4-5 hours.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Stige*
> 
> Well my 390 + 3570K with a 360 should be pretty close to yours. Or your pump is not pushing enough water or your fans are just not doing anything at such low voltage.
> I just checked, 47C max on Core, 68C on VRM1 and 48C on VRM2 while I have played BnS for past 4-5 hours.


What is your Ambient Temp? Also 47C is about what I got too with Single GPU. If I ramp up my fas @ 12v My temps drop ~ 54C for the hottest card. This is with case close. If I open the case the temps drop even more.

Edit: Just tested Witcher 3 (Not very CPU Heavy) 15 min with Side Panel off. 48C GPU1, 51C GPU2. Ambient 20-22C. If I play 2 hours probably increase room to 25-26C. And hit 55C. I was hitting 60C with GPU2 because I had side panel on, heater was ON aka 24C ambient before gaming and was playing BF4 which pushed CPU much harder.


----------



## mus1mus

Quote:


> Originally Posted by *Stige*
> 
> Well my 390 + 3570K with a 360 should be pretty close to yours. Or your pump is not pushing enough water or your fans are just not doing anything at such low voltage.
> I just checked, 47C max on Core, 68C on VRM1 and 48C on VRM2 while I have played BnS for past 4-5 hours.


Hot!

I only get those temps at over 1.45 under load. 290X + 5930. Both OC'ed to the edge.









2x 360 with 1850 fans in Pull now. But then, my ambient is sub 20C.


----------



## Schwulibertz

Wallpaper anyone?
Got bored and made these, mega link below, enjoy.

ASUS


MSI


SAPPHIRE


https://mega.nz/#F!M5500L4Z!VqCJ9Cbg79UGkML7WPfNyg


----------



## battleaxe

Quote:


> Originally Posted by *Spartoi*
> 
> So I bought this AIO water cooler for my Sapphire R9 390x. It's the Arctic Accelero Hybrid III 140 290x variant. Installing it was a pain but once I got it on my temps fell drastically. I idle at mid-high 20s and after an hour Furmark, temps never exceeded 55C.
> 
> Now my problem is the core temps are good but the VRMs aren't since they aren't watercooled but air cooled. They are higher than they were with the stock cooler. I'm guessing this is because the 290x VRM heatsink didn't perfectly fit my 390x so I placed it as best as I could on my card. The highest temps I've seen is 97C which is pretty hot. I've been using stock voltage because I don't want the VRMs to get any hotter but this inability to raise voltage is killing my OC ability. So, I was wondering if you or anyone had any advice on lowering the VRM temps? I tried putting another fan under my GPU to help cool the VRMs but it only decreased the temps by 2C. I wonder if I should take off the 80mm fan on the VRM heatsink and just use the 120mm fan on the bottom of my case to cool the heatsink? Thoughts?


Quote:


> Originally Posted by *ZealotKi11er*
> 
> What is your Ambient Temp? Also 47C is about what I got too with Single GPU. If I ramp up my fas @ 12v My temps drop ~ 54C for the hottest card. This is with case close. If I open the case the temps drop even more.
> 
> Edit: Just tested Witcher 3 (Not very CPU Heavy) 15 min with Side Panel off. 48C GPU1, 51C GPU2. Ambient 20-22C. If I play 2 hours probably increase room to 25-26C. And hit 55C. I was hitting 60C with GPU2 because I had side panel on, heater was ON aka 24C ambient before gaming and was playing BF4 which pushed CPU much harder.


You just need more RAD space. That's all.


----------



## kizwan

Ambient 30 - 32C, 360 + 240 + 120 rads, with GTA V @1000/1300, my cards can go up to 60s Celsius. I didn't always monitor VRMs but both are below core anyway. Last time I checked VRM 1 in the 50s.


----------



## Charcharo

Quote:


> Originally Posted by *Schwulibertz*
> 
> Wallpaper anyone?
> Got bored and made these, mega link below, enjoy.
> 
> ASUS
> 
> 
> MSI
> 
> 
> SAPPHIRE
> 
> 
> https://mega.nz/#F!M5500L4Z!VqCJ9Cbg79UGkML7WPfNyg


Can you make a PowerColor one







?
The PCS+ that is.


----------



## Chaoz

Quote:


> Originally Posted by *Spartoi*
> 
> So I bought this AIO water cooler for my Sapphire R9 390x. It's the Arctic Accelero Hybrid III 140 290x variant. Installing it was a pain but once I got it on my temps fell drastically. I idle at mid-high 20s and after an hour Furmark, temps never exceeded 55C.
> 
> Now my problem is the core temps are good but the VRMs aren't since they aren't watercooled but air cooled. They are higher than they were with the stock cooler. I'm guessing this is because the 290x VRM heatsink didn't perfectly fit my 390x so I placed it as best as I could on my card. The highest temps I've seen is 97C which is pretty hot. I've been using stock voltage because I don't want the VRMs to get any hotter but this inability to raise voltage is killing my OC ability. So, I was wondering if you or anyone had any advice on lowering the VRM temps? I tried putting another fan under my GPU to help cool the VRMs but it only decreased the temps by 2C. I wonder if I should take off the 80mm fan on the VRM heatsink and just use the 120mm fan on the bottom of my case to cool the heatsink? Thoughts?


I just noticed but your card is leaning down hard. Looks like it's a bit heavy for the pcie slot. Isn't there a bracket which supports the card so it doesn't bend?


----------



## Stige

Quote:


> Originally Posted by *mus1mus*
> 
> Hot!
> 
> I only get those temps at over 1.45 under load. 290X + 5930. Both OC'ed to the edge.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 2x 360 with 1850 fans in Pull now. But then, my ambient is sub 20C.


My cpu is at 1.52V and GPU is at +175mV right now


----------



## tolis626

Quote:


> Originally Posted by *ZealotKi11er*
> 
> What is your Ambient Temp? Also 47C is about what I got too with Single GPU. If I ramp up my fas @ 12v My temps drop ~ 54C for the hottest card. This is with case close. If I open the case the temps drop even more.
> 
> Edit: Just tested Witcher 3 (Not very CPU Heavy) 15 min with Side Panel off. 48C GPU1, 51C GPU2. Ambient 20-22C. If I play 2 hours probably increase room to 25-26C. And hit 55C. I was hitting 60C with GPU2 because I had side panel on, heater was ON aka 24C ambient before gaming and was playing BF4 which pushed CPU much harder.


Your temps are not bad at all, but I do have to agree with the guys saying that they should be better. Considering most AIO equipped cards don't break 55C easily, even when heavily overclocked, yours should be cooler than that with all that rad space. Are you sure your pump is functioning properly? Is your flow adequate? I don't think it's the rads at all, rather it looks like flow restriction. But that's just me. If your pump is something like the D5 or DDC, I'd turn it up a notch, see what happens. Most D5 pumps are happy (quiet while providing adequate flow) at 3/5 setting, assuming it's the variant with the speed knob.

PS : Since it seems that the MSI has a nickel plated contact plate, I would suppose liquid metal TIMs are ok. Anyone using CLU or similar on their GPU?


----------



## ZealotKi11er

Quote:


> Originally Posted by *tolis626*
> 
> Your temps are not bad at all, but I do have to agree with the guys saying that they should be better. Considering most AIO equipped cards don't break 55C easily, even when heavily overclocked, yours should be cooler than that with all that rad space. Are you sure your pump is functioning properly? Is your flow adequate? I don't think it's the rads at all, rather it looks like flow restriction. But that's just me. If your pump is something like the D5 or DDC, I'd turn it up a notch, see what happens. Most D5 pumps are happy (quiet while providing adequate flow) at 3/5 setting, assuming it's the variant with the speed knob.
> 
> PS : Since it seems that the MSI has a nickel plated contact plate, I would suppose liquid metal TIMs are ok. Anyone using CLU or similar on their GPU?


The difference is 120mm AIO only cool the GPU. Also even though I have 360 + 240 RAD, the 240 RAD is pushing air out of the bottom of the case which is very restrictive. Some of that air goes back as Intake from the front of the case. I tried every possible combinations of fan setups and was the best for temps. The pump is fine. The temps dont go up right away. They increase slowly as water get hotter. Again 2 GPU is very different to cool then 1 GPU. Also I am sure people with AIO are not running fans 1000 RPM.


----------



## Stige

What pump is it? Water might move but if it moves too slowly, it won't move the right away inside the radiator thus causing much higher temps than should be.

I got the same Phobya 360 Rad as you do myself with Phobya DC12-400 pump.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Stige*
> 
> What pump is it?


D5. I have 3 Water Cooled System with D5 right now. GPUs are all above 45C. Got a PII @ 4.2GHz + 7970 GHz with 360 RAD Push. GPU hits 52C.


----------



## Mysticking32

Quote:


> Originally Posted by *TsukikoChan*
> 
> This is for the texture missing bug people are mentioning? I just wanna pipe in the same thing has been happening in Evil Within as well.. I know it's an unoptimised game but for the most part it is very pretty but there's been a couple of slow texture loads and Kidman's face never got a full texture (always a low-LOD texture version). I'm curious as to why this is doing this too. hmm..


Yeah it seems to be on random games for most people. Mostly triple a titles. I'm not sure what's causing it. But it's interesting you say The Evil within. I haven't had any problems with that game. I'll try installing it now and running it


----------



## jodybdesigns

I got a pretty good question. To start off I want to REALLY water cool my GPU(s) to finish my complete loop. After a year of my OCD seeing this tube run across the center of my video cards (I can't believe I have lasted this long), its time to finish the loop. But after doing some pricing for 2 blocks, a couple rads, and all the fittings. It is going to run me about $325.

I am gaming (or trying to grrr) in 1440p and I am going to be cooling 2 7950's with 2 EK Thermospheres. Or..

I can get one of the Powercolor PCS+ 390 for $289 ($269 after rebate, but I never get them) - sell off the 2 7950's for about $100 each, and buy a block, fittings, and rad for the 390. I would lose power consumption, but is the upgrade from Crossfire 7950's > R9 390 with the upgrades worth justifying?


----------



## ZealotKi11er

Quote:


> Originally Posted by *jodybdesigns*
> 
> I got a pretty good question. To start off I want to REALLY water cool my GPU(s) to finish my complete loop. After a year of my OCD seeing this tube run across the center of my video cards (I can't believe I have lasted this long), its time to finish the loop. But after doing some pricing for 2 blocks, a couple rads, and all the fittings. It is going to run me about $325.
> 
> I am gaming (or trying to grrr) in 1440p and I am going to be cooling 2 7950's with 2 EK Thermospheres. Or..
> 
> I can get one of the Powercolor PCS+ 390 for $289 ($269 after rebate, but I never get them) - sell off the 2 7950's for about $100 each, and buy a block, fittings, and rad for the 390. I would lose power consumption, but is the upgrade from Crossfire 7950's > R9 390 with the upgrades worth justifying?


Going from 2 x HD 7950 to 390 will actually be a upgrade. If HD 7950 scale perfectly you might see 20% better fut for most games 1 390X will be better. Better Idea is to look for a 290X or 290 that comes with a Water Block. You will save a lot of money.


----------



## jodybdesigns

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Going from 2 x HD 7950 to 390 will actually be a upgrade. If HD 7950 scale perfectly you might see 20% better fut for most games 1 390X will be better. Better Idea is to look for a 290X or 290 that comes with a Water Block. You will save a lot of money.


I like this guy









Lol I get positive scaling in ONE game, and that is GTA5. Even BF4, a 3 year old game, plays like crap on 1440p with Crossfire 7950's in my opinion...

I was actually looking at a 290x with a waterblock. I am gaming at 1440p. Would the 290x or the 390 be better? Or would they be one of those neck and neck type deals. Because I do like the idea of one of the 290x's with a block, but I also like the idea of that 8gb VRAM too...


----------



## ZealotKi11er

Quote:


> Originally Posted by *jodybdesigns*
> 
> I like this guy
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Lol I get positive scaling in ONE game, and that is GTA5. Even BF4, a 3 year old game, plays like crap on 1440p with Crossfire 7950's in my opinion...
> 
> I was actually looking at a 290x with a waterblock. I am gaming at 1440p. Would the 290x or the 390 be better? Or would they be one of those neck and neck type deals. Because I do like the idea of one of the 290x's with a block, but I also like the idea of that 8gb VRAM too...


R9 290X will be faster under water. 390 will probably never use 8GB ever especially 1440p. If games start to use 8GB @ 1440p then 390 will be too slow. Have not tried to play BF4 with HD 79XX but 2 x 290 scale amazing in BF4 even 4K. Its one of the only games that runs well.
Now 390 vs 290X is all about cost. I would not spend more then $50 to get a New 390 over 290X. Also Factor in how much WB cost for 390.


----------



## Schwulibertz

false
Quote:


> Originally Posted by *Charcharo*
> 
> Can you make a PowerColor one
> 
> 
> 
> 
> 
> 
> 
> ?
> The PCS+ that is.


I tried but i cant seem to find a decent picture...


----------



## Spartoi

Quote:


> Originally Posted by *Chaoz*
> 
> I just noticed but your card is leaning down hard. Looks like it's a bit heavy for the pcie slot. Isn't there a bracket which supports the card so it doesn't bend?


If you're referring to the bracket on the PCI-E slot that hold the GPU, the bracket is broken on that PCI-E slot. I've moved my GPU down to the middle x16 PCI-E slot and the bend is gone.


----------



## jodybdesigns

Quote:


> Originally Posted by *ZealotKi11er*
> 
> R9 290X will be faster under water. 390 will probably never use 8GB ever especially 1440p. If games start to use 8GB @ 1440p then 390 will be too slow. Have not tried to play BF4 with HD 79XX but 2 x 290 scale amazing in BF4 even 4K. Its one of the only games that runs well.
> Now 390 vs 290X is all about cost. I would not spend more then $50 to get a New 390 over 290X. Also Factor in how much WB cost for 390.


Doing some simple researching, the 390 is actually cheaper than the 290x brand new (for the moment). 390 for $289 or 290x for $305. I can get some used 290x's for about $225 (but this buying off EBAY too). I would be selling the 7950's to purchase the GPU block + rad + fittings though to justify the costs.

I do get what you are saying about the "if it finally uses it, its probably too slow by then"

I have a couple weeks to look regardless. I am though wanting to dump these 7950's. They are starting to age. I will keep an eye out for some good priced 290x's when it comes down time to buy. If I happen to find some new 290x's for the same price, I will be getting a 290x.


----------



## ZealotKi11er

Quote:


> Originally Posted by *jodybdesigns*
> 
> Doing some simple researching, the 390 is actually cheaper than the 290x brand new (for the moment). 390 for $289 or 290x for $305. I can get some used 290x's for about $225 (but this buying off EBAY too). I would be selling the 7950's to purchase the GPU block + rad + fittings though to justify the costs.
> 
> I do get what you are saying about the "if it finally uses it, its probably too slow by then"
> 
> I have a couple weeks to look regardless. I am though wanting to dump these 7950's. They are starting to age. I will keep an eye out for some good priced 290x's when it comes down time to buy. If I happen to find some new 290x's for the same price, I will be getting a 290x.


If yo buy new get 390. I would only get 290X if you can buy it with a Block ~ same price as new 290X or 390.


----------



## Stige

Yup just get a 390 if 290X is so expensive.

Games will start pushing that 4GB soon I reckon, as some games already are, especially at 1440p.
It won't hurt to have more than that 4GB. Kinda feel sorry for the poor people who own a 970 lol


----------



## Chaoz

Quote:


> Originally Posted by *Spartoi*
> 
> If you're referring to the bracket on the PCI-E slot that hold the GPU, the bracket is broken on that PCI-E slot. I've moved my GPU down to the middle x16 PCI-E slot and the bend is gone.


No I mean something like this bracket:










The black on the side of the GPU. It comes in the Hybrid packs.It does in the Accelero Hybrid II - 120 not sure if it's in the others aswell.


----------



## jodybdesigns

Quote:


> Originally Posted by *ZealotKi11er*
> 
> If yo buy new get 390. I would only get 290X if you can buy it with a Block ~ same price as new 290X or 390.


Quote:


> Originally Posted by *Stige*
> 
> Yup just get a 390 if 290X is so expensive.
> 
> Games will start pushing that 4GB soon I reckon, as some games already are, especially at 1440p.
> It won't hurt to have more than that 4GB. Kinda feel sorry for the poor people who own a 970 lol


Thanks. I'll just keep focused on the 390 then, unless I happen to get lucky on a 290x. The Powercolor PCS+ for $289.99 with $20 rebate. $269.99 isn't bad for one of the better cooled 390's until the 7950's sell.


----------



## ZealotKi11er

Quote:


> Originally Posted by *jodybdesigns*
> 
> Thanks. I'll just keep focused on the 390 then, unless I happen to get lucky on a 290x. The Powercolor PCS+ for $289.99 with $20 rebate. $269.99 isn't bad for one of the better cooled 390's until the 7950's sell.


Try to get R9 390 Nitro if you can


----------



## jodybdesigns

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Try to get R9 390 Nitro if you can


Yeah I have always owned Sapphire cards, they have always been top notch. Including the 5 7950's in a miner. Man those things took some SERIOUS abuse. I am still using 2 of them today they were so good at one time lol

But its time to move on from them...


----------



## Spartoi

Quote:


> Originally Posted by *Chaoz*
> 
> No I mean something like this bracket:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The black on the side of the GPU. It comes in the Hybrid packs.It does in the Accelero Hybrid II - 120 not sure if it's in the others aswell.


Yes, I have that installed already but it's not that long. This AIO was made for the 290x so I don't know if the 290x was shorter but the bracket doesn't much area on my 390x.


----------



## Chaoz

Quote:


> Originally Posted by *Spartoi*
> 
> Yes, I have that installed already but it's not that long. This AIO was made for the 290x so I don't know if the 290x was shorter but the bracket doesn't much area on my 390x.


That's not what I meant,.

I mean the long black bracket on the side of the grafics card. Not the heatsink on top.

This part is what i mean (highlighted in red):


----------



## kizwan

Quote:


> Originally Posted by *jodybdesigns*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ZealotKi11er*
> 
> Try to get R9 390 Nitro if you can
> 
> 
> 
> Yeah I have always owned Sapphire cards, they have always been top notch. Including the 5 7950's in a miner. Man those things took some SERIOUS abuse. I am still using 2 of them today they were so good at one time lol
> 
> But its time to move on from them...
Click to expand...

If you want watercool both core & VRMs, then you'll only have two choices; Powercolor PCS+ 390 or ASUS DCU2 390 (firstly make sure waterblock is available for purchase). Also MSI 390 once EK released the block.


----------



## Stige

Quote:


> Originally Posted by *kizwan*
> 
> If you want watercool both core & VRMs, then you'll only have two choices; Powercolor PCS+ 390 or ASUS 390 (firstly make sure waterblock is available for purchase). Also MSI 390 once EK released the block.


And the ASUS card only has the "Hybrid" block for it which looks like a full cover block but has only a heatsink on the VRM.
Alphacool GPX series.

I have one, works fine after I changed the thermal pads for VRM and glued a GT AP-15 fan there on top of it.
75C max now at +175mV.


----------



## kizwan

EK have full waterblock for ASUS DCU2.


----------



## jodybdesigns

Quote:


> Originally Posted by *kizwan*
> 
> If you want watercool both core & VRMs, then you'll only have two choices; Powercolor PCS+ 390 or ASUS DCU2 390 (firstly make sure waterblock is available for purchase). Also MSI 390 once EK released the block.


This is why I am looking at the Powercolor PCS+. The Alphacool NexXxoS GPX R9 290/290x M07 fits both revisions of the PCS+ according to the compatibility chart. And it is the cheapest!


----------



## Spartoi

Quote:


> Originally Posted by *Chaoz*
> 
> That's not what I meant,.
> 
> I mean the long black bracket on the side of the grafics card. Not the heatsink on top.
> 
> This part is what i mean (highlighted in red):


Oh. My AIO didn't come with that.


----------



## kizwan

Quote:


> Originally Posted by *jodybdesigns*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> If you want watercool both core & VRMs, then you'll only have two choices; Powercolor PCS+ 390 or ASUS DCU2 390 (firstly make sure waterblock is available for purchase). Also MSI 390 once EK released the block.
> 
> 
> 
> This is why I am looking at the Powercolor PCS+. The Alphacool NexXxoS GPX R9 290/290x M07 fits both revisions of the PCS+ according to the compatibility chart. And it is the cheapest!
Click to expand...

I was referring to EK waterblock because it cool both core & vrms. For 390/390X, only Powercolor PCS+ & ASUS DCU2 have compatible waterblock from EK (EK SE & EK DCU2 block respectively). At least, so far both still compatible as far as I know. The new XFX card with new inductors no longer compatible with EK waterblock.


----------



## ziggystardust

Anyone knows where might be those VRM 1 and VRM 2 sensors located on Sapphire 390X Nitro?


----------



## jazz995756

If anyone is interested I recently decided to pull the trigger and put a full Waterblock on my xfx 390 the only issue as most of you know is the new inductor with it being slightly bigger the block might need some modification.

That being said I came across a forum talking about why EK removed the Waterblock from their website as being compatible with our cards and they had some more insight as to why. Basically if you look at the pcb you can immediately see if the card is compatible by looking at the inductor. The old inductor that is compatible with most blocks has a "C" on it the new inductor has a "Z" on the top and mine happened to be the old inductor but had to remove all current heatsinks and the backplate in order for the block to fit.

http://www.frozencpu.com/products/24126/ex-blc-1714/EK_MSI_Gigabyte_Radeon_R9-290X_VGA_Liquid_Cooling_Block_Rev_20_-_Acetal_EK-FC_R9-290X_-_Acetal_Rev20.html?tl=c599s2078b133

I ordered that block and it fit perfectly and comes with plenty of thermal pads for everything you need. Hopefully this helps some of you wanting to put a full block on your card and avoid buying an expensive paperweight.


----------



## Synntx

Welp, for those who have paid attention to my posts lately, I believe I've solved my GPU overclock issues.

I went ahead and returned the Antec HPG-850 PSU and swapped it out for an AX860. Everything is working flawlessly. i was able to push the card as far as 1220/1750 on air for benching, resulting in a ~15200 GPU score. I dropped the clocks down to 1170/1750 w/ +111mv on the core and I'm Witcher 3 stable







. Pulling 55-60fps minimum in 1440p, ultra settings w/ hairworks disabled. Temps max out at 80C in gaming. I was having some blue screen issues but determined my DRAM OC to be a wee bit too high, so once I fixed that it was smooth sailing.

Needless to say, I'm quite satisfied
















The only disappointing thing I've encountered was that I updated my parts list on PCPartPicker.com to include all current hardware and nearly pooped my pants at the overall cost of the build......

EDIT: I have been paying attention to TRIXX's hardware monitor, and it's showing my VDDC Power in / Power out peaking at at 361w/310w 0_o


----------



## TsukikoChan

Quote:


> Originally Posted by *Synntx*
> 
> Welp, for those who have paid attention to my posts lately, I believe I've solved my GPU overclock issues.
> 
> I went ahead and returned the Antec HPG-850 PSU and swapped it out for an AX860. Everything is working flawlessly. i was able to push the card as far as 1220/1750 on air for benching, resulting in a ~15200 GPU score. I dropped the clocks down to 1170/1750 w/ +111mv on the core and I'm Witcher 3 stable
> 
> 
> 
> 
> 
> 
> 
> . Pulling 55-60fps minimum in 1440p, ultra settings w/ hairworks disabled. Temps max out at 80C in gaming. I was having some blue screen issues but determined my DRAM OC to be a wee bit too high, so once I fixed that it was smooth sailing.
> 
> Needless to say, I'm quite satisfied
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The only disappointing thing I've encountered was that I updated my parts list on PCPartPicker.com to include all current hardware and nearly pooped my pants at the overall cost of the build......
> 
> EDIT: I have been paying attention to TRIXX's hardware monitor, and it's showing my VDDC Power in / Power out peaking at at 361w/310w 0_o


So it was the psu then? same as me prob but i need to assess my bank balance before i can pull the trigger on a new psu :-(


----------



## RicoDee




----------



## tolis626

Quote:


> Originally Posted by *Synntx*
> 
> Welp, for those who have paid attention to my posts lately, I believe I've solved my GPU overclock issues.
> 
> I went ahead and returned the Antec HPG-850 PSU and swapped it out for an AX860. Everything is working flawlessly. i was able to push the card as far as 1220/1750 on air for benching, resulting in a ~15200 GPU score. I dropped the clocks down to 1170/1750 w/ +111mv on the core and I'm Witcher 3 stable
> 
> 
> 
> 
> 
> 
> 
> . Pulling 55-60fps minimum in 1440p, ultra settings w/ hairworks disabled. Temps max out at 80C in gaming. I was having some blue screen issues but determined my DRAM OC to be a wee bit too high, so once I fixed that it was smooth sailing.
> 
> Needless to say, I'm quite satisfied
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The only disappointing thing I've encountered was that I updated my parts list on PCPartPicker.com to include all current hardware and nearly pooped my pants at the overall cost of the build......
> 
> EDIT: I have been paying attention to TRIXX's hardware monitor, and it's showing my VDDC Power in / Power out peaking at at 361w/310w 0_o


Well I'll be damned. That PSU must certainly have been dying, else it shouldn't be making a difference. Still, happy to see you solved this, no matter how much I dislike being wrong.









Also damn, that overclock. My card just doesn't want to do more than 1180MHz properly. Sure, with +150mV it will do 1200/1725MHz (maybe more) no problem, but the scores I get in Firestrike are the same or even lower than what I get for my usual overclocks. Throttling? Instability? I have no idea. It's not temps fore sure, I have the GPU fans and the adjacent case fans blasting at 100% while benching and temps peaked at what? 68C? Lower than 70C anyway. I've even tried +175mV (And was standing next to the PC, ready to press the reset button because I was scared it may overheat/die







) and scores were even lower (14100 plus change). That's depressing.


----------



## afyeung

Quote:


> Originally Posted by *Synntx*
> 
> Welp, for those who have paid attention to my posts lately, I believe I've solved my GPU overclock issues.
> 
> I went ahead and returned the Antec HPG-850 PSU and swapped it out for an AX860. Everything is working flawlessly. i was able to push the card as far as 1220/1750 on air for benching, resulting in a ~15200 GPU score. I dropped the clocks down to 1170/1750 w/ +111mv on the core and I'm Witcher 3 stable
> 
> 
> 
> 
> 
> 
> 
> . Pulling 55-60fps minimum in 1440p, ultra settings w/ hairworks disabled. Temps max out at 80C in gaming. I was having some blue screen issues but determined my DRAM OC to be a wee bit too high, so once I fixed that it was smooth sailing.
> 
> Needless to say, I'm quite satisfied
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The only disappointing thing I've encountered was that I updated my parts list on PCPartPicker.com to include all current hardware and nearly pooped my pants at the overall cost of the build......
> 
> EDIT: I have been paying attention to TRIXX's hardware monitor, and it's showing my VDDC Power in / Power out peaking at at 361w/310w 0_o


Where are you testing Witcher 3 and what version do you have? I have my 390x at 1160/1750 with +81mv core and +50mv aux. I was seeing fps drops into the low 40s with everything max except hairworks. I'm calling BS because review sites report that even the 980Ti at stock boost gets around 70fps average without hairworks at 1440p. 60fps min is not possible on a 390x.


----------



## Dundundata

Did u turn foliage distance to high? Makes a big difference. I leave HW on though cause...those flowing locks


----------



## Synntx

Quote:


> Originally Posted by *TsukikoChan*
> 
> So it was the psu then? same as me prob but i need to assess my bank balance before i can pull the trigger on a new psu :-(


Quote:


> Originally Posted by *tolis626*
> 
> Well I'll be damned. That PSU must certainly have been dying, else it shouldn't be making a difference. Still, happy to see you solved this, no matter how much I dislike being wrong.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also damn, that overclock. My card just doesn't want to do more than 1180MHz properly. Sure, with +150mV it will do 1200/1725MHz (maybe more) no problem, but the scores I get in Firestrike are the same or even lower than what I get for my usual overclocks. Throttling? Instability? I have no idea. It's not temps fore sure, I have the GPU fans and the adjacent case fans blasting at 100% while benching and temps peaked at what? 68C? Lower than 70C anyway. I've even tried +175mV (And was standing next to the PC, ready to press the reset button because I was scared it may overheat/die
> 
> 
> 
> 
> 
> 
> 
> ) and scores were even lower (14100 plus change). That's depressing.


I just think it was a combination of a lack of power and efficiency. I've got 2 HDD's, 3 SSD's, 8 Case fans, 2 VRM fans, an AIO, a fan controller, lighting, gaming keyboard and mouse, multiple USB devices, plus heavy overclocks on the CPU and GPU. I don't think my old PSU is dying, as it's not very old at all, and runs the system fine without overclocks, I just don't think it could keep up with demand. Yes I can manage to get through Firestrike with some pretty high overclocks, but that's all those clocks are good for; benching. I am excited to be stable at 1170/1750 for gaming.

Quote:


> Originally Posted by *afyeung*
> 
> Where are you testing Witcher 3 and what version do you have? I have my 390x at 1160/1750 with +81mv core and +50mv aux. I was seeing fps drops into the low 40s with everything max except hairworks. I'm calling BS because review sites report that even the 980Ti at stock boost gets around 70fps average without hairworks at 1440p. 60fps min is not possible on a 390x.


What do you mean where? I'm in my house







. Hairworks is disabled completely and foilage distance set to high. I also disable the blur settings because I can't stand them. Oddly enough, I see a 3-4FPS increase w/ my second monitor disabled while gaming.

Quote:


> Originally Posted by *Dundundata*
> 
> Did u turn foliage distance to high? Makes a big difference. I leave HW on though cause...those flowing locks


Yes, those flowing locks...........


----------



## afyeung

Quote:


> Originally Posted by *Synntx*
> 
> What do you mean where? I'm in my house
> 
> 
> 
> 
> 
> 
> 
> . Hairworks is disabled completely and foilage distance set to high. I also disable the blur settings because I can't stand them. Oddly enough, I see a 3-4FPS increase w/ my second monitor disabled while gaming.
> Yes, those flowing locks...........


I meant which part of the game lol. Still don't buy it. I'll have to test it out myself when I have time


----------



## tolis626

Quote:


> Originally Posted by *Synntx*
> 
> I just think it was a combination of a lack of power and efficiency. I've got 2 HDD's, 3 SSD's, 8 Case fans, 2 VRM fans, an AIO, a fan controller, lighting, gaming keyboard and mouse, multiple USB devices, plus heavy overclocks on the CPU and GPU. I don't think my old PSU is dying, as it's not very old at all, and runs the system fine without overclocks, I just don't think it could keep up with demand. Yes I can manage to get through Firestrike with some pretty high overclocks, but that's all those clocks are good for; benching. I am excited to be stable at 1170/1750 for gaming.
> What do you mean where? I'm in my house
> 
> 
> 
> 
> 
> 
> 
> . Hairworks is disabled completely and foilage distance set to high. I also disable the blur settings because I can't stand them. Oddly enough, I see a 3-4FPS increase w/ my second monitor disabled while gaming.


Nah, it's surely not pushing 800W. My system is similar and I've seen about 550W peaks, but never sustained wattage over 500W. I'd say that PSU is done for. Or it's really bad quality and its 12V is crapping itself when too much wattage is pulled through it, which is all the same.

Regarding your FireStrike scores, I think maybe it's because you have a Sapphire card, and those have 2x8-pin power. The MSI with its 8+6-pin may be power-throttling at those higher clocks. Either that, or there's something wrong in my setup. I'd appreciate if someone with an MSI 390x could post benches at very high overclocks with high (>+100mV) voltages.
Quote:


> Originally Posted by *afyeung*
> 
> I meant which part of the game lol. Still don't buy it. I'll have to test it out myself when I have time


I don't know if the 60FPS claim is true, as I haven't tested myself, but it's not unrealistic either. My rig is pushing 40-50FPS with Hairworks enabled (Albeit at a low preset without much AA for it) at 1440p when in the wilds, out of White Orchard (Or whatever the village in the first quest is called, I haven't really played the game yet). I'd guess disabling Hairworks will land me in the 60s. Foliage distance does have a big impact on performance, but not on image quality.


----------



## diggiddi

I agree that PSU must've been dying, My HCG 750w was able to run CFX 290x until recently
I need to pickup one of those EVGA 1300 g2


----------



## tolis626

Quote:


> Originally Posted by *diggiddi*
> 
> I agree that PSU must've been dying, My HCG 750w was able to run CFX 290x until recently
> I need to pickup one of those EVGA 1300 g2


Well, the 1300G2 is awesome, as long as you don't mind the noise at idle. Really, I've never heard it ramp up or be really noisy, but at idle it's the only thing audible in my system and that sucks. I'd also look at something like the 1200P2, 1000G2 (quieter than the 1300G2 for some reason) or 1000P2. But EVGA all the way for me.

Other than that, I just tested the Witcher a bit. 1440p, Hairworks off, foliage distance at high and I had my card at 1175/1725MHz with +100mV. I was seeing about 55fps average. FRAPS reported an average of 73FPS, but I forgot to turn the benchmark off and it also recorded for a few seconds while I was in the menu, with FPS well in the 2000 range, so I guess these results are useless. What's concerning is that my card hit 83C in no time and it started artifacting like there's no tomorrow. Damn, I gotta change that thermal paste... Maybe I'll just apply some of the MX4 I have lying around and be done with it.


----------



## battleaxe

Quote:


> Originally Posted by *afyeung*
> 
> Where are you testing Witcher 3 and what version do you have? I have my 390x at 1160/1750 with +81mv core and +50mv aux. I was seeing fps drops into the low 40s with everything max except hairworks. I'm calling BS because review sites report that even the 980Ti at stock boost gets around 70fps average without hairworks at 1440p. 60fps min is not possible on a 390x.


You do realize the difference between an OC'd 390x and a 980ti aren't very much right? 75fps on 980ti and 60fps on a 390x actually sounds about right on. That's about 20% roughly (not pulling out the calculator for this sorry).


----------



## afyeung

Yeah. 25fps. But even without hairworks Witcher 3 is still a very Nvidia biased game. I am a fan of my 390x lol. It's an excellent card and maybe 40% at most behind the 980Ti in the most Nvidia biased games.




Claiming to get 60fps at 1440p with hairworks off and LOD at high is COMPLETE BS. I know for a fact because I've played the game and know other people who have a 390(x) and also play witcher 3 at 1440p. 1170/1750 on the 390x won't bring it to Fury X performance. You'll get a 10% boost at MOST. Witcher 3 is like Crysis 3. It's very hard to maintain 60fps constant with one card at resolutions higher than 1080p. However, 2 390x's is perfect for maxing 1440p.


----------



## afyeung

Quote:


> Originally Posted by *afyeung*
> 
> Yeah. 25fps. But even without hairworks Witcher 3 is still a very Nvidia biased game. I am a fan of my 390x lol. It's an excellent card and maybe 40% at most behind the 980Ti in the most Nvidia biased games.
> 
> 
> 
> 
> Claiming to get 60fps at 1440p with hairworks off and LOD at high is COMPLETE BS. I know for a fact because I've played the game and know other people who have a 390(x) and also play witcher 3 at 1440p. 1170/1750 on the 390x won't bring it to Fury X performance. You'll get a 10% boost at MOST. Witcher 3 is like Crysis 3. It's very hard to maintain 60fps constant with one card at resolutions higher than 1080p. However, 2 390x's is perfect for maxing 1440p.


*25%


----------



## Synntx

Quote:


> Originally Posted by *tolis626*
> 
> Well, the 1300G2 is awesome, as long as you don't mind the noise at idle. Really, I've never heard it ramp up or be really noisy, but at idle it's the only thing audible in my system and that sucks. I'd also look at something like the 1200P2, 1000G2 (quieter than the 1300G2 for some reason) or 1000P2. But EVGA all the way for me.
> 
> Other than that, I just tested the Witcher a bit. 1440p, Hairworks off, foliage distance at high and I had my card at 1175/1725MHz with +100mV. I was seeing about 55fps average. FRAPS reported an average of 73FPS, but I forgot to turn the benchmark off and it also recorded for a few seconds while I was in the menu, with FPS well in the 2000 range, so I guess these results are useless. What's concerning is that my card hit 83C in no time and it started artifacting like there's no tomorrow. Damn, I gotta change that thermal paste... Maybe I'll just apply some of the MX4 I have lying around and be done with it.


Quote:


> Originally Posted by *tolis626*
> 
> Nah, it's surely not pushing 800W. My system is similar and I've seen about 550W peaks, but never sustained wattage over 500W. I'd say that PSU is done for. Or it's really bad quality and its 12V is crapping itself when too much wattage is pulled through it, which is all the same.
> 
> Regarding your FireStrike scores, I think maybe it's because you have a Sapphire card, and those have 2x8-pin power. The MSI with its 8+6-pin may be power-throttling at those higher clocks. Either that, or there's something wrong in my setup. I'd appreciate if someone with an MSI 390x could post benches at very high overclocks with high (>+100mV) voltages.
> I don't know if the 60FPS claim is true, as I haven't tested myself, but it's not unrealistic either. My rig is pushing 40-50FPS with Hairworks enabled (Albeit at a low preset without much AA for it) at 1440p when in the wilds, out of White Orchard (Or whatever the village in the first quest is called, I haven't really played the game yet). I'd guess disabling Hairworks will land me in the 60s. Foliage distance does have a big impact on performance, but not on image quality.


Gotta love Sapphire.


----------



## Synntx

Quote:


> Originally Posted by *afyeung*
> 
> Yeah. 25fps. But even without hairworks Witcher 3 is still a very Nvidia biased game. I am a fan of my 390x lol. It's an excellent card and maybe 40% at most behind the 980Ti in the most Nvidia biased games.
> 
> 
> 
> 
> Claiming to get 60fps at 1440p with hairworks off and LOD at high is COMPLETE BS. I know for a fact because I've played the game and know other people who have a 390(x) and also play witcher 3 at 1440p. 1170/1750 on the 390x won't bring it to Fury X performance. You'll get a 10% boost at MOST. Witcher 3 is like Crysis 3. It's very hard to maintain 60fps constant with one card at resolutions higher than 1080p. However, 2 390x's is perfect for maxing 1440p.


As a correction to an earlier post, my FPS fluctuate from 50-60, sometimes peaking at 70.

I'll run a 15 minute FRAPS benchmark and report results.


----------



## tolis626

Quote:


> Originally Posted by *Synntx*
> 
> As a correction to an earlier post, my FPS fluctuate from 50-60, sometimes peaking at 70.


Good thing about the Witcher is that it's properly coded, so it's easily playable even at 40FPS. Sure, it may get a little choppy at times, but it's not lagging like crazy and the controls are fine. Inquisition, on the other hand, would get 10 different kinds of messed up if my framerate wasn't high enough. Gotta love CD Projekt Red.
Quote:


> Originally Posted by *Synntx*
> 
> Gotta love Sapphire.


Damn you.


----------



## xutnubu

Can anybody tell me the VRM and Core temps of the PowerColor PCS+ 390. And is it really that bad for OCing as I've heard?


----------



## tbob22

Quote:


> Originally Posted by *xutnubu*
> 
> Can anybody tell me the VRM and Core temps of the PowerColor PCS+ 390. And is it really that bad for OCing as I've heard?


I wouldn't call the OCing bad, just not great. At +100mv I was able to get 1150/1600.

Furmark for 10min at those clocks:
Core: 71c
VRM1: 72c
VRM2: 68c
Fan: 72%

The fan does wind up at those speeds and is pretty loud, but during gaming I can barely hear it, it usually stays at around 40% and temps stay around 60-65c.


----------



## xutnubu

Quote:


> Originally Posted by *tbob22*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xutnubu*
> 
> Can anybody tell me the VRM and Core temps of the PowerColor PCS+ 390. And is it really that bad for OCing as I've heard?
> 
> 
> 
> I wouldn't call the OCing bad, just not great. At +100mv I was able to get 1150/1600.
> 
> Furmark for 10min at those clocks:
> Core: 71c
> VRM1: 72c
> VRM2: 68c
> Fan: 72%
> 
> The fan does wind up at those speeds and is pretty loud, but during gaming I can barely hear it, it usually stays at around 40% and temps stay around 60-65c.
Click to expand...

Thanks for the info. Good temps indeed, especially for Furmark. But I'd feel kind of uncomfortable with +100mV.

I've also heard (







) about the noise complaints, but I don't care much about it.

Do you have The Witcher 3? What kind of fps you get? I'm hoping for Ultra settings with Shadows and Foliage Distance @ High, and maybe HBAO if there's some performance to spare. HW Off obviously.


----------



## tbob22

Quote:


> Originally Posted by *xutnubu*
> 
> Thanks for the info. Good temps indeed, especially for Furmark. But I'd feel kind of uncomfortable with +100mV.
> 
> I've also heard (
> 
> 
> 
> 
> 
> 
> 
> ) about the noise complaints, but I don't care much about it.
> 
> Do you have The Witcher 3? What kind of fps you get? I'm hoping for Ultra settings with Shadows and Foliage Distance @ High, and maybe HBAO if there's some performance to spare. HW Off obviously.


Yeah, I usually just run stock speeds. I don't have The Witcher 3, haven't really had a chance to game much in general.


----------



## EternalRest

My PowerColor arrived today.


----------



## m70b1jr

Quote:


> Originally Posted by *EternalRest*
> 
> My PowerColor arrived today.


Does your pc levitate when it turns on?


----------



## afyeung

Quote:


> Originally Posted by *Synntx*
> 
> As a correction to an earlier post, my FPS fluctuate from 50-60, sometimes peaking at 70.
> 
> I'll run a 15 minute FRAPS benchmark and report results.


Actually, just take a screen shot with riva tuner running in the corner. That should do it. One screen shot of the actual game and one with settings open in game.


----------



## Synntx

Quote:


> Originally Posted by *afyeung*
> 
> Actually, just take a screen shot with riva tuner running in the corner. That should do it. One screen shot of the actual game and one with settings open in game.


I use Steam's FPS overlay....... stand by


----------



## afyeung

Quote:


> Originally Posted by *Synntx*
> 
> I use Steam's FPS overlay....... stand by


Riva tuner is one of the best tools for GPU overclocking. It can show fps, gpu usage, core and mem clocks, + frame times and CPU usage. I suggest u get it


----------



## Synntx

Quote:


> Originally Posted by *afyeung*
> 
> Riva tuner is one of the best tools for GPU overclocking. It can show fps, gpu usage, core and mem clocks, + frame times and CPU usage. I suggest u get it


I'll look into it. In the meantime, here are some screenshots. The last one is inside the Kingfisher in Novigrad, the rest are settings.


----------



## xutnubu

Quote:


> Originally Posted by *Synntx*
> 
> Quote:
> 
> 
> 
> Originally Posted by *afyeung*
> 
> Riva tuner is one of the best tools for GPU overclocking. It can show fps, gpu usage, core and mem clocks, + frame times and CPU usage. I suggest u get it
> 
> 
> 
> I'll look into it. In the meantime, here are some screenshots. The last one is inside the Kingfisher in Novigrad, the rest are settings.
Click to expand...

Hey, can I ask you what FPS you get with those same settings @ 1080p and maybe HBAO+ instead of SSAO?


----------



## battleaxe

I just picked up a 390x and it was totally dead. No video at all. It wouldn't even let the PC post from BIOS. Back it goes.


----------



## Synntx

Quote:


> Originally Posted by *xutnubu*
> 
> Hey, can I ask you what FPS you get with those same settings @ 1080p and maybe HBAO+ instead of SSAO?


Around 70fps with HBAO+ and same settings at 1080p


----------



## xutnubu

Quote:


> Originally Posted by *Synntx*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xutnubu*
> 
> Hey, can I ask you what FPS you get with those same settings @ 1080p and maybe HBAO+ instead of SSAO?
> 
> 
> 
> Around 70fps with HBAO+ and same settings at 1080p
Click to expand...


----------



## navjack27

Quote:


> Originally Posted by *afyeung*
> 
> Riva tuner is one of the best tools for GPU overclocking. It can show fps, gpu usage, core and mem clocks, + frame times and CPU usage. I suggest u get it


AND USING IT LOWERS FRAMERATES!


----------



## Dundundata

@1080p with decent OC (1150/1625), settings all Ultra except foliage distance=high, and Hairworks ON I usually run around 60 with dips into the 50s. This is with Vsync on and HBAO+.

But seriously just play Dragon's Dogma you will get really high framerate


----------



## Dundundata

Quote:


> Originally Posted by *navjack27*
> 
> AND USING IT LOWERS FRAMERATES!


Does it really? I run it constantly in every game become I'm neurotic like that and need to know my temps, how much of a fps hit can it cause?


----------



## navjack27

i did a video bout it


----------



## OneB1t

yes thats normal every monitoring software increasing frametimes








also decrease from 350 to 300 is nothing to worry about









normally there is like 1-2fps decrease in 50-60fps range but if your refresh time in afterburner is low it can cause microstutter


----------



## jdorje

I use rss with hwinfo integration to overlay fps, cpu/gpu temps and usage.

For benchmarking I'd run Fraps and record every frame time. You can then make a, uh, histogram chart of the frame times. This will show you median fps as well as the longest frame times.


----------



## JreyE30

Hi guys, new to the club here, just recently sold my R9 270x CFX setup to buy an R9 390(basically broke even in pricing)

I bought the ASUS R9 390 Strix DC3 since there was no MSI in stock on Amazon, but after checking newegg its cheaper plus it was in stock but i already ordered the card









I've heard some mixed reviews on this particular card, mostly performance and heat issues, been contemplating to return it since it could cause a problem in the long run, and just get the MSI version, idk if i should? thanks all


----------



## Stige

Yes the VRM cooling on the Strix DC3 is awful.


----------



## ziggystardust

Quote:


> Originally Posted by *Stige*
> 
> Yes the VRM cooling on the Strix DC3 is awful.


As far as i know it's not that good in Nvidia counterparts as well. I saw people with Asus Strix gtx 970 and 980s with vrm temperatures around 95-100C on stock clocks.

I believe when it comes to VRMs it's all about where the sensors are located to be honest. I just can't understand how MSI r9 390 vrm temperatures stay that low on gpu-z (around 60s), while infrared measurements prove the opposite. Tom's Hardware's test shows MSI 390's vrm pins heating up to 100C under load on stock clocks.

I wouldn't take gpu-z or any other monitoring software as very dependable sources. At the end of the day it's not hard to mess up with those sensors.
Quote:


> Originally Posted by *JreyE30*
> 
> Hi guys, new to the club here, just recently sold my R9 270x CFX setup to buy an R9 390(basically broke even in pricing)
> 
> I bought the ASUS R9 390 Strix DC3 since there was no MSI in stock on Amazon, but after checking newegg its cheaper plus it was in stock but i already ordered the card
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've heard some mixed reviews on this particular card, mostly performance and heat issues, been contemplating to return it since it could cause a problem in the long run, and just get the MSI version, idk if i should? thanks all


If you take vrm temperatures into consideration, the difference should be negligible to be honest. Tom's Hardware test proves MSI 390 vrm runs quite high as well, maxing at 101C. I wouldn't make my call according to the sensor readings. God knows where those guys at MSI placed that sensor to obtain those low readings.









On the other hand, Asus core temperatures are a bit higher than other 390s which might be a tad better decision point if you think it would worth the hassle. .


----------



## Stige

So maybe it is just that ASUS is reporting it correctly compared to others?


----------



## OneB1t

nope asus have bad cooling

980/970 have bad temperatures but no VRM monitoring so all their users are fine


----------



## Spartoi

So I was trying to adjust the VRM heatsink on my card who's core is AIO watercooled by VRMs are air cooled because I couldn't believe the temps could remain so high (~100C). Good new and bads news. Bad news is that VRM2 still maxes out around ~100C (highest I've seen is ~102C so far). Good news is that I was able to lower VRM1 temps significantly and that now maxes at 60C. Such a large discrepancy. I'm going to order the generic VRM heatsinks that Arctic offers and see I can make those fit on my card.

Oh, and for some reason after doing this my benchmark scores increased about 3 FPS.


----------



## ziggystardust

Quote:


> Originally Posted by *Stige*
> 
> So maybe it is just that ASUS is reporting it correctly compared to others?


No idea. Like I said it probably depends where they put the sensors and it's clear that MSI 390s vrm sensor doesn't show vrm pins or mosfets at all. Simply there can't be a 30-40C difference on any part of the mosfets.
Quote:


> Originally Posted by *OneB1t*
> 
> nope asus have bad cooling
> 
> 980/970 have bad temperatures but no VRM monitoring so all their users are fine


Yes, Asus Strix GTX 970 and 980s do have the VRM monitoring unlike some other GTX 970/980s.


----------



## Mellifleur

cans i join plx?

https://www.techpowerup.com/gpuz/details.php?id=vym53

its not oc'd right now but it gets to 1180 core 1700 mem when i was playing around with it.

i can run a few benches and submit some screenies with settings if people would like a reference.


----------



## afyeung

Quote:


> Originally Posted by *Synntx*
> 
> I'll look into it. In the meantime, here are some screenshots. The last one is inside the Kingfisher in Novigrad, the rest are settings.


Wait lol. I was playing at the same settings as you but with HBAO+ on and getting 50fps at 1440p. Will turning off HBAO+ bump it up to 60s? Sorry my bad, but it's still not "ultra".


----------



## afyeung

Quote:


> Originally Posted by *navjack27*
> 
> AND USING IT LOWERS FRAMERATES!


Maybe in CS GO which honestly isn't a good benchmark anyways. Not if you set it up right. In CS GO I wouldn't even use an FPS counter


----------



## OneB1t

this is what i think about rebranding/refreshing 290 to 390


----------



## christoph

Quote:


> Originally Posted by *bichael*
> 
> Just put the alpacool GPX block on my Powercolor 390 and got it up and running last night. Not time for much testing but at stock the gpu was around 50oC and VRMs around 60oC on a run of Firestrike. That's with a small pump and at 30oC+ room temp. (Not worth comparing to my air results as I had to mount the fans outside the case so they were blowing through a side panel and across a gap, about 80 / 70 for what it's worth). I'm definitely happy with it, gives nearly all the benefits of a fullcover block at less cost and while being upgradeable.
> 
> Topless Powercolor 390
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If this looks cramped imagine what it's like after the psu, hdd's, and blu ray drive go in. The 360 rad lives outside...


hey guys

I Have the Sapphire Nitro 390 myself;

have you guys had change the thermal paste of these video cards?? what about the thermal pad that is on the VRMs as the memory chips? what's the thick size of those, is it 1 mm??

is not that I have problems with temps, is just so I can buy new thermal pads and have them ready for a possible swap, like these ones

http://www.amazon.com/Coollaboratory-Liquid-MetalPad-Thermal-Grease/dp/B001CXNSU8


----------



## Vellinious

Quote:


> Originally Posted by *OneB1t*
> 
> also 970 have like 1300mhz stock and you can overclock it to like 1500mhz thats 15%
> 390 have 1100 stock and 1200 overclock thats 9%
> 
> but 390 have better overclock scaling because there is less bottleneck from memory/cut down units
> 
> so 390OC = 970OC


970 base clocks on the reference cards are 1050 on the core. The factory overclocked cards run anywhere from 1200 to 1370, but.....that's not the base. Just sayin.

They're both really good cards, and trade blows back and forth.

I know it's not a 390, but, I replaced my 970 with an 8GB 290X. Both were underwater and overclocked to within an inch of their life. The 970 with a custom bios...I'm still working on a custom bios for the 290X. Quick comparison using Firestrike.

http://www.3dmark.com/compare/fs/7358506/fs/6557355


----------



## Perplexity

http://www.techpowerup.com/gpuz/details.php?id=7dhve

Core Voltage + 19mV

Currently using this clock, will try and push it a bit more tonight probably.


----------



## outofmyheadyo

Quote:


> Originally Posted by *Perplexity*
> 
> http://www.techpowerup.com/gpuz/details.php?id=7dhve
> 
> + 19mV
> 
> Currently using this clock, will try and push it a bit more tonight probably.


Quote:


> Originally Posted by *Vellinious*
> 
> 970 base clocks on the reference cards are 1050 on the core. The factory overclocked cards run anywhere from 1200 to 1370, but.....that's not the base. Just sayin.
> 
> They're both really good cards, and trade blows back and forth.
> 
> I know it's not a 390, but, I replaced my 970 with an 8GB 290X. Both were underwater and overclocked to within an inch of their life. The 970 with a custom bios...I'm still working on a custom bios for the 290X. Quick comparison using Firestrike.
> 
> http://www.3dmark.com/compare/fs/7358506/fs/6557355


Thats a nice comparison, I`d like to see a one like that with 390 vs 970 and it`s still surprising a 290x beats the 970 in firestrike,


----------



## fyzzz

You can do way more bios mods to 290(x) and 390(x) than the nvidia counterparts.
http://www.3dmark.com/fs/7254867


----------



## Vellinious

Quote:


> Originally Posted by *fyzzz*
> 
> You can do way more bios mods to 290(x) and 390(x) than the nvidia counterparts.
> http://www.3dmark.com/fs/7254867


Yeah, I'm seeing that. A LOT more moving parts in there...it's MUCH more complicated. On Maxwell, you up the voltage, adjust the TDP and power limits, increase the 6 / 8 / and PCIe slot power draw, smooth out the voltage / clock table and done...


----------



## lostsurfer

hey guys, gf offered to buy me a 390 today, what's the best brand to go with? Leaning towards the msi due to the cooling?

my options are, msi, gigabyte, xfx, asus. the sapphire nitro is sold out.


----------



## Vellinious

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Thats a nice comparison, I`d like to see a one like that with 390 vs 970 and it`s still surprising a 290x beats the 970 in firestrike,


Clubs it over the head like a baby seal, it does..... That 970 score is one of the highest, too....I think I was 3rd in overall score for all 1 x 970s. I was also #1 in 2 x 970 in FS, FS Extreme and FS Ultra. Those cards were VERY good.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Vellinious*
> 
> Clubs it over the head like a baby seal, it does..... That 970 score is one of the highest, too....I think I was 3rd in overall score for all 1 x 970s. I was also #1 in 2 x 970 in FS, FS Extreme and FS Ultra. Those cards were VERY good.


Just so most people dont get the wrong idea most FS score with custom BIOS are really only for benchmarks and not actual games. You score that high 290X but that is not 24/7 OC or anywhere close most people would get. People would be luck to score 14K 24/7 clocks with no custom bios.


----------



## Chaoz

Quote:


> Originally Posted by *JreyE30*
> 
> Hi guys, new to the club here, just recently sold my R9 270x CFX setup to buy an R9 390(basically broke even in pricing)
> 
> I bought the ASUS R9 390 Strix DC3 since there was no MSI in stock on Amazon, but after checking newegg its cheaper plus it was in stock but i already ordered the card
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've heard some mixed reviews on this particular card, mostly performance and heat issues, been contemplating to return it since it could cause a problem in the long run, and just get the MSI version, idk if i should? thanks all
> 
> snip


I got the exact same card but the DC3OC version and I love it to pieces. My VRM temps aren't high at all. Max they get at is 70°C so nowhere near the temps everyone is saying.

When I play CoD BO3 for hours on end my card hits max 70°C on the core and the same for the VRM's. Maybe I got lucky?


----------



## Vellinious

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Just so most people dont get the wrong idea most FS score with custom BIOS are really only for benchmarks and not actual games. You score that high 290X but that is not 24/7 OC or anywhere close most people would get. People would be luck to score 14K 24/7 clocks with no custom bios.


I push for peak performance when I run Firestrike. My 970s ran every day at 1581 core and 2000 memory. That score I linked was run at 1633 core and 2175 memory....same exact bios file and same voltage.

Not sure where the 290X will be 24/7 stable yet, I haven't gotten that far.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Vellinious*
> 
> I push for peak performance when I run Firestrike. My 970s ran every day at 1581 core and 2000 memory. That score I linked was run at 1633 core and 2175 memory....same exact bios file and same voltage.
> 
> Not sure where the 290X will be 24/7 stable yet, I haven't gotten that far.


Dont see many people here with any Hawaii card over 1.2GHz for 24/7 OC. For example I scored 29.3K with high OC not 24/7 stable and about 26K for 24/7 OC.


----------



## Vellinious

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Dont see many people here with any Hawaii card over 1.2GHz for 24/7 OC. For example I scored 29.3K with high OC not 24/7 stable and about 26K for 24/7 OC.


Ok.....as I said, they were both pushed to within an inch of their lives. Indicating, they had extremely high overclocks....as a comparison. 

29.3k / 26k in what, and with what? lol


----------



## OneB1t

Quote:


> Originally Posted by *Vellinious*
> 
> 970 base clocks on the reference cards are 1050 on the core. The factory overclocked cards run anywhere from 1200 to 1370, but.....that's not the base. Just sayin.


it is base







i calculate with gpu boost


----------



## Vellinious

Quote:


> Originally Posted by *OneB1t*
> 
> it is base
> 
> 
> 
> 
> 
> 
> 
> i calculate with gpu boost


With boost 2.0, most of the 970s boost to 1440 and higher...some as high as 1500. But that's not base. The base core clock for the GM204 is what's set in the stock reference bios. /wink


----------



## Stige

This is AMD thread, not for 970 fanbois with their inferior cards, get out.


----------



## OneB1t

Quote:


> Originally Posted by *Vellinious*
> 
> With boost 2.0, most of the 970s boost to 1440 and higher...some as high as 1500. But that's not base. The base core clock for the GM204 is what's set in the stock reference bios. /wink


thats reason why there is not much gained from OC

so all fanboys talking about nvidia awesome overclock are just blind and they not see that card allready running at high frequency so there is not much to gain from overclocking


----------



## tolis626

I thought that it was common knowledge by now that Maxwell's greatest strength is perhaps the very high clocks it hits. Like, seriously, clock for clock I don't think even the 980ti is really better than a 390x. But a 980ti will hit 1500MHz easy, while most 390x (And generally Hawaii cards) will do 1150MHz? 1200MHz tops. Each architecture has its strengths and weaknesses and we've known for a while.

What would be really interesting is a comparison of a max overclocked 290x/390x to a max overclocked 980 (non-ti). I think they may be going neck to neck...


----------



## OneB1t

980ti is also bigger chip without double precision performance (601mm vs 438mm for hawaii)
980 is now close to 390X even after full overclock


----------



## Vellinious

Quote:


> Originally Posted by *Stige*
> 
> This is AMD thread, not for 970 fanbois with their inferior cards, get out.


I own a 290x lol


----------



## Vellinious

Quote:


> Originally Posted by *tolis626*
> 
> I thought that it was common knowledge by now that Maxwell's greatest strength is perhaps the very high clocks it hits. Like, seriously, clock for clock I don't think even the 980ti is really better than a 390x. But a 980ti will hit 1500MHz easy, while most 390x (And generally Hawaii cards) will do 1150MHz? 1200MHz tops. Each architecture has its strengths and weaknesses and we've known for a while.
> 
> What would be really interesting is a comparison of a max overclocked 290x/390x to a max overclocked 980 (non-ti). I think they may be going neck to neck...


In some game titles, maybe. But in benchmarks? No. Not quite.


----------



## OneB1t

just wait for DX12 + async shaders


----------



## Vellinious

That's possible. But by the time enough new titles are out that actually utilize it, AMD and NVIDIA both will have new gens released. /shrug


----------



## Vellinious

Quote:


> Originally Posted by *OneB1t*
> 
> thats reason why there is not much gained from OC
> 
> so all fanboys talking about nvidia awesome overclock are just blind and they not see that card allready running at high frequency so there is not much to gain from overclocking


I'm not sure about fanboys, but....I highly overclocked 970 will keep right up with a highly overclocked 390. I'm not sure why it's so important for one to be the best, but...whatever. They're both so even it's not even funny. lol


----------



## mus1mus

Quote:


> Originally Posted by *tolis626*
> 
> I thought that it was common knowledge by now that Maxwell's greatest strength is perhaps the very high clocks it hits. Like, seriously, clock for clock I don't think even the 980ti is really better than a 390x. But a 980ti will hit 1500MHz easy, while most 390x (And generally Hawaii cards) will do 1150MHz? 1200MHz tops. Each architecture has its strengths and weaknesses and we've known for a while.
> 
> What would be really interesting is a comparison of a max overclocked 290x/390x to a max overclocked 980 (non-ti). I think they may be going neck to neck...


It's not about clocks. It's about performance when you compare the offerings from each camp.

When you consider overclocking, the topic should shift to frequency/performance scaling.

Nvidia happens to reach higher clocks but that can be a misinformation. Remember GPU BOOST?

It may just be their term for throttling. When you heat up Maxwell, Boost will not hit it's target. So too, if you happen to reach the card's TDP Limit.

Instead of saying, the card is designed to clock at 1400 and throttle down when exceeds defined limits for example, they say, the card is clocked at 1250 and boost to 1400 if you keep it cool and not hitting it's limits.


----------



## Vellinious

Quote:


> Originally Posted by *mus1mus*
> 
> It's not about clocks. It's about performance when you compare the offerings from each camp.
> 
> When you consider overclocking, the topic should shift to frequency/performance scaling.
> 
> Nvidia happens to reach higher clocks but that can be a misinformation. Remember GPU BOOST?
> 
> It may just be their term for throttling. When you heat up Maxwell, Boost will not hit it's target. So too, if you happen to reach the card's TDP Limit.
> 
> Instead of saying, the card is designed to clock at 1400 and throttle down when exceeds defined limits for example, they say, the card is clocked at 1250 and boost to 1400 if you keep it cool and not hitting it's limits.


Yup. For Maxwell to perform as well as they should, they need a custom bios and a custom fan curve. Water cooling is obviously better, but...that's true of anything. = P


----------



## outofmyheadyo

thatswhy you disable boost, and use static clocks for loads.


----------



## Vellinious

Quote:


> Originally Posted by *outofmyheadyo*
> 
> thatswhy you disable boost, and use static clocks for loads.


Most do, with a custom bios...yeah


----------



## JreyE30

Quote:


> Originally Posted by *Chaoz*
> 
> I got the exact same card but the DC3OC version and I love it to pieces. My VRM temps aren't high at all. Max they get at is 70°C so nowhere near the temps everyone is saying.
> 
> When I play CoD BO3 for hours on end my card hits max 70°C on the core and the same for the VRM's. Maybe I got lucky?


Those are good temps but take into consideration ambient temps effect internal temps, I'm in the desert so summer will take a toll on my card unless I run the AC all day but I have the same card as you do OC edition
Quote:


> Originally Posted by *lostsurfer*
> 
> hey guys, gf offered to buy me a 390 today, what's the best brand to go with? Leaning towards the msi due to the cooling?
> 
> my options are, msi, gigabyte, xfx, asus. the sapphire nitro is sold out.


Don't be like me, I skimped on the research to getting a 390 I ended up with the Asus card but I'm on the verge of exchanging it for an MSI or go back to Sapphire as I always had for the past almost 10 years

XFXs version is solid I hear plus lifetime warranty if registered through their site iirc

Power color is also a good option as well, but Sapphire has been with AMD for a very long time so they know how to make proper cards especially with good cooling

XFX, MSI, Sapphire and Power Color would be something I'd look into

BTW Out of the four I recommend I believe the Sapphire and Power Color are the longest cards so make sure to do some measuring and research to make sure it'll fit


----------



## jdorje

Summer is the time for undervolting. I can run my 390 at -100 mV and 1035 mhz and it's a good 50-100w below the stock voltage overclock. You definitely don't want high wattage when the ac is running; that's electricity you have to pay for twice.


----------



## Charcharo

Quote:


> Originally Posted by *Vellinious*
> 
> That's possible. But by the time enough new titles are out that actually utilize it, AMD and NVIDIA both will have new gens released. /shrug


Many of us do not upgrade that often








For me it seems to be every 5-6 years.

*That is also why those 8 Gigs are important.


----------



## lostsurfer

Quote:


> Originally Posted by *JreyE30*
> 
> Those are good temps but take into consideration ambient temps effect internal temps, I'm in the desert so summer will take a toll on my card unless I run the AC all day but I have the same card as you do OC edition
> Don't be like me, I skimped on the research to getting a 390 I ended up with the Asus card but I'm on the verge of exchanging it for an MSI or go back to Sapphire as I always had for the past almost 10 years
> 
> XFXs version is solid I hear plus lifetime warranty if registered through their site iirc
> 
> Power color is also a good option as well, but Sapphire has been with AMD for a very long time so they know how to make proper cards especially with good cooling
> 
> XFX, MSI, Sapphire and Power Color would be something I'd look into
> 
> BTW Out of the four I recommend I believe the Sapphire and Power Color are the longest cards so make sure to do some measuring and research to make sure it'll fit


I appreciate it, i'm going to go MSI. I'm RMAing an XFX right now, that was a lemon from day 1. Also they're out of sapphire's and they dont carry any power colors.


----------



## Mazda6i07

I have an Msi, no complaints yet other than some slight coil wine, which, i cannot hear over the noise from gaming anyways. But i know it's there so it bugs me. But great card thus far, 390.








Now just need to Oc it, not sure it's worth the hassle though for the minimal performance bump especially since im only 1080p gaming.


----------



## Chaoz

Quote:


> Originally Posted by *JreyE30*
> 
> Those are good temps but take into consideration ambient temps effect internal temps, I'm in the desert so summer will take a toll on my card unless I run the AC all day but I have the same card as you do OC edition


Ah I see, well I live where it can get hot in the summer but damn cold in the winter.
If I were you I'd RMA it and go with Sapphire. My previous one (HD 5970) was a Sapphire and i was really happy with it. But the ASUS was cheaper and I heard good things about it, so i went with that card.


----------



## Stige

Quote:


> Originally Posted by *Vellinious*
> 
> I'm not sure about fanboys, but....I highly overclocked 970 will keep right up with a highly overclocked 390. I'm not sure why it's so important for one to be the best, but...whatever. They're both so even it's not even funny. lol


Your 970 won't come even close to my 390, not even close brah.

It will always be behind.


----------



## ziggystardust

Your 970s and 390s will always be behind my Voodoo 2.


----------



## Vellinious

Quote:


> Originally Posted by *ziggystardust*
> 
> Your 970s and 390s will always be behind my Voodoo 2.


Ah, the good ole voodoo cards....That was my first video card. = )


----------



## outofmyheadyo

I still remember how amazed i was when i first tried the velocity 100


----------



## christoph

Quote:


> Originally Posted by *Charcharo*
> 
> Many of us do not upgrade that often
> 
> 
> 
> 
> 
> 
> 
> 
> For me it seems to be every 5-6 years.
> 
> *That is also why those 8 Gigs are important.


yeah, my old HD4890 still could play new games at middle settings with good FPS, but that cuz it was seriously limited by the 1 GB Vram


----------



## Charcharo

Quote:


> Originally Posted by *christoph*
> 
> yeah, my old HD4890 still could play new games at middle settings with good FPS, but that cuz it was seriously limited by the 1 GB Vram


I remember how at the time I read that the 1 GB ATI 5770 is somewhat not needed and I would be better of getting a 512mb model..
Thank God I did not listen. The 5770 is one hell of a soldier... still works just fine, I just use my R9 390 now








That card got so much life due to the 1gb of VRAM at the time...

So I think I did good with getting an 8gb R9 390


----------



## Mazda6i07

390 inside the case of course


----------



## Vellinious

Quote:


> Originally Posted by *Stige*
> 
> Your 970 won't come even close to my 390, not even close brah.
> 
> It will always be behind.


Ok...do have some backup for that? lol Some monster benchmark where your 390 just smashes what the 970 I had could do? I mean, it's entirely possible, the 290X I have runs quite a bit faster than it does, but.....since you're so sure, show me. = )


----------



## kizwan

Quote:


> Originally Posted by *ziggystardust*
> 
> Your 970s and 390s will always be behind my Voodoo 2.


Your Voodoo 2 will always behind my Voodoo 3 3000.









BTW, regarding FS score, winning is winning, it doesn't matter whether it's gaming stable or not. Go to any competitive benching thread, no one is talking about gaming stable clocks.


----------



## Synntx

Quote:


> Originally Posted by *afyeung*
> 
> Wait lol. I was playing at the same settings as you but with HBAO+ on and getting 50fps at 1440p. Will turning off HBAO+ bump it up to 60s? Sorry my bad, but it's still not "ultra".


I don't much see a difference graphically between SSAO and HBAO+. But that's just me. Yes lowering ambient occlusion setting will give a moderate FPS increase, maybe 3-5 if you're lucky.

On another note, my VRM1 temps are maxing at 105!!!!!! What the heck! GPU core and VRM2 never see 80 though...


----------



## Vellinious

Quote:


> Originally Posted by *Synntx*
> 
> I don't much see a difference graphically between SSAO and HBAO+. But that's just me. Yes lowering ambient occlusion setting will give a moderate FPS increase, maybe 3-5 if you're lucky.
> 
> On another note, my VRM1 temps are maxing at 105!!!!!! What the heck! GPU core and VRM2 never see 80 though...


Yikes. Custom fan curve?


----------



## Synntx

Quote:


> Originally Posted by *Vellinious*
> 
> Yikes. Custom fan curve?


Yep, tops out 100% at 85. At load running about 80-85%. Im shocked that it peaks so high, and then under moderate use drops down to 70?? ***? Not sure if it's sensor issue or what....


----------



## christoph

Quote:


> Originally Posted by *Charcharo*
> 
> I remember how at the time I read that the 1 GB ATI 5770 is somewhat not needed and I would be better of getting a 512mb model..
> Thank God I did not listen. The 5770 is one hell of a soldier... still works just fine, I just use my R9 390 now
> 
> 
> 
> 
> 
> 
> 
> 
> That card got so much life due to the 1gb of VRAM at the time...
> 
> So I think I did good with getting an 8gb R9 390


yeah, That's why I bought the 390, mainly cuz the Vram, and really mainly cuz it was a lot cheaper than the 970, not to mention the 980...

My HD4890's still kicking


----------



## christoph

Quote:


> Originally Posted by *Synntx*
> 
> Yep, tops out 100% at 85. At load running about 80-85%. Im shocked that it peaks so high, and then under moderate use drops down to 70?? ***? Not sure if it's sensor issue or what....


hows the airflow in your case??


----------



## Synntx

Quote:


> Originally Posted by *christoph*
> 
> hows the airflow in your case??


Its' superb. 3x 120mm in the fron, 1x 140mm in the rear, 4x 120mm up top in push/pull on the rad....


----------



## battleaxe

Just picked up a pair of XFX 390x Cards. Will be installing soon. I'll get some pics tomorrow in here.


----------



## afyeung

Quote:


> Originally Posted by *Synntx*
> 
> I don't much see a difference graphically between SSAO and HBAO+. But that's just me. Yes lowering ambient occlusion setting will give a moderate FPS increase, maybe 3-5 if you're lucky.
> 
> On another note, my VRM1 temps are maxing at 105!!!!!! What the heck! GPU core and VRM2 never see 80 though...


If only 5fps increase then don't see how you can maintain 60fps min in witcher 3 at 1440p. It's not possible with a 390/x and I'm saying that as an owner of a 390x at 1170/1625 for my daily driver. You can look at benchmark vids online for witcher 3 @ 1440p. You need at least a 980ti to get a solid 60fps gameplay. Lowering shadows and lod will increase min fps from 35-45 at most


----------



## mus1mus

Quote:


> Originally Posted by *Stige*
> 
> Your 970 won't come even close to my 390, not even close brah.
> 
> It will always be behind.


Aheem. Before you speak like this to someone, make sure you know where you are and where he is.

Go check the FireStrike thread and look up his name.









You are talking BS to the 970 top dawg in there.

Besides, I am pretty sure your 390 has nothing against my 290.








Quote:


> Originally Posted by *Vellinious*
> 
> Ok...do have some backup for that? lol Some monster benchmark where your 390 just smashes what the 970 I had could do? I mean, it's entirely possible, the 290X I have runs quite a bit faster than it does, but.....since you're so sure, show me. = )


Exactly.









BTW, some benches put my 780 crushing 970s too. But I don't wanna say it can crush everything.








Quote:


> Originally Posted by *kizwan*
> 
> Your Voodoo 2 will always behind my Voodoo 3 3000.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> BTW, regarding FS score, winning is winning, it doesn't matter whether it's gaming stable or not. Go to any competitive benching thread, no one is talking about gaming stable clocks.


Nvidia cards are closer to being stable at their max clocks and best scores than AMD cards. I guess. Considering my mods to get me where the card is.


----------



## Vellinious

Quote:


> Originally Posted by *mus1mus*
> 
> Nvidia cards are closer to being stable at their max clocks and best scores than AMD cards. I guess. Considering my mods to get me where the card is.


Having experienced both the 290X and the 970s, that's somewhat true. Kind of...there are some really stupid easy tricks you can do with the NVIDIA GPUs to help keep them more stable at top clocks. EVGA's Precision X has a handy little button called "KBoost", which locks in boost voltage and clocks. Helps to keep things stable when you're riding the edge.

lol, wish I could find something like that for this 290x. = P


----------



## jodybdesigns

Pulled that trigger. Powercolor PCS+ 390 on the way. $299 + $20 mail in rebate. Be here Tuesday. Pretty excited lol
Quote:


> Originally Posted by *kizwan*
> 
> Your Voodoo 2 will always behind my Voodoo 3 3000.


Boo. I am using the Voodoo Banshee - Because it came with Unreal for free bruh! lol


----------



## mus1mus

Quote:


> Originally Posted by *Vellinious*
> 
> Having experienced both the 290X and the 970s, that's somewhat true. Kind of...there are some really stupid easy tricks you can do with the NVIDIA GPUs to help keep them more stable at top clocks. EVGA's Precision X has a handy little button called "*KBoost*", which locks in boost voltage and clocks. Helps to keep things stable when you're riding the edge.
> 
> lol, wish I could find something like that for this 290x. = P


One thing to note though, AMD cards can be pushed into their very limit in Benchmarks and give you a score in a more fathomable way. Artifacts, screen corruption, black screens and stuff whrn running on the rugged edge.

Nvidia cards, at least from the 600 to 900 series that I have had my hands on, wouldn't allow you to run them at their limits. It's either just run without glitches or simply not run at all. Black screens and restarts, Driver stopping and crashing, all that sort. And most of the time, it's the driver limitting what you can squeeze. Not the card. As the card runs fine but driver dictates it when to stop.

Unless you can trick the driver not to, like this run for example, where I have to adjust the limits of the card to trick the driver I am low enough on the limits by setting them at 250%. Yes, 250% and the card is riding at 180 - 198% of it's TDP and Power Limit.

http://www.3dmark.com/sd/3747800










Crazy run. But took home the 2nd spot on the bot for a single 780!

290 for comparison. Tess tweaks will be an annihilation going for the 290.

http://www.3dmark.com/sd/3778481


----------



## matt2dlg

Validate me bruh!


----------



## matt2dlg

Validate me bruh!
https://www.techpowerup.com/gpuz/details.php?id=cus5q
Sapphire Nitro 390
Stock cooling


----------



## Vellinious

Quote:


> Originally Posted by *mus1mus*
> 
> One thing to note though, AMD cards can be pushed into their very limit in Benchmarks and give you a score in a more fathomable way. Artifacts, screen corruption, black screens and stuff whrn running on the rugged edge.
> 
> Nvidia cards, at least from the 600 to 900 series that I have had my hands on, wouldn't allow you to run them at their limits. It's either just run without glitches or simply not run at all. Black screens and restarts, Driver stopping and crashing, all that sort. And most of the time, it's the driver limitting what you can squeeze. Not the card. As the card runs fine but driver dictates it when to stop.
> 
> Unless you can trick the driver not to, like this run for example, where I have to adjust the limits of the card to trick the driver I am low enough on the limits by setting them at 250%. Yes, 250% and the card is riding at 180 - 198% of it's TDP and Power Limit.
> 
> http://www.3dmark.com/sd/3747800
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Crazy run. But took home the 2nd spot on the bot for a single 780!
> 
> 290 for comparison. Tess tweaks will be an annihilation going for the 290.
> 
> http://www.3dmark.com/sd/3778481


I haven't run SkyDiver with the new card yet. But.....that's pretty even. Good run, man.

http://www.3dmark.com/compare/sd/3778481/sd/3475837


----------



## mus1mus

Quote:


> Originally Posted by *Vellinious*
> 
> I haven't run SkyDiver with the new card yet. But.....that's pretty even. Good run, man.
> 
> http://www.3dmark.com/compare/sd/3778481/sd/3475837


I'd be hard pressed to believe that was your best SD run.









I sometimes wonder if the 780 can be made stronger TBH.

BTW, run SD with Tess Off and you'll be amazed how good your card is.


----------



## Vellinious

Quote:


> Originally Posted by *mus1mus*
> 
> I'd be hard pressed to believe that was your best SD run.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I sometimes wonder if the 780 can be made stronger TBH.
> 
> BTW, run SD with Tess Off and you'll be amazed how good your card is.


I ran it once, just to put my name on the leaderboard, first run put me in the top 20..never ran it again.

Tess off just feels like cheating, though. lol


----------



## mus1mus

Tess off is allowed on the bot though.


----------



## Vellinious

Quote:


> Originally Posted by *mus1mus*
> 
> Tess off is allowed on the bot though.


Yeah. I signed up for that site, but, never did do anything there about posting scores or anything.

With tess tweaks, are you talking about in the driver settings? Setting it to off, or, AMD optimized....or setting it to a specific level?


----------



## Vellinious

PSU shut down again with stress on the overclocked CPU and GPU. I'm beginning to wonder if some of the voltage drop is due to the PSU starting to go bad.....

RMA requested. Waiting to hear back from EVGA.


----------



## ZealotKi11er

Doesn't MSI have one option to make Voltage stay constant? Also I think one reason AMD is hard to get stable at their limit since HD 7970 is that if your temps are low enough the card will work even with artifacts. Even if you are getting no artifacts you can be stable 40C and then 50C your drivers will crash. Could be some part of the card is getting hotter which even Water Cooling does not cool when you are running +200 mV and up.


----------



## nicocf23

HI ALL,

THIS IS MY FIRST POST HERE.

this is my RIG:

PowerColor DEVIL R9 390x (watercooled) 1150/1750
Sapphire R9 390 Nitro 1150/1750

FireStrike Extreme: 10526 points
FireStrike : 21020 points


----------



## Vellinious

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Doesn't MSI have one option to make Voltage stay constant? Also I think one reason AMD is hard to get stable at their limit since HD 7970 is that if your temps are low enough the card will work even with artifacts. Even if you are getting no artifacts you can be stable 40C and then 50C your drivers will crash. Could be some part of the card is getting hotter which even Water Cooling does not cool when you are running +200 mV and up.


It does, but it doesn't also lock in the boost clock. Like turning off c-states and energy saving features on CPUs when you're pushing really hard. Helps stability if you can lock the clock down.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Vellinious*
> 
> It does, but it doesn't also lock in the boost clock. Like turning off c-states and energy saving features on CPUs when you're pushing really hard. Helps stability if you can lock the clock down.


I think it only locks voltage?


----------



## kizwan

Quote:


> Originally Posted by *Vellinious*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ZealotKi11er*
> 
> Doesn't MSI have one option to make Voltage stay constant? Also I think one reason AMD is hard to get stable at their limit since HD 7970 is that if your temps are low enough the card will work even with artifacts. Even if you are getting no artifacts you can be stable 40C and then 50C your drivers will crash. Could be some part of the card is getting hotter which even Water Cooling does not cool when you are running +200 mV and up.
> 
> 
> 
> It does, but it doesn't also lock in the boost clock. Like turning off c-states and energy saving features on CPUs when you're pushing really hard. Helps stability if you can lock the clock down.
Click to expand...

I have never been able to lock voltage with force constant voltage setting. If you want lock to max clock, set Unoffcial Overclocking without PowerPlay. Just press reset later to get the card to run at idle clocks when not gaming or benching.


----------



## Spartoi

Can someone tell me where the VRMs are located on the Sapphire 390x?

I've highlighted where I think they are but am not sure. I guessing the reason why my VRM2 temps are so high is because the heatsink on my AIO don't cover them but I don't know where they are.


----------



## Vellinious

Quote:


> Originally Posted by *kizwan*
> 
> I have never been able to lock voltage with force constant voltage setting. If you want lock to max clock, set Unoffcial Overclocking without PowerPlay. Just press reset later to get the card to run at idle clocks when not gaming or benching.


There's a setting in AB to force constant voltage, but....I'm not sure if it works with AMD cards.


----------



## Vellinious

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I think it only locks voltage?


Yup. Voltage only.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Spartoi*
> 
> Can someone tell me where the VRMs are located on the Sapphire 390x?
> 
> I've highlighted where I think they are but am not sure. I guessing the reason why my VRM2 temps are so high is because the heatsink on my AIO don't cover them but I don't know where they are.


----------



## christoph

Quote:


> Originally Posted by *ZealotKi11er*


this should be the same for the 390, right?


----------



## ZealotKi11er

Quote:


> Originally Posted by *christoph*
> 
> this should be the same for the 390, right?


Yeah for most R9 290 -> 390X.


----------



## Spartoi

Quote:


> Originally Posted by *ZealotKi11er*


Thanks. I placed a fan at the bottom of my case and positioned it near VRM2 and temps dropped ~15C. I've ordered one of these 290x heatsinks and will place that little cap on my VRM2. Hopefully then temps will be closer to VRM1 which at load is in the 50C range.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Spartoi*
> 
> Thanks. I placed a fan at the bottom of my case and positioned it near VRM2 and temps dropped ~15C. I've ordered one of these 290x heatsinks and will place that on little cap on my VRM2. Hopefully then temps will be closer to VRM1 which at load is in the 50C range.


VRM2 is technically easier to keep cooler.


----------



## tolis626

So.... Question. Do GPUs need more core votlage when running the memory at higher clocks? Because I think that's what mine is doing. I've had it crash a million times until I forgot to lower the voltage from +100mV but set the clock to 1160MHz from 1175MHz with memory being at 1700MHz and aux at +50mV. And it was stable, whereas before it would reboot with 1160MHz and +60mV that is stable with lower mem clocks.

Pardon my confused writing, but I was out with friends and I'm quite drunk, so it's hard to describe techical issues with any amount of coherence.


----------



## christoph

Quote:


> Originally Posted by *Spartoi*
> 
> Thanks. I placed a fan at the bottom of my case and positioned it near VRM2 and temps dropped ~15C. I've ordered one of these 290x heatsinks and will place that on little cap on my VRM2. Hopefully then temps will be closer to VRM1 which at load is in the 50C range.


please reply with results once you replace those.

meanwhile, are those any better than the stock ones that comes with the Nitro 390?
whats the thick size of the thermal pad strip? it looks like 3mm


----------



## ZealotKi11er

Quote:


> Originally Posted by *tolis626*
> 
> So.... Question. Do GPUs need more core votlage when running the memory at higher clocks? Because I think that's what mine is doing. I've had it crash a million times until I forgot to lower the voltage from +100mV but set the clock to 1160MHz from 1175MHz with memory being at 1700MHz and aux at +50mV. And it was stable, whereas before it would reboot with 1160MHz and +60mV that is stable with lower mem clocks.
> 
> Pardon my confused writing, but I was out with friends and I'm quite drunk, so it's hard to describe techical issues with any amount of coherence.


Here is the thing. We have no Voltage control for Hawaii based cards. The memory for 290/290X is set at 5GHz but is rated 6GHz. To have it run 6GHz we have to increase the memory controller voltage to get it stable. AMD did this to lower the cards power consumption. Going above 1500MHz is basically how good the memory is clocking with same voltage as stock.


----------



## kizwan

Quote:


> Originally Posted by *tolis626*
> 
> So.... Question. Do GPUs need more core votlage when running the memory at higher clocks? Because I think that's what mine is doing. I've had it crash a million times until I forgot to lower the voltage from +100mV but set the clock to 1160MHz from 1175MHz with memory being at 1700MHz and aux at +50mV. And it was stable, whereas before it would reboot with 1160MHz and +60mV that is stable with lower mem clocks.
> 
> Pardon my confused writing, but I was out with friends and I'm quite drunk, so it's hard to describe techical issues with any amount of coherence.


Usually either increasing AUX voltage (+25 to +50mV) or core voltage. Basically let say you need +50mV core voltage for 1600MHz memory clock. If you increase AUX voltage by +25 to +50mV (no need higher than this so far I found), then you can lower your core voltage. I just want to point out that if your card memory overclock doesn't respond to AUX voltage increase, then this trick doesn't work.


----------



## ziggystardust

Quote:


> Originally Posted by *ZealotKi11er*
> 
> VRM2 is technically easier to keep cooler.


So the VRM2 sensor on Sapphire 390s located on memory mosfet and not actively cooled by a heatsink or thermal pad?

Edit: Just watched a teardown video and noticed there's a microscopic thermal pad for that mosfet on a very thin part of the main heat spreader. Is that really a wise design? My vrm 2 is maxing out around mid 80s under heavy load on stock fan curve. I wonder if that thermal pad is really there...


----------



## ZealotKi11er

Quote:


> Originally Posted by *ziggystardust*
> 
> So the VRM2 sensor on Sapphire 390s located on memory mosfet and not actively cooled by a heatsink or thermal pad?
> 
> Edit: Just watched a teardown video and noticed there's a microscopic thermal pad for that mosfet on a very thin part of the main heat spreader. Is that really a wise design? My vrm 2 is maxing out around mid 80s under heavy load on stock fan curve. I wonder if that thermal pad is really there...


That is very good design. You have the same card?


----------



## ziggystardust

Quote:


> Originally Posted by *ZealotKi11er*
> 
> That is very good design. You have the same card?


I have the 390x Nitro version. But the thing is; VRM 2 temp is maxing out around 85C while VRM1 is low 70s.

Another weird thing is; when vrms exceed a certain temperature, (usually the vrm2 first), only the rightmost fan kicking in which is actually blowing on vrm1. I'm kind of confused.


----------



## Offender_Mullet

Just bought an XFX 390X from a member on here. So far so good.









I had their 390 before (which I just ended up using for a client's build) and their cooling is the same top-notch for both cards.


----------



## Vellinious

Quote:


> Originally Posted by *Offender_Mullet*
> 
> Just bought an XFX 390X from a member on here. So far so good.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I had their 390 before (which I just ended up using for a client's build) and their cooling is the same top-notch for both cards.


Would you post a copy of the stock bios from that card? I want to compare it to the stock bios on the XFX 8GB 290X. Please.


----------



## Offender_Mullet

Quote:


> Originally Posted by *Vellinious*
> 
> Would you post a copy of the stock bios from that card? I want to compare it to the stock bios on the XFX 8GB 290X. Please.


I just tried attaching it on here and it said the file wasn't allowed.







There are probably some on the TPU database. You might want to check there.


----------



## Vellinious

Quote:


> Originally Posted by *Offender_Mullet*
> 
> I just tried attaching it on here and it said the file wasn't allowed.
> 
> 
> 
> 
> 
> 
> 
> There are probably some on the TPU database. You might want to check there.


zip it


----------



## Offender_Mullet

Quote:


> Originally Posted by *Vellinious*
> 
> zip it


.Zip is on the no-no list as well. Just tried it. PM me your e-mail if you want and I'll send it that way.


----------



## Vellinious

Hmmm....


----------



## ZealotKi11er

Quote:


> Originally Posted by *ziggystardust*
> 
> I have the 390x Nitro version. But the thing is; VRM 2 temp is maxing out around 85C while VRM1 is low 70s.
> 
> Another weird thing is; when vrms exceed a certain temperature, (usually the vrm2 first), only the rightmost fan kicking in which is actually blowing on vrm1. I'm kind of confused.


All fans should kick in. You might have a dead fan there.


----------



## kizwan

Quote:


> Originally Posted by *Offender_Mullet*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Vellinious*
> 
> zip it
> 
> 
> 
> .Zip is on the no-no list as well. Just tried it. PM me your e-mail if you want and I'll send it that way.
Click to expand...

Sure you can.

XFXR9390DDBios.zip 99k .zip file


----------



## Offender_Mullet

Quote:


> Originally Posted by *kizwan*
> 
> Sure you can.
> 
> XFXR9390DDBios.zip 99k .zip file


This is what I get when I try to attach it. .Rar isn't supported?


Regardless, already e-mailed him it.


----------



## mus1mus

Tick the zip box when add it as an archive using Winrar.


----------



## Offender_Mullet

Quote:


> Originally Posted by *mus1mus*
> 
> Tick the zip box when add it as an archive using Winrar.


I'm an idiot.


----------



## mus1mus

Quote:


> Originally Posted by *Offender_Mullet*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mus1mus*
> 
> Tick the zip box when add it as an archive using Winrar.
> 
> 
> 
> I'm an idiot.
Click to expand...

We all were. At some point.


----------



## ziggystardust

Quote:


> Originally Posted by *ZealotKi11er*
> 
> All fans should kick in. You might have a dead fan there.


No there's no dead fan. All three fans work independently on stock fan curve in that new Intelligent Fan Control thing of Sapphire. Fans kick in just when they need to. But the confusing part is; if vrm2 is the memory side mosfet, then there's a problem with the cooling design, because when vrm2 temp rises and reaches the threshold, only rightmost fan kicks in, not the leftmost one. Leftmost one kicks in when the gpu temp reaches 60C (which takes some time) and the middle fan follows it if the overall temp tends to rise.

The design idea is not bad at heart, but if the vrm2 sensor is the memory side mosfet, this design struggles to keep it cool.


----------



## deskiller

so guys, I got something to ask,

I have 2 evga 780 classifieds in sli.

would a AMD R9 390x be better?

trying to play some games at 4k and hitting memory limits with textures.


----------



## tolis626

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Here is the thing. We have no Voltage control for Hawaii based cards. The memory for 290/290X is set at 5GHz but is rated 6GHz. To have it run 6GHz we have to increase the memory controller voltage to get it stable. AMD did this to lower the cards power consumption. Going above 1500MHz is basically how good the memory is clocking with same voltage as stock.


Well, seeing as I almost bought a 290x last year, I know most of that. I didn't know that the same applied to the 390/390x. I thought that allowing control over the mem controller was among the changes in the "Grenada" chip. Guess I was wrong. Thanks!
Quote:


> Originally Posted by *kizwan*
> 
> Usually either increasing AUX voltage (+25 to +50mV) or core voltage. Basically let say you need +50mV core voltage for 1600MHz memory clock. If you increase AUX voltage by +25 to +50mV (no need higher than this so far I found), then you can lower your core voltage. I just want to point out that if your card memory overclock doesn't respond to AUX voltage increase, then this trick doesn't work.


To be honest, at the moment I have no idea what my memory responds to. Stability seems kinda random. Like, the other day it passed a little over an hour of Valley at 1170/1700 with +100/+50 with no problems, no artifacts, no nothing. Then, the next day I tried the exact same settings in BF4 and it crashed within 5 minutes. Now, BF4 is quite the stress test for GPUs and I run it at DSR 1440p with everything maxed out except for MSAA that's at 2x instead of 4x. But that's just ridiculous, I'd expect it to be at least semi-stable. And it mostly doesn't throw any artifacts or errors, it will just crash. Black screen, all sound becomes a buzzing noise from hell and then RSOD and, if I'm lucky, a reboot. Sometimes it will just stick there showing nothing. And it's the GPU driver that crashes according to Bluescreenviewer, so it's not something else that's causing problems. It's also not temps, because I have it set up with a very aggressive fan curve that goes to 80% at 65C and 100% at 75C, so it usually tops at 72-73C.

Maybe I'll completely give up and run it at 1625MHz and be done with it. Maybe not. Thing is, the MSI doesn't seem to scale its performance well due to power limitations. 1150MHz at +50mV and 1175MHz at +100mV are about the same performance wise and going to 1200MHz at +150mV is slower than both of those. So it seems that without a custom BIOS it's not even worth to continue pursuing these higher overclocks. Which is a shame. Unless there is a way to increase the power limits without a custom BIOS. There isn't one though, is there?
Quote:


> Originally Posted by *deskiller*
> 
> so guys, I got something to ask,
> 
> I have 2 evga 780 classifieds in sli.
> 
> would a AMD R9 390x be better?
> 
> trying to play some games at 4k and hitting memory limits with textures.


Well, if we're talking about a single 390x, no. It's a great card but not 4K ready. If you're talking about Crossifre 390x, then yes, it's going to be better. Quite a bit better, actually, and not only because of the extra VRAM.

BUT! And there's always a but. We're what? 6 months away from Polaris? A little more? I'd wait. It's not like SLI 780s are slow by any means. Just don't run these ultra high res textures for a bit and buy a 14/16nm card when they come out.


----------



## battleaxe

Quote:


> Originally Posted by *deskiller*
> 
> so guys, I got something to ask,
> 
> I have 2 evga 780 classifieds in sli.
> 
> would a AMD R9 390x be better?
> 
> trying to play some games at 4k and hitting memory limits with textures.


Two 780's are going to be more powerful than one 390x.


----------



## ZealotKi11er

Quote:


> Originally Posted by *ziggystardust*
> 
> No there's no dead fan. All three fans work independently on stock fan curve in that new Intelligent Fan Control thing of Sapphire. Fans kick in just when they need to. But the confusing part is; if vrm2 is the memory side mosfet, then there's a problem with the cooling design, because when vrm2 temp rises and reaches the threshold, only rightmost fan kicks in, not the leftmost one. Leftmost one kicks in when the gpu temp reaches 60C (which takes some time) and the middle fan follows it if the overall temp tends to rise.
> 
> The design idea is not bad at heart, but if the vrm2 sensor is the memory side mosfet, this design struggles to keep it cool.


Is there a way to have all fans kick in the same time?


----------



## kizwan

Quote:


> Originally Posted by *tolis626*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Usually either increasing AUX voltage (+25 to +50mV) or core voltage. Basically let say you need +50mV core voltage for 1600MHz memory clock. If you increase AUX voltage by +25 to +50mV (no need higher than this so far I found), then you can lower your core voltage. I just want to point out that if your card memory overclock doesn't respond to AUX voltage increase, then this trick doesn't work.
> 
> 
> 
> To be honest, at the moment I have no idea what my memory responds to. Stability seems kinda random. Like, the other day it passed a little over an hour of Valley at 1170/1700 with +100/+50 with no problems, no artifacts, no nothing. Then, the next day I tried the exact same settings in BF4 and it crashed within 5 minutes. Now, BF4 is quite the stress test for GPUs and I run it at DSR 1440p with everything maxed out except for MSAA that's at 2x instead of 4x. But that's just ridiculous, I'd expect it to be at least semi-stable. And it mostly doesn't throw any artifacts or errors, it will just crash. Black screen, all sound becomes a buzzing noise from hell and then RSOD and, if I'm lucky, a reboot. Sometimes it will just stick there showing nothing. And it's the GPU driver that crashes according to Bluescreenviewer, so it's not something else that's causing problems. It's also not temps, because I have it set up with a very aggressive fan curve that goes to 80% at 65C and 100% at 75C, so it usually tops at 72-73C.
> 
> Maybe I'll completely give up and run it at 1625MHz and be done with it. Maybe not. Thing is, the MSI doesn't seem to scale its performance well due to power limitations. 1150MHz at +50mV and 1175MHz at +100mV are about the same performance wise and going to 1200MHz at +150mV is slower than both of those. So it seems that without a custom BIOS it's not even worth to continue pursuing these higher overclocks. Which is a shame. Unless there is a way to increase the power limits without a custom BIOS. There isn't one though, is there?
Click to expand...

Skip Valley. Use Heaven loop & Firestrike to test stability. Also I doubt your card hit power limit with Valley, Heaven or even Firestrike. Power limit with +50% is over 300W anyway.

Yeah, I think you hit the wall there for the memory. Regarding the power limit, if we can figure out how MSI AB/Trixx set power limit, we may be able to increase it more than 50% without BIOS mod. In theory.


----------



## tolis626

Quote:


> Originally Posted by *kizwan*
> 
> Skip Valley. Use Heaven loop & Firestrike to test stability. Also I doubt your card hit power limit with Valley, Heaven or even Firestrike. Power limit with +50% is over 300W anyway.
> 
> Yeah, I think you hit the wall there for the memory. Regarding the power limit, if we can figure out how MSI AB/Trixx set power limit, we may be able to increase it more than 50% without BIOS mod. In theory.


Like you, I don't think I should be hitting any power limits, but I can't find any other logical explanation as to why performance doesn't scale. It's not thermals, that's for sure. Perhaps it's only semi-stable and it causes a performance drop instead of obvious artifacting? That could be a possibility, but I doubt that. It seems way more logical that it throttles due to power limitations. And seeing that some Sapphire cards have no such problems with their dual 8-pin power delivery, I lean towards this as an explanation even more. I could be wrong, but I can't think of anything else at this moment.

Other than that, I will start using Heaven and see where that gets me. Also, I suppose you mean we could set a higher power limit with command prompts or something?

PS : Regarding replacing the TIM on the MSI cards, any suggestions? I have some Arctic MX-4 lying around and was thinking of using that if there isn't a serious reason to buy something else.


----------



## SoccerNinja

I just bought a 390 for my first system, What is a good overclocking guide?


----------



## ziggystardust

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Is there a way to have all fans kick in the same time?


Yeah, using a custom fan curve. I made one before but there wasn't a big difference. I set a maximum of 45% fan speed though. I'm trying to make my case as silent as possible. I guess I will have to change thermal pads at some point.


----------



## kizwan

Quote:


> Originally Posted by *tolis626*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Skip Valley. Use Heaven loop & Firestrike to test stability. Also I doubt your card hit power limit with Valley, Heaven or even Firestrike. Power limit with +50% is over 300W anyway.
> 
> Yeah, I think you hit the wall there for the memory. Regarding the power limit, if we can figure out how MSI AB/Trixx set power limit, we may be able to increase it more than 50% without BIOS mod. In theory.
> 
> 
> 
> Like you, I don't think I should be hitting any power limits, but I can't find any other logical explanation as to why performance doesn't scale. It's not thermals, that's for sure. Perhaps it's only semi-stable and it causes a performance drop instead of obvious artifacting? That could be a possibility, but I doubt that. It seems way more logical that it throttles due to power limitations. And seeing that some Sapphire cards have no such problems with their dual 8-pin power delivery, I lean towards this as an explanation even more. I could be wrong, but I can't think of anything else at this moment.
> 
> *Other than that, I will start using Heaven and see where that gets me. Also, I suppose you mean we could set a higher power limit with command prompts or something?*
> 
> PS : Regarding replacing the TIM on the MSI cards, any suggestions? I have some Arctic MX-4 lying around and was thinking of using that if there isn't a serious reason to buy something else.
Click to expand...

In theory, yes like using command prompt. Similar to what we need to do to get offset voltage higher than +100mV in MSI AB. Only theory though because I tried to compare i2c dump between +0% & +50% PL, I can't find useful information.

MX-4 should be ok. If you don't have temp issue, better don't. If it's not break, don't fix it.


----------



## JerDerv

Hi guys, just bought a gigabyte r9 390x to replace my crossfire 380s.


----------



## jdorje

Heaven loop? Is that a looping benchmark or you mean just leave the normal animation running?


----------



## kizwan

Quote:


> Originally Posted by *jdorje*
> 
> Heaven loop? Is that a looping benchmark or you mean just leave the normal animation running?


Yeah, just leave it running for 30 minutes or 1 hour.


----------



## jdorje

I wish there were a way to loop the benchmark and run it ~10 times.


----------



## OneB1t

if you leave it running in normal mode its same load as in benchmark just select some good scene


----------



## tolis626

I think that if you make a custom run of FireStrike you can also have that run in a loop. That should be stressful enough. But other than that, if you just open Valley or Heaven (they behave the same, really) and let it sit there, it's a stress test alright. My only complaint is that if there's artifacting but it doesn't just downright crash, you have to be there to see it, and sitting an hour in front of your screen doing nothing is boring on top of stupid. If only these tests had built-in error detection, life would be so much easier.


----------



## Dundundata

just play some games, fun and test at the same time


----------



## jodybdesigns

Tried playing The Witcher 3 a bit. Terrible on my Crossfire 7950's. 1080p (I didn't even try 1440p). Textures on Ultra. Everything else on high. Hairworks off. No AA. 55fps average.

Although my scaling is 99% and GPU usage is 99%. My CPU usage is around 60%.

Can't wait until my 390 gets here...


----------



## ZealotKi11er

Quote:


> Originally Posted by *jodybdesigns*
> 
> Tried playing The Witcher 3 a bit. Terrible on my Crossfire 7950's. 1080p (I didn't even try 1440p). Textures on Ultra. Everything else on high. Hairworks off. No AA. 55fps average.
> 
> Although my scaling is 99% and GPU usage is 99%. My CPU usage is around 60%.
> 
> Can't wait until my 390 gets here...


Yeah CFX is no good with Witcher 3. I get 40-50 fps 4K but from time to time fps drops 30 fps for some unknown reason. Even when GPU usage is 90% there is visible stutter. With single GPU I get ~ 30 fps but its very smooth. Problem is 30 fps is not very fluid for my taste.


----------



## jodybdesigns

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Yeah CFX is no good with Witcher 3. I get 40-50 fps 4K but from time to time fps drops 30 fps for some unknown reason. Even when GPU usage is 90% there is visible stutter. With single GPU I get ~ 30 fps but its very smooth. Problem is 30 fps is not very fluid for my taste.


Nah, no 30fps gaming for me.

Look I am going to break a karma here. Day before yesterday I was standing in the video game isle at Wal-Mart looking at the Xbox Ones. Then I remembered 30fps and usually 900p upscaled if I am lucky.

I came home and ordered my 390...


----------



## christoph

Quote:


> Originally Posted by *ZealotKi11er*
> 
> All fans should kick in. You might have a dead fan there.


no no, is all good I have that video card, and with the intelligent fan controller does that, one fan's stop while the others are running, the temps should be right like that


----------



## fat4l

Quote:


> Originally Posted by *mus1mus*
> 
> I'd be hard pressed to believe that was your best SD run.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I sometimes wonder if the 780 can be made stronger TBH.
> 
> BTW, run SD with Tess Off and you'll be amazed how good your card is.


This makes me wonder if you are benching and doing those amazing scores with tess off


----------



## mus1mus

Quote:


> Originally Posted by *fat4l*
> 
> This makes me wonder if you are benching and doing those amazing scores with tess off


Tess Off can be seen on the results. There are previous reports 3DMark not being able to detect Tess Tweaks down to 2X but I have no luck with them.

But since you are asking, here is Tess OFF.







Good for 2nd spot on the bot. Well,that is just due to the top dog using a better CPU.

http://www.3dmark.com/sd/3763946


----------



## fat4l

Quote:


> Originally Posted by *mus1mus*
> 
> Tess Off can be seen on the results. There are previous reports 3DMark not being able to detect Tess Tweaks down to 2X but I have no luck with them.


How ? when I turn it off in drivers, 3d marks says, "valid result"


----------



## mus1mus

Quote:


> Originally Posted by *fat4l*
> 
> How ? when I turn it off in drivers, 3d marks says, "valid result"


Lucky you.

But since you doubt my syet stink







here you go. A very easy one.

http://www.3dmark.com/fs/6643800

And a hard one.
http://www.3dmark.com/3dm11/10588407


----------



## RWGTROLL

I have a question what is the max safe temp for vrm 2on my r9 390x devil. I'm setting at 65c max right now and I want to know how far I can push my AUX voltage


----------



## Stige

There is very little need to touch AUX voltage on these cards.


----------



## mus1mus

Agree.

Best to keep the VRMs under 80C at all times. It can be argued that they can go past 90C. But thinks get wonky past 70C on high clocks in my experience.


----------



## Stige

They can do 100C+ but it does create artifacts at high temps, the lower the better obviously.


----------



## TsukikoChan

quick wee question, one i didn't think much of for ages. I use a custom fancurve and don't mind having the fan at 15-25% when i'm not gaming as i don't hear it (i wear big padded headphones).
The thing is, i've heard and read that these cards (i have 390x sapphire tri-x) are meant to be 0rpm until the temps require it to rise and require cooling. Since i've got this card, i've never seen it go to 0rpm, it never went below 15-20% even trying to set it manually via crimson, fan-curve, static, etc. Any thoughts? i don't think it's the temps as sometimes when my room is super cold and i turn it on i see temps start at 30-40 and then only start to rise if i game or my room heats up.. hmm...


----------



## mus1mus

It has to be Zero Core that you mean. Did you try enabling ULPS?


----------



## kizwan

Quote:


> Originally Posted by *RWGTROLL*
> 
> I have a question what is the max safe temp for vrm 2on my r9 390x devil. I'm setting at 65c max right now and I want to know how far I can push my AUX voltage


Even if your memory overclock require bumping the AUX voltage, the most you need is between +25mV to +50mV. Bumping more usually doesn't help memory overclock to go higher.


----------



## roflcopter1654

Just built my system with an MSI r9 390...will post proof and whatnot when i get home.


----------



## christoph

Quote:


> Originally Posted by *mus1mus*
> 
> Tess Off can be seen on the results. There are previous reports 3DMark not being able to detect Tess Tweaks down to 2X but I have no luck with them.
> 
> But since you are asking, here is Tess OFF.
> 
> 
> 
> 
> 
> 
> 
> Good for 2nd spot on the bot. Well,that is just due to the top dog using a better CPU.
> 
> http://www.3dmark.com/sd/3763946


That OC is under water?, what'sthe Voltage for the Core??

Quote:


> Originally Posted by *Stige*
> 
> They can do 100C+ but it does create artifacts at high temps, the lower the better obviously.


yes, they can go over 100 c, but it is way much better to keep them under 80, cuz the life of the video card can last much much longer.

I had a old HD4890 that I'd keep under 80 at all time with OC and it didn't die, I had use for that video card until my 390 arrive less than 1 month ago.


----------



## Stige

Temps won't break your card. 80-90C is completely fine.


----------



## mus1mus

The one I have is on the link.

Water. 1.36ish IDLE and 1.344 at load. 80+ASIC for the card.


----------



## OneB1t

do you use LLC or changed vrm frequency? thats pretty low vdrop


----------



## viking21

edit.


----------



## Mysticking32

Quick question. What do y'all run your cards on? Like what kind of an overclock? And how are the voltages?

Do you keep the overclock on daily at max or only when you play a game? Same question for voltages? 1150 with 100 mv? 1130 with 50? etc


----------



## Darkeylel

http://support.amd.com/en-us/kb-articles/Pages/AMD-Radeon-Software-Crimson-Edition-16.1.1-Hotfix-Release-Notes.aspx

Was this driver released yesterday or was it released last month ?


----------



## roflcopter1654

Everything is stock right now. Haven't really pushed it. Idles at about 30 C with a custom fan curve, hit about 53 in Dota 2 on max everything with vsync. It's in an NZXT s340 with a couple extra 140mm intake fans.


----------



## diggiddi

Quote:


> Originally Posted by *Darkeylel*
> 
> http://support.amd.com/en-us/kb-articles/Pages/AMD-Radeon-Software-Crimson-Edition-16.1.1-Hotfix-Release-Notes.aspx
> 
> Was this driver released yesterday or was it released *last month* ?


Bolded


----------



## Spartoi

Quote:


> Originally Posted by *Darkeylel*
> 
> http://support.amd.com/en-us/kb-articles/Pages/AMD-Radeon-Software-Crimson-Edition-16.1.1-Hotfix-Release-Notes.aspx
> 
> Was this driver released yesterday or was it released last month ?


16.1.1 was released yesterday.


----------



## Stige

Quote:


> Originally Posted by *Mysticking32*
> 
> Quick question. What do y'all run your cards on? Like what kind of an overclock? And how are the voltages?
> 
> Do you keep the overclock on daily at max or only when you play a game? Same question for voltages? 1150 with 100 mv? 1130 with 50? etc


+175mV, 1225/1600 24/7.


----------



## Mysticking32

Quote:


> Originally Posted by *Stige*
> 
> +175mV, 1225/1600 24/7.


wait are you trolling or serious? if you're serious then holy crap nice lol


----------



## OneB1t

why troll some cards can do 1250 or even more stable


----------



## Stige

I can play at 1250 for days but I get a small flicker here and there so I didn't think it was good enough for me so dropped the clocks to 1225.


----------



## Mysticking32

O I believe the clocks I'm just talking about running the voltage that high 24/7. That's epic lol


----------



## OneB1t

175mv is not that much


----------



## fat4l

Quote:


> Originally Posted by *mus1mus*
> 
> Tess Off can be seen on the results. There are previous reports 3DMark not being able to detect Tess Tweaks down to 2X but I have no luck with them.
> 
> But since you are asking, here is Tess OFF.
> 
> 
> 
> 
> 
> 
> 
> Good for 2nd spot on the bot. Well,that is just due to the top dog using a better CPU.
> 
> http://www.3dmark.com/sd/3763946
> 
> 
> Spoiler: Warning: Spoiler!


it cant be seen, if set up in drivers. Look.
Normal default driver settings, 1030/1250MHz all stock vs Changed drivers settings, 1030/1250MHz all stock
Newest drivers 16.1.1

http://www.3dmark.com/compare/fs/7418510/fs/7418566

http://www.3dmark.com/fs/7418566
http://www.3dmark.com/fs/7418510



All it says is, "graphics driver not approved" as this is beta driver from today. It says it for both results hoever one of them has custom driver settings.
This gives me 12% Crossfire boost in graphics score on stock(just for curiosity)...








Settings that can be adjusted: Tessellation OFF, Texture filtering quality-Performance, Frame pacing-OFF.

With that said, here is my 290X single, 1250/1750MHz with 1500 timings and with "non-beta" driver.
16.6k graphics score...








U can see all three scores....and all three "valid result"!











Also, Crimson 16.1.1 vs 16.1. No boost in 3d mark








http://www.3dmark.com/compare/fs/7418510/fs/7418456#


----------



## TsukikoChan

Quote:


> Originally Posted by *mus1mus*
> 
> It has to be Zero Core that you mean. Did you try enabling ULPS?


Ah, i disabled ULPS to try aid in some of the issues i've been having with my 390x stability, so you think because of this i will never get the 0rpm mode regardless of temperature?


----------



## TsukikoChan

Quote:


> Originally Posted by *fat4l*
> 
> Also, Crimson 16.1.1 vs 16.1. No boost in 3d mark
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/compare/fs/7418510/fs/7418456#


I would highly doubt it would :-D it looks like it's primarily bug fixes and new profile for tomb raider


----------



## RWGTROLL

Quote:


> Originally Posted by *mus1mus*
> 
> Agree.
> 
> Best to keep the VRMs under 80C at all times. It can be argued that they can go past 90C. But thinks get wonky past 70C on high clocks in my experience.


thank you for the info helps alot


----------



## mus1mus

Quote:


> Originally Posted by *fat4l*
> 
> it cant be seen, if set up in drivers. Look.
> Normal default driver settings, 1030/1250MHz all stock vs Changed drivers settings, 1030/1250MHz all stock
> *Newest drivers 16.1.1*
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://www.3dmark.com/compare/fs/7418510/fs/7418566
> 
> http://www.3dmark.com/fs/7418566
> http://www.3dmark.com/fs/7418510
> 
> 
> 
> All it says is, "graphics driver not approved" as this is beta driver from today. It says it for both results hoever one of them has custom driver settings.
> This gives me 12% Crossfire boost in graphics score on stock(just for curiosity)...
> 
> 
> 
> 
> 
> 
> 
> 
> Settings that can be adjusted: Tessellation OFF, Texture filtering quality-Performance, Frame pacing-OFF.
> 
> With that said, here is my 290X single, 1250/1750MHz with 1500 timings and with "non-beta" driver.
> 16.6k graphics score...
> 
> 
> 
> 
> 
> 
> 
> 
> U can see all three scores....and all three "valid result"!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also, Crimson 16.1.1 vs 16.1. No boost in 3d mark
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/compare/fs/7418510/fs/7418456#


Maybe the Drivers.

But you do notice my scores are all pre-Crimson right?


----------



## kizwan

Quote:


> Originally Posted by *mus1mus*
> 
> Quote:
> 
> 
> 
> Originally Posted by *fat4l*
> 
> it cant be seen, if set up in drivers. Look.
> Normal default driver settings, 1030/1250MHz all stock vs Changed drivers settings, 1030/1250MHz all stock
> *Newest drivers 16.1.1*
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://www.3dmark.com/compare/fs/7418510/fs/7418566
> 
> http://www.3dmark.com/fs/7418566
> http://www.3dmark.com/fs/7418510
> 
> 
> 
> All it says is, "graphics driver not approved" as this is beta driver from today. It says it for both results hoever one of them has custom driver settings.
> This gives me 12% Crossfire boost in graphics score on stock(just for curiosity)...
> 
> 
> 
> 
> 
> 
> 
> 
> Settings that can be adjusted: Tessellation OFF, Texture filtering quality-Performance, Frame pacing-OFF.
> 
> With that said, here is my 290X single, 1250/1750MHz with 1500 timings and with "non-beta" driver.
> 16.6k graphics score...
> 
> 
> 
> 
> 
> 
> 
> 
> U can see all three scores....and all three "valid result"!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also, Crimson 16.1.1 vs 16.1. No boost in 3d mark
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/compare/fs/7418510/fs/7418456#
> 
> 
> 
> 
> 
> 
> Maybe the Drivers.
> 
> But you do notice my scores are all pre-Crimson right?
Click to expand...

@fyzzz said he used Crimson & tessellation tweaked was detected. Futuremark system info go wonky again I guess. This is not the first time. I don't see issue in legitimacy of the scores though since many competition sites including HWBOT allow tessellation tweaking.


----------



## ziggystardust

Quote:


> Originally Posted by *TsukikoChan*
> 
> quick wee question, one i didn't think much of for ages. I use a custom fancurve and don't mind having the fan at 15-25% when i'm not gaming as i don't hear it (i wear big padded headphones).
> The thing is, i've heard and read that these cards (i have 390x sapphire tri-x) are meant to be 0rpm until the temps require it to rise and require cooling. Since i've got this card, i've never seen it go to 0rpm, it never went below 15-20% even trying to set it manually via crimson, fan-curve, static, etc. Any thoughts? i don't think it's the temps as sometimes when my room is super cold and i turn it on i see temps start at 30-40 and then only start to rise if i game or my room heats up.. hmm...


Did you check if the fans are actually spinning? There's somekinda bug in fan speed reporting where it shows 20% even the fans are not spinning.


----------



## Dundundata

Quote:


> Originally Posted by *jodybdesigns*
> 
> Nah, no 30fps gaming for me.
> 
> Look I am going to break a karma here. Day before yesterday I was standing in the video game isle at Wal-Mart looking at the Xbox Ones. Then I remembered 30fps and usually 900p upscaled if I am lucky.
> 
> I came home and ordered my 390...


Good choice, my ps4 is collecting dust since building a pc


----------



## bbrotha

Hey guys, I'm having some temperature issues with my card and hopefully you can help me.
I usually play FO4 this days and recently installed the MSI Afterburner with the RIVA OSD monitoring. Once i've been playing for 20 minutes the card gets really hot, pushing 85 C and has even reached to 90C, with fans at 100%.
I'm currently running it on a $40ish case with 4 120mm fans and the case has an acrylic window. I have to open it in order to drop the temps around 78C. Do you think something is wrong with my card or could it be the case? A friend of mine has the same setup except for the case and he's getting below 72 C at load.


----------



## Stige

Bad airflow, bad case fans.
Or just a bad case really, cheap ones usually don't have very good airflow.


----------



## JreyE30

Finally received my replacement, the MSI R9 390, I'm comparing both the ASUS and MSI version, that will be a separate post but I'll have to say that Im sold on MSI card, ASUS card was just too hot, running avg of 85c in Firestrike, where the MSI measures out around low to mid 70s....again this will be a separate post

On to sharing some of my OC results with the MSI card and if im doing this right hopefully will land me on the owners list

MSI R9 390

Core: 1150mhz
Memory: 1700mhz
Core voltage: +50mV

Ran Firestrike and just RARELY scratches the 80c temps however fan would sit around 60-70% during benchmarking, doesnt really bother me as much, however I'll be doing more OCing in the future, for now i think im settled with this


----------



## TsukikoChan

Quote:


> Originally Posted by *ziggystardust*
> 
> Did you check if the fans are actually spinning? There's somekinda bug in fan speed reporting where it shows 20% even the fans are not spinning.


I'll have to open the case and try this then sometime soon  i always go by the numbers trixx gives me back ^_^

can't say i've noticed any difference yet using 16.1.1 crimson yet, i have come across an interesting bug though in that my ff13-2 profile settings in crimson never stick over a restart, have to set them up again each time i want to play the game :< hmm.. (i know i know, crimson bug, not particular to 390x, just griping here haha)

also, ordering myself a new psu hopefully end of this week. amazon-uk is out of stock of the evga supernova 750w-g2, so hopefully that comes back into stock soon. that should allow me to oc my cpu again and should give me a more stable/overclockable experience with the 390x :3


----------



## ziggystardust

Quote:


> Originally Posted by *TsukikoChan*
> 
> I'll have to open the case and try this then sometime soon  i always go by the numbers trixx gives me back ^_^
> 
> can't say i've noticed any difference yet using 16.1.1 crimson yet, i have come across an interesting bug though in that my ff13-2 profile settings in crimson never stick over a restart, have to set them up again each time i want to play the game :< hmm.. (i know i know, crimson bug, not particular to 390x, just griping here haha)
> 
> also, ordering myself a new psu hopefully end of this week. amazon-uk is out of stock of the evga supernova 750w-g2, so hopefully that comes back into stock soon. that should allow me to oc my cpu again and should give me a more stable/overclockable experience with the 390x :3


Yeah, you will see the fans are not spinning.









I have the 750w supernova g2. Great psu. If that doesn't come back into stocks you can get the 850 one.

16.1.1 boosted my Rise of the Tomb Raider performance noticeably by the way. Especially the min. fps. It's much more consistent now.


----------



## coffeeplus

Hello! Can any of you please help me and comment on the thread I opened, about AMD cards - related to R9 390 as well - and drivers and their current state?
I have a thread here and I'm looking to centralize opinions, facts and tips about AMD from a stability perspective: http://www.overclock.net/t/1590013/what-is-the-current-state-of-amd-cards-and-drivers-from-a-stability-perspective


----------



## Agent Smith1984

Just an update, all members are caught up to page 555, I will be catching up throughout the rest of the week.
I have had a hectic past couple of weeks, but am settled back in the swing of things now, so will be getting a little more involved here again.
Sorry for the delays in getting this updated recently.

Thanks and GO PANTHERS!!!!!


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Just an update, all members are caught up to page 555, I will be catching up throughout the rest of the week.
> I have had a hectic past couple of weeks, but am settled back in the swing of things now, so will be getting a little more involved here again.
> Sorry for the delays in getting this updated recently.
> 
> Thanks and GO PANTHERS!!!!!


I still need to get my submission in. Just can't get the darn time.









Great thread BTW... I'll get it in soon. Got a test to study for now though.


----------



## xutnubu

Is the Gigabyte G1 model, for both the 390 / 390X bad in terms of cooling?


----------



## Intel CPU

Quote:


> Originally Posted by *Dundundata*
> 
> Could it have something to do with the PSU, I have a Corsair as well.


Let me share my corsair hx platinum experiences. Its a piece of crap. Gives horrible whine w my pailit gtx 970. Th whine comes from the corsair like a freaking bomb or drill abt the explode n the palit gtx 970 was experiencing whine as well. I switched to a top rated cooler master v 1000 watt gold n the whine from the psu disappeared. So its corsair hx series screwing up. But the gtx 970 still whines. I finally shut the dam nvidia whine ul by getting a sapphire nitro r9 390. Currently the best of best for no whine, kool n ultra quiet amd card.


----------



## Intel CPU

Quote:


> Originally Posted by *xutnubu*
> 
> Is the Gigabyte G1 model, for both the 390 / 390X bad in terms of cooling?


Newegg reviewss show pretty kool temps but fans might be louder. Sapphire currently has industry best triple fan cooling for r390/r390x. Dam quiet amd card. I owned one.


----------



## yuannan

Is it weird that my mem clock is at max ALL the time?

Make my idle temps go up to 50-55C with clock at 1600.


----------



## Intel CPU

Quote:


> Originally Posted by *xutnubu*
> 
> Is the Gigabyte G1 model, for both the 390 / 390X bad in terms of cooling?


Newegg reviewss show pretty kool temps but fans might be louder. Sapphire currently has industry best triple cooling


----------



## xutnubu

Quote:


> Originally Posted by *Intel CPU*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xutnubu*
> 
> Is the Gigabyte G1 model, for both the 390 / 390X bad in terms of cooling?
> 
> 
> 
> Newegg reviewss show pretty kool temps but fans might be louder. Sapphire currently has industry best triple cooling
Click to expand...

Yeah I've read a couple where they say it gets hot as well, sadly no major hardware reviewers have given the G1 a chance.

I have a dilemma now, these G1 are the only 390s I can get in my country. So I'm torn between that and a 970 (3.5GB







), and I already sold my 280x.

Not sure what to do, I've read the 390 G1 is also voltage locked, that's a bummer.


----------



## OneB1t

stay away from 390 without voltage regulation







reason why they blocked overvolt is that they can use cheaper parts for VRM... (also no dual bios)
if you can go for 290X it will be cheaper and faster


----------



## Intel CPU

Quote:


> Originally Posted by *OneB1t*
> 
> stay away from 390 without voltage regulation
> 
> 
> 
> 
> 
> 
> 
> reason why they blocked overvolt is that they can use cheaper parts for VRM... (also no dual bios)
> if you can go for 290X it will be cheaper and faster


How do u monitor vrm temps during gameplay. Pls advise.


----------



## OneB1t

you can use hwinfo64 + rivatuner statistic server (part of msi afterburner)
to write VRM temps on screen while playing


----------



## xutnubu

Quote:


> Originally Posted by *OneB1t*
> 
> stay away from 390 without voltage regulation
> 
> 
> 
> 
> 
> 
> 
> reason why they blocked overvolt is that they can use cheaper parts for VRM... (also no dual bios)
> *if you can go for 290X* it will be cheaper and faster


Sadly I can't, used market is limited here, and expensive.


----------



## ZealotKi11er

Quote:


> Originally Posted by *xutnubu*
> 
> Sadly I can't, used market is limited here, and expensive.


What do you mean expensive? I know people are a bit cheap when buy used stuff. For example here in Canada trying to sell R9 290 + Water Block for $300 is too much to ask considering a new 390 cost $440 + 15% Tax min. Is that fair?


----------



## Intel CPU

Quote:


> Originally Posted by *OneB1t*
> 
> you can use hwinfo64 + rivatuner statistic server (part of msi afterburner)
> to write VRM temps on screen while playing


Btw initially I increase power limit to +50% then I oc successfully to hit near r390x for gpu n mem n texel filrates etc. Nxt I finally overvolt by +56 to hit msi n sapphire r390x level n got firestrike , valley n heaven stable. Am I doing the right thing. Which one will increase my power consumption more, power limitter or overvolt


----------



## OneB1t

both power limit increase and overvolt is increasing consumption
power limit only increase consumption if card is hold back hitting limit
if you have 290X its pretty easy to reach over 390X performance you just need good bios


----------



## Intel CPU

Quote:


> Originally Posted by *OneB1t*
> 
> both power limit increase and overvolt is increasing consumption
> if you have 290X its pretty easy to reach over 390X performance you just need good bios


How do I measure power consumption for my sapphire nitro r390 overclocked. Vs a msi or asus r390x. Would be interesting to know.


----------



## OneB1t

gpu-z shows power consumption but its not very precise








you need wall powermeter for good comparsion


----------



## xutnubu

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xutnubu*
> 
> Sadly I can't, used market is limited here, and expensive.
> 
> 
> 
> What do you mean expensive? I know people are a bit cheap when buy used stuff. For example here in Canada trying to sell R9 290 + Water Block for $300 is too much to ask considering a new 390 cost $440 + 15% Tax min. Is that fair?
Click to expand...

Cheapest 290x I've found is $340, XFX, basically the same price as a new R9 390.


----------



## ziggystardust

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Just an update, all members are caught up to page 555, I will be catching up throughout the rest of the week.
> I have had a hectic past couple of weeks, but am settled back in the swing of things now, so will be getting a little more involved here again.
> Sorry for the delays in getting this updated recently.
> 
> Thanks and GO PANTHERS!!!!!


I've been around for a while but forgot to post my stuff. Currently haven't tried any oc yet but will start trying soon.


----------



## OneB1t

Quote:


> Originally Posted by *xutnubu*
> 
> Cheapest 290x I've found is $340, XFX, basically the same price as a new R9 390.


then go for it







its better card you dont really need 8gb memory for anything


----------



## Spartoi

Quote:


> Originally Posted by *OneB1t*
> 
> then go for it
> 
> 
> 
> 
> 
> 
> 
> its better card you dont really need 8gb memory for anything


In what way is a 290x a better card than a 390x?


----------



## xutnubu

Quote:


> Originally Posted by *OneB1t*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xutnubu*
> 
> Cheapest 290x I've found is $340, XFX, basically the same price as a new R9 390.
> 
> 
> 
> then go for it
> 
> 
> 
> 
> 
> 
> 
> its better card you dont really need 8gb memory for anything
Click to expand...

Mmm, why? It's basically the same price, with double the VRAM and local warranty.


----------



## OneB1t

Quote:


> Originally Posted by *Spartoi*
> 
> In what way is a 290x a better card than a 390x?


cheaper?









its better than 390 for sure and same card as 390X only without 8gb memory

290X is 10% faster clock for clock than 390 (on same frequency)
same speed as 390X


----------



## Spartoi

Quote:


> Originally Posted by *OneB1t*
> 
> cheaper?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> its better than 390 for sure and same card as 390X only without 8gb memory
> 
> 290X is 10% faster clock for clock than 390 (on same frequency)
> same speed as 390X


Oh, thought you were comparing the 290x to the 390x. I'd still choose a new 390 over a used 290x though. 8GB RAM + Warranty.


----------



## OneB1t

where i live you can still buy 290X as new


----------



## ZealotKi11er

Quote:


> Originally Posted by *Spartoi*
> 
> Oh, thought you were comparing the 290x to the 390x. I'd still choose a new 390 over a used 290x though. 8GB RAM + Warranty.


Same here. Problem is 390 new where I live is basically $200 more then used 290X. Just does not make sense.


----------



## jodybdesigns

Well the 7950's are gone. The 390 was such a worthy upgrade. Witcher 3, 1440p, No Hairworks, No AA, Ultra/High mix with HBAO+. 60fps for the most part with rare drops. MUCH smoother gameplay than my 7950 Crossfire setup at all High 1080p.

Also tried a mild overclock. Stable at 1105/1550 +45mv. Also set the fan speed at 60% while I am gaming.

After about 20 minutes the GPU was at 67C, the VRM1 was at 75C, and VRM2 was at 68C.Beautiful. 60% fan speed on the Powercolor is no where neat as loud as 60% on the 7950's. Also during the "mute" idle with 0% fan speed, the card Idles at 57C, a little high for my taste so I set it at 20% fan speed and it is idling at 45C. You can't even hear it.

Awesome card IMO. So add me to the list!


Spoiler: Warning: Spoiler!


----------



## xutnubu

Quote:


> Originally Posted by *Spartoi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *OneB1t*
> 
> then go for it
> 
> 
> 
> 
> 
> 
> 
> its better card you dont really need 8gb memory for anything
> 
> 
> 
> In what way is a 290x a better card than a 390x?
Click to expand...

Quote:


> Originally Posted by *Spartoi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *OneB1t*
> 
> then go for it
> 
> 
> 
> 
> 
> 
> 
> its better card you dont really need 8gb memory for anything
> 
> 
> 
> In what way is a 290x a better card than a 390x?
Click to expand...

Quote:


> Originally Posted by *jodybdesigns*
> 
> Well the 7950's are gone. The 390 was such a worthy upgrade. Witcher 3, 1440p, No Hairworks, No AA, Ultra/High mix with HBAO+. 60fps for the most part with rare drops. MUCH smoother gameplay than my 7950 Crossfire setup at all High 1080p.
> 
> Also tried a mild overclock. Stable at 1105/1550 +45mv. Also set the fan speed at 60% while I am gaming.
> 
> After about 20 minutes the GPU was at 67C, the VRM1 was at 75C, and VRM2 was at 68C.Beautiful. 60% fan speed on the Powercolor is no where neat as loud as 60% on the 7950's. Also during the "mute" idle with 0% fan speed, the card Idles at 57C, a little high for my taste so I set it at 20% fan speed and it is idling at 45C. You can't even hear it.
> 
> Awesome card IMO. So add me to the list!


Really nice. I was actually on my way to the store to get that model and I stumbled upon this: http://www.newegg.com/Product/Product.aspx?Item=N82E16814131686&cm_re=powercolor_r9_390-_-14-131-686-_-Product

They assured me it was the PCS+ (right...







)

Looks like an engineering sample and I bet the dual fan cooler doesn't cut it, so I just said nope.

And that's how I'm in between a 970 or the 390 G1.


----------



## jodybdesigns

Okay so after playing for a while I also started to notice some throttle going on. I tracked it down, like always, to Powerplay. Seems when I have Vsync on, and its capping the FPS to 60, I am getting stutter in about every game. Now if I disable Vsync the throttle goes completely away, but I have some tears.

Is there any way to disable or modify the Powerplay, or have they yet?

Looks like its time to invest in a Freesync or 144hz monitor...


----------



## roflcopter1654

Kinda weird, but all of a sudden my temps are seemingly much lower than the first day or so i was running my msi 390. Like, 7-8 degrees lower on the same game with the same settings and ambient temps.


----------



## fat4l

So this is what I'm experiencing:
Flickering with crossfire + freesync enabled.
This is with 16.1.1 drivers.
For example, when I run World of warcraft, then alt-tab into windows, then run skype, then the desktop starts flickering, especially when I try to move the skype window.

The usage and frequency fluctuation with games, mostly older ones(DX9, DX10)...
Here are two vids showing the fluctuations. Both clock and usage.
#1 World of Warcraft 3.3.5 - 



#2 Crysis 1 - 




System spec-
Intel i7 4790K @5.1G
Asus M7 Hero
Corsair Dominator Platinum 4*4GB 2666CL10
Asus Ares III (2*290X)
Superflower Leadex 1200W Platinum
BenQ XL2730Z @1440p/144Hz

AMD Drivers: 16.1.1 Crimson, Windows 10 Pro 64 bit.


----------



## Synntx

Quote:


> Originally Posted by *jodybdesigns*
> 
> Okay so after playing for a while I also started to notice some throttle going on. I tracked it down, like always, to Powerplay. Seems when I have Vsync on, and its capping the FPS to 60, I am getting stutter in about every game. Now if I disable Vsync the throttle goes completely away, but I have some tears.
> 
> Is there any way to disable or modify the Powerplay, or have they yet?
> 
> Looks like its time to *invest in a Freesync or 144hz monitor*...


Fixed that for you.

Seriously, I run 2x 27" monitors, only use one for gameplay. Recently picked up an ASUS MG278q, LOVE IT!.

On another note, my VRM1 temps are maxxing out at 105-106*c. I've reseated all the thermal pads on the GPU, but no improvement. Guess its time to order some fujipoly.....or back down my clocks...........nah...

@Agent Smith1984 Could you update my information on the first page?? I don't have the MSI 390x anymore, now it's a Sapphire 390x NITRO, max clock achieved was 1210/1750, stable clocks at 1180/1750


----------



## navjack27

are still people ignoring clockblocker? if ur throttling and powerplaying while vsyncing then use clockblocker with ADL method


----------



## fat4l

Quote:


> Originally Posted by *navjack27*
> 
> are still people ignoring clockblocker? if ur throttling and powerplaying while vsyncing then use clockblocker with ADL method


If its for me, then yes I know about it. However this doesnt fix the usage fluctuations...


----------



## Mysticking32

I'm kind of stuck at a current overclock of 1110 at 31 mv and memory at 1700 with 75 mv. If I try to go to 1120 with anywhere except up to 100 I'll get artifacts. Normally I wouldn't mind this but the temps reach up to the high 80's. Talking about 89-90 range here.

Any ideas on how to keep the card cooler? My case has good airflow btw. It's just the heating on this particular card sucks lol.

Msi r9 390x


----------



## OneB1t

fluctuations of clock with vsync are ok dunno why fight them? its way to save power when full card speed is not needed


----------



## ZealotKi11er

Quote:


> Originally Posted by *OneB1t*
> 
> fluctuations of clock with vsync are ok dunno why fight them? its way to save power when full card speed is not needed


I think GPU usage is for that or has been for me for 95% of the games.


----------



## OneB1t

card downclock while:
1. hitting power limit
2. hitting temperature limit
3. cpu bottlenecked
4. vsync, framerate limit

its not that its bad thing i dont think that stutter will go away if you lock card at maximum frequency
if game stutters with vsync is probably badly implemented vsync not card downclock issue

its easy to get frametimes from fraps and compare if really downclocking is that big problem

card also not just downclock it also undervolts


----------



## afyeung

Quote:


> Originally Posted by *Mysticking32*
> 
> I'm kind of stuck at a current overclock of 1110 at 31 mv and memory at 1700 with 75 mv. If I try to go to 1120 with anywhere except up to 100 I'll get artifacts. Normally I wouldn't mind this but the temps reach up to the high 80's. Talking about 89-90 range here.
> 
> Any ideas on how to keep the card cooler? My case has good airflow btw. It's just the heating on this particular card sucks lol.
> 
> Msi r9 390x


I have the same card. It gets real hot. I ditched the stock cooler a while ago and modded it here https://linustechtips.com/main/topic/528585-msi-r9-390x-dual-120mm-mod/?page=2
Crossfire didn't even work simce the 390x was throttling. With just the 390x and dual 120mms, I was getting 85c on the core with +100mv 1170mhz core. 390x is a VERY hot chip.


----------



## afyeung

Quote:


> Originally Posted by *jodybdesigns*
> 
> Okay so after playing for a while I also started to notice some throttle going on. I tracked it down, like always, to Powerplay. Seems when I have Vsync on, and its capping the FPS to 60, I am getting stutter in about every game. Now if I disable Vsync the throttle goes completely away, but I have some tears.
> 
> Is there any way to disable or modify the Powerplay, or have they yet?
> 
> Looks like its time to invest in a Freesync or 144hz monitor...


You can disable powerplay. Stuttering is odd though. V sync will cause your card to throttle anyways. Thats how it works.


----------



## fat4l

Quote:


> Originally Posted by *OneB1t*
> 
> card downclock while:
> 1. hitting power limit
> 2. hitting temperature limit
> 3. cpu bottlenecked
> 4. vsync, framerate limit
> 
> its not that its bad thing i dont think that stutter will go away if you lock card at maximum frequency
> if game stutters with vsync is probably badly implemented vsync not card downclock issue
> 
> its easy to get frametimes from fraps and compare if really downclocking is that big problem
> 
> card also not just downclock it also undervolts


I'm not hitting any of these 4....
I have freesync with NO Vsync


----------



## TsukikoChan

@Agent Smith1984 Thanks for the addition on the front page but my clocks are wrong on my card  i get 1120-1600 on my sapphire 390x (i use this when playing demanding games like evil within) and i know i can get higher once i get my new psu next week XD


----------



## OneB1t

Quote:


> Originally Posted by *fat4l*
> 
> I'm not hitting any of these 4....
> I have freesync with NO Vsync


even 5.1ghz i7 can reach cpu limit in some games


----------



## Mysticking32

Quote:


> Originally Posted by *afyeung*
> 
> I have the same card. It gets real hot. I ditched the stock cooler a while ago and modded it here https://linustechtips.com/main/topic/528585-msi-r9-390x-dual-120mm-mod/?page=2
> Crossfire didn't even work simce the 390x was throttling. With just the 390x and dual 120mms, I was getting 85c on the core with +100mv 1170mhz core. 390x is a VERY hot chip.


I'm hoping not to do all that lol. I think it's an error with the thermal paste application since my card is reaching 85 at 1110 and 31 mv. It reaches the high 80's at 100mv @1110 and at higher clocks goes to the low 90's. I've seen 91 so far. If this is normal I won't even bother. But does replacing the thermal paste void the warranty just in case?


----------



## outofmyheadyo

So my 970 is struggling in tomb raider @ 2560x1440 eventho it`s clocked @ 1480/7600
I have the chance to sell my 970 for the same ammount I could buy a R9390X-DC2-8GD5 from ASUS, would you say the swap is worth it ?
How are your 390x-s handling the new tomb raider, and can one expect around the same OC from the 390X as the 390?


----------



## afyeung

Quote:


> Originally Posted by *Mysticking32*
> 
> I'm hoping not to do all that lol. I think it's an error with the thermal paste application since my card is reaching 85 at 1110 and 31 mv. It reaches the high 80's at 100mv @1110 and at higher clocks goes to the low 90's. I've seen 91 so far. If this is normal I won't even bother. But does replacing the thermal paste void the warranty just in case?


Your overclocks are actually really low but using a lot of voltage. I switched out the thermal paste and it only lowered temps by 1c. Im going kraken g10 with the x41 so I dont have to worry as much. These cards output a lot of heat. But yeah, try replacing the thermal paste. You do puncture the warranty sticker, but MSI is really chill as long as u return in original condition.


----------



## afyeung

Quote:


> Originally Posted by *outofmyheadyo*
> 
> So my 970 is struggling in tomb raider @ 2560x1440 eventho it`s clocked @ 1480/7600
> I have the chance to sell my 970 for the same ammount I could buy a R9390X-DC2-8GD5 from ASUS, would you say the swap is worth it ?
> How are your 390x-s handling the new tomb raider, and can one expect around the same OC from the 390X as the 390?


That's a pretty bad cooler. I'd keep the 970. And the 390 and 390x use the same chip basically so they overclock within the same range. Its all silicon lotrery though. You can most likely get 1150 on the core. Which is a mere +50mhz from some of the custom 390xs


----------



## jdorje

Do not get the dc2 cooler.


----------



## outofmyheadyo

Ok thanks for the heads up on the cooler issue.
But what would be the best cooler to go for incase of 390?

Sapphire Radeon R9 390 Nitro Tri-X OC 8GB GDDR5
MSI R9 390 GAMING 8G 8GB GDDR5

Any of these any good


----------



## Mysticking32

Quote:


> Originally Posted by *afyeung*
> 
> Your overclocks are actually really low but using a lot of voltage. I switched out the thermal paste and it only lowered temps by 1c. Im going kraken g10 with the x41 so I dont have to worry as much. These cards output a lot of heat. But yeah, try replacing the thermal paste. You do puncture the warranty sticker, but MSI is really chill as long as u return in original condition.


Cool thanks. And yeah the voltages are that high because it artifacts without it being that high. Not sure what that's about.

Thank you though. I'll try replacing the thermal paste


----------



## navjack27

i'm trying to get a benchmark or two for the fanboy benchmark competition and man its frustrating to start a bench run and have the drivers crash without a reason. just doo do dooo running firestrike no artifacts and then BAM black screen. how can i boot up windows 10 with minimal background crap but with network and vga driver support? yes i know advanced startup but i'd like to know what you guys do to optimize your computers for high scores?


----------



## Stige

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Ok thanks for the heads up on the cooler issue.
> But what would be the best cooler to go for incase of 390?
> 
> Sapphire Radeon R9 390 Nitro Tri-X OC 8GB GDDR5
> MSI R9 390 GAMING 8G 8GB GDDR5
> 
> Any of these any good


Sapphire has made the best coolers for Radeons for a long while now.


----------



## jodybdesigns

I have been having so much fun I forgot to post back. My Powercolor PCS+ is simply amazing. I don't have any of the temp issues you guys have. I set up a custom fan curve and I yet to see my card get hotter than 69C on the VRMs and 65 on the core with 58% fan speed. It's pretty quiet at 58%. I also keep my side panel off. If I put my panel on the GPU would be inaudible. I am comfortably running 1075/1550 @ +19mv. The card now gets maxed WAY before the CPU ever comes close.

I have disabled Vsync on all my games and set my frames to Unlimited. The card screams and is usually at 90%+ usage. With Vsync on both my CPU and GPU is around 65% usage. I have been getting 0 throttle on the Witcher 3 for the most part and I am getting well over 60fps at all times. After a while the screen tearing seems to go unnoticed. World of Warcraft gets absolutely dominated by the 390. I was getting some terrible stutter in WoW using Crossfire. Seems like all my games are more fluid. I will probably never go Crossfire again on that point, since I had the worst time ever with my 2x 4870's as well...


----------



## navjack27

did a clean driver install and now my scores are back up to what i like. YAY!!!!


----------



## Vellinious

Quote:


> Originally Posted by *navjack27*
> 
> i'm trying to get a benchmark or two for the fanboy benchmark competition and man its frustrating to start a bench run and have the drivers crash without a reason. just doo do dooo running firestrike no artifacts and then BAM black screen. how can i boot up windows 10 with minimal background crap but with network and vga driver support? yes i know advanced startup but i'd like to know what you guys do to optimize your computers for high scores?


That looks like fun....is there a team leader that you need to sign up with, or? Just start posting your bench scores?


----------



## navjack27

just start posting scores my dude. the rules are just tweak anything really, you can disable tess entirely!!!! once u get the score load up gpu-z and two instances of cpu-z (one with the main tab and one with the memory tab) and take a screenshot showing ur score (graphics score and main score and all that) and provide a link to ur result


----------



## Vellinious

Quote:


> Originally Posted by *navjack27*
> 
> just start posting scores my dude. the rules are just tweak anything really, you can disable tess entirely!!!! once u get the score load up gpu-z and two instances of cpu-z (one with the main tab and one with the memory tab) and take a screenshot showing ur score (graphics score and main score and all that) and provide a link to ur result


Excellent. How long is it running? I'm waiting for an advanced RMA for a PSU from EVGA that's dying on me. Can't put a lot of pressure on the CPU and GPU without the PSU shutting down. Should be here by mid week.


----------



## navjack27

DATES:
Starts: 2/1/2016 EST
Ends: 3/31/2016 EST


----------



## Vellinious

Quote:


> Originally Posted by *navjack27*
> 
> DATES:
> Starts: 2/1/2016 EST
> Ends: 3/31/2016 EST


Plenty of time....sweeeet. = )

Thanks, man


----------



## navjack27

{FANBOY} LETS SHOW THOSE NVIDIOTS WHOS BOSS {/FANBOY}


----------



## Mysticking32

So I kind of cheated a little. Rather than replace the thermal paste on the gpu, I had ordered a case fan on amazon recently. My case had a spot right in front of/ on top of the gpu so I put the fan there. Temps haven't gone above 78 yet. Overclock is 1120 @ 38mv and mem at 1700 38mv. Very happy.

Here's my firestrike score. Almost hitting 12k now with a graphics score of 14024. Can't wait to go higher.

http://www.3dmark.com/fs/7459730


----------



## navjack27

wait till you go hard in the paint like this http://www.3dmark.com/fs/7457158


----------



## Mysticking32

Quote:


> Originally Posted by *navjack27*
> 
> wait till you go hard in the paint like this http://www.3dmark.com/fs/7457158


I'd be jumping up and down if I ever even got close to that graphics score lmao.

That is amazing! How is it possible!? lol


----------



## Vellinious

Quote:


> Originally Posted by *navjack27*
> 
> wait till you go hard in the paint like this http://www.3dmark.com/fs/7457158


What clocks were you running to get that?


----------



## navjack27

Bios mods + overclock to 1175/1625 + >..> 2x tess. But it's not much faster then my use application settings run. I gotta do another run with that set before i turn in for the night. Take a look at the graphics scores in that benchmark competition thread.


----------



## Vellinious

I ran 15782 graphics on the stock bios with no tweaks. Was just wondering....I'm wanting to break 16k, just not sure how to get there.


----------



## mus1mus

Quote:


> Originally Posted by *Vellinious*
> 
> I ran 15782 graphics on the stock bios with no tweaks. Was just wondering....I'm wanting to break 16k, just not sure how to get there.


Try to ask @gupsterg. He is a brilliant fella.

Go for the REDs


----------



## jodybdesigns

I love my Powercolor PCS+ 390 more and more. This thing OC's like a monster without much voltage increases. I guess I won the silicon lottery with mine. Look at those temps. Man oh man that is beautiful.


Spoiler: Warning: Spoiler!


----------



## mus1mus

Nice. But I'd pull the Power Limit to max even if the card just runs stock. It does improve things.

Any chance to try the Memory to 1625 or 1750? If it allows you, it'd be better too.


----------



## Vellinious

Quote:


> Originally Posted by *mus1mus*
> 
> Try to ask @gupsterg. He is a brilliant fella.
> 
> Go for the REDs


As soon as my new PSU arrives from EVGA, I'll be making another push at it. I'm going to raise the power limits, TDP, and TDC on the stock bios, and not change anything else, see if it doesn't help a little bit. Probably won't, but.....worth a shot, anyway.


----------



## Dundundata

Quote:


> Originally Posted by *Intel CPU*
> 
> Let me share my corsair hx platinum experiences. Its a piece of crap. Gives horrible whine w my pailit gtx 970. Th whine comes from the corsair like a freaking bomb or drill abt the explode n the palit gtx 970 was experiencing whine as well. I switched to a top rated cooler master v 1000 watt gold n the whine from the psu disappeared. So its corsair hx series screwing up. But the gtx 970 still whines. I finally shut the dam nvidia whine ul by getting a sapphire nitro r9 390. Currently the best of best for no whine, kool n ultra quiet amd card.


hmmm interesting maybe you got a dud? i have the RM series and it's been great, don't hear it at all. same with the msi390, no whine at all

and i've experienced coil whine, it's terrible!


----------



## Dundundata

Navjack are you running the memory timing modded bios. I made one for my card but haven't flashed it yet. What can I expect in benching and real world performance?

o and did you go with the 1250 strap timings?


----------



## ZealotKi11er

Quote:


> Originally Posted by *Dundundata*
> 
> Navjack are you running the memory timing modded bios. I made one for my card but haven't flashed it yet. What can I expect in benching and real world performance?


Probably 1 fps in real world but 1fps in 3DMark will help scores.


----------



## sil130

Hey guys, tried to OC my msi r9 390 for the first time and here is its benchmark. http://www.3dmark.com/3dm/10653981?
Isnt it kinda too low?
i5-6600k @ 4.6ghz
390 1110 core 1550 mem


----------



## navjack27

a little low i guess

http://www.3dmark.com/compare/fs/7475295/fs/7457158/fs/7446152/fs/7432814

here are a bunch of my test runs and urs


----------



## jodybdesigns

Managed to hit 1200/1650 with +50mv on my Powercolor PCS+. The benchmark ran through several times
Quote:


> Originally Posted by *sil130*
> 
> Hey guys, tried to OC my msi r9 390 for the first time and here is its benchmark. http://www.3dmark.com/3dm/10653981?
> Isnt it kinda too low?
> i5-6600k @ 4.6ghz
> 390 1110 core 1550 mem


Seems fine to me. I am getting 10450 with my Powercolor 390 1100 / 1575. I have a 3570k. The 6600k scores about 300 points higher than a 3570k.


----------



## navjack27

i'll ignore the fact that my broadwell is beating a skylake


----------



## sil130

Quote:


> Originally Posted by *jodybdesigns*
> 
> Managed to hit 1200/1650 with +50mv on my Powercolor PCS+. The benchmark ran through several times
> Seems fine to me. I am getting 10450 with my Powercolor 390 1100 / 1575. I have a 3570k. The 6600k scores about 300 points higher than a 3570k.


Well its seems okay then. Btw I have a question, I followed this guide when i OC'ed my cpu 



. Everything is fine during the stress tests. But last night I got a blue screen with "machine_check_exception" while browsing steam and my pc hang one time when I was playing AC Syndicate. CPU temps were at 70C when it hang. Is it caused by my overclock?


----------



## mus1mus

Quote:


> Originally Posted by *jodybdesigns*
> 
> Managed to hit 1200/1650 with +50mv on my Powercolor PCS+. The benchmark ran through several times


I'm just chiming in to give some insights.









Memory straps info - these cards have memory straps that you need to target when overclocking.

1375 -
1500 -
1625 -
1750 -
1875 -

With the exception of the XFX DD 290X 8GB where the BIOS litterally don't follow such rule, when you OC the Memory, follow this rule:

Shoot for the strap ends. See above. If you are nearing the higher strap end, say 1620 -- it uses the the timings from 1501-1625 -- meaning, tight timings for the frequency.

Whereas if you are at 1650 - 25MHz from 1625, you are now using 1626-1750 timings. Closer to 1625 than to 1750. Meaning, loose timings for the freqiency.

I am willing to bet 1200/1625 scores better than 1200/1650 that you are in. You can test to confirm. Also, it's not gonna hurt to try 1750 for 3XX Cards seem to do better in memory OC than the 2XX cards.


----------



## sil130

What are the safe temps on msi r9 390? Is 78C max load fine?


----------



## mus1mus

Quote:


> Originally Posted by *sil130*
> 
> What are the safe temps on msi r9 390? Is 78C max load fine?


The cooler the better. 78C is better than 90C from ref 290X DEEMED safe by AMD though.


----------



## battleaxe

Here's my submission. She looks like a pretty good one as far as 390x are concerned. Does 1230mhz at 75mv. I'll see what it can really do when I get some time. And yes, that's only the cooler as I already ripped it off and put a universal block on it.


----------



## Razeraa

Seems like my MSI R9 390 isn't that great overclocker and doesn't like voltage. I tried getting 1150/1600Mhz stable but after I added some voltage screen went black and only way to get it to work again was to boot with safe mode and uninstall drivers with DDU.

At least I can get 1075 / 1550Mhz stable without adding any voltage. I'm happy with that.


----------



## Dundundata

Quote:


> Originally Posted by *Razeraa*
> 
> Seems like my MSI R9 390 isn't that great overclocker and doesn't like voltage. I tried getting 1150/1600Mhz stable but after I added some voltage screen went black and only way to get it to work again was to boot with safe mode and uninstall drivers with DDU.
> 
> At least I can get 1075 / 1550Mhz stable without adding any voltage. I'm happy with that.


did you increase power limit as well


----------



## peejay2104

Hello everyone,
Been following this forum for quite a while now and finally decided to get an account since i have a pretty important question.
i've got an asus dc2 390x card (watercooled, dc2 cooler is crap







wish i knew that before buying my gpu).
Right now i got a stable overclock of 1200/1700 at +100mv so there is plenty of room to tinker with.
However, whenever i try to increase voltage more, say like +150mv, my pc crashes occasionaly. core temps are fine, as wel as vrm temps.
There is also no artifacting when doing this what so ever.
So, i overclocked to 1250/1775 with +150mv. Ran heaven benchmark couple times without a hickup. But as soon as i start to play some games, black ops 3 for example, it crashes after a couple minutes, sometimes half an hour.
Someone who has an idea?
And by crashing i mean black screening









pc specs
i5 4690k at 4.7ghz (not the issue here, is perfectly stable after 24hr stresstest)
r9 390x with a kraken x31
coolermaster g750m psu
16gb 1600mhz ballistix
asus h97 plus mobo
evo 500gb ssd
2tb hdd

many thanks!
pj


----------



## sil130

Quote:


> Originally Posted by *mus1mus*
> 
> The cooler the better. 78C is better than 90C from ref 290X DEEMED safe by AMD though.


Is there anything I can do to have lower temp? Add extra intake fans etc?


----------



## THUMPer1

Quote:


> Originally Posted by *mus1mus*
> 
> I'm just chiming in to give some insights.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Memory straps info - these cards have memory straps that you need to target when overclocking.
> 
> 1375 -
> 1500 -
> 1625 -
> 1750 -
> 1875 -
> 
> With the exception of the XFX DD 290X 8GB where the BIOS litterally don't follow such rule, when you OC the Memory, follow this rule:
> 
> Shoot for the strap ends. See above. If you are nearing the higher strap end, say 1620 -- it uses the the timings from 1501-1625 -- meaning, tight timings for the frequency.
> 
> Whereas if you are at 1650 - 25MHz from 1625, you are now using 1626-1750 timings. Closer to 1625 than to 1750. Meaning, loose timings for the freqiency.
> 
> I am willing to bet 1200/1625 scores better than 1200/1650 that you are in. You can test to confirm. Also, it's not gonna hurt to try 1750 for 3XX Cards seem to do better in memory OC than the 2XX cards.


Trying to figure this out. my MSI 390x at 1700 and 30/35mV is unstable. I tried 1750 and 50mV aux and i get lines flashing sometimes. So that's not stable. It is stable at 1650 and 25mV. I have not added any core voltage. Do I need less more more Aux voltage? haha


----------



## peejay2104

I have not yet increased my gpu aux voltage as i' mnot sure how this will affect performance and temps and so on.
pj


----------



## peejay2104

I also have not increased memory voltage yet. Is that the reason it black screens?


----------



## THUMPer1

Quote:


> Originally Posted by *peejay2104*
> 
> I also have not increased memory voltage yet. Is that the reason it black screens?


I know if you increase mem speed the aux voltage needs increased. My 390x seems to be stable at 1650 speed 25 mV Aux. Im trying to get past that speed though.


----------



## Dundundata

Quote:


> Originally Posted by *THUMPer1*
> 
> Trying to figure this out. my MSI 390x at 1700 and 30/35mV is unstable. I tried 1750 and 50mV aux and i get lines flashing sometimes. So that's not stable. It is stable at 1650 and 25mV. I have not added any core voltage. Do I need less more more Aux voltage? haha


if it was unstable at 1700, why did you bump it up to 1750?

for memory you might be better off with 1625 than 1650, because of how the memory timings in the bios are setup. Even though 1650 is higher it uses higher/looser (worse) timings


----------



## peejay2104

Quote:


> Originally Posted by *THUMPer1*
> 
> I know if you increase mem speed the aux voltage needs increased. My 390x seems to be stable at 1650 speed 25 mV Aux. Im trying to get past that speed though.


Well, she is stable at 1700 without me touching the aux voltage. Even at 1750, but if i can't increase my core clock then there is no reason to increase mem speed since i lose performance :/


----------



## THUMPer1

Quote:


> Originally Posted by *Dundundata*
> 
> if it was unstable at 1700, why did you bump it up to 1750?
> 
> for memory you might be better off with 1625 than 1650, because of how the memory timings in the bios are setup. Even though 1650 is higher it uses higher/looser (worse) timings


Because I want to see what happens at 1750 haha


----------



## sil130

Hey guys, ive OC'ed my msi r9 390 to 1120 core clock and 1625 mem clock. 50% power limit and +45 mV. I tested it with heaven and firemark and both went fine, no artifacts. But when I started playing Witcher 3, ive noticed minimal artifacts, so does this mean my oc is not stable? Should I put my core voltage higher? Is it safe to put it at 100?


----------



## tolis626

Quote:


> Originally Posted by *sil130*
> 
> Hey guys, ive OC'ed my msi r9 390 to 1120 core clock and 1625 mem clock. 50% power limit and +45 mV. I tested it with heaven and firemark and both went fine, no artifacts. But when I started playing Witcher 3, ive noticed minimal artifacts, so does this mean my oc is not stable? Should I put my core voltage higher? Is it safe to put it at 100?


Witcher 3 will have your GPU work like a mule. Seriously, mine can be stable at everything else and give artifacts in Witcher 3. So for me it has become the benchmark for stability testing. 15 minutes usually is all it takes too. Just watch those temps!

In fact, now that I think of it, do check your temps. Above 80C can cause instabilities on some GPUs, leading them to require more voltage, which in turn raises temps even further. The MSI in particular benefits greatly from good case airflow. In my case, ok airflow vs really good airflow is over 10C hotter.

Other than that, you'll need more core voltage. Although I'd also be looking at increasing aux voltage at this point. Memory overclocking isn't the greatest strength of the MSI cards. Mine barely does 1725MHz half stable. Game stable is another story.


----------



## sil130

Quote:


> Originally Posted by *tolis626*
> 
> Witcher 3 will have your GPU work like a mule. Seriously, mine can be stable at everything else and give artifacts in Witcher 3. So for me it has become the benchmark for stability testing. 15 minutes usually is all it takes too. Just watch those temps!
> 
> In fact, now that I think of it, do check your temps. Above 80C can cause instabilities on some GPUs, leading them to require more voltage, which in turn raises temps even further. The MSI in particular benefits greatly from good case airflow. In my case, ok airflow vs really good airflow is over 10C hotter.
> 
> Other than that, you'll need more core voltage. Although I'd also be looking at increasing aux voltage at this point. Memory overclocking isn't the greatest strength of the MSI cards. Mine barely does 1725MHz half stable. Game stable is another story.


Should I put my voltage to 100? And what should I put on my aux voltage? I havent read a guide that tweaks aux voltage so i dont have any idea.


----------



## tolis626

Quote:


> Originally Posted by *sil130*
> 
> Should I put my voltage to 100? And what should I put on my aux voltage? I havent read a guide that tweaks aux voltage so i dont have any idea.


I'll say it again, first check your temps. If you're still using MSI's default fan curve, which you probably are, you are most likely running hot. So in Afterburner go to settings and set a custom fan curve.

Having said that, why would you set your voltage to +100mV if there is no need? Do it in small steps. Increasing voltage a lot will lead to higher temperatures and noise levels and shorter component life. If you want to achieve a certain overclock and you need +100mV then yes, you should set it that high, but I highly doubt 1120MHz needs that much voltage. As I said, I think it's more likely a memory issue. Ignoring the fact that you should overclock your memory and core independently (Which you really should be doing, by the way, to know what exactly is causing issues), try setting aux voltage to +25mV or +50mV. Going above +50mV isn't advised by people who know more than I do, so I follow their recommendations. You may also want to go higher or lower with your memory speed to try the different timing straps, but let's not complicate things further.

I say dial back your memory to 1500MHz, overclock your core and then start messing with the mem clocks. Good luck!


----------



## mus1mus

Quote:


> Originally Posted by *peejay2104*
> 
> I also have not increased memory voltage yet. Is that the reason it black screens?


To make things clear, the black screen you are seeing, does the display go back when you unplug - replug the monitor cable?

If it doesn't, it's a card issue. If the display goes back, it's a known issue on these cards. Which BIOS do you use?
Quote:


> Originally Posted by *THUMPer1*
> 
> Trying to figure this out. my MSI 390x at 1700 and 30/35mV is unstable. I tried 1750 and 50mV aux and i get lines flashing sometimes. So that's not stable. It is stable at 1650 and 25mV. I have not added any core voltage. Do I need less more more Aux voltage? haha


Aux Voltage has VERY little to do with that. Memory Voltage is somewhat linked to the Core Voltage. Try to increase the core Voltage and try it again. Else, 1625 is the way to go.

Or, BIOS Editting to allow you to use tighter timings for higher memory clocks.

Quote:


> Originally Posted by *THUMPer1*
> 
> I know if you increase mem speed the aux voltage needs increased. My 390x seems to be stable at 1650 speed 25 mV Aux. Im trying to get past that speed though.


Same as above. Try a higher Core Voltage.
Quote:


> Originally Posted by *sil130*
> 
> Hey guys, ive OC'ed my msi r9 390 to 1120 core clock and 1625 mem clock. 50% power limit and +45 mV. I tested it with heaven and firemark and both went fine, no artifacts. But when I started playing Witcher 3, ive noticed minimal artifacts, so does this mean my oc is not stable? Should I put my core voltage higher? Is it safe to put it at 100?


If your cooling allows it...
Quote:


> Originally Posted by *sil130*
> 
> Is there anything I can do to have lower temp? Add extra intake fans etc?


That is for you to figure out. What case, Fans, etc you have.


----------



## peejay2104

Quote:


> Originally Posted by *mus1mus*
> 
> To make things clear, the black screen you are seeing, does the display go back when you unplug - replug the monitor cable?
> 
> If it doesn't, it's a card issue. If the display goes back, it's a known issue on these cards. Which BIOS do you use?
> 
> I have not tried that, will try it immediatly.
> I just use the stock asus bios
> 
> 
> 
> 
> 
> 
> 
> 
> pj


----------



## mus1mus

Quote:


> Originally Posted by *tolis626*
> 
> I'll say it again, first check your temps. If you're still using MSI's default fan curve, which you probably are, you are most likely running hot. So in Afterburner go to settings and set a custom fan curve.
> 
> Having said that, why would you set your voltage to +100mV if there is no need? Do it in small steps. Increasing voltage a lot will lead to higher temperatures and noise levels and shorter component life. If you want to achieve a certain overclock and you need +100mV then yes, you should set it that high, but I highly doubt 1120MHz needs that much voltage. As I said, I think it's more likely a memory issue. Ignoring the fact that you should overclock your memory and core independently (Which you really should be doing, by the way, to know what exactly is causing issues), try setting aux voltage to +25mV or +50mV. Going above +50mV isn't advised by people who know more than I do, so I follow their recommendations. You may also want to go higher or lower with your memory speed to try the different timing straps, but let's not complicate things further.
> 
> I say dial back your memory to 1500MHz, overclock your core and then start messing with the mem clocks. Good luck!


SAGE! Great advise.

Bottom line, it's all about the cooling.


----------



## battleaxe

Quote:


> Originally Posted by *Razeraa*
> 
> Seems like my MSI R9 390 isn't that great overclocker and doesn't like voltage. I tried getting 1150/1600Mhz stable but after I added some voltage screen went black and only way to get it to work again was to boot with safe mode and uninstall drivers with DDU.
> 
> At least I can get 1075 / 1550Mhz stable without adding any voltage. I'm happy with that.


Make sure to uncheck the little light at the bottom of MSI AB so when it reboots it won't set the same sets you had that made you crash. It will load default sets on startup. This allows you to not get caught with your pants down after reboot. Just an FYI to help in the future when you are benching.


----------



## peejay2104

@mus1mus, after deleting asus gpu tweak 2 it's actually stable now...haven't had the black screens anymore so far. Will keep updating with proof pic of card and stuff so i to can join the grid on page nr1








pj


----------



## RicoDee

sorry guys.


----------



## patriotaki

hello guys i found a great deal on the Powercolor 390 PCS+ for 310EUR, i might be getting it, does anyone of you have it? whats your opinion?

any signs of coil whine? does it have 0dB on low temp?


----------



## Charcharo

I have it. No Coil Whine. Silent all in all, yet it is absolutely quiet under 61 degrees.

Mine is not a great overcklocker though. Else all else in it is great.


----------



## murakume

It's been a while since I was last active on this thread or the forum, but I've since bought a second 390X for crossfire and did this today


----------



## Charcharo

http://www.3dmark.com/3dm/10690842
I guess this is a bit low guys?

This is stock PCS + 390 . Rest of the system is in the actual run.

Will Windows 10 improve this? Or a better CPU/RAM?


----------



## Vellinious

Quote:


> Originally Posted by *Charcharo*
> 
> http://www.3dmark.com/3dm/10690842
> I guess this is a bit low guys?
> 
> This is stock PCS + 390 . Rest of the system is in the actual run.
> 
> Will Windows 10 improve this? Or a better CPU/RAM?


That's a bit low, but it's not horrible for stock clocks.


----------



## Charcharo

Quote:


> Originally Posted by *Vellinious*
> 
> That's a bit low, but it's not horrible for stock clocks.


Is it out of the ordinary? Because it is strange. The only other Bench I compared my card to are Metro Last Light benches (and it does BETTER than the ones posted in sites it seems).

Would Windows 10 and an SSD improve this score? I just bought 3DMark and I do not know how things influence the score.


----------



## Vellinious

Quote:


> Originally Posted by *Charcharo*
> 
> Is it out of the ordinary? Because it is strange. The only other Bench I compared my card to are Metro Last Light benches (and it does BETTER than the ones posted in sites it seems).
> 
> Would Windows 10 and an SSD improve this score? I just bought 3DMark and I do not know how things influence the score.


The only thing you should be comparing is the graphics score. The overall score, takes the physics and combined score into account, which are heavily CPU oriented. For graphics performance, graphics score.


----------



## TsukikoChan

just a wee heads up guys for those of you without the full 3dmark software (like me), steam have it on sale at the minute at like 3-4quid


----------



## Stige

Quote:


> Originally Posted by *TsukikoChan*
> 
> just a wee heads up guys for those of you without the full 3dmark software (like me), steam have it on sale at the minute at like 3-4quid


1,6€ on india steam







must buy I guess.


----------



## battleaxe

Quick little run here. Higher to come. Just checking VRM temps and I can see that I need to reseat the blocks on the 390X pronto. Much hotter than the 290X.


----------



## Charcharo

Quote:


> Originally Posted by *Vellinious*
> 
> The only thing you should be comparing is the graphics score. The overall score, takes the physics and combined score into account, which are heavily CPU oriented. For graphics performance, graphics score.


So this is an OK graphics score for a *stock* R9 390 with such a *system*? There is nothing abnormal, right







?

EDIT:

http://www.3dmark.com/3dm/10692140

Yey I guess


----------



## OneB1t

totally normal result


----------



## Stige

Just bought it cause it was cheap and all, http://www.3dmark.com/3dm/10693797?

Mild clocks. This doesn't run as easily as Valley does heh

Top 1 on "similiar setups" easily.


----------



## patriotaki

just bought 2 pcs+ 390 i will test them on crossfire







waiting for them to arrive.
can i use two different psu's to power the gpus ?


----------



## Stige

Yes? I don't see why not atleast? But I wouldn't know.


----------



## pillowsack

You can, but I recommend getting the adapter to do so.

http://www.amazon.com/Vantacor-Dual-Adapter-Cable-2-way/dp/B00DL3L2J6

http://www.amazon.com/Add2PSU-Multiple-Power-Supply-Adapter/dp/B009P98Q8U

Sleeved one $13:

http://www.amazon.com/gp/product/B007OUNHL8/ref=pd_lpo_sbs_dp_ss_1?pf_rd_p=1944687622&pf_rd_s=lpo-top-stripe-1&pf_rd_t=201&pf_rd_i=B009P98Q8U&pf_rd_m=ATVPDKIKX0DER&pf_rd_r=0Z8P00377Q3NN4RQEEVM

Both do the same thing


----------



## battleaxe

What kind of temps are any of you getting with a full cover block? Mine seem a bit high on VRM's. My core never goes over 44C but the VRM1 is hitting 70C at +100mv. This is after I reseated everthing. I only saw an improvement of 2C roughly.

Seems high doesn't it?


----------



## mus1mus

Please verify that VDDC Value you are getting at +100.

Fujipoly pads will be your refuge.


----------



## patriotaki

Quote:


> Originally Posted by *pillowsack*
> 
> You can, but I recommend getting the adapter to do so.
> 
> http://www.amazon.com/Vantacor-Dual-Adapter-Cable-2-way/dp/B00DL3L2J6
> 
> http://www.amazon.com/Add2PSU-Multiple-Power-Supply-Adapter/dp/B009P98Q8U
> 
> Sleeved one $13:
> 
> http://www.amazon.com/gp/product/B007OUNHL8/ref=pd_lpo_sbs_dp_ss_1?pf_rd_p=1944687622&pf_rd_s=lpo-top-stripe-1&pf_rd_t=201&pf_rd_i=B009P98Q8U&pf_rd_m=ATVPDKIKX0DER&pf_rd_r=0Z8P00377Q3NN4RQEEVM
> 
> Both do the same thing


is this necessary? can i do it without the adapter? i will power the whole system including 1 gpu and power up the 2nd gpu with another psu


----------



## pillowsack

Yeah, you could just jump the green wire from your 24PIN connectors to each other, also include a ground.


----------



## Stige

Quote:


> Originally Posted by *battleaxe*
> 
> What kind of temps are any of you getting with a full cover block? Mine seem a bit high on VRM's. My core never goes over 44C but the VRM1 is hitting 70C at +100mv. This is after I reseated everthing. I only saw an improvement of 2C roughly.
> 
> Seems high doesn't it?


What block?
Quote:


> Originally Posted by *mus1mus*
> 
> Please verify that VDDC Value you are getting at +100.
> 
> Fujipoly pads will be your refuge.


Even 7w/Mk rated Phobya pads are way better than any stock pads, I saw a massive improvement of 7-8C on my card with those.


----------



## battleaxe

Quote:


> Originally Posted by *mus1mus*
> 
> Please verify that VDDC Value you are getting at +100.
> 
> Fujipoly pads will be your refuge.


I'm already using the FujiPoly 17kh Extreme pads.

The voltage is .05mv higher than my 290x. Its a standard number as I recall, though I cannot remember what. I'll have to check that... Modified EK block that was meant for an XFX 290x. Remodeled to allow the taller caps.

Something's weird here for sure...

Okay, here's the screeny. 

And here I bumped it up to 1250mhz just to see if it would hold. I have my doubts that it could make it through a bench run on this though. If I can get the temps down more though I would bet it could make it. But VRM1 is just getting too hot right now.



Edit:

This makes me wonder if there's a small difference in the RAM height from the older 290x on this block? Or something else that's keeping it from making good contact. I did notice the board seems to be warped ever so slightly. Which seems odd. Its the whole thing, back plate included. Hmmm...

Edit2: Well, this was an experiment I wanted to try... but I don't think its working out. I think I ruined the block. LOL









Edit3: These 390X cards are beasts, but dang do they crank out the voltage. I think it was mentioned somewhere that they run higher volts to allow the base clock of 1500 on the RAM. I may be mistaken, but this really translates into some serious heat. Heat the 290 series was already known for, but the 390 series really takes this to a new level. Interestingly the core is easy enough to tame. My 390X is running identical to the 290X, so no difference there. The VRM's though are hotter, at least on my setup they are.


----------



## TheCowTamer

Has anyone else had an issue with their 390x showing up as a 390 in 3dmark and other software? I purchased a XFX 390x off of ebay and when it came it had the 390x box and a 390x warranty card. Also in GPU-Z it shows that is has 2816 shaders. Here the the GPU-z. Anyone have any idea whats going on? Could it actually be a 390? Should I try and send it back? Thanks guys.


----------



## battleaxe

Quote:


> Originally Posted by *TheCowTamer*
> 
> Has anyone else had an issue with their 390x showing up as a 390 in 3dmark and other software? I purchased a XFX 390x off of ebay and when it came it had the 390x box and a 390x warranty card. Also in GPU-Z it shows that is has 2816 shaders. Here the the GPU-z. Anyone have any idea whats going on? Could it actually be a 390? Should I try and send it back? Thanks guys.


They are all like that. It says 390 series.


----------



## TheCowTamer

OK. Thanks for the info. I had tried to google and figure that out but just wanted to make sure. Thanks for the help


----------



## Stige

Quote:


> Originally Posted by *mus1mus*
> 
> Please verify that VDDC Value you are getting at +100.
> 
> Fujipoly pads will be your refuge.


Quote:


> Originally Posted by *battleaxe*
> 
> I'm already using the FujiPoly 17kh Extreme pads.
> 
> The voltage is .05mv higher than my 290x. Its a standard number as I recall, though I cannot remember what. I'll have to check that... Modified EK block that was meant for an XFX 290x. Remodeled to allow the taller caps.
> 
> Something's weird here for sure...
> 
> Okay, here's the screeny.
> 
> And here I bumped it up to 1250mhz just to see if it would hold. I have my doubts that it could make it through a bench run on this though. If I can get the temps down more though I would bet it could make it. But VRM1 is just getting too hot right now.
> 
> 
> 
> Edit:
> 
> This makes me wonder if there's a small difference in the RAM height from the older 290x on this block? Or something else that's keeping it from making good contact. I did notice the board seems to be warped ever so slightly. Which seems odd. Its the whole thing, back plate included. Hmmm...
> 
> Edit2: Well, this was an experiment I wanted to try... but I don't think its working out. I think I ruined the block. LOL
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit3: These 390X cards are beasts, but dang do they crank out the voltage. I think it was mentioned somewhere that they run higher volts to allow the base clock of 1500 on the RAM. I may be mistaken, but this really translates into some serious heat. Heat the 290 series was already known for, but the 390 series really takes this to a new level. Interestingly the core is easy enough to tame. My 390X is running identical to the 290X, so no difference there. The VRM's though are hotter, at least on my setup they are.


Must be something wrong considering I'm getting 75C max temps on a Alphacool GPX block (Hybrid, core is watercooled, VRM is not) on the VRM1. And that is with +175mV.


----------



## battleaxe

Quote:


> Originally Posted by *Stige*
> 
> Must be something wrong considering I'm getting 75C max temps on a Alphacool GPX block (Hybrid, core is watercooled, VRM is not) on the VRM1. And that is with +175mV.


Dang... yeah, I'd say something is wrong for sure then. Thanks!!









Edit: I might have to get a pair of those blocks then...


----------



## patriotaki

Pff damn the dual PSU adapter is expensive and I'm struggling atm.
I just want to run for a few hours 2x 390 on my PC to do some benchmarks for my review site and YouTube channel..
Is there any way I can do it without the adapter? I will be running only 1 GPU and I'll sell the other one afterwards.. What do you suggest me?


----------



## battleaxe

Quote:


> Originally Posted by *patriotaki*
> 
> Pff damn the dual PSU adapter is expensive and I'm struggling atm.
> I just want to run for a few hours 2x 390 on my PC to do some benchmarks for my review site and YouTube channel..
> Is there any way I can do it without the adapter? I will be running only 1 GPU and I'll sell the other one afterwards.. What do you suggest me?


Use a paper clip between the green wire and a black wire on that 24pin PSU cable. That will turn it on. Then you can do what you want.


----------



## jodybdesigns

Quote:


> Originally Posted by *patriotaki*
> 
> Pff damn the dual PSU adapter is expensive and I'm struggling atm.
> I just want to run for a few hours 2x 390 on my PC to do some benchmarks for my review site and YouTube channel..
> Is there any way I can do it without the adapter? I will be running only 1 GPU and I'll sell the other one afterwards.. What do you suggest me?


Jump the green wire and the black wire on one 24 pin to the other 24 pin. This can be ANY black wire since the blacks are all grounds. Just strip a bit of wire away. I ran miners with 2 PSU's wired this way for months.


----------



## Synntx

My latest score. Look at them clocks!

http://www.3dmark.com/3dm/10697504?


----------



## christoph

Quote:


> Originally Posted by *Stige*
> 
> What block?
> Even 7w/Mk rated Phobya pads are way better than any stock pads, I saw a massive improvement of 7-8C on my card with those.


Excuse me, can you link me to the exact thermal pads you've bought?? what size am I suppose to buy??


----------



## Dundundata

Quote:


> Originally Posted by *Synntx*
> 
> My latest score. Look at them clocks!
> 
> http://www.3dmark.com/3dm/10697504?


nice gfx score


----------



## ZealotKi11er

Quote:


> Originally Posted by *christoph*
> 
> Excuse me, can you link me to the exact thermal pads you've bought?? what size am I suppose to buy??


1mm should be fine and get 11w or 17w.


----------



## Stige

Quote:


> Originally Posted by *christoph*
> 
> Excuse me, can you link me to the exact thermal pads you've bought?? what size am I suppose to buy??


http://www.ebay.com/itm/111732983134

This is what I bought.


----------



## jodybdesigns

My latest run. 10535 isn't bad.

http://www.3dmark.com/fs/7505676

If I had an i7 I would be over 11k. The temps are my favorite thing though. 1100/1625 +45mv completely stable. Load temps: Core 65C, VRM1 73C, VRM2 68C. That is the highest I have seen them go. The Powercolor PCS+ rocks.


----------



## Stige

Quote:


> Originally Posted by *jodybdesigns*
> 
> My latest run. 10535 isn't bad.
> 
> http://www.3dmark.com/fs/7505676
> 
> If I had an i7 I would be over 11k. The temps are my favorite thing though. 1100/1625 +45mv completely stable. Load temps: Core 65C, VRM1 73C, VRM2 68C. That is the highest I have seen them go. The Powercolor PCS+ rocks.


1100 score behind mine


----------



## jodybdesigns

Quote:


> Originally Posted by *Stige*
> 
> 1100 score behind mine


Oh yeah, but that's quite a hefty overclock on the CPU. I can't touch 4.4 without going to 1.3v on the core so I leave it at 4.3.

Are you running a custom bios?


----------



## tolis626

Quote:


> Originally Posted by *Synntx*
> 
> My latest score. Look at them clocks!
> 
> http://www.3dmark.com/3dm/10697504?


Holy hell that's fast. What voltages are you using? My card craps out at anything over 1.28V under load.


----------



## kizwan

Quote:


> Originally Posted by *Charcharo*
> 
> http://www.3dmark.com/3dm/10690842
> I guess this is a bit low guys?
> 
> This is stock PCS + 390 . Rest of the system is in the actual run.
> 
> Will Windows 10 improve this? Or a better CPU/RAM?


You should get 12k for graphics score at stock which you did. Powercolor 290s tend to score a bit lower in bench compare to the other brands at same clock. Probably the same with 390.
Quote:


> Originally Posted by *Charcharo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Vellinious*
> 
> That's a bit low, but it's not horrible for stock clocks.
> 
> 
> 
> Is it out of the ordinary? Because it is strange. The only other Bench I compared my card to are Metro Last Light benches (and it does BETTER than the ones posted in sites it seems).
> 
> Would Windows 10 and an SSD improve this score? I just bought 3DMark and I do not know how things influence the score.
Click to expand...

Relax, your score is not out of the ordinary.

I recalled many people claimed bench score in Win 7 is a bit higher than Win 10. And no, SSD will not improve the score.
Quote:


> Originally Posted by *TheCowTamer*
> 
> Has anyone else had an issue with their 390x showing up as a 390 in 3dmark and other software? I purchased a XFX 390x off of ebay and when it came it had the 390x box and a 390x warranty card. Also in GPU-Z it shows that is has 2816 shaders. Here the the GPU-z. Anyone have any idea whats going on? Could it actually be a 390? Should I try and send it back? Thanks guys.


It just a name in registry which you can change to any name. GPU-Z is correct by displaying "390 series" & Futuremark System Info can sometime unreliable in getting hardware info correctly.
Quote:


> Originally Posted by *battleaxe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Stige*
> 
> Must be something wrong considering I'm getting 75C max temps on a Alphacool GPX block (Hybrid, core is watercooled, VRM is not) on the VRM1. And that is with +175mV.
> 
> 
> 
> Dang... yeah, I'd say something is wrong for sure then. Thanks!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: I might have to get a pair of those blocks then...
Click to expand...

Try stack two pads on the VRMs.


----------



## Vellinious

Has anyone been able to break 16k graphics score without using tess tweaks?


----------



## THUMPer1

Always wondered what my 3dmark score was.
http://www.3dmark.com/3dm/10699984

14k graphics score. nice


----------



## ZealotKi11er

Quote:


> Originally Posted by *THUMPer1*
> 
> Always wondered what my 3dmark score was.
> http://www.3dmark.com/3dm/10699984
> 
> 14k graphics score. nice


That's very good score. Try running memory 1625MHz and see what you score.


----------



## THUMPer1

Quote:


> Originally Posted by *ZealotKi11er*
> 
> That's very good score. Try running memory 1625MHz and see what you score.


I tried that with heaven. It was lower by 3 points. Ill try 3dmark though


----------



## THUMPer1

Score went up a hair at 1625 mem
http://www.3dmark.com/3dm/10700249


----------



## buttface420

i just got my new msi gaming 390 today and first thing i did after installing it and drivers was check asic quality and i got a very crappy 69.2%. which really sucks cause i was going to go with a nitro but everyone told me the msi was better binned and had better oc ability.


----------



## mus1mus

High ASIC has nothing to do with Overclockability.

My 290 with 80+ ASIC taps out (heavy artifacts and blackscreens) at 1350.
My 290Xs that both have 76 ASIC do 1350 highly Stable. But with a higher Voltage requirement.

http://www.3dmark.com/3dm11/10800210


----------



## buttface420

thats the highest graphic score for a single 290x i ever seen, my 290x gets like 12,000 graphic score

i got a msi 290x also that has asic quality of 68.3% and its a lemon of a clocker..i got 1100/1375 tops

did you ever have the dreaded black screen crash with your 290s?


----------



## Vellinious

Quote:


> Originally Posted by *buttface420*
> 
> thats the highest graphic score for a single 290x i ever seen, my 290x gets like 12,000 graphic score
> 
> i got a msi 290x also that has asic quality of 68.3% and its a lemon of a clocker..i got 1100/1375 tops
> 
> did you ever have the dreaded black screen crash with your 290s?


Is 19k on 3D Mark 11 that tough to hit?

http://www.3dmark.com/3dm11/10910215


----------



## buttface420

jeez i guess i need to learn how to OC from you guys my 290x with a hg10 a1 bracket and h55 aio couldnt hit anything past the 12,000s.

do you guys 290x's have elpida or hynix? cause i swaer my elpida doesnt like anything over 1375


----------



## rdr09

Quote:


> Originally Posted by *Vellinious*
> 
> Is 19k on 3D Mark 11 that tough to hit?
> 
> http://www.3dmark.com/3dm11/10910215


it was for me. a year ago or so . . .

http://www.3dmark.com/3dm11/8776470


----------



## buttface420

you are all making me feel like such a stupid noob. im here with my crap score looking like a loser


----------



## Vellinious

Mus and Fyz were making me feel inferior...I don't feel so bad now. lol


----------



## buttface420

well whats the big secret? i thought i tried everything


----------



## Mysticking32

How are y'all getting such high graphics scores? These are the highest I've been able to get like ever lol. And I couldn't even run it at 1150 with 100mv because of artifacts so I'm currently just at the 1130mhz with 69mv. (lowest i could get it without artifacts) I think it's safe to say I got a horrible card lol.

http://www.3dmark.com/compare/fs/7485667/fs/7460709/fs/7460620


----------



## buttface420

Quote:


> Originally Posted by *Mysticking32*
> 
> How are y'all getting such high graphics scores? These are the highest I've been able to get like ever lol. And I couldn't even run it at 1150 with 100mv because of artifacts so I'm currently just at the 1130mhz with 69mv. (lowest i could get it without artifacts) I think it's safe to say I got a horrible card lol.
> 
> http://www.3dmark.com/compare/fs/7485667/fs/7460709/fs/7460620


i know right? i thought if you can hit 1200 you had a golden card but these guys are hitting 1300 on core like really?!!


----------



## Vellinious

I used HiS iTurbo and tested in FS graphics test 1 and 2 in order to find clocks that would give the best results. I took that clock, lowered it a tad, and ran the 3D Mark 11 graphics tests to get a graphics score.

I'm a little bit afraid to push super hard until my replacement PSU arrives from EVGA. I'm hoping that once I get the new PSU in, I'll be able to hit 16k graphics score in FS. Hoping...we'll see.

I still need to raise the power limits, TDP and TDC in the stock bios, and....try to figure out how to get rid of at least some of the vdroop I'm seeing, which is...SIGNIFICANT. Dropping from 1.443v to 1.283v when under load. 160mv seems excessive.


----------



## rdr09

Quote:


> Originally Posted by *Mysticking32*
> 
> How are y'all getting such high graphics scores? These are the highest I've been able to get like ever lol. And I couldn't even run it at 1150 with 100mv because of artifacts so I'm currently just at the 1130mhz with 69mv. (lowest i could get it without artifacts) I think it's safe to say I got a horrible card lol.
> 
> http://www.3dmark.com/compare/fs/7485667/fs/7460709/fs/7460620


Those are 3D Mark 11 benchies.

@ buttface, it's winning lottery, water, and weather. Cool weather.


----------



## Vellinious

Quote:


> Originally Posted by *rdr09*
> 
> Those are 3D Mark 11 benchies.
> 
> @ buttface, it's winning lottery, water, and weather. Cool weather.


lol, that too...I thought they knew those were 3D Mark 11 scores.

Yeah, I can't quite break 16k graphics in firestrike. The FS graphics scores are quite a bit lower than 19k. = P

This is my best Firestrike run without tess tweaks.

http://www.3dmark.com/fs/7358506


----------



## Mysticking32

Quote:


> Originally Posted by *rdr09*
> 
> Those are 3D Mark 11 benchies.
> 
> @ buttface, it's winning lottery, water, and weather. Cool weather.


O i know that lol. I was talking about the other scores of graphics higher than 14k. Not on the 3dmark 11.

And I wish I could overclock higher. 1140 for me though lol. Is that normal? Anyone?


----------



## Vellinious

Quote:


> Originally Posted by *Mysticking32*
> 
> O i know that lol. I was talking about the other scores of graphics higher than 14k. Not on the 3dmark 11.
> 
> And I wish I could overclock higher. 1140 for me though lol. Is that normal? Anyone?


Seems like most people are in that area. 1150ish to 1250ish.

If you're on air, that's probably a good portion of it.


----------



## Hethrus

Now I have a very important question I can't find an answer for. I'm looking to watercool my XFX r9 390 DD black addition (will include link to newegg)
I saw it says they're no longer compatible but since I bought mine in 10/8/2015 is it possible its one of the earlier ones before the change? How do I know for sure?
And if it is which full cover waterblock is known to work? I would really appreciate some help.

Video card as shown in purchase history: http://www.newegg.ca/Product/Product.aspx?Item=N82E16814150728&_ga=1.223689563.1158227939.1455084971


----------



## mus1mus

Quote:


> Originally Posted by *Vellinious*
> 
> Mus and Fyz were making me feel inferior...I don't feel so bad now. lol


I'm pretty sure it's just Fyzzz


----------



## Stige

Quote:


> Originally Posted by *buttface420*
> 
> i just got my new msi gaming 390 today and first thing i did after installing it and drivers was check asic quality and i got a very crappy 69.2%. which really sucks cause i was going to go with a nitro but everyone told me the msi was better binned and had better oc ability.


Stop believing people who post crap like that and you will be just fine.

There is no such thing as "One manufacturer OCs better than other", that is just a load of crap really. And ASIC doesn't mean anything either.
Quote:


> Originally Posted by *Mysticking32*
> 
> How are y'all getting such high graphics scores? These are the highest I've been able to get like ever lol. And I couldn't even run it at 1150 with 100mv because of artifacts so I'm currently just at the 1130mhz with 69mv. (lowest i could get it without artifacts) I think it's safe to say I got a horrible card lol.
> 
> http://www.3dmark.com/compare/fs/7485667/fs/7460709/fs/7460620


Well my 390 scores about 300 more on Graphics Score than your 390X, just luck of the draw I guess.
And BIOS modding for more voltage.


----------



## rdr09

Quote:


> Originally Posted by *Hethrus*
> 
> Now I have a very important question I can't find an answer for. I'm looking to watercool my XFX r9 390 DD black addition (will include link to newegg)
> I saw it says they're no longer compatible but since I bought mine in 10/8/2015 is it possible its one of the earlier ones before the change? How do I know for sure?
> And if it is which full cover waterblock is known to work? I would really appreciate some help.
> 
> Video card as shown in purchase history: http://www.newegg.ca/Product/Product.aspx?Item=N82E16814150728&_ga=1.223689563.1158227939.1455084971


Sgt. might be able to help . . .

http://www.overclock.net/t/1331663/xfx-black-double-dissipation-club

Quote:


> Originally Posted by *Vellinious*
> 
> lol, that too...I thought they knew those were 3D Mark 11 scores.
> 
> Yeah, I can't quite break 16k graphics in firestrike. The FS graphics scores are quite a bit lower than 19k. = P
> 
> This is my best Firestrike run without tess tweaks.
> 
> http://www.3dmark.com/fs/7358506


Quote:


> Originally Posted by *Mysticking32*
> 
> O i know that lol. I was talking about the other scores of graphics higher than 14k. Not on the 3dmark 11.
> 
> And I wish I could overclock higher. 1140 for me though lol. Is that normal? Anyone?


Only time i hit 14K in FS was when i used Win 10 and my 290 was oc'ed like 1320 core or something. It was cold that night.


----------



## jazz995756

Quote:


> Originally Posted by *Hethrus*
> 
> Now I have a very important question I can't find an answer for. I'm looking to watercool my XFX r9 390 DD black addition (will include link to newegg)
> I saw it says they're no longer compatible but since I bought mine in 10/8/2015 is it possible its one of the earlier ones before the change? How do I know for sure?
> And if it is which full cover waterblock is known to work? I would really appreciate some help.
> 
> Video card as shown in purchase history: http://www.newegg.ca/Product/Product.aspx?Item=N82E16814150728&_ga=1.223689563.1158227939.1455084971


I ran into this problem recently and I found a forum explaining how to tell I can find the link but if you take your back plate off and post a picture I can tell you if it's possible or not there is a inductor on the end that was revised from the original reference PCB That was made slightly taller causing the Waterblock not to fit. On that inductor it should either have a "C" or a "Z" if it has the "C" then it will be compatible with the Waterblock if it has the "Z" then unfortunately it won't work unless you modify the block and thin it down to clear the inductor.

This is the WB I used http://www.frozencpu.com/products/24126/ex-blc-1714/EK_MSI_Gigabyte_Radeon_R9-290X_VGA_Liquid_Cooling_Block_Rev_20_-_Acetal_EK-FC_R9-290X_-_Acetal_Rev20.html

I didn't cause any issues I just had to remove the mid plate on the GPU and I replaced the thermal pads the WB came with fits good except for the end of the GPU I wasn't able to put a screw in because the hole was slightly off center but didn't hurt my performance and no I don't hit 60c even with a mild overclock.


----------



## Charcharo

Quote:


> Originally Posted by *kizwan*
> 
> You should get 12k for graphics score at stock which you did. Powercolor 290s tend to score a bit lower in bench compare to the other brands at same clock. Probably the same with 390.
> Relax, your score is not out of the ordinary.
> 
> I recalled many people claimed bench score in Win 7 is a bit higher than Win 10. And no, SSD will not improve the score.


Hmm why do PowerColor R9 290s bench a bit lower than the rest







?
And even stock at that.


----------



## Stige

Quote:


> Originally Posted by *Mysticking32*
> 
> How are y'all getting such high graphics scores? These are the highest I've been able to get like ever lol. And I couldn't even run it at 1150 with 100mv because of artifacts so I'm currently just at the 1130mhz with 69mv. (lowest i could get it without artifacts) I think it's safe to say I got a horrible card lol.
> 
> http://www.3dmark.com/compare/fs/7485667/fs/7460709/fs/7460620


Well my 390 scores about 300 more on Graphics Score than your 390X, just luck of the draw I gues
Quote:


> Originally Posted by *Charcharo*
> 
> Hmm why do PowerColor R9 290s bench a bit lower than the rest
> 
> 
> 
> 
> 
> 
> 
> ?
> And even stock at that.


They won't, and they don't. It does not make any sense at all, they will/should all bench the same at same clocks regardless of manufacturer.


----------



## TsukikoChan

Quote:


> Originally Posted by *patriotaki*
> 
> Pff damn the dual PSU adapter is expensive and I'm struggling atm.
> I just want to run for a few hours 2x 390 on my PC to do some benchmarks for my review site and YouTube channel..
> Is there any way I can do it without the adapter? I will be running only 1 GPU and I'll sell the other one afterwards.. What do you suggest me?


I don't know if you've got this working yet or not but just thought i'd mention some of the higher end psu's come with the wee adaptor for turning on a psu without a motherboard.. well, at least the evga g2 750w one does, so i assume most of the evga g/p ones do :O if that helps if you haven't ordered your psu yet :-(


----------



## mus1mus

The adaptor mentioned previously is for synching 2 PSUs to power up at the same time when you press the PC power button.

If just turning on the PSU is what you meant, it would be very simple to do.


----------



## TsukikoChan

Quote:


> Originally Posted by *mus1mus*
> 
> The adaptor mentioned previously is for synching 2 PSUs to power up at the same time when you press the PC power button.
> 
> If just turning on the PSU is what you meant, it would be very simple to do.


ah, my bad, i didn't read fully then ^_^


----------



## kizwan

Quote:


> Originally Posted by *Charcharo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> You should get 12k for graphics score at stock which you did. Powercolor 290s tend to score a bit lower in bench compare to the other brands at same clock. Probably the same with 390.
> Relax, your score is not out of the ordinary.
> 
> I recalled many people claimed bench score in Win 7 is a bit higher than Win 10. And no, SSD will not improve the score.
> 
> 
> 
> Hmm why do PowerColor R9 290s bench a bit lower than the rest
> 
> 
> 
> 
> 
> 
> 
> ?
> And even stock at that.
Click to expand...

No idea. In 290 case, Powecolor did released new BIOS that should fixed it but the bench score still a bit lower.


----------



## Razeraa

[/quote]
Quote:


> Originally Posted by *Dundundata*
> 
> did you increase power limit as well


I did yes, +50


----------



## navjack27

re-learning the ins and outs of cpu overclocking the past couple days and its interesting how if you don't have a high enough cpu input voltage that the pci express slot craps out when it tries to go full 16x 3.0 mode.

besides that little tidbit i have a working 4.3ghz overclock with a bit better results now, also with a 4000mhz uncore. i'll prob run another 3dmark run for the contest to see if i score even a tiny bit better. GO TEAM RED!

don't forget to get up in this, even if you score low your result helps us all win!


----------



## Vellinious

I need to get some scores submitted for that.


----------



## sil130

Hey guys I need help. My MSI r9 390 suddenly became 6-10C hotter w/o changes.I havent change any voltage,clocks, fan profiles any whatsover. I just left it overnight downloading something and when I woke up my idle temp is 63C, which supposed to be 54-56C. Now my max temp on AC:S is 82C, ive already finished the game and my max temp before was only 75C. Anyone experienced this also?


----------



## DirtyDutch

Today i got my second hand msi R9 390 390x a was zo ******* thrilled and I was going to post in the thread to join you here.



But when i opened the boxes I was missing my receipt for the warranty, this is when I started to get nervous,this is what i found in the box.









Take a good look at the picture below, not only is the cable hanging out look at the steel that is grinded away. :O



Then i saw the warranty label was broken.


So I started to assemble the card,*** happened to this beautiful card is a mistery to me???


----------



## DirtyDutch




----------



## Stige

What the actual fak? Scammed pretty much?


----------



## jodybdesigns

What were they trying to do with that? Modify it so they could use a universal block or something? That's awful. What a scam...


----------



## battleaxe

Quote:


> Originally Posted by *DirtyDutch*


wow... That's rough... Who the heck would do such a thing? Shoddy man... Shoddy


----------



## JreyE30

Quote:


> Originally Posted by *sil130*
> 
> Hey guys I need help. My MSI r9 390 suddenly became 6-10C hotter w/o changes.I havent change any voltage,clocks, fan profiles any whatsover. I just left it overnight downloading something and when I woke up my idle temp is 63C, which supposed to be 54-56C. Now my max temp on AC:S is 82C, ive already finished the game and my max temp before was only 75C. Anyone experienced this also?


What is your ambient temps? Sounds like the case it's in isn't getting enough flow or somethings up if you haven't messed with any settings or voltages


----------



## sil130

Quote:


> Originally Posted by *JreyE30*
> 
> What is your ambient temps? Sounds like the case it's in isn't getting enough flow or somethings up if you haven't messed with any settings or voltages


I dont have a thermostat here but i think its 20-24C. I dont think its airflow either because ive been using this for weeks and i havent changed any settings on the fan, so its impossible to suddenly have a bad airflow. I guess its a driver problem or the card itself..


----------



## christoph

Quote:


> Originally Posted by *ZealotKi11er*
> 
> 1mm should be fine and get 11w or 17w.


is 17w suppose to be better??
Quote:


> Originally Posted by *Stige*
> 
> http://www.ebay.com/itm/111732983134
> 
> This is what I bought.


these are 7w, is that number better as it goes higher or lower??


----------



## JreyE30

Quote:


> Originally Posted by *sil130*
> 
> I dont have a thermostat here but i think its 20-24C. I dont think its airflow either because ive been using this for weeks and i havent changed any settings on the fan, so its impossible to suddenly have a bad airflow. I guess its a driver problem or the card itself..


Interesting, you have the latest 16.1.1?


----------



## DirtyDutch

Thank God I could retrace the guy on Facebook, I called his mom and she refunded all my money. Pfeeew


----------



## buttface420

Quote:


> Originally Posted by *DirtyDutch*
> 
> Today i got my second hand msi R9 390 390x a was zo ******* thrilled and I was going to post in the thread to join you here.
> 
> 
> 
> But when i opened the boxes I was missing my receipt for the warranty, this is when I started to get nervous,this is what i found in the box.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Take a good look at the picture below, not only is the cable hanging out look at the steel that is grinded away. :O
> 
> 
> 
> Then i saw the warranty label was broken.
> 
> 
> So I started to assemble the card,*** happened to this beautiful card is a mistery to me???


WHY....WHYYYYYYYYYY. why did they hack up such a already beast card. smh.


----------



## DirtyDutch

I'm guessing they tryed to fit some aftermarket watercooling on it.


----------



## Lixxon

Hey! Im new here I got sapphire 390x.... Hopefully someone here could help me out:

This is my current stats: http://i.imgur.com/wkPfk1J.png
Its just abit more than stock speeds, but my question is really how i should go forward.
1: I know that when overclocking i simply only increase with 5-20 speeds max to check if its stable etc and if I see any artifacts, but before overclocking even more should I increase power limit % sliter to the max before starting? (+50)
2: Does increasing memory clock reduce the amount of core clock you can do? Will I get higher core clock If i skip memory clock, or do they both need to be increased? (core clock gives better results than memory?)
3: As far as I have been checking out as long as I dont touch "core voltage" Im not risking to burn or destroy my card?
So now going forward, what do you guys think/know, I wonder if its ok to put my Power limit % to max +50? Then I start adding higher clocks to core clock and memory clock.
Thanks in advance! Hope someone can help me out =)


----------



## Stige

Quote:


> Originally Posted by *christoph*
> 
> is 17w suppose to be better??
> these are 7w, is that number better as it goes higher or lower??


Higher is better but even these provided a big improvement over the stock pads for me.

Anything above 7w/mk and the price goes up fast.


----------



## sil130

Quote:


> Originally Posted by *DirtyDutch*
> 
> Thank God I could retrace the guy on Facebook, I called his mom and she refunded all my money. Pfeeew


Yep. Installed it last night.


----------



## JreyE30

Quote:


> Originally Posted by *DirtyDutch*
> 
> I'm guessing they tryed to fit some aftermarket watercooling on it.


That's funny I don't think they know it's not the reference PCB as far as I know most places (EK, Aqua etc.) will not make any custom water blocks for any custom made PCBs, and plus the cards temps are perfectly fine


----------



## patriotaki

7 days from now ill have 2x pcs+ 390 in my hands







so excited..


----------



## mus1mus

Quote:


> Originally Posted by *Lixxon*
> 
> Hey! Im new here I got sapphire 390x.... Hopefully someone here could help me out:
> 
> This is my current stats: http://i.imgur.com/wkPfk1J.png
> Its just abit more than stock speeds, but my question is really how i should go forward.
> 1: I know that when overclocking i simply only increase with 5-20 speeds max to check if its stable etc and if I see any artifacts, but before overclocking even more should I increase power limit % sliter to the max before starting? (+50)
> 2: Does increasing memory clock reduce the amount of core clock you can do? Will I get higher core clock If i skip memory clock, or do they both need to be increased? (core clock gives better results than memory?)
> 3: As far as I have been checking out as long as I dont touch "core voltage" Im not risking to burn or destroy my card?
> So now going forward, what do you guys think/know, I wonder if its ok to put my Power limit % to max +50? Then I start adding higher clocks to core clock and memory clock.
> Thanks in advance! Hope someone can help me out =)


1. + 50 Power Limit ALWAYS. Even if you are not OC'ing.

2. Try which memory strap the card maxes out. To do that, ADD +100 Core Voltage on MSI AB, and try the straps as, 1500,1625,1750, etc.

If you can make it to run at 1750, you are in to a good start. Do a GPU-Z render for a little load to verify Memory Clocks are stable at least.

3. Set Core Frequency to whatever value that allows no artifacts.

4. Adjust fan speed to your preference.

5. Tune down the Voltage for better temps that is suitable for your Overclock.


----------



## mus1mus

Quote:


> Originally Posted by *DirtyDutch*
> 
> Thank God I could retrace the guy on Facebook, I called his mom and she refunded all my money. Pfeeew


That sounds better.









Now , will the card stay?


----------



## DirtyDutch

Quote:


> Originally Posted by *mus1mus*
> 
> That sounds better.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now , will the card stay?


Prob go for a MSI R9 390 non X.


----------



## christoph

Quote:


> Originally Posted by *Stige*
> 
> Higher is better but even these provided a big improvement over the stock pads for me.
> 
> Anything above 7w/mk and the price goes up fast.


thanks for replying;

and yes, I've noticed that the price goes up as the pads are improve


----------



## DirtyDutch

What brand R9 390 card gets the best OC?


----------



## Stige

Quote:


> Originally Posted by *DirtyDutch*
> 
> What brand R9 390 card gets the best OC?


It doesn't matter what brand...

Buy the Sapphire because it has the best cooler, simple as that.


----------



## DirtyDutch

Quote:


> Originally Posted by *Stige*
> 
> It doesn't matter what brand...
> 
> Buy the Sapphire because it has the best cooler, simple as that.


Yes the Sapphire was on top of my list for the cooling,prob go for the one with the backplate.


----------



## mus1mus

Quote:


> Originally Posted by *Stige*
> 
> Quote:
> 
> 
> 
> Originally Posted by *DirtyDutch*
> 
> What brand R9 390 card gets the best OC?
> 
> 
> 
> It doesn't matter what brand...
> 
> Buy the Sapphire because it has the best cooler, simple as that.
Click to expand...

good advice. +1

Cooling is better than Clocks.







Clocks is all about getting either Lemons or Golds. While cooling is definitely needed.


----------



## jdorje

Asic quality just means the percentile of your stock vid right? 70% means your stock vid is lower than 70% of other cards in the sample set.

There is a correlation between that and overclock ability but it's a fairly loose one.

The msi 390/x overclocks the best because it has 2 extra vrms. Right?


----------



## ZealotKi11er

Quote:


> Originally Posted by *jdorje*
> 
> Asic quality just means the percentile of your stock vid right? 70% means your stock vid is lower than 70% of other cards in the sample set.
> 
> There is a correlation between that and overclock ability but it's a fairly loose one.
> 
> The msi 390/x overclocks the best because it has 2 extra vrms. Right?


Nobody really knows. There is not correlation to better overclocks or anything.


----------



## mus1mus

Quote:


> Originally Posted by *jdorje*
> 
> Asic quality just means the percentile of your stock vid right? 70% means your stock vid is lower than 70% of other cards in the sample set.
> 
> There is a correlation between that and overclock ability but it's a fairly loose one.
> 
> The msi 390/x overclocks the best because it has 2 extra vrms. Right?


1. YES, and a little Nope.
2. Nope.
3. Nope as well.

Higher ASIC Quality does mean a lower stock VID. 70% has an equivalent VID. Not that of the second statement.

Higher ASIC might mean you can clock fairly good at lower Voltages *but to how much is another matter*.

MSI may have a better board. But they don't have the control to the chip's OC abilities. Unless you buy a guaranteed one. Like a LIGHTNING for example. But even then, what you can achieve is still a lottery


----------



## Dundundata

Quote:


> Originally Posted by *buttface420*
> 
> i just got my new msi gaming 390 today and first thing i did after installing it and drivers was check asic quality and i got a very crappy 69.2%. which really sucks cause i was going to go with a nitro but everyone told me the msi was better binned and had better oc ability.


mine is lower than that and my card OC's rather well.


----------



## Vellinious

ASIC quality doesn't seem to matter as much on the Hawaii archtecture as it does on Maxwell. Keppler and Fermi before it, was pretty random too.

Totally random thought: I have got to find a custom bios that'll help performance on this card....


----------



## battleaxe

So close to 1700, so close. Pushing to 1250mhz core made it artifact a bit, I didn't want it to crash so settled on 1245. Before VRM1 got into the 70's C it was fine, but once over 70C I start to see some flashes. Signs she's about to go black. Gotta get it cooler. I've been working on figuring out what to do, aside from putting the PC out in the cold (which I may do) I'm not sure how to get the VRM's colder. Seems its just the voltage making them so hot. If I run at 1000mhz and -75mv (which is stable) then my VRM's barely go over 40C. Overall, very happy with this card though.


----------



## Mysticking32

Whelp guys I just learned an important lesson. Apparently my cases air flow sucked complete a$$. I added 3 fans since the last time i posted. (replaced my old ones) and my god the temps have been much more reasonable. I had a 10-12 degree drop in gpu temps. Before I was hitting 86 87 degrees @ 1130mhz with 69 volts. After that I added 2 fans and it dropped down to 81. Today I added the third fan and the temps dropped all the way to 75 same settings.

I'm very impressed. Probably going to replace my other fans too. Didn't even think about that. Still can't overclock to 1150 past 100 volts though. (I prefer to stay under 100) Although now I think I might be able to add more voltage since the temps are dropping so drastically. I'll try and wait till I can order at least 2 more fans before trying to go over 100 mv. The last time i did that. (before i added the 3 fans) I was hitting 89-91. Which I was not happy with lol.


----------



## tolis626

Quote:


> Originally Posted by *battleaxe*
> 
> So close to 1700, so close. Pushing to 1250mhz core made it artifact a bit, I didn't want it to crash so settled on 1245. Before VRM1 got into the 70's C it was fine, but once over 70C I start to see some flashes. Signs she's about to go black. Gotta get it cooler. I've been working on figuring out what to do, aside from putting the PC out in the cold (which I may do) I'm not sure how to get the VRM's colder. Seems its just the voltage making them so hot. If I run at 1000mhz and -75mv (which is stable) then my VRM's barely go over 40C. Overall, very happy with this card though.


Dude... I know lower temps are supposed to help stabilize an overclock but that is ridiculous. 1245MHz at 1.24V? Mine won't do 1200MHz at 1.28V. Damn... Share your secrets with me!









EDIT : Seeing as it's a particularly cold day (By Greek standards, at least. It's like 10C outside) and the house is like a fridge because I forgot to turn on the heating, I decided to relieve my frustration by benchmarking stuff, so that I could make the best of a situation. Only thing I managed to do is cause me even more frustration. Some of my results are the following : 1200/1625MHz at +125/+25mV, 1210/1625MHz at +150/+25mV and, my personal favorite 1225/1625MHz at +175/+25mV. I was at least happy that it didn't explode, because otherwise I'm ready to throw the thing out the window. Temps didn't break 66C on the core and 60C on the VRM (10-15C ambient, side panel off and fans blasting at 100% does provide substantial cooling it seems), so these aren't the issue. I can't for the life of me explain why my scores go further down the higher I go. There also was no artifacting as far as I could tell. And then there's this 1185/1725MHz at +100/+50mV run that obliterates (ok, I exaggerate here but still) all higher overclocks. Also, my memory refuses to do 1750MHz no matter how much voltage I throw at it. Boom, black screen or RSOD. Kind of ridiculous, really. 1725MHz is fine for anything other than prolonged gaming.

So... Any ideas are welcome. Both for overclocking/benchmarking and anger management.

EDIT 2 : Did a suicide run at 1225/1725MHz at +200/+25mV. Still, no go. 14400 plus change. What the...?


----------



## gupsterg

Quote:


> Originally Posted by *jdorje*
> 
> Asic quality just means the percentile of your stock vid right? 70% means your stock vid is lower than 70% of other cards in the sample set.
> 
> There is a correlation between that and overclock ability but it's a fairly loose one.
> 
> The msi 390/x overclocks the best because it has 2 extra vrms. Right?


The Stilt has posted info about "ASIC Quality", this post.

There are also some great explanations / other info regarding say LL effect on litecoin forum by The Stilt, his thread.


----------



## battleaxe

Quote:


> Originally Posted by *tolis626*
> 
> Dude... I know lower temps are supposed to help stabilize an overclock but that is ridiculous. 1245MHz at 1.24V? Mine won't do 1200MHz at 1.28V. Damn... Share your secrets with me!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT : Seeing as it's a particularly cold day (By Greek standards, at least. It's like 10C outside) and the house is like a fridge because I forgot to turn on the heating, I decided to relieve my frustration by benchmarking stuff, so that I could make the best of a situation. Only thing I managed to do is cause me even more frustration. Some of my results are the following : 1200/1625MHz at +125/+25mV, 1210/1625MHz at +150/+25mV and, my personal favorite 1225/1625MHz at +175/+25mV. I was at least happy that it didn't explode, because otherwise I'm ready to throw the thing out the window. Temps didn't break 66C on the core and 60C on the VRM (10-15C ambient, side panel off and fans blasting at 100% does provide substantial cooling it seems), so these aren't the issue. I can't for the life of me explain why my scores go further down the higher I go. There also was no artifacting as far as I could tell. And then there's this 1185/1725MHz at +100/+50mV run that obliterates (ok, I exaggerate here but still) all higher overclocks. Also, my memory refuses to do 1750MHz no matter how much voltage I throw at it. Boom, black screen or RSOD. Kind of ridiculous, really. 1725MHz is fine for anything other than prolonged gaming.
> 
> So... Any ideas are welcome. Both for overclocking/benchmarking and anger management.
> 
> EDIT 2 : Did a suicide run at 1225/1725MHz at +200/+25mV. Still, no go. 14400 plus change. What the...?


There's no secret. Its just the lottery. My VRM's are way too hot, otherwise I sure it could go higher on the core. I can't figure out how to get them colder. Thinking about buying a brand new block and modifying it so it will fit this card as its not supported by any block vendors ATM. You're card really isn't bad at all. Good even, mine is just a little better than average is all. There are some out there even better than mine, just the way it goes.


----------



## tolis626

Quote:


> Originally Posted by *battleaxe*
> 
> There's no secret. Its just the lottery. My VRM's are way too hot, otherwise I sure it could go higher on the core. I can't figure out how to get them colder. Thinking about buying a brand new block and modifying it so it will fit this card as its not supported by any block vendors ATM. You're card really isn't bad at all. Good even, mine is just a little better than average is all. There are some out there even better than mine, just the way it goes.


I know it's the lottery. By "secrets", I meant to teach me how to win the damn lottery.









On a more serious note, the only thing keeping me from slapping an AIO or something on my card is that its resale value will tank completely. Or at least I think it will. Maybe I'll change my mind about that as summer approaches. Temps do improve things a lot for GPUs, so I do not think 1200MHz is out of the realm of possibility for 24/7 operation under water. For how long that would work... That's another story. I have no idea how fast voltage and current can damage a GPU, but I tend to think that if temps are in check it's all fine, really.

Other than that, do you have any ideas why my scores suck so bad? I mean, they're not bad per se, but I see marginal improvements over 1150MHz and even a regression in performance over 1185MHz. Is it just that these GPUs hate voltage that much? I hoped not as long as temps aren't the issue...


----------



## Lixxon

Quote:


> Originally Posted by *mus1mus*
> 
> 1. + 50 Power Limit ALWAYS. Even if you are not OC'ing.
> 
> 2. Try which memory strap the card maxes out. To do that, ADD +100 Core Voltage on MSI AB, and try the straps as, 1500,1625,1750, etc.
> 
> If you can make it to run at 1750, you are in to a good start. Do a GPU-Z render for a little load to verify Memory Clocks are stable at least.
> 
> 3. Set Core Frequency to whatever value that allows no artifacts.
> 
> 4. Adjust fan speed to your preference.
> 
> 5. Tune down the Voltage for better temps that is suitable for your Overclock.


Oh any special reason it should be +50 power limit always? Never heard aboutt memory straps is that specific speed that has to be? ( I cant put 1635 for example?)


----------



## Stige

Quote:


> Originally Posted by *tolis626*
> 
> Dude... I know lower temps are supposed to help stabilize an overclock but that is ridiculous. 1245MHz at 1.24V? Mine won't do 1200MHz at 1.28V. Damn... Share your secrets with me!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT : Seeing as it's a particularly cold day (By Greek standards, at least. It's like 10C outside) and the house is like a fridge because I forgot to turn on the heating, I decided to relieve my frustration by benchmarking stuff, so that I could make the best of a situation. Only thing I managed to do is cause me even more frustration. Some of my results are the following : 1200/1625MHz at +125/+25mV, 1210/1625MHz at +150/+25mV and, my personal favorite 1225/1625MHz at +175/+25mV. I was at least happy that it didn't explode, because otherwise I'm ready to throw the thing out the window. Temps didn't break 66C on the core and 60C on the VRM (10-15C ambient, side panel off and fans blasting at 100% does provide substantial cooling it seems), so these aren't the issue. I can't for the life of me explain why my scores go further down the higher I go. There also was no artifacting as far as I could tell. And then there's this 1185/1725MHz at +100/+50mV run that obliterates (ok, I exaggerate here but still) all higher overclocks. Also, my memory refuses to do 1750MHz no matter how much voltage I throw at it. Boom, black screen or RSOD. Kind of ridiculous, really. 1725MHz is fine for anything other than prolonged gaming.
> 
> So... Any ideas are welcome. Both for overclocking/benchmarking and anger management.
> 
> EDIT 2 : Did a suicide run at 1225/1725MHz at +200/+25mV. Still, no go. 14400 plus change. What the...?


I noticed the same thing, I was increasing/lowering voltage and core speeds one at a time and still couldn't figure it out.. At some point there seems to only be more harm than good from extra voltage, it seems anything past +150mV for me gives very little if any benefit it at all.

Also if I game for extensive time at like +175mV or +200mV, I eventually just get a black screen and have to reset my BIOS to get picture out again.
Weird... I'm thinking maybe the PCI-E lane can't handle the juice or something this thing eats at +200mV.

EDIT: Just to clarify, my Core stays at 45C or below at all times even at +200mV and VRM doesn't go higher than 75C at +200mV.
I don't think improving the temps much further would help my case very much atleast, maybe. There isn't much I can do to try improve them anyway from this point on.

Just need to open the block up and confirm it's making contact between the GPU block and the heatsink itself as some users have reported in the Alphacool GPX thread that their blocks are actually making zero contact with the heatsink itself! Could explain my high VRM temps before a few fixes aswell.


----------



## battleaxe

Quote:


> Originally Posted by *tolis626*
> 
> I know it's the lottery. By "secrets", I meant to teach me how to win the damn lottery.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> On a more serious note, the only thing keeping me from slapping an AIO or something on my card is that its resale value will tank completely. Or at least I think it will. Maybe I'll change my mind about that as summer approaches. Temps do improve things a lot for GPUs, so I do not think 1200MHz is out of the realm of possibility for 24/7 operation under water. For how long that would work... That's another story. I have no idea how fast voltage and current can damage a GPU, but I tend to think that if temps are in check it's all fine, really.
> 
> Other than that, do you have any ideas why my scores suck so bad? I mean, they're not bad per se, but I see marginal improvements over 1150MHz and even a regression in performance over 1185MHz. Is it just that these GPUs hate voltage that much? I hoped not as long as temps aren't the issue...


There does seem to be something of diminishing returns on these. But I think it is temperature related. If you get your card cooler try running the higher volts and see what happens. That's my theory anyway. As I got the core colder then it seemed to increase stability and I could go for higher clocks, that's the only reason it went to 1245mhz because the core was under 50c. That said, there is going to be a trade-off at some point.


----------



## Vellinious

Quote:


> Originally Posted by *battleaxe*
> 
> There does seem to be something of diminishing returns on these. But I think it is temperature related. If you get your card cooler try running the higher volts and see what happens. That's my theory anyway. As I got the core colder then it seemed to increase stability and I could go for higher clocks, that's the only reason it went to 1245mhz because the core was under 50c. That said, there is going to be a trade-off at some point.


This...temps definitely played a part. In my testing on air before my block arrived, I could get it up to about 1250ish. My cap was more temp related though, as I could tell there was more there. Under water, I can hit as high as 1330 on firestrike / heaven runs.


----------



## navjack27

it's not temps, from discovery on the bios editing thread it's mainly power related.

EDIT: wait well yeah... higher clocks you need a lower nominal temp for stability, lower then the highest you would consider stable for whatever clocks you use without a voltage adjustment. like i use 1100/1625 24/7, my fan curve goes:


but when i take it up to 1175/1750 with +125mv on the core.... i don't chance anything and just fix it to 100%


----------



## DirtyDutch

Count me in. ?


----------



## THUMPer1

Can't get 1200 core on my 390x even at +60 core. I get artifacts, so I gave up. Great card either way, freesync is the ****!


----------



## ZealotKi11er

Quote:


> Originally Posted by *THUMPer1*
> 
> Can't get 1200 core on my 390x even at +60 core. I get artifacts, so I gave up. Great card either way, freesync is the ****!


Try +100-150.


----------



## navjack27

Quote:


> Originally Posted by *THUMPer1*
> 
> Can't get 1200 core on my 390x even at +60 core. I get artifacts, so I gave up. Great card either way, freesync is the ****!


might be talkin out my ass here but not many with a 390x go that high... and if they do its just for QUICK benchmark runs and then they douse their cards with water and ice to shock it out of fever and delirium.


----------



## mus1mus

Quote:


> Originally Posted by *Lixxon*
> 
> Oh any special reason it should be +50 power limit always? Never heard aboutt memory straps is that specific speed that has to be? ( I cant put 1635 for example?)


Power Limit maxed will allow the card to pull what it needs to. And not to be limited by power starvation.

Unless you modify your BIOS's Memory timings, 1625 will be better than 1635. You can just test it.


----------



## battleaxe

Quote:


> Originally Posted by *navjack27*
> 
> might be talkin out my ass here but not many with a 390x go that high... and if they do its just for QUICK benchmark runs and then they douse their cards with water and ice to shock it out of fever and delirium.


LOL...


----------



## Vellinious

Did a little testing tonight with core / memory clocks and voltages. Didn't seem to matter what I set the voltage at above +225, stability didn't increase, and max/min voltages stayed the same during the runs. 1.305 / 1.273 respectively. Of course, when it wasn't under load, peak read 1.443v....

Got the best results with 1293 on the core and between 1814 and 1820 on the memory. By the end, when I ran this, the coolant temp was starting to crawl up past 5c over ambient, so, I think if I go at it fresh, when the coolant temps are lower, it may be able to run a bit better. Thinking about lowering the ambient temp by another 5c or so, see if that doesn't make a difference.

Still....please with the progress. Tess off testing...still couldn't break 16k with tess enabled.


----------



## kizwan

@Vellinious you'll need to remove the voltage limit to push more.


----------



## Vellinious

Quote:


> Originally Posted by *kizwan*
> 
> @Vellinious you'll need to remove the voltage limit to push more.


Yeah, I need to figure out how to do that. Not sure if raising the power limits/TDP would do anything, but I'm going to try that tomorrow as well.


----------



## Synntx

Quote:


> Originally Posted by *tolis626*
> 
> Holy hell that's fast. What voltages are you using? My card craps out at anything over 1.28V under load.


I was pushing +175mv on that test. Keep in mind, this is, in no way, stable. That was just to push it to the max on the bench.


----------



## mus1mus

Quote:


> Originally Posted by *Vellinious*
> 
> Did a little testing tonight with core / memory clocks and voltages. Didn't seem to matter what I set the voltage at above +225, stability didn't increase, and max/min voltages stayed the same during the runs. 1.305 / 1.273 respectively. Of course, when it wasn't under load, peak read 1.443v....
> 
> Got the best results with 1293 on the core and between 1814 and 1820 on the memory. By the end, when I ran this, the coolant temp was starting to crawl up past 5c over ambient, so, I think if I go at it fresh, when the coolant temps are lower, it may be able to run a bit better. Thinking about lowering the ambient temp by another 5c or so, see if that doesn't make a difference.
> 
> Still....please with the progress. Tess off testing...still couldn't break 16k with tess enabled.


Maybe worth looking at this BIOS. You have 8GB Though.

https://www.kingpincooling.com/forum/archive/index.php/t-2637.html

And/or try that GPU Tweak that can push more than 1.6V

I tested this on the linked BIOS.

Without GPU TWEAK, V=1.367 max.
With GPU TWEAK, I wont tell. Might freak some guys out.


----------



## Vellinious

Quote:


> Originally Posted by *mus1mus*
> 
> Maybe worth looking at this BIOS. You have 8GB Though.
> 
> https://www.kingpincooling.com/forum/archive/index.php/t-2637.html
> 
> And/or try that GPU Tweak that can push more than 1.6V
> 
> I tested this on the linked BIOS.
> 
> Without GPU TWEAK, V=1.367 max.
> With GPU TWEAK, I wont tell. Might freak some guys out.


I'll try the GPU tweak. I used the Hawaii bios editor and raised the power limit / TDC limit and TDP max to 250. Didn't change anything else. Just wanted to try higher power limits. Booted up just fine, opened HiS iTurbo, started adding the test clocks (1270 core / 1790 memory), pushed the power limit up to 50%, enabled it, and...screen went fuzzy. Took a reboot to reset it.

Seems like ANY changes to the bios just causes an inordinate amount of instability. I don't get it.....


----------



## patriotaki

How can I rub 2x R9 390's with 2 PSUs ? I don't have an adapter . just want to run them for few hours ..any safe way?


----------



## Stige

Hasn't it been said several times already that you only need to jump the 2 pins on the other PSU to power it on...


----------



## rdr09

Quote:


> Originally Posted by *Stige*
> 
> Hasn't it been said several times already that you only need to jump the 2 pins on the other PSU to power it on...


Which should be turned on first? Be nice if you can turn them on exactly same time like you do with an adapter.


----------



## Stige

Quote:


> Originally Posted by *rdr09*
> 
> Which should be turned on first? Be nice if you can turn them on exactly same time like you do with an adapter.


It doesn't matter which one is turned on first. Just wire the wires needed from your PC to the second PSU and should work just fine. And turn on at the same time.


----------



## THUMPer1

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Try +100-150.


Afterburner goes up to 100. Even +100 on 1200 it doesn't seem to be enough I still get some artifacting, not as much but I completed the heaven bench and only gained 10 points.


----------



## Stige

Quote:


> Originally Posted by *THUMPer1*
> 
> Afterburner goes up to 100. Even +100 on 1200 it doesn't seem to be enough I still get some artifacting, not as much but I completed the heaven bench and only gained 10 points.


BIOS modding is the best way for extra voltage, or you can use some crappy software like Trixxx that allows +200mV.


----------



## THUMPer1

Quote:


> Originally Posted by *Stige*
> 
> BIOS modding is the best way for extra voltage, or you can use some crappy software like Trixxx that allows +200mV.


Eh, I don't really care that much now. I'll sell it once the new cards come out anyway.


----------



## spyshagg

shameless copy paste but it is a calling to arms

jump on this will ya!!!!!

http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd/250_50#post_24886638

lets do it come on


----------



## battleaxe

Quote:


> Originally Posted by *rdr09*
> 
> Which should be turned on first? Be nice if you can turn them on exactly same time like you do with an adapter.


Jump from green on the 24pin to black on 24 pin of the same connector. You will need a sep power cord to the extra PSU. When you plug it in it will power on. I usually power on the auxillary PSU first, then turn the PC on and the extra GPU will have power when it boots to register for BIOS. Should pick it right up, I've never had a problem with this.

You can use something simple like a paper clip to jump between the green and black wires on the connector, super easy.


----------



## TsukikoChan

Quote:


> Originally Posted by *battleaxe*
> 
> You can use something simple like a paper clip to jump between the green and black wires on the connector, super easy.


I'm sure this should work and all but taking a paperclip to a live PSU to turn it on sounds like something that could get someone a darwin award if not done right ^^;;


----------



## battleaxe

Quote:


> Originally Posted by *TsukikoChan*
> 
> I'm sure this should work and all but taking a paperclip to a live PSU to turn it on sounds like something that could get someone a darwin award if not done right ^^;;


This was a common practice among miners. It's only a 12v trigger, not a powered feed from the wall. It's not dangerous, nor does it even do anything if you short it against a wall of the case. I understand the concern, but in this case, there's nothing to fear. I've done it probably thirty times in the last three years.


----------



## TsukikoChan

Quote:


> Originally Posted by *battleaxe*
> 
> This was a common practice among miners. It's only a 12v trigger, not a powered feed from the wall. It's not dangerous, nor does it even do anything if you short it against a wall of the case. I understand the concern, but in this case, there's nothing to fear. I've done it probably thirty times in the last three years.


hehe, oh i know it will work fine but i'm more concerned with any individuals who might short the wrong sections of the live psu ^.^ i keep envisioning someone taking a fork to a wallsocket haha.


----------



## battleaxe

Quote:


> Originally Posted by *TsukikoChan*
> 
> hehe, oh i know it will work fine but i'm more concerned with any individuals who might short the wrong sections of the live psu ^.^ i keep envisioning someone taking a fork to a wallsocket haha.


Well, yeah. If they would stick a paper clip into the red wire and then the black and plug it in something will likely pop. I'm assuming he's not color blind, haha... Honestly, the green is easy to pick out an there's black wires on the same side. Its kinda hard to screw it up TBH. Take a look and you'll see what I mean.


----------



## rdr09

Quote:


> Originally Posted by *Stige*
> 
> It doesn't matter which one is turned on first. Just wire the wires needed from your PC to the second PSU and should work just fine. And turn on at the same time.


Quote:


> Originally Posted by *battleaxe*
> 
> Jump from green on the 24pin to black on 24 pin of the same connector. You will need a sep power cord to the extra PSU. When you plug it in it will power on. I usually power on the auxillary PSU first, then turn the PC on and the extra GPU will have power when it boots to register for BIOS. Should pick it right up, I've never had a problem with this.
> 
> You can use something simple like a paper clip to jump between the green and black wires on the connector, super easy.


I know about jumping the green and the black. i use an adapter that came with my evga psu to power my water pump when filling the loop.

so, either turn them on at same time or auxiliary first (the psu that only powering the secondary gpu).


----------



## DirtyDutch

Got my Sapphire R9 390 390 running steady without any artifacts at 1140/1650Mhz +50mV and Temps at 70°C.
This anny1 good, kinda new to overclocking.


----------



## Dundundata

Quote:


> Originally Posted by *DirtyDutch*
> 
> Got my Sapphire R9 390 390 running steady without any artifacts at 1140/1650Mhz +50mV and Temps at 70°C.
> This anny1 good, kinda new to overclocking.


sounds good


----------



## christoph

Quote:


> Originally Posted by *Stige*
> 
> BIOS modding is the best way for extra voltage, or you can use some crappy software like Trixxx that allows +200mV.


so what's the best software to use for overcloking??


----------



## Derek129

Quote:


> Originally Posted by *christoph*
> 
> so what's the best software to use for overcloking??


Msi Afterburner


----------



## christoph

Quote:


> Originally Posted by *Derek129*
> 
> Msi Afterburner


so still the best opcion...


----------



## mus1mus

On air, afterburner for normal OC levels.

TriXX and iTurbo if on water. For higher Voltages.


----------



## bill1971

is possible to flash the 390x bios to xfx 390(like 290-290x) who has a friend of mine?


----------



## mus1mus

Quote:


> Originally Posted by *bill1971*
> 
> is possible to flash the 390x bios to xfx 390(like 290-290x) who has a friend of mine?


It is. But note that some motherboards detects mismatched device ID after bios flashing and will stop you from posting. RVE does this for example. And device ID of a 290 is different from the 290X.

Otherwise, the bios will just ignore the difference between the GPUs and work as a normal 390 (locked CUs) and if luck permits, 390X (unlocked CUs).

Though, don't ever think that you will end up with a perfectly working card after flashing. Differences in clocks, Voltages etc, can pose issues to the card. Most of it can be fixed and ignored though.


----------



## bill1971

Quote:


> Originally Posted by *mus1mus*
> 
> It is. But note that some motherboards detects mismatched device ID after bios flashing and will stop you from posting. RVE does this for example. And device ID of a 290 is different from the 290X.
> 
> Otherwise, the bios will just ignore the difference between the GPUs and work as a normal 390 (locked CUs) and if luck permits, 390X (unlocked CUs).
> 
> Though, don't ever think that you will end up with a perfectly working card after flashing. Differences in clocks, Voltages etc, can pose issues to the card. Most of it can be fixed and ignored though.


because I own a 290 card and already have chanche bios,i want to know if the way to flash bios in 390 is the same with 290?


----------



## mus1mus

Should be.


----------



## Lixxon

Hey, been running some tests today, Heaven, furmark, firestrike and it seems to be running fine, alltho is there any program or software that would tell me its not stable?


----------



## Stige

It seems my VRM starts giving me artifacts around ~74-75C.

I need to test this more, turn my fans around so my rad is blowing air outside and not inside the case and see what happens, hopefully get the temps below 75C at all times, adding 2 extra fans (Missing one fan..) to my Rad for 5 Fans total in Push/Pull dropped my temps in the long run by 5C already, adding a third fan and pushing the air outside should keep it below 74C or so.


----------



## tolis626

Quote:


> Originally Posted by *Stige*
> 
> It seems my VRM starts giving me artifacts around ~74-75C.
> 
> I need to test this more, turn my fans around so my rad is blowing air outside and not inside the case and see what happens, hopefully get the temps below 75C at all times, adding 2 extra fans (Missing one fan..) to my Rad for 5 Fans total in Push/Pull dropped my temps in the long run by 5C already, adding a third fan and pushing the air outside should keep it below 74C or so.


This has been my experience as well. I want to try adding a fan directly next to the GPU so that it blows air directly into the heatsink, just in case it improves things ever so slightly. I have my two bottom case fans blowing cold air on it alrady though, so I doubt there's going to be much difference.

My issue with this is that I can't have my fans ramp up and down depending on GPU temps or even depending on whether I'm running a game or not, so I have to do it manually and that plain sucks.

When you say artifacts, I guess they are random flashes of colors, right? That's what I get when temps start being an issue. Else, it's just white or black squares, like checkerboard or something. I also get black screens every now and then when trying to run my memory higher than 1700MHz. God damnit.


----------



## Stige

It's like a small flicker on a texture somewhere every few minutes, it's nothing major. But it doesn't happen at all until the temps ramp up.
Exactly like the checkerboard you described.

So I'm thinking I need to get them to 70C or less under any case.. I'm hoping reversing my radiator fans will help this out a lot as it isn't blowing the constantly heating up radiator air directly to the GPU anymore thus kinda making the temp rise exponential (GPU heats water and radiator heats the GPU more from the rising water temps).

EDIT: I left Armored Warfare running in hangar when I went to watch deadpool, temps rose to 47C Max on Core and 78C/49C on the VRM.

Less than that 81C I was experiencing before atleast.


----------



## m70b1jr

I'm trying to get 1200mhz on liquid, but even with all the power + core up, I can't seem to get past 1115mhz on core and 1700mhz on memory.


----------



## mus1mus

Quote:


> Originally Posted by *m70b1jr*
> 
> I'm trying to get 1200mhz on liquid, but even with all the power + core up, I can't seem to get past 1115mhz on core and 1700mhz on memory.


Add more Voltage.


----------



## Stige

Card stuck at max clocks at all times all of a sudden? Trying to put less clocks on MSI AB and it reverts back to what it was.

Weird? Any idea why it happens?

EDIT: Well this is weird, I tried to go and change the refresh rate...


----------



## Stige

Well reboot fixed that but clocks are still stuck at max at all times for some weird reason.

Just bought a digital thermometer to see some temps, ambient is 25.3C and rear exhaust gives me 31.8C. Heating up the room nicely.

I turned the heating off completely in this room, get the ambients down a bit and get that VRM near that 70C or less. And turn the radiator fans to exhaust instead of intake later today.

EDIT: In 10 minutes after turning Armored Warfare off, ambient is now 23.9C and exhaust is 28.1C.
EDIT2: Temps seem to settle at 23.7C / 26.9C now.


----------



## TsukikoChan

woo! new PSU arrived! got myself a evga supernova 850w G2, so i should be installing that tonight, should make my system a bit more stable 
(i was trying to get the 750w g2 but amazon has been out of stock of those for weeks and they dropped the price of the 850w over the weekend to a nice round 100quid mark so i went and bought that :-D futureproofing myself i hope, considering this thing has a 10year warranty on it if i can recall correctly)


----------



## ziggystardust

Quote:


> Originally Posted by *TsukikoChan*
> 
> woo! new PSU arrived! got myself a evga supernova 850w G2, so i should be installing that tonight, should make my system a bit more stable
> (i was trying to get the 750w g2 but amazon has been out of stock of those for weeks and they dropped the price of the 850w over the weekend to a nice round 100quid mark so i went and bought that :-D futureproofing myself i hope, considering this thing has a 10year warranty on it if i can recall correctly)


Yes, 10 year warranty, but you need to register it on evga website to get that.


----------



## Stige

I wonder if there are any properly good 200mm fans out there that can push air well enough through filters.

If I keep my case open, ambient temps are around 24.1C while exhaust temp is 29.4C from the radiator now that I reversed the fans.
If I close my case, exhaust temp rises to 29.9C.

I did get my VRM1 to drop down from 75-76C to around ~73C. Still could do with a few degrees lower temps.


----------



## Perfect-Anubis

https://pcpartpicker.com/part/xfx-video-card-r9390p8bd6

Anyone see or use this blower style 390 yet? I'm thinking about getting it.


----------



## bill1971

how can I find out if my 390 card is unlock,to flash other bios?


----------



## m70b1jr

Quote:


> Originally Posted by *Perfect-Anubis*
> 
> https://pcpartpicker.com/part/xfx-video-card-r9390p8bd6
> 
> Anyone see or use this blower style 390 yet? I'm thinking about getting it.


As an owner of the XFX Double Dissipation, I'd say go for the DD version. it never went above 85C on a max overclock, but I'm on liquid now.


----------



## buttface420

im not a very smart gpu overclocker but my pc doesnt like raising the power limit at all. so far playing around i can get 1160/1600 at +75mv and power limit at 0 on my msi gaming 390. it was stable long enough to run firestrike:

http://www.3dmark.com/3dm/10739802

this is only my first run at finding its max oc though...will try for higher soon.


----------



## buttface420

double post my bad


----------



## LongRod

Well, don't know if this has been posted yet, but to anyone looking for a full cover waterblock for the MSI variants, EK finally released one. A bit late but, still good that they've done it.

https://www.ekwb.com/news/ek-releases-msi-radeon-r9-390x-gaming-8g-full-cover-water-block/


----------



## buttface420

Quote:


> Originally Posted by *LongRod*
> 
> Well, don't know if this has been posted yet, but to anyone looking for a full cover waterblock for the MSI variants, EK finally released one. A bit late but, still good that they've done it.
> 
> https://www.ekwb.com/news/ek-releases-msi-radeon-r9-390x-gaming-8g-full-cover-water-block/


awesome!!! im so glad they decided to do that looks like im getting one for my msi 390


----------



## Perfect-Anubis

Quote:


> Originally Posted by *m70b1jr*
> 
> As an owner of the XFX Double Dissipation, I'd say go for the DD version. it never went above 85C on a max overclock, but I'm on liquid now.


You think this version of it would get hot then over the DD version?

The case I'm going to use is the GD09 by Silverstone. It's an ATX HTPC, so I figured blower style would be better for temperatures for the case and the rest of the components.


----------



## m70b1jr

Quote:


> Originally Posted by *Perfect-Anubis*
> 
> You think this version of it would get hot then over the DD version?
> 
> The case I'm going to use is the GD09 by Silverstone. It's an ATX HTPC, so I figured blower style would be better for temperatures for the case and the rest of the components.


Go for the DD


----------



## christoph

Quote:


> Originally Posted by *ziggystardust*
> 
> Yes, 10 year warranty, but you need to register it on evga website to get that.


what???

is it really good those PSU? I've been tenting to upgrade my PSU


----------



## ziggystardust

Quote:


> Originally Posted by *christoph*
> 
> what???
> 
> is it really good those PSU? I've been tenting to upgrade my PSU


Yeah it's a great unit. One of the best in terms of ripple suppression and performance.


----------



## christoph

Quote:


> Originally Posted by *ziggystardust*
> 
> Yeah it's a great unit. One of the best in terms of ripple suppression and performance.


all right, I'll keep it in mind


----------



## Vellinious

Joined team red on the fan boy battle. 55k was as high as I could get on Vantage....what's the secret with that benchmark? I saw a couple people hit 57k and 58k....


----------



## mus1mus

Quote:


> Originally Posted by *Vellinious*
> 
> Joined team red on the fan boy battle. 55k was as high as I could get on Vantage....what's the secret with that benchmark? I saw a couple people hit 57k and 58k....


Heya.

Successively running the app doing tiny increments of the Clocks will take you there. By tiny, I meant, keep the Memory on a baseline,and add 2 or 1 MHz off your previous result. The app is very picky with clocks.

BTW, Wanna try the VOLT MOD Again?


----------



## Vellinious

Quote:


> Originally Posted by *mus1mus*
> 
> Heya.
> 
> Successively running the app doing tiny increments of the Clocks will take you there. By tiny, I meant, keep the Memory on a baseline,and add 2 or 1 MHz off your previous result. The app is very picky with clocks.
> 
> BTW, Wanna try the VOLT MOD Again?


I tried setting my power limit / TDP and TDC to 250. Booted fine, tried to set the power target slider to 50%, and the screen went funky. I've about given up on getting a modded bios to work on this card.

I guess maybe set the power limit / TDP to 500 or something and then not touch that slider? /shrug

Will probably try that bios you gave me again with the new PSU. Not touch the power limit slider, and see if it'll run without freakin out.


----------



## mus1mus

Quote:


> Originally Posted by *Vellinious*
> 
> I tried setting my power limit / TDP and TDC to 250. Booted fine, tried to set the power target slider to 50%, and the screen went funky. I've about given up on getting a modded bios to work on this card.
> 
> I guess maybe set the power limit / TDP to 500 or something and then not touch that slider? /shrug
> 
> Will probably try that bios you gave me again with the new PSU. Not touch the power limit slider, and see if it'll run without freakin out.


I am about to pinpoint the VCore Limit HEX on the ROM. Once done, that's the only thing to change. Less tweaks less error possibility.


----------



## Vellinious

Quote:


> Originally Posted by *mus1mus*
> 
> I am about to pinpoint the VCore Limit HEX on the ROM. Once done, that's the only thing to change. Less tweaks less error possibility.


Let me know...I'd love to get a little more out of this card.


----------



## mus1mus

Quote:


> Originally Posted by *Vellinious*
> 
> Let me know...I'd love to get a little more out of this card.


Testing my 290X now. So far so good. 3 HEXES found. Narrowing them down.

Using those 3 HEXES, I can use my STOCK BIOS to hit 1.45V. Blackscreens happen past that. Testing the mod with the ASUS BIOS now.








PM SENT.


----------



## bill1971

Quote:


> Originally Posted by *bill1971*
> 
> how can I find out if my 390 card is unlock,to flash other bios?


sorry could you help?


----------



## TsukikoChan

New PSU in pc, can't say for sure yet how my OCs will fare as I didn't get it installed until late last night (geez, the supernova is a beast! much longer than my previous psu, had to remove and re-set my bottom fan to get it in!). Firestrike ran fine afterwards with no issues with my full 390x OC and i upped my 8350 to 4.5ghz with no issues 
only prob i have now is that since the new psu is silent, i now realise my WDBlack 1tb is crunching heavily and audibly on any data load/write (it's been getting slow the last few weeks), so me thinks it's time to try rma it after data backup.

interestingly though, as you guys said my 390x fans don't spin when not being pushed (i had the case open) and only spun up when i was running firestrike. trixx still showed fanspeed above 0% despite not on :< here's hoping the fan speed reporting gets fixed soon on these cards.

Also interestingly, when i had my case open (i was hunting down source of clicking which turned out to be bottom fan) all temps dropped on cpu and gpu :< 26oC on idle for my 390x and 6-15oC on cpu over firestrike.. is my case airflow really that bad that it adds that much degrees when closed up? hmm...

p.s. evga make some nice psu's! the package was well presented with a variety of cables and some cable ties! i think i'm in love XD


----------



## mus1mus

Quote:


> Originally Posted by *bill1971*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bill1971*
> 
> how can I find out if my 390 card is unlock,to flash other bios?
> 
> 
> 
> sorry could you help?
Click to expand...




Spoiler: Warning: Spoiler!



This is your thread.


----------



## Stige

Quote:


> Originally Posted by *Stige*
> 
> It seems my VRM starts giving me artifacts around ~74-75C.
> 
> I need to test this more, turn my fans around so my rad is blowing air outside and not inside the case and see what happens, hopefully get the temps below 75C at all times, adding 2 extra fans (Missing one fan..) to my Rad for 5 Fans total in Push/Pull dropped my temps in the long run by 5C already, adding a third fan and pushing the air outside should keep it below 74C or so.


After adding those two fans, I took apart the GPX block today to find this:

(Yes I took the block apart while it was attached to the loop, was too lazy to remove it lol)

The block was barely making contact with the heatsink... Unacceptable build quality if you ask me...

I tightened the screws inside there, applied Phobya NanoGrease and put it all back together and now my temps are 46C Core, 69C/45C VRM. Another nice ~5C drop in temps, starting to look decent now.


----------



## Tivan

Quote:


> Originally Posted by *TsukikoChan*
> 
> New PSU in pc, can't say for sure yet how my OCs will fare as I didn't get it installed until late last night (geez, the supernova is a beast! much longer than my previous psu, had to remove and re-set my bottom fan to get it in!). Firestrike ran fine afterwards with no issues with my full 390x OC and i upped my 8350 to 4.5ghz with no issues
> only prob i have now is that since the new psu is silent, i now realise my WDBlack 1tb is crunching heavily and audibly on any data load/write (it's been getting slow the last few weeks), so me thinks it's time to try rma it after data backup.
> 
> interestingly though, as you guys said my 390x fans don't spin when not being pushed (i had the case open) and only spun up when i was running firestrike. trixx still showed fanspeed above 0% despite not on :< here's hoping the fan speed reporting gets fixed soon on these cards.
> 
> Also interestingly, when i had my case open (i was hunting down source of clicking which turned out to be bottom fan) all temps dropped on cpu and gpu :< 26oC on idle for my 390x and 6-15oC on cpu over firestrike.. is my case airflow really that bad that it adds that much degrees when closed up? hmm...
> 
> p.s. evga make some nice psu's! the package was well presented with a variety of cables and some cable ties! i think i'm in love XD


HDD sounds/performance might be from the drive just doing a lot of moving the hdd header around due to fragmentation. Not sure if the windows defragmentation routine is enabled by default, but I'd run it manually and try doing a full run. Not sure if that's the issue though! Some HDD models just happen to be kinda not very quiet when moving the header around, so it's a possibility.

As for the temperatures, I'd just make sure you have ample amounts of air intake on the bottom half of the case, and ample amounts of air exhaust on the top. Oh and the air filters might be clogged with dust. Happens faster than one thinks!


----------



## Falmatrix2r

Hey, started some OCing on the core clock of the Sapphire Tri-x 390x, today I'm at 1090Mhz (1055Mhz stock) on core clock, (+13mV stock) and +20% power limit. It's been stable after 30min in Valley. I'll do some gaming later on and if everything keeps being stable, I'll bump up 10Mhz more.
VRM max temps 79c and 70c
GPU max temp 70c
Room temp is at 22c


----------



## Stige

Always put your power limit at +50%.


----------



## NovaEnid

I'm new to the AMD fanclub: https://www.techpowerup.com/gpuz/details.php?id=w4dw5

Sapphire R9 390 Nitro with stock cooler.

Twiddled a bit with MSI Afterburner:

+19mV (stock) Core Voltage
+50% Power Limit
1100 MHz Core Clock
1625 MHz Memory Clock
Temps are around 80°C (FurMark) with default Fan curve.

Is that a good result? I'd have to increase voltage by quite a lot if I wanted to push Core Clock even more.


----------



## Spartoi

Does memory OC increase performance as much as a core OC in game? I know in benchmarks OC'ing the memory increases performance a bit, but does it actually translate into game performance? Should I prioritize the core clock speed over the memory speed or should there be a balance for the best gaming performance?


----------



## diggiddi

Prioritize core over memory any day. First max core out then try to max memory


----------



## Falmatrix2r

Quote:


> Originally Posted by *Stige*
> 
> Always put your power limit at +50%.


I've seen some say +20% is better and 50% is simply leaking watts. I'm not sure what to think about the power limit.
Do you find my temps safe?


----------



## Stige

What kinda crap is "leaking watts", it doesn't affect it in any way...

It will allow your card to work at full speed at all times...


----------



## Vellinious

Quote:


> Originally Posted by *Falmatrix2r*
> 
> I've seen some say +20% is better and 50% is simply leaking watts. I'm not sure what to think about the power limit.
> Do you find my temps safe?


The power limit is simply telling the card that it can pull that many watts if it needs to....it's not "leaking" anything.


----------



## Falmatrix2r

Quote:


> Originally Posted by *Stige*
> 
> What kinda crap is "leaking watts", it doesn't affect it in any way...
> 
> It will allow your card to work at full speed at all times...


Quote:


> Originally Posted by *Vellinious*
> 
> The power limit is simply telling the card that it can pull that many watts if it needs to....it's not "leaking" anything.


Sorry, bad choice of words, I meant to say that I've read on some forums that using +50% will only draw more power without any performance gain or might even result in worse performance + higher temps.


----------



## Vellinious

Quote:


> Originally Posted by *Falmatrix2r*
> 
> Sorry, bad choice of words, I meant to say that I've read on some forums that using +50% will only draw more power without any performance gain or might even result in worse performance + higher temps.


It's not going to result in worse performance....but could result in higher temps. I'd agree with the latter. But only because when I overclock, I'm pushing my hardware for a purpose. Not to gain an extra 2 or 3 fps in some silly shooter..... /shrug


----------



## Falmatrix2r

Quote:


> Originally Posted by *Vellinious*
> 
> It's not going to result in worse performance....but could result in higher temps. I'd agree with the latter. But only because when I overclock, I'm pushing my hardware for a purpose. Not to gain an extra 2 or 3 fps in some silly shooter..... /shrug


So about the temps, what would be the safe line I shouldn't cross?


----------



## Stige

Quote:


> Originally Posted by *Falmatrix2r*
> 
> Sorry, bad choice of words, I meant to say that I've read on some forums that using +50% will only draw more power without any performance gain or might even result in worse performance + higher temps.


It will draw more power if it needs more power...

If it doesn't, nothing will change...


----------



## Vellinious

Quote:


> Originally Posted by *Falmatrix2r*
> 
> So about the temps, what would be the safe line I shouldn't cross?


That's a really good question for someone that's air cooling. I'm not sure what temps start to affect performance and stability on the Hawaii GPUs.


----------



## mus1mus

Stay within 70C for longer life.

80C is for VRMs and Core for max. They can allow higher. But that's just me. I have never hit 80C for quite a while now.


----------



## tolis626

Quote:


> Originally Posted by *Vellinious*
> 
> That's a really good question for someone that's air cooling. I'm not sure what temps start to affect performance and stability on the Hawaii GPUs.


On my card that happens around the 75C mark if pushing really hard, else it's about 80C where stability is compromised. Someone else has said that their card behaves a lot better when under 60C (Under water, obviously). So i guess it's a linear thing where the lower you go the better (duh!). Talking numbers, 1175/1725MHz at +100/+50mV works "fine" until 75C, then I start getting random color flashes all over the place. Or it will downright crash after prolonged periods, where lowering the memory to 1700MHz actually works but still behaves wonky at over 75C. On the other hand, and I can't confirm this because I haven't really tested enough (Low power consumption isn't where it's at for me, especially during winter months), but the 60C point seems to be where differences start to appear in stability. I used to do 1000/1500MHz at -100mV, but I would get a crash on some games every now and then. Turns out that they were games that didn't heat up my CPU, thus didn't cause my case fans to spin up (all are controlled by the CPU temperature until I find a better solution) and combined with a really NOT aggresive GPU fan curve they caused my GPU to heat up to over 70C before the fans kicked in and cooled it down. Having my case fans run at 1250RPM (about 90%) and an aggressive fan curve on the GPU, temps never pass 60C with that profile and usually hang around the 55-60C range. Limited testing has showed that 1025/1625MHz at -100mV is stable thus far. So... Yeah. Maybe it's a thing? Maybe not? I'm just sharing experiences and other people can also try.

On another note, switching from 1000/1500MHz to 1025/1625MHz with both at -100mV caused my power consumption to go up quite a bit, at least according to GPU-z (Yeah, can't be arsed to find and use my kill-a-watt for this). On the former, max power consumption was about 140-150W and average was a little bit over 120W, while on the latter max was a little under 200W and average was 140-145W. Which is not much, sure, but I don't think there should be any difference, let alone a noticeable one (still, Hawaii can be really efficient after all). From what I could remember, actual voltage used on the core was higher on the latter, something like 1.05V to 1.1V. And I think that this can be attributed mostly to memory overclocking. I'm sorry I can't test it more, but these have been a crazy few days and I'm still too busy to do this stuff. If no one else can test on their hardware, I may be able to follow up on this next week.


----------



## Vellinious

Quote:


> Originally Posted by *tolis626*
> 
> On my card that happens around the 75C mark if pushing really hard, else it's about 80C where stability is compromised. Someone else has said that their card behaves a lot better when under 60C (Under water, obviously). So i guess it's a linear thing where the lower you go the better (duh!). Talking numbers, 1175/1725MHz at +100/+50mV works "fine" until 75C, then I start getting random color flashes all over the place. Or it will downright crash after prolonged periods, where lowering the memory to 1700MHz actually works but still behaves wonky at over 75C. On the other hand, and I can't confirm this because I haven't really tested enough (Low power consumption isn't where it's at for me, especially during winter months), but the 60C point seems to be where differences start to appear in stability. I used to do 1000/1500MHz at -100mV, but I would get a crash on some games every now and then. Turns out that they were games that didn't heat up my CPU, thus didn't cause my case fans to spin up (all are controlled by the CPU temperature until I find a better solution) and combined with a really NOT aggresive GPU fan curve they caused my GPU to heat up to over 70C before the fans kicked in and cooled it down. Having my case fans run at 1250RPM (about 90%) and an aggressive fan curve on the GPU, temps never pass 60C with that profile and usually hang around the 55-60C range. Limited testing has showed that 1025/1625MHz at -100mV is stable thus far. So... Yeah. Maybe it's a thing? Maybe not? I'm just sharing experiences and other people can also try.
> 
> On another note, switching from 1000/1500MHz to 1025/1625MHz with both at -100mV caused my power consumption to go up quite a bit, at least according to GPU-z (Yeah, can't be arsed to find and use my kill-a-watt for this). On the former, max power consumption was about 140-150W and average was a little bit over 120W, while on the latter max was a little under 200W and average was 140-145W. Which is not much, sure, but I don't think there should be any difference, let alone a noticeable one (still, Hawaii can be really efficient after all). From what I could remember, actual voltage used on the core was higher on the latter, something like 1.05V to 1.1V. And I think that this can be attributed mostly to memory overclocking. I'm sorry I can't test it more, but these have been a crazy few days and I'm still too busy to do this stuff. If no one else can test on their hardware, I may be able to follow up on this next week.


+1

I've noticed a difference in stability between 17c coolant temps and 22c coolant temps, yeah.... I wasn't sure if the Hawaii GPUs would react the same way to temps as Maxwell does, but....they sure seem to.


----------



## Falmatrix2r

Quote:


> Originally Posted by *mus1mus*
> 
> Stay within 70C for longer life.
> 
> 80C is for VRMs and Core for max. They can allow higher. But that's just me. I have never hit 80C for quite a while now.


If 80c is my limit then I don't have much room left since i'm hitting 77c on vrm 1.

Quote:


> Originally Posted by *Stige*
> 
> It will draw more power if it needs more power...
> 
> If it doesn't, nothing will change...


Thanks for your answer


----------



## christoph

Quote:


> Originally Posted by *Stige*
> 
> It will draw more power if it needs more power...
> 
> If it doesn't, nothing will change...


Ok, now I got a little confuse

what if I have stock clocks, no OC, do i need to increase power limit?
does it help at anything at all?
or may or may not help, but definitely does not hurt?


----------



## Joe88

would this work in getting analog vga output off a 390?
http://www.newegg.com/Product/Product.aspx?Item=N82E16812423001


----------



## mus1mus

Quote:


> Originally Posted by *christoph*
> 
> Ok, now I got a little confuse
> 
> what if I have stock clocks, no OC, do i need to increase power limit?
> does it help at anything at all?
> or may or may not help, but definitely does not hurt?


I some cases, the Default Power Limit can be maxed out by the card under LOAD at stock. So yes, it helps.

Those limiters allow the card to throttle when exceeding the threshold to keep the card within the TDP Envelope and keep Thermals at bay.

Remember the reference models. They have to do that to keep the card cool. But with custom cooling solutions, they are hitting the limiters earlier than exceeding TEMPS. And funny how so,me custom cards retain the reference card's limit figures.


----------



## christoph

Quote:


> Originally Posted by *mus1mus*
> 
> I some cases, the Default Power Limit can be maxed out by the card under LOAD at stock. So yes, it helps.
> 
> Those limiters allow the card to throttle when exceeding the threshold to keep the card within the TDP Envelope and keep Thermals at bay.
> 
> Remember the reference models. They have to do that to keep the card cool. But with custom cooling solutions, they are hitting the limiters earlier than exceeding TEMPS. And funny how so,me custom cards retain the reference card's limit figures.


nice, now I understand


----------



## Stige

Quote:


> Originally Posted by *Falmatrix2r*
> 
> If 80c is my limit then I don't have much room left since i'm hitting 77c on vrm 1.


I would say in theory, you should fine for anything under 90C on Core and under 100C on VRM. What I mean is that it wouldn't really affect the lifespan of the card in any way that matters.

But in reality, that high VRM temps will make your card require a lot more voltage to stablize a specific overclock than say running it at 70C or less would.

Stock card? Just ignore the temps. Going for max overclocks? Get them as low as possible, core temp is not really important, VRM1 is.
I start seeing artifacts at +125mV and 1200/1600 clocks after my VRM1 gets above ~70C. I have managed to get it to top out at ~71C but still need to improve it.. somehow...


----------



## TsukikoChan

Quote:


> Originally Posted by *Joe88*
> 
> would this work in getting analog vga output off a 390?
> http://www.newegg.com/Product/Product.aspx?Item=N82E16812423001


yes, i use one of these myself on my 390x for one of my side monitors (the other is done via a hdmi->vga connector which also works on the 390x) so i can confirm this. I've also got an active dp->dvi working on the 390x as well.

and you know what? regardless of through dp or dvi, windows still decides to change my ruddy screen layout on turning off my main monitor on the 390x *flips table* i was hoping the active dp->dvi would solve that but it didn't (that app for saving desktop icons is a lifesaver!! <3 )


----------



## Joe88

Quote:


> Originally Posted by *TsukikoChan*
> 
> yes, i use one of these myself on my 390x for one of my side monitors (the other is done via a hdmi->vga connector which also works on the 390x) so i can confirm this. I've also got an active dp->dvi working on the 390x as well.
> 
> and you know what? regardless of through dp or dvi, windows still decides to change my ruddy screen layout on turning off my main monitor on the 390x *flips table* i was hoping the active dp->dvi would solve that but it didn't (that app for saving desktop icons is a lifesaver!! <3 )


Do you have the name of that icon saver?
I looked into it once before and found one but wasnt compatible with win10


----------



## tolis626

Quote:


> Originally Posted by *TsukikoChan*
> 
> t (that app for saving desktop icons is a lifesaver!! <3 )


Well, so much for it not solving your problem. You're welcome.









Seriously though, it's awesome. I have it restore the same layout upon booting, so even if my icon layout changes when playing in 1440p (VSR) it will get automatically restored next time I turn the computer on. Nice.
Quote:


> Originally Posted by *Joe88*
> 
> Do you have the name of that icon saver?
> I looked into it once before and found one but wasnt compatible with win10


The app is called DesktopOK and you can find it here. It's working with Windows 10, I'm using it right now.


----------



## TsukikoChan

Quote:


> Originally Posted by *tolis626*
> 
> Well, so much for it not solving your problem. You're welcome.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Seriously though, it's awesome. I have it restore the same layout upon booting, so even if my icon layout changes when playing in 1440p (VSR) it will get automatically restored next time I turn the computer on. Nice.


hehe, sorry sorry, i get lost of who said what in this thread XD ty! <3
It hasn't solved the issue but it has alleviated the symptoms  turning off the monitor or a black screen will still cause issues with explorer from time to time and plays havoc with my icons but the app lets me put things back to where they were easily 

also, i'm being stupid here but the VSR that you can turn on in crimson, that takes whatever display you mount it on (e.g. 1080p monitor) and treats it ingame and onscreen as a 1440p+ resolution that it automatically downscales to 1080p which would make things nicer looking? how much of a hit does it have on performance? do you nee to do anything special in games to get it to work?


----------



## NovaEnid

Quote:


> Originally Posted by *TsukikoChan*
> 
> also, i'm being stupid here but the VSR that you can turn on in crimson, that takes whatever display you mount it on (e.g. 1080p monitor) and treats it ingame and onscreen as a 1440p+ resolution that it automatically downscales to 1080p which would make things nicer looking? how much of a hit does it have on performance? do you nee to do anything special in games to get it to work?


Well, your graphics card has to work with 1,8 times more pixels; so the performance goes down respectively. Other than that, I haven't had any issues with VSR (or DSR @ nVidia) yet.


----------



## jodybdesigns

I woke up this morning and my Powercolor 390 was going 100% full fan speed, and no matter what I set my fan curve to (auto, fixed, or manual) it would not decrease the fan speed until I rebooted my machine.

What's up with that?

*edit* On a positive note, the card was idling at 31C lol


----------



## tolis626

Quote:


> Originally Posted by *TsukikoChan*
> 
> hehe, sorry sorry, i get lost of who said what in this thread XD ty! <3
> It hasn't solved the issue but it has alleviated the symptoms  turning off the monitor or a black screen will still cause issues with explorer from time to time and plays havoc with my icons but the app lets me put things back to where they were easily
> 
> also, i'm being stupid here but the VSR that you can turn on in crimson, that takes whatever display you mount it on (e.g. 1080p monitor) and treats it ingame and onscreen as a 1440p+ resolution that it automatically downscales to 1080p which would make things nicer looking? how much of a hit does it have on performance? do you nee to do anything special in games to get it to work?


I was obviously joking about the icons part. I'm not here for the thank you's. But still, you're welcome.









About VSR, it's what NovaEnid said basically. With VSR you just have your GPU render everything at 1440p (you can even have your desktop at 1440p but it looks crappy and blurry) and have the GPU scale it to your screen. It won't downscale anything, as far as I know. It's not SSAA, it's true 1440p on a 1080p screen and as such will work with any game as long as it's enabled in the driver. You just have to change the resolution in game to what you want. Other than that, the performance hit is the same as when running a native 1440p monitor, no more, no less.


----------



## TsukikoChan

sounds like a fun thing to try out XD


----------



## peejay2104

Okay guys, i really need some help here. I'm still having the black screens issue. It happens just by adding core voltage, when i increase it to 200+MV it crashes. Nothing else was changed, just increased voltage and boom, crash. Tried unplugging the monitor cables but that did not make the blackscreen go away :/
Pj


----------



## TsukikoChan

Quote:


> Originally Posted by *peejay2104*
> 
> Okay guys, i really need some help here. I'm still having the black screens issue. It happens just by adding core voltage, when i increase it to 200+MV it crashes. Nothing else was changed, just increased voltage and boom, crash. Tried unplugging the monitor cables but that did not make the blackscreen go away :/
> Pj


I had a similar black screen issue, as did several other members on this board, and it usuaaaaaaaaally is caused by a psu/powerdraw problem. For myself, my cpu OC caused instability when running with my 390x so lowering my OC stabilised the black screens and crashes. New Psu should fix that for me (as it did another member whose name i cannot recall). Check how much power your 390 and cpu are drawing and maybe lower some OCs to test stability.

on the same note, there are still blackscreen bugs still raised in crimson so it could be that as well.


----------



## Stige

I can also relate to that black screen issue, but it also requires a BIOS reset for me for some unknown reset, if I don't, I won't get a picture out. These issues only showed at +175mV to +200mV on the card under heavy load.


----------



## tolis626

My black screen issues seem to come exclusively from instabilities with my overclocks. The fact that every game or benchmark reacts differently drives me nuts. I've been playing a lot of Witcher 2 these last few days (To see the whole story before playing 3. I've already played the first Witcher game but had held back from 2) and was getting crashes. After I fed up, I lowered my memory overclock from 1700MHz to 1625MHz and I was getting a lot of artifacts. To establish a baseline, I can play BF4 with 1150/1700MHz at +60mV (or maybe +55mV) for hours no problem. Same for Inquisition and it was ok with the Witcher 3 from my really brief testing. With Witcher 2 boom, blackscreen in no time. 1150/1625MHz at the same voltages gives me artifacts and I have to give it like +70 or 75mV for it to be stable. At that point, I think even 1700MHz memory is ok but I can't say for sure. I can also forget any clocks higher than 1160MHz, as that needs +85-90mV to be stable and 1165MHz and above show artifacts even at +100mV. What I have noticed is that with higher (and potentially less stable, it seems) memory clocks, instabilities will more often lead to a downright crash (black screen) instead of plain old artifacting. Once the core is stable enough, higher memory clocks will work. That is, up until 1700MHz or maaaaaybe 1725MHz. 1750MHz is out of the question for my card, sadly.

So my question is, is Witcher 2 to blame or is my card a worse overclocker than I initially thought? Sad thing is that during Witcher 2 in particular, temps get out of control (sort of) after a while when pushing over +70mV (so about 1.25V Vcore), reaching 79C on both the core and VRM1, and that may compromise stability even further. At this point I'm thinking about how I can get a simple 120mm fan to blow air right onto the GPU heatsink while ramping its speed up and down depending on the GPU temps. I'm even thinking about going water (AIO) but I'm too broke at this time to even consider it more seriously.


----------



## peejay2104

Quote:


> Originally Posted by *TsukikoChan*
> 
> I had a similar black screen issue, as did several other members on this board, and it usuaaaaaaaaally is caused by a psu/powerdraw problem. For myself, my cpu OC caused instability when running with my 390x so lowering my OC stabilised the black screens and crashes. New Psu should fix that for me (as it did another member whose name i cannot recall). Check how much power your 390 and cpu are drawing and maybe lower some OCs to test stability.
> 
> on the same note, there are still blackscreen bugs still raised in crimson so it could be that as well.


Yes, that's what i thought. I got a 750w psu from coolermaster tho so there should not be a problem :/
I tried to Use to seperate cables coming from the psu which works, sometimes... Super weird :/ any idea?


----------



## rdr09

Quote:


> Originally Posted by *peejay2104*
> 
> Yes, that's what i thought. I got a 750w psu from coolermaster tho so there should not be a problem :/
> I tried to Use to seperate cables coming from the psu which works, sometimes... Super weird :/ any idea?


Do you mean you get a BS only when you add 200+ mv or when you add any mv? You don't get it if you keep the vcore at stock?


----------



## TsukikoChan

Quote:


> Originally Posted by *peejay2104*
> 
> Yes, that's what i thought. I got a 750w psu from coolermaster tho so there should not be a problem :/
> I tried to Use to seperate cables coming from the psu which works, sometimes... Super weird :/ any idea?


funny thing you should mention coolermaster 750w as that's what i just pulled out of my pc when it was giving me the same bother







i've now replaced it with a 850w (750w out of stock) evga supernova g2. i haven't been able to give it a proper test run yet so i will be doing that over the next few days, but another user here (again, names escape me) moved to a evga psu and the same problem went away 
edit: though mine was doing it at way lower OC than yours so maybe you're pushing the card too hard?
food for thought at least.


----------



## peejay2104

Quote:


> Originally Posted by *TsukikoChan*
> 
> funny thing you should mention coolermaster 750w as that's what i just pulled out of my pc when it was giving me the same bother
> 
> 
> 
> 
> 
> 
> 
> i've now replaced it with a 850w (750w out of stock) evga supernova g2. i haven't been able to give it a proper test run yet so i will be doing that over the next few days, but another user here (again, names escape me) moved to a evga psu and the same problem went away
> edit: though mine was doing it at way lower OC than yours so maybe you're pushing the card too hard?
> food for thought at least.


Hmm, might have to go for a different psu. The thing is, that now it is running at 200+MV with no problem. Everything is at full load, CPU to. It just happens on occasion :/


----------



## peejay2104

Quote:


> Originally Posted by *rdr09*
> 
> Do you mean you get a BS only when you add 200+ mv or when you add any mv? You don't get it if you keep the vcore at stock?


Only when i go over like 150mv. The thing is, now it is running fine at 200+MV. Just happens on occasion apperently :/ CPU is under full load to :/


----------



## rdr09

Quote:


> Originally Posted by *peejay2104*
> 
> Only when i go over like 150mv. The thing is, now it is running fine at 200+MV. Just happens on occasion apperently :/ CPU is under full load to :/


200 is primarily just for benching - imo. you might want to check your temps, especially vrm1 at load.


----------



## peejay2104

Quote:


> Originally Posted by *rdr09*
> 
> 200 is primarily just for benching - imo. you might want to check your temps, especially vrm1 at load.


Vrm1 under load never goes above 88°c,
What would be the max MV you recommend die daily Use?


----------



## rdr09

Quote:


> Originally Posted by *peejay2104*
> 
> Vrm1 under load never goes above 88°c,
> What would be the max MV you recommend die daily Use?


This is just my opinion and based on my knowledge of hawaii (similar to your gpu) . . . max MV for daily use will be based on temp. others, based it on type of cooling. On air, 200 is quite high and, again, imo . . . keep your temps below 80 no matter how accurate or inaccurate the apps are in measuring.

If you can keep your temps below 80 at 150+ MV, then do so.


----------



## peejay2104

Quote:


> Originally Posted by *rdr09*
> 
> This is just my opinion and based on my knowledge of hawaii (similar to your gpu) . . . max MV for daily use will be based on temp. others, based it on type of cooling. On air, 200 is quite high and, again, imo . . . keep your temps below 80 no matter how accurate or inaccurate the apps are in measuring.


Should've mentioned the card is watercooled








What's your max mv for daily Use ?


----------



## Stige

I would run my card at +200mV if I could get my VRM temps to acceptable levels with it but that just doesn't happen right now.
Only run +125mV right now because then they stay around ~71C max which is still a tad high but works ok~ish.


----------



## rdr09

Quote:


> Originally Posted by *peejay2104*
> 
> Should've mentioned the card is watercooled
> 
> 
> 
> 
> 
> 
> 
> 
> What's your max mv for daily Use ?


even at 200+ you should not be hitting 88. No way unless you don't have enuf rad space.

i have crossfire. it's my cpu that needs to be oc'ed.


----------



## peejay2104

Quote:


> Originally Posted by *rdr09*
> 
> even at 200+ you should not be hitting 88. No way unless you don't have enuf rad space.
> 
> i have crossfire. it's my cpu that needs to be oc'ed.


It's my vrm that is about 88°c, which is sadly still air cooled. Only core gpu is watercooled which never goes above 60 on 200+MV








Thinking of ketting a new psu, might get a second gpu to while i'm on it


----------



## rdr09

false
Quote:


> Originally Posted by *peejay2104*
> 
> It's my vrm that is about 88°c, which is sadly still air cooled. Only core gpu is watercooled which never goes above 60 on 200+MV
> 
> 
> 
> 
> 
> 
> 
> 
> Thinking of ketting a new psu, might get a second gpu to while i'm on it


i see. take hint from Stige before getting a new psu. keep it under 80 - vrm 1.


----------



## peejay2104

Alright, does temps on the vrm affect stability? Because i read that they van easily hit 110°c :/


----------



## peejay2104

also, might be a stupid question. But is there a way to see the whole graph on page one with all the overclock statistics person achieved? I can see only till voltage and not whats behind that


----------



## Stige

Quote:


> Originally Posted by *peejay2104*
> 
> It's my vrm that is about 88°c, which is sadly still air cooled. Only core gpu is watercooled which never goes above 60 on 200+MV
> 
> 
> 
> 
> 
> 
> 
> 
> Thinking of ketting a new psu, might get a second gpu to while i'm on it


Those are pretty high temps for a watercooled core.
Quote:


> Originally Posted by *peejay2104*
> 
> Alright, does temps on the vrm affect stability? Because i read that they van easily hit 110°c :/


Yes, quite badly aswell. My clocks can be stable for a period of time until my VRM1 goes past 70C, I start to see small artifact every now and then after that as a sign.


----------



## peejay2104

Quote:


> Originally Posted by *Stige*
> 
> Those are pretty high temps for a watercooled core.
> Yes, quite badly aswell. My clocks can be stable for a period of time until my VRM1 goes past 70C, I start to see small artifact every now and then after that as a sign.


well, card is cooled by a kraken x31 with fans only spinning at 65%.

Didn't know stability was that much affected by vrm temps :/ how can i get them even lower? already using fujipoly extreme pads on the heatsink :/


----------



## Dundundata

Quote:


> Originally Posted by *peejay2104*
> 
> also, might be a stupid question. But is there a way to see the whole graph on page one with all the overclock statistics person achieved? I can see only till voltage and not whats behind that


You can open the graph in a new tab


----------



## Stige

Quote:


> Originally Posted by *peejay2104*
> 
> well, card is cooled by a kraken x31 with fans only spinning at 65%.
> 
> Didn't know stability was that much affected by vrm temps :/ how can i get them even lower? already using fujipoly extreme pads on the heatsink :/


I strapped a GT AP-15 fan on top of them, helped quite a bit. Only got 7w/mk thermal pads on the VRM myself.

My max core temps have been like ~45C, usually way less, even at +200mV.


----------



## peejay2104

Quote:


> Originally Posted by *Stige*
> 
> I strapped a GT AP-15 fan on top of them, helped quite a bit. Only got 7w/mk thermal pads on the VRM myself.
> 
> My max core temps have been like ~45C, usually way less, even at +200mV.


and with which cooler is that?
pj


----------



## peejay2104

Also just noticed that in unigine heaven, i only need +50mv for 1180/1700 to run it without artifacts while in assetto corsa i need to bump the voltage to +85 to get no artifacts :/
pj


----------



## spyshagg

guys

guys

guys

Its time!!!




*AMD vs NVIDIA*

> Run three benchmarks, print screen the score with GPU-Z CPU-Z and the custom wallpaper

http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd/0_50

Put your 3XX to WORK!
Lets do this! come on!


----------



## Falmatrix2r

Quote:


> Originally Posted by *Stige*
> 
> I would say in theory, you should fine for anything under 90C on Core and under 100C on VRM. What I mean is that it wouldn't really affect the lifespan of the card in any way that matters.
> 
> But in reality, that high VRM temps will make your card require a lot more voltage to stablize a specific overclock than say running it at 70C or less would.
> 
> Stock card? Just ignore the temps. Going for max overclocks? Get them as low as possible, core temp is not really important, VRM1 is.
> I start seeing artifacts at +125mV and 1200/1600 clocks after my VRM1 gets above ~70C. I have managed to get it to top out at ~71C but still need to improve it.. somehow...


This isn't very reassuring...VRM1 is already at 76c on stock settings...ehh..


----------



## patriotaki

im having some issues with my r9 390 pcs+ , i got one bsod when i tried to maximize the window of BF4 (alt + enter) which drivers should i use


----------



## Stige

Quote:


> Originally Posted by *peejay2104*
> 
> and with which cooler is that?
> pj


Alphacool GPX M03 I think it is.


----------



## Dundundata

That block for the MSI is sexy, it works for both 390/390x? Makes it tempting to go liquid, though I have quite a bit invested in my fan setup.


----------



## peejay2104

Quote:


> Originally Posted by *Stige*
> 
> Alphacool GPX M03 I think it is.


ooh, fullcover waterblock. Sadly don't have the option of installing my own custom waterloop


----------



## patriotaki

which drivers should i use? i got the 16.1.1


----------



## patriotaki

my screen gets pinkish-yellowish when pressing alt+enter while gaming


----------



## Joe88

yes
Quote:


> Originally Posted by *Dundundata*
> 
> That block for the MSI is sexy, it works for both 390/390x? Makes it tempting to go liquid, though I have quite a bit invested in my fan setup.


ekwb just released their full cover blocks for the msi r9 390/390x also
https://www.ekwb.com/news/ek-releases-msi-radeon-r9-390x-gaming-8g-full-cover-water-block/


----------



## Stige

Quote:


> Originally Posted by *peejay2104*
> 
> ooh, fullcover waterblock. Sadly don't have the option of installing my own custom waterloop


It's a hybrid.
Only the core is watercooled, and the block makes contact with the heatsink which covers the card.

That is why my VRM temps are still so high.


----------



## peejay2104

Quote:


> Originally Posted by *Stige*
> 
> It's a hybrid.
> Only the core is watercooled, and the block makes contact with the heatsink which covers the card.
> 
> That is why my VRM temps are still so high.


Which r9 390 do you have?
Interested in eventually making my own custom waterloop


----------



## Stige

Quote:


> Originally Posted by *peejay2104*
> 
> Which r9 390 do you have?
> Interested in eventually making my own custom waterloop


ASUS Strix DC3. Should not buy this if you plan to watercool atleast, it doesn't have a proper full cover block for it


----------



## peejay2104

Quote:


> Originally Posted by *Stige*
> 
> ASUS Strix DC3. Should not buy this if you plan to watercool atleast, it doesn't have a proper full cover block for it


i have a dc2 from asus, i think the ek waterblocks fit the strix version tho?
pj


----------



## patriotaki

can anyone help me out??


----------



## peejay2104

Quote:


> Originally Posted by *patriotaki*
> 
> can anyone help me out??


We need more info, system specs, have you oc'ed, how much have you oc'ed and so on


----------



## patriotaki

Quote:


> Originally Posted by *peejay2104*
> 
> We need more info, system specs, have you oc'ed, how much have you oc'ed and so on


my specs are on the signature below, black panther. i have the pcs+ 390 i overclocked it 1060mhz but i had some artifacts lowered it to 1010 again.

im getting a pinkish-yeloowish screen when pressing alt +tab or alt + enter the card performs great . is it software related?


----------



## Stige

Quote:


> Originally Posted by *peejay2104*
> 
> i have a dc2 from asus, i think the ek waterblocks fit the strix version tho?
> pj


Their site is pretty clear about what fits what on the configurator. Unfortunately none fits the DC3 QQ.

If there is a block for DC2 though, I wonder how different the PCB is on them


----------



## peejay2104

Quote:


> Originally Posted by *Stige*
> 
> Their site is pretty clear about what fits what on the configurator. Unfortunately none fits the DC3 QQ.
> 
> If there is a block for DC2 though, I wonder how different the PCB is on them


Just slightly different PCB layout i think. Good thing about the dc2 is that there is quite a lot of waterblocks avaible for them







stock cooler sucks and it loud tho


----------



## battleaxe

Well. I've run out of options. I picked up some copper pipe and some copper bar. I'm going to make my own water VRM cooling system.

I can't get full cover blocks for either my 390x or my 290x and both are better than normal cards, so I can't see getting rid of them.

This shouldn't be overly difficult to do, and the side benefit is that it will not be expensive to duplicate when I upgrade to a different set of cards in the future... As I can just build a new set. In theory, cooling should be the same as a full cover block. We shall see though. I also need to get another RAD to increase my RAD space. I have low RPM fans to keep the noise down, but the slow turning fans are letting the cores get close to 50c, which I would like to see a bit lower, like 42-43C maximum. This should be interesting.


----------



## Joe88

Quote:


> Originally Posted by *peejay2104*
> 
> Interested in eventually making my own custom waterloop


well if you want the ease of an AOI while the performance and quality parts of a custom loop

https://www.ekwb.com/shop/ek-xlc-predator-240-incl-qdc
*OR*
https://www.ekwb.com/shop/ek-xlc-predator-360-incl-qdc

*+*

https://www.ekwb.com/shop/ek-fc-r9-390x-tf5-nickel
(select Predator PRE-FILL service)

you simply snap the quick disconnects together and you technically have a custom loop




but like a custom loop, its not cheap


----------



## mus1mus

Quote:


> Originally Posted by *battleaxe*
> 
> Well. I've run out of options. I picked up some copper pipe and some copper bar. I'm going to make my own water VRM cooling system.
> 
> I can't get full cover blocks for either my 390x or my 290x and both are better than normal cards, so I can't see getting rid of them.
> 
> This shouldn't be overly difficult to do, and the side benefit is that it will not be expensive to duplicate when I upgrade to a different set of cards in the future... As I can just build a new set. In theory, cooling should be the same as a full cover block. We shall see though. I also need to get another RAD to increase my RAD space. I have low RPM fans to keep the noise down, but the slow turning fans are letting the cores get close to 50c, which I would like to see a bit lower, like 42-43C maximum. This should be interesting.


uhmmm. This sounds like Jerry-Rig Watercooling. Are you gonna solder the pipes through the bards?

Looks like you are into something.


----------



## battleaxe

Quote:


> Originally Posted by *mus1mus*
> 
> uhmmm. This sounds like Jerry-Rig Watercooling. Are you gonna solder the pipes through the bards?
> 
> Looks like you are into something.


I'm going to solder the copper pipe to the copper pads that will contact the VRM's.


----------



## OneB1t

care that you need to make it nonconductive i think


----------



## Stige

Quote:


> Originally Posted by *OneB1t*
> 
> care that you need to make it nonconductive i think


Why would he have to make it non-conductive? It only transfers heat there, it doesn't touch anything that transfers electricity.


----------



## peejay2104

Quote:


> Originally Posted by *Joe88*
> 
> well if you want the ease of an AOI while the performance and quality parts of a custom loop
> 
> https://www.ekwb.com/shop/ek-xlc-predator-240-incl-qdc
> *OR*
> https://www.ekwb.com/shop/ek-xlc-predator-360-incl-qdc
> 
> *+*
> 
> https://www.ekwb.com/shop/ek-fc-r9-390x-tf5-nickel
> (select Predator PRE-FILL service)
> 
> you simply snap the quick disconnects together and you technically have a custom loop
> 
> 
> 
> 
> but like a custom loop, its not cheap


Yea, been thinking about one of those loops, sad thing is that the rad won't fit my case


----------



## sil130

Quote:


> Originally Posted by *peejay2104*
> 
> well, card is cooled by a kraken x31 with fans only spinning at 65%.
> 
> Didn't know stability was that much affected by vrm temps :/ how can i get them even lower? already using fujipoly extreme pads on the heatsink :/


Isnt that the kraken g10 not compabitle with r9 390?


----------



## peejay2104

Quote:


> Originally Posted by *sil130*
> 
> Isnt that the kraken g10 not compabitle with r9 390?


No idea, fits perfectly on my dc2. Also fits on a strix r9.


----------



## simonfredette

Hi , ive had my MSI R9 390 for a few weeks now so have started looking into moderate overclocking. Its running stock air and I have only changed the fan curve using afterburner because I did not like it idling at 61 C, with the new curve it idles around 30 and usually will not exceed 60 during gaming and only hit 69 during 3d mark 11 after overclocking . I ran the clocks up to 1200/1700 with +100 mV and it ran stable with no atifacts etc. like I said temps pushed 69 with the fan running at only 87% max obviously 100 % gpu usage. There was a slight coil whine at the beginning of the tests that seemed to go away onc the card was warmed up, does this also happen with yours . It pushed my graphics score on 3d mark up to 18275 from the ~15500 I had at stock clocks. proof . Would you leave it there on air or keep pushing to find the max core clk , I dont want to exceed the 100 mV so I can stick with afterburner and would rather not change aux V to increase memory further..


----------



## mus1mus

Quote:


> Originally Posted by *simonfredette*
> 
> Hi , ive had my MSI R9 390 for a few weeks now so have started looking into moderate overclocking. Its running stock air and I have only changed the fan curve using afterburner because I did not like it idling at 61 C, with the new curve it idles around 30 and usually will not exceed 60 during gaming and only hit 69 during 3d mark 11 after overclocking . I ran the clocks up to 1200/1700 with +100 mV and it ran stable with no atifacts etc. like I said temps pushed 69 with the fan running at only 87% max obviously 100 % gpu usage. There was a slight coil whine at the beginning of the tests that seemed to go away onc the card was warmed up, does this also happen with yours . It pushed my graphics score on 3d mark up to 18275 from the ~15500 I had at stock clocks. proof . Would you leave it there on air or keep pushing to find the max core clk , I dont want to exceed the 100 mV so I can stick with afterburner and would rather not change aux V to increase memory further..


That looks like a very good sample there, sir.

Just a few suggestion with your OC.
1. Power Limit - set this to max of +50%. Not harmful in any way. Just letting the card to get what it needs and not actually limited by power, thermal, and tdp limits defined from the factory.

2. Try to get the Memory to 1750 if you can for better results.

Otherwise, the limits on your OC'ing will be up to your willingness. Just be aware that heat has a major factor in your OC headroom. Keep it under 80C for both core and the VRM and the card will happily give you the boost.

And while you are at it, post your OC results here to help the Team in the current battle. Just head to the OP for the requirements.









http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd/400_50#post_24908074


----------



## Stige

Quote:


> Originally Posted by *simonfredette*
> 
> Hi , ive had my MSI R9 390 for a few weeks now so have started looking into moderate overclocking. Its running stock air and I have only changed the fan curve using afterburner because I did not like it idling at 61 C, with the new curve it idles around 30 and usually will not exceed 60 during gaming and only hit 69 during 3d mark 11 after overclocking . I ran the clocks up to 1200/1700 with +100 mV and it ran stable with no atifacts etc. like I said temps pushed 69 with the fan running at only 87% max obviously 100 % gpu usage. There was a slight coil whine at the beginning of the tests that seemed to go away onc the card was warmed up, does this also happen with yours . It pushed my graphics score on 3d mark up to 18275 from the ~15500 I had at stock clocks. proof . Would you leave it there on air or keep pushing to find the max core clk , I dont want to exceed the 100 mV so I can stick with afterburner and would rather not change aux V to increase memory further..


Your card is propably throttling a LOT without touching that Power Limit, increase it to +50% and watch it burn








Also core temp doesn't really matter because VRM will always be hotter and that is what you should be looking at instead.

Memory overclock gives very little if any benefit at all, only a few pts in benchmarking and none in games.


----------



## simonfredette

Quote:


> Originally Posted by *Stige*
> 
> Your card is propably throttling a LOT without touching that Power Limit, increase it to +50% and watch it burn
> 
> 
> 
> 
> 
> 
> 
> 
> Also core temp doesn't really matter because VRM will always be hotter and that is what you should be looking at instead.
> 
> Memory overclock gives very little if any benefit at all, only a few pts in benchmarking and none in games.


the + 50% is a good call id forgotten about it ,I went up to 1750 on the memory clock and went up to 18314 , not sure if ill keep it up for that little an increase in performance, I might try to push the clocks up slowly and see where I can get them but I dont want to damage the card just find the settings where it likes to sit, I know they underclock cards to make iteasier to justify the 390x or fury etc. what are you using to get vrm temps ?


----------



## simonfredette

Quote:


> Originally Posted by *Stige*
> 
> Your card is propably throttling a LOT without touching that Power Limit, increase it to +50% and watch it burn
> 
> 
> 
> 
> 
> 
> 
> 
> Also core temp doesn't really matter because VRM will always be hotter and that is what you should be looking at instead.
> 
> Memory overclock gives very little if any benefit at all, only a few pts in benchmarking and none in games.


now im a little weirded out, when I run kombuster 3 and turn on gpu monitoring , during the test the core clocks are still maxing out at 1080 and 1525, ive never worked with radeon settings but is radeon still holding my clocks down and preventing afterburner from applying the OC ?


----------



## mus1mus

Open GPU-Z to monitor the clocks.


----------



## Stige

And GPU-Z also monitors your VRM temps.


----------



## patriotaki

....what does this mean??

PCs+ 390


----------



## mus1mus

Corruption.


----------



## bichael

Still haven't played with voltage (or even power limit) but found it quite interesting that I went from a max OC of 1038 on air to 1067 on water. Temps on air about 80core/70vrm I think (note I had the fans strapped to the side of my case so not a fair reflection on the cooler) and temps on water about 50core/55vrm. So while I wouldn't necessarily consider 80 too hot it was clearly impacting at what point artifacts became a problem.

Incidently that gave me a firestrike score of 8788 (12965graphics) which put me in second place for a G3258+390 so might be time to have a play with more voltage


----------



## patriotaki

Quote:


> Originally Posted by *mus1mus*
> 
> Corruption.


Which means..?
Bad card? Or Is this software related?


----------



## mus1mus

Quote:


> Originally Posted by *patriotaki*
> 
> Which means..?
> Bad card? Or Is this software related?


Do you OC the card?


----------



## patriotaki

Quote:


> Originally Posted by *mus1mus*
> 
> Do you OC the card?


I did push it to 1060-1070Mhz for a few minutes and then lowered it back to 1010


----------



## mus1mus

revert top stock, and emulate the issue.


----------



## patriotaki

Quote:


> Originally Posted by *mus1mus*
> 
> revert top stock, and emulate the issue.


im running it stock now i tried the crimson 15 drivers still same issue..


----------



## mus1mus

Quote:


> Originally Posted by *patriotaki*
> 
> im running it stock now i tried the crimson 15 drivers still same issue..


Do a DDU when you change Drivers. Actually, I'm a chatter. I don't play that game. I hope someone will chime in and give you resolve.


----------



## patriotaki

Quote:


> Originally Posted by *mus1mus*
> 
> Do a DDU when you change Drivers. Actually, I'm a chatter. I don't play that game. I hope someone will chime in and give you resolve.


i did i tried the 16.1.1 driver still same issue im downloading 15.7 CCC driver now..hope it works fine


----------



## TsukikoChan

Quote:


> Originally Posted by *patriotaki*
> 
> i did i tried the 16.1.1 driver still same issue im downloading 15.7 CCC driver now..hope it works fine


Have you tried a different game? does it happen there on the same corrupt settings? (game specific?)
Have you tried another monitor/tv with the corrupted settings? Is the monitor overclocked? (monitor dead?)
Have you tried another cable to connect to the monitors? (cable is deaded?)

lots yet to do to check before rma-ing card


----------



## patriotaki

Quote:


> Originally Posted by *TsukikoChan*
> 
> Have you tried a different game? does it happen there on the same corrupt settings? (game specific?)
> Have you tried another monitor/tv with the corrupted settings? Is the monitor overclocked? (monitor dead?)
> Have you tried another cable to connect to the monitors? (cable is deaded?)
> 
> lots yet to do to check before rma-ing card


i did change the colors to full RGB on the settings and everythings seems to be running flawlessly now i tried other games like bad company 2 no issues there. Counter strike Global offensive did the same thing.

looks like its fixed now..ill try to test it more later and come back here


----------



## TsukikoChan

Quote:


> Originally Posted by *patriotaki*
> 
> i did change the colors to full RGB on the settings and everythings seems to be running flawlessly now i tried other games like bad company 2 no issues there. Counter strike Global offensive did the same thing.
> 
> looks like its fixed now..ill try to test it more later and come back here


good to hear it's starting to behave again


----------



## patriotaki

hah yeah








what fps do you get on GTA V everything maxed out with advanced graphics on 1080p monitor?


----------



## TsukikoChan

Quote:


> Originally Posted by *patriotaki*
> 
> hah yeah
> 
> 
> 
> 
> 
> 
> 
> 
> what fps do you get on GTA V everything maxed out with advanced graphics on 1080p monitor?


no idea :-D haven't played the game yet


----------



## Agent Smith1984

Just ordered an MSI 390X, club3d DP->UHD adapter and am rejoining the club, and getting back involved around here again.

I missed my 8GB at 4K too much, so I made a quick $100 and still have an equivalent card to my 980 (other than the coolness factor of the kingpin thing)


----------



## viking21

Hi guys, these days I've been testing my r9 390 with ubuntu (15.10) and I'm getting some issues. When I scroll web pages or drag and drop the window panel I get screen tearing or something like that. I guess that there's no driver for mine because the settings gui shows over the driver in use the written "R9 290".

If I switch to open source drivers and works well but I can't turn on my gpu fan speeds and temp raises.

Should I keep these drivers or fix something that escapes me?


----------



## Agent Smith1984

FYI, MSI 390X is $380 on newegg, with a $20 gaming mouse, and the new Hitman, and a $20 rebate card.

You just can't beat that!!!


----------



## simonfredette

Quote:


> Originally Posted by *Agent Smith1984*
> 
> FYI, MSI 390X is $380 on newegg, with a $20 gaming mouse, and the new Hitman, and a $20 rebate card.
> 
> You just can't beat that!!!


I got excited and then remembered you are american .. Here that means 523 , still not a bad deal since I paid 515 tax in shipped but no mouse or game. I have it clocked to 1200 and honestly its not perfect during benchmark I get small green squares a couple times during tests , nothing during games though and it gives me a 3d mark 11 graphics score of 20338 which I can definitely live with. I think I could get much higher than that with more volts but with afterburner im limited to 100 mV and dont mind keeping it that way , I only hit about 59C during benchmarks on the GPU ( havnt checked temps on the VRM , where in gpu-z are you seeing those temps I see nothing in sensors for temps other than core.. ) In any case great performing card for the price , only comparable here is a 970 price for price and I think for now performance wise they would be the same but hopefully eventually the extra memory actually makes a difference. Im almost exclusively playing Rust right now which has a pretty low requirement for GPU and I find my fps to be limited by the game and server long before my card, sometimes over 100 fps and other days 40, it is what it is .. alpha


----------



## Agent Smith1984

Quote:


> Originally Posted by *simonfredette*
> 
> I got excited and then remembered you are american .. Here that means 523 , still not a bad deal since I paid 515 tax in shipped but no mouse or game. I have it clocked to 1200 and honestly its not perfect during benchmark I get small green squares a couple times during tests , nothing during games though and it gives me a 3d mark 11 graphics score of 20338 which I can definitely live with. I think I could get much higher than that with more volts but with afterburner im limited to 100 mV and dont mind keeping it that way , I only hit about 59C during benchmarks on the GPU ( havnt checked temps on the VRM , where in gpu-z are you seeing those temps I see nothing in sensors for temps other than core.. ) In any case great performing card for the price , only comparable here is a 970 price for price and I think for now performance wise they would be the same but hopefully eventually the extra memory actually makes a difference. Im almost exclusively playing Rust right now which has a pretty low requirement for GPU and I find my fps to be limited by the game and server long before my card, sometimes over 100 fps and other days 40, it is what it is .. alpha


I just sold my 980 kingpin to go back to amd 390x because of the vram! My 980 4gb is pegged our on 4k on gta v


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I just sold my 980 kingpin to go back to amd 390x because of the vram! My 980 4gb is pegged our on 4k on gta v


Hey Agent add me to the list please sir!!







Its an XFX 390x obviously. And yes, thats only the air cooler as I had already stripped it off and added a universal block to it.



Also: got my copper pipe in, copper bar, and Fujipoly Extreme pads. Will be building some custom water lines for the VRM's and plumbing them into the loop soon.

I'll try to remember to take some pics so others can see. Hoping for some great results.
Quote:


> Originally Posted by *Agent Smith1984*
> 
> FYI, MSI 390X is $380 on newegg, with a $20 gaming mouse, and the new Hitman, and a $20 rebate card.
> 
> You just can't beat that!!!


That is a good deal man! Do it!!


----------



## pillowsack

I dunno if it's been mentioned but EKWB has water block for our beloved 390X MSI cards now.


----------



## patriotaki

can the OC to the core cause artifacts when you dont OC the memory? i see benchmarks that with the pcs+ 390 they can get 1100-1150 MHz with some OC on the memory clock too.
i can OC my pcs+ 390 at 1050MHz with no artifacts anything above that is causeing the game to show artifacts.
If i OC the memory also will they dissapear?


----------



## mus1mus

Nope. When artifacts appear, add some Voltage. When it heats up, add more fan speed.

Every card is different in OC abilities. But mostly, cooling is the limit for these cards.

Try to OC your Memory first. Try 1625 or 1750 for the memory. Add some Voltage and verify stability. Add Core before artifacts appear and be done with it.


----------



## patriotaki

Quote:


> Originally Posted by *mus1mus*
> 
> Nope. When artifacts appear, add some Voltage. When it heats up, add more fan speed.
> 
> Every card is different in OC abilities. But mostly, cooling is the limit for these cards.
> 
> Try to OC your Memory first. Try 1625 or 1750 for the memory. Add some Voltage and verify stability. Add Core before artifacts appear and be done with it.


I'm using MSI afterburner how much voltage should I add each time?


----------



## TheCowTamer

Add me to the list. Xfx 390x 1160 on core 1700 on memory. Once tax returns come i will be picking up a full cover block and seeing how far i can push it after that.


----------



## pillowsack

Nice! H220 is a good cooler too, if you add the 390X I recommend adding another 2x 120MM radiator.


----------



## xer0h0ur

Can someone give me a recommendation on power supply for a rig using an FX6300 for a 390X. He was thinking of using an EVGA 600B 600W Bronze power supply and I said he should bump that up to a 750 or 800W unit instead. What say you 390X owners?


----------



## Worldwin

Quote:


> Originally Posted by *xer0h0ur*
> 
> Can someone give me a recommendation on power supply for a rig using an FX6300 for a 390X. He was thinking of using an EVGA 600B 600W Bronze power supply and I said he should bump that up to a 750 or 800W unit instead. What say you 390X owners?


I like to think of the 600B as avoid like the plague. I say spend a little more and grab a better psu like the B2.


----------



## TheCowTamer

Its actually the h240-x which is dual 140mm rad which is closer in area to a triple 120mm rad then a dual 120mm but im going add another 140mm rad. So i will 3 140s for a 390x and a 4790k at 4.9. Should be more than enough. If needed i will put another rad in the front. Should be soon that i can expand it


----------



## TheCowTamer

Quote:


> Originally Posted by *xer0h0ur*
> 
> Can someone give me a recommendation on power supply for a rig using an FX6300 for a 390X. He was thinking of using an EVGA 600B 600W Bronze power supply and I said he should bump that up to a 750 or 800W unit instead. What say you 390X owners?


I would pick up probably at least 650watt in the g2, gs, or p2 series.


----------



## xer0h0ur

Thank you gents. He picked the EVGA SuperNOVA P2 650W PSU.


----------



## TsukikoChan

welp, got some more crashes and freezes on my rig over the weekend after i put my cpu back up to 4.7ghz. decided to go right back to the beginning and test overclocks on my cpu and ram (this time using cpu bus increases instead of pure ratio increase) and test stability. found i couldn't get my cpu past 4.5ghz without reaching the heat limit (for me 78c on socket) but i did get my memory stable at a higher OC (1800mhz instead of 1600) and gaming was pretty smooth afterwards. i guess this is my limit using the nh-14 in my case, i know the cpu can go up to 4.7 and beyond but i need to get better cooling. kiiiiiiiiinda wondering now if i even needed to upgrade the psu if i was still getting crashes with the poor cpu oc.. hmm, oh well!

anyways, long story short, i now got my rig to get over 9.5K overall on firestrike (9530 - [13510 - 8707 - 3107 ] ) and pushed my 390x a bit harder for it (1138-1602)  quite happy with how the results are coming along, i can probably push the gcard a bit harder as the sapphire cooler is quite good at keeping temps down!
http://www.3dmark.com/3dm/10888676?


----------



## dragzi

*Problems*

Signed up to post on this page.

Got an xfx 390x (first edition) a week after it was released. Regretted the purchase for ages.

Had many problems with it, mostly that it would hit 94'c as soon as anything intensive was fired up. This lead to the vrm temp hitting near 110'c and the system shutting down. It would overheat constantly, even in winter when my ambient temp was 19'c (live in london, so ambient in summer is only about 23'c). ASIC score was abou 78%

That was all with stock clocks (1050 core, 1600 mem? and 100% power target).

Ended up taking the side of my case off to get the card to sit at 85'c under gaming load. Really iritated me as I have a 5820k with a dh15 sitting ontop, overclocked to 4.2ghz, and it hits a max of 65'c with the side of the case on (nzxt noctis 450). Was preparing to spend £500 on watercooling gear just for the sake of this card.

Sent an email to XFX asking if changing the TIM would void my warranty, which I was told it would.

*The fix!*

Got frustrated yesterday. Fired up furmark, and did some tweaking. Results are below Chronological. All "Stable for" results were furmark runs. Stock voltage is 1.1v

*Side of case off, 66'C idle*

150% power target, stock voltage - Driver reset in under 20 seconds @ 95'c
100% power target, stock voltage - Driver reset in 60 seconds @ 95'c
100% power target, -25mv - Stable for 10 min, Driver reset @ 95'c
100% power target, -50mv - Stable for 30 min, No crash, max temp 84'c (Core clock was bouncing around the 950-980 mark)
150% power target, -50mv - Stable for 15 min, Driver reset @ 95'c
150% power target, -100mv (was hopefull) - Stable 30 min, No crash, max temp 72'C (Core clock was hitting 1050 throughout)

Time to switch from Afterburner to Trixx (as it supports > -100mv undervolts), and sort out a fan curve (no more 100% hoover for me!). Ended up setling for a steady ramp up from 20% < 50'c, 100% > 68'c

150% power target, -125mv - Stable for 20 min, Then artifacts
150% power target, -110mv - Stable for 50 min, No artifacts, Max temp 68'c
150% power target, -116mv - Stable for 60 min, No artifacts, Max temp 65'c

*Side of case on, 45'C idle*

150% power target, -116mv - Stable for 60 min, No artifacts, Max temp 71'c

Threw a good 4-5 hours of cs:go and rocket league at it and saw it never hit more than about 65'c, and kept rock solid at 1050mhz (which at stock it would throttle down to 900ish). ended up with a voltage that was sitting at around about 0.984v, occasionally climbing to 1v. From what GPUz's sensor reading were telling me, I cut nearly 80W off the peak power usage. Unigenie heaven was giving me higher fps after the voltage drop, probably down to throttling.

TL;dr If your gpu is a toaster, see if you can undervolt without losing any performance.

Anything else I can do to ensure that its running stable? Was thinking maybe if the memory fills up it might not have enough to power the mem controller and core, but my logic may be off.


----------



## Stige

Yes your logic off is off.

If it works, then don't bother about it?

But I would seriously look into improving your case airflow, thats a massive difference between side panel on and off.


----------



## dragzi

In what way is the difference massive? I'm a bit confused :S

The case airflow isnt such an issue. Has 3 corsair af120's in the front as input, and 3 af 120's in the top as output, aswell as a rear af120 for output. Negative pressure, excellent temps on my cpu, but I need to see if there is a hotspot in the gpu area.

The 66'c idle was measured with the side panel off and the gpu at 1.1v
the 45'c idle was measured with the side panel on and the gpu at 0.94v

Worth noting that I have no mechanical drives/cages/bays blocking the input air, and my cable tidying is near perfect, everything behind the mobo or under the psu cover.

cpu temps raise about 2-3'c when the side panel is on.


----------



## TsukikoChan

seriously dude, what's the matter with your card 0.o if your case airflow is fine and your ambients are fine, then you shouldn't have those temps! you shouldn't need to undervolt a card to get it to work in a good cooled rig.. i would suggest pushing for an rma, something sounds like it's bust on that card to me :-S as an example, my sapphire 390x (fair enough, it has better fans and cooling on it) i get anyway from 30-45c on idle and 60-72c during heavy gaming with a nice OC, +90-100mv and a conservative fan profile :-(


----------



## dragzi

Quote:


> Originally Posted by *TsukikoChan*
> 
> i would suggest pushing for an rma, something sounds like it's bust on that card to me :-S


Ive emailed scan and asked if there is a possibility of me getting a swap for a gtx 770 or pay the extra for an r9 nano, but as its on finance (because my wife holds the purse strings) Im kinda stuck. hoping scan bail me out, or at least give me the one with the new vrm design


----------



## TsukikoChan

Quote:


> Originally Posted by *dragzi*
> 
> Ive emailed scan and asked if there is a possibility of me getting a swap for a gtx 770 or pay the extra for an r9 nano, but as its on finance (because my wife holds the purse strings) Im kinda stuck. hoping scan bail me out, or at least give me the one with the new vrm design


email xfx, tell them the card sent you an SOS in the form of smoke signals coming out of the card in the dead of winter, that should herald an rma case if anything ^_^


----------



## dragzi

Quote:


> Originally Posted by *TsukikoChan*
> 
> email xfx, tell them the card sent you an SOS in the form of smoke signals coming out of the card in the dead of winter, that should herald an rma case if anything ^_^


Ive emailed XFX twice so far. Once when I got the card, and once yesterday (before the undervolting experiment).

Both times ive been told that its scan's responsibility, and that crashes at 95'c are in normal operating range. Still waiting on a response from scan. Left my phone at home today so cant give them a call


----------



## Agent Smith1984

List updated!!! WOOOHOOOO

Just went through 100 pages and added all appropriate submissions!

Congrats to new members on their cards. I am back home again (ran a Fury for a little bit.... then a 980 for a little bit, and now going back to 390X-going to add a second soon also, lol)

Sooo..... EK has a waterblock for he MSI now eh??? Guess they got more interest than they thought they would!


----------



## tolis626

Been playing more these last few days (I finally have time to get to finishing WItcher 2. I want to play 3 so bad, but want to finish 2 first.) and the only thing I can say is, man, these cards do NOT like voltage or temps. I can run 1100/1625MHz at -40mV no problem, but need like +70mV for 1150MHz to get no artifacts or driver crashes in Witcher 2. Then there's the fact that, for a while, I can run 1160/1625MHz at +30mV or something stupid like that and games like BF4 are fine, for the most part. Not Witcher 2 though, the driver crashes after a while. Like, seriously, what's up with this game? I'm running it at 1440p VSR, everything max and ubersampling off.

Then again, going past 1150MHz on the core (and thus needing more voltage) doesn't improve my benchmarks at all and just adds heat when gaming. My cooling can handle it, but there's the noise. If it was improving performance I wouldn't really mind, but it's not, so... Yeah.

And one more thing. How the hell are people getting 1750MHz on their cards? Especially MSI ones. Or is it mine that's a bad overclocker? I'm depressed...


----------



## Agent Smith1984

Quote:


> Originally Posted by *tolis626*
> 
> Been playing more these last few days (I finally have time to get to finishing WItcher 2. I want to play 3 so bad, but want to finish 2 first.) and the only thing I can say is, man, these cards do NOT like voltage or temps. I can run 1100/1625MHz at -40mV no problem, but need like +70mV for 1150MHz to get no artifacts or driver crashes in Witcher 2. Then there's the fact that, for a while, I can run 1160/1625MHz at +30mV or something stupid like that and games like BF4 are fine, for the most part. Not Witcher 2 though, the driver crashes after a while. Like, seriously, what's up with this game? I'm running it at 1440p VSR, everything max and ubersampling off.
> 
> Then again, going past 1150MHz on the core (and thus needing more voltage) doesn't improve my benchmarks at all and just adds heat when gaming. My cooling can handle it, but there's the noise. If it was improving performance I wouldn't really mind, but it's not, so... Yeah.
> 
> And one more thing. How the hell are people getting 1750MHz on their cards? Especially MSI ones. Or is it mine that's a bad overclocker? I'm depressed...


To get 1750 on MSI card, you will likely need to add 25-50mv of AUX voltage


----------



## tolis626

Quote:


> Originally Posted by *Agent Smith1984*
> 
> To get 1750 on MSI card, you will likely need to add 25-50mv of AUX voltage


I've already done that. I should have mentioned it.









Even 1725MHz is only half stable...


----------



## Agent Smith1984

Quote:


> Originally Posted by *tolis626*
> 
> I've already done that. I should have mentioned it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Even 1725MHz is only half stable...


I found 1625 scored higher on my first MSI 390 even though it would run all the way up to 1750.... the memory strap will run tighter timings up to 1625, and then the strap changes to looser timings at that point, so if you aren't running 1700+ stable, you may benefit more from running 1600--1625. Even if the card runs issue free (no artifacts, or crashes) at a 1700+ memory speed, it doesn't guarantee the best performance, because you could be losing performance to ECC.


----------



## NovaEnid

Quote:


> Originally Posted by *Agent Smith1984*
> 
> List updated!!! WOOOHOOOO
> 
> Just went through 100 pages and added all appropriate submissions!


What did I do wrong? ;_;
Quote:


> Originally Posted by *NovaEnid*
> 
> I'm new to the AMD fanclub: https://www.techpowerup.com/gpuz/details.php?id=w4dw5
> 
> Sapphire R9 390 Nitro with stock cooler.
> 
> Twiddled a bit with MSI Afterburner:
> 
> +19mV (stock) Core Voltage
> +50% Power Limit
> 1100 MHz Core Clock
> 1625 MHz Memory Clock
> Temps are around 80°C (FurMark) with default Fan curve.
> 
> Is that a good result? I'd have to increase voltage by quite a lot if I wanted to push Core Clock even more.


----------



## Agent Smith1984

Quote:


> Originally Posted by *NovaEnid*
> 
> What did I do wrong? ;_;


Ooops, I must have missed you by mistake pal. Sorry, there were so many to catch up on.

Added!


----------



## jdorje

1.1V is already crazy low. The average 390 is like 1.25V at stock.

Clean out dust.

Then either rma it or remount the cooler and vrm cooler with new tim and thermal pads.


----------



## Stige

Why are so many people using FurMark here? That is just silly. That program has nothing to do with anything at all. It is a bad indicator of temps, power consumption or stability.
There is zero reason for anyone to ever use FurMark, just uninstall that crap software and use games to test stability/temps.

Also to whoever has stability issues, just look at your VRM temps and you will see why. Anything past 70C (atleast for me) starts to require way more voltage to be stable. I would imagine getting it lower would improve the overclocks even more, the cards are very sensitive to high VRM temps.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Stige*
> 
> Why are so many people using FurMark here? That is just silly. That program has nothing to do with anything at all. It is a bad indicator of temps, power consumption or stability.
> There is zero reason for anyone to ever use FurMark, just uninstall that crap software and use games to test stability/temps.
> 
> Also to whoever has stability issues, just look at your VRM temps and you will see why. Anything past 70C (atleast for me) starts to require way more voltage to be stable. I would imagine getting it lower would improve the overclocks even more, the cards are very sensitive to high VRM temps.


Agreed, furmark is a death causing heat monster!


----------



## patriotaki

how much mV should i add on the r9 390 pcs+ ? i dont want to burn it








im trying to get the most out of it


----------



## Stige

Quote:


> Originally Posted by *patriotaki*
> 
> how much mV should i add on the r9 390 pcs+ ? i dont want to burn it
> 
> 
> 
> 
> 
> 
> 
> 
> im trying to get the most out of it


No one can tell you that, each card is an individual.

Experiment and find out, you can't push it too high without BIOS modding so your only limitations are the VRM temperatures.


----------



## patriotaki

Quote:


> Originally Posted by *Stige*
> 
> No one can tell you that, each card is an individual.
> 
> Experiment and find out, you can't push it too high without BIOS modding so your only limitations are the VRM temperatures.


i tried 1100mhz , 1700mhz +65mV and it was quite stable

is the +65mV too much?


----------



## Stige

Quote:


> Originally Posted by *patriotaki*
> 
> i tried 1100mhz , 1700mhz +65mV and it was quite stable
> 
> is the +65mV too much?


Like I said, if you don't mod the BIOS then you can't go too high...


----------



## tolis626

Quote:


> Originally Posted by *Stige*
> 
> Why are so many people using FurMark here? That is just silly. That program has nothing to do with anything at all. It is a bad indicator of temps, power consumption or stability.
> There is zero reason for anyone to ever use FurMark, just uninstall that crap software and use games to test stability/temps.
> 
> Also to whoever has stability issues, just look at your VRM temps and you will see why. Anything past 70C (atleast for me) starts to require way more voltage to be stable. I would imagine getting it lower would improve the overclocks even more, the cards are very sensitive to high VRM temps.


This. Furmark is for GPUs what Prime95 small-FFTs is for CPUs, just even more pointless. People should avoid it like the plague. Don't want to play games for testing or don't have time to do so? Run Valley or Heaven or even Firestrike in a loop. There you go.

As for VRMs (in case you were also talking about me), mine usually don't exceed 70C below +50mV. But yes, VRMs getting hot introduces instabilities for sure. VRMs reach 80C and I practically can't overclock any more. Fortunately, for this to happen I need a low fan speed and at least +75mV (usually 100mV and even then increasing fan speed a bit makes it ok), so it's not really an issue. Keeping them under 70C though is tricky for prolonged gaming sessions.
Quote:


> Originally Posted by *Agent Smith1984*
> 
> I found 1625 scored higher on my first MSI 390 even though it would run all the way up to 1750.... the memory strap will run tighter timings up to 1625, and then the strap changes to looser timings at that point, so if you aren't running 1700+ stable, you may benefit more from running 1600--1625. Even if the card runs issue free (no artifacts, or crashes) at a 1700+ memory speed, it doesn't guarantee the best performance, because you could be losing performance to ECC.


Well, in benchmarks performance increases quite linearly with memory clocks. But comparing 1625MHz to 1725MHz is like 100 points difference in Firestrike or less. I doubt there is any real difference in games to make it worth the hassle. I'm just kind of disappointed with my cards overclocking. I mean, it's not the worst by any means, 1150/1625MHz is more than enough, but seeing people hitting 1200+MHz (even for benchmarking only) and 1750MHz memory while I can't do either of those makes me kinda jealous. Especially the memory thing. With everyone seemingly getting their cards to 1750MHz, it's like they're rubbing it in my face.


----------



## Agent Smith1984

Quote:


> Originally Posted by *tolis626*
> 
> This. Furmark is for GPUs what Prime95 small-FFTs is for CPUs, just even more pointless. People should avoid it like the plague. Don't want to play games for testing or don't have time to do so? Run Valley or Heaven or even Firestrike in a loop. There you go.
> 
> As for VRMs (in case you were also talking about me), mine usually don't exceed 70C below +50mV. But yes, VRMs getting hot introduces instabilities for sure. VRMs reach 80C and I practically can't overclock any more. Fortunately, for this to happen I need a low fan speed and at least +75mV (usually 100mV and even then increasing fan speed a bit makes it ok), so it's not really an issue. Keeping them under 70C though is tricky for prolonged gaming sessions.
> Well, in benchmarks performance increases quite linearly with memory clocks. But comparing 1625MHz to 1725MHz is like 100 points difference in Firestrike or less. I doubt there is any real difference in games to make it worth the hassle. I'm just kind of disappointed with my cards overclocking. I mean, it's not the worst by any means, 1150/1625MHz is more than enough, but seeing people hitting 1200+MHz (even for benchmarking only) and 1750MHz memory while I can't do either of those makes me kinda jealous. Especially the memory thing. With everyone seemingly getting their cards to 1750MHz, it's like they're rubbing it in my face.


I hear you....

I hope my MSI 390x does at least 1175.... my last MSI 390 did 1200 core on 87mv, that was the sweet spot for it. 100mv was just a tad too much for the boards power draw and temps.

It looks like XFX is taking the lead in core clock now. I am thinking XFX is packing old school hawaii cores in their cards, and not the more matured "grenada" core.... I say that because people are able to push over 100mv to them and get 1200+ cores, a characteristic more common to the older 290's, and seldom seen on any of the 390 samples I've come across.


----------



## battleaxe

Quote:


> Originally Posted by *dragzi*
> 
> *Problems*
> 
> Signed up to post on this page.
> 
> Got an xfx 390x (first edition) a week after it was released. Regretted the purchase for ages.
> 
> Had many problems with it, mostly that it would hit 94'c as soon as anything intensive was fired up. This lead to the vrm temp hitting near 110'c and the system shutting down. It would overheat constantly, even in winter when my ambient temp was 19'c (live in london, so ambient in summer is only about 23'c). ASIC score was abou 78%
> 
> That was all with stock clocks (1050 core, 1600 mem? and 100% power target).
> 
> Ended up taking the side of my case off to get the card to sit at 85'c under gaming load. Really iritated me as I have a 5820k with a dh15 sitting ontop, overclocked to 4.2ghz, and it hits a max of 65'c with the side of the case on (nzxt noctis 450). Was preparing to spend £500 on watercooling gear just for the sake of this card.
> 
> Sent an email to XFX asking if changing the TIM would void my warranty, which I was told it would.
> 
> *The fix!*
> 
> Got frustrated yesterday. Fired up furmark, and did some tweaking. Results are below Chronological. All "Stable for" results were furmark runs. Stock voltage is 1.1v
> 
> *Side of case off, 66'C idle*
> 
> 150% power target, stock voltage - Driver reset in under 20 seconds @ 95'c
> 100% power target, stock voltage - Driver reset in 60 seconds @ 95'c
> 100% power target, -25mv - Stable for 10 min, Driver reset @ 95'c
> 100% power target, -50mv - Stable for 30 min, No crash, max temp 84'c (Core clock was bouncing around the 950-980 mark)
> 150% power target, -50mv - Stable for 15 min, Driver reset @ 95'c
> 150% power target, -100mv (was hopefull) - Stable 30 min, No crash, max temp 72'C (Core clock was hitting 1050 throughout)
> 
> Time to switch from Afterburner to Trixx (as it supports > -100mv undervolts), and sort out a fan curve (no more 100% hoover for me!). Ended up setling for a steady ramp up from 20% < 50'c, 100% > 68'c
> 
> 150% power target, -125mv - Stable for 20 min, Then artifacts
> 150% power target, -110mv - Stable for 50 min, No artifacts, Max temp 68'c
> 150% power target, -116mv - Stable for 60 min, No artifacts, Max temp 65'c
> 
> *Side of case on, 45'C idle*
> 
> 150% power target, -116mv - Stable for 60 min, No artifacts, Max temp 71'c
> 
> Threw a good 4-5 hours of cs:go and rocket league at it and saw it never hit more than about 65'c, and kept rock solid at 1050mhz (which at stock it would throttle down to 900ish). ended up with a voltage that was sitting at around about 0.984v, occasionally climbing to 1v. From what GPUz's sensor reading were telling me, I cut nearly 80W off the peak power usage. Unigenie heaven was giving me higher fps after the voltage drop, probably down to throttling.
> 
> TL;dr If your gpu is a toaster, see if you can undervolt without losing any performance.
> 
> Anything else I can do to ensure that its running stable? Was thinking maybe if the memory fills up it might not have enough to power the mem controller and core, but my logic may be off.


I have an XFX 390X. I've never seen over 75C when on the stock cooler. Something is wrong with your card. Probably the thermal paste is crud. Either way, the XFX cards are good, maybe not quite as good as the MSI, but good nonetheless... so something is amiss with your specific card.


----------



## battleaxe

Update on custom VRM coolers for my XFX390X.

My tries with modding an old 290x block has failed so now I'm back to the universal block and making a custom VRM cooler from copper stock and pipe. Here's the first update and photos of the VRM cooler. The copper rods are soldered to the copper pipe. Very easy process. Just used a torch on a concrete surface so the VRMpads would contact on a flat surface. I will be threading into the solid rod for mounting points and may add some additional places so I can thread into and mount it properly.


Spoiler: Warning: Spoiler!


----------



## Agent Smith1984

Quote:


> Originally Posted by *battleaxe*
> 
> Update on custom VRM coolers for my XFX390X.
> 
> My tries with modding an old 290x block has failed so now I'm back to the universal block and making a custom VRM cooler from copper stock and pipe. Here's the first update and photos of the VRM cooler. The copper rods are soldered to the copper pipe. Very easy process. Just used a torch on a concrete surface so the VRMpads would contact on a flat surface. I will be threading into the solid rod for mounting points and may add some additional places so I can thread into and mount it properly.
> 
> 
> Spoiler: Warning: Spoiler!


Function over form, right my friend???


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Function over form, right my friend???


I think it will look decent when I'm done. I know right now it looks pretty sad. But yes, I don't overly care what it looks like. I'm not interested in paying $160 for a full block and backplate and can't for either of these cards anyway. I like doing things differently when I can and seeing what kind of results I can get for a lot cheaper. So far it seems to be working out well for me overall. I'm hoping to beat my previous best settings/benching at 1245mhz core 1750mhz memory once I get this setup and cooling things down


----------



## Agent Smith1984

Quote:


> Originally Posted by *battleaxe*
> 
> I think it will look decent when I'm done. I know right now it looks pretty sad. But yes, I don't overly care what it looks like. I'm not interested in paying $160 for a full block and backplate and can't for either of these cards anyway. I like doing things differently when I can and seeing what kind of results I can get for a lot cheaper. So far it seems to be working out well for me overall. I'm hoping to beat my previous best settings/benching at 1245mhz core 1750mhz memory once I get this setup and cooling things down


Awesome.... that core clock is awesome! Is that a suicide run, or do you actually get any kind of stability out of it at that speed? I'd bet you can get some 1250-1275 runs going if you can get the VRM sub-60c....


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Awesome.... that core clock is awesome! Is that a suicide run, or do you actually get any kind of stability out of it at that speed? I'd bet you can get some 1250-1275 runs going if you can get the VRM sub-60c....


That's not a suicide run, just benching in Heaven. At 1250 it just starts to artifact. At 1245mhz I saw no artifacts.

I think so too, getting the VRM's under 60c or hopefully even 50c will make a huge difference. I've got my fingers crossed.


----------



## pillowsack

Quote:


> Originally Posted by *battleaxe*
> 
> That's not a suicide run, just benching in Heaven. At 1250 it just starts to artifact. At 1245mhz I saw no artifacts.
> 
> I think so too, getting the VRM's under 60c or hopefully even 50c will make a huge difference. I've got my fingers crossed.


I want that damn 390X EKWB water block!!! Why did they have to release it after I did this:


----------



## battleaxe

Quote:


> Originally Posted by *pillowsack*
> 
> I want that damn 390X EKWB water block!!! Why did they have to release it after I did this:


What are your VRM's like on temp in that setup?


----------



## tbob22

Anyone's 390 sagging? Even though my Powercolor has a backplate it's sagging pretty bad. :/


----------



## Agent Smith1984

Quote:


> Originally Posted by *battleaxe*
> 
> That's not a suicide run, just benching in Heaven. At 1250 it just starts to artifact. At 1245mhz I saw no artifacts.
> 
> I think so too, getting the VRM's under 60c or hopefully even 50c will make a huge difference. I've got my fingers crossed.


Yeah, sub 60 is the sweet spot for these cards' VRMs.....

I did some serious case fan upgrade to get 59C max on 87mv, and that's how I got 1200 core. If for whatever reason it got too hot inside, and broke 60c, the core was no longer stable, lol


----------



## Stige

Quote:


> Originally Posted by *battleaxe*
> 
> Update on custom VRM coolers for my XFX390X.
> 
> My tries with modding an old 290x block has failed so now I'm back to the universal block and making a custom VRM cooler from copper stock and pipe. Here's the first update and photos of the VRM cooler. The copper rods are soldered to the copper pipe. Very easy process. Just used a torch on a concrete surface so the VRMpads would contact on a flat surface. I will be threading into the solid rod for mounting points and may add some additional places so I can thread into and mount it properly.
> 
> 
> Spoiler: Warning: Spoiler!


Just standard solder there between the pipe and the copper block? I will so frickin do this soon myself, it should be a massive improvement in VRM temps.

Using 13mm copper tubing, I can just tighten the tubing straight into the copper pipe not having to worry about fittings.

Copper should be easy to work with so I think I can mold the block with just a dremel for precision and style.

I think you can even slap a black paint on it because it isn't being cooled by air, only the contect between the pipe and the block which is obviously covered by the solder.


----------



## specopsFI

Quote:


> Originally Posted by *tolis626*
> 
> Been playing more these last few days (I finally have time to get to finishing WItcher 2. I want to play 3 so bad, but want to finish 2 first.) and the only thing I can say is, man, these cards do NOT like voltage or temps. I can run 1100/1625MHz at -40mV no problem, but need like +70mV for 1150MHz to get no artifacts or driver crashes in Witcher 2. Then there's the fact that, for a while, I can run 1160/1625MHz at +30mV or something stupid like that and games like BF4 are fine, for the most part. Not Witcher 2 though, the driver crashes after a while. Like, seriously, what's up with this game? I'm running it at 1440p VSR, everything max and ubersampling off.
> 
> Then again, going past 1150MHz on the core (and thus needing more voltage) doesn't improve my benchmarks at all and just adds heat when gaming. My cooling can handle it, but there's the noise. If it was improving performance I wouldn't really mind, but it's not, so... Yeah.
> 
> And one more thing. How the hell are people getting 1750MHz on their cards? Especially MSI ones. Or is it mine that's a bad overclocker? I'm depressed...


Witcher 2 is a killer. I had three OC settings for my 290, all tested to be stable through dozens of games and benchmarks, including such favorites as Crysis 3, Sleeping Dogs, looped 3DMark11, FFXIV benchmark... I'm talking about hundreds of hours of usage through a whole bunch of graphics-heavy titles.

Then I finally got to Witcher 2 and it artifacted within a couple of minutes, throwing black texture bugs and finally crashing the driver. With all three "stable" settings. It's possible that it was just the temperatures since I had ubersampling on and seriously, nothing else that I've run with VSR or resolution scaling up to 2160p hasn't raised the temps like W2 with ubersampling. So clearly W2 is one of the most critical Hawaii stability tests, in my experience even the most difficult one. Of course, I haven't even tried W3 yet...


----------



## THUMPer1

Replaced thermal paste on my MSI 390x. Dropped temps by 10c in a 30 min heaven run at 1150/1650 +35Mv fan 80% Pretty awesome.


----------



## Worldwin

I did something similar. Put CLU on the die and my temps went from 83->75C and fan speed went from 83->59%. This is also with a MSI 390X. VRM temps went down by 1C. Gotta love liquid metal.


----------



## battleaxe

Quote:


> Originally Posted by *Stige*
> 
> Just standard solder there between the pipe and the copper block? I will so frickin do this soon myself, it should be a massive improvement in VRM temps.
> 
> Using 13mm copper tubing, I can just tighten the tubing straight into the copper pipe not having to worry about fittings.
> 
> Copper should be easy to work with so I think I can mold the block with just a dremel for precision and style.
> 
> I think you can even slap a black paint on it because it isn't being cooled by air, only the contect between the pipe and the block which is obviously covered by the solder.


Yes, I used plain solder like used in home plumbing.


----------



## diggiddi

Quote:


> Originally Posted by *Agent Smith1984*
> 
> List updated!!! WOOOHOOOO
> 
> Just went through 100 pages and added all appropriate submissions!
> 
> Congrats to new members on their cards. I am back home again (ran a Fury for a little bit.... then a 980 for a little bit, and now going back to 390X-going to add a second soon also, lol)
> 
> Sooo..... EK has a waterblock for he MSI now eh??? Guess they got more interest than they thought they would!


prodigal has returned, give him water


----------



## tolis626

Quote:


> Originally Posted by *Worldwin*
> 
> I did something similar. Put CLU on the die and my temps went from 83->75C and fan speed went from 83->59%. This is also with a MSI 390X. VRM temps went down by 1C. Gotta love liquid metal.


So CLU is OK on the MSI? Happy to hear that, I want to get some too but didn't know whether the heatsink contact plate is bare copper or not. If you're using CLU without problems, I guess it's nickel plated, right? Would you recommend it as far as the ease of the process is concerned? How careful would one need to be when applying CLU or similar liquid metal compounds?

Too many questions, I know. Sorry.


----------



## Worldwin

Quote:


> Originally Posted by *THUMPer1*
> 
> Replaced thermal paste on my MSI 390x. Dropped temps by 10c in a 30 min heaven run at 1150/1650 +35Mv fan 80% Pretty awesome.


Quote:


> Originally Posted by *tolis626*
> 
> So CLU is OK on the MSI? Happy to hear that, I want to get some too but didn't know whether the heatsink contact plate is bare copper or not. If you're using CLU without problems, I guess it's nickel plated, right? Would you recommend it as far as the ease of the process is concerned? How careful would one need to be when applying CLU or similar liquid metal compounds?
> 
> Too many questions, I know. Sorry.


You are correct in that it is nickel plated. The thing about using CLU on the TF5 cooler is that if you put it on the die only the CLU doesn't contact with the heatsink. You have to put it on both the cooler and die for it to contact. In terms of ease I would advise against it unless you are comfortable doing so. There is no mark or place that tells you where exactly the heatsink will line up with the die. This means that you wont know exactly where to put the CLU on the heatsink. WRT how careful a person needs to be during application it would be very. Liquid metal is conductive and you can easily kill your chip.

Another note with putting CLU on the heatsink. If you put it on the die only it might still work meaning that the CLU heats up and then contacts the heatsink. I did this with my 280X and not all of the CLU was contacting the heatsink giving less than optimal heat transfer.

What i mean by the "mark" on the heatsink. On the MSI 280X cooler there is a bulge that makes it extremely apparent where to put the CLU.
https://www.techpowerup.com/reviews/MSI/R9_280X_Gaming/images/cooler2.jpg

Whereas
https://www.techpowerup.com/reviews/MSI/R9_390X_Gaming/images/cooler2.jpg
it is a flat surface. Meaning you have to guess where to apply the CLU once you clean the stock TIM.


----------



## Stige

Quote:


> Originally Posted by *tolis626*
> 
> So CLU is OK on the MSI? Happy to hear that, I want to get some too but didn't know whether the heatsink contact plate is bare copper or not. If you're using CLU without problems, I guess it's nickel plated, right? Would you recommend it as far as the ease of the process is concerned? How careful would one need to be when applying CLU or similar liquid metal compounds?
> 
> Too many questions, I know. Sorry.


I already posted you can use CLU with any GPU pretty much, none of them have aluminum heat sinks as far as I know, not the high end cards anyway cause aluminum is crap.


----------



## THUMPer1

I used thermal grease anyway. Arctic Silver 5 actually. I'm sure there are better alternatives but I don't have any on hand.


----------



## tolis626

Quote:


> Originally Posted by *Stige*
> 
> I already posted you can use CLU with any GPU pretty much, none of them have aluminum heat sinks as far as I know, not the high end cards anyway cause aluminum is crap.


I think it causes problems with bare copper. If I'm not mistaken, there are cases where the TIM dries up completely and those people are left with a crust of material on their chips that's really hard to get off. I was worried about copper, not aluminum.








Quote:


> Originally Posted by *Worldwin*
> 
> You are correct in that it is nickel plated. The thing about using CLU on the TF5 cooler is that if you put it on the die only the CLU doesn't contact with the heatsink. You have to put it on both the cooler and die for it to contact. In terms of ease I would advise against it unless you are comfortable doing so. There is no mark or place that tells you where exactly the heatsink will line up with the die. This means that you wont know exactly where to put the CLU on the heatsink. WRT how careful a person needs to be during application it would be very. Liquid metal is conductive and you can easily kill your chip.
> 
> Another note with putting CLU on the heatsink. If you put it on the die only it might still work meaning that the CLU heats up and then contacts the heatsink. I did this with my 280X and not all of the CLU was contacting the heatsink giving less than optimal heat transfer.
> 
> What i mean by the "mark" on the heatsink. On the MSI 280X cooler there is a bulge that makes it extremely apparent where to put the CLU.
> https://www.techpowerup.com/reviews/MSI/R9_280X_Gaming/images/cooler2.jpg
> 
> Whereas
> https://www.techpowerup.com/reviews/MSI/R9_390X_Gaming/images/cooler2.jpg
> it is a flat surface. Meaning you have to guess where to apply the CLU once you clean the stock TIM.


I see... Well, count me disappointed. I had hoped for a "Yes it's quite easy actually!" kind of answer, and I would have ordered a bottle of CLU (Or even the even better, allegedly, Thermal Grizzly Kryonaut) instantly. Well, that plan's going down the drain, it seems.









Damn... Maybe I'll try putting some of the MX4 I have leftover on that GPU and see how it fares. Thanks for your input though!

PS : On second thought, why would it not make contact? That would mean there is a gap between the heatsink and GPU that needs to be filled by the TIM. That by itself means more TIM is needed, so it instantly makes the whole thing's performance drop because it will act as an insulator. I'd guess the contact plate would be squeezing onto the GPU die. Or shouldn't it? With that said, even the prospect of an overspill that causes the a short somewhere sends chills down my spine...


----------



## simonfredette

gpu z doesnt give me vrm temps on my msi, I thought it was just but it seems many people install gpu-z and there is no sensor reading for vrm ?


----------



## Worldwin

Quote:


> Originally Posted by *tolis626*
> 
> I think it causes problems with bare copper. If I'm not mistaken, there are cases where the TIM dries up completely and those people are left with a crust of material on their chips that's really hard to get off. I was worried about copper, not aluminum.
> 
> 
> 
> 
> 
> 
> 
> 
> I see... Well, count me disappointed. I had hoped for a "Yes it's quite easy actually!" kind of answer, and I would have ordered a bottle of CLU (Or even the even better, allegedly, Thermal Grizzly Kryonaut) instantly. Well, that plan's going down the drain, it seems.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Damn... Maybe I'll try putting some of the MX4 I have leftover on that GPU and see how it fares. Thanks for your input though!
> 
> PS : On second thought, why would it not make contact? That would mean there is a gap between the heatsink and GPU that needs to be filled by the TIM. That by itself means more TIM is needed, so it instantly makes the whole thing's performance drop because it will act as an insulator. I'd guess the contact plate would be squeezing onto the GPU die. Or shouldn't it? With that said, even the prospect of an overspill that causes the a short somewhere sends chills down my spine...


The CLU does not overspill at all. If there was ever to be a mistake it would definitely be on the user part. TIM easily fills the gap between the cooler and die whereas the CLU does not. If the heatsink was squeezing onto the die we would have an issue with people kill there dies by overtightening causing the heatsink to fracture the die. Every AIB leaves a gap between the die and heatsink because they use TIM not liquid metal. If they designed for liquid metal you would expect a good drop in distance between the heatsink and die.


----------



## Worldwin

Quote:


> Originally Posted by *simonfredette*
> 
> gpu z doesnt give me vrm temps on my msi, I thought it was just but it seems many people install gpu-z and there is no sensor reading for vrm ?


It should give you temps.


----------



## Stige

Quote:


> Originally Posted by *Worldwin*
> 
> The CLU does not overspill at all. If there was ever to be a mistake it would definitely be on the user part. TIM easily fills the gap between the cooler and die whereas the CLU does not. If the heatsink was squeezing onto the die we would have an issue with people kill there dies by overtightening causing the heatsink to fracture the die. Every AIB leaves a gap between the die and heatsink because they use TIM not liquid metal. If they designed for liquid metal you would expect a good drop in distance between the heatsink and die.


Both my old HD7950 worked fine with CLU only applied on the DIE, I think you are wrong here.

People need to take their tinfoil hats off with these "shorting" business really, it's like nearly impossible to spill this thing, just buy it and see for yourself, it doesn't act anything like a regular TIM.


----------



## simonfredette

Quote:


> Originally Posted by *Worldwin*
> 
> It should give you temps.


Yeah. Thats the point , it doesnt .. Maybe certain cards have the sensors go through the chip and others dont. Anyone with an msi 390 have it on gpu-z?


----------



## Worldwin

Quote:


> Originally Posted by *Stige*
> 
> Both my old HD7950 worked fine with CLU only applied on the DIE, I think you are wrong here.
> 
> People need to take their tinfoil hats off with these "shorting" business really, it's like nearly impossible to spill this thing, just buy it and see for yourself, it doesn't act anything like a regular TIM.


Your 7950 does not use the same cooler as my 390X. Also where did I say it would spill. It says at the very start "The CLU does not overspill at all." Shorting is a complete possibility which is why I do not recommend it for those that are not confident. Stick with what you know works.
Quote:


> Originally Posted by *simonfredette*
> 
> Yeah. Thats the point , it doesnt .. Maybe certain cards have the sensors go through the chip and others dont. Anyone with an msi 390 have it on gpu-z?


Maybe RMA it? VRM temps are important enough to warrant it. My VRM2 temp is bugged however VRM2 rarely gets hot so I couldn't be bothered.


----------



## battleaxe

Quote:


> Originally Posted by *simonfredette*
> 
> gpu z doesnt give me vrm temps on my msi, I thought it was just but it seems many people install gpu-z and there is no sensor reading for vrm ?


Make sure ULPS is disabled on both cards if you have two cards installed. This will keep a card from showing the VRM's.


----------



## simonfredette

Ok thanks i solved it and it was embarassingly simple , in the properties for gpuz you can disable dpi settings , this allows me to make the application window bigger allowing me to see vrm temps . So finally my overclock on stock air is :
Core clock 1200
Mem clock 1700
+50%
+100mV
Load core temp 69C
Vrm 1,2 on load 52C


----------



## kizwan

Quote:


> Originally Posted by *dragzi*
> 
> *Problems*
> 
> Signed up to post on this page.
> 
> Got an xfx 390x (first edition) a week after it was released. Regretted the purchase for ages.
> 
> Had many problems with it, mostly that it would hit 94'c as soon as anything intensive was fired up. This lead to the vrm temp hitting near 110'c and the system shutting down. It would overheat constantly, even in winter when my ambient temp was 19'c (live in london, so ambient in summer is only about 23'c). ASIC score was abou 78%
> 
> That was all with stock clocks (1050 core, 1600 mem? and 100% power target).
> 
> Ended up taking the side of my case off to get the card to sit at 85'c under gaming load. Really iritated me as I have a 5820k with a dh15 sitting ontop, overclocked to 4.2ghz, and it hits a max of 65'c with the side of the case on (nzxt noctis 450). Was preparing to spend £500 on watercooling gear just for the sake of this card.
> 
> Sent an email to XFX asking if changing the TIM would void my warranty, which I was told it would.
> 
> *The fix!*
> 
> Got frustrated yesterday. Fired up furmark, and did some tweaking. Results are below Chronological. All "Stable for" results were furmark runs. Stock voltage is 1.1v
> 
> *Side of case off, 66'C idle*
> 
> 150% power target, stock voltage - Driver reset in under 20 seconds @ 95'c
> 100% power target, stock voltage - Driver reset in 60 seconds @ 95'c
> 100% power target, -25mv - Stable for 10 min, Driver reset @ 95'c
> 100% power target, -50mv - Stable for 30 min, No crash, max temp 84'c (Core clock was bouncing around the 950-980 mark)
> 150% power target, -50mv - Stable for 15 min, Driver reset @ 95'c
> 150% power target, -100mv (was hopefull) - Stable 30 min, No crash, max temp 72'C (Core clock was hitting 1050 throughout)
> 
> Time to switch from Afterburner to Trixx (as it supports > -100mv undervolts), and sort out a fan curve (no more 100% hoover for me!). Ended up setling for a steady ramp up from 20% < 50'c, 100% > 68'c
> 
> 150% power target, -125mv - Stable for 20 min, Then artifacts
> 150% power target, -110mv - Stable for 50 min, No artifacts, Max temp 68'c
> 150% power target, -116mv - Stable for 60 min, No artifacts, Max temp 65'c
> 
> *Side of case on, 45'C idle*
> 
> 150% power target, -116mv - Stable for 60 min, No artifacts, Max temp 71'c
> 
> Threw a good 4-5 hours of cs:go and rocket league at it and saw it never hit more than about 65'c, and kept rock solid at 1050mhz (which at stock it would throttle down to 900ish). ended up with a voltage that was sitting at around about 0.984v, occasionally climbing to 1v. From what GPUz's sensor reading were telling me, I cut nearly 80W off the peak power usage. Unigenie heaven was giving me higher fps after the voltage drop, probably down to throttling.
> 
> TL;dr If your gpu is a toaster, see if you can undervolt without losing any performance.
> 
> Anything else I can do to ensure that its running stable? Was thinking maybe if the memory fills up it might not have enough to power the mem controller and core, but my logic may be off.


First of all, why side of case off is higher than side of case on? I think you got that reversed. Which one that actually with side panel removed/off?

Second of all, I'm assuming you got that reversed & the "side of case on" is actually with the side panel removed/off, then there's nothing wrong with your card.

Third of all, your case is the problem. 450 is pretty restrictive in term of air flow. Even if you go for watercooling, heat will still be the problem. Find a way to remove the top & front cover.


----------



## wdpir32k3

I Would like to join the club just got this for my girlfriend PowerColor DEVIL Radeon R9 390X


----------



## Lixxon

Quote:


> Originally Posted by *wdpir32k3*
> 
> I Would like to join the club just got this for my girlfriend PowerColor DEVIL Radeon R9 390X


Whats the catch with joining cluub wut is there something I am missing ? or is it just an expression of also having one of these gpus







?


----------



## dragzi

Quote:


> Originally Posted by *kizwan*
> 
> First of all, why side of case off is higher than side of case on? I think you got that reversed. Which one that actually with side panel removed/off?
> 
> Second of all, I'm assuming you got that reversed & the "side of case on" is actually with the side panel removed/off, then there's nothing wrong with your card.
> 
> Third of all, your case is the problem. 450 is pretty restrictive in term of air flow. Even if you go for watercooling, heat will still be the problem. Find a way to remove the top & front cover.


Its chronological over the course of the undervolting experiment. so 94'c crash with the side of the case off was with no voltage modification, and the side of the case went back on once id undervolted it by 116mv. whilst it was at that level of undervolt, putting the side of the case on did increase the temps by about 3-4'C. The noctis 450 is ok if you use static pressure fans to get around the airflow restrictions.


----------



## THUMPer1

Stop using furmak. It's useless.


----------



## mus1mus

Quote:


> Originally Posted by *THUMPer1*
> 
> Stop using furmak. It's useless.


You know, I used to have a fascination with P95 and IBT AVX above other things a PC can do. Maybe it's Furmark for some people.


----------



## jdorje

So I hear a lot of talk on this thread about furmark. Is this a better stress test than what I've been using? Should I download it?


----------



## patriotaki

i tried to push my pcs+ 390 more today 1110Mhz
+55-60mV (dont remember exactly)
+100Mhz Core clock
+200Mhz Mem clock
+25% power limit

i tried going up to 1120mhz with +65mV and +50% power limit

i got RSOD


----------



## Agent Smith1984

Quote:


> Originally Posted by *jdorje*
> 
> So I hear a lot of talk on this thread about furmark. Is this a better stress test than what I've been using? Should I download it?


The majority of the talk about furmark, is telling people to NOT use it. I highly advise against it, as it is just an unrealistic, and torturous load on GPU's.


----------



## Stige

Quote:


> Originally Posted by *Agent Smith1984*
> 
> The majority of the talk about furmark, is telling people to NOT use it. I highly advise against it, as it is just an unrealistic, and torturous load on GPU's.


This is correct, there is absolutely nothing that FurMark will help you with.


----------



## patriotaki

sometimes in bf4 i get worse avg fps when i OC the gpu ..lol


----------



## navjack27

maybe its an unstable memory overclock giving too many memory corrections. i had that when i was running catzilla benchmark last night


----------



## patriotaki

Quote:


> Originally Posted by *navjack27*
> 
> maybe its an unstable memory overclock giving too many memory corrections. i had that when i was running catzilla benchmark last night


should i lower the mem?


----------



## navjack27

if you arent running at a strap memory speed then yeah.

1750
1625
1500
1375
1250
1125
1000
900
800
400

those are the memory strap top ends for a 390x i just assume its the same for a 390.


----------



## uszpdoz

i can undervolt my gpu to -100mV with stock speed with the same performance without undervolting... i can OC the core up to 1120/1600 without voltage adjustment...but up to that clock i cant OCing even with increasing the mV...i just wonder why.


----------



## jdorje

You must be mistaken on that, or there's some setting that is off. Clock should scale smoothly with voltage.

The actual scaling is not that good - about 100 mhz per 200 mV on my chip.

I too can run 1035 mhz at 1225-100=1125 mV.


----------



## uszpdoz

do u mean that undervolt -100mv 1040 core clock performance (using heaven as benchmark) should less than 0mv 1040 core clock ?since i tested it using heaven and the witcher 3 and dont see any loss or gain in performance while undervolt or not (only temparature drop around 5~10C if undervolt).


----------



## lanc3lot

Hey everyone,

1. I have read in many reviews of 390, that with the proper overclocking, it can produce fps results like 390x - Is that something that you have seen till now or it's lets say a exaggeration?

2. I plan to buy 390, (either Sapphire or MSI), especially for 1440p gaming. I am bit concerned that Amd drivers aren't so optiomised like Nvidia (or werent before Radeon Crimson was released).

Thnx in advance for any possible reply


----------



## tolis626

Quote:


> Originally Posted by *lanc3lot*
> 
> Hey everyone,
> 
> 1. I have read in many reviews of 390, that with the proper overclocking, it can produce fps results like 390x - Is that something that you have seen till now or it's lets say a exaggeration?
> 
> 2. I plan to buy 390, (either Sapphire or MSI), especially for 1440p gaming. I am bit concerned that Amd drivers aren't so optiomised like Nvidia (or werent before Radeon Crimson was released).
> 
> Thnx in advance for any possible reply


When both are running at the same clockspeed, the 390x is 5-10% faster than the 390, so they are pretty close anyway. More so with overclocking. It's not an exaggeration, but it's also not something you should obsess over either.

AMD's drivers have honestly been fine (more than fine) for the most part the last few years. The only "issue" I know of is that Crossfire profiles take a while to come out for some games, but you're gonna be running a single card so it won't be a problem. Don't worry about the drivers.


----------



## Stige

Yeah, this driver nonsense about AMD/ATI is from the past, a LONG time ago and somehow people still bring it up for some reason.


----------



## tolis626

Quote:


> Originally Posted by *Stige*
> 
> Yeah, this driver nonsense about AMD/ATI is from the past, a LONG time ago and somehow people still bring it up for some reason.


I know, right? These past few years I've had more trouble with NVidia drivers than AMD drivers, and I've had next to no issues with NVidia as well. The only "problem" I've encountered with an AMD driver in recent memory is when I was trying to install Windows 8.1 on my old laptop with a Mobility Radeon HD4650 in it and it would cause it to black screen and never work again. Installing some modded older driver did the trick, though.


----------



## lanc3lot

Quote:


> Originally Posted by *Stige*
> 
> Yeah, this driver nonsense about AMD/ATI is from the past, a LONG time ago and somehow people still bring it up for some reason.


I haven't got any Ati/Amd card in my rig. I always had Nvidia, now i am with an GTX580 and i am find it very hard to play the games the way i like, especially in 1440p. So in the research i have done to choose my next card, the most common thing i was reading in various forums, was that AMD have a problem with drivers and that their drivers are not so often released like Nvidia where it releases on the day of the release of each major game. So for this i asked here.


----------



## battleaxe

I've had less issues with AMD drivers the last 4 years than I have with Nvidia. Its perspective I think. If you want to find fault with something you will. That simple. Who am I though? Nobody.


----------



## Agent Smith1984

I found this little gem while searching through the 3dmark score lists....

http://www.3dmark.com/fs/6644269

HOLY GRANOLA BARS BATMAN!


----------



## Ha-Nocri

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I found this little gem while searching through the 3dmark score lists....
> 
> http://www.3dmark.com/fs/6644269
> 
> HOLY GRANOLA BARS BATMAN!


How is that even possible @1250MHz?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Ha-Nocri*
> 
> How is that even possible @1250MHz?


Not a clue, maybe a driver hack or setting in crimson that bypasses the futuremark system info service?

OR, they had 1250 as their OC on MSI maybe, and then used trixx in the background to up vcore to 200mv, and were really running at like 1300+??

Seems crazy to me....


----------



## Stige

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I found this little gem while searching through the 3dmark score lists....
> 
> http://www.3dmark.com/fs/6644269
> 
> HOLY GRANOLA BARS BATMAN!


What is so holy granola about this?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Stige*
> 
> What is so holy granola about this?


16,600 graphics on a single card, and a verified legit run. That is insane


----------



## Stige

Quote:


> Originally Posted by *Agent Smith1984*
> 
> 16,600 graphics on a single card, and a verified legit run. That is insane


The clocks are not even very high? I got 14600 on my 390 at very mild clocks (1200/1600), if I can build my VRM a seperate waterblock and get it cooled, I have a feeling I can improve that by that extra 50MHz on Core considering I can run Valley at 1268Mhz or something like that.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Stige*
> 
> The clocks are not even very high? I got 14600 on my 390 at very mild clocks (1200/1600), if I can build my VRM a seperate waterblock and get it cooled, I have a feeling I can improve that by that extra 50MHz on Core considering I can run Valley at 1268Mhz or something like that.


I understand that the clocks aren't high, what is astonishing, is the score..... 16,600 at 1250 core seems like a hack to me.


----------



## Stige

And other comparison scores, that score seems about right to me on a 390X at those clocks?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Stige*
> 
> And other comparison scores, that score seems about right to me on a 390X at those clocks?


Na, most you will typically see on 390X is about 15,200 graphics score....

http://www.3dmark.com/fs/6391953

Here is one at 1225 and getting 15,000, there is NO WAY 25mhz is getting another 1600 points. I want to know why they did in that run to get such a high score...... it's gotta be a tessy tweak that they are getting by Futuemark somehow.


----------



## Vellinious

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Na, most you will typically see on 390X is about 15,200 graphics score....
> 
> http://www.3dmark.com/fs/6584528
> 
> Here is one at 1225 and getting 14,800, there is NO WAY 25mhz is getting another 800 points. I want to know why they did in that run to get such a high score...... it's gotta be a tessy tweak that they are getting by Futuemark somehow.


Modified bios for better memory performance maybe.....I can get right at 16k with those clocks...it's only a couple of FPS difference. Looking at the detailed scores, he wasn't using tess tweaks or magic hex....certainly looks legit to me.


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I found this little gem while searching through the 3dmark score lists....
> 
> http://www.3dmark.com/fs/6644269
> 
> HOLY GRANOLA BARS BATMAN!


Quote:


> Originally Posted by *Ha-Nocri*
> 
> How is that even possible @1250MHz?


Oh its possible. Just wait til I get my VRM's under water.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Vellinious*
> 
> Modified bios for better memory performance maybe.....I can get right at 16k with those clocks...it's only a couple of FPS difference. Looking at the detailed scores, he wasn't using tess tweaks or magic hex....certainly looks legit to me.


Yeah, it says it's valid, that's a crazy good score though.


----------



## Agent Smith1984

So what about this one?

http://www.3dmark.com/fs/7016767

I call BS at 1175...... there is a way to kill tess in crimson and get around futuremark I believe.


----------



## Stige

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Na, most you will typically see on 390X is about 15,200 graphics score....
> 
> http://www.3dmark.com/fs/6391953
> 
> Here is one at 1225 and getting 15,000, there is NO WAY 25mhz is getting another 1600 points. I want to know why they did in that run to get such a high score...... it's gotta be a tessy tweak that they are getting by Futuemark somehow.


This is only 400 higher than my 390 at lower clocks, I would be near that score on my 390 if I ran it at the same clocks, this score is way too low if you ask me.

Also the website can show wrong clocks, atleast it does on my runs, like 800MHz on memory and 1150 on Core or something even though I ran it at 1200/1600.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Stige*
> 
> This is only 400 higher than my 390 at lower clocks, I would be near that score on my 390 if I ran it at the same clocks, this score is way too low if you ask me.


That score looks congruent with every other in that clock range though.

The 390 is only 3-5% slower at the same clock speeds. What is your best firestrike graphics score so far at max clocks?

I have found several that look like this:

http://www.3dmark.com/fs/6882297

And only two that are breaking 16k.....


----------



## Stige

Quote:


> Originally Posted by *Agent Smith1984*
> 
> That score looks congruent with every other in that clock range though.
> 
> The 390 is only 3-5% slower at the same clock speeds. What is your best firestrike graphics score so far at max clocks?
> 
> I have found several that look like this:
> 
> http://www.3dmark.com/fs/6882297
> 
> And only two that are breaking 16k.....


I would score higher than that on my 390 lol at those clocks.

This is my highest so far: http://www.3dmark.com/fs/7503411
Like I said, clocks were 1200/1600. I refunded 3DMark on Steam so I don't have it now, will run it again when I get my VRM temps toned town so I can actually push the card to 1250+ on 3DMark.
Also the drivers were 16.1 I think, I didn't have the hotfix installed back then.


----------



## Agent Smith1984

Look at this comp:

http://www.3dmark.com/compare/fs/7016767/fs/6882297

Navjack is actually a member of our club. He is using modified timings..... I am going to get with him and see if that's all there is too it, or if there is some tess mod going on. I'm telling you right now, modified memory can't cover that much ground.

My Fury with HBM at 570, and core at 1100 hit about 17k graphics score, so there is something that's got that 390x kicking some abnormal amounts of ass.


----------



## Stige

Ofcourse there could be something fishy there, but those few linked which you said were "normal scores" are a bit low aswell if you ask me.
Dunno what's going on really, I guess those can be a bit high but I will know for sure once I can run at same clocks :l


----------



## Vellinious

Quote:


> Originally Posted by *Agent Smith1984*
> 
> So what about this one?
> 
> http://www.3dmark.com/fs/7016767
> 
> I call BS at 1175...... there is a way to kill tess in crimson and get around futuremark I believe.


You can turn off tessellation in Crimson, but it detects it.....

With tess tweaks, the scores climb pretty quickly to 17k+.

They're probably watercooled...maybe even a chilled loop. /shrug


----------



## Agent Smith1984

I've contact the user and asked for some clarification on his score.....

I told him if it was a "secret" that I would understand..... lol


----------



## Vellinious

I ran this a couple of nights ago with clocks that I knew would be stable for an FS run. I need to try the driver version they're using, see if there's a difference there.

http://www.3dmark.com/fs/7657373


----------



## Agent Smith1984

Quote:


> Originally Posted by *Vellinious*
> 
> I ran this a couple of nights ago with clocks that I knew would be stable for an FS run. I need to try the driver version they're using, see if there's a difference there.
> 
> http://www.3dmark.com/fs/7657373


Damn nice hawaii you got there. Modified BIOS I assume?

What mv?

These damn 390's will NOT respond to voltage like the 290s did.....

Sure, they OC better with less voltage up to 1150-1200MHz, but then they seem to just crap out. Could be related to a lower board TDP intended to make Greneda more power efficient though.


----------



## Vellinious

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Damn nice hawaii you got there. Modified BIOS I assume?
> 
> What mv?
> 
> These damn 390's will NOT respond to voltage like the 290s did.....
> 
> Sure, they OC better with less voltage up to 1150-1200MHz, but then they seem to just crap out. Could be related to a lower board TDP intended to make Greneda more power efficient though.


Yeah, it's running the 390X memstrap and a voltage mod. With the vdroop that's still VERY noticeable, even after mods, the max I can push the voltage to is +262mv, as anything more than that doesn't yield any more voltage to the core. Still trying to figure that out, and get rid of some more of the vdroop. Watching voltage go from 1.44v to 1.305v is painful.....


----------



## Agent Smith1984

Quote:


> Originally Posted by *Vellinious*
> 
> Yeah, it's running the 390X memstrap and a voltage mod. With the vdroop that's still VERY noticeable, even after mods, the max I can push the voltage to is +262mv, as anything more than that doesn't yield any more voltage to the core. Still trying to figure that out, and get rid of some more of the vdroop. Watching voltage go from 1.44v to 1.305v is painful.....


Gotcha,

Is this is a hard mod?

Oddly enough, with every 290 I have had, and every 390 I have had, I can never get load voltage above 1.262 despite what I have it set to. And on the 390's, I noticed the load voltage was closer to 1.25 most of the time, whether I tried to push 75mv, or 175mv, it just seems to droop to around the same amount, which immediately let me know the cards are TDP limited. Whether that is through BIOS or physical limitations, I'm not sure......

I would have thought that the sapphire cards would of had the best shot at high clocks when these cards first came out, because they are using a 2x 8 pin power delivery, but it appears that their binning (at least initially on the cards with no back plate) was very poor. That may be a different story now that they have gon from 1010Mhz to 1040Mhz, but I see pretty much any Hawaii core can do that.

Right now, it appears XFX and MSI are winning in the binning department, maybe with the edge even tilting towards XFX now. I imagine if the XFX sold as well as the MSI, we would see even more 1200+ samples of their cards.


----------



## mus1mus

Arrrh.

Give us your BIOS and we will see it up to 1.5V up to your willingness. 1300MHz is not possible under 1.36 at load for most cards.

Some systems can get away with Tess OFF or up to 2X without being detected.


----------



## Vellinious

lol, I'm pretty satisfied...just wish the vdroop wasn't so game changing.

I also wonder why I never saw this issue with Maxwell.....not running enough volts?


----------



## Agent Smith1984

Quote:


> Originally Posted by *mus1mus*
> 
> Arrrh.
> 
> Give us your BIOS and we will see it up to 1.5V up to your willingness. 1300MHz is not possible under 1.36 at load for most cards.
> 
> Some systems can get away with Tess OFF or up to 2X without being detected.


I'm yanking out the GTX 980 KPE tonight and popping in the new MSI 390X.... I will dump BIOS ASAP.

You got any sweet modded BIOS firestrike runs for me to check out??









I'm definitely looking to push this card as hard as I can on air (I'M NOT SKEERED AT ALL







)

My last MSI 390 did 1200/1725 no problem, with 71C core, and 59C VRM temps on air.... I'll probably torture this card for a little bit, give it to the wife, and get whatever the gen AMD card is (wanting 8GB of HBM2) and pair that up with Zen on an entirely new build.....


----------



## mus1mus

Quote:


> Originally Posted by *Vellinious*
> 
> lol, I'm pretty satisfied...just wish the vdroop wasn't so game changing.
> 
> I also wonder why I never saw this issue with Maxwell.....not running enough volts?


We can still mod that to emulate a full PT3. The mods I did where just VDDC Limit related.

Maxwells are not that Power bheavy compared to Hawaiis.


----------



## mus1mus

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I'm yanking out the GTX 980 KPE tonight and popping in the new MSI 390X.... I will dump BIOS ASAP.
> 
> You got any sweet modded BIOS firestrike runs for me to check out??
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm definitely looking to push this card as hard as I can on air (I'M NOT SKEERED AT ALL
> 
> 
> 
> 
> 
> 
> 
> )
> 
> My last MSI 390 did 1200/1725 no problem, with 71C core, and 59C VRM temps on air.... I'll probably torture this card for a little bit, give it to the wife, and get whatever the gen AMD card is (wanting 8GB of HBM2) and pair that up with Zen on an entirely new build.....


SHOOR. VDDC Limit is guaranteed working. But be aware that black screens come with High Voltages.

Maybe we can try Memory tweaks for you. Magic Hex too if allowed.

Sad to say, you are within the very edge of what I can consider safe in terms of temps.


----------



## Vellinious

Quote:


> Originally Posted by *mus1mus*
> 
> We can still mod that to emulate a full PT3. The mods I did where just VDDC Limit related.
> 
> Maxwells are not that Power bheavy compared to Hawaiis.


Well, whatever you did took it from 1.285v minimums, to 1.305v minimums. So, it definitely helped.

I may wait until I get my new cables made with the capacitors on them, to smooth out the voltage further before it gets to the GPU. No idea if that'll actually do anything, but......can't hurt either. lol


----------



## tolis626

Meanwhile, I'm just sittin' here trying to figure out how to go over my highest score, which is in the 14700 +change range, that I got with 1185/1725MHz. Sad thing is, same config now gives me around 14400-14500, just like any other clock that's 1150MHz or above. I've tried up to 1225MHz, still there. And that guy is pushing 16.6k? What is he cooling with? Steroids?









PS : Please stop this custom BIOS discussion. You're tempting me and I shouldn't think about it. I want to sell the card later.


----------



## Agent Smith1984

Quote:


> Originally Posted by *mus1mus*
> 
> SHOOR. VDDC Limit is guaranteed working. But be aware that black screens come with High Voltages.
> 
> Maybe we can try Memory tweaks for you. Magic Hex too if allowed.
> 
> Sad to say, you are within the very edge of what I can consider safe in terms of temps.


Yeah, I'm definitely going to want to delve into the BIOS mod thing.

Those temps were with case fans at 1600rpm (tiny litle s340), so I am probably going to set them to a full tilt 2250 and get that heat out of there a little better.

I don't expect much more than 1200/1700 with the card, but if I can tweak those memory timings to squeeze some better numbers, that'll be awesome.


----------



## mus1mus

Quote:


> Originally Posted by *Vellinious*
> 
> Well, whatever you did took it from 1.285v minimums, to 1.305v minimums. So, it definitely helped.
> 
> I may wait until I get my new cables made with the capacitors on them, to smooth out the voltage further before it gets to the GPU. No idea if that'll actually do anything, but......can't hurt either. lol


Caps should help. But how big it is matters. Though, you might get away with smaller (microFarads uF) rating due to switching PSUs producing higher Voltage frequencies than 60Hz ac wall frequency.

Car Audio nuts are familiar with over 1 Farad caps. They are rather bulky and impractical for our use.

And the fact that PCs convert 12V to smaller voltages caps effect will be minimal as Voltages fed unto components rely now to the Voltage regulators and circuitry.

BTW, you know we can mod your DPM7 Voltage right? We can set it to 1.287 as the base.

Quote:


> Originally Posted by *tolis626*
> 
> Meanwhile, I'm just sittin' here trying to figure out how to go over my highest score, which is in the 14700 +change range, that I got with 1185/1725MHz. Sad thing is, same config now gives me around 14400-14500, just like any other clock that's 1150MHz or above. I've tried up to 1225MHz, still there. And that guy is pushing 16.6k? What is he cooling with? Steroids?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> PS : Please stop this custom BIOS discussion. You're tempting me and I shouldn't think about it. I want to sell the card later.


Yoh can sell them with stock roms. No problems.


----------



## tolis626

Quote:


> Originally Posted by *mus1mus*
> 
> Yoh can sell them with stock roms. No problems.


Doesn't flashing the BIOS immediately void the warranty? Or is it a "if you return it at its stock condition it's ok" kind of thing? If so, then the only thing keeping me from doing it is the fact that the MSI only has one BIOS, but what's the possibility something could go THAT wrong?

Other than that, one could debate whether it's worth it for a stock air-cooled card or not, but that's down to personal preference I guess. I don't think I could squeeze that much out of it without better cooling. That doesn't take the fun out of it though.


----------



## Vellinious

Quote:


> Originally Posted by *mus1mus*
> 
> Caps should help. But how big it is matters. Though, you might get away with smaller (microFarads uF) rating due to switching PSUs producing higher Voltage frequencies than 60Hz ac wall frequency.
> 
> Car Audio nuts are familiar with over 1 Farad caps. They are rather bulky and impractical for our use.
> 
> And the fact that PCs convert 12V to smaller voltages caps effect will be minimal as Voltages fed unto components rely now to the Voltage regulators and circuitry.
> 
> BTW, you know we can mod your DPM7 Voltage right? We can set it to 1.287 as the base.


Hmm....that wouldn't be bad for a benchmark only mod. I could always use the stock bios for gaming and such.

Would that allow for higher voltages without vdroop?


----------



## navjack27

here are some quotes from my discussion with mr agent smith
Quote:


> Originally Posted by *Navjack27*
> 1250? i remember running a 1175 run with 1750 and 1650 memory.
> 
> now i'm still to this day confused about the tesselation but whenever i run it (not for green vs red competitions lol) i run it making sure to have tess set in the registry to use app setting.
> 
> lemme look at my 3dmark results to remind myself
> 
> i believe that this run http://www.3dmark.com/fs/7457158 was with tess set to 0x3400 = 4x in registry
> 
> http://www.3dmark.com/fs/7457089 was unmodified
> 
> http://www.3dmark.com/fs/7017421 i'm quite sure was just massive overclocking on the gpu with memory timings (1250 timings artifacts at 1750mhz, did 1625 instead) so i ran it 1175core 1625memory +125mv core and like i said i took the 1250 memory strap and moved it up to every memory timing up to 1750 (my bios says it goes to 2000mhz memory as max but never tried)
> 
> i use radeonmod registry editor to tweak a couple other things like DISABLEDMACOPY 1 and DISABLEBLOCKWRITE 0 i also use clockblocker in ADL mode which i manually set to on after i set my overclock with iTurbo
> 
> i guess i also notice that fresh installs of drivers mean a ton to the system so i wipe the registry of everything related to amd crimson and catalyst and then uninstall and then check again and then run ddu AND THEN install the last newest catalyst and then the newest crimson.
> 
> its been a while since i've been in mega 3dmark firestrike benching mode so i could run some with my new cpu and try to recreate my results or best them.


Quote:


> Originally Posted by *agent smith*
> So what about this one?
> 
> http://www.3dmark.com/fs/7016767
> 
> I call BS at 1175...... there is a way to kill tess in crimson and get around futuremark I believe.


Quote:


> Originally Posted by *navjack27*
> LOL i do know how to get around it from being detected. stumbled upon it by accident honestly.
> 
> i call BS on most of my firestrike scores just due to how convoluted the registry is with the graphical settings and how there are still internal profiles for each app you add PLUS not knowing when crimson profiles override registry settings
> 
> *note not settings i used when i benched, just an example*


Quote:


> Originally Posted by *navjack27*
> one thing tho is that the memory timings DO account for TONS of extra points. maybe 500-1500 or 2000 extra points mattering on whats going on in your system.
> 
> the ones you pointed out, i BELIEVE have the drivers using use app setting tess, but i also said i can't account for what the drivers ACTUALLY listen to for the setting which infuriates me because of getting called out. i'd not use those scores as comparisons for my own data if i cheated.
> 
> also i'm sure they are low in other areas due to my old 4790s. maybe... ahhh **** it, i'll do some benching today with my new broadwell.


----------



## Vellinious

Quote:


> Originally Posted by *navjack27*
> 
> here are some quotes from my discussion with mr agent smith


Good lord....whatever happened to just running the freakin benchmark with an overclock and seeing how well you do? This level of modification is just plain shady.....registry files and such? /smh Pretty pathetic, really.


----------



## navjack27

maybe its the new way of life? gotta get every inch of performance outta ur system when you hit a thermal or power limit.

here, i did just that tho
http://www.3dmark.com/fs/7677687

reset to default driver settings. its not a fluke.

EDIT: nvidia users just have an easier way of modifying settings with nvidia inspector, you think no one does it?


----------



## Vellinious

Quote:


> Originally Posted by *navjack27*
> 
> maybe its the new way of life? gotta get every inch of performance outta ur system when you hit a thermal or power limit.
> 
> here, i did just that tho
> http://www.3dmark.com/fs/7677687
> 
> reset to default driver settings. its not a fluke.
> 
> EDIT: nvidia users just have an easier way of modifying settings with nvidia inspector, you think no one does it?


Didn't say nobody does....just said it's pretty pathetic to get a higher score. To me, that's the same thing as the d-bags that use aim bots in shooters.


----------



## mus1mus

Quote:


> Originally Posted by *Vellinious*
> 
> Hmm....that wouldn't be bad for a benchmark only mod. I could always use the stock bios for gaming and such.
> 
> Would that allow for higher voltages without vdroop?


That can give you a higher ceiling.



You can do that with Hawaii Bios Reader.

Notice the Black boxes? That can be manually set. I personally just put the value on the RED box unto those and let OC tools do the rest of the overvolting.


----------



## Vellinious

Quote:


> Originally Posted by *mus1mus*
> 
> That can give you a higher ceiling.
> 
> 
> 
> You can do that with Hawaii Bios Reader.
> 
> Notice the Black boxes? That can be manually set. I personally just put the value on the RED box unto those and let OC tools do the rest of the overvolting.


Hmm. Ok, I'll take a look at that tonight.


----------



## mus1mus

Quote:


> Originally Posted by *tolis626*
> 
> Doesn't flashing the BIOS immediately void the warranty? Or is it a "if you return it at its stock condition it's ok" kind of thing? If so, then the only thing keeping me from doing it is the fact that the MSI only has one BIOS, but what's the possibility something could go THAT wrong?
> 
> Other than that, one could debate whether it's worth it for a stock air-cooled card or not, but that's down to personal preference I guess. I don't think I could squeeze that much out of it without better cooling. That doesn't take the fun out of it though.


You can save the original BIOS thru GPU-Z. It's a complete replica. Flash the card, if things go wrong, flash back the original bios.

They can't tell, you flashed it.


----------



## navjack27

Quote:


> Originally Posted by *Vellinious*
> 
> Didn't say nobody does....just said it's pretty pathetic to get a higher score. To me, that's the same thing as the d-bags that use aim bots in shooters.


the important thing is this, at least to me
http://hwbot.org/news/9039_application_52_rules/


----------



## mus1mus

Quote:


> Originally Posted by *Vellinious*
> 
> Quote:
> 
> 
> 
> Originally Posted by *navjack27*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Vellinious*
> 
> Quote:
> 
> 
> 
> Originally Posted by *navjack27*
> 
> here are some quotes from my discussion with mr agent smith
> 
> 
> 
> Good lord....whatever happened to just running the freakin benchmark with an overclock and seeing how well you do? This level of modification is just plain shady.....registry files and such? /smh Pretty pathetic, really.
> 
> Click to expand...
> 
> maybe its the new way of life? gotta get every inch of performance outta ur system when you hit a thermal or power limit.
> 
> here, i did just that tho
> http://www.3dmark.com/fs/7677687
> 
> reset to default driver settings. its not a fluke.
> 
> EDIT: nvidia users just have an easier way of modifying settings with nvidia inspector, you think no one does it?
> 
> Click to expand...
> 
> Didn't say nobody does....just said it's pretty pathetic to get a higher score. To me, that's the same thing as the d-bags that use aim bots in shooters.
Click to expand...

I wouldn't call this BS. If it matters in Benches, it will matter in Gaming. Both users can benefit from these mods.

nVidia guys have LOD Tweaks that are also allowed on the bot for competitive purposes. They do the same thing I believe.


----------



## jdorje

Bios editing is pretty cool.

The first step is to make sure you have a backup gpu as bricking your card is a near certainty and you need another card to reflash. I keep my igpu enabled and make sure its the primary in the bios so that's what I use for DOS flashing.

Bumping memory timings just takes a few minutes in a hex editor (hex workshop? ) and will get you 2-4% more fps at no cost whatsoever. Why manufacturers don't fix the straps is beyond me.

Editing voltage is the next step. Aida can tell you what your stock voltage states are. You can then edit this by changing all 6 tables (not just the two shown). This is no different from simply bumping voltage in afterburner though. It can let you raise the stock voltage or let you go over +100 mV.

You can also change the default clock and memory. Also no different from using afterburner.

IIRC mine is set to 1740 ram with 1100 timings. 1225 mV and 1090 mhz on core. It's the same voltage as stock and power use isn't very high, but gets about 12% more performance. If I get cold or want more fps then I oc more in afterburner.

The bios editing thread is cool and full of helpful people but can get a bit daunting with their terminology.


----------



## Vellinious

Quote:


> Originally Posted by *mus1mus*
> 
> I wouldn't call this BS. If it matters in Benches, it will matter in Gaming. Both users can benefit from these mods.
> 
> nVidia guys have LOD Tweaks that are also allowed on the bot for competitive purposes. They do the same thing I believe.


LOD tweaks?

HW Bot rules are pretty shady too, imo.... I do runs with tess tweaks on FS for comparison purposes, but, I'd never go into the registry or try to hide the fact that the run I had done was with a tess tweak to pass if off as a "legit score". There's a REASON they call it a "legit score", btw.....lol

I understand wanting to be competitive, but, I can't see stooping to that level to get a better score. "Pond scum" comes to mind, almost immediately....


----------



## mus1mus

Quote:


> Originally Posted by *Vellinious*
> 
> LOD tweaks?
> 
> HW Bot rules are pretty shady too, imo.... I do runs with tess tweaks on FS for comparison purposes, but, *I'd never go into the registry or try to hide the fact that the run I had done was with a tess tweak to pass if off as a "legit score". There's a REASON they call it a "legit score"*, btw.....lol
> 
> I understand wanting to be competitive, but, I can't see stooping to that level to get a better score. "Pond scum" comes to mind, almost immediately....


Level of Detail mods.

Same here though. And just thought about what I said. If it tweaks TESS, just turn off TESS.

Quote:


> Originally Posted by *jdorje*
> 
> Bios editing is pretty cool.
> 
> The first step is to make sure you have a backup gpu as bricking your card is a near certainty and you need another card to reflash. I keep my igpu enabled and make sure its the primary in the bios so that's what I use for DOS flashing.
> 
> Bumping memory timings just takes a few minutes in a hex editor (hex workshop? ) and will get you 2-4% more fps at no cost whatsoever. Why manufacturers don't fix the straps is beyond me.
> 
> Editing voltage is the next step. Aida can tell you what your stock voltage states are. You can then edit this by changing all 6 tables (not just the two shown). This is no different from simply bumping voltage in afterburner though. It can let you raise the stock voltage or let you go over +100 mV.
> 
> You can also change the default clock and memory. Also no different from using afterburner.
> 
> IIRC mine is set to 1740 ram with 1100 timings. 1225 mV and 1090 mhz on core. It's the same voltage as stock and power use isn't very high, but gets about 12% more performance. If I get cold or want more fps then I oc more in afterburner.
> 
> *The bios editing thread is cool and full of helpful people but can get a bit daunting with their terminology*.


Damn right.


----------



## Vellinious

Quote:


> Originally Posted by *mus1mus*
> 
> That can give you a higher ceiling.
> 
> 
> 
> You can do that with Hawaii Bios Reader.
> 
> Notice the Black boxes? That can be manually set. I personally just put the value on the RED box unto those and let OC tools do the rest of the overvolting.


My power play tab doesn't look like that. I don't have the voltage table there..... ??


----------



## mus1mus

I am using the old one.

HawaiiBiosReader.zip 32k .zip file


----------



## Vellinious

Quote:


> Originally Posted by *mus1mus*
> 
> I am using the old one.
> 
> HawaiiBiosReader.zip 32k .zip file


Thanks


----------



## Vellinious

Quote:


> Originally Posted by *Vellinious*
> 
> Hmm. Ok, I'll take a look at that tonight.


The red box you show there is already set to 1287mv. ??


----------



## rdr09

Quote:


> Originally Posted by *Vellinious*
> 
> LOD tweaks?
> 
> HW Bot rules are pretty shady too, imo.... I do runs with tess tweaks on FS for comparison purposes, but, I'd never go into the registry or try to hide the fact that the run I had done was with a tess tweak to pass if off as a "legit score". There's a REASON they call it a "legit score", btw.....lol
> 
> I understand wanting to be competitive, but, I can't see stooping to that level to get a better score. "Pond scum" comes to mind, almost immediately....


I think it's just fair to allow both sides to do tweaks. So long as they are within the rules.


----------



## dagget3450

Quote:


> Originally Posted by *rdr09*
> 
> I think it's just fair to allow both sides to do tweaks. So long as they are within the rules.


I kind of understand both sides, i just still can't wrap my head around some of the software tweaks i.e. LOD/Tess.
Essentially your changing the way the benchmark was intended to run. On the overclocking side it makes total sense to tweak hardware settings as your tweaking your box.

For me it feels like this, modify your race car for the track(i.e your rig) and race. Tweaking graphical quality in the said benchmark is like modifying the race track for your advantage.(defeating the purpose of the race)

Just my personal opinion,.

Anyways its all fun and games, until someone loses an eye. Then it's just fun and games in the dark


----------



## mus1mus

Quote:


> Originally Posted by *Vellinious*
> 
> The red box you show there is already set to 1287mv. ??


Copy the value from the Red box unto what I boxed in Black. Both of them blacks. Must be identical or else.


----------



## Agent Smith1984

So.... go the 390x fired up last night.

Seems to be a decent card so far.

Not done testing by any means, but initial results are 1180MHz at 60mv/1750MHz VRAM at 50mv AUX, with a FS graphics score of 14,616..... (that's a 100% no artifact, and game stable test speed).

All I need to do is put those 1250 memory straps in place and it should be able to get her up to15k, which is dead even with my 980 KPE at 1526/1947 (15,033).
Now factor in how crippled the memory bus is on the 980 while running at 4k, and you can see how the 390 will definitely pull ahead at high resolution..... YES, even in an overclocked 980 vs overclocked 390 situation, which people in the 980 boards have tried to use against the 390 several times, and I can now confirm that even though the maxwell overclocks a lot better, it doesn't seem to matter all that much. I am getting identical performance between the two cards with their respective overclocks.

I fired up some GTA V at the exact same 4k settings I used on my 980 last night and the performance was actually smoother. The 980 did okay on framerates, but it would get hitches every once in a while from pegging the VRAM out. It consistently used almost all the VRAM at 4k, and that was my primary reason for going back to AMD's 4 year old space heater, lol.

Don't get me wrong though ,I could put the 980 at non- stable clocks and get some nice benches in the 15,600+range, which is indeed better than what running this 390x at a non-stable 1210~ / 1750+ clocks will get (probbaly around 15k), but to me, if the clocks aren't stable for daily use, then the benching can't be comparable. I guess those numbers could be indicative of water cooling, but for air, the 390x is not an inferior card to the 980 at all from a performance standpoint. The inferiority comes in the ford of power and temps...... oh and that driver argument everybody keeps making against AMD? Bullcrap!. I'm hearing as much driver woes from green owners as I ever have....

One athlete may respond better to roids than another, but both athletes performing to the best of their non-doped abilities may be very similar.....


----------



## Vellinious

Quote:


> Originally Posted by *Agent Smith1984*
> 
> So.... go the 390x fired up last night.
> 
> Seems to be a decent card so far.
> 
> Not done testing by any means, but initial results are 1180MHz at 60mv/1750MHz VRAM at 50mv AUX, with a FS graphics score of 14,616..... (that's a 100% no artifact, and game stable test speed).
> 
> All I need to do is put those 1250 memory straps in place and it should be able to get her up to15k, which is dead even with my 980 KPE at 1526/1947 (15,033).
> Now factor in how crippled the memory bus is on the 980 while running at 4k, and you can see how the 390 will definitely pull ahead at high resolution..... YES, even in an overclocked 980 vs overclocked 390 situation, which people in the 980 boards have tried to use against the 390 several times, and I can now confirm that even though the maxwell overclocks a lot better, it doesn't seem to matter all that much. I am getting identical performance between the two cards with their respective overclocks.
> 
> I fired up some GTA V at the exact same 4k settings I used on my 980 last night and the performance was actually smoother. The 980 did okay on framerates, but it would get hitches every once in a while from pegging the VRAM out. It consistently used almost all the VRAM at 4k, and that was my primary reason for going back to AMD's 4 year old space heater, lol.
> 
> Don't get me wrong though ,I could put the 980 at non- stable clocks and get some nice benches in the 15,600+range, which is indeed better than what running this 390x at a non-stable 1210~ / 1750+ clocks will get (probbaly around 15k), but to me, if the clocks aren't stable for daily use, then the benching can't be comparable. I guess those numbers could be indicative of water cooling, but for air, the 390x is not an inferior card to the 980 at all from a performance standpoint. The inferiority comes in the ford of power and temps...... oh and that driver argument everybody keeps making against AMD? Bullcrap!. I'm hearing as much driver woes from green owners as I ever have....
> 
> One athlete may respond better to roids than another, but both athletes performing to the best of their non-doped abilities may be very similar.....


If you're only getting those clocks out of a GM204, AND a KPE? You should probably try again......


----------



## Agent Smith1984

Being a KPE has no affect on the card clocking any better on air than any other 980..... It's a purpose built card, and that purpose is LN2 (I only had the card cause I got it on a trade)

1526 is not too bad for the core on air with stock voltage, but I was terribly disappointed in the memory. It would bench up to 2,020+ on stock voltage (mind you these are Samsung chips), but for daily use, I couldn't get her past 1950 or so at 4k without corruption or inevitable black screens. The card had an ASIC of 69.7 so it was no golden core or anything. Looking through 980 forums, and having did tons of research during the time I had the card, 1500-1530 range is very normal for EVGA classies...


----------



## Agent Smith1984

On another note......

This 390X seems to act exactly like my MSI 390 did in regards to overclocking, except for the memory seems to like 1750 much better than that card did (though it still seems to want AUX voltage to get there).....

When I first got the 390, I couldn't get past 1180 for the first few days, then after a ton of different settings and MSI AB power setting tweaks, I was able to find a stable 1200 @ 88mv on the nose, so hope I can land at 1200 on this one too.
Looking at my temps though (71c full tilt after an hour of GTA V at 4k), I certainly won't be doing much more than 80-100mv with the voltage for daily use.


----------



## Vellinious

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Being a KPE has no affect on the card clocking any better on air than any other 980..... It's a purpose built card, and that purpose is LN2 (I only had the card cause I got it on a trade)
> 
> 1526 is not too bad for the core on air with stock voltage, but I was terribly disappointed in the memory. It would bench up to 2,020+ on stock voltage (mind you these are Samsung chips), but for daily use, I couldn't get her past 1950 or so at 4k without corruption or inevitable black screens. The card had an ASIC of 69.7 so it was no golden core or anything. Looking through 980 forums, and having did tons of research during the time I had the card, 1500-1530 range is very normal for EVGA classies...


That's actually pretty poor for a GM204. Most GM204 based GPUs can nearly boost 2.0 to 1500 by just maxing the power limit and adding a tiny bit of voltage. Most GM204s I've help setup with custom bios files, and had them setup custom fan curves with are well over 1550 on air, some as high as 1600. 69% ASIC for Maxwell is pretty bad, yes....but there's still room there.

I'd get the voltage tool and put some clock on it, before you start making claims that your 390 can best it....thus far, your 980 KPE is only a couple hundred points higher than my best 970 graphics scores. My best was 14880.....

For reference....a decent 980 will pull low 17k graphics scores. Get a good clock on them, and they'll pull high 17k. On air, you should still be able to see mid to high 15k graphics scores.

Sorry for the long post, just seems like you're not really pushing that card very much. /shrug


----------



## Agent Smith1984

Quote:


> Originally Posted by *Vellinious*
> 
> That's actually pretty poor for a GM204. Most GM204 based GPUs can nearly boost 2.0 to 1500 by just maxing the power limit and adding a tiny bit of voltage. Most GM204s I've help setup with custom bios files, and had them setup custom fan curves with are well over 1550 on air, some as high as 1600. 69% ASIC for Maxwell is pretty bad, yes....but there's still room there.
> 
> I'd get the voltage tool and put some clock on it, before you start making claims that your 390 can best it....thus far, your 980 KPE is only a couple hundred points higher than my best 970 graphics scores. My best was 14880.....


The card wouldn't respond to voltage at all.... anything over 25mv would cause crazy artifacts. It's very common for some 204's to not improve with voltage at all. I was able to add 13mv nd boost to around 1540 or so but it seemed to add a strange amount of heat to the core (something with power deliver changes in P-X not allowing as much vdroop or something?? I dunno)

And I never said the 390X could best it..... I said it is not an inferior card from a performance standpoint like some make it out to be, especially not at high resolution.
It is however, way more power hungry, and runs a good bit hotter for most people (though my load temps on my 390X with overvoltage are the same as the KPE with 13mv+)

Best FS score on my KPE was 15,780 graphics. That was at around 1540~/2000 I believe... can't really remember. Again I only went to the 390X for the 8GB. My Fury was also VRAM limited, so the only logical thing to do was go back to the 390, and a second one in the next few weeks


----------



## Vellinious

Quote:


> Originally Posted by *Agent Smith1984*
> 
> The card wouldn't respond to voltage at all.... anything over 25mv would cause crazy artifacts. It's very common for some 204's to not improve with voltage at all. I was able to add 13mv nd boost to around 1540 or so but it seemed to add a strange amount of heat to the core (something with power deliver changes in P-X not allowing as much vdroop or something?? I dunno)
> 
> And I never said the 390X could best it..... I said it is not an inferior card from a performance standpoint like some make it out to be, especially not at high resolution.
> It is however, way more power hungry, and runs a good bit hotter for most people (though my load temps on my 390X with overvoltage are the same as the KPE with 13mv+)
> 
> Best FS score on my KPE was 15,780 graphics. That was at around 1540~/2000 I believe... can't really remember. Again I only went to the 390X for the 8GB. My Fury was also VRAM limited, so the only logical thing to do was go back to the 390, and a second one in the next few weeks


I don't doubt at all that the 390 / 390X can beat it with a custom bios and some voltage. The 290X I have is putting up pretty decent 980 type scores....just sayin it's a pretty poor comparison. That's obviously a pretty bad 980.

/shrug


----------



## Agent Smith1984

Quote:


> Originally Posted by *Vellinious*
> 
> I don't doubt at all that the 390 / 390X can beat it with a custom bios and some voltage. The 290X I have is putting up pretty decent 980 type scores....just sayin it's a pretty poor comparison. That's obviously a pretty bad 980.
> 
> /shrug


I wouldn't say bad.... I have gone through months of research, and pages upon pages of 980 classy threads, and most cards are doing between 1493 and 1560.... at 1526 I was righ in the middle. The memory seemed low though.... it was a card in my opinion that needed to be water cooled, and with such a low ASIC would of probably been a monster on LN2.

ASIC quality seems to mean nothing on GCN based cores, while Keplar and Maxwell's ASIC qualities seem to be very indicative of their OC potential, especially on air.

My brother's 980Ti Classy does 1585mhz (stock voltage when I had last sat with him, maybe be much higher now with some juice) on air (we've benched over 1620 something I think), but he got lucky as hell with a 79.6 ASIC quality. That thing is a monster!


----------



## Vellinious

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I wouldn't say bad.... I have gone through months of research, and pages upon pages of 980 classy threads, and most cards are doing between 1493 and 1560.... at 1526 I was righ in the middle. The memory seemed low though.... it was a card in my opinion that needed to be water cooled, and with such a low ASIC would of probably been a monster on LN2.
> 
> ASIC quality seems to mean nothing on GCN based cores, while Keplar and Maxwell's ASIC qualities seem to be very indicative of their OC potential, especially on air.
> 
> My brother's 980Ti Classy does 1585mhz on air, but he got lucky as hell with a 79.6 ASIC quality. That thing is a monster!


The GM200s are different....they don't overclock nearly as well, usually, as the GM204 based GPUs. Yeah, that 980ti is a monster. 1585 is VERY good for a GM200.

The Classys and KPEs are really poor performers on air, and on water. Most of the G1 980 / 980 FTW / SSC / SC guys I have helped, even on air, are hitting 1550+...most of them are mid 70% ASICs, which is decent, but not golden by any means. My brothers 980 FTW, which has an ASIC quality of 83.4%, hits 1610 in FS on air. I've tried to get him to put it under water, but....he's a wuss....


----------



## Agent Smith1984

Quote:


> Originally Posted by *Vellinious*
> 
> The GM200s are different....they don't overclock nearly as well, usually, as the GM204 based GPUs. Yeah, that 980ti is a monster. 1585 is VERY good for a GM200.
> 
> The Classys and KPEs are really poor performers on air, and on water. Most of the G1 980 / 980 FTW / SSC / SC guys I have helped, even on air, are hitting 1550+...most of them are mid 70% ASICs, which is decent, but not golden by any means. My brothers 980 FTW, which has an ASIC quality of 83.4%, hits 1610 in FS on air. I've tried to get him to put it under water, but....he's a wuss....


Just reminds how much I like tinkering with different cards. I don't really have any brand allegiance, as both have strengths and weaknesses. Usually NVIDIA in the way of performance, and AMD in the way of price.

My 390X was $360 after rebate, and came with a decent gaming mouse, and a Hitman code ($60), so that was a pretty good value for me.

I'm going to dump the BIOS tonight and see what mus comes up with for me


----------



## Vellinious

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Just reminds how much I like tinkering with different cards. I don't really have any brand allegiance, as both have strengths and weaknesses. Usually NVIDIA in the way of performance, and AMD in the way of price.
> 
> My 390X was $360 after rebate, and came with a decent gaming mouse, and a Hitman code ($60), so that was a pretty good value for me.
> 
> I'm going to dump the BIOS tonight and see what mus comes up with for me


I'm pretty impressed with the 290X I ended up with. Cost about the same as the 970s I had purchased, and performs a LOT better.

Next up: chiller for my loop. lol


----------



## Agent Smith1984

Quote:


> Originally Posted by *Vellinious*
> 
> I'm pretty impressed with the 290X I ended up with. Cost about the same as the 970s I had purchased, and performs a LOT better.
> 
> Next up: chiller for my loop. lol


Chilled water could have a huge impact on a 290!!! Especially if it's a core that actually responds to juice (cause they all respond to low temps)

I'm ready to tinker with this BIOS some.... My hope is that it will still do the 1750 clock speed, even with the tighter timings.
I think am going to do a fresh TIM on it tonight too. I know those factory jobs are gubered up pretty bad usually, and I have a fresh tube of noctua just sittin around anyways.

I'll be doing a custom loop on my next build (probably a 5820k, or Zen if it's good) and probably use AMD's next GPU offering with 8GB of HBM2.


----------



## Vellinious

Aye...I keep the ambients in my pc room down around 16c usually when I'm doing my oc runs, to keep the coolant temps below 20c. I figure, with as low as the humidity level is in that room, I should be able to chill the loop down to around 5c without seeing much condensation....I need to do some math on it, but....I'd likely insulate the board anyway, just to be sure.

I have a small dorm fridge I've been toying with. I put an internal air temp sensor in it, and can keep the internal temp right at 5c. I have a bunch of soft copper tubing left over from a construction project, so thought I'd coil that as tightly as I could, and in layers, on the interior of the minifridge...the copper should transfer heat / cold really well, and with the MO RA3 removing most of the heat from the water before it goes back to the fridge, it shouldn't have to work very hard to chill the coolant back down. Or....get a bigger refrigerator and put the radiator in it....

Haven't decided on the final design yet, but.....it's in the works. Should probably have something ready by this summer. = D


----------



## Agent Smith1984

Quote:


> Originally Posted by *Vellinious*
> 
> Aye...I keep the ambients in my pc room down around 16c usually when I'm doing my oc runs, to keep the coolant temps below 20c. I figure, with as low as the humidity level is in that room, I should be able to chill the loop down to around 5c without seeing much condensation....I need to do some math on it, but....I'd likely insulate the board anyway, just to be sure.
> 
> I have a small dorm fridge I've been toying with. I put an internal air temp sensor in it, and can keep the internal temp right at 5c. I have a bunch of soft copper tubing left over from a construction project, so thought I'd coil that as tightly as I could, and in layers, on the interior of the minifridge...the copper should transfer heat / cold really well, and with the MO RA3 removing most of the heat from the water before it goes back to the fridge, it shouldn't have to work very hard to chill the coolant back down. Or....get a bigger refrigerator and put the radiator in it....
> 
> Haven't decided on the final design yet, but.....it's in the works. Should probably have something ready by this summer. = D


When my brother and I were younger, around 2005, he was I believe 14, and I was around 20..... we turbed up a window unit AC with duct tape and flex pipe, to blow right inside the 120mm side intake fan on an NZXT Nemesis Elite build we had done, and then had another flex tube comming out the back 120mm, running into a dehumidifier. Running a 7950GT with volt pen mod to run at GTX+++ clocks already, plus some BIOS mods to tighten memory timings, and had the nicest air cooler at the time, Zalman Fatal1ty something... something..... needless to say, we tortured that card. Some of the funniest ghetto rigged cooling you've ever seen for sure!!! haha

Back then custom water cooling could cost you a FORTUNE, and you NEVER saw it on GPU's without custom made blocks. Crazy how much things have changed!

Sorry for the trip down memory lane, just you talking about using a small dorm fridge to cool your loop took me back!! lol


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> When my brother and I were younger, around 2005, he was I believe 14, and I was around 20..... we turbed up a window unit AC with duct tape and flex pipe, to blow right inside the 120mm side intake fan on an NZXT Nemesis Elite build we had done, and then had another flex tube comming out the back 120mm, running into a dehumidifier. Running a 7950GT with volt pen mod to run at GTX+++ clocks already, plus some BIOS mods to tighten memory timings, and had the nicest air cooler at the time, Zalman Fatal1ty something... something..... needless to say, we tortured that card. Some of the funniest ghetto rigged cooling you've ever seen for sure!!! haha
> 
> Back then custom water cooling could cost you a FORTUNE, and you NEVER saw it on GPU's without custom made blocks. Crazy how much things have changed!
> 
> Sorry for the trip down memory lane, just you talking about using a small dorm fridge to cool your loop took me back!! lol


If you can hit 1200 on air, get that thing under water and really see what it can do. Lowering the vrm's makes a big difference from what I have seen myself and on these boards. It just makes the whole hobby more fun. I'm still waiting on some parts to get my loop all done on the 390x. Can't wait to get er' finished and see what it can put out on heaven. I'm sure it won't break any records, but its just so much fun seeing how much more over stock these things can do in the right conditions. Chillers are very tempting to me... but I'm not ready yet.


----------



## Vellinious

Quote:


> Originally Posted by *Agent Smith1984*
> 
> When my brother and I were younger, around 2005, he was I believe 14, and I was around 20..... we turbed up a window unit AC with duct tape and flex pipe, to blow right inside the 120mm side intake fan on an NZXT Nemesis Elite build we had done, and then had another flex tube comming out the back 120mm, running into a dehumidifier. Running a 7950GT with volt pen mod to run at GTX+++ clocks already, plus some BIOS mods to tighten memory timings, and had the nicest air cooler at the time, Zalman Fatal1ty something... something..... needless to say, we tortured that card. Some of the funniest ghetto rigged cooling you've ever seen for sure!!! haha
> 
> Back then custom water cooling could cost you a FORTUNE, and you NEVER saw it on GPU's without custom made blocks. Crazy how much things have changed!
> 
> Sorry for the trip down memory lane, just you talking about using a small dorm fridge to cool your loop took me back!! lol


I've already got it fitted to the base for my case. The big silver box in the pic is the perfect size to fit it right inside. = )


----------



## battleaxe

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *Vellinious*
> 
> I've already got it fitted to the base for my case. The big silver box in the pic is the perfect size to fit it right inside. = )






Two X9's... so tempting. I have only one...


----------



## Agent Smith1984

Quote:


> Originally Posted by *battleaxe*
> 
> If you can hit 1200 on air, get that thing under water and really see what it can do. Lowering the vrm's makes a big difference from what I have seen myself and on these boards. It just makes the whole hobby more fun. I'm still waiting on some parts to get my loop all done on the 390x. Can't wait to get er' finished and see what it can put out on heaven. I'm sure it won't break any records, but its just so much fun seeing how much more over stock these things can do in the right conditions. Chillers are very tempting to me... but I'm not ready yet.


What's impressive, is how well MSI did on the VRM cooling with these on air. 57c under 4k load with 60mv+ last night...... pretty impressive

I would like to give a go at water since EK has blocks for these now, but this thing is at the end of it's life cycle for my purposes, an I'm probably 6 months out from a different card (at which time I do want to definitely water cool).


----------



## sil130

Hey guys, what are the safe vrm temps for msi r9 390? My goes up to 82C on heaven benchmark. 1120 core 1625 mem @ +45 mv. Is that good?


----------



## Agent Smith1984

Quote:


> Originally Posted by *sil130*
> 
> Hey guys, what are the safe vrm temps for msi r9 390? My goes up to 82C on heaven benchmark. 1120 core 1625 mem @ +45 mv. Is that good?


Not great, but plenty safe.


----------



## Renner

Lol, you should try out playing Rise of the Tomb Raider. The only game ever broke a 90 degrees milestone on my MSi 390. On 1080p... So far, the max I ever saw was like 80-82 in Batman Arkham Knight while playing it maxed out at 3200x1800res. However, at least ROTTR runs great despite everything, and it doesn't seem to present a problem for my system, although I'm aware it would if I kept my stock CPU cooler.


----------



## CamsX

Hi guys, been a while since I posted. Any recommendation for software related FAN issues? I'm pretty sure it all started with the Crimson drivers. I heard there were issues, but didn't notice any with the 1st Crimson version.

1 out of 3 boots, fans would stay stuck at 28% speed, and won't detect the custom fan profile in Afterburner. Would hit 80+°C when I don't notice.

Had to uninstall afterburner and go with crappy Trixx, but now fan spikes to 100% according to my sensors. Very weird. Hope there is fix with the drivers released yesterday.


----------



## Agent Smith1984

Okay guys, here is the MAX flicker/artifact free, and 1 hour game stable clock I have achieved....

1195/1750 @ 88mv/50mv

http://www.3dmark.com/3dm/10949988

1200+ will run pretty good, but it gets a little flicker here and there in FS, and going to 100mv makes it worse. I imagine on water it would really do nice....

The VRAM doesn't like anything over 1750.... going to 1760 causes a 100 point drop from ECC, and going to 1740 was a 20 point or so drop, so I just left it at 1750.

Here is the BIOS for anyone wanting to help with a nice modded BIOS







@mus1mus









MSI390x.zip 97k .zip file


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Okay guys, here is the MAX flicker/artifact free, and 1 hour game stable clock I have achieved....
> 
> 1195/1750 @ 88mv/50mv
> 
> http://www.3dmark.com/3dm/10949988
> 
> 1200+ will run pretty good, but it gets a little flicker here and there in FS, and going to 100mv makes it worse. I imagine on water it would really do nice....
> 
> The VRAM doesn't like anything over 1750.... going to 1760 causes a 100 point drop from ECC, and going to 1740 was a 20 point or so drop, so I just left it at 1750.
> 
> Here is the BIOS for anyone wanting to help with a nice modded BIOS
> 
> 
> 
> 
> 
> 
> 
> @mus1mus
> 
> 
> 
> 
> 
> 
> 
> 
> 
> MSI390x.zip 97k .zip file


The reason it does better at 1750 is because of how the timing straps are written. 1750 is the highest you can go on those timings, hitting 1760 puts into the next strap and also at slower timings and thus scores worse as only 10mhz is added. I believe the next max timings are at 1875mhz. Many of us here just run the max strap for set timings. So 1500, 1625, 1750 etc. If it can't hit those you just go down to the next strap and call it a day.


----------



## TheCowTamer

Heres my best score with my XFX 390x at 1200 on core and 1700 on memory. Just slight artifacting. With sapphire trixx running 50% power limit and 75mv extra. Max temp in the benchmark was 79 on core and 71 VRMS.
http://www.3dmark.com/fs/7690886
Seems low compared to others scores, whats your guys opinion


----------



## m70b1jr

Can anyone link me some info on BIOS editing my XFX DD R9 390? It's liquid cooled btw.


----------



## Vellinious

I can't make it go any higher... Fyzzz's 290 is just all balls.



EDIT: I lied...I made it go higher


----------



## battleaxe

Quote:


> Originally Posted by *m70b1jr*
> 
> Can anyone link me some info on BIOS editing my XFX DD R9 390? It's liquid cooled btw.


Here's the one I used. http://www.overclock.net/t/1561372/hawaii-bios-editing-290-290x-295x2-390-390x

I would suggest using AMDflash though instead of Winflash


----------



## Agent Smith1984

Quote:


> Originally Posted by *battleaxe*
> 
> The reason it does better at 1750 is because of how the timing straps are written. 1750 is the highest you can go on those timings, hitting 1760 puts into the next strap and also at slower timings and thus scores worse as only 10mhz is added. I believe the next max timings are at 1875mhz. Many of us here just run the max strap for set timings. So 1500, 1625, 1750 etc. If it can't hit those you just go down to the next strap and call it a day.


Now i want to try some tighter timings


----------



## Vellinious

Mine doesn't run all that well at 1750.....I have to go higher to get good bench scores


----------



## battleaxe

Quote:


> Originally Posted by *Vellinious*
> 
> Mine doesn't run all that well at 1750.....I have to go higher to get good bench scores


That's weird, cause 1750 is the last mhz on that strap. What do you run at?


----------



## battleaxe

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *Vellinious*
> 
> I've already got it fitted to the base for my case. The big silver box in the pic is the perfect size to fit it right inside. = )






So you've got the dirty box in the basement and clean box up top. Nice. Dirty bugger.


----------



## Vellinious

Quote:


> Originally Posted by *battleaxe*
> 
> 
> So you've got the dirty box in the basement and clean box up top. Nice. Dirty bugger.


It's cooler down here. If I turn the space heaters off while I'm at work, it usually cools down to around 14c in here.....good for benching. It was 13.3c when I got home today....so I put the boots to it.








With tess tweaks, but.....I'll take it

http://www.3dmark.com/fs/7691621


----------



## roflcopter1654

This seems....a little low. I'm on stock clocks with an MSI 390 and crimson 16.1.1 hotfix but that gfx score is wayyy below what the 390 is supposed to pull in 3dmark 11. Not sure what's going on here, esp. since i haven't noticed any notable performance issues in games. Is that just crimson ******* up everything?


----------



## roflcopter1654

Well, hell. Somehow Crimson was capping my clock at 520MHz. Rolled back to catalyst and this is what i got.


----------



## tolis626

Quote:


> Originally Posted by *roflcopter1654*
> 
> Well, hell. Somehow Crimson was capping my clock at 520MHz. Rolled back to catalyst and this is what i got.


It's not capping your clocks, most probably. I've noticed similar behavior on mine and turns out that Crimson just caps my framerate to 55 FPS every time I fire up 3dMark (and only 3dMark, it doesn't do it with games fortunately) . If I go in and disable framerate control again it's also normal. So I say reinstall Crimson and see if that's your problem. Unless you don't want to deal with that either, in which case stick to whatever works for you.


----------



## TsukikoChan

Quote:


> Originally Posted by *tolis626*
> 
> It's not capping your clocks, most probably. I've noticed similar behavior on mine and turns out that Crimson just caps my framerate to 55 FPS every time I fire up 3dMark (and only 3dMark, it doesn't do it with games fortunately) . If I go in and disable framerate control again it's also normal. So I say reinstall Crimson and see if that's your problem. Unless you don't want to deal with that either, in which case stick to whatever works for you.


Yea, i recently upgraded to win10 and reinstalled crimson and it started to apply a framerate cap to everything even though i only set it for one game :< crimson gets confused sometimes.

Quote:


> Originally Posted by *CamsX*
> 
> Hi guys, been a while since I posted. Any recommendation for software related FAN issues? I'm pretty sure it all started with the Crimson drivers. I heard there were issues, but didn't notice any with the 1st Crimson version.
> 
> 1 out of 3 boots, fans would stay stuck at 28% speed, and won't detect the custom fan profile in Afterburner. Would hit 80+°C when I don't notice.
> 
> Had to uninstall afterburner and go with crappy Trixx, but now fan spikes to 100% according to my sensors. Very weird. Hope there is fix with the drivers released yesterday.


Use DDU and wipe all amd drivers and reinstall latest crimson, could be a catalyst artifact that is causing interference.
I use Trixx myself and i sometimes still see the 100% spikes, it's basically Crimson and Trixx both interfering with each other and if you watch temps you will see it report a temp like 800oC for a millisecond which causes the system to freak out and set 100% for a few seconds before settling back into a curve.
I've found that for me, i don't even touch the overdrive section of crimson (touching it installs it to my knowledge, essentially creating both crimson overdrive and trixx overdrive settings) and use trixx solely for my OC, i haven't seen a 100% fan spike in some time.

One thing to watch out for, crimson loads before trixx (in win7 at least), so if you leave a trixx OC profile running on shutdown (for me, 1130/1600), on a restart i would either get 1050/1600 or 1130/1600 (depending if crimson woke up on the wrong side of the bed) and this would play havoc with fans and OC settings. I highly recommend resetting your core/mem speeds prior to any system restarts so to make crimson behave, and then only load and play with profiles in trixx after your system has fully restarted.

crimson is nice, but it still doesn't play well with trixx and afterburner from my knowledge for overdrive/OC


----------



## Lixxon

This is considered artifacts yes?


http://imgur.com/cg7YRkL


It only happens once, was running a few gta 5 benchmarks few times no issues

can it be considered a just a bug or..?


----------



## Stige

Artifacts, could be VRM temps climbing slowly like mine does.

I get zero artifacts until my VRM temps hit around 75C, then I saw a flicker like that once every few minutes.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Stige*
> 
> Artifacts, could be VRM temps climbing slowly like mine does.
> 
> I get zero artifacts until my VRM temps hit around 75C, then I saw a flicker like that once every few minutes.


Realized the same thing last night.

Instead of 45 minutes to an hour of GTA V, I stuck my son on the box to test it out for several hours, and evintually, it gets to cooking pretty good.

Core maxes out at around 78c after three hours, and VRM is around 72c @ 1175/1750 (60mv/50mv AUX), which is probably what I will run daily.... going up to 88mv for 20 MHz isn't wrth the power and heat.

That EK block is seeming more and more tempting, especially with my consideration for crossfire


----------



## Vellinious

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Realized the same thing last night.
> 
> Instead of 45 minutes to an hour of GTA V, I stuck my son on the box to test it out for several hours, and evintually, it gets to cooking pretty good.
> 
> Core maxes out at around 78c after three hours, and VRM is around 72c @ 1175/1750 (60mv/50mv AUX), which is probably what I will run daily.... going up to 88mv for 20 MHz isn't wrth the power and heat.
> 
> That EK block is seeming more and more tempting, especially with my consideration for crossfire


If Aquacomputer makes a block for it....I'd highly recommend them, with a backplate. They are absolutely amazing.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Vellinious*
> 
> If Aquacomputer makes a block for it....I'd highly recommend them, with a backplate. They are absolutely amazing.


The only thing available as far as I know, is the Alphacool hybrid water block, and now apparently, EK has a block (great job EK, you heard the MSI owner's cries!!).

I am thinking about getting an open loop AIO 120mm, and slapping them together. Wonder if I can still use MY backplate with the EK block?

I know so little about custom water cooling. Dont' even know where to start really, in regards to what barbs and connections to use, etc.... but I have been reading some good information on EK webstie.

I am only considering an AIO because I have very little room for a pump and res in my small S340 case, though I have seen some make it work..... that's why I think I want to wait until the next build before going custom water. That machine will be a full tower.


----------



## Vellinious

Quote:


> Originally Posted by *Agent Smith1984*
> 
> The only thing available as far as I know, is the Alphacool hybrid water block, and now apparently, EK has a block (great job EK, you heard the MSI owner's cries!!).
> 
> I am thinking about getting an open loop AIO 120mm, and slapping them together. Wonder if I can still use MY backplate with the EK block?
> 
> I know so little about custom water cooling. Dont' even know where to start really, in regards to what barbs and connections to use, etc.... but I have been reading some good information on EK webstie.
> 
> I am only considering an AIO because I have very little room for a pump and res in my small S340 case, though I have seen some make it work..... that's why I think I want to wait until the next build before going custom water. That machine will be a full tower.


The EK kits at least have a decent pump. The DDC has great head pressure, but they don't offer the flow that the D5s have. Still, they're a decent little pump, and those expandable AIOs aren't terrible.

If it were me though, I'd wait.....if your next build is going to be any time soon, and you want to do a full custom loop, you'll regret the AIO purchase.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Vellinious*
> 
> The EK kits at least have a decent pump. The DDC has great head pressure, but they don't offer the flow that the D5s have. Still, they're a decent little pump, and those expandable AIOs aren't terrible.
> 
> If it were me though, I'd wait.....if your next build is going to be any time soon, and you want to do a full custom loop, you'll regret the AIO purchase.


Good to know!! I may roll with the L 120 kit, sense it comes with their approved fasteners and has an external pump/res, but a small enough to fit in my case.

EK seems to be really doing a lot in the way of educating people on water cooling now, versus a more passive, "here are our blocks and products, order if you know what you need".....type of approach.

Probably going to delve deep into the S340 thread and see just how far I can go. I want to work the CPU and (2) GPU's into a 3 rad setup to be honest. (1) 240mm, and (2) 120mm..... all operating out of this tiny little case, but that may be a while.. lol


----------



## Vellinious

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Good to know!! I may roll with the L 120 kit, sense it comes with their approved fasteners and has an external pump/res, but a small enough to fit in my case.
> 
> EK seems to be really doing a lot in the way of educating people on water cooling now, versus a more passive, "here are our blocks and products, order if you know what you need".....type of approach.
> 
> Probably going to delve deep into the S340 thread and see just how far I can go. I want to work the CPU and (2) GPU's into a 3 rad setup to be honest. (1) 240mm, and (2) 120mm..... all operating out of this tiny little case, but that may be a while.. lol


The DDC would be a good pump for that kind of setup. You're looking at quite a bit of restriction with 2 GPU / 1 CPU and 3 rads. That's where the DDC shines. That and it's physically smaller than the D5. I prefer D5s myself, but....for your setup the DDC would be great. I tend to lean toward a dual D5 setup for highly restrictive loops, but.....


----------



## rdr09

Quote:


> Originally Posted by *Vellinious*
> 
> The DDC would be a good pump for that kind of setup. You're looking at quite a bit of restriction with 2 GPU / 1 CPU and 3 rads. That's where the DDC shines. That and it's physically smaller than the D5. I prefer D5s myself, but....for your setup the DDC would be great. I tend to lean toward a dual D5 setup for highly restrictive loops, but.....


I saw the 18K GScore. Great job!


----------



## mus1mus

18K is already around the stock 980TI territory.

Very sweet.

Agent, I won't be back til Monday your time. But will do it first thing then.


----------



## roflcopter1654

Quote:


> Originally Posted by *tolis626*
> 
> It's not capping your clocks, most probably. I've noticed similar behavior on mine and turns out that Crimson just caps my framerate to 55 FPS every time I fire up 3dMark (and only 3dMark, it doesn't do it with games fortunately) . If I go in and disable framerate control again it's also normal. So I say reinstall Crimson and see if that's your problem. Unless you don't want to deal with that either, in which case stick to whatever works for you.


no, it was legitimately capping my clock across all games. Afterburner was listing it at 520mhz. Didn't notice it because i haven't played anything lately that wasn't either capped at 60fps already or that i don't run with vsync.

side note: whoever optimized MGR rising: revengeance is a literal god.


----------



## Agent Smith1984

Quote:


> Originally Posted by *mus1mus*
> 
> 18K is already around the stock 980TI territory.
> 
> Very sweet.
> 
> Agent, I won't be back til Monday your time. But will do it first thing then.


Sounds good man, thanks much!!


----------



## Vellinious

Quote:


> Originally Posted by *rdr09*
> 
> I saw the 18K GScore. Great job!


I couldn't believe it actually did it....I ran it 3 more times just to be sure. lol


----------



## m70b1jr

What should be an average firestrike score with an overclocked R9 390, and an overclocked FX-8350?


----------



## Agent Smith1984

Quote:


> Originally Posted by *m70b1jr*
> 
> What should be an average firestrike score with an overclocked R9 390, and an overclocked FX-8350?


Between 10,000 and 10,800 probably.....

How high of overclocks are we talking?

My FX @ 5GHz full tilt gets a physics score of around 9650 with AMD cards (for some reason that jumps to 9800 when using NVIDIA 980)

The graphics score for a NICELY overclocked 390 will be around 14,200-14,400 points. That's with clocks in the 1200/1750 range and STOCK BIOS on crimson driver.


----------



## m70b1jr

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Between 10,000 and 10,800 probably.....
> 
> How high of overclocks are we talking?
> 
> My FX @ 5GHz full tilt gets a physics score of around 9650 with AMD cards (for some reason that jumps to 9800 when using NVIDIA 980)
> 
> The graphics score for a NICELY overclocked 390 will be around 14,200-14,400 points. That's with clocks in the 1200/1750 range and STOCK BIOS on crimson driver.


Mines is like 9,800 on average.. I've never broken 10,000....


----------



## Stige

My 3570K and R9 390 gets ~11600 with my daily clocks, can't run firestrike with the clocks I ran valley with (1268 or something on Core).


----------



## Agent Smith1984

Quote:


> Originally Posted by *m70b1jr*
> 
> Mines is like 9,800 on average.. I've never broken 10,000....


I can bench at 5.2, which put me in the 10,100 or so range I believe? Can't remember.

May do a suicide run this evening on the CPU and the card and see how she does.

With the massive gimping that Firestrike does to AMD CPU's on the combined test, doesn't matter how well you do on GPU/CPU tests, cause that tests ruins the whole overall score.....


----------



## Agent Smith1984

Here's what I mean..... look at my run on the left trouncing the 4.6ghz i5 system not only in GPU score (obviously, with my higher overclock), but also beating the CPU by 500 points, yet the scores are identical!!

http://www.3dmark.com/compare/fs/7690020/fs/5548393

The combined score unfairly botches FX CPU's by only using 4 cores.... but whatever, I just sound like a bitter red teamer I guess


----------



## Stige

Well AMD CPUs are inferior, everyone knows that really.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Stige*
> 
> Well AMD CPUs are inferior, everyone knows that really.


The differences in real world use are minuscule when pitting an overclocked FX against an overclocked i5 though. Tons of stuff to back that statement up, but I won't have that discussion here, lol. Sure intel IPC has been much better than AMD for years now, but overall performance on competing chips (FX-9590 vs i5 4690k for example) is very close in most scenarios.


----------



## tolis626

Quote:


> Originally Posted by *Agent Smith1984*
> 
> The differences in real world use are minuscule when pitting an overclocked FX against an overclocked i5 though. Tons of stuff to back that statement up, but I won't have that discussion here, lol. Sure intel IPC has been much better than AMD for years now, but overall performance on competing chips (FX-9590 vs i5 4690k for example) is very close in most scenarios.


The 9590 is clearly superior to any Intel CPU if you're living in a cold climate. Can any Intel CPU be used for space heating? No. The 9590 can do it. So you've got that going for you, which is nice.









All kidding aside, when using all 8 threads it's a more than capable CPU. If it had a more modern feature set it would be holding up quite well, really, especially for the price.

Other than that... I tried reading a bit into BIOS modding for these cards and god damn is it too much to read into. I don't know, I hoped that there would be something like "here is the modded BIOS for your MSI 390x with tighter timings that allows higher voltage, flash it and you're ready to go", but it's never that simple, is it?


----------



## jdorje

Quote:


> Originally Posted by *Lixxon*
> 
> This is considered artifacts yes?
> 
> 
> http://imgur.com/cg7YRkL
> 
> 
> It only happens once, was running a few gta 5 benchmarks few times no issues
> 
> can it be considered a just a bug or..?


Those are artifacts caused by unstable core. You'll see them in gta v eventually but firestrike is a little more stressful.


----------



## Vellinious

Quote:


> Originally Posted by *tolis626*
> 
> The 9590 is clearly superior to any Intel CPU if you're living in a cold climate. Can any Intel CPU be used for space heating? No. The 9590 can do it. So you've got that going for you, which is nice.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> All kidding aside, when using all 8 threads it's a more than capable CPU. If it had a more modern feature set it would be holding up quite well, really, especially for the price.
> 
> Other than that... I tried reading a bit into BIOS modding for these cards and god damn is it too much to read into. I don't know, I hoped that there would be something like "here is the modded BIOS for your MSI 390x with tighter timings that allows higher voltage, flash it and you're ready to go", but it's never that simple, is it?


There's PT1 and PT2, but I'm not sure if they have memory mods or not.....I think they're just voltage. Would have to ask someone that knows more than I. I know I had one made for me with better voltage and tighter timings...still working on it, to reduce vdroop, but, it's gotten SO much better.


----------



## tolis626

Quote:


> Originally Posted by *Vellinious*
> 
> There's PT1 and PT2, but I'm not sure if they have memory mods or not.....I think they're just voltage. Would have to ask someone that knows more than I. I know I had one made for me with better voltage and tighter timings...still working on it, to reduce vdroop, but, it's gotten SO much better.


Well, I will read into it more. Even if I have something made for me by someone else, I want to know what it does and how it works. I'm just a wee bit afraid to do stuff myself at this point. I guess you understand.

Did you see any real world performance benefits from the modded BIOS? I'm kinda curious how mine will behave, as 1750MHz in unattainable on the memory, so I was thinking something like applying the same tight timings across all straps and then take it however high it may go. Right now it craps out at 1725MHz half-stable. 1700MHz is doable for gaming, it seems.


----------



## Agent Smith1984

Quote:


> Originally Posted by *tolis626*
> 
> Well, I will read into it more. Even if I have something made for me by someone else, I want to know what it does and how it works. I'm just a wee bit afraid to do stuff myself at this point. I guess you understand.
> 
> Did you see any real world performance benefits from the modded BIOS? I'm kinda curious how mine will behave, as 1750MHz in unattainable on the memory, so I was thinking something like applying the same tight timings across all straps and then take it however high it may go. Right now it craps out at 1725MHz half-stable. 1700MHz is doable for gaming, it seems.


Have you benched at 1625 and 1700 to see if the lower strap scores higher?


----------



## Stige

Quote:


> Originally Posted by *Agent Smith1984*
> 
> The differences in real world use are minuscule when pitting an overclocked FX against an overclocked i5 though. Tons of stuff to back that statement up, but I won't have that discussion here, lol. Sure intel IPC has been much better than AMD for years now, but overall performance on competing chips (FX-9590 vs i5 4690k for example) is very close in most scenarios.


Even a Skylake i3 is faster in games than any AMD CPU lol, now that is VALUE FOR MONEY.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Stige*
> 
> Even a Skylake i3 is faster in games than any AMD CPU lol, now that is VALUE FOR MONEY.


----------



## tolis626

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Have you benched at 1625 and 1700 to see if the lower strap scores higher?


Yup. Here is a 1175/1625MHz run and here is a 1175/1700MHz run. As usual, higher memory clocks score 50-150 points more for me. What's strange is that I got some slight artifacting in both of these tests, but they ran fine. My card seems to be losing OC potential over time, damn it.









Max temps were 70C, so it might improve with lower temps, but it's not bad.


----------



## Agent Smith1984

Quote:


> Originally Posted by *tolis626*
> 
> Yup. Here is a 1175/1625MHz run and here is a 1175/1700MHz run. As usual, higher memory clocks score 50-150 points more for me. What's strange is that I got some slight artifacting in both of these tests, but they ran fine. My card seems to be losing OC potential over time, damn it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Max temps were 70C, so it might improve with lower temps, but it's not bad.


What kind of voltage offset are you using for core/AUX? What are your VRM temps?


----------



## tolis626

Quote:


> Originally Posted by *Agent Smith1984*
> 
> What kind of voltage offset are you using for core/AUX? What are your VRM temps?


You're catching me off guard, I didn't even open GPU-z.









I used +100/+50mV. VRM temps are on average 5-7C below core and that applies pretty consistently when benchmarking, as in prolonged gaming sessions they can climb a bit higher than that.

Strange thing is I could do 1175MHz just fine. Why it would artifact now is beyond me. Damn it. I'll also try less voltage, but I'm kind of giving up on this chip.


----------



## Agent Smith1984

Quote:


> Originally Posted by *tolis626*
> 
> You're catching me off guard, I didn't even open GPU-z.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I used +100/+50mV. VRM temps are on average 5-7C below core and that applies pretty consistently when benchmarking, as in prolonged gaming sessions they can climb a bit higher than that.
> 
> Strange thing is I could do 1175MHz just fine. Why it would artifact now is beyond me. Damn it. I'll also try less voltage, but I'm kind of giving up on this chip.


I went through that with my previous MSI 390 (and maybe with this new 390x)

I ran at 1200 core, 90mv for about a week, then I'm playing Crysis 3 one day, and start getting some artifacts. I see hat my temps were 3c higher than normal, then I look at the thermastat on the wall and see my living room was about 78f (normally keep an indoor temp of around 72f.... I cut the AC, and tried again 10 minutes later after the room had settled to a more comfy 74f, and the card did fine. When using air on Hawaii, the littlest amounts can count. That's the beauty of the water cooling I guess


----------



## tolis626

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I went through that with my previous MSI 390 (and maybe with this new 390x)
> 
> I ran at 1200 core, 90mv for about a week, then I'm playing Crysis 3 one day, and start getting some artifacts. I see hat my temps were 3c higher than normal, then I look at the thermastat on the wall and see my living room was about 78f (normally keep an indoor temp of around 72f.... I cut the AC, and tried again 10 minutes later after the room had settled to a more comfy 74f, and the card did fine. When using air on Hawaii, the littlest amounts can count. That's the beauty of the water cooling I guess


Although it's certainly a possibility, I don't think it's temps in my case. I mean, when I got this card and started overclocking it it was the end of September or so. The weather went on to being kind of hot until November (like 25-30C ambients or more), but I had no problem overclocking. Sure, it would climb to over 80C most of the time, but it worked. At some point it stopped doing so and I don't know why. Drivers? Something I did? I have no idea. And right now I'm saying I want to upgrade to Polaris come fall, but I don't even know if I'll have the money, so I may be stuck with this card for a while. Fixing its quirks is important, but it's not doing me any favors.


----------



## mus1mus

Quote:


> Originally Posted by *m70b1jr*
> 
> Mines is like 9,800 on average.. I've never broken 10,000....


You will need 5GHz + on the CPU and 1100+ on the GPU to get to 10K.

An i3 beating an FX 8 Core is an unsolicited hersay.


----------



## lanc3lot

A little help about Sapphire.

At the moment, my PSU is the following: http://archive.benchmarkreviews.com/index.php?option=com_content&task=view&id=26&Itemid=1&limit=1&limitstart=1 - Apparently i can't use Sapphire with its connections. If i use something like Power cable PCIe 6Pin to 8Pin, i will be ok you believe? I mean be able to use the card on its 100% or i will have losses?

Ty in advance, i assume the best way would be to change also my PSU but at the moment with the budget i have will be a problem. So the question is if it will be ok, or if not, go for MSI


----------



## Stige

Quote:


> Originally Posted by *mus1mus*
> 
> You will need 5GHz + on the CPU and 1100+ on the GPU to get to 10K.
> 
> An i3 beating an FX 8 Core is an unsolicited hersay.


Here you go fanboi 




AMD CPUs are crap with the new Skylakes out, quite simply.
The FX gets demolished by the i3 really, and costs less.


----------



## CamsX

Quote:


> Originally Posted by *TsukikoChan*
> 
> Yea, i recently upgraded to win10 and reinstalled crimson and it started to apply a framerate cap to everything even though i only set it for one game :< crimson gets confused sometimes.
> Use DDU and wipe all amd drivers and reinstall latest crimson, could be a catalyst artifact that is causing interference.
> I use Trixx myself and i sometimes still see the 100% spikes, it's basically Crimson and Trixx both interfering with each other and if you watch temps you will see it report a temp like 800oC for a millisecond which causes the system to freak out and set 100% for a few seconds before settling back into a curve.
> I've found that for me, i don't even touch the overdrive section of crimson (touching it installs it to my knowledge, essentially creating both crimson overdrive and trixx overdrive settings) and use trixx solely for my OC, i haven't seen a 100% fan spike in some time.
> 
> One thing to watch out for, crimson loads before trixx (in win7 at least), so if you leave a trixx OC profile running on shutdown (for me, 1130/1600), on a restart i would either get 1050/1600 or 1130/1600 (depending if crimson woke up on the wrong side of the bed) and this would play havoc with fans and OC settings. I highly recommend resetting your core/mem speeds prior to any system restarts so to make crimson behave, and then only load and play with profiles in trixx after your system has fully restarted.
> 
> crimson is nice, but it still doesn't play well with trixx and afterburner from my knowledge for overdrive/OC


Thanks for the overview. I'll keep an eye on things from now on.


----------



## mus1mus

Quote:


> Originally Posted by *Stige*
> 
> Here you go fanboi
> 
> 
> 
> 
> AMD CPUs are crap with the new Skylakes out, quite simply.
> The FX gets demolished by the i3 really, and costs less.


Benchmark specific. And games that are poorly coded.

I can throw you my numbers and try to catch them with i3s.


----------



## kizwan

Quote:


> Originally Posted by *Vellinious*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tolis626*
> 
> The 9590 is clearly superior to any Intel CPU if you're living in a cold climate. Can any Intel CPU be used for space heating? No. The 9590 can do it. So you've got that going for you, which is nice.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> All kidding aside, when using all 8 threads it's a more than capable CPU. If it had a more modern feature set it would be holding up quite well, really, especially for the price.
> 
> Other than that... I tried reading a bit into BIOS modding for these cards and god damn is it too much to read into. I don't know, I hoped that there would be something like "here is the modded BIOS for your MSI 390x with tighter timings that allows higher voltage, flash it and you're ready to go", but it's never that simple, is it?
> 
> 
> 
> There's PT1 and PT2 PT3, but I'm not sure if they have memory mods or not.....I think they're just voltage. Would have to ask someone that knows more than I. I know I had one made for me with better voltage and tighter timings...still working on it, to reduce vdroop, but, it's gotten SO much better.
Click to expand...

Both PT1 & PT3 are voltage unlocked. Meaning no limit in the BIOS. PT1 have vdroop while PT3 doesn't.


----------



## battleaxe

Picked up a 480 rad to add to the loop today. Can't wait to get this thing built. Just need to find the time.


----------



## Stige

Quote:


> Originally Posted by *mus1mus*
> 
> Benchmark specific. And games that are poorly coded.
> 
> I can throw you my numbers and try to catch them with i3s.


I kinda wish I could afford a Skylake i3 for my second build but have to stick with the LGA1155 mobo I have in shelf


----------



## simonfredette

I finally have GPUz working properly and displaying vrm temps as I was told they are the overclock ceiling for these cards ( msi 390 on stock air cooler), My overclock is 1200 core , 1700 mem at +100mV and +50% , it runs supper clean through 3d mark and gaming and temps on core max at 60 and vrms max at 52 and 55C .This seems pretty low and would indicate a bunch of headroom left for overclocking so whats next , aux voltage bump and push the memory , how hot are we comfortable going ?


----------



## Stige

Quote:


> Originally Posted by *simonfredette*
> 
> I finally have GPUz working properly and displaying vrm temps as I was told they are the overclock ceiling for these cards ( msi 390 on stock air cooler), My overclock is 1200 core , 1700 mem at +100mV and +50% , it runs supper clean through 3d mark and gaming and temps on core max at 60 and vrms max at 52 and 55C .This seems pretty low and would indicate a bunch of headroom left for overclocking so whats next , aux voltage bump and push the memory , how hot are we comfortable going ?


I call hax vrm temps, too good to be true! :l

BIOS mod for more voltage on core, aux voltage is mostly useless and memory overclocking gives no real benefit.


----------



## jodybdesigns

If I dump +100mv into my Powercolor PCS+ 390 I black screen instantly. I have to hard shut down. I am afraid to BIOS mod that high...


----------



## ROTOR

Anyone can upload the ASUS R9 390 DC2 BIOS?

Thanks!


----------



## tolis626

So, I decided to go play some BF4 (old loves die hard) and, being the worst game I have when black screens are a problem, I gave my higher overclocks a shot. So I tried to stick to 1170/1700MHz. I started off with +80/+25mV. I got some flickering after 5 minutes, so I increased aux voltage to +50mV and got a black screen within another 5 minutes. I then proceeded to try +90/+50mV and same thing happened. Max temps through all of this were 75C on the core and 73C on the VRM. Then I tried +80mV core and no additional aux voltage and it seems to work. I played through a whole round of BF4, so like 15 minutes with no problems that I noticed (Although with BF4 needing my undivided attention to do well, it's quite possible that I miss some artifacts when they appear). Strange thing was that my temps were 77C max on the core and 75C max on the VRM, so I went to check my actual voltage and I think that it stays high more consistently. It still bounces around between 1.23V and 1.26V (plus change), but it rarely dipped to 1.23V and usually stayed in the 1.25V range with many spikes to 1.26V, where before it would mostly chill around 1.23-1.24V and spike to 1.25-1.26V.

So, nothing conclusive here, but I may be onto something.


----------



## Stige

Same results as the rest of us, you are getting crashes/artifacts because of temps, not because of lack of voltage. The lower your temps, the lower you can get your voltage. And higher temps = more voltage = more temps, a never ending cycle really.


----------



## tolis626

Quote:


> Originally Posted by *Stige*
> 
> Same results as the rest of us, you are getting crashes/artifacts because of temps, not because of lack of voltage. The lower your temps, the lower you can get your voltage. And higher temps = more voltage = more temps, a never ending cycle really.


Without the additional AUX voltage I got higher actual core voltage and higher temps both on the core and on the VRM, so it wasn't temps. It wasn't much higher, just 2C, but it was higher nontheless. That's what's strange.


----------



## simonfredette

I finally have GPUz working properly and displaying vrm temps as I was told they are the overclock ceiling for these cards ( msi 390 on stock air cooler), My overclock is
Quote:


> Originally Posted by *Stige*
> 
> I call hax vrm temps, too good to be true! :l
> 
> BIOS mod for more voltage on core, aux voltage is mostly useless and memory overclocking gives no real benefit.


well ill run a benchamark or at least kombuster and show you but it is what it shows , a full benchmark of 3d mark 11 only vrms at 51-54 running 1200 1700 .. im not going to bios mod for sure and if aux volt and memory overclocking further isnt worth it than im happy with the results I have I just wanted to see if I could inch it a bit more. Its perfectly stable in games and 3d mark , artifacts a bit in heaven which is surprising and the temps are more than reasonable. I think ill work on overclocking the CPU and ram a bit more and leave the gpu alone, its a great card.


----------



## m70b1jr

Can someone tell me how to fix the table not showing all the way up? It cuts off horizontally on chrome, and I can't see the rest.


----------



## m70b1jr

Hey guys, does anyone here have a recommendation of what BIOS I should flash on my XFX DD R9 390?

Also, could I flash an overclocked BIOS to my card, then trick MSI to allow me to add another +100mv on the card? Could I do that or is it a bad idea?

I'm spending all of tonight trying to break the 10,000 mark in firestrike.


----------



## jdorje

Quote:


> Originally Posted by *m70b1jr*
> 
> Hey guys, does anyone here have a recommendation of what BIOS I should flash on my XFX DD R9 390?
> 
> Also, could I flash an overclocked BIOS to my card, then trick MSI to allow me to add another +100mv on the card? Could I do that or is it a bad idea?
> 
> I'm spending all of tonight trying to break the 10,000 mark in firestrike.


I can give you my modded bios in about 5 hours.


----------



## m70b1jr

Quote:


> Originally Posted by *jdorje*
> 
> I can give you my modded bios in about 5 hours.


Inb4 my pc explodes


----------



## jdorje

Well, I use stock 1225 mV core voltage. Raising voltage at all isn't that helpful and past the +100 limit of afterburner even less so. I do think my stock voltage is lower than the norm though.

The real benefit comes from changing the memory straps.


----------



## m70b1jr

Quote:


> Originally Posted by *jdorje*
> 
> Well, I use stock 1225 mV core voltage. Raising voltage at all isn't that helpful and past the +100 limit of afterburner even less so. I do think my stock voltage is lower than the norm though.
> 
> The real benefit comes from changing the memory straps.


Can I get a brief explanation of memory straps?


----------



## jdorje

Memory timings (like cas for dram) are controlled in the bios and can't be changed in afterburner. If you bump ram speed at certain cutoffs the timings will be raised. If you benchmark you can see ram at 1624 is faster than 1626.

So you just use a hex editor to copy paste the timings from 1101-1225 up to all your higher speed ranges. You can find videos on this easily but there's not much customization since everyone ends up using that same strap.

It's a 2-3% performance boost basically for free.


----------



## m70b1jr

Quote:


> Originally Posted by *jdorje*
> 
> Memory timings (like cas for dram) are controlled in the bios and can't be changed in afterburner. If you bump ram speed at certain cutoffs the timings will be raised. If you benchmark you can see ram at 1624 is faster than 1626.
> 
> So you just use a hex editor to copy paste the timings from 1101-1225 up to all your higher speed ranges. You can find videos on this easily but there's not much customization since everyone ends up using that same strap.
> 
> It's a 2-3% performance boost basically for free.


I'm trying to do some heavy overclocking on my R9 390 to break the 10,000 mark in Firestike, My FX-8350 is at 4.5ghz. Also, i've checked to see if some of my CU cores could be unlocked, but they can't sadly. Also, my aftermarket AIO liquid cooler doesn't let me do as high clock speeds as my stock cooler. not because of temps, but artifacting.


----------



## jdorje

You can't really compare 3dmark scores across different cpus. Better to just look at the graphics score. Even that might be cpu bound for all I know.

http://www.3dmark.com/fs/7199111

And one last thing: if you need more voltage with the aio cooler it's probably vrm temps. Try to get a decent heat sink on them.


----------



## m70b1jr

Well I found out the issue with my instabilities.. So, I heard something about the amount of pressue on the PCB would cause artifacting, so I kept my screws REALLY loose. Well, I tightened them down as much I could safeley, and Was able to get a stable 1150mhz overclock (+100mv) on the core.. Not the 1200mhz I'd like to hit though.. However, my temps did not go above 65c on the core.

VRM Temps 80, and 89 though...


----------



## m70b1jr

Well, I'm going to flash a BIOS onto my GPU, so hopefully I can add some extra voltage onto it...

I really hope I don't brick my GPU, I hoping my GPU has dual BIOS (The switch) or maybe the switch is for unlocking BIOS flashing.


----------



## RicoDee

Just got done trying out my card on water. 1190/1700 50mv. I will play more, Im sure of it .


----------



## rdr09

Quote:


> Originally Posted by *RicoDee*
> 
> 
> 
> 
> 
> 
> Just got done trying out my card on water. 1190/1700 50mv. I will play more, Im sure of it .


Is that with tess off or on? i have to oc my 290 to 1300 core to get that graphics score. nice.


----------



## m70b1jr

Quote:


> Originally Posted by *RicoDee*
> 
> 
> 
> 
> 
> 
> Just got done trying out my card on water. 1190/1700 50mv. I will play more, Im sure of it .


HOW?
I barely get 1150 on 100mv.


----------



## Irked

Quote:


> Originally Posted by *m70b1jr*
> 
> HOW?
> I barely get 1150 on 100mv.


On 16.2 drivers I cant get more then 1125/1700 with out black screening as soon as i add power crash. Up till newest drivers i got 1190/1700 at +80mv


----------



## ROTOR

Quote:


> Originally Posted by *ROTOR*
> 
> Anyone can upload the ASUS R9 390 DC2 BIOS?
> 
> Thanks!


Please


----------



## jdorje

https://www.dropbox.com/sh/tdqjx4cxqaqjj29/AADExR9yNf2nU_KILOaXf4aGa?dl=0


----------



## Nameless1988

Hi everyone! I purchased my xfx 390black Edition few days ago amd i am very happy with this card! Good temperatures, Great performance, my 390 runs at 1050 MHz on the core. Tried to OCing and i got stable 1120 mhz (core) without any extra voltage, just +50% power limit. Actually it runs with mew Crimson driver (16.2) and everything is fine. Dying light, gta5, mgs5, fallout4 they all run very smooth (i come from an Asus 770 dc2 with only 2 gb, so it's a good upgrade).
My spec: I73770k (stock), Asus z77A, 16 gb ddr3 (2400mhz), Samsung evo 240 120 gb, Cooler master B600 (600w), Win10prox64.
Here i post a benchmark, unigine heaven, with my card at stock clock, only +50%power limit.









Edit: I am om the Smartphone and i am not able to load the screen (don't know why) . In heaven benchmark i got 58.4 fps average, that seems a good score at stock clock! Great card!









Sorry for my bad English, i am italian.


----------



## m70b1jr

jdorje, can you tell me which of your BIOS's has the MOST voltage modification on it? I flashed your current.rom, and I was able to get 1160mhz stable up from 1150mhz. I do want to say Flashing this was extremely scary.. I found it hard to breath when I rebooted my PC, and even know my heart is still beating fast. But if you could tell me which bios I should use to get 1200mhz, that would be amazing!


----------



## m70b1jr

Sorry for the spam, but will raising this:


Help allow a more stable core overclock? if so, what's the max safe I should raise it too?


----------



## RicoDee

Quote:


> Originally Posted by *rdr09*
> 
> Is that with tess off or on? i have to oc my 290 to 1300 core to get that graphics score. nice.


Yes it was on . Max Tess Detail 10, Max Tessellation Factor 24. Thanks


----------



## jdorje

Quote:


> Originally Posted by *m70b1jr*
> 
> jdorje, can you tell me which of your BIOS's has the MOST voltage modification on it? I flashed your current.rom, and I was able to get 1160mhz stable up from 1150mhz. I do want to say Flashing this was extremely scary.. I found it hard to breath when I rebooted my PC, and even know my heart is still beating fast. But if you could tell me which bios I should use to get 1200mhz, that would be amazing!


First of all, flashing shouldn't be scary. You should go into it expecting to brick your card - memory timings are maybe the easiest way to do this - and know how to unbrick it by flashing a working bios back. The only problem there is that you won't be able to use your GPU to do it. I have a secondary monitor running on my igpu, and when I flash I always set that to be the initial output in my mobo UEFI so that I'm using it for the dos boot. If you have a dual-bios gpu that helps too but you have to know how to use it. It's almost certainly even possible to do it without a display if you memorize the sequence of key presses. Just think about this in advance since you don't want to be stuck with a nonworking GPU and no way to get a display.

All of the original bios's I had were on either adaptive or 1225 mV for core voltage. But I uploaded one now max_oc with 1350 mV core, equivalent to +125 mV for me. This is the one I used (with small offset, I forget what) to get my 11.8k firestrike score.

None of the changes should have allowed you to get higher clocks at the same voltage. However if your stock voltage was lower than 1225 then using the 1225 one would be raising the voltage obviously. Some cards have a hidden offset that's applied on top of the voltage. Watch what your actual voltage is in hwinfo (with vdroop you'll get greatly diminishing returns in actual voltage compared to VID). I don't really know the safe voltage limit but my card black screens before 1400 mV VID anyway.


----------



## RicoDee

Quote:


> Originally Posted by *RicoDee*
> 
> 
> 
> 
> 
> Just got done trying out my card on water. 1190/1700 50mv. I will play more, Im sure of it .


I managed to OC my cpu to 4.3 for starters. And upped my gpu to 1195/1750 . thanks pplz.


----------



## m70b1jr

Quote:


> Originally Posted by *RicoDee*
> 
> I managed to OC my cpu to 4.3 for starters. And upped my gpu to 1195/1750 . thanks pplz.


Which card? Can you send me your BIOS?


----------



## m70b1jr

Ya boii


----------



## ROTOR

Quote:


> Originally Posted by *jdorje*
> 
> https://www.dropbox.com/sh/tdqjx4cxqaqjj29/AADExR9yNf2nU_KILOaXf4aGa?dl=0


THanksssssss!!!!!!!!!


----------



## m70b1jr

Maybe overclocking does make a difference

Stock


OC'd


----------



## Vellinious

Quote:


> Originally Posted by *m70b1jr*
> 
> Maybe overclocking does make a difference
> 
> Stock
> 
> 
> OC'd


The only people that'll tell you overclocking isn't worth it, are the ones that aren't doin it right.....


----------



## RicoDee

Quote:


> Originally Posted by *m70b1jr*
> 
> Which card? Can you send me your BIOS?


Bro I have the MSI R9 390. and if you teach me how to get to my bios , I won't mind sending it to you


----------



## Vellinious

Quote:


> Originally Posted by *RicoDee*
> 
> Bro I have the MSI R9 390. and if you teach me how to get to my bios , I won't mind sending it to you


Open up GPUz. The little arrow button beneath the AMD logo. Push it, save it to desktop. Put it in a zip file, and post it here.


----------



## m70b1jr

Quote:


> Originally Posted by *RicoDee*
> 
> Bro I have the MSI R9 390. and if you teach me how to get to my bios , I won't mind sending it to you


Download GPU-Z, open it up, and where it says BIOS version or Bios Revision, theres should be an icon to the right of it that gives the option to save the BIOS.


----------



## RicoDee

I did some more reading and yes , I can send it to you. How would u like for me to send it ?


----------



## RicoDee

Im on it brb


----------



## m70b1jr

Drop box or mega.co.nz. doesn't matter.


----------



## RicoDee

Hawaii.zip 99k .zip file


Here you go bro .


----------



## tolis626

So, let me get this straight. I just downloaded Hawaii BIOS Reader. I can just go in there, load up my current BIOS dump, change voltages, timings etc and then flash it to my card and... That's it? Seems almost too simple to be honest... Especially after reading a bit into the Hawaii BIOS editing thread here.


----------



## m70b1jr

Quote:


> Originally Posted by *tolis626*
> 
> So, let me get this straight. I just downloaded Hawaii BIOS Reader. I can just go in there, load up my current BIOS dump, change voltages, timings etc and then flash it to my card and... That's it? Seems almost too simple to be honest... Especially after reading a bit into the Hawaii BIOS editing thread here.


Simpler than it looks. DO NOT flash a r9 390x bios onto your card. It's VERY Scary to flash bios's, but as long as your motherboard has an iGPU, and you have a phone on hand to watch tutorials, you can unbrick your card if it comes to it. My heart rate went up to 92 Beats Per Minute (Measures on my galaxy S6 edge plus) while flashing this. (I'm 16 years old) so yes, it gets very stressful.


----------



## RicoDee

wow, Im 46 yrs old and I would not at this time try flashing. Im too skerd .


----------



## jdorje

Quote:


> Originally Posted by *tolis626*
> 
> So, let me get this straight. I just downloaded Hawaii BIOS Reader. I can just go in there, load up my current BIOS dump, change voltages, timings etc and then flash it to my card and... That's it? Seems almost too simple to be honest... Especially after reading a bit into the Hawaii BIOS editing thread here.


In fairness, OneB1t's editor is fairly new. Before that it was all hex editing, which is still afaik the easiest way to edit memory timings which are the most important bios mod. Additionally, some things in the bios must match up - like the voltages in all six tables, or the clocks on the tables and the master list.


----------



## m70b1jr

Quote:


> Originally Posted by *jdorje*
> 
> In fairness, OneB1t's editor is fairly new. Before that it was all hex editing, which is still afaik the easiest way to edit memory timings which are the most important bios mod. Additionally, some things in the bios must match up - like the voltages in all six tables, or the clocks on the tables and the master list.




If i raise those values, do you think I'll have high enough headroom for a 1200mhz + OC?


----------



## mus1mus

Quote:


> Originally Posted by *m70b1jr*
> 
> 
> 
> If i raise those values, do you think I'll have high enough headroom for a 1200mhz + OC?


It doesn't matter leaving those values at stock. The +50% power limit can allow you to go further.

It will take a hefty number of tests to get this dialed in and see the effect.


----------



## Nameless1988

Do you have coil whine on your 390s? my XFX 390 B.E. has coil whine (not too much).


----------



## jdorje

DO NOT RAISE THOSE VALUES.

I may have seriously misled you since I forgot I already raised them (default is 208). I'll go lower them all in the morning. Raising the power limit beyond the 300 ish that the 8+6 pins can provide is one of the few ways to damage your card. And since you can add +50% in software that 238 can go up to around 360w draw already. The 238 is copied from sapphires bios but they have 8+8.

Personally I found no benefit to raising the power limit anyway.

And finally that has no effect on stability. It just throttles if you exceed the limit.

To see how much voltage you need for 1200 mhz go find out how much you need for lower clock. Put it in a spreadsheet and chart it - clock on the horizontal voltage on the vertical. Then add a quadratic fit curve. It should be very accurate.


----------



## mus1mus

QUADRATIC WHUT?


----------



## kizwan

Quote:


> Originally Posted by *jdorje*
> 
> DO NOT RAISE THOSE VALUES.
> 
> I may have seriously misled you since I forgot I already raised them (default is 208). I'll go lower them all in the morning. Raising the power limit beyond the 300 ish that the 8+6 pins can provide is one of the few ways to damage your card. And since you can add +50% in software that 238 can go up to around 360w draw already. The 238 is copied from sapphires bios but they have 8+8.
> 
> Personally I found no benefit to raising the power limit anyway.
> 
> And finally that has no effect on stability. It just throttles if you exceed the limit.
> 
> To see how much voltage you need for 1200 mhz go find out how much you need for lower clock. Put it in a spreadsheet and chart it - clock on the horizontal voltage on the vertical. Then add a quadratic fit curve. It should be very accurate.


Firstly, it's not going to damage the card. Worst case scenario your card will power starving.

Secondly, 8+6 pin can deliver more than 300W.

Thirdly, Hawaii/Grenada BIOS have voltage limit + vdroop. Voltage limit pretty low last time I checked. So no problem here.

Fourthly, voltage can kill this card provided you're using BIOS without voltage limit like PT1/PT3.


----------



## Carniflex

Quote:


> Originally Posted by *mus1mus*
> 
> QUADRATIC WHUT?


It means fitting with x^2 kind of polynomial. Or in simpler terms "thats whats written on the tickbox!" - you will get some kind of extrapolated curve - following it should give you roughly the volts needed.


----------



## Vellinious

Quote:


> Originally Posted by *jdorje*
> 
> DO NOT RAISE THOSE VALUES.
> 
> I may have seriously misled you since I forgot I already raised them (default is 208). I'll go lower them all in the morning. Raising the power limit beyond the 300 ish that the 8+6 pins can provide is one of the few ways to damage your card. And since you can add +50% in software that 238 can go up to around 360w draw already. The 238 is copied from sapphires bios but they have 8+8.
> 
> Personally I found no benefit to raising the power limit anyway.
> 
> And finally that has no effect on stability. It just throttles if you exceed the limit.
> 
> To see how much voltage you need for 1200 mhz go find out how much you need for lower clock. Put it in a spreadsheet and chart it - clock on the horizontal voltage on the vertical. Then add a quadratic fit curve. It should be very accurate.


About 80 watts from the PCIe lane, 150ish from the 6 pin and 175 from the 8 pin. You could push over 400 watts if you wanted to.


----------



## mus1mus

Quote:


> Originally Posted by *Carniflex*
> 
> It means fitting with x^2 kind of polynomial. Or in simpler terms "thats whats written on the tickbox!" - you will get some kind of extrapolated curve - following it should give you *roughly* the volts needed.


No pun intended. I know a bit of Math









You said it better than the one who mentioned it tho, it's only true up to a certain point. So it's not a certainty. Overclocking is not a guaranteed figure. People need to steer clear of that.


----------



## Agent Smith1984

I'm finding more and more, that's it's only worth running my card around 1150mhz with low voltage.

My nephews came over and played GTA V with my son for about 5 hours Friday night and that joker was cooking at 81C with 73c VRM by the end of the whole thing.....

Benching is a different story, but actual 4K gaming (which sends the core temps up a whole 5C over 1080P gaming) is just too much to overvolt on air.

I'm happy with the card none the less though, but I don't think crossfire will be in the cards now. There's no way the top card will stay cool enough with another one underneath it.


----------



## flopper

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I'm finding more and more, that's it's only worth running my card around 1150mhz with low voltage.
> 
> My nephews came over and played GTA V with my son for about 5 hours Friday night and that joker was cooking at 81C with 73c VRM by the end of the whole thing.....
> 
> Benching is a different story, but actual 4K gaming (which sends the core temps up a whole 5C over 1080P gaming) is just too much to overvolt on air.
> 
> I'm happy with the card none the less though, but I don't think crossfire will be in the cards now. There's no way the top card will stay cool enough with another one underneath it.


Polaris soon,all new tech to play with


----------



## jodybdesigns

Quote:


> Originally Posted by *flopper*
> 
> Polaris soon,all new tech to play with


And brand new broken drivers! Yaaay


----------



## Agent Smith1984

So looks like initial Doom 2016 results are showing AMD TROUNCE NVIDIA.... I'm sure it's just a driver fix for them, but who's to say we don't have driver improvements coming for it too.
I'll be curious to see how the beta does.

A 7970 performing the same as a 980 is just ridiculous.... tells me this game is going to be VERY VRAM dependent (not space, but bandwidth)

http://wccftech.com/doom-2016-alpha-benchmarks-geforce-amd-radeon/

Thinking a little more about it.... it could be that this alphaport is just not optimized for PC yet, so by default it's running really good on very similar-to-console hardware???


----------



## THUMPer1

lol 60 FPS. How insulting.


----------



## patriotaki

when is dx 12 coming out


----------



## Agent Smith1984

Quote:


> Originally Posted by *patriotaki*
> 
> when is dx 12 coming out


The question is when will DX12 be implemented.... it's already out


----------



## X6SweexLV

Hi, I want to buy a new video card, come from the NVIDIA GTX 960 ...
Then I have a question which of R9 390 cards have a good, quiet, personally I watched http://www.asus.com/Graphics-Cards/STRIXR9390DC3OC8GD5GAMING/overview/ ...
And I have a 8gb and I7 4771 or enough pleasure to R9 390, 630W psu gold
And this will be my first ATI / AMD card
Hate Nvidia drivers and the high price


----------



## jodybdesigns

Quote:


> Originally Posted by *Agent Smith1984*
> 
> So looks like initial Doom 2016 results are showing AMD TROUNCE NVIDIA.... I'm sure it's just a driver fix for them, but who's to say we don't have driver improvements coming for it too.
> I'll be curious to see how the beta does.
> 
> A 7970 performing the same as a 980 is just ridiculous.... tells me this game is going to be VERY VRAM dependent (not space, but bandwidth)
> 
> http://wccftech.com/doom-2016-alpha-benchmarks-geforce-amd-radeon/
> 
> Thinking a little more about it.... it could be that this alphaport is just not optimized for PC yet, so by default it's running really good on very similar-to-console hardware???


I think this is pretty awesome. But if anyone remembers, The ID engine always ran awesome on AMD hardware. I remember my 4850 killed Quake 4.


----------



## Stige

Quote:


> Originally Posted by *X6SweexLV*
> 
> Hi, I want to buy a new video card, come from the NVIDIA GTX 960 ...
> Then I have a question which of R9 390 cards have a good, quiet, personally I watched http://www.asus.com/Graphics-Cards/STRIXR9390DC3OC8GD5GAMING/overview/ ...
> And I have a 8gb and I7 4771 or enough pleasure to R9 390, 630W psu gold
> And this will be my first ATI / AMD card
> Hate Nvidia drivers and the high price


I have that GPU and the VRM cooling on it sucks. It's "ok" if you don't plan on overclocking but I would look at another manufacturer for R9 390 with a better cooler, like Sapphire.


----------



## Agent Smith1984

I used an Asus Strix 390 for two weeks and it is a nightmare to keep cool. The VRM cooling is terrible.

At first I tried to give it the benefit of the doubt, and thought it had a bad rap, and I did get 1180 on the core, but man it sure did get hot after extended gaming sessions.

The VRM's would get to 88-90c and the core did around 75-80c depending on what I was doing and for how long.


----------



## X6SweexLV

good, but my country is available only Sapphire R9 390 8G D5 no back plate
GIGABYTE GV-R939G1 GAMING-8GD
XFX R9-390P-8DF6
VTX3D VXR9 390 8GBD5-DHE
ASUS STRIX-R9390-DC3OC-8GD5-GAMING


----------



## Agent Smith1984

Quote:


> Originally Posted by *X6SweexLV*
> 
> good, but my country is available only Sapphire R9 390 8G D5 no back plate
> GIGABYTE GV-R939G1 GAMING-8GD
> XFX R9-390P-8DF6
> VTX3D VXR9 390 8GBD5-DHE
> ASUS STRIX-R9390-DC3OC-8GD5-GAMING


The XFX is the best of those in my opinion.


----------



## X6SweexLV

Ok
Ty for help









Edit: then 630 PSU will be good this card ?


----------



## m70b1jr

Quote:


> Originally Posted by *X6SweexLV*
> 
> Ok
> Ty for help
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: then 630 PSU will be good this card ?


I have a 630 and it works fine. I'd say for XFX or Sapphire, but since the XFX has a back plate, go for it.


----------



## m70b1jr

Question: Does anyone have any hard mods or any software mods than can fix the black screening when adding too much voltage to the card? I know there's a .95 Rail volt mod, but I'm not sure how comfortable I feel doing that. I will do it if I need to. But what about software? Any thing in the cards bios?


----------



## simonfredette

There were some doubts about my VRM temps being surprisingly low , ive recently added the glass top to my antique sewing machine build and that does create higher ambient temps although the temps are still respectable.
Core clock 1200
Memory 1700
+100mV
+50%
I took a screen shot with kombustor just so we could see real time with a moderate load on the card and got these results
Gpu temp63C
VRMS 58 and 46 C
Load 100%
fan speed 100% 2300rpm
130 FPS on kombustor, my 3D mark 11 graphics score at these settings is 20338


----------



## m70b1jr

Quote:


> Originally Posted by *simonfredette*
> 
> There were some doubts about my VRM temps being surprisingly low , ive recently added the glass top to my antique sewing machine build and that does create higher ambient temps although the temps are still respectable.
> Core clock 1200
> Memory 1700
> +100mV
> +50%
> I took a screen shot with kombustor just so we could see real time with a moderate load on the card and got these results
> Gpu temp63C
> VRMS 58 and 46 C
> Load 100%
> fan speed 100% 2300rpm
> 130 FPS on kombustor, my 3D mark 11 graphics score at these settings is 20338


So lucky you can get 1200mhz on a +100mv increase..


----------



## Worldwin

Quote:


> Originally Posted by *m70b1jr*
> 
> So lucky you can get 1200mhz on a +1200mv increase..


Sweet jesus. Was he using liquid helium to cool that chip? Also at that level of voltage isn't 1200mhz pretty low?


----------



## m70b1jr

Quote:


> Originally Posted by *Worldwin*
> 
> Sweet jesus. Was he using liquid helium to cool that chip? Also at that level of voltage isn't 1200mhz pretty low?


JESUS CHRIST. i ment +100mv. LIQUID HELIUM *****.


----------



## mus1mus

Quote:


> Originally Posted by *simonfredette*
> 
> There were some doubts about my VRM temps being surprisingly low , ive recently added the glass top to my antique sewing machine build and that does create higher ambient temps although the temps are still respectable.
> Core clock 1200
> Memory 1700
> +100mV
> +50%
> I took a screen shot with kombustor just so we could see real time with a moderate load on the card and got these results
> Gpu temp63C
> VRMS 58 and 46 C
> Load 100%
> fan speed 100% 2300rpm
> 130 FPS on kombustor, my 3D mark 11 graphics score at these settings is 20338


Which card? Looks like you got a winner!

Try using Trixx for more. Looking at your SS, I think you can still squeeze a lot more from the temp headroom.


----------



## Stige

Quote:


> Originally Posted by *mus1mus*
> 
> Which card? Looks like you got a winner!
> 
> Try using Trixx for more. Looking at your SS, I think you can still squeeze a lot more from the temp headroom.


Better to mod the BIOS and keep using MSI AB, Trixx is just... meh. MSI AB is so much better.


----------



## simonfredette

Quote:


> Originally Posted by *mus1mus*
> 
> Which card? Looks like you got a winner!
> 
> Try using Trixx for more. Looking at your SS, I think you can still squeeze a lot more from the temp headroom.


temp wise I think I could , its an msi 390 , they were said to have the best vrm contact for cooling and seeing it now I dont doubt it , I switched up the fan curve because they had it set up so the fans dont even turn on until 60C which makes it idle right around there, mine is typically spinning at 40% which is quiet enough and gives me a idle of 30 and gaming around 50. it does want to artifact in heaven which is weird , perfectly fine in games , 3d mark and kombustor. And yes its obviously not at +1200mV, its a sewing machine not a spaceship ..


----------



## Vellinious

Quote:


> Originally Posted by *Stige*
> 
> Better to mod the BIOS and keep using MSI AB, Trixx is just... meh. MSI AB is so much better.


Last I knew, AB only allows for +200mv.... I wouldn't use it.


----------



## Stige

Quote:


> Originally Posted by *Vellinious*
> 
> Last I knew, AB only allows for +200mv.... I wouldn't use it.


Trixxx allows +200mV, AB allows +100mV.

That is why I have modded my BIOS to give +100mV so I can get +200mV total with MSI AB, it is just superior to Trixx in every way.


----------



## Jagerstriker

Hello everyone. Earlier this week I bought an MSI R9 390 Gaming 8G card. So far it's been running great except for one issue I can't seem to figure out. Whenever I'm benchmarking or gaming my core clock is all over the place. It regularly dips to 500-600 MHz during The Witcher 3 as well as the Valley benchmark, and during AC:U it's just all over the place.

The following screenshots should illustrate what I mean.

During Valley benchmark:









During AC:U









Does anyone have any idea why this occurs and how I could fix it? It sucks during games like TW3 when my core clock drops to 500MHz for maybe half a second and as a result my FPS drops to around 30 for a second, it's very disturbing not having a steady framerate. Whenever the core clock runs at 1150MHz I can get a steady 60 fps.


----------



## m70b1jr

Quote:


> Originally Posted by *Jagerstriker*
> 
> Hello everyone. Earlier this week I bought an MSI R9 390 Gaming 8G card. So far it's been running great except for one issue I can't seem to figure out. Whenever I'm benchmarking or gaming my core clock is all over the place. It regularly dips to 500-600 MHz during The Witcher 3 as well as the Valley benchmark, and during AC:U it's just all over the place.
> 
> The following screenshots should illustrate what I mean.
> 
> During Valley benchmark:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> During AC:U
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Does anyone have any idea why this occurs and how I could fix it? It sucks during games like TW3 when my core clock drops to 500MHz for maybe half a second and as a result my FPS drops to around 30 for a second, it's very disturbing not having a steady framerate. Whenever the core clock runs at 1150MHz I can get a steady 60 fps.


Make sure you're not thermal throttling.


----------



## Stige

ClockBlocker helps, it seems to be common for these cards for some reason and that fixes it, it forces the clocks to stay a t max during 3D and stuff.


----------



## Irked

Sounds like AMD's power saving is trying to down clock even under load


----------



## Jagerstriker

Quote:


> Originally Posted by *m70b1jr*
> 
> Make sure you're not thermal throttling.


I'm pretty sure it's not thermal throttling since max temps did not exceed 80° under load, though I'm not sure since this is the first time I'm using an AMD card.
Quote:


> Originally Posted by *Stige*
> 
> ClockBlocker helps, it seems to be common for these cards for some reason and that fixes it, it forces the clocks to stay a t max during 3D and stuff.


Thanks, I'll check it out!
Quote:


> Originally Posted by *Irked*
> 
> Sounds like AMD's power saving is trying to down clock even under load


How do I disable this? I have no experience with AMD cards or their software.


----------



## Irked

Try Clock blocker like Stige said there was a post by a AMD rep saying they are looking to add this to a later update http://www.overclock.net/t/1592691/change-org-petition-to-add-an-option-to-disable-power-saving-in-gpu-settings/10#post_24944142


----------



## Jagerstriker

Quote:


> Originally Posted by *Irked*
> 
> Try Clock blocker like Stige said there was a post by a AMD rep saying they are looking to add this to a later update http://www.overclock.net/t/1592691/change-org-petition-to-add-an-option-to-disable-power-saving-in-gpu-settings/10#post_24944142


Okay, thanks a lot! I'll try it out and report back later.


----------



## trotter

I finally got mine. R9 390 XFX Double Dissipation Black!


----------



## battleaxe

Quote:


> Originally Posted by *trotter*
> 
> 
> 
> I finally got mine. R9 390 XFX Double Dissipation Black!


Be sure to let us know how your card does. I have one of these too. Mine was very fussy about the power supply I used with it, but once that was underway, all has been great. Curious to know how someone's OC's besides mine.


----------



## NoNameGodLike

Any useful tips for http://www.techpowerup.com/gpudb/b3393/asus-strix-r9-390x-directcu-iii-oc.html ?








Im using ASUS GPU TweakII for overclocking.
I used benchmarks to define the maximum of this card.
I can hear my card throttling sometimes, like it wont allow me to use everything it has to offer. I know its for my cards safety. But I dont go so far to damage my card. Even on lower parameters, it occurs.


----------



## Stige

Because the cooling on the Strix is awful, that's why. You can't give it extra voltage without the VRM hitting 100C+.


----------



## Agent Smith1984

Quote:


> Originally Posted by *NoNameGodLike*
> 
> Any useful tips for http://www.techpowerup.com/gpudb/b3393/asus-strix-r9-390x-directcu-iii-oc.html ?
> 
> 
> 
> 
> 
> 
> 
> 
> Im using ASUS GPU TweakII for overclocking.
> I used benchmarks to define the maximum of this card.
> I can hear my card throttling sometimes, like it wont allow me to use everything it has to offer. I know its for my cards safety. But I dont go so far to damage my card. Even on lower parameters, it occurs.


A few other things also.....

Crimson driver without hotfix can cause core clock throttling, and also make sure your power limit is at 50%, regardless of whether you are overclocking/volting or not.

I've had several hawaii/grenada cards that would throttle in the 930-1000 range if power limit was not increased.

If it still does it after that, try jacking up fans and taking your case door off to see if the cooler temps stop it from throttling. What exactly are your temps?

I just can't believe Asus failed on cooling 2 rounds in a row, an with a second cooler design. They keep slapping their NVIDIA coolers on AMD cards despite the fact that the damn AMD cards need the better cooling worse than the NVIDIA cards do!! You'd think if they were recycling, they'd go the other way around?


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> A few other things also.....
> 
> Crimson driver without hotfix can cause core clock throttling, and also make sure your power limit is at 50%, regardless of whether you are overclocking/volting or not.
> 
> I've had several hawaii/grenada cards that would throttle in the 930-1000 range if power limit was not increased.
> 
> If it still does it after that, try jacking up fans and taking your case door off to see if the cooler temps stop it from throttling. What exactly are your temps?
> 
> I just can't believe Asus failed on cooling 2 rounds in a row, an with a second cooler design. They keep slapping their NVIDIA coolers on AMD cards despite the fact that the damn AMD cards need the better cooling worse than the NVIDIA cards do!! You'd think if they were recycling, they'd go the other way around?


I don't think they care about showing AMD any love. Maybe they want to make AMD look bad? But all they really accomplish is making themselves look like a cheap crappy brand IMO. They've become well known to offer crappy GPU's now if you ask me. Good motherboards and monitors, and that's about it. MSI has them beat solidly on the GPU front now.


----------



## Vellinious

Quote:


> Originally Posted by *battleaxe*
> 
> I don't think they care about showing AMD any love. Maybe they want to make AMD look bad? But all they really accomplish is making themselves look like a cheap crappy brand IMO. They've become well known to offer crappy GPU's now if you ask me. Good motherboards and monitors, and that's about it. MSI has them beat solidly on the GPU front now.


WIth the 970 STRIX, ASUS fell so hard on the fail button that they bounced and hit it again. All STRIX Maxwell cards are voltage locked at 1.212v, and can only be unlocked with a special voltage tool (I think it's a modified ASUS GPU Tweak)....but it doesn't work on the 970s. That, and they gave the STRIX 970 1 x 6 pin power connector, SEVERELY limiting it's overclocking potential. ASUS is known for having issues with customer service, so, in seeing what they did to the STRIX Maxwells, and having experience with their RMA process, I'll never even consider them for a GPU in the future. They're just horrible.


----------



## tolis626

Quote:


> Originally Posted by *Vellinious*
> 
> WIth the 970 STRIX, ASUS fell so hard on the fail button that they bounced and hit it again. All STRIX Maxwell cards are voltage locked at 1.212v, and can only be unlocked with a special voltage tool (I think it's a modified ASUS GPU Tweak)....but it doesn't work on the 970s. That, and they gave the STRIX 970 1 x 6 pin power connector, SEVERELY limiting it's overclocking potential. ASUS is known for having issues with customer service, so, in seeing what they did to the STRIX Maxwells, and having experience with their RMA process, I'll never even consider them for a GPU in the future. They're just horrible.


This. As much as I love all other Asus products (Especially mobos and monitors), their GPU department is... bad. There's no other way to put it. A friend had a DirectCUII R9 290x and it was horrible. 90+C at all times, no overclocking whatsoever and it was loud if it was to run at reasonable (<90C) temps. Then, one of the fans just stopped working and he couldn't get through with the RMA process, he fed up and bought a water block for the card and made a small custom loop for it (like, with some used parts it cost him a bit north of 200€, which is really good IMO). Then it was decent. Their PCBs are robust and all, but they miss the mark on about anything else. I haven't had experience with their NVidia based GPUs, but their AMD lineup is pure garbage as a whole package. Even the damn Matrix 290x wasn't well cooled for Christ's sake! Shame, because Asus is an otherwise excellent company with some pretty good engineers. If they work a bit on their GPU lineup and fix that terrible customer support (Really, there are countless horror stories about it on the web), they'll be hands down the best, IMO. Meanwhile, I'm not at all saddened to see MSI kicking butt. I always had a special place in my heart for MSI, as my first computer ever had an MSI mobo. I kind of inherited that from my father.


----------



## NovaEnid

I loved my old Asus GeForce GTX 660 Ti DirectCU II TOP. Clocked quite nicely for me, was really quiet and ran all my games until last month (~mid quality settings).

But now I'm really happy with my Sapphire Radeon R9 390 Nitro.


----------



## Nameless1988

Hi mates!









My Heaven score , 1080p, maxed out.



About core throttling:

On my XFX 390 black edition i have set afterburner on custom overclock mode without powel play so mi gpu core stay at 1050 mhz rock solid (stock clock) even with vsync on, even with less than 100% gpu-usage.

this is the trick to avoid clock throttling (if you guys know another or better "technique" to avoid core throttling... let me know!)

another question : have you ever heard coil-whine on your 390s ? Mine has a bit of coil whine , even at 60 fps but non that much. The card runs smooth with good temperatures.


----------



## simonfredette

Quote:


> Originally Posted by *Nameless1988*
> 
> Hi mates!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My Heaven score , 1080p, maxed out.
> 
> 
> 
> About core throttling:
> 
> On my XFX 390 black edition i have set afterburner on custom overclock mode without powel play so mi gpu core stay at 1050 mhz rock solid (stock clock) even with vsync on, even with less than 100% gpu-usage.
> 
> this is the trick to avoid clock throttling (if you guys know another or better "technique" to avoid core throttling... let me know!)
> 
> another question : have you ever heard coil-whine on your 390s ? Mine has a bit of coil whine , even at 60 fps but non that much. The card runs smooth with good temperatures.


That seems almost low , I had trouble running heaven without artifacts at my usual overclock ( which was stable on everything else ) so I dialed down the core clock from 1200 to 1160 and it ran clean but my score was 2566. I do get some coil whine during benching , not a ton but some for sure and whats nice is with the lower core clock I was able to lower the voltage to +80 mV.



Ill re run with tessellation set to extreme and see what difference it makes , I also use clock blocker to ensure my clocks remain stable during bench or games.

I ran with tessellation set to max and it does hurt some but not quite as much as it did for you , what is your overclock on GPU and CPU ?


----------



## battleaxe

Quote:


> Originally Posted by *Vellinious*
> 
> WIth the 970 STRIX, ASUS fell so hard on the fail button that they bounced and hit it again. All STRIX Maxwell cards are voltage locked at 1.212v, and can only be unlocked with a special voltage tool (I think it's a modified ASUS GPU Tweak)....but it doesn't work on the 970s. That, and they gave the STRIX 970 1 x 6 pin power connector, SEVERELY limiting it's overclocking potential. ASUS is known for having issues with customer service, so, in seeing what they did to the STRIX Maxwells, and having experience with their RMA process, I'll never even consider them for a GPU in the future. They're just horrible.


Yeah, I agree. They've done a great job of destroying their reputation on GPU's as far as I'm concerned. Too bad. They of all companies have the means to do a lot better. Guess they just got too greedy. Who else have we seen do this a lot lately? Hmmm...


----------



## simonfredette

Quote:


> Originally Posted by *battleaxe*
> 
> Yeah, I agree. They've done a great job of destroying their reputation on GPU's as far as I'm concerned. Too bad. They of all companies have the means to do a lot better. Guess they just got too greedy. Who else have we seen do this a lot lately? Hmmm...


yeah but isnt the direct cuII about the only one that has a full cover block, is that BECAUSE the stock cooling is garbage or just a really awesome deal with asus. One of the reasons I bought the MSI is because it has a great stock cooler with plenty of VRM cooling and it should "fingers crossed" get an EK or heatkiller full cover block in the near future .


----------



## Nameless1988

Quote:


> Originally Posted by *simonfredette*
> 
> That seems almost low , I had trouble running heaven without artifacts at my usual overclock ( which was stable on everything else ) so I dialed down the core clock from 1200 to 1160 and it ran clean but my score was 2566. I do get some coil whine during benching , not a ton but some for sure and whats nice is with the lower core clock I was able to lower the voltage to +80 mV.
> 
> 
> 
> Ill re run with tessellation set to extreme and see what difference it makes , I also use clock blocker to ensure my clocks remain stable during bench or games.
> 
> I ran with tessellation set to max and it does hurt some but not quite as much as it did for you , what is your overclock on GPU and CPU ?


You did it wrong.









You missed AntiAliasing (my Heaven benchmark is with 8x AA)

The right way to run Heaven is on MAX SETTINGS, even AA.









Re-do it and put AntiAliasing on 8x!


----------



## simonfredette

Ya i saw that , with aa x 8 my results were essentially the same as yours. When gaming i turn that off and tessellation to medium settings , it looks great and greatly increases fps. I cringed when I saw the fps with that junk on.


----------



## Stige

Why would anyone run Heaven on anything but Extreme preset anyway? lol at "Custom" preset.


----------



## NoNameGodLike

Quote:


> Originally Posted by *Agent Smith1984*
> 
> A few other things also.....
> 
> Crimson driver without hotfix can cause core clock throttling, and also make sure your power limit is at 50%, regardless of whether you are overclocking/volting or not.
> 
> I've had several hawaii/grenada cards that would throttle in the 930-1000 range if power limit was not increased.
> 
> If it still does it after that, try jacking up fans and taking your case door off to see if the cooler temps stop it from throttling. What exactly are your temps?
> 
> I just can't believe Asus failed on cooling 2 rounds in a row, an with a second cooler design. They keep slapping their NVIDIA coolers on AMD cards despite the fact that the damn AMD cards need the better cooling worse than the NVIDIA cards do!! You'd think if they were recycling, they'd go the other way around?


Im staying under 70° on 1175/1520. Mostly 63-67. When i open Window and its cold in room :l. Otherwise 70°-75°.
Power target 120%. Increased by 20%.
Gpu voltage 1325mV. increased by 75mV.
Fan speed 50%-60%. Mostly 60%
I never tried increasing memory because I was told by people, by reading other forums, that Increasing memory on such cards is just waste of time. Because GDDR5 is way too fast to make and noticeable difference by changing it a bit.


----------



## Stige

Quote:


> Originally Posted by *NoNameGodLike*
> 
> Im staying under 70° on 1175/1520. Mostly 63-67. When i open Window and its cold in room :l. Otherwise 70°-75°.
> Power target 120%. Increased by 20%.
> Gpu voltage 1325mV. increased by 75mV.
> Fan speed 50%-60%. Mostly 60%
> I never tried increasing memory because I was told by people, by reading other forums, that Increasing memory on such cards is just waste of time. Because GDDR5 is way too fast to make and noticeable difference by changing it a bit.


You should always have your power limit at 150%.


----------



## tolis626

Quote:


> Originally Posted by *NoNameGodLike*
> 
> Im staying under 70° on 1175/1520. Mostly 63-67. When i open Window and its cold in room :l. Otherwise 70°-75°.
> Power target 120%. Increased by 20%.
> Gpu voltage 1325mV. increased by 75mV.
> Fan speed 50%-60%. Mostly 60%
> I never tried increasing memory because I was told by people, by reading other forums, that Increasing memory on such cards is just waste of time. Because GDDR5 is way too fast to make and noticeable difference by changing it a bit.


Quote:


> Originally Posted by *Stige*
> 
> You should always have your power limit at 150%.


First off, what Stige said. Power limit doesn't tell your card to consume more power, it just tells it "yo, don't hold back" so that IF the card needs that power it has it on tap. No reason whatsoever to not max it out, unless you're using a small factor case with a barely able PSU. And even then, I doubt there's any reason to leave it lower than 150%.

Other than that, I would suggest either leaving your memory at stock (1500MHz) or upping it to the next strap end (1625MHz or 1750MHz) to get the tightest timings possible. 1520MHz has the same, looser timings as 1625MHz whereas 1500MHz has tighter timings. So both 1625MHz (higher bandwidth) and 1500MHz (lower latency) should (will?) perform better than 1520MHz.


----------



## THUMPer1

Quote:


> Originally Posted by *Nameless1988*
> 
> Hi mates!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My Heaven score , 1080p, maxed out.


Mines 1599, same settings. 1150/1650. For anyone that thinks the RAM runs better at 1625, it doesn't in Heaven for me. 1625 results in a lower score.


----------



## patriotaki

i got two new updates today when i upgraded to win 10 pro


its whql and beta..should i download them?


----------



## battleaxe

Quote:


> Originally Posted by *simonfredette*
> 
> yeah but isnt the direct cuII about the only one that has a full cover block, is that BECAUSE the stock cooling is garbage or just a really awesome deal with asus. One of the reasons I bought the MSI is because it has a great stock cooler with plenty of VRM cooling and it should "fingers crossed" get an EK or heatkiller full cover block in the near future .


It probably has more to do with the volume of units sold than anything else. But if the block makers had been paying attention they would have realized not many people will buy the ASUS because of its poor cooler, but maybe that does translate into getting blocks for most of them. IDK. Seems like a risky proposition to me from the standpoint of blockmakers though. The MSI seems a better bet to sell more blocks, again though.... my bet is it comes down to volume of sales in the end. I would hope they did their homework before only manufacturing blocks for the ASUS. I'm not sure though, I'm not an expert on these sorts of things, but did take a few economics classes in my time.


----------



## Vellinious

Quote:


> Originally Posted by *Stige*
> 
> Why would anyone run Heaven on anything but Extreme preset anyway? lol at "Custom" preset.


You have to run Heaven in "CUSTOM" to run it in 1080.....


----------



## simonfredette

Right and full screen, plus I prefer to run it the way I run my games so I have a more accurate representation of the temps and fps ill have during gaming. I turn tessellation off and AA off while I game , id rather have 80-100 fps without them than 50-70 without, I find fps make a more noticeable difference than the settings do. That being said , like nameless1988 said i was comparing apples to oranges when I didnt notice his settings were different, we obviously cant compare scores unless we run at the same settings.


----------



## Stige

Quote:


> Originally Posted by *Vellinious*
> 
> You have to run Heaven in "CUSTOM" to run it in 1080.....


Extreme preset is 1080p? No?
Or was it Extreme HD? What people use at all times anyway for comparison purposes, haven't seen anyone use anything else yet apart from higher resolutions ofcourse.

EDIT: Oh wait Heaven, not Valley, derp.


----------



## Vellinious

Quote:


> Originally Posted by *Stige*
> 
> Extreme preset is 1080p? No?
> Or was it Extreme HD? What people use at all times anyway for comparison purposes, haven't seen anyone use anything else yet apart from higher resolutions ofcourse.
> 
> EDIT: Oh wait Heaven, not Valley, derp.


^^Got it


----------



## Nameless1988

Quote:


> Originally Posted by *THUMPer1*
> 
> Mines 1599, same settings. 1150/1650. For anyone that thinks the RAM runs better at 1625, it doesn't in Heaven for me. 1625 results in a lower score.


Great score mate! That score of mine (58.4 fps avg) is at stock clock. My xfx won't run at 1150 mhz on stock voltage. I need to increase voltage For 1150 mhz.


----------



## THUMPer1

Quote:


> Originally Posted by *Nameless1988*
> 
> Great score mate! That score of mine (58.4 fps avg) is at stock clock. My xfx won't run at 1150 mhz on stock voltage. I need to increase voltage For 1150 mhz.


Me too. But I only add +30 core voltage. It's not much.


----------



## diggiddi

Quote:


> Originally Posted by *patriotaki*
> 
> i got two new updates today when i upgraded to win 10 pro
> 
> 
> its whql and beta..should i download them?


Yes WHQL if you're not comfortable with Beta


----------



## Regnitto

I don't have the required proof to join the club at the moment (no internet other than phone data atm and phone doesn't like to post pics on here) but I picked up an ASUS strix 390x a few weeks ago.

I'm making tho post to sub to the forum so that I can find it easier she I can post my proof.

Also, does anyone have issues with PC crashing when running Ashes of the Singularity in dx 12 while simultaneously running vlc media player on a second monitor? I had this problem yesterday.


----------



## m70b1jr

http://www.overclock.net/t/1593185/xfx-r9-390-voltage-mod


----------



## momo254

First timer here. Just got a pair of XFX 390X's and I must say I'm impressed. Here are my results of Heaven on stock settings. For some reason the score seems low for this crossfire set up.


----------



## Stige

Why are people running Heaven here instead of Valley?


----------



## mus1mus

Does it matter?


----------



## Nameless1988

Quote:


> Originally Posted by *Stige*
> 
> Why are people running Heaven here instead of Valley?


because Heaven is heavier than Valley.


----------



## patriotaki

how much OC can you get with no voltage on the r9 390? (please include which r9 390 you have also)
i can get 1050-1060MHz on my pcs+ with no voltage its not so good isnt it?


----------



## Nameless1988

Quote:


> Originally Posted by *patriotaki*
> 
> how much OC can you get with no voltage on the r9 390? (please include which r9 390 you have also)
> i can get 1050-1060MHz on my pcs+ with no voltage its not so good isnt it?


As you can see, i have an xfx black edition (runs at 1050 mhz, stock clock).

Running U. Heaven I can get 1120 mhz (stable, no artifact) on the core without overvolt.

Usually i do not overclock the VRam because there is no gain overclocking the vram, (unlike overclocking the VRam on Maxwell).


----------



## jodybdesigns

Quote:


> Originally Posted by *patriotaki*
> 
> how much OC can you get with no voltage on the r9 390? (please include which r9 390 you have also)
> i can get 1050-1060MHz on my pcs+ with no voltage its not so good isnt it?


Actually that's really good. I have a PCS+ and I have to pump +44mv to get 1085/1625 and +63mv to get 1110/1625 and +100mv to get 1150/1625.

You got a better one than me sir.
Quote:


> Originally Posted by *Nameless1988*
> 
> Usually i do not overclock the VRam because there is no gain overclocking the vram, (unlike overclocking the VRam on Maxwell).


Actually, there is benefits into clocking the memory on the Hawaii chips. You just have to get into the correct memory timings.


----------



## patriotaki

i can get 1100Mhz ,1700Mhz with +56mV with no artifacts..

is it better to have lower clock memory?


----------



## jodybdesigns

Quote:


> Originally Posted by *patriotaki*
> 
> i can get 1100Mhz ,1700Mhz with +56mV with no artifacts..
> 
> is it better to have lower clock memory?


Here is your timings

Strap end 400MHz (40 9C 00) , Range = 150-400MHz
Strap end 800MHz (80 38 01) , Range = 401-800MHz
Strap end 900MHz (90 5F 01) , Range = 801-900MHz
Strap end 1000MHz (A0 86 01) , Range = 901-1000MHz
Strap end 1125MHz (74 B7 01) , Range = 1001-1125MHz
Strap end 1250MHz (48 E8 01) , Range = 1126-1250MHz
Strap end 1375MHz (1C 19 02) , Range = 1251-1375MHz
Strap end 1500MHz (F0 49 02) , Range = 1376-1500MHz
Strap end 1625MHz (C4 7A 02) , Range = 1501-1625MHz
Strap end 1750MHZ (98 AB 02) , Range = 1626-1750MHz

You want to have 1625 or 1750, don't run anything in between, it's useless. Try running your memory at 1625 and I bet you will get a better result. Set it to 1750 and you will get an even better result.


----------



## Slowpoke66

Quote:


> Originally Posted by *patriotaki*
> 
> how much OC can you get with no voltage on the r9 390? (please include which r9 390 you have also)
> i can get 1050-1060MHz on my pcs+ with no voltage its not so good isnt it?


1100-1625 w/ Voltage 0+mV, Power Limit 50+, bench- & BF4-stable (MSI 390 Gaming in CF)


----------



## jdorje

1090 mhz, xfx 8256 390. Stock is 1225 mV.

Or you could just look at the spreadsheet in the first post.


----------



## TsukikoChan

Quote:


> Originally Posted by *m70b1jr*
> 
> Maybe overclocking does make a difference
> 
> Stock
> 
> 
> OC'd


I'm being dumb here but what all did you OC there to get your 8350 to get above that  I've played with my 8350 and can only get a stable 4.5ghz without going over temp upper limit (noctua nh14 can only go so far ;_; ).


----------



## mus1mus

Get a 1429, 1432 1433 batch FX. They OC a lot with low Voltages. See in my sig. It's a batch 1432PGY.


----------



## patriotaki

damn.. on my pcs+ 390 i get some artifacts on 1080Mhz +23mV , 1625MHz and power limit +50%

also on gta v i get low fps.. 40-50fps all ultra, everything maxed out.


----------



## TsukikoChan

Quote:


> Originally Posted by *mus1mus*
> 
> Get a 1429, 1432 1433 batch FX. They OC a lot with low Voltages. See in my sig. It's a batch 1432PGY.


how do i find out what batch i got? i bought this over a year ago, i won't be changing it until the new zen's come out so if it's a bad batch then not much i can do about it :-(


----------



## jdorje

If all ultra includes msaa, well, disable that. I get 60-80 fps at 1440 on ~very high. But you should be benchmarking with unigine valley.


----------



## TsukikoChan

Also, a question, is upping the Aux voltage that important for OCing? i mainly use trixx for OC (which doesn't have Aux on it) and i can get 1620mhz on my memory without needing to put in extra aux. Am i missing something? is aux neccessary to push past the 1625 timings? does it help vrm or core speeds/stability?


----------



## jdorje

Aux (system agent) voltage does nothing for me.


----------



## CommunistSquare

Got my MSI R9 390X a few days ago (RMA'd a Lightning 290X, which was barely stable at stock clocks)...





I think I have a pretty damn good overclocker in my hands... no additional voltage, +50% power limit, got to 1150/1700, Firestrike stable. Haven't tested out any games, but hopefully the stability persists.
Temperatures are fantastic... haven't broke ~74c yet on default fan settings.


----------



## legendary2020

Hi

count me in



http://www.3dmark.com/3dm/11038385


----------



## Agent Smith1984

Quote:


> Originally Posted by *legendary2020*
> 
> Hi
> 
> count me in
> 
> 
> 
> http://www.3dmark.com/3dm/11038385


Please use a GPU-Z validation link with name, or even your FireStike with username listed, and I will get you added.

You can also post a pic of card with your username somewhere in the picture.

Thanks


----------



## CommunistSquare

http://www.3dmark.com/fs/7753865

Please add


----------



## legendary2020

sorry i didn't know here is my validation link

http://www.techpowerup.com/gpuz/details.php?id=guymn


----------



## Agent Smith1984

BOTH ADDED!!


----------



## fenixfox

heres my msi r9 390 overclock

1150mhz core, 1700mhz memory all on +38mv core and 50% power limit.


----------



## lyricyst2000

Quote:


> Originally Posted by *Regnitto*
> 
> I don't have the required proof to join the club at the moment (no internet other than phone data atm and phone doesn't like to post pics on here) but I picked up an ASUS strix 390x a few weeks ago.
> 
> I'm making tho post to sub to the forum so that I can find it easier she I can post my proof.
> 
> Also, does anyone have issues with PC crashing when running Ashes of the Singularity in dx 12 while simultaneously running vlc media player on a second monitor? I had this problem yesterday.


I had that exact issue with Fallout 4 so I doubt its specific to DX12 games. Probably an issue with VLC itself.


----------



## unkletom

Getting 160 points more just for graphics in Firestrike Extreme after doing the memory timings mod in hex editor. Not bad for a 10 minutes job.









Here is the video on how to do it: 




And download Hex Workshop.

Just mark the straps like that guy in the video did. It should be the same hex numbers for everyone. Then copy paste your own timings from the lower strap into the higher straps.


----------



## patriotaki

guys i think my OC gets worse and worse as the time passes..
1 week ago i could go up to 1100MHz,1700Mhz , +56mV with no artifacts
now with same settings i get artifacts..

is it possible that i could have broken my card?








i get 90-110 fps on bf4 and sometimes 75-77fps

my friend with a gtx 970 gets way better fps than me..

390 should be better am i right?


----------



## Regnitto

Quote:


> Originally Posted by *lyricyst2000*
> 
> I had that exact issue with Fallout 4 so I doubt its specific to DX12 games. Probably an issue with VLC itself.


Thanks. I haven't had the problem with fallout or any other game, but I'll keep an eye out for it to happen with any other games


----------



## Stige

Quote:


> Originally Posted by *patriotaki*
> 
> guys i think my OC gets worse and worse as the time passes..
> 1 week ago i could go up to 1100MHz,1700Mhz , +56mV with no artifacts
> now with same settings i get artifacts..
> 
> is it possible that i could have broken my card?
> 
> 
> 
> 
> 
> 
> 
> 
> i get 90-110 fps on bf4 and sometimes 75-77fps
> 
> my friend with a gtx 970 gets way better fps than me..
> 
> 390 should be better am i right?


High VRM temps?


----------



## patriotaki

I ran furmark I gave to my card +40Mhz core clock

And I was seeing black artifacts..terrible OC


----------



## tolis626

Quote:


> Originally Posted by *patriotaki*
> 
> I ran furmark I gave to my card +40Mhz core clock
> 
> And I was seeing black artifacts..terrible OC


The offset applied in AB means nothing. What really matters is the actual voltage (VDDC) applied to your card under load. If your card is one of those with low DPM7 voltage, then you're going to need higher offsets for overclocking. Your OC isn't worse, you just need to apply a larger offset to reach the same voltage as someone with a higher voltage. For example, my card has a DPM7 voltage of 1.275V, but there are cards that have 1.225V. Maybe your card is among the latter (Check with The Stilt's EVV tool). No need to panic.

Also, artifacts could have always been there with these same settings without you noticing them. Happens to me too.


----------



## legendary2020

Coming from a dead GTX 980 to the R9 390X I would say there are no differences in fps in many games ...the best OC I manage to get is 1160 / 1625 50+ v no problems at all plus you guys need to try the new driver update 16.2.1 it improved my GTA v fps


----------



## Irked

I cant even boot on 16.2.1 instant black screen


----------



## rdr09

Quote:


> Originally Posted by *Irked*
> 
> I cant even boot on 16.2.1 instant black screen


I'll be irk too.

http://www.overclock.net/t/988215/how-to-remove-your-amd-gpu-drivers-new-2016

Skip Evolved for now and, if you can, do not let Radeon Settings be part of startup programs in Windows. I go to msconfig prior to reboot and uncheck it.


----------



## Stige

Quote:


> Originally Posted by *patriotaki*
> 
> I ran *furmark* I gave to my card +40Mhz core clock
> 
> And I was seeing black artifacts..terrible OC


Stop using FurMark, it doesn't do you any good, it's useless piece of software...


----------



## patriotaki

Quote:


> Originally Posted by *Stige*
> 
> Stop using FurMark, it doesn't do you any good, it's useless piece of software...


OK which one should is use?


----------



## Stige

Quote:


> Originally Posted by *patriotaki*
> 
> OK which one should is use?


Games that you actually play? I guess FireStrike/Heaven are ok for quick testing but games in the long run.


----------



## Irked

Quote:


> Originally Posted by *rdr09*
> 
> I'll be irk too.
> 
> http://www.overclock.net/t/988215/how-to-remove-your-amd-gpu-drivers-new-2016
> 
> Skip Evolved for now and, if you can, do not let Radeon Settings be part of startup programs in Windows. I go to msconfig prior to reboot and uncheck it.


Im running 16.1.1 works fine for now looking at AMD forums Im not the only one with the issue


----------



## legendary2020

Hmm it's interesting I have the 16.2.1 no problems at all....its all about improvements more fps on my msi R9 390x


----------



## Stige

No issues here either with the 16.2.1.


----------



## TsukikoChan

gonna install it tonight


----------



## Stige

Fooled around with FireStrike yesterday again, improved my score a bit: http://www.3dmark.com/3dm/11042772
This was at 1225/1650.

Previous score was 11683.


----------



## patriotaki

I get artifacts on most games when I don't use voltage ..the highest I can go with no voltage is 1050mhz and power limit +50%
Maybe NY power supply? It's an old super flower 650watt..maybe with a better PSU I can get higher OC?


----------



## Stige

Quote:


> Originally Posted by *patriotaki*
> 
> I get artifacts on most games when I don't use voltage ..the highest I can go with no voltage is 1050mhz and power limit +50%
> Maybe NY power supply? It's an old super flower 650watt..maybe with a better PSU I can get higher OC?


Which card and what are your VRM temps?


----------



## uszpdoz

Quote:


> Originally Posted by *patriotaki*
> 
> I get artifacts on most games when I don't use voltage ..the highest I can go with no voltage is 1050mhz and power limit +50%
> Maybe NY power supply? It's an old super flower 650watt..maybe with a better PSU I can get higher OC?


im currently using corsair HX650w for my msi r9 390,the most stable oc i can get without increasing voltage or power limit is 1120/1650 but beyond that any +10 +100 mv still give me artifacts..it must be my card limit or psu(vrm and other temp is around 75c ~77c).


----------



## tolis626

Quote:


> Originally Posted by *patriotaki*
> 
> I get artifacts on most games when I don't use voltage ..the highest I can go with no voltage is 1050mhz and power limit +50%
> Maybe NY power supply? It's an old super flower 650watt..maybe with a better PSU I can get higher OC?


Dude, you've been asking over, and over, and over, and over, and over again while ignoring most advice given to you. Take a step back and listen to what others have to say before trying things randomly without actually knowing what you're doing.

First off, uninstall Furmark. It's useless and only stresses your card for no reason. Test for stability with games (The harder they are to run, the better. Witcher 3, I've found, is the best for that)

Secondly, undo any overclocking (Just press the reset button in Afterburner or whatever) and disable any overclocking software (don't let them run at boot), then reboot and run The Stilt's VID app and post what it reports as your DPM7 voltage (Or even better, post a screenshot of it). If we don't know that, any offset you apply is useless, really.

Third, install GPU-z and give us your VRM temps and actual VDDC under load. You could also have it record everything (In the bottom of the sensors tab - also have it refresh the sensors every second instead of 2.5s or so that it is by default) and then zip and upload the log here so we can see what's going on. Or, again, even better would be to post screenshots.

I'm willing to bet your card is a low DPM7 one and you just need a bigger offset. But you can't just try random things and expect it to work. Help us help you.


----------



## legendary2020

http://www.3dmark.com/3dm/11038385

I couldn't go higher than 11230 with my R9 390X ...Cpu should be powerful enough as it's the highest cpu on LGA 1155 ( xeon E3 1290 v2 4.10Ghz )and yah its faster than 3770 plus its overclocked


----------



## rdr09

Quote:


> Originally Posted by *legendary2020*
> 
> http://www.3dmark.com/3dm/11038385
> 
> I couldn't go higher than 11230 with my R9 390X ...Cpu should be powerful enough as it's the highest cpu on LGA 1155 ( xeon E3 1290 v2 4.10Ghz )and yah its faster than 3770 plus its overclocked


i think you are running your rams at single channel with that 2GB.


----------



## legendary2020

The 4x2 actually in dual channel ...I don't know if that extra 2GB is effecting the performance of the 8GB


----------



## rdr09

Quote:


> Originally Posted by *legendary2020*
> 
> The 4x2 actually in dual channel ...I don't know if that extra 2GB is effecting the performance of the 8GB


might not be a big deal but me think so. try removing the 2 and install the one 4GB in its place without messing up your system.









run FS again using same settings and see if your graphics score goes up. if you care. if not just ignore.


----------



## patriotaki

Quote:


> Originally Posted by *tolis626*
> 
> Dude, you've been asking over, and over, and over, and over, and over again while ignoring most advice given to you. Take a step back and listen to what others have to say before trying things randomly without actually knowing what you're doing.
> 
> First off, uninstall Furmark. It's useless and only stresses your card for no reason. Test for stability with games (The harder they are to run, the better. Witcher 3, I've found, is the best for that)
> 
> Secondly, undo any overclocking (Just press the reset button in Afterburner or whatever) and disable any overclocking software (don't let them run at boot), then reboot and run The Stilt's VID app and post what it reports as your DPM7 voltage (Or even better, post a screenshot of it). If we don't know that, any offset you apply is useless, really.
> 
> Third, install GPU-z and give us your VRM temps and actual VDDC under load. You could also have it record everything (In the bottom of the sensors tab - also have it refresh the sensors every second instead of 2.5s or so that it is by default) and then zip and upload the log here so we can see what's going on. Or, again, even better would be to post screenshots.
> 
> I'm willing to bet your card is a low DPM7 one and you just need a bigger offset. But you can't just try random things and expect it to work. Help us help you.






GPU-ZSensorLog.txt 236k .txt file


----------



## tolis626

Quote:


> Originally Posted by *patriotaki*
> 
> 
> 
> 
> 
> GPU-ZSensorLog.txt 236k .txt file


See? That wasn't so hard. AND I was right. My card is a 1.275V DPM7 chip. What that means is that when we both apply the same offset, say +50mV in Afterburner, I'm going to get a higher voltage (Not 50mV because of vdroop, but higher nontheless). In fact, your card's voltage at +100mV is going to be closer to mine when at +50mV. So just push some more volts in there, just watch out for the temps, especially the VRMs.

You might also want to make a custom, more aggressive fan curve to keep things nice and cool, but I digress. I would have it top out at something like 50-60% with the PCS, but that's just me.

EDIT : Now that I'm thinking about it, under load I'm a full 100mV higher than you when at +50mv. So I say go straight to +100mV and don't look back. And certainly don't be afraid. Also, don't forget to set your power limit to +50%.


----------



## patriotaki

so im basically safe with +100mV? i have a custom fan curve and i can control the fans on my case.
how can i overclock it now? 1100Mhz ,1725MHz memory and +100mV will it be okay?


----------



## tolis626

Quote:


> Originally Posted by *patriotaki*
> 
> so im basically safe with +100mV? i have a custom fan curve and i can control the fans on my case.
> how can i overclock it now? 1100Mhz ,1725MHz memory and +100mV will it be okay?


Woah, slow down there. Leave the memory at stock for now, you have to finish overclocking the core first. Try increasing in 25MHz steps at first. Then, when you start getting artifacts, go back to your previously stable overclock and add 5-10MHz at a time. See where that gets you. Then, when you're all done with the core, try overclocking the memory. There's a good chance you might need to increase AUX voltage when pushing the memory farther, but don't go over +50mV on it to be totally safe.

I'd expect you'd be able to hit at least 1125MHz, maybe even higher. But take your time to do it and do it right. Just watch and write down your VRM temps and post back when you have results. Good luck!


----------



## patriotaki

Quote:


> Originally Posted by *tolis626*
> 
> Woah, slow down there. Leave the memory at stock for now, you have to finish overclocking the core first. Try increasing in 25MHz steps at first. Then, when you start getting artifacts, go back to your previously stable overclock and add 5-10MHz at a time. See where that gets you. Then, when you're all done with the core, try overclocking the memory. There's a good chance you might need to increase AUX voltage when pushing the memory farther, but don't go over +50mV on it to be totally safe.
> 
> I'd expect you'd be able to hit at least 1125MHz, maybe even higher. But take your time to do it and do it right. Just watch and write down your VRM temps and post back when you have results. Good luck!


how much voltage to add? anything above 60-70 my pc crashes... RSOD


----------



## Nameless1988

There is no point into Overclocking vram: useless. Vram at 1650 / 1700 mhz would decrease general performance and framerate in your benchmark.
Avoid overclock on Vram, focuse yourself on the core clock and overclock it for real gain in performance/framerate!!


----------



## mus1mus

Quote:


> Originally Posted by *Nameless1988*
> 
> There is no point into Overclocking vram: useless. Vram at 1650 / 1700 mhz would decrease general performance and framerate in your benchmark.
> Avoid overclock on Vram, focuse yourself on the core clock and overclock it for real gain in performance/framerate!!


are you sure?

coz I can show you the difference.









Maybe you have not seen me clock the core. And score less than the other guys.


----------



## m70b1jr

Quote:


> Originally Posted by *Nameless1988*
> 
> There is no point into Overclocking vram: useless. Vram at 1650 / 1700 mhz would decrease general performance and framerate in your benchmark.
> Avoid overclock on Vram, focuse yourself on the core clock and overclock it for real gain in performance/framerate!!


Any reason why it would decrease? I'm running 1750 on my mem speeds.


----------



## tolis626

Quote:


> Originally Posted by *patriotaki*
> 
> how much voltage to add? anything above 60-70 my pc crashes... RSOD


At the same clocks it crashes just from adding voltage? If so, check your VRM temps as it happens. I don't think it can crash at those voltage though, seems highly unlikely. Some of us are pushing +200mV when that happens.

I cannot stress enough, however, that you should put your memory at stock in the beginning.
Quote:


> Originally Posted by *Nameless1988*
> 
> There is no point into Overclocking vram: useless. Vram at 1650 / 1700 mhz would decrease general performance and framerate in your benchmark.
> Avoid overclock on Vram, focuse yourself on the core clock and overclock it for real gain in performance/framerate!!


I'm sorry, but that's complete BS. Sure, core overclocking yields better results, but memory overclocking has a very real effect on performance too. Not as much as it does on Maxwell cards, but enough. 1650MHz uses 1750MHz timings but is close to 1625MHz (Tighter timings at < or = 1625MHz). But even 1700MHz improves performance. By how much and is it worth it is another story, but the results don't lie.


----------



## patriotaki

on bf4 1500mhz mem, 1090mhz core clock with +50mV thats the highest i can get

with some rare artifacts

1080mhz core clock stable with +50mV


----------



## Stige

Quote:


> Originally Posted by *tolis626*
> 
> At the same clocks it crashes just from adding voltage? If so, check your VRM temps as it happens. I don't think it can crash at those voltage though, seems highly unlikely. Some of us are pushing +200mV when that happens.
> 
> I cannot stress enough, however, that you should put your memory at stock in the beginning.
> I'm sorry, but that's complete BS. Sure, core overclocking yields better results, but memory overclocking has a very real effect on performance too. Not as much as it does on Maxwell cards, but enough. 1650MHz uses 1750MHz timings but is close to 1625MHz (Tighter timings at < or = 1625MHz). But even 1700MHz improves performance. By how much and is it worth it is another story, but the results don't lie.


Memory overclocking offers no real benefit and very little benefit even in benchmarks, so you are the one who is wrong here claiming this.

Best boost you can get out of memory is to use tighter timings, that's it really.
Push your core to max you can and then just put your memory to whatever, with tightened timings.

VDDCI has no impact on anything, I run my VDDCI at -75mV and _MAYBE_ noticed a little less artifacts in Firestrike when I ran the card at it's limits.


----------



## Vellinious

Quote:


> Originally Posted by *Stige*
> 
> Memory overclocking offers no real benefit and very little benefit even in benchmarks, so you are the one who is wrong here claiming this.
> 
> Best boost you can get out of memory is to use tighter timings, that's it really.
> Push your core to max you can and then just put your memory to whatever, with tightened timings.
> 
> VDDCI has no impact on anything, I run my VDDCI at -75mV and _MAYBE_ noticed a little less artifacts in Firestrike when I ran the card at it's limits.


Memory clock can make a pretty big difference. My scores climb from a starting point of 1750 all the way to 1836. 1792 is the sweet spot that gives me the best scores with no artifacts, but I can push it higher, and still gain frames.

I don't know where you got the idea that memory clocks don't need to be adjusted, just set the timings tighter, cause.....someone that's setting their memory higher, is gonna beat your score. lol


----------



## patriotaki

1120Core clock
1600 Mem
+100mV
+50% Power limit
very rare artifacts


----------



## Agent Smith1984

Actually it depends.....

I have seen very nice improvements when overclocking VRAM heavily at 4k, but at 1080/1440 it made much less of an impact.


----------



## Stige

Quote:


> Originally Posted by *Vellinious*
> 
> Memory clock can make a pretty big difference. My scores climb from a starting point of 1750 all the way to 1836. 1792 is the sweet spot that gives me the best scores with no artifacts, but I can push it higher, and still gain frames.
> 
> I don't know where you got the idea that memory clocks don't need to be adjusted, just set the timings tighter, cause.....someone that's setting their memory higher, is gonna beat your score. lol


It makes zero difference in games and very little in benchmarks. For gaming purposes, there is no reason to touch it, just put it at whatever is stable and leave it there, any card should run 1600-1650 fine.


----------



## mus1mus

Eiii. Okay, so explain this: http://www.3dmark.com/fs/7738856


----------



## m70b1jr

Personally ive noticed higher scores raising my VRAM clocks.. It helped me break the 10,000 mark on firestrike.
1180 / 1750, but when I get home imma try 1775 or 1800.


----------



## Agent Smith1984

To say "zero" difference is not true..... it does make a difference, especially at high resolution.... it's just that, it doesn't make as big of a difference as core clocking.

You can clock the core and VRAM 10% over and get a 6-9% performance gain...... you can clock the core only 10%, and get 4-6% gain, while just OCing the memory would only net around 2-3%

You can also do something like clock the core 10% over, and the memory 15-20% over, and still only get a 10%~ gain, so there is a range in which i helps some, but after that it has little-to-no impact at all.
I've tested it quite a bit with various 290's and 390's.

It's most effective in benchmarks, or at high resolution (especially as you pack on the settings), however, Hawaii raw HP isn't enough in single form to ever saturate the bandwidth these cards offer. Once you add a second core in crossfire, you can really get to making use of that stout frame buffer.

Fiji in it's own right isn't the monster I was hoping it would be either, but when using mine, I still saw performance increase at 4k when overclocking the HBM, and the bus width is overkill on that thing, so it can help some.


----------



## Vellinious

I've done a lot of testing using graphics test 1 and 2 in Firestrike. I started at 1625 and did 20mhz bumps each run, and each run the score got a little better until I was at a max of 1836.

Tess tweaks high:
http://www.3dmark.com/fs/7691621

No tess tweaks high:
http://www.3dmark.com/fs/7701597

The scores are increasingly lower, with the lower I set the memory. Maybe my timings aren't tight enough, but.....saying memory clocks make zero difference for gaming and very little difference for benchmarks is just kinda silly.


----------



## Stige

You haven't posted any comparisons with only memory clocks changed. What has tesselation cheating got anything to do with this?


----------



## Vellinious

Quote:


> Originally Posted by *Stige*
> 
> You haven't posted any comparisons with only memory clocks changed. What has tesselation cheating got anything to do with this?


People use both with tess and without. Just adding them for comparison purposes to other's scores.....have you hit graphics scores that high with your memory at 1600-1650? I don't know many that have....even on Fyzzz's fastest runs, he's pushing above 1750....

I'll dig through and try to find some scores with the memory at 1750 and core clocks the same, but I usually just erase the test runs. If you want, I can run them again tonight to show you.....


----------



## Stige

Quote:


> Originally Posted by *Vellinious*
> 
> People use both with tess and without. Just adding them for comparison purposes to other's scores.....have you hit graphics scores that high with your memory at 1600-1650? I don't know many that have....even on Fyzzz's fastest runs, he's pushing above 1750....
> 
> I'll dig through and try to find some scores with the memory at 1750 and core clocks the same, but I usually just erase the test runs. If you want, I can run them again tonight to show you.....


Do that.


----------



## Vellinious

Quote:


> Originally Posted by *Stige*
> 
> Do that.


lol, ok. I'll run some comparisons tonight.

But, what I'm hearing from you is, "mine doesn't get any gains after XXXX clock, so it must be true for everyone".


----------



## Stige

It has applied to all Radeons for a long while, they are not crap like NVidia cards are with limited memory bandwidth, thus the memory overclocking offers very little benefits.


----------



## tolis626

Quote:


> Originally Posted by *Vellinious*
> 
> People use both with tess and without. Just adding them for comparison purposes to other's scores.....have you hit graphics scores that high with your memory at 1600-1650? I don't know many that have....even on Fyzzz's fastest runs, he's pushing above 1750....
> 
> I'll dig through and try to find some scores with the memory at 1750 and core clocks the same, but I usually just erase the test runs. If you want, I can run them again tonight to show you.....


I was meaning to ask, what settings (voltages, BIOS mods, etc) are you using to get these insane clocks? I mean both core and memory, but I'm mostly interested in the memory aspeft. Is your card just a golden sample or what? Also, what settings do you use daily?

Thanks!
Quote:


> Originally Posted by *Stige*
> 
> It has applied to all Radeons for a long while, they are not crap like NVidia cards are with limited memory bandwidth, thus the memory overclocking offers very little benefits.


And those "crap" NVidia cards are just as fast or faster while consuming less power. So they can also be used in laptops without causing a meltdown. Guess whose strategy worked out best.

Anyway. I do love AMD GPUs and they work exceptionally well on the desktop. but that's about it. And part of the problem are those wide memory buses. I hope HBM solves both the power consumption and bandwidth problems in one swoop. If so, I'll be happy.

Also, yes, NVidia GPUs tend to show larger performance benefits from memory overclocking, but saying that Radeons don't benefit from it is far from the truth. True, they don't need it so much. True, it never makes the difference between playable and unplayable. But whether it increases performance or not isn't debated. It does, no matter if we like it or not. I sure would like to believe that memory clocks don't matter because my card's memory doesn't clock well, but I'd be fooling myself.


----------



## Agent Smith1984

I overclock the VRAM on wife's 7870 from 1200 to 1500 (20% increase) and the performance increase is around 5-7% overall, that is from VRAM alone.
Then adding 200MHz to the core (another 20% increase), the total performance boost jumps up over 14-16% depending on the title/bench.
Overclocking only the core yields around 10% improvement. So it all scales pretty linearly on that card.

Mind you the 7870 has a much more limited memory bus that our cards, but the memory OC does make a difference none the less.

The same ol' rule will always apply. Core first, memory second..... but to say overclocking the memory has NO impact is not true.


----------



## Stige

All that matters for me is price to performance, and Nvidia doesn't offer anything in the price range of R9 390 to match it. Power consumption is irrelevant. So are laptops, they are not meant for gaming.

And I still stand by that memory overclocks on Radeons won't offer any real benefit in gaming worth meantioning, you might get 1% more performance from +200MHz but that's it really.

EDIT: As a reply to above, yes ofcourse it will make a bigger difference on bad cards compared to higher end ones that don't have memory issues.


----------



## r_aquarii

just bought a gigabyte r9 390 G1. anyone able to remove the cooler and apply thermal paste?


----------



## Stige

+150MHz on memory increase GPU score by about 340 on Firestrike for me.

EDIT: +50MHz on Core is worth 540 pts and propably a lot more performance in games aswell.


----------



## Vellinious

Quote:


> Originally Posted by *tolis626*
> 
> I was meaning to ask, what settings (voltages, BIOS mods, etc) are you using to get these insane clocks? I mean both core and memory, but I'm mostly interested in the memory aspeft. Is your card just a golden sample or what? Also, what settings do you use daily?
> 
> Thanks!


Using Mus1Mus's voltage mod, and the memory was set to the 8GB 390X settings. Nothing other than that. I run the core at 1300-1330 for benchmarks...the lower the ambient temps, the higher I can run the core and have it make a difference. If my coolant goes above 21c, I have to back off the core a bit to keep it running well. I don't need to add any to the VDDCI for the memory, but adding some seems to help stability a bit.

For daily clocks I can run 1250 / 1750 and it's rock solid stable....though, most of the time, I don't mess with it, and just run it at stock clocks.

I haven't seen anything from the 390 that impresses me. All the good scores on benchmarks are coming from the 290 and 290X. /shrug


----------



## jdorje

I get a good bit higher score in valley with faster vram. To the point where I can see exactly where it starts getting errors and score drops. I haven't really looked at gaming fps...just assumed it would follow. But I do use 1440.

One thing though, with two monitors connected your vram doesn't lower its clock at idle. If I run two monitors with 1740 vram it's 70w higher at idle. So that alone is reason not to clock it high.

When you say -75 mV on vddci does that mean aux voltage?


----------



## Vellinious

Quote:


> Originally Posted by *jdorje*
> 
> I get a good bit higher score in valley with faster vram. To the point where I can see exactly where it starts getting errors and score drops. I haven't really looked at gaming fps...just assumed it would follow. But I do use 1440.
> 
> One thing though, with two monitors connected your vram doesn't lower its clock at idle. If I run two monitors with 1740 vram it's 70w higher at idle. So that alone is reason not to clock it high.
> 
> When you say -75 mV on vddci does that mean aux voltage?


The Unigine benchmarks have always LOVED memory clock...the higher the better.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Vellinious*
> 
> Using Mus1Mus's voltage mod, and the memory was set to the 8GB 390X settings. Nothing other than that. I run the core at 1300-1330 for benchmarks...the lower the ambient temps, the higher I can run the core and have it make a difference. If my coolant goes above 21c, I have to back off the core a bit to keep it running well. I don't need to add any to the VDDCI for the memory, but adding some seems to help stability a bit.
> 
> For daily clocks I can run 1250 / 1750 and it's rock solid stable....though, most of the time, I don't mess with it, and just run it at stock clocks.
> 
> I haven't seen anything from the 390 that impresses me. All the good scores on benchmarks are coming from the 290 and 290X. /shrug


You are right there.....

These "Grenada" (







) cores just want keep clocking and scaling with voltage like the original Hawai''s would.

AMD matured the manufacturing process much like they did on their CPU's, and this is the result:

old Hawaii .... a good core will keep scaling with voltage and water cooling on up in the 1300's or more

new Hawaii.... a good core will clock up to around 1200-1250 on MHz, and all the voltage in the world won't get it any further!

To compare:

old Vishera... will do over 5GHz with good cooling and 1.55v

new vishera... will generally do 4.7-4.9GHz on 1.45v, and after that, getting further can be pulling teeth!!!


----------



## Stige

I get less than 2 FPS more in Armored Warfare hangar with the +150MHz on Memory, goes from ~72.4 to ~74.1.

EDIT: I get more FPS at 1650 with the tighter timings than I do at 1700.


----------



## dopeonarope

MSI 309X Air/Stock


----------



## Agent Smith1984

Quote:


> Originally Posted by *Stige*
> 
> I get less than 2 FPS more in Armored Warfare hangar with the +150MHz on Memory, goes from ~72.4 to ~74.1.


Doesn't seem like much, but I looked at it this way.....
After overclocking my memory from 1500, to 1750 my average FPS in Crysis 3 (4k high/very high custom settings- No AA) went from around 56.7 FPS to 58.5, and then after pushing the core to 1175, that hit a tad over 60.5.

Most of the FPS are in the mid to upper 60's, with some dips in the high 40's, but very few.

So no, it didn't make a huge difference, but it did shift my average FPS experience from below 60, to just above 60, which felt huge during actual gameplay.


----------



## Vellinious

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Doesn't seem like much, but I looked at it this way.....
> After overclocking my memory from 1500, to 1750 my average FPS in Crysis 3 (4k high/very high custom settings- No AA) went from around 56.7 FPS to 58.5, and then after pushing the core to 1175, that hit a tad over 60.5.
> 
> Most of the FPS are in the mid to upper 60's, with some dips in the high 40's, but very few.
> 
> So no, it didn't make a huge difference, but it did shift my average FPS experience from below 60, to just above 60, which felt huge during actual gameplay.


+1


----------



## Agent Smith1984

Quote:


> Originally Posted by *Vellinious*
> 
> +1


----------



## legendary2020

Guys I have a problem with my card it touch 88c with OC and 83c without OC and that with 100% fan speed ....Msi R9 390x


----------



## Agent Smith1984

Quote:


> Originally Posted by *legendary2020*
> 
> Guys I have a problem with my card it touch 88c with OC and 83c without OC and that with 100% fan speed ....Msi R9 390x


Sounds like bad case flow to me......

Try taking the door off and see if that changes anything.


----------



## mus1mus

Quote:


> Originally Posted by *Stige*
> 
> I get less than 2 FPS more in Armored Warfare hangar with the +150MHz on Memory, goes from ~72.4 to ~74.1.
> 
> EDIT: I get more FPS at 1650 with the tighter timings than I do at 1700.


Eiii. I though you said it doesn't make a difference at all?

In case you didn't realize, tightening timings for the memory is a form of OVERCLOCKING!


----------



## legendary2020

Well the case is opened and there's stock fans on it plus the cables all managed my old GTX 980 never went above 78c but this one make me scare infact it.hit 88c just now without OC.... I'll try to change the card position and see if I can get any improvements


----------



## Agent Smith1984

Quote:


> Originally Posted by *legendary2020*
> 
> Well the case is opened and there's stock fans on it plus the cables all managed my old GTX 980 never went above 78c but this one make me scare infact it.hit 88c just now without OC.... I'll try to change the card position and see if I can get any improvements


Well, a 980 should never hit 78c, so that is definitely telling me there is an airflow problem in there.

My 980 @ 1526 never broke 70c, and my MSI 390x never breaks 77c with over voltage and overclocks. And I don't use 100% fan speed.

Do you have a picture of your rig? Preferably one that shows fan directions....

Also, what are you case fans rated at CFM wise?


----------



## Vellinious

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Well, a 980 should never hit 78c, so that is definitely telling me there is an airflow problem in there.
> 
> My 980 @ 1526 never broke 70c, and my MSI 390x never breaks 77c with over voltage and overclocks. And I don't use 100% fan speed.
> 
> Do you have a picture of your rig? Preferably one that shows fan directions....
> 
> Also, what are you case fans rated at CFM wise?


Eh, that would depend on the overclock and the cooler type. Some of the stock air coolers weren't very good....


----------



## Agent Smith1984

Quote:


> Originally Posted by *Vellinious*
> 
> Eh, that would depend on the overclock and the cooler type. Some of the stock air coolers weren't very good....


Ahh, gotcha, yeah mine actually had a good air cooler on it, so if it was reference that would make sense.

Still, 88C on the 390, though not dangerous, seems high for the MSI version. This cooler generally works very well.


----------



## Vellinious

First thing I'd do is get some high quality pads for the memory and some good thermal paste and remount the cooler. If that doesn't fix it, then it's a problem with the card / cooler / air flow.


----------



## Stige

Quote:


> Originally Posted by *Vellinious*
> 
> First thing I'd do is get some high quality pads for the memory and some good thermal paste and remount the cooler. If that doesn't fix it, then it's a problem with the card / cooler / air flow.


Memory doesn't really get hot at all, you need to put those thermal pads on the VRM instead of Memory.


----------



## Vellinious

Quote:


> Originally Posted by *Stige*
> 
> Memory doesn't really get hot at all, you need to put those thermal pads on the VRM instead of Memory.


VRM too, yes...but memory gets hot, and the warmer it runs, the worse it runs. Keep it cool, and it'll overclock further.

You have this unnatural hate for memory lol...


----------



## tolis626

On that subject, I was thinking about buying a Thermal Grizzly Kryonaut (for some the best non-conductive TIM) for the core and some Fujipoly pads for the VRMs and memory. It's gonna be a costly few degrees it seems...


----------



## Vellinious

Quote:


> Originally Posted by *tolis626*
> 
> On that subject, I was thinking about buying a Thermal Grizzly Kryonaut (for some the best non-conductive TIM) for the core and some Fujipoly pads for the VRMs and memory. It's gonna be a costly few degrees it seems...


I use Kryonaut....awesome stuff. I highly recommend it.


----------



## tolis626

Quote:


> Originally Posted by *Vellinious*
> 
> I use Kryonaut....awesome stuff. I highly recommend it.


Awesome, yes. Also pricey. It's not available in Greece and so far I can get it from eBay for like 20€ for 5 grams or so, or PerformancePCs for 20$ If they don't rip me off with shipping and if it doesn't take forever to arrive. But it seems to be worth it. CLU performance without the dangers and with the ease of normal TIMs? Yes please.

I'm more concerned about the Fujipoly pads though. Seems I can only find the 14W/mK ones on eBay and, of course, not locally. Sigh...


----------



## jdorje

There's 16 memory modules. Putting high quality pads on each would cost like $30. Or, where can you get good pads cheap?


----------



## Vellinious

The comparison, per request. All done with the core at a very stable 1250 overclock. First 1625, then 1750 and then 1821 (the clocks read right). The loop was a bit warm tonight for 1836 or higher, so I didn't mess with it. But....still, gains there. Graphics test 2 responds more to memory overclocks....that's where Mus and Fyzzz had been kickin my butt.

http://www.3dmark.com/compare/fs/7769346/fs/7769361/fs/7769411#


----------



## tolis626

Quote:


> Originally Posted by *jdorje*
> 
> There's 16 memory modules. Putting high quality pads on each would cost like $30. Or, where can you get good pads cheap?


I was thinking of getting the really good pads only for the VRMs. Sure, memory does produce heat, but not THAT much. I don't think it would make a difference anyway. What I was thinking about doing is get the best I can for the VRMs and then get some good, but cheap 7 or 11W/mK pads for the memory, but those I can get locally.

I suppose I need the thinnest ones, right?


----------



## jdorje

What happens if you use regular thermal paste on vrms or memory modules? To connect to a heat sink, that is.


----------



## mus1mus

Quote:


> Originally Posted by *jdorje*
> 
> What happens if you use regular thermal paste on vrms or memory modules? To connect to a heat sink, that is.


It depends on your coolers gap from the components. It's safe to really assume that 5mm gaps cannot be used with Pastes.


----------



## diggiddi

Quote:


> Originally Posted by *tolis626*
> 
> Dude, you've been asking over, and over, and over, and over, and over again while ignoring most advice given to you. Take a step back and listen to what others have to say before trying things randomly without actually knowing what you're doing.
> 
> First off, uninstall Furmark. It's useless and only stresses your card for no reason. Test for stability with games (The harder they are to run, the better. Witcher 3, I've found, is the best for that)
> 
> Secondly, undo any overclocking (Just press the reset button in Afterburner or whatever) and disable any overclocking software (don't let them run at boot), then reboot and run The Stilt's VID app and post what it reports as your DPM7 voltage (Or even better, post a screenshot of it). If we don't know that, any offset you apply is useless, really.
> 
> Third, install GPU-z and give us your VRM temps and actual VDDC under load. You could also have it record everything (In the bottom of the sensors tab - also have it refresh the sensors every second instead of 2.5s or so that it is by default) and then zip and upload the log here so we can see what's going on. Or, again, even better would be to post screenshots.
> 
> I'm willing to bet your card is a low DPM7 one and you just need a bigger offset. But you can't just try random things and expect it to work. Help us help you.


I just checked m VID's with the program what do my results mean?


----------



## jdorje

Two Hawaii gpus with stock voltage 1.1875V and 1.25V. That first is way below average. Asic quality really high?

Personally I'd put them in one at a time and clock them to 1050 mhz. Then edit the bios to have your desired fan curve and the minimum voltage needed for true stability (OCCT stability plus 6 mv). Then they'll both run at 1050 mhz with minimum power.


----------



## kizwan

Quote:


> Originally Posted by *tolis626*
> 
> Quote:
> 
> 
> 
> Originally Posted by *jdorje*
> 
> There's 16 memory modules. Putting high quality pads on each would cost like $30. Or, where can you get good pads cheap?
> 
> 
> 
> I was thinking of getting the really good pads only for the VRMs. Sure, memory does produce heat, but not THAT much. I don't think it would make a difference anyway. What I was thinking about doing is get the best I can for the VRMs and then get some good, but cheap 7 or 11W/mK pads for the memory, but those I can get locally.
> 
> I suppose I need the thinnest ones, right?
Click to expand...

11W/mK keeps my VRM cool even when overclocking. Also for sure 11 or even 7W/mK can handle memory well. Memory doesn't produce a lot of heat but it is sensitive to heat. So give a lot of love to memory too.


----------



## legendary2020

Hmm I tried new cooler master thermal paste no improvements at all I tried to run the card outside the case in test build for better air flow ...guess what it hit 91c...keep in mind this is a used card but It was used for a short time as the guy who sold it to me said so ...any way he offered to refund me and im not sure if I should buy the R9 390X again or go with the Gtx 980


----------



## dopemoney

PowerColor PC R9 390. I join club now? Me be Glorious PC Master Race now?


----------



## diggiddi

Quote:


> Originally Posted by *jdorje*
> 
> Two Hawaii gpus with stock voltage 1.1875V and 1.25V. That first is way below average. Asic quality really high?
> 
> Personally I'd put them in one at a time and clock them to 1050 mhz. Then edit the bios to have your desired fan curve and the minimum voltage needed for true stability (OCCT stability plus 6 mv). Then they'll both run at 1050 mhz with minimum power.


Huh interesting analysis, the first one is the best overclocker of the 2.
Asic 78.9% Was able to get 1230/1620 iirc
not so much on the second which was only able to get to 1180/1600 thereabouts iirc
Asic 76.4%


----------



## legendary2020

Ok after I thought about it I still want to be in team Red so I decide to buy a new R9 390x powercolor devil .. I did my research and this card really shows low temperatures never passes the 52c on stock and with OC the max is 58c I'll order it today through Amazon after I get my refund what do you guys think I would like to know your opinion ?


----------



## dopemoney

Legendary2020, it's a good buy. I bought the powercolor 390 and I love it, cooling and performance on powercolor cards seems to be above par. If I could do it again, I'd get the same card you are looking at getting.


----------



## Carniflex

Quote:


> Originally Posted by *legendary2020*
> 
> Ok after I thought about it I still want to be in team Red so I decide to buy a new R9 390x powercolor devil .. I did my research and this card really shows low temperatures never passes the 52c on stock and with OC the max is 58c I'll order it today through Amazon after I get my refund what do you guys think I would like to know your opinion ?


It really depends on the price I would say. I would compare it against the cheapest 390X + Alphacool GPX for that card + pump, rad, tubing and fittings. If the second option is cheaper or the same price get the second option, if not then get this one. Probably the Devil should come out cheaper but if something else is on sale somewhere it might be close enough gap to be worth it to make a jump to the custom loop instead of the AIO.


----------



## Minusorange

A question for you guys, should I try to save my 290 by installing a new cooler which will probably cost me around £130 to £160 or should I just spend an extra £120 to £150 to buy a 390, or the little extra for a 390x

I'm not going to be bothering with overclocking it so what would you guys do in similar situation ? Ideally the less money spent the better, but it's not a major issue


----------



## Carniflex

Quote:


> Originally Posted by *Minusorange*
> 
> A question for you guys, should I try to save my 290 by installing a new cooler which will probably cost me around £130 to £160 or should I just spend an extra £120 to £150 to buy a 390, or the little extra for a 390x
> 
> I'm not going to be bothering with overclocking it so what would you guys do in similar situation ? Ideally the less money spent the better, but it's not a major issue


To be honest I would not bother "upgrading" to 390/390X if you have already 290 as they are in essence the same card as far as I understand. Or has the cooling on your 290 failed and you need to "save" it for that reason? If the latter then ~130 .. 160 £ should be cheaper than, in essence, buying the same card again.


----------



## dopemoney

Minusorange, gonna be more than a "little" extra to get 390x, if looking at bang for buck, get 390, don't spend money trying to save old hardware, get new


----------



## dopemoney

All: For what it's worth, don't let anyone tell you the 390 is "rebadge" of another card, it's not, just read up on it. AMD doesn't rename an outdated product and then sell it as a new one.


----------



## legendary2020

Quote:


> Originally Posted by *Carniflex*
> 
> It really depends on the price I would say. I would compare it against the cheapest 390X + Alphacool GPX for that card + pump, rad, tubing and fittings. If the second option is cheaper or the same price get the second option, if not then get this one. Probably the Devil should come out cheaper but if something else is on sale somewhere it might be close enough gap to be worth it to make a jump to the custom loop instead of the AIO.


Actually The msi R9 390X is Priced at: $446.49 on Amazon and the Powercolor devil at 449$ so not a big difference I rather go with the cool solution


----------



## Carniflex

Quote:


> Originally Posted by *dopemoney*
> 
> All: For what it's worth, don't let anyone tell you the 390 is "rebadge" of another card, it's not, just read up on it. AMD doesn't rename an outdated product and then sell it as a new one.


Yes it does. As an example I have myself flashed a 5770 to be a 6770. Even without the flashing 5770 and 6770 could be crossfired back when crossfire was in essence reserved for identical cards.


----------



## Minusorange

Quote:


> Originally Posted by *Carniflex*
> 
> To be honest I would not bother "upgrading" to 390/390X if you have already 290 as they are in essence the same card as far as I understand. Or has the cooling on your 290 failed and you need to "save" it for that reason? If the latter then ~130 .. 160 £ should be cheaper than, in essence, buying the same card again.


Yeah the cooling failed, one of the fans on my Tri-x literally fell off, seems like the magnets have lost some of their magnetism so it's not sticking in place and flies off whenever the fans ramp up. Running it at the moment with case open and huge fan blowing on it so I can use the rig as the only spare card I have that I know that works is an old AGP 7800GT lol

Looks like I may not have to pay though, just phoned up the place I bought it from and turns out it's still under Sapphires warranty so I've arranged to send it back on Monday.

Not sure whether sapphire will just replace the broken fan or send me a new 390, hoping it's a new 390 but I know it'll just be a fan lol


----------



## b0uncyfr0

How does the Vapor-X 290x compare to the 390x G1 and MSI's Gaming versions? I know the aftermarket 390x's range from 1070-1100 core but do they have much headroom to push even further?

I can barely get 1100 on my 290x vapor x with +90. Not intendding to push any further, just wanted to see how it compares.


----------



## flopper

Quote:


> Originally Posted by *b0uncyfr0*
> 
> How does the Vapor-X 290x compare to the 390x G1 and MSI's Gaming versions? I know the aftermarket 390x's range from 1070-1100 core but do they have much headroom to push even further?
> 
> I can barely get 1100 on my 290x vapor x with +90. Not intendding to push any further, just wanted to see how it compares.


not much headroom.
the 390 gen is 1100-1250mhz or so and more stay under 1200 than above it.


----------



## Gdourado

Can anyone confirm if the prolimatech mk-26 fits a sapphire 390x tri-X?

Thanks.


----------



## Gdourado

This is the reference 290X PCB:


And the Sapphire 390X PCB:


If an aftermarket aircooler like the MK-26 is compatible with the reference 290X, will it also be with the 390X?

Cheers!


----------



## battleaxe

Quote:


> Originally Posted by *Gdourado*
> 
> This is the reference 290X PCB:
> 
> 
> And the Sapphire 390X PCB:
> 
> 
> If an aftermarket aircooler like the MK-26 is compatible with the reference 290X, will it also be with the 390X?
> 
> Cheers!


If you have a Sapphire 390x and plan to run on air you won't do much better than the Tri-x cooler. Plus the VRM's are different, so the MK-26 would do nothing to cool those anyway. You will have to add aftermarket heat sinks on the VRM's. The 390x Sapphire VRM's get plenty warm and need to be cooled well. The stock cooler does a decent job.

I would say unless you plan to go with a full cover block on the Sapphire 390x (which I do not think one is available) you are wasting your time. Keep the stock cooler, for air. It does a good job.


----------



## Agent Smith1984

Quote:


> Originally Posted by *dopemoney*
> 
> 
> 
> PowerColor PC R9 390. I join club now? Me be Glorious PC Master Race now?


You need to post a GPU-Z validation with your user name, or a screen shot showing GPU-Z open and your OC login screen, or something of that nature, and I will get you added! Thanks


----------



## Agent Smith1984

Quote:


> Originally Posted by *dopemoney*
> 
> All: For what it's worth, don't let anyone tell you the 390 is "rebadge" of another card, it's not, just read up on it. AMD doesn't rename an outdated product and then sell it as a new one.


Just to make a correction to your statement here..... the exact opposite is true.

That is EXACTLY what AMD does, and did so for two generations of Tahiti, three generations of Pitcairn, and now 2 generations of Hawaii and Tonga.... it's a testament to the longevity of their flagship GPU's, and I look at it as a positive, more so than a negative really...


----------



## Stige

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Just to make a correction to your statement here..... the exact opposite is true.
> 
> That is EXACTLY what AMD does, and did so for two generations of Tahiti, three generations of Pitcairn, and now 2 generations of Hawaii.... it's a testament to the longevity of their flagship GPU's, and I look at it as a positive, more so than a negative really...


It is far from just a rebadge/rebrand, the correct term would be "refresh" propably, considering they have upgraded parts on them instead of just using the exact same card over again.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Stige*
> 
> It is far from just a rebadge/rebrand, the correct term would be "refresh" propably, considering they have upgraded parts on them instead of just using the exact same card over again.


I agree with the refesh sentiment to a degree, bet let's not forget that the first tear down of a 390X was the XFX from Best Buy, and it was an exact match to the 290X 8GB XFX card, less a different BIOS......

280X was a 7970 GHz edition all day long..... you can even flash the cards back and forth in many cases, so there is more rebranding, than new innovation there, with the latter coming through software and driver improvements, and less through any hardware improvements (outside of the better IMC's used on the 390 cards when going to 8GB), and the expanded offering of 4GB versions of Tonga (now Antinqua) and Pitcairn (now Trinidad)......


----------



## tolis626

Quote:


> Originally Posted by *kizwan*
> 
> 11W/mK keeps my VRM cool even when overclocking. Also for sure 11 or even 7W/mK can handle memory well. Memory doesn't produce a lot of heat but it is sensitive to heat. So give a lot of love to memory too.


Sweet! So I guess any 7 or 11W/mK pad will do for the memory, right?

One thing I haven't quite understood is what thickness I need. I can find 0.5mm, 1mm, 1.5mm and even thicker. My gut tells me to go with the thinnest option, but I don't really know.


----------



## Lixxon

Woah.... Really? I upgraded from win 7 to win 10 and now I got 5-600 more points in firemark







(using 390x)


----------



## Agent Smith1984

Quote:


> Originally Posted by *Lixxon*
> 
> Woah.... Really? I upgraded from win 7 to win 10 and now I got 5-600 more points in firemark
> 
> 
> 
> 
> 
> 
> 
> (using 390x)


Yeah, a little "thank you" from Microsoft, lol


----------



## Lixxon

Had 390x for some time but never posted to join da club heres me.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Lixxon*
> 
> Had 390x for some time but never posted to join da club heres me.


Added!


----------



## Agent Smith1984

So I got the 1250 timing strap BIOS for 390X from fyzz and will be trying it out sometime this weekend. Hoping to see the memory still clock well, even with the tighter timings. Should be good for another 150-250 points in Firestrike, so a legit 15k graphics score should be possible. I'll probably try a no tess run too...


----------



## Stige

Quote:


> Originally Posted by *Agent Smith1984*
> 
> So I got the 1250 timing strap BIOS for 390X from fyzz and will be trying it out sometime this weekend. Hoping to see the memory still clock well, even with the tighter timings. Should be good for another 150-250 points in Firestrike, so a legit 15k graphics score should be possible. I'll probably try a no tess run too...


edit: herpderp I was looking at my 3dm11 results..

14.8k is the highest for me in Firestrike


----------



## m70b1jr

Still looking to hardmod my xfx r9 390 to fix black screens at +130mv.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Stige*
> 
> edit: herpderp I was looking at my 3dm11 results..
> 
> 14.8k is the highest for me in Firestrike


What clocks?

Also, do you have any of the following games:

GTA V
Crysis 3
BF 4
Dirt Fallout
Hitman (the latest one)

??

And what res do you run?

I'd like to share Fraps results between your overclocked intel and my vishera, just for the hell of it.

I run 4k most of the time, but would love to see some 1440p comparisons with GPU's at same clocks, and at max overclocks.

BTW, 5.1 on 3570k is really good, we COULD NOT break 4.9 on my brother's 3570k, but he normally ran 4.8, cause it was like 500mv difference in voltage to get there, and a 5-10c difference under load (obviously limited cooling).


----------



## dopemoney

Yes. I will own my mistakes when I misspeak and upon much reading, AMD has "rebadged" products in the past. I am still working through how exactly that is a good thing. However, I did find this article which at least made me feel better. http://wccftech.com/amd-radeon-r9-390-390x-not-rebadges-power-optimization/

Also, when I get to the heezy I'll post screen shot with username and gpu-z open. Many thanks to you.


----------



## Agent Smith1984

Quote:


> Originally Posted by *dopemoney*
> 
> Yes. I will own my mistakes when I misspeak and upon much reading, AMD has "rebadged" products in the past. I am still working through how exactly that is a good thing. However, I did find this article which at least made me feel better. http://wccftech.com/amd-radeon-r9-390-390x-not-rebadges-power-optimization/
> 
> Also, when I get to the heezy I'll post screen shot with username and gpu-z open. Many thanks to you.


It's good because AMD gets time to fully develop the platform.....

Hawaii started out as a good card, and would score around 9500-10,000 graphics point in firestrike at launch with stock 1000/1250 clocks. Now a 390x, even clocked to 1000/1250, will drop ariound 12,000 points!!! That is driver optimization alone..... Now figure they went in and tweaked the BIOS to use less power, and added new higher clocking IMC's, and you are seeing people get legitimate 390x scores in the 13,800 range right out of the box...... With even further overclocking, we are getting 15k graphics scores!!!!

So from square one with Hawaii, you are talking a 25-30+% improvement through; cooling, drivers, BIOS refinement, and updated memory chips....

That's pretty staggering.

Same thing with 7900 series... I've watched my son's 7950 evolve from a great 1080p card, to an even better one than it was when launched....

Even my wife's 7870 is running Crysis 3 and GTA V right at 1080P with very high settings at over 60FPS (2xAA / no AA).

Maturation is better than innovation sometimes.


----------



## jdorje

Well, it's good because it's cheaper, and leads to longer lifespans of older cards since they receive support for longer. Compare the amd 290 to the nvidia 780, for instance...or the 7950 to the 670. Nvidia cards may hold value better because of reasons, but amd cards hold performance far better.


----------



## fyzzz

I found a old firestrike result from my previous 290 when it was clocked to 1265/1500 and 14.12 omega drivers i think, it got a gpu score of 13794. Today with my current 290 (Modded 390 bios, crimson drivers), i get a gpu score of 13614 at 1100/1625.


----------



## Vellinious

Quote:


> Originally Posted by *Agent Smith1984*
> 
> It's good because AMD gets time to fully develop the platform.....
> 
> Hawaii started out as a good card, and would score around 9500-10,000 graphics point in firestrike at launch with stock 1000/1250 clocks. Now a 390x, even clocked to 1000/1250, will drop ariound 12,000 points!!! That is driver optimization alone..... Now figure they went in and tweaked the BIOS to use less power, and added new higher clocking IMC's, and you are seeing people get legitimate 390x scores in the 13,800 range right out of the box...... With even further overclocking, we are getting 15k graphics scores!!!!
> 
> So from square one with Hawaii, you are talking a 25-30+% improvement through; cooling, drivers, BIOS refinement, and updated memory chips....
> 
> That's pretty staggering.
> 
> Same thing with 7900 series... I've watched my son's 7950 evolve from a great 1080p card, to an even better one than it was when launched....
> 
> Even my wife's 7870 is running Crysis 3 and GTA V right at 1080P with very high settings at over 60FPS (2xAA / no AA).
> 
> Maturation is better than innovation sometimes.


I wouldn't call it maturation so much as not being hobbled....you shouldn't have to wait until the tech is 5 years old to get the best out of it. rofl


----------



## HZCH

Quote:


> Originally Posted by *Vellinious*
> 
> I wouldn't call it maturation so much as not being hobbled....you shouldn't have to wait until the tech is 5 years old to get the best out of it. rofl


Well, I got mixed feelings about it.
IMHO, and as you said, to wait 5 years or so to "mature" a product is isane, at least for the Glorious PC Master Race (look at AMD financial results and market shares)...
In the other way, looking at the last results under dx12, and comlaring them to the GTX980 (I own one)... I mean... Now THAT'S a pretty big step up. An improvement big enough that I was starting wondering why I had paid for a 980 (before remembering it was a better gpu than the 390 when I bought my 980).

Aside of the bigger power draw shown by Tomshardware DX12 reviews for the 390 (can't find the link, erh, too lazy), I think AMD has shown it could kick asses when working seriously. And it gives me the itch to try a 390 on Win 10 ?


----------



## christoph

Quote:


> Originally Posted by *tolis626*
> 
> Sweet! So I guess any 7 or 11W/mK pad will do for the memory, right?
> 
> One thing I haven't quite understood is what thickness I need. I can find 0.5mm, 1mm, 1.5mm and even thicker. My gut tells me to go with the thinnest option, but I don't really know.


we need to find out fro sure what thickness of pads to use, I was told to buy 1.5 mm, but that was just one guy who told me that, and I'm about to order some pads, but still unsure of what thinkcness to order

and another question, is one strip of pad enough to cover all the memory chips?

they are 16 chips right?


----------



## christoph

Quote:


> Originally Posted by *HZCH*
> 
> Well, I got mixed feelings about it.
> IMHO, and as you said, to wait 5 years or so to "mature" a product is isane, at least for the Glorious PC Master Race (look at AMD financial results and market shares)...
> In the other way, looking at the last results under dx12, and comlaring them to the GTX980 (I own one)... I mean... Now THAT'S a pretty big step up. An improvement big enough that I was starting wondering why I had paid for a 980 (before remembering it was a better gpu than the 390 when I bought my 980).
> 
> Aside of the bigger power draw shown by Tomshardware DX12 reviews for the 390 (can't find the link, erh, too lazy), I think AMD has shown it could kick asses when working seriously. And it gives me the itch to try a 390 on Win 10 ?


AMD almost all the time have been right, is a different thing that the software is been always written to use Intel/Nvidia

and that's why AMD came up with they're own software to use with their tech


----------



## Vellinious

Quote:


> Originally Posted by *HZCH*
> 
> Well, I got mixed feelings about it.
> IMHO, and as you said, to wait 5 years or so to "mature" a product is isane, at least for the Glorious PC Master Race (look at AMD financial results and market shares)...
> In the other way, looking at the last results under dx12, and comlaring them to the GTX980 (I own one)... I mean... Now THAT'S a pretty big step up. An improvement big enough that I was starting wondering why I had paid for a 980 (before remembering it was a better gpu than the 390 when I bought my 980).
> 
> Aside of the bigger power draw shown by Tomshardware DX12 reviews for the 390 (can't find the link, erh, too lazy), I think AMD has shown it could kick asses when working seriously. And it gives me the itch to try a 390 on Win 10 ?


Yes, but looking at DX12 results are looking into the near future. Before it's used regularly, we're talking about a year from now, and another architecture release.... Who cares if the 980 doesn't perform as well RIGHT NOW, than the 390 does in some preliminary benchmarks for something that won't be fully utilized for a while yet? By the time we see the first FULL RELEASE game, Deus Ex, that will be using DX12, we'll be staring down the barrel of the new GPU releases, and you can bet, that NVIDIA addressed it.

I love what my 290X has given me since I bought it a couple of months ago, but.....it's already "old", and just now starting to perform like it should. 290X scores from a year ago were barely beating my 970 scores. That's pathetic.....


----------



## jdorje

Quote:


> Originally Posted by *christoph*
> 
> we need to find out fro sure what thickness of pads to use, I was told to buy 1.5 mm, but that was just one guy who told me that, and I'm about to order some pads, but still unsure of what thinkcness to order
> 
> and another question, is one strip of pad enough to cover all the memory chips?
> 
> they are 16 chips right?


The memory is huge - 16 modules each of which is about 1/3 inch(??? just made that up) on a side. I think you could use TIM there as you can get the heat sink resting right up against it. For the VRMs you need pads since they aren't flat.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Vellinious*
> 
> Yes, but looking at DX12 results are looking into the near future. Before it's used regularly, we're talking about a year from now, and another architecture release.... Who cares if the 980 doesn't perform as well RIGHT NOW, than the 390 does in some preliminary benchmarks for something that won't be fully utilized for a while yet? By the time we see the first FULL RELEASE game, Deus Ex, that will be using DX12, we'll be staring down the barrel of the new GPU releases, and you can bet, that NVIDIA addressed it.
> 
> I love what my 290X has given me since I bought it a couple of months ago, but.....it's already "old", and just now starting to perform like it should. 290X scores from a year ago were barely beating my 970 scores. That's pathetic.....


That's a lot of back pocket nvidia lobbying too though.....

You can't deny the whole gameworks crap, i don't care who you are... I'm not an AMD only person by any means, but nvidia's deception is ridiculous..... Add in the whole 3.5gb scandal, and the fact that nvidia refuses to keep improving support for cards even as old (but not really old) as the 700 series, and you can quickly build a case not to support them. I loved my 980 personally, but the 4gb limitation killed me at 4k, so i went back.. Mind you i even tried fury for a bit, and it did great, but it still had 4gb limitations.... My next upgrade will be a total ground up build, and will consist of whichever cpu offering is best (waiting to see how zen does) and either Pascal or Polaris GPU... whichever is best value.....


----------



## Vellinious

Quote:


> Originally Posted by *Agent Smith1984*
> 
> That's a lot of back pocket nvidia lobbying too though.....
> 
> You can't deny the whole gameworks crap, i don't care who you are... I'm not an AMD only person by any means, but nvidia's deception is ridiculous..... Add in the whole 3.5gb scandal, and the fact that nvidia refuses to keep improving support for cards even as old (but not really old) as the 700 series, and you can quickly build a case not to support them. I loved my 980 personally, but the 4gb limitation killed me at 4k, so i went back.. Mind you i even tried fury for a bit, and it did great, but it still had 4gb limitations.... My next upgrade will be a total ground up build, and will consist of whichever cpu offering is best (waiting to see how zen does) and either Pascal or Polaris GPU... whichever is best value.....


I deal in reality....not rainbows and unicorns. DX12 is still rainbows and unicorns. Will be for a while yet.


----------



## rdr09

Quote:


> Originally Posted by *Agent Smith1984*
> 
> That's a lot of back pocket nvidia lobbying too though.....
> 
> You can't deny the whole gameworks crap, i don't care who you are... I'm not an AMD only person by any means, but nvidia's deception is ridiculous..... Add in the whole 3.5gb scandal, and the fact that nvidia refuses to keep improving support for cards even as old (but not really old) as the 700 series, and you can quickly build a case not to support them. I loved my 980 personally, but the 4gb limitation killed me at 4k, so i went back.. Mind you i even tried fury for a bit, and it did great, but it still had 4gb limitations.... My next upgrade will be a total ground up build, and will consist of whichever cpu offering is best (waiting to see how zen does) and either Pascal or Polaris GPU... whichever is best value.....


i think, just thinking







, the support for kepler came back as a result of complaints from owners and when enough sales were made in maxwell.


----------



## christoph

Quote:


> Originally Posted by *jdorje*
> 
> The memory is huge - 16 modules each of which is about 1/3 inch(??? just made that up) on a side. I think you could use TIM there as you can get the heat sink resting right up against it. For the VRMs you need pads since they aren't flat.


really? those chips are making direct contact with the heatsink??

no, no, wait, they have thermal pads on them, so the gap between the heatsink and the chips should be big enough to use thermal pads instead of thermal paste, unless the thermal pads are the 0.5 mm thick and considering that the heatsink is compressing the pads against the chips so that'll leave a gap like what? 0.25 mm, in which case we could use thermal paste (non-conductive) and get away with it, which should be better (performance wise)

right?


----------



## dopemoney




----------



## dopemoney

Nvidia, for all their deception, does make a great high-end card. My friend built his rig with a 5960x and new Titan X, and a completely unnecessary 64gb of ram. I have witnessed that there is not yet a game that he cannot max out. However, and of some import, he coughed up over $2800 for his build, excluding monitor. Mine, a 7600k and a 390, with end-to-end water-cooling, 16gb ram, including 2k monitor, was only $1650. I get _competitive_ results with him, so it's difficult to justify spending that kind of dough. By competitive, I mean for example: on Ultra settings in Fallout 4, I can play at 60-65fps without overclocking. He runs around 85-100fps without overclocking.


----------



## jaydude

For anyone using a Gigabyte R9 390, A new Bios has been released "F91"

Release for SKHYNIX Memory
ATI Source BIOS Version: 015.049.000.011
Core Clock/ Memory Clock：1025/1500 MHz
Modify the fan duty in MS-DOS mode

I have not tried them yet, the lack of dual bios worries me about flashing and I am not really having any issues since I upgraded all my cooling


----------



## rdr09

Quote:


> Originally Posted by *jaydude*
> 
> For anyone using a Gigabyte R9 390, A new Bios has been released "F91"
> 
> Release for SKHYNIX Memory
> ATI Source BIOS Version: 015.049.000.011
> Core Clock/ Memory Clock：1025/1500 MHz
> Modify the fan duty in MS-DOS mode
> 
> I have not tried them yet, the lack of dual bios worries me about flashing and I am not really having any issues since I upgraded all my cooling


Also, use ATIWinflash not Gigabyte tool if it still exists.


----------



## legendary2020

Quote:


> Originally Posted by *Agent Smith1984*
> 
> That's a lot of back pocket nvidia lobbying too though.....
> 
> You can't deny the whole gameworks crap, i don't care who you are... I'm not an AMD only person by any means, but nvidia's deception is ridiculous..... Add in the whole 3.5gb scandal, and the fact that nvidia refuses to keep improving support for cards even as old (but not really old) as the 700 series, and you can quickly build a case not to support them. I loved my 980 personally, but the 4gb limitation killed me at 4k, so i went back.. Mind you i even tried fury for a bit, and it did great, but it still had 4gb limitations.... My next upgrade will be a total ground up build, and will consist of whichever cpu offering is best (waiting to see how zen does) and either Pascal or Polaris GPU... whichever is best value.....


Well I get your point

it looks like nvidia is trying any trick to just slow AMD cards all this Gameworks thing is effecting not only AMD but nvidia old cards too. .. in some cases we can see the GTX 960 beating the GTX 780 and 770 !!



I know the Gtx 960 is a new Gpu with different architecture but still that should not happen



Here we go again the Gtx 960 is faster than the 770 and only 6fps slower than Gtx 780



Do I really need to say something ?

The bottom line is nvidia is pushing this gameworks to far its like do anything to hold AMD cards ..but its effecting our own old cards too !! Well f**** it just do it anyway

Now I'm not an AMD fanboy I pay for the best value for my money whatever its nvidia or AMD hell even Intel if they come up with something other than Intel HD
But man you know that the game is F.up when you see the recommended gpus as Gtx 550 or 7870 wow gtx 550 it then jumping to 7870

Oh well I guess they will learn soon...hopefully


----------



## Charcharo

Reality is a double-edged sword... especially for use enthusiasts.

We are both completely irrelevant to the big picture yet too talkative to ignore.


----------



## Irked

Nvidia Gameworks just needs to stop. Its not good for any cards performance to run 64x tessellation like witcher 3 with hair works. I could rant on about this for awhile but Ill just leave a like to a video i found explaining what seems to be going on with Gameworks


----------



## jdorje

Quote:


> Originally Posted by *christoph*
> 
> really? those chips are making direct contact with the heatsink??
> 
> no, no, wait, they have thermal pads on them, so the gap between the heatsink and the chips should be big enough to use thermal pads instead of thermal paste, unless the thermal pads are the 0.5 mm thick and considering that the heatsink is compressing the pads against the chips so that'll leave a gap like what? 0.25 mm, in which case we could use thermal paste (non-conductive) and get away with it, which should be better (performance wise)
> 
> right?


Honestly I don't remember for sure from when I opened my card up. I don't think three were pads or paste on the memory. I think the heat sink had metal that screwed up against the vram modules.

I would guess cards are different in this.


----------



## m70b1jr

Wow. Even when using Virtual Super Resolution in Black ops 3 (1440p) AND all the AA I can add, I still get above 60FPS..
Quick question, do you guys think Treyarch will add DX12 in a future DLC?


----------



## r_aquarii

i need a little help here.
i have a gigabyte 390 and wish to remove the stock cooler and install the arctic cooling cooler
however, i cant seem to remove this 4 little "screw". anyone had any idea how to remove it?


----------



## OneB1t

from other side?


----------



## m70b1jr

Quote:


> Originally Posted by *r_aquarii*
> 
> i need a little help here.
> i have a gigabyte 390 and wish to remove the stock cooler and install the arctic cooling cooler
> however, i cant seem to remove this 4 little "screw". anyone had any idea how to remove it?


The circles screws get removed from the top. If you're using the liquid arctic cooler, I recommend you get a 2nd hand to install it because it's VERY hard to do.


----------



## r_aquarii

Quote:


> Originally Posted by *OneB1t*
> 
> from other side?


heatsink is blocking the screw. no way i could remove it. i dont even know how they install it
Quote:


> Originally Posted by *m70b1jr*
> 
> The circles screws get removed from the top. If you're using the liquid arctic cooler, I recommend you get a 2nd hand to install it because it's VERY hard to do.


what do you mean by the top? from the backplate?
im using Accelero Xtreme IV


----------



## OneB1t

oh god..









maybe then remove heatsink first?


----------



## m70b1jr

Quote:


> Originally Posted by *OneB1t*
> 
> oh god..
> 
> 
> 
> 
> 
> 
> 
> 
> 
> maybe then remove heatsink first?


Yea, pretty sure the screws you circles aren't required to take the heatsink / cooler off.


----------



## r_aquarii

that 4 screw is holding on to the
Quote:


> Originally Posted by *OneB1t*
> 
> oh god..
> 
> 
> 
> 
> 
> 
> 
> 
> 
> maybe then remove heatsink first?


Quote:


> Originally Posted by *m70b1jr*
> 
> Yea, pretty sure the screws you circles aren't required to take the heatsink / cooler off.


yes, you are right. i feel so stupid now...
it take a little bit of force to pull out the heatsink due to the thermal tape.


----------



## patriotaki

i got a stable overclock with my PCS+ r9 390
+100mV
+50% power limit
1120MHz core clock
1600MHz memory clock

on GTA V i only see 5-7fps more from the stock settings

shouldnt i see more?


----------



## OneB1t

Quote:


> Originally Posted by *r_aquarii*
> 
> yes, you are right. i feel so stupid now...


its fine everyone gets brainfreezes


----------



## christoph

Quote:


> Originally Posted by *jdorje*
> 
> Honestly I don't remember for sure from when I opened my card up. I don't think three were pads or paste on the memory. I think the heat sink had metal that screwed up against the vram modules.
> 
> I would guess cards are different in this.


then I get I'll have to open the videocard before I order the pads and some extra paste for it


----------



## legendary2020

Cant wait until I get my new card to join again


----------



## Nameless1988

Quote:


> Originally Posted by *patriotaki*
> 
> i got a stable overclock with my PCS+ r9 390
> +100mV
> +50% power limit
> 1120MHz core clock
> 1600MHz memory clock
> 
> on GTA V i only see 5-7fps more from the stock settings
> 
> shouldnt i see more?


Nope.


----------



## legendary2020

Quote:


> Originally Posted by *patriotaki*
> 
> i got a stable overclock with my PCS+ r9 390
> +100mV
> +50% power limit
> 1120MHz core clock
> 1600MHz memory clock
> 
> on GTA V i only see 5-7fps more from the stock settings
> 
> shouldnt i see more?


Overclocking the card will not effect every game you may get 10 fps on a game but 2 fps on the other some you will not get anything the same thing goes for the stability some games will be stable to certain OC settings and unstable to some other games. The thing is overclocking your card will not turn it to R9 490X so dont expect a big jump


----------



## 2jzom

hi

I have problem , see here https://drive.google.com/file/d/0B90UB4FppWF-dkJGeGxOLTdhZDg/view
i try to uninstall the driver and reinstall but problem didn't gone

Sapphire r9 390
crossover monitor


----------



## jodybdesigns

Quote:


> Originally Posted by *patriotaki*
> 
> i got a stable overclock with my PCS+ r9 390
> +100mV
> +50% power limit
> 1120MHz core clock
> 1600MHz memory clock
> 
> on GTA V i only see 5-7fps more from the stock settings
> 
> shouldnt i see more?


Go to 1625 on memory. And I don't get a huge increase in GTA5 when over clocking. But it runs great at stock settings too


----------



## jodybdesigns

Quote:


> Originally Posted by *2jzom*
> 
> hi
> 
> I have problem , see here https://drive.google.com/file/d/0B90UB4FppWF-dkJGeGxOLTdhZDg/view
> i try to uninstall the driver and reinstall but problem didn't gone
> 
> Sapphire r9 390
> crossover monitor


RAM is bad on the GPU. Try another cable or monitor. These Chinese monitors are funny anyways.

Sorry for double post. Windows phone....


----------



## r_aquarii

test each single graphic card alone and see if the artifact still there.
most likely is issue with the card itself


----------



## kizwan

@2jzom, Bad cable or bad port (HDMI/DVI/DP) can give same issue too. When plugging the cable to the graphic card, try not plug it in all the way.


----------



## ThatRagingDude

Can I use a NZXT G10 With A H55 On This Card?


----------



## m70b1jr

Hey, for those who have the Arctic Acelerro hybrid, you know the heatsink that takes the place of a backplace? Well could I change out that heatsink for my backplate?


----------



## Spartoi

For those who have managed to overclock to 1200Mhz or higher, what temperature does the card run at during load/stress test and how much voltage did you add?


----------



## Stige

Mine takes +125mV for 1200 on Core, core temps are like 40C cause it's under water but VRM gets to around ~70C which could be lower, lower temps would mean less voltage required.


----------



## m70b1jr

On liquid my VRM1 is getting 75C and VRM2 at 113C. Can someone show me where the VRM's are at? XFX R9 390?


----------



## Stige

Quote:


> Originally Posted by *m70b1jr*
> 
> On liquid my VRM1 is getting 75C and VRM2 at 113C. Can someone show me where the VRM's are at? XFX R9 390?


My VRM2 never exceeds my core temp so like 40C, You sure that thing is making contact?


----------



## m70b1jr

Quote:


> Originally Posted by *Stige*
> 
> My VRM2 never exceeds my core temp so like 40C, You sure that thing is making contact?


Well, it's not a full water block. I'll see what I can do to get those temps under control.
EDIT My VRM1 came with a heatsink with a fan for it. Thats why i see it stays so much cooler. VRM 2 has no type of cooling what so ever, and the inside of my PC case is an inferno right now.


----------



## m70b1jr

Need your guys opinion.

Top left radiator in the case is for the CPU. There's a fan on each side of it, as intake.

Top of the case is outtake.

Top right is GPU radiator, as intake.

Bottom right is normal case fan, as intake.


----------



## jdorje

I need around 1400 mV for 1200. It's too hot and certainly not worth it. I believe my 390 is above average and outside of msi cards with their 8 (?) Vrms you probably won't be running at 1200 on air.

The vrm on the left in the pic is for the vram. It doesn't get hot. The 6 vrms on the right are for the core and that's what needs cooling.


----------



## battleaxe

Well. I got the loop finished. Initial impressions are that I have a problem somewhere. The first GPU VRM's (slot 0 GPU) are significantly hotter than the second. So I think I have a contact issue that I need to resolve. But I do think that the second GPU is showing what or close to what is possible with this hybrid setup. In short, I'm happy with how it turned out. Now I only have the slow turning 1450 fans in the rig, I added 480mm of RAD space, and a new pump, plus the EKWC universal blocks. Only thing really to figure out is the first cards high (but now lower) VRM1 temps. My core temps only hit the low 40's which is nice...

Lots of pics to come.


Spoiler: Warning: Spoiler!
























Edit:

Just did a quick run at 1270mhz core. I didn't take a lot of time here. But previously I was getting flashing and artifacts as soon as I went to 1250mhz. Temps at those settings was over 75c on VRM1 and over 50c on the core.

Even though, something is not ideal ( on the first card VRM1) the most I saw here with higher voltage and core clock was 72c on VRM1 where before I would be at 75c even with lower voltage and core.... Still I believe temps will come down more once I figure out contact issues on the VRM1 of the first card that seem imminent.

Core is cooler for sure here and that is probably what got me the extra mhz to 1270mhz. I'll have to see if this will be stable, but short version is that its not flashing where 1250 was before. Good indication its close at least. Once I figure out why the VRM1 is not the same as the other card, I'll post more screeenies to show what has happened since doing this hybrid VRM water cooling solution.


----------



## mus1mus

Looks to me like it's not worth the hassle you have been through.

A full water block is also available from EK.

You also should've had better luck with these.


----------



## battleaxe

Quote:


> Originally Posted by *mus1mus*
> 
> Looks to me like it's not worth the hassle you have been through.
> 
> A full water block is also available from EK.
> 
> You also should've had better luck with these.


No full cover block for either of my cards.

You might be right about the connections though. Could have worked better. IDK. I still have some tinkering to do with it.

Edit: it was actually my plan to use that type of pipe initially, but I decided to go with the soft tubing. I think I may try this after all. Its wasn't very hard anyway. Hardest part is draining and checking the loop for leaks really. Probably look nicer too.


----------



## tolis626

Hey guys, does anyone know how thick the VRM pads are on the MSI 390x? I'm thinking of following the instructions fat4l posted here, but it would suck if I order a pack of expensive and useless pads.









I'd guess 1mm would be about right, but whatever. Knowing is way better than guessing.


----------



## Stige

Quote:


> Originally Posted by *tolis626*
> 
> Hey guys, does anyone know how thick the VRM pads are on the MSI 390x? I'm thinking of following the instructions fat4l posted here, but it would suck if I order a pack of expensive and useless pads.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'd guess 1mm would be about right, but whatever. Knowing is way better than guessing.


I posted this in the other thread aswell:

I bought 1.5mm pads for VRM for my 390, memory is the same I THINK.

Usually the stock pads are big enough so you can measure them yourself even, they come over the VRM/Memory so you can measure them on the edges. Or you could even measure the compressed part and buy something just thicker than that.


----------



## mus1mus

Or just buy the EK block. And base the thickness from the kit.


----------



## tolis626

Quote:


> Originally Posted by *Stige*
> 
> I posted this in the other thread aswell:
> 
> I bought 1.5mm pads for VRM for my 390, memory is the same I THINK.
> 
> Usually the stock pads are big enough so you can measure them yourself even, they come over the VRM/Memory so you can measure them on the edges. Or you could even measure the compressed part and buy something just thicker than that.


Meh, I don't think I'll get Fujipolys for the memory. Too much expensive padding for nothing. I may get some Phobya 7W/mK pads that I can find locally for cheap.

As for the 1.5mm pads, the only "issue" I can see is, as someone else said in the other thread, the Fujipoly pads aren't very compresssible. I don't think 1mm or 1.5mm pads will cause any issues though. The only ones I could see being problematic are the 0.5mm pads in case they don't even fill the gaps. I would just prefer to avoid having to disassemble my card twice, so that's why I'm asking. If anyone in the meantime can confirm this for the MSI, I'll be grateful. If not, I'm gonna get the 1.5mm ones and hope for the best.

Anyway! Thanks Stige, you're the man!








Quote:


> Originally Posted by *mus1mus*
> 
> Or just buy the EK block. And base the thickness from the kit.


What? Why would I get a 100+€ water block just to see that? I'm not planning to water cool yet. I probably won't WC the 390x at all. If I have the budget on my next upgrade, then maybe.


----------



## battleaxe

Quote:


> Originally Posted by *tolis626*
> 
> Meh, I don't think I'll get Fujipolys for the memory. Too much expensive padding for nothing. I may get some Phobya 7W/mK pads that I can find locally for cheap.
> 
> As for the 1.5mm pads, the only "issue" I can see is, as someone else said in the other thread, the Fujipoly pads aren't very compresssible. I don't think 1mm or 1.5mm pads will cause any issues though. The only ones I could see being problematic are the 0.5mm pads in case they don't even fill the gaps. I would just prefer to avoid having to disassemble my card twice, so that's why I'm asking. If anyone in the meantime can confirm this for the MSI, I'll be grateful. If not, I'm gonna get the 1.5mm ones and hope for the best.
> 
> Anyway! Thanks Stige, you're the man!
> 
> 
> 
> 
> 
> 
> 
> 
> What? Why would I get a 100+€ water block just to see that? I'm not planning to water cool yet. I probably won't WC the 390x at all. If I have the budget on my next upgrade, then maybe.


My experience is most of them are 1mm. I'm almost positive the ones MSI uses are, as are XFX and Sapphire, at least on the cards I have bought, which is about 5 of the 290x/390x variety.


----------



## Stige

Quote:


> Originally Posted by *tolis626*
> 
> Meh, I don't think I'll get Fujipolys for the memory. Too much expensive padding for nothing. I may get some Phobya 7W/mK pads that I can find locally for cheap.
> 
> As for the 1.5mm pads, the only "issue" I can see is, as someone else said in the other thread, the Fujipoly pads aren't very compresssible. I don't think 1mm or 1.5mm pads will cause any issues though. The only ones I could see being problematic are the 0.5mm pads in case they don't even fill the gaps. I would just prefer to avoid having to disassemble my card twice, so that's why I'm asking. If anyone in the meantime can confirm this for the MSI, I'll be grateful. If not, I'm gonna get the 1.5mm ones and hope for the best.
> 
> Anyway! Thanks Stige, you're the man!
> 
> 
> 
> 
> 
> 
> 
> 
> What? Why would I get a 100+€ water block just to see that? I'm not planning to water cool yet. I probably won't WC the 390x at all. If I have the budget on my next upgrade, then maybe.


Yup I got those Phobya pads on VRM right now aswell, will propably ask my friend in states to buy some Fujipoly for me and send it over, cost way less that way than from Ebay.


----------



## tolis626

Quote:


> Originally Posted by *battleaxe*
> 
> My experience is most of them are 1mm. I'm almost positive the ones MSI uses are, as are XFX and Sapphire, at least on the cards I have bought, which is about 5 of the 290x/390x variety.


Cool! Then 1mm it is!









Now I'm just waiting until PerformancePCs has them in stock and I hope they will ship to Greece without problems. I also hope I won't have to pay a quadrillion $ in shipping.








Quote:


> Originally Posted by *Stige*
> 
> Yup I got those Phobya pads on VRM right now aswell, will propably ask my friend in states to buy some Fujipoly for me and send it over, cost way less that way than from Ebay.


As above, I think PPCs ships to Europe. And for such low cost items, I don't think there will be any customs charges either. Also, all I can find on eBay are some overpriced 11 and 14W/mK pads, so that's a no from me. I prefer to get the real deal for that kind of money.

Now that I'm thinking of it, I have to see if one piece will be enough... I guess it will, but gotta make sure.


----------



## Stige

This isn't that expensive: http://www.ebay.com/itm/Fujipoly-Extreme-1-5mm-11W-mK-Thermal-Pad-100mm-x-15mm-Sarcon-XR-e-Unit-1-/181701216935?hash=item2a4e3c8aa7:g:z~UAAOSw9r1V-Ai-


----------



## mus1mus

Quote:


> Originally Posted by *tolis626*
> What? Why would I get a 100+€ water block just to see that? I'm not planning to water cool yet. I probably won't WC the 390x at all. If I have the budget on my next upgrade, then maybe.


So you can get into watercooling of course!


----------



## battleaxe

Quote:


> Originally Posted by *tolis626*
> 
> Cool! Then 1mm it is!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now I'm just waiting until PerformancePCs has them in stock and I hope they will ship to Greece without problems. I also hope I won't have to pay a quadrillion $ in shipping.
> 
> 
> 
> 
> 
> 
> 
> 
> As above, I think PPCs ships to Europe. And for such low cost items, I don't think there will be any customs charges either. Also, all I can find on eBay are some overpriced 11 and 14W/mK pads, so that's a no from me. I prefer to get the real deal for that kind of money.
> 
> Now that I'm thinking of it, I have to see if one piece will be enough... I guess it will, but gotta make sure.


Can you order from Amazon? They have them fairly cheap I think. I get all my fujipoly extremes from Amazon now.
Quote:


> Originally Posted by *tolis626*
> 
> Cool! Then 1mm it is!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now I'm just waiting until PerformancePCs has them in stock and I hope they will ship to Greece without problems. I also hope I won't have to pay a quadrillion $ in shipping.
> 
> 
> 
> 
> 
> 
> 
> 
> As above, I think PPCs ships to Europe. And for such low cost items, I don't think there will be any customs charges either. Also, all I can find on eBay are some overpriced 11 and 14W/mK pads, so that's a no from me. I prefer to get the real deal for that kind of money.
> 
> Now that I'm thinking of it, I have to see if one piece will be enough... I guess it will, but gotta make sure.


I always get the 50x60mm pads. Its enough to do at least 3-4 cards VRM's sections at least. I just did two GPUs and have more than half the sheet left over.


----------



## Spartoi

What's the best thermal pads with double-sided adhesive? I have VRM heatsinks, but no secure way to mount it.


----------



## 2jzom

anyone


----------



## 2jzom

Quote:


> Originally Posted by *2jzom*
> 
> hi
> 
> I have problem , see here https://drive.google.com/file/d/0B90UB4FppWF-dkJGeGxOLTdhZDg/view
> i try to uninstall the driver and reinstall but problem didn't gone
> 
> Sapphire r9 390
> crossover monitor


----------



## tolis626

Quote:


> Originally Posted by *Stige*
> 
> This isn't that expensive: http://www.ebay.com/itm/Fujipoly-Extreme-1-5mm-11W-mK-Thermal-Pad-100mm-x-15mm-Sarcon-XR-e-Unit-1-/181701216935?hash=item2a4e3c8aa7:g:z~UAAOSw9r1V-Ai-


I never said they are that expensive per se. It's just that I can get the better ones at about the same price. That, in my eyes, makes those a bad buy. I will get something like that if I can't get my hands on the exact ones I want though. It's not like they won't get the job done.
Quote:


> Originally Posted by *mus1mus*
> 
> So you can get into watercooling of course!


Believe me when I say I want to do it. I want to build a custom loop, delid my 4790k, water cool my card and all that cool stuff. But money's a bit tight and I have to replace my dying Galaxy S3 and I'm as picky with my smartphones as I am with my PC components, so I'm eyeing something like the Galaxy S7 or LG G5 or something similar. So that's where money's gonna go now. Then some vacations and then it's going to be Polaris time, so the 390x probably won't see the watery love I want to show it.









Although I've been thinking that, since I already have an H110 for the CPU, I could build half a loop, so to speak, include only the GPU and then at a later date expand it and include the CPU. That's still unlikely to happen with the 390x though.
Quote:


> Originally Posted by *battleaxe*
> 
> Can you order from Amazon? They have them fairly cheap I think. I get all my fujipoly extremes from Amazon now.
> I always get the 50x60mm pads. Its enough to do at least 3-4 cards VRM's sections at least. I just did two GPUs and have more than half the sheet left over.


What Amazon? I usually order from Amazon.de, but I guess you mean from the US. I guess I can order, though, sure. Last time I checked Amazon.de though, they didn't have Fujipoly pads.


----------



## Agent Smith1984

Quote:


> Originally Posted by *dopemoney*


Wait, do you have a reference card???


----------



## tolis626

One thing I've been forgetting to ask. Is there any chance that my black screen issues (at high memory clocks, not stock, so the card isn't bad) could be related to the fact that I'm using HDMI? I'm kinda forced to use a TV as a monitor for now. Would there be any point in getting a DP->HDMI adapter?


----------



## dopemoney

This is my card:
http://www.newegg.com/Product/Product.aspx?Item=N82E16814131686


----------



## Tobiman

Quote:


> Originally Posted by *ThatRagingDude*
> 
> 
> 
> Can I use a NZXT G10 With A H55 On This Card?


Yes.


----------



## xxrafael95xx

Hi guys im new with the card, im planing to watercool it, i think i only have two options (correct me if im wrong) alphacool and ekwb, i will go with the alphaccol because of the price. Does someone here has it?


----------



## Agent Smith1984

Quote:


> Originally Posted by *xxrafael95xx*
> 
> 
> 
> Hi guys im new with the card, im planing to watercool it, i think i only have two options (correct me if im wrong) alphacool and ekwb, i will go with the alphaccol because of the price. Does someone here has it?


I'll get you added ASAP.

Be aware that the alphacool block DOES NOT water cool the VRM's and memory..... it uses a passive heatsink over those areas. It's a good solution with a custom loop, but will not be quite as effective on the VRM's.


----------



## mus1mus

Quote:


> Originally Posted by *tolis626*
> 
> One thing I've been forgetting to ask. Is there any chance that my black screen issues (at high memory clocks, not stock, so the card isn't bad) could be related to the fact that I'm using HDMI? I'm kinda forced to use a TV as a monitor for now. Would there be any point in getting a DP->HDMI adapter?


Nope. If it's due to a memory OC, just back off. It's an instability you cannot fix by changing the output cable.


----------



## xxrafael95xx

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I'll get you added ASAP.
> 
> Be aware that the alphacool block DOES NOT water cool the VRM's and memory..... it uses a passive heatsink over those areas. It's a good solution with a custom loop, but will not be quite as effective on the VRM's.


THnkas for adding me and for your answer, so would if you have to choose you would said get the ekwb, then? because i see people using kraken g10 or artic accelero, but seems like my msi isnt compatible with them


----------



## Agent Smith1984

Quote:


> Originally Posted by *xxrafael95xx*
> 
> THnkas for adding me and for your answer, so would if you have to choose you would said get the ekwb, then? because i see people using kraken g10 or artic accelero, but seems like my msi isnt compatible with them


Exactly, don't try your hand with the kraken. It may work, it may not.... I'm sure it can be MADE to work though either way.

The alphacool differs from those, but at least placing a heat sink on those other vital parts, but it does flow water through that area. It will work somewhere between a G10 and an AIO, and a an EK and 120mm water loop.

Not sure what actual results would be. It really just depends on your budget. The alpha is still a great cooling solution though.


----------



## xxrafael95xx

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Exactly, don't try your hand with the kraken. It may work, it may not.... I'm sure it can be MADE to work though either way.
> 
> The alphacool differs from those, but at least placing a heat sink on those other vital parts, but it does flow water through that area. It will work somewhere between a G10 and an AIO, and a an EK and 120mm water loop.
> 
> Not sure what actual results would be. It really just depends on your budget. The alpha is still a great cooling solution though.


well tbh i dont want to spend a lot of money, but if i have to spend more in buying a better waterblock i will, because i live in a very hot place and i cant turn on the ac everytime i wanna play, so im worry about my temps, i get 80 on load and those temps might be normal but i dont want to risk my card, that is why im thinking on alphacool but know that you say that it wont cool the vrm got me thinking, because i think the ekwb doesnt cool the ram neither.


----------



## mus1mus

If an EK block is available and within reach, just go for it. I am looking at MSI 390s now TBH. Only coz EK supports them now.


----------



## Agent Smith1984

Quote:


> Originally Posted by *xxrafael95xx*
> 
> well tbh i dont want to spend a lot of money, but if i have to spend more in buying a better waterblock i will, because i live in a very hot place and i cant turn on the ac everytime i wanna play, so im worry about my temps, i get 80 on load and those temps might be normal but i dont want to risk my card, that is why im thinking on alphacool but know that you say that it wont cool the vrm got me thinking, because i think the ekwb doesnt cool the ram neither.


EK does make a full cover for the MSI now. I need to update the OP to include, just been busy....

To clarify though.... 80c is STILL 14c cooler than the 94c that the card will throttle at. You are not in any kind of danger area or anything, though it's never bad to get it cooler, especially when wanting to push those clocks!!


----------



## xxrafael95xx

Quote:


> Originally Posted by *Agent Smith1984*
> 
> EK does make a full cover for the MSI now. I need to update the OP to include, just been busy....
> 
> To clarify though.... 80c is STILL 14c cooler than the 94c that the card will throttle at. You are not in any kind of danger area or anything, though it's never bad to get it cooler, especially when wanting to push those clocks!!


yeah i have read that i can take 94 at full load but as i said i dont feel comfortable with those temps, i dont know why, and as you say if i want to push tjose clock i would be able to. tahnks guy i will do all the possible to buy the ekwb, that imo looks pretty than the alphacool


----------



## Agent Smith1984

Go the wife's rig up and running full blast, and I gotta say, I sure wouldn't mind putting 2) 390's in her rig!! She has a huge 200mm fan blowing over the GPU area, and 3 120mm's sucking air out. Her case flow is unreal









My s340 not quite as good, lol


----------



## xxrafael95xx

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Go the wife's rig up and running full blast, and I gotta say, I sure wouldn't mind putting 2) 390's in her rig!! She has a huge 200mm fan blowing over the GPU area, and 3 120mm's sucking air out. Her case flow is unreal
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My s340 not quite as good, lol


You know i regrett on buying the waterblock but it was too late, i thought i will buy two high pressure fans and that's it, but i had bought a giftcard in pc performances,and it cant be refunded so i just bought the ekwb block lol


----------



## Agent Smith1984

Quote:


> Originally Posted by *xxrafael95xx*
> 
> You know i regrett on buying the waterblock but it was too late, i thought i will buy two high pressure fans and that's it, but i had bought a giftcard in pc performances,and it cant be refunded so i just bought the ekwb block lol


Do you have the waterloop in place already to add the card to??


----------



## xxrafael95xx

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Do you have the waterloop in place already to add the card to??


Nope that is why i was regretting because i have to spend lot of money, but what can i do.


----------



## Agent Smith1984

Quote:


> Originally Posted by *xxrafael95xx*
> 
> Nope that is why i was regretting because i have to spend lot of money, but what can i do.


I'd suggest leaving yourself some room to loop your CPU in later..... start with a 240mm minimum rad, and go from there. I'm no expert in the water department, but there is lots of help elsewhere on this sight!


----------



## xxrafael95xx

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I'd suggest leaving yourself some room to loop your CPU in later..... start with a 240mm minimum rad, and go from there. I'm no expert in the water department, but there is lots of help elsewhere on this sight!


well someone is selling me a Thermaltake BigWater 760 for 85$ so i will buy that, because has everything to start with, and i can add more radiators later, and i will go for it because i have a mid case


----------



## christoph

Quote:


> Originally Posted by *Stige*
> 
> This isn't that expensive: http://www.ebay.com/itm/Fujipoly-Extreme-1-5mm-11W-mK-Thermal-Pad-100mm-x-15mm-Sarcon-XR-e-Unit-1-/181701216935?hash=item2a4e3c8aa7:g:z~UAAOSw9r1V-Ai-


those 11w/mk would be good enough?? the 17w should be better don't they??


----------



## Stige

Quote:


> Originally Posted by *christoph*
> 
> those 11w/mk would be good enough?? the 17w should be better don't they??


But the 17w ones cost a fortune compared to these, I think these should be enough, or the 14w.


----------



## christoph

Quote:


> Originally Posted by *Stige*
> 
> But the 17w ones cost a fortune compared to these, I think these should be enough, or the 14w.


yes exactly they are way expensive

then I'll try to order those, I been having problems finding who can send then over my country that's why I haven't order some

does anyone knows or imagine whats the w/km of the stocks thermal pads of these video cards?


----------



## christoph

should I order the 14w/mk 1.0 mm or the 11w/mk 1.5 mm??


----------



## Stige

Quote:


> Originally Posted by *christoph*
> 
> yes exactly they are way expensive
> 
> then I'll try to order those, I been having problems finding who can send then over my country that's why I haven't order some
> 
> does anyone knows or imagine whats the w/km of the stocks thermal pads of these video cards?


Not more than 3w/mk that's for sure, Alphacool waterblocks come with 3.5w/mk I think so I would guess stock ones are even less than that.

Quote:


> Originally Posted by *christoph*
> 
> should I order the 14w/mk 1.0 mm or the 11w/mk 1.5 mm??


I think you need 1.5mm for VRM on these cards, not sure if 1mm will be enough. I ordered 1.5mm atleast.


----------



## christoph

Quote:


> Originally Posted by *Stige*
> 
> Not more than 3w/mk that's for sure, Alphacool waterblocks come with 3.5w/mk I think so I would guess stock ones are even less than that.
> I think you need 1.5mm for VRM on these cards, not sure if 1mm will be enough. I ordered 1.5mm atleast.


alright then 1.5 it is, thanks

but, should be the same 1.5 mm for the memory? I want to replace VRM's and memory too


----------



## Stige

Quote:


> Originally Posted by *christoph*
> 
> alright then 1.5 it is, thanks
> 
> but, should be the same 1.5 mm for the memory? I want to replace VRM's and memory too


Yes.


----------



## Spartoi

Quote:


> Originally Posted by *Spartoi*
> 
> What's the best thermal pads with double-sided adhesive? I have VRM heatsinks, but no secure way to mount it.


Anyone?


----------



## mus1mus

Use zip ties.


----------



## Slowpoke66

Quote:


> Originally Posted by *Spartoi*
> 
> What's the best thermal pads with double-sided adhesive? I have VRM heatsinks, but no secure way to mount it.


I'm using Akasa AK-TT12-80. Don't know how good they are, but they keep my heatsinks in place. At least, atm...


----------



## patriotaki

why does gtx 970 get better fps on most games than the 390?


----------



## mus1mus

Title dependent. And 970s can reach higher clocks due to boost. When both are overclocked to a considerable degree, the difference shows favoring the 390. Add in resolutions higher than 1440p and bye bye 3.5GB VRAM.


----------



## patriotaki

Quote:


> Originally Posted by *mus1mus*
> 
> Title dependent. And 970s can reach higher clocks due to boost. When both are overclocked to a considerable degree, the difference shows favoring the 390. Add in resolutions higher than 1440p and bye bye 3.5GB VRAM.


agree ... i have 2x pcs+ r9 390 and both cards gain 7-10fps more when OCed.


----------



## m70b1jr

The only real time advantage the 970 has over the 390 is the clock speed and lower TDP.


----------



## mus1mus

Speaking of clock speed.









http://www.3dmark.com/3dm11/11033686

What a lazy card!


----------



## m70b1jr

Quote:


> Originally Posted by *mus1mus*
> 
> Speaking of clock speed.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm11/11033686
> 
> What a lazy card!


How the **** did you get that 1500mhz clock speed?


----------



## patriotaki

+ Add me to the list !!









http://www.techpowerup.com/gpuz/details.php?id=add7y


----------



## mus1mus

Quote:


> Originally Posted by *m70b1jr*
> 
> How the **** did you get that 1500mhz clock speed?


Brute force! I need to continue fighting Fyzzz and Vellinious! At all cost!








Quote:


> Originally Posted by *patriotaki*
> 
> + Add me to the list !!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.techpowerup.com/gpuz/details.php?id=add7y


Nice OC there! Can you join the cause and fight for the RED Team here?

http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd/0_50

Show them 970 and 980 fanboys where they belong. You can easily add 200K XFire points in there!


----------



## m70b1jr

I could probably get a high mhz overclock, for when I added more voltage I didn't black screen


----------



## Stige

Quote:


> Originally Posted by *patriotaki*
> 
> why does gtx 970 get better fps on most games than the 390?


It does not.


----------



## mus1mus

Quote:


> Originally Posted by *m70b1jr*
> 
> I could probably get a high mhz overclock, for when I added more voltage I didn't black screen


Yeah. It varies really. I used to get some issues at 1300! Then at 1400. Now 1525 is where the card go amiss! Break it slow on the Voltages.


----------



## patriotaki

Quote:


> Originally Posted by *mus1mus*
> 
> Brute force! I need to continue fighting Fyzzz and Vellinious! At all cost!
> 
> 
> 
> 
> 
> 
> 
> 
> Nice OC there! Can you join the cause and fight for the RED Team here?
> 
> http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd/0_50
> 
> Show them 970 and 980 fanboys where they belong. You can easily add 200K XFire points in there!


whoaah ill try to do some benches later today and post back.. i see the green team has a better score


----------



## mus1mus

They should. Or else.....


----------



## patriotaki

Quote:


> Originally Posted by *mus1mus*
> 
> They should. Or else.....


haha i dont think we will get close to them







ill post later my results


----------



## kizwan

Quote:


> Originally Posted by *mus1mus*
> 
> Quote:
> 
> 
> 
> Originally Posted by *m70b1jr*
> 
> I could probably get a high mhz overclock, for when I added more voltage I didn't black screen
> 
> 
> 
> Yeah. It varies really. I used to get some issues at 1300! Then at 1400. Now 1525 is where the card go amiss! Break it slow on the Voltages.
Click to expand...

You did the volt mod?


----------



## m70b1jr

Quote:


> Originally Posted by *kizwan*
> 
> You did the volt mod?


Theres a volt mod? Link it please.


----------



## tolis626

What the actual hell @mus1mus? How did that thing happen? My card refuses to work properly at over 1175MHz and you hit 1500+? That was like ***** slapping me in the face while pinching my nipples. It hurts and it's ridiculous.









Seriously, awesome stuff man. How much voltage did that take? And under what kind of cooling? Water? LN2? DICE? Phase change? Liquid helium? Neutron star infusions? What? I don't think I've ever seen a Hawaii card clock that high.









Also, come on, throw us a bone and give us a Firestrike score. That should be fun to watch.


----------



## mus1mus

There's a Voltage limit OFF mod. It emulates the PT-based BIOS. The actual MAX Voltage will still be a card attribute though.

Don't be mad at me guys. I am just trying to catch up with Vell and Fyzzz. Will download Firestrike later.

The card is still on water. I didn't look at the temps though. But the Core seem to be sitting in the 50s.

I need more Fan Speeds and Fans for the 2 360s. And moar Powah than a 1250W Seasonic can give. But I have 2. I just need to wire them together for tri-fire action.


----------



## patriotaki

No OC


OC


thats bad isnt it


----------



## mus1mus

I forgot. Do a TESSELATION off on CCC or Crimson.


----------



## m70b1jr

Can I have the tutorial for this voltage/bios mod? Will it work with an r9 390?


----------



## patriotaki

Quote:


> Originally Posted by *mus1mus*
> 
> I forgot. Do a TESSELATION off on CCC or Crimson.


how cant find it under settings got 16.2 crimson


----------



## patriotaki

how much score are you getting on firestrike? i think my score is low.. running at 1080p default settings


----------



## mus1mus

Quote:


> Originally Posted by *m70b1jr*
> 
> Can I have the tutorial for this voltage/bios mod? Will it work with an r9 390?


I believe it should. It's 11PM round here and I don't have a PC with me. I'll be back at you asking for your BIOS so I can have a look. I'll mod that if allowed.

There is only one Hex value to change to do that.
Quote:


> Originally Posted by *patriotaki*
> 
> how cant find it under settings got 16.2 crimson


Open Crimson > Gaming >Tesellation > Override Settings > Choose OFF.


----------



## patriotaki

Quote:


> Originally Posted by *mus1mus*
> 
> I believe it should. It's 11PM round here and I don't have a PC with me. I'll be back at you asking for your BIOS so I can have a look. I'll mod that if allowed.
> 
> There is only one Hex value to change to do that.
> Open Crimson > Gaming >Tesellation > Override Settings > Choose OFF.


did it ..same result how much are you getting?


----------



## kizwan

Quote:


> Originally Posted by *m70b1jr*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> You did the volt mod?
> 
> 
> 
> Theres a volt mod? Link it please.
Click to expand...

When I said volt mod, I was referring to hardware mod. This mod is useful for the card that experiencing blackscreen when overvolting. Download the doc file from the first link in the below thread.
https://www.kingpincooling.com/forum/showthread.php?t=2473

Example:-
http://www.overclock.net/t/1561372/hawaii-bios-editing-290-290x-295x2-390-390x/1990#post_24936560

The mod was done on referenced card. To do the same thing on 390, you will need first to identify the correct location of the 0.95V rail. You will need to use DMM.


----------



## m70b1jr

Quote:


> Originally Posted by *kizwan*
> 
> When I said volt mod, I was referring to hardware mod. This mod is useful for the card that experiencing blackscreen when overvolting. Download the doc file from the first link in the below thread.
> https://www.kingpincooling.com/forum/showthread.php?t=2473
> 
> Example:-
> http://www.overclock.net/t/1561372/hawaii-bios-editing-290-290x-295x2-390-390x/1990#post_24936560
> 
> The mod was done on referenced card. To do the same thing on 390, you will need first to identify the correct location of the 0.95V rail. You will need to use DMM.


Wanna help me out with that? Would it work on the r9 390?. Feel free to shoot me a PM if you have the time and you want to.


----------



## kurodax

So i got this problem when OC my R9 390

Stock speed



OC



This just happend with Valley benchmark (until now), i try OC with games and Steam VR bench but i don't see this problem. So this is about my card or the benchmark ?


----------



## christoph

Quote:


> Originally Posted by *mus1mus*
> 
> Title dependent. And 970s can reach higher clocks due to boost. When both are overclocked to a considerable degree, the difference shows favoring the 390. Add in resolutions higher than 1440p and bye bye 3.5GB VRAM.


add that where I live, the 390 is a little bit cheaper than the 970


----------



## jdorje

Quote:


> Originally Posted by *kurodax*
> 
> So i got this problem when OC my R9 390


That's an unstable overclock. Lower core clock or raise voltage.


----------



## kurodax

Quote:


> Originally Posted by *jdorje*
> 
> That's an unstable overclock. Lower core clock or raise voltage.


i raise voltage, power limit to maximum, Aux voltage +31, Core clock at 1150 and memory clock at 1645 mhz and this still happend (not much like in the picture but it still happend). That mean i have a bad card right


----------



## tolis626

So today I realized that I hadn't updated Afterburner to 4.2.0 and I just downloaded and installed it. Just for the lulz (Not really, it was just to cause me even more frustration, I'm mazochistic like that







) I decided to give it a go at high overclocks again. So, 1175/1750MHz at +90/+50mV was where it's at. Note that I hadn't been able to get a proper benchmark run out of my GPU with the memory at 1750MHz. Most of the time it would black screen on me during the benchmark, or other times it would cause flickering, artifacts or just low scores (Probably due to ECC going all crazy). So today this happened. I don't know, or care, about how and why it worked, it just worked and that's what matters. 14707 graphics score. About damn time I got unstuck from the 14500 range!









Now, I'm not saying Afterburner's update had anything to do with it. I just think it was coincidence. I don't want anyone quoting me as saying that "AARRRRRHH! Afterburner 4.2.0 improved my benchmarkzzzz! ARHHHH". Still, it was awesome.








Quote:


> Originally Posted by *kurodax*
> 
> i raise voltage, power limit to maximum, Aux voltage +31, Core clock at 1150 and memory clock at 1645 mhz and this still happend (not much like in the picture but it still happend). That mean i have a bad card right


A bad card would be a card that can't even work correctly at stock. Your card's fine, maybe it just isn't a great overclocker, like most cards. How much voltage have you given the core? Aux voltage doesn't really matter that much most of the time. Also, while figuring stability for the core, leave the memory at stock and, when you're done with the core, overclock the memory too. One step at a time will get you there.









EDIT : Erm, scratch that 14707. I just got a 14801 at 1185MHz and +100mV, otherwise the same. God damn I shouldn't get so excited about a 14800, but I'm sissygirling right now.


----------



## mus1mus

Soooo.

Who wants a NO ROM SET VOLTAGE Limit?









Make your selves known heathens.


----------



## Vellinious

Quote:


> Originally Posted by *mus1mus*
> 
> Soooo.
> 
> Who wants a NO ROM SET VOLTAGE Limit?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Make your selves known heathens.


Without black screens?


----------



## mus1mus

Nope.







Just a no limit mod.

BTW I am using 15.201.1151.1004 Driver and it doesn't black screen on any of my card at very high Voltages.







That gave me the 1500MHz


----------



## Vellinious

Ok, I'll try it.

Put it on this one.

Anything I need to watch for? Settings I should use / avoid?

OV3MH.zip 98k .zip file


----------



## m70b1jr

I would like to say, I did test out his unlock. I did not black screen with like +200 volts on it. Glad I don't have to resort to hard modding. For now. Currently 16 and working minimum wage. I would hate to make a 330$ Paperweight.


----------



## mus1mus

Go run some benches.









The mod I did actually just removed the rom implemented Voltage limit. Whilst maintaing the stock roms Voltage Info. The card can still clock down and Down Volt like a stock card.

+200 is still safe as long as you keep it cool. I do +385







and living on the edge.


----------



## m70b1jr

Is there a number of voltage where even with adequate cooling the card will begin to damage? And just as a recap, whats the "Starting damaging temp number" for the r9 390?

mus1mus is cooking me up a BIOS right now. Ill see what I can push mine too.


----------



## Vellinious

Can't.stop.laughing....

https://linustechtips.com/main/topic/561041-980ti-darwin-awards-help/?page=1


----------



## m70b1jr

Quote:


> Originally Posted by *Vellinious*
> 
> Can't.stop.laughing....
> 
> https://linustechtips.com/main/topic/561041-980ti-darwin-awards-help/?page=1


NOOOOOOOO. I hope he works for NASA.


----------



## mus1mus

Quote:


> Originally Posted by *Vellinious*
> 
> Ok, I'll try it.
> 
> Put it on this one.
> 
> Anything I need to watch for? Settings I should use / avoid?
> 
> OV3MH.zip 98k .zip file


Vell. Try this. This is just a swap so Im not sure if it will work.

OV3MTEST.zip 99k .zip file


Edit:

@m70b1jr

Try this. I kinda sway away from Offset myself. So No limit will do for now.

HawaiiOV.zip 99k .zip file


----------



## Spartoi

Quote:


> Originally Posted by *mus1mus*
> 
> Use zip ties.


Nowhere to securely tie the heatsink onto card.
Quote:


> Originally Posted by *Slowpoke66*
> 
> I'm using Akasa AK-TT12-80. Don't know how good they are, but they keep my heatsinks in place. At least, atm...


Where can I buy this? I wanted to buy from FrozenCPU but they aren't taking order now. Or if there is an alternative thermal tape that is more accessible to purchase?


----------



## kurodax

anybody got a unstable core clock like me ? And this is the stock speed, not OC. And my friend also said that my 12V has some problem


----------



## Stige

Quote:


> Originally Posted by *kurodax*
> 
> anybody got a unstable core clock like me ? And this is the stock speed, not OC. And my friend also said that my 12V has some problem


Don't believe what GPU says for the 12V line until you have actually measured it with a DMM yourself.

Mine says as low as 11.5V sometimes but the actual line when measured with DMM never drops below 12V.


----------



## patriotaki

turning off tesellation on amd settings on 3dmark the option is still enabled.. why do i get so low scores?


----------



## gupsterg

Quote:


> Originally Posted by *mus1mus*
> 
> It doesn't matter leaving those values at stock. The +50% power limit can allow you to go further.
> 
> It will take a hefty number of tests to get this dialed in and see the effect.


I will update OP concerning PowerLimit in bios mod thread (was updated recently as well)







.

This is the process a user must do to get correct in ROM.

Lets say a ROM has 208W / 208W / 200A, you OC your card and it throttles not due temp but PL. You then up PL in overdrive/msi ab, say you only need 10% added to get clocks stable. Safest procedure is to add 10% to stock TDP & MPDL (the W figures). Then flash ROM, retest OC without adding PL in windows, if GPU clock doesn't throttle all is good, otherwise add 10% to TDC.

Using the figure of % from MSI AB I have made many ROMs for others and it is an exact match for what would need to be added to ROM. Early on when I gave others ROM I'd say use 8% if they said 10% was in MSI AB and card would throttle.

Even if we raise some values in PowerPlay for say amps VRM controller in stock ROMs (/MTP) has OCP limit programmed in it. Then it also monitors VRM temp and lowers output if VRHOT occurs.

We don't have access to IR3567B datasheet but IR3565B is very similar, you can see pinouts for IR3567B (as they are in it's 2 page datasheet) and compare with IR3565B; on page 45 of IR3565B you will see info on VRHOT







.

As long as members are taking into account PCI-E connecters, VRM spec it's all AOK IMO. All you are doing is setting PL in ROM, it is as safe as whacking +50% PL in SW which most people do and say "it's safe".

Personally on my Tri-X 290 (8+6) ref PCB 225 / 225 / 216 ; this is what Sapphire official updated OC ROM has. It has been great and not throttled clocks for even more than 1140/1495 (which I run 24/7). On my Vapor-X 290X (8+8) custom pcb 238 / 238 / 229 been all the way good to 1150 / 1575 (have tested a little more than those clocks).

If people keep PL only increased in ROM as they require and they find a future OC they test throttles adding a little PL via software to test is still AOK, then just readjust ROM later as required.

Quoting you was no way meaning info aimed at you







, but using as starting point for my post







.


----------



## kizwan

I have my 290 tri-x modded bios with tdp/tdc/pl set to 999. And during benching, I added +50% pl.


----------



## patriotaki

i get high pitched noise when openning Witcher 3.
On GTA V its very very little compared to Witcher 3, cant tell for sure if its pitching ..but i thinkl yes

should i return it?


----------



## patriotaki

Quote:


> Originally Posted by *patriotaki*
> 
> i get high pitched noise when openning Witcher 3.
> On GTA V its very very little compared to Witcher 3, cant tell for sure if its pitching ..but i thinkl yes
> 
> should i return it?


or is it my psu.. its an old 650watt from superflower lol


----------



## battleaxe

Quote:


> Originally Posted by *patriotaki*
> 
> i get high pitched noise when openning Witcher 3.
> On GTA V its very very little compared to Witcher 3, cant tell for sure if its pitching ..but i thinkl yes
> 
> should i return it?


How long have you had it?

Can you duplicate with a bench? If so let it run a few hours while running something like Heaven (if Heaven also causes it) and see if it improves.


----------



## patriotaki

Quote:


> Originally Posted by *battleaxe*
> 
> How long have you had it?
> 
> Can you duplicate with a bench? If so let it run a few hours while running something like Heaven (if Heaven also causes it) and see if it improves.


i have the card 3-4 weeks.. how many hours should i run heaven? im downloading it now


----------



## battleaxe

Quote:


> Originally Posted by *patriotaki*
> 
> i have the card 3-4 weeks.. how many hours should i run heaven? im downloading it now


run it overnight and see if it improves at all by morning.


----------



## patriotaki

Quote:


> Originally Posted by *battleaxe*
> 
> run it overnight and see if it improves at all by morning.


running heaven also makes a pitching noise...bad card?


----------



## jodybdesigns

Quote:


> Originally Posted by *patriotaki*
> 
> running heaven also makes a pitching noise...bad card?


It's the high frame rate. It's fine. Mine does it too when exiting Heaven. Sometimes other games when I hit 2k FPS in the menus.

970s are the worst for it.

Edit- I will add that my HX1050 crapped out 2 days ago and I swapped to a cooler master V700 and my system hasn't been this stable in a year. And my coil whine isn't so bad. Might be something to check into.


----------



## jdorje

Coil whine. My card used to do that at 500+ fps, but I haven't noticed it recently. The overnight trick is to try to break it in.


----------



## patriotaki

Quote:


> Originally Posted by *jdorje*
> 
> Coil whine. My card used to do that at 500+ fps, but I haven't noticed it recently. The overnight trick is to try to break it in.


does it matter if i run it in windowed mode?

i am running it now for a few hours...coil whine still exists.. although i have my case open with the case closed i dont think i can hear it


----------



## patriotaki

my r9 390 pcs+ coil whines alot.. how can i see if its my psus fault or gpus?


----------



## Agent Smith1984

Quote:


> Originally Posted by *patriotaki*
> 
> my r9 390 pcs+ coil whines alot.. how can i see if its my psus fault or gpus?


You can't without testing another PSU.....


----------



## jodybdesigns

Those power supplies aren't the greatest in the world. What is the efficiency on that PSU? 650 might be pushing it depending on a number of factors. But could be the card too.


----------



## patriotaki




----------



## m70b1jr

Can someone show me how to edit my BIOS's voltage?


----------



## Worldwin

Quote:


> Originally Posted by *m70b1jr*
> 
> Can someone show me how to edit my BIOS's voltage?


Go to http://www.overclock.net/t/1561372/hawaii-bios-editing-290-290x-295x2-390-390x and read.


----------



## m70b1jr

Quote:


> Originally Posted by *Worldwin*
> 
> Go to http://www.overclock.net/t/1561372/hawaii-bios-editing-290-290x-295x2-390-390x and read.


I did, and it's way to complicated for me.


----------



## Stige

Quote:


> Originally Posted by *jodybdesigns*
> 
> Those power supplies aren't the greatest in the world. What is the efficiency on that PSU? 650 might be pushing it depending on a number of factors. But could be the card too.


650W is plenty for any single card setup, you can't push past it with a 390 for sure. I can run this at 1.52V on CPU and +200mV on the GPU without any issues.


----------



## patriotaki

Quote:


> Originally Posted by *jodybdesigns*
> 
> Those power supplies aren't the greatest in the world. What is the efficiency on that PSU? 650 might be pushing it depending on a number of factors. But could be the card too.


http://www.super-flower.com.tw/products_detail.php?class=2&sn=6&ID=23&lang=en

this is my psu


----------



## Stige

Quote:


> Originally Posted by *patriotaki*
> 
> http://www.super-flower.com.tw/products_detail.php?class=2&sn=6&ID=23&lang=en
> 
> this is my psu


Your PSU is fine really.


----------



## patriotaki

Quote:


> Originally Posted by *Stige*
> 
> Your PSU is fine really.


sure? @shilka told me that is not so good


----------



## jodybdesigns

Quote:


> Originally Posted by *patriotaki*
> 
> sure? @shilka told me that is not so good


Wow....5 +12v rails?? I suspect PSU right now. 5 rails is usually a crap design.

You need to get a nice PSU with 1 strong rail.


----------



## patriotaki

Quote:


> Originally Posted by *jodybdesigns*
> 
> Wow....5 +12v rails?? I suspect PSU right now. 5 rails is usually a crap design.
> 
> You need to get a nice PSU with 1 strong rail.


hmm that leads me to a new psu? that will fix my issues i think..
i get alot of coil whine


----------



## jodybdesigns

Quote:


> Originally Posted by *patriotaki*
> 
> hmm that leads me to a new psu? that will fix my issues i think..
> i get alot of coil whine


I would try to borrow one first. But usually, more than 1 rail is usually crap. Look at any of the really high end PSU's. They usually all have 1 rail with a ton of amps on it. Instead of splitting a ton of amps across a bunch of rails.


----------



## patriotaki

Quote:


> Originally Posted by *jodybdesigns*
> 
> I would try to borrow one first. But usually, more than 1 rail is usually crap. Look at any of the really high end PSU's. They usually all have 1 rail with a ton of amps on it. Instead of splitting a ton of amps across a bunch of rails.


agree.. getting a better psu will solve the coil whine and maybe get a better OC?


----------



## jodybdesigns

Quote:


> Originally Posted by *patriotaki*
> 
> agree.. getting a better psu will solve the coil whine and maybe get a better OC?


My Corsair HX1050 died over the weekend. I had a brand new Cooler Master V700 Gold and my computer hasn't been more happy. I mean it is full of happy. I am now able to overclock my CPU to 4.2ghz @ 1.4v - which is an undervolt. I could not do this before. Also, I am now able to run my PCS+ @ 1100/1625 only using +38mv now. I had to use +63mv before. I am also able to overclock my ram to 1866 which I have NEVER been able to do. I think the HX1050 has been going bad for a year now. Somewhere in November/December I started getting Kernel-Power 41 errors and the computer would just black screen - the notorious issue with the Corsair power supplies going bad. Kinda like how Ford brakes squeak when the pads are bad. It's a call sign.

The 12v rail was dropping down to 10.68v and it would stay there, then suddenly drop to 10.38v and black screen. My V700 hovers at 11.89v - and SOMETIMES drops to 11.77v when I push the card to 1175/1625 +53mv. I lost some wattage, but I am more than happy with the results.


----------



## patriotaki

Quote:


> Originally Posted by *jodybdesigns*
> 
> My Corsair HX1050 died over the weekend. I had a brand new Cooler Master V700 Gold and my computer hasn't been more happy. I mean it is full of happy. I am now able to overclock my CPU to 4.2ghz @ 1.4v - which is an undervolt. I could not do this before. Also, I am now able to run my PCS+ @ 1100/1625 only using +38mv now. I had to use +63mv before. I am also able to overclock my ram to 1866 which I have NEVER been able to do. I think the HX1050 has been going bad for a year now. Somewhere in November/December I started getting Kernel-Power 41 errors and the computer would just black screen - the notorious issue with the Corsair power supplies going bad. Kinda like how Ford brakes squeak when the pads are bad. It's a call sign.
> 
> The 12v rail was dropping down to 10.68v and it would stay there, then suddenly drop to 10.38v and black screen. My V700 hovers at 11.89v - and SOMETIMES drops to 11.77v when I push the card to 1175/1625 +53mv. I lost some wattage, but I am more than happy with the results.


Thank you sir!!!! That's what I'm going to do

I found the zalman 750watt 80+ gold for 95euro

What do you think of that psu??


----------



## jodybdesigns

Quote:


> Originally Posted by *patriotaki*
> 
> Thank you sir!!!! That's what I'm going to do
> 
> I found the zalman 750watt 80+ gold for 95euro
> 
> What do you think of that psu??


Eh, too many rails. Try to find a single rail unit.

It's not modular, but if you can find one of these: Deepcool DQ750ST - They aren't modular, but they are Gold rated, they are proven quality by tons of reviews. They are the biggest bang for your buck.

The XFX TS or XTR Series are good. The Seasonic G series are really good. FSP Group makes some good power supplies as well.

You will have to do a bit of research though. Lots of brands are simply rebadges of quality units like Seasonic and other companies. Just find something with 55+amps on a single rail.

*edit* I would head over to the Power Supply forum. There is a couple of members there who specialize in that kind of thing. I just like to read what's going on all over the site and try to keep up with the latest "trends" lol


----------



## patriotaki

Quote:


> Originally Posted by *jodybdesigns*
> 
> Eh, too many rails. Try to find a single rail unit.
> 
> It's not modular, but if you can find one of these: Deepcool DQ750ST - They aren't modular, but they are Gold rated, they are proven quality by tons of reviews. They are the biggest bang for your buck.
> 
> The XFX TS or XTR Series are good. The Seasonic G series are really good. FSP Group makes some good power supplies as well.
> 
> You will have to do a bit of research though. Lots of brands are simply rebadges of quality units like Seasonic and other companies. Just find something with 55+amps on a single rail.
> 
> *edit* I would head over to the Power Supply forum. There is a couple of members there who specialize in that kind of thing. I just like to read what's going on all over the site and try to keep up with the latest "trends" lol


I was talking about this one..
http://www.zalman.com/global/product/Product_Read.php?Idx=646


----------



## jodybdesigns

Quote:


> Originally Posted by *patriotaki*
> 
> I was talking about this one..
> http://www.zalman.com/global/product/Product_Read.php?Idx=646


Ahh yeah the Goldrocks are made by Enhance, a QUALITY company. I thought you were looking at one of the HP series (they are terrible). Not a bad price in euro's either.

I would still try another PSU. Just take my advice with a grain of salt. Unless you are looking to upgrade anyways, then I would pull the trigger on it. Because you never know, it could very well be the card. I only experienced high pitched coil whine when I hit 2k frames because of the huge load. Now I am not getting any coil whine at all honestly. I have tested 6 or 7 times just entering and exiting in the past few minutes and I am not getting anything now. But that could have been my issue and not yours. So remember, just take MY advice with a grain of salt.


----------



## patriotaki

Quote:


> Originally Posted by *jodybdesigns*
> 
> Ahh yeah the Goldrocks are made by Enhance, a QUALITY company. I thought you were looking at one of the HP series (they are terrible). Not a bad price in euro's either.
> 
> I would still try another PSU. Just take my advice with a grain of salt. Unless you are looking to upgrade anyways, then I would pull the trigger on it. Because you never know, it could very well be the card. I only experienced high pitched coil whine when I hit 2k frames because of the huge load. Now I am not getting any coil whine at all honestly. I have tested 6 or 7 times just entering and exiting in the past few minutes and I am not getting anything now. But that could have been my issue and not yours. So remember, just take MY advice with a grain of salt.


Yes ..but I think my PSU is the issue.. Bought 2x r9 390 from same store one can hit up to 1070Mhz in a 750watt form cooler master without voltage
.. The other one on my PSU from superflowrr can hit 1030-1040 with no voltage.

The xfx ts 750 outputs 62amps while the zalman 65amps.. What's better?


----------



## jodybdesigns

Quote:


> Originally Posted by *patriotaki*
> 
> Yes ..but I think my PSU is the issue.. Bought 2x r9 390 from same store one can hit up to 1070Mhz in a 750watt form cooler master without voltage
> .. The other one on my PSU from superflowrr can hit 1030-1040 with no voltage.
> 
> The xfx ts 750 outputs 62amps while the zalman 65amps.. What's better?


Either will be fine. I assume you are putting this into Black Panther? Those 6600K's are pretty efficient...even when overclocked. My i5 requires me to jump from 1.24 to 1.28 to get to 4.3 and 1.31 for 4.5. So I decided to run undervolted @ 4.2ghz. Nice and smooth.

Your system will probably draw 600 at the maximum (Mine hits about 540 and I am on water). Consider the 90% efficiency, and we can say 90-100 watts is being dumped out as heat, so you are only pulling 500 from the wall. That's how the efficiency works, and whhy people want a high efficiency. Especially where power isn't cheap like the USA.


----------



## PCMADD0CT0R

Add me please, thanks.

1. PROOF- http://www.techpowerup.com/gpuz/mkrq6/

2.Crossfired - ASUS AMD Radeon (TM) R9 390 Series DirectX 12 STRIX-R9390-DC3OC-8GD5-GAMING 8GB 512-Bit GDDR5 PCI Express 3.0 HDCP Ready Video Card

3. Cooling - Stock


----------



## m70b1jr

Can someone upload HIS ITurbo for me? I can't find any downloads online for it.


----------



## Vellinious

HiS iTurbo

https://drive.google.com/open?id=0B6zqzZ0qTCB5TkRDNWgwcVd1Z1k


----------



## Stige

Quote:


> Originally Posted by *patriotaki*
> 
> sure? @shilka told me that is not so good


Quote:


> Originally Posted by *jodybdesigns*
> 
> Wow....5 +12v rails?? I suspect PSU right now. 5 rails is usually a crap design.
> 
> You need to get a nice PSU with 1 strong rail.


I stand corrected, I didn't even think that they still manufacture crap with multiple 12V lines







The PSU itself might be OK but multiple 12V lines are not.
Quote:


> Originally Posted by *patriotaki*
> 
> hmm that leads me to a new psu? that will fix my issues i think..
> i get alot of coil whine


Quote:


> Originally Posted by *jodybdesigns*
> 
> I would try to borrow one first. But usually, more than 1 rail is usually crap. Look at any of the really high end PSU's. They usually all have 1 rail with a ton of amps on it. Instead of splitting a ton of amps across a bunch of rails.


And this is correct, you don't want more than one 12V rail.


----------



## patriotaki

Quote:


> Originally Posted by *jodybdesigns*
> 
> Either will be fine. I assume you are putting this into Black Panther? Those 6600K's are pretty efficient...even when overclocked. My i5 requires me to jump from 1.24 to 1.28 to get to 4.3 and 1.31 for 4.5. So I decided to run undervolted @ 4.2ghz. Nice and smooth.
> 
> Your system will probably draw 600 at the maximum (Mine hits about 540 and I am on water). Consider the 90% efficiency, and we can say 90-100 watts is being dumped out as heat, so you are only pulling 500 from the wall. That's how the efficiency works, and whhy people want a high efficiency. Especially where power isn't cheap like the USA.


So I couldn't find any reviews on the zalman xg-750 ( I have bought it on my father's PC "red panther" but I didn't test it so I can't tell if it's good.
Shilka told me to stay away from it cause there are no reviews.

My other option is the xfx ts 750 gold
What about this one?
Any other recommendations?

My only complaint is that the xfx is not modular or even semi-modular


----------



## TsukikoChan

Quote:


> Originally Posted by *patriotaki*
> 
> So I couldn't find any reviews on the zalman xg-750 ( I have bought it on my father's PC "red panther" but I didn't test it so I can't tell if it's good.
> Shilka told me to stay away from it cause there are no reviews.
> 
> My other option is the xfx ts 750 gold
> What about this one?
> Any other recommendations?
> 
> My only complaint is that the xfx is not modular or even semi-modular


finally, my time to return the favor and shine XD
http://www.tomshardware.co.uk/forum/id-2547993/psu-tier-list.html

I'm unsure if this is still up to date, but for a 390/390x you want a tier 1 or a tier 2 psu 
Personally when i replaced my psu a month or 2 ago i went for the EVGA SuperNova G2 750w (i ended up with 850 but was aiming to get at least 750w) which is modular and a high tier psu. it is sooooooo silent and my gcard has behaved better since


----------



## Stige

Quote:


> Originally Posted by *TsukikoChan*
> 
> finally, my time to return the favor and shine XD
> http://www.tomshardware.co.uk/forum/id-2547993/psu-tier-list.html
> 
> I'm unsure if this is still up to date, but for a 390/390x you want a tier 1 or a tier 2 psu
> Personally when i replaced my psu a month or 2 ago i went for the EVGA SuperNova G2 750w (i ended up with 850 but was aiming to get at least 750w) which is modular and a high tier psu. it is sooooooo silent and my gcard has behaved better since


I would say for anything you use yourself, why would you save money on a PSU when if you buy a quality one, you won't have to replace it for atleast 5 years? PSU is propably the last place one should save money on, there are other places you can save money on if you need to but don't buy a cheap PSU, you will regret it one day.

Also unless you plan to Crossfire or something silly like that in the future, you will never need more than 600W, propably not even that much even if you do heavy overclocking like I do. I can't push my system past ~500W on the wall outlet and that includes my water pumps + dozen fans, if I were to build it without the watercooling + "minimal" amount of fans, I doubt it would take more than 420-430W on the wall outlet.

Also I think that tier list for PSUs is just bs, Rosewill Tachyons at Tier 2? Why exactly? These things are up there at the top with all the quality PSUs.


----------



## patriotaki

Quote:


> Originally Posted by *Stige*
> 
> I would say for anything you use yourself, why would you save money on a PSU when if you buy a quality one, you won't have to replace it for atleast 5 years? PSU is propably the last place one should save money on, there are other places you can save money on if you need to but don't buy a cheap PSU, you will regret it one day.
> 
> Also unless you plan to Crossfire or something silly like that in the future, you will never need more than 600W, propably not even that much even if you do heavy overclocking like I do. I can't push my system past ~500W on the wall outlet and that includes my water pumps + dozen fans, if I were to build it without the watercooling + "minimal" amount of fans, I doubt it would take more than 420-430W on the wall outlet.
> 
> Also I think that tier list for PSUs is just bs, Rosewill Tachyons at Tier 2? Why exactly? These things are up there at the top with all the quality PSUs.


OK so which one do you recommen


----------



## jodybdesigns

Quote:


> Originally Posted by *patriotaki*
> 
> OK so which one do you recommen


The Goldrock looks to be made by Enhance. They make some of the trip tier Zalmans. The XFX TS series are really good PSU's.


----------



## Stige

Quote:


> Originally Posted by *patriotaki*
> 
> OK so which one do you recommen


The XFX is built with Seasonic parts so you can't go wrong with that for sure.


----------



## patriotaki

I guess I'll have to live with the non modular way







.. Xfx it is..


----------



## Agent Smith1984

Just be aware of this; it's never a bad idea to upgrade to a higher quality PSU, but don't be shocked when you hook everything up and still hear crickets chirping.
When I had my Fury, it was quite the squealer, and I tried two different PSU's with it. My 390's have never made a peep, but I do hear quite a few people who say their's do. Most of the issue comes in situations where the FPS is really high (either easy to run/low res gaming, or menu screens within a game).

No matter, you can't go wrong with the XFX units made by Seasonic, and also anything in the EVGA G or P series is also top notch.









Honestly, I have a Raidmax AE850 and a Rosewill Hive 850, and both have supported highly overclocked crossfire 290/390 setups with 200w+ overclocked CPU's without a flinch...... PSU is a good component not to skimp on, but you also don't have to spend a fortune to get good results either.

The 4 year old Raidmax is still pushing an overclocked X6 and 7870 on my wife's rig.... I figured I'd let it spend it's last years with an easier workload, lol

I've put a thrashing on this Rosewill HIVE and this thing is rock solid.... one thing is though, if there is EVER any type of power surge or outage, you have to unplug it, switch it off, and then plug it back in to reset the breaker in it. It's very protective of the hardware (a good thing), and I got this thing for like $80 on sale


----------



## Stige

My 390 makes this "clicking" sort of "whine" if I have something 3D running and alt tab so it is in the background. I'm pretty sure my HD7950 had the same "problem" aswell.


----------



## patriotaki

thank you all for your answers you really helped me









just one last question...there arent any PSU in the price range of XFX TS 750 which are modular or semi modular except for the zalman xg 750..right?







i cant find anything


----------



## Stige

Quote:


> Originally Posted by *patriotaki*
> 
> thank you all for your answers you really helped me
> 
> 
> 
> 
> 
> 
> 
> 
> 
> just one last question...there arent any PSU in the price range of XFX TS 750 which are modular or semi modular except for the zalman xg 750..right?
> 
> 
> 
> 
> 
> 
> 
> i cant find anything


XFX TS 650G 80+ Gold is 110€ in Finland, so 750 should be slightly higher.

At that price range there are plenty of good modular alternatives in Finland atleast.
like 650W EVGA SuperNOVA G2 is 104€ so even cheaper than that, and 100% modular.
750W G2 is 112,90€.


----------



## patriotaki

Quote:


> Originally Posted by *Stige*
> 
> XFX TS 650G 80+ Gold is 110€ in Finland, so 750 should be slightly higher.
> 
> At that price range there are plenty of good modular alternatives in Finland atleast.
> like 650W EVGA SuperNOVA G2 is 104€ so even cheaper than that, and 100% modular.
> 750W G2 is 112,90€.


evga 750 g2 is 140euro here


----------



## patriotaki

what about the XFX ProSeries 750W XXX Edition Semi-Modular?
or this

http://www.skytechshop.gr/hardware/power-supplies/thermaltake-toughpower-750w-atx-80plus-gold


----------



## kizwan

Quote:


> Originally Posted by *patriotaki*
> 
> what about the XFX ProSeries 750W XXX Edition Semi-Modular?
> or this
> 
> http://www.skytechshop.gr/hardware/power-supplies/thermaltake-toughpower-750w-atx-80plus-gold


What is wrong with your current PSU? Which Super Flower is that? Leadex? It have good review.


----------



## patriotaki

Quote:


> Originally Posted by *kizwan*
> 
> What is wrong with your current PSU? Which Super Flower is that? Leadex? It have good review.


not even close...its this one

http://www.super-flower.com.tw/products_detail.php?class=2&sn=6&ID=23&lang=en


----------



## kizwan

Quote:


> Originally Posted by *patriotaki*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> What is wrong with your current PSU? Which Super Flower is that? Leadex? It have good review.
> 
> 
> 
> not even close...its this one
> 
> http://www.super-flower.com.tw/products_detail.php?class=2&sn=6&ID=23&lang=en
Click to expand...

I managed to traced your posts. Can you link me the post where Shilka said the PSU is not good? Generally Super Flower PSU always get good review. If the coil whine is from the GPU, the "problem" will be the GPU, not PSU.


----------



## patriotaki

Quote:


> Originally Posted by *kizwan*
> 
> I managed to traced your posts. Can you link me the post where Shilka said the PSU is not good? Generally Super Flower PSU always get good review. If the coil whine is from the GPU, the "problem" will be the GPU, not PSU.


i have asked him in PM


----------



## Stige

Quote:


> Originally Posted by *kizwan*
> 
> I managed to traced your posts. Can you link me the post where Shilka said the PSU is not good? Generally Super Flower PSU always get good review. If the coil whine is from the GPU, the "problem" will be the GPU, not PSU.


It is bad because it has _FIVE_ different 12V lines and the max it can supply on one line is slightly above 300W.


----------



## patriotaki

Quote:


> Originally Posted by *Stige*
> 
> It is bad because it has _FIVE_ different 12V lines and the max it can supply on one line is slightly above 300W.


http://www.skytechshop.gr/hardware/power-supplies/thermaltake-toughpower-750w-atx-80plus-gold

what about this one?


----------



## Agent Smith1984

Quote:


> Originally Posted by *patriotaki*
> 
> http://www.skytechshop.gr/hardware/power-supplies/thermaltake-toughpower-750w-atx-80plus-gold
> 
> what about this one?


Have you had your card too long exchange it with the vendor yet?

You CAN exchange a GPU for the reason of coil whine. I would try that first since it's practically free, instead of dumping money on a PSU you may not need... but that's just my opinion.
Good luck!


----------



## Stige

Quote:


> Originally Posted by *patriotaki*
> 
> http://www.skytechshop.gr/hardware/power-supplies/thermaltake-toughpower-750w-atx-80plus-gold
> 
> what about this one?


It's a very old model and not a very good one either, the XFX is way better.


----------



## jodybdesigns

Quote:


> Originally Posted by *patriotaki*
> 
> http://www.skytechshop.gr/hardware/power-supplies/thermaltake-toughpower-750w-atx-80plus-gold
> 
> what about this one?


Now Thermaltakes are hit and miss. I have a Toughpower 675 watter going after 5 years.

I also have a Rosewill HIVE 650 in my wife's machine. I did have a fan controller kill the first one. But Rosewill customer support is top notch and was replaced in 2 weeks there and back. I always look at Rosewills offerings, they make good stuff.


----------



## patriotaki

i think i need to change PSU... my CPU OC is also not so stable..
my options then are limited to these:

http://t-support.gr/index.php?route=product/product&product_id=1154386
http://www.skytechshop.gr/hardware/power-supplies/xfx-ts-series-750w-atx-80plus-gold


----------



## kizwan

Regarding single vs. multi +12V rails, please read this article. Multi +12V rails not necessarily bad.

http://www.overclock.net/power-supplies/761202-single-rail-vs-multi-rail-explained.html


----------



## Stige

Quote:


> Originally Posted by *patriotaki*
> 
> i think i need to change PSU... my CPU OC is also not so stable..
> my options then are limited to these:
> 
> http://t-support.gr/index.php?route=product/product&product_id=1154386
> http://www.skytechshop.gr/hardware/power-supplies/xfx-ts-series-750w-atx-80plus-gold


Why only those? If you want modular and quality from that store, you could get this: http://www.skytechshop.gr/hardware/power-supplies/super-flower-leadex-gold-650w-sf-650f14mg-black?limit=100

If it's only those two, get the XFX TS 750W.


----------



## patriotaki

Quote:


> Originally Posted by *Stige*
> 
> Why only those? If you want modular and quality from that store, you could get this: http://www.skytechshop.gr/hardware/power-supplies/super-flower-leadex-gold-650w-sf-650f14mg-black?limit=100


is a bit expensive 130EUR+10EUR shipping.. plus its not in stock either


----------



## Stige

Quote:


> Originally Posted by *patriotaki*
> 
> is a bit expensive 130EUR+10EUR shipping.. plus its not in stock either


Yeah I missed that, frickin weird languages!


----------



## 66racer

Hi guys,

I wanted to ask, for the purpose of mining Im planning on running 4 390x cards, my question is can I use a motherboard that does not support 4 way CF? Im split between the evga z170 ftw or the asrock z170 oc formula. The asrock supports 4 way but the evga has each pci-e x16 slot pinned out as a full x16 slot. Judging by other mining rigs it seems the evga board will be ok but wanted to double check before buying. It will have a 6700k for other purposes than mining and it will not be overclocked.

Also the plan is 4 powercolor devil 390x water cooled cards, likely with the shrouds removed since its over two slots.....UNLESS you guys think there is an air cooler that can handle that kind of setup under 24hr use.

Thanks guys!


----------



## Stige

Quote:


> Originally Posted by *66racer*
> 
> Hi guys,
> 
> I wanted to ask, for the purpose of mining Im planning on running 4 390x cards, my question is can I use a motherboard that does not support 4 way CF? Im split between the evga z170 ftw or the asrock z170 oc formula. The asrock supports 4 way but the evga has each pci-e x16 slot pinned out as a full x16 slot. Judging by other mining rigs it seems the evga board will be ok but wanted to double check before buying. It will have a 6700k for other purposes than mining and it will not be overclocked.
> 
> Also the plan is 4 powercolor devil 390x water cooled cards, likely with the shrouds removed since its over two slots.....UNLESS you guys think there is an air cooler that can handle that kind of setup under 24hr use.
> 
> Thanks guys!


Hasn't mining been a waste of time for a while now? Atleast for Bitcoin anyway?


----------



## patriotaki

has anyone used the drivers 16.2.1 hotfix?


----------



## patriotaki

just ordered the XFX TS Series 750W ATX 80Plus Gold P1-750G-TS3X

thank you all for your help! i hope ill see some improvement!

have a nice day everyone thanks again

ps: sorry for turning this thread into a psu thread


----------



## Agent Smith1984

Quote:


> Originally Posted by *66racer*
> 
> Hi guys,
> 
> I wanted to ask, for the purpose of mining Im planning on running 4 390x cards, my question is can I use a motherboard that does not support 4 way CF? Im split between the evga z170 ftw or the asrock z170 oc formula. The asrock supports 4 way but the evga has each pci-e x16 slot pinned out as a full x16 slot. Judging by other mining rigs it seems the evga board will be ok but wanted to double check before buying. It will have a 6700k for other purposes than mining and it will not be overclocked.
> 
> Also the plan is 4 powercolor devil 390x water cooled cards, likely with the shrouds removed since its over two slots.....UNLESS you guys think there is an air cooler that can handle that kind of setup under 24hr use.
> 
> Thanks guys!


Not sure about the 4x CF thing, but in regards to cooling.....

If you are SURE you have the hose length and fans mounts for the devils, then they are great cards, though I'm not sure the shroud being off will still buy the space you need..... In regards to potential air cooling... the only 2 slot cards than may withstand a 4x CF setup would be the Sapphire's in my opinion.... possibly the powercolor or XFX also.... You could really try your hand with undervolting on the XFX cards, because the cores seemed to be binned well, and they have drastically improved their VRM cooling on the latest versions. I have seem some -50 and even close to -100mv 1,000mhz core clocks for some people. Undervolting is a really interesting approach to maintaining temps at stock speeds.


----------



## DarX098

Hi guys, a good heaven engine score for a 390x is? And the max score with stock cooler on oc? Thanks


----------



## 66racer

Quote:


> Originally Posted by *Stige*
> 
> Hasn't mining been a waste of time for a while now? Atleast for Bitcoin anyway?


YEah bitcoin needs special mining hardware from what I understand. This isnt my personal build since I was recommended to this person from a friend. There are other currencies that do well though.
Quote:


> Originally Posted by *Agent Smith1984*
> 
> Not sure about the 4x CF thing, but in regards to cooling.....
> 
> If you are SURE you have the hose length and fans mounts for the devils, then they are great cards, though I'm not sure the shroud being off will still buy the space you need..... In regards to potential air cooling... the only 2 slot cards than may withstand a 4x CF setup would be the Sapphire's in my opinion.... possibly the powercolor or XFX also.... You could really try your hand with undervolting on the XFX cards, because the cores seemed to be binned well, and they have drastically improved their VRM cooling on the latest versions. I have seem some -50 and even close to -100mv 1,000mhz core clocks for some people. Undervolting is a really interesting approach to maintaining temps at stock speeds.


The challenge on this build was I needed to make it fit a normal case, man otherwise I would have loved a test bench like most mining builds use. The plan is a 750d, I was thinking the Define XL but went with the 750d in the end.

Undervolt and underclock is something I will be looking at as well, thanks for pointing that out.

I think I will order a single devil card first before ordering all 4 to be sure. I just dont see any other option. Im planning on mounting two on top and one as the rear exhaust and the lowest card will be either the case floor or front intake. Planning on 2 other intake fans and possibly a custom bracket to make sure airflow is getting over the vrm heatsinks if I dont think they are getting enough air.

I have run my gtx770 at 1.39v for long gaming sessions with a 120mm fan directed over the asus vrm heatsink with good results so I figure stock clocks and maybe with an undervolt I should be ok. I will be using an IR temp sensor to get heatsink temps (yeah not most accurate but better than nothing).

Will be using the EVGA P2 1600watt to power it all.

If I am missing anything let me know







Thats why Im here, any tips Im all ears. I probably wont order cards till tomorrow or saturday.


----------



## Scorpion49

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Not sure about the 4x CF thing, but in regards to cooling.....
> 
> If you are SURE you have the hose length and fans mounts for the devils, then they are great cards, though I'm not sure the shroud being off will still buy the space you need..... In regards to potential air cooling... the only 2 slot cards than may withstand a 4x CF setup would be the Sapphire's in my opinion.... possibly the powercolor or XFX also.... You could really try your hand with undervolting on the XFX cards, because the cores seemed to be binned well, and they have drastically improved their VRM cooling on the latest versions. I have seem some -50 and even close to -100mv 1,000mhz core clocks for some people. Undervolting is a really interesting approach to maintaining temps at stock speeds.


XFX sells both 390 and 390X with blowers now. The 390X, according to the XFX rep I talked to, is apparently pretty good. It may be loud, but probably a good choice for air cooled 4-way cards. I plan on picking one of the 390X's up, it looks like the Fury X cooler and may have a slightly different heatsink, supposedly a very limited run of cards. The 390 has the standard 290X style cooler on it.


----------



## patriotaki

anyone tried the 16.2.1 hotfix driver?


----------



## Irked

Im running the 16.3 Hot fix drivers no issues yet


----------



## Agent Smith1984

Quote:


> Originally Posted by *66racer*
> 
> YEah bitcoin needs special mining hardware from what I understand. This isnt my personal build since I was recommended to this person from a friend. There are other currencies that do well though.
> The challenge on this build was I needed to make it fit a normal case, man otherwise I would have loved a test bench like most mining builds use. The plan is a 750d, I was thinking the Define XL but went with the 750d in the end.
> 
> Undervolt and underclock is something I will be looking at as well, thanks for pointing that out.
> 
> I think I will order a single devil card first before ordering all 4 to be sure. I just dont see any other option. Im planning on mounting two on top and one as the rear exhaust and the lowest card will be either the case floor or front intake. Planning on 2 other intake fans and possibly a custom bracket to make sure airflow is getting over the vrm heatsinks if I dont think they are getting enough air.
> 
> I have run my gtx770 at 1.39v for long gaming sessions with a 120mm fan directed over the asus vrm heatsink with good results so I figure stock clocks and maybe with an undervolt I should be ok. I will be using an IR temp sensor to get heatsink temps (yeah not most accurate but better than nothing).
> 
> Will be using the EVGA P2 1600watt to power it all.
> 
> If I am missing anything let me know
> 
> 
> 
> 
> 
> 
> 
> Thats why Im here, any tips Im all ears. I probably wont order cards till tomorrow or saturday.


Have you considered going with the cheapest reference 290's you can fine, and getting blocks for all of them? I mean, considering this is a mining rig, going with the highest end 390's won't benefit you over any of those cards, and the price difference will allow you to do custom water cooling.......


----------



## jodybdesigns

Quote:


> Originally Posted by *patriotaki*
> 
> anyone tried the 16.2.1 hotfix driver?


I am running 16.3 and it's never been better. It has a new Power Efficiency so you can disable powerplay and it will keep your clocks from throttling. It's working great for me.


----------



## tolis626

Hey guys, there are a few things I've been meaning to ask regarding (mostly) memory overclocking.

First off, does anyone know why The Stilt recommends no higher that 1.05V AUX voltage? I seem to remember an explanation about the IMC being the hottest part of the chip or something. Does that mean that if temps are in check, we can push further? Or is it dangerous anyway and shouldn't be done for longer periods (He said that 1.1V is ok for benchmarking).

Secondly, is VRM2 on these cards the memory VRM? If so, wouldn't lowering its temps improve memory overclocking similar to how VRM1 temps affect core overclocking? Also, what temperatures does it usually operate at? On the MSI card there is no sensor for it.

Third, do any of you have any idea why memory stability on my card is hit or miss? Seems like it's random. Like, one time I can play normally at 1725MHz and then, after a reboot or the next day or something I can't even play at 1700MHz without crashing. What is up with that?

Thanks in advance!


----------



## patriotaki

whats the vulkan?
i read that is an API like mantle?

will any games use vulkan or will it be a failure like the mantle?
does nvidia use vulkan?


----------



## kizwan

Quote:


> Originally Posted by *tolis626*
> 
> Hey guys, there are a few things I've been meaning to ask regarding (mostly) memory overclocking.
> 
> First off, does anyone know why The Stilt recommends no higher that 1.05V AUX voltage? I seem to remember an explanation about the IMC being the hottest part of the chip or something. Does that mean that if temps are in check, we can push further? Or is it dangerous anyway and shouldn't be done for longer periods (He said that 1.1V is ok for benchmarking).
> 
> Secondly, is VRM2 on these cards the memory VRM? If so, wouldn't lowering its temps improve memory overclocking similar to how VRM1 temps affect core overclocking? Also, what temperatures does it usually operate at? On the MSI card there is no sensor for it.
> 
> Third, do any of you have any idea why memory stability on my card is hit or miss? Seems like it's random. Like, one time I can play normally at 1725MHz and then, after a reboot or the next day or something I can't even play at 1700MHz without crashing. What is up with that?
> 
> Thanks in advance!


Actually no more than 1.1V if I'm not mistaken. From Hawaii Bios mod thread.
Quote:


> From talk's with The Stilt via PM going too heavy hand on upping this is not good idea. So little increase can help and excessive can do the opposite (ie not help). The MC is sensitive to voltage so use caution. It's effects are down to per user experience / card.


With 290 ROM I don't need to add AUX voltage but with 390 ROM I need to increase it at least +50mV for memory overclock. Once you get how much you need for memory overclock, that's all you need & adding more doesn't increase your memory overclock headroom. Of course core voltage still can help memory overclocking. At least this was my experience on it. Different cards, react differently or have no effect at all.


----------



## 66racer

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Have you considered going with the cheapest reference 290's you can fine, and getting blocks for all of them? I mean, considering this is a mining rig, going with the highest end 390's won't benefit you over any of those cards, and the price difference will allow you to do custom water cooling.......


That would for sure be probably the best thing for temps but I didnt want to suggest that from a downtime standpoint. If a card failed under the rigors of mining 24/7 it would be a nightmare warranty repair.

I just think the powercolor devil 390x missed the point of 4 way CF with their solution, thinking if it was a blower fan they used it would have been perfect, oh well.


----------



## jdorje

I read on the Internet just the other day that for a good psu, more rails is better. The reasoning is that each rail will have separate circuit protectors and overvolt protection. The example was given of a particular high end model with 8 40a 12v rails.

Is that logic wrong? I understand it doesn't automatically apply to low end units that have multiple low amperage rails just because it's cheaper, and that you can't run a 390 very well on a 300W rail.


----------



## Agent Smith1984

Quote:


> Originally Posted by *66racer*
> 
> That would for sure be probably the best thing for temps but I didnt want to suggest that from a downtime standpoint. If a card failed under the rigors of mining 24/7 it would be a nightmare warranty repair.
> 
> I just think the powercolor devil 390x missed the point of 4 way CF with their solution, thinking if it was a blower fan they used it would have been perfect, oh well.


I think the need for reference "blower style" 390's went out the door with the death of the mining industry..... It was a huge craze on the 7900/290 series because mining was in it's prime.

Now there's no need for reference design, because gamers are the ones buying cards again. From a gamer's perspective, the last blower cards had terrible temp results, and you don't need the space savings they offer, because no one is using more than 2 or 3 cards together in a rig anymore.

I know that doesn't stand a 100% fact for everyone, but in most cases, people want aftermarket cooling that works better, so they can just run 1 or 2 cards..... The devil to mean, is a nice concept, but they should of offered a reference equivalent to the Fuxy X or something like that.....

Have you looked from some 295x2's at all?

2 of those in 4x would be great for what you want!


----------



## dopemoney

http://www.techpowerup.com/gpuz/details.php?id=54g48


----------



## Agent Smith1984

Quote:


> Originally Posted by *dopemoney*
> 
> http://www.techpowerup.com/gpuz/details.php?id=54g48


I have you added already, you trying to show us something I've not seen already? I did have a question for you..... is your card a reference blower?? You'd be one of the VERY few... as in... the ONLY, owner on here of one of those









Let us know how OC does and I will update the doc


----------



## Noirgheos

So I recently got my hands on a 390X Nitro, yes, the X variant.

I pitted it against my R9 Fury.

390X: 1200/+200Mem.

Fury: 1050/+0Mem

Tests done at 1080p.

In Shadow of Mordor, Ultra preset:

R9 Fury: 108FPS

R9 390X: 102FPS

Witcher 3 HBAO+, Ultra (except for Shadow and Foliage Distance, those are high)(Intro battle after talking to Vesemir in White Orchard):

R9 Fury: 64-72FPS

R9 390X: 63-70FPS

Why are they so damn close? I really wasted my money on the Fury. I'm sure at 1440p and 4K the Fury will pull ahead by a larger margin, but a ~5-10FPS difference at 1080p with both overclocked... Not worth it. Gonna sell the Fury and get a 1080p Freesync with it. Thinking Nixeus.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Noirgheos*
> 
> So I recently got my hands on a 390X Nitro, yes, the X variant.
> 
> I pitted it against my R9 Fury.
> 
> 390X: 1200/+200Mem.
> 
> Fury: 1050/+0Mem
> 
> Tests done at 1080p.
> 
> In Shadow of Mordor, Ultra preset:
> 
> R9 Fury: 108FPS
> 
> R9 390X: 102FPS
> 
> Witcher 3 HBAO+, Ultra (except for Shadow and Foliage Distance, those are high)(Intro battle after talking to Vesemir in White Orchard):
> 
> R9 Fury: 64-72FPS
> 
> R9 390X: 63-70FPS
> 
> Why are they so damn close? I really wasted my money on the Fury. I'm sure at 1440p and 4K the Fury will pull ahead by a larger margin, but a ~5-10FPS difference at 1080p with both overclocked... Not worth it. Gonna sell the Fury and get a 1080p Freesync with it. Thinking Nixeus.


Now you know why I went from Fury back to 390X.... put $150 back in my pocket, and got rid of that god awful 4GB limitation I was having in GTA V at 4k....

Do note though, that there is more of a spread in performance at 4k because of the memory bandwidth difference, but the additional shader count seems to have very little impact, probably due to the continued use of 64 ROP's....


----------



## Scorpion49

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I think the need for reference "blower style" 390's went out the door with the death of the mining industry..... It was a huge craze on the 7900/290 series because mining was in it's prime.
> 
> Now there's no need for reference design, because gamers are the ones buying cards again. From a gamer's perspective, the last blower cards had terrible temp results, and you don't need the space savings they offer, because no one is using more than 2 or 3 cards together in a rig anymore.


You can still get them:

http://www.amazon.com/XFX-1000MHz-Graphics-Cards-R9-390P-8BD6/dp/B01AWGY6II/ref=sr_1_2?ie=UTF8&qid=1457640879&sr=8-2&keywords=xfx+390

http://www.amazon.com/XFX-1050MHz-Graphics-Cards-R9-390X-8BD6/dp/B01AWGY2SW/ref=sr_1_3?ie=UTF8&qid=1457640887&sr=8-3&keywords=xfx+390x


----------



## Noirgheos

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Now you know why I went from Fury back to 390X.... put $150 back in my pocket, and got rid of that god awful 4GB limitation I was having in GTA V at 4k....
> 
> Do note though, that there is more of a spread in performance at 4k because of the memory bandwidth difference, but the additional shader count seems to have very little impact, probably due to the continued use of 64 ROP's....


Still, a 390X is basically the same thing when overclocked, and with Freesync, it should last even longer.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Noirgheos*
> 
> Still, a 390X is basically the same thing when overclocked, and with Freesync, it should last even longer.


Oh I agree...... trust me, I found out first hand.....

I sold my MSI 390, and upgraded to Fury...... ended up doing some experimenting with a 980 KPE after that, and ultimately got rid of that and went to the MSI 390X, and now I feel like I am back where I am supposed to be... lol

Hawaii, may be old, but it's a very matured, and familiar platform for me, and t performs very well for the money, even now.

I got my 390x for $350 after rebate, and it came with a free mouse, and a Hitman code... that's an awesome value....


----------



## Noirgheos

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Oh I agree...... trust me, I found out first hand.....
> 
> I sold my MSI 390, and upgraded to Fury...... ended up doing some experimenting with a 980 KPE after that, and ultimately got rid of that and went to the MSI 390X, and now I feel like I am back where I am supposed to be... lol
> 
> Hawaii, may be old, but it's a very matured, and familiar platform for me, and t performs very well for the money, even now.
> 
> I got my 390x for $350 after rebate, and it came with a free mouse, and a Hitman code... that's an awesome value....


Damn, I wish I was in the USA. Canada misses out on all this stuff... If I get the Hitman code, I might sell it and put it toward a new SSD. Gonna aim for a 500GB one, get rid of my HDD once and for all. So loud.


----------



## 66racer

Quote:


> Originally Posted by *Noirgheos*
> 
> So I recently got my hands on a 390X Nitro, yes, the X variant.
> 
> I pitted it against my R9 Fury.
> 
> 390X: 1200/+200Mem.
> 
> Fury: 1050/+0Mem
> 
> Tests done at 1080p.
> 
> In Shadow of Mordor, Ultra preset:
> 
> R9 Fury: 108FPS
> 
> R9 390X: 102FPS
> 
> Witcher 3 HBAO+, Ultra (except for Shadow and Foliage Distance, those are high)(Intro battle after talking to Vesemir in White Orchard):
> 
> R9 Fury: 64-72FPS
> 
> R9 390X: 63-70FPS
> 
> Why are they so damn close? I really wasted my money on the Fury. I'm sure at 1440p and 4K the Fury will pull ahead by a larger margin, but a ~5-10FPS difference at 1080p with both overclocked... Not worth it. Gonna sell the Fury and get a 1080p Freesync with it. Thinking Nixeus.


Sorry I'm on mobile right now so dony see your system specs but the cpu and overclock will matter more in your case at 1080p with both of those cards over something less powerful.


----------



## PCMADD0CT0R

Agent are you using any custom modular cables with your Rosewill or the factory supplied ones? I have a Rosewill Photon 1050 and want to see if I can find a compatible setup for Rosewill PSU's in general. I submitted a request to CableMod support awaiting their response, but just wanted to know if you had any suggestions off-hand. Thanks in advance.


----------



## Noirgheos

Quote:


> Originally Posted by *66racer*
> 
> Sorry I'm on mobile right now so dony see your system specs but the cpu and overclock will matter more in your case at 1080p with both of those cards over something less powerful.


i7 4790K, 16GB of DDR3-2400MHz, Maximus VII Hero, I've pretty much eliminated any other bottlenecks,


----------



## battleaxe

I redesigned the VRM coolers once again. Here's the results. I'm quite happy with how this worked out.









52C max on VRM1, less than 40C on VRM2... works for me.

1200mhz core, 1500mhz mem, 1.211v max during load.


----------



## Agent Smith1984

Quote:


> Originally Posted by *battleaxe*
> 
> I redesigned the VRM coolers once again. Here's the results. I'm quite happy with how this worked out.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 52C max on VRM1, less than 40C on VRM2... works for me.
> 
> 1200mhz core, 1500mhz mem, 1.211v max during load.


Ooohhh, you need dat volt mod yo!!!!

Man, 1.35v, and you'd be looking at 1260-1300 core probably


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Ooohhh, you need dat volt mod yo!!!!
> 
> Man, 1.35v, and you'd be looking at 1260-1300 core probably


I can get 1270 for sure without it. Already tried. I was just showing the temps, and how they are a ton better now. I plan to put another RAD between the two cards VRM's sections tomorrow so the heat from card one isn't transferred to card two VRM's. Should help card two a bit.

I'm pretty stoked with this little project. Tood a while to build and put together, but totally worth it.


----------



## m70b1jr

Glad to know.


----------



## kizwan

Quote:


> Originally Posted by *battleaxe*
> 
> I redesigned the VRM coolers once again. Here's the results. I'm quite happy with how this worked out.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 52C max on VRM1, less than 40C on VRM2... works for me.
> 
> 1200mhz core, 1500mhz mem, 1.211v max during load.
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


That is nice mod. What is your water temp during that run? Or ambient temp?


----------



## dopemoney

thanks for adding me. i didnt realize you had already done it, i guess i was waiting for some sort of confirmation. as for overclocking the card, i guess it wasn't horrible, but it wasn't great either. using +50% power, i was only able to get it to 1060mhz on the core clock and 1565mhz on the memory clock. anything higher on either clock resulted in artifacts or blackscreen. i think my heaven score is acceptable compared to other 390 variants, even though they overclock better. 

and like i said, this is my card: http://www.newegg.com/Product/Product.aspx?Item=N82E16814131686
i dont think its a reference card because my understanding was that AMD did not make reference 390's. or if they did it was only for oems? did they only make reference cards for Fury X? in truth though, the card does not say PowerColor anywhere on it, rather it says VXR9 390 8GBD5-DHE and then there's a serial number, but the newegg link i posted here is where the card came from. well shoot, i just found this, but this is not my card: http://www.newegg.com/Product/Product.aspx?Item=N82E16814105039


----------



## jodybdesigns

Quote:


> Originally Posted by *m70b1jr*
> 
> 
> Glad to know.


But are you Wallet Ready? Lord Gaben said I'm not


----------



## Stige

Quote:


> Originally Posted by *battleaxe*
> 
> I can get 1270 for sure without it. Already tried. I was just showing the temps, and how they are a ton better now. I plan to put another RAD between the two cards VRM's sections tomorrow so the heat from card one isn't transferred to card two VRM's. Should help card two a bit.
> 
> I'm pretty stoked with this little project. Tood a while to build and put together, but totally worth it.


How hot is the RAM modules now with heatsinks like that? Can you keep your finger on them?


----------



## diggiddi

Quote:


> Originally Posted by *patriotaki*
> 
> whats the vulkan?
> i read that is an API like mantle?
> 
> will any games use vulkan or will it be a failure like the mantle?
> does nvidia use vulkan?


Vulkan is essentially Mantle Also it was not a failure at all, it effectively accelerated Dx12 development


----------



## Stige

Quote:


> Originally Posted by *diggiddi*
> 
> Vulkan is essentially Mantle Also it was not a failure at all, it effectively accelerated Dx12 development


Isn't Vulkan the next-gen OpenGL/Mantle replacement?


----------



## battleaxe

Quote:


> Originally Posted by *kizwan*
> 
> That is nice mod. What is your water temp during that run? Or ambient temp?


I don't know what the water temp was. I didn't check. Our house is kept at 70deg F. So whatever that is...? Not sure what that is in Celcius.
Quote:


> Originally Posted by *Stige*
> 
> How hot is the RAM modules now with heatsinks like that? Can you keep your finger on them?


Yes, they don't get very hot at all. But I don't think they do normally either?


----------



## Stige

Quote:


> Originally Posted by *battleaxe*
> 
> Yes, they don't get very hot at all. But I don't think they do normally either?


That is what I thought aswell, there was a debate about this earlier in another thread I think. Good to know I was right


----------



## diggiddi

Quote:


> Originally Posted by *Stige*
> 
> Isn't Vulkan the next-gen OpenGL/Mantle replacement?


Nope, AMD gave the code to Kronos, there's a Youtube video where they stated this

Theres actually a discussion going on in here http://www.overclock.net/t/1592431/anand-ashes-of-the-singularity-revisited-a-beta-look-at-directx-12-asynchronous-shading/920


----------



## patriotaki

wil the vulkan at least be available in most of the gameS?


----------



## gupsterg

Quote:


> Originally Posted by *Noirgheos*
> 
> So I recently got my hands on a 390X Nitro, yes, the X variant.
> 
> I pitted it against my R9 Fury.
> 
> 
> Spoiler: test results
> 
> 
> 
> 390X: 1200/+200Mem.
> 
> Fury: 1050/+0Mem
> 
> Tests done at 1080p.
> 
> In Shadow of Mordor, Ultra preset:
> 
> R9 Fury: 108FPS
> 
> R9 390X: 102FPS
> 
> Witcher 3 HBAO+, Ultra (except for Shadow and Foliage Distance, those are high)(Intro battle after talking to Vesemir in White Orchard):
> 
> R9 Fury: 64-72FPS
> 
> R9 390X: 63-70FPS
> 
> 
> 
> Why are they so damn close? I really wasted my money on the Fury. I'm sure at 1440p and 4K the Fury will pull ahead by a larger margin, but a ~5-10FPS difference at 1080p with both overclocked... Not worth it. Gonna sell the Fury and get a 1080p Freesync with it. Thinking Nixeus.


+rep for info







.

IMO you're pitting a very good clocking 390X against a below average Fury IMO







. Then also IMO the 390/X benefits from mature drivers / ROM as essentially it's a Hawaii core.

On Hawaii when a 390/X memory controller timings are added to a 290/X ROM we see a slight increase, then we tighten RAM timings (390/X have slightly tighter timings from factory) we then see another slight increase, *but* regardless a 390/X ROM flashed to a 290/X still gives slightly better performance (most cases). Even if we flash a 290/X to 390/X due to the "fused id" on core a 290/X will not use same driver path. This paragraph is to show that 390/X benefits from a lot of maturity on drivers / rom.

If you have time I'd be interested to see perhaps retest 390X @ 1050MHz vs Fury @ 1050MHz?







.
Quote:


> Originally Posted by *Agent Smith1984*
> 
> Now you know why I went from Fury back to 390X.... put $150 back in my pocket, and got rid of that god awful 4GB limitation I was having in GTA V at 4k....


Been a Custom PC reader since 2004, highly regard their reviews and many purchase decisions been based on them







. As some of the reviewers IIRC also work on Bit Tech it sometimes has same reviews from that UK PC mag.

In Shadow of Mordor approx 5% difference, Witcher III approx 10% between Fury Tri-X OC (1050/500) vs MSI 390X Gaming (1100/1525). GTA V @ 4K there is 20% min FPS & aver.FPS difference, I'm not seeing a 4GB HBM limit?

Crysis 3 I can see near 20% min FPS increase and ~15% aver.FPS. BF4 approx 10%, @ 4K nearer ~15%. Alien Isolation approx 6% in min FPS, 5% aver.FPS @ 1080P as we rise in res we're getting close to 11%.

Now 390X has 2816SP, Fury has 3584 = 27% increase in SP. Now take the 290/X or 390/X? 2816 vs 2560 = 10% difference in SP. Once a 290 was same clock as 290X generally gap was small, Bit Tech Review of Asus DCUII 290 OC 1000/1260 with vs 290X 1000/1250.

So would you not say that if 390X and Fury were clock for clocks same the approximate FPS gap would be there but bigger than 290 vs 290X clock for clock as SP % is greater between 390X vs Fury?

Currently I've only done some quick tests, here is my 3DM FS results.

Link:- Fury X 1090 (genuine not a unlock card) vs Fury 1090 (3840SP unlock) vs Vapor-X 290X 1100/1525 (heavy bios mod)

Above were not 16.3 drivers, some owners of Fiji IIRC are reporting better results with 16.3 drivers







.

Now the new 16.3 drivers are giving a new option in control panel, "Power Efficiency".

Same ROM / RIG, etc, view what happens to GPU clock in Heaven test on the Fury X I currently have in rig.



Spoiler: PE : Off









Spoiler: PE : On







HML files.

Power_eff_check.zip 10k .zip file


All drivers prior drivers to 16.3 had PE hidden and On always. Not had a chance to see how it effects FPS, so only currently assuming it will have some positive impact with PE : Off.

When I was upping "PowerLimit" prior to 16.3 with PE: Off I'd see "PowerTune" get more aggressive to downlclock GPU in Heaven even if GPU temp very cool (approx.<55C). I had to find this balancing point where PL was enough to get clocks flatter and for PT not to effect it as much, but they never become truly flat until 16.3 with PE : Off.

Taking other aspects of Fury Tri-X into context I like it now vs my initial thoughts. It can maintain =<55C very quietly on air compared with Hawaii, which I best I could get was <75C. Also dunno what magic they did with VRM but the 6 phase rear VRM is as cool as the Vapor-X 290X 10 phase rear VRM. Also the other VRM temps are phenomenal on air, if I compared my ref PCB Tri-X 290 temps vs Fury Tri-X there is a world of difference.


----------



## coffeeplus

Hello!

Nice clock graph you posted there.
I've been testing with the 16.3 driver myself yesterday, but didn't have the exact same results.

Haven't tried Unigine yet, but fiddled a bit with Witcher.
Already posted my experience in another thread, here, but will quote below.
Original post: http://www.overclock.net/t/1594219/16-3-drivers-installed-but-clock-memory-not-working-correctly#post_24978940
Quote:


> But I also have a dilemma whether it's working as intended.
> With Power efficiency turned OFF, I observed the following:
> 1) Playing The Witcher 3, limited with MSI Afterburner at 50 fps, fullscreen and a settings combination so that I would expect that card is not at full load but still has room, even in the most demanding game areas - clock is not standing at 1015 Mhz, but has small fluctuations. However, D3D usage tends to stay at maximum, it's like instead of keeping constant 1015 Mhz and depending on the way those 50 FPS are achieved, the Usage would fluctuate, it lowers the clock so that the usage will be as close to maximum as possible.
> 2) While watching a Youtube video, clock is sitting at 1015 Mhz, nice smooth graph in MSI Afterburner.
> 3) Idle at 300 Mhz.
> 
> I will retest point 1) and maybe come back with some graph screenshots just to confirm exact behavior in the above described scenarios.
> Anyway, does anyone know if this is how it should actually behave?
> 
> While running 1) the frametime graph looked better than with 16.2 driver, but still had some small spikes from time to time. Although I think they were not so noticeable to the eye (my eyes, at least) during gameplay.


So, at least in Witcher 3, I didn't manage to get a smooth clock - but I got it while following Youtube in Firefox - isn't that a bit strange?!

Also, I haven't fiddled with the Power Limit at all! In MSI afterburner I have it set at 0. Do you have any indication or tip on whether I should do this (I have no plans for overclock) if it influences this clock speed for the better without any other drawbacks?


----------



## gupsterg

Regarding Witcher III clock fluctuation increase PL + PE : Off and re-test







.

Regarding your "While watching a Youtube video, clock is sitting at 1015 Mhz, nice smooth graph in MSI Afterburner." if you are using Firefox switch off hardware acceleration







.

Link to video on method :-


----------



## coffeeplus

Thank you for bothering with that video!
I am not particularly annoyed that the card is used while Firefox browsing, although the temps get higher more often like this - maybe I can offload it to the CPU like you recommend







.

I will test increasing the Power Limit when I get home and post results.

But, I am a bit _wary_ of fiddling with the PL at the moment and would greatly benefit if I can get some guidance:
1) Should this be regularly be done if no overclock intention?
2) Any drawbacks/dangers of setting a higher power limit margin? Like instability, more heat etc. - apart from the extra watts freedom for the card to draw.
3) Where should I do it from? MSI Afterburner? Or other software? I heard that the Crimson driver has the feature in the Overdrive tab (but I haven't touched it yet, out of the same wariness - heard that Crimson settings might break the good going of your card and that it's best to leave the settings to their defaults and adjust as little as possible)
4) What value to increase to? Maximum is 20% (as far as I heard) - do I need to increase that much, or only a small increment will suffice, like 5%?
5) Is there any documented or empirical claim that 0% Power Limit affects clock making it to fluctuate or could my case be an isolated one?


----------



## Ron Soak

The last 48 hours have been a roller coaster...

Had a 970, wanted to upgrade, read all these amazing reviews about the R9 390X, saw they were going cheaper than 980's and picked one up. (With Back-plate, no orange 1180 model)

Uninstalled Nvidia drivers (the wrong way I later learned) and freshly installed this card along with the 16.3 hot-fix and everything it wanted to install.

That Raptr gaming app is a POS BTW.

My monitor flickered, BLOPS 3 looked dark and fuzzy while GTA V didn't work, couldn't keep 30 fps even after turning things down to normal. I hate that their are game settings + catalyst settings + raptr settings.

After hours of mucking around (do you know how long it take to close GTA change 1 setting, restart GTA UGH) I found the proper way to uninstall your NVIDIA drivers. Did that.

Thought the problem was fixed, BLOPS worked amazingly, however GTA just would not work for me in any capacity. Still getting 23fps.

Watched what it did on MSI Afterburner (un-overclocked) and the usage was going 100-0-100-0-100 insane. Read that turning a setting off in crimson, helped with that, tried it, kinda helped went to bed feeling a little bit better that I hadn't wasted my money.

After work, tried to jump onto GTA and it was stuttering and not keeping 30fps even in the slightest. Went back to the internet thinking their must be something I missed on a webpage somewhere.

"Disable ULPS"

The most magical words to ever grace my eyes.

Reset GTA settings, disabled ULPS jumped in, 50-60fps constant with most thing on high, SUCCESS!

So with that out the way, I went about trying to do what I had intended to do 48 hours prior. OC.

With Extended official overclocking limits checked, along with disable ULPS, PowerPlay support enabled, plus a decent OC I have every single GTA setting at max, which was getting me 43 fps so I have locked it at 30. Never been happier.

Such a long road!

Its also a really pretty card and compliments my set up well!


----------



## patriotaki

Quote:


> Originally Posted by *Ron Soak*
> 
> The last 48 hours have been a roller coaster...
> 
> Had a 970, wanted to upgrade, read all these amazing reviews about the R9 390X, saw they were going cheaper than 980's and picked one up. (With Back-plate, no orange 1180 model)
> 
> Uninstalled Nvidia drivers (the wrong way I later learned) and freshly installed this card along with the 16.3 hot-fix and everything it wanted to install.
> 
> That Raptr gaming app is a POS BTW.
> 
> My monitor flickered, BLOPS 3 looked dark and fuzzy while GTA V didn't work, couldn't keep 30 fps even after turning things down to normal. I hate that their are game settings + catalyst settings + raptr settings.
> 
> After hours of mucking around (do you know how long it take to close GTA change 1 setting, restart GTA UGH) I found the proper way to uninstall your NVIDIA drivers. Did that.
> 
> Thought the problem was fixed, BLOPS worked amazingly, however GTA just would not work for me in any capacity. Still getting 23fps.
> 
> Watched what it did on MSI Afterburner (un-overclocked) and the usage was going 100-0-100-0-100 insane. Read that turning a setting off in crimson, helped with that, tried it, kinda helped went to bed feeling a little bit better that I hadn't wasted my money.
> 
> After work, tried to jump onto GTA and it was stuttering and not keeping 30fps even in the slightest. Went back to the internet thinking their must be something I missed on a webpage somewhere.
> 
> "Disable ULPS"
> 
> The most magical words to ever grace my eyes.
> 
> Reset GTA settings, disabled ULPS jumped in, 50-60fps constant with most thing on high, SUCCESS!
> 
> So with that out the way, I went about trying to do what I had intended to do 48 hours prior. OC.
> 
> With Extended official overclocking limits checked, along with disable ULPS, PowerPlay support enabled, plus a decent OC I have every single GTA setting at max, which was getting me 43 fps so I have locked it at 30. Never been happier.
> 
> Such a long road!
> 
> Its also a really pretty card and compliments my set up well!


are you playing on 1080p? do you have any advanced graphics on? what are you using AMD chs? 8x AA? everything maxed out?
what fps do you get with the same settings before and after you disabled ULPS?


----------



## gupsterg

@coffeeplus

Answers:-

1) It can be needed, many owners up PL when throttle occurs.

2) You shouldn't see an increase in temps or be at risk by upping PL using your factory ROM.

3) MSI AB

4) Many will suggest go to 50%, I don't. I only adjust as much as is need for all games, etc to be stable. I never run things like Furmark / Kombuster / OCCT on GPU either.

5) Same answer as 1.

You'll note AMD drivers allow PL limit manipulation but not voltage control, so manufacturer deem it "Safe" to increase or perhaps needed for a user.

How are you're GPU temps? as besides PL causing throttle, GPU temps reaching x point can cause throttle.


----------



## coffeeplus

Quote:


> Originally Posted by *gupsterg*
> 
> @coffeeplus
> 
> Answers:-
> 
> 1) It can be needed, many owners up PL when throttle occurs.
> 
> 2) You shouldn't see an increase in temps or be at risk by upping PL using your factory ROM.
> 
> 3) MSI AB
> 
> 4) Many will suggest go to 50%, I don't. I only adjust as much as is need for all games, etc to be stable. I never run things like Furmark / Kombuster / OCCT on GPU either.
> 
> 5) Same answer as 1.
> 
> You'll note AMD drivers allow PL limit manipulation but not voltage control, so manufacturer deem it "Safe" to increase or perhaps needed for a user.
> 
> How are you're GPU temps? as besides PL causing throttle, GPU temps reaching x point can cause throttle.


4) Do you think Unigine is in the same category as Furmark etc.? I wouldn't run Furmark either.

About my temps:
- Unigine Heaven with fps unlocked (& no vsync) gets my card around 78 degrees Celsius after 7 minutes. Maybe gets higher in time - although haven't tested? It seems to get pretty stable around that value.
- Witcher 3 capped at 60 FPS (although my card can't get much higher than that and it even goes below sometime, in the 50s) gets me around 73.
- Witcher 3 capped at 46/50 FPS keeps my card under 70, about 68-69.
(tested with Witcher 3 in Novigrad)

Are these temperatures close to throttling zone? I thought not.
But my clock fluctuates. And I noticed a perceived improvement when upgrading from 16.2 to 16.3 in terms of smoothness, although the frametimes are not so smooth in W3, as neither the clock - still has some small falls, but I think fewer and of lesser magnitude than with 16.2.


----------



## Stige

Quote:


> Originally Posted by *Ron Soak*
> 
> The last 48 hours have been a roller coaster...
> 
> Had a 970, wanted to upgrade, read all these amazing reviews about the R9 390X, saw they were going cheaper than 980's and picked one up. (With Back-plate, no orange 1180 model)
> 
> Uninstalled Nvidia drivers (the wrong way I later learned) and freshly installed this card along with the 16.3 hot-fix and everything it wanted to install.
> 
> That Raptr gaming app is a POS BTW.
> 
> My monitor flickered, BLOPS 3 looked dark and fuzzy while GTA V didn't work, couldn't keep 30 fps even after turning things down to normal. I hate that their are game settings + catalyst settings + raptr settings.
> 
> After hours of mucking around (do you know how long it take to close GTA change 1 setting, restart GTA UGH) I found the proper way to uninstall your NVIDIA drivers. Did that.
> 
> Thought the problem was fixed, BLOPS worked amazingly, however GTA just would not work for me in any capacity. Still getting 23fps.
> 
> Watched what it did on MSI Afterburner (un-overclocked) and the usage was going 100-0-100-0-100 insane. Read that turning a setting off in crimson, helped with that, tried it, kinda helped went to bed feeling a little bit better that I hadn't wasted my money.
> 
> After work, tried to jump onto GTA and it was stuttering and not keeping 30fps even in the slightest. Went back to the internet thinking their must be something I missed on a webpage somewhere.
> 
> "Disable ULPS"
> 
> The most magical words to ever grace my eyes.
> 
> Reset GTA settings, disabled ULPS jumped in, 50-60fps constant with most thing on high, SUCCESS!
> 
> So with that out the way, I went about trying to do what I had intended to do 48 hours prior. OC.
> 
> With Extended official overclocking limits checked, along with disable ULPS, PowerPlay support enabled, plus a decent OC I have every single GTA setting at max, which was getting me 43 fps so I have locked it at 30. Never been happier.
> 
> Such a long road!
> 
> Its also a really pretty card and compliments my set up well!


You play games on a PC and cap your FPS at something like 30? Why the heck would you ever do stupid stuff like that? You might aswell be playing on inferior consoles then.

Quote:


> Originally Posted by *coffeeplus*
> 
> 4) Do you think Unigine is in the same category as Furmark etc.? I wouldn't run Furmark either.
> 
> About my temps:
> - Unigine Heaven with fps unlocked (& no vsync) gets my card around 78 degrees Celsius after 7 minutes. Maybe gets higher in time - although haven't tested? It seems to get pretty stable around that value.
> - Witcher 3 capped at 60 FPS (although my card can't get much higher than that and it even goes below sometime, in the 50s) gets me around 73.
> - Witcher 3 capped at 46/50 FPS keeps my card under 70, about 68-69.
> (tested with Witcher 3 in Novigrad)
> 
> Are these temperatures close to throttling zone? I thought not.
> But my clock fluctuates. And I noticed a perceived improvement when upgrading from 16.2 to 16.3 in terms of smoothness, although the frametimes are not so smooth in W3, as neither the clock - still has some small falls, but I think fewer and of lesser magnitude than with 16.2.


FurMark is crap, that's obvious. Why would you classify Unigine benchmarks with FurMark is beyond me though, they have nothing to do with eachother and work nothing alike.

If you need to cap your FPS to have acceptable temperatures, I think you are looking at it completely wrong and need to find another way to fix those high temps. Capping FPS is just silly, especially at low numbers like that.

Also always put your power limit at +50%, no reason to have it any lower, ever. It will only help you if your card draws enough power to allow it to draw some more if/when needed.


----------



## gupsterg

@coffeeplus
Quote:


> Do you think Unigine is in the same category as Furmark etc.?


No, I'm happy to run Heaven all day long







and use it as part of artifact testing when OC'ing a GPU.
Quote:


> Are these temperatures close to throttling zone? I thought not.


I agree







, most ROMs is 95C , in Overdrive you can see throttle temp as "Target GPU Temperature".
Quote:


> But my clock fluctuates.


Try PL increase







.M


----------



## coffeeplus

Quote:


> Originally Posted by *Stige*
> 
> FurMark is crap, that's obvious. Why would you classify Unigine benchmarks with FurMark is beyond me though, they have nothing to do with eachother and work nothing alike.
> 
> If you need to cap your FPS to have acceptable temperatures, I think you are looking at it completely wrong and need to find another way to fix those high temps. Capping FPS is just silly, especially at low numbers like that.
> 
> Also always put your power limit at +50%, no reason to have it any lower, ever. It will only help you if your card draws enough power to allow it to draw some more if/when needed.


I haven't expressed my classification - it was only a question - as I haven't documented myself about Unigine's relevance.

I don't cap FPS for temperature sake, as I don't think my current temperatures - the ones I have posted - are alarming. Do you think so or that I should have lower temps? (Maybe have problems with case cooling?) I capped FPS in search of a smooth frametime graph!
And I think having a constant and horizontal graph frametime is more valuable than the number of FPS all day around! Even if you go close to 30 FPS







(I previously owned an GTX 960 before going AMD, and I capped frames at 40 (or was it 38?) with Witcher - to get a smooth frametime in Novigrad - although with the settings I was playing I could go as high as 70 FPS in Velen). It was much more better than having FPS jump all around from 40 to 55!









I will play with the PL as soon as possible and let you know of what works for me. However, I will take a conservative approach and increase only in small increments. I don't overclock, and I guess I won't need more extra watts for my GPU! I'm thinking of starting with 10% see if it flattens my clock graph at 1015 Mhz.


----------



## Stige

Quote:


> Originally Posted by *coffeeplus*
> 
> I haven't expressed my classification - it was only a question - as I haven't documented myself about Unigine's relevance.
> 
> I don't cap FPS for temperature sake, as I don't think my current temperatures - the ones I have posted - are alarming. Do you think so or that I should have lower temps? (Maybe have problems with case cooling?) I capped FPS in search of a smooth frametime graph!
> And I think having a constant and horizontal graph frametime is more valuable than the number of FPS all day around! Even if you go close to 30 FPS
> 
> 
> 
> 
> 
> 
> 
> (I previously owned an GTX 960 before going AMD, and I capped frames at 40 (or was it 38?) with Witcher - to get a smooth frametime in Novigrad - although with the settings I was playing I could go as high as 70 FPS in Velen). It was much more better than having FPS jump all around from 40 to 55!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I will play with the PL as soon as possible and let you know of what works for me. However, I will take a conservative approach and increase only in small increments. I don't overclock, and I guess I won't need more extra watts for my GPU! I'm thinking of starting with 10% see if it flattens my clock graph at 1015 Mhz.


Isn't 1015MHz less than stock clocks? Why do you run it so low? Atleast my card is 1050MHz stock.


----------



## Carniflex

Quote:


> Originally Posted by *xxrafael95xx*
> 
> Nope that is why i was regretting because i have to spend lot of money, but what can i do.


While not exactly cheap water cooling can be actually started "on budget" as well. Just have to go reasonably simple and not "max everything right away".

As you have already a GPU block ordered then one cheap possibility (if you want to do CPU as well perhaps) is to get some naufacturers watercooling starter kit. They normally start somewhere around 120 EUR/$ or so for a 2x 120mm rad, pump/res combo, fans and fittings and tubing and ofc cpu block. You would just need 2 extra fittings then for cpu + gpu loop (make sure they are the right size for the tubing used in the kit) or if you want to do only gpu loop then you can use the fittings intended for cpu and leave cpu block unused or resell in ebay or something for recovering some cost.

If you opt to go for parts separately then you will end up roughly in the same ballpark or a bit above premade kit, roughly:
2x 120 mm rad ~50 eur
pump ~50 eur
fittings / tubes ~15..20 eur
cpu block ~50 eur
reservoir ~15 .. 50 eur

Also, if your system is stationary you are not limited to 2x 120 mm rad or whatever fits inside the case, if you are willing to cough up marginally more money you can get a lot more radiator space with cheaper end monster radiator. For example nova 1080 is normally about 120 eur or so (9x120mm).

Which is a lot more economical than getting similar radiator space as several separate radiators. However, in a tight case even the normal 2x 120mm coming in kit can be mounted extrenall by drilling couple of 3mm holes in a sidepanel and running hosing out of the case from whatever opening you have available.

All that radiator space basically determines is the temperature difference between air and water. So in that regard even a single 2x 120mm can be adequate for cooling an overclocked CPU putting out 150W and overclocked GPU putting about 300W of heat into the loop. It would just mean temperature delta of about 30C (at 20C room temp the loop would be approx 50C) and you would still get about 60 .. 70 C component temperature.


----------



## coffeeplus

Quote:


> Originally Posted by *Stige*
> 
> Isn't 1015MHz less than stock clocks? Why do you run it so low? Atleast my card is 1050MHz stock.


No. I haven't fiddled with the clock speed at all. This is stock speed for my model. I saw they have multiple models at different clocks, XFX.
Here they have one at 1000 Mhz: http://xfxforce.com/en-us/products/amd-radeon-r9-300-series/amd-radeon-r9-390-double-dissipation-r9-390a-8dfr
And here is my model: http://xfxforce.com/de/products/amd-radeon-r9-300-series/amd-radeon-r9-390-double-dissipation-r9-390p-8df6


----------



## Stige

Quote:


> Originally Posted by *Carniflex*
> 
> While not exactly cheap water cooling can be actually started "on budget" as well. Just have to go reasonably simple and not "max everything right away".
> 
> As you have already a GPU block ordered then one cheap possibility (if you want to do CPU as well perhaps) is to get some naufacturers watercooling starter kit. They normally start somewhere around 120 EUR/$ or so for a 2x 120mm rad, pump/res combo, fans and fittings and tubing and ofc cpu block. You would just need 2 extra fittings then for cpu + gpu loop (make sure they are the right size for the tubing used in the kit) or if you want to do only gpu loop then you can use the fittings intended for cpu and leave cpu block unused or resell in ebay or something for recovering some cost.
> 
> If you opt to go for parts separately then you will end up roughly in the same ballpark or a bit above premade kit, roughly:
> 2x 120 mm rad ~50 eur
> pump ~50 eur
> fittings / tubes ~15..20 eur
> cpu block ~50 eur
> reservoir ~15 .. 50 eur
> 
> Also, if your system is stationary you are not limited to 2x 120 mm rad or whatever fits inside the case, if you are willing to cough up marginally more money you can get a lot more radiator space with cheaper end monster radiator. For example nova 1080 is normally about 120 eur or so (9x120mm).
> 
> Which is a lot more economical than getting similar radiator space as several separate radiators. However, in a tight case even the normal 2x 120mm coming in kit can be mounted extrenall by drilling couple of 3mm holes in a sidepanel and running hosing out of the case from whatever opening you have available.
> 
> All that radiator space basically determines is the temperature difference between air and water. So in that regard even a single 2x 120mm can be adequate for cooling an overclocked CPU putting out 150W and overclocked GPU putting about 300W of heat into the loop. It would just mean temperature delta of about 30C (at 20C room temp the loop would be approx 50C) and you would still get about 60 .. 70 C component temperature.


Now you made me want to buy a monster rad aswell and mount it outside the case... DONT DO THIS TO MEEEEEEEEE!








Can a single pump run a rad like that?


----------



## mus1mus

A single DDC or a D5 will do. But that will depend on the amount of blocks too.


----------



## Carniflex

Quote:


> Originally Posted by *Stige*
> 
> Now you made me want to buy a monster rad aswell and mount it outside the case... DONT DO THIS TO MEEEEEEEEE!
> 
> 
> 
> 
> 
> 
> 
> 
> Can a single pump run a rad like that?


The monster radiator has basically the restriction in the same ballpark like any other single radiator (or a bit smaller even smaller, depending on internal construction). Overall I do not think radiators have particularly significant impact on the flow compared to the amount of restriction that CPU and GPU blocks have. The one in the picture was running fine with 1x CPU block, 3x GPU core blocks and 2x DC-LT pumps at approx 9V. The water was flowing even with only single pump at 12V when I tried it but temperature was about 1..2 C worse compared to 2x DC-LT at 9V. DC-LT is a _very_ small pump. About two or three of these should be equivalent to D5 or any other "proper" pump I think? They are approx 4W a piece at 12V.


----------



## Vellinious

Quote:


> Originally Posted by *Stige*
> 
> Now you made me want to buy a monster rad aswell and mount it outside the case... DONT DO THIS TO MEEEEEEEEE!
> 
> 
> 
> 
> 
> 
> 
> 
> Can a single pump run a rad like that?


I run an MO RA3 420 and use 2 x D5s. I wouldn't run my system with anything less. They're not as restrictive as a CPU / GPU block, but they're pretty restrictive compared to other rads, simply because of the massive size. In order to keep flow rates up where I wanted them, 2 x D5's were going to be necessary.


----------



## mus1mus

Actually, it also depends on the cores cross section as much as sheer core tube length.

I have 2 GTX 480 that can actually hinder flow at a high amount due to smaller crevices on the core.


----------



## Charcharo

Quote:


> Originally Posted by *Stige*
> 
> You play games on a PC and cap your FPS at something like 30? Why the heck would you ever do stupid stuff like that? You might aswell be playing on inferior consoles then.


Performance and graphics is actually PC Gaming's least Ace against consoles.

Modding, backwards compatibility (real one), more games, emulation, cheaper long term prices... the fact that it is not a toy... all those things are very valid reasons to go for a PC. Technically... those are the absolute biggest pros to PC Gaming. Graphics and frames are... nice and easy to market icing on top of the cake. Nothing more though.

As for the R9 390... since seeing DX12 performance and new drivers from AMD, I gotta say I am even happier with my card. I made the right choice with it!


----------



## Stige

Quote:


> Originally Posted by *Charcharo*
> 
> *Performance and graphics is actually PC Gaming's least Ace against consoles.*
> 
> Modding, backwards compatibility (real one), more games, emulation, cheaper long term prices... the fact that it is not a toy... all those things are very valid reasons to go for a PC. Technically... those are the absolute biggest pros to PC Gaming. Graphics and frames are... nice and easy to market icing on top of the cake. Nothing more though.
> 
> As for the R9 390... since seeing DX12 performance and new drivers from AMD, I gotta say I am even happier with my card. I made the right choice with it!


Did you really just say that out loud? Really? REALLY?
Worst troll ever.

Consoles have nothing over PC, nothing. ESPECIALLY NOT performance or graphics, they are just awful on consoles.


----------



## Charcharo

Quote:


> Originally Posted by *Stige*
> 
> Did you really just say that out loud? Really? REALLY?
> Worst troll ever.
> 
> Consoles have nothing over PC, nothing. ESPECIALLY NOT performance or graphics, they are just awful on consoles.


This is not trolling....

Least greatest Ace means that it is an Ace but there are... bigger and better things on PC. Modding. Emulation. Long term costs... need I go on?

Least =/= not. Least means I am comparing something. In this case I acknowledge the obvious graphical and performance superiority of PC and say it carries some importance. Just that there are other, much bigger and much more important things.

That is also the reason I would take a weaker than a console PC over a console...


----------



## patriotaki

there is no doubt.. PC > consoles

but i have to say i enjoyed more BF4 on the ps4 than on PC..
maybe its because i was playing on a 55inch LED FHD TV and there was more teamplay because i always played with a squad


----------



## Agent Smith1984

Quote:


> Originally Posted by *patriotaki*
> 
> there is no doubt.. PC > consoles
> 
> but i have to say i enjoyed more BF4 on the ps4 than on PC..
> maybe its because i was playing on a 55inch LED FHD TV and there was more teamplay because i always played with a squad


This is why I have my computer hooked up my to my 55" 4k TV










I run NO AA at all, and it allows me to get playable framerates on a single overclocked 390x with high-very high settings on every game I have.

How well Crysis 3 and GTA V are running for me at the settings I am using at 4k is pretty awesome. I mean, yeah, it's 45-70FPS, so it's not staying above 60, but my average FPS is around 60FPS in Crysis 3 @ 4k high settings, and around 57FPS in GTA V with very high custom settings.

I don't see anything wrong with locking 30FPS if the cinematic effect doesn't bother you... it actually works really well in GTA, but for shooters, I recommend getting 60+ if you can.


----------



## patriotaki

Quote:


> Originally Posted by *Agent Smith1984*
> 
> This is why I have my computer hooked up my to my 55" 4k TV
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I run NO AA at all, and it allows me to get playable framerates on a single overclocked 390x with high-very high settings on every game I have.
> 
> How well Crysis 3 and GTA V are running for me at the settings I am using at 4k is pretty awesome. I mean, yeah, it's 45-70FPS, so it's not staying above 60, but my average FPS is around 60FPS in Crysis 3 @ 4k high settings, and around 57FPS in GTA V with very high custom settings.
> 
> I don't see anything wrong with locking 30FPS if the cinematic effect doesn't bother you... it actually works really well in GTA, but for shooters, I recommend getting 60+ if you can.


yes i was also going to buy a 4k tv but last moment i changed my mind .. i decided to get a simple FHD 3D TV (samsung f8000) because most of the times we just watch TV and in my area we have very very few channels that are streaming FHD res







so the 4k wasnt worth it for me


----------



## jdorje

Unigine is as hot as anything that's worth running.

OCCT may be hotter but with its error testing you only need to run it for a couple minutes and you'll probably test stability very well.

There's no reason not to set power limit to +50%. Throttling is not your friend. If you're too hot lower voltage. That said, unless you are raising voltage significantly this doesn't seem to make a difference.

Gta v is not very gpu intensive. I have no trouble maintaining 65 fps at 1440 on ~very high. Msaa is disabled but most other settings are maxed. This takes a powerful cpu though - overclocking my 4690k is the difference between 58 and 65.


----------



## Vellinious

The Unigine benchmarks will NOT stress a GPU as hard as FIrestrike Ultra graphics test 1 and 2, and with the stress testing with the FS graphics tests, CPU won't play a part in GPU usage, where in Valley it does. Heaven not so much, but it does come into play. I NEVER use the Unigine benchmarks for stress testing for game clocks.....they're too easy on the GPU.


----------



## m70b1jr

I have to say Unigine is pretty weak, however, it points out tons of artficating around scenes 22 - 26.

Also, I didn't realize it, but my XFX R9 390 has Dual BIOS.. So, if I mess up on BIOS, I have a 2nd one to relay on. Also, I'm having weird issues with the "Voltage unlock" BIOS mod.

So, I can get a stable overclock of 1200mhz, then firestrike will randomly black screen, causing me to reboot. When I reboot, and revert back to stock speeds, my graphic drivers keep not responding, and I have to switch BIOS's. I would like to know what voltages you guys are getting at 1200mhz+ please..


----------



## Vellinious

I can run stock voltage at 1200....I get a few artifacts, but if I add a little bit of offset, it cleans up quite nicely. I haven't nailed down an exact number, but I'd guess +30mv to +50mv...in that neighborhood.


----------



## m70b1jr

Quote:


> Originally Posted by *Vellinious*
> 
> I can run stock voltage at 1200....I get a few artifacts, but if I add a little bit of offset, it cleans up quite nicely. I haven't nailed down an exact number, but I'd guess +30mv to +50mv...in that neighborhood.


WHAT
LOL I have to add like +250mv


----------



## tolis626

Quote:


> Originally Posted by *m70b1jr*
> 
> WHAT
> LOL I have to add like +250mv


One thing I learned the hard way. Don't ask Vellinious or some of the other guys (mus1mus and fyzz come to mind) here about their cards. You're going to leave really disappointed about your card.


----------



## THUMPer1

Anyone here with Hitman, 390/390x and win 10?


----------



## Stige

Quote:


> Originally Posted by *THUMPer1*
> 
> Anyone here with Hitman, 390/390x and win 10?


Yup.

DX12, All on maximum except:
- Both shadow settings medium
- AA off

Runs at steady 65+ FPS at all times.


----------



## patriotaki

where did you get DX 12 and Hitman?


----------



## Stige

Quote:


> Originally Posted by *patriotaki*
> 
> where did you get DX 12 and Hitman?


Steam?


----------



## patriotaki

Quote:


> Originally Posted by *Stige*
> 
> Steam?


what about DX12?


----------



## Spartoi

Quote:


> Originally Posted by *patriotaki*
> 
> what about DX12?


Bundle with/built in to Windows 10.


----------



## Noirgheos

Looking at the Hitman and Divison benchmarks... man I wasted the cash on the Fury. I'm 1-6 frames ahead of a stock 390X in both games.


----------



## Stige

Quote:


> Originally Posted by *Noirgheos*
> 
> Looking at the Hitman and Divison benchmarks... man I wasted the cash on the Fury. I'm 1-6 frames ahead of a stock 390X in both games.


If you are with that mindset, you could say 390X is a waste of money cause it's "only few FPS faster than a 390".


----------



## Noirgheos

Quote:


> Originally Posted by *Stige*
> 
> If you are with that mindset, you could say 390X is a waste of money cause it's "only few FPS faster than a 390".


Yeah, but at least it can overclock. Anyway, selling the Fury for a 390X and $150, gonna put it towards Feeesync.


----------



## Stige

Quote:


> Originally Posted by *Noirgheos*
> 
> Yeah, but at least it can overclock. Anyway, selling the Fury for a 390X and $150, gonna put it towards Feeesync.


Silicon lottery really.

Like when I had my two HD7950, one did 1200 on Core easily, the second one barely managed 1100 with max effort.


----------



## Karoths

im torn between this and the 970 ik already what i will be told to get the 390 seeing as thts whats here but will a 600b evga psu power this card amd r9 390 some say yes while others say no and its mixed and please in advance don't say upgrade your psu that would be a big help


----------



## Noirgheos

Quote:


> Originally Posted by *Stige*
> 
> Silicon lottery really.
> 
> Like when I had my two HD7950, one did 1200 on Core easily, the second one barely managed 1100 with max effort.


Except adding voltage to the Fury lowers FPS... So what you can on stock is what you're limited to. Don't even get me started on the instability of OCing HBM...


----------



## Dundundata

Quote:


> Originally Posted by *Karoths*
> 
> im torn between this and the 970 ik already what i will be told to get the 390 seeing as thts whats here but will a 600b evga psu power this card amd r9 390 some say yes while others say no and its mixed and please in advance don't say upgrade your psu that would be a big help


what cpu do you have?


----------



## diggiddi

Quote:


> Originally Posted by *Karoths*
> 
> im torn between this and the 970 ik already what i will be told to get the 390 seeing as thts whats here but will a 600b evga psu power this card amd r9 390 some say yes while others say no and its mixed and please in advance don't say upgrade your psu that would be a big help


It will run but most likely there will be no overclocking room


----------



## Karoths

Quote:


> Originally Posted by *Dundundata*
> 
> what cpu do you have?


i have an i5 4690k


----------



## Karoths

Quote:


> Originally Posted by *diggiddi*
> 
> It will run but most likely there will be no overclocking room


i don't mind if it won't really be able to overclock just wanna know if it can game on a 1080p 60hz with no major performance hits


----------



## diggiddi

Should be ok


----------



## Karoths

ok thanks for the help


----------



## kizwan

Quote:


> Originally Posted by *Karoths*
> 
> im torn between this and the 970 ik already what i will be told to get the 390 seeing as thts whats here but will a 600b evga psu power this card amd r9 390 some say yes while others say no and its mixed and please in advance don't say upgrade your psu that would be a big help


Yes & overclocking too.


----------



## Karoths

Quote:


> Originally Posted by *kizwan*
> 
> Yes & overclocking too.


wait so even with this id be able to overclock? the 390?


----------



## Tobiman

Quote:


> Originally Posted by *Karoths*
> 
> wait so even with this id be able to overclock? the 390?


Yes. You are probably using 500 watts max with a balls to the wall overclock.


----------



## kizwan

Quote:


> Originally Posted by *Karoths*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Yes & overclocking too.
> 
> 
> 
> wait so even with this id be able to overclock? the 390?
Click to expand...

Yes. I checked out the PSU. You have plenty for a 390.


----------



## Karoths

Quote:


> Originally Posted by *Tobiman*
> 
> Yes. You are probably using 500 watts max with a balls to the wall overclock.


currently i wouldnt be surprised if i am already but you never know id rather be safe than sorry thanks


----------



## Karoths

Quote:


> Originally Posted by *kizwan*
> 
> Yes. I checked out the PSU. You have plenty for a 390.


awesome thanks for the help


----------



## diggiddi

I dont know about overclocking with that little reserve power


----------



## Karoths

Quote:


> Originally Posted by *diggiddi*
> 
> I dont know about overclocking with that little reserve power


well i don't know how far it can overclock with stock abilities and no bios mod i mod my nvidia bios often when i get a new gtx card last i used was a 960 so


----------



## Tobiman

By balls to the wall, I mean LN2 overclocking so you are completely fine with 600watts.


----------



## Karoths

thanks again i guess i was a bit confused there but thanks so i can run it no back breaking drawbacks or the like?


----------



## uszpdoz

Quote:


> Originally Posted by *Stige*
> 
> If you are with that mindset, you could say 390X is a waste of money cause it's "only few FPS faster than a 390".


true indeed but between the price comparison for 390 and 390x its still okay but not between furyx and 390x(at least at my place price tag anyway)


----------



## Agent Smith1984

So..... I got hitman for free with my 390x.. I had no idea it was dx12.. I haven't even installed it yet... Guess what I'm doing tomorrow?!

Can't wait to see dx12 on this card at 4k in a new game!


----------



## Carniflex

Quote:


> Originally Posted by *Charcharo*
> 
> This is not trolling....
> 
> Least greatest Ace means that it is an Ace but there are... bigger and better things on PC. Modding. Emulation. Long term costs... need I go on?
> 
> Least =/= not. Least means I am comparing something. In this case I acknowledge the obvious graphical and performance superiority of PC and say it carries some importance. Just that there are other, much bigger and much more important things.
> 
> That is also the reason I would take a weaker than a console PC over a console...


Well the console launch was not particularly impressive hardware wise to be frank. even at the launch it was possible, in essence, to build a PC for the similar cost of the console launch price with off the shelf parts that was slightly outperforming the console (see, for example, http://www.eurogamer.net/articles/digitalfoundry-2014-the-next-gen-digital-foundry-pc from the time when consoles launched). About a year has passed since then and PC's have moved on while consoles have remained as they were at launch.

Then again a typical console user does not know the difference between RAM and hard disk and they do not care to know it either. All they care for is if the specific game they want is on the platform or not (this is ofc true for PC as a platform as well) - they all have their exclusives. Hell there are whole genres on PC that are not present on consoles.


----------



## coffeeplus

Quote:


> Originally Posted by *gupsterg*
> 
> @coffeeplus
> 
> Answers:-
> 
> 1) It can be needed, many owners up PL when throttle occurs.
> 
> 2) You shouldn't see an increase in temps or be at risk by upping PL using your factory ROM.
> 
> 3) MSI AB
> 
> 4) Many will suggest go to 50%, I don't. I only adjust as much as is need for all games, etc to be stable. I never run things like Furmark / Kombuster / OCCT on GPU either.
> 
> 5) Same answer as 1.
> 
> You'll note AMD drivers allow PL limit manipulation but not voltage control, so manufacturer deem it "Safe" to increase or perhaps needed for a user.
> 
> How are you're GPU temps? as besides PL causing throttle, GPU temps reaching x point can cause throttle.


Quote:


> Originally Posted by *gupsterg*
> 
> @coffeeplus
> No, I'm happy to run Heaven all day long
> 
> 
> 
> 
> 
> 
> 
> 
> and use it as part of artifact testing when OC'ing a GPU.
> I agree
> 
> 
> 
> 
> 
> 
> 
> , most ROMs is 95C , in Overdrive you can see throttle temp as "Target GPU Temperature".
> Try PL increase
> 
> 
> 
> 
> 
> 
> 
> .M


Quote:


> Originally Posted by *Stige*
> 
> If you need to cap your FPS to have acceptable temperatures, I think you are looking at it completely wrong and need to find another way to fix those high temps. Capping FPS is just silly, especially at low numbers like that.
> 
> Also always put your power limit at +50%, no reason to have it any lower, ever. It will only help you if your card draws enough power to allow it to draw some more if/when needed.


Hello!

I'm back with some result of my GPU clock analysis.
I did the following tests:

1. Unigine test - no v-sync, no framecap, Power Efficiency OFF, no PL increase in AB.
Results: Positive - clock stays mostly at maximum!
Temperature: maybe not the best - but I also fiddled with my case setup a bit in the following graph.
Visualistation:


2. Witcher 3 tests - no v-sync, framecap 46, Power Efficiency OFF, PL increase in multiple steps: 10%, then 20%, then 30% from AB (increase and hit Apply while in-game)
Results: PL doesn't seem to influence clock frequency, clock frequency not as consistent as with Unigine.



3. Witcher 3 test - no v-sync, framecap 46, Power Efficiency OFF, no PL increase, Clockblocker ON
Results: clock fluctuations reduced, as expected. Also, look at the frametime graph - although there's not a great difference between 2) and 3), I think in this test the graph is slightly smoother. I didn't realize this visually as well, I'm content of the smoothness with 2) as well.


Conclusions:
a) Clock seems to be stuck at maximum either with Clockblocker or when GPU is 100% utilised, like with Unigine uncapped.
b) When using RTSS to cap so that usage is not so great, then clock will fluctuate a bit. However, these do not impact smoothness in a noticeable fashion.


----------



## Charcharo

Quote:


> Originally Posted by *Carniflex*
> 
> Well the console launch was not particularly impressive hardware wise to be frank. even at the launch it was possible, in essence, to build a PC for the similar cost of the console launch price with off the shelf parts that was slightly outperforming the console (see, for example, http://www.eurogamer.net/articles/digitalfoundry-2014-the-next-gen-digital-foundry-pc from the time when consoles launched). About a year has passed since then and PC's have moved on while consoles have remained as they were at launch.
> 
> Then again a typical console user does not know the difference between RAM and hard disk and they do not care to know it either. All they care for is if the specific game they want is on the platform or not (this is ofc true for PC as a platform as well) - they all have their exclusives. Hell there are whole genres on PC that are not present on consoles.


I know that. I am just pointing out that Graphics and Performance are *LESSER* than Real Backwards compatibility, lower long term costs, Modding and Emulation.

That is why I would choose my old ATI 5770 PC over a new PS4.

Even if the PS4 had Quad Titan Fury X or something insane like that, I would still not care. The platform is by default limited and terrible.

As for the "make a PC for the same cash as a console"... not in all countries. Not possible always.


----------



## Stige

Quote:


> Originally Posted by *Noirgheos*
> 
> Except adding voltage to the Fury lowers FPS... So what you can on stock is what you're limited to. Don't even get me started on the instability of OCing HBM...


Then your VRM is too hot and it's throttling the card or something. If more voltage lowers your FPS then there is something wrong, propably with temperatures, not with the card per say.


----------



## gupsterg

Quote:


> Originally Posted by *Stige*
> 
> Then your VRM is too hot and it's throttling the card or something. If more voltage lowers your FPS then there is something wrong, propably with temperatures, not with the card per say.


He is correct.

Buildzoid confirmed it a while back, link to post.

But it seems also my bios modding endeavors are helping







, link to buildzoids blog.

Now if we go back to Buildzoid's original posted data on FPS drop we see a correlation with the 1.300V manual VID limit we are seeing via bios mod.

PE on / off is having a big effect on Fiji, don't know about you Grenada guys, so on the whole I think drivers are working to limit Fury for power, etc.

Anyhow I think that is enough info here and will post more in relevant threads.


----------



## Noirgheos

Quote:


> Originally Posted by *Stige*
> 
> Then your VRM is too hot and it's throttling the card or something. If more voltage lowers your FPS then there is something wrong, propably with temperatures, not with the card per say.


Quote:


> Originally Posted by *gupsterg*
> 
> He is correct.
> 
> Buildzoid confirmed it a while back, link to post.
> 
> But it seems also my bios modding endeavors are helping
> 
> 
> 
> 
> 
> 
> 
> , link to buildzoids blog.
> 
> Now if we go back to Buildzoid's original posted data on FPS drop we see a correlation with the 1.300V manual VID limit we are seeing via bios mod.
> 
> PE on / off is having a big effect on Fiji, don't know about you Grenada guys, so on the whole I think drivers are working to limit Fury for power, etc.
> 
> Anyhow I think that is enough info here and will post more in relevant threads.


My VRMs are not getting too hot, as Gup mentions, it is a Fury issue.


----------



## Stige

Why are we discussing Fury related issues in this thread to begin with?


----------



## jodybdesigns

Quote:


> Originally Posted by *Stige*
> 
> Why are we discussing Fury related issues in this thread to begin with?


Because the anger is strong with Fury users. But we all know the 390(x) has been neck and neck with Fury since release. $350 extra dollars for 5 or 6 frames? Nah. No thanks.


----------



## Noirgheos

Quote:


> Originally Posted by *Stige*
> 
> Why are we discussing Fury related issues in this thread to begin with?


Because the Fury line is worthless garbage. Such a price demand for minimal frame gain.


----------



## gupsterg

LOL ....


----------



## Noirgheos

Quote:


> Originally Posted by *gupsterg*
> 
> LOL ....


I know I feel like a ******.


----------



## jodybdesigns

Well the 390 and an Ivy Bridge is ready for VR. I loved how they said 2 weeks ago you would need Haswell or better. The benchmark got smashed.


----------



## Vellinious

Quote:


> Originally Posted by *jodybdesigns*
> 
> Well the 390 and an Ivy Bridge is ready for VR. I loved how they said 2 weeks ago you would need Haswell or better. The benchmark got smashed.


I really don't think the Steam VR benchmark is going to mean anything when it comes right down to it. It's not nearly stressful enough on the GPU / CPU to really tell anything.

When a company that actually manufactures the equipment for VR comes out and says, "minimum requirements", they probably mean it....does that mean you can't get by with something a little less? Not likely, but....it's not going to "perform as advertised".


----------



## m70b1jr

Does anyone know how to save my overclock profile into a BIOS? If that makes any sense.. Or could someone do it for me? If I give someone my BIOS here, could they show me how I could (By default) add +200mv, and 1200mhz default clock speed?


----------



## Noirgheos

Quote:


> Originally Posted by *jodybdesigns*
> 
> Well the 390 and an Ivy Bridge is ready for VR. I loved how they said 2 weeks ago you would need Haswell or better. The benchmark got smashed.


Yep, I do have a 4790K, and I did run my 390X at 1200MHz. I got an 8.8


----------



## Vellinious

I hit 10 with a 290X @ 1250 / 1750....like I said, I don't think this bench is going to even be close to what reality is....


----------



## OneB1t

funny thing is that this benchmark love memory settings from 390X if i swap memory table from 390X into my 290X score will be like +1


----------



## spyshagg

9.7 with 290x @1170/1600

own two of them, so I'm set for vr


----------



## Noirgheos

Quote:


> Originally Posted by *spyshagg*
> 
> 9.7 with 290x @1170/1600
> 
> own two of them, so I'm set for vr


Really? Damn. Why did I score so low?


----------



## jdorje

VR is going to be very cpu-intensive compared to normal gaming. When your FPS drops, it's usually due to a CPU bottleneck, and having FPS drop under (?)90 in VR is a no-go. There's games (gta v) where no sub-4-ghz CPU can even maintain a steady 60 FPS. And drops are terrible.

But if you have a 3570k at 4.5 ghz or something that's better than a stock skylake i5, so...

But it's very game dependent. Other games don't need CPU power at all.


----------



## OneB1t

Vr will be dx12 so nearly all cpus will be fine


----------



## BradleyW

Hello 390/X users. Just looking for some feedback on the Hitman D3D12 test below:

HITMAN BENCHMARK 1080P
Quote:


> : 290 DX11 min fps 52
> : 290 DX12 min fps 53
> 
> : 390 DX11 min fps 52
> : 390 DX12 min fps 59


Source

Notice the R9 290 series doesn't really seem to benefit from DX12, but the 390 series does yet they are more or less the same card.


----------



## Noirgheos

Quote:


> Originally Posted by *BradleyW*
> 
> Hello 390/X users. Just looking for some feedback on the Hitman D3D12 test below:
> 
> HITMAN BENCHMARK 1080P
> Source
> 
> Notice the R9 290 series doesn't really seem to benefit from DX12, but the 390 series does yet they are more or less the same card.


People say the 290/290X are the same as 390/390X... they're not. The BIOS for the 390/390X is a lot better... utilized. Made.

Try flashing a 390 BIOS to your 290, should see an improvement,


----------



## mus1mus

It's the drivers. Not the BIOS.

You can flash a modded 390/X bios into a 290/X and still lag to a 390/X clock for clock due to driver enhancements.


----------



## Noirgheos

Quote:


> Originally Posted by *mus1mus*
> 
> It's the drivers. Not the BIOS.
> 
> You can flash a modded 390/X bios into a 290/X and still lag to a 390/X clock for clock due to driver enhancements.


Now that doesn't make sense. We already know the hardware is next to no different, what else can they be optimizing for?


----------



## mus1mus

The 300 series benefits from driver optumisation.

Even when you mod a 200 series xard, it doesn't get the same driver path as the 300 series cards. It's AMD's way to differentiate each series.

Now, you might say 200 series cards beat 300 cards in certain benchmarks and certain users. But that has to be a combination of Clocks, Memory optimisations and other mods. Stock everything, 300 cards win.


----------



## jdorje

Quote:


> Originally Posted by *mus1mus*
> 
> The 300 series benefits from driver optumisation.
> 
> Even when you mod a 200 series xard, it doesn't get the same driver path as the 300 series cards. It's AMD's way to differentiate each series.
> 
> Now, you might say 200 series cards beat 300 cards in certain benchmarks and certain users. But that has to be a combination of Clocks, Memory optimisations and other mods. Stock everything, 300 cards win.


Should be possible to hack the drivers somehow to get it to recognize the 290/x as a 390/x.


----------



## mus1mus

True. If that can only be done though.


----------



## Ron Soak

Quote:


> Originally Posted by *Noirgheos*
> 
> Now that doesn't make sense. We already know the hardware is next to no different, what else can they be optimizing for?


Its called binning, the components in the 390 will be of a better bin to the 290, allowing them to seemingly take the same parts and make them work better.


----------



## jdorje

Quote:


> Originally Posted by *Ron Soak*
> 
> Its called binning, the components in the 390 will be of a better bin to the 290, allowing them to seemingly take the same parts and make them work better.


Dubious. The 290 isn't made anymore so they can't be binning them separately from 390s anymore. Presumably the 390 and 390x are binned from the same production, just like the 970 and 980.

But binning mostly is done by the core quality, aka, a better binned chip will have more cores or clock higher. That doesn't explain why you'd do better at the same clocks.


----------



## Vellinious

From what I've seen, the 3xx series fall behind the 2xx series when you start putting some clock on them. I haven't seen a benchmark by a 3xx card yet, that has impressed me.


----------



## OneB1t

in fact its same
just card loterry all over again


----------



## patriotaki

whats the minimum wattage you need for 2x 390s ?

will my xfx ts 750w gold be enough?


----------



## Stige

Quote:


> Originally Posted by *patriotaki*
> 
> whats the minimum wattage you need for 2x 390s ?
> 
> will my xfx ts 750w gold be enough?


If you don't overclock with BIOS mods and stuff, easily.


----------



## Noirgheos

Quote:


> Originally Posted by *Vellinious*
> 
> From what I've seen, the 3xx series fall behind the 2xx series when you start putting some clock on them. I haven't seen a benchmark by a 3xx card yet, that has impressed me.


Yeah I've noticed that the 3xx series don't OC as well. I got my 390X to 1200 though, so I'm happy.


----------



## mus1mus

Quote:


> Originally Posted by *Stige*
> 
> Quote:
> 
> 
> 
> Originally Posted by *patriotaki*
> 
> whats the minimum wattage you need for 2x 390s ?
> 
> will my xfx ts 750w gold be enough?
> 
> 
> 
> If you don't overclock with BIOS mods and stuff, easily.
Click to expand...

Quite interesting how easy you can make a conclusion without even considering his CPU.









1250W Gold trips at +200 with my Overclocked 5930K.


----------



## Noirgheos

Quote:


> Originally Posted by *mus1mus*
> 
> Quite interesting how easy you can make a conclusion without even considering his CPU.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1250W Gold trips at +200 with my Overclocked 5930K.


Well he has the Skylake thread in his signature. So... safe to assume 6700K or lower. He should be fine, but 390 CrossfireX can use up to 800W from what I've heard...


----------



## patriotaki

i got the 6600K, i have 2x R9 390s but i want to crossfire them only for benchmarks

ill downclock my 6600k to 3.9ghz to save some power


----------



## rdr09

Quote:


> Originally Posted by *patriotaki*
> 
> i got the 6600K, i have 2x R9 390s but i want to crossfire them only for benchmarks
> 
> ill downclock my 6600k to 3.9ghz to save some power


Don't even try. i have an evga 850 and i tried +100 on my 290s and it quit. they both can do 1290 at +200. that was when i had a evga 1300 for an X99 setup.


----------



## patriotaki

hmm then i need a bigger psu..


----------



## patriotaki

what if i dont OC the GPUs and keep my cpu at stock clocks ?


----------



## mus1mus

Depending on your card's ASIC. If you have higher than 70%, it may become an issue. Lower, you might get away with it.

For reference, both my 290Xs with 76 ASIC can run Firesttike longer before tripping OCP than if I mix my 290 with one of those 290Xs. The 290 has an ASIC or 83ish.

Which brings up the thought of ASIC and Power Consumption into play. Higher ASIC requires lower Voltage to clock but consumes more power.


----------



## patriotaki

btw i have not been added to the list yet


----------



## diggiddi

You could run them at stock but not for long though, I can't even overclock on a single 290X now,
my 750 just can't take it no more
Disclaimer! I do have an FX 8350 @4.8ghz


----------



## Stige

I ran a 2500K at 5GHz @ 1.52V and two HD7950s on a 620W PSU.

They have 200W TDP I think, 390 is 275W TDP.

750W should be sufficient without any extreme overclocks. His CPU is only 4.5GHz which propably means very low voltage already. I doubt he would have any issues with the 750W PSU.


----------



## mus1mus

275 with Power Limit Untouched.


----------



## patriotaki

i could downclock my cpu to 4.0GHz maaybe at 1.2v?
i will leave the gpus at stock.

or maybe i can use 2 psus... hmmmmm


----------



## rdr09

Quote:


> Originally Posted by *patriotaki*
> 
> i could downclock my cpu to 4.0GHz maaybe at 1.2v?
> i will leave the gpus at stock.
> 
> or maybe i can use 2 psus... hmmmmm


With both gpus at stock . . . it might work. with 2 290s at stock i don't see 700W read from the UPS at load. But, these are 290s and 390Xs comsumes higher for sure.

For benchmarking, minimum of 1000W i would say.

EDIT: I've seen some undervolt at stock to lower power use and heat.


----------



## jdorje

Quote:


> Originally Posted by *patriotaki*
> 
> whats the minimum wattage you need for 2x 390s ?
> 
> will my xfx ts 750w gold be enough?


I could certainly run two of my 390s with a 4690k on 750w at around stock power level. I could run two of them on 650w with the cpu overclocked and the 390s at 1125 mV and ~1000 mhz. When I do the above setup with a single 390 it takes under 400W at the wall. Of course that wouldn't be exciting for benching.

Get a kill a watt and check out power usage for yourself.


----------



## patriotaki

check this out

http://outervision.com/b/MGCOKY

for 4hours of use : 684Watt


----------



## patriotaki

Quote:


> Originally Posted by *patriotaki*
> 
> check this out
> 
> http://outervision.com/b/MGCOKY
> 
> for 4hours of use : 684Watt


i can save even more wattage if i downclock the i5 to 3.5GHz


----------



## rdr09

Quote:


> Originally Posted by *patriotaki*
> 
> i can save even more wattage if i downclock the i5 to 3.5GHz


keep 4GHz on cpu and keep the gpus at stock. don't be running useless app like furmark.


----------



## patriotaki

Quote:


> Originally Posted by *rdr09*
> 
> keep 4GHz on cpu and keep the gpus at stock. don't be running useless app like furmark.


nope.. i will run only 3dmark , gta v,bf4,witcher 3,crysis 3, unigen heaven..

will it be ok?


----------



## Stige

Quote:


> Originally Posted by *patriotaki*
> 
> nope.. i will run only 3dmark , gta v,bf4,witcher 3,crysis 3, unigen heaven..
> 
> will it be ok?


It will run for sure. If you plan on keeping to air cooling, I'm not sure you can ever clock them high enough to have any issues with the PSU.


----------



## patriotaki

Quote:


> Originally Posted by *Stige*
> 
> It will run for sure. If you plan on keeping to air cooling, I'm not sure you can ever clock them high enough to have any issues with the PSU.


yes air cooling it is..

what is the worst case scenario though?

if i have the 390s at stock for example and try to overclock them a bit, what will happen if the psu cant handle it?
will i damage any parts?


----------



## Stige

No. It will simply turn off.


----------



## spyshagg

Quote:


> Originally Posted by *Noirgheos*
> 
> Really? Damn. Why did I score so low?


Your score is great. Is just that I am running much tighter timings @ 1600mhz (bios mod)


----------



## Agent Smith1984

Quote:


> Originally Posted by *patriotaki*
> 
> btw i have not been added to the list yet


I'll update now, I must have missed you.... what are you final overclock numbers and voltages?


----------



## patriotaki

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I'll update now, I must have missed you.... what are you final overclock numbers and voltages?


dont update it yet.. ill report in few days maybe i can get a better OC with the new PSU


----------



## jodybdesigns

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I'll update now, I must have missed you.... what are you final overclock numbers and voltages?


I wasn't either sir.


----------



## Agent Smith1984

Yeah, what's weird is that I used to rely on my new email notifications to see when new posts were made, and use those get the proof of ownerships, but for whatever reason, they aren't all going to my email anymore. Not sure why???


----------



## patriotaki

maybe we can make a form to complete that will be sent to you..it will be easier


----------



## jodybdesigns

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Yeah, what's weird is that I used to rely on my new email notifications to see when new posts were made, and use those get the proof of ownerships, but for whatever reason, they aren't all going to my email anymore. Not sure why???


I have been having issues getting emails from threads I have never even visited. Probably something the dev's are still working on after the major revamp of the site.


----------



## jdorje

The worst case scenario overdrawing your psu depends on the unit. It'll either let you overdraw it and work fine, shut down, or catch on fire and explode.


----------



## patriotaki

Quote:


> Originally Posted by *jdorje*
> 
> The worst case scenario overdrawing your psu depends on the unit. It'll either let you overdraw it and work fine, shut down, or catch on fire and explode.


..lol

since its a seasonic uniit xfx ts 750w gold.. i dont think ill blow up.. (fingers crossed)


----------



## jdorje

Upgrade to a diablotek if you want fireworks.


----------



## patriotaki

Quote:


> Originally Posted by *jdorje*
> 
> Upgrade to a diablotek if you want fireworks.


nah im good


----------



## Agent Smith1984

750w XFX should be fine. I ran (2) 290's with overclocks of 1150/1400 75mv on an 850w RAIDMAX with an overclocked FX-8300 probably pulling 200w all itself.


----------



## Agent Smith1984

I know everybody is tightening timings for better performance..... but has anybody tried going the other direction to get the 2.5GHz out of these IC's that they are rated up to??

https://www.skhynix.com/products.view.do?vseq=953&cseq=81

That's the exact chip on my MSI 390X (The t2 version)......

Seems like you could easily get into Fury's bandwidth territory with some loosened timings and extended VRAM clock allowance in the BIOS.

Not sure it would matter, but I am just curious.....


----------



## jdorje

Would be worth trying, but odds are low. I get the same 1740 mhz on both 1740 timings and 1225 timings. Seems unlikely that ~2000 timings would let me go much higher.

And in most cases reduced latency seems to be a bigger benefit than increased bandwidth.


----------



## mus1mus

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I know everybody is tightening timings for better performance..... but has anybody tried going the other direction to get the 2.5GHz out of these IC's that they are rated up to??
> 
> https://www.skhynix.com/products.view.do?vseq=953&cseq=81
> 
> That's the exact chip on my MSI 390X (The t2 version)......
> 
> Seems like you could easily get into Fury's bandwidth territory with some loosened timings and extended VRAM clock allowance in the BIOS.
> 
> Not sure it would matter, but I am just curious.....


Something is holding back Memory clocks on these cards. And max you can get at stock timings cannot be too far from what you can get when you tighten them down.

1675 is best I can muster on my Elpidas. Even when using the timings for 1725 or 1250.


----------



## Ron Soak

Hey

Is crossfire, legitimately worth it, is the performance increase worth it?


----------



## rdr09

Quote:


> Originally Posted by *Ron Soak*
> 
> Hey
> 
> Is crossfire, legitimately worth it, is the performance increase worth it?


Depends on the game you playing. For 4K is is legit and worth it unless you have a Freesync monitor . . .

http://www.overclock.net/t/1590354/viewsonic-xg2700-4k-ips-adaptive-sync-thread


----------



## jdorje

Unless? Does freesync not work with crossfire?


----------



## rdr09

Quote:


> Originally Posted by *jdorje*
> 
> Unless? Does freesync not work with crossfire?


Not sure. i don't own a freesync monitor. But, with a single gpu like a 390X, it would be easier to manage dips with Freesync trying to stay within the sync range.


----------



## jdorje

Off topic, but I've tweaked my xg270hu freesync range a few times. It defaults to 40-144. I changed it to 30-144 or even 2-144, but at lower frame rates it flickers. In fact it flickers below about 50 hz. Should I consider rma, or hack it again for 50-144 freesync?


----------



## rdr09

Quote:


> Originally Posted by *jdorje*
> 
> Off topic, but I've tweaked my xg270hu freesync range a few times. It defaults to 40-144. I changed it to 30-144 or even 2-144, but at lower frame rates it flickers. In fact it flickers below about 50 hz. Should I consider rma, or hack it again for 50-144 freesync?


I suggest you ask the op of the link i posted earlier - Arizona. He is a moderator and very helpful.


----------



## 66racer

Well I was asking if the powercolor devil 390x would be a two slot after removing the shroud, I can confirm it is! All 4 came in today so I quickly removed one shroud and was able to relax afterwards haha I did remove later on the asrock i/o trim panel since it was touching the 390x backplate. Might have been ok but function over form on this build especially.


----------



## jodybdesigns

Quote:


> Originally Posted by *66racer*
> 
> Well I was asking if the powercolor devil 390x would be a two slot after removing the shroud, I can confirm it is! All 4 came in today so I quickly removed one shroud and was able to relax afterwards haha I did remove later on the asrock i/o trim panel since it was touching the 390x backplate. Might have been ok but function over form on this build especially.


Well that's going to be a hefty monster LOL


----------



## jodybdesigns

Quote:


> Originally Posted by *jdorje*
> 
> Unless? Does freesync not work with crossfire?


Crossfire isn't recommended IMO. Dx12 does NOT currently support multi GPU setups. Also, the game has to have a correct driver profile. And if it's a game works title you can forget getting any good performance from it. I waited 6 months on a proper Dying Light profile. I'll never wait 2 or 3 months on a driver again. Single card solutions for me from now on.


----------



## Nameless1988

Single card is the way! Never going sli/crossfire: too many problems...


----------



## Agent Smith1984

Quote:


> Originally Posted by *jodybdesigns*
> 
> Crossfire isn't recommended IMO. Dx12 does NOT currently support multi GPU setups. Also, the game has to have a correct driver profile. And if it's a game works title you can forget getting any good performance from it. I waited 6 months on a proper Dying Light profile. I'll never wait 2 or 3 months on a driver again. Single card solutions for me from now on.


Wait, who told you DX12 doesn't support multiple GPU's??

DX12 is going to open up doors for multiple GPU's we've never seen before.

I've even seen Fury's and 980ti's running together in perfect dual GPU harmony. Funny thing is, a Fury + a 980ti actually perfoms better than a Fury+Fury, and a 980ti + 980ti.... crazy huh?

CRAZYNESS!!!
http://www.anandtech.com/show/9740/directx-12-geforce-plus-radeon-mgpu-preview/4


----------



## spyshagg

Quote:


> Originally Posted by *jdorje*
> 
> Unless? Does freesync not work with crossfire?


It does work. You need however to enable "frame pacing" option or else it will not work.

But I still find a good amount of stutter that freesync seems unable to correct when using crossfire. BF4 is specially susceptible to stutter under crossfire+freesync. Drop crossfire and its smooth.


----------



## rdr09

Quote:


> Originally Posted by *spyshagg*
> 
> It does work. You need however to enable "frame pacing" option or else it will not work.
> 
> But I still find a good amount of stutter that freesync seems unable to correct when using crossfire. BF4 is specially susceptible to stutter under crossfire+freesync. Drop crossfire and its smooth.


If i have freesync and playing BF4 in 4K . . . i would drop Freesync first. There is no tearing in BF4 at all.


----------



## jodybdesigns

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Wait, who told you DX12 doesn't support multiple GPU's??
> 
> DX12 is going to open up doors for multiple GPU's we've never seen before.
> 
> I've even seen Fury's and 980ti's running together in perfect dual GPU harmony. Funny thing is, a Fury + a 980ti actually perfoms better than a Fury+Fury, and a 980ti + 980ti.... crazy huh?
> 
> CRAZYNESS!!!
> http://www.anandtech.com/show/9740/directx-12-geforce-plus-radeon-mgpu-preview/4


I eat my words. This is nice. Do you know why the harmony? Async computing in the Fury - Physics in the 980. That's beautiful.

I have been seeing a lot that multi-gpu isn't supported. I am going to go back to my DX12 is a myth idea until it is finally here lol


----------



## patriotaki

My new psu xfx ts 750w arrived today








maybe i will connect it later tonight or in the next days.. from next week i will try the crossfire too


----------



## Agent Smith1984

Well, nowwe see why AMD is happily bundling Hitman with their cards:
http://www.overclock.net/t/1594186/various-hitman-2016-pc-directx-11-vs-directx-12-performance

@ 4k, the 390x is 33% faster than the 980.... overclock your 980 all you want to, you aren't covering that ground!!!


----------



## m70b1jr

What else using Aync computing? Software wise. To my knowledge, DX12 is only using Async computing. Why would AMD include Async computing if nothing used it? Unless it was a way of future proofing their cards, and AMD knew DX12 would use it.


----------



## TsukikoChan

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Well, nowwe see why AMD is happily bundling Hitman with their cards:
> http://www.overclock.net/t/1594186/various-hitman-2016-pc-directx-11-vs-directx-12-performance
> 
> @ 4k, the 390x is 33% faster than the 980.... overclock your 980 all you want to, you aren't covering that ground!!!


*sees results*
*rubs eyes*
*sees results again*
*rubs eyes*
is..is that a 390x outperforming a 980ti in most of the hitman benchmarks??! 0.o
i hope a similar thing shows up for rise of the tombraider since it just got its dx12 patch a few days ago, been holding off on that game for some time for the dx12 patch.


----------



## jdorje

The same thing happens in ashes. However dx12 always crashes for me.

The 390 and x are bundled with hitman, but if you bought it recently newegg will probably give you a copy of the game. I got it on black friday and had no problems.


----------



## TsukikoChan

Quote:


> Originally Posted by *m70b1jr*
> 
> What else using Aync computing? Software wise. To my knowledge, DX12 is only using Async computing. Why would AMD include Async computing if nothing used it? Unless it was a way of future proofing their cards, and AMD knew DX12 would use it.


To my understanding AMD knew Async was the way forward but dx couldn't support it, they started to create their own api (mantle) which would make good use of GCN architecture but only a few games picked it up. they offered their findings into mantle to the vulkan group for their opengl upgrade and the directx rewrite (11->12) was inspired you could say by this. AMD always seems to be very forward minded but it often doesn't pay off until much later (HBM, async, unified memory, lots of cores for cpu, etc).
i could be completely wrong in what i just wrote haha ^^;;


----------



## Agent Smith1984

I'm just excited that my 2015 GPU, based on 2012 tech, with still be kicking in 2017 and beyond..... 5+ years out of GPU, and getting better performance as it goes along, instead of being seemingly slower is just, well....... AWESOME!


----------



## jodybdesigns

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Well, nowwe see why AMD is happily bundling Hitman with their cards:
> http://www.overclock.net/t/1594186/various-hitman-2016-pc-directx-11-vs-directx-12-performance
> 
> @ 4k, the 390x is 33% faster than the 980.... overclock your 980 all you want to, you aren't covering that ground!!!


Actually that's a 390. Not a X. The X is even faster


----------



## patriotaki

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Well, nowwe see why AMD is happily bundling Hitman with their cards:
> http://www.overclock.net/t/1594186/various-hitman-2016-pc-directx-11-vs-directx-12-performance
> 
> @ 4k, the 390x is 33% faster than the 980.... overclock your 980 all you want to, you aren't covering that ground!!!


yes..thats true but we cant be happy only from 1 game..we need to see more

dont forget that hitman is AMD based, if the 390X is better than a 980 in a NVIDIA game then yes AMD has done it!! won the game


----------



## jdorje

Quote:


> Originally Posted by *patriotaki*
> 
> yes..thats true but we cant be happy only from 1 game..we need to see more
> 
> dont forget that hitman is AMD based, if the 390X is better than a 980 in a NVIDIA game then yes AMD has done it!! won the game


Hitman and ashes are both amd biased, and you see that in comparisons under dx11.

But if you look at the boost card gets from moving to dx12 that probably is not biased. I haven't made up exact numbers but it's something like 15% across the board for amd, and nothing for nvidia. That's enough to push a 390 to match the 980, for instance.

It is unlikely that any nvidia game will support dx12 until they release cards that benefit from it. Then they all will, of course. So for a while it's a safe bet every dx12 game will continue to be amd biased.


----------



## tolis626

A little help here guys. I've taken my MSI apart to replace the TIM and thermal pads and it was a mess. I can't figure out what pads to replace though. There is a green strip right on the VRMs and there is another one above it that's grey and stuck to the heatsink. Which one do I remove? Or do I remove both and install my own pads? Thanks!


----------



## jdorje

Quote:


> Originally Posted by *tolis626*
> 
> A little help here guys. I've taken my MSI apart to replace the TIM and thermal pads and it was a mess. I can't figure out what pads to replace though. There is a green strip right on the VRMs and there is another one above it that's grey and stuck to the heatsink. Which one do I remove? Or do I remove both and install my own pads? Thanks!


Pics? 390 or x?

The msi I believe has 8 vrms. The last two are snuck in above the regular 6. So I'd guess those pads are the cooling for both the 6 and the 2 and you should replace both.


----------



## tolis626

Quote:


> Originally Posted by *jdorje*
> 
> Pics? 390 or x?
> 
> The msi I believe has 8 vrms. The last two are snuck in above the regular 6. So I'd guess those pads are the cooling for both the 6 and the 2 and you should replace both.


First off, thank you!

Now, pics :


You can see the cooler and the card in this photo, along with the pads I'm talking about.

Also, it's a 390x. Not sure if it matters though.


----------



## jdorje

The grey padding in the bottom left is the main vrm pad. You can see indentation of the 6 core vrms. Those same vrms are in the top right, exposed. The other 2 core vrms should be right above those, off the top of the picture. I think. Only msi has those extra vrms.

What's the green stuff in the top right? If that's more thermal pad it looks like it's on top of the vrm chokes. Usually those aren't cooled at all; no need to replace.


----------



## tolis626

Quote:


> Originally Posted by *jdorje*
> 
> The Grey padding in the bottom left is the main vrm pad. You can see indentation of the 6 core vrms. Those same vrms are in the top right, exposed. The other 2 core vrms should be right above those, off the top of the picture. I think. Only msi has those extra vrms.
> 
> What's the green stuff in the top right? If that's more thermal pad it looks like it's on tip of the vrm chokes. Usually those aren't cooled at all; no need to replace.


I've got no idea! That's what I'm asking about actually. I removed it and it turned to crumbs mostly. The grey pad is intact. For now, I've replaced the grey one with my own Phobya pad and put some TIM on the caps for good measure.


----------



## jdorje

Tim will only help if they are in contact with the heat sink. That's what thermal pads are for - to bridge the gap of an uneven surface.

I assume the green stuff is cheaper thermal pad for the chokes. No need to replace but if you tore some off it can't hurt.

Are there 2 more vrms above the main 6? Get some padding to connect that to the main heat sink.


----------



## tolis626

Quote:


> Originally Posted by *jdorje*
> 
> Tim will only help if they are in contact with the heat sink. That's what thermal pads are for - to bridge the gap of an uneven surface.
> 
> I assume the green stuff is cheaper thermal pad for the chokes. No need to replace but if you tore some off it can't hurt.
> 
> Are there 2 more vrms above the main 6? Get some padding to connect that to the main heat sink.


I replaced all pads I could find with the Phobya ones. Turns out they might be too thin or I messed up, because my VRM reaches 85C after a Valley benchmark run at +80/+50mV. Also, the core doesn't seem to be cooler at all, but I can't be sure. Before, it had enough TIM for two cards or more, it was a mess. I replaced it with some MX-4 at least, but it didn't seem to do much.

But hey, the upside is that it works!









Imma get some 1.5mm pads tomorrow. Maybe I should also replace the green stuff on the chokes? No idea...

Also, @Stige, you were right damn it. I probably do need 1.5mm pads.


----------



## jdorje

I got a 5c drop on core and 10c on vrms by remounting my xfx 390 8256.

For core, I suspect it's important to cover the entire chip with tim, unlike on a cpu. I used some old ceramique and spread it for full coverage with some plastic. If I were to remount it again I'd probably use clu.

For vrms, the 8256 has a separate vrm heat sink that screws down tightly. So there is good contact but limited cooling.

Make sure all 8 vrms are padded on the msi.

Though I guess the whole thing could also be explained if you didn't screw the cooler on tightly enough.


----------



## tolis626

Quote:


> Originally Posted by *jdorje*
> 
> I got a 5c drop on core and 10c on vrms by remounting my xfx 390 8256.
> 
> For core, I suspect it's important to cover the entire chip with tim, unlike on a cpu. I used some old ceramique and spread it for full coverage with some plastic. If I were to remount it again I'd probably use clu.
> 
> For vrms, the 8256 has a separate vrm heat sink that screws down tightly. So there is good contact but limited cooling.
> 
> Make sure all 8 vrms are padded on the msi.
> 
> Though I guess the whole thing could also be explained if you didn't screw the cooler on tightly enough.


Hmm... So you're saying it's worth a shot to try and tighten the screws some more? That'd be easy, I guess. More in-and-outs, but whatever works.

On the above image, in the red circle ("Circle", whatever







) are the VRMs, right? If so, there's 7 of them. Maybe in the green circle is the 8th? Also, do the chokes (in the blue circle) need that green stuff? I put it back, but in pieces because I didn't have enough padding left. Maybe I should get some padding for that too? Or leave them bare or with some TIM?

I did spread the TIM, too. Maybe I put too much on it, but I doubt it was THAT much, especially compared to the stock application. Needing tightening could explain it too... Maybe I'll try later. For now, I'm too stressed over it. Like really, my hands were shaking as if I had Parkinson's. Damn...

Thanks for all your help jdorje!


----------



## christoph

leave the chokes, no need for cooling, but leave the green padding and maybe put a tiny drop of TIM on each it won't hurt to keepp them making contact with the heatsink, but no need to replace the pads on them, so just focus on the VRM's and the GPU


----------



## mus1mus

I'll remove the pads on the chokes as they may be hitting the heatsink and create a gap on the VRMS and the GPU Die.

They don't get hot.


----------



## tolis626

Quote:


> Originally Posted by *christoph*
> 
> leave the chokes, no need for cooling, but leave the green padding and maybe put a tiny drop of TIM on each it won't hurt to keepp them making contact with the heatsink, but no need to replace the pads on them, so just focus on the VRM's and the GPU


Quote:


> Originally Posted by *mus1mus*
> 
> I'll remove the pads on the chokes as they may be hitting the heatsink and create a gap on the VRMS and the GPU Die.
> 
> They don't get hot.


First off, thank you guys too.

Now, I went to the gym to blow off some steam, came back and took the thing apart again. First thing I noticed, I had put too much paste on it, so that could explain the core temps. Then I proceeded to remove that green gummy thing from the chokes and left them bare. I reassembled the card (Becomes easier every time really, I went from a 7 on the Richter scale to a 5







) and made sure every screw was tight enough.

Still, though, I ran Valley and got similar results. Core temps seemed kinda better (maybe?), but VRMs not so much. I think they took longer to heat up, but eventually they did hit 80C and I stopped the test. I guess I'll need 1.5mm pads and I'll need to take the thing apart once more, at least. Sigh... I'm getting depressed. I had to mess with it, didn't I?


----------



## jdorje

Maybe the msi only has 7 core vrms?

My xfx and afaik all the other cards have 6.


----------



## tolis626

Quote:


> Originally Posted by *jdorje*
> 
> Maybe the msi only has 7 core vrms?
> 
> My xfx and afaik all the other cards have 6.


I dunno man, that's what it seems like. I guessed 8 meant 6 for the core and 2 for the memory. I can clearly see 7 though. So... Maybe 7+1? I really have no idea.

You can see it more clearly in this PCB shot from techpowerup.


There's also some other small pads scattered around, whose purpose escapes me, but I replaced those anyway.


----------



## jdorje

Pretty sure they all have 1 for the memory, off on the other side of the card.

That extra vrm on the msi is theoretically why it overclocks better.


----------



## patriotaki

i have upgraded my PSU to the XFX TS 750W gold.. but my gpu still coil whines.

not as much as with the previous psu but still a little bit..should i RMA it?


----------



## hearburnron

I read a 390X Crossfire review the other day that recorded spikes of 805W maximum draw benchmarking Shadow of Mordor - but usually is was closer 700W. Sorry I don't have a link as all I've done for the past 4 weeks is read and watch reviews of the 390X and I've no idea which one it was. I just got my new rig a week ago - I'm running 2 390Xs in crossfire. MY PSU is an XTR 850W. What happens if you need more power than you've got anyway? Consequences?


----------



## hearburnron

Quote:


> Originally Posted by *jodybdesigns*
> 
> Crossfire isn't recommended IMO. Dx12 does NOT currently support multi GPU setups. Also, the game has to have a correct driver profile. And if it's a game works title you can forget getting any good performance from it. I waited 6 months on a proper Dying Light profile. I'll never wait 2 or 3 months on a driver again. Single card solutions for me from now on.


The latest patch release notes from Rise of the Tomb Raider suggests a reason to be excited to have Crossfire with Dx12:

_And there is a never before seen level of control over NVIDIA SLI and AMD CrossFireX configurations, which means that as a developer we can take full control over those systems and ensure users get a great experience with them_


__
https://140859222830%2Fdev-blog-bringing-directx-12-to-rise-of-the

Rejoice! DirectX12 is here and it's going to give everyone a free £150 upgrade to their cards!


----------



## patriotaki

anyone can help me out ?







im getting a high pitched noise while playing some games/or when the card is under load i noticed that if i use VSYNC the noise kinda disappears . Also downclocking the card to 900MHz the noise also disappear..

is my card defective?


----------



## tolis626

Quote:


> Originally Posted by *jdorje*
> 
> Pretty sure they all have 1 for the memory, off on the other side of the card.
> 
> That extra vrm on the msi is theoretically why it overclocks better.


Really? I thought most cards had 2 of them. Guess I was wrong.









I did notice that that VRM is bare, with no thermal pads or anything. Why would they do that?
Quote:


> Originally Posted by *patriotaki*
> 
> i have upgraded my PSU to the XFX TS 750W gold.. but my gpu still coil whines.
> 
> not as much as with the previous psu but still a little bit..should i RMA it?


Quote:


> Originally Posted by *hearburnron*
> 
> I read a 390X Crossfire review the other day that recorded spikes of 805W maximum draw benchmarking Shadow of Mordor - but usually is was closer 700W. Sorry I don't have a link as all I've done for the past 4 weeks is read and watch reviews of the 390X and I've no idea which one it was. I just got my new rig a week ago - I'm running 2 390Xs in crossfire. MY PSU is an XTR 850W. What happens if you need more power than you've got anyway? Consequences?


First off, I would try undervolting. For not so demanding games, I run my card at 1040/1625MHz at -100mV and it runs just fine. With +50% Power Limit its maximum draw is close to 200W, but the average is nowhere near that, it hovers around 140W or something stupid like that. It's also dead silent with those settings.

Now, some cards might not do so well with undervolting (And I would prefer a card that overclocks well instead of undervolting well), but most will do at least something like 1000MHz at -50mV. Power consumption goes down a lot and a second card has much more performance to offer than an overclock. Just forget about breaking any records in benchmarks.









Also, coil whine happens. Some retailers/companies will RMA your card for coil whine, some not. I say, it it doesn't bother you, ignore it. My card whines too, but it's inaudible over fan noise, game sounds etc. It also isn't really audible unless I remove the side panel, so there's that. I wouldn't even try to RMA a card for coil whine unless it's downright awful and distracts you.









PS : Coil whine is just annoying. It's not a defect in the sense that your card isn't working properly. It's fine, it won't break or anything. It just makes some noise.


----------



## Vellinious

I've noticed that on the 970 / 980 / 290X that the coil whine starts to lessen as time goes on. All 3 of them I've had have whined like a banshee at first, but after running them really hard for a while, they eventually start to quiet down. I ran the 970s so hard, it was hardly noticeable at all after a while.


----------



## patriotaki

How did you run them so hard?


----------



## Vellinious

Overclocked them as hard as I could, and ran benchmarks.....


----------



## christoph

the 960 of my friend has this coil whine, and my 390 has it too, don't think is defective, most 970 and 980 has the same coil whine


----------



## Ron Soak

Coil wine can also be influenced by the PSU
For example its commonly attributed to too much voltage inside of a capacitor which has the result of noise of a certain frequency being generated.

PSU's have a lot of power going through them and may also be causing a certain frequency, inaudible normally but if the GPU hits the same frequency it will be amplified.


----------



## mus1mus

Quote:


> Originally Posted by *Ron Soak*
> 
> Coil wine can also be influenced by the PSU
> For example its commonly *attributed to too much voltage inside of a capacitor* which has the result of noise of a certain frequency being generated.
> 
> PSU's have a lot of power going through them and may also be causing a certain frequency, inaudible normally but if the GPU hits the same frequency it will be amplified.


Keyword, COIL.
COIL - Inductor
INDUCTOR = Chokes

If a capacitor is making a sound, throw it away. Right away.


----------



## diggiddi

Quote:


> Originally Posted by *hearburnron*
> 
> I read a 390X Crossfire review the other day that recorded spikes of 805W maximum draw benchmarking Shadow of Mordor - but usually is was closer 700W. Sorry I don't have a link as all I've done for the past 4 weeks is read and watch reviews of the 390X and I've no idea which one it was. I just got my new rig a week ago - I'm running 2 390Xs in crossfire. MY PSU is an XTR 850W. What happens if you need more power than you've got anyway? Consequences?


Your PSU will shut off if it can't handle the amount of current coming through to protect your system and prevent damage, at least a good psu should


----------



## Slowpoke66

Quote:


> Originally Posted by *jdorje*
> 
> Pretty sure they all have 1 for the memory, off on the other side of the card.
> 
> That extra vrm on the msi is theoretically why it overclocks better.


Quote:


> Originally Posted by *tolis626*
> 
> Really? I thought most cards had 2 of them. Guess I was wrong.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I did notice that that VRM is bare, with no thermal pads or anything. Why would they do that?


At least the MSI-cards (390/390X) have two vrms for the vram. They're positioned on the upper left side, under the midplate, just above the R12 choke.


----------



## mus1mus

Hawaii has that in every card iteration. IIRC.


----------



## patriotaki

My GPU coil whines at stock settings too


----------



## patriotaki

the coil whine is quite audible when vsync is off, and also when i OC the card the noise is stronger. should i run furmark or heaven for few hours? i got this card more than 2 weeks..

EDIT: increasing the power limit increases the sound


----------



## spyshagg

my ref 290x has loud coil whine even after 6 months overclocked.

my DCU II 290x is dead silent, even with +230mv on it. DIGI VRM or something is written on the box.


----------



## patriotaki

i think ill RMA it


----------



## spyshagg

I sincerely classify coil whine in the same territory of "silicon lottery". It is a lottery.

Most of them will coil whine. The only reason my DCU II wont coil whine is because it has a different type of chokes than reference cards iirc


----------



## Stige

Quote:


> Originally Posted by *spyshagg*
> 
> I sincerely classify coil whine in the same territory of "silicon lottery". It is a lottery.
> 
> Most of them will coil whine. The only reason my DCU II wont coil whine is because it has a different type of chokes than reference cards iirc


This is correct. Coil whine can be on any card on these days it's very common aswell. You are lucky if you have a card without any sort of whine.
RMA cause of coil whine is a bit silly.


----------



## mus1mus

Yep. It is silly as RMAing coz the card won't clock to 1100/1500 at stock VID.

Coil whine is normal for any inductor dealing with high switching frequency.


----------



## m70b1jr

Hey guys, probably not the right place, but imma try here first.

I was running prime 95, and valley bench mark to test my temps. Then my PC Shut off. Temps were fine.
CPU was around 67 (A bit high, but safe)
GPU Was about 55.

And it cut off. I tried turning it on, and no luck. So I unplugged it for about 30 minutes, and still wouldn't turn on, however the green light on the motherboard turns on. When attempting to turn it on, my powerbutton on my case flashes, for maybe half a second, and does nothing. No fans spin or anything. So I disconnected my powersupply from my PC, and going to put it back in when I get home from school. If that doesn't work, do you guys have any tips for me?


----------



## Vellinious

Quote:


> Originally Posted by *m70b1jr*
> 
> Hey guys, probably not the right place, but imma try here first.
> 
> I was running prime 95, and valley bench mark to test my temps. Then my PC Shut off. Temps were fine.
> CPU was around 67 (A bit high, but safe)
> GPU Was about 55.
> 
> And it cut off. I tried turning it on, and no luck. So I unplugged it for about 30 minutes, and still wouldn't turn on, however the green light on the motherboard turns on. When attempting to turn it on, my powerbutton on my case flashes, for maybe half a second, and does nothing. No fans spin or anything. So I disconnected my powersupply from my PC, and going to put it back in when I get home from school. If that doesn't work, do you guys have any tips for me?


Do you have another PSU to try in it? Sounds like the PSU popped to me.


----------



## m70b1jr

Quote:


> Originally Posted by *Vellinious*
> 
> Do you have another PSU to try in it? Sounds like the PSU popped to me.


Nope, and I can't afford another one for a while, because I just got my first job (I'm 16) And i'm saving for a car.


----------



## mus1mus

Or along the way, something was damaged from the dying PSU surge.

Valley and Prime to blame.


----------



## TsukikoChan

Had a similar thing happen to me in the past, open case and try inspect the fans and motherboard for scorch marks.
I once had a case fan short (using motherboard connector) which popped a diode on the mboard so power was getting to the mboard but the (obivous to it but not me) shortcircuit stopped it from turning on. I blew up a different part of the mboard testing out the case fans (i had a feeling but not certain), so it's possible if it's not the gcard or psu, your mboard or a case fan might've popped.
i learned a new moral from that story: don't use fractal fans straight from the motherboard ^^;;
solution is to switch out parts but that is dangerous i suppose. say it is the psu, you put in a new gcard or motherboard, bam, another one bites the dust. motherboard dead and you switch out psu, bam, another one bites the dust. (maybe;s). you are in a tricky predicament ^^;;


----------



## m70b1jr

Quote:


> Originally Posted by *TsukikoChan*
> 
> Had a similar thing happen to me in the past, open case and try inspect the fans and motherboard for scorch marks.
> I once had a case fan short (using motherboard connector) which popped a diode on the mboard so power was getting to the mboard but the (obivous to it but not me) shortcircuit stopped it from turning on. I blew up a different part of the mboard testing out the case fans (i had a feeling but not certain), so it's possible if it's not the gcard or psu, your mboard or a case fan might've popped.
> i learned a new moral from that story: don't use fractal fans straight from the motherboard ^^;;
> solution is to switch out parts but that is dangerous i suppose. say it is the psu, you put in a new gcard or motherboard, bam, another one bites the dust. motherboard dead and you switch out psu, bam, another one bites the dust. (maybe;s). you are in a tricky predicament ^^;;


I did try to take out my GCard to power it on, but that did nothing. I'll check the fan plugs when I get home, and look for other shorts. If nothing, I'll probably get a new PSU (Does anyone here have a 650watt+ they'd sell to me for cheap? <3 ) If a new PSU doesn't work, I'll probably end up killing my self (RIP PC 2k16)


----------



## Vellinious

Quote:


> Originally Posted by *m70b1jr*
> 
> I did try to take out my GCard to power it on, but that did nothing. I'll check the fan plugs when I get home, and look for other shorts. If nothing, I'll probably get a new PSU (Does anyone here have a 650watt+ they'd sell to me for cheap? <3 ) If a new PSU doesn't work, I'll probably end up killing my self (RIP PC 2k16)


Uh...that's probably a bit extreme...


----------



## m70b1jr

Quote:


> Originally Posted by *Vellinious*
> 
> Uh...that's probably a bit extreme...


lol, nah, I won't kill my self, just would suck to see years of saving components be fried in one swoop.


----------



## Tobiman

Quote:


> Originally Posted by *Carniflex*
> 
> Well the console launch was not particularly impressive hardware wise to be frank. even at the launch it was possible, in essence, to build a PC for the similar cost of the console launch price with off the shelf parts that was slightly outperforming the console (see, for example, http://www.eurogamer.net/articles/digitalfoundry-2014-the-next-gen-digital-foundry-pc from the time when consoles launched). About a year has passed since then and PC's have moved on while consoles have remained as they were at launch.
> 
> Then again a typical console user does not know the difference between RAM and hard disk and they do not care to know it either. All they care for is if the specific game they want is on the platform or not (this is ofc true for PC as a platform as well) - they all have their exclusives. Hell there are whole genres on PC that are not present on consoles.


Sorry but a good amount of this is bollocks. There's no way a $400 or $500 PC will match consoles expecially in AAA titles. Even an XBONE lays waste to an i3/750Ti combo in ALL the latest games.


----------



## Stige

Grab your consoles and get out of this thread with your inferior systems, no one cares about those. They suck and that's it really. Must be hard to play with aimbots and crap...


----------



## jodybdesigns

Quote:


> Originally Posted by *Stige*
> 
> Grab your consoles and get out of this thread with your inferior systems, no one cares about those. They suck and that's it really. Must be hard to play with aimbots and crap...


Lol

YEAH!


----------



## jdorje

I have a 750w g1 to get rid of...


----------



## patriotaki

dont know im getting used to the sound or the coil whine goes a bit away..will test it for few more days and see


----------



## m70b1jr

Quote:


> Originally Posted by *jdorje*
> 
> I have a 750w g1 to get rid of...


How much you trying to sell it for? <3


----------



## jdorje

Quote:


> Originally Posted by *m70b1jr*
> 
> How much you trying to sell it for? <3


Probably like $40.

But back to topic, what was your old psu, gpu, and cpu? How far overclocked?


----------



## m70b1jr

Quote:


> Originally Posted by *jdorje*
> 
> Probably like $40.
> 
> But back to topic, what was your old psu, gpu, and cpu? How far overclocked?


FX-8350 at 4.5ghz, stock voltage
the R9 390 was OC'd to 11600mhz, +100mv.

I was running Valley and Prime95 when It happened. Which I've done before.

My powersupply was a cheap Rosewilll 630watt PSU. Newegg doesn't even offer it anymore.
http://www.newegg.com/Product/Product.aspx?Item=N82E16817182200


----------



## Agent Smith1984

GRRRRRRR

Just updated GTA V and not I am getting core clock fluctuations and stutters in the game.... wth?


----------



## jdorje

Quote:


> Originally Posted by *Agent Smith1984*
> 
> GRRRRRRR
> 
> Just updated GTA V and not I am getting core clock fluctuations and stutters in the game.... wth?


My 390 crushes gta v at 1440.

I would guess cpu bottleneck (stutters can happen on an i5 if you run anything in the background) or running out of ram (causes fps to drop to 20 for half a second every few seconds).

But wait, you mean updated, as in you were running it before? Could be game issue.


----------



## jodybdesigns

Quote:


> Originally Posted by *Agent Smith1984*
> 
> GRRRRRRR
> 
> Just updated GTA V and not I am getting core clock fluctuations and stutters in the game.... wth?


I super smash GTA5 @ 1440p. I see some stutter but my fps will jump from to 60 to 59.8 when it happens. My GPU load is a nice 100% with my i5 3570k @ 4.2ghz topping at 78% so I'm not sure why the random stutters.


----------



## tolis626

Hey guys, regarding my issues with the TIM and thermal pad replacement.

Today I got a 1.5mm Phobya pad and a tube of Gelid GC Extreme. I installed them and, while core temps did improve a bit, the VRM temps are still high. Like, 81C after 5 minutes of Valley high. Still running at 1170/1675MHz at +80/+50mV. Really, I'm getting worried here. I have no idea what I might be doing wrong.


----------



## jdorje

I'd hit 100-110C vrms at 1305 mV...


----------



## Agent Smith1984

Quote:


> Originally Posted by *jodybdesigns*
> 
> I super smash GTA5 @ 1440p. I see some stutter but my fps will jump from to 60 to 59.8 when it happens. My GPU load is a nice 100% with my i5 3570k @ 4.2ghz topping at 78% so I'm not sure why the random stutters.


I run gta v at 4k with no issues at all.... What I'm saying is, it updated this morning and now I'm getting clock fluctuations and stutters.... Also, some people's skin is blue?


----------



## Agent Smith1984

Delete


----------



## Agent Smith1984

Quote:


> Originally Posted by *tolis626*
> 
> Hey guys, regarding my issues with the TIM and thermal pad replacement.
> 
> Today I got a 1.5mm Phobya pad and a tube of Gelid GC Extreme. I installed them and, while core temps did improve a bit, the VRM temps are still high. Like, 81C after 5 minutes of Valley high. Still running at 1170/1675MHz at +80/+50mV. Really, I'm getting worried here. I have no idea what I might be doing wrong.


That's about right for your voltage.... That aux voltage raises vrm temps a lot btw..


----------



## jodybdesigns

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I run gta v at 4k with no issues at all.... What I'm saying is, it updated this morning and now I'm getting clock fluctuations and stutters....


Ahhh I understand now. I have been playing the last few days, but I haven't played since I took out Xfire. Of course the game runs 10x better, but I am noticing the occasional stutters. But my clocks fluctuate because of Vsync I think. Come to think of it, I don't get any stutter in Rise with Vsync though... Maybe because of the horsepower required to push Rise? I dunno. I will do more testing later and try to post some results when I am finally finished with this stupid review plugin I am working on.

*edit* Blue Skin? Try verifying the game cache on Steam. Would be terrible to have to reinstall the game it's the biggest game on my SSD...


----------



## tolis626

Quote:


> Originally Posted by *jdorje*
> 
> I'd hit 100-110C vrms at 1305 mV...


Well, first off, I'm not running that high a voltage. +80mV on my card means something like 1.25V under load. Secondly, I'm comparing to my previous VRM temps. They were almost always a bit below core temps. Now they will happily and quickly jump way over core temps and keep climbing. I ran a benchmark in Valley at 1175/1700MHz at +90mV/+20mV and it hit 88C by the time the benchmark ended. I'm scared I messed up something pretty pad.
Quote:


> Originally Posted by *Agent Smith1984*
> 
> That's about right for your voltage.... That aux voltage raises vrm temps a lot btw..


Agent, you have an MSI too, no? Does yours go this high? Mine didn't use to do so.

Also, the above run was with not so much added AUX voltage. Something's wrong here.


----------



## jdorje

My stock vid is 1225 mV. Most are higher, 1250 is average I think. There is substantial droop before the final voltage.

You're comparing your vrm temps now to your core temps before? That makes no sense. If you improved core cooling I'd expect vrm temps to rise because the fan isn't as high. But what were your vrm temps before?

I've found no advantage to raising aux voltage btw. I've heard rumors of being able to raise memory voltage but if so it's a deep secret.


----------



## m70b1jr

lol
http://outervision.com/b/UP3h89


----------



## Agent Smith1984

Honestly, I'm really disappointed with my current msi.... My original 390 would hit 77c/75c at 50mv/50mv 1150/1650 daily clocks, but this 390x hits around 81c/80c at the same settings (except 1750 memory)

Same case, fans, and ambient temp too. I'm probably going to pad it and pot some new tim on this weekend.


----------



## tolis626

Quote:


> Originally Posted by *jdorje*
> 
> My stock vid is 1225 mV. Most are higher, 1250 is average I think. There is substantial droop before the final voltage.
> 
> You're comparing your vrm temps now to your core temps before? That makes no sense. If you improved core cooling I'd expect vrm temps to rise because the fan isn't as high. But what were your vrm temps before?
> 
> I've found no advantage to raising aux voltage btw. I've heard rumors of being able to raise memory voltage but if so it's a deep secret.


You misunderstood. I just said that my VRM temps were always just below my core temps. And for the sake of proper testing, I do all my testing with my case fans at 1200RPM and the GPU fans at 80% fixed speed. Really, my testing methodology is sound as far as I'm concerned. At the same (or, similar, really) settings, I used to get around 75C for the core and 70C for the VRM.

Just now I tried 1175/1625MHz at +90mV and no AUX voltage. Still the same results.

However, what's concerning me is that not even my core temps improved. Like, I'd expect the move from copious amounts of crap TIM to a proper application of GC Extreme and temps are about the same, if not the same. Which makes me think, maybe my screwdriver is the wrong size and I'm not tightening enough? I don't have one with smaller heads, though, but I doubt that's it. Sigh...
Quote:


> Originally Posted by *Agent Smith1984*
> 
> Honestly, I'm really disappointed with my current msi.... My original 390 would hit 77c/75c at 50mv/50mv 1150/1650 daily clocks, but this 390x hits around 81c/80c at the same settings (except 1750 memory)
> 
> Same case, fans, and ambient temp too. I'm probably going to pad it and pot some new tim on this weekend.


I'm disappointed with my stupidity. It used to work fine before, but I had to be stupid and mess with it.









Do inform me how that goes man.


----------



## jdorje

Okay that makes more sense.

When I replaced both core tim and vrm paddings on my xfx, I got a 5c drop in core temp. This lead to lower fan speed on the cooler since I didn't have a fixed speed for the comparison (lesson learned there - always use fixed fan speed for before after comparisons). But despite that I also got a 10c drop on vrm temps.

If you pinch the cooler together do temps drop? That might be an indication it's not on tightly enough.

There's no way you messed anything up though. No matter how good the cooling was before you can get it better with a personal touch.


----------



## Agent Smith1984

Quote:


> Originally Posted by *m70b1jr*
> 
> lol
> http://outervision.com/b/UP3h89


Lol, is it saying that cause you listed 50% overvoltage?


----------



## Noirgheos

Ok was at school today testing their 390X Nitro. Got it to 1200MHz. Beast scored scored 12.6K in Firstrike, basically my Fury. Got to 80C though...


----------



## m70b1jr

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Lol, is it saying that cause you listed 50% overvoltage?


Maybe. It said I needed a arc reactor, but they're all out of stock ATM.


----------



## tolis626

Quote:


> Originally Posted by *jdorje*
> 
> Okay that makes more sense.
> 
> When I replaced both core tim and vrm paddings on my xfx, I got a 5c drop in core temp. This lead to lower fan speed on the cooler since I didn't have a fixed speed for the comparison (lesson learned there - always use fixed fan speed for before after comparisons). But despite that I also got a 10c drop on vrm temps.
> 
> If you pinch the cooler together do temps drop? That might be an indication it's not on tightly enough.
> 
> There's no way you messed anything up though. No matter how good the cooling was before you can get it better with a personal touch.


I tried to pinch the cooler, but no luck. I don't think I did much though, as I all I could do was bend the plastic shroud. At some point it even went up in temperature? I dunno, but it didn't work for sure.

For now, I do hope that maybe, just maybe, these Phobya pads (being quite firmer than the stock ones) may need some time and heating up to mold themselves onto the VRMs to allow better cooling, as the ones I removed didn't even have the indentations from the VRMs that were clearly seen on the stock pads. So I think I'm going to leave it alone for now, game on stock voltage and 1125MHz for a bit and see it again with a more clear mind. Maybe I'll wait for Agent's results if he decides to go ahead and do the same to his card.

A crazy thought that did cross my mind is that MSI could be using awesome pads on their card all along and I actually just downgraded. That'd be like... Daaaaaamn...


----------



## ogow89

So i rma'd my r9 290 pcs+ due to fan issues and they just gave me store credits because there are no more r9 290 pcs+ to replace it with. I tried to convince them to give me an r9 390 as a replacement but they wouldn't budge. And now my only option is to get a new card. Any suggestions? I really want to upgrade and not side grade so no r9 390. So r9 390x and up. I also overclock and would like to know which brands are best for reaching high overclocks.


----------



## patriotaki

Quote:


> Originally Posted by *ogow89*
> 
> So i rma'd my r9 290 pcs+ due to fan issues and they just gave me store credits because there are no more r9 290 pcs+ to replace it with. I tried to convince them to give me an r9 390 as a replacement but they wouldn't budge. And now my only option is to get a new card. Any suggestions? I really want to upgrade and not side grade so no r9 390. So r9 390x and up. I also overclock and would like to know which brands are best for reaching high overclocks.


sapphire or msi?


----------



## ogow89

Quote:


> Originally Posted by *patriotaki*
> 
> sapphire or msi?


Well that is why i am asking. I wouldn't mind either brands. My older card was powercolor.


----------



## jdorje

Msi overclocks the best, with that extra vrm. Rest are all about equal though the nitro then the pcs+ are considered the best coolers. Avoid gigabyte and asus.

What store though?


----------



## ogow89

Mindfactory germany.


----------



## jdorje

Quote:


> Originally Posted by *ogow89*
> 
> Mindfactory germany.


Edit: mobile misclick.

390x devil seems to be super cheap there. Easy choice. Though it's 3 slots wide.

Otherwise the xfx at 370, the nitro at 400, or the msi at 420 are all tempting.

390 or 390x is definitely the way to go. The 390 is a slight upgrade from the average 290, but the 390x seems to do amazingly in dx12.


----------



## ogow89

Quote:


> Originally Posted by *jdorje*
> 
> Well


Well ...?

Is it a bad handler?


----------



## ogow89

No were to mount the cooling. Already have a cpu water cooling set from coolermaster mounted on the back. On the front would be too much of a hastle, and i am not sure it would.


----------



## m70b1jr

I guess my powersupply blew. I learned how to tie a noose today. I could put that talent to work.

Kidding btw.


----------



## Vellinious

Quote:


> Originally Posted by *mus1mus*
> 
> Vell. Try this. This is just a swap so Im not sure if it will work.
> 
> OV3MTEST.zip 99k .zip file
> 
> 
> Edit:
> 
> @m70b1jr
> 
> Try this. I kinda sway away from Offset myself. So No limit will do for now.
> 
> HawaiiOV.zip 99k .zip file


Just got around to trying this tonight. It doesn't work. Won't even boot up. = (


----------



## m70b1jr

Quote:


> Originally Posted by *Vellinious*
> 
> Just got around to trying this tonight. It doesn't work. Won't even boot up. = (


You did something wrong and bricked your card. Does it have a 2nd bios


----------



## Worldwin

The msi uses a 6+1 phase VRM for core and 1(+1?) phase for memory.
http://www.realhardwarereviews.com/msi-r9-390x-gaming-8g/4/


----------



## Vellinious

Quote:


> Originally Posted by *m70b1jr*
> 
> You did something wrong and bricked your card. Does it have a 2nd bios


No, it's not bricked. I just reflashed to an older version. lol


----------



## kizwan

Quote:


> Originally Posted by *tolis626*
> 
> Hey guys, regarding my issues with the TIM and thermal pad replacement.
> 
> Today I got a 1.5mm Phobya pad and a tube of Gelid GC Extreme. I installed them and, while core temps did improve a bit, the VRM temps are still high. Like, 81C after 5 minutes of Valley high. Still running at 1170/1675MHz at +80/+50mV. Really, I'm getting worried here. I have no idea what I might be doing wrong.


Well, it seems msi already using a very good quality pad. It is a long shot, try put thermal pad back on the chokes. you'll never know, life is full of surprises.
Quote:


> Originally Posted by *jdorje*
> 
> I've found no advantage to raising aux voltage btw. I've heard rumors of being able to raise memory voltage but if so it's a deep secret.


it depends on the card. Some cards needs a tad increase in aux voltage to get stable when overclocking the memory especially. It doesn't increase memory voltage, it is voltage for memory controller.
Quote:


> Originally Posted by *Agent Smith1984*
> 
> Honestly, I'm really disappointed with my current msi.... My original 390 would hit 77c/75c at 50mv/50mv 1150/1650 daily clocks, but this 390x hits around 81c/80c at the same settings (except 1750 memory)
> 
> Same case, fans, and ambient temp too. I'm probably going to pad it and pot some new tim on this weekend.


Make sense for the 390x to slightly hotter than 390 at the same setting.


----------



## tolis626

Quote:


> Originally Posted by *kizwan*
> 
> Well, it seems msi already using a very good quality pad. It is a long shot, try put thermal pad back on the chokes. you'll never know, life is full of surprises. .


I may try to reuse the 1mm pads I initially bought. I've also kept most of the padding from the VRMs, so maybe I could try putting that back? I doubt it's better than 7W/mK though. Still, that used to work, so... Yeah, maybe I'll do that. Thanks!

PS : Do I actually need to reapply the TIM every time I take the card apart? Not only is it a tedious process, but this Gelid stuff ain't cheap!

Also, I cannot get it off the capacitors next to the core. The stock TIM was smeared all over those and there is still enough of it there to cover another core. Maybe I need some ArctiClean or something?


----------



## m70b1jr

After doing the paper clip test ive come to the conclusion my power supply is D.E.A.D

Looking to by some cheap 650watt+ modular PSU's. pm me if you have any.


----------



## jdorje

Opened up GTA V, switched to Franklin, took out my fastest car, drove around the island. There were definitely no slowdowns. I used fraps to record the frame times and came up with the below, basically in line with what I have recorded before.










Afterwards I opened up graphics settings to take note of what the settings were, but was distracted when I saw that the game had somehow reset itself to dx10. So, yeah, dunno what's up with that.


----------



## christoph

Quote:


> Originally Posted by *tolis626*
> 
> Hey guys, regarding my issues with the TIM and thermal pad replacement.
> 
> Today I got a 1.5mm Phobya pad and a tube of Gelid GC Extreme. I installed them and, while core temps did improve a bit, the VRM temps are still high. Like, 81C after 5 minutes of Valley high. Still running at 1170/1675MHz at +80/+50mV. Really, I'm getting worried here. I have no idea what I might be doing wrong.


at what fan speed?


----------



## kizwan

Quote:


> Originally Posted by *tolis626*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Well, it seems msi already using a very good quality pad. It is a long shot, try put thermal pad back on the chokes. you'll never know, life is full of surprises. .
> 
> 
> 
> I may try to reuse the 1mm pads I initially bought. I've also kept most of the padding from the VRMs, so maybe I could try putting that back? I doubt it's better than 7W/mK though. Still, that used to work, so... Yeah, maybe I'll do that. Thanks!
> 
> PS : Do I actually need to reapply the TIM every time I take the card apart? Not only is it a tedious process, but this Gelid stuff ain't cheap!
> 
> Also, I cannot get it off the capacitors next to the core. The stock TIM was smeared all over those and there is still enough of it there to cover another core. Maybe I need some ArctiClean or something?
Click to expand...

Best practice is to re-apply thermal paste everytime you remove the cooler.

Just clean the best you can. It doesn't need to be sparkly clean. Sound like you apply too much. Put small X or + in the middle of the gpu. It'll help TIM spread properly.


----------



## Vellinious

Q-tips and rubbing alcohol work really good for cleaning off old paste.


----------



## tolis626

Quote:


> Originally Posted by *kizwan*
> 
> Best practice is to re-apply thermal paste everytime you remove the cooler.
> 
> Just clean the best you can. It doesn't need to be sparkly clean. Sound like you apply too much. Put small X or + in the middle of the gpu. It'll help TIM spread properly.


The problem isn't the TIM that I put on, it is the stock TIM residue that's on the surrounding capacitors.

My application is OK, from what I can tell. I spread it on the die using the plastic little spatula that came with the TIM. Core temps are fine. I think they're getting better too. Maybe it's curing? I dunno.
Quote:


> Originally Posted by *christoph*
> 
> at what fan speed?


80% for testing always.
Quote:


> Originally Posted by *Vellinious*
> 
> Q-tips and rubbing alcohol work really good for cleaning off old paste.


I did use those, but I can't get all of that goo. It's there for good it seems. Oh well...


----------



## patriotaki

I also got my pCS+ r9 390 from mindfactory in Germany...there's too much process for RMA though


----------



## FlickyBeans

Hey guys, so I'm running Crossfire R9 390x Double Dissipation (Xfx, Radeon)
I should be getting 60 fps in 1920x1080 in far cry primal, correct? I get 30 to 50. And when I turn off crossfire it's about the same, if not BETTER









My CPU is i7-4790k,
1050W psu
16g ram

I noticed people were getting 80fps on furmark while I get about 38 average.. do I have bad cards or something?
ALL my drivers are up to date, even my motherboard bios. Thanks, and I'm glad to be joining in on the AMD club!


----------



## Agent Smith1984

Quote:


> Originally Posted by *FlickyBeans*
> 
> Hey guys, so I'm running Crossfire R9 390x Double Dissipation (Xfx, Radeon)
> I should be getting 60 fps in 1920x1080 in far cry primal, correct? I get 30 to 50. And when I turn off crossfire it's about the same, if not BETTER
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My CPU is i7-4790k,
> 1050W psu
> 16g ram
> 
> I noticed people were getting 80fps on furmark while I get about 38 average.. do I have bad cards or something?
> ALL my drivers are up to date, even my motherboard bios. Thanks, and I'm glad to be joining in on the AMD club!


You need to run gpu-z in the background and see if the primary card is throttling from heat. Cf is a ***** when it comes to temps!


----------



## FlickyBeans

Quote:


> Originally Posted by *Agent Smith1984*
> 
> You need to run gpu-z in the background and see if the primary card is throttling from heat. Cf is a ***** when it comes to temps!


I'll attach the saved log. GPU load seems to be all over the place. Maybe that's how it should be..
Most of this log I am running around wilderness where I sit in the 40-55 fps range. Back at the village, where my gpu starts getting hot, is when it drops to the 30s and low 30s even.

overclocknet.txt 207k .txt file


EDIT: Maybe it's a CPU issue, I need a benchmark to test for errors. CPU burner sounds.. dangerous and hot. I don't like the sound of that. What do you guys think?
Also, I turned off intel graphics incase it was trying to use that instead of both cards at full.. let's see.
editedit: No change when turning off internal graphics.


----------



## tolis626

So, after everything I've done wrong to my GPU these days, I finally decided to relax and play some Witcher 3. At first I was playing with my stock voltage overclock of 1125/1625MHz and temps were decent. But after a while I wanted to see how far I can push this thing now before it starts to melt down. Then comes the strange part. I set the memory clock to 1675MHz with +50mV AUX and left it there for everything else (Higher may or may not crash). I then started upping the core clocks I could get 1170MHz stable at +50mV whereas previously I'd need at least +80mV. I thought it was a mistake at first or something and it wasn't really running at that speed, it should be artifacting. Then I pushed 5MHz more and I had some artifacting here and there, but very rarely so (like 2 or 3 times in a 30 minute session). For 1175MHz I used to need +100mV and that was half-stable. I used to get artifacting in Witcher 3 even with +100mV at 1175MHz and even so, that was the highest I'd been able to game on until now. Going from +50mV to +60mV fixed those too. Right now, I'm running 1185MHz and been playing for like half an hour, but I also played more than an hour before I hit the gym earlier. I do need +75mV for 1185MHz, but it's the first time I've been able to game at such a clock.

Which makes me wonder. What the hell is going on? Is there something I'm doing wrong now and it's not really stable? Was there something wrong with the card before and I fixed it? Maybe the sloppy TIM application was to blame? Thing is, through all this, and using a very aggressive fan curve, max I've gotten on the core is 73C with it hanging under 70C most of the time. On the VRM it's another story. It'll quickly climg to 75-80C and usually it stays there. Although now it did reach 84C during gameplay, but the house is also really warm right now.


----------



## jdorje

You can't compare stability between games. But witcher seems one of the most stressful games.

You remounted cooler and repadded the vrms right? That can certainly improve stability.


----------



## tolis626

Quote:


> Originally Posted by *jdorje*
> 
> You can't compare stability between games. But witcher seems one of the most stressful games.
> 
> You remounted cooler and repadded the vrms right? That can certainly improve stability.


I didn't compare across games. Before I started playing Witcher 3 more (Had to finish 2 first), I only played it as a stress test. Most of my near-stable overclocks were stable otherwise, but failed when playing Witcher 3. That's what I'm saying.

I moved to 1190MHz and +80mV. I MAY have seen an artifact, but I'm not sure as it was during a dialog and I wasn't focused. It's playing quite fine though for the time being. Damn...

Also, I marginally improved core temps and worsened (like 10C more) VRM temps. If doing those helped stability, then there was something wrong before.


----------



## m70b1jr

Went to best buy and picked up a new power supply. I got a CMX 750 for $89 w/tax. PC Works now. So happy.


----------



## jdorje

My guess would be you actually improved vrm temps, but wherever the sensor happens to be ended up hotter.


----------



## tolis626

Quote:


> Originally Posted by *jdorje*
> 
> My guess would be you actually improved vrm temps, but wherever the sensor happens to be ended up hotter.


To be honest, I thought of that too. It would make sense. At least, more sense than most of the alternatives. The strange thing is that, would it be the same in two different applications? Hmm...


----------



## Agent Smith1984

The MSI cards that i have had. Both had 2-5c cooler VRM temps from the core. Especially at sub 100+ voltage. I can't even tinker with 80mv right now without hitting 86c core. 1080p only hits 80c though. You really want to stress a card... Run at 4k. It's a whole different ballgame. I refuse to go below 4k now.... I actually enjoy the res and visuals of 4k with no AA. Even at mid 40's in newer gaand crysys 3 both typesmes. And in the car of shooters, my two faves, bf4 and crysis3... Both run at 60. So I'm good!


----------



## jdorje

1440 is where a single 390/x really shines imo.


----------



## FlickyBeans

Quote:


> Originally Posted by *Agent Smith1984*
> 
> You need to run gpu-z in the background and see if the primary card is throttling from heat. Cf is a ***** when it comes to temps!


Can you help with the info I provided? I just want this tower to shine


----------



## m70b1jr

How's the FPS with a single R9 390 (OC'd?) And a FX-8350? At 4K with a game like The Division


----------



## tolis626

So, higher VRM temps and all, I decided to really test whether overclocking really has improved. So, Firestrike gave me an answer and it's a yes! Highest I'd gotten before was 14800-ish with 1185/1750MHz at +100/+50mV and the card was barely holding on for dear stability.

Now? This beauty happened. Admittedly at +125mV for the core, but who cares? I'm so near 15000 I can almost taste it. Now I gotta figure out how to get 40 more points. Yee-huh!








Quote:


> Originally Posted by *Agent Smith1984*
> 
> The MSI cards that i have had. Both had 2-5c cooler VRM temps from the core. Especially at sub 100+ voltage. I can't even tinker with 80mv right now without hitting 86c core. 1080p only hits 80c though. You really want to stress a card... Run at 4k. It's a whole different ballgame. I refuse to go below 4k now.... I actually enjoy the res and visuals of 4k with no AA. Even at mid 40's in newer gaand crysys 3 both typesmes. And in the car of shooters, my two faves, bf4 and crysis3... Both run at 60. So I'm good!


That has been my experience too on both fronts. For anything other than FPS, I think 4K is nearing faster than we think. 2x390x can push pretty decent framerates, and even more so with DX12 titles. Only time will tell though...


----------



## Vellinious

Good run


----------



## Malinkadink

Between the Nitro and MSI R9 390 which is the better buy? Nitro is $315, MSI is $340


----------



## Ha-Nocri

Nitro. Runs cooler and quieter


----------



## Zhilin

Hey people.
I sincerely apologize if my queston has already been answered on any of the previous 734 pages.
I'm planning on using dual R9 Nanos in my build but afaik Fiji GPUs don't have HDMI 2.0 or HDCP 2.2 outputs which are crucial for me as I'm making a consumer product specifically designed for living room [email protected] gaming/Netflix and chilling. I've investigated that unfortunate issue and found some kind of solution: to make Skylake GPU (IGFX) primary in BIOS (GA Gaming 7 allows that) and use it as an output.
Can anybody please confirm or deny it? Will it cause any troubles for a regular consumer? Thank you.


----------



## tolis626

Quote:


> Originally Posted by *Vellinious*
> 
> Good run


Thanks man! Now share with me your secrets master! I want dat 15.000!








Quote:


> Originally Posted by *Zhilin*
> 
> Hey people.
> I sincerely apologize if my queston has already been answered on any of the previous 734 pages.
> I'm planning on using dual R9 Nanos in my build but afaik Fiji GPUs don't have HDMI 2.0 or HDCP 2.2 outputs which are crucial for me as I'm making a consumer product specifically designed for living room [email protected] gaming/Netflix and chilling. I've investigated that unfortunate issue and found some kind of solution: to make Skylake GPU (IGFX) primary in BIOS (GA Gaming 7 allows that) and use it as an output.
> Can anybody please confirm or deny it? Will it cause any troubles for a regular consumer? Thank you.


Making the iGPU the primary will actually use the iGPU. I don't know if I'm understanding you correctly, but you want to use the Nanos for processing but use the iGPU as an output? Like a passthrough? If so, not gonna happen (Unless I know something terribly wrong myself, that is). What you can do is get a DP1.2->HDMI2.0 adapter. These came out recently, though, so they may not be available everywhere and for cheap.

Now, if what you want to do is use the Nanos with a normal monitor and then use the iGPU for watching stuff on that 4K TV, then yeah, that's totally doable. But you don't make the iGPU your primary, you enable iGPU multi-monitor support so you can be using your dGPUs normally and also plug a monitor/TV to the iGPU and have them work at the same time.


----------



## Zhilin

Quote:


> Originally Posted by *tolis626*
> 
> What you can do is get a DP1.2->HDMI2.0 adapter. .


The problem is, that adapter will fix framerate issue but won't fix lack of HDCP 2.2 support which is required for Netflix 4K streaming. My build is designed to be used primary with TV so in most cases there won't be any other monitor. So, if I set Nanos as primary GPU, leave iGPU enabled in BIOS and plug TV into iGPU HDMI - will it automatically switch rendering to iGPU? Sorry if I'm asking ridiculous questions.
Thanks.


----------



## Paulvin

Hello this is my first post because im planning on buying a 390 for months now

what brand can you recommend? i dont plan on overclocking because the climate here is very hot


----------



## m70b1jr

Quote:


> Originally Posted by *Paulvin*
> 
> Hello this is my first post because im planning on buying a 390 for months now
> 
> what brand can you recommend? i dont plan on overclocking because the climate here is very hot


Sapphire Nitro or the XFX one.


----------



## Stige

Quote:


> Originally Posted by *m70b1jr*
> 
> Sapphire Nitro or the XFX one.


And stay far away from ASUS cards for sure.


----------



## jodybdesigns

Quote:


> Originally Posted by *Paulvin*
> 
> Hello this is my first post because im planning on buying a 390 for months now
> 
> what brand can you recommend? i dont plan on overclocking because the climate here is very hot


Rocking a Powwrcolor PCS+. Since I started using my air recently, I have yet to see my VRMs break 70c in the last 4 days. Matter of fact, my temps are something to be kinda jealous of. 1100/1625 +38mv. 32 idle 20% fan speed. 64c across the board for all load temps on Rise, Hit man, GTA5, and a few others at 60% fan speed. I can't make this thing break into the high 60s. I love it.


----------



## HyeVltg3

Not sure if right place.
http://www.3dmark.com/fs/7921091

Can anyone explain why my Core and Memory(just realized I was looking at OC'd marks, still my core is hella lower than stock even) Clocks are so different from everyone else.

I'm using latest 16.3 Crimson drivers. http://support.amd.com/en-us/kb-articles/Pages/AMD_Radeon_Software_Crimson_Edition_16.3.1.aspx

Just upgraded from Z77 to Z97 (3570K to 4790K) and ran 3Dmark last night.


----------



## m70b1jr

Quote:


> Originally Posted by *HyeVltg3*
> 
> Not sure if right place.
> http://www.3dmark.com/fs/7921091
> 
> Can anyone explain why my Core and Memory(just realized I was looking at OC'd marks, still my core is hella lower than stock even) Clocks are so different from everyone else.
> 
> I'm using latest 16.3 Crimson drivers. http://support.amd.com/en-us/kb-articles/Pages/AMD_Radeon_Software_Crimson_Edition_16.3.1.aspx
> 
> Just upgraded from Z77 to Z97 (3570K to 4790K) and ran 3Dmark last night.


Check to make sure powersavings is disabled in crimson, and make sure you're not thermal throttling


----------



## HyeVltg3

Quote:


> Originally Posted by *m70b1jr*
> 
> Check to make sure powersavings is disabled in crimson, and make sure you're not thermal throttling


Any idea where power management in Crimson is? really hate the new interface, I miss Catalyst.
Will also check to see if Windows 10 default power savings kicked in, I just finally had everything up and running before trying 3dmark. just wanted to see how big of an improvement going 3570k to 4790k was =D
Really hoping its not thermal throttling.


----------



## rdr09

Quote:


> Originally Posted by *HyeVltg3*
> 
> Not sure if right place.
> http://www.3dmark.com/fs/7921091
> 
> Can anyone explain why my Core and Memory(just realized I was looking at OC'd marks, still my core is hella lower than stock even) Clocks are so different from everyone else.
> 
> I'm using latest 16.3 Crimson drivers. http://support.amd.com/en-us/kb-articles/Pages/AMD_Radeon_Software_Crimson_Edition_16.3.1.aspx
> 
> Just upgraded from Z77 to Z97 (3570K to 4790K) and ran 3Dmark last night.


Could be the integrated gpu messing up the reading. disable it in bios.

see the secondary card?


----------



## battleaxe

So I have the rig all running now... well sorta. Had some serious issues the last few days getting the full loop running. I have no idea what has happened or why but here's what I have gone through. Major pain in the neck.



*Got random freezes:* anytime I ran stress tests or try to game...

- assumed it was motherboard because I seemed to be having trouble with the second GPU getting disconnected. Hoped it wasn't the GPU.... (changed motherboard)
- had to reinstall Win7 then Win10 in order to get new motherboard to work with Win10
- experienced same freezing as with old board. (bashed head into wall 10x)
- replaced TIM on GPU2, reseated block, inspected entire board, etc...
- freezes again, this time will run sometimes, crash sometimes
- started suspecting something else besides GPU
- went through systematic process of writing down stress test results and trying different things, memtest, removing drives, removing RAM, etc.
- found bad sector on one HD (bashed head into wall again 10x)
- went through all drives searching for a fixing bad sectors with disc check
- found nothing
- PC seems to work now. Maybe... fingers crossed. Three stress tests and no crashes or freezes. What the heck man?


----------



## HyeVltg3

Quote:


> Originally Posted by *rdr09*
> 
> Could be the integrated gpu messing up the reading. disable it in bios.
> 
> see the secondary card?


Wow didnt even know it would get listed, so used to bios auto-disabling iGPU when a discrete card is found.
will try disabling it when I get home and re run the test.
Didnt think anything would be listed in that drop-down seeing as how I only had 1 of my 390s installed.


----------



## legendary2020

back again with my new R9 390X Powercolor devil

http://www.techpowerup.com/gpuz/details.php?id=57c6m

http://www.3dmark.com/3dm/11277397


----------



## dagget3450

Does anyone know if these 2 cards are straight up 290/290x ref style pcb?

http://www.newegg.com/Product/Product.aspx?Item=N82E16814150767

http://www.newegg.com/Product/Product.aspx?Item=N82E16814150759

They appear to have ref coolers in the pictures but who knows if its correct photo?

I actually like the newer looking one, that resembles fury x kind of.


----------



## Master0fBlunt

Not pleased w/ my cards performance, see the 390x overclocking thread, but I may go X-Fire w/ another 390x. My Variant is Strix.


----------



## jdorje

Quote:


> Originally Posted by *Malinkadink*
> 
> Between the Nitro and MSI R9 390 which is the better buy? Nitro is $315, MSI is $340


Xfx 390x is $340.

Edit: sale seems to have ended, it was at newegg.


----------



## m70b1jr

Quote:


> Originally Posted by *jdorje*
> 
> Xfx 390x is $340.


You can get the XFX on Newegg for 309$


----------



## Vellinious

The price on the 8GB 290X keeps going up. Now $360 on Amazon. Glad I bought mine when I did. lol


----------



## HyeVltg3

There we go: 3dmark.com/3dm/11278519
looks like it was just a "Details" error, didnt really affect the score. oh well. just annoyed my Physcis and Graphics score is in ~12k but Combined its only 4k.
Going to test later today with both 390s installed. hoping for at least 10k >>> 14k, I can deal with that.


----------



## battleaxe

Quote:


> Originally Posted by *battleaxe*
> 
> So I have the rig all running now... well sorta. Had some serious issues the last few days getting the full loop running. I have no idea what has happened or why but here's what I have gone through. Major pain in the neck.
> 
> 
> 
> *Got random freezes:* anytime I ran stress tests or try to game...
> 
> - assumed it was motherboard because I seemed to be having trouble with the second GPU getting disconnected. Hoped it wasn't the GPU.... (changed motherboard)
> - had to reinstall Win7 then Win10 in order to get new motherboard to work with Win10
> - experienced same freezing as with old board. (bashed head into wall 10x)
> - replaced TIM on GPU2, reseated block, inspected entire board, etc...
> - freezes again, this time will run sometimes, crash sometimes
> - started suspecting something else besides GPU
> - went through systematic process of writing down stress test results and trying different things, memtest, removing drives, removing RAM, etc.
> - found bad sector on one HD (bashed head into wall again 10x)
> - went through all drives searching for a fixing bad sectors with disc check
> - found nothing
> - PC seems to work now. Maybe... fingers crossed. Three stress tests and no crashes or freezes. What the heck man?


Well, crap. Appears its the GPU that's acting up. Now I have to tear the loop down and see if it does that on its own.

Does anyone know if I can test the second GPU alone without tearing anything apart?


----------



## rdr09

Quote:


> Originally Posted by *battleaxe*
> 
> Well, crap. Appears its the GPU that's acting up. Now I have to tear the loop down and see if it does that on its own.
> 
> Does anyone know if I can test the second GPU alone without tearing anything apart?


yes, disable crossfire if you can. shut down system and unplug power from gpu not being tested. hook up gpu being tested to monitor and turn on system.


----------



## Vellinious

If it'll run from the 2nd slot as the primary card. My 970s wouldn't on an X99 board. Worth a shot though.


----------



## HyeVltg3

Quote:


> Originally Posted by *HyeVltg3*
> 
> There we go: 3dmark.com/3dm/11278519
> looks like it was just a "Details" error, didnt really affect the score. oh well. just annoyed my Physcis and Graphics score is in ~12k but Combined its only 4k.
> Going to test later today with both 390s installed. hoping for at least 10k >>> 14k, I can deal with that.


Two cards installed. http://www.3dmark.com/3dm/11281123
Aimed for 14k, surpassed expectations.

On to the next problem:
Slot 1 - Club3D R9 390
Slot 2 - Gigabyte G1 R9 390

Why in 3DMark results does it say "Generic VGA" in Secondary GPU ? is this just how 3DMark does it, or something wrong with my system.
I have not touched SLI/CF since 2012, I forget if theres anything you need to turn on/off, install or update when you move from 1 GPU to 2?


----------



## Noirgheos

Quote:


> Originally Posted by *Vellinious*
> 
> If it'll run from the 2nd slot as the primary card. My 970s wouldn't on an X99 board. Worth a shot though.


You said on LTT that your card can hit 1200 with +50 right? Will try next chance I have...


----------



## dagget3450

Quote:


> Originally Posted by *HyeVltg3*
> 
> Two cards installed. http://www.3dmark.com/3dm/11281123
> Aimed for 14k, surpassed expectations.
> 
> On to the next problem:
> Slot 1 - Club3D R9 390
> Slot 2 - Gigabyte G1 R9 390
> 
> Why in 3DMark results does it say "Generic VGA" in Secondary GPU ? is this just how 3DMark does it, or something wrong with my system.
> I have not touched SLI/CF since 2012, I forget if theres anything you need to turn on/off, install or update when you move from 1 GPU to 2?


Its done it to me a few times, esp when overclocking and mutiple gpu


----------



## Vellinious

Quote:


> Originally Posted by *Noirgheos*
> 
> You said on LTT that your card can hit 1200 with +50 right? Will try next chance I have...


It'll bench there, yeah. Probably not game stable. Haven't messed with it much.


----------



## Foresight

So, has the problem of gigabyte r9 390s being volt locked been fixed?


----------



## patriotaki

I'm not so happy with my r9 390..
My friend with his gtx 970 g1 gets 10frames more than me on most games with same settings.. He OC his 970 and I'm running at stock but I can't get more than 7-8fps when I OC it..and it gets pretty loud and hot so I usually run at stock settings


----------



## Malinkadink

Quote:


> Originally Posted by *jdorje*
> 
> Xfx 390x is $340.
> 
> Edit: sale seems to have ended, it was at newegg.


Where does the xfx fall into all this between the nitro and msi? From what i gather the MSI is the best overclocker able to hit 1200 on the core, the Nitro can get somewhere around 1100 fairly comfortably and is the cooler/quieter one, what about the xfx?


----------



## Master0fBlunt

Quote:


> Originally Posted by *Foresight*
> 
> So, has the problem of gigabyte r9 390s being volt locked been fixed?


I own the Asus Strix Radeon R9 390x and I appear to be able to change my voltage using "MSI Afterburner", after allowing the program to do so through advanced settings. The one you speak of, is it not an "x" designated card?


----------



## uszpdoz

Quote:


> Originally Posted by *patriotaki*
> 
> I'm not so happy with my r9 390..
> My friend with his gtx 970 g1 gets 10frames more than me on most games with same settings.. He OC his 970 and I'm running at stock but I can't get more than 7-8fps when I OC it..and it gets pretty loud and hot so I usually run at stock settings


what i heard those gtx 970 G1 gaming variant is the best one in term of performances and this gtx 970 G1 gaming gpu get 5~10 fps more if comparing with other gtx 970 variant.


----------



## kizwan

Quote:


> Originally Posted by *battleaxe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *battleaxe*
> 
> So I have the rig all running now... well sorta. Had some serious issues the last few days getting the full loop running. I have no idea what has happened or why but here's what I have gone through. Major pain in the neck.
> 
> 
> 
> *Got random freezes:* anytime I ran stress tests or try to game...
> 
> - assumed it was motherboard because I seemed to be having trouble with the second GPU getting disconnected. Hoped it wasn't the GPU.... (changed motherboard)
> - had to reinstall Win7 then Win10 in order to get new motherboard to work with Win10
> - experienced same freezing as with old board. (bashed head into wall 10x)
> - replaced TIM on GPU2, reseated block, inspected entire board, etc...
> - freezes again, this time will run sometimes, crash sometimes
> - started suspecting something else besides GPU
> - went through systematic process of writing down stress test results and trying different things, memtest, removing drives, removing RAM, etc.
> - found bad sector on one HD (bashed head into wall again 10x)
> - went through all drives searching for a fixing bad sectors with disc check
> - found nothing
> - PC seems to work now. Maybe... fingers crossed. Three stress tests and no crashes or freezes. What the heck man?
> 
> 
> 
> Well, crap. Appears its the GPU that's acting up. Now I have to tear the loop down and see if it does that on its own.
> 
> Does anyone know if I can test the second GPU alone without tearing anything apart?
Click to expand...

Did your secondary card disappeared after gpu related crash? Try press the power button (on the case) to power it off instead of pressing the reset button.

Btw, did you bleed the loop properly? Temp wise difference between the gpus not big?


----------



## Paulvin

Quote:


> Originally Posted by *m70b1jr*
> 
> Sapphire Nitro or the XFX one.


Quote:


> Originally Posted by *Stige*
> 
> And stay far away from ASUS cards for sure.


any reason why should i stay away from MSI cards?


----------



## Worldwin

Quote:


> Originally Posted by *Paulvin*
> 
> any reason why should i stay away from MSI cards?


It has only 1 bios. Also 2.5 slots used making it impracticable for most ITX cases. Other than that i am loving mine.


----------



## jdorje

Quote:


> Originally Posted by *Malinkadink*
> 
> Where does the xfx fall into all this between the nitro and msi? From what i gather the MSI is the best overclocker able to hit 1200 on the core, the Nitro can get somewhere around 1100 fairly comfortably and is the cooler/quieter one, what about the xfx?


The OP gives a rundown, but I'll put it in my own words.

MSI is the best overclocker hands down because it has 1 extra VRM (7 instead of 6), and great VRM cooling. It has a good cooler with a fanless mode. AFAIK there's only the one MSI card.

Nitro probably has the best cooler. It has a 1-fan mode that's supposedly effective. Good VRM cooling. It has 8+8 pin cables, which theoretically allows a higher power limit, but still doesn't overclock any better than the other cards. AFAIK this is the only Sapphire card.

PCS+ is a close third for cooling. Dont know how the vrm cooling is. It's often a fair bit cheaper.

XFX makes two cards. I have the 8256, which has often been the cheapest 390. The cooling is decent, but vrm cooling is mediocre. Some claim the XFX cards are binned better (XFX is exclusive to AMD...but then so are Sapphire and Powercolor, so this doesn't make a whole lot of sense). Still, my card is a good one (1090 mhz on 1225 mV), so who knows. XFX's warranty is fairly mod-friendly, but then all the companies above have good customer service too.

Powercolor also makes a devil hybrid cooler for the 390x. Don't know much about it but it sounds tempting. Price is decent. It takes up 3 slots but supposedly you can remove a cover and get it down to 2.

I've heard nothing good about either gigabyte (voltage locked) or asus (either model) cards. Or any other brand. But europe has some brands we don't have in the US.


----------



## FlickyBeans

Do these results look alright?
Radeon R9 390x XFX double dissipation, TWO in Crossfire. i7-4790k. Valley benchmark


I get down to 30fps in farcry primal in the "hot" part, my village. Do I actually need to down the graphics on that game from ultra??
Cards get out dated so fast...


----------



## jdorje

Edit: this line was supposed to say "We probably all have iGpus...having 2 bios's doesn't do anything since you can use the igpu to reflash bios." Was off topic though.
Quote:


> Originally Posted by *FlickyBeans*
> 
> Do these results look alright?
> Radeon R9 390x XFX double dissipation, TWO in Crossfire. i7-4790k. Valley benchmark
> 
> I get down to 30fps in farcry primal in the "hot" part, my village. Do I actually need to down the graphics on that game from ultra??
> Cards get out dated so fast...


4600 is a good score but I feel like it should be higher. I get 2900 with a single 390. And 30 min fps? That's got to be a crossfire issue right?

Have you tried the cards in dx12?


----------



## FlickyBeans

Quote:


> Originally Posted by *jdorje*
> 
> We probably all have iGpus...having 2 bios's doesn't do anyt
> 4600 is a good score but I feel like it should be higher. I get 2900 with a single 390. And 30 min fps? That's got to be a crossfire issue right?
> 
> Have you tried the cards in dx12?


Where might I do that?
I have jumpy fps.. and I was getting 32 in FurMark. Also what are some ways I can I troubleshoot to see if it's the card or cpu, software or hardware, etc?


----------



## Malinkadink

Quote:


> Originally Posted by *uszpdoz*
> 
> what i heard those gtx 970 G1 gaming variant is the best one in term of performances and this gtx 970 G1 gaming gpu get 5~10 fps more if comparing with other gtx 970 variant.


Yep, i have my 970 g1 clocked at 1500/8000 w/o even touching the voltage and its perfectly stable and throws out amazing performance. Reason im considering AMD is i may get a freesync monitor and wanna make use of VRR.


----------



## kizwan

Quote:


> Originally Posted by *FlickyBeans*
> 
> Do these results look alright?
> Radeon R9 390x XFX double dissipation, TWO in Crossfire. i7-4790k. Valley benchmark
> 
> 
> I get down to 30fps in farcry primal in the "hot" part, my village. Do I actually need to down the graphics on that game from ultra??
> Cards get out dated so fast...


The score look alright. About far cry, exactly 30 FPS min or higher? I heard crossfire support still buggy for this game. Try set AMD Crossfire Mode to Optimized 1x1 in the Radeon settings window.


----------



## jodybdesigns

Quote:


> Originally Posted by *patriotaki*
> 
> I'm not so happy with my r9 390..
> My friend with his gtx 970 g1 gets 10frames more than me on most games with same settings.. He OC his 970 and I'm running at stock but I can't get more than 7-8fps when I OC it..and it gets pretty loud and hot so I usually run at stock settings


Then get a 980Ti or wait until the next gen of cards. Buying a 970 would be sidestepping. I think your disappointed in the coil whine and trying to find a reason to justify a bad purchase.


----------



## Vellinious

Quote:


> Originally Posted by *FlickyBeans*
> 
> Do these results look alright?
> Radeon R9 390x XFX double dissipation, TWO in Crossfire. i7-4790k. Valley benchmark
> 
> 
> I get down to 30fps in farcry primal in the "hot" part, my village. Do I actually need to down the graphics on that game from ultra??
> Cards get out dated so fast...


Valley is pretty CPU bound. Bump the processor up. If you want a really good score, disable hyperthreading and all but 2 cores and get the 2 remaining cores running at 5ghz or above. 2 x 970s score 5k pretty easy, so yeah....


----------



## kizwan

Quote:


> Originally Posted by *Vellinious*
> 
> Quote:
> 
> 
> 
> Originally Posted by *FlickyBeans*
> 
> Do these results look alright?
> Radeon R9 390x XFX double dissipation, TWO in Crossfire. i7-4790k. Valley benchmark
> 
> 
> I get down to 30fps in farcry primal in the "hot" part, my village. Do I actually need to down the graphics on that game from ultra??
> Cards get out dated so fast...
> 
> 
> 
> Valley is pretty CPU bound. Bump the processor up. If you want a really good score, disable hyperthreading and all but 2 cores and get the 2 remaining cores running at 5ghz or above. 2 x 970s score 5k pretty easy, so yeah....
Click to expand...

I'm pretty sure he is asking about stock score, not highest score he can get.


----------



## Vellinious

Guess I assumed that he'd be overclocking....being an overclocking forum and such.....


----------



## coffeeplus

Hello guys!
I am in search of some tips/advice on the following:

Tested my XFX R9 390 with Witcher 3 capped from in-game to 60 Hz as this is my monitors maximum rate as well.
Crimson 16.3 with PE disabled.
Stock clock of 1015 MHz, running on Win 8.1.

Here are some temperature graphs, using different case fan configurations, for the GPU and my CPU.


1. Do my temperature look fine enough - reaching 74-75 Celsius in Witcher 3 considering a aprox 24 ambient room temp? Should I try/make an effort to lower them a bit or leave them like they are now?

2. As you can see, I manage to obtain roughly 2 Celsius degrees lower GPU temperatures if using two side intakes @1350 RPMs (tested with Arctic Cooling F12 120mm fans). My CPU seems to be the main beneficiary of the 2 side intakes, obtaining nearly 10 degrees lower!
However, I would want to obtain a similar effect for the GPU, as it gets hotter than my CPU and I am more concerned for it. Do you have any tips or recommendation on how to configure my case fans?
I have no front intakes a.t.m but I doubt those might benefit the GPU more than the side intakes, as they would draw through a frontal thick filter - so I doubt they'll bring as much air as they do from the side - which only has a metallic grille. As for exhaust, I have a sinlge 120mm rear fan spinning at 1000RPM.


----------



## tolis626

Last night was a particularly cold night (by Greek standards, don't make fun of me because I'm calling 5-10 degrees Celsius cold







) so I decided it was benchmarking time. And then, it happened. I finally got over 15.000 in FireStrike. Look at it. LOOK AT IT!









All trolling aside, I'm really excited I FINALLY made it so high. And the above one is the second run I did as I wanted a complete benchmark and to see that it wasn't a fluke or a lucky one. The first one was even a bit higher at 15.071 ( http://www.3dmark.com/3dm/11284483?), but it was a graphics only run. Strange thing is, the first one with the higher (even marginally so) score was at 1215/1700MHz at +140/+25mV and the second run was at 1220/1725MHz at +150/+25mV. I also think increasing AUX voltage past +25mV will lead to worse scores, but I have yet to prove that so don't quote me on it.

Now, I can finally fry my card happily...









PS : Another strange thing is that, unlike before, when adding voltage and clocks past 1180MHz at +100mV resulted in actually lower scores, now it scales somewhat. At 1200MHz at +100mV and +125mV I got about 14900 plus change. Last time I tried going so high I landed right around 14500.

Also, I tried running Valley at 1200/1675MHz at +100mV (no AUX increase) and, apart from getting low scores (Probably needed that AUX voltage), my temps reached 77C on the core and 89C on the VRM. And that makes me wonder, since when is Valley running hotter than Witcher 3? Could anyone that has both run them and see if it's like that for them too?


----------



## battleaxe

Quote:


> Originally Posted by *rdr09*
> 
> yes, disable crossfire if you can. shut down system and unplug power from gpu not being tested. hook up gpu being tested to monitor and turn on system.


This worked +1. Thank you.









It appears the second card is just fine. Not an issue at all. I was going to be surprised if it were dead, cause I didn't think I had done anything to it.

The only thing left then is the PSU. So I'll be swapping that out maybe today to see if its dead. Gonna pick up an extra and just have two PSU's. Might get two of the same kind too. Why not.

Thank you.


----------



## battleaxe

Quote:


> Originally Posted by *kizwan*
> 
> Did your secondary card disappeared after gpu related crash? Try press the power button (on the case) to power it off instead of pressing the reset button.
> 
> Btw, did you bleed the loop properly? Temp wise difference between the gpus not big?


The second card is about 2c higher than the first. Which it always has been regardless of how the loop is setup.

Both well under 50C. I used rdr09's suggestion and was able to test the second card. Its fine. Since I tested everything else. I'm fairly certain its the PSU now, or maybe a flaky hard drive, but most likely the PSU.


----------



## FlickyBeans

Quote:


> Originally Posted by *Vellinious*
> 
> Guess I assumed that he'd be overclocking....being an overclocking forum and such.....


How does disabling threads help it get a better score?
Also, I would love to OC however I need to make sure they're operating correctly STOCK first. In games like FarCry Primal the main card hits 80 and the second hits 70 (Probably due to their positioning near eachother, airflow etc.) So should I only OC the second?


----------



## jdorje

Quote:


> Originally Posted by *FlickyBeans*
> 
> How does disabling threads help it get a better score?
> Also, I would love to OC however I need to make sure they're operating correctly STOCK first. In games like FarCry Primal the main card hits 80 and the second hits 70 (Probably due to their positioning near eachother, airflow etc.) So should I only OC the second?


You can overclock significantly on most cards without raising voltage, or even while lowering voltage. If you've got the time it's certainly worthwhile.


----------



## FlickyBeans

Quote:


> Originally Posted by *jdorje*
> 
> You can overclock significantly on most cards without raising voltage, or even while lowering voltage. If you've got the time it's certainly worthwhile.


Is there any clock recommendations for XFX r9 390x without raising voltage? 1650/1100 or what? With which program? One thing I'm curious about is what program do you guys use to set fans? I noticed using hwinfo, that my fan was only at about 75% when the card hit 80 degrees+. I would like it running at 90% at that point, I'm not worried about power draw. 1050w PSU.


----------



## Vellinious

Quote:


> Originally Posted by *FlickyBeans*
> 
> How does disabling threads help it get a better score?
> Also, I would love to OC however I need to make sure they're operating correctly STOCK first. In games like FarCry Primal the main card hits 80 and the second hits 70 (Probably due to their positioning near eachother, airflow etc.) So should I only OC the second?


Hyperthreading and more cores = more heat on the CPU. Without the additional heat, you can achieve a higher clock on the remaining 2 cores. And since Valley only uses 2 cores and is relatively CPU bound, increasing the core clock on the CPU will increase the score.


----------



## Noirgheos

Quote:


> Originally Posted by *Vellinious*
> 
> Hyperthreading and more cores = more heat on the CPU. Without the additional heat, you can achieve a higher clock on the remaining 2 cores. And since Valley only uses 2 cores and is relatively CPU bound, increasing the core clock on the CPU will increase the score.


Speaking of CPU temps, I have my 4790K at 4.4GHz. Noctua NH-U12S in a H440. I reach 76C max during Firestrike physics, and I got a max of 84C in the Intel XTU and Aida64. I expected more out of this thing...

Heck during Witcher 3 I get up to 71C...


----------



## FlickyBeans

Quote:


> Originally Posted by *Vellinious*
> 
> Hyperthreading and more cores = more heat on the CPU. Without the additional heat, you can achieve a higher clock on the remaining 2 cores. And since Valley only uses 2 cores and is relatively CPU bound, increasing the core clock on the CPU will increase the score.


I only get to 60 degrees tops on my cores.. I have h100i gtx corsair liquid cool, but I see what you're saying. It still applies.
I notice you have a 1000w psu for only one card, I did a psu calculator and it looks like I need to up from 1050w to 1300w (r9 390x crossfired) along with the liquid cooling on cpu.. 4 fans in the case.. what do you think?


----------



## Vellinious

Quote:


> Originally Posted by *FlickyBeans*
> 
> I only get to 60 degrees tops on my cores.. I have h100i gtx corsair liquid cool, but I see what you're saying. It still applies.
> I notice you have a 1000w psu for only one card, I did a psu calculator and it looks like I need to up from 1050w to 1300w (r9 390x crossfired) along with the liquid cooling on cpu.. 4 fans in the case.. what do you think?


I was running 2 cards, and with the fans and pumps I have for my loop, along with the lighting, and overclocks, 1000 watts was...cutting it close.

For 2 x 390X I'd look to 1000w bare minimum.


----------



## patriotaki

Quote:


> Originally Posted by *jodybdesigns*
> 
> Then get a 980Ti or wait until the next gen of cards. Buying a 970 would be sidestepping. I think your disappointed in the coil whine and trying to find a reason to justify a bad purchase.


exactly


----------



## battleaxe

Found the culprit of my freezing crashing... bad hard drive. At least SSD's are cheap now.


----------



## rdr09

Quote:


> Originally Posted by *battleaxe*
> 
> Found the culprit of my freezing crashing... bad hard drive. At least SSD's are cheap now.


Are you going to reinstall everything? I image all my HDDs using free version of either Easeus or Macrium. Less downtime.


----------



## zorbyss

Hi guys, I'm new here and I've built myself a new gaming pc. It's my first time building so please forgive me if I said anything stupid. I've having issues with CS : GO, the fps hovers around 30-70 and couldn't keep it steady above 60 (with and without V-sync). I tried messing around with the graphic settings but still the same even I set the graphic settings to the lowest. To my understanding, CS:GO's graphic requirement is very easy to max out and I was able to achieve above 80+ with Crysis 2 and similar framerate with Far Cry 4 but just not CS:GO. I'm currently on 16.3 Crimson. Specs are as follow if it helps.



Motherboard : ASUS Z170 Pro Gaming
RAM : Corsair DDR4 2400mhz 8GB x1

Thank you all in advance. Sorry for my bad english.

Edit: add on motherboard and ram


----------



## jodybdesigns

Quote:


> Originally Posted by *rdr09*
> 
> Are you going to reinstall everything? I image all my HDDs using free version of either Easeus or Macrium. Less downtime.


I always used Acronis. Are these better?
Quote:


> Originally Posted by *zorbyss*
> 
> Hi guys, I'm new here and I've built myself a new gaming pc. It's my first time building so please forgive me if I said anything stupid. I've having issues with CS : GO, the fps hovers around 30-70 and couldn't keep it steady above 60 (with and without V-sync). I tried messing around with the graphic settings but still the same even I set the graphic settings to the lowest. To my understanding, CS:GO's graphic requirement is very easy to max out and I was able to achieve above 80+ with Crysis 2 and similar framerate with Far Cry 4 but just not CS:GO. I'm currently on 16.3 Crimson. Specs are as follow if it helps.
> 
> 
> 
> Motherboard : ASUS Z170 Pro Gaming
> RAM : Corsair DDR4 2400mhz 8GB x1
> 
> Thank you all in advance. Sorry for my bad english.
> 
> Edit: add on motherboard and ram


While you are in game. Turn off Full Screen and turn it back on.


----------



## Master0fBlunt

Quote:


> Originally Posted by *Paulvin*
> 
> any reason why should i stay away from MSI cards?


I keep hearing negative things about my Asus Strix 390x, are they really inferior performance-wise? My point being, should I get another for my X-fire setup for aesthetic reasons so they match, or should I stay away from them and X-fire w/ say a PowerColour?


----------



## kizwan

Quote:


> Originally Posted by *Master0fBlunt*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Paulvin*
> 
> any reason why should i stay away from MSI cards?
> 
> 
> 
> I keep hearing negative things about my Asus Strix 390x, are they really inferior performance-wise? My point being, should I get another for my X-fire setup for aesthetic reasons so they match, or should I stay away from them and X-fire w/ say a PowerColour?
Click to expand...

ASUS get a bad rap because of the cooler that unable to provide adequate cooling especially when overclock but if I'm not mistaken Strix have slightly better cooler than other ASUS cards.


----------



## jodybdesigns

Quote:


> Originally Posted by *Master0fBlunt*
> 
> I keep hearing negative things about my Asus Strix 390x, are they really inferior performance-wise? My point being, should I get another for my X-fire setup for aesthetic reasons so they match, or should I stay away from them and X-fire w/ say a PowerColour?


Well the Powercolors aren't the greatest over clockers in the world, but if you are looking for cooling they have it. Mine hasn't broke 75c on core, VRM1 or VRM2 1100/1625 +38mv +20Aux. We are talking hours of sessions will nice full curves. I leave my fans at 60% during gaming. The room is about 74F.


----------



## Stige

Quote:


> Originally Posted by *kizwan*
> 
> ASUS get a bad rap because of the cooler that unable to provide adequate cooling especially when overclock but if I'm not mistaken Strix have slightly better cooler than other ASUS cards.


The Strix cooler is just as bad as the rest. I have the Strix DC3 R9 390 and the stock cooler is garbage, VRM hits 90C+ at stock clocks.


----------



## Noirgheos

This is why we get Sapphire


----------



## patriotaki

Quote:


> Originally Posted by *jodybdesigns*
> 
> Well the Powercolors aren't the greatest over clockers in the world, but if you are looking for cooling they have it. Mine hasn't broke 75c on core, VRM1 or VRM2 1100/1625 +38mv +20Aux. We are talking hours of sessions will nice full curves. I leave my fans at 60% during gaming. The room is about 74F.


can you show me your custom fan profile?


----------



## bluej511

Noob here, well noob to the site haha. I have my Sapphire r9 390 Nitro watercooled thought id chime in, with pics. I absolutely loved it air cooled but anything above 60% fan speed and it is LOUD. I wouldn't break 72°C too often and the vrms stayed below 70°C in antec 302 case with 2 intake fans. I now have the Core X5 AND the alphacool gpx r9 390 m01 on it. I've not seen too many with this gpu block as its quite restrictive but the temps are INSANE. I might be the only one in this thread with it not too sure. My core is around 38-41°C yes that low and i was shocked when i installed it (installation with all the pads front and back was a pain but it works.). The vrms again stay under 70°C and thats impressive for a card thats now passively cooled with no fans. I have a fan above it as an exhaust and 2 intake fans that are quite far from the card. I'm hoping to add more intake fans somewhere. I went from a 5770>7850>r9 390. Absolutely love the card plays everything i throw at it.

Right now just running the gpuz render test VRM 1 is 47°C and VRM 2 is 62°C. I did overclock it at 1100/1600 factory settings and in game looked fine no artifacts or anything to speak of seems to run ok.

Here she is in the case. Backplate is included, its heavy but the thing is incredibly stiff now.









And here are the fins, i stuck my phone in the case excuse the quality.


----------



## Vellinious

How are the VRM temps? That's the problem with the Alphacool GPX blocks, they don't actively cool the VRM and VRAM.. They're decent blocks, and do a fine job with good air flow. Especially life saving if there are no full coverage blocks available.

I believe they still have their program going where if they don't already make a block for your GPU, if you send them your card, they'll make a block, and give you the first one for free.


----------



## Stige

Quote:


> Originally Posted by *Vellinious*
> 
> How are the VRM temps? That's the problem with the Alphacool GPX blocks, they don't actively cool the VRM and VRAM.. They're decent blocks, and do a fine job with good air flow. Especially life saving if there are no full coverage blocks available.
> 
> I believe they still have their program going where if they don't already make a block for your GPU, if you send them your card, they'll make a block, and give you the first one for free.


VRAM doesn't need any real cooling so that is irrelevant, you can strap small individual heat sinks on each VRAM chip and they still won't get hot. VRM does get hot though if you run with stock thermal pads that come with the block without a fan attached to the heatsink end where the VRM is.


----------



## bluej511

Yea i thought about that but unless they make a copper version i wouldnt want it, not interested in nickel. They make one for the msi card now so soon enough might see a sapphire one.

Right now just running the gpuz render test VRM 1 is 47°C and VRM 2 is 62°C. Its about the same after a long session of gaming. Ive got no fan attached to it either but alphacool does make a bracket for an 80mm fan. I used the pads that come with it and ive had no problems. While gaming VRM 1 gets close to 60°C and VRM 2 stays around low mid 60s. I don't see that as an issue plenty cool.

I did mount a temp sensor taped to the backplate right by the core and that only got to about 42°C so it seems to be working great. VRM and core.


----------



## Vellinious

Quote:


> Originally Posted by *Stige*
> 
> VRAM doesn't need any real cooling so that is irrelevant, you can strap small individual heat sinks on each VRAM chip and they still won't get hot. VRM does get hot though if you run with stock thermal pads that come with the block without a fan attached to the heatsink end where the VRM is.


VRAM doesn't need cooling.....lol, ok


----------



## Stige

Quote:


> Originally Posted by *Vellinious*
> 
> VRAM doesn't need cooling.....lol, ok


"you can strap small individual heat sinks on each VRAM chip and they still won't get hot"

Read it again.


----------



## Vellinious

Quote:


> Originally Posted by *Stige*
> 
> "VRAM doesn't need any real cooling so that is irrelevant"


ok


----------



## Stige

Quote:


> Originally Posted by *Vellinious*
> 
> ok


You can put this on the VRAM and they will have enough cooling.



Here is the proof you are wrong:
Quote:


> Originally Posted by *battleaxe*
> 
> I redesigned the VRM coolers once again. Here's the results. I'm quite happy with how this worked out.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 52C max on VRM1, less than 40C on VRM2... works for me.
> 
> 1200mhz core, 1500mhz mem, 1.211v max during load.


You can keep your finger on the VRAM all day long if you wish which means it doesn't get hot at all.


----------



## Vellinious

That's proof that I'm wrong, that VRAM doesn't need cooling? Man....you've got a lot to learn about scientific method. lol

1500mhz memory is pretty weak...


----------



## bluej511

So this is after about 20mins of AC Syndicate on full tilt, water temp has leveled off to about 9-10C Delta. Im only measuring the outlet of the 360 rad, before i had the 240 and 360 right after the other, this might have made a bit of a difference.

P.S. I had heard that too that VRAM gets no where near as hot as the VRM does. Don't forget VRM mean voltage regulator module gets much more voltages then the VRAM does hence way more heat.
As you can see no issues with VRM cooling and this was oced at 1100/1200 and thats max voltage, avg was a bit lower then that.


----------



## Stige

Quote:


> Originally Posted by *Vellinious*
> 
> That's proof that I'm wrong, that VRAM doesn't need cooling? Man....you've got a lot to learn about scientific method. lol
> 
> 1500mhz memory is pretty weak...


You are wrong and VRAM doesn't get so hot that you can't even touch it. If you strap heatsinks like that on it, you could call it a day, no matter what your clocks are.

Come on, prove me wrong kid.


----------



## Vellinious

Quote:


> Originally Posted by *bluej511*
> 
> So this is after about 20mins of AC Syndicate on full tilt, water temp has leveled off to about 9-10C Delta. Im only measuring the outlet of the 360 rad, before i had the 240 and 360 right after the other, this might have made a bit of a difference.
> 
> P.S. I had heard that too that VRAM gets no where near as hot as the VRM does. Don't forget VRM mean voltage regulator module gets much more voltages then the VRAM does hence way more heat.
> As you can see no issues with VRM cooling and this was oced at 1100/1200 and thats max voltage, avg was a bit lower then that.


That's better VRM temps than I thought they'd have. Not bad at all.


----------



## chiknnwatrmln

Question for somebody running a CF 390/x setup.. Does Fallout 4 Crossfire work properly for you?


----------



## Vellinious

Quote:


> Originally Posted by *Stige*
> 
> You are wrong and VRAM doesn't get so hot that you can't even touch it. If you strap heatsinks like that on it, you could call it a day, no matter what your clocks are.
> 
> Come on, prove me wrong kid. But you can't cause I'm right.


Actually, the VRAM with my card on air at 1750 would start hitting artifacts there. With the Kyrographics block I can push it up to 1800 without any artifacts. The cooler you keep it, the better it'll run. It's been that way forever.

I push my hardware....I know what it takes to make them run better. Cooler is always better. If you're looking for mild overclocks and a little performance boost, then the passive heatsinks are probably just fine for you. It's been the same for memory since I really started hitting the overclocks hard back on Fermi....


----------



## bluej511

Quote:


> Originally Posted by *Vellinious*
> 
> That's better VRM temps than I thought they'd have. Not bad at all.


Yea they have a bad rep because extremerigs or wtv the site is called did a review and got crazy high VRM temps i believe above 90. I think there was some quality issues or maybe just install issues who knows. For me its dead on or even a bit lower then the Nitro cooler did with THREE fans blowing over it. This only has 2 intake case fans. I might strap the 140 in front of the fins see if it makes any difference but the hoses are more in the way then i thought. Otherwise can't fault a core of 42°C coming from 72°C its a 30°C drop on the core.


----------



## Stige

Quote:


> Originally Posted by *bluej511*
> 
> So this is after about 20mins of AC Syndicate on full tilt, water temp has leveled off to about 9-10C Delta. Im only measuring the outlet of the 360 rad, before i had the 240 and 360 right after the other, this might have made a bit of a difference.
> 
> P.S. I had heard that too that VRAM gets no where near as hot as the VRM does. Don't forget VRM mean voltage regulator module gets much more voltages then the VRAM does hence way more heat.
> As you can see no issues with VRM cooling and this was oced at 1100/1200 and thats max voltage, avg was a bit lower then that.


Wonder why your VRM2 is hotter than mine. At +125mV I get ~44C max on Core, 59C on VRM1 and 45C on VRM2 with GPX block. Phobya thermal pads and GT AP15 glued on top of the VRM part of the heatsink.
Without these mods, the VRM temps were unacceptable really. Max voltage says 1.305V but that really doesn't mean much or doesn't tell how much it is under load.

My room temp is ~25C temp and exhaust temp from the radiator is ~30C.
After a night of PC turned off, the temps are around ~22C in the morning, heats up pretty well


----------



## Vellinious

Quote:


> Originally Posted by *bluej511*
> 
> Yea they have a bad rep because extremerigs or wtv the site is called did a review and got crazy high VRM temps i believe above 90. I think there was some quality issues or maybe just install issues who knows. For me its dead on or even a bit lower then the Nitro cooler did with THREE fans blowing over it. This only has 2 intake case fans. I might strap the 140 in front of the fins see if it makes any difference but the hoses are more in the way then i thought. Otherwise can't fault a core of 42°C coming from 72°C its a 30°C drop on the core.


The core temps have always been pretty decent with the GPX blocks. They're great if there are no full coverage blocks available.


----------



## bluej511

Quote:


> Originally Posted by *Stige*
> 
> Wonder why your VRM2 is hotter than mine. At +125mV I get ~44C max on Core, 59C on VRM1 and 45C on VRM2 with GPX block. Phobya thermal pads and GT AP15 glued on top of the VRM part of the heatsink. Without these mods, the VRM temps were unacceptable really. Max voltage says 1.305V but that really doesn't mean much or doesn't tell how much it is under load.


Well you do have way better thermal pads and a fan blowing over it. Are you running the Sapphire as well or another brand? I do believe they all use dif VRMs could be why. I think the VRM 1 on the sapphire is only 3 vrms while VRM 2 is at least 7-8.


----------



## mus1mus

Quote:


> Originally Posted by *Stige*
> 
> You are wrong and VRAM doesn't get so hot that you can't even touch it. If you strap heatsinks like that on it, you could call it a day, no matter what your clocks are.
> 
> Come on, prove me wrong kid.


Okay.


----------



## Vellinious

Quote:


> Originally Posted by *mus1mus*
> 
> Okay.


I got an especially good laugh out of "kid". lol


----------



## bluej511

Guys don;t make me stick a "probe" in between the VRAM and the thermal pad to see what temps they get haha. Nah i wouldn't do that to prove a point anyways. Google gives you 0 results for VRAM temps and automatically posts results for VRM. My guess is because DDR5 lets admit is quite low voltage AND low amps it gets less hot. End the argument there.


----------



## PCMADD0CT0R

I have 2 Strix 390's because Micro Center had it on sale I couldn't pass it up upgrading from my Vapor X Trixx 280x's. Although you have the 390x I don't see any issues as mine work just fine from the branding perspective in Crossfire mode.


----------



## mus1mus

Ha. Yeah boy!









Who am I to call you "kid" when I am eating your dust in benches? hmmm









--was meant to quote Vel..... But mobile data sux.


----------



## bluej511

Guys should def quote i am utterly lost right now. Btw EK came with so much Gelid Extreme its what i used on the GPU block during install.


----------



## PCMADD0CT0R

Quote:


> Originally Posted by *jodybdesigns*
> 
> Well the Powercolors aren't the greatest over clockers in the world, but if you are looking for cooling they have it. Mine hasn't broke 75c on core, VRM1 or VRM2 1100/1625 +38mv +20Aux. We are talking hours of sessions will nice full curves. I leave my fans at 60% during gaming. The room is about 74F.


(Sorry missed the quote button after the fact) I have 2 Strix 390's because Micro Center had it on sale I couldn't pass it up upgrading from my Vapor X Trixx 280x's. Although you have the 390x I don't see any issues as mine work just fine from the branding perspective in Crossfire mode.


----------



## Vellinious

Quote:


> Originally Posted by *mus1mus*
> 
> Ha. Yeah boy!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Who am I to call you "kid" when I am eating your dust in benches? hmmm
> 
> 
> 
> 
> 
> 
> 
> 
> 
> --was meant to quote Vel..... But mobile data sux.


And probably older....I've been building my own rigs since K6 = P


----------



## bluej511

Quote:


> Originally Posted by *Vellinious*
> 
> And probably older....I've been building my own rigs since K6 = P


Pft i started using AOL wtv version and windows 95 lol.


----------



## Stige

Quote:


> Originally Posted by *bluej511*
> 
> Guys don;t make me stick a "probe" in between the VRAM and the thermal pad to see what temps they get haha. Nah i wouldn't do that to prove a point anyways. Google gives you 0 results for VRAM temps and automatically posts results for VRM. My guess is because DDR5 lets admit is quite low voltage AND low amps it gets less hot. End the argument there.


I would love to see this please, if you just could.


----------



## Vellinious

Quote:


> Originally Posted by *bluej511*
> 
> Pft i started using AOL wtv version and windows 95 lol.


AOL. The free disks in the mail once a week....I must have had a stack of those things 3ft high. I used them as coasters. rofl


----------



## jodybdesigns

Quote:


> Originally Posted by *patriotaki*
> 
> can you show me your custom fan profile?


Well it's pretty much 1% fan speed increase at anything over 65C - so 65% fan speed - 70C / 70% fan speed, etc, etc.

But lately I haven't really been using the curve. I just set it at 60% and game on. I just watch the temps on my second monitor and adjust if needed. I should setup a new curve but I am just too lazy.

Also, my fans are on at all times, 40% is completely inaudible. My pump can be louder. And my push/pull @ 900rpm can be heard a little bit, but it is also outside of the case.


----------



## m70b1jr

Hey guys, is the EKWB Vargar fans all that it's hyped up to be? I'm trying to find some REALLY silent fans for my H80i GT.


----------



## Chaoz

Go with Corsair SP120Q PWM, in combination with the Link they're really silent. Have 4 of them in push/pull constantly spinning at 70% (±1000rpm) and they're barely noticable above my casefans (be quiet! Pure Wings 2 120mm)


----------



## jdorje

I have phanteks mp's on my h80i. Very good fans and 20% off at newegg, but I think there are quieter fans out there (not anything from corsair though).


----------



## bluej511

From what ive read idk why people keep buying the corsair fans ive been told they're not quiet at all.


----------



## patriotaki

i used two diffrent pci-e cables to power up my r9 390 pcs+ , and i have to say the coil whine is reduced..
i use 1 cable for the 8pin and a different cable for the 6pin..
strange isnt it?


----------



## Chaoz

Quote:


> Originally Posted by *jdorje*
> 
> I have phanteks mp's on my h80i. Very good fans and 20% off at newegg, but I think there are quieter fans out there (not anything from corsair though).


Quote:


> Originally Posted by *bluej511*
> 
> From what ive read idk why people keep buying the corsair fans ive been told they're not quiet at all.


And you guys know this because you have tried them or just because people tell you they're not? And you believe them?

Cuz that's such BS, imho. The SP120Q PWM's are really silent.


----------



## jdorje

Who makes corsair fans?

I was not bashing them, just saying they aren't going to be better than phanteks. If anything I understated how happy I am with my mps.

But when fans ramp up the h80i rad makes noise as air moves through it. Fans can't stop that.


----------



## bluej511

Quote:


> Originally Posted by *patriotaki*
> 
> i used two diffrent pci-e cables to power up my r9 390 pcs+ , and i have to say the coil whine is reduced..
> i use 1 cable for the 8pin and a different cable for the 6pin..
> strange isnt it?


I did that to my sapphire to begin with, no daisy chain im using 2 separate 8pin connections.


----------



## bluej511

Quote:


> Originally Posted by *Chaoz*
> 
> And you guys know this because you have tried them or just because people tell you they're not? And you believe them?
> 
> Cuz that's such BS, imho. The SP120Q PWM's are really silent.


I havent tried em yet but they have their db rated at 23 while lets say a noctua is rating at 18 with more static pressure and more cfm. I have all my fans at 1200rpms and im emitting about 18db of noise from a meter away. Its quiet.

And btw Chaos im talking at full speed maxed out not just on PWM. And yes i understand the difference between air noise and actual fan noise.


----------



## jdorje

Yeah I'm using two different power cables with my 390 also, even though both of them has a 6 and 8 pin cable on em. No reason to use just one cable.


----------



## patriotaki

Quote:


> Originally Posted by *bluej511*
> 
> I did that to my sapphire to begin with, no daisy chain im using 2 separate 8pin connections.


me2.. using two different 8pin cables..
i was impressed .. didnt expect to reduce the coil whine


----------



## battleaxe

Quote:


> Originally Posted by *rdr09*
> 
> Are you going to reinstall everything? I image all my HDDs using free version of either Easeus or Macrium. Less downtime.


IDK. Turns out it wasn't the hard drive after all. The problem is so inconsistent, so annoying. Very hard to figure out, usually I get these things figured out pretty fast. This one has me stumped.


----------



## m70b1jr

In all honesty, Corsair's link program sucks, and I'm stuck using it. I have a custom fan profile, and it never works. Never. It's broken software and needs to be updated.


----------



## mus1mus

Quote:


> Originally Posted by *m70b1jr*
> 
> Hey guys, is the EKWB Vargar fans all that it's hyped up to be? I'm trying to find some REALLY silent fans for my H80i GT.


They are. Vardars are really great.
I believe dazmode still sells GTs too.









Quote:


> Originally Posted by *Chaoz*
> 
> Go with Corsair SP120Q PWM, in combination with the Link they're really silent. Have 4 of them in push/pull constantly spinning at 70% (±1000rpm) and they're barely noticable above my casefans (be quiet! Pure Wings 2 120mm)


meh. Try to grab some 1300 RPM Cougar HDBs and hear that Corsair being a ton louder.








Quote:


> Originally Posted by *battleaxe*
> 
> IDK. Turns out it wasn't the hard drive after all. The problem is so inconsistent, so annoying. Very hard to figure out, usually I get these things figured out pretty fast. This one has me stumped.


What about your PSU? I know for sure my X-1250W can't handle 2 cards when benching.


----------



## jdorje

Link software is awful. I have it installed but never run it. But it seems if I open link and change the fan profile then that's persistent even with the software closed. Must be stored in the firmware somehow? I've tried out all four profiles and come back to "default" as the one I use, so that's pretty boring I guess.

But the link hardware can be read by hwinfo. So that's incredibly helpful. Reading the water temp is extremely useful as its far more accurate than other temp measurements.


----------



## Chaoz

Quote:


> Originally Posted by *bluej511*
> 
> I havent tried em yet but they have their db rated at 23 while lets say a noctua is rating at 18 with more static pressure and more cfm. I have all my fans at 1200rpms and im emitting about 18db of noise from a meter away. Its quiet.
> 
> And btw Chaos im talking at full speed maxed out not just on PWM. And yes i understand the difference between air noise and actual fan noise.


True, but you never said at full speed in your previous post. Just that Corsair fans aren't any good.


----------



## patriotaki

i ordered 2 days ago the corsair sp120 led edition .. to put it on my cpu cooler and use the coolermaster fan that came with the cooler as pull

...im thinking also to use a spare fan that i have, thinking putting it under the r9 390 in an angle so i can have better temps


----------



## jodybdesigns

Quote:


> Originally Posted by *Chaoz*
> 
> Go with Corsair SP120Q PWM, in combination with the Link they're really silent. Have 4 of them in push/pull constantly spinning at 70% (±1000rpm) and they're barely noticable above my casefans (be quiet! Pure Wings 2 120mm)


I have 2 of them right now. They are on the inside. You can't hear them at all no matter the speed. Period. Nada. But I have a couple of Corsair fans that came with my H100 I am using in pull. I have to turn them WAY down, they are still audible @ 900RPM. Need to get some more SP120 PWMs
Quote:


> Originally Posted by *patriotaki*
> 
> me2.. using two different 8pin cables..
> i was impressed .. didnt expect to reduce the coil whine


I'm using 2 cables as well. Each cable has 2 plugs, but I am only using 1 of each.


----------



## jodybdesigns

*remove ugh FF!*


----------



## bluej511

Quote:


> Originally Posted by *Chaoz*
> 
> True, but you never said at full speed in your previous post. Just that Corsair fans aren't any good.


Yea i should have specified. The Noctuas get loud at full 12v but thats mostly the amount of air you hear passing thru rads. I have em at 10.5v and their already running at 1200rpm. My enermax are just as silent though but they're so cheap compared to the Noctuas the blades don't even spin straight haha.


----------



## bluej511

Quote:


> Originally Posted by *jodybdesigns*
> 
> I have 2 of them right now. They are on the inside. You can't hear them at all no matter the speed. Period. Nada. But I have a couple of Corsair fans that came with my H100 I am using in pull. I have to turn them WAY down, they are still audible @ 900RPM. Need to get some more SP120 PWMs
> I'm using 2 cables as well. Each cable has 2 plugs, but I am only using 1 of each.


Hey just curious since i already forgot what the stock cooler runs VRM temps at. I want to see how close mine are on the Alpha block compared to the nitro cooler.


----------



## Tobiman

Best way to get below 70 celsius on these cards without the card getting to loud is to invest in a kraken g10-kraken x40 or something a bit thicker combo. You can also use the combo for future cards as well so no loss there.


----------



## jodybdesigns

Quote:


> Originally Posted by *bluej511*
> 
> Hey just curious since i already forgot what the stock cooler runs VRM temps at. I want to see how close mine are on the Alpha block compared to the nitro cooler.


Stock settings besides Fan Curve. Fan is currently at 33%. VRAM won't downclock because of multi monitor. My temps are great on my Powercolor PCS+. If I manually set my 750mhz VRAM profile in afterburner, my temps drop about 3C across the board.



*edit* It is also a bit warmer in my room today. It is cooler outside so inside temps are 77F. Beginning of Spring in Tennessee so temps outside are funny. I think we are in Blackberry winter. Middle of the week I will stop running my heat again, internal temps will be about 73F inside. I love watching the idle temp of my i5 3570K @ 4.2ghz drop down into the 20's. And stay in the 40's during gaming.


----------



## bluej511

Quote:


> Originally Posted by *jodybdesigns*
> 
> Stock settings besides Fan Curve. Fan is currently at 33%. VRAM won't downclock because of multi monitor. My temps are great on my Powercolor PCS+. If I manually set my 750mhz VRAM profile in afterburner, my temps drop about 3C across the board.
> 
> 
> 
> *edit* It is also a bit warmer in my room today. It is cooler outside so inside temps are 77F. Beginning of Spring in Tennessee so temps outside are funny. I think we are in Blackberry winter. Middle of the week I will stop running my heat again, internal temps will be about 73F inside. I love watching the idle temp of my i5 3570K @ 4.2ghz drop down into the 20's. And stay in the 40's during gaming.


Ah those are def not loaded temps are they haha. VRMs under 60 damn id love that. Ive added more intake fans to my case literally right above the block and seems like with better moving fans it will get even cooler. Gonna add 3 intake fans up top.


----------



## jodybdesigns

Quote:


> Originally Posted by *bluej511*
> 
> Ah those are def not loaded temps are they haha. VRMs under 60 damn id love that. Ive added more intake fans to my case literally right above the block and seems like with better moving fans it will get even cooler. Gonna add 3 intake fans up top.


I modded the crap out of my Phantom 410. No panels, custom side window. I got 4 Bitfenix Spectres for $3.30/lot. They are wonderful. I usually run them @ 7v, but I been using them at 12v @ 2000rpm and they are practically silent (why buy something and not use it right?). I have 2 in the front, 1 in the bottom, 1 in the back. 2x SP120 PWMs up to as push, radiator is outside the frame, with 2x H100 fans in pull (these are the loudest of them all, budget bleh).

*Potato pic incoming*


Pretty happy with my setup.


----------



## battleaxe

Quote:


> Originally Posted by *mus1mus*
> 
> They are. Vardars are really great.
> I believe dazmode still sells GTs too.
> 
> 
> 
> 
> 
> 
> 
> 
> meh. Try to grab some 1300 RPM Cougar HDBs and hear that Corsair being a ton louder.
> 
> 
> 
> 
> 
> 
> 
> 
> What about your PSU? I know for sure my X-1250W can't handle 2 cards when benching.


I put a new PSU on just the GPU's and same thing. So, unless the existing PSU is flaking even when only on the CPU, then that's not it. Think that's possible? It works just fine when only one card is plugged in. As soon as I add the second card, I get freezing and such. Checked the motherboard too. Same thing. Checked unplugging all hard drives, same thing.


----------



## bluej511

I find it hard to believe a 1250w psu can't handle 2 r9 390s and a cpu.


----------



## battleaxe

Quote:


> Originally Posted by *bluej511*
> 
> I find it hard to believe a 1250w psu can't handle 2 r9 390s and a cpu.


His cards hit over 1300mhz on core. He's probably pulling well over 400-450 watts per card. Think again.


----------



## mus1mus

Quote:


> Originally Posted by *battleaxe*
> 
> I put a new PSU on just the GPU's and same thing. So, unless the existing PSU is flaking even when only on the CPU, then that's not it. Think that's possible? It works just fine when only one card is plugged in. As soon as I add the second card, I get freezing and such. Checked the motherboard too. Same thing. Checked unplugging all hard drives, same thing.


Clean Drivers as well? OS?

Quote:


> Originally Posted by *battleaxe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bluej511*
> 
> I find it hard to believe a 1250w psu can't handle 2 r9 390s and a cpu.
> 
> 
> 
> His cards hit over 1300mhz on core. He's probably pulling well over 400-450 watts per card. Think again.
Click to expand...

http://www.3dmark.com/3dm11/11075753









Any 850W will fail at this level.


----------



## bluej511

Quote:


> Originally Posted by *battleaxe*
> 
> His cards hit over 1300mhz on core. He's probably pulling well over 400-450 watts per card. Think again.


Even if it does 900 theres no way his CPU is pulling 300w or so. Even with HDDs, fans, pumps and etch that prob doesnt pull 100w even with RAM. I find it hard to believe but who knows.
Quote:


> Originally Posted by *mus1mus*
> 
> Clean Drivers as well? OS?
> http://www.3dmark.com/3dm11/11075753
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Any 850W will fail at this level.


Impressive, but can it run minecraft haha. I guess its possible, but that seems insane for a cpu to pull 250+ w.


----------



## NastyAIDS

Ok so i have a Asus R9 390 Strix and i was wondering if anyone has their strix watercooled. I know that EK has a R9 290 Directcu ii waterblock and was wondering if that would fit.


----------



## bluej511

Quote:


> Originally Posted by *mus1mus*
> 
> Clean Drivers as well? OS?
> http://www.3dmark.com/3dm11/11075753
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Any 850W will fail at this level.


Quote:


> Originally Posted by *NastyAIDS*
> 
> Ok so i have a Asus R9 390 Strix and i was wondering if anyone has their strix watercooled. I know that EK has a R9 290 Directcu ii waterblock and was wondering if that would fit.


The ek one fits any reference pcb design r9 290/x/390/x it will fit. They have just recently released one for the msi gaming 390/x. Wish theyd release one for the Sapphire too haha.


----------



## NastyAIDS

Ya i wish they made full blocks for pretty much every r9 390 vendor. My 390 hits 81c with a oc of 1110 and my god the fans are loud







.


----------



## mus1mus

Quote:


> Originally Posted by *bluej511*
> 
> Even if it does 900 theres no way his CPU is pulling 300w or so. Even with HDDs, fans, pumps and etch that prob doesnt pull 100w even with RAM. I find it hard to believe but who knows.


You need to rethink what you believe than bashing what's happening.








Mind you, I am only benching. Not Gaming. Gaming BF4 may even need my clocks lowered further than I would like to.
And OCP trips happen only in Combined Tests. CPU at 4.7/1.33 + 2 290Xs at 1.45ish load. If I lower my CPU clock to 4.6/1.26 it can pass a test. And gives me more wiggle room for another Voltage bump on the GPUs.

*Voltage and Clocks increase Power Consumption exponentially. It's not as simple as 450 + 450 + 300 + 100 W.







*
Quote:


> Impressive, but can it run minecraft haha. I guess its possible, but that seems insane for a cpu to pull 250+ w.


Nope. Minesweeper crashes at 200000FPS.


----------



## bluej511

Quote:


> Originally Posted by *mus1mus*
> 
> You need to rethink what you believe than bashing what's happening.
> 
> 
> 
> 
> 
> 
> 
> 
> Mind you, I am only benching. Not Gaming. Gaming BF4 may even need my clocks lowered further than I would like to.
> And OCP trips happen only in Combined Tests. CPU at 4.7/1.33 + 2 290Xs at 1.45ish load. If I lower my CPU clock to 4.6/1.26 it can pass a test. And gives me more wiggle room for another Voltage bump on the GPUs.
> 
> *Voltage and Clocks increase Power Consumption exponentially. It's not as simple as 450 + 450 + 300 + 100 W.
> 
> 
> 
> 
> 
> 
> 
> *
> Nope. Minesweeper crashes at 200000FPS.


No bashing just hard to believe. Lets say max each card pulls 450w. Two of them would equal 900w, nothing exponential if thats the maximum hence the term maximum. Would mean theoretically that your CPU is pulling lets say for headroom of other components, 200-250w. Never said it was impossible, just hard to believe.

Edit: Never mind just realized you have an amd cpu, should be pulling 200w on stock clocks muahhahahaha


----------



## mus1mus

Quote:


> Originally Posted by *bluej511*
> 
> No bashing just hard to believe. Lets say max each card pulls 450w. Two of them would equal 900w, nothing exponential if thats the maximum hence the term maximum. Would mean theoretically that your CPU is pulling lets say for headroom of other components, 200-250w. Never said it was impossible, just hard to believe.


Safe to say my system is pulling 1400 W before tripping OCP, buddy.

Cards are probably pulling north or even more than 550W if allowed.

Take this for example to show the relationship between stock and OC + OV scenario:

Stock Clock / Stock Voltage = 100%
OC at 50 % would pull 150% Power.
OV at 50 % would also pull 150% power

OC 50% + OV 50% = ~225% power to keep it simple.


----------



## Chaoz

Quote:


> Originally Posted by *NastyAIDS*
> 
> Ok so i have a Asus R9 390 Strix and i was wondering if anyone has their strix watercooled. I know that EK has a R9 290 Directcu ii waterblock and was wondering if that would fit.


Alphacool has made full waterblock for the R9 series (M03 version for the STRIX cards):
http://www.alphacool.com/shop/gpu-cooler/ati-fullsize/?p=1&o=5&n=12&f=652|77|76


----------



## bluej511

Quote:


> Originally Posted by *mus1mus*
> 
> Safe to say my system is pulling 1400 W before tripping OCP, buddy.
> 
> Cards are probably pulling north or even more than 550W if allowed.
> 
> Take this for example to show the relationship between stock and OC + OV scenario:
> 
> Stock Clock / Stock Voltage = 100%
> OC at 50 % would pull 150% Power.
> OV at 50 % would also pull 150% power
> 
> OC 50% + OV 50% = ~225% power to keep it simple.


I got it no need to call me buddy. Didn't notice your sig i didn't pay attention.


----------



## Nedooo

Ok time to wake up from fairy-tale, you read 1200w and you believe it will provide 1200w...muahahahahaha...so gpu and cpu are the only power eaters...muahahahahaha...
First let us start from the power source PSU...fan well that is little power eating gremlin... now just follow the power cords and try to spot few other power eating gremlins...


----------



## bluej511

Someone is taking something they shouldnt be taking huh?


----------



## Chaoz

Quote:


> Originally Posted by *bluej511*
> 
> Hey just curious since i already forgot what the stock cooler runs VRM temps at. I want to see how close mine are on the Alpha block compared to the nitro cooler.


My VRM temps are really good on my STRIX R9 390 DC3OC. Barely go over 70°C when I'm playing BO3 or other stuff. Same when I ran Valley.


----------



## bluej511

Quote:


> Originally Posted by *Chaoz*
> 
> My VRM temps are really good on my STRIX R9 390 DC3OC. Barely go over 70°C when I'm playing BO3 or other stuff.


Alright so the alphacool block does slightly better then the Strix thats good to know. I forgot which card it was that VRMs were thru the roof, i think the MSI cooler maybe.


----------



## Chaoz

Quote:


> Originally Posted by *bluej511*
> 
> Alright so the alphacool block does slightly better then the Strix thats good to know. I forgot which card it was that VRMs were thru the roof, i think the MSI cooler maybe.


Some claim that their Strix cards VRM's are getting way too hot, no problems here, tbh.


----------



## mus1mus

Sig is not there to put you off.







If you want a detailed run at my explanation, just say so. I would be glad to. But it seemed you don't appreciate a friendly talk from the start.









@Nedooo

Don't worry man, PSU quality was not wrongly considered coming into this kind of set-up.... QUAD Hawaiis


----------



## bluej511

Quote:


> Originally Posted by *mus1mus*
> 
> Sig is not there to put you off.
> 
> 
> 
> 
> 
> 
> 
> If you want a detailed run at my explanation, just say so. I would be glad to. But it seemed you don't appreciate a friendly talk from the start.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> @Nedooo
> 
> Don't worry man, PSU quality was not wrongly considered coming into this kind of set-up.... QUAD Hawaiis


I appreciate the friendly talk, what i dont appreciate is you assuming im bashing. If that was the case i would say AMD cpus suck. That would be bashing haha (btw ive only used amd gpus and refuse to ever buy nvidia hence the joke). No harm done i understand how electricity and electrical systems work, AxV=W and all that jazz. If i had noticed right away that you were using an AMD cpu that OCed pulls close to 200W if not over then i would have believed it, simple as that.


----------



## NastyAIDS

Just what i needed to know! thanks


----------



## dagget3450

Quote:


> Originally Posted by *bluej511*
> 
> I appreciate the friendly talk, what i dont appreciate is you assuming im bashing. If that was the case i would say AMD cpus suck. That would be bashing haha (btw ive only used amd gpus and refuse to ever buy nvidia hence the joke). No harm done i understand how electricity and electrical systems work, AxV=W and all that jazz. If i had noticed right away that you were using an AMD cpu that OCed pulls close to 200W if not over then i would have believed it, simple as that.


Pretty sure hes using intel cpu i thought.


----------



## NastyAIDS

Like how hot do the vrms get?


----------



## battleaxe

Quote:


> Originally Posted by *mus1mus*
> 
> Clean Drivers as well? OS?
> http://www.3dmark.com/3dm11/11075753
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Any 850W will fail at this level.


Used DDU to uninstall and then reinstalled and tried various drivers. Nothing works so far, just tried on Win7 too, got blue screen on that OS. Win10 just freezes when its loaded. But sometimes it will game and bench fine. Totally intermittent and annoying as ****.

I'm back to suspecting the motherboard, CPU, or the second GPU or even the first GPU maybe causing said issues in Xfire.


----------



## bluej511

Quote:


> Originally Posted by *battleaxe*
> 
> Used DDU to uninstall and then reinstalled and tried various drivers. Nothing works so far, just tried on Win7 too, got blue screen on that OS. Win10 just freezes when its loaded. But sometimes it will game and bench fine. Totally intermittent and annoying as ****.
> 
> I'm back to suspecting the motherboard, CPU, or the second GPU or even the first GPU maybe causing said issues in Xfire.


Ive had some issues sometimes updating/installing crimson. Has to do with the anti-virus, unplugging the ethernet and the installing it fixes that issue. AMD has been great lately just a few niggles. For some reason in my crimson my gpu clock is in % instead of mhz.


----------



## battleaxe

Quote:


> Originally Posted by *bluej511*
> 
> Ive had some issues sometimes updating/installing crimson. Has to do with the anti-virus, unplugging the ethernet and the installing it fixes that issue. AMD has been great lately just a few niggles. For some reason in my crimson my gpu clock is in % instead of mhz.


I'll give it a shot, but I doubt its the issue as I haven't even installed an antivirus software yet since fresh installation of Win10. Its only days old now.


----------



## bluej511

Quote:


> Originally Posted by *battleaxe*
> 
> I'll give it a shot, but I doubt its the issue as I haven't even installed an antivirus software yet since fresh installation of Win10. Its only days old now.


I mean the W10 freezing up issue worth a shot, ive a few install issues with AMD and W10 but its been ok now. Im not crossfired though but its very add its not even installing. Have you tried just leaving in one gpu or is it watercooled and thats difficult?


----------



## mus1mus

Quote:


> Originally Posted by *bluej511*
> 
> I appreciate the friendly talk, what i dont appreciate is you assuming im bashing. If that was the case i would say AMD cpus suck. That would be bashing haha (btw ive only used amd gpus and refuse to ever buy nvidia hence the joke). No harm done i understand how electricity and electrical systems work, AxV=W and all that jazz. If i had noticed right away that you were using an AMD cpu that OCed pulls close to 200W if not over then i would have believed it, simple as that.


AMD FX rig on the sig is not what I have at the moment. But that is beside the point.

5930K at 4.7 sucks more power than that FX.


----------



## diggiddi

Quote:


> Originally Posted by *mus1mus*
> 
> Sig is not there to put you off.
> 
> 
> 
> 
> 
> 
> 
> If you want a detailed run at my explanation, just say so. I would be glad to. But it seemed you don't appreciate a friendly talk from the start.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> @Nedooo
> 
> Don't worry man, PSU quality was not wrongly considered coming into this kind of set-up.... QUAD Hawaiis


What PSU are you using?


----------



## kizwan

Quote:


> Originally Posted by *battleaxe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mus1mus*
> 
> They are. Vardars are really great.
> I believe dazmode still sells GTs too.
> 
> 
> 
> 
> 
> 
> 
> 
> meh. Try to grab some 1300 RPM Cougar HDBs and hear that Corsair being a ton louder.
> 
> 
> 
> 
> 
> 
> 
> 
> What about your PSU? I know for sure my X-1250W can't handle 2 cards when benching.
> 
> 
> 
> I put a new PSU on just the GPU's and same thing. So, unless the existing PSU is flaking even when only on the CPU, then that's not it. Think that's possible? It works just fine when only one card is plugged in. As soon as I add the second card, I get freezing and such. Checked the motherboard too. Same thing. Checked unplugging all hard drives, same thing.
Click to expand...

Still have the problem huh?







If it's any consolation to you, I'm also having intermittent issue with my crossfire when gaming. Freezing/crash/bsod at one time, no problem at all in the other time. I already re-verified my CPU overclock a couple of times to rule it out but still the problem still happening intermittently. I don't know which to blame, CPU or GPU or both.
Quote:


> Originally Posted by *battleaxe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bluej511*
> 
> I find it hard to believe a 1250w psu can't handle 2 r9 390s and a cpu.
> 
> 
> 
> His cards hit over 1300mhz on core. He's probably pulling well over 400-450 watts per card. Think again.
Click to expand...

If I'm not mistaken using custom ROM with voltage limit removed & Vdroop disabled.


----------



## mus1mus

Quote:


> Originally Posted by *diggiddi*
> 
> What PSU are you using?


Seasonic X-1250 * 2







I can still add another 1000 to the mix. Cocktails








Quote:


> Originally Posted by *kizwan*
> 
> If I'm not mistaken using custom ROM with voltage limit removed & Vdroop disabled.


Vdroop is ON.







Just the limit removed.


----------



## Kalistoval

Is anyone else overclocking an Asus 390 DCUII?. I have tried using Msi after burner and I am currently using the newest Trixx can seem to get it passed 1120 core at 100mv.


----------



## diggiddi

Quote:


> Originally Posted by *mus1mus*
> 
> Seasonic X-1250 * 2
> 
> 
> 
> 
> 
> 
> 
> I can still add another 1000 to the mix. Cocktails
> 
> 
> 
> 
> 
> 
> 
> 
> Vdroop is ON.
> 
> 
> 
> 
> 
> 
> 
> Just the limit removed.


2x? and its not enough???


----------



## mus1mus

When pushing *chasing* fyzzz and velinious


----------



## kizwan

Quote:


> Originally Posted by *diggiddi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mus1mus*
> 
> Seasonic X-1250 * 2
> 
> 
> 
> 
> 
> 
> 
> I can still add another 1000 to the mix. Cocktails
> 
> 
> 
> 
> 
> 
> 
> 
> Vdroop is ON.
> 
> 
> 
> 
> 
> 
> 
> Just the limit removed.
> 
> 
> 
> 2x? and its not enough???
Click to expand...

He, fyzzz & velinious are benching with a lot of voltage pushing through the card(s).


----------



## Vellinious

Not as much as I'd like.....I can only get to 1.46v, and with vdroop it ends up 1.305v. It's holding me back.....


----------



## diggiddi

So I'm looking to add a 1000-1300w psu in tandem to my 750w, good idea? The idea is to get a 9590 and maybe a 390/x for trifire
(+ 2x lightnings) & Run the main system off the 1000 and plug the cards Pcie into the 750


----------



## mus1mus

Quote:


> Originally Posted by *diggiddi*
> 
> So I'm looking to add a 1000-1300w psu in tandem to my 750w, good idea? The idea is to get a 9590 and maybe a 390/x for trifire
> (+ 2x lightnings) & Run the main system off the 1000 and plug the cards Pcie into the 750


You can run the CPU + Main GPU on the 750. 2 Cards + Fans + all other components on the 1000.
Quote:


> Originally Posted by *Nedooo*
> 
> Ok time to wake up from fairy-tale, you read 1200w and you believe it will provide 1200w...muahahahahaha...so gpu and cpu are the only power eaters...muahahahahaha...
> First let us start from the power source PSU...fan well that is little power eating gremlin... now just follow the power cords and try to spot few other power eating gremlins...





Spoiler: Fans are not power eaters? :thinking: How much do you reckon 10 of these will eat?


----------



## diggiddi

Quote:


> Originally Posted by *mus1mus*
> 
> You can run the CPU + Main GPU on the 750. 2 Cards + Fans + all other components on the 1000.
> 
> 
> Spoiler: Fans are not power eaters? :thinking: How much do you reckon 10 of these will eat?


I cant even overclock 1 gpu on it b4 it trips current protection


----------



## mus1mus

In that sense, Look for no less than a 1000.


----------



## Kalistoval

I making some kinda progress mus lol


----------



## mus1mus

Quote:


> Originally Posted by *Kalistoval*
> 
> I making some kinda progress mus lol


You lost me there buddy.


----------



## bluej511

Quote:


> Originally Posted by *mus1mus*
> 
> You lost me there buddy.


mus1mus are you building a pc or a rocket ship with those fans haha. I just switched my rad from the top to the front see if it drops my temps more. Thank god i didnt have to drain it fully, all i did was put the case on its front and removed the rad, barely dripping water. Then i booted and got a shell message. Low and behold yanking on sata cables BAD IDEA pulled right out of my SSD.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Master0fBlunt*
> 
> I keep hearing negative things about my Asus Strix 390x, are they really inferior performance-wise? My point being, should I get another for my X-fire setup for aesthetic reasons so they match, or should I stay away from them and X-fire w/ say a PowerColour?


2) Asus cards in CF is a bad idea, the top card will likely throttle from high temps..... the cooling is okay for one (mine would run in the high 70's stock, but the VRM< got really hot, I believe in the high 80's....

Quote:


> Originally Posted by *Kalistoval*
> 
> Is anyone else overclocking an Asus 390 DCUII?. I have tried using Msi after burner and I am currently using the newest Trixx can seem to get it passed 1120 core at 100mv.


The asus card will benefit more from overclocking on 50mv or less because of the VRM temps..... I found it better to run the one I had at 25mv and it hit around 1150 core.

Meanwhile.... this thread sure has taken off, it's everything I can do to keep up with you guys!!


----------



## Stige

The ASUS cards have just plain crap cooling for VRM, it doesn't really exist. Even at stock you are looking at 90C+. Increasing voltage will not do you any good on those cards because you will get even mroe unstable clocks cause of the high VRM temps.

Your best bet to succeed with ASUS cards is an aftermarket cooler, no other choice really.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Stige*
> 
> The ASUS cards have just plain crap cooling for VRM, it doesn't really exist. Even at stock you are looking at 90C+. Increasing voltage will not do you any good on those cards because you will get even mroe unstable clocks cause of the high VRM temps.
> 
> Your best bet to succeed with ASUS cards is an aftermarket cooler, no other choice really.


Is your card on water now? I notice you have 1200/1600 listed as your clocks.... I couldn't bench the Asus I tested past 1180 with 75mv and it wasn't anywhere near game stable because the VRM was hitting 95C. The core was not too bad at 85c, but still kinda high.


----------



## tolis626

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Is your card on water now? I notice you have 1200/1600 listed as your clocks.... I couldn't bench the Asus I tested past 1180 with 75mv and it wasn't anywhere near game stable because the VRM was hitting 95C. The core was not too bad at 85c, but still kinda high.


Hey Agent, did you bother changing the TIM and pads on your MSI? If not, I recommend you do. Even though mine seemed messed up at first (With VRM temps easily 10C higher than before), it now overclocks A LOT better. Like, seriously, I can get 1195MHz game stable at +100mV (Although I do give up those last 5MHz and settle for 1190MHz at +90mV). Playing Witcher 3 and BF4 atm and core temps have been stellar with Gelid GC Extreme. It never passes 75C (usually hovers just below 70C), albeit with an aggressive fan profile. All that remains is to get my act together, get some Fujipolys (Phobya pads didn't work out at all), get VRM temps down and finally be able to get 1200MHz game stable. I'm even debating making a custom BIOS that has a slightly higher DPM7 voltage if +100mV on the stock BIOS doesn't cut it for 1200MHz. I can already game at 1200MHz and +100MHz, I just get SOME artifacting here and there, but nothing major. Strange thing is, it mostly happens when VRM temps rise rapidly, like when Alt+Tabing back into a game. Oh well.









Only thing that still doesn't want to behave is the memory. Like seriously, how do you guys get 1750MHz stable? How crappy is my memory? Rhetorical questions and all, but I'm really wondering if there's anything more I can do to improve the situation. No particular reason performance-wise to have over the 1625-1675MHz I can get stable, it's just annoying that I seem to be among the few that can't get my memory really high.

PS : That, and benchmarks do love memory clocks.


----------



## bluej511

Quote:


> Originally Posted by *tolis626*
> 
> Hey Agent, did you bother changing the TIM and pads on your MSI? If not, I recommend you do. Even though mine seemed messed up at first (With VRM temps easily 10C higher than before), it now overclocks A LOT better. Like, seriously, I can get 1195MHz game stable at +100mV (Although I do give up those last 5MHz and settle for 1190MHz at +90mV). Playing Witcher 3 and BF4 atm and core temps have been stellar with Gelid GC Extreme. It never passes 75C (usually hovers just below 70C), albeit with an aggressive fan profile. All that remains is to get my act together, get some Fujipolys (Phobya pads didn't work out at all), get VRM temps down and finally be able to get 1200MHz game stable. I'm even debating making a custom BIOS that has a slightly higher DPM7 voltage if +100mV on the stock BIOS doesn't cut it for 1200MHz. I can already game at 1200MHz and +100MHz, I just get SOME artifacting here and there, but nothing major. Strange thing is, it mostly happens when VRM temps rise rapidly, like when Alt+Tabing back into a game. Oh well.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Only thing that still doesn't want to behave is the memory. Like seriously, how do you guys get 1750MHz stable? How crappy is my memory? Rhetorical questions and all, but I'm really wondering if there's anything more I can do to improve the situation. No particular reason performance-wise to have over the 1625-1675MHz I can get stable, it's just annoying that I seem to be among the few that can't get my memory really high.
> 
> PS : That, and benchmarks do love memory clocks.


I use Gelid Extreme on my alphacool gpu block, the suprmacy evo came with so much i was able to do my gpu and cpu bloack 3-4x.


----------



## Agent Smith1984

Quote:


> Originally Posted by *tolis626*
> 
> Hey Agent, did you bother changing the TIM and pads on your MSI? If not, I recommend you do. Even though mine seemed messed up at first (With VRM temps easily 10C higher than before), it now overclocks A LOT better. Like, seriously, I can get 1195MHz game stable at +100mV (Although I do give up those last 5MHz and settle for 1190MHz at +90mV). Playing Witcher 3 and BF4 atm and core temps have been stellar with Gelid GC Extreme. It never passes 75C (usually hovers just below 70C), albeit with an aggressive fan profile. All that remains is to get my act together, get some Fujipolys (Phobya pads didn't work out at all), get VRM temps down and finally be able to get 1200MHz game stable. I'm even debating making a custom BIOS that has a slightly higher DPM7 voltage if +100mV on the stock BIOS doesn't cut it for 1200MHz. I can already game at 1200MHz and +100MHz, I just get SOME artifacting here and there, but nothing major. Strange thing is, it mostly happens when VRM temps rise rapidly, like when Alt+Tabing back into a game. Oh well.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Only thing that still doesn't want to behave is the memory. Like seriously, how do you guys get 1750MHz stable? How crappy is my memory? Rhetorical questions and all, but I'm really wondering if there's anything more I can do to improve the situation. No particular reason performance-wise to have over the 1625-1675MHz I can get stable, it's just annoying that I seem to be among the few that can't get my memory really high.
> 
> PS : That, and benchmarks do love memory clocks.


I gotta order some pads before I change out the TIM so it's all done in one shot.

I am thinking the 1mm Fujipoly, but not sure if that will be the correct thickness or not.....


----------



## tolis626

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I gotta order some pads before I change out the TIM so it's all done in one shot.
> 
> I am thinking the 1mm Fujipoly, but not sure if that will be the correct thickness or not.....


Well, EK's site says that 1mm pads are used on the MSI 390x. I didn't get good results with the 1mm Phobya pads and thought it was the thickness, but 1.5mm didn't work out either. I am going to get 1mm Fujipolys though as that seems to be the correct size. Worst case scenario, I have to use a bit more TIM between the pad and cooler.

That got me thinking, could my memory woes (The fact that my max memory overclock is inconsistent is what's strange to me) be related to a high VRAM VRM temperature? I could always put some thermal paste on it to see... Most probably, though, I think I'll just have to accept that that's what I got.


----------



## Stige

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Is your card on water now? I notice you have 1200/1600 listed as your clocks.... I couldn't bench the Asus I tested past 1180 with 75mv and it wasn't anywhere near game stable because the VRM was hitting 95C. The core was not too bad at 85c, but still kinda high.


Yeah I got Alphacool GPX block on it now, they aren't very good either but with tweaks and better thermal pads, I'll manage..


----------



## bluej511

Quote:


> Originally Posted by *Stige*
> 
> Yeah I got Alphacool GPX block on it now, they aren't very good either but with tweaks and better thermal pads, I'll manage..


Eh i love mine couldnt be happier i mean come on 40°C core temp who wouldnt be haha. Id love fuji pads but jesus theyre expensive, id only put em on the vrms though.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Stige*
> 
> Yeah I got Alphacool GPX block on it now, they aren't very good either but with tweaks and better thermal pads, I'll manage..


What voltage are you running for 1200? What's the core and VRM hit with the alphacool block?


----------



## Stige

Quote:


> Originally Posted by *Agent Smith1984*
> 
> What voltage are you running for 1200? What's the core and VRM hit with the alphacool block?


+125mV on Core and -25mV on AUX Voltage.

Core/VRM2 are pretty much the same at all times, ~45C tops. VRM1 can get upto 71C under more demanding games like Armored Warfare. Still a bit too high, once it reaches around that ~70-71C mark, I start to see small artifact every few minutes, anything below that 70C mark and zero artifacts until then.

I have Phobya 7w/mk thermal pads and GT AP-15 glued on top of the VRM bit to improve the VRM cooling and they still run that hot, once I got some spare cash, I'll order some 14w/mk thermal pads and hope they provide a massive improvement..
Quote:


> Originally Posted by *bluej511*
> 
> Eh i love mine couldnt be happier i mean come on 40°C core temp who wouldnt be haha. Id love fuji pads but jesus theyre expensive, id only put em on the vrms though.


Core temp is kinda irrelevant though as it goes nowhere near as high as VRM1 does, VRM cooling is what matters for these cards really.

Out of the box, the GPX Alphacool block has mediocre VRM cooling at best, far from acceptable if you ask me. With better thermal pads and fan added, they start to be "ok".


----------



## Agent Smith1984

How much fujipoly will I need to do my one 390x VRM and VRAM??

I am getting the 17w/mk Fuji ultra material. They sell it in 100x15 and 60x50 pieces on amazon.....

I am thinking that should yield me a good VRM drop. Also putting noctua NT-H1 on the core.


----------



## bluej511

Quote:


> Originally Posted by *Stige*
> 
> +125mV on Core and -25mV on AUX Voltage.
> 
> Core/VRM2 are pretty much the same at all times, ~45C tops. VRM1 can get upto 71C under more demanding games like Armored Warfare. Still a bit too high, once it reaches around that ~70-71C mark, I start to see small artifact every few minutes, anything below that 70C mark and zero artifacts until then.
> 
> I have Phobya 7w/mk thermal pads and GT AP-15 glued on top of the VRM bit to improve the VRM cooling and they still run that hot, once I got some spare cash, I'll order some 14w/mk thermal pads and hope they provide a massive improvement..
> Core temp is kinda irrelevant though as it goes nowhere near as high as VRM1 does, VRM cooling is what matters for these cards really.
> 
> Out of the box, the GPX Alphacool block has mediocre VRM cooling at best, far from acceptable if you ask me. With better thermal pads and fan added, they start to be "ok".


Stige mine out of the box vrm temps are identical to what my nitro did mid 60s. Considering custom coolers do that with 3 fans and gpx does it with none sounds good to me. My vrms hqve never reached 70C


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> How much fujipoly will I need to do my one 390x VRM and VRAM??
> 
> I am getting the 17w/mk Fuji ultra material. They sell it in 100x15 and 60x50 pieces on amazon.....
> 
> I am thinking that should yield me a good VRM drop. Also putting noctua NT-H1 on the core.


The 60x50 sheet is enough to do about 4 cards VRM's.

I wouldn't waste this good stuff on the RAM. Use something cheaper. Just my OP


----------



## Stige

Quote:


> Originally Posted by *bluej511*
> 
> Stige mine out of the box vrm temps are identical to what my nitro did mid 60s. Considering custom coolers do that with 3 fans and gpx does it with none sounds good to me. My vrms hqve never reached 70C


So yeah, zero benefit from the GPX block then







Sure core temps are lower but it doesn't affect much if any at all. But can't expect much more from a cheap "hybrid" block anyway.

Not very good like I said, workable with some improvements.


----------



## bluej511

Quote:


> Originally Posted by *Stige*
> 
> So yeah, zero benefit from the GPX block then
> 
> 
> 
> 
> 
> 
> 
> Sure core temps are lower but it doesn't affect much if any at all. But can't expect much more from a cheap "hybrid" block anyway.
> 
> Not very good like I said, workable with some improvements.


Yea prettymuch but 60s vrm is perfectly fine and that was at 1100/1600 for me fully stable. Might get some alphacool 11w pads they're made by fujipoly as well.


----------



## Stige

If they shipped the Alphacool blocks with better thermalpads to begin with, I think the cost for them would be negligible while getting much better reviews, the thermal pad swap alone can make quite a difference on VRM temps. Better temps = better reviews = more sales.


----------



## bluej511

Quote:


> Originally Posted by *Stige*
> 
> If they shipped the Alphacool blocks with better thermalpads to begin with, I think the cost for them would be negligible while getting much better reviews, the thermal pad swap alone can make quite a difference on VRM temps. Better temps = better reviews = more sales.


Was just ghinking that myself. If they shipped with 11 or 14w would be amazing. 11s arent that expensive either. I might pick some up from the uk in 1 5mm cant find any in 3 for the backplate


----------



## ronaldoz

Well, I just ordered a new Sapphire Radeon NITRO R9 390 8G. Is this a good R9 390? I was thinking about the Powercolor PSC+ too, but I've read both have nice temperatures, but Sapphire got a better service. Oh, and the Gigabyte now only got 2 fans, so... Also, I got a better feeling for the r9 390, then the GTX970. It has 3,5GB memory and the R9 390 scores really nice in benchmarks. I sold my Powercolor R9 290 PSC+ and actually happy to get a AMD again.


----------



## jodybdesigns

Quote:


> Originally Posted by *ronaldoz*
> 
> Well, I just ordered a new Sapphire Radeon NITRO R9 390 8G. Is this a good R9 390? I was thinking about the Powercolor PSC+ too, but I've read both have nice temperatures, but Sapphire got a better service. Oh, and the Gigabyte now only got 2 fans, so... Also, I got a better feeling for the r9 390, then the GTX970. It has 3,5GB memory and the R9 390 scores really nice in benchmarks. I sold my Powercolor R9 290 PSC+ and actually happy to get a AMD again.


Both are awesome. The Sapphire is a better overclocker though. Sapphire always delivers. ALWAYS.


----------



## ronaldoz

Quote:


> Originally Posted by *jodybdesigns*
> 
> Both are awesome. The Sapphire is a better overclocker though. Sapphire always delivers. ALWAYS.


Thanks for the info! That's great to hear, because I like to overclock and could not really find information about the Sapphire yet. For this I have to be sure, Afterburner won't lower the voltage to default, as it did to my R9 290. On this forum I've read, it happened since the Crimson drivers. So Afterburner was not prepared I guess. Trixx did a great job, but in the end my experience with overclocking was not good and feels like it kinda damaged the card, so it could not overclock stable anymore. Do you have any suggestions about this?


----------



## jodybdesigns

Quote:


> Originally Posted by *ronaldoz*
> 
> Thanks for the info! That's great to hear, because I like to overclock and could not really find information about the Sapphire yet. For this I have to be sure, Afterburner won't lower the voltage to default, as it did to my R9 290. On this forum I've read, it happened since the Crimson drivers. So Afterburner was not prepared I guess. Trixx did a great job, but in the end my experience with overclocking was not good and feels like it kinda damaged the card, so it could not overclock stable anymore. Do you have any suggestions about this?


I always used Trixx with all my Sapphire cards and I never had a problem. I have owned all Sapphire cards up until this point. I have ran multi monitor for as long as I can remember, and my vram has always ran full speed where the core will drop to idle. Same thing happens with my Powercolor PCS+. Doesn't bother me. I can manually set my profile to set the vram to 750MHz and I get a 3c difference. Not worth the hassle, so I leave it alone. I am though for once, using Afterburner because it is a bit more friendly with my Powercolor than Trixx.

But lots of people here have the Nitro. They usually drop in, show off the card, and 90% of them are never heard from. Why? Because they are busy enjoying their games.

Enjoy your Nitro man.


----------



## ronaldoz

Quote:


> Originally Posted by *jodybdesigns*
> 
> I always used Trixx with all my Sapphire cards and I never had a problem. I have owned all Sapphire cards up until this point. I have ran multi monitor for as long as I can remember, and my vram has always ran full speed where the core will drop to idle. Same thing happens with my Powercolor PCS+. Doesn't bother me. I can manually set my profile to set the vram to 750MHz and I get a 3c difference. Not worth the hassle, so I leave it alone. I am though for once, using Afterburner because it is a bit more friendly with my Powercolor than Trixx.
> 
> But lots of people here have the Nitro. They usually drop in, show off the card, and 90% of them are never heard from. Why? Because they are busy enjoying their games.
> 
> Enjoy your Nitro man.


Thanks, sounds great! I guess I will be back to tell my experience with the overclock (using Trixx), and replacing the thermal paste. I'd like to put GC Extreme on the GPU chip. But I'm not sure if I gonna do that soon or later. Well, I hope it's in the PC tomorrow.


----------



## Agent Smith1984

Quote:


> Originally Posted by *battleaxe*
> 
> The 60x50 sheet is enough to do about 4 cards VRM's.
> 
> I wouldn't waste this good stuff on the RAM. Use something cheaper. Just my OP


Awesome!

Think I'll do my 390x, wife's 7870, and son's 7950 all at once!


----------



## bluej511

My sapphire came with elpida memory instead of hynix kinda dissapointing


----------



## ronaldoz

Quote:


> Originally Posted by *bluej511*
> 
> My sapphire came with elpida memory instead of hynix kinda dissapointing


Do you got a Nitro and have you overclocked the card? Ah, I see, 1100 / 1600? Are you using this 24/7?


----------



## bluej511

Quote:


> Originally Posted by *ronaldoz*
> 
> Do you got a Nitro and have you overclocked the card? Ah, I see, 1100 / 1600? Are you using this 24/7?


Nah i just set it to that to see if it could run, seemed pretty stable while gaming so can't complain. I got the Nitro with the backplate so it came clocked at 1040/1500 from the factory instead of the 1010/1500 without the backplate. I do believe that now they only come with the backplate sapphire no longer sells it without.


----------



## tolis626

Hey guys, been forgetting to ask since I started messing with my card and taking it apart. My backplate has a sort of plastic film on the underside. Is that normal?



Please excuse the piss poor quality. The photos were taken with my Galaxy S3 as I don't have a proper camera since I broke my Nikon. Damn... I shouldn't remember that...


----------



## bluej511

Quote:


> Originally Posted by *tolis626*
> 
> Hey guys, been forgetting to ask since I started messing with my card and taking it apart. My backplate has a sort of plastic film on the underside. Is that normal?


Normal my does as well, the sapphire backplate does too.


----------



## tolis626

Quote:


> Originally Posted by *bluej511*
> 
> Normal my does as well, the sapphire backplate does too.


Hmmm... I guessed so. I figure that, if it's a purely cosmetic add-on and doesn't provide any cooling, then making sure it doesn't cause any short circuits is essential.

Anyway, thanks man!


----------



## bluej511

Yea my sapphire one actually had some thick thermal pad on the backplate for the vrm, it just wasnt actually under the vrm which was weird. When installing the alphacool one i got it dead under the vrm the Sapphire one was slightly off. And people complaining about the pad thickness on the back of the acool at 3mm, the Sapphire bad on the backplate was the same thickness.


----------



## christoph

yeah my Sapphire has the Pad between the backplate and the counterside of the VRMs, but I didn't notice a plastic film


----------



## bluej511

Quote:


> Originally Posted by *christoph*
> 
> yeah my Sapphire has the Pad between the backplate and the counterside of the VRMs, but I didn't notice a plastic film


I think it depends who put it on haha, mine had a few air bubbles in it so was easy to notice, i honestly have no idea what the film even does.


----------



## Kalistoval

1120/1600 70mv core voltage, 50mv aux, +50% power limit. Temps are cool upping the clock and voltage tiny bit by bit and aux voltage is not working I think the stable max I can get from this card will be at 1135 core.


----------



## Stige

Quote:


> Originally Posted by *jodybdesigns*
> 
> Both are awesome. The Sapphire is a better overclocker though. Sapphire always delivers. ALWAYS.


You are wrong. You can't say one brand is better overclocker than another... It is just pure luck.


----------



## mus1mus

There are companies that do their own pre-binning. And put them tested chips unto high-end cards. EVGA is one. And so are the Matrix cards. So he is not wrong. You just don't agree with him.

Get that attitude outta here.


----------



## bluej511

Quote:


> Originally Posted by *mus1mus*
> 
> There are companies that do their own pre-binning. And put them tested chips unto high-end cards. EVGA is one. And so are the Matrix cards. So he is not wrong. You just don't agree with him.
> 
> Get that attitude outta here.


Mmm well considering this is an AMD R9 390/X thread he's actually not wrong, EVGA doesnt make amd cards correct? As far as the r9 390s are concerned he's right that no aftermarket manufacturer is better at OCed. Hell the Sapphire is a 2x8pin with a fully custom pcb and its by far not the best overclocking card.


----------



## Stige

Quote:


> Originally Posted by *mus1mus*
> 
> There are companies that do their own pre-binning. And put them tested chips unto high-end cards. EVGA is one. And so are the Matrix cards. So he is not wrong. You just don't agree with him.
> 
> Get that attitude outta here.


And your proof is where exactly? Nowhere, it's not true. Claiming one brand is better than other is bs. When it comes to overclocking that is. Obviously others have much better cooling than some others. But in the end, it is just silicon lottery if your card will OC or not.


----------



## bluej511

Even Jayztwocents got one of those crazy kingpin high end EVGA cards and he was so disappointed in the OC on it, it really is luck of the draw.


----------



## Master0fBlunt

Have you all not heard of "The Silicon Lottery"....


----------



## mus1mus

Well, I don't need to give you proof. But I can point out the bad in you. Your attitude sux. You know it all!

Your card will overclock up to your willingness.







I am a testament.


----------



## Stige

Quote:


> Originally Posted by *mus1mus*
> 
> Well, I don't need to give you proof. But I can point out the bad in you. Your attitude sux. You know it all!
> 
> Your card will overclock up to your willingness.
> 
> 
> 
> 
> 
> 
> 
> I am a testament.


I know I know it all. Just facts really, no point denying them.

But really, only reason you should buy a specific brand of a card is 1. awesome stock cooler 2. aftermarket coolers. That's it.
You don't buy a brand because someone has a card from them that overclocks nicely, it's just luck...


----------



## mus1mus

It's more than just luck. It's getting the idea that these certain cards did well for others that your odds can be better.

Gathering data before jumping into a conclusion is the way to go.

Now, If Sapphire has been having the issues that ASUS has, I believe people will also try to persuade people to stay away from them.

But then again, you know it all.


----------



## Stige

Sapphire cards have the best stock cooler so that is a reason alone to buy them. ASUS is garbage.


----------



## ronaldoz

Quote:


> Originally Posted by *mus1mus*
> 
> It's more than just luck. It's getting the idea that these certain cards did well for others that your odds can be better.
> 
> Gathering data before jumping into a conclusion is the way to go.
> 
> Now, If Sapphire has been having the issues that ASUS has, I believe people will also try to persuade people to stay away from them.
> 
> But then again, you know it all.


Hm, I choose the Sapphire Nitro, because I noticed nice temps in the reviews / tests. As well for the Powercolor. I've read about 'bad' temps on the MSI, and there is not so much choice when getting a R9 390. The Gigabyte only got 2 fans now.


----------



## jodybdesigns

I didn't mean to get the hurt butt going. But per reviews, the Sapphires are excellent at cooling, and pretty good overclockers. I haven't seen a user yet with a Nitro who can't hit 1100/1600 with practically stock volts. Sapphire is AMD's baby brand, so of course it is going to get the tender loving care that the other brands don't get. That's what happens when your name is ASUS, you mostly cater to Nvidia, and you expect those same coolers to cool off a R9 390. I have owned Sapphires since my 2600XT, owned 2 4870's, then got 2 Sapphire 6870's that were 5870 rebadges, then I bought 5x Sapphire 7950's. I never had one fail, I never had one that didn't overclock like crazy. Simply put, Sapphire delivers.

So whatever again..


----------



## Vellinious

Quote:


> Originally Posted by *bluej511*
> 
> Even Jayztwocents got one of those crazy kingpin high end EVGA cards and he was so disappointed in the OC on it, it really is luck of the draw.


Jay's a tool. I've never seen him hit an overclock that made me even think, "huh....that's an OK overclock". He took a KPE, a card meant for LN2 cooling, left it on air and then tried to overclock it. He's an idiot, along with anyone that would buy that card, and leave it on air. Maxwell runs better, when it's cooler. The lower the temps, the higher the card will boost.

The ASIC quality of a card, is reading the voltage leak in the core. It IS the silicon lottery. It doesn't, however, account for the rest of the components on the card. The VRM quality, PCB quality, chokes, caps. memory type, power delivery and about a hundred other things will play into how well a card will overclock.

With the Maxwell architecture, the ASIC quality will tell you quite a bit...especially since they don't really respond well to additional voltage.

EVGA bins the KPEs....obviously. They have to, or they wouldn't be able to sell specific ASIC quality cards.


----------



## zorbyss

Quote:


> Originally Posted by *jodybdesigns*
> 
> I always used Acronis. Are these better?
> While you are in game. Turn off Full Screen and turn it back on.


Sorry but that doesn't work. I tried update the drivers to 16.3.1 but still no help. This thing is driving me crazy.


----------



## Stige

Quote:


> Originally Posted by *jodybdesigns*
> 
> I didn't mean to get the hurt butt going. But per reviews, the Sapphires are excellent at cooling, and pretty good overclockers. I haven't seen a user yet with a Nitro who can't hit 1100/1600 with practically stock volts. Sapphire is AMD's baby brand, so of course it is going to get the tender loving care that the other brands don't get. That's what happens when your name is ASUS, you mostly cater to Nvidia, and you expect those same coolers to cool off a R9 390. I have owned Sapphires since my 2600XT, owned 2 4870's, then got 2 Sapphire 6870's that were 5870 rebadges, then I bought 5x Sapphire 7950's. I never had one fail, I never had one that didn't overclock like crazy. Simply put, Sapphire delivers.
> 
> So whatever again..


That is just a load of crap. To begin with, any card will hit 1100/1600 at stock voltage anyway so your point is irrelevant, 1100/1600 isn't much when 1050/1500 is stock. Stop spreading false information here, it doesn't do anyone any good.
Sapphire coolers are propably the best there are for AMD cards and have been for a while, I did own a HD7950 Vapor-X myself.

How well any card overclocks is just pure silicon lottery though.


----------



## ronaldoz

The card just arrived. Now I hope it's having the Hylix(?) memory. Sigh.. They should just put 1 type in there









About installation: I removed the AMD drivers at Device Manager, because I only installed minimal. Do I need to delete more and also need to reinstall Afterburner and Trixx, before placing the R9 390?

I love it compared to my R9 290. That's a beast too. Having 3 DP looks great to me for example and the looks actually fit the white black looks of my pc system.

*Update*
After the regular deleting, I've used the 'Driver Store' in CMD for deleting all AMD drivers manually.


----------



## mus1mus

Quote:


> Originally Posted by *Stige*
> 
> That is just a load of crap. To begin with, any card will hit 1100/1600 at stock voltage anyway so your point is irrelevant, 1100/1600 isn't much when 1050/1500 is stock. Stop spreading false information here, it doesn't do anyone any good.
> Sapphire coolers are propably the best there are for AMD cards and have been for a while, I did own a HD7950 Vapor-X myself.
> 
> How well any card overclocks is just pure silicon lottery though.


Silicon lottery is a word for losers.









It's a users' willingness question.

http://www.3dmark.com/fs/7925843

http://www.3dmark.com/fs/7964918

http://www.3dmark.com/fs/7836616


----------



## ronaldoz

Memory Info tells me my card is locked, but I got Hynix memory and 69,6% ASIC quality


----------



## Agent Smith1984

Quote:


> Originally Posted by *mus1mus*
> 
> Silicon lottery is a word for losers.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It's a users' willingness question.
> 
> http://www.3dmark.com/fs/7925843
> 
> http://www.3dmark.com/fs/7964918
> 
> http://www.3dmark.com/fs/7836616


WOW WOW!!

Boy, that CPU really lets 2 or more cards fly!!

Single card graphics score will look the same on about any system, but once you get 2 cards going, and get into the 25k + graphics score range, and the Intel just lets em boogie!

I really can't wait to see how Zen does so I can determine my next rig, cause it may very well be an x99, or some other form of Intel if it fails to impress.


----------



## mus1mus

Quote:


> Originally Posted by *Agent Smith1984*
> 
> WOW WOW!!
> 
> Boy, that CPU really lets 2 or more cards fly!!
> 
> Single card graphics score will look the same on about any system, but once you get 2 cards going, and get into the 25k + graphics score range, and the Intel just lets em boogie!
> 
> I really can't wait to see how Zen does so I can determine my next rig, cause it may very well be an x99, or some other form of Intel if it fails to impress.


Yes, I want ZEN to be spectacular too. Broadwell E is coming with 10C/20T too. So it's either of the two when I have the chance. Or go for a cheaper (hopefully) Haswell Octa.


----------



## ronaldoz

Btw, after restart, Trixx show +19mV. After reset it shows +19mV.

__
https://www.reddit.com/r/3w8au6/got_a_sapphire_nitro_390_everything_runs_like_a/
 popped up after searching Google. Makes me wonder if I should try to overclock later. Because this weird stuff did damage to my old R9 290. The voltage / clock speeds need to stay as set.


----------



## jodybdesigns

Quote:


> Originally Posted by *ronaldoz*
> 
> Btw, after restart, Trixx show +19mV. After reset it shows +19mV.
> 
> __
> https://www.reddit.com/r/3w8au6/got_a_sapphire_nitro_390_everything_runs_like_a/
> popped up after searching Google. Makes me wonder if I should try to overclock later. Because this weird stuff did damage to my old R9 290. The voltage / clock speeds need to stay as set.


Are you overclocking in Crimson AND Trixx?


----------



## ronaldoz

Quote:


> Originally Posted by *jodybdesigns*
> 
> Are you overclocking in Crimson AND Trixx?


No, I only installed the drivers, and not using the AMD software, just Trixx. I'm using Crimson Drivers, yes.

Should you see some little red dots on the waterdrops in the Valley benchmark? At stock I see them in some waterdrops. I also noticed this on my R9 290. Clock speed remain max, not as that other Reddit post is telling. But that guy also got +19mV, as some other got as well.

Around 3:00 the red dot pop up at the rain:


----------



## Agent Smith1984

Quote:


> Originally Posted by *mus1mus*
> 
> Yes, I want ZEN to be spectacular too. Broadwell E is coming with 10C/20T too. So it's either of the two when I have the chance. Or go for a cheaper (hopefully) Haswell Octa.


All I can say is.... buy AMD stock NOW. It's gone up 50% since February... lol They control 85% of the VR market share, and IF Zen is what we hope it is, we can expect things in the enthusiast DIY PC world to get VERY interesting!!!









I'm super excited.

Not to mention, DX 12 is already showing very well for AMD, even in early implementation. I'm not sure why so many people are still calling it a "fairy tale" and what not. There are already titles using it, and there will be many more to come. I think people forget that NO ONE is counting on DX12 more than the console industry, and that whole market has the devs in their back pocket, so I am of the mindset that we will see DX12 sooner than later.


----------



## jodybdesigns

Quote:


> Originally Posted by *Agent Smith1984*
> 
> All I can say is.... buy AMD stock NOW. It's gone up 50% since February... lol They control 85% of the VR market share, and IF Zen is what we hope it is, we can expect things in the enthusiast DIY PC world to get VERY interesting!!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm super excited.
> 
> Not to mention, DX 12 is already showing very well for AMD, even in early implementation. I'm not sure why so many people are still calling it a "fairy tale" and what not. There are already titles using it, and there will be many more to come. I think people forget that NO ONE is counting on DX12 more than the console industry, and that whole market has the devs in their back pocket, so I am of the mindset that we will see DX12 sooner than later.


I said DX12 was a myth for a while. But that was a few months ago too. It took them forever to get it here. But so far, I am impressed. I tested Rise on my Thuban @ 3.6ghz. With a 7870XT, 1080p - High with FXAA, I was getting 55fps max, and 32fps minimum. That's hella impressive. 32fps min on a Phenom II?? Get out of here, that's amazing.
Quote:


> Originally Posted by *ronaldoz*
> 
> No, I only installed the drivers, and not using the AMD software, just Trixx. I'm using Crimson Drivers, yes.
> 
> Should you see some little red dots on the waterdrops in the Valley benchmark? At stock I see them in some waterdrops. I also noticed this on my R9 290. Clock speed remain max, not as that other Reddit post is telling. But that guy also got +19mV, as some other got as well.


Run Heaven, it's more stressful on the system. Everyone here will tell you that too.


----------



## ronaldoz

Quote:


> Originally Posted by *jodybdesigns*
> 
> I said DX12 was a myth for a while. But that was a few months ago too. It took them forever to get it here. But so far, I am impressed. I tested Rise on my Thuban @ 3.6ghz. With a 7870XT, 1080p - High with FXAA, I was getting 55fps max, and 32fps minimum. That's hella impressive. 32fps min on a Phenom II?? Get out of here, that's amazing.
> Run Heaven, it's more stressful on the system. Everyone here will tell you that too.


Thanks, Heaven runs fine.







Also the temps are awesome. They are lower then my R9 with Liquid Ultra on the chip, and those temps were already very low and awesome for the R9.

*Update*
Whatever I do, it's locking the voltage at +19mV. So I can change the clocks, but not the voltage. I hope that it's not really locked, then I gotta return it I guess, and take another.


----------



## spyshagg

Quote:


> Originally Posted by *mus1mus*
> 
> Silicon lottery is a word for losers.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It's a users' willingness question.
> 
> http://www.3dmark.com/fs/7925843
> 
> http://www.3dmark.com/fs/7964918
> 
> http://www.3dmark.com/fs/7836616


hey, whats wrong with your X2 score?

http://www.3dmark.com/compare/fs/7964918/fs/7865388

check the clocks

cheers


----------



## jodybdesigns

Quote:


> Originally Posted by *ronaldoz*
> 
> No, I only installed the drivers, and not using the AMD software, just Trixx. I'm using Crimson Drivers, yes.
> 
> Should you see some little red dots on the waterdrops in the Valley benchmark? At stock I see them in some waterdrops. I also noticed this on my R9 290. Clock speed remain max, not as that other Reddit post is telling. But that guy also got +19mV, as some other got as well.
> 
> Around 3:00 the red dot pop up at the rain:


Quote:


> Originally Posted by *ronaldoz*
> 
> Thanks, Heaven runs fine.
> 
> 
> 
> 
> 
> 
> 
> Also the temps are awesome. They are lower then my R9 with Liquid Ultra on the chip, and those temps were already very low and awesome for the R9.
> 
> *Update*
> Whatever I do, it's locking the voltage at +19mV. So I can change the clocks, but not the voltage. I hope that it's not really locked, then I gotta return it I guess, and take another.


Doesn't that card have a dual bios? Have you checked that if it does.

Also, Try Afterburner for giggles. Also try DDU and uninstall everything completely, and reinstall again.


----------



## ronaldoz

Quote:


> Originally Posted by *jodybdesigns*
> 
> Doesn't that card have a dual bios? Have you checked that if it does.
> 
> Also, Try Afterburner for giggles. Also try DDU and uninstall everything completely, and reinstall again.


Thanks for you help. I've tried using AB, and saving the clock as profile too. That helped. However, Trixx voltage changes will not remain. So I gotta do the overclock with AB. Very happy that this worked out tho. GPU-Z shows 1 bios version. I'm not sure how to check if there is another one.

PS. I deleted every AMD driver from the driver strore in Windows. Using pnputil.exe -e | more or pnputil.exe -f -d oemxx.inf for specific drivers.


----------



## Agent Smith1984

Are you making sure you have that power saving feature in 16.3 turned off?


----------



## jodybdesigns

Quote:


> Originally Posted by *ronaldoz*
> 
> Thanks for you help. I've tried using AB, and saving the clock as profile too. That helped. However, Trixx voltage changes will not remain. So I gotta do the overclock with AB. Very happy that this worked out tho. GPU-Z shows 1 bios version. I'm not sure how to check if there is another one.
> 
> PS. I deleted every AMD driver from the driver strore in Windows. Using pnputil.exe -e | more or pnputil.exe -f -d oemxx.inf for specific drivers.


Sounds like a bug in Trixx. Try using a different version. It could be holding the registry values for your R9 290. Try uninstalling Trixx, and removing the registry values. And then fresh install Trixx.


----------



## ronaldoz

Quote:


> Originally Posted by *jodybdesigns*
> 
> Sounds like a bug in Trixx. Try using a different version. It could be holding the registry values for your R9 290. Try uninstalling Trixx, and removing the registry values. And then fresh install Trixx.


Allright, I will try that in a bit. I'm not a expert in removing the registry as well tho. When I did that, I will try the newest version again. If that not fixed the issue, I will try a older version.








Quote:


> Originally Posted by *Agent Smith1984*
> 
> Are you making sure you have that power saving feature in 16.3 turned off?


No, I'm only using the driver, but not the AMD software. So maybe the driver will force that? There a more people who got +19mV as stock voltage, and can't clock it lower / higher. Now it's fixed for 90%, because it's still happening in Trixx. (but now, it change to the Afterburner voltage)


----------



## kizwan

Quote:


> Originally Posted by *ronaldoz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *jodybdesigns*
> 
> Sounds like a bug in Trixx. Try using a different version. It could be holding the registry values for your R9 290. Try uninstalling Trixx, and removing the registry values. And then fresh install Trixx.
> 
> 
> 
> Allright, I will try that in a bit. I'm not a expert in removing the registry as well tho. When I did that, I will try the newest version again. If that not fixed the issue, I will try a older version.
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Agent Smith1984*
> 
> Are you making sure you have that power saving feature in 16.3 turned off?
> 
> Click to expand...
> 
> No, I'm only using the driver, but not the AMD software. So maybe the driver will force that? There a more people who got +19mV as stock voltage, and can't clock it lower / higher. Now it's fixed for 90%, because it's still happening in Trixx. (but now, it change to the Afterburner voltage)
Click to expand...

If you have MSI AB installed, run these command in elevated command prompt.

Code:



Code:


cd "\Program Files (x86)\MSI Afterburner"
MSIAfterburner.exe /i2cd

Wait until you get confirmation dialog box. Then post here the *i2cdump.txt* file.

*Edit:* Much better way is dump your card BIOS using GPU-Z, then post the bios here.


----------



## Vellinious

Quote:


> Originally Posted by *spyshagg*
> 
> hey, whats wrong with your X2 score?
> 
> http://www.3dmark.com/compare/fs/7964918/fs/7865388
> 
> check the clocks
> 
> cheers


Driver versions could account for some of that.


----------



## ronaldoz

Quote:


> Originally Posted by *kizwan*
> 
> If you have MSI AB installed, run these command in elevated command prompt.
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> cd "\Program Files (x86)\MSI Afterburner"
> MSIAfterburner.exe /i2cd
> 
> Wait until you get confirmation dialog box. Then post here the *i2cdump.txt* file.
> 
> *Edit:* Much better way is dump your card BIOS using GPU-Z, then post the bios here.


Allright, this is the BIOS dump

Hawaii.zip 100k .zip file


----------



## kizwan

Quote:


> Originally Posted by *ronaldoz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> If you have MSI AB installed, run these command in elevated command prompt.
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> cd "\Program Files (x86)\MSI Afterburner"
> MSIAfterburner.exe /i2cd
> 
> Wait until you get confirmation dialog box. Then post here the *i2cdump.txt* file.
> 
> *Edit:* Much better way is dump your card BIOS using GPU-Z, then post the bios here.
> 
> 
> 
> Allright, this is the BIOS dump
> 
> Hawaii.zip 100k .zip file
Click to expand...

Your BIOS by default have +18.75mV (~+19mV). So that's why Trixx is showing +19mV.


----------



## battleaxe

Well. Now I think its my 290x that has flaked out causing all my issues after all. I was able to make it freeze by plugging into just that card. Worked fine when I tried this before, but last night tried again, and freeze city. So likely that's the culprit. Its under warranty but I have not heard good things about Sapphire's RMA process.

Any advice before I try to oven bake this thing? Or should I just RMA it and take a gamble on Sapphire? I'm not sure what to do.

Edit: its a sad day when I'm more afraid of losing my card to a manufacturers RMA process than I am of baking it in an oven. This is pathetic really.


----------



## ronaldoz

Quote:


> Originally Posted by *kizwan*
> 
> Your BIOS by default have +18.75mV (~+19mV). So that's why Trixx is showing +19mV.


Thanks for checking it out! So nothing weird going on here.







But I will use AB for now, because Trixx will not remember the changed voltage at a restart. When using AB for this, it will be remembered, and also shown in Trixx.


----------



## Agent Smith1984

Quote:


> Originally Posted by *ronaldoz*
> 
> Thanks for checking it out! So nothing weird going on here.
> 
> 
> 
> 
> 
> 
> 
> But I will use AB for now, because Trixx will not remember the changed voltage at a restart. When using AB for this, it will be remembered, and also shown in Trixx.


I always found use trixx for startup OC is bad news..... just asking for black screens, and also no AUX voltage control for memory clocking.


----------



## ronaldoz

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I always found use trixx for startup OC is bad news..... just asking for black screens, and also no AUX voltage control for memory clocking.


Allight, I started on my 290 with AB, but it reset the voltages after restarting the system. That happened since the new Crimson drivers. That's why I switched to Trixx and probably did much damage to my card too. I mean, the software should remember the settings after restart, and absoluty not change voltages, or anything, and let the other settings remain overclocked.


----------



## patriotaki

how much fps more do you get from an OC on the r9 390?


----------



## ronaldoz

Well, I could go up to 1170Mhz at stock without artifacts. It already clocked higher then my R9 290 on max. 1170 / 1500 was the max, but I could not get it stable (the card was acting strange). Max temp = 54C Max voltage = 1,273V. Because the 390's ASIC quality is 69,6%, the card maybe need more voltage then the 290 I used. It had 78% ASIC quality. The max Power In = 277,5w. That happened in Firestrike. I did not raise the power limit yet.

Does core clock artifacts hurt more, then memory, or isn't there really difference? Gonna try to push the memory clock now.








Quote:


> Originally Posted by *patriotaki*
> 
> how much fps more do you get from an OC on the r9 390?


This review might give a idea. It's a overclock from 1010 / 1500 to 1150 / 1700.


----------



## patriotaki

Quote:


> Originally Posted by *ronaldoz*
> 
> Well, I could go up to 1170Mhz at stock without artifacts. It already clocked higher then my R9 290 on max. 1170 / 1500 was the max, but I could not get it stable (the card was acting strange). Max temp = 54C Max voltage = 1,273V. Because the 390's ASIC quality is 69,6%, the card maybe need more voltage then the 290 I used. It had 78% ASIC quality. The max Power In = 277,5w. That happened in Firestrike. I did not raise the power limit yet.
> 
> Does core clock artifacts hurt more, then memory, or isn't there really difference? Gonna try to push the memory clock now.
> 
> 
> 
> 
> 
> 
> 
> 
> This review might give a idea. It's a overclock from 1010 / 1500 to 1150 / 1700.


WHICH 390 YOU HAVE?


----------



## ronaldoz

Quote:


> Originally Posted by *patriotaki*
> 
> WHICH 390 YOU HAVE?


It's a Sapphire Nitro. I've put the core clock back to 1150Mhz, because 1170Mhz will artifact in Firestrike. It did not in Heaven. So I continue testing with Firestrike only now. There happened something: when I testing 1520Mhz, the drivers crashed, but there were not artifacts.


----------



## mus1mus

Quote:


> Originally Posted by *spyshagg*
> 
> hey, whats wrong with your X2 score?
> 
> http://www.3dmark.com/compare/fs/7964918/fs/7865388
> 
> check the clocks
> 
> cheers


I believe it is the modded bios' attribute and Drivers'. And that my system is acting up.









GS2 in FS reacts to modded bios quite well.







Same goes with 3dM11 GS 4.


----------



## NastyAIDS

http://www.frozencpu.com/products/24242/ex-blc-1735/EK_Thermosphere_Universal_GPU_Waterblock_-_Acetal_Nickel_EK-Thermosphere_-_AcetalNickel.html?id=srocbJFh&mv_pc=350

Ok i got another question. Will this fit on my R9 390? Im worried that the 1/4 threads wont be usable from the top since the Strix pcb is larger from the top and may block them.


----------



## bluej511

Quote:


> Originally Posted by *NastyAIDS*
> 
> http://www.frozencpu.com/products/24242/ex-blc-1735/EK_Thermosphere_Universal_GPU_Waterblock_-_Acetal_Nickel_EK-Thermosphere_-_AcetalNickel.html?id=srocbJFh&mv_pc=350
> 
> Ok i got another question. Will this fit on my R9 390? Im worried that the 1/4 threads wont be usable from the top since the Strix pcb is larger from the top and may block them.


I may get that and mod by alphacool block and put that on. I think someone tried this on their msi board and had issues with it being blocked or something. You can always just add 2 90° degrees and have the hoses back up on top.


----------



## battleaxe

Any advice before I try to oven bake my failing 290x in the oven?

Or should I just RMA it and take a gamble on Sapphire? I'm not sure what to do.

I'm honestly more afraid to RMA than I am to bake it. Seems a larger risk to me.
Quote:


> Originally Posted by *NastyAIDS*
> 
> http://www.frozencpu.com/products/24242/ex-blc-1735/EK_Thermosphere_Universal_GPU_Waterblock_-_Acetal_Nickel_EK-Thermosphere_-_AcetalNickel.html?id=srocbJFh&mv_pc=350
> 
> Ok i got another question. Will this fit on my R9 390? Im worried that the 1/4 threads wont be usable from the top since the Strix pcb is larger from the top and may block them.


You can enter from the front or the back.









So, it will fit either way, unless you have two in Xfire that is. Then you're in a pickel. I have those blocks BTW. Very nice.


----------



## NastyAIDS

Ok thanks Gonna get one soon


----------



## NastyAIDS

Oh yeah i saw that fourm. He wanted to keep his baackplate on and he needed to keep the metal support holding it on i suppose but he had to cut it. But if thats the case then i suppose i will just get 2 90 degree fittings. Thanks


----------



## battleaxe

Quote:


> Originally Posted by *NastyAIDS*
> 
> Oh yeah i saw that fourm. He wanted to keep his baackplate on and he needed to keep the metal support holding it on i suppose but he had to cut it. But if thats the case then i suppose i will just get 2 90 degree fittings. Thanks


You will need the optional bracket for the 290/390 series though. Its like $7 extra.

Mounting bracket that is.

Edit: this one I believe.


----------



## ronaldoz

After some testing, my card could get 1150 / 1500 or 1140 / 1580 at stock. But the last one, will not yet have the same fps or benchmark scores. I've tested Heaven + Firestrike. I also noticed a 2 fps difference in Just Cause 2 (DX9 maybe). It's not much, but it looks like raising core clock is much / much more effective. Is this normal?

1150 / 1500:
* Heaven 1600*900 avg *79,6 fps* 2004 score
* Firestrike 11819 (overall) *13742* 13533 5278

1140 / 1580:
* Heaven 1600*900 avg *78,4 fps* 1974 score
* Firestrike 11626 (overall) *13489* 13487 5184


----------



## jodybdesigns

Quote:


> Originally Posted by *ronaldoz*
> 
> After some testing, my card could get 1150 / 1500 or 1140 / 1580 at stock. But the last one, will not yet have the same fps or benchmark scores. I've tested Heaven + Firestrike. I also noticed a 2 fps difference in Just Cause 2 (DX9 maybe). It's not much, but it looks like raising core clock is much / much more effective. Is this normal?
> 
> 1150 / 1500:
> * Heaven 1600*900 avg *79,6 fps* 2004 score
> * Firestrike 11819 (overall) *13742* 13533 5278
> 
> 1140 / 1580:
> * Heaven 1600*900 avg *78,4 fps* 1974 score
> * Firestrike 11626 (overall) *13489* 13487 5184


Try to stay within your timing straps.

Strap end 400MHz (40 9C 00) , Range = 150-400MHz
Strap end 800MHz (80 38 01) , Range = 401-800MHz
Strap end 900MHz (90 5F 01) , Range = 801-900MHz
Strap end 1000MHz (A0 86 01) , Range = 901-1000MHz
Strap end 1125MHz (74 B7 01) , Range = 1001-1125MHz
Strap end 1250MHz (48 E8 01) , Range = 1126-1250MHz
Strap end 1375MHz (1C 19 02) , Range = 1251-1375MHz
Strap end 1500MHz (F0 49 02) , Range = 1376-1500MHz
Strap end 1625MHz (C4 7A 02) , Range = 1501-1625MHz
Strap end 1750MHZ (98 AB 02) , Range = 1626-1750MHz

Either run 1500Mhz or 1625Mhz. Everything in between is pointless.


----------



## ronaldoz

Quote:


> Originally Posted by *jodybdesigns*
> 
> Try to stay within your timing straps.
> 
> Strap end 400MHz (40 9C 00) , Range = 150-400MHz
> Strap end 800MHz (80 38 01) , Range = 401-800MHz
> Strap end 900MHz (90 5F 01) , Range = 801-900MHz
> Strap end 1000MHz (A0 86 01) , Range = 901-1000MHz
> Strap end 1125MHz (74 B7 01) , Range = 1001-1125MHz
> Strap end 1250MHz (48 E8 01) , Range = 1126-1250MHz
> Strap end 1375MHz (1C 19 02) , Range = 1251-1375MHz
> Strap end 1500MHz (F0 49 02) , Range = 1376-1500MHz
> Strap end 1625MHz (C4 7A 02) , Range = 1501-1625MHz
> Strap end 1750MHZ (98 AB 02) , Range = 1626-1750MHz
> 
> Either run 1500Mhz or 1625Mhz. Everything in between is pointless.


Wow, thanks. So we have to use only these strap end Mhz's for the memory clocks? Isn't the performance increasing linear from 1500 to 1625 for example? Or is this about something else?


----------



## tolis626

Quote:


> Originally Posted by *ronaldoz*
> 
> Wow, thanks. So we have to use only these strap end Mhz's for the memory clocks? Isn't the performance increasing linear from 1500 to 1625 for example? Or is this about something else?


Imagine it like RAM clockspeed and CAS latency on the CPU side. Say your RAM can do 2133MHz at CL9 and it needs CL10 for anything above it until 2400MHz, and then it needs CL11 until 2666MHz etc. If you run it at 2200MHz and CL10, you'll end up slower than both 2133MHz CL9 and 2400MHz CL10.

You simply want to run at the highest clockspeed possible with the lowest latency possible. If you're running 1520MHz you're near 1500MHz's speed, but use the timings for 1625MHz, which are looser. Hence, it'll be slower. There is some debate whether going to any speed over 1625MHz will improve performance regardless of timings straps, but now is not the time and I am not the person to solve this. Some say 1700MHz will be slower than 1625MHz, some say that even 1650MHz is faster than 1625MHz. Trying for yourself is your best bet. Although it won't hurt to try 1625MHz and 1750MHz first and see what happens. If you can run 1750MHz right off the bat with no problems, then what's the point of wasting your time? Unless your card's memory is also a waste of time like mine.









Oh, and another thing. If your memory overclock is half-stable, you might actually lose performance as error correction kicks in. I've had that happen twice, where after 5-10 minutes or so my FPS started getting lower. But usually my drivers just crash and I get a nice black screen of death. Sigh... And no amount of aux voltage seems to help above 1675MHz. I hope you fare better.









PS : Anyone who may know, would memory overclocking benefit more from better cooling on the RAM VRMs or the RAM chips themselves? Currently debating if I should "upgrade" those too.


----------



## jodybdesigns

Quote:


> Originally Posted by *tolis626*
> 
> Imagine it like RAM clockspeed and CAS latency on the CPU side. Say your RAM can do 2133MHz at CL9 and it needs CL10 for anything above it until 2400MHz, and then it needs CL11 until 2666MHz etc. If you run it at 2200MHz and CL10, you'll end up slower than both 2133MHz CL9 and 2400MHz CL10.
> 
> You simply want to run at the highest clockspeed possible with the lowest latency possible. If you're running 1520MHz you're near 1500MHz's speed, but use the timings for 1625MHz, which are looser. Hence, it'll be slower. There is some debate whether going to any speed over 1625MHz will improve performance regardless of timings straps, but now is not the time and I am not the person to solve this. Some say 1700MHz will be slower than 1625MHz, some say that even 1650MHz is faster than 1625MHz. Trying for yourself is your best bet. Although it won't hurt to try 1625MHz and 1750MHz first and see what happens. If you can run 1750MHz right off the bat with no problems, then what's the point of wasting your time? Unless your card's memory is also a waste of time like mine.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Oh, and another thing. If your memory overclock is half-stable, you might actually lose performance as error correction kicks in. I've had that happen twice, where after 5-10 minutes or so my FPS started getting lower. But usually my drivers just crash and I get a nice black screen of death. Sigh... And no amount of aux voltage seems to help above 1675MHz. I hope you fare better.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> PS : Anyone who may know, would memory overclocking benefit more from better cooling on the RAM VRMs or the RAM chips themselves? Currently debating if I should "upgrade" those too.


Could not have been explained better


----------



## diggiddi

Quote:


> Originally Posted by *ronaldoz*
> 
> After some testing, my card could get 1150 / 1500 or 1140 / 1580 at stock. But the last one, will not yet have the same fps or benchmark scores. I've tested Heaven + Firestrike. I also noticed a 2 fps difference in Just Cause 2 (DX9 maybe). It's not much, *but it looks like raising core clock is much / much more effective. Is this normal?
> 
> *1150 / 1500:
> * Heaven 1600*900 avg *79,6 fps* 2004 score
> * Firestrike 11819 (overall) *13742* 13533 5278
> 
> 1140 / 1580:
> * Heaven 1600*900 avg *78,4 fps* 1974 score
> * Firestrike 11626 (overall) *13489* 13487 5184


Yes, that is normal to see more gains with upping core speed when memory speed is not the limiting factor with a 512 bit wide bus
If the memory bus was 128 or even 256 bit then greater gains would be had from overclocking it since it would be the bottleneck


----------



## ronaldoz

Quote:


> Originally Posted by *tolis626*
> 
> Imagine it like RAM clockspeed and CAS latency on the CPU side. Say your RAM can do 2133MHz at CL9 and it needs CL10 for anything above it until 2400MHz, and then it needs CL11 until 2666MHz etc. If you run it at 2200MHz and CL10, you'll end up slower than both 2133MHz CL9 and 2400MHz CL10.
> 
> You simply want to run at the highest clockspeed possible with the lowest latency possible. If you're running 1520MHz you're near 1500MHz's speed, but use the timings for 1625MHz, which are looser. Hence, it'll be slower. There is some debate whether going to any speed over 1625MHz will improve performance regardless of timings straps, but now is not the time and I am not the person to solve this. Some say 1700MHz will be slower than 1625MHz, some say that even 1650MHz is faster than 1625MHz. Trying for yourself is your best bet. Although it won't hurt to try 1625MHz and 1750MHz first and see what happens. If you can run 1750MHz right off the bat with no problems, then what's the point of wasting your time? Unless your card's memory is also a waste of time like mine.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Oh, and another thing. If your memory overclock is half-stable, you might actually lose performance as error correction kicks in. I've had that happen twice, where after 5-10 minutes or so my FPS started getting lower. But usually my drivers just crash and I get a nice black screen of death. Sigh... And no amount of aux voltage seems to help above 1675MHz. I hope you fare better.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> PS : Anyone who may know, would memory overclocking benefit more from better cooling on the RAM VRMs or the RAM chips themselves? Currently debating if I should "upgrade" those too.


Quote:


> Originally Posted by *diggiddi*
> 
> Yes, that is normal to see more gains with upping core speed when memory speed is not the limiting factor with a 512 bit wide bus
> If the memory bus was 128 or even 256 bit then greater gains would be had from overclocking it since it would be the bottleneck


Thanks a lot guys. You really helped to get a better perspective on this. The card is running at 1130 / 1625 at the moment and probably need a voltage inscrease now to overclock it more.







If the ratio will be like this, the card needs around 1220Mhz on the core clock. 1220 / 1750 sounds awesome, but not sure if that's realistic.


----------



## Kalistoval

Could some one read the info of my cards bios and report it back to me please.

Hawaii.zip 100k .zip file


----------



## Agent Smith1984

Quote:


> Originally Posted by *ronaldoz*
> 
> Thanks a lot guys. You really helped to get a better perspective on this. The card is running at 1130 / 1625 at the moment and probably need a voltage inscrease now to overclock it more.
> 
> 
> 
> 
> 
> 
> 
> If the ratio will be like this, the card needs around 1220Mhz on the core clock. 1220 / 1750 sounds awesome, but not sure if that's realistic.


It's not very common tip go beyond 1200 on a 300 series card. 290s on water will do 1250, even 1300+ in some cases on water, but we have yet to see anyone with a 390 on water that will do much over 1220 or so.....

They do seem to love 1150-1200 on regular voltage with air cooling though, which it's not commonly seen on the 290 series.... It's just a maturation in the silicon manufacturing process. AMD did the same thing with their CPU's.....

1150/1625 is normal, 1175/1700+ is good, and 1200/1750 is great on these cards!

I've personally tested 4 different 390's and Only my first MSI 390 would do 1200 core, however it's memory quit at 1650. All the others would do around 1180 core with 1750 memory. Mind you, of the 4 cards, only my current card was an "X".

Mind you, I'm talking daily clocks, not benching. My 390x will bench at 1205 at 100mv in firestrike.


----------



## m70b1jr

If I could stop black screening when adding voltage, I'm pretty sure I could hit above 1200mhz


----------



## Agent Smith1984

Okay, so here it is, 1700 flat in Heaven!!! NO ARTIFACTS WHATSOEVER, ON AIR!!!


----------



## ronaldoz

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Okay, so here it is, 1700 flat in Heaven!!! NO ARTIFACTS WHATSOEVER, ON AIR!!!


You got a nice card man! You are on water right? On +38mW I got 1160 / 1660 for the best results, but I'm trying to get the memory as high as possible at 1140 core. So it will take a while, before I could even get to 1200. Will be lucky.


----------



## Agent Smith1984

Quote:


> Originally Posted by *ronaldoz*
> 
> You got a nice card man! You are on water right? On +38mW I got 1160 / 1660 for the best results, but I'm trying to get the memory as high as possible at 1140 core. So it will take a while, before I could even get to 1200. Will be lucky.


Na, I'm on air.... It's 1080p stable at 1200/100mv with 100% fan, but 4k gives me another 6c and artifacts, so at 4k I'm only good for around 1175/60mv

1080 is rock solid though... Would love to see how water does, but this card is a placeholder until Polaris for me, no point in buying a block right now... For me personally anyways.


----------



## m70b1jr

Does anyone here think installing a NZXT g10 would be worth it? I'm on liquid, and one thing I hate about the AIO I got for it is its terrible apperence. Maybe a g10 will spice things up with better VRM coolingredients and better looks?


----------



## ronaldoz

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Na, I'm on air.... It's 1080p stable at 1200/100mv with 100% fan, but 4k gives me another 6c and artifacts, so at 4k I'm only good for around 1175/60mv
> 
> 1080 is rock solid though... Would love to see how water does, but this card is a placeholder until Polaris for me, no point in buying a block right now... For me personally anyways.


Ah, I see. But it's a awesome overclock! What memory clock are you using for 4K? Over here it's the same situation. My 290 was acting weird, so I wanted a decent one, that would overclock. So I end up with a 390. It's doing awesome compared to the 290 tho. I could only dream about this clocks on that card. Also over here I noticed artifacts could happen at 1440P, but is stable at 1080P. So I decided to test only at 1440p for now.


----------



## Kalistoval

Quote:


> Originally Posted by *m70b1jr*
> 
> Does anyone here think installing a NZXT g10 would be worth it? I'm on liquid, and one thing I hate about the AIO I got for it is its terrible apperence. Maybe a g10 will spice things up with better VRM coolingredients and better looks?


I have one and a X40 to go with it. Its just a bracket, I did change out the fan that comes with the g10 with a better fan for my vrms.


----------



## kizwan

Quote:


> Originally Posted by *Kalistoval*
> 
> Could some one read the info of my cards bios and report it back to me please.
> 
> Hawaii.zip 100k .zip file


Nothing interesting unless you have specific things you're looking for.


----------



## m70b1jr

Quote:


> Originally Posted by *Kalistoval*
> 
> I have one and a X40 to go with it. Its just a bracket, I did change out the fan that comes with the g10 with a better fan for my vrms.


Mine has a Heatsink with a smalllll fan (40 mm maybe?) And it's ugly. Maybe the g10 can spice things up. The $70 on newegg for the white one is pretty ridiculous though.


----------



## patriotaki

the only way to test if the gpu is stable in a specific OC is to run games or benchmarks?


----------



## Noirgheos

Quote:


> Originally Posted by *patriotaki*
> 
> the only way to test if the gpu is stable in a specific OC is to run games or benchmarks?


Yep, running Valley for a while or Heaven and looking for artifacts is the way to go. Max it out in settings and look closely for artifacts. If nothing pops up, it should be stanle in most, if not all games.


----------



## Stige

Quote:


> Originally Posted by *Noirgheos*
> 
> Yep, running Valley for a while or Heaven and looking for artifacts is the way to go. Max it out in settings and look closely for artifacts. If nothing pops up, it should be stanle in most, if not all games.


Valley is bad for checking stability, I can run way higher clocks in Valley than anything else. Heaven is apparently more taxing, Firestrike for sure is way more taxing than Valley is.

Best way to check stability is obviously to just play games.


----------



## Agent Smith1984

Yeah, Firestrike has my vote for most taxing bench to check artifacts, though it may not reveal actual instabilities.

I actually run Firestrike graphics test 2 in windowed mode @ 3200x1800 with AB off to the right monitoring temps when I check for actual artifact free clocks.

1080P doesn't stress my card enough to indicate how it may actual act during 4k gaming.

If you really want to test stability of both the CPU/GPU overclocked and the PSU, then run IBT on high while running a high res window of firestrike graphics test 1 or 2 loop.


----------



## mus1mus

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Yeah, Firestrike has my vote for most taxing bench to check artifacts, though it may not reveal actual instabilities.
> 
> I actually run Firestrike graphics test 2 in windowed mode @ 3200x1800 with AB off to the right monitoring temps when I check for actual artifact free clocks.
> 
> 1080P doesn't stress my card enough to indicate how it may actual act during 4k gaming.
> 
> If you really want to test stability of both the CPU/GPU overclocked and the PSU, then run IBT on high while running a high res window of firestrike graphics test 1 or 2 loop.


Smitty, care to run some benches for us on Reds? We need you there. X2 should really do a lot.

http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd/1100_50#post_25013225


----------



## Agent Smith1984

Quote:


> Originally Posted by *mus1mus*
> 
> Smitty, care to run some benches for us on Reds? We need you there. X2 should really do a lot.
> 
> http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd/1100_50#post_25013225


I'd love to!

Will probably work on it over the weekend (took Monday off for a 4-dayer







)

Mean time, I'll be reading through the rules and such....

X2?


----------



## mus1mus

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I'd love to!
> 
> Will probably work on it over the weekend (took Monday off for a 4-dayer
> 
> 
> 
> 
> 
> 
> 
> )
> 
> Mean time, I'll be reading through the rules and such....
> 
> X2?


2 Cards?









X1 will do too. Tesselation off for better scores, W8 for Fire Strike.


----------



## Agent Smith1984

Quote:


> Originally Posted by *mus1mus*
> 
> 2 Cards?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> X1 will do too. Tesselation off for better scores, W8 for Fire Strike.


Okay, so we can run tess off?? Sweet. Wish I had two cards, I'd definitely add some help


----------



## mus1mus

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Okay, so we can run tess off?? Sweet. Wish I had two cards, I'd definitely add some help


It's fine. X1 scores matter as much as X4s. Especially when we are catching up.


----------



## ronaldoz

Done a lot of benching today, and got 5 profiles to continue with. The max I can get with +100mV / + 50% power limit = 1185 / 1750. Before using more then +100mV, I would like to test games, and see what different overclocks will do for the framerate, temps are power usage.


----------



## battleaxe

I just picked up two more XFX 390x today from MC and another power supply. So sick of this problem. Hope this fixes the issue.

Whatever isn't running right is going back.


----------



## Agent Smith1984

Quote:


> Originally Posted by *battleaxe*
> 
> I just picked up two more XFX 390x today from MC and another power supply. So sick of this problem. Hope this fixes the issue.
> 
> Whatever isn't running right is going back.


So you are running a total of three, or are you testing two more and taking all but one back?

2) 390X is a wicked setup wit dat 8GB VRAM!!!


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> So you are running a total of three, or are you testing two more and taking all but one back?
> 
> 2) 390X is a wicked setup wit dat 8GB VRAM!!!


I'm going to install the two new ones and see if my problem persisist. If not, then I know its one of the old cards. Then I'll tear my blocks apart, etc... and figure out which one. Then RMA or return whatever is acting up. I got the extra PSU just to give my 1000watt PSU a break as I'm sure pushing two of these in Xfire at 1200 + is not very nice to it long term. Gonna add an AX860 for the GPU's to give it a break.









In the end I'll have 2 390X for sure. One way or another.


----------



## Agent Smith1984

Quote:


> Originally Posted by *battleaxe*
> 
> I'm going to install the two new ones and see if my problem persisist. If not, then I know its one of the old cards. Then I'll tear my blocks apart, etc... and figure out which one. Then RMA or return whatever is acting up. I got the extra PSU just to give my 1000watt PSU a break as I'm sure pushing two of these in Xfire at 1200 + is not very nice to it long term. Gonna add an AX860 for the GPU's to give it a break.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> In the end I'll have 2 390X for sure. One way or another.


Sorry if I missed it, but what was the problem you are getting? Stutters or clock drops?


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Sorry if I missed it, but what was the problem you are getting? Stutters or clock drops?


Neither. Freezing and crashing while in Xfire. Only in Xfire. I suspect my 290x is toast though. We will see.

If it is, should I oven bake it or RMA? Always wanted to try to oven bake a card to bring it back to life. Heard of one guy who has done that over ten times and each time was successful. Sounds kinda fun.


----------



## Agent Smith1984

Quote:


> Originally Posted by *battleaxe*
> 
> Neither. Freezing and crashing while in Xfire. Only in Xfire. I suspect my 290x is toast though. We will see.
> 
> If it is, should I oven bake it or RMA? Always wanted to try to oven bake a card to bring it back to life. Heard of one guy who has done that over ten times and each time was successful. Sounds kinda fun.


You can try the oven bake. My brother did it to an old GTX280 one time and it worked I believe. I tried it years ago on a 7800 GTX and had no luck... not had a dead card since then to try it again (cross fingers I won't have to either).

I advise a simple RMA though, because you run a better chance of getting a 390 in it's place! Plus, it seems that once you go oven bake, you stay oven bake... as in, you may have to do it every 3 months to a year from what I've read.


----------



## Vellinious

I wouldn't want to trade a 290X for a 390 in an RMA process....I'd feel pretty ripped off.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Vellinious*
> 
> I wouldn't want to trade a 290X for a 390 in an RMA process....I'd feel pretty ripped off.


Not for a crossfire setup though..... he is running an 8GB 390X now, with a 4GB 290X..... he'd see comparable performance with a 390X + 390 as he would with a 390X + 290X (maybe miss out on 2-5% performance if the 290x is a great overclocker), but he will have a full 8GB frame buffer to use. Not that you need 8GB for everything, but with that much core power, you can certainly crank things up high enough to break 4GB at high resolutions.....

I've tested mix and matched combinations of both cards, and to me, the best results were with 390X + 390 (didn't have another 390x at the time to test with), over (2) 290x's, but some of that has to do with the fact I was on air cooling, and limited to 1150-1200MHz overclocks. It'd really be nice if they could get him a 390x in it's place instead though.


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Not for a crossfire setup though..... he is running an 8GB 390X now, with a 4GB 290X..... he'd see comparable performance with a 390X + 390 as he would with a 390X + 290X (maybe miss out on 2-5% performance if the 290x is a great overclocker), but he will have a full 8GB frame buffer to use. Not that you need 8GB for everything, but with that much core power, you can certainly crank things up high enough to break 4GB at high resolutions.....
> 
> I've tested mix and matched combinations of both cards, and to me, the best results were with 390X + 390 (didn't have another 390x at the time to test with), over (2) 290x's, but some of that has to do with the fact I was on air cooling, and limited to 1150-1200MHz overclocks. It'd really be nice if they could get him a 390x in it's place instead though.


Amen to that brother...









But honestly I have a beast (not Mus1 kind of beast mine you) 290x. Does over 1270/1750 for benching.

So I'd love to save it if I could. Even if to put in my rig that has a 290 in it. And sell the old 290.


----------



## Scorpion49

Well I couldn't stand it any more. I had to get this card and see whats up, just like I presumed it is identical to a 290X in every way except the memory IC's (this is literally an AMD reference card, presumably as sold to OEM's like Dell or HP, it even has the bracket they use on those included in the box). ASIC is ok at 72.1%. I got this one mainly because I hate the looks of all of the "aftermarket" coolers, and my experience with both the XFX DD edition and the MSI Gaming, they are just as loud as the reference blower anyways.

So I got the card and ordered a Raijintek Morpheous cooler that is on sale and will match my black/silver theme while being much cooler and quieter. I'll add pics of the PCB when the cooler gets here tomorrow and I take it apart.


----------



## Noirgheos

Quote:


> Originally Posted by *Scorpion49*
> 
> Well I couldn't stand it any more. I had to get this card and see whats up, just like I presumed it is identical to a 290X in every way except the memory IC's (this is literally an AMD reference card, presumably as sold to OEM's like Dell or HP, it even has the bracket they use on those included in the box). ASIC is ok at 72.1%. I got this one mainly because I hate the looks of all of the "aftermarket" coolers, and my experience with both the XFX DD edition and the MSI Gaming, they are just as loud as the reference blower anyways.
> 
> So I got the card and ordered a Raijintek Morpheous cooler that is on sale and will match my black/silver theme while being much cooler and quieter. I'll add pics of the PCB when the cooler gets here tomorrow and I take it apart.


My Nitro 390X is pretty damn quiet.

Also, guys... NO GIMPWORKS IN DARK SOULS 3!!!! Only benchmark out though is with a guy who has a 980. At 1080p maxed out he almost never goes over 60% usage... Feeling good for my 390X.


----------



## Scorpion49

Quote:


> Originally Posted by *Noirgheos*
> 
> My Nitro 390X is pretty damn quiet.
> 
> Also, guys... NO GIMPWORKS IN DARK SOULS 3!!!! Only benchmark out though is with a guy who has a 980. At 1080p maxed out he almost never goes over 60% usage... Feeling good for my 390X.


Nitro is too long, can't fit it. Also, I don't like sapphire after the massive hassle they gave me over RMA'ing an R9 Fury.


----------



## jdorje

Why is there no hybrid 390? And only one 390x...


----------



## yuannan

is this a good OC?

1200/1750 @ +75mv and +50%
Sapphire TriX 390x, Stock cooler on full blast.

Card hit 68C max under firestrike at nearly 1.3V.

Is 1.3V safe? or should i lower it?

Also please add me to members list.


----------



## jdorje

1.3V vid is quite safe, barely higher than many cards stock.

68C is really low, but then firestrike doesn't get hot.


----------



## yuannan

Quote:


> Originally Posted by *jdorje*
> 
> 1.3V vid is quite safe, barely higher than many cards stock.
> 
> 68C is really low, but then firestrike doesn't get hot.


What's the max I should go up to?

24/7 and just for a benchmark, thinking to going past 13k once i get the time.


----------



## Noirgheos

Quote:


> Originally Posted by *Scorpion49*
> 
> Nitro is too long, can't fit it. Also, I don't like sapphire after the massive hassle they gave me over RMA'ing an R9 Fury.


Yeah their warranty is ****, but their coolers are the best.


----------



## tolis626

Quote:


> Originally Posted by *jdorje*
> 
> 1.3V vid is quite safe, barely higher than many cards stock.
> 
> 68C is really low, but then firestrike doesn't get hot.


Well, most cards at stock range from 1.225V to 1.275V as far as I know. Thing is, that means squat as it's just the DPM7 voltage. If you account for vdroop you end up quite a bit lower. On my card, a 1.275V part mind you, at stock under load it draws about 1.2V, maybe a tad higher. To actually use 1.3V under load I have to go over +100mV using TrixX (not much higher, but that's beside the point). This also seems to be affected by VRM temps, but that's a whole different discussion.

Now, I know you said VID, but I don't know what yuannan meant. If his card is a 1.225V part then +75mV lands him right at 1.3V VID. Under load, however, I'd expect him to hit about 1.25V. If his card can do 1200/1750MHz at 1.25V, color me impressed. Mine needs 1.3V real voltage. Maybe a wee bit lower, like 1.285-1.29V, but whatever.









Another thing that came to my attention while searching about memory overclocking is that some people around the web claim that Hawaii responds really well to overclocking when following a certain "golden ratio" (which is 1.25-1.4 I believe, depending on the particular GPU) between core and VRAM clocks. Is that actually a thing with the 300 series too? Was it ever a thing or is it just a bunch of bull excrement?


----------



## bluej511

I dont mind Sapphire, the fact that m 7850 after 2 years has 0 issues i got their Nitro r9 390. Msi on the other hand my 5770 has a fan that crapped out.


----------



## jdorje

I know vid technically means the dpm table, but I just meant what he had voltage set to. Screenshot showed +75 mV so unless there's bios mods it should be around 1300. And 68C on stock cooling can't be much higher than that. While 68C on water/aio is probably impossible.


----------



## ronaldoz

Does anyone know if the Sapphire's Nitro VRM's are cooled by the heatsink, or could be improved? I'm having supernice temps on the GPU, but the VRM is like +10C more.


----------



## bluej511

The Nitro heatsink touches the VRM and thermal pads. I think mine used to stay in the 60C for both VRMs.


----------



## ronaldoz

Quote:


> Originally Posted by *bluej511*
> 
> The Nitro heatsink touches the VRM and thermal pads. I think mine used to stay in the 60C for both VRMs.


Hm allright, the GPU chip is around 55C for example. But VRM 2 = 65C. Maybe I did a bit too much +AuX voltage for 1 overclock before. It's at +0mV at the moment. (more Aux voltage made the VRM more hot).


----------



## Agent Smith1984

Reference card is beautiful in my opinion, want to see two in cf..


----------



## Agent Smith1984

Quote:


> Originally Posted by *yuannan*
> 
> 
> 
> is this a good OC?
> 
> 1200/1750 @ +75mv and +50%
> Sapphire TriX 390x, Stock cooler on full blast.
> 
> Card hit 68C max under firestrike at nearly 1.3V.
> 
> Is 1.3V safe? or should i lower it?
> 
> Also please add me to members list.


Awesome clock for 390 series, I'll add you tomorrow

Everything is safe at 68!


----------



## diggiddi

Quote:


> Originally Posted by *Scorpion49*
> 
> Nitro is too long, can't fit it. Also, I don't like sapphire after the massive hassle they gave me over RMA'ing an R9 Fury.


Nice looking card how cool is it an what's the width?


----------



## Scorpion49

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Reference card is beautiful in my opinion, want to see two in cf..


Yeah, but its definitely reference card cooling. 94C all day long, surprisingly not as noisy as a 290X and the VRM cooling is different so they don't get so hot (this is no louder than my old 390 Gaming 8G MSI card). It stays at the 1050mhz frequency all the time, no throttling, fan speed hasn't gone above 34%.

Quote:


> Originally Posted by *diggiddi*
> 
> Nice looking card how cool is it an what's the width?


Its exactly as a reference 290X with a different shroud. The heatsink is slightly different as well, but I'll have to take it apart to see how much.


----------



## diggiddi

Quote:


> Originally Posted by *tolis626*
> 
> Well, most cards at stock range from 1.225V to 1.275V as far as I know. Thing is, that means squat as it's just the DPM7 voltage. If you account for vdroop you end up quite a bit lower. On my card, a 1.275V part mind you, at stock under load it draws about 1.2V, maybe a tad higher. To actually use 1.3V under load I have to go over +100mV using TrixX (not much higher, but that's beside the point). This also seems to be affected by VRM temps, but that's a whole different discussion.
> 
> Now, I know you said VID, but I don't know what yuannan meant. If his card is a 1.225V part then +75mV lands him right at 1.3V VID. Under load, however, I'd expect him to hit about 1.25V. If his card can do 1200/1750MHz at 1.25V, color me impressed. Mine needs 1.3V real voltage. Maybe a wee bit lower, like 1.285-1.29V, but whatever.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Another thing that came to my attention while searching about memory overclocking is that some people around the web claim that Hawaii responds really well to overclocking when following a certain "golden ratio" (which is 1.25-1.4 I believe, depending on the particular GPU) between core and VRAM clocks. Is that actually a thing with the 300 series too? Was it ever a thing or is it just a bunch of bull excrement?


I think this was just the general OC ratio of core to memory that most cards fell between for 290/X
it will be much higher for Grenada due to higher memory clocks(1700mhz+) than Hawaii


----------



## christoph

Quote:


> Originally Posted by *ronaldoz*
> 
> Hm allright, the GPU chip is around 55C for example. But VRM 2 = 65C. Maybe I did a bit too much +AuX voltage for 1 overclock before. It's at +0mV at the moment. (more Aux voltage made the VRM more hot).


that's the same readings I get from mine, the VRM 2 is like 10C hotter than the GPU, and that's why I'm going to replace the thermal pads with Fujipoly thermal pads


----------



## Agent Smith1984

Okay, so take overclocking out of the equation and you end up with a nice looking reference blower card but what I'm getting at is they're two slot cards, they fit together well and these cards can run in Crossfire all day long in a dual gpu configuration even if they run at 90 Celsius and you're good.

People forget that these cards are okay at 90c, how does yours do at stock? Is it really 94c? Does it have the "uber" switch on it?


----------



## ronaldoz

Quote:


> Originally Posted by *christoph*
> 
> that's the same readings I get from mine, the VRM 2 is like 10C hotter than the GPU, and that's why I'm going to replace the thermal pads with Fujipoly thermal pads


Sounds good! What kinda Fujipoly will you use? Do you think you could still get warranty when recplaing them? I would like to replace the thermal paste as well, though the temperatures are great. Still curious if it help a bit.


----------



## Scorpion49

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Okay, so take overclocking out of the equation and you end up with a nice looking reference blower card but what I'm getting at is they're two slot cards, they fit together well and these cards can run in Crossfire all day long in a dual gpu configuration even if they run at 90 Celsius and you're good.
> 
> People forget that these cards are okay at 90c, how does yours do at stock? Is it really 94c? Does it have the "uber" switch on it?


It looks like it does have a switch, not sure if its the normal/uber or just a dual BIOS. It maintains clocks without going over 40% fan speed anyways so I think it might just be dual BIOS. IIRC the uber switch on the original 290X just allowed the fan cap to move up to 55%. And yeah, these would be good for CF, could even do 4-way with them if you wanted. My cooler will be in sometime today so I'll post pics of what the heatsink on this thing looks like, looking inside the shroud I can tell its a little different.


----------



## jodybdesigns

Quote:


> Originally Posted by *ronaldoz*
> 
> Sounds good! What kinda Fujipoly will you use? Do you think you could still get warranty when recplaing them? I would like to replace the thermal paste as well, though the temperatures are great. Still curious if it help a bit.


Told you they were awesome. You don't see people talking about the Nitro here because they are too busy gaming and not here complaining begging for help. People can hate me all day. I stand by Sapphire, but the $$$ of the Powercolor got me. But I also got my PCS+ 390 for $275. The Nitro was $329.


----------



## bluej511

My alphacool keeps my vrms at 59/61 will be adding a couple of 1400mm intake fans drop it even more. My nitro was nice but noisy. I went from 60 or so dB to low 40s dB with TEN fans lol.


----------



## ronaldoz

Quote:


> Originally Posted by *jodybdesigns*
> 
> Told you they were awesome. You don't see people talking about the Nitro here because they are too busy gaming and not here complaining begging for help. People can hate me all day. I stand by Sapphire, but the $$$ of the Powercolor got me. But I also got my PCS+ 390 for $275. The Nitro was $329.


Ye, it's behaving pretty awesome. I like how cool the temps are, and how quiet it can be. Having the fans at 60%, results in game around 55C and can't hear them, because of the game sound.


----------



## rdr09

Quote:


> Originally Posted by *bluej511*
> 
> My alphacool keeps my vrms at 59/61 will be adding a couple of 1400mm intake fans drop it even more. My nitro was nice but noisy. I went from 60 or so dB to low 40s dB with TEN fans lol.


It does. You think it can help beat my 290 in this thread?

http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd

EDIT:

https://www.techpowerup.com/downloads/Benchmarking/Futuremark/

Pick Setup . . .

http://www.cpuid.com/softwares/cpu-z.html

Use this as a background through the duration . . .


----------



## Agent Smith1984

Quote:


> Originally Posted by *jodybdesigns*
> 
> Told you they were awesome. You don't see people talking about the Nitro here because they are too busy gaming and not here complaining begging for help. People can hate me all day. I stand by Sapphire, but the $$$ of the Powercolor got me. But I also got my PCS+ 390 for $275. The Nitro was $329.


I'm a big fan of Sapphire also. NEVER had a bad experience with any Sapphire product, EVER....

The the first tri-x cooler on the 290's, and even the latest versuion on the Nitro/Tri-X 390 series is superb. Powercolor I can't speak for, but I've read a hundred accounts of happy PCS users on the 290 series and have no reason to believe the newer one is any different.

I think any of the 390's are fine EXCEPT Asus and Gigabyte. I say Asus because of their crap coolers, and Gigabyte because of their locked voltage. Other than those two, everyone seems to be fine with their cards (though some users get high temps on the MSI cards, and some don't).


----------



## battleaxe

Quote:


> Originally Posted by *Scorpion49*
> 
> Well I couldn't stand it any more. I had to get this card and see whats up, just like I presumed it is identical to a 290X in every way except the memory IC's (this is literally an AMD reference card, presumably as sold to OEM's like Dell or HP, it even has the bracket they use on those included in the box). ASIC is ok at 72.1%. I got this one mainly because I hate the looks of all of the "aftermarket" coolers, and my experience with both the XFX DD edition and the MSI Gaming, they are just as loud as the reference blower anyways.
> 
> So I got the card and ordered a Raijintek Morpheous cooler that is on sale and will match my black/silver theme while being much cooler and quieter. I'll add pics of the PCB when the cooler gets here tomorrow and I take it apart.


That's a really nice and clean looking card IMO.

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Not for a crossfire setup though..... he is running an 8GB 390X now, with a 4GB 290X..... he'd see comparable performance with a 390X + 390 as he would with a 390X + 290X (maybe miss out on 2-5% performance if the 290x is a great overclocker), but he will have a full 8GB frame buffer to use. Not that you need 8GB for everything, but with that much core power, you can certainly crank things up high enough to break 4GB at high resolutions.....
> 
> I've tested mix and matched combinations of both cards, and to me, the best results were with 390X + 390 (didn't have another 390x at the time to test with), over (2) 290x's, but some of that has to do with the fact I was on air cooling, and limited to 1150-1200MHz overclocks. It'd really be nice if they could get him a 390x in it's place instead though.


Edit:

Okay, so it was in fact my 290x that has died. Bummer. It was a nice one too. Would hit just shy of 1300core and over 1700 on RAM. Sucks to see her go. Sucks so much. She was stronger than every 390X I have seen so far. Did well over 1730 on Heaven with no tess mods. I haven't seen any 390x beat that. But it was limited by 4GB of RAM, so I guess I will live with the new 390x.

BTW: tested the new XFX 390x that I bought (I picked up two of them and plan to return one)

Both of them did over 1220mhz before artifacts on air. Both hit 1750 mem clocks. 1225mhz and I started to see a few artifacts.

So XFX is really doing a nice job (are they binning?). That, or all the chips are now getting a lot better. The last three cards I have bought have all been XFX 390x and they all do almost exactly the same around 1220mhz core 1750RAM before the artifacts start.

I've got them under water now, and they are humming along.

I'll get some updates on how high they can go once I get some time. My old XFX390x was able to hit 1270/1750 (on water and low ambient) before artifacts at only 100mv extra. I plan to try a bit more volts on Trixx to see if that helps at all. Will post some screenies of results along the way.


----------



## Agent Smith1984

Quote:


> Originally Posted by *battleaxe*
> 
> That's a really nice and clean looking card IMO.
> Edit:
> 
> Okay, so it was in fact my 290x that has died. Bummer. It was a nice one too. Would hit just shy of 1300core and over 1700 on RAM. Sucks to see her go. Sucks so much. She was stronger than every 390X I have seen so far. Did well over 1730 on Heaven with no tess mods. I haven't seen any 390x beat that. But it was limited by 4GB of RAM, so I guess I will live with the new 390x.
> 
> BTW: tested the new XFX 390x that I bought (I picked up two of them and plan to return one)
> 
> Both of them did over 1220mhz before artifacts on air. Both hit 1750 mem clocks. 1225mhz and I started to see a few artifacts.
> 
> So XFX is really doing a nice job (are they binning?). That, or all the chips are now getting a lot better. The last three cards I have bought have all been XFX 390x and they all do almost exactly the same around 1220mhz core 1750RAM before the artifacts start.
> 
> I've got them under water now, and they are humming along.
> 
> I'll get some updates on how high they can go once I get some time. My old XFX390x was able to hit 1270/1750 (on water and low ambient) before artifacts at only 100mv extra. I plan to try a bit more volts on Trixx to see if that helps at all. Will post some screenies of results along the way.


I am almost positive now, that XFX is either binning better than all other brands, OR, they are using old stock Hawaii cores in place of the newer "Grenada" chips.....

I will NEVER use Trixx again..... I experimented with some 150mv+ overclocking last night, and one thing lead to another, I got a black screen and I could not boot into WIndows anymore. I had to go safe mode and wipe the driver completely, wipe AB (had a windows startup profile on it) and DDU everything before it would ever boot again. I've beat the hell out of this card with Afterburner at 100mv or less, but using over 100mv on these seems to have aboslutely no positive impact on clocking. I think qith 50mv I got around 10mhz more on the core and the score stayed the same. Water may respond different, but I can clearly tell these Grenada cores do NOT like voltage the way their predecessors do......

Good to see XFX breaking 1200 on most cards.


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I am almost positive now, that XFX is either binning better than all other brands, OR, they are using old stock Hawaii cores in place of the newer "Grenada" chips.....
> 
> I will NEVER use Trixx again..... I experimented with some 150mv+ overclocking last night, and one thing lead to another, I got a black screen and I could not boot into WIndows anymore. I had to go safe mode and wipe the driver completely, wipe AB (had a windows startup profile on it) and DDU everything before it would ever boot again. I've beat the hell out of this card with Afterburner at 100mv or less, but using over 100mv on these seems to have aboslutely no positive impact on clocking. I think qith 50mv I got around 10mhz more on the core and the score stayed the same. Water may respond different, but I can clearly tell these Grenada cores do NOT like voltage the way their predecessors do......
> 
> Good to see XFX breaking 1200 on most cards.


Yeah, I don't know if I'm just getting lucky or not, but I'll take it. Three cards now that do over 1200 is a happy time for me.


----------



## Agent Smith1984

Quote:


> Originally Posted by *battleaxe*
> 
> Yeah, I don't know if I'm just getting lucky or not, but I'll take it. Three cards now that do over 1200 is a happy time for me.


You're not the first either..... I have read a few others getting 12 on air also.

Bilko was getting 1250 on water I believe, maybe benching higher.

I swear they must be using old cores, why would XFX spend so much time/money to bin better than the other manufacturers??
Who knows though, lol

I just know everyone on PC, MSI, Sapp, and Asus is getting in the 1150-1180 range 90% of time, with a handful hitting 1200mhz, while I see several examples of XFX getting 1200+

I'd take an older "juice-happy" core on a 390x board with the 8GB IC's over the newer cores any day....


----------



## jodybdesigns

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I'm a big fan of Sapphire also. NEVER had a bad experience with any Sapphire product, EVER....
> 
> The the first tri-x cooler on the 290's, and even the latest versuion on the Nitro/Tri-X 390 series is superb. Powercolor I can't speak for, but I've read a hundred accounts of happy PCS users on the 290 series and have no reason to believe the newer one is any different.
> 
> I think any of the 390's are fine EXCEPT Asus and Gigabyte. I say Asus because of their crap coolers, and Gigabyte because of their locked voltage. Other than those two, everyone seems to be fine with their cards (though some users get high temps on the MSI cards, and some don't).


I have purchased a total of 11 Sapphire branded cards. Not one has failed on me. One of my 4870's are still trucking along in my nephews super budget Minecraft rig. I have had it since release day. It has the Hynix memory and is a fine overclocker.


I bought 5 7950's for a mining rig. It didn't pay off (imagine that!). So we split the system up between a few of us. All of them are still going. I just sold my 2x 7950's for $259 on eBay. I paid for my R9 390 with that.



Now on to my Powercolor PCS+. It is absolutely amazing. It's not the greatest overclocker in the world..but it was all about the cooling+semi ehhh quiet. Those 7950's @ 70% could be heard outside. The Powercolor @ 70% is at LEAST 4x as quiet. I have preached in here enough, so there isn't much more I can say.


*edit* I will say that I have had 2x EVGA branded 8800GT's fail, and 1x MSI 9800GTS fail on me. Those were my last Nvidia purchases..

*potato pic incoming*


----------



## SLK

@Scorpion49

The fan is definitely different if you look at the curves on the blades. Looks like a Delta Nvidia uses. Probably explains why it is quieter. AMD is taking pages and learning finally.


----------



## Scorpion49

Quote:


> Originally Posted by *jodybdesigns*
> 
> I have purchased a total of 11 Sapphire branded cards. Not one has failed on me. One of my 4870's are still trucking along in my nephews super budget Minecraft rig. I have had it since release day. It has the Hynix memory and is a fine overclocker.


Its all a matter of perspective. I've purchased many sapphire cards in the last year and a fair half of them failed or had issues. Especially the Fury cards, holy crap.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Scorpion49*
> 
> Its all a matter of perspective. I've purchased many sapphire cards in the last year and a fair half of them failed or had issues. Especially the Fury cards, holy crap.


Don't count Fury, lol, I am pretty sure those things are 50% crap anyways!!


----------



## Scorpion49

Haha yeah I noticed that, trust me. I was actually really close to getting the 390X Nitro, it was actually cheaper than the reference card. But it is simply too long for my case and I wanted to see the reference card anyhow. Only cost me $49 for the Raijintek Morpheus so I'm happy with that, probably going to have to make some kind of prop for it though to keep the card form sagging.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Scorpion49*
> 
> Haha yeah I noticed that, trust me. I was actually really close to getting the 390X Nitro, it was actually cheaper than the reference card. But it is simply too long for my case and I wanted to see the reference card anyhow. Only cost me $49 for the Raijintek Morpheus so I'm happy with that, probably going to have to make some kind of prop for it though to keep the card form sagging.


Did you back off from Fury because of all the issues? I backed out cause I put over $150 in my pocket, and got my damn 8GB of VRAM back.... the performance difference is so marginal in actual gaming.....

The real value in my opinion, is to two run (2) 390's in crossfire, but since I got the 390x with Hitman and a mouse for $360 after rebate, it was worth the $30 more to go X.

I plan on running two of these bad boys full on, but not in the budget right now, I got some tickets a while back I gotta pay that are killing me









I'm kind of hanging around also, to see polaris..... if it is as strong as two 390's, I may go that route, but if not, I'll capilize on lower 390 prices and get a second one.

At $700, you can't get much more performance per dollar that what (2) 390'x does, unless you score (2) used 290's for around $400 (which I had done in the past, and it was awesome at the time), but even then you are 4GB bound at 4k.


----------



## Scorpion49

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Did you back off from Fury because of all the issues? I backed out cause I put over $150 in my pocket, and got my damn 8GB of VRAM back.... the performance difference is so marginal in actual gaming.....
> 
> The real value in my opinion, is to two run (2) 390's in crossfire, but since I got the 390x with Hitman and a mouse for $360 after rebate, it was worth the $30 more to go X.
> 
> I plan on running two of these bad boys full on, but not in the budget right now, I got some tickets a while back I gotta pay that are killing me
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm kind of hanging around also, to see polaris..... if it is as strong as two 390's, I may go that route, but if not, I'll capilize on lower 390 prices and get a second one.
> 
> At $700, you can't get much more performance per dollar that what (2) 390'x does, unless you score (2) used 290's for around $400 (which I had done in the past, and it was awesome at the time), but even then you are 4GB bound at 4k.


Yeah, mostly. That and the driver problems with FO4 with earlier driver revisions. When my spare GTX 950 was outperforming an R9 Fury because of the stupid power states enabling in-game, I'm done. To be clear, I had 6 Fury cards and the ONLY one without a defect was the last one, the Nitro OC+ but the drivers finally pushed me over the edge back to a 970.

Recently a picked up a Qnix QX2710 locally for very, very cheap. I actually got a smoking deal, I ended up with an FX8320 system with 16GB RAM, a Gigabyte 990FXA-UD3 R5, a GTX 960 ACX SSC and a 500GB HDD for $225 with the monitor included. The problem was the case and cooling were ancient, it was a thermaltake armor with a built in liquid cooling system from teh early 2000's and it was overheating badly (case was clearly in a smokers home for many years, RAD packed with dirt and pet fur solidified into cement with cigarette tar, I didn't even take it into my house. Disassembled it outside and left it by the road for the garbage truck, disgusting).

I sold all of the FX stuff so I ended out making money on the deal, but the 970 wasn't really great at 1440p 120hz. So I had to decide between another Fury or a 390X, I figured 390X is the best bet for the money. Fury was the only other choice because it has a DVI port, these Korean screens don't work with adapters. Wasn't about to spend $600+ on a 980ti either.


----------



## bluej511

Quote:


> Originally Posted by *rdr09*
> 
> It does. You think it can help beat my 290 in this thread?
> 
> http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd
> 
> EDIT:
> 
> https://www.techpowerup.com/downloads/Benchmarking/Futuremark/
> 
> Pick Setup . . .
> 
> http://www.cpuid.com/softwares/cpu-z.html
> 
> Use this as a background through the duration . . .


Idk maybe, i already have cpuz and the 3D Mark demo on steam.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Scorpion49*
> 
> Yeah, mostly. That and the driver problems with FO4 with earlier driver revisions. When my spare GTX 950 was outperforming an R9 Fury because of the stupid power states enabling in-game, I'm done. To be clear, I had 6 Fury cards and the ONLY one without a defect was the last one, the Nitro OC+ but the drivers finally pushed me over the edge back to a 970.
> 
> Recently a picked up a Qnix QX2710 locally for very, very cheap. I actually got a smoking deal, I ended up with an FX8320 system with 16GB RAM, a Gigabyte 990FXA-UD3 R5, a GTX 960 ACX SSC and a 500GB HDD for $225 with the monitor included. The problem was the case and cooling were ancient, it was a thermaltake armor with a built in liquid cooling system from teh early 2000's and it was overheating badly (case was clearly in a smokers home for many years, RAD packed with dirt and pet fur solidified into cement with cigarette tar, I didn't even take it into my house. Disassembled it outside and left it by the road for the garbage truck, disgusting).
> 
> I sold all of the FX stuff so I ended out making money on the deal, but the 970 wasn't really great at 1440p 120hz. So I had to decide between another Fury or a 390X, I figured 390X is the best bet for the money. Fury was the only other choice because it has a DVI port, these Korean screens don't work with adapters. Wasn't about to spend $600+ on a 980ti either.


Yeah, I think a lot of people fall in that 1080-1440p range, and can't quite get what they need from a 970, don't want to spend $460+ on a 980, and certainly don't want to spend $600+ on Fury X and 980TI, so the 390's fall in the perfect pricing bracket.

I ran a 980 KPE for a bit at 1516/7900 after I sold off my Fury, and though it was a good card 90% of the time, I would get stutters at 4K from VRAM limitation, which is why I finally went back to a 390....

Even with that 980 at 1516/7900, my 390x at daily clocks of 1150/1750 beat it at 4k. 1080P was a different story, that little 980 would ZOOM at 1080, but 4k, the bandwidth of the AMD just takes over.


----------



## david279

Hi guys..I'm new here. Got a R9 390 and loving it. Gave it a slight overclock, I'm new to GPU overclocking so I'm taking it slow.


----------



## Agent Smith1984

Quote:


> Originally Posted by *david279*
> 
> Hi guys..I'm new here. Got a R9 390 and loving it. Gave it a slight overclock, I'm new to GPU overclocking so I'm taking it slow.


Welcome!

Please post same shot but with Note or Word open displaying your username on OCN.

I'll get you added to the list!

Thanks


----------



## bluej511

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Welcome!
> 
> Please post same shot but with Note or Word open displaying your username on OCN.
> 
> I'll get you added to the list!
> 
> Thanks


Dont think ive even done that myself yet, whoops.


----------



## christoph

Quote:


> Originally Posted by *ronaldoz*
> 
> Sounds good! What kinda Fujipoly will you use? Do you think you could still get warranty when recplaing them? I would like to replace the thermal paste as well, though the temperatures are great. Still curious if it help a bit.


I only could order the 11w/mk ones, 1.5 mm thick, and of course I will redo the thermal paste as well, but don't think I will be getting much from the thermal paste of the GPU, and I doubt that warranty still apply after the opening of the video card, but I really don't care much about warranty as the video card is a Sapphire one, I have never had any problems with this brand...

all video cards, (any component actually) will see an increase of lifespan IF you keep them at lower temps than they are originally getting of the original design, from the stock cooler, of course that the cooler from Sapphire are the best and if you trying to OC and you can keep the temps low somehow then the lifespan of the video card won't get compromise


----------



## christoph

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Yeah, I think a lot of people fall in that 1080-1440p range, and can't quite get what they need from a 970, don't want to spend $460+ on a 980, and certainly don't want to spend $600+ on Fury X and 980TI, so the 390's fall in the perfect pricing bracket.
> 
> I ran a 980 KPE for a bit at 1516/7900 after I sold off my Fury, and though it was a good card 90% of the time, I would get stutters at 4K from VRAM limitation, which is why I finally went back to a 390....
> 
> Even with that 980 at 1516/7900, my 390x at daily clocks of 1150/1750 beat it at 4k. 1080P was a different story, that little 980 would ZOOM at 1080, but 4k, the bandwidth of the AMD just takes over.


I have a friend with a TITAN and 32 gb of RAM, and he won't get nothing but stutters from that card, why? don't know really, don't have time to fix a 1000 bucks video card from stuttering really, I mean, that should not be happening to begin with..


----------



## david279

There ya go....


----------



## Vellinious

Quote:


> Originally Posted by *christoph*
> 
> I have a friend with a TITAN and 32 gb of RAM, and he won't get nothing but stutters from that card, why? don't know really, don't have time to fix a 1000 bucks video card from stuttering really, I mean, that should not be happening to begin with..


Some of it could be drivers, could be power delivery, could be settings in the control panel....could be a lot of things. 90% of the time, the thing that it ISN'T, is the hardware. = )


----------



## Minusorange

I'll be joining the club on Saturday, just ordered an Asus 390 strix after sorting out a partial refund for my broken tri-x, cost me effectively £140 when I add in the refund so not a bad deal considering I can't wait around for the new AMD line to be released, get 3 games with it too so I could potentially knock another £50 off the price if I sell them. So £90 for a new card and a new 3 year warranty (only reason I went with the strix as all other manufacturers only offer 2 years apart from MSI who was more expensive)


----------



## christoph

Quote:


> Originally Posted by *Vellinious*
> 
> Some of it could be drivers, could be power delivery, could be settings in the control panel....could be a lot of things. 90% of the time, the thing that it ISN'T, is the hardware. = )


weren't you that was complaining about the 390?? anyway

I know he tried everything to try and fix his thing, maybe, don't know really, I'm not there, I don't want to waste time trying to fix a 1000 bucks, the Fury X was way way cheaper here were we live, hell the 980 TI superclock and super freeze or whatever series that was, was like 950 bucks, I mean what the heck are those prices, Nvidia don't even know to price their products...

I can play any game with my 390, I may be getting some frame drops, but what?? I can almost double the performance in Xfire for less than 1000 bucks, I paid 319 for my 390 with back plate...


----------



## ronaldoz

Does anyone knows how to get more then 100mV in Afterburner? I would like to have a slider that could go up to +200mV.


----------



## Stige

BIOS modding.


----------



## Vellinious

Quote:


> Originally Posted by *ronaldoz*
> 
> Does anyone knows how to get more then 100mV in Afterburner? I would like to have a slider that could go up to +200mV.


Use Trixxx or HiS iTurbo

https://drive.google.com/file/d/0Bzz6JsWm9EHPRWZ6LTB2eFRyclk/view


----------



## kizwan

Quote:


> Originally Posted by *ronaldoz*
> 
> Does anyone knows how to get more then 100mV in Afterburner? I would like to have a slider that could go up to +200mV.


Code:



Code:


cd "C:\Program Files (x86)\MSI Afterburner"
MsiAfterburner.exe /wi6,30,8d,14
MsiAfterburner.exe

Note:-
Example above : 14 (HEX) = 20 (DEC) : 20 x 6.25mV = +125mV

The command above auto-set the slider. Don't move or touch the slider because it will reset back to +100mV max.


----------



## Agent Smith1984

Is anybody actually getting gains on these cards going past 100mv though? I ask because I do not get anything out of it at all. I think these cards have a limited TDP compared to 290, on top of having different core characteristics than the older Hawaii chips.


----------



## patriotaki

what do you think?
r9 390 pcs+ at stock
i5 6600k 4.4ghz
ddr4 2400mhz


----------



## Agent Smith1984

Quote:


> Originally Posted by *patriotaki*
> 
> what do you think?
> r9 390 pcs+ at stock
> i5 6600k 4.4ghz
> ddr4 2400mhz


Looks right to me









LMK what you end up with for clocks and voltage so I can get you added (you had told me to hold off until you had your OC figured out I believe).


----------



## ronaldoz

Quote:


> Originally Posted by *Vellinious*
> 
> Use Trixxx or HiS iTurbo
> 
> https://drive.google.com/file/d/0Bzz6JsWm9EHPRWZ6LTB2eFRyclk/view


Thanks, will try HiS iTurbo, or at least checking it out. Trixx won't work for me. It will reset the voltage to stock after restarting. I've contacted Sapphire, and they told they will release a new Trixx version, so it's possible to change this. Not sure why it's working for others.

*Update* His act as Trixx does, so after a restart, the voltage is on stock.


----------



## patriotaki

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Looks right to me
> 
> 
> 
> 
> 
> 
> 
> 
> 
> LMK what you end up with for clocks and voltage so I can get you added (you had told me to hold off until you had your OC figured out I believe).


this is +100mV
+50% power limit
1100 Core Clock
1700 Mem clock

no artifacts here.. i think i can go up to 1120..will try later
sign me up


----------



## Agent Smith1984

Quote:


> Originally Posted by *patriotaki*
> 
> this is +100mV
> +50% power limit
> 1100 Core Clock
> 1700 Mem clock
> 
> no artifacts here.. i think i can go up to 1120..will try later
> sign me up


You shouldn't need anywhere near 100 to get 1100.

You should see something like 1150MHz on around 25-60mv.

I will get you added.


----------



## patriotaki

Quote:


> Originally Posted by *Agent Smith1984*
> 
> You shouldn't need anywhere near 100 to get 1100.
> 
> You should see something like 1150MHz on around 25-60mv.
> 
> I will get you added.


not even close... i get huge artifacts 1150 clock with +100mv


----------



## jodybdesigns

Quote:


> Originally Posted by *Agent Smith1984*
> 
> You shouldn't need anywhere near 100 to get 1100.
> 
> You should see something like 1150MHz on around 25-60mv.
> 
> I will get you added.


Not sure if I got added or not.

Powercolor PCS+ 1100/1625 +38mv


----------



## patriotaki

mine is a terrible overclocker!


----------



## Agent Smith1984

Quote:


> Originally Posted by *patriotaki*
> 
> not even close... i get huge artifacts 1150 clock with +100mv


Well, based and your and jody's numbers, it looks like the PoweColors are coming in on the low side... Good information, and still good cards, even at 1100MHz

It's a shame to not at least get 1150 out of her though.....


----------



## patriotaki

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Well, based and your and jody's numbers, it looks like the PoweColors are coming in on the low side... Good information, and still good cards, even at 1100MHz
> 
> It's a shame to not at least get 1150 out of her though.....


yea..terrible OC i get artifacts on 1100mhz with +63mv
that means i need +100mv which is too much..and i prefer to run it at stock.. i will just play 7-10fps lower


----------



## ronaldoz

Quote:


> Originally Posted by *kizwan*
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> cd "C:\Program Files (x86)\MSI Afterburner"
> MsiAfterburner.exe /wi6,30,8d,14
> MsiAfterburner.exe
> 
> Note:-
> Example above : 14 (HEX) = 20 (DEC) : 20 x 6.25mV = +125mV
> 
> The command above auto-set the slider. Don't move or touch the slider because it will reset back to +100mV max.


Thanks, so Afterburner will not show the actuall voltage next to the slider? So has to be set manually for every voltage. Do I need to place that command in notepad and save as .bat? And should be deleted if using a lower voltage? Just to be sure, because it sound's a bit tricky. Oh, and will it be remembered after a pc restart?


----------



## Agent Smith1984

Quote:


> Originally Posted by *patriotaki*
> 
> yea..terrible OC i get artifacts on 1100mhz with +63mv
> that means i need +100mv which is too much..and i prefer to run it at stock.. i will just play 7-10fps lower


Wow, that's crazy.... 100mv for 1100 may be about the worst result I have seen for these cards.

What does the core voltage report as under 100% load in GPU-Z??

I wonder if PC is using a lower stock voltage than others, and your voltage isn't actually that high....


----------



## patriotaki

sometimes when i open msi AB my pc just freezes..anyone knows why? i have unistalled it, reinstalled , cleaned cache tmp files etc..still happens SOMETIMES


----------



## patriotaki

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Wow, that's crazy.... 100mv for 1100 may be about the worst result I have seen for these cards.
> 
> What does the core voltage report as under 100% load in GPU-Z??
> 
> I wonder if PC is using a lower stock voltage than others, and your voltage isn't actually that high....


check this out


----------



## rdr09

Quote:


> Originally Posted by *patriotaki*
> 
> sometimes when i open msi AB my pc just freezes..anyone knows why? i have unistalled it, reinstalled , cleaned cache tmp files etc..still happens SOMETIMES


Quote:


> Originally Posted by *patriotaki*
> 
> check this out


You leave your gpu oc'ed?

Wait, is that firemark? Not many recommend to use that here in ocn.


----------



## patriotaki

Quote:


> Originally Posted by *rdr09*
> 
> You leave your gpu oc'ed?


never


----------



## rdr09

Quote:


> Originally Posted by *patriotaki*
> 
> never


It happens when the gpu at stock? try it. it could be Overdrive (if you accepted it) and AB conflicting one another.

Or, AB and another app opened conflicting. I recently oc'ed my gpu using Trixx and forgot to reset it to stock. Turned on my system, used AB to monitor usage and the game kept on freezing.

EDIT: See post # 42 . . .

http://www.overclock.net/t/1594105/amd-16-3-drivers/40#post_25014533


----------



## Agent Smith1984

Quote:


> Originally Posted by *patriotaki*
> 
> check this out


Bro bro bro









Do NOT use that evil program of death on your card. PLEASE









BTW..... It's just as I thought with your card, your load voltage is only 1.125...... Mine is around 1.235 at 100mv+

You simply need to get more voltage to OC more. The PC obviously run a much lower core voltage.


----------



## patriotaki

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Bro bro bro
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Do NOT use that evil program of death on your card. PLEASE
> 
> 
> 
> 
> 
> 
> 
> 
> 
> BTW..... It's just as I thought with your card, your load voltage is only 1.125...... Mine is around 1.235 at 100mv+
> 
> You simply need to get more voltage to OC more. The PC obviously run a much lower core voltage.


haha sorry i just run it once quickly to show you the screenshot because other programs are in fullscreen









how can i increase voltage? using sapphire trixx?


----------



## Agent Smith1984

Quote:


> Originally Posted by *patriotaki*
> 
> haha sorry i just run it once quickly to show you the screenshot because other programs are in fullscreen
> 
> 
> 
> 
> 
> 
> 
> 
> 
> how can i increase voltage? using sapphire trixx?


Yep, I want to see if the card responds to 200mv and gets you in the 1.2**+ range.... that's where you'll find your 1170-1200 core clocks at.
Try Heaven or FireStrike or something to check it though. Firemark = Death to yur curds


----------



## ronaldoz

Quote:


> Originally Posted by *patriotaki*
> 
> haha sorry i just run it once quickly to show you the screenshot because other programs are in fullscreen
> 
> 
> 
> 
> 
> 
> 
> 
> 
> how can i increase voltage? using sapphire trixx?


You could use the Heaven benchmark, to post not in fullscreen if you like to.


----------



## patriotaki

with heaven i managed to get +200mv and 1160Mhz with 1650-1665mhz mem clock.. i get BSOD if im anywhere near 1700mem clock..

the vrm1 goes up to 92celcius


----------



## jdorje

Lol firemark.

Another easy way to increase voltage is to edit it within the bios. If you look up your adaptive voltage with aida you can get a base. Then just edit all 6 tables with hawaiibiosreader to change the base from adaptive to, well, whatever you want it to be.

This way is harder than just using trixx, sure. But it'll give you confidence to go in and edit your ram timings which is productive.


----------



## Agent Smith1984

Quote:


> Originally Posted by *patriotaki*
> 
> with heaven i managed to get +200mv and 1160Mhz with 1650-1665mhz mem clock.. i get BSOD if im anywhere near 1700mem clock..
> 
> the vrm1 goes up to 92celcius


That's because PowerColor's cooler doesn't even touch the VRM's....

I wasn't aware of that until looking at the pics in this review:
http://www.fudzilla.com/component/k2/38040-powercolor-r9-390-pcs-8gb-reviewed?showall=1

VRM's breaking 70c will hinder OC ability quite a bit.


----------



## bluej511

Wow yea they are indeed somehow passively cool. Its just going to blow hot air over the heat sink anyways. I was able to get 1125/1625 without power limit or increased voltage.


----------



## ronaldoz

Quote:


> Originally Posted by *jdorje*
> 
> Lol firemark.
> 
> Another easy way to increase voltage is to edit it within the bios. If you look up your adaptive voltage with aida you can get a base. Then just edit all 6 tables with hawaiibiosreader to change the base from adaptive to, well, whatever you want it to be.
> 
> This way is harder than just using trixx, sure. But it'll give you confidence to go in and edit your ram timings which is productive.


It would be nice to do so. But will you use more power all the time? I OC's my CPU and only changed the Vcore to manual. I don't know anything about addaptive. But when using Prime or Linx for example, the Vcore is a bit more then I've set.

Somehow I managed to use 1200 / 1700 with +100mV and +50% power limit. Yesterday I could not get it over 1185Mhz, so that is confusing me a bit. After Fire Strike, I did a full Heaven benchmark. I did not raise the core anymore, and did use this /wi6,30,8d,13 to use 118mV in Afterburner.


----------



## Agent Smith1984

Then MSI goes and cools the damn VRM's, but leaves the RAM open.....
http://www.vortez.net/articles_pages/msi_r9_390x_gaming_8g_review,5.html

Sapphire took care of BOTH!
http://www.kitguru.net/components/graphic-cards/zardon/sapphire-r9-390x-tri-x-8gb-review/4/

Obviously they have the best air cooling solution for these, and the power delivery (8+8) to technically be the best choice of any 390, however the OC results tell a different story...

Anyone with Nitro/Tri-X got a GPU-Z screenie to show load voltages? Seems like if any card would benefit from going 150mv+ and using Trixx, it would be the Sappy


----------



## patriotaki

so....what do i do now?


----------



## bluej511

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Then MSI goes and cools the damn VRM's, but leaves the RAM open.....
> http://www.vortez.net/articles_pages/msi_r9_390x_gaming_8g_review,5.html
> 
> Sapphire took care of BOTH!
> http://www.kitguru.net/components/graphic-cards/zardon/sapphire-r9-390x-tri-x-8gb-review/4/
> 
> Obviously they have the best air cooling solution for these, and the power delivery (8+8) to technically be the best choice of any 390, however the OC results tell a different story...
> 
> Anyone with Nitro/Tri-X got a GPU-Z screenie to show load voltages? Seems like if any card would benefit from going 150mv+ and using Trixx, it would be the Sappy


Here you go agent smith just took it now doing Heaven, no artifacts and OCed. I didn't add any power limit or voltage.


----------



## ronaldoz

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Then MSI goes and cools the damn VRM's, but leaves the RAM open.....
> http://www.vortez.net/articles_pages/msi_r9_390x_gaming_8g_review,5.html
> 
> Sapphire took care of BOTH!
> http://www.kitguru.net/components/graphic-cards/zardon/sapphire-r9-390x-tri-x-8gb-review/4/
> 
> Obviously they have the best air cooling solution for these, and the power delivery (8+8) to technically be the best choice of any 390, however the OC results tell a different story...
> 
> Anyone with Nitro/Tri-X got a GPU-Z screenie to show load voltages? Seems like if any card would benefit from going 150mv+ and using Trixx, it would be the Sappy


Gonna try to raise voltages slowly, and I guess the GPU got a good day.. It's on 118mV if that is correct (/wi6,30,8d,13) using 350W max with Fire Strike Extreme! But I was able to get the card on 1220 / 1700. So that's 20Mhz more then +100mV.


----------



## Scorpion49

So I swapped the cooler out just now. The reference 290X has a much different cooler than this 390X. Check it out:

All ready to go!



Reference 290X heatsink:



390X heatsink:









I first did the small heatsinks as the instructions said, but I added a few extras in some spots since I had them, but then I had to change out some of the ones on the VRAM because they interfered with the heat pipes.





Topped it all off with a pair of AP-45's, running at 7V with my fan controller they move more air than any 3rd party graphics card I have ever encountered and nearly silent (can only hear them with the case open).



Surprisingly it didn't sag much, but I put a little prop in there anyways just for safety. Going to start testing temps now.


----------



## Agent Smith1984

Quote:


> Originally Posted by *bluej511*
> 
> Here you go agent smith just took it now doing Heaven, no artifacts and OCed. I didn't add any power limit or voltage.


I see max at 1.25, which is normal, but what is the avg during the load itself with vdroop?

That's what really matters with core clocks on these.

What's crazy is, the friggin PowerColor cards at still able to do 1100MHz with a measly 1.14~ load voltage..... Makes me want to just run at 1100MHz all the time, and see how much I can undervolt, lol


----------



## bluej511

Quote:


> Originally Posted by *Scorpion49*
> 
> So I swapped the cooler out just now. The reference 290X has a much different cooler than this 390X. Check it out:
> 
> All ready to go!
> 
> 
> 
> Reference 290X heatsink:
> 
> 
> 
> 390X heatsink:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I first did the small heatsinks as the instructions said, but I added a few extras in some spots since I had them, but then I had to change out some of the ones on the VRAM because they interfered with the heat pipes.
> 
> 
> 
> 
> 
> Topped it all off with a pair of AP-45's, running at 7V with my fan controller they move more air than any 3rd party graphics card I have ever encountered and nearly silent (can only hear them with the case open).
> 
> 
> 
> Surprisingly it didn't sag much, but I put a little prop in there anyways just for safety. Going to start testing temps now.


Curious to see what the VRMs get under load. I was thinking of doing the same thing but buying an ek thermosphere. I might mod the alphacool block to accept the thermosphere as its much less restrictive just not sure if it will fit.


----------



## bluej511

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I see max at 1.25, which is normal, but what is the avg during the load itself with vdroop?
> 
> That's what really matters with core clocks on these.
> 
> What's crazy is, the friggin PowerColor cards at still able to do 1100MHz with a measly 1.14~ load voltage..... Makes me want to just run at 1100MHz all the time, and see how much I can undervolt, lol


Already closed it but ill do an avg of one run of heaven with the factory clocks see what i get.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Scorpion49*
> 
> So I swapped the cooler out just now. The reference 290X has a much different cooler than this 390X. Check it out:
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> All ready to go!
> 
> 
> 
> Reference 290X heatsink:
> 
> 
> 
> 390X heatsink:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I first did the small heatsinks as the instructions said, but I added a few extras in some spots since I had them, but then I had to change out some of the ones on the VRAM because they interfered with the heat pipes.
> 
> 
> 
> 
> 
> Topped it all off with a pair of AP-45's, running at 7V with my fan controller they move more air than any 3rd party graphics card I have ever encountered and nearly silent (can only hear them with the case open).
> 
> 
> 
> Surprisingly it didn't sag much, but I put a little prop in there anyways just for safety. Going to start testing temps now.


Looks awesome man. VERY curious to see load temps for core and VRM's..... Let us know what kind of clocks you achieve.


----------



## Agent Smith1984

Quote:


> Originally Posted by *bluej511*
> 
> Already closed it but ill do an avg of one run of heaven with the factory clocks see what i get.


Thanks man, good info to have around here.







+1


----------



## Stige

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Then MSI goes and cools the damn VRM's, but leaves the RAM open.....
> http://www.vortez.net/articles_pages/msi_r9_390x_gaming_8g_review,5.html
> 
> Sapphire took care of BOTH!
> http://www.kitguru.net/components/graphic-cards/zardon/sapphire-r9-390x-tri-x-8gb-review/4/
> 
> Obviously they have the best air cooling solution for these, and the power delivery (8+8) to technically be the best choice of any 390, however the OC results tell a different story...
> 
> Anyone with Nitro/Tri-X got a GPU-Z screenie to show load voltages? Seems like if any card would benefit from going 150mv+ and using Trixx, it would be the Sappy


That VRAM cooling is more than sufficient on MSI... They don't get hot like has been said about million times....


----------



## jdorje

Quote:


> Originally Posted by *ronaldoz*
> 
> It would be nice to do so. But will you use more power all the time? I OC's my CPU and only changed the Vcore to manual. I don't know anything about addaptive. But when using Prime or Linx for example, the Vcore is a bit more then I've set.
> 
> Somehow I managed to use 1200 / 1700 with +100mV and +50% power limit. Yesterday I could not get it over 1185Mhz, so that is confusing me a bit. After Fire Strike, I did a full Heaven benchmark. I did not raise the core anymore, and did use this /wi6,30,8d,13 to use 118mV in Afterburner.


Just to be clear, in talking about the gpu bios and gpu overclock.

Hawaii has 8 power states. If you edit the bios and change only the highest state, the other 7 are unaffected.

If you do +100 mV in afterburner does it add that only to the highest state? Or to all?
Quote:


> Originally Posted by *Agent Smith1984*
> 
> Obviously they have the best air cooling solution for these, and the power delivery (8+8) to technically be the best choice of any 390, however the OC results tell a different story...


Msi has an extra vrm for more steady power. It's as simple as that. And while you can push an extra 75 W on the sapphire without catching on fire, scaling at that point is terrible anyway (plus possible black screen issue).

My xfx has a ram heatsink. I'm at 1740 mhz on it. I guess now I think about it msi people are reporting a lot less on the ram. Whether that matters I guess is situational.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Stige*
> 
> That VRAM cooling is more than sufficient on MSI... They don't get hot like has been said about million times....


I don't think it's that drastic either, BUT, it could explain why both of the MSI cards I have tested needed exactly 50mv more AUX voltage to get the RAM to 1750, while all of the others making direct contact with the VRAM have not....

Could be coincidence too though.... Of course, when you get into the situation of having to add voltage to something to make it more stable because it is running too hot it becomes a viscous cycle and you will always have to "settle" for something.... lol


----------



## bluej511

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Thanks man, good info to have around here.
> 
> 
> 
> 
> 
> 
> 
> +1


So avg after a Heaven run seems to be 1.151v. Took the reading immediately after closing heaven.


----------



## Stige

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I don't think it's that drastic either, BUT, it could explain why both of the MSI cards I have tested needed exactly 50mv more AUX voltage to get the RAM to 1750, while all of the others making direct contact with the VRAM have not....
> 
> Could be coincidence too though.... Of course, when you get into the situation of having to add voltage to something to make it more stable because it is running too hot it becomes a viscous cycle and you will always have to "settle" for something.... lol


Well I run my AUX Voltage at -50mV so dunno, haven't seen it having any effect at all after I got my waterblock so why run it at more if I can run it at less?


----------



## Scorpion49

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Looks awesome man. VERY curious to see load temps for core and VRM's..... Let us know what kind of clocks you achieve.


I've just run Valley looped for about 10 minutes with the fans on minimum, my temps are as follows:

- GPU core 54C
- VRM1 64C
- VRM2 61C

Pretty good for the fans on minimum speed. I could make it tornado mode if I wanted, the fans do go up to 2150RPM haha.


----------



## ronaldoz

Quote:


> Originally Posted by *jdorje*
> 
> Just to be clear, in talking about the gpu bios and gpu overclock.
> 
> Hawaii has 8 power states. If you edit the bios and change only the highest state, the other 7 are unaffected.
> 
> If you do +100 mV in afterburner does it add that only to the highest state? Or to all?
> Msi has an extra vrm for more steady power. It's as simple as that. And while you can push an extra 75 W on the sapphire without catching on fire, scaling at that point is terrible anyway (plus possible black screen issue).
> 
> My xfx has a ram heatsink. I'm at 1740 mhz on it. I guess now I think about it msi people are reporting a lot less on the ram. Whether that matters I guess is situational.


Uhm, I don't know. How could I check values? I notice the voltage is a bit more when using /wi6,30,8d,13.


----------



## bluej511

Quote:


> Originally Posted by *Scorpion49*
> 
> I've just run Valley looped for about 10 minutes with the fans on minimum, my temps are as follows:
> 
> - GPU core 54C
> - VRM1 64C
> - VRM2 61C
> 
> Pretty good for the fans on minimum speed. I could make it tornado mode if I wanted, the fans do go up to 2150RPM haha.


Damn thats not bad, well then im actually shocked how well my Alphacool gpx does with absolutely no fans on it and only have 2 intake fans (for now, will be adding 2-3 more NF-14s into the case) I was up till 6am last night moved my rad from down the case to the side and have it in a push pull taking case air, also switched my front fans from intake to exhaust pull for the 240rad. Runs cooler then before now just need more cool air in.


----------



## Stige

Quote:


> Originally Posted by *Scorpion49*
> 
> I've just run Valley looped for about 10 minutes with the fans on minimum, my temps are as follows:
> 
> - GPU core 54C
> - VRM1 64C
> - VRM2 61C
> 
> Pretty good for the fans on minimum speed. I could make it tornado mode if I wanted, the fans do go up to 2150RPM haha.


Valley is very light weight and doesn't really represent anything. You are better off running Heaven/Firestrike which are both way more taxing than Valley is. And you can run Heaven without breaks unlike Firestrike.


----------



## Agent Smith1984

Ugggg, I want water so bad, but don't really feel like spending money on a card I will only have for another 6-8 months... lol


----------



## yuannan

1.287V if my screen shot from before is correct, might be different...


----------



## Agent Smith1984

Quote:


> Originally Posted by *yuannan*
> 
> 1.287V if my screen shot from before is correct, might be different...


LOAD????

That looks like idle voltage to me...

The vdroop on these cards is pretty substantial....


----------



## Stige

I get about ~1.25V under load at +125mV on my card.
1.305V at no load.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Stige*
> 
> I get about ~1.25V under load at +125mV on my card.
> 1.305V at no load.


That sounds about right for 125+...

I get around 1.21-1.23 at 100mv+

Wish there was some LLC to play with on these cards, lol.


----------



## diggiddi

Quote:


> Originally Posted by *patriotaki*
> 
> sometimes when i open msi AB my pc just freezes..anyone knows why? i have unistalled it, reinstalled , cleaned cache tmp files etc..still happens SOMETIMES


I have the problem too it started with the latest version of Afterburner ie 4.2


----------



## Agent Smith1984

Quote:


> Originally Posted by *diggiddi*
> 
> I have the problem too it started with the latest version of Afterburner ie 4.2


Strange, cause I just installed it last night, and it did okay, but then this morning my computer was froze??? I just thought the baby slapped it.. he does that A LOT, like, it's driving me nuts.
He slaps the crap out of my rig, and it hard locks, lol


----------



## kizwan

Quote:


> Originally Posted by *ronaldoz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Vellinious*
> 
> Use Trixxx or HiS iTurbo
> 
> https://drive.google.com/file/d/0Bzz6JsWm9EHPRWZ6LTB2eFRyclk/view
> 
> 
> 
> Thanks, will try HiS iTurbo, or at least checking it out. Trixx won't work for me. It will reset the voltage to stock after restarting. I've contacted Sapphire, and they told they will release a new Trixx version, so it's possible to change this. Not sure why it's working for others.
> 
> *Update* His act as Trixx does, so after a restart, the voltage is on stock.
Click to expand...

Did you mean even when you set MSI AB or Trixx to auto-start at startup, voltage get reset?


----------



## ronaldoz

Quote:


> Originally Posted by *kizwan*
> 
> Did you mean even when you set MSI AB or Trixx to auto-start at startup, voltage get reset?


MSI AB does not reset the voltage. Trixx and HiS does. They both do after restarting Windows. So also when use setting 'Startup with Windows'. But when I set voltage with Afterburner, both will show correct voltage. But I can't use them for the default overclock software, since they reset voltage.


----------



## Scorpion49

Quote:


> Originally Posted by *Stige*
> 
> Valley is very light weight and doesn't really represent anything. You are better off running Heaven/Firestrike which are both way more taxing than Valley is. And you can run Heaven without breaks unlike Firestrike.


Running AW for the last hour I'm seeing 55C on the core and 71/64 on the VRM's, still minimum fan speed.


----------



## bluej511

My idle voltage is like 1.07v or something why do you guys have high idle voltage?


----------



## yuannan

Just tested it with furmark 720p at 4XAA,

dropped a near perfect 0.1V at 1.178

Thats is quite a heavy drop.

While on the subject of voltage, whats the highest I should go up to?

Thinking of taking some hwbot top spots, currently at #11 globally for the 390x firestrike normal, 2k away from #1.


----------



## yuannan

Quote:


> Originally Posted by *bluej511*
> 
> My idle voltage is like 1.07v or something why do you guys have high idle voltage?


different vendors have different start volts, I'm currently running a underclocked 390x tri-X at 1000/1500 @ -56mv/-36% power, never goes over 60C under load at 35% fan speed, this card is quiet AND cool i'm really surprised, usually bring this up when nVidia people starts ****ting on me on YouTube.

One reason I can think of for different start voltages are just semi-bias reviews. If a vendor starts at 1v compared to 1.3V the vendor with 1.3 will overlock ALOT better, but it really doesn't.

Other reason will be different vendors will have to lower voltage to fit their target audience, MSI for gamers so extra voltage, asus for quiet for lower voltage ext...


----------



## david279

I just bought a LG ultrawide(34UM67-P), anyone use a ultrawide with the r9 390? It's a 2k monitor.

Sent from my Nexus 6P using Tapatalk


----------



## yuannan

Quote:


> Originally Posted by *david279*
> 
> I just bought a LG ultrawide(34UM67-P), anyone use a ultrawide with the r9 390? It's a 2k monitor.
> 
> Sent from my Nexus 6P using Tapatalk


using a 29" samsung, got the LG one but I hated it, controls were bad and the structure of the pixels pissed me off.


----------



## bluej511

Quote:


> Originally Posted by *david279*
> 
> I just bought a LG ultrawide(34UM67-P), anyone use a ultrawide with the r9 390? It's a 2k monitor.
> 
> Sent from my Nexus 6P using Tapatalk


1920x1080 is already 2k the ultrawides just have slightly more vertical pixels could be 2.5k lol. Im looking into ultrawide next but waiting for the 29uc88 curved freesync to see the price. R9 390 should run it np.


----------



## jdorje

Is 2560x1080 more or less demanding than 2560x1440? Seems like the wider field of view could cause more objects to be drawn - and raise cpu usage - even though the number of pixels is smaller.

My 390 oc has no trouble with 2560x1440...


----------



## jodybdesigns

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Bro bro bro
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Do NOT use that evil program of death on your card. PLEASE
> 
> 
> 
> 
> 
> 
> 
> 
> 
> BTW..... It's just as I thought with your card, your load voltage is only 1.125...... Mine is around 1.235 at 100mv+
> 
> You simply need to get more voltage to OC more. The PC obviously run a much lower core voltage.


I have my slider set to +100mv and I am topping out at 1.211v

How do I fix this. I have always thought my voltage has never been high enough. I don't want to use Trixx, it sucks.


----------



## Stige

Quote:


> Originally Posted by *Scorpion49*
> 
> Running AW for the last hour I'm seeing 55C on the core and 71/64 on the VRM's, still minimum fan speed.


AW is the heaviest thing I have on my PC aswell, that thing just puts unbeliavable stress on the GPU, nothing else comes even close.
Even sitting idle in hangar.


----------



## 66racer

Hey guys,

I mentioned 390x quadfire not long ago, I got the system built and thought I would share the 3dmark results of 25303, no overclocking on anything other than the powercolor devil 390x being at 1100mhz stock. The 6700k was not overclocked either. Max watts I saw from the wall was 1230watts. Owner was Happy, right out the gate she was doing 125-130MH/s mining too.

http://www.3dmark.com/fs/7956973

How she came out:


----------



## yuannan

Quote:


> Originally Posted by *Stige*
> 
> AW is the heaviest thing I have on my PC aswell, that thing just puts unbeliavable stress on the GPU, nothing else comes even close.
> Even sitting idle in hangar.


Sorry to be a noob, but what does AW mean?


----------



## Scorpion49

Quote:


> Originally Posted by *yuannan*
> 
> Sorry to be a noob, but what does AW mean?


Armored Warfare.


----------



## jodybdesigns

Wow this thread is moving fast! I decided to give my computer some juice to kick it in it's @$$. So my voltage at +100mv is topping out at 1.211 but my GPU usage is 100%. And I got my best benchmark yet. No artifacts! But if I push this past 1700Mhz I black screen. I also have +55 Aux voltage. And look at those VRM temps! So wonderful! Just the cooling in general makes me happy about mine. Even if the VRM's are passively cooled (which has me thinking how to fix this if possible..)

*edit* 1150/1700 +100mv. And I cant get it to top out higher than 1.211v, so I can't push the core higher...


----------



## bluej511

Quote:


> Originally Posted by *jodybdesigns*
> 
> Wow this thread is moving fast! I decided to give my computer some juice to kick it in it's @$$. So my voltage at +100mv is topping out at 1.211 but my GPU usage is 100%. And I got my best benchmark yet. No artifacts! But if I push this past 1700Mhz I black screen. I also have +55 Aux voltage. And look at those VRM temps! So wonderful! Just the cooling in general makes me happy about mine. Even if the VRM's are passively cooled (which has me thinking how to fix this if possible..)
> 
> *edit* 1150/1700 +100mv. And I cant get it to top out higher than 1.211v, so I can't push the core higher...


Thats about what i got at 1125 and 1650 with 0 mv added, guess i got a good silicone lottery. As far as the VRMs, if you can just add a fan blowing accross the card, helps a ton.


----------



## bluej511

Quote:


> Originally Posted by *jdorje*
> 
> Is 2560x1080 more or less demanding than 2560x1440? Seems like the wider field of view could cause more objects to be drawn - and raise cpu usage - even though the number of pixels is smaller.
> 
> My 390 oc has no trouble with 2560x1440...


Its a bit less demanding then 1440p. 2560x1440=3,686,400. 2560x1080=2,764,800 so about a million less pixels.


----------



## jodybdesigns

Quote:


> Originally Posted by *bluej511*
> 
> Thats about what i got at 1125 and 1650 with 0 mv added, guess i got a good silicone lottery. As far as the VRMs, if you can just add a fan blowing accross the card, helps a ton.


These Powercolors actually like a fan at the back of the card. If you feel these cards they push a ton of air out of the side of them. It causes a restriction. I get -4C cooler temps with a fan in the back than the side.

I want to put this Powercolor on water. Ugh just need these 2 jobs and it's mine. EK has a block for this, but I can't find a backplate with the specific block they have.


----------



## bluej511

Quote:


> Originally Posted by *jodybdesigns*
> 
> These Powercolors actually like a fan at the back of the card. If you feel these cards they push a ton of air out of the side of them. It causes a restriction. I get -4C cooler temps with a fan in the back than the side.
> 
> I want to put this Powercolor on water. Ugh just need these 2 jobs and it's mine. EK has a block for this, but I can't find a backplate with the specific block they have.


I dont see why not but you SHOULD be able to use the factory backplate onto the ek, i know for a fact you can do that with their msi block. The backplate has lil rubber spacers anyways you just prob have to either use the factory screws or the ek ones whichever ones seem like they work better. I wanted to keep my sapphire one on the alphacool but didnt work.


----------



## renoy

no one use HIS IceQ x2 R9 390 Series?

I am using it know, n I thing its pretty good


----------



## bluej511

Quote:


> Originally Posted by *renoy*
> 
> no one use HIS IceQ x2 R9 390 Series?
> 
> I am using it know, n I thing its pretty good


Gotta max it out, turn on 8x msaa and tesselation then check the results, thats a full on full load test this way.


----------



## jodybdesigns

Quote:


> Originally Posted by *bluej511*
> 
> I dont see why not but you SHOULD be able to use the factory backplate onto the ek, i know for a fact you can do that with their msi block. The backplate has lil rubber spacers anyways you just prob have to either use the factory screws or the ek ones whichever ones seem like they work better. I wanted to keep my sapphire one on the alphacool but didnt work.


Are you using the NexXxos?


----------



## renoy

Quote:


> Originally Posted by *renoy*
> 
> no one use HIS IceQ x2 R9 390 Series?
> 
> I am using it know, n I thing its pretty good


Quote:


> Originally Posted by *bluej511*
> 
> Gotta max it out, turn on 8x msaa and tesselation then check the results, thats a full on full load test this way.


I think I am already maxed it out...or I made wrong setup?


----------



## bluej511

Quote:


> Originally Posted by *jodybdesigns*
> 
> Are you using the NexXxos?


Yea it came with a pretty beefy backplate. I was hesitant in buying it because there's pretty much only bad reviews about high VRM temps and what not but mine is lower then most of you guys and as of right now i don't even have great case flow. Ill be adding 2 NF14s up top as intakes getting cool air for the rads to exhaust out but im in the low to mid 60s for VRMs without having a fan blowing on it. Once i get a Noctua on top of it ill see if it even makes a difference.


----------



## bluej511

Quote:


> Originally Posted by *renoy*
> 
> I think I am already maxed it out...or I made wrong setup?


Turn anti-aliasing to 8x and tessellation to extreme thats maxing out the card.


----------



## battleaxe

Here's my proof for dual 390x Xfire - Admin


----------



## renoy

Quote:


> Originally Posted by *bluej511*
> 
> Turn anti-aliasing to 8x and tessellation to extreme thats maxing out the card.




like this


----------



## bluej511

Quote:


> Originally Posted by *renoy*
> 
> 
> 
> like this


Yup exactly.


----------



## bluej511

So does anyone else have Elpida memory on their 390/390x?


----------



## ronaldoz

Quote:


> Originally Posted by *jodybdesigns*
> 
> )
> 
> *edit* 1150/1700 +100mv. And I cant get it to top out higher than 1.211v, so I can't push the core higher...


What do you mean by: can't do more then 1,211v? What value of voltage is that in GPU-Z? VDDC?


----------



## Scorpion49

Does anyone know how to check the display chain for HDCP compliance with AMD? I'm trying to watch a movie on Amazon and its telling me I can only have SD which it did not before with the only difference being an Nvidia card instead of AMD prior to switching to this 390X.


----------



## renoy

Quote:


> Originally Posted by *bluej511*
> 
> Thats about what i got at 1125 and 1650 with 0 mv added, guess i got a good silicone lottery. As far as the VRMs, if you can just add a fan blowing accross the card, helps a ton.


Quote:


> Originally Posted by *bluej511*
> 
> Yup exactly.


n yeah this the result


----------



## Vellinious

Quote:


> Originally Posted by *renoy*
> 
> n yeah this the result


Uh....Its my benchmark.....lol As in, came from my card back in January. Same day I posted it to the Heaven thread in the benchmarks forum.


----------



## jodybdesigns

Quote:


> Originally Posted by *Vellinious*
> 
> Uh....Its my benchmark.....lol As in, came from my card back in January. Same day I posted it to the Heaven thread in the benchmarks forum.


Wooooooooow lol


----------



## Vellinious

Quote:


> Originally Posted by *jodybdesigns*
> 
> Wooooooooow lol


My thoughts exactly.... Huge stones....HUGE


----------



## Agent Smith1984

This forum is flying man, sure glad i created it, cause in reality, these cards really are different. They have some advantages over 290 series, and some disadvantages over 290 series, and it's good to have a place to gone in on just these cards.

I'll get you added tomorrow axe!

Congrats on cf 390x, it's beastly!


----------



## renoy

Quote:


> Originally Posted by *bluej511*
> 
> Thats about what i got at 1125 and 1650 with 0 mv added, guess i got a good silicone lottery. As far as the VRMs, if you can just add a fan blowing accross the card, helps a ton.


Quote:


> Originally Posted by *Vellinious*
> 
> Uh....Its my benchmark.....lol As in, came from my card back in January. Same day I posted it to the Heaven thread in the benchmarks forum.


wrong picture sorry























this one is true


----------



## Vellinious

Quote:


> Originally Posted by *renoy*
> 
> wrong picture sorry
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> this one is true


Better....now run it with 8xAA lol


----------



## kizwan

Quote:


> Originally Posted by *Vellinious*
> 
> Quote:
> 
> 
> 
> Originally Posted by *renoy*
> 
> n yeah this the result
> 
> 
> 
> 
> 
> Uh....Its my benchmark.....lol As in, came from my card back in January. Same day I posted it to the Heaven thread in the benchmarks forum.
Click to expand...

What? Why he using your screenshot?
Quote:


> Originally Posted by *renoy*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bluej511*
> 
> Thats about what i got at 1125 and 1650 with 0 mv added, guess i got a good silicone lottery. As far as the VRMs, if you can just add a fan blowing accross the card, helps a ton.
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Vellinious*
> 
> Uh....Its my benchmark.....lol As in, came from my card back in January. Same day I posted it to the Heaven thread in the benchmarks forum.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> wrong picture sorry
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> this one is true
Click to expand...

You forgot 8xAA.


----------



## renoy

Quote:


> Originally Posted by *Vellinious*
> 
> Better....now run it with 8xAA lol


----------



## Stige

Quote:


> Originally Posted by *renoy*
> 
> wrong picture sorry
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> this one is true


You haven't heard of a screenshot button or puuush on your PC by this day and age?


----------



## yuannan

1210/1750 @ +100mv and +50% limit

Also please change my name to yuannan (all lower)


----------



## patriotaki

Quote:


> Originally Posted by *Agent Smith1984*
> 
> That's because PowerColor's cooler doesn't even touch the VRM's....
> 
> I wasn't aware of that until looking at the pics in this review:
> http://www.fudzilla.com/component/k2/38040-powercolor-r9-390-pcs-8gb-reviewed?showall=1
> 
> VRM's breaking 70c will hinder OC ability quite a bit.


so what do i do now agent?








should i have trixx installed and have +150mV? would this be safe?
BTW today i installed a pwm fan right under the gpu , and its constantly blowing air on the gpu


----------



## jodybdesigns

Quote:


> Originally Posted by *patriotaki*
> 
> so what do i do now agent?
> 
> 
> 
> 
> 
> 
> 
> 
> should i have trixx installed and have +150mV? would this be safe?
> BTW today i installed a pwm fan right under the gpu , and its constantly blowing air on the gpu


Try putting the fan at the back of the card and watch the magic happen. That's where I got the best cooling results.


----------



## patriotaki

Quote:


> Originally Posted by *jodybdesigns*
> 
> Try putting the fan at the back of the card and watch the magic happen. That's where I got the best cooling results.


on the right hand side?


----------



## bluej511

So i just added a small Enermax tb silence (doesnt blow much air) over the alphacool block and my VRMs after 15-20mins of Syndicate are VRM1 52°C and VRM2 60°C. So not sure if ill leave it in or remove it. Did drop it 8°C or so for VRM1 and a couple for VRM2.


----------



## ronaldoz

Quote:


> Originally Posted by *jodybdesigns*
> 
> Try putting the fan at the back of the card and watch the magic happen. That's where I got the best cooling results.


Over here there are 3 120 mm fans in the front of the case. Should that be fine? The temps are great, but any improvement is great.

Quote:


> Originally Posted by *yuannan*
> 
> 
> 
> 1210/1750 @ +100mv and +50% limit
> 
> Also please change my name to yuannan (all lower)


Awesome overclock! I suggest to test a full hd benchmark in Heaven too. I've done some succesfull tests with Firestrike, but could be artifacting later in Heaven. Also 1440p resolution could artifact here, when 1980p is fine. So I started to test with 1440p only.


----------



## jodybdesigns

Quote:


> Originally Posted by *patriotaki*
> 
> on the right hand side?


In the back where it looks like a black Lamborghini exhaust. These coolers seem to blow air out of the side and bottom of the card. Seems to sucking air in the back on the card where you plug in the pcie cable.


----------



## yuannan

Quote:


> Originally Posted by *ronaldoz*
> 
> Over here there are 3 120 mm fans in the front of the case. Should that be fine? The temps are great, but any improvement is great.
> Awesome overclock! I suggest to test a full hd benchmark in Heaven too. I've done some succesfull tests with Fire strike, but could be artifacting later in Heaven. Also 1440p resolution could artifact here, when 1980p is fine. So I started to test with 1440p only.


I never liked heaven because it took so long to run, 30 tests doesn't exactly take 5 minutes. It does how ever stress it harder on the GPU

I get small artifaction on the particles such as specks and sparks, bumping the voltage to +150mv in sapphire Tri-X doesn't remove it either. So i'm just gonna deal with it. I'm OCing for fun I don't plan to run OC while gaming ever as it's very noisy, 1000/1500 @ -56mv -24% @ 35-40% fans. Near silent









Currently #7 on hwbot for the 390x trying to push for #2 with just 700 more points, I have my 6600k @ 4.6 if I OC it i can gain a few more points.

I also have the highest OC on hwbot as well, but the top few have 5960x which I can't afford...

http://hwbot.org/benchmark/3dmark_-_fire_strike/rankings?hardwareTypeId=videocard_2471&cores=1#start=0#interval=20


----------



## patriotaki




----------



## bluej511

Quote:


> Originally Posted by *patriotaki*


So ghetto i love it haha. I screwed mine into a bracket. I just ordered 2 NF-14s to add as intake fans to the top of my case, ill take more pics when i get em showing off my rear intake fan for my alphacool block.


----------



## patriotaki

Quote:


> Originally Posted by *bluej511*
> 
> So ghetto i love it haha. I screwed mine into a bracket. I just ordered 2 NF-14s to add as intake fans to the top of my case, ill take more pics when i get em showing off my rear intake fan for my alphacool block.


haha yeahh

both fans do good job .. 38celcius idle vrm temp according to trixx


----------



## battleaxe

Never mind the dust as I'm too lazy to worry about such things.

XFX 390X Crossfire. Custom VRM coolers. Cores under 45C. VRM's under 52C max. (for Normal runs and gameplay)

Both cards will do 1245mhz min. First card will hit 1270mhz for single run. (suicide runs VRM1 will get into the low 70C's)

I suspect my VRM1 is not making good contact as these temps were lower before I redid the Xfire and VRM's since adding the new card. So I think I need to get some new VRM pads from FujiPoly. I might have moved them around too much. Still not bad results for not having a dedicated full cover block as I cannot get them for these cards.

Not bad attoll.


----------



## Scorpion49

Quote:


> Originally Posted by *yuannan*
> 
> 
> 
> 1210/1750 @ +100mv and +50% limit
> 
> Also please change my name to yuannan (all lower)


I gotta ask, is there something wrong with the AMD cards in Firestrike? You've got modded tesselation and 1210mhz overclock and got a 16596 graphic score, my bone stock out of the box GTX 970 got 16117 graphics score. It seems like there should be a big gap between the two cards.


----------



## battleaxe

Quote:


> Originally Posted by *Scorpion49*
> 
> I gotta ask, is there something wrong with the AMD cards in Firestrike? You've got modded tesselation and 1210mhz overclock and got a 16596 graphic score, my bone stock out of the box GTX 970 got 16117 graphics score. It seems like there should be a big gap between the two cards.


Firestrike isn't a great test for AMD cards. That's why we use Heaven most of the time.


----------



## yuannan

Quote:


> Originally Posted by *Scorpion49*
> 
> I gotta ask, is there something wrong with the AMD cards in Firestrike? You've got modded tesselation and 1210mhz overclock and got a 16596 graphic score, my bone stock out of the box GTX 970 got 16117 graphics score. It seems like there should be a big gap between the two cards.


Mod tessalation really didn't make that ,much of a different about 500 extra as you can see from here:

#1 has mod tessellation, #2 and #3 is normal firestrike with Crimson 16.2.1
http://www.3dmark.com/compare/fs/7993469/fs/7993265/fs/7992832

As for the graphics score I got no idea, I OC for fun and like I said before I don't even OC for gaming. Purely fun and seeing what is possible.


----------



## Scorpion49

Quote:


> Originally Posted by *yuannan*
> 
> Mod tessalation really didn't make that ,much of a different about 500 extra as you can see from here:
> 
> http://www.3dmark.com/compare/fs/7993469/fs/7993265#
> 
> As for the graphics score I got no idea, I OC for fun and like I said before I don't even OC for gaming. Purely fun and seeing what is possible.


NEVERMIND! I'm an idiot that just woke up after a night at the bar, I was looking at 3Dmark11! lol I knew something was weird. I was thinking 16k graphics in firestrike was really good when I first looked at it so I was super confused. My card gets read as a 290X when I run it now.

Heres mine that I just ran compared to the old card (CPU clocks are lower this time): http://www.3dmark.com/compare/fs/7997713/fs/7898467


----------



## ronaldoz

Mmm, I just ordered a Nitro 390X, and bring the 390 back. I think it's not really a difference in performance. Just a bit higher clocks and shader. But it did not really cost much more and I did not know that when ordering a 390.


----------



## Noirgheos

Quote:


> Originally Posted by *ronaldoz*
> 
> Mmm, I just ordered a Nitro 390X, and bring the 390 back. I think it's not really a difference in performance. Just a bit higher clocks and shader. But it did not really cost much more and I did not know that when ordering a 390.


Big enough difference to match a Fury non-x with a +100MHz OC.


----------



## mus1mus

Quote:


> Originally Posted by *yuannan*
> 
> Mod tessalation really didn't make that ,much of a different about 500 extra as you can see from here:
> 
> #1 has mod tessellation, #2 and #3 is normal firestrike with Crimson 16.2.1
> http://www.3dmark.com/compare/fs/7993469/fs/7993265/fs/7992832
> 
> As for the graphics score I got no idea, I OC for fun and like I said before I don't even OC for gaming. Purely fun and seeing what is possible.


http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd/1150_50#post_25019282

Get in there. We need your help.

Same goes for you @ronaldoz


----------



## yuannan

Quote:


> Originally Posted by *mus1mus*
> 
> http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd/1150_50#post_25019282
> 
> Get in there. We need your help.
> 
> Same goes for you @ronaldoz


I'll read over the rules and submit one tonight.

We are losing by quite a large margin, can we even pull this back?


----------



## mus1mus

Quote:


> Originally Posted by *yuannan*
> 
> I'll read over the rules and submit one tonight.
> 
> We are losing by quite a large margin, can we even pull this back?


Yes.









With a single card, try to contribute with every bench and expect around 70K total to add to the tally.

My quads, still in the works. But can easily add 200K in there.


----------



## bluej511

Im downloading 3dmark11 right now wil just have to download vantage later. Got a 10,694 on firestrike with my r9 390 1100/1600.


----------



## mus1mus

Quote:


> Originally Posted by *bluej511*
> 
> Im downloading 3dmark11 right now wil just have to download vantage later. Got a 10,694 on firestrike with my r9 390 1100/1600.


We're not after the Overall score for each benchmark. Only the Graphics score. A Hawaii/Grenada card can at least do 14K in Fire Strike (score multipled by 2 for the tally FYI) and around 18K in 3DM11. Vantage is a little picky but expect around 50K (divided by two though).


----------



## Scorpion49

Quote:


> Originally Posted by *mus1mus*
> 
> http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd/1150_50#post_25019282
> 
> Get in there. We need your help.
> 
> Same goes for you @ronaldoz


I'm working on my overclock to get in on it as well, I have all of the benches installed already.


----------



## mus1mus

Quote:


> Originally Posted by *Scorpion49*
> 
> I'm working on my overclock to get in on it as well, I have all of the benches installed already.


Nice.









Remember, just joining matters more than shooting for high scores.


----------



## bluej511

Quote:


> Originally Posted by *mus1mus*
> 
> We're not after the Overall score for each benchmark. Only the Graphics score. A Hawaii/Grenada card can at least do 14K in Fire Strike (score multipled by 2 for the tally FYI) and around 18K in 3DM11. Vantage is a little picky but expect around 50K (divided by two though).


Well then 13405 OCed on firestrike. 3dm11 factory clocks im getting 17,794.


----------



## mus1mus

Quote:


> Originally Posted by *bluej511*
> 
> Well then 13405 OCed on firestrike. 3dm11 factory clocks im getting 17,794.


Something may be holding back that Overclock. Check Driver settings. Performance mode maybe.

But like I said, simply joining matters more than another 1 or 2K more score in there.


----------



## bluej511

Quote:


> Originally Posted by *mus1mus*
> 
> Something may be holding back that Overclock. Check Driver settings. Performance mode maybe.
> 
> But like I said, simply joining matters more than another 1 or 2K more score in there.


Well gotta do the tests before joining haha. Have no idea if this card even has a performance mode, ive got e efficiency turned off already.


----------



## Vellinious

Did you turn tess off in the driver control panel?


----------



## jodybdesigns

I need to get in on that. I'm too lazy to find a "free" version on Vantage. Can't find myself to pay for it for a one time use because I hate vantage. I guess I'll take one for the team...


----------



## bluej511

Quote:


> Originally Posted by *Vellinious*
> 
> Did you turn tess off in the driver control panel?


Is that what kills firestrike haha? I have it set to amd optimized, other option is override or use application settings, dont have an off.


----------



## mus1mus

Quote:


> Originally Posted by *jodybdesigns*
> 
> I need to get in on that. I'm too lazy to find a "free" version on Vantage. Can't find myself to pay for it for a one time use because I hate vantage. I guess I'll take one for the team...


You don't need to pay for the app. 3DM11 and Vantage allow skipping the demo or full test runs. So good for quicker runs. FS Demo is just annoyingly long for me so I bought the full version during a Steam Promo.


----------



## Agent Smith1984

Quote:


> Originally Posted by *mus1mus*
> 
> You don't need to pay for the app. 3DM11 and Vantage allow skipping the demo or full test runs. So good for quicker runs. FS Demo is just annoyingly long for me so I bought the full version during a Steam Promo.


Exact same thing i did!

So no tess is okay for this right?


----------



## rdr09

delete


----------



## Vellinious

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Exact same thing i did!
> 
> So no tess is okay for this right?


Yes, no tess is fine for this.


----------



## bluej511

Ill give it a shot see what i get. Setting it to use application settings made it worse haha.


----------



## jodybdesigns

Quote:


> Originally Posted by *rdr09*
> 
> Here jody . . .
> 
> https://www.techpowerup.com/downloads/Benchmarking/Futuremark/
> 
> EDIT: In 3DMark11 . . . make sure you pick Stretched.


Thank youuuu!

I will get these downloaded and everything benched. Will probably run that 1150/1700 I had going last night. I got some pretty good numbers with those.


----------



## Scorpion49

Is this score OK? Card doesn't seem to be getting a lot of voltage, +100mv is ~1.242V at load with droop. But it did 10 loops of Heaven just fine at 1200/1600. I turned my fans up for benching and my max core temp was 50C (61/66 on VRM) after all of the loops, this setup is cooling very nicely. Should I push the memory higher?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Scorpion49*
> 
> Is this score OK? Card doesn't seem to be getting a lot of voltage, +100mv is ~1.242V at load with droop. But it did 10 loops of Heaven just fine at 1200/1600. I turned my fans up for benching and my max core temp was 50C (61/66 on VRM) after all of the loops, this setup is cooling very nicely. Should I push the memory higher?


Yes, try and get memory to 1750 and you will hit 1700 points. Might need aux voltage to do it...


----------



## Scorpion49

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Yes, try and get memory to 1750 and you will hit 1700 points. Might need aux voltage to do it...


Haha, just barely missed it.


----------



## jodybdesigns

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Yes, try and get memory to 1750 and you will hit 1700 points. Might need aux voltage to do it...


Does CPU have a factor in the Heaven bench? If so, Looks like I am going to have to dump some voltage into this i5 to make it hit 4.5ghz. It's unstable with Offset and I have to used a fixed voltage when I break 4.4ghz


----------



## Vellinious

Quote:


> Originally Posted by *jodybdesigns*
> 
> Does CPU have a factor in the Heaven bench? If so, Looks like I am going to have to dump some voltage into this i5 to make it hit 4.5ghz. It's unstable with Offset and I have to used a fixed voltage when I break 4.4ghz


Very little. Valley is CPU bound to beat hell, though.


----------



## jodybdesigns

Quote:


> Originally Posted by *Vellinious*
> 
> Very little. Valley is CPU bound to beat hell, though.


Okay cool cool. Will keep that in mind. Thanks


----------



## Vellinious

Quote:


> Originally Posted by *Scorpion49*
> 
> Is this score OK? Card doesn't seem to be getting a lot of voltage, +100mv is ~1.242V at load with droop. But it did 10 loops of Heaven just fine at 1200/1600. I turned my fans up for benching and my max core temp was 50C (61/66 on VRM) after all of the loops, this setup is cooling very nicely. Should I push the memory higher?


The Unigine benchmarks LOVE high memory clocks. Push it up as high as you can.


----------



## Scorpion49

I thought I might have killed my card running Firestrike, turns out that 3Dmark does NOT like overclocked Korean monitors. I had to set it back to 60hz to even get the run to start.


----------



## Stige

Quote:


> Originally Posted by *Scorpion49*
> 
> I thought I might have killed my card running Firestrike, turns out that 3Dmark does NOT like overclocked Korean monitors. I had to set it back to 60hz to even get the run to start.


Mine runs fine with 120Hz, not an issue with the monitor but something else.
QNIX QX2710


----------



## Scorpion49

Quote:


> Originally Posted by *Stige*
> 
> Mine runs fine with 120Hz, not an issue with the monitor but something else.
> QNIX QX2710


Definitely the monitor. I've had issues with it on some games as well. Only with the AMD card, doesn't seem to happen with Nivida but you don't have to use CRU for those.


----------



## Stige

Quote:


> Originally Posted by *Scorpion49*
> 
> Definitely the monitor. I've had issues with it on some games as well. Only with the AMD card, doesn't seem to happen with Nivida but you don't have to use CRU for those.


Like what for example? I have never had any issues and always use AMD cards with my QNIX.

Only thing I have ever had an issues is that Final Fantasy 7 Launcher required 60Hz for some reason and only gave an error when you tried to start it with 120Hz.
But even that was not monitor specific, just refresh rate issue for some reason.


----------



## Scorpion49

Quote:


> Originally Posted by *Stige*
> 
> Like what for example? I have never had any issues and always use AMD cards with my QNIX.
> 
> Only thing I have ever had an issues is that Final Fantasy 7 Launcher required 60Hz for some reason and only gave an error when you tried to start it with 120Hz.
> But even that was not monitor specific, just refresh rate issue for some reason.


Yeah thats what I mean, its not really the monitors fault its just that my EDID override doesn't agree with some programs. I don't usually have an issue at 96hz but at 110/120hz some programs don't like it. 3Dmark11 and Vantage both ran fine but firestrike was crashing immediately and the monitor would lose signal, setting it back to 60hz fixed it.


----------



## Stige

Quote:


> Originally Posted by *Scorpion49*
> 
> Yeah thats what I mean, its not really the monitors fault its just that my EDID override doesn't agree with some programs. I don't usually have an issue at 96hz but at 110/120hz some programs don't like it. 3Dmark11 and Vantage both ran fine but firestrike was crashing immediately and the monitor would lose signal, setting it back to 60hz fixed it.


Never had issues with Firestrike so it has to be something else.


----------



## Scorpion49

Quote:


> Originally Posted by *Stige*
> 
> Never had issues with Firestrike so it has to be something else.


It doesn't matter to me what you did or didn't have issues with. We are not using the same system with the same settings.


----------



## mus1mus

Scorpion, he's just a know-it-all.









I am looking at your subs to the competition and I think you don't have Tesselation Off. Pretty scores though!


----------



## battleaxe

I just added an AX860 PSU to power only my 390X's.









Nothing like excess. Plan to add a few more RADs next.

I have to stop messing with this thing. Seriously.


----------



## Scorpion49

Quote:


> Originally Posted by *mus1mus*
> 
> Scorpion, he's just a know-it-all.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I am looking at your subs to the competition and I think you don't have Tesselation Off. Pretty scores though!


Yeah I don't care to be shutting off tess. It always seemed stupid to me that it is even allowed. The hardware is going to do what its going to do, artificially changing it to make one or the other look better is silly.


----------



## legendary2020

The max clock i got on my GPU is 1170 core clock and 1680 memory clock with 100 mV anything higher shows flickering all over the place

Here is my result



fire strike 11690 to 11740

i saw other people pushing the same GPU to 1215 and some even got 1225 core clock and 1675 on memory clock ..but or some reason mine cant go there


----------



## battleaxe

I just added an AX860 PSU to power only my 390X's.
Quote:


> Originally Posted by *mus1mus*
> 
> Scorpion, he's just a know-it-all.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I am looking at your subs to the competition and I think you don't have Tesselation Off. Pretty scores though!


BTW Mus... you can see I took your suggestions and made a rigid pipe VRM solution. It did work better. Probably just because of the mass of cold water behind the metal if nothing else I imagine. Temps are around 50c or less at 1100/1500 during gaming both cards driving. So pretty decent. And about 15c better than what my best was previously.

+1 for that BTW.


----------



## Agent Smith1984

So.... I got my fanboy comp scores up, gave us about 80k points









Just initial runs. I may do some suicide testing later, lol

BTW... we are really getting our asses whooped over there.... if just 10 people would run these benches and get close to what I did we'd be back in it!!!


----------



## mus1mus

Quote:


> Originally Posted by *Agent Smith1984*
> 
> So.... I got my fanboy comp scores up, gave us about 80k points
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just initial runs. I may do some suicide testing later, lol
> 
> BTW... we are really getting our asses whooped over there.... if just 10 people would run these benches and get close to what I did we'd be back in it!!!


Those are some pretty scores!
Quote:


> Originally Posted by *battleaxe*
> 
> I just added an AX860 PSU to power only my 390X's.
> BTW Mus... you can see I took your suggestions and made a rigid pipe VRM solution. It did work better. Probably just because of the mass of cold water behind the metal if nothing else I imagine. Temps are around 50c or less at 1100/1500 during gaming both cards driving. So pretty decent. And about 15c better than what my best was previously.
> 
> +1 for that BTW.


Yep. Saw that one previously and you made some pretty good job in there. :there: I am thinking of doing the same --- integrated on to a reference blower.


----------



## jdorje

Smith, you linked a couple pages back the pcs+ cooler showing a separate small vrm cooler inside the main cooler.

The xfx is just like that too. The vrm cooler is a bit bigger though I think. And the vrms are thermally linked to the backplate which gets really hot. I think the xfx vrm heat sink is a bit bigger than the pcs+ one. But it's still impossible to keep vrms near 70C under any kind of overclock.

I was wondering about better ways to cool those suckers. If the vrm heat sink could be linked to the main heat sink with some thermal padding that might help. But another idea would be blowing air at the backplate. It gets up to 60-70C (!!!) so would be a great candidate for moving heat. Would I want to attach some small heat sinks to it? Obviously not viable in dual card setups...except for the top card which is the one that needs it.


----------



## jodybdesigns

Well I got my results posted for Red Team. I got some really good scores today benching this thing to the limits lol

http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd/1170#post_25020906


----------



## Vellinious

Quote:


> Originally Posted by *jodybdesigns*
> 
> Well I got my results posted for Red Team. I got some really good scores today benching this thing to the limits lol
> 
> http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd/1170#post_25020906


Looks like you ran with tessellation on.


----------



## yuannan

Dame, we only got a few days left.

Better push my rig right to the max tommaro. Gonna get up extra early (10am) and OC it till about 12.

Benches should be up by 1pm.

Wish me luck...


----------



## battleaxe

To bad my PC won't run FS at all. Bummer.


----------



## Agent Smith1984

Quote:


> Originally Posted by *battleaxe*
> 
> To bad my PC won't run FS at all. Bummer.


Oh man, no fs?

Mine won't run 11 unless i get it through steam, the other two work as they should through regular Windows installation.


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Oh man, no fs?
> 
> Mine won't run 11 unless i get it through steam, the other two work as they should through regular Windows installation.


Yeah, I'm gonna download and try again. But the last few times I tried it wouldn't even launch. I'm on a fresh install though. So fingers crossed. Hope to help out the red team.









Can we add multiple rigs to the stats? Or only one rig? I have several computers around here with AMD cards on them.


----------



## Scorpion49

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Oh man, no fs?
> 
> Mine won't run 11 unless i get it through steam, the other two work as they should through regular Windows installation.


We need one more person, now we're only trailing by 21k points!


----------



## battleaxe

Quote:


> Originally Posted by *Scorpion49*
> 
> We need one more person, now we're only trailing by 21k points!


Hopefully I can get mine in there.









Trying. We shall see. I'll try the other machine if this one fails. But it only has one 290 in it.


----------



## jodybdesigns

Quote:


> Originally Posted by *Vellinious*
> 
> Looks like you ran with tessellation on.


Can you run without? I was in such a hurry (and anticipation) I thought the system was barely going to make it through (I had to reboot once). Should I run Firestrike without Tess? It's the only suite I have purchased I can tweak the settings.


----------



## jodybdesigns

Quote:


> Originally Posted by *battleaxe*
> 
> Hopefully I can get mine in there.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Trying. We shall see. I'll try the other machine if this one fails. But it only has one 290 in it.


Do a bone stock run! That will put us up 50k points minimum!


----------



## tolis626

When is the competition closing? I'm not at home since yesterday and I will be returning either tomorrow or Sunday. I hope I can get my MSI to get its crap together and pull a 1200+ for me and for the team. I don't think it'll do us the favor of going high on the memory though, but what can I do? Better than nothing I guess.


----------



## jodybdesigns

Quote:


> Originally Posted by *tolis626*
> 
> When is the competition closing? I'm not at home since yesterday and I will be returning either tomorrow or Sunday. I hope I can get my MSI to get its crap together and pull a 1200+ for me and for the team. I don't think it'll do us the favor of going high on the memory though, but what can I do? Better than nothing I guess.


It ends March 31st. So you have some time


----------



## yuannan

All done lads,

OCed my card and CPU to the max I can before I get into the BLK and going up in single digits and using a custom BIOs.

Gallery of my hard work:


http://imgur.com/NDDVt


http://www.3dmark.com/fs/8003239
http://www.3dmark.com/3dm11/11107634
http://www.3dmark.com/3dmv/5434031

Might as well update my OC as well.

1255/1750 @ +200mv + 50% power fan maxed


----------



## battleaxe

Quote:


> Originally Posted by *jodybdesigns*
> 
> Do a bone stock run! That will put us up 50k points minimum!


It worked. Must have been the fresh install. I ran with Tess on as I wasn't sure how to turn it off, and don't have time to figure it out. But its score. So there it is.


----------



## jodybdesigns

Quote:


> Originally Posted by *yuannan*
> 
> All done lads,
> 
> OCed my card and CPU to the max I can before I get into the BLK and going up in single digits and using a custom BIOs.
> 
> Gallery of my hard work:
> 
> 
> http://imgur.com/NDDVt
> 
> 
> http://www.3dmark.com/fs/8003239
> http://www.3dmark.com/3dm11/11107634
> http://www.3dmark.com/3dmv/5434031
> 
> Might as well update my OC as well.
> 
> 1255/1750 @ +200mv + 50% power fan maxed


Quote:


> Originally Posted by *battleaxe*
> 
> It worked. Must have been the fresh install. I ran with Tess on as I wasn't sure how to turn it off, and don't have time to figure it out. But its score. So there it is.


Well we took the lead with those! Lol


----------



## yuannan

Good job guys, we in the lead by 200k!









Lets hope team green has no more people left to submit.


----------



## battleaxe

Just added Vantage x2 as well. I had forgotten how stupid Vantage is. LOL

Her knees are locked and cannot move, but the boobs sure jiggle all over the place. So stupid. LOL

Edit: I'm gonna sit on my single Vantage results until the last minute. Just in case its very close at the end.


----------



## jodybdesigns

GO TEAM RED!

Leading by 209,327


----------



## mus1mus

Wow. You guys!


----------



## jodybdesigns

Quote:


> Originally Posted by *mus1mus*
> 
> Wow. You guys!


Lol Do you know how hard it was to find the time to do the benchmarks I did? I am just glad to help.


----------



## mus1mus

Quote:


> Originally Posted by *jodybdesigns*
> 
> Lol Do you know how hard it was to find the time to do the benchmarks I did? I am just glad to help.


I know. I too may not be able to bench for another couple of days due to Easter holidays. I may not have enough time to put up quads. We'll see.

Still, the lead is small for the greens. We need more people joining.


----------



## navjack27

we need more and more quad crossfire benches. that is the place nvidia can't even touch.

on topic to a 390x... i still can't figure out why my new 980 ti can't even TOUCH my 390x in csgo. 390x > 980 ti in dx9 ?


----------



## jodybdesigns

If I only still had this...


----------



## jdorje

Quote:


> Originally Posted by *navjack27*
> 
> on topic to a 390x... i still can't figure out why my new 980 ti can't even TOUCH my 390x in csgo. 390x > 980 ti in dx9 ?


cs:go has to be cpu bottlenecked on those cards right? Which just means more efficient drivers for that game.


----------



## jdorje

Sorry for double post, but: what exactly should I turn off/down in the various 3dmark benches for this test? Tesselation? I see several tess options though.

http://www.3dmark.com/3dm/11385205?

Not doing a custom bios or trixx to so I'm limited to 1325 mV.


----------



## fyzzz

Just turn it off completely like this:


----------



## jdorje

http://imgur.com/aBjKkvu


Here's my xfx vrm cooler. It actually splits so there's a second similar part behind the power cable.

Vrms get up to 20C higher than core under load.

It really seems like a block of metal in there to connect the vrms to b the main heat sink would be better. If I remove the vrm heat sink can I continue to use the fujipoly thermal pads under it?

What kind of metal block should I try out there?

Is my heat sink nickel or aluminum? I know the base is copper. The fins must be aluminum right?

Edit: just realized it's only fins there on the heat sink. Dammit. That pic also shows the screw connecting the heatsink to the backplate. I don't know if that's the only reason but...the backplate gets incredibly hot. But without much surface area just blowing a fan on it doesn't seem to make a big difference in cooling.


----------



## bluej511

Ill spend some time today and see what i can get. Without voltage im at 1125/1600 fully stable, i wonder if i can get 1200/1700. You guys think 25mv would be enough for that?


----------



## jdorje

Definitely not. Expect 2 mV per mhz.


----------



## bluej511

Quote:


> Originally Posted by *jdorje*
> 
> Definitely not. Expect 2 mV per mhz.


1200/1700 with 100mv and 50% power artifacts right away, 1200/1650 with 50mv and 50% seems totally stable, going to do a run of firestrike no tess see what i end upwith.


----------



## mus1mus

Quote:


> Originally Posted by *jdorje*
> 
> Definitely not. Expect 2 mV per mhz.


How did you come up with such a value?
Quote:


> Originally Posted by *bluej511*
> 
> 1200/1700 with 100mv and 50% power artifacts right away, 1200/1650 with 50mv and 50% seems totally stable, going to do a run of firestrike no tess see what i end up with.


So you're seeing artifacts with Memory Clock? If you have the headroom for more Voltage, the core may even go higher.

Stay at 1625 memory and clock the core higher.


----------



## jdorje

Quote:


> Originally Posted by *mus1mus*
> 
> How did you come up with such a value? .


A sample size of 1. I get 1035 mhz at 1125 mV (-100), 1090 mhz at 1225 mV (stock), and 1140 mhz at 1335 mV (+100). At one point I had a really nice graph of this showing very accurate quadratic scaling, but I must have destroyed it. Do you have data that disagrees substantially?

Also, I did all 3 tests: http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd/1230#post_25022024


----------



## mus1mus

Quote:


> Originally Posted by *jdorje*
> 
> A sample size of 1. I get 1035 mhz at 1125 mV (-100), 1090 mhz at 1225 mV (stock), and 1140 mhz at 1335 mV (+100). At one point I had a really nice graph of this showing very accurate quadratic scaling, but I must have destroyed it. *Do you have data that disagrees substantially?
> *
> Also, I did all 3 tests: http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd/1230#post_25022024


Nope. I just pump it til it stabilizes. By that, I mean - getting it stable enough for some runs.









Nice scores there.


----------



## bluej511

Quote:


> Originally Posted by *mus1mus*
> 
> How did you come up with such a value?
> So you're seeing artifacts with Memory Clock? If you have the headroom for more Voltage, the core may even go higher.
> 
> Stay at 1625 memory and clock the core higher.


So 1200/1650 +100mv and 50% was totally stable in firestrike, heaven was stable higher oh well.

Here's the first screenshot. Even my VRMs stayed at 67/61 haha. Blowing a fan across it helped a lot. Can't wait to get my extra noctuas as intakes on monday ill be running a dozen fans but silent. Let me know if its a good enough screenshot for the competition.


----------



## mus1mus

I cant see the scores. But you only need the competition wallpaper and you're all good.









But since your artufact issue comes with memory, try to squeeze more Core Mhz. Another 50MHz maybe.


----------



## bluej511

Quote:


> Originally Posted by *mus1mus*
> 
> I cant see the scores. But you only need the competition wallpaper and you're all good.


15,640 graphics score i should prob import the image might make it more readable.

Its prob the Elpida memory holding back the memory OC.


----------



## bluej511

Ok here we go thats better, one of three to come.


----------



## mus1mus

Quote:


> Originally Posted by *bluej511*
> 
> Ok here we go thats better, one of three to come.


Better.









Are you willing to try something?

I have been having some success with this on my cards. I am getting better scores when my clocks end with 47.

Can you try 1247/1647?


----------



## jdorje

Quote:


> Originally Posted by *mus1mus*
> 
> Nope. I just pump it til it stabilizes. By that, I mean - getting it stable enough for some runs.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Nice scores there.


Some people have really high clocks that might indicate better than 1:2 scaling. But it might also just indicate a better starting point. On my card I know that 200 mV only gets me 105 mhz. Kinda makes that -100 mV profile really temping (and if I did crossfire it actually would be).

My card does have low stock voltage of 1225 mV on which it goes up to 1090 mhz, which is above average I think. But hard to stabilize at higher voltage as the VRMs get too hot (XFX cooler has segregated VRM cooling). But I can at least pass firestrike at 1325 mV and 1200 mhz. Ram is very nice + 1125 straps on it. Interestingly vantage and 3dmark11 both crashed at 1200 mhz so I just got lazy and dropped those to 1150. God those benches take forever to run.


----------



## Stige

Trailing by 45k pts, I was about to run my second PC but it says only one PC per GPU class








sadface.

I did do those 3DMark runs with daily clocks so I could improve a bit maybe but...


----------



## mus1mus

Vantage is picky. 3DM11 is more forgiving.

If the crash happens at the very start loading the bench on 1st Graphics test, just restart it and don't close the app. It will run and accept higher clocks.


----------



## bluej511

Heres my 3d11. I better download vantage fast and get my scores in haha.


----------



## ronaldoz

Is there a method to now what your max voltage is? I noticed soms cutting sounds and black screen.. Or stripes from left to right. I noticed this happen at a specific offset voltage and don't like to get there and might be bad. But how bad is it? And sounds the effect I'm describing as legit or is your experience different?


----------



## bluej511

Ok did a bit of a cleaning up of wires and what not today, much cleaner. I'm waiting for Thermaltake to make their half window available, would hide the psu and everything else below the bottom. Might get one of each side but not sure yet. If they dont make it available might just spray the plexi black on the bottom half.

Here's the back of the 390, and rad side.


Heres the fins and enermax fan blowing thru it. Needs some red cable on that side for sure.


And here it is windowed. The thing site on the floor hidden away anyways so.


----------



## Noirgheos

Quote:


> Originally Posted by *mus1mus*
> 
> Better.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Are you willing to try something?
> 
> I have been having some success with this on my cards. I am getting better scores when my clocks end with 47.
> 
> Can you try 1247/1647?


I honestly feel like if someone does Firerstrike without an i7 it's unfair. I'm sure he'd be in the higher 12xxx if he had one.


----------



## bluej511

Quote:


> Originally Posted by *Noirgheos*
> 
> I honestly feel like if someone does Firerstrike without an i7 it's unfair. I'm sure he'd be in the higher 12xxx if he had one.


Yea prob would, im oced to 4.2 on all cores, could prob go even higher but for the contest they're only taking graphics scores and mine are pretty high so its all good.


----------



## patriotaki

anyone downloaded 16.3.1 beta driver?


----------



## Noirgheos

Quote:


> Originally Posted by *bluej511*
> 
> Yea prob would, im oced to 4.2 on all cores, could prob go even higher but for the contest they're only taking graphics scores and mine are pretty high so its all good.


Yeah, with my 4790K @ 4.4GHz, my 1200MHz 390X gets in the high 12000s. Most of the time above 12800. I pushed the core to 1240MHz and it completed the run with a score of 13xxx, but the artifacting...


----------



## bluej511

Quote:


> Originally Posted by *patriotaki*
> 
> anyone downloaded 16.3.1 beta driver?


Yup no issues. Funny thing is if i reset an overdrive profile, Radeon stops responding lol. If i right click and open it back up again its perfectly fine.


----------



## bluej511

Oh wait so question since you guys are all Hawaii gurus. I did the benchmarks by turning tessellation off since its AMD weakness :sniper:but do you guys also turn it off while gaming to get better fps or the games totally look like crap? I have it set to amd optimized and idk if some of em in game i left em on or off.


----------



## Noirgheos

Quote:


> Originally Posted by *bluej511*
> 
> Oh wait so question since you guys are all Hawaii gurus. I did the benchmarks by turning tessellation off since its AMD weakness :sniper:but do you guys also turn it off while gaming to get better fps or the games totally look like crap? I have it set to amd optimized and idk if some of em in game i left em on or off.


AMD optimized. Some games need certain amounts so the geometry won't bug out. Witcher 3 needs a minimum of 16x, and honestly, 16x looks no different from 64x. AMD Optimized sets it to 16x for most games, not sure about when it sets it to 8x.

I just leave it at AMD Optimized in Global Settings.


----------



## mus1mus

For comparison and gaming, it's with Tess On. Even when benchmarking.

It's just this fanboy and Hwbot that I do Tess Off.


----------



## bluej511

Alright cool good to know. My VRMs and core didn't get much hotter when running +100mv, they still stayed well under 70°C. Gonna play some AC Unity (hey i am french after all) with it OCed and voltaged up and see what i can get for temps.
Quote:


> Originally Posted by *mus1mus*
> 
> For comparison and gaming, it's with Tess On. Even when benchmarking.
> 
> It's just this fanboy and Hwbot that I do Tess Off.


Tess on or amd optimized?


----------



## tolis626

So, I have been wondering... What if I put a piece of thermal pad on the back of the PCB, behind the VRMs, to try and connect them to the backplate? I suppose that would somewhat improve temps. Although I believe I would need some extra thick padding as that gap looks at least 3-4mm wide.

Thing is, there's this plastic film on the backplate, so i guess thermal transfer would be limited. Hmm...


----------



## mus1mus

AMD optimised is AMD's own settings. It has a bit of performance boost from Using Application Settings.


----------



## Noirgheos

Quote:


> Originally Posted by *mus1mus*
> 
> AMD optimised is AMD's own settings. It has a bit of performance boost from Using Application Settings.


And usually looks no different.


----------



## ronaldoz

Quote:


> Originally Posted by *Noirgheos*
> 
> I honestly feel like if someone does Firerstrike without an i7 it's unfair. I'm sure he'd be in the higher 12xxx if he had one.


Why would it not be fair? It's about the system, not only about the graphics.








Quote:


> Originally Posted by *bluej511*
> 
> Yea prob would, im oced to 4.2 on all cores, could prob go even higher but for the contest they're only taking graphics scores and mine are pretty high so its all good.


Your system looks nice. And you got a nice overclock! How much mV did you have use at your overclock utility? I would like to overclock again, but I don't feel confident about higher votages yet. My 290 and 390 made my screen / soundboxes make some weird 'cutting' noises at some point. Resulting in some agressive vertical stripes or black screen. And I never see anyone tell about such a 'artifact'









In the end I overclock really carefully. Step by step > +10Mhz > full test. But this 'artifact' happen without having other artifact issues, so I won't overclock for now. For my feeling, it could result in damaging the card, and could even make artifacting at stock speeds. That's just a thought, not sure about that.

Well, I just posted in the fanboy thread and now gonna play some games with my new R9 390X.


----------



## Noirgheos

Quote:


> Originally Posted by *ronaldoz*
> 
> Why would it not be fair? It's about the system, not only about the graphics.
> 
> 
> 
> 
> 
> 
> 
> 
> Your system looks nice. And you got a nice overclock! How much mV did you have use at your overclock utility? I would like to overclock again, but I don't feel confident about higher votages yet. My 290 and 390 made my screen / soundboxes make some weird 'cutting' noises at some point. Resulting in some agressive vertical stripes or black screen. And I never see anyone tell about such a 'artifact'
> 
> 
> 
> 
> 
> 
> 
> 
> 
> In the end I overclock really carefully. Step by step > +10Mhz > full test. But this 'artifact' happen without having other artifact issues, so I won't overclock for now. For my feeling, it could result in damaging the card, and could even make artifacting at stock speeds. That's just a thought, not sure about that.
> 
> Well, I just posted in the fanboy thread and now gonna play some games with my new R9 390X.


It's not really unfair, just some people may jump to conclusions on the total score for comparisons...


----------



## bluej511

Quote:


> Originally Posted by *mus1mus*
> 
> For comparison and gaming, it's with Tess On. Even when benchmarking.
> 
> It's just this fanboy and Hwbot that I do Tess Off.


Quote:


> Originally Posted by *tolis626*
> 
> So, I have been wondering... What if I put a piece of thermal pad on the back of the PCB, behind the VRMs, to try and connect them to the backplate? I suppose that would somewhat improve temps. Although I believe I would need some extra thick padding as that gap looks at least 3-4mm wide.
> 
> Thing is, there's this plastic film on the backplate, so i guess thermal transfer would be limited. Hmm...


My alphacool has that. And running with an avg of 1.221 while OCed hardcore it kept my VRM temps to 74 and 68. I didnt oc this much with the stock cooler but id imagine it would be the same if not higher. Now granted my case only has one intake fan until my Noctuas come in then should be quite a bit lower.


----------



## bluej511

Quote:


> Originally Posted by *ronaldoz*
> 
> Why would it not be fair? It's about the system, not only about the graphics.
> 
> 
> 
> 
> 
> 
> 
> 
> Your system looks nice. And you got a nice overclock! How much mV did you have use at your overclock utility? I would like to overclock again, but I don't feel confident about higher votages yet. My 290 and 390 made my screen / soundboxes make some weird 'cutting' noises at some point. Resulting in some agressive vertical stripes or black screen. And I never see anyone tell about such a 'artifact'
> 
> 
> 
> 
> 
> 
> 
> 
> 
> In the end I overclock really carefully. Step by step > +10Mhz > full test. But this 'artifact' happen without having other artifact issues, so I won't overclock for now. For my feeling, it could result in damaging the card, and could even make artifacting at stock speeds. That's just a thought, not sure about that.
> 
> Well, I just posted in the fanboy thread and now gonna play some games with my new R9 390X.


I went straight +100mv and 50% power limit, netted me 1200/1650. i just played AC Unity and it was totally stable, no artificats no nothing. I wont run it again till my fans come in as this was the hottest ive ever seen my VRMs even though 74 for some isnt that hot. The fan ive got blowing on it is only 900rpms and isnt high flowing. Its ghetto rigged but it works, i can imagine with some Fujipolys and a stronger fan i could prob get below 60.


----------



## ronaldoz

Quote:


> Originally Posted by *bluej511*
> 
> I went straight +100mv and 50% power limit, netted me 1200/1650. i just played AC Unity and it was totally stable, no artificats no nothing. I wont run it again till my fans come in as this was the hottest ive ever seen my VRMs even though 74 for some isnt that hot. The fan ive got blowing on it is only 900rpms and isnt high flowing. Its ghetto rigged but it works, i can imagine with some Fujipolys and a stronger fan i could prob get below 60.


1200 is really good for this card I guess, so that's nice man. Will you replace the thermal pads for Fujipoly ones?


----------



## bluej511

Quote:


> Originally Posted by *ronaldoz*
> 
> 1200 is really good for this card I guess, so that's nice man. Will you replace the thermal pads for Fujipoly ones?


Eventually i will but they are quite expensive. I might just go with the 11mw ones and just put it over the VRMs. The back is 3mm and can't find fujis that thick so will only have to go on the pcb heatsink. I dont feel like taking it apart just yet, i could prob do it without draining the system and just leaving the heatsink in the case but who knows. Gonna wait to see till i get my new fans. Right now ive only got one Noctua NF-14 blowing into the case. I will be adding 2 more and see what temps i end up with, if need be will do 4 up top. Right now ive got 10fans with 2 more wil be 12 and still totally silent. Might add 2 more decent ones as a push for my front rad as its only in pull right now. My fan controller can take 6 channels 30W a piece so no problem controlling em.


----------



## ronaldoz

Quote:


> Originally Posted by *bluej511*
> 
> Eventually i will but they are quite expensive. I might just go with the 11mw ones and just put it over the VRMs. The back is 3mm and can't find fujis that thick so will only have to go on the pcb heatsink. I dont feel like taking it apart just yet, i could prob do it without draining the system and just leaving the heatsink in the case but who knows. Gonna wait to see till i get my new fans. Right now ive only got one Noctua NF-14 blowing into the case. I will be adding 2 more and see what temps i end up with, if need be will do 4 up top. Right now ive got 10fans with 2 more wil be 12 and still totally silent. Might add 2 more decent ones as a push for my front rad as its only in pull right now. My fan controller can take 6 channels 30W a piece so no problem controlling em.


Sound good! I'm curious what results you will get by that.

And uhm, I gotta bring the 390X back, because 1 fan is making a annoying sound. But I will first order a new one, and then send this one retour, after that one is over here. No big deal, it can happen.


----------



## Minusorange

Guess I can join the club now as my 390 came today

Really impressed with the fancy packaging from Asus



The build quality of the card itself looks pretty decent too





Those heatsink pipes are huge

Definitely happy with my "up(Side?)"grade from my 290 Tri-x

Don't have GPU-Z so hopefully you'll allow this as my verification


----------



## Agent Smith1984

Quote:


> Originally Posted by *Minusorange*
> 
> Guess I can join the club now as my 390 came today
> 
> Really impressed with the fancy packaging from Asus
> 
> 
> 
> The build quality of the card itself looks pretty decent too
> 
> 
> 
> 
> 
> Those heatsink pipes are huge
> 
> Definitely happy with my "up(Side?)"grade from my 290 Tri-x
> 
> Don't have GPU-Z so hopefully you'll allow this as my verification


I'll be honest with you man. You're going to miss that tri-x cooling. All of the Asus cards run really hot....

Still a good card, just won't run very cool, and may not clock well because of that.

I had one at one point, and it certainly looks good!


----------



## battleaxe

I have to say I am super happy with these XFX 390X's.

I just did the Fanboy comparo and was able to bench at 1220core on both cards at the same time. (I know, not as good as some 290's) I didn't bother try for more, so I can imagine they would do better. Zero artifacts too.

Very happy. My first card will do 1270core/1750RAM, but second will not, so didn't bother to try harder on them.

Finally, got Tess turned off too. But strangely the score went down on 3dMark11. Anyone know why?

And Mus1, I beat your FS score!!









But you killed me on the other ones.










Spoiler: Warning: Spoiler!


----------



## navjack27

Quote:


> Originally Posted by *jdorje*
> 
> cs:go has to be cpu bottlenecked on those cards right? Which just means more efficient drivers for that game.


but i didn't change cpus. only the gpu. i'm still running strong with my 5775c which is overclocked. now i see why /globaloffensive/ has threads with 980 ti owners complaining. its not BAD but its not what you'd expect from such a beast.


----------



## jodybdesigns

Had a chance this morning to update my Firestrike and 3Dmark11 scores with Tess off. Pretty good jump there. If I have time I will run Vantage but it just takes too long.


----------



## yuannan

In the case that I messed up my GPU, that I pumped too much juice through it and I clocked it too high,

is "requiring more voltage for the same old clock" a symptom of such?


----------



## ronaldoz

Quote:


> Originally Posted by *yuannan*
> 
> In the case that I messed up my GPU, that I pumped too much juice through it and I clocked it too high,
> 
> is "requiring more voltage for the same old clock" a symptom of such?


Did you got specific issues at the high overclock or did you not experienced any of that?


----------



## Minusorange

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I'll be honest with you man. You're going to miss that tri-x cooling. All of the Asus cards run really hot....


It was the cooling that failed on my 290, the fan over the VRMs near to the video ports literally fell off one day which is why I've had to get the 390


----------



## ronaldoz

PS. I would like to peel the sticker of, so I could oil the bearing, but I'm like 100% sure it will avoid warranty, or anything like that. Or not getting 100% money back. But I really wanna do it, just to fix the bearing sounds.


----------



## yuannan

Quote:


> Originally Posted by *ronaldoz*
> 
> Did you got specific issues at the high overclock or did you not experienced any of that?


Artifacts as it normally does, nothing really unsall, the biggest one was just a system freeze and one time the screen split in 2 and when green and stuff.
Nothing a reboot didn't fix, today when I stressed it the game was stuttering and artifacting, then my PC froze.

Upon reboot I bumped up the voltage to stock as I underclock for noise, it is fine as far as I can tell.


----------



## bluej511

Damn boys were leading by 200,000 now.


----------



## danielwhitt

I've just setup IP my skylake rig with two MSI r9 390s in crossfire, 8g gaming cards, holy crap you need some good ventilation on those, but holy crap they are surprisingly quick in crossfire.


----------



## bluej511

Quote:


> Originally Posted by *danielwhitt*
> 
> I've just setup IP my skylake rig with two MSI r9 390s in crossfire, 8g gaming cards, holy crap you need some good ventilation on those, but holy crap they are surprisingly quick in crossfire.


Yea the msi don't cool very well. Time to watercool em ek makes waterblocks haha.


----------



## navjack27

thing is, you still have heat in ur computer room.


----------



## danielwhitt




----------



## bluej511

Yup youll always have heat. Linus did a test of water cooled and non water cooled and both rooms were the same temp haha. I find that kinda hard to believe as an air cooled card will blow air out at around 70C and a radiator will only blow air out thats 35C but go figure. Me personally ive noticed it hotter sitting next to my case when it was air cooled.


----------



## danielwhitt

Quote:


> Originally Posted by *bluej511*
> 
> Yea the msi don't cool very well. Time to watercool em ek makes waterblocks haha.


The Mrs would kill me at the moment with how much I've spent on the pc, si they will have to wait for a bit, not gonna over clock them til I get a 4k monitor, a tad overkill for 1080p lol


----------



## bluej511

Haha i totalled mine up its a tiny bit over 2000€. That doesn't include the keyboard and mouse either. Planning on getting a 21:9 screen just waiting for the new LG 29" curved one to come out to see the price.


----------



## yuannan

Quote:


> Originally Posted by *bluej511*
> 
> Haha i totalled mine up its a tiny bit over 2000€. That doesn't include the keyboard and mouse either. Planning on getting a 21:9 screen just waiting for the new LG 29" curved one to come out to see the price.


The new LG ones only have freesync form 55-75, a very small range that is near negligible. I got a 29" samsung one after I returned my LG one and I'm very happy with it, one complaint is the motion blur on hard black blocks, but that is it.

You should wait it out after computex to get a monitor as they drop good stuff most of the time.


----------



## bluej511

Quote:


> Originally Posted by *yuannan*
> 
> The new LG ones only have freesync form 55-75, a very small range that is near negligible. I got a 29" samsung one after I returned my LG one and I'm very happy with it, one complaint is the motion blur on hard black blocks, but that is it.
> 
> You should wait it out after computex to get a monitor as they drop good stuff most of the time.


The lg 29um67 that ships now has a range of 40-75, the new 29uc88 curved has the same range as well. The 55-75 your thinking of is if you use hdmi and thats the v sync range not freesync. A simple driver update from lg lowers the freesync from 48 to 40. Theres a few .ini files as well that can drop it to 32-25 without issues.


----------



## n3o611

Hey guys,

today I took more time to OC my card even further and I found that 1140 Core Clock and 1700 Memory Clock is the sweet spot for my Card where it performs perfectly without any additional Core Voltage.

Overclock1140.jpg 241k .jpg file


Here my 3d Mark result: http://www.3dmark.com/3dm/11392149 (1135 Core Clock, was lazy to do another one)



http://imgur.com/zkoSZjX


Is it worth to give the card more voltage to go even further? with +0mV 1140 CoreClock is my current maximum, if I just go to 1150 it starts to artifact. I didnt try much more about the Memory clock, since I havent seen much improvment through that.

Also: How does my card perform? I dont see many sapphire r9 390 being overclocked to compare.

Thanks!


----------



## yuannan

Quote:


> Originally Posted by *bluej511*
> 
> The lg 29um67 that ships now has a range of 40-75, the new 29uc88 curved has the same range as well. The 55-75 your thinking of is if you use hdmi and thats the v sync range not freesync. A simple driver update from lg lowers the freesync from 48 to 40. Theres a few .ini files as well that can drop it to 32-25 without issues.


Hmmm, might sell my monitor and pick the LG one up if I get a free extra £££ laying around.


----------



## mus1mus

Quote:


> Originally Posted by *danielwhitt*
> 
> The Mrs would kill me at the moment with how much I've spent on the pc, si they will have to wait for a bit, not gonna over clock them til I get a 4k monitor, a tad overkill for 1080p lol


Please. Help us beat those green blooded rich kids.









http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd/1250_50#post_25023024

We need your dual cards scores. Please


----------



## jodybdesigns

Quote:


> Originally Posted by *mus1mus*
> 
> Please. Help us beat those green blooded rich kids.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd/1250_50#post_25023024
> 
> We need your dual cards scores. Please


They are down 232k points. I can sign my wife up for overclock and bench my wifes computer for maybe another 40k. But I feel like I would be cheating so bad lol

The guys in there with the 980Ti's have already pushed them practically as far as they can (they are talking about putting them on water and buying more cards and returning them for God's sake lol). They MIGHT get a 5k increase on their total score each if they are lucky. And those are only the active ones. I think we might have them on this one.


----------



## mus1mus

Nope. Not til it's over. We even need some quad guys to chime in. We need all members we can gather to pound them for good.









Yep, I saw them talk of that too. Silly boys.


----------



## Noirgheos

Quote:


> Originally Posted by *mus1mus*
> 
> Nope. Not til it's over. We even need some quad guys to chime in. We need all members we can gather to pound them for good.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yep, I saw them talk of that too. Silly boys.


It really is an issue isn't it? I just had to educate my older sister. "AMD gets hot and they have ****ty drivers."

Will it ever end?


----------



## ronaldoz

Got this on stock (no overclock) with the R9 390X. Not bad I guess. I remember having 1545 with 1140 / 1580 with the R9 390.


----------



## mus1mus

BTW, good job @battleaxe. But my Firestrike scores suck to begin with.









I may give them another spin. But priority now is to improve 3X and start doing the 4X.


----------



## Minusorange

70 degrees with 60% fan speed on my Strix

My system has always been warm because my room is always so I'm okay with this, 60% seems to be the sweetspot for cooling vs noise as it gets rather loud post 60%. Definitely more quiet than my Tri-x was at 60% but about 5 degrees hotter but I can live with it, just means I can't really attempt any OC'ing past the stock OC as I'm not of fan of going into the 80 degree range even though I know the cards are rated to go that hot


----------



## jdorje

Quote:


> Originally Posted by *Minusorange*
> 
> 70 degrees with 60% fan speed on my Strix
> 
> My system has always been warm because my room is always so I'm okay with this, 60% seems to be the sweetspot for cooling vs noise as it gets rather loud post 60%. Definitely more quiet than my Tri-x was at 60% but about 5 degrees hotter but I can live with it, just means I can't really attempt any OC'ing past the stock OC as I'm not of fan of going into the 80 degree range even though I know the cards are rated to go that hot


Vrm temps are far more limiting on most of these cards. Core is easy to keep under 80 but vrms may still be at 100.


----------



## Noirgheos

Quote:


> Originally Posted by *ronaldoz*
> 
> Got this on stock (no overclock) with the R9 390X. Not bad I guess. I remember having 1545 with 1140 / 1580 with the R9 390.


Run it fullscreen. I got 64FPS with my Fury with the exact same settings...


----------



## gupsterg

*1x special offer*







.

Whomever submits a new entry to 3D Fanboy Competition 2016: nVidia vs AMD (Red Team of course







) I will offer ROM mod service over and above what HawaiiReader does (ie multi state VDDCI / VDDC Limit / LLC / fSW ) aka "The Works"







.

*Only 2 conditions:-*

i) do a entry .

ii) ROM mod done after entry, within my own time constraints (which usually is not long wait







) .


----------



## jodybdesigns

Quote:


> Originally Posted by *gupsterg*
> 
> *1x special offer*
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Whomever submits a new entry to 3D Fanboy Competition 2016: nVidia vs AMD (Red Team of course
> 
> 
> 
> 
> 
> 
> 
> ) I will offer ROM mod service over and above what HawaiiReader does (ie multi state VDDCI / VDDC Limit / LLC / fSW ) aka "The Works"
> 
> 
> 
> 
> 
> 
> 
> .
> 
> *Only 2 conditions:-*
> 
> i) do a entry .
> 
> ii) ROM mod done after entry, within my own time constraints (which usually is not long wait
> 
> 
> 
> 
> 
> 
> 
> ) .


I should have waited. I would love to see what this PCS+ can do, and I'm too paranoid to do any BIOS tweaking lol


----------



## gupsterg

No worries







, I will make special exception to a participant







.

PM for what you require







.


----------



## n3o611

Quote:


> Originally Posted by *gupsterg*
> 
> *1x special offer*
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Whomever submits a new entry to 3D Fanboy Competition 2016: nVidia vs AMD (Red Team of course
> 
> 
> 
> 
> 
> 
> 
> ) I will offer ROM mod service over and above what HawaiiReader does (ie multi state VDDCI / VDDC Limit / LLC / fSW ) aka "The Works"
> 
> 
> 
> 
> 
> 
> 
> .
> 
> *Only 2 conditions:-*
> 
> i) do a entry .
> 
> ii) ROM mod done after entry, within my own time constraints (which usually is not long wait
> 
> 
> 
> 
> 
> 
> 
> ) .


Im not sure what it is, but sounds nice!









About the competition, I got a Sapphire R9 390 Nitro OC 1140/1700... I hope that helps. Does CPU matter much in the competition? I can get my 6600K @ 4,9K for that time.


----------



## mus1mus

Nope. Purely GPU. But vantage seem to scale with CPU speeds. So yeah.


----------



## gupsterg

Quote:


> Originally Posted by *n3o611*
> 
> Im not sure what it is, but sounds nice!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> About the competition, I got a Sapphire R9 390 Nitro OC 1140/1700... I hope that helps. Does CPU matter much in the competition? I can get my 6600K @ 4,9K for that time.


That's gonna be







, you do entry and my time is your







.


----------



## ronaldoz

Quote:


> Originally Posted by *Noirgheos*
> 
> Run it fullscreen. I got 64FPS with my Fury with the exact same settings...


Nice, it helps a bit:


----------



## Noirgheos

Quote:


> Originally Posted by *ronaldoz*
> 
> Nice, it helps a bit:


That is ****ed up. I'm so happy I went to a 390X and pocketed the cash.


----------



## ronaldoz

Quote:


> Originally Posted by *Noirgheos*
> 
> That is ****ed up. I'm so happy I went to a 390X and pocketed the cash.


What do you mean? Did you sell your Fury or something?


----------



## christoph

sorry can help you guys, I'm limited by CPU, besides than I haven't try to OC mu GPU yet


----------



## jdorje

Quote:


> Originally Posted by *christoph*
> 
> sorry can help you guys, I'm limited by CPU, besides than I haven't try to OC mu GPU yet


Overclocking and cpu power make very little difference to your contribution. Just run the 3 benchmarks and screenshot them with cpuz/cpuz/gpuz and using the desktop background.

Edit: just to be clear, yeah overclocking will raise your score. But an OC is only 10-15% more, so a stock score will still count nearly as much. The competition is more like a head count than anything else. Though obviously a 390 will count ~double what a 270 does.


----------



## ronaldoz

Quote:


> Originally Posted by *christoph*
> 
> sorry can help you guys, I'm limited by CPU, besides than I haven't try to OC mu GPU yet


For that competition every score will count. It does not matter how much it is. It will be counted together, so it's not what the best avarage score is or something.


----------



## jdorje

What's with the strange custom runs of heaven? It's just like the preset only changed to 1080? But what about us 1440 users?

Anyway this is with a 390 +100 (1325 mV) gaming-stable overclock. 1150 core, 1740 ram, 1125 timing straps (highly recommended).

The first time I ran it I apparently had tesselation still off and got 80 fps with no complaints from heaven. So I guess it's easy to game the system?


----------



## Minusorange

Quote:


> Originally Posted by *jdorje*
> 
> Vrm temps are far more limiting on most of these cards. Core is easy to keep under 80 but vrms may still be at 100.


I've been monitoring them under load and they're around 65 at the moment, peaked at 82 while core was at 83 before I set my fan profile up, surprised they're actually running that cool after what you've said


----------



## n3o611

Might be a dumb question, but for the competition i need the advanced version, right?


----------



## dagget3450

Quote:


> Originally Posted by *n3o611*
> 
> Might be a dumb question, but for the competition i need the advanced version, right?


You can use the demo on all of them, just make sure to link the url to your score


----------



## ronaldoz

Quote:


> Originally Posted by *n3o611*
> 
> Might be a dumb question, but for the competition i need the advanced version, right?


No, you could do the test. Advanced will allow you to skip the demo or run specific tests only.


----------



## bluej511

Quote:


> Originally Posted by *Minusorange*
> 
> I've been monitoring them under load and they're around 65 at the moment, peaked at 82 while core was at 83 before I set my fan profile up, surprised they're actually running that cool after what you've said


Yea mine stay around the 60s lil above lil below. Can;t wait to redo the card with better thermal pads at some point. Thinking of delidding my 4690k as well ek makes special shorter screws for like 4€ worth a shot. Ive heard people dropping up to 20°C from doing that.


----------



## Noirgheos

Quote:


> Originally Posted by *ronaldoz*
> 
> What do you mean? Did you sell your Fury or something?


Yeah. Sold it and grabbed a 390X Nitro.


----------



## ronaldoz

Quote:


> Originally Posted by *Noirgheos*
> 
> Yeah. Sold it and grabbed a 390X Nitro.


Ah cool. Got the same.







How are you doing at overclocking? Something different, I tested what speeds you need to get the same scores.

1185 / 1750 with *390*


1160 / 1625 with *390X*


----------



## jdorje

I wish to challenge any single-390 owner to beat my fanboy score of 77,280. I did it with a lowly XFX 8256 model, the cheapest possible 390, so it shouldn't be hard to top with an MSI or Nitro. That's 49531/2+22340+15087*2. The formula is half your vantage plus your 3dmark11 plus twice your firestrike score - graphics scores only.

Tips:

Mod your bios with lower timing straps.
Push up voltage obviously. I used 1225+100. Higher voltage helps, up to a point.
If you have a 144 hz monitor, set it to 60 hz.
You can probably pass firestrike (at least) with 50 mhz higher than is actually stable. As long as there's no blackscreen, artifacts won't hurt anything. I passed firestrike at 1200, but the other two crashed so I just dropped it back to 1150 for them.
Disable tessellation in your AMD settings (open crimson, gaming settings, just override it).
Lock fans at 100% and remove your side case panel. Firestrike is very stressful but doesn't get that hot - since there's so much downtime during loading - but you don't want fans to drop in speed during the load screens since your VRMs are probably still hot.
I used a 4690k at my everyday 4.7 ghz overclock. CPU single-score speed might make a little bit of difference for these.
If anyone does break it, I'd obviously like a heads up so I can retest. I have a bit more voltage headroom and another 100 mhz on my CPU.

Edit: you're welcome to try out my mod'd bios, linked here. The max_oc bios includes a 1350 mV base, ~100 higher than stock voltages, so is a good base for benching runs. Includes 1160 base clock, 1740 memory, which could provide problems for some but is easy to edit. Also has 1126-1250 straps and a 238W base power limit. Just make sure you are prepared to unbrick the card by using another gpu (igpu or dual bios) to reflash your original bios.

Another edit: the highest 390x score is 86,600, by yuannan. Aim for that if you've got a 390x. The highest 290/290x scores are higher and will be pretty hard to beat.


----------



## kizwan

Quote:


> Originally Posted by *bluej511*
> 
> Yup youll always have heat. Linus did a test of water cooled and non water cooled and both rooms were the same temp haha. I find that kinda hard to believe as an air cooled card will blow air out at around 70C and a radiator will only blow air out thats 35C but go figure. Me personally ive noticed it hotter sitting next to my case when it was air cooled.


The card power consumption doesn't change when you watercool, so the amount of heat generated will almost the same. Watercool card generally run at lower temp, especially VRM when running cooler, it will work more efficient, the heat generated may slightly less but the reduction may too small that you'll not see any difference in (room) temp. That's why room temp doesn't change.

Regarding you feel it is hotter when air cooled, it depends on your proximity to the the location where the heat is being dump. Air cooled card is either dump heat at the back of the case (blower type cooler) or dump heat in the case. Almost all 390/390X cards, except ones with blower type cooler, dump heat in the case. So this is why you feel it's hotter when sitting near to the case when it was air cooled. In my case the back of the case point toward the wall & when my card using the stock blower cooler, I do feels hotter too because I sit near the location.
Quote:


> Originally Posted by *jdorje*
> 
> 
> 
> What's with the strange custom runs of heaven? It's just like the preset only changed to 1080? But what about us 1440 users?
> 
> Anyway this is with a 390 +100 (1325 mV) gaming-stable overclock. 1150 core, 1740 ram, 1125 timing straps (highly recommended).
> 
> The first time I ran it I apparently had tesselation still off and got 80 fps with no complaints from heaven. So I guess it's easy to game the system?


With Heaven you'll need to run it in custom setting. Basically _resolution-8xAA-(Preset)Custom-(Quality)Ultra-(Tessellation)Extreme_. Except for 4K, there's thread here at OCN only require it to run at 4xAA. Wait...that Valley thread. I'm assuming the same for Heaven...too lazy to check.


----------



## bluej511

Quote:


> Originally Posted by *kizwan*
> 
> The card power consumption doesn't change when you watercool, so the amount of heat generated will almost the same. Watercool card generally run at lower temp, especially VRM when running cooler, it will work more efficient, the heat generated may slightly less but the reduction may too small that you'll not see any difference in (room) temp. That's why room temp doesn't change.
> 
> Regarding you feel it is hotter when air cooled, it depends on your proximity to the the location where the heat is being dump. Air cooled card is either dump heat at the back of the case (blower type cooler) or dump heat in the case. Almost all 390/390X cards, except ones with blower type cooler, dump heat in the case. So this is why you feel it's hotter when sitting near to the case when it was air cooled. In my case the back of the case point toward the wall & when my card using the stock blower cooler, I do feels hotter too because I sit near the location.
> With Heaven you'll need to run it in custom setting. Basically _resolution-8xAA-(Preset)Custom-(Quality)Ultra-(Tessellation)Extreme_. Except for 4K, there's thread here at OCN only require it to run at 4xAA. Wait...that Valley thread. I'm assuming the same for Heaven...too lazy to check.


Power consumption doesnt change much might be minimal as the card runs cooler. I lock my fps anyways so it doesnt use much. GTA V, AC Syndicate and Unity wont even reach 60fps maxed out so they're power guzzlers. Lil teaser of Dirt Rally, open it in original then press F11, maxed out barely stresses the r9 390.


----------



## christoph

Quote:


> Originally Posted by *jdorje*
> 
> Overclocking and cpu power make very little difference to your contribution. Just run the 3 benchmarks and screenshot them with cpuz/cpuz/gpuz and using the desktop background.
> 
> Edit: just to be clear, yeah overclocking will raise your score. But an OC is only 10-15% more, so a stock score will still count nearly as much. The competition is more like a head count than anything else. Though obviously a 390 will count ~double what a 270 does.


Quote:


> Originally Posted by *ronaldoz*
> 
> For that competition every score will count. It does not matter how much it is. It will be counted together, so it's not what the best avarage score is or something.


ohhhhhhhhh, thought it was a overclocking competition

alright, then what's the link to it?? what bench should I run?


----------



## n3o611

This is the Competition: http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd

Can anybody help me, why my 3DMARK VANTAGE Result says "TIME MEASURING INACCURATE."? its only on Vantage
Link: http://www.3dmark.com/3dmv/5434518
Screenshot:


----------



## jdorje

Mine says that too on Vantage. And the moderator asked another guy from here to redo it because of it. Don't know why though. Will it say that if tessellation is enabled?


----------



## Mister300

Here is my link to benchmark with my gamer tag.

http://www.3dmark.com/fs/8011832


----------



## n3o611

Quote:


> Originally Posted by *jdorje*
> 
> Mine says that too on Vantage. And the moderator asked another guy from here to redo it because of it. Don't know why though. Will it say that if tessellation is enabled?


I have redone the test, and now its gone. weird bug -> http://www.3dmark.com/3dmv/5434536


----------



## jdorje

So you beat my 390 score eh? That didn't take long.

Scores to beat now are 390x - 86,600 and 390 - 78.425

I think the 6600k is the preferred CPU for this thing.


----------



## diggiddi

Quote:


> Originally Posted by *n3o611*
> 
> I have redone the test, and now its gone. weird bug -> http://www.3dmark.com/3dmv/5434536


Try syncing your clock with internet time whenever you see that


----------



## christoph

should I run the 3 benches? or just one?


----------



## diggiddi

All 3, the points add up


----------



## christoph

ok, downloading right now


----------



## Stige

Quote:


> Originally Posted by *jdorje*
> 
> I wish to challenge any single-390 owner to beat my fanboy score of 77,280. I did it with a lowly XFX 8256 model, the cheapest possible 390, so it shouldn't be hard to top with an MSI or Nitro. That's 49531/2+22340+15087*2. The formula is half your vantage plus your 3dmark11 plus twice your firestrike score - graphics scores only.
> 
> Tips:
> 
> Mod your bios with lower timing straps.
> Push up voltage obviously. I used 1225+100. Higher voltage helps, up to a point.
> If you have a 144 hz monitor, set it to 60 hz.
> You can probably pass firestrike (at least) with 50 mhz higher than is actually stable. As long as there's no blackscreen, artifacts won't hurt anything. I passed firestrike at 1200, but the other two crashed so I just dropped it back to 1150 for them.
> Disable tessellation in your AMD settings (open crimson, gaming settings, just override it).
> Lock fans at 100% and remove your side case panel. Firestrike is very stressful but doesn't get that hot - since there's so much downtime during loading - but you don't want fans to drop in speed during the load screens since your VRMs are probably still hot.
> I used a 4690k at my everyday 4.7 ghz overclock. CPU single-score speed might make a little bit of difference for these.
> If anyone does break it, I'd obviously like a heads up so I can retest. I have a bit more voltage headroom and another 100 mhz on my CPU.
> 
> Edit: you're welcome to try out my mod'd bios, linked here. The max_oc bios includes a 1350 mV base, ~100 higher than stock voltages, so is a good base for benching runs. Includes 1160 base clock, 1740 memory, which could provide problems for some but is easy to edit. Also has 1126-1250 straps and a 238W base power limit. Just make sure you are prepared to unbrick the card by using another gpu (igpu or dual bios) to reflash your original bios.
> 
> Another edit: the highest 390x score is 86,600, by yuannan. Aim for that if you've got a 390x. The highest 290/290x scores are higher and will be pretty hard to beat.


Only 15k in Firestrike even with no tesselation? I got 14800 with tesselation and no real tweaks.


----------



## ronaldoz

Quote:


> Originally Posted by *Stige*
> 
> Only 15k in Firestrike even with no tesselation? I got 14800 with tesselation and no real tweaks.


If you are using the default Fire Strike settings, you probably got your core clock more then 1200 (maybe 1250+) or / and have a 390x?


----------



## Stige

Quote:


> Originally Posted by *ronaldoz*
> 
> If you are using the default Fire Strike settings, you probably got your core clock more then 1200 (maybe 1250+) or / and have a 390x?


http://www.3dmark.com/fs/7757771 Just a regular 390.

The core/memory clocks are off for some reason, maybe due to bios modding? Same reason it says invalid driver I think. I think the clocks might have been 1225/1650 or something, I have tightened the timings a notch in BIOS but should tighten them some more I think as most people seem to run the most tight timings on their cards without issues.


----------



## ronaldoz

Quote:


> Originally Posted by *Stige*
> 
> http://www.3dmark.com/fs/7757771 Just a regular 390.
> 
> The core/memory clocks are off for some reason, maybe due to bios modding? Same reason it says invalid driver I think. I think the clocks might have been 1225/1650 or something, I have tightened the timings a notch in BIOS but should tighten them some more I think as most people seem to run the most tight timings on their cards without issues.


Ah not bad, I've seen some 390's at 15.000, but they modified the test. But at 1225 and a bit tighter timings, might help to get there.


----------



## Noirgheos

Quote:


> Originally Posted by *ronaldoz*
> 
> Ah cool. Got the same.
> 
> 
> 
> 
> 
> 
> 
> How are you doing at overclocking? Something different, I tested what speeds you need to get the same scores.
> 
> 1185 / 1750 with *390*
> 
> 
> 1160 / 1625 with *390X*


Yeah I'm running 1200MHz and currently beating my Fury by a bit.


----------



## patriotaki

why does heaven say the 390 has 4gb of vram?


----------



## Vellinious

Quote:


> Originally Posted by *Stige*
> 
> http://www.3dmark.com/fs/7757771 Just a regular 390.
> 
> The core/memory clocks are off for some reason, maybe due to bios modding? Same reason it says invalid driver I think. I think the clocks might have been 1225/1650 or something, I have tightened the timings a notch in BIOS but should tighten them some more I think as most people seem to run the most tight timings on their cards without issues.


Futuremark will only validate WHQL approved drivers. The last WHQL approved driver for AMD was 15.12, I think. All of the recent ones I've seen aren't.


----------



## Minusorange

Quote:


> Originally Posted by *bluej511*
> 
> Yea mine stay around the 60s lil above lil below. Can;t wait to redo the card with better thermal pads at some point. Thinking of delidding my 4690k as well ek makes special shorter screws for like 4€ worth a shot. Ive heard people dropping up to 20°C from doing that.


All going okay so far apart from this

Surely it's just HWinfo reporting incorrectly ?



171 thousand volts is impossible right ?

Maybe I should update Hwinfo as I'm running an old version


----------



## tolis626

Quote:


> Originally Posted by *Minusorange*
> 
> All going okay so far apart from this
> 
> Surely it's just HWinfo reporting incorrectly ?
> 
> 
> 
> 171 thousand volts is impossible right ?
> 
> Maybe I should update Hwinfo as I'm running an old version


Yup, that's a glitch. Maybe you were running another monitoring app at the same time and they messed with each other.

If you wanna laugh about wrongly reported values, just enable monitoring in Valley. Mine usually reports a temperature anywhere from 50000C to 250000C. These PCBs must be really tough to not melt under such high temps. I may try to overclock some more and achieve nuclear fusion in the future.


----------



## Minusorange

Quote:


> Originally Posted by *tolis626*
> 
> Yup, that's a glitch. Maybe you were running another monitoring app at the same time and they messed with each other.
> 
> If you wanna laugh about wrongly reported values, just enable monitoring in Valley. Mine usually reports a temperature anywhere from 50000C to 250000C. These PCBs must be really tough to not melt under such high temps. I may try to overclock some more and achieve nuclear fusion in the future.


haha thought as much, just updated hwinfo and the readings are more realistic now


----------



## Vellinious

Like this. lol


----------



## tolis626

Quote:


> Originally Posted by *Vellinious*
> 
> Like this. lol


Damn, nuclear fusion right there.









I started messing around with benchmarking for the competition. Is there any point in pushing the last 100MHz on my 4790k or will 4.8GHz not make that much of a difference over 4.7GHz? I'd guess it's gonna be fine either way as we're taking only graphics scores into account.


----------



## Vellinious

Quote:


> Originally Posted by *tolis626*
> 
> Damn, nuclear fusion right there.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I started messing around with benchmarking for the competition. Is there any point in pushing the last 100MHz on my 4790k or will 4.8GHz not make that much of a difference over 4.7GHz? I'd guess it's gonna be fine either way as we're taking only graphics scores into account.


It might make a difference in Vantage, but...since graphics scores are the only thing that counts, turn hyperthreading off...save ya some heat, maybe get a bit more clock out of it.


----------



## ronaldoz

Quote:


> Originally Posted by *tolis626*
> 
> Damn, nuclear fusion right there.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I started messing around with benchmarking for the competition. Is there any point in pushing the last 100MHz on my 4790k or will 4.8GHz not make that much of a difference over 4.7GHz? I'd guess it's gonna be fine either way as we're taking only graphics scores into account.


It gives me 300 points more for physics only, for graphic score in Fire Strike it's helping 0%.


----------



## Minusorange

Quote:


> Originally Posted by *Vellinious*
> 
> Like this. lol


Impressive, your card was half as hot as the sun!!


----------



## tolis626

So I entered. My card barely held out until it could complete Vantage and 3DMark11 with the memory at 1725MHz. And I'm still wondering why they decided to include Vantage. That was an eyesore!

Anyway, let's see how this pans out for us.


----------



## pojoFX

First post here! Not entering any competitions or anything, just joining the 390 owner's club. XFX Radeon R9 390. Link to GPU-Z validation below. My card has subpar headroom for overclocking compared to numbers others are posting. Max stable clock is around 1080 or so, 1100 results in artifacts no matter what voltage although I try not to push too much juice through it since it gets pretty hot under load. Still quite happy with this cards performance. Obviously it annihilates my old crossfire 6850's from my previous AMD Phenom II build and really came to life when I carried it over into my new i5 6600k build.

http://www.techpowerup.com/gpuz/details.php?id=bmnwd


----------



## yuannan

Quote:


> Originally Posted by *pojoFX*
> 
> First post here! Not entering any competitions or anything, just joining the 390 owner's club. XFX Radeon R9 390. Link to GPU-Z validation below. My card has subpar headroom for overclocking compared to numbers others are posting. Max stable clock is around 1080 or so, 1100 results in artifacts no matter what voltage although I try not to push too much juice through it since it gets pretty hot under load. Still quite happy with this cards performance. Obviously it annihilates my old crossfire 6850's from my previous AMD Phenom II build and really came to life when I carried it over into my new i5 6600k build.
> 
> http://www.techpowerup.com/gpuz/details.php?id=bmnwd


Come and join the competiton, It really helps you feel as part of the comunity,
http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd/

It'll take about 3 hours in total to setup and run, if you already got the OC then 20mins is plenty. All we need is 3 benchmarks runs, even if you don't OC that far it's still very good. a 390 can easily pull 60k+ for the team. My 390x pulled nearly 90k with the heaviest OC with out BIOs mods.

If you are not scared of high voltages and loud noise, get Sapphire Tri-X pump in +200mv and clock it to 1200/1700.

Help us beat those green rich kids!


----------



## jdorje

Quote:


> Originally Posted by *Stige*
> 
> Only 15k in Firestrike even with no tesselation? I got 14800 with tesselation and no real tweaks.


Then why haven't you beaten my score yet?


----------



## Stige

Quote:


> Originally Posted by *jdorje*
> 
> Then why haven't you beaten my score yet?


Because I haven't ran 3DMark for months? Don't care that much for it, maybe once I swap my thermal pads on GPU so I can push it to 1250+.


----------



## jdorje

Yeah that's what I'm complaining about. Go join the fan boy competition already and set a good example for the young uns.

Vantage and 3dm11 are way more annoying than fire strike though.


----------



## Stige

I have already ran them for the competition? A long time ago, only daily clocks but cba trying too hard.


----------



## yuannan

Quote:


> Originally Posted by *Stige*
> 
> I have already ran them for the competition? A long time ago, only daily clocks but cba trying too hard.


If we lose its because of you


----------



## jodybdesigns

Quote:


> Originally Posted by *Stige*
> 
> Because I haven't ran 3DMark for months? Don't care that much for it, maybe once I swap my thermal pads on GPU so I can push it to 1250+.


I agree with this lol. I have 48 hours on my 3DMark in Steam


----------



## ronaldoz

Quote:


> Originally Posted by *yuannan*
> 
> Come and join the competiton, It really helps you feel as part of the comunity,
> http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd/
> 
> It'll take about 3 hours in total to setup and run, if you already got the OC then 20mins is plenty. All we need is 3 benchmarks runs, even if you don't OC that far it's still very good. a 390 can easily pull 60k+ for the team. My 390x pulled nearly 90k with the heaviest OC with out BIOs mods.
> 
> If you are not scared of high voltages and loud noise, get Sapphire Tri-X pump in +200mv and clock it to 1200/1700.
> 
> Help us beat those green rich kids!


What kinda artifacts do you have with your card btw (after that big overclock)? Like flickering objects? I think I got that kinda effect before on a older card.


----------



## yuannan

Quote:


> Originally Posted by *ronaldoz*
> 
> What kinda artifacts do you have with your card btw (after that big overclock)? Like flickering objects? I think I got that kinda effect before on a older card.


The classic "I'm not getting enough juice, bad textures and driver crash", the card now just require more voltage than it did before to run at the same clock, not much about 20mv. But it's still something.

Bassically, don't max out sapphire trixx on +200mv.

the card now is perfectly fine, nothing bad as far as I can tell, just added more voltage and it's good as new


----------



## Stige

Quote:


> Originally Posted by *jodybdesigns*
> 
> I agree with this lol. I have 48 hours on my 3DMark in Steam


Only 2 hours for me








It doesn't take 48 hours to get the max score


----------



## bluej511

So weird. When mine was artifacting it was just giving me flickers of light on the screen, like a sunbeam flicker. It was totally stable at 1200/1650 thats about it. 1700 on Elpida memory was just bad. I have no idea why mine isnt Hynix memory.


----------



## yuannan

the card now is perectly fine, nothing bad as far as I can tell, just added more voltage and it's good as new
Quote:


> Originally Posted by *bluej511*
> 
> So weird. When mine was artifacting it was just giving me flickers of light on the screen, like a sunbeam flicker. It was totally stable at 1200/1650 thats about it. 1700 on Elpida memory was just bad. I have no idea why mine isnt Hynix memory.


Artifacts are very random, they are mostly green and purple for some weird reason also.

What brand are you running?


----------



## jodybdesigns

Quote:


> Originally Posted by *Stige*
> 
> Only 2 hours for me
> 
> 
> 
> 
> 
> 
> 
> 
> It doesn't take 48 hours to get the max score


Eh, I have been using mine for a while.


----------



## ronaldoz

Quote:


> Originally Posted by *yuannan*
> 
> The classic "I'm not getting enough juice, bad textures and driver crash", the card now just require more voltage than it did before to run at the same clock, not much about 20mv. But it's still something.
> 
> Bassically, don't max out sapphire trixx on +200mv.
> 
> the card now is perfectly fine, nothing bad as far as I can tell, just added more voltage and it's good as new


Nice to hear. I noticed on a older card artifacts on stock speed / voltages after trying to overclock at higher voltages, but it feels kinda tricky.


----------



## bluej511

Quote:


> Originally Posted by *yuannan*
> 
> the card now is perectly fine, nothing bad as far as I can tell, just added more voltage and it's good as new
> Artifacts are very random, they are mostly green and purple for some weird reason also.
> 
> What brand are you running?


Got a sapphire nitro. Absolutely all the other ones ive seen run hynix memory. Maybe mine was built using an older board who knows.

P.S. This is what i used to game on when i had room, might have to set this up at some point this year.


----------



## Stige

Fek it, now you guys made me want to improve my score for the competition, I had a look and my best run is +600 on Firestrike compared to what I submitted there.

Does the amount of RAM you have affect 3DMark scores?


----------



## mus1mus

Try to appreciate the card as it is. Set your memory to 1625, and gradually add 13 MHz on to it til it crash. Go back to the previous step and run it while adding 1 MHz til you get the best scoring memory OC.

Mod the bios to use lower straps for the final memory OC. Done. Profit.


----------



## bluej511

Quote:


> Originally Posted by *mus1mus*
> 
> Try to appreciate the card as it is. Set your memory to 1625, and gradually add 13 MHz on to it til it crash. Go back to the previous step and run it while adding 1 MHz til you get the best scoring memory OC.
> 
> Mod the bios to use lower straps for the final memory OC. Done. Profit.


Eh scores are already in, who cares were always gonna be ahead anyways lol.

I dont even keep it OCed honestly, it has plenty of oomph for all the games i play. Can't wait to get my 5.1/7.1 system running so i can get surround sound on all my pc games again.


----------



## yuannan

Quote:


> Originally Posted by *ronaldoz*
> 
> Nice to hear. I noticed on a older card artifacts on stock speed / voltages after trying to overclock at higher voltages, but it feels kinda tricky.


it's like CPU degradation on water cooled rigs, the temps are fine but the high voltages slowly rip the tracks apart until it dies









Quote:


> Originally Posted by *bluej511*
> 
> Got a sapphire nitro. Absolutely all the other ones ive seen run hynix memory. Maybe mine was built using an older board who knows.
> 
> P.S. This is what i used to game on when i had room, might have to set this up at some point this year.


It's a legit card afaik, you just got the bad end of the deal.

"bad" memory
https://www.techpowerup.com/vgabios/176894/sapphire-r9390-8192-150825.html

"normal" memory
https://www.techpowerup.com/vgabios/173990/sapphire-r9390-8192-150527-1.html
https://www.techpowerup.com/vgabios/173311/sapphire-r9390-8192-150527.html

some reason the one with "bad" memory is clocked higher


----------



## bluej511

Quote:


> Originally Posted by *yuannan*
> 
> it's like CPU degradation on water cooled rigs, the temps are fine but the high voltages slowly rip the tracks apart until it dies
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It's a legit card afaik, you just got the bad end of the deal.
> 
> "bad" memory
> https://www.techpowerup.com/vgabios/176894/sapphire-r9390-8192-150825.html
> 
> "normal" memory
> https://www.techpowerup.com/vgabios/173990/sapphire-r9390-8192-150527-1.html
> https://www.techpowerup.com/vgabios/173311/sapphire-r9390-8192-150527.html
> 
> some reason the one with "bad" memory is clocked higher


Yea the new one with a higher core seems to use Elpida, they give you a backplate but cheaper memory haha.

And whats cpu degradation for water cooled rigs? My cpu temps are far hotter then my GPU lol.


----------



## yuannan

Quote:


> Originally Posted by *bluej511*
> 
> Yea the new one with a higher core seems to use Elpida, they give you a backplate but cheaper memory haha.
> 
> And whats cpu degradation for water cooled rigs? My cpu temps are far hotter then my GPU lol.


When you normally overclock you go +clock, then +voltage until you hit +80C. People with water can push the voltage alot higher than "normal" people. You see posts with people saying "oh, my cpu is now unstable at X and Y when it was fine a year ago" , this is because the temps are 100% fine and the cooler can handle it fine, but the capacitors and wiring on the CPU is not designed to run at these high voltages. Eventually the tracks on the CPU gets so worn out extra voltage cannot bridge the gaps and your cpu just dies









Also, mem on AMD cards don't matter as much as nVidia, the 512bit bus is massive and arguably unneeded. I would go back plate +better binned chip over good mem clocks anyday.


----------



## bluej511

Eh im only running 1.08v on my 4690k so im not worried about that yet. I could go 1.1v thats about it, might net me 4.4-4.5. But im running 4.2 on all cores.


----------



## yuannan

Quote:


> Originally Posted by *bluej511*
> 
> Eh im only running 1.08v on my 4690k so im not worried about that yet. I could go 1.1v thats about it, might net me 4.4-4.5. But im running 4.2 on all cores.


You can volt up wayyyy, higher.

What temps and cooler?


----------



## bluej511

Quote:


> Originally Posted by *yuannan*
> 
> You can volt up wayyyy, higher.
> 
> What temps and cooler?


Supremacy EVO and its running at 50°C during gaming, the alphacool block is super restrictive but wtv, might have some more piece of junk in the jet plate too. Its totally stable at 1.088 and 4.2ghz everything is on auto, running that as a boost clock. Pretty sure i don't need 100w and 4.2ghz surfing on chrome lol. I might actually get the screws from EK and run it delided and get lower temps, i would never run the voltage more then 1.3v anyways.


----------



## jdorje

Weird man. I'm at 1.335V with an aio+delid and I barely break 50 in gaming.


----------



## ronaldoz

That voltage is totally fine indeed. I'm using 1,25V for 4,7Ghz, having 50C ingame.


----------



## bluej511

Quote:


> Originally Posted by *jdorje*
> 
> Weird man. I'm at 1.335V with an aio+delid and I barely break 50 in gaming.


Well thats the thing, you're delided. If i delid and run liquid ultra or something i would be below 40°C more then likely. Like i said GPU block is restrictive, water spends more time in the CPU block therefore runs hotter. Ive thought about deliding and running Noctua on the die, my guess would be id probably be at 38-40C while gaming. Liquid ultra might get me even lower who knows.


----------



## pojoFX

Quote:


> Originally Posted by *yuannan*
> 
> Come and join the competiton, It really helps you feel as part of the comunity,
> http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd/
> 
> It'll take about 3 hours in total to setup and run, if you already got the OC then 20mins is plenty. All we need is 3 benchmarks runs, even if you don't OC that far it's still very good. a 390 can easily pull 60k+ for the team. My 390x pulled nearly 90k with the heaviest OC with out BIOs mods.
> 
> If you are not scared of high voltages and loud noise, get Sapphire Tri-X pump in +200mv and clock it to 1200/1700.
> 
> Help us beat those green rich kids!


I've got nothing better to do then spend my Sunday afternoon toying around with the new hardware anyway, so I guess there's no good reason not to!

I just finished running all the benchmarks at default demo version settings. If I'm understanding the rules correctly we're just using the graphics scores. The Vantage score is divided by 2 and the Firestrike score is multiplied by 2. Then add them all up and that's the final score. Does that sound right? Your predictions and my results seem to line up.

When doing so I got the following:

Vantage: 47584 (/2 = 23792)
http://www.3dmark.com/3dmv/5434928

Mark11: 17877
http://www.3dmark.com/3dm11/11114719

Firestrike: 12571 (*2 = 25142)
http://www.3dmark.com/3dm/11407136?

Total: 66811

If anyone can confirm I didn't screw it up I'll go ahead and make a proper entry.


----------



## jdorje

You need screenshots showing the competition background, cpuz cpu and memory tabs, and gpuz along with the score. Everyone seems to just be using the Web site with the score.

You can raise scores a bit by disabling tesselation in the amd settings panel. Not worth running again for though.

Of course nows a good time to overclock, or to fix your timing straps in bios. Because I'm sure you'd get quick help with any questions, lol.


----------



## tolis626

So, I've been thinking. What would it take to brick a card by BIOS flashing? And by brick, I mean permanently. I understand that, as long as there is a backup GPU, you can reflash the BIOS no problem. Am I correct?

I'm actually thinking about taking gupsterg up on his offer.









Tight memory timings, a 25mV higher DPM7 voltage (so 1.3V instead of 1.275V that it is at stock) and whatever else could be done to improve performance. I would surely like higher memory clocks , but that doesn't seem to be happening with my card, so yeah.


----------



## yuannan

Quote:


> Originally Posted by *tolis626*
> 
> So, I've been thinking. What would it take to brick a card by BIOS flashing? And by brick, I mean permanently. I understand that, as long as there is a backup GPU, you can reflash the BIOS no problem. Am I correct?
> 
> I'm actually thinking about taking gupsterg up on his offer.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Tight memory timings, a 25mV higher DPM7 voltage (so 1.3V instead of 1.275V that it is at stock) and whatever else could be done to improve performance. I would surely like higher memory clocks , but that doesn't seem to be happening with my card, so yeah.


You will be correct, you don't even need a backup GPU if you have a dual bios, boot into first bios, push the button to go into the bad one, then flash while the button is pushed. worked on a 390x tri-X anyway.


----------



## tolis626

Quote:


> Originally Posted by *yuannan*
> 
> You will be correct, you don't even need a backup GPU if you have a dual bios, boot into first bios, push the button to go into the bad one, then flash while the button is pushed. worked on a 390x tri-X anyway.


The MSI doesn't have a dual BIOS sadly.


----------



## yuannan

Quote:


> Originally Posted by *tolis626*
> 
> The MSI doesn't have a dual BIOS sadly.


It should be the same process, you just need a external GPU or the integrated GPU.
But there is definitely a higher risk than a dual bios as it could be a permanent brick.
If you plan to OC I wouldn't do it to get a higher voltage, anything higher than 200mv will seriously damage your GPU very quickly.


----------



## Stige

Quote:


> Originally Posted by *yuannan*
> 
> It should be the same process, you just need a external GPU or the integrated GPU.
> But there is definitely a higher risk than a dual bios as it could be a permanent brick.
> If you plan to OC I wouldn't do it to get a higher voltage, anything higher than 200mv will seriously damage your GPU very quickly.


And where is the proof of this damage? Stop spreading rumors if you can't back them up, these cards will run fine at +200mV.


----------



## yuannan

Quote:


> Originally Posted by *Stige*
> 
> And where is the proof of this damage? Stop spreading rumors if you can't back them up, these cards will run fine at +200mV.


My card?
Or did I just get unlucky?


----------



## tolis626

Quote:


> Originally Posted by *yuannan*
> 
> It should be the same process, you just need a external GPU or the integrated GPU.
> But there is definitely a higher risk than a dual bios as it could be a permanent brick.
> If you plan to OC I wouldn't do it to get a higher voltage, anything higher than 200mv will seriously damage your GPU very quickly.


Nah, that's not it. And seeing as I don't plan to move to water cooling any time soon, I'm stuck with stock cooling. I just prefer Afterburner over Trixx and, if I'm to flash a custom BIOS anyway, I might as well get rid of the need for Trixx altogether. +125mV lands me at 1200-1210MHz now, so that's more than enough. I'm already pushing 1.3V under load at that offset, so I wouldn't want to go higher. It's already too high compared to other people's cards, especially the 1.225V DPM7 ones.


----------



## Stige

Quote:


> Originally Posted by *yuannan*
> 
> My card?
> Or did I just get unlucky?


How? When did this happen? Under what circumstances? What Voltage?


----------



## yuannan

Quote:


> Originally Posted by *Stige*
> 
> How? When did this happen? Under what circumstances? What Voltage?


http://www.overclock.net/t/1561704/official-amd-r9-390-390x-owners-club/7980#post_25025370

Sapphire tri-x 390x, 1255/1750 +200mv, idle voltage about 1.45 load I'm not sure.


----------



## Stige

+200mV only gets me ~1.375v at Idle which should be about right for most cards out there, 1.45 is way more than just +200mV.
At load it drops to around ~1.3V which is more than safe.

Something is not right if you get 1.45V even at idle.


----------



## bluej511

Score mine is only 1.008v at idle lol.


----------



## Stige

Quote:


> Originally Posted by *bluej511*
> 
> Score mine is only 1.008v at idle lol.


Well it is not true idle as it doesn't downclock with my current 120Hz settings, it's more like "light load", maybe playing a video or something?

I get about 1.258V with GPU-Z render test, that should be easy for anyone to compare with. This is at +125mV.


----------



## bluej511

If i go up 100mv it goes up to 1.086. Sometimes mine even dips to a level 1.000v weird eh?


----------



## yuannan

Quote:


> Originally Posted by *Stige*
> 
> +200mV only gets me ~1.375v at Idle which should be about right for most cards out there, 1.45 is way more than just +200mV.
> At load it drops to around ~1.3V which is more than safe.
> 
> Something is not right if you get 1.45V even at idle.


I only ran it for about 1 hour to bench all the scores, then I swapped it back to my normal underclock. After that I posted about my problems on here.

But is 1.4V 100% safe? Any sources to back that up? I searched all over the internet before I did my +200mv, after most people saying "yeah it'll be fine as long as you cool it" I said just F** it and pumped it all the way up in TriXX. Not sure if it's worth it, but hey, I'm the "best" 390x in the competition so far and even beat out a few furys.


----------



## Vellinious

I would just about bet that 1.3v is perfectly safe for an every day clock....and that's probably a little bit conservative.


----------



## bluej511

Considering a stock one avgs about 1.18 or so i dont see why 1.3 avg wouldn't be safe. The issue though would be the max voltage and how long it stays there.


----------



## tolis626

@yuannan
Can you run the Stilt's EVV app and tell us what your DPM7 voltage is? Or even post a BIOS dump or something. Just make sure you're at stock settings when you do either of these things.
Quote:


> Originally Posted by *Stige*
> 
> Well it is not true idle as it doesn't downclock with my current 120Hz settings, it's more like "light load", maybe playing a video or something?
> 
> I get about 1.258V with GPU-Z render test, that should be easy for anyone to compare with. This is at +125mV.


Woah, your card goes to 1.375V while idle (and using 3D clocks, whatever we should call it) and drops all the way to 1.258V under load? That's insane. What's your DPM7? My card will go to 1.375V at +100mV but under load it will hover around 1.28V.
Quote:


> Originally Posted by *Vellinious*
> 
> I would just about bet that 1.3v is perfectly safe for an every day clock....and that's probably a little bit conservative.


Just to make sure, we're talking about 1.3V under load, right? Because as soon as I get my hands on some Fujipoly pads, I'm gonna push this baby harder. Running 1185MHz at +90mV now and I'm gaming happily. VRMs will sometime go as high as 85C, but they mostly hang around 75C, which is fine, I guess.

I'm wondering if improving cooling for VRM2 will make any difference. What'd you say?


----------



## yuannan

Quote:


> Originally Posted by *tolis626*
> 
> @yuannan
> Can you run the Stilt's EVV app and tell us what your DPM7 voltage is? Or even post a BIOS dump or something. Just make sure you're at stock settings when you do either of these things.
> Woah, your card goes to 1.375V while idle (and using 3D clocks, whatever we should call it) and drops all the way to 1.258V under load? That's insane. What's your DPM7? My card will go to 1.375V at +100mV but under load it will hover around 1.28V.
> Just to make sure, we're talking about 1.3V under load, right? Because as soon as I get my hands on some Fujipoly pads, I'm gonna push this baby harder. Running 1185MHz at +90mV now and I'm gaming happily. VRMs will sometime go as high as 85C, but they mostly hang around 75C, which is fine, I guess.
> 
> I'm wondering if improving cooling for VRM2 will make any difference. What'd you say?



That's my GPU-Z no idea what EVV is.

This is with +150mv as I don't wanna load 200 into it again.


----------



## ronaldoz

Quote:


> Originally Posted by *Stige*
> 
> Well it is not true idle as it doesn't downclock with my current 120Hz settings, it's more like "light load", maybe playing a video or something?
> 
> I get about 1.258V with GPU-Z render test, that should be easy for anyone to compare with. This is at +125mV.


Well, I'm having 1,226V VDDC max at +25mV.


----------



## battleaxe

Quote:


> Originally Posted by *Stige*
> 
> +200mV only gets me ~1.375v at Idle which should be about right for most cards out there, 1.45 is way more than just +200mV.
> At load it drops to around ~1.3V which is more than safe.
> 
> Something is not right if you get 1.45V even at idle.


Why isn't your card downclocking though?

Am I missing something here?

Are you forcing constant voltage? If so, why? Forgive me if I'm missing the boat or coming to the party late.


----------



## Ron Soak

via messing around with MSI Afterburner you can accidentally force the the card to stay at max clock. Accidentally did this the other week.


----------



## christoph

hi guys, I just uploaded my scores to the fanboy competition, hope everything is ok and my scores gets added


----------



## rdr09

Quote:


> Originally Posted by *pojoFX*
> 
> I've got nothing better to do then spend my Sunday afternoon toying around with the new hardware anyway, so I guess there's no good reason not to!
> 
> I just finished running all the benchmarks at default demo version settings. If I'm understanding the rules correctly we're just using the graphics scores. The Vantage score is divided by 2 and the Firestrike score is multiplied by 2. Then add them all up and that's the final score. Does that sound right? Your predictions and my results seem to line up.
> 
> When doing so I got the following:
> 
> Vantage: 47584 (/2 = 23792)
> http://www.3dmark.com/3dmv/5434928
> 
> Mark11: 17877
> http://www.3dmark.com/3dm11/11114719
> 
> Firestrike: 12571 (*2 = 25142)
> http://www.3dmark.com/3dm/11407136?
> 
> Total: 66811
> 
> If anyone can confirm I didn't screw it up I'll go ahead and make a proper entry.


Do you mind updating them with tessellation off . . .



Set Tess Mode to Override Application Settings and Maximum tess level to off. Thanks.


----------



## Stige

Quote:


> Originally Posted by *battleaxe*
> 
> Why isn't your card downclocking though?
> 
> Am I missing something here?
> 
> Are you forcing constant voltage? If so, why? Forgive me if I'm missing the boat or coming to the party late.


QNIX QX2710. Using CRU for custom refresh rate and I use tighter timings instead of the ones that can downclock. There are timings for custom resolution that allow downclocking but I prefer to use the ones with tighter timings but then the GPU doesn't downclock fully.


----------



## bluej511

Oh shoot guys, what did i do?


----------



## Scorpion49

Does anyone run a custom gamma curve with the radeon software? I tweaked mine using the "additional settings" menu that brings up the old CCC looking thing, and it seems to be bugged out. It does not apply when I boot up the computer, only after I open the menu again does it show up.


----------



## bluej511

Gamma? I see to have installed 16.3.1 radeon corruptly last time. Id start my pc and have a green screen with a load wheel, works after shutting it off and restarting so i re-installed Crimson just in case to see if it fixes it, Windows 10 btw.


----------



## Scorpion49

Quote:


> Originally Posted by *bluej511*
> 
> Gamma? I see to have installed 16.3.1 radeon corruptly last time. Id start my pc and have a green screen with a load wheel, works after shutting it off and restarting so i re-installed Crimson just in case to see if it fixes it, Windows 10 btw.


Yeah I have to run one for my old 120hz monitor, its CCFL backlight and getting very old, can't tell contrast between dark gray and black without a gamma boost. The Windows calibrator can't do it properly, so I use this.



But when I reboot the original gamma stays applied until I open the menu again. I don't need to actually change it, just open the menu and it instantly switches to my custom setting. The menu also doesn't seem to remember what the setting is, you can see it say default 1.05 value but if I click "reset" it will actually change back to that, right now its at like 1.25 but the slider doesn't remember it.


----------



## bluej511

Only thing i could think of would be to run Radeon settings as administator under properties see if that helps.


----------



## ronaldoz

Quote:


> Originally Posted by *bluej511*
> 
> Oh shoot guys, what did i do?


That's looking good! I bet your temps will drop nicely.


----------



## bluej511

Quote:


> Originally Posted by *ronaldoz*
> 
> That's looking good! I bet your temps will drop nicely.


I delided the ghetto way but it worked. It didnt drop as much as i thought honestly. Doing intel stress test it dropped like 7-8°C, while gaming it was closer to 5. On the i5s it doesnt seem to do a huge difference. Ordered the screws from EK gonna run it delided fully (it has the IHS now just wanted to see what it would do). Ill use Noctua TIM just for the sake of consistency and efficiency. Seems like it makes a bigger difference with a bigger OC. Once i run it bare die ill try a higher OC see what temps i get. I think the very restrictive gpu block affects the CPU quite a bit, water stays in there longer. Might run the pump a lil slower less psi pressure.


----------



## bluej511

Btw does anyone know what those lil switches are on the back of the gpu pcb? My backplate covers em up now but just curious. To delid i held the IHS against the edge of a marble top, put a thin piece of cheap wood on the edge of the cpu pcb while i held it gave it a few wacks and poof, keeps it from falling on the ground too.


----------



## Stige

Quote:


> Originally Posted by *bluej511*
> 
> I delided the ghetto way but it worked. It didnt drop as much as i thought honestly. Doing intel stress test it dropped like 7-8°C, while gaming it was closer to 5. On the i5s it doesnt seem to do a huge difference. Ordered the screws from EK gonna run it delided fully (it has the IHS now just wanted to see what it would do). Ill use Noctua TIM just for the sake of consistency and efficiency. Seems like it makes a bigger difference with a bigger OC. Once i run it bare die ill try a higher OC see what temps i get. I think the very restrictive gpu block affects the CPU quite a bit, water stays in there longer. Might run the pump a lil slower less psi pressure.


You put CLU between the IHS and Die and you will see 15C-20C temp drops, I got like 17-18C drops myself. Using cheap TIM between the IHS and Die won't be any better than the toothpaste Intel uses.


----------



## bluej511

Welli saw an article that there was no difference between intel tim and noctua tim, its the fact that the glue keeps the tim far too high for good contact. I hear clu eats away at water blocks so i might stay away. Going to run it bare die anyways so no point in clu between the ihs and die. I already ordered the screws for it.


----------



## ronaldoz

Quote:


> Originally Posted by *bluej511*
> 
> I delided the ghetto way but it worked. It didnt drop as much as i thought honestly. Doing intel stress test it dropped like 7-8°C, while gaming it was closer to 5. On the i5s it doesnt seem to do a huge difference. Ordered the screws from EK gonna run it delided fully (it has the IHS now just wanted to see what it would do). Ill use Noctua TIM just for the sake of consistency and efficiency. Seems like it makes a bigger difference with a bigger OC. Once i run it bare die ill try a higher OC see what temps i get. I think the very restrictive gpu block affects the CPU quite a bit, water stays in there longer. Might run the pump a lil slower less psi pressure.


Allright, you did clean the CPU nicely, so that was great. I used Liquid Ultra, but I will only use that for the CPU tho. I've used it on the GPU chip from the R9 290 and it made a HUGE difference. The Powercolor was at stock voltage 70C ingame and it dropped to 60C. But the temps were raising very slowly after adding voltages, that made the big difference. For example: Before 85C, after 63C. And also more, like before 105C, after 67C. But I will use GC Extreme now, because Liquid Ultra is kinda 'aggresive'.

For you card, I'm really curious what it do with your temps! Even 5C would be awesome.


----------



## bluej511

Quote:


> Originally Posted by *ronaldoz*
> 
> Allright, you did clean the CPU nicely, so that was great. I used Liquid Ultra, but I will only use that for the CPU tho. I've used it on the GPU chip from the R9 290 and it made a HUGE difference. The Powercolor was at stock voltage 70C ingame and it dropped to 60C. But the temps were raising very slowly after adding voltages, that made the big difference. For example: Before 85C, after 63C. And also more, like before 105C, after 67C. But I will use GC Extreme now, because Liquid Ultra is kinda 'aggresive'.
> 
> For you card, I'm really curious what it do with your temps! Even 5C would be awesome.


It already stays under 45°C watercooled so idc about the core of the GPU not worth getting CLU for that. I hear liquid ultra eats away at the water block so i might stay away. I think running it with the IHS off will be the biggest difference. Im sure if i put it on the die then the IHS over it id get even lower temps but seems like running it bare die gets crazy low temps.


----------



## Stige

Quote:


> Originally Posted by *bluej511*
> 
> Welli saw an article that there was no difference between intel tim and noctua tim, its the fact that the glue keeps the tim far too high for good contact. I hear clu eats away at water blocks so i might stay away. Going to run it bare die anyways so no point in clu between the ihs and die. I already ordered the screws for it.


Then you have been very misinformed and should stop spreading this false information around.

CLU will not do anything to your waterblock, IHS or die and will offer way superior temps to anything else you can find (apart from other liquid metals). CLU only eats away ALUMINUM.

Running direct-to-die offers very little if any improvement at all over running the IHS there and you actually have a change of cracking the die with direct-to-die contact. I have tried it in the past myself aswell but it didn't offer any benefits, just a hassle of installation having to remove the IHS holding bracket from mobo and then tighten the block to hold the CPU in place hard enough.


----------



## bluej511

Well the ek comes with new screws that drops the waterblock a tiny amount. This is what in CLU quite a few corrosive materials. Thats what i read on the delided section that CLU is quite abbrasive.

Alloy of the metal components gallium, indium, rhodium, silver, zinc and stannous, bismuth;
suspended in a graphite-copper matrix.

Actually one less layer of thick copper to go thru would improve temperatures. It seems to work much better with water then air that might be why.


----------



## bluej511

This is what i mean, it ends up curing itself onto copper. http://www.overclockers.com/forums/showthread.php/759641-Liquid-Ultra-and-copper-blocks

And honestly a 1.5°C difference between what i use now and that, pointless to buy some.
http://www.tomshardware.com/reviews/thermal-paste-performance-benchmark,3616-17.html


----------



## battleaxe

Quote:


> Originally Posted by *Stige*
> 
> QNIX QX2710. Using CRU for custom refresh rate and I use tighter timings instead of the ones that can downclock. There are timings for custom resolution that allow downclocking but I prefer to use the ones with tighter timings but then the GPU doesn't downclock fully.


Seems like a fast way to wear out a GPU. I could be mistaken though. But then again, if you plan to upgrade soon and don't care about the money then who cares? Go for it. To each his own I say.


----------



## Stige

Quote:


> Originally Posted by *battleaxe*
> 
> Seems like a fast way to wear out a GPU. I could be mistaken though. But then again, if you plan to upgrade soon and don't care about the money then who cares? Go for it. To each his own I say.


It won't do any harm to the card. Had my QNIX for 3 years now I think.


----------



## Stige

Quote:


> Originally Posted by *bluej511*
> 
> This is what i mean, it ends up curing itself onto copper. http://www.overclockers.com/forums/showthread.php/759641-Liquid-Ultra-and-copper-blocks
> 
> And honestly a 1.5°C difference between what i use now and that, pointless to buy some.
> http://www.tomshardware.com/reviews/thermal-paste-performance-benchmark,3616-17.html


At so low temps, why would it make a difference? Once you start pushing heat through, that is when you will see a big difference. That test is irrelevant.

Also CLU does nothing between the IHS and Die, it doesn't even harden. It does harden between IHS and the CPU block but nothing a little lapping and cleaning wont fix.


----------



## battleaxe

Quote:


> Originally Posted by *Stige*
> 
> It won't do any harm to the card. Had my QNIX for 3 years now I think.


And the same GPU? Wow. Well I guess it doesn't really do anything then. Hmmm... I would have thought it would degrade the core some running constant voltage all the time. But maybe it just uses more energy is all.


----------



## bluej511

Quote:


> Originally Posted by *Stige*
> 
> At so low temps, why would it make a difference? Once you start pushing heat through, that is when you will see a big difference. That test is irrelevant.
> 
> Also CLU does nothing between the IHS and Die, it doesn't even harden. It does harden between IHS and the CPU block but nothing a little lapping and cleaning wont fix.


Interesting, because from what ive read it hardens between the IHS and die because thats what its supposed to do, supposed to melt once loaded then once cooled off it bondes to the IHS and die like solder (almost). This is what ive seen on every forum about CLU and CLP. Ive also read that CLP is much less harder on heatsinks/water blocks then CLU is. Hey im just going by what ive read.

Like i said this is pretty much testing purposes. Once i run it without the IHS if my temps get even lower i will consider getting CLU/P but if it makes a 2-3° difference for me its pointless. Gonna try and run it at 4.5 now and see what i come up with in temps.


----------



## Stige

No I'm 110% serious. Between the IHS and Die it NEVER harderns, ever. I have used it for 3 years now atleast and it only hardens between the IHS and cpu block, I can keep the same paste for 3 years between the IHS and die without it hardening. It must be something to do with the material of the DIE itself.


----------



## bluej511

Quote:


> Originally Posted by *Stige*
> 
> No I'm 110% serious. Between the IHS and Die it NEVER harderns, ever. I have used it for 3 years now atleast and it only hardens between the IHS and cpu block, I can keep the same paste for 3 years between the IHS and die without it hardening. It must be something to do with the material of the DIE itself.


So odd i wonder why some posts have it where it hardened under the IHS. So im trying to OC to 4.5 and having to manually set my voltages haha. Windows is giving me a whea uncorrectable error lol.


----------



## bluej511

So 4.5ghz at 1.15v (verified with my fluke meter), stable after 15mins (was crashing before after 5) intel stress test got her up to 52°C. So only a dif of 5°C betwee 1.088 or so and 1.15.


----------



## jdorje

You won't see 20C temp difference when your temp is starting from 50C. Assuming ambient of 20C, you've gone from +30c to +25C which is a 17% improvement to your cooling. At a guess you'd double that with liquid metal.

The i5 does benefit less from delid than an i7 or g3258, but it's still massive.

Delidding from a delta of 30c is pretty pointless. But still fun! Clu is fun too so use it!

The main problem with every non-lm paste is that it will pump out and you'll end up with a dry die and awful temps.


----------



## ronaldoz

Quote:


> Originally Posted by *jdorje*
> 
> You won't see 20C temp difference when your temp is starting from 50C. Assuming ambient of 20C, you've gone from +30c to +25C which is a 17% improvement to your cooling. At a guess you'd double that with liquid metal.
> 
> The i5 does benefit less from delid than an i7 or g3258, but it's still massive.
> 
> Delidding from a delta of 30c is pretty pointless. But still fun! Clu is fun too so use it!
> 
> The main problem with every non-lm paste is that it will pump out and you'll end up with a dry die and awful temps.


This happened with my 4790K with delid. It was on aircooling.


----------



## jdorje

What type of cooling doesn't matter that much. But you'll just see a bigger difference if you have higher wattage which of course requires higher cooling.

I had a 9C improvement on my 4690k+h80i and gained 2 multipliers as a result. 13C on my g3258+212.

If I were you bluej511 I wouldn't bother with a naked mount. But do put on lm.


----------



## bluej511

Quote:


> Originally Posted by *jdorje*
> 
> You won't see 20C temp difference when your temp is starting from 50C. Assuming ambient of 20C, you've gone from +30c to +25C which is a 17% improvement to your cooling. At a guess you'd double that with liquid metal.
> 
> The i5 does benefit less from delid than an i7 or g3258, but it's still massive.
> 
> Delidding from a delta of 30c is pretty pointless. But still fun! Clu is fun too so use it!
> 
> The main problem with every non-lm paste is that it will pump out and you'll end up with a dry die and awful temps.


No no ive gone from 50 to about 45-46 lol. Not much at 1.088 but at 1.15 only went to 51-52 where i was before but more voltage. Im hoping to get under or at 40C with no ihs. Hey im bored gotta do something lol. My water delta on cpu only is prob less then 5C add the gpu its closer to 10C. Ill try it with noctua tim first for consistancy. If it goes from 50 to 38 ill be happy.

Hey all i did was hold the ihs onto marble edge and got it off no vice or razor costs nada.


----------



## bluej511

Quote:


> Originally Posted by *jdorje*
> 
> What type of cooling doesn't matter that much. But you'll just see a bigger difference if you have higher wattage which of course requires higher cooling.
> 
> I had a 9C improvement on my 4690k+h80i and gained 2 multipliers as a result. 13C on my g3258+212.
> 
> If I were you bluej511 I wouldn't bother with a naked mount. But do put on lm.


Honestly im just doing it for tests the screws were only 8$ lol. Any less material to dissapate heat the better but hey we'll see i love tinkering and testing. Hell i mounted my 360 rad 2 dif ways before i was happy.


----------



## thegamehhh

Can someone tell me how to turn on vsync in 2nd/desktop mode? I was looking for 3d settings but i cannot find those at 15.30 drivers.
I mean i am getting screen tearing in desktop mode and i dont know what to fix it.


----------



## bluej511

Gotta go under gaming global settings and turn on frame rate target control. Otherwise vsync is set in games.


----------



## patriotaki

should i give my pcs+ 390 and get a sapphire 390?


----------



## yuannan

Quote:


> Originally Posted by *patriotaki*
> 
> should i give my pcs+ 390 and get a sapphire 390?


If it's for free or a very small fee I would


----------



## patriotaki

the point is that it will be used


----------



## jdorje

No. Just put an aio on your 390+.


----------



## tolis626

So, I'm in kind of a pickle. Either Witcher 3 isn't nearly as stressful as I think it is, or it's VSR that's causing it. Earlier today, I played like 2 hours of Witcher 3 at 1200/1700MHz at +115/+50mV. Max GPU temp was, I dunno, like 72C or thereabouts, and max VRM temp was 80C. Both hovered lower most of the time (maybe mainly because entering the map, inventory, etc screens leads to the card idling). About an hour ago, I decided to play some BF4. Conquest Large on Siege of Shanghai, so about as stressful as it can get. Thing is, like Valley, I run BF4 at 1440p VSR. At the same clocks and voltages, after about half an hour my GPU had hit 77C and the VRM had gone up to 90C. And same thing happens with Valley, it will heat up the GPU more (quite a bit too) than Witcher 3 and, IMO, that shouldn't be the case.

"Well, try without VSR, duh!" you'll say. I don't have time today, will do it tomorrow. I just wanted to see if anyone had any idea. Is it purely the higher resolution that's the problem? Could be, but I doubt 1440p would run so much hotter than 1080p. That's more like the difference I'd expect going to 4K or something.

PS : Fortunately, BF4 ran fine with no flickering or artifacting or anything, even when it was toasting itself. At least I got that going for me.


----------



## BIGBANGBOSSSS

hello everyone...
i need help, i just bought asus strix r9 390 and when connecting the power pins i noticed that only right white led is working and not the left. when i remove the power pin both red lights are working. i am afraid that the card is not working correct or i should do something with it. btw my previous card is sapphire r9 290 and i was playing the witcher in full hd ultra with no problem. and now with r9 390 i feel like the mouse pointer is not ok , feel some glitches and i dont know, something is wrong.
i would appreciate if some on would assist me asap.


----------



## jdorje

Quote:


> Originally Posted by *tolis626*
> 
> So, I'm in kind of a pickle. Either Witcher 3 isn't nearly as stressful as I think it is, or it's VSR that's causing it. Earlier today, I played like 2 hours of Witcher 3 at 1200/1700MHz at +115/+50mV. Max GPU temp was, I dunno, like 72C or thereabouts, and max VRM temp was 80C. Both hovered lower most of the time (maybe mainly because entering the map, inventory, etc screens leads to the card idling). About an hour ago, I decided to play some BF4. Conquest Large on Siege of Shanghai, so about as stressful as it can get. Thing is, like Valley, I run BF4 at 1440p VSR. At the same clocks and voltages, after about half an hour my GPU had hit 77C and the VRM had gone up to 90C. And same thing happens with Valley, it will heat up the GPU more (quite a bit too) than Witcher 3 and, IMO, that shouldn't be the case.
> 
> "Well, try without VSR, duh!" you'll say. I don't have time today, will do it tomorrow. I just wanted to see if anyone had any idea. Is it purely the higher resolution that's the problem? Could be, but I doubt 1440p would run so much hotter than 1080p. That's more like the difference I'd expect going to 4K or something.
> 
> PS : Fortunately, BF4 ran fine with no flickering or artifacting or anything, even when it was toasting itself. At least I got that going for me.


Those temps sound completely normal for non msi stock cooling.


----------



## tolis626

Quote:


> Originally Posted by *jdorje*
> 
> Those temps sound completely normal for non msi stock cooling.


What do you mean non MSI stock cooling? Only thing I've changed is the TIM and thermal pads. Otherwise it's the stock MSI cooler.


----------



## christoph

Quote:


> Originally Posted by *BIGBANGBOSSSS*
> 
> hello everyone...
> i need help, i just bought asus strix r9 390 and when connecting the power pins i noticed that only right white led is working and not the left. when i remove the power pin both red lights are working. i am afraid that the card is not working correct or i should do something with it. btw my previous card is sapphire r9 290 and i was playing the witcher in full hd ultra with no problem. and now with r9 390 i feel like the mouse pointer is not ok , feel some glitches and i dont know, something is wrong.
> i would appreciate if some on would assist me asap.


there's only a few users with that card, we have to wait for some users to come along that can tell about that card


----------



## christoph

Quote:


> Originally Posted by *tolis626*
> 
> What do you mean non MSI stock cooling? Only thing I've changed is the TIM and thermal pads. Otherwise it's the stock MSI cooler.


but either way the temps seems to be ok

what ambient temps you get over there??


----------



## tolis626

Quote:


> Originally Posted by *christoph*
> 
> but either way the temps seems to be ok
> 
> what ambient temps you get over there??


It's not the temps themselves, it's the difference between titles. I did try a bit of BF4 (goodbye sleep, sigh...) at 1080p with VSR disabled, albeit in the test range, and I got to 73C on the core and 84C on the VRMs, so there definitely seems to be a connection to resolution. Also, 90C sits a bit wrong with me. I know these VRMs are supposed to handle up to 115C or whatever, but still...

My ambients are about 20C in this room. Funny thing is, when I started playing, this room was cold. After a few hours of gaming, it's comfy and warm. Ah... I'm gonna suffer in the summer...


----------



## jdorje

Huge differences in vrm between loads is normal.

Your vrm temps would be normal for a non msi cooler. But I feel like the msi should be lower.


----------



## kubiks

Never got a chance to post this, but here goes.



I'm not 100% on the tubing. Still coming up with a final layout.

ps. i fixed the alphacool cpu block too


----------



## mus1mus

Quote:


> Originally Posted by *kubiks*
> 
> Never got a chance to post this, but here goes.
> 
> 
> 
> I'm not 100% on the tubing. Still coming up with a final layout.
> 
> ps. i fixed the alphacool cpu block too


pssst..

We need you here. That lead of theirs' can easily be equalled by your XFire if you submit.

http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd/0_50


----------



## kubiks

Omg , what is that?


----------



## kubiks

There isnt enough info. Should I run firestrike 1080p? or what?

Also Im runnign windows 10, the HWbot rule state it must be like windows 7 and earlier


----------



## christoph

Quote:


> Originally Posted by *kubiks*
> 
> There isnt enough info. Should I run firestrike 1080p? or what?
> 
> Also Im runnign windows 10, the HWbot rule state it must be like windows 7 and earlier


we need you there...


----------



## jodybdesigns

Quote:


> Originally Posted by *kubiks*
> 
> There isnt enough info. Should I run firestrike 1080p? or what?
> 
> Also Im runnign windows 10, the HWbot rule state it must be like windows 7 and earlier


Win10 is fine. Also stock clocks on all programs is fine. Get the basic downloads from here: http://www.futuremark.com/support/downloads

I see you're on water so I assume you are overclocking. Would be nice for you to do if you are interested in feeling like a part of the community.


----------



## kizwan

Quote:


> Originally Posted by *jdorje*
> 
> Huge differences in vrm between loads is normal.
> 
> Your vrm temps would be normal for a non msi cooler. But I feel like the msi should be lower.


Last time I checked MSI 390/390X cooler not bad but not the best for VRMs. Sapphire Nitro cooler seems the best all around for both core & VRMs.


----------



## Stige

Quote:


> Originally Posted by *BIGBANGBOSSSS*
> 
> hello everyone...
> i need help, i just bought asus strix r9 390 and when connecting the power pins i noticed that only right white led is working and not the left. when i remove the power pin both red lights are working. i am afraid that the card is not working correct or i should do something with it. btw my previous card is sapphire r9 290 and i was playing the witcher in full hd ultra with no problem. and now with r9 390 i feel like the mouse pointer is not ok , feel some glitches and i dont know, something is wrong.
> i would appreciate if some on would assist me asap.


But the card works fine? If so, I wouldn't worry about it. It is afterall just one LED.


----------



## rdr09

nvm. The 3D Fanboy Competition turned into . . .


----------



## jdorje

Like me, you made sure the red flame was prominent on your screenshots. This is very important.
Quote:


> Originally Posted by *kizwan*
> 
> Last time I checked MSI 390/390X cooler not bad but not the best for VRMs. Sapphire Nitro cooler seems the best all around for both core & VRMs.


The nitro and msi both have the vrms cooled by the main heat sink. Pcs+ and xfx both have segregated coolers "inside" the heat sink.

But numbers people have posted here make me think the sapphire doesn't really have better vrm temps than the tier 2 cards. Dunno though, sometimes people just post wildly different numbers so it's hard to be sure.


----------



## kizwan

Quote:


> Originally Posted by *tolis626*
> 
> Quote:
> 
> 
> 
> Originally Posted by *christoph*
> 
> but either way the temps seems to be ok
> 
> what ambient temps you get over there??
> 
> 
> 
> It's not the temps themselves, it's the difference between titles. I did try a bit of BF4 (goodbye sleep, sigh...) at 1080p with VSR disabled, albeit in the test range, and I got to 73C on the core and 84C on the VRMs, so there definitely seems to be a connection to resolution. Also, 90C sits a bit wrong with me. I know these VRMs are supposed to handle up to 115C or whatever, but still...
> 
> My ambients are about 20C in this room. Funny thing is, when I started playing, this room was cold. After a few hours of gaming, it's comfy and warm. Ah... I'm gonna suffer in the summer...
Click to expand...

38°C outdoor & 36°C indoor are very fun & comfy.


----------



## kubiks

I will do it tomorrow evening after work. I have to download everything and my internet is slow







I will definitely put some numbers down for team red.


----------



## yuannan

There are new crimson drivers, are they worth retesting for?


----------



## Ron Soak

Quote:


> Originally Posted by *yuannan*
> 
> There are new crimson drivers, are they worth retesting for?


a) new drivers are ALWAYS worth retesting

b) the feedback on this one is already pretty positive,


----------



## fyzzz

I got about 150 points more in firestrike switching from Crimson 16.3.1 to 16.3.2. I have a 290 with a very modified 390 bios.


----------



## yuannan

Take that as a no then, 150 points isn't worth the time. Sorry if we lose out by 150 points btw


----------



## bluej511

Jesus Christ i just updated to 16.3.1 couple days ago haha. I'm loving that they're driver division is finally on point. They're getting more drivers out then ngreedia


----------



## tolis626

Quote:


> Originally Posted by *jdorje*
> 
> Huge differences in vrm between loads is normal.
> 
> Your vrm temps would be normal for a non msi cooler. But I feel like the msi should be lower.


See, what's strange is what I'd been rumbling about when changing the thermal pads on my GPU. Reported VRM temps are worse than before. No other way to put it. Before messing with the card I had not seen above 82C. Ever. Now, that's a common temp at high-ish overclocks. This could very well be the first time I've seen 90c. And that would be fine if the results reflected that. Worse cooling should lead to lower overclocks. Thing is, the opposite is true. I was getting a max of semistable 1175MHz at +100mV before and now I comfortably do 1195MHz at the same voltage and it's rock solid. All with my VRMs at higher temperatures. The "what the hell?" is strong with this card...









As I've said before, could be something else that improved my overclocks. Could also be that temps are being misreported or that they were being misreported before. To make sure, I'll order Fujipoly 14W/mK pads tomorrow and see what happens with the lower temps. I will also put a 1mm Phobya pad on the caps just for good measure, and I'll also replace the padding on VRM2 in the hopes that it improves my memory overclocking. Not likely to happen, I know, but it's worth a try I guess.
Quote:


> Originally Posted by *kizwan*
> 
> 38°C outdoor & 36°C indoor are very fun & comfy.


What the hell are you talking about man? I've had to endure these temperatures every summer here and they are hell on Earth. Sweating a gallon of water per minute when sleeping is no fun, and that's exactly what happens sometimes. I don't like sleeping with the AC turned on too, so I'm in a pickle every summer. A very hot, sweaty and smelly pickle.









My only hope is that losing weight will lead to me not suffering so much. I've gone from 100kg (220lbs) to 75kg (165lbs) since last year and I feel cold quite more than before. I really really hope I don't feel the heat as much too, or else I'm screwed.









And that's as far off topic as it goes I think.


----------



## Transmaniacon

Hey guys, currently have a MSI HD7950 and it's been serving me well but I can't max AAA titles at 1080P anymore and am likely due for an upgrade. I see some killer deals on the R9 series, but I know the new generation is also coming out this summer. Newegg has a MSI R9 390X for $379, I haven't seen one this low before, think it's worth jumping on that deal or waiting for the new cards to come out? I likely won't be someone who buys the top end cards, so I would probably need to wait longer for the non flagship cards to show up.


----------



## yuannan

Quote:


> Originally Posted by *Transmaniacon*
> 
> Hey guys, currently have a MSI HD7950 and it's been serving me well but I can't max AAA titles at 1080P anymore and am likely due for an upgrade. I see some killer deals on the R9 series, but I know the new generation is also coming out this summer. Newegg has a MSI R9 390X for $379, I haven't seen one this low before, think it's worth jumping on that deal or waiting for the new cards to come out? I likely won't be someone who buys the top end cards, so I would probably need to wait longer for the non flagship cards to show up.


This is one sale in the UK.

http://www.amazon.co.uk/dp/B014F7A65W/ref=dra_a_sm_mr_ho_xx_P3033_200?tag=dradisplay0bb-21&ascsubtag=8b9c3c00a17f3c33a0bafa96be8b22eb_S

Water cooled and all, check the US amazon, maybe it's also on sale.


----------



## Transmaniacon

Quote:


> Originally Posted by *yuannan*
> 
> This is one sale in the UK.
> 
> http://www.amazon.co.uk/dp/B014F7A65W/ref=dra_a_sm_mr_ho_xx_P3033_200?tag=dradisplay0bb-21&ascsubtag=8b9c3c00a17f3c33a0bafa96be8b22eb_S
> 
> Water cooled and all, check the UK amazon, maybe it's also on sale.


That one is $450 in the US unfortunately, I haven't seen one this low yet, most are right around $400.


----------



## yuannan

Quote:


> Originally Posted by *Transmaniacon*
> 
> That one is $450 in the US unfortunately, I haven't seen one this low yet, most are right around $400.


I meant US amazon in the last post btw, if you are gonna get a 390 I would get a sapphire one. It's very quiet and OCs very well, for me anyway.


----------



## Transmaniacon

Quote:


> Originally Posted by *yuannan*
> 
> I meant US amazon in the last post btw, if you are gonna get a 390 I would get a sapphire one. It's very quiet and OCs very well, for me anyway.


Yeah I would either go MSI or Sapphire for the R9 series, I have had good luck with my MSI HD7950 though and it seems to be the better overclocker of the two.


----------



## yuannan

Quote:


> Originally Posted by *Transmaniacon*
> 
> Yeah I would either go MSI or Sapphire for the R9 series, I have had good luck with my MSI HD7950 though and it seems to be the better overclocker of the two.


MSI does seem to OC better, but I prefer the sapphire one as it's quieter. 50mhz is not not that much for me.


----------



## Transmaniacon

Quote:


> Originally Posted by *yuannan*
> 
> MSI does seem to OC better, but I prefer the sapphire one as it's quieter. 50mhz is not not that much for me.


Yeah they do have a very good cooler, I however am planning a move to ITX and that Sapphire card limits my case options. Still trying to decide if I want to wait for the new series, but the 7950 is starting to show it's age.


----------



## yuannan

Quote:


> Originally Posted by *Transmaniacon*
> 
> Yeah they do have a very good cooler, I however am planning a move to ITX and that Sapphire card limits my case options. Still trying to decide if I want to wait for the new series, but the 7950 is starting to show it's age.


A nano?
The next best card will be a 380. Or a 970 if you are evil










Talking about these 2 cards:

http://www.ebuyer.com/714716-sapphire-r9-380-itx-compact-oc-2gb-gddr5-dvi-i-hdmi-dual-mini-11242-00-20g?gclid=Cj0KEQjwz-i3BRDtn53Z5Z7t4PUBEiQA23q2ADpj8z-rDKabmwt6el5bG3uXTuazmBfMImITCbYUXX8aAvDD8P8HAQ#fo_c=951&fo_k=fb82c729ab3b809181759ce019fdc31d&fo_s=gplauk?mkwid=sxcObZ2Xs_dc&pcrid=51482416379&pkw=&pmt=

http://www.novatech.co.uk/products/components/nvidiageforcegraphicscards/nvidiagtx970series/Gigabyte/GV-N970IXOC-4GD.html?gclid=Cj0KEQjwz-i3BRDtn53Z5Z7t4PUBEiQA23q2ADobxJdrHyYG0bGRRUQjyXNpp7VpJlNG2LLsUB211RsaAl6N8P8HAQ#utm_source=google&utm_medium=base&utm_campaign=products


----------



## bluej511

Quote:


> Originally Posted by *yuannan*
> 
> MSI does seem to OC better, but I prefer the sapphire one as it's quieter. 50mhz is not not that much for me.


Matter of opinion and fan curve. After 60% they are ridiculously loud, as a matter of fact i changed my fan curve as to never exceed 60%. Now watercooled all my fans are running at 1200rpm and my card stays in the low 40°C. Quiet and cool.

P.S. As of right now ive got 10fans with 2 more on the way.


----------



## yuannan

Quote:


> Originally Posted by *bluej511*
> 
> Matter of opinion and fan curve. After 60% they are ridiculously loud, as a matter of fact i changed my fan curve as to never exceed 60%. Now watercooled all my fans are running at 1200rpm and my card stays in the low 40°C. Quiet and cool.
> 
> P.S. As of right now ive got 10fans with 2 more on the way.


Where did you get the blocks? ek or universal ones?


----------



## Transmaniacon

Quote:


> Originally Posted by *yuannan*
> 
> A nano?
> The next best card will be a 380. Or a 970 if you are evil
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Talking about these 2 cards:
> 
> http://www.ebuyer.com/714716-sapphire-r9-380-itx-compact-oc-2gb-gddr5-dvi-i-hdmi-dual-mini-11242-00-20g?gclid=Cj0KEQjwz-i3BRDtn53Z5Z7t4PUBEiQA23q2ADpj8z-rDKabmwt6el5bG3uXTuazmBfMImITCbYUXX8aAvDD8P8HAQ#fo_c=951&fo_k=fb82c729ab3b809181759ce019fdc31d&fo_s=gplauk?mkwid=sxcObZ2Xs_dc&pcrid=51482416379&pkw=&pmt=
> 
> http://www.novatech.co.uk/products/components/nvidiageforcegraphicscards/nvidiagtx970series/Gigabyte/GV-N970IXOC-4GD.html?gclid=Cj0KEQjwz-i3BRDtn53Z5Z7t4PUBEiQA23q2ADobxJdrHyYG0bGRRUQjyXNpp7VpJlNG2LLsUB211RsaAl6N8P8HAQ#utm_source=google&utm_medium=base&utm_campaign=products


Well not that compact, I have been liking the new Fractal Nano S, which can fit most full sized cards, the Sapphire Nitro might be a bit too big though.


----------



## bluej511

Quote:


> Originally Posted by *yuannan*
> 
> Where did you get the blocks? ek or universal ones?


Its the alphacool one, cools the VRMs passively. Mine stay around low 60°C with only ONE intake fan (yes i know its why im adding more this week lol). So hoping i get even less. I wish ek or aquacomputer made one for the sapphire nitro i would have gotten it as theyd cool the VRMs by water as well. Stupid custom pcb. But im hoping once i get new thermal pads, the 11 w/mk for me as the 14-17 are ludicrously expensive. Might get even lower temps.


----------



## jodybdesigns

Quote:


> Originally Posted by *Transmaniacon*
> 
> Hey guys, currently have a MSI HD7950 and it's been serving me well but I can't max AAA titles at 1080P anymore and am likely due for an upgrade. I see some killer deals on the R9 series, but I know the new generation is also coming out this summer. Newegg has a MSI R9 390X for $379, I haven't seen one this low before, think it's worth jumping on that deal or waiting for the new cards to come out? I likely won't be someone who buys the top end cards, so I would probably need to wait longer for the non flagship cards to show up.


I jumped from Crossfire 7950's to a single R9 390. It was worthy. Feels good to play all my games in 1440p without Crossfire, or the stutters associated with Crossfire. CFX is in a really bad place right now. I have my R9 390 OC'd and I get 390X performance.

The BEST deal you COULD do, is find a used 290 or 290x. It's the same card, and I see a lot of 290x's smoking a 390x.


----------



## Transmaniacon

Quote:


> Originally Posted by *jodybdesigns*
> 
> I jumped from Crossfire 7950's to a single R9 390. It was worthy. Feels good to play all my games in 1440p without Crossfire, or the stutters associated with Crossfire. CFX is in a really bad place right now. I have my R9 390 OC'd and I get 390X performance.
> 
> The BEST deal you COULD do, is find a used 290 or 290x. It's the same card, and I see a lot of 290x's smoking a 390x.


Nice to know you got a good performance boost, I have never really been interested in the dual GPU setups, I don't play on the resolutions to justify them. I think I would rather go with a new GPU, I kind of feel like buying a used GPU is like buying a used sports car, you know it's been rung out and driven hard.

I was planning on the 390, but for like $30 more I can get the 390X with that sale going on. Is the plan usually to release the high end cards and then have the mid-range ones follow after? I suppose I would be in the market for a R9 490 essentially, just not sure I want to wait until the fall.


----------



## GorillaSceptre

Anyone else have extreme temps in game menus? It's been very hot where i live this summer (yesterday was 35C







), so I'm used to my card hitting low-80s when playing games, but yesterday i paused 2016 Tomb Raider to get something to eat, and came back to hearing my fans pegged at 100% and temps of 94C reached... Nothing has ever made my card get that hot, not even torture tests.. What the hell is happening there? Lol.

Is there something i can do in the drivers or something to stop that from happening? I'm on 16.1.1 if that helps, thanks.


----------



## yuannan

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Anyone else have extreme temps in game menus? It's been very hot where i live this summer (yesterday was 35C
> 
> 
> 
> 
> 
> 
> 
> ), so I'm used to my card hitting low-80s when playing games, but yesterday i paused 2016 Tomb Raider to get something to eat, and came back to hearing my fans pegged at 100% and temps of 94C reached... Nothing has ever made my card get that hot, not even torture tests.. What the hell is happening there? Lol.
> 
> Is there something i can do in the drivers or something to stop that from happening? I'm on 16.1.1 if that helps, thanks.


Lemme guess, ASUS?


----------



## christoph

hey guys

one question about the drivers, do I need to download the chipset drivers too? or the main crimson drivers comes with EVERYTHING includeed? the chipset drivers, the raid driver and everrything?


----------



## yuannan

Quote:


> Originally Posted by *christoph*
> 
> hey guys
> 
> one question about the drivers, do I need to download the chipset drivers too? or the main crimson drivers comes with EVERYTHING includeed? the chipset drivers, the raid driver and everrything?


It includes everything, make sure to untick AMD gaming app as no one uses that.


----------



## bluej511

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Anyone else have extreme temps in game menus? It's been very hot where i live this summer (yesterday was 35C
> 
> 
> 
> 
> 
> 
> 
> ), so I'm used to my card hitting low-80s when playing games, but yesterday i paused 2016 Tomb Raider to get something to eat, and came back to hearing my fans pegged at 100% and temps of 94C reached... Nothing has ever made my card get that hot, not even torture tests.. What the hell is happening there? Lol.
> 
> Is there something i can do in the drivers or something to stop that from happening? I'm on 16.1.1 if that helps, thanks.


Yea i would update to the latest, not sure when it was that AMD had that fan curve issue. Depends on the card but yea Asus (which i think are ridiculously overpriced in everything) took their god awful r9 290 coolers and just stuck em on the r9 390 without doing any work. Pretty sure 94°C is when the cards start throttling. If you're not using afterburner it could be that crimson just isn't ramping the fan speed enough during load to cool your card. I remember without a fan curve id hit mid 80s even on the Nitro, with a fan curve closer to 70-71.


----------



## christoph

Quote:


> Originally Posted by *yuannan*
> 
> It includes everything, make sure to untick AMD gaming app as no one uses that.


oh yeah I don't have that app, can find a use for it, thanks


----------



## GorillaSceptre

Quote:


> Originally Posted by *yuannan*
> 
> Lemme guess, ASUS?


Quote:


> Originally Posted by *bluej511*
> 
> Yea i would update to the latest, not sure when it was that AMD had that fan curve issue. Depends on the card but yea Asus (which i think are ridiculously overpriced in everything) took their god awful r9 290 coolers and just stuck em on the r9 390 without doing any work. Pretty sure 94°C is when the cards start throttling. If you're not using afterburner it could be that crimson just isn't ramping the fan speed enough during load to cool your card. I remember without a fan curve id hit mid 80s even on the Nitro, with a fan curve closer to 70-71.


It's an MSI, weird thing is it's fine during gameplay, it's only in menus that the temps shoot through the roof..

I'll update crimson and see how it goes.


----------



## bluej511

Quote:


> Originally Posted by *GorillaSceptre*
> 
> It's an MSI, weird thing is it's fine during gameplay, it's only in menus that the temps shoot through the roof..
> 
> I'll update crimson and see how it goes.


Could set vsync on or frame rate target control. It is weird its only doing it in menus though.


----------



## Agent Smith1984

It's doing it in menus because menus tend to run an insanely high framerate Sometimes over 200FPS.... just set a cap of 120 or whatever your refresh rate tops out at and it will help some. I've seen this happen to a few people before.


----------



## bluej511

Quote:


> Originally Posted by *Agent Smith1984*
> 
> It's doing it in menus because menus tend to run an insanely high framerate Sometimes over 200FPS.... just set a cap of 120 or whatever your refresh rate tops out at and it will help some. I've seen this happen to a few people before.


My thought exactly but if its doing it in menus hes got vsync off meaning his card should be running full in games too and heating up the same way irregardless of fps.


----------



## GorillaSceptre

Quote:


> Originally Posted by *bluej511*
> 
> Could set vsync on or frame rate target control. It is weird its only doing it in menus though.


Quote:


> Originally Posted by *Agent Smith1984*
> 
> It's doing it in menus because menus tend to run an insanely high framerate Sometimes over 200FPS.... just set a cap of 120 or whatever your refresh rate tops out at and it will help some. I've seen this happen to a few people before.


Appreciate the help gents.









Okay, I'll setup target control, i have a 60Hz panel so i may as well set it to 60? I assumed it would be because of high-fps in menus, but i didn't think it would heat my card up more than Kombuster..









I remember that there was temp issues with Crimson in conjunction with AB, is that still a problem maybe?


----------



## bluej511

Pretty sure thats been fixed. Worked for me a month ago. The crimson fan curve used to suck. Set it to either 59 or 60. I have mine to 60 but sometimes it goes to 60.5 causes tearing. Cant wait for my freesync ultrawide.


----------



## yuannan

Has anyone experienced damage after overclocking monitor? I managed to bump my s29e790c to a whooping 85hz, there seems to no frame skipping and it's fine so far.


----------



## christoph

Quote:


> Originally Posted by *Agent Smith1984*
> 
> It's doing it in menus because menus tend to run an insanely high framerate Sometimes over 200FPS.... just set a cap of 120 or whatever your refresh rate tops out at and it will help some. I've seen this happen to a few people before.


good thinking


----------



## bluej511

Quote:


> Originally Posted by *yuannan*
> 
> Has anyone experienced damage after overclocking monitor? I managed to bump my s29e790c to a whooping 85hz, there seems to no frame skipping and it's fine so far.


Always a possibility as with any overclock.


----------



## jodybdesigns

Quote:


> Originally Posted by *bluej511*
> 
> My thought exactly but if its doing it in menus hes got vsync off meaning his card should be running full in games too and heating up the same way irregardless of fps.


Depends though, He could be hitting 1K+ FPS. That will cause it to skyrocket over the normal load from the gameplay.


----------



## ronaldoz

Tomorrow I got another 390X, because I bring the other back (bearing on the fan makes a bit extra noise), so I could run it in crossfire for the contest maybe. Not sure if it's that easy. And I got 750W.. I guess it won't really work ye?


----------



## jdorje

I only get coil whine in loading screens. Particularly in valley/heaven during the closing credits when it hits 4000 fps. The load is lower then but something about higher fps itself causes the problem.


----------



## tolis626

Quote:


> Originally Posted by *ronaldoz*
> 
> Tomorrow I got another 390X, because I bring the other back (bearing on the fan makes a bit extra noise), so I could run it in crossfire for the contest maybe. Not sure if it's that easy. And I got 750W.. I guess it won't really work ye?


Meh, undervolt them as far as they'll go at stock speeds and 750W is plenty. Mine uses like less than 150W on average at 1040MHz and -100mV (or under 1.1V under load). Do it, [Gollum voice] WE NEEDS IT! [/Gollum voice]


----------



## bluej511

Quote:


> Originally Posted by *jdorje*
> 
> I only get coil whine in loading screens. Particularly in valley/heaven during the closing credits when it hits 4000 fps. The load is lower then but something about higher fps itself causes the problem.


Honestly i dont think ive ever heard coil whine, and im so picky about that stuff. My fans aren't even that loud so id hear it but ive never had it.


----------



## ronaldoz

Quote:


> Originally Posted by *tolis626*
> 
> Meh, undervolt them as far as they'll go at stock speeds and 750W is plenty. Mine uses like less than 150W on average at 1040MHz and -100mV (or under 1.1V under load). Do it, [Gollum voice] WE NEEDS IT! [/Gollum voice]


Haha, well, I might give it a change and see if it works out.


----------



## Metalbeard

Quote:


> Originally Posted by *jodybdesigns*
> 
> I jumped from Crossfire 7950's to a single R9 390. It was worthy. Feels good to play all my games in 1440p without Crossfire, or the stutters associated with Crossfire. CFX is in a really bad place right now. I have my R9 390 OC'd and I get 390X performance.
> 
> The BEST deal you COULD do, is find a used 290 or 290x. It's the same card, and I see a lot of 290x's smoking a 390x.


Similar situation for me. I upgraded to a 390 from two 6970's. I didn't think I noticed the microstutter people always talk about but the single card is noticeably smoother. I run with vsync on so the frames are the same. So far very happy with the purchase.


----------



## jdorje

Quote:


> Originally Posted by *ronaldoz*
> 
> Tomorrow I got another 390X, because I bring the other back (bearing on the fan makes a bit extra noise), so I could run it in crossfire for the contest maybe. Not sure if it's that easy. And I got 750W.. I guess it won't really work ye?


If I clock my 4690k at [email protected] and my 390 at [email protected] it's about 400W wall draw under x264+valley. I'm rather sure I could run a second 390 on 650W.


----------



## ronaldoz

Quote:


> Originally Posted by *jdorje*
> 
> If I clock my 4690k at [email protected] and my 390 at [email protected] it's about 400W wall draw under x264+valley. I'm rather sure I could run a second 390 on 650W.


Sound nice. So tomorrow I gonna try it!


----------



## bluej511

Quote:


> Originally Posted by *jdorje*
> 
> If I clock my 4690k at [email protected] and my 390 at [email protected] it's about 400W wall draw under x264+valley. I'm rather sure I could run a second 390 on 650W.


Nice i tried 4.5 on mine yesterday, got her at 1.155 stable for 15mins, i measured with my fluke and was closer to 1.149 or so i believe, its weird wtv i put in in the BIOS it turns out to be a bit lower. My temps didnt change that much either. My delid is running at the same temps as before though guessing there might be something going on. Once i get my new screws im gonna take apart the cpu block AGAIN, i thik theres something clogging it up.


----------



## jdorje

Quote:


> Originally Posted by *ronaldoz*
> 
> Sound nice. So tomorrow I gonna try it!


Note that 1.125 is well below typical stock voltages!


----------



## ronaldoz

Quote:


> Originally Posted by *jdorje*
> 
> Note that 1.125 is well below typical stock voltages!


Yes, I have to find out what clocks the card need, when lowering voltages.


----------



## Stige

Quote:


> Originally Posted by *jdorje*
> 
> If I clock my 4690k at [email protected] and my 390 at [email protected] it's about 400W wall draw under x264+valley. I'm rather sure I could run a second 390 on 650W.


And I get about 500W max from the wall with my 3570K at 1.5V and my 390 at +125mV. And these are high voltages for any normal user, 650W is plenty for anyone that won't push every single part in their system to it's limits.


----------



## bluej511

Quote:


> Originally Posted by *Stige*
> 
> And I get about 500W max from the wall with my 3570K at 1.5V and my 390 at +125mV. And these are high voltages for any normal user, 650W is plenty for anyone that won't push every single part in their system to it's limits.


True but psus arent very efficient at higher wattages. Might sound insane but if im around 350-400w my rm1000 is at peak effiency.


----------



## pojoFX

Quote:


> Originally Posted by *rdr09*
> 
> Do you mind updating them with tessellation off . . .
> 
> 
> 
> Set Tess Mode to Override Application Settings and Maximum tess level to off. Thanks.


The actual entry I submitted is with tess off. I think I messed with the clocks a bit but the results were trivial...my 390 just isn't that great of an overclocker. Firestrike saw a bit of a gain with tess turned off though. Here's the link to my entry:

http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd/1360#post_25026619


----------



## jodybdesigns

Quote:


> Originally Posted by *Stige*
> 
> And I get about 500W max from the wall with my 3570K at 1.5V and my 390 at +125mV. And these are high voltages for any normal user, 650W is plenty for anyone that won't push every single part in their system to it's limits.


1.5v holy crap lol. What clocks are pushing when you are running that many volts? I have my chip at 4.2ghz 1.24v. Too huge of a jump for me to hit 4.3ghz (1.28) 4.4 (1.31) 4.5 (1.36). I am currently underclocked. I got extremely lucky up to 4.2ghz, but my chip falls flat on its face over 4.2.


----------



## Stige

Quote:


> Originally Posted by *bluej511*
> 
> True but psus arent very efficient at higher wattages. Might sound insane but if im around 350-400w my rm1000 is at peak effiency.



90%+ in any case really. At ~500W it's over 91%. That is for my PSU.
I would call that "efficient"

Any quality PSU will do good efficiency at any usage.
Quote:


> Originally Posted by *jodybdesigns*
> 
> 1.5v holy crap lol. What clocks are pushing when you are running that many volts? I have my chip at 4.2ghz 1.24v. Too huge of a jump for me to hit 4.3ghz (1.28) 4.4 (1.31) 4.5 (1.36). I am currently underclocked. I got extremely lucky up to 4.2ghz, but my chip falls flat on its face over 4.2.


It used to do 5.1GHz but dropped it to 4.7GHz now at 1.4V due to degradation.


----------



## bluej511

Quote:


> Originally Posted by *jodybdesigns*
> 
> 1.5v holy crap lol. What clocks are pushing when you are running that many volts? I have my chip at 4.2ghz 1.24v. Too huge of a jump for me to hit 4.3ghz (1.28) 4.4 (1.31) 4.5 (1.36). I am currently underclocked. I got extremely lucky up to 4.2ghz, but my chip falls flat on its face over 4.2.


Ivy bridge hungry damn lol. Im at 1.079-1.088 for 4.2


----------



## bluej511

Quote:


> Originally Posted by *Stige*
> 
> 
> 90%+ in any case really. At ~500W it's over 91%. That is for my PSU.
> I would call that "efficient"
> 
> Any quality PSU will do good efficiency at any usage.
> It used to do 5.1GHz but dropped it to 4.7GHz now at 1.4V due to degradation.


Right but at 50% youd be at 93% idk how much the kwh is in finland though but every % helps.


----------



## jodybdesigns

Quote:


> Originally Posted by *Stige*
> 
> 
> It used to do 5.1GHz but dropped it to 4.7GHz now at 1.4V due to degradation.


Lol can you really tell a real world difference that high? Or was it strictly for benching? Or do you 24/7 that thing like a Las Vegas Hooker? lol
Quote:


> Originally Posted by *bluej511*
> 
> Ivy bridge hungry damn lol. Im at 1.079-1.088 for 4.2


Yes they are. And hot. Forgot all about hot.


----------



## Stige

Quote:


> Originally Posted by *jodybdesigns*
> 
> Lol can you really tell a real world difference that high? Or was it strictly for benching? Or do you 24/7 that thing like a Las Vegas Hooker? lol
> Yes they are. And hot. Forgot all about hot.


It ran fine for about 3 years at 1.52V at 5.1GHz, started to get crashes after. I mean temps never went above ~65C so why not?


----------



## bluej511

Since i love math ill do a lil math haha. Lets say 90% efficiency at 400 will use 440w from the wall. Lets say at 93% efficiency its 428w. So a difference of 12w. If its .15c kw/h youll save 2.62$ a year, doesnt sound like much right haha? BUT if you live where its .35c kw/h you save 6$ a year. Yea it doesnt sound like much i know but this is just one example, the difference could be more extreme. Coule be 88% and 93%. Just a thought, this is pretty much free savings by picking the right PSU in the first place. And the less load on a psu the longer it lasts.


----------



## Stige

Well if you do account efficiency, my system doesn't take much over 450W from the PSU itself under any circumstances. Heck I would even dare to claim that any quality 500W PSU will run a single card setup without any issues even at high overclocks.


----------



## jdorje

I can break 550w with my 4690k drawing 200 and my 390 325. Takes some work though!


----------



## Stige

Quote:


> Originally Posted by *jdorje*
> 
> I can break 550w with my 4690k drawing 200 and my 390 325. Takes some work though!


Are you using a plug on the wall outlet to measure total consumption? You shouldn't believe what GPU-Z says for power usage.


----------



## bluej511

Quote:


> Originally Posted by *Stige*
> 
> Well if you do account efficiency, my system doesn't take much over 450W from the PSU itself under any circumstances. Heck I would even dare to claim that any quality 500W PSU will run a single card setup without any issues even at high overclocks.


Right but also depends, my calculation was only if u game for 4hrs a day not accounting other tasks, for example at 400w between bronze and gold if you use 400w a day for 4hrs here in france its only 3$ saving lol in Germany its more like 12. Worth it by getting a gold psu. Double that if you use full power 8hrs a day, gaming or editing or wtv it is.


----------



## Stige

Quote:


> Originally Posted by *bluej511*
> 
> Right but also depends, my calculation was only if u game for 4hrs a day not accounting other tasks, for example at 400w between bronze and gold if you use 400w a day for 4hrs here in france its only 3$ saving lol in Germany its more like 12. Worth it by getting a gold psu. Double that if you use full power 8hrs a day, gaming or editing or wtv it is.


I reckon I save a few bucks owning a Platinum PSU compared to having something like Bronze considering my PC is on 24/7 and I game A LOT.


----------



## bluej511

Quote:


> Originally Posted by *Stige*
> 
> I reckon I save a few bucks owning a Platinum PSU compared to having something like Bronze considering my PC is on 24/7 and I game A LOT.


Yea for sure, if they made a 1000w platinum id love it, or titanium MMMM. What people dont get is its FREE savings, you buy a gold psu instead of bronze, usually on sale or wtv and u save. Mine was 165€ for an rm1000 usually 200€


----------



## Stige

Quote:


> Originally Posted by *bluej511*
> 
> Yea for sure, if they made a 1000w platinum id love it, or titanium MMMM. What people dont get is its FREE savings, you buy a gold psu instead of bronze, usually on sale or wtv and u save. Mine was 165€ for an rm1000 usually 200€


Or the fact that you don't have to buy a PSU every 3 years because you decided to save money on the wrong part in your PC...


----------



## bluej511

Quote:


> Originally Posted by *Stige*
> 
> Or the fact that you don't have to buy a PSU every 3 years because you decided to save money on the wrong part in your PC...


True but PSUs do age. I dont remember when i bought my tx650 but after 5years started to smell, sent it in for warranty got a new tx650v2, ran that for a couple years and i have it now using it for testing and what not as a spare, good for watercooling. Modular makes it so easy to keep tidy too. Now i just need red extensions for my r9 390.


----------



## jdorje

1050w gs was on sale last summer for $90. I couldn't resist and am very happy even though I currently use like 475w of it.

Every few months it seems like there's a great psu sale.


----------



## jdorje

Quote:


> Originally Posted by *Stige*
> 
> Are you using a plug on the wall outlet to measure total consumption? You shouldn't believe what GPU-Z says for power usage.


Yes, kill-a-watt. Breaking it down into cpu and gpu usage is from hwinfo data but takes some approximation.

200w on the 4690k is a lot of work and only viable for a few seconds from a cold loop. Pushing the 390 above 300W is easy by comparison.


----------



## bluej511

Quote:


> Originally Posted by *jdorje*
> 
> 1050w gs was on sale last summer for $90. I couldn't resist and am very happy even though I currently use like 475w of it.
> 
> Every few months it seems like there's a great psu sale.


Damn they have a 1000w TITANIUM for 280$, i need to find this in europe somewhere. EVGA is scarce here idk why.


----------



## yuannan

i'm at 280 watts max load (firestrike extreme combined on loop), it peaks to 290 some time but it sits at 280 and idles at 120-150w with light tasks like youtube.


----------



## jdorje

Quote:


> Originally Posted by *bluej511*
> 
> Damn they have a 1000w TITANIUM for 280$, i need to find this in europe somewhere. EVGA is scarce here idk why.


Because it's a California company?

Evga doesn't make any of these units though. Gs is seasonic, but the low wattage gs units are pretty mediocre. G2 is the super flower leadex and probably a little better. Gq is decent and g1 is bad.

And there's bronze and platinum versions of most. I guess you're looking at the t2?

My gs is supposed to run at near platinum level under 100w load. That was the final selling point for me.
Quote:


> Originally Posted by *yuannan*
> 
> i'm at 280 watts max load (firestrike extreme combined on loop), it peaks to 290 some time but it sits at 280 and idles at 120-150w with light tasks like youtube.


Youtube isn't idle; the gpu does a fair amount of work. My system is around 61W idle.

To get an upper limit on realistic load run x264 and valley at the same time.


----------



## bluej511

Under 100w load only happens if you OC using turbo with energy savings (like any sane person should do haha). Seasonic is all over the place here, theres even a French shop ldlc.com that has seasonic make their psu. Platinum 1000w for 200€ http://www.ldlc.com/fiche/PB00181085.html


----------



## yuannan

Quote:


> Originally Posted by *jdorje*
> 
> Because it's a California company?
> 
> Evga doesn't make any of these units though. Gs is seasonic, but the low wattage gs units are pretty mediocre. G2 is the super flower leadex and probably a little better. Gq is decent and g1 is bad.
> 
> And there's bronze and platinum versions of most. I guess you're looking at the t2?
> 
> My gs is supposed to run at near platinum level under 100w load. That was the final selling point for me.
> Youtube isn't idle; the gpu does a fair amount of work. My system is around 61W idle.
> 
> To get an upper limit on realistic load run x264 and valley at the same time.


That's as low as it goes for me, my mobo doesn't boot with adaptive voltages. So have to pick manual voltage, my GPU also doesn't down clock it's RAM at idle so there is little if any difference at 0% and 20% load.


----------



## jdorje

120 idle is crazy. That would cost me $120 a year to idle full time.

You can't run 2 monitors on the 390. Put the second on igpu or even a cheap old card.

You can't use cstates? Those have far more effect than adaptive voltage.


----------



## yuannan

Quote:


> Originally Posted by *jdorje*
> 
> 120 idle is crazy. That would cost me $120 a year to idle full time.
> 
> You can't run 2 monitors on the 390. Put the second on igpu or even a cheap old card.
> 
> You can't use cstates? Those have far more effect than adaptive voltage.


c states are enabled, so is speedstep. i also run only one monitor (s29e790c 2560*[email protected]) It's not that demanding yet the GPU refuse to downclock. I think it's a BIOs issue and user @gupsterg has offered to make a custom one. I would say 60% of that power is due to the high mem clock, VRM2 at idle is at 60-65C so the GPU is pulling some heavy juice.


----------



## jdorje

If I run two monitors with my 1740 memory my card takes 80w more at idle. It's an insane bug.


----------



## Stige

Quote:


> Originally Posted by *jdorje*
> 
> 120 idle is crazy. That would cost me $120 a year to idle full time.
> 
> You can't run 2 monitors on the 390. Put the second on igpu or even a cheap old card.
> 
> You can't use cstates? Those have far more effect than adaptive voltage.


I get about 200W on desktop at idle.


----------



## yuannan

Quote:


> Originally Posted by *Stige*
> 
> I get about 200W on desktop at idle.


Your profile picture is perfect for this


----------



## bluej511

Its a known issue with hawaii running dual monitors. Yea 200 at idle is INSANE. Yea youtube or Chrome if you have hardware accelerated enabled it will use gpu more. Direct x baby.


----------



## Stige

It's the overclock monitor that does it. If you use tight timings for the 120Hz then it cannot downclock fully at idle, there are timings that do downclock but mine doesn't do 120Hz without errors if I don't run tightened timings.


----------



## diggiddi

Quote:


> Originally Posted by *tolis626*
> 
> Meh, undervolt them as far as they'll go at stock speeds and 750W is plenty. Mine uses like less than 150W on average at 1040MHz and -100mV (or under 1.1V under load). Do it, [Gollum voice] WE NEEDS IT! [/Gollum voice]


It depends heavily on the age and quality of the PSU. My antec 750 HCG used to be able to handle Xfired lightning 290X but not any more.
I trip OCP on only one stock gpu in Firestrike, I have since halved the core/mem clock and they run much cooler too, until new Psu, then back to ludicrous mode


----------



## PhillyB

Sorry to change the topic...but i finally got a Alphacool waterblock fitted onto my MSI 390x and overclocked it pretty quickly. Set power limit to 30% and have the clocks at 1180/1650. Kombuster seems stable and temps are good.

I Ran Firestrike and got 12453 overall which seems good. not sure.

Graphics: 14538
Physics 13979
combined 5561

the oddity is that after checking the single gpu top 30 firestrike, that would put me into 3rd, which seems odd for a non-extreme overclock like the ones on the list.


----------



## bluej511

Quote:


> Originally Posted by *PhillyB*
> 
> Sorry to change the topic...but i finally got a Alphacool waterblock fitted onto my MSI 390x and overclocked it pretty quickly. Set power limit to 30% and have the clocks at 1180/1650. Kombuster seems stable and temps are good.
> 
> I Ran Firestrike and got 12453 overall which seems good. not sure.
> 
> Graphics: 14538
> Physics 13979
> combined 5561
> 
> the oddity is that after checking the single gpu top 30 firestrike, that would put me into 3rd, which seems odd for a non-extreme overclock like the ones on the list.


Wat r ur temps? Should have gotten an ekwb cools the vrms as well


----------



## jdorje

14500 isn't very high; you can do better. The full firestrike score is extremely cpu dependent. Pretty terrible as a gpu benchmark.

Push your power limit at least to maximum, for starters. Then consider editing your vram timing straps.

And go register here: http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd


----------



## Noirgheos

Might not be the place to ask this...

But should I install chipset drivers and the likes? SATA, USB3, etc?


----------



## christoph

Quote:


> Originally Posted by *Noirgheos*
> 
> Might not be the place to ask this...
> 
> But should I install chipset drivers and the likes? SATA, USB3, etc?


I was told that the main crimson driver, the whole package installs everything you need, the one that's like 300 MB


----------



## diggiddi

Yeah what he said, Crimson installs Usb, audio(not soundcard), sata etc drivers


----------



## Noirgheos

Quote:


> Originally Posted by *diggiddi*
> 
> Yeah what he said, Crimson installs Usb, audio(not soundcard), sata etc drivers


I'm talking about motherboard drivers for my Intel chipset. Should I download them from the ASUS site?


----------



## Chaoz

Yeah definately not the correct thread. As for your question. Yes get the latest from asus site. For further questions refer to the Intel motherboard section.


----------



## bluej511

Yea Crimson only installs crimson drivers haha. It def doesnt do Sata, usb etc. It does HDMI audio as thats part of the card. As far as mobo drivers, only one i usually update is my audio drivers.


----------



## ronaldoz

For that contest, I got 2 390x's in my system, but the second one is idle. Crossfire is enabled in AMD software. Do I need a cable to connect both cards maybe?

*Update* It's working in-game, lol, what a boost.


----------



## Slowpoke66

Quote:


> Originally Posted by *ronaldoz*
> 
> For that contest, I got 2 390x's in my system, but the second one is idle. Crossfire is enabled in AMD software. Do I need a cable to connect both cards maybe?
> 
> *Update* It's working in-game, lol, what a boost.


ULPS disabled?


----------



## ronaldoz

Quote:


> Originally Posted by *Slowpoke66*
> 
> ULPS disabled?


It's not disabled, however, I could run the tests in crossfire, and it gave a nice boost. I underclocked to 1000 / 1450, but it might be possible to run at higher clocks. The GPU's are using both 225W at the moment. Gave 50.000 extra score for the contest.


----------



## Gdourado

Anyone has two 390x in crossfire?
How is the performance with the latest Crimson drivers?
Are there still any problems or major issues with crossfire setups that still make it preferable to have a single card setup?

Cheers!


----------



## PhillyB

Quote:


> Originally Posted by *bluej511*
> 
> Wat r ur temps? Should have gotten an ekwb cools the vrms as well


Ran Kombuster at 1280x720(default size) until temps evened out (~10 min).
Kombuster(stock): 133fps
Kobbuster(oc): 144fps
CPU: 39c
GPU: 55c
GPUVrm: 87c:
DeltaT:~7c

Ran a 15min test on CPU and GPU with the AIDA64 stress test.
CPU: 58c
GPU: 47c
GPUVrm: 58c
DeltaT: ~5c

I ordered the block in november the day after it went on backorder...got sent in Feb, then was stuck in customs for a month and a half. So when i ordered it, the EK block wasn't known to be released.
Quote:


> Originally Posted by *jdorje*
> 
> 14500 isn't very high; you can do better. The full firestrike score is extremely cpu dependent. Pretty terrible as a gpu benchmark.
> 
> Push your power limit at least to maximum, for starters. Then consider editing your vram timing straps.
> 
> And go register here: http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd


im not really one to push my overclocks. Just something stable that I can leave on 24/7.


----------



## battleaxe

Quote:


> Originally Posted by *Gdourado*
> 
> Anyone has two 390x in crossfire?
> How is the performance with the latest Crimson drivers?
> Are there still any problems or major issues with crossfire setups that still make it preferable to have a single card setup?
> 
> Cheers!


I have a pair of them.

I haven't noticed anything different than one card except the new Hitman Game doesn't have Xfire profile yet in DX12. I'm sure its coming though as they just added Xfire for DX11.

When I first loaded up the latest BETA Crimson I got a Windows Freeze on both Win7 and Win10 the first time I loaded in.

Not sure why, but since then it has been completely fine. No issues and seems a slight bump over last Beta driver.

I played a game of BF3 the other night in Xfire and it was so smooth I couldn't believe it. The new drivers seem to have smoothed things out even more than before. I really can't tell any difference between Xfire and single card anymore. Plus the added frames are nice. I think overall the increased frame pacing is overshadowing any stuttering issues. So at lower framerates such as when running single card you get a higher latency for frame timing. Where in Xfire there may be slightly more stuttering, but since the frame time is quicker it seems about the same. Either way, to me, they both seem near identical for gameplay, but with more power available for higher graphics. This is only my experience though. I'm sure some here will argue the other way.


----------



## jodybdesigns

Quote:


> Originally Posted by *Gdourado*
> 
> Anyone has two 390x in crossfire?
> How is the performance with the latest Crimson drivers?
> Are there still any problems or major issues with crossfire setups that still make it preferable to have a single card setup?
> 
> Cheers!


SLI and Xfire both are in a bad place. Actually, I don't think it has ever been this bad. Horrible support from both camps right now. But, results may vary so..


----------



## bluej511

Quote:


> Originally Posted by *PhillyB*
> 
> Ran Kombuster at 1280x720(default size) until temps evened out (~10 min).
> Kombuster(stock): 133fps
> Kobbuster(oc): 144fps
> CPU: 39c
> GPU: 55c
> GPUVrm: 87c:
> DeltaT:~7c
> 
> Ran a 15min test on CPU and GPU with the AIDA64 stress test.
> CPU: 58c
> GPU: 47c
> GPUVrm: 58c
> DeltaT: ~5c
> 
> I ordered the block in november the day after it went on backorder...got sent in Feb, then was stuck in customs for a month and a half. So when i ordered it, the EK block wasn't known to be released.
> im not really one to push my overclocks. Just something stable that I can leave on 24/7.


Nice delta, a lil weird that your vrms are so hot though. I have the same block my delta is about 10°C or so and my gpu only reaches 41-42°C and VRMs have never hit 70°C. I hope you didn't use the thermal paste that came with it from the reviews ive read its absolute garbage. I used gc extreme.


----------



## battleaxe

Quote:


> Originally Posted by *jodybdesigns*
> 
> SLI and Xfire both are in a bad place. Actually, I don't think it has ever been this bad. Horrible support from both camps right now. But, results may vary so..


I suppose it depends on the games you play. For me, its been great. I play BF3 BF4 and Hitman. That's about it. That's all I have time for really. On those games other than Hitman in DX12 (for now) Xfire is awesome in my experience. I have had zero issues other than one of my cards dying which had nothing to do with Xfire.


----------



## PhillyB

Quote:


> Originally Posted by *bluej511*
> 
> Nice delta, a lil weird that your vrms are so hot though. I have the same block my delta is about 10°C or so and my gpu only reaches 41-42°C and VRMs have never hit 70°C. I hope you didn't use the thermal paste that came with it from the reviews ive read its absolute garbage. I used gc extreme.


I used MX4 on the GPU.
There's not a lot of air flow in the case. I only run my fans at 1250rpm, so the air flow into the passive block isnt very high. Could be the culprit for the higher vrm temps.
Whats your overclock at? mines at 1180/1650?

edit: the deltas are so low because of the 2x 360s im using. totally overkill.


----------



## patriotaki

any alternative to the msi AB that can show hardware info while gaming?


----------



## bluej511

Quote:


> Originally Posted by *PhillyB*
> 
> I used MX4 on the GPU.
> There's not a lot of air flow in the case. I only run my fans at 1250rpm, so the air flow into the passive block isnt very high. Could be the culprit for the higher vrm temps.
> Whats your overclock at? mines at 1180/1650?
> 
> edit: the deltas are so low because of the 2x 360s im using. totally overkill.


Im only using a 360 and 240. I tightened mine tight though looked with a flashlight made sure it was flush. Ive only got one intake fan and getting low vrm temps for gpx block. Even at 1200/1650 didnt change much. Keeps my core at 43 and my 4690k at about 50


----------



## PhillyB

hmm, interesting. Which stress tests have you run?


----------



## jdorje

Quote:


> Originally Posted by *patriotaki*
> 
> any alternative to the msi AB that can show hardware info while gaming?


Rtss with hwinfo can show any sensors.

When exactly does the fan boy competition end? I'll probably resubmit and try to get the highest 390 score again and fix the timing error that the scorer didn't notice in accepting my score.


----------



## bluej511

Quote:


> Originally Posted by *PhillyB*
> 
> hmm, interesting. Which stress tests have you run?


Honestly i use AC Syndicate, as stressful on cpu and gpu as i can find, gets the water temps right up there. I was shocked when i got it that i dropped 30°C from the Nitro cooler. I dont use furmark as that messes with the voltage. Plus it stays at 100% at all times in SYndicate haha. Ill be adding more intake fans this week, one just isnt enough, hoping to get cooler air to my rads.

My core is about about 41-42°C with a water temp of about 32-33°C, the CPU runs at about 50°C even delided. Going to run it bare die this week as well and clean it up, i think theres something in there clogging it. My VRMs have always stayed under 70C


----------



## PhillyB

i will try some gaming tonight and see what happens. Kombuster temps are usually extremely high in comparison to gaming temps.


----------



## kizwan

Quote:


> Originally Posted by *ronaldoz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Slowpoke66*
> 
> ULPS disabled?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It's not disabled, however, I could run the tests in crossfire, and it gave a nice boost. I underclocked to 1000 / 1450, but it might be possible to run at higher clocks. The GPU's are using both 225W at the moment. Gave 50.000 extra score for the contest.
Click to expand...

You want to disabled ULPS because it can prevent secondary card from overvolt properly.
Quote:


> Originally Posted by *battleaxe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *jodybdesigns*
> 
> SLI and Xfire both are in a bad place. Actually, I don't think it has ever been this bad. Horrible support from both camps right now. But, results may vary so..
> 
> 
> 
> I suppose it depends on the games you play. For me, its been great. I play BF3 BF4 and Hitman. That's about it. That's all I have time for really. On those games other than Hitman in DX12 (for now) Xfire is awesome in my experience. I have had zero issues other than one of my cards dying which had nothing to do with Xfire.
Click to expand...

Crossfire (with supported games) has been great & scaling pretty good too. I have no bad experience with Crossfire except the games that doesn't support it.


----------



## bluej511

Quote:


> Originally Posted by *PhillyB*
> 
> i will try some gaming tonight and see what happens. Kombuster temps are usually extremely high in comparison to gaming temps.


its why i dont use it stress tests are only good for finding out if an OC is stable, my cpu gets just as hot in syndicate as it does in intel stress test so i know its working properly.


----------



## patriotaki

Quote:


> Originally Posted by *jdorje*
> 
> Rtss with hwinfo can show any sensors.
> 
> When exactly does the fan boy competition end? I'll probably resubmit and try to get the highest 390 score again and fix the timing error that the scorer didn't notice in accepting my score.


yeah tried that...its difficult to set it up...
also the labels take soooooooooooooooo much space........


----------



## ronaldoz

Quote:


> Originally Posted by *patriotaki*
> 
> yeah tried that...its difficult to set it up...
> also the labels take soooooooooooooooo much space........


Not sure, but it was easy. Just 3 x CPU-Z and 1 x GPU-Z (and the contest background). Run the test at default. Should not be that hard, but it can be annoying if you get errors off course.
Quote:


> Originally Posted by *kizwan*
> 
> You want to disabled ULPS because it can prevent secondary card from overvolt properly.
> Crossfire (with supported games) has been great & scaling pretty good too. I have no bad experience with Crossfire except the games that doesn't support it.


Will disabling ULPS also help for 1 card? And how will it help?


----------



## jdorje

I think you must be doing it wrong. I don't have any labels, just a bunch of numbers. I have it show my gpu core and vrm temps, cpu temp, gpu usage, cpu max and average core usage, and fps.

Also it's really easy to set up. Just go into hwinfo sensors, settings, rtss tab, and add the values.


----------



## Agent Smith1984

I think I knocke all my testing and screenies out in 30 minutes or so.... about the easiest forum comp I've ever entered....

I used to compete in "Forum Warz" at PC Pitstop and The Raptor Pit, and those things took some serious time to enter.....

I actually made the theme song for the video, lol






It's some OLD SCHOOL stuff in that video my friends!!


----------



## patriotaki

Quote:


> Originally Posted by *jdorje*
> 
> I think you must be doing it wrong. I don't have any labels, just a bunch of numbers. I have it show my gpu core and vrm temps, cpu temp, gpu usage, cpu max and average core usage, and fps.
> 
> Also it's really easy to set up. Just go into hwinfo sensors, settings, rtss tab, and add the values.


yes thats what i did but i also checked the 2nd box which shows the labels of each item you want to show..
it takes too much space for example if i want to show the CPU temp , CPU Speed, CPU utilization it will be showed as following:

CPU Temparature, CPU Speed, CPU Utilization , 50celcius,4400MHz,60%
which is annoying ill have to mess more with it


----------



## jdorje

Well you can edit the label to make it short. But yeah if you're trying to fit a lot of data it can add up.


----------



## kizwan

Quote:


> Originally Posted by *patriotaki*
> 
> Quote:
> 
> 
> 
> Originally Posted by *jdorje*
> 
> Rtss with hwinfo can show any sensors.
> 
> When exactly does the fan boy competition end? I'll probably resubmit and try to get the highest 390 score again and fix the timing error that the scorer didn't notice in accepting my score.
> 
> 
> 
> yeah tried that...its difficult to set it up...
> also the labels take soooooooooooooooo much space........
Click to expand...

Just choose the most important sensor you want to monitor. It have limit how many you can display though.

I use AIDA64 >> RTSS

Example:-


Quote:


> Originally Posted by *ronaldoz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> You want to disabled ULPS because it can prevent secondary card from overvolt properly.
> Crossfire (with supported games) has been great & scaling pretty good too. I have no bad experience with Crossfire except the games that doesn't support it.
> 
> 
> 
> Will disabling ULPS also help for 1 card? And how will it help?
Click to expand...

ULPS only work in Crossfire; for turning off secondary gpu(s) when idle. So disabling ULPS on single GPU setup shouldn't have any effect or benefit but you never know. Once or twice I did read disabling ULPS improved their GPU usage.


----------



## ronaldoz

Quote:


> Originally Posted by *kizwan*
> 
> Just choose the most important sensor you want to monitor. It have limit how many you can display though.
> 
> I use AIDA64 >> RTSS
> 
> Example:-
> 
> ULPS only work in Crossfire; for turning off secondary gpu(s) when idle. So disabling ULPS on single GPU setup shouldn't have any effect or benefit but you never know. Once or twice I did read disabling ULPS improved their GPU usage.


Thank you! And I also like the Aida > Rivatuner stats a lot! Can't get VRM temps set with Afterburner, so I will surely use Aida!


----------



## bluej511

Quote:


> Originally Posted by *ronaldoz*
> 
> Thank you! And I also like the Aida > Rivatuner stats a lot! Can't get VRM temps set with Afterburner, so I will surely use Aida!


True but afterburner and gpuz are free, thats how i monitor anyways i just leave gpuz in the background then check once finished.


----------



## patriotaki

is there any FPS option on Aida?


----------



## Gdourado

Quote:


> Originally Posted by *jodybdesigns*
> 
> SLI and Xfire both are in a bad place. Actually, I don't think it has ever been this bad. Horrible support from both camps right now. But, results may vary so..


Can you please elaborate a bit on why do you say so?


----------



## ronaldoz

Quote:


> Originally Posted by *bluej511*
> 
> True but afterburner and gpuz are free, thats how i monitor anyways i just leave gpuz in the background then check once finished.


Indeed, but if you like, you could use the trial version and add the VRM temps only, like this:


----------



## jdorje

Just use hwinfo. It's free and has ~every sensor.


----------



## bluej511

Quote:


> Originally Posted by *jdorje*
> 
> Just use hwinfo. It's free and has ~every sensor.


It doesnt have vrm at least not the version im on lol.


----------



## patriotaki

Quote:


> Originally Posted by *bluej511*
> 
> It doesnt have vrm at least not the version im on lol.


aida64 seems more user friendly for me and i like it ..but no FPS







... fraps it is then?


----------



## ronaldoz

Quote:


> Originally Posted by *patriotaki*
> 
> aida64 seems more user friendly for me and i like it ..but no FPS
> 
> 
> 
> 
> 
> 
> 
> ... fraps it is then?


MSI Afterburner


----------



## jdorje

Jesus you guys are making this too hard.

All the osd tools (except Fraps) use rtss for display afaik. Rtss will display the fps itself, set that in the rtss panel. It gets installed with afterburner which you probably want anyway.

Then just go into hwinfo - or your preferred rtss-compatible program - and set it to what you want it to display. In hwinfo you set a row and column for each element so I have like one row for fps, one for gpu/vrm temps, one for cpu temps and use.


----------



## bluej511

Quote:


> Originally Posted by *jdorje*
> 
> Jesus you guys are making this too hard.
> 
> All the osd tools (except Fraps) use rtss for display afaik. Rtss will display the fps itself, set that in the rtss panel. It gets installed with afterburner which you probably want anyway.
> 
> Then just go into hwinfo - or your preferred rtss-compatible program - and set it to what you want it to display. In hwinfo you set a row and column for each element so I have like one row for fps, one for gpu/vrm temps, one for cpu temps and use.


Is hwinfo the same as hwmonitor?


----------



## jdorje

Quote:


> Originally Posted by *bluej511*
> 
> Is hwinfo the same as hwmonitor?


"No"


----------



## bluej511

Yea i just tried it, the amount of information is ridiculous. Can't get VRMs to show up in RTSS but we'll try it again later. I have it set to yes in the sensors settings but nada.


----------



## patriotaki

hmm on aida64 i cant play at fullscreen while using rtss


----------



## diggiddi

Quote:


> Originally Posted by *patriotaki*
> 
> any alternative to the msi AB that can show hardware info while gaming?


trixx, steam overlay

Quote:


> Originally Posted by *Gdourado*
> 
> Can you please elaborate a bit on why do you say so?


Crossfire issues are heavily dependent on the game/engine? so find out if the games you play have issues with it and decide from there
For me it works fine for all my games so far eg BF3/4, cry1/2/3, pcars, F1 etc


----------



## ronaldoz

Quote:


> Originally Posted by *patriotaki*
> 
> hmm on aida64 i cant play at fullscreen while using rtss


Not sure if you noticed, but you got 2 methods to monitor in Aida:
- desktop
- ingame
For ingame: don't use OSD, but 'external applications' at the preference screen. That might help.


----------



## jodybdesigns

Quote:


> Originally Posted by *Gdourado*
> 
> Can you please elaborate a bit on why do you say so?


You will have to go pages back for all that. I steadily read the streams rolling through and have for months. I don't need to say anything. I also results may vary. I Have had Xfire twice. Once with 4870's and that was absolutely disgusting. The overhead was awful. Then, they worked out the 7900 series about half way through its life and Xfire was amazing throughout the 200 series lifespan. Then here we go again, all of a sudden the 300 series and Fury came out. Everything was back to a stuttery, jittery mess. I blame the Fury line of cards for the bad driver development for Xfire.

Please take the time to read the entire thread here. Reviews are completely mixed, but mostly negative.

Xfire is simply numbers on a page. If you about pressing numbers and benching hard, get Xfire. If you want a smooth experience at 2K or higher, get a single card solution. A 390X or bite the bullet and drop the cash for a 980Ti.


----------



## PhillyB

Quote:


> Originally Posted by *bluej511*
> 
> its why i dont use it stress tests are only good for finding out if an OC is stable, my cpu gets just as hot in syndicate as it does in intel stress test so i know its working properly.


found a nice taxing location in FO4 and GPU:47c, VRM:58c. was going to fire up Star Citizen, but apparently they did a big update.


----------



## christoph

yeah, I'm really impressed with the cooling, I've use Sapphire since 2007


----------



## bluej511

Quote:


> Originally Posted by *PhillyB*
> 
> found a nice taxing location in FO4 and GPU:47c, VRM:58c. was going to fire up Star Citizen, but apparently they did a big update.


Thats better. You are in TX too ambient is prob in the 25C right now. Id be shocked if going from mx4 to gc extreme gets it from 47 to 42 my delta is even higher then yours lol. My water temp is probably 32-34C when my gpu is at 42.


----------



## christoph

GC extreme is better than MX4?


----------



## battleaxe

Quote:


> Originally Posted by *christoph*
> 
> yeah, I'm really impressed with the cooling, I've use Sapphire since 2007


Just hope you don't have to RMA with them any time soon. As I am finding out the hard way right now. Opened a ticket earlier this week and they still haven't responded to me.


----------



## christoph

Quote:


> Originally Posted by *battleaxe*
> 
> Just hope you don't have to RMA with them any time soon. As I am finding out the hard way right now. Opened a ticket earlier this week and they still haven't responded to me.


really?

I've never had any problem with Sapphire cards, and I was told that Sapphire has one of the best RMA deparment


----------



## jdorje

Today the 31st is the last day for the fanboy competition.

http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd

We're in first with a small buffer. But anyone who hasn't submitted yet, now's a good time. If you want some advice on overclocking (memory timings mods!) and are new, now's a good time. If you want to show the superiority of your card and overclocking skills, now's a good time.

The current highest 390 score is 79187 by forisp2400. The highest 390x is yuannan with 86,600.

To enter you need the background, you need 2 cpu-z and 1 gpu-z open, and you need to run and screenshot firestrike, 3dmark11, and/or vantage. Only graphics score matters. Post the screenshots and the 3dmark links over there.

Also helpful are the newest drivers from a couple days ago: http://support.amd.com/en-us/download/desktop?os=Windows%2010%20-%2064


----------



## bluej511

Quote:


> Originally Posted by *christoph*
> 
> GC extreme is better than MX4?


Yea that and thermal grizzly kryonaut seems to be the best.

Honestly the 2 companies ive ever had to call for issues have been amazing. Corsair rma is great. Antec customer service is just heavenly. I put an hp prebuilt into the antec 302 and couldnt get the usb 3.0 front header to work. They sent me a new header not once not twice but 3x. Turns out the usb on hp mobos is proprietary. If antec had a nice cube case i would have bought it


----------



## Chaoz

Quote:


> Originally Posted by *bluej511*
> 
> Yea that and thermal grizzly kryonaut seems to be the best.


I'm using Antec nano Diamond Formula 7. Quite happy with it, although it cost a lot of money, but it's a bit tube of 4 grams. I'm using it on my CPU and GPU. It's supposedly better than Silver5.
GPU temps dropped 3-4°C in idle (even when just browsing the forums and got my fans around 1200 rpm, so quite silent) and same for load temps.
CPU I've no clue.


----------



## bluej511

Quote:


> Originally Posted by *Chaoz*
> 
> I'm using Antec nano Diamond Formula 7. I'm using it on my CPU and GPU. GPU temps dropped 3-4°C in idle (even when just browsing the forums and got my fans around 1200 rpm, so quite silent) and same for load temps.
> CPU I've no clue.


I used it on my gpu and it runs 41-42C while the other guys on mx4 runs at 48 ish so gc extreme works great. Theyre usually all very close anyways till you get into liquid metal.

I used noctua on my delid since i have a lot of it seems to make no difference but i could have moved the ihs around too much or cpu block is clogged a lil who knows lol


----------



## Chaoz

Quote:


> Originally Posted by *bluej511*
> 
> I used it on my gpu and it runs 41-42C while the other guys on mx4 runs at 48 ish so gc extreme works great. Theyre usually all very close anyways till you get into liquid metal.
> 
> I used noctua on my delid since i have a lot of it seems to make no difference but i could have moved the ihs around too much or cpu block is clogged a lil who knows lol


My GPU in idle (and when browsing) runs at 34-35°C and with the standard TIM on my card it was 39°C, with just some browsing forums and such.

Haven't delidded my CPU, don't feel like doing it. If I screw it up I'm screwed


----------



## bluej511

Quote:


> Originally Posted by *Chaoz*
> 
> My GPU in idle (and when browsing) runs at 34-35°C and with the standard TIM on my card it was 39°C, with just some browsing forums and such.
> 
> Haven't delidded my CPU, don't feel like doing it. If I screw it up I'm screwed


It was easier then i thought, i didnt even use a vice or razor blade, just the edge of a marble desk and hammer popped right off. Idle temps wont matter much they usually always close, even when watercooled.


----------



## jdorje

Please stop wasting your delid with noctua paste. And please stop saying it or others might do the same. It makes me hurt inside every time I read it.


----------



## bluej511

Quote:


> Originally Posted by *jdorje*
> 
> Please stop wasting your delid with noctua paste. And please stop saying it or others might do the same. It makes me hurt inside every time I read it.


Boohoo will be fixed shortly. I dont feel like using aggressive rhodium liquid metal. I might lap the cpu block anyways.


----------



## Stige

Quote:


> Originally Posted by *bluej511*
> 
> Boohoo will be fixed shortly. I dont feel like using aggressive rhodium liquid metal. I might lap the cpu block anyways.


Then you are delidding completely wrong and should stop spreading this information to people, every single person should use CLU or equilevant between the IHS and DIE, every single time.


----------



## bluej511

Quote:


> Originally Posted by *Stige*
> 
> Then you are delidding completely wrong and should stop spreading this information to people, every single person should use CLU or equilevant between the IHS and DIE, every single time.


You do know using 86w/mk liquid metal then using a tim thats 10w/mk defeats the purpose right? Should be using clu on both. Hence why im removing the ihs and going direct die. Why less surface to transfer heat through. Why do you think gpus are bare die honestly? Think about it.


----------



## Vellinious

Quote:


> Originally Posted by *bluej511*
> 
> You do know using 86w/mk liquid metal then using a tim thats 10w/mk defeats the purpose right? Should be using clu on both. Hence why im removing the ihs and going direct die. Why less surface to transfer heat through. Why do you think gpus are bare die honestly? Think about it.


You're using the naked mount from EK? Yeah...good to use paste there. If you were putting the IHS back on, that's where CLU would be good.

As for using paste between the IHS and block...it works just fine, and doesn't defeat the purpose of anything. Works brilliantly, actually. As replacing the crap Intel uses between the die and the IHS has already increased heat transfer by a ton. Get that heat to the IHS, so your block can do it's job. Hell, you'd notice a decrease in temps, even if you used peanut butter on the block mount.


----------



## bluej511

Quote:


> Originally Posted by *Vellinious*
> 
> You're using the naked mount from EK? Yeah...good to use paste there. If you were putting the IHS back on, that's where CLU would be good.
> 
> As for using paste between the IHS and block...it works just fine, and doesn't defeat the purpose of anything. Works brilliantly, actually. As replacing the crap Intel uses between the die and the IHS has already increased heat transfer by a ton. Get that heat to the IHS, so your block can do it's job. Hell, you'd notice a decrease in temps, even if you used peanut butter on the block mount.


I gotta find that article but issue isnt the intel tim but the glue that keeps the ihs from actually making good contact with the die especially with the poor tolerances on the ihs. Someone did a test with a shim between the ihs and die and no shim. Temps dropped 20C without the shim. Ill be getting the screws in a few days will clean the block put tim on the die then see the temps. If i dont like it could use clp or thermal grizzly which is rated at 86w/mk compared to clps 32. Again as i said this is for testing purposes its why i didnt go bare die right away.

My temps are higher now so im guessing i screwed up putting tim somewhere. In the delid thread differnece between bare die and ihs is 5C + so worth it getting 4 screws. Thats at the same overclock on a 4790k my guess is my voltage just isnthigh enough to see huge differences but it will be.


----------



## bluej511

Found it lol. This is ivy bridge though which runs hotter but there you go.

http://forums.anandtech.com/showpost.php?p=34053183


----------



## Vellinious

I know a couple of guys that have used the naked mount from EK and love it. Don't know that I'd do it, but.....eh


----------



## bluej511

Quote:


> Originally Posted by *Vellinious*
> 
> I know a couple of guys that have used the naked mount from EK and love it. Don't know that I'd do it, but.....eh


Pretty much remove the torx but in 4 new screws and thats it so simply. The screws are 4€ problem is no matter where u ship from its where they get u. No one in france stocks it, the computer market here is minimal and for water cooling even less so. Monsoon stuff and wat not has to come from germany.


----------



## Stige

Direct-to-die compared to running CLU between IHS and DIE, the difference is non-existant. Not worth the hassle of running direct-to-die in my opinion, I tried it and it didn't offer any real benefit at all.
If you delid and you some crappy TIM then you completely defeat the purpose of delidding.


----------



## tolis626

Quote:


> Originally Posted by *Stige*
> 
> Direct-to-die compared to running CLU between IHS and DIE, the difference is non-existant. Not worth the hassle of running direct-to-die in my opinion, I tried it and it didn't offer any real benefit at all.
> If you delid and you some crappy TIM then you completely defeat the purpose of delidding.


@bluej511

This. You gotta look at the physics behind it really. The die is a small (Like what? 170-180mm^2? And that includes the iGPU that's dormant), heat-dense surface. Dissipating heat from a tiny surface is a much bigger challenge than dissipating it from the IHS's large one. So putting the most thermally conductive material you can on the die is of utmost importance so that you can get heat away from it as fast as possible. Now, the IHS, as a solid block of metal, will get saturated with heat quite quickly and it will conduct it quite efficiently. It's also a much bigger surface that's coming in contact with your heatsink or water block, so the thermal conductivity of the TIM used is of secondary importance.

Putting a silicone based TIM, even the best ones like GC Extreme and Kryonaut, on the die itself is effectively insulating it compared to a liquid metal solution. Having direct-to-die mounting isn't going to change much either, as the IHS is not the bottleneck. And really, you want your thermal conductivity to be as high as possible. If you can get better than CLU, do it. Thermal Grizzly Conductonaut and Phobya LM are both better, I think.

Now, doing a direct mount for the fun of it is fine with me. You like tinkering, I get that. But insisting on putting that Noctua TIM on it is wrong indeed for the above mentioned reasons (and some more). Hope I convinced you.

PS : Also, the reason GPUs don't use an IHS generally is because the hot, power hungry ones use bigger dies that don't need an IHS to effectively increase the surface area, and the smaller die GPUs that would benefit from one aren't power hungry or hot enough.


----------



## bluej511

Guys like i said testing purposes lol. If i put clp direct on the die and mount the block directly and see a change i wont know if its from clp or going direct to die. Like i said before between direct and nnon direct same cpu same clock there was a 6C difference ie the non ihs ones ran cooler.

You guys understand that any extra layer no matter what material is still an extra layer, why do you think people insulate homes?

Stige check the link i posted the intel tim IS NOT the issue. Hell the intel tim performed better the noctua tim, which is always ranked top 5 in tims its far from crappy. Yes i understand that a liquid of 32w/mk will perform better then a paste of 12w/mk. You guys missed what vellinious posted as well.


----------



## tolis626

Quote:


> Originally Posted by *bluej511*
> 
> Guys like i said testing purposes lol. If i put clp direct on the die and mount the block directly and see a change i wont know if its from clp or going direct to die. Like i said before between direct and nnon direct same cpu same clock there was a 6C difference ie the non ihs ones ran cooler.
> 
> You guys understand that any extra layer no matter what material is still an extra layer, why do you think people insulate homes?
> 
> Stige check the link i posted the intel tim IS NOT the issue. Hell the intel tim performed better the noctua tim, which is always ranked top 5 in tims its far from crappy. Yes i understand that a liquid of 32w/mk will perform better then a paste of 12w/mk. You guys missed what vellinious posted as well.


Well, you're definitely right there. And I understand why you're doing this. I just had to point out that there's a very valid reason that everyone who delids uses liquid metals.


----------



## bluej511

Quote:


> Originally Posted by *tolis626*
> 
> Well, you're definitely right there. And I understand why you're doing this. I just had to point out that there's a very valid reason that everyone who delids uses liquid metals.


Right i get it, its far from thermal efficient then paste tim. But most people who delid put the ihs back on lol. Stige keeps pointing out there's no difference, but on the delid thread 2 identical chips with the same OC speed, and clu one had a temp drop of 25C (no ihs) and ihs with clu and tx4 and mx4 was 15-18C, so there is a difference, now this is also for the 4790k and im using a 4690k so less tdp. Also ivy bridge seems to be getting way more ridiculous temp drops then anything else.

Now for me it takes about 2mins to reapply tim if need be, reason im not going to direct clu/clp is to see if there is indeed no difference, if there is ill clu and put the ihs back on, if the difference is massive the ihs will stay off and i will either clu or clp (i hear that clp is much easier to clean).

P.S. We're crushing the fanboy competition haha.


----------



## tolis626

Quote:


> Originally Posted by *bluej511*
> 
> Right i get it, its far from thermal efficient then paste tim. But most people who delid put the ihs back on lol. Stige keeps pointing out there's no difference, but on the delid thread 2 identical chips with the same OC speed, and clu one had a temp drop of 25C (no ihs) and ihs with clu and tx4 and mx4 was 15-18C, so there is a difference, now this is also for the 4790k and im using a 4690k so less tdp. Also ivy bridge seems to be getting way more ridiculous temp drops then anything else.
> 
> Now for me it takes about 2mins to reapply tim if need be, reason im not going to direct clu/clp is to see if there is indeed no difference, if there is ill clu and put the ihs back on, if the difference is massive the ihs will stay off and i will either clu or clp (i hear that clp is much easier to clean).
> 
> P.S. We're crushing the fanboy competition haha.


I wasn't debating IHS on versus IHS off. Although truth is you get 80-90% of the benefits of delidding by scraping off the glue and putting a liquid metal on the die, then reusing the IHS. Having the IHS there is easier (for mounting) and safer and you don't run the risk of cracking the die. I'd be willing to give up 5C or so for added safety down the road, but that's just me. If you want the absolute best performance, then indeed you want direct mounting and LM.









About the fanboy competition, oh yeah.


----------



## bluej511

Quote:


> Originally Posted by *tolis626*
> 
> I wasn't debating IHS on versus IHS off. Although truth is you get 80-90% of the benefits of delidding by scraping off the glue and putting a liquid metal on the die, then reusing the IHS. Having the IHS there is easier (for mounting) and safer and you don't run the risk of cracking the die. I'd be willing to give up 5C or so for added safety down the road, but that's just me. If you want the absolute best performance, then indeed you want direct mounting and LM.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> About the fanboy competition, oh yeah.


Well im using the EKWB naked ivy screws. You pretty much replace the factory mounting screws with these and it drops down the block the correct amount (wtv that may be an extra 2-3mm or wtv it is, ill measure both screw spacers once i get em).

I did scrape all the glue, even in the corners cleaned it up real nice. Again thats the main culprit and not the intel tim which is silver based so a good conductor. The ihs is made of copper anyways its just plated (btw bare copper is absolutely the best, its why alphacool barely paint the fins on their radiators). Anytime copper is painted, the paint acts as an insulator.


----------



## Stige

Quote:


> Originally Posted by *bluej511*
> 
> Right i get it, its far from thermal efficient then paste tim. But most people who delid put the ihs back on lol. Stige keeps pointing out there's no difference, *but on the delid thread 2 identical chips with the same OC speed, and clu one had a temp drop of 25C (no ihs) and ihs with clu and tx4 and mx4 was 15-18C, so there is a difference, now this is also for the 4790k and im using a 4690k so less tdp.* Also ivy bridge seems to be getting way more ridiculous temp drops then anything else.
> 
> Now for me it takes about 2mins to reapply tim if need be, reason im not going to direct clu/clp is to see if there is indeed no difference, if there is ill clu and put the ihs back on, if the difference is massive the ihs will stay off and i will either clu or clp (i hear that clp is much easier to clean).
> 
> P.S. We're crushing the fanboy competition haha.


The only reason there for the temp difference is that crappy paste used between the IHS and block, that's the reason. Not the IHS or anything else. Use CLU on both sides of the IHS and the difference won't be 2C. Like I said, I did try direct-to-die but the difference was within margin of error and it was not worth the trouble of having to really put your CPU block super tight so the CPU made proper contact.


----------



## bluej511

Quote:


> Originally Posted by *Stige*
> 
> The only reason there for the temp difference is that crappy paste used between the IHS and block, that's the reason. Not the IHS or anything else. Use CLU on both sides of the IHS and the difference won't be 2C. Like I said, I did try direct-to-die but the difference was within margin of error and it was not worth the trouble of having to really put your CPU block super tight so the CPU made proper contact.


Right and hence why im testing lol. But again the tim is not the issue for gods sakes read the link inposted. To you evry tim is crappy lol. Noctua, mx4,tx4, kryonaut. Read it and youll see the ihs glue is the issue not the tim.


----------



## Stige

Quote:


> Originally Posted by *bluej511*
> 
> Right and hence why im testing lol. But again the tim is not the issue for gods sakes read the link inposted. To you evry tim is crappy lol. Noctua, mx4,tx4, kryonaut. Read it and youll see the ihs glue is the issue not the tim.


Quote:


> but on the delid thread 2 identical chips with the same OC speed, and clu one had a temp drop of 25C (no ihs) and ihs with clu and tx4 and mx4 was 15-18C, so there is a difference, now this is also for the 4790k and im using a 4690k so less tdp.


You are comparing direct-to-die with CLU against IHS that has CLU between IHS/Die and some Noctua paste between IHS/Block. That paste will hinder the performance a LOT. Put CLU on both sides of the IHS and then you can compare them.


----------



## bluej511

So youre saying his testing was wrong that the Noctua did worse then the Intel with the glue removed? According to you both tims are awful lol. Again the tim isnt the issue when deliding its the glue keeping the ihs in bad contact with the die. Of course clu will work better its like solder you keep saying it. Were talking about why deliding makes a difference. Your testing is your testing your voltages,cpu,psu are different then mine or whoever else delids. Silicon lottery.


----------



## Stige

Quote:


> Originally Posted by *bluej511*
> 
> So youre saying his testing was wrong that the Noctua did worse then the Intel with the glue removed? According to you both tims are awful lol. Again the tim isnt the issue when deliding its the glue keeping the ihs in bad contact with the die. Of course clu will work better its like solder you keep saying it. Were talking about why deliding makes a difference. Your testing is your testing your voltages,cpu,psu are different then mine or whoever else delids. Silicon lottery.


You don't seem to understand that your comparison is irrelevant because of using a completely different TIM there. If only CLU was used in all cases THEN it would be right but it was not. You can't compare temps like that then say direct-to-die is much better when you used completely different TIM.


----------



## bluej511

Quote:


> Originally Posted by *Stige*
> 
> You don't seem to understand that your comparison is irrelevant because of using a completely different TIM there. If only CLU was used in all cases THEN it would be right but it was not. You can't compare temps like that then say direct-to-die is much better when you used completely different TIM.


Ding ding ding now youre getting it. If i tested with nh-t1 then clp wouldnt be accurate. If i go bare die with noctua and get better temps then clearly bare die works right? See you got it. Ill be trying at [email protected] and [email protected]


----------



## Stige

Quote:


> Originally Posted by *bluej511*
> 
> Ding ding ding now youre getting it. If i tested with nh-t1 then clp wouldnt be accurate. If i go bare die with noctua and get better temps then clearly bare die works right? See you got it. Ill be trying at [email protected] and [email protected]


I was saying that same thing all along







You just didn't get it!


----------



## Vellinious

Quote:


> Originally Posted by *Stige*
> 
> You are comparing direct-to-die with CLU against IHS that has CLU between IHS/Die and some Noctua paste between IHS/Block. That paste will hinder the performance a LOT. Put CLU on both sides of the IHS and then you can compare them.


You don't need CLU on both sides. Between the die and the IHS is plenty. A good quality paste between the IHS and the block works perfectly. The difference would be very small. The biggest problem is, and has always been, getting the heat from the die to the IHS. Once it's there, a good block can remove the heat without the need for CLU. In fact, I'd never recommend using CLU between an IHS and a block. Never.

The direct die is always going to be better, btw....I don't care what TIM you're using between the IHS and the block.


----------



## bluej511

Quote:


> Originally Posted by *Vellinious*
> 
> You don't need CLU on both sides. Between the die and the IHS is plenty. A good quality paste between the IHS and the block works perfectly. The difference would be very small. The biggest problem is, and has always been, getting the heat from the die to the IHS. Once it's there, a good block can remove the heat without the need for CLU. In fact, I'd never recommend using CLU between an IHS and a block. Never.
> 
> The direct die is always going to be better, btw....I don't care what TIM you're using between the IHS and the block.


+REP


----------



## jdorje

We're getting way off topic here. But. ..

Lm between die and ihs gives a much larger boost then between ihs and heatsink. At least double the change in delta t, and maybe much more. The reason is that the area of the die is much smaller. The ihs is actually really good at its job.


----------



## Vellinious

^^This


----------



## Stige

The conversation ended already and you completely missed the point in it.

EDIT:
Quote:


> Originally Posted by *Vellinious*
> 
> The direct die is always going to be better, btw....I don't care what TIM you're using between the IHS and the block.


The difference will be insignificant. I wish someone would test it with only liquid metal in both cases, I cba taking the mobo sockets off and try to get it to boot direct-to-die again, was such a hassle.


----------



## bluej511

Quote:


> Originally Posted by *Stige*
> 
> The conversation ended already and you completely missed the point in it.
> 
> EDIT:
> The difference will be insignificant. I wish someone would test it with only liquid metal in both cases, I cba taking the mobo sockets off and try to get it to boot direct-to-die again, was such a hassle.


Hence why im testing it with noctua both with ihs and without will give me a definitive answer. Yes i know with clu would be greater but im not after greater im after differential.


----------



## Vellinious

Any time you have to go through another layer, it's going to reduce efficiency. By adding 2 distinct layers (IHS and another layer of TIM), you reduce efficiency. The naked mount will always be more efficient because it doesn't have those extra layers.

I don't think I'd ever use the naked mount myself, but if you're one to push your CPU clocks as high as possible...the naked mount would be the way to go.

I don't know why any of us argue with Stige....he's ALWAYS right. roflmao


----------



## Stige

Quote:


> Originally Posted by *Vellinious*
> 
> Any time you have to go through another layer, it's going to reduce efficiency. By adding 2 distinct layers (IHS and another layer of TIM), you reduce efficiency. The naked mount will always be more efficient because it doesn't have those extra layers.
> 
> *I don't think I'd ever use the naked mount myself, but if you're one to push your CPU clocks as high as possible...the naked mount would be the way to go.*
> 
> I don't know why any of us argue with Stige....he's ALWAYS right. roflmao


Well it doesn't look like people want this to stop so:

Like I have said a million times now, if you only use CLU then the difference is non-existant and within margin of error if you have the IHS on or not.
Also on the bolded part: If you have a custom loop, you can't get your CPU temps that high that you would even need direct-to-die to get the temps lower.

My CPU does about 65C at 1.52-1.55V which is propably more than 99.9% of the people ever run at. Sure some CPUs do run hotter (Like AMD but they are just crap anyway) but I think you can never hit max temps on a custom loop if you have done things properly, just not possible really.


----------



## bluej511

Quote:


> Originally Posted by *Stige*
> 
> Well it doesn't look like people want this to stop so:
> 
> Like I have said a million times now, if you only use CLU then the difference is non-existant and within margin of error if you have the IHS on or not.
> Also on the bolded part: If you have a custom loop, you can't get your CPU temps that high that you would even need direct-to-die to get the temps lower.
> 
> My CPU does about 65C at 1.52-1.55V which is propably more than 99.9% of the people ever run at. Sure some CPUs do run hotter (Like AMD but they are just crap anyway) but I think you can never hit max temps on a custom loop if you have done things properly, just not possible really.


Yes but you live in Finland. Someone with an ambient of 25+ wont have water thats 25C think about that as well. Southern France in the summer this place inside hits 28C or so no AC my water will be 38C my cpu would prob run 58C at its oc of 1.088v right now.


----------



## jdorje

Yeah but if you're running 1.5V for any length of time the difference between 50C and 60C might matter.

I've only taken my 4690k up to 1.45V...it can't quite pass benches at 4.9 ghz. I imagine at 1.5V I could boot 5.0. But temps just aren't an issue even on my h80i unless I run something hot for long enough to heat up the loop.


----------



## jodybdesigns

Off topic, But can you run a block directly on the DIE? Like without the lid? And if you did, would you use TIM or Paste?


----------



## jdorje

Yeah, you can run it naked with no ihs. The difference is fairly small (0-5 degrees) and it's riskier than actually delidding though. You have to get the pressure just right and even or you risk damaging the die. You certainly want LM on the die for it.

Why is it a small difference? Most heat sinks and blocks are designed to work with the ihs, which spreads the heat over a larger area. The die itself is like 1 cm squared and that's where all the heat will be. So some heat pipes or some of the water might be doing a lot less with the more concentrated heat.

Relatedly, the larger die size is why gpu water cooling is absurdly effective. Even if the water gets hot the core is only a few degrees hotter. Even delidded I can get my 4690k 50C hotter than my water (with 200 watts...which is hard...and would send my h80i water temp 25-35C above ambient if I let it run).


----------



## Stige

Quote:


> Originally Posted by *bluej511*
> 
> Yes but you live in Finland. Someone with an ambient of 25+ wont have water thats 25C think about that as well. Southern France in the summer this place inside hits 28C or so no AC my water will be 38C my cpu would prob run 58C at its oc of 1.088v right now.


My room temp hits 30C in the summer so.. yeah







It's already 25C in winter if my PC is on.

EDIT: 24.8C right now.
My radiator exhaust temps while playing World of Tanks is 30.7C atm so +6C to room temp.


----------



## jdorje

Damn that makes me feel cold. Ambient is 17C right here and I'm in short sleeves.

Any time you compare temps it should be in delta. But obviously if your ambient rises 20C that is going to have a 20C impact on final temp.


----------



## Vellinious

Quote:


> Originally Posted by *Stige*
> 
> Well it doesn't look like people want this to stop so:
> 
> Like I have said a million times now, if you only use CLU then the difference is non-existant and within margin of error if you have the IHS on or not.
> Also on the bolded part: If you have a custom loop, you can't get your CPU temps that high that you would even need direct-to-die to get the temps lower.
> 
> My CPU does about 65C at 1.52-1.55V which is propably more than 99.9% of the people ever run at. Sure some CPUs do run hotter (Like AMD but they are just crap anyway) but I think you can never hit max temps on a custom loop if you have done things properly, just not possible really.


It's not possible to hit max temps on a CPU with a custom loop? Man....I wish I knew what fairy tale world you live in, because....this is just categorically untrue. Seriously....that made my eye twitch just reading it. My goodness....

And memory doesn't really need cooling either. /boggle Where do you get this stuff? lol


----------



## jdorje

It's pretty hard for me to break 80C on my 4690k delid with h80i. Not impossible obviously (p95 small ffts, or 1.45V+ would do it). If I had a custom loop I'd be fully voltage limited. Of course if I had 20C hover ambients I'd have 1/3 less thermal headroom and it'd be easy.

I think Stige was assuming a delid.

If you had a custom loop with not enough rads and a gpu or more in it, that'd also make it hotter.


----------



## Vellinious

Obviously, the guys that run chilled loops, phase change, DICE and LN2 are all doing it wrong......cause, all they really need is a good custom loop. lol, wow....the level of dumb in that is staggering.


----------



## Stige

Quote:


> Originally Posted by *jdorje*
> 
> It's pretty hard for me to break 80C on my 4690k delid with h80i. Not impossible obviously (p95 small ffts, or 1.45V+ would do it). If I had a custom loop I'd be fully voltage limited. Of course if I had 20C hover ambients I'd have 1/3 less thermal headroom and it'd be easy.
> 
> I think Stige was assuming a delid.
> 
> If you had a custom loop with not enough rads and a gpu or more in it, that'd also make it hotter.


Well H80i is not a good cooler by any means so.. yeah. Like you said, with a custom loop your only limit is voltage, nothing else really. I burned one 2500K trying to boot at 1.7V lol. 1.6V was not enough ;D
And like I said, a properly built custom loop will handle any CPU you throw at it within daily use.
Quote:


> Originally Posted by *Vellinious*
> 
> It's not possible to hit max temps on a CPU with a custom loop? Man....I wish I knew what fairy tale world you live in, because....this is just categorically untrue. Seriously....that made my eye twitch just reading it. My goodness....
> 
> And memory doesn't really need cooling either. /boggle Where do you get this stuff? lol


Quote:


> Originally Posted by *Vellinious*
> 
> Obviously, the guys that run chilled loops, phase change, DICE and LN2 are all doing it wrong......cause, all they really need is a good custom loop. lol, wow....the level of dumb in that is staggering.


You are just... wow... New levels of dumb or ignorant.

Tell me the people here who run at 1.52V or higher 24/7 like I do? Have fun running your CPU with your dice ln2 or whatever 24/7. There is no way that a custom loop will even hit 80C at 1.52V if it's properly built. That is as far as I would go for daily use and I doubt there are many who would even do that.


----------



## Vellinious

Quote:


> Originally Posted by *Stige*
> 
> Well H80i is not a good cooler by any means so.. yeah. Like you said, with a custom loop your only limit is voltage, nothing else really. I burned one 2500K trying to boot at 1.7V lol. 1.6V was not enough ;D
> And like I said, a properly built custom loop will handle any CPU you throw at it within daily use.
> 
> You are just... wow... New levels of dumb or ignorant.
> 
> Tell me the people here who run at 1.52V or higher 24/7 like I do? Have fun running your CPU with your dice ln2 or whatever 24/7. There is no way that a custom loop will even hit 80C at 1.52V if it's properly built. That is as far as I would go for daily use and I doubt there are many who would even do that.


Your quote:

"but I think you can never hit max temps on a custom loop if you have done things properly, just not possible really."

Between this and the whole "memory doesn't really need cooling" thing, I'm just not sure which one is worse. lol


----------



## Stige

Quote:


> Originally Posted by *Vellinious*
> 
> Your quote:
> 
> "but I think you can never hit max temps on a custom loop if you have done things properly, just not possible really."
> 
> Between this and the whole "memory doesn't really need cooling" thing*, I'm just not sure which one is worse.* lol


You. I have posted proof TWICE already on top of the original post here that the VRAM doesn't get hot on these so you can even use tiny heatsinks on them and be fine. You are just being ignorant or dumb or both...
And I'm correct that a custom loop is voltage limited more than anything else when it comes to CPUs if properly built.


----------



## jdorje

Overclockers use ln2/subzero cooling so that they can raise their voltage limit for short periods, not for running an everyday overclock (obviously). For an everyday overclock cooling a delidded i5 (3570k/4670k/4690k/6600k, or 2500k soldered) is not hard even without a custom loop and you will certainly hit the voltage limit before the thermal limit. It's just hard to stress test because that's hotter than everyday use.

As for vram, MSI vram doesn't even have a heat sink (?) and it still does fine. Though my xfx vram cooling gives me ~100 mhz more than the typical msi which probably lets me crush you all in the single-390 score on the Fanboy competition (along with my 17C ambients, which is my true secret weapon). Which brings us back to topic! The highest 390x is still 86,600 and the 390 79,2xx. You have 6 hours to break those scores.


----------



## bluej511

Quote:


> Originally Posted by *Vellinious*
> 
> Your quote:
> 
> "but I think you can never hit max temps on a custom loop if you have done things properly, just not possible really."
> 
> Between this and the whole "memory doesn't really need cooling" thing, I'm just not sure which one is worse. lol


Dont bother. I lost interest when he said finland gets hotter then the south of france lol.


----------



## bluej511

Interesting so its 7C outside yet 25C in your room. That pc must be on 24/7.


----------



## jdorje

Quote:


> Originally Posted by *bluej511*
> 
> Interesting so its 7C outside yet 25C in your room. That pc must be on 24/7.


Uh he did say his PC was ~200W idle. Which remains insane.


----------



## bluej511

Mine prob pushes 350w while gaming and my output air temp is closer to low-mid 30s and my room doesnt get to 30C but its also 15C here at night 20 in the day. Yes in march-april lol


----------



## christoph

what you guys are talking about?

I run 1.52 v 24/7 almost since I bought my x6 core, and have never hit 60 degrees, 55 at most and is 32 celsius ambient


----------



## yuannan

Quote:


> Originally Posted by *christoph*
> 
> what you guys are talking about?
> 
> I run 1.52 v 24/7 almost since I bought my x6 core, and have never hit 60 degrees, 55 at most and is 32 celsius ambient


You either use water or live in the UK.

Seriously my room is freezing all the time. Only after gaming for a few hours that my room warms up.


----------



## christoph

nop, I'm on air


----------



## yuannan

Quote:


> Originally Posted by *christoph*
> 
> nop, I'm on air


Which cooler, fan speed and ambient?


----------



## christoph

ambient 32, but right now is 29, and I'm using the TRUE 120 black in push pull configuration, 2000 rpm fans.

But I lapped my heatsink, only the heatsink, for which i saw an improvement of 3 degrees, and using the (forgot the name) Carit?? 7 diamond some like that, for which I saw an improvement of maybe 1 degree over the MX4


----------



## jdorje

Quote:


> Originally Posted by *christoph*
> 
> ambient 32, but right now is 29, and I'm using the TRUE 120 black in push pull configuration, 2000 rpm fans.
> 
> But I lapped my heatsink, only the heatsink, for which i saw an improvement of 3 degrees, and using the (forgot the name) Carit?? 7 diamond some like that, for which I saw an improvement of maybe 1 degree over the MX4


Did we ask you which CPU?


----------



## christoph

Quote:


> Originally Posted by *jdorje*
> 
> Did we ask you which CPU?


you guys didn't, but you can see it in my rig tab, is a AMD 1055T 6 core


----------



## mus1mus

Congrats EVERYONE..

Solid TEAM WORK!


----------



## christoph

yes we did it


----------



## jodybdesigns

Figured I would drop a Few DOOM Beta screens here. Don't look if you don't like spoilers. Game is a lot like the Alpha with tons of re-textures plus a TON of Character/Weapon customization in the Beta. Still feels like Quake. Game plays AMAZING on my 390.


Spoiler: Warning: Spoiler!


----------



## mus1mus

Just a heads up.

CATALYST 15.10 is soooooooooooooooooooo KEWL!


----------



## jdorje

What to download for 15.10?


----------



## bluej511

So much rep for a noob haha.


----------



## mus1mus

Quote:


> Originally Posted by *jdorje*
> 
> What to download for 15.10?


You mean where?

http://www.guru3d.com/files-details/amd-catalyst-15-10-beta-driver-download.html

http://forums.guru3d.com/showthread.php?t=403142

NEW HIGH SCORE

Quote:


> Originally Posted by *bluej511*
> 
> So much rep for a noob haha.


Me, NOOB?


----------



## bluej511

Quote:


> Originally Posted by *mus1mus*
> 
> You mean where?
> 
> http://www.guru3d.com/files-details/amd-catalyst-15-10-beta-driver-download.html
> 
> http://forums.guru3d.com/showthread.php?t=403142
> 
> NEW HIGH SCORE
> Me, NOOB?


No referring to me noob haha.

Back on topic, i added 2 more intake fans on top of my case, my VRM 1 is now 51°C and VRM 2 is 59°C, dropped a few degrees on each. Still as quiet as before haha, now with a dozen fans.


----------



## mus1mus

That's the same temps as my reference blower. With mre noise as well.


----------



## bluej511

Quote:


> Originally Posted by *mus1mus*
> 
> That's the same temps as my reference blower. With mre noise as well.


Im around 18-20db with 12 fans at 1200rpm id say thats pretty quiet.


----------



## Amaregaz

Hy guys im new on overclock.net, I just bought an Asus Strix Direct III R9 390X and i noticed that its running pretty hot when watching a movie because the memory is at max speed, so I was wondering is there a way to create a profile for desktop and one for games? I like that my fans are not running when im on desktop, but i don't like having so high temperatures when watching a movie. I've noticed some weird spikes, as you can see in the chart, why is it spiking at max frequency when im only on desktop and not playing anything?


----------



## Transmaniacon

I wouldn't say 70 degrees is very hot for the GPU, the Strix cooler isn't all that great on the R9 300 series though.


----------



## mus1mus

Quote:


> Originally Posted by *bluej511*
> 
> Im around 18-20db with 12 fans at 1200rpm id say thats pretty quiet.


18-20 dB is superficial







Considering you have 12.

That is according to this:


----------



## mus1mus

Quote:


> Originally Posted by *Amaregaz*
> 
> Hy guys im new on overclock.net, I just bought an Asus Strix Direct III R9 390X and i noticed that its running pretty hot when watching a movie because the memory is at max speed, so I was wondering is there a way to create a profile for desktop and one for games? I like that my fans are not running when im on desktop, but i don't like having so high temperatures when watching a movie. I've noticed some weird spikes, as you can see in the chart, why is it spiking at max frequency when im only on desktop and not playing anything?


That temperature might be due to a Driver or your fan Profile.


----------



## bluej511

Quote:


> Originally Posted by *mus1mus*
> 
> 18-20 dB is superficial
> 
> 
> 
> 
> 
> 
> 
> Considering you have 12.
> 
> That is according to this:


You wish lol. Your system will be as loud as your loudest fan. You can have 20 enermax tb silence and not hear a thing. 7 of mine are noctuas. 3am dead silent room 23db, system on around 41-43db from about a foot away or half a meter.


----------



## mus1mus

Quote:


> Originally Posted by *bluej511*
> 
> You wish lol. Your system will be as loud as your loudest fan. You can have 20 enermax tb silence and not hear a thing. 7 of mine are noctuas. 3am dead silent room 23db, system on around 41-43db from about a foot away or half a meter.


How do you know I have LOUD fans?









A single fan of mine can wake neighbors from a mile.


----------



## bluej511

Quote:


> Originally Posted by *mus1mus*
> 
> How do you know I have LOUD fans?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> A single fan of mine can wake neighbors from a mile.


Most people have loud fans lol. Hell before i remounted my stock thermaltake fans they humed like crazy. I gotta add more screws though some of em hum/vibe a bit. I ordered 50 stainless steel m3x30s just waiting for em


----------



## Amaregaz

Is there any way to limit the power consumption of the GPU?


----------



## bluej511

Quote:


> Originally Posted by *Amaregaz*
> 
> Is there any way to limit the power consumption of the GPU?


Vsync or frame rate target control.


----------



## Amaregaz

What about that power limit option?


----------



## Transmaniacon

Quote:


> Originally Posted by *Amaregaz*
> 
> What about that power limit option?


Why do you need to reduce the power consumption? That is definitely not one of AMD's strong points.


----------



## HeXBLiTz

Hey guys just picked up a gigabyte gaming G1 390x.

Have it in a nzxt phantom with two 140 in at front a 120 out at back and the huge one intake on side panel blowing towards the card but at stock I have to run the GPU fans between 80%-100% just to keep the card at about 85c and as you think it's VERY loud.

Is there any water blocks that fit my card or after market hybrid coolers that fit I've done some Google and new to AMD cards so any help be nice.


----------



## Amaregaz

Quote:


> Originally Posted by *Transmaniacon*
> 
> Why do you need to reduce the power consumption? That is definitely not one of AMD's strong points.


I dont have a good psu for the moment, my psu in Sirtec High Power Element Bronze 600W II, so im trying to limit power consumption until i get a new PSU


----------



## battleaxe

Quote:


> Originally Posted by *Amaregaz*
> 
> I dont have a good psu for the moment, my psu in Sirtec High Power Element Bronze 600W II, so im trying to limit power consumption until i get a new PSU


Yeah. Just clock it down. Put the ram at 1250mhz and the core at 900mhz. That will def pull a lot less power. You should also be able to then put the voltage at -50 at least. Maybe even less than that, -100 possibly. At that point I would be the card will only pull about 170 watts max or thereabouts.


----------



## tolis626

Quote:


> Originally Posted by *Amaregaz*
> 
> I dont have a good psu for the moment, my psu in Sirtec High Power Element Bronze 600W II, so im trying to limit power consumption until i get a new PSU


Quote:


> Originally Posted by *battleaxe*
> 
> Yeah. Just clock it down. Put the ram at 1250mhz and the core at 900mhz. That will def pull a lot less power. You should also be able to then put the voltage at -50 at least. Maybe even less than that, -100 possibly. At that point I would be the card will only pull about 170 watts max or thereabouts.


Lol no. I sometimes run my card undervolted and I can run 1040/1625MHz at -100mV no problem. Power consumption is stupidly low, like 140W. No need to slow it down so much. 1000MHz is attainable most probably. Although I wouldn't worry that much if it's 600W. I'd just run it at stock and call it a day.


----------



## bluej511

Quote:


> Originally Posted by *Amaregaz*
> 
> Is there any way to limit the power consumption of the GPU?


Quote:


> Originally Posted by *HeXBLiTz*
> 
> Hey guys just picked up a gigabyte gaming G1 390x.
> 
> Have it in a nzxt phantom with two 140 in at front a 120 out at back and the huge one intake on side panel blowing towards the card but at stock I have to run the GPU fans between 80%-100% just to keep the card at about 85c and as you think it's VERY loud.
> 
> Is there any water blocks that fit my card or after market hybrid coolers that fit I've done some Google and new to AMD cards so any help be nice.


If its refernece design lots of choices if not theres alphacool or ekwb thermosphere then heatsinks for vrm/vram.


----------



## ronaldoz

Quote:


> Originally Posted by *Amaregaz*
> 
> I dont have a good psu for the moment, my psu in Sirtec High Power Element Bronze 600W II, so im trying to limit power consumption until i get a new PSU


You might be fine with it. I could use 2 x 390X (put the voltage down a bit > 2 x 225W) with 750W Gold PSU. I expected that 1 390X was almost limiting the PSU (when overclocking it). I'm also using a overclocked CPU.
Quote:


> Originally Posted by *HeXBLiTz*
> 
> Hey guys just picked up a gigabyte gaming G1 390x.
> 
> Have it in a nzxt phantom with two 140 in at front a 120 out at back and the huge one intake on side panel blowing towards the card but at stock I have to run the GPU fans between 80%-100% just to keep the card at about 85c and as you think it's VERY loud.
> 
> Is there any water blocks that fit my card or after market hybrid coolers that fit I've done some Google and new to AMD cards so any help be nice.


Did you bought this new?


----------



## HeXBLiTz

Quote:


> Originally Posted by *bluej511*
> 
> If its refernece design lots of choices if not theres alphacool or ekwb thermosphere then heatsinks for vrm/vram.


Ok thanks I've no idea if it's reference or not is there a way to tell?

Also vram/VRM wouldn't need a fan?

And no it was used but not for long looked clean and in good condition when it arrived like new


----------



## battleaxe

Quote:


> Originally Posted by *tolis626*
> 
> Lol no. I sometimes run my card undervolted and I can run 1040/1625MHz at -100mV no problem. Power consumption is stupidly low, like 140W. No need to slow it down so much. 1000MHz is attainable most probably. Although I wouldn't worry that much if it's 600W. I'd just run it at stock and call it a day.


No need to be rude. I was only trying to help him out. I was guessing off the top of my head too. Have you put a DMM on it?

The original 290's were clocked at 950 core, and they still hit around 250 watts right?

So how am I wrong? What are you using for your theory? The voltage readings on GPUZ? Not accurate if so.

Plus, I was thinking about max load while gaming on something like Far Cry or something like that. So it seems feasible to me even now that 900mhz could pull 170 watts under full load on something demanding. Even still, if all he is running is 1080p then I doubt he even needs more than 900mhz to begin with.

I could be wrong, but somehow I doubt I'm that far off.


----------



## bluej511

Quote:


> Originally Posted by *HeXBLiTz*
> 
> Ok thanks I've no idea if it's reference or not is there a way to tell?
> 
> Also vram/VRM wouldn't need a fan?
> 
> And no it was used but not for long looked clean and in good condition when it arrived like new


Check the ekwb configurator.

And everything on a gpu needs cooling its why its there in the first place.


----------



## ronaldoz

Quote:


> Originally Posted by *HeXBLiTz*
> 
> Ok thanks I've no idea if it's reference or not is there a way to tell?
> 
> Also vram/VRM wouldn't need a fan?
> 
> And no it was used but not for long looked clean and in good condition when it arrived like new


You got a Gigabyte G1 Gaming, and that's not a reference. Your temperature is kinda hot. Mine is doing 66C ingame, but I got a different cooler with 3 fans. Before, Gigabyte also used coolers with 3 fans.

Ye, getting another cooling on the card would be great. You bought a nice card (390X).


----------



## GorillaSceptre

Quote:


> Originally Posted by *HeXBLiTz*
> 
> Hey guys just picked up a gigabyte gaming G1 390x.
> 
> Have it in a nzxt phantom with two 140 in at front a 120 out at back and the huge one intake on side panel blowing towards the card but at stock I have to run the GPU fans between 80%-100% just to keep the card at about 85c and as you think it's VERY loud.
> 
> Is there any water blocks that fit my card or after market hybrid coolers that fit I've done some Google and new to AMD cards so any help be nice.


What's your ambient temps? If they are low you may need to re-apply the thermal paste, I think i have to do the same with my MSI, I'm just waiting for winter temps to see if my card lines up with other users first.

Just fyi my temps and fan speed are similar to yours with around 35C ambient's.


----------



## HeXBLiTz

Quote:


> Originally Posted by *GorillaSceptre*
> 
> What's your ambient temps? If they are low you may need to re-apply the thermal paste, I think i have to do the same with my MSI, I'm just waiting for winter temps to see if my card lines up with other users first.
> 
> Just fyi my temps and fan speed are similar to yours with around 35C ambient's.


My room temperatures probably never go above 25c atmo. Been looking and searching for alternative cooling solutions but because it's not reference there's no guarantee they'll work.

Like the NZXT G10 or EKWB universal block or the Arctic acelero.

I'm going strip it down later tonight and redo the thermal paste see if that changes anything


----------



## ronaldoz

Quote:


> Originally Posted by *HeXBLiTz*
> 
> My room temperatures probably never go above 25c atmo. Been looking and searching for alternative cooling solutions but because it's not reference there's no guarantee they'll work.
> 
> Like the NZXT G10 or EKWB universal block or the Arctic acelero.
> 
> I'm going strip it down later tonight and redo the thermal paste see if that changes anything


I think there is garantuee, because the chip is the same for other 390X's as well. Some coolers are getting hotter then others.


----------



## HeXBLiTz

Quote:


> Originally Posted by *ronaldoz*
> 
> I think there is garantuee, because the chip is the same for other 390X's as well. Some coolers are getting hotter then others.


The PCB design isn't reference so there's isn't a guarantee they'll even fit on the card


----------



## ronaldoz

Quote:


> Originally Posted by *HeXBLiTz*
> 
> The PCB design isn't reference so there's isn't a guarantee they'll even fit on the card


I see, yes, that might be a difficulty. I hope some guys over here could help you out.


----------



## tolis626

Quote:


> Originally Posted by *battleaxe*
> 
> No need to be rude. I was only trying to help him out. I was guessing off the top of my head too. Have you put a DMM on it?
> 
> The original 290's were clocked at 950 core, and they still hit around 250 watts right?
> 
> So how am I wrong? What are you using for your theory? The voltage readings on GPUZ? Not accurate if so.
> 
> Plus, I was thinking about max load while gaming on something like Far Cry or something like that. So it seems feasible to me even now that 900mhz could pull 170 watts under full load on something demanding. Even still, if all he is running is 1080p then I doubt he even needs more than 900mhz to begin with.
> 
> I could be wrong, but somehow I doubt I'm that far off.


Rude? Not my intention, really. Sorry it came out like that man.









The original 290s were quite stupidly "overvolted", from what I remember. As in, they could undervolt all the way all day with no problems. When undervolted far enough, they sip power. Like really, they consume really little. The 390/390x are, AFAIK, better in that regard due to maturation of the process etc.

And no, I didn't use a multi-meter, but I did use a kill-a-watt. Don't remember the exact number, but it was somewhere in the neighbourhood of 300-350W during BF4 if I remember correctly. Couple that with a 4.8GHz 4790k at 1.325V (VID, actual core voltage was something like 1.35V), 8 140mm Phanteks fans, 3 LED strips, overclocked RAM etc, and you got an approximation of how much the card consumes. It also seemed to agree with GPU-z, strangely.

The "lol no" part, however, was mostly about needing to go down to 900/1250MHz. I'd say leave the memory at 1500MHz, put the core voltage at -100mV and see how far core clocks will go. I'd guess it's gonna be in the 980-1020MHz range most likely, but that's just a hunch.

Again, sorry for sounding like an arse. I tend to do this sometimes. Dunno why, it just comes naturally.









PS : Also, keeping the core and VRMs cool goes a long way to decreasing power consumption. When undervolted, that's especially easy to do, so that's a thing to keep in mind too.


----------



## battleaxe

Quote:


> Originally Posted by *tolis626*
> 
> Rude? Not my intention, really. Sorry it came out like that man.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The original 290s were quite stupidly "overvolted", from what I remember. As in, they could undervolt all the way all day with no problems. When undervolted far enough, they sip power. Like really, they consume really little. The 390/390x are, AFAIK, better in that regard due to maturation of the process etc.
> 
> And no, I didn't use a multi-meter, but I did use a kill-a-watt. Don't remember the exact number, but it was somewhere in the neighbourhood of 300-350W during BF4 if I remember correctly. Couple that with a 4.8GHz 4790k at 1.325V (VID, actual core voltage was something like 1.35V), 8 140mm Phanteks fans, 3 LED strips, overclocked RAM etc, and you got an approximation of how much the card consumes. It also seemed to agree with GPU-z, strangely.
> 
> The "lol no" part, however, was mostly about needing to go down to 900/1250MHz. I'd say leave the memory at 1500MHz, put the core voltage at -100mV and see how far core clocks will go. I'd guess it's gonna be in the 980-1020MHz range most likely, but that's just a hunch.
> 
> Again, sorry for sounding like an arse. I tend to do this sometimes. Dunno why, it just comes naturally.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> PS : Also, keeping the core and VRMs cool goes a long way to decreasing power consumption. When undervolted, that's especially easy to do, so that's a thing to keep in mind too.


All good. No worries.


----------



## Amaregaz

Does anyone know why it says my gpu is Hawaii, I have Asus Strix 390X, isnt supposed to be Grenada?


----------



## tolis626

Quote:


> Originally Posted by *Amaregaz*
> 
> Does anyone know why it says my gpu is Hawaii, I have Asus Strix 390X, isnt supposed to be Grenada?


Same thing, really.


----------



## mus1mus

Probably just a GPU-Z version bug. Try newer ones.


----------



## jdorje

Grenada is in hawaii bro.


----------



## mus1mus

Grenada or Hawaii Branding is in the BIOS and Chip info. Apparently, some software were not updated to differentiate or read them correctly. There was even a time when a modded 290X with a 390 BIOS was being read as Grenada. 3DMark used to.


----------



## HeXBLiTz

So I redid the thermal paste with some artic silver 5 I had laying around temps dropped 10c to 75 with 60%-70% fan speed so that'll do


----------



## ronaldoz

Quote:


> Originally Posted by *HeXBLiTz*
> 
> So I redid the thermal paste with some artic silver 5 I had laying around temps dropped 10c to 75 with 60%-70% fan speed so that'll do


That's a great temp difference!


----------



## christoph

Quote:


> Originally Posted by *HeXBLiTz*
> 
> So I redid the thermal paste with some artic silver 5 I had laying around temps dropped 10c to 75 with 60%-70% fan speed so that'll do


over stock paste??


----------



## HeXBLiTz

Quote:


> Originally Posted by *christoph*
> 
> over stock paste??


Yeah I assume it was stock paste although the card was 2nd hand but in good condition clean and less then a year old.

The paste on there was very dry and quite thin not much at all tbh


----------



## battleaxe

Quote:


> Originally Posted by *HeXBLiTz*
> 
> Yeah I assume it was stock paste although the card was 2nd hand but in good condition clean and less then a year old.
> 
> The paste on there was very dry and quite thin not much at all tbh


You did remove the old I assume?


----------



## HeXBLiTz

Quote:


> Originally Posted by *battleaxe*
> 
> You did remove the old I assume?


Obviously lol. I'm not that much of a noob


----------



## bluej511

Really depends on the factory paste and how much they put on. I redid mine and wasnt too big a difference. I bet using gc extreme or thermal grizzly would be even more. Could use clu/clp but its been known to need lapping to clean it off. Pro seems to be easier to clean then ultra. One is 36w/mk the other is 32. Condoctonaut is 86 lol.


----------



## yuannan

Would high VRM2 temps mean a bad paste? or is it just that VRM2 is usually hotter anyway? VRM2 idles at 50-60 and VRM1 is at 30-40


----------



## bluej511

Theres no paste between VRMs just pads. What you idle at is what mine is at full load so something is def not right there haha. Could be why your having issues with your needed higher voltages. Whats your VRM2 like at load?


----------



## battleaxe

Quote:


> Originally Posted by *yuannan*
> 
> Would high VRM2 temps mean a bad paste? or is it just that VRM2 is usually hotter anyway? VRM2 idles at 50-60 and VRM1 is at 30-40


Quote:


> Originally Posted by *yuannan*
> 
> Would high VRM2 temps mean a bad paste? or is it just that VRM2 is usually hotter anyway? VRM2 idles at 50-60 and VRM1 is at 30-40


What card do you have?


----------



## christoph

I wanted to try GC extreme few times before, so I'm going to look into buying a tube so I can test...

VRMs always gonna be hotter than the GPU


----------



## bluej511

Quote:


> Originally Posted by *christoph*
> 
> I wanted to try GC extreme few times before, so I'm going to look into buying a tube so I can test...
> 
> VRMs always gonna be hotter than the GPU


It came with my EKWB, i was able to do the cpu block 2-3x and the gpu die once before i used it all. Was impressed it came with that much. Now ive got some Noctua NH-T1 but i might get some gc extreme it worked really well.


----------



## jdorje

The internet says Kyronaut is better than GC x.

Pretty sure you can use LM on these cards easily enough though. CLU is easy. The problem with improving the core cooling is that just lowers fan speeds and leaves your VRMs hotter. Or you raise the fan curve to match and then it's not really gaining you anything.


----------



## bluej511

Quote:


> Originally Posted by *jdorje*
> 
> The internet says Kyronaut is better than GC x.
> 
> Pretty sure you can use LM on these cards easily enough though. CLU is easy. The problem with improving the core cooling is that just lowers fan speeds and leaves your VRMs hotter. Or you raise the fan curve to match and then it's not really gaining you anything.


yup its why im leaving the gpu core as is, 41C is plenty cool already.


----------



## jdorje

It makes no sense the vrms should always be hotter. I mean the amount of heat generated is way less right? So it's just a question of cooling. And the issue there is the vrms can't be shorted out (right? ) and are uneven, so thermal padding is needed to connect them to the cooling apparatus.

Hmm.


----------



## bluej511

Well because the amount of amps and volts the VRMs get is much greater then the core or vram. The VRMs are pretty much what regulates the current going to the card. Pretty sure the watts and amps that come in go directly to the vrm THEN to the core, its why its called voltage REGULATOR module. It will always be the hottest electronic part, its the same for mobos, and so on.


----------



## mus1mus

Blame Physics.


----------



## Worldwin

By that logic the PSU should be the hottest. Afterall ALL of the power pulled from the wall goes through the PSU prior to the rest of the system. The VRM shouldn't always be the hottest part. You are ignoring the efficiency. If the VRM is say 80% efficient then 20% is turned into waste heat @ the VRM making it hot. The VRM are not always the hottest part depending on the cooling. In the case of GPU's with an 80% efficiency say it pulls 300W from PSU. With 80% efficiency 240W is being turned into heat at the die whereas 60W at the VRM. It is easier to cool 60W than 240W which is why manufacturers often gimp on them VRM cooling as it drastically easier to cool compared to the core.

TLDR; If your VRM temps>Core temps, then cooling isn't strong enough for VRM. The fact that the VRM converts ~12V ->1.X V is not the reason why it is hotter.


----------



## tolis626

So I ordered myself a set of Fujipoly pads at last. I hope I'll get them sooner rather than later, but sometimes standard shipping takes ages. Anyways, I'm hoping for the best here.

I just have a couple of questions. First one is, should I put a thermal pad on the chokes (I did take the green one that MSI had on them off and left them that way)? When I use the Fujipolys I'll have a spare set of Phobya pads. I'd guess as long as it's not a thick pad and doesn't prevent contact to the other parts of the GPU, it wouldn't hurt. But should I bother? Or is there absolutely no reason to cool the chokes?

Second question. I'm also thinking of changing the pads on VRM2, in the hopes that it improves my memory overclocking, much in the same way that VRM1 cooling increased my core overclocking. Is there any chance it will work, or am I wasting my time? I will most probably do it because I already bought a 50x15x1mm pad to use there, but I'm asking just in case.

Thanks in advance!


----------



## mus1mus

Quote:


> Originally Posted by *Worldwin*
> 
> By that logic the PSU should be the hottest. Afterall ALL of the power pulled from the wall goes through the PSU prior to the rest of the system. The VRM shouldn't always be the hottest part. You are ignoring the efficiency. If the VRM is say 80% efficient then 20% is turned into waste heat @ the VRM making it hot. The VRM are not always the hottest part depending on the cooling. In the case of GPU's with an 80% efficiency say it pulls 300W from PSU. With 80% efficiency 240W is being turned into heat at the die whereas 60W at the VRM. It is easier to cool 60W than 240W which is why manufacturers often gimp on them VRM cooling as it drastically easier to cool compared to the core.
> 
> TLDR; If your VRM temps>Core temps, then cooling isn't strong enough for VRM. The fact that the VRM converts ~12V ->1.X V is not the reason why it is hotter.


You are taking it into irrelevance.


----------



## Worldwin

Quote:


> Originally Posted by *mus1mus*
> 
> You are taking it into irrelevance.


Care to explain?


----------



## bluej511

Quote:


> Originally Posted by *Worldwin*
> 
> Care to explain?


I think he means PSUs function a bit differently then GPUs, and yes PSUs do get warm. But their also massive, completely open and not squashed between 2 pieces of aluminum lol.


----------



## Worldwin

Quote:


> Originally Posted by *bluej511*
> 
> I think he means PSUs function a bit differently then GPUs, and yes PSUs do get warm. But their also massive, completely open and not squashed between 2 pieces of aluminum lol.


The PSU was just an example to show that just because all the power flows through it doesn't mean that power is where it is used. I most likely should have been more clear.


----------



## mus1mus

Quote:


> Originally Posted by *Worldwin*
> 
> Care to explain?


First, consider the load that the VRMs are facing.
Quote:


> Originally Posted by *Worldwin*
> 
> The PSU was just an example to show that just because all the power flows through it doesn't mean that power is where it is used. I most likely should have been more clear.


Power section is not always the hottest part of any system. But they do get less attention when it comes to cooling solution.

All electronics always makes power sections hot when talking about power levels in the hundreds of watts.


----------



## Worldwin

Anyone know the thicknes for the thermal pads for the MSI 390X?
Quote:


> Originally Posted by *mus1mus*
> 
> First, consider the load that the VRMs are facing.
> Power section is not always the hottest part of any system. But they do get less attention when it comes to cooling solution.
> 
> All electronics always makes power sections hot when talking about power levels in the hundreds of watts.


Why do i feel like i am talking about unicorns and you are talking about chairs. Seems like what you are saying is addons rather than actual replies.


----------



## kizwan

VRM is not the hottest part.


----------



## jdorje

My vrms get 20C hotter than my core. It's not the most heat but it is the highest temps. Those who have better cooling can do better. But if you put the core on water then most likely the vrms aren't on water so they're still significantly hotter. As mentioned the issue is obviously cooling solutions.

If the vrms are 80% efficient then they are dropping 60 watts. The gpu core will be dropping about 240 watts. And my ~90% efficient psu will be dropping about 40 watts, but over a larger area.

An aio that could cool the vrms would be so convenient.


----------



## Stige

Quote:


> Originally Posted by *kizwan*
> 
> VRM is not the hottest part.


And what card is that? VRM is the hottest part on these cards 99.9% of the time.


----------



## mus1mus

Quote:


> Originally Posted by *Worldwin*
> 
> Why do i feel like i am talking about unicorns and you are talking about chairs. Seems like what you are saying is addons rather than actual replies.


Well, in a point, what you are talking about are rather based on your imagination than what happens in a real world.

I do feel you have the knowledge so I'll drop a few things based on your analogy.

Batteries and your phone for one, does the battery get hot or warm when you are using the phone with Mobile Data ON and gaming on it?

The battery is of course rated to give out the power requirements the phone needs. But as you load the phone functionality, you can see how it affects the power section. The phone components can even stay cool during that usage.

Another one, when a processor is stressed under load, the VRMs on the motherboard gets hot or warm depending on your perspective, but they are designed to,and have the efficiency to ensure withstand that kind of load.

Why do you think?

This is due to the load they are faced with and not about the efficiency or power transfer and internal power consumptions of the VRMs. Remember your efficiency figure example?









The things is, even if you have a very powerful Power Section powering a certain load, GPU or CPU as an example, once those components becomes functional or goes into a high utilisation, the Power Section will shoulder the load.

High utilisation on the load will translate to a semi short circuit or close to, at the Power Stage perspective. I bet you know how a short curcuit affects Power Supplies.









Your example on efficiency of the VRM circuit actually relates to another thing. And should not be pointed towards high temps at Full Load or heavy utilization. In fact, that should be directed towards why the VRMs get warm even when the GPUs are not loaded or under low utilisation.


----------



## mus1mus

Quote:


> Originally Posted by *kizwan*
> 
> VRM is not the hottest part.


It's a matter of design perspective. Rather focusing on a beefier Power Stage or beefier cooler for the main component. GPU.


----------



## christoph

it all comes down to COOLING SOLUTION, first you have to take in mind the STRESS you are putting into the electronics, then consider the cooling solution to it...

years ago, the VRMs of any video card were the hottest part of the video card, cause they didn't have the cooling that nowadays have, but the stress you're putting into the VRMs is still there but with a better cooling solution, in the 300 AMD series the VRMs are hotter than the GPU but just for a few degrees, BUT it depends on the cooling of the case


----------



## christoph

Quote:


> Originally Posted by *tolis626*
> 
> So I ordered myself a set of Fujipoly pads at last. I hope I'll get them sooner rather than later, but sometimes standard shipping takes ages. Anyways, I'm hoping for the best here.
> 
> I just have a couple of questions. First one is, should I put a thermal pad on the chokes (I did take the green one that MSI had on them off and left them that way)? When I use the Fujipolys I'll have a spare set of Phobya pads. I'd guess as long as it's not a thick pad and doesn't prevent contact to the other parts of the GPU, it wouldn't hurt. But should I bother? Or is there absolutely no reason to cool the chokes?
> 
> Second question. I'm also thinking of changing the pads on VRM2, in the hopes that it improves my memory overclocking, much in the same way that VRM1 cooling increased my core overclocking. Is there any chance it will work, or am I wasting my time? I will most probably do it because I already bought a 50x15x1mm pad to use there, but I'm asking just in case.
> 
> Thanks in advance!


I;m gonna change the VRMs thermal pads, both the VRM1 and VRM2, the chokes I will leave them as they are, so in the end I just will change the VRMs thermal pads as the GPU thermal paste but this last because you have to replace the thermal paste once you take the video card apart


----------



## jdorje

Vrm2 never gets hot. Cooling the vram itself is beneficial but most cards just have a heat sink on top of it to do that already.

My cpu chokes get up to like 60C. Seems like cooling wouldn't be too bad.


----------



## christoph

Quote:


> Originally Posted by *jdorje*
> 
> Vrm2 never gets hot. Cooling the vram itself is beneficial but most cards just have a heat sink on top of it to do that already.
> 
> My cpu chokes get up to like 60C. Seems like cooling wouldn't be too bad.


then why VRM2 is hotter than VRM1?


----------



## jdorje

Quote:


> Originally Posted by *christoph*
> 
> then why VRM2 is hotter than VRM1?


Okay I misspoke...my vram vrm never gets hot. Maybe yours does.


----------



## christoph

hmm

it would be nice if we have replies about the VRM temperatures among the different 200/300 series

we been asking if our temps are ok, cuz we have nothing to compare against...

or do we have that info?


----------



## bluej511

Quote:


> Originally Posted by *christoph*
> 
> hmm
> 
> it would be nice if we have replies about the VRM temperatures among the different 200/300 series
> 
> we been asking if our temps are ok, cuz we have nothing to compare against...
> 
> or do we have that info?


Sapphire seems to have the best vrm temps. I think asus and msi are the wordt. Idk about xfx or reference.


----------



## Stige

ASUS cards are easily the worst and Sapphire cards are the best.


----------



## tolis626

Anyone know what the small square pieces of thermal pad are used for? The ones to the right of the VRM strip. (Well, in this picture it's on the left because the heatsink is upside down)


----------



## jdorje

Vrm temps on my xfx vary massively by load. At stock 1225 mV and 1090 mhz and 20C ambient my core gets up to low 70s in gaming and vrms to 80-90. If I do OCCT core is barely hotter but vrms will approach 100. At +100 mV in valley the core will be around 80 while vrms can get up near 120 if it runs long enough.

I've replaced core tim and vrm thermal padding. Both made a large improvement.

Vrm2 aka the vram gets up to maybe 60-70C tops. My vram is at 1740 with 1250 strap timings.

Msi and sapphire have the best vrm temps. Both cool the vrms with the main heat sink. Pcs+ and both xfx cards use a small heat sink "inside" the main one and can get much hotter. In particular, vrm temps take a long time to max out.

There does seem to be large variation in individual cards though.

They say that over 70C voltage ripple starts to rise. Not sure though; I've never really seen stability drop over time as the vrm temps get higher.


----------



## Worldwin

Quote:


> Originally Posted by *mus1mus*
> 
> Well, in a point, what you are talking about are rather based on your imagination than what happens in a real world.
> 
> I do feel you have the knowledge so I'll drop a few things based on your analogy.
> 
> Batteries and your phone for one, does the battery get hot or warm when you are using the phone with Mobile Data ON and gaming on it?
> 
> The battery is of course rated to give out the power requirements the phone needs. But as you load the phone functionality, you can see how it affects the power section. The phone components can even stay cool during that usage.
> 
> Another one, when a processor is stressed under load, the VRMs on the motherboard gets hot or warm depending on your perspective, but they are designed to,and have the efficiency to ensure withstand that kind of load.
> 
> Why do you think?
> 
> This is due to the load they are faced with and not about the efficiency or power transfer and internal power consumptions of the VRMs. Remember your efficiency figure example?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The things is, even if you have a very powerful Power Section powering a certain load, GPU or CPU as an example, once those components becomes functional or goes into a high utilisation, the Power Section will shoulder the load.
> 
> High utilisation on the load will translate to a semi short circuit or close to, at the Power Stage perspective. I bet you know how a short curcuit affects Power Supplies.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Your example on efficiency of the VRM circuit actually relates to another thing. And should not be pointed towards high temps at Full Load or heavy utilization. In fact, that should be directed towards why the VRMs get warm even when the GPUs are not loaded or under low utilisation.


From what I am understanding you are saying that the reason the VRM gets hot is because you are moving X watts through it rather than waste energy being burnt off at the VRM.


----------



## bluej511

Quote:


> Originally Posted by *tolis626*
> 
> 
> 
> Anyone know what the small square pieces of thermal pad are used for? The ones to the right of the VRM strip. (Well, in this picture it's on the left because the heatsink is upside down)


Yea looks like its for cooling something else maybe a chip of some sort dont think my sapphire had that.


----------



## jdorje

Quote:


> Originally Posted by *tolis626*
> 
> Anyone know what the small square pieces of thermal pad are used for? The ones to the right of the VRM strip. (Well, in this picture it's on the left because the heatsink is upside down)


That's the msi?

Really hard to tell since the top is flipped so you can't line it up visually. Looks like they should contact the capacitors you can see on the top right. It also looks like the padding on the chokes (green) doesn't contact anything?

Someone posted a bit back that the msi didn't have a vram heat sink, but that one obviously does.


----------



## Worldwin

Quote:


> Originally Posted by *tolis626*
> 
> 
> 
> Anyone know what the small square pieces of thermal pad are used for? The ones to the right of the VRM strip. (Well, in this picture it's on the left because the heatsink is upside down)


It should be contacting what appears to be the mosfets?


----------



## tolis626

Quote:


> Originally Posted by *jdorje*
> 
> That's the msi?
> 
> Really hard to tell since the top is flipped so you can't line it up visually. Looks like they should contact the capacitors you can see on the top right. It also looks like the padding on the chokes (green) doesn't contact anything?
> 
> Someone posted a bit back that the msi didn't have a vram heat sink, but that one obviously does.


Well, it's not a heatsink per se, more like a contact plate. But I think it does touch the heatsink and it definitely is enough to keep the VRAM cool. I don't know about VRM2 though, as we can't monitor its temperatures on the MSI. If changing its padding improves anything, it's probably getting too hot.

Also, the green strip on the chokes touches the part of the heatsink that has denser fins, right next to the VRM contact plate. I'm thinking I could use the stock VRM pad there, as that's really pliable and wouldn't hinder contact on any other point. The Phobya pads are really firm and might create a gap between the core and heatsink or VRM and heatsink.


----------



## jdorje

Yeah I wouldn't change the green pads. As you say it's touching the fins, not the sink, so contact depends on its pliability.

Vrm2 is the vram vrm. It's in the top left and off the screen on your picture. It doesn't even have a heat sink? Or maybe a tiny one. But they'd no way to cool it with the main heat sink.


----------



## bluej511

Btw whoever applied that tim should be fired. Way too much way too sloppy.


----------



## jdorje

I don't know what the best way to apply tim on the gpu core is. On a cpu the die is under the ihs and tiny; you just want a thin covering of tim on and around the die location. So just dropping a dot on the die spot will do that and avoid bubbles as the heat and pressure spread it.

But on the gpu core you want full coverage right? The die is under that whole thing. So a small dot leaves a fair amount uncovered, while a larger amount both spills over and leaves it too thick in the center.

I wasn't sure and ended up just spreading a thin layer of my ceramique on the whole thing with a piece of plastic. This too is supposedly a bad method but I got a 10c improvement in temps.


----------



## bluej511

If done right any application works. I did double line on my gpu seems to work. X works really best too but has to be thin lines in an x.


----------



## tolis626

Quote:


> Originally Posted by *jdorje*
> 
> Yeah I wouldn't change the green pads. As you say it's touching the fins, not the sink, so contact depends on its pliability.
> 
> Vrm2 is the vram vrm. It's in the top left and off the screen on your picture. It doesn't even have a heat sink? Or maybe a tiny one. But they'd no way to cool it with the main heat sink.


Yup, VRM2 is under there and that's what worries me. And seeing as, from what I've gathered, the MSI cards seems to be the worst when it comes to memory overclocking, I think it may be overheating. And the fact that there is no sensor for it strikes me as suspicious.

Also, the green pad is gone. The moment I tried to peel it off it crumbled and I had to remove it in 10 pieces or so. It's been running with no pad on the chokes though for quite a while and it seems fine? I dunno. Maybe it's not needed at all as even EK doesn't include a pad for them and leaves them naked under the block.
Quote:


> Originally Posted by *jdorje*
> 
> I don't know what the best way to apply tim on the gpu core is. On a cpu the die is under the ihs and tiny; you just want a thin covering of tim on and around the die location. So just dropping a dot on the die spot will do that and avoid bubbles as the heat and pressure spread it.
> 
> But on the gpu core you want full coverage right? The die is under that whole thing. So a small dot leaves a fair amount uncovered, while a larger amount both spills over and leaves it too thick in the center.
> 
> I wasn't sure and ended up just spreading a thin layer of my ceramique on the whole thing with a piece of plastic. This too is supposedly a bad method but I got a 10c improvement in temps.


Well, as the contact between the die and heatsink is direct, you want full coverage. Any spot that isn't covered by TIM will most probably overheat, as it's not even metal-to-metal contact. So you need to spread. I've seen people use credit cards and I see no reason why that wouldn't work. I've also seen people use latex gloves to spread it. Again, why not? For me, I use the tiny plastic spatula that came with GC Extreme to spread it. Way more convenient and it actually leads to a pretty even spread if you're careful. Can you avoid bubbles 100%? Probably not. But they're preferable to uncovered spots overheating.
Quote:


> Originally Posted by *bluej511*
> 
> Btw whoever applied that tim should be fired. Way too much way too sloppy.


Yup, tell me about it. I've cleaned it two times and I still can't get it all out. If you look closely, there was some TIM all the way on the metal plate close to the PCIe connectors. Thing is, there is still some of the old TIM left on the capacitors near the die and I have no idea how to clean that. Even a cotton bud can't clean it properly, as it can't get between the capacitors to clean it all off. I was about to shower it with alcohol, but all I have is 93% ethanol with glycerine mixed in (to avoid drinking, I use that because it's the cheapest here, lol) and I'm not confident of having too much of that on there. SIgh... Although I guess as long as it doesn't cause problems, it's fine.


----------



## Stige

Electric cleaner does it for me, cleans it right up. Spray it so the tim just drips off from the card.


----------



## yuannan

Quote:


> Originally Posted by *battleaxe*
> 
> What card do you have?


Sapphire Tri-X 390x.

I think it's all down to the fact that mem clock does not throttle down but I'm not sure.|
The GPU temps are very weird at 50C idle (mem clock again







) and 60C load. The fan sit at 30-40% and remain very quiet.

At idle if I push the fans to max VRM2 drops to 45C and VRM1 drops to low 30s.


----------



## christoph

I think is something like that? and the VRM2 should be round the gren circle right??



besides, if you took off the thermal pads of the VRMs (the stock one), isn't better than the green pads of the chokes? I mean you could replace the green pads with the VRMs stock themal pads, and don't know if the Sapphire ones has that green thermal pad


----------



## yuannan

Quote:


> Originally Posted by *christoph*
> 
> I think is something like that? and the VRM2 should be round the gren circle right??


My whole backplate is hot, but the heat does seem to come from the back of the card aka the end where it connects to the case.


----------



## christoph

the back plate has that purpose too, to absorb heat from the card, the Sapphire ones have thermal pad between the backplate and the counterpart of the video cad's PCB, just in between that gap where should align with the back of the VRMs


----------



## tolis626

Quote:


> Originally Posted by *christoph*
> 
> I think is something like that? and the VRM2 should be round the gren circle right??
> 
> 
> 
> besides, if you took off the thermal pads of the VRMs (the stock one), isn't better than the green pads of the chokes? I mean you could replace the green pads with the VRMs stock themal pads, and don't know if the Sapphire ones has that green thermal pad


Yup, VRM2 is under there, where you circled using green. And also, that's what I've been thinking about doing. The stock thermal pad seems like crap, but that green thing was even worse. Plus, the VRM pad is really easily pliable and wouldn't get in the way of anything else, most probaly, while the firmer Phobya pads would if they weren't the right size.

It would be great if someone with other brand of card chimed in here and said whether their cards have any sort of cooling for the chokes.


----------



## yuannan

Quote:


> Originally Posted by *christoph*
> 
> the back plate has that purpose too, to absorb heat from the card, the Sapphire ones have thermal pad between the backplate and the counterpart of the video cad's PCB, just in between that gap where should align with the back of the VRMs


I brought and installed the BP my self. It has this soft rubber bit that connects to the blue bit on the PCB.


----------



## christoph

Quote:


> Originally Posted by *tolis626*
> 
> Yup, VRM2 is under there, where you circled using green. And also, that's what I've been thinking about doing. The stock thermal pad seems like crap, but that green thing was even worse. Plus, the VRM pad is really easily pliable and wouldn't get in the way of anything else, most probaly, while the firmer Phobya pads would if they weren't the right size.
> 
> It would be great if someone with other brand of card chimed in here and said whether their cards have any sort of cooling for the chokes.


yeah I have the Sapphire but the thermal pads that I order haven't got here, is gonna take like other 2 weeks...

Quote:


> Originally Posted by *yuannan*
> 
> I brought and installed the BP my self. It has this soft rubber bit that connects to the blue bit on the PCB.


what rubber bit? do you have a link to the backplate you bought?


----------



## yuannan

Quote:


> Originally Posted by *christoph*
> 
> yeah I have the Sapphire but the thermal pads that I order haven't got here, is gonna take like other 2 weeks...
> what rubber bit? do you have a link to the backplate you bought?


It's straight from sapphire. I could take a picture if you really want. But all it is a really soft rubber like material that bridges the gap BP and PCB.


----------



## christoph

Quote:


> Originally Posted by *yuannan*
> 
> It's straight from sapphire. I could take a picture if you really want. But all it is a really soft rubber like material that bridges the gap BP and PCB.


ohhhhhhh, That's something I really want to know, cuz I haven't take my video card apart, I want to know if that "rubber pad" is thermal pad or not, cuz if is not, you can replace that rubber with actual thermal pad and be cooling the PCB's back, taking in mind that almost every thermal pad is NOT conductance

you didn't take pictures when installing the backplate right?


----------



## bluej511

Quote:


> Originally Posted by *christoph*
> 
> ohhhhhhh, That's something I really want to know, cuz I haven't take my video card apart, I want to know if that "rubber pad" is thermal pad or not, cuz if is not, you can replace that rubber with actual thermal pad and be cooling the PCB's back, taking in mind that almost every thermal pad is NOT conductance
> 
> you didn't take pictures when installing the backplate right?


Mine is taken apart back in the box haha. That is indeed a thermal pad on the backplate its super sticky. Id leave that alone as its prob closer to 3mm thick and that thickness is hard to find.


----------



## yuannan

Quote:


> Originally Posted by *christoph*
> 
> ohhhhhhh, That's something I really want to know, cuz I haven't take my video card apart, I want to know if that "rubber pad" is thermal pad or not, cuz if is not, you can replace that rubber with actual thermal pad and be cooling the PCB's back, taking in mind that almost every thermal pad is NOT conductance


That pad doesn't feel conductive at all when I touched it. The rubber is VERY soft. It squidges around and is shaped easily. I've never used to touched a "raw" thermal pad before so I might be just BSing. Despite all this the backplate is still very hot. Not enough to burn and enough to warm up your hands, quite well actually.


----------



## jdorje

My backplate is like 60C. Surely the PCB can't be that hot. My core is barely that hot, and no way the heatsink is. The heat must be coming from VRMs or...somewhere else that doesn't have good cooling.


----------



## Stige

Quote:


> Originally Posted by *jdorje*
> 
> My backplate is like 60C. Surely the PCB can't be that hot. My core is barely that hot, and no way the heatsink is. The heat must be coming from VRMs or...somewhere else that doesn't have good cooling.


Yup I have the same "issue". My core is barely over 40C but the VRM gets to 60-70C range which heats up the backplate nicely. Need to setup a fan to blow over the backplate somehow.


----------



## jdorje

I tried just blowing one at it but it did nothing. Just not a good surface for dumping heat. I considered putting some sinks on it...I think they would want to connect to the screws. That has to be what is bringing the heat.


----------



## Stige

Quote:


> Originally Posted by *jdorje*
> 
> I tried just blowing one at it but it did nothing. Just not a good surface for dumping heat. I considered putting some sinks on it...I think they would want to connect to the screws. That has to be what is bringing the heat.


This is what I did to my R9 280X in my secondary PC. The VRM heatsink is tiny on it and it only has a tiny backplate over the VRM, I just put some thermal paste there and slapped some of those 1cm x 1cm heat sinks across it. The VRM actually can be touched now so it does help.

The GPX block I have does have sort of small "fins" on the backplate but nothing major so propably won't help much at all blowing air over it either.


----------



## bluej511

Quote:


> Originally Posted by *Stige*
> 
> This is what I did to my R9 280X in my secondary PC. The VRM heatsink is tiny on it and it only has a tiny backplate over the VRM, I just put some thermal paste there and slapped some of those 1cm x 1cm heat sinks across it. The VRM actually can be touched now so it does help.
> 
> The GPX block I have does have sort of small "fins" on the backplate but nothing major so propably won't help much at all blowing air over it either.


Youd be surprised. For me its VRM2 thats usually hotter then 1. It now stays under 60C for both.


----------



## christoph

Quote:


> Originally Posted by *bluej511*
> 
> Youd be surprised. For me its VRM2 thats usually hotter then 1. It now stays under 60C for both.


mine too, the VRM2 is hotter than VRM1...

anyone knows if the Kryounat? thermal paste is conductive? can it be apply on the GPU without any strange issue?


----------



## bluej511

Quote:


> Originally Posted by *christoph*
> 
> mine too, the VRM2 is hotter than VRM1...
> 
> anyone knows if the Kryounat? thermal paste is conductive? can it be apply on the GPU without any strange issue?


Kyronaut is non conductive, super hard to find, and def baller haha. Its supposed to be the best TIM on the market bar none. Beats out everything else if im correct.

So i'm trying out my new just released Corsair M65 Pro RGB, just played some Far Cry 4 (forgot i even had the game to test haha). VRM 1 is 53°C and VRM 2 is 59°C. This is with the alphacool gpx waterblock, i have the intake fans up top so its blowing right over the card, i have a fan blowing over the fins but its only a 900rpm fan so not moving much air. Temps are about the same when playing Syndicate, so dropped a few degrees adding more intake fans in my case. I wonder if getting Fujipoly 11w/mk pads would drop it down even more. Those are pretty decent temps for passively cooled VRMs.


----------



## christoph

Quote:


> Originally Posted by *bluej511*
> 
> Kyronaut is non conductive, super hard to find, and def baller haha. Its supposed to be the best TIM on the market bar none. Beats out everything else if im correct.
> 
> So i'm trying out my new just released Corsair M65 Pro RGB, just played some Far Cry 4 (forgot i even had the game to test haha). VRM 1 is 53°C and VRM 2 is 59°C. This is with the alphacool gpx waterblock, i have the intake fans up top so its blowing right over the card, i have a fan blowing over the fins but its only a 900rpm fan so not moving much air. Temps are about the same when playing Syndicate, so dropped a few degrees adding more intake fans in my case. I wonder if getting Fujipoly 11w/mk pads would drop it down even more. Those are pretty decent temps for passively cooled VRMs.


yes, and acatually those temps are not bad at all, Sapphire has come to just have the best cooling solution, but we're not happy with anything, that's why I'm replacing the thermal pads with the Fujipoly 11w/mk, but they haven't arrived yet...

but what to replace the GPU thermal paste with? is this for real? 11 grams? 12.5 w/mk? non conductive you said? I know that the GC extreme is not conductive but I don't know what thermal rating it has

http://www.amazon.com/Thermal-Grizzly-Kryonaut-Grease-Paste/dp/B00ZJS8Q6S


----------



## bluej511

Yea its real just check their site its non conductive. If your only doing your gpu and cpu (cuz might as well right lol) go with the 1g its more then enough. Idk how well tim keeps. 11g is way too much.

Gelid extreme is about 8.5 thermal grizzly have a liquid metal thats 86w/mK lol.


----------



## yuannan

Quote:


> Originally Posted by *jdorje*
> 
> My backplate is like 60C. Surely the PCB can't be that hot. My core is barely that hot, and no way the heatsink is. The heat must be coming from VRMs or...somewhere else that doesn't have good cooling.


No idea, my GPU only hits 60C under load anyway so I'm stuffed.
Quote:


> Originally Posted by *bluej511*
> 
> Yea its real just check their site its non conductive. If your only doing your gpu and cpu (cuz might as well right lol) go with the 1g its more then enough. Idk how well tim keeps. 11g is way too much.
> 
> Gelid extreme is about 8.5 thermal grizzly have a liquid metal thats 86w/mK lol.


I got the be quiet thermal paste with my DRP3 and I still have about 1/3 of the tube left. Good for about another 5 apps.


----------



## bluej511

Quote:


> Originally Posted by *yuannan*
> 
> I got the be quiet thermal paste with my DRP3 and I still have about 1/3 of the tube left. Good for about another 5 apps.


Yea 1g is def more then enough for a few applications. My ek water block came with 1.5g, i did my cpu prob 3-4x and a good amount for my gpu.

On the other hand about this latest nvidia drivers thats bricking cards, again.

P.S. I just scored a barely released M65 Pro RGB yesterday. Sometimes i love being the first to get new things haha.


----------



## yuannan

Quote:


> Originally Posted by *bluej511*
> 
> Yea 1g is def more then enough for a few applications. My ek water block came with 1.5g, i did my cpu prob 3-4x and a good amount for my gpu.
> 
> On the other hand about this latest nvidia drivers thats bricking cards, again.
> 
> P.S. I just scored a barely released M65 Pro RGB yesterday. Sometimes i love being the first to get new things haha.


I got a normal m65, very good mouse. I'm at 400DPI anyway so no need for me. But the surface calibration tool, how is it?


----------



## bluej511

Quote:


> Originally Posted by *yuannan*
> 
> I got a normal m65, very good mouse. I'm at 400DPI anyway so no need for me. But the surface calibration tool, how is it?


400 or 4000 lol. The tool is fun to use just do circles at a certain speed. Im using a razer goliath speed anyways. Id love a hard mat just need a size that fits. The mouse was 70€ here its 60$ US but my main shopping site had 25% off. Goodp deal for a just released mouse. I might be the first on overclock with it. Its super smooth too. Every said the sniper button is in a bad spot but honestly not even close. My thumb has a hard time pressing it its a bit recessed and firm. So far so good, click delay is good, feels great very smooth.

It doesnt like my MSI gaming ports for fw updates though had to do it thru the 3.0 port.


----------



## yuannan

Quote:


> Originally Posted by *bluej511*
> 
> 400 or 4000 lol. The tool is fun to use just do circles at a certain speed. Im using a razer goliath speed anyways. Id love a hard mat just need a size that fits. The mouse was 70€ here its 60$ US but my main shopping site had 25% off. Goodp deal for a just released mouse. I might be the first on overclock with it. Its super smooth too. Every said the sniper button is in a bad spot but honestly not even close. My thumb has a hard time pressing it its a bit recessed and firm. So far so good, click delay is good, feels great very smooth.
> 
> It doesnt like my MSI gaming ports for fw updates though had to do it thru the 3.0 port.


400 dpi, i changed it for csgo and stuck with it, corsair mm200 mat, plenty of space for me


----------



## Mattuz

Hi guys, I overclocked my sapphire r9 390 to 1135/1690mhz with stock voltage, max temp with auto fan mode is 76℃ with 100% fan is 67℃, is it all ok?

I would also like to know why msi afterburner cannot stop the fan to spin like my card automatically does when under 47℃, on afterbutner it only goes as down as 20%, is it fixable?


----------



## jdorje

I'd try changing fan curve in the bios. And upgrade your memory timing straps while you're at it.

On most cards vrm temps are a problem long before core temps. So watch those too.


----------



## Mattuz

Ok, how can i cange it? I mean how can I enter the bios?


----------



## ronaldoz

Quote:


> Originally Posted by *Mattuz*
> 
> Hi guys, I overclocked my sapphire r9 390 to 1135/1690mhz with stock voltage, max temp with auto fan mode is 76℃ with 100% fan is 67℃, is it all ok?
> 
> I would also like to know why msi afterburner cannot stop the fan to spin like my card automatically does when under 47℃, on afterbutner it only goes as down as 20%, is it fixable?


Could the temperatures raise, because using higher clocks, or is this only when adding voltage? I'm having a Sapphire Nitro as well (Actually tested 1 x 390 and 2 x 390x). The temperature of the GPU chip is with gaming around 66C with 80% fan max.


----------



## ronaldoz

Sorry, I quoted myself, and then add jdordje's text, oops!

Quote:


> Originally Posted by *jdorje*
> 
> I'd try changing fan curve in the bios. And upgrade your memory timing straps while you're at it.
> 
> On most cards vrm temps are a problem long before core temps. So watch those too.


Will this help gaming performance? Did you checked how much extra score you have with Firestrike / Heaven by using those upgraded timing straps?


----------



## jdorje

Lower timings is pretty helpful.

I get ~2950 on valley preset with stock 1225 mV.

Still struggling to get 60 fps in fallout at 1440. Anyone have settings suggestions?

Edit: read the bios modding thread for more on strap editing. I just followed a youtube video they linked.


----------



## Mattuz

Quote:


> Originally Posted by *ronaldoz*
> 
> Could the temperatures raise, because using higher clocks, or is this only when adding voltage? I'm having a Sapphire Nitro as well (Actually tested 1 x 390 and 2 x 390x). The temperature of the GPU chip is with gaming around 66C with 80% fan max.


Well these temperatures are under kombustor furmark, under game with auto fan I don't pass ~64℃


----------



## Stige

Quote:


> Originally Posted by *Mattuz*
> 
> Well these temperatures are under kombustor furmark, under game with auto fan I don't pass ~64℃


Stop using **** like Furmark...


----------



## Mattuz

Well occt test for gpu wasn't working so I used it and after it I tried unigine valley and gamed a bit


----------



## Stige

Worst GPU benchmarks for stability:
1. Furmark
2. Valley

Furmark doesnt do you any good in any possible case, everyone that uses it needs to STOP. That software is garbage.


----------



## Mattuz

So what do you suggest to use?


----------



## navjack27

I totally agree. The only thing a benchmark is good for with stability testing is seeing if your card is stable in that benchmark. I've passed fire strike and everything only to have csgo fail during a comp.


----------



## Stige

Quote:


> Originally Posted by *Mattuz*
> 
> So what do you suggest to use?


Heaven and Firestrike are a good place to start.

Or just play games.


----------



## Mattuz

Understood, thanks. But why valley is not good and heaven is?


----------



## bluej511

This is exactly why everyone should have at least one very demanding game on their pc. Run it maxed out, or stable 60fps or wtv your hz. Run the game so its 100% usage at all times, and not just 100% but make sure its also using all the core speed especially with the r9 300 series. I use Syndicate for temps as it uses my GPU to 100% and believe it or not cpu avg is usually around 80% or so. This way i can see if it artifacts and it gets my water temp to equilibrium.

I believe Valley is more for both and heaven is more gpu. For me i was stable in heaven but saw artifacts in firestrike so it really depends. Personally i dont play benchmarks but i play games haha.


----------



## yuannan

Quote:


> Originally Posted by *Stige*
> 
> Stop using **** like Furmark...


Why don't you guys like furmark?


----------



## Stige

Quote:


> Originally Posted by *yuannan*
> 
> Why don't you guys like furmark?


Because it doesn't do anything useful at all.


----------



## yuannan

Quote:


> Originally Posted by *Stige*
> 
> Because it doesn't do anything useful at all.


I use it to test my fan profiles, and stress my system for cooling and temps.

I don't use it for stability testing so am I doing the "right" thing?


----------



## Stige

Quote:


> Originally Posted by *yuannan*
> 
> I use it to test my fan profiles, and stress my system for cooling and temps.
> 
> I don't use it for stability testing so am I doing the "right" thing?


It gives you temps that are out of this world, not realistic in any way.


----------



## yuannan

Quote:


> Originally Posted by *Stige*
> 
> It gives you temps that are out of this world, not realistic in any way.


They are the same as the ones in game, furmark can't make my GPU work harder than 100%.

Or am I missing something? Unless it's a less demanding game all of mine tops out the GPU and I sit at 60-80 fps at 60C and 40% fan speed. Furmark does the same, 60C and 40% fan speed. Sure I have to look at a hairy arsehole but it kicks up the temps in about 30 seconds for me. perfect for fan curve testing.


----------



## Stige

Quote:


> Originally Posted by *yuannan*
> 
> They are the same as the ones in game, furmark can't make my GPU work harder than 100%.
> 
> Or am I missing something? Unless it's a less demanding game all of mine tops out the GPU and I sit at 60-80 fps at 60C and 40% fan speed. Furmark does the same, 60C and 40% fan speed. Sure I have to look at a hairy arsehole but it kicks up the temps in about 30 seconds for me. perfect for fan curve testing.


Furmark is like lighting your GPU on fire. It pushes temps way past anything else ever will, and you can see why if you have a watt-o-meter. With ONLY Furmark, I can get my PC to pull just about 600W from the wall. But no game or anything even goes above 500W in any situation.

If Furmark makes your GPU use more power than your whole system added together in a game, there is something very wrong with that software.


----------



## yuannan

Quote:


> Originally Posted by *Stige*
> 
> Furmark is like lighting your GPU on fire. It pushes temps way past anything else ever will, and you can see why if you have a watt-o-meter. With ONLY Furmark, I can get my PC to pull just about 600W from the wall. But no game or anything even goes above 500W in any situation.
> 
> If Furmark makes your GPU use more power than your whole system added together in a game, there is something very wrong with that software.


Hmm, I'll give it a go now.

What game do you normally use?


----------



## diggiddi

Quote:


> Originally Posted by *yuannan*
> 
> I got a normal m65, very good mouse. I'm at 400DPI anyway so no need for me. But the surface calibration tool, how is it?


You can download it off corsair site, really easy to use, just move your mouse around the mouse pad after clicking button to start it


----------



## tolis626

So my Fujipoly pads just arrived. Gonna install them in a bit and see how it goes. Hope the 1mm choice was a good one, or else I wasted 30€.


----------



## bluej511

Quote:


> Originally Posted by *tolis626*
> 
> So my Fujipoly pads just arrived. Gonna install them in a bit and see how it goes. Hope the 1mm choice was a good one, or else I wasted 30€.


Dont forget before and after temps. Hope you got the right size. Id do a smell test before giving her 100% just in case theres an air gap.


----------



## tolis626

Quote:


> Originally Posted by *bluej511*
> 
> Dont forget before and after temps. Hope you got the right size. Id do a smell test before giving her 100% just in case theres an air gap.


Smell test? What's that?

Anyways... So something's wrong here. Either temperature reporting is wrong, or 1mm is the wrong size. I can't think of any other reason why a 14W/mK Fujipoly pad can have the same temps as a 7W/mK Phoybya or a crappy whatever it is MSI uses. Thing is, it's about the same as the Phobya, maybe even worse.

Before I continue, this time I took off the metal plate that covers the RAM. Here it is.



First off, notice how some memory chips aren't fully covered by a thermal pad. That in itself concerns me. Then there is the thickness of the padding. This is most certainly not 0.5mm padding (as EK states in their manual), but more like 1mm and it seems as low quality as it goes. Then there is the VRM2 pad that's a very thick (looks like 3mm, or 2.5mm but definitely not thinner than that, it's huge) which I replaced with a 1.5mm Phobya and a 1mm Fujipoly pad stuck together with Gelid GC Extreme. I know, not the best idea, but that thing got torn to pieces as I tried to take it off.

So, unless there is a 0.5mm gap between the VRMs' pad and the heatsink and that's causing the temps (it quickly hit 90C and even a bit over that in Valley 1440p at 1180/1650MHz at +80/+50mV) to go so high, I don't know what's going on. Like really, I'm out of ideas here. I will test more with Witcher 3 during the day just to rule out the possibility of this being a fluke, but color me disappointed.









Worst thing is, I can't even find 1.5mm Fujipolys with decent pricing. Not only that, but at the moment I had some bills to pay and I'm broke as hell. God damn it...


----------



## Stige

You don't need to worry even if the memory is not fully covered, they don't get hot anyway so it won't be an issue even if it is only half covered like that.

I'm also pretty sure I stated earlier that 1mm is too thin? The 11w/mk at 1.5mm doesn't cost that much. 14w is overkill I think, 11w should offer a big improvement over the Phobya pads already.
I will propably order some later this week once I get the money from my second PC and old parts and replace my Phobya pads with 11w/mk.


----------



## bluej511

Doesnt look like u covered the vrm only vram. Vram doesnt have a temp sensor


----------



## bluej511

Here what i circled is what the fujilpoly thermal pads should be on/over (smell test, typo meant small test). If you didn't rip the factory pads put em back over the vram and use the fuji on the VRM. Idk how its cooled on the msi i dont see a heatsink or heatsink. The 3 smaller ones look like they touch the msi cooler but the longer line of em doesnt look like theres anything over em, maybe theres a heatsink we don't see.


----------



## tolis626

Quote:


> Originally Posted by *Stige*
> 
> You don't need to worry even if the memory is not fully covered, they don't get hot anyway so it won't be an issue even if it is only half covered like that.
> 
> I'm also pretty sure I stated earlier that 1mm is too thin? The 11w/mk at 1.5mm doesn't cost that much. 14w is overkill I think, 11w should offer a big improvement over the Phobya pads already.
> I will propably order some later this week once I get the money from my second PC and old parts and replace my Phobya pads with 11w/mk.


Well, I went by what EK says in their manual. That and I could only find 1mm 14W/mK pads. Sigh...

Anyways, I'll probably get 11W/mK 1.5mm ones if I don't figure anything else out. I just hate taking the card apart again and again...
Quote:


> Originally Posted by *bluej511*
> 
> Doesnt look like u covered the vrm only vram. Vram doesnt have a temp sensor


Quote:


> Originally Posted by *bluej511*
> 
> Here what i circled is what the fujilpoly thermal pads should be on/over (smell test, typo meant small test). If you didn't rip the factory pads put em back over the vram and use the fuji on the VRM. Idk how its cooled on the msi i dont see a heatsink or heatsink. The 3 smaller ones look like they touch the msi cooler but the longer line of em doesnt look like theres anything over em, maybe theres a heatsink we don't see.


Well, you got me wrong. The picture was taken right after I disassembled the card. That on the right is the RAM contact plate (It adds rigidity to the PCB too). The cooler is far to the right and you can't see it in the picture. Most of the pads are on it too. You're right about where the VRMs are, and that's where I put my pads on. The long strip of VRMs (VRM1) is in direct contact with the heatsink and VRM2 (The small one next to the RAM modules) is in contact with the aforementioned plate. I didn't reassemble the card wrong.









I also didn't mess with the VRAM padding.


----------



## kizwan

false
Quote:


> Originally Posted by *tolis626*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bluej511*
> 
> Dont forget before and after temps. Hope you got the right size. Id do a smell test before giving her 100% just in case theres an air gap.
> 
> 
> 
> Smell test? What's that?
> 
> Anyways... So something's wrong here. Either temperature reporting is wrong, or 1mm is the wrong size. I can't think of any other reason why a 14W/mK Fujipoly pad can have the same temps as a 7W/mK Phoybya or a crappy whatever it is MSI uses. Thing is, it's about the same as the Phobya, maybe even worse.
> 
> Before I continue, this time I took off the metal plate that covers the RAM. Here it is.
> 
> 
> 
> First off, notice how some memory chips aren't fully covered by a thermal pad. That in itself concerns me. Then there is the thickness of the padding. This is most certainly not 0.5mm padding (as EK states in their manual), but more like 1mm and it seems as low quality as it goes. Then there is the VRM2 pad that's a very thick (looks like 3mm, or 2.5mm but definitely not thinner than that, it's huge) which I replaced with a 1.5mm Phobya and a 1mm Fujipoly pad stuck together with Gelid GC Extreme. I know, not the best idea, but that thing got torn to pieces as I tried to take it off.
> 
> So, unless there is a 0.5mm gap between the VRMs' pad and the heatsink and that's causing the temps (it quickly hit 90C and even a bit over that in Valley 1440p at 1180/1650MHz at +80/+50mV) to go so high, I don't know what's going on. Like really, I'm out of ideas here. I will test more with Witcher 3 during the day just to rule out the possibility of this being a fluke, but color me disappointed.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Worst thing is, I can't even find 1.5mm Fujipolys with decent pricing. Not only that, but at the moment I had some bills to pay and I'm broke as hell. God damn it...
Click to expand...

I can not recall any Hawaii on air get "good" result with Fujipoly. VRMs temps did get better as far as I remember but not a lot. It seems the only way to fully utilized Fujipoly is with gpu under water. The Phobya 7W/mK is already a very good pad because I remember someone using it on Hawaii (under water). So it's not surprising you did not noticed any improvement when going with Fujipoly. If you can see impression of VRMs on the Fujipoly pad, that can rule out contact issue. You can try sandwich two fujipoly pads on the VRMs just in case.


----------



## bluej511

Im on water but passively cooled and get better temps then that so that statement is false lol. If the pad is too thin you got a gap and wont cool at all. Someone with the ares r9 290 got ridiculous temp drops on it.

Check ebay.co.uk they usually ship worldwide. Aquatuning does too.


----------



## tolis626

Quote:


> Originally Posted by *bluej511*
> 
> Im on water but passively cooled and get better temps then that so that statement is false lol. If the pad is too thin you got a gap and wont cool at all. Someone with the ares r9 290 got ridiculous temp drops on it.
> 
> Check ebay.co.uk they usually ship worldwide. Aquatuning does too.


The ones I got were from pcerb (awesome customer service, by the way) in the UK.









I checked Aquatuning and Amazon and both require me to pay some ridiculous sums to ship to Greece. Like 30-40€ for a 2-pack of pads. God damn ridiculous.
Quote:


> Originally Posted by *kizwan*
> 
> false
> I can not recall any Hawaii on air get "good" result with Fujipoly. VRMs temps did get better as far as I remember but not a lot. It seems the only way to fully utilized Fujipoly is with gpu under water. The Phobya 7W/mK is already a very good pad because I remember someone using it on Hawaii (under water). So it's not surprising you did not noticed any improvement when going with Fujipoly. If you can see impression of VRMs on the Fujipoly pad, that can rule out contact issue. You can try sandwich two fujipoly pads on the VRMs just in case.


Thing is, as they're reported, my temps are worse than stock. At stock max I had seen was 80-85C on the VRMs and with all three of the alternatives I've tried I've been getting about the same temps, but still consistently worse than stock. Temps are practically the same between the Fujipoly 1mm pads and both Phobya pads (1mm and 1.5mm). The only one left on there long enough was the 1.5mm pad and that one had the impressions of the VRMs on it (It actually took a while because the Phobya pads are really firm). And on top of all that, my card behaves as if its VRM temps are actually better than before. Like, drastically so. But alas, I don't know what conclusion I can draw from all of this, apart from the fact that it's getting frustrating.

For now, I'll game as I have spare time these days. With the Phobya pads my temps got better after the first few hours of use. I suppose because high temps led to softening the pad and that allowed it to have better contact with the VRMs, taking their shape. That's just a theory that makes sense to me though.


----------



## kizwan

Quote:


> Originally Posted by *tolis626*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> false
> I can not recall any Hawaii on air get "good" result with Fujipoly. VRMs temps did get better as far as I remember but not a lot. It seems the only way to fully utilized Fujipoly is with gpu under water. The Phobya 7W/mK is already a very good pad because I remember someone using it on Hawaii (under water). So it's not surprising you did not noticed any improvement when going with Fujipoly. If you can see impression of VRMs on the Fujipoly pad, that can rule out contact issue. You can try sandwich two fujipoly pads on the VRMs just in case.
> 
> 
> 
> Thing is, as they're reported, my temps are worse than stock. At stock max I had seen was 80-85C on the VRMs and with all three of the alternatives I've tried I've been getting about the same temps, but still consistently worse than stock. Temps are practically the same between the Fujipoly 1mm pads and both Phobya pads (1mm and 1.5mm). The only one left on there long enough was the 1.5mm pad and that one had the impressions of the VRMs on it (It actually took a while because the Phobya pads are really firm). And on top of all that, my card behaves as if its VRM temps are actually better than before. Like, drastically so. But alas, I don't know what conclusion I can draw from all of this, apart from the fact that it's getting frustrating.
> 
> For now, I'll game as I have spare time these days. With the Phobya pads my temps got better after the first few hours of use. I suppose because high temps led to softening the pad and that allowed it to have better contact with the VRMs, taking their shape. That's just a theory that makes sense to me though.
Click to expand...

You should see impression on the Fujipoly pad right away. If you did not, then it doesn't make any good contact. Try sandwiching two Fujipoly pads together on the VRMs.

I don't know the quality of the VRM's temp diode/sensor, so better not start conspiracy theory about it.


----------



## battleaxe

Quote:


> Originally Posted by *tolis626*
> 
> The ones I got were from pcerb (awesome customer service, by the way) in the UK.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I checked Aquatuning and Amazon and both require me to pay some ridiculous sums to ship to Greece. Like 30-40€ for a 2-pack of pads. God damn ridiculous.
> Thing is, as they're reported, my temps are worse than stock. At stock max I had seen was 80-85C on the VRMs and with all three of the alternatives I've tried I've been getting about the same temps, but still consistently worse than stock. Temps are practically the same between the Fujipoly 1mm pads and both Phobya pads (1mm and 1.5mm). The only one left on there long enough was the 1.5mm pad and that one had the impressions of the VRMs on it (It actually took a while because the Phobya pads are really firm). And on top of all that, my card behaves as if its VRM temps are actually better than before. Like, drastically so. But alas, I don't know what conclusion I can draw from all of this, apart from the fact that it's getting frustrating.
> 
> For now, I'll game as I have spare time these days. With the Phobya pads my temps got better after the first few hours of use. I suppose because high temps led to softening the pad and that allowed it to have better contact with the VRMs, taking their shape. That's just a theory that makes sense to me though.


Are you sure the block lined back up properly? Some air coolers have inserts that go into the holes around the GPU. If these are out of line, double check your lineup process. And as Kizwann stated, there should be imprints in the VRM pads. If there are not, then something is not lined up properly. Be very careful, the Fuji pads are extremely delicate and easily tear/rip.

I've never seen where they don't make a big improvement. But I have misaligned things myself and if the contact is not good. They do not work properly. That is for sure.


----------



## christoph

hey guys, I have just one question, I was reading the Sapphire webpage, and I've read that the video card has the UEFI bios enabled when the switch IS NOT LIGHT UP

but I that don't have UEFI bios in my MOBO should be ruuning the video card in Legacy mode, with the switch turned off?


----------



## Mattuz

Actually is not like that. When the light is on the UEFI is enabled, if you have a UEFI mobo you can either use the uefi or the legacy. If you have a BIOS mobo then you'll have to not push the button on the sapphire card, so it will use it's legacy bios.


----------



## Minusorange

Well after a week in I'm impressed with my Asus card, the cooling on it is actually pretty decent, prefer it over the Sapphire tri-x cooling as it's not only less loud but the VRM's stay nice & cool

GPU core on load never goes above 69 degrees on 50% speed
VRM's hover around the 65 degree mark

GPU core when not being used is around the 38 degree mark
VRM's around the 33 degree mark

The only minor annoyance with the card is the stupid light that fades in and out on the side of the card making it look like a neon sign for a shop as opposed to something cool looking but I can live with it.

I'd definitely recommend the card to anyone looking to buy, it's well built, good cooling and 3 year warranty so can't really go wrong


----------



## christoph

Quote:


> Originally Posted by *Mattuz*
> 
> Actually is not like that. When the light is on the UEFI is enabled, if you have a UEFI mobo you can either use the uefi or the legacy. If you have a BIOS mobo then you'll have to not push the button on the sapphire card, so it will use it's legacy bios.


please, read the Sapphire webpage again, cuz I thought that too; and guess what? with the light on of the switch of the video card (which I believe is legacy bios) the coil whine went away...

anyone please confirm this, read the Sapphire web page


----------



## patriotaki

Quote:


> Originally Posted by *christoph*
> 
> please, read the Sapphire webpage again, cuz I thought that too; and guess what? with the light on of the switch of the video card (which I believe is legacy bios) the coil whine went away...
> 
> anyone please confirm this, read the Sapphire web page


is there something similar on my PCS+ 390? maybe theres a way to avoid coil whine?


----------



## jdorje

Can you use regular thermal paste on the vram? Seems like that would be convenient if so.

Also, I was playing fo4 at 1440 and getting 40 fps. Googled optimal settings and that's what it gave me. Switched to high/medium and no real improvement. Turns out fps was being creamed by having a second (desktop) monitor on. After turning that off the "optimal settings" (which is closer to ultra than high I think) gives ~65 fps. I thought fo4 was locked at 60? But regardless, it seems like my 390 crushes this game at 1440, as it does with so many others (everything except witcher 3).

Hilariously the game looked quite decent at 45 fps. Freesync is awesome.


----------



## Mattuz

I have that card right here and with my previous mobo that wasn't uefi the card worked only with the light turned off. Now with the new mobo works fine with both on and off.

_"This product supports both Legacy and UEFI BIOS. By factory default, the card itself is under Legacy mode. By pressing the button with SAPPHIRE logo, UEFI mode will be easily enabled."_

They made a mistake in the photo, the descriprion is right


----------



## bluej511

You gotta install windows in uefi mode though, or else wont work. I might do a fresh install one of these days and have it in uefi mode.


----------



## patriotaki

How can i check which type am i running the gpu


----------



## bluej511

Honestly if you installed windows and didn't do it in UEFI then its in legacy haha.


----------



## patriotaki

Quote:


> Originally Posted by *bluej511*
> 
> Honestly if you installed windows and didn't do it in UEFI then its in legacy haha.


....
lol

how do i install windows in uefi


----------



## patriotaki

im on legacy







just cheked.. if i install windows in uefi mode will this solve coil whine?


----------



## bluej511

Quote:


> Originally Posted by *patriotaki*
> 
> ....
> lol
> 
> how do i install windows in uefi


I had to look it up too looks identical to how i installed it in the first place. Maybe gotta set the bios to uefi only then install windows who knows.


----------



## patriotaki

can anyone confirm that coil whine goes away?


----------



## Mattuz

But I knew that just clicking the button you change the uefi mode, why you are saying that you have to install the so with uefi or you cannot turn it on?


----------



## christoph

the card then, it works UEFI or not enabled in my computer with windows legacy, and check in the GPU-Z tab, that it says is UEFI enabled ?? or is it IF JUST have Uefi support?

with the button on, the UEFI box is not checked, and with the button off the UEFI box is checked...


----------



## Stige

Quote:


> Originally Posted by *patriotaki*
> 
> can anyone confirm that coil whine goes away?


It doesn't.


----------



## patriotaki

why all the hassle then for using it on uefi?


----------



## christoph

Quote:


> Originally Posted by *christoph*
> 
> the card then, it works UEFI or not enabled in my computer with windows legacy


Quote:


> Originally Posted by *patriotaki*
> 
> why all the hassle then for using it on uefi?


that's what I'm trying to figure out...

the card works UEFI enabled or not in my windows 7 64 legacy...


----------



## bluej511

Quote:


> Originally Posted by *patriotaki*
> 
> why all the hassle then for using it on uefi?


Boots faster supposedly. I just hate the 1-2 sec black screen before the welcome screen wonder if uefi gets rid of it


----------



## patriotaki

Quote:


> Originally Posted by *bluej511*
> 
> Boots faster supposedly. I just hate the 1-2 sec black screen before the welcome screen wonder if uefi gets rid of it


no performance gain on 390?or avoid coil whine?


----------



## mus1mus

Why would UEFI avoid coil whine?


----------



## christoph

well guess what?


----------



## christoph

Quote:


> Originally Posted by *mus1mus*
> 
> Why would UEFI avoid coil whine?


cuz it does not







I'd spoke too soon


----------



## mus1mus

And it won't. Ever.


----------



## christoph

Quote:


> Originally Posted by *mus1mus*
> 
> And it won't. Ever.


but did you read my post about the tech support?


----------



## mus1mus

Quote:


> Originally Posted by *christoph*
> 
> but did you read my post about the tech support?


Nothing in there talks about UEFI and COIL Whine.


----------



## christoph

Quote:


> Originally Posted by *mus1mus*
> 
> Nothing in there talks about UEFI and COIL Whine.


yes, BUT it says that with the button ON is in Legacy mode


----------



## kizwan

Quote:


> Originally Posted by *christoph*
> 
> the card then, it works UEFI or not enabled in my computer with windows legacy, and check in the GPU-Z tab, that it says is UEFI enabled ?? or is it IF JUST have Uefi support?
> 
> with the button on, the UEFI box is not checked, and with the button off the UEFI box is checked...


GPU UEFI BIOS is basically GPU UEFI driver for (UEFI) BIOS boot process. It's independent from windows or OS. Once BIOS/UEFI BIOS boot process completed it will hand over to windows/OS. You can have either windows/OS boot in MBR or UEFI mode.


----------



## christoph

Quote:


> Originally Posted by *kizwan*
> 
> GPU UEFI BIOS is basically GPU UEFI driver for (UEFI) BIOS boot process. It's independent from windows or OS. Once BIOS/UEFI BIOS boot process completed it will hand over to windows/OS. You can have either windows/OS boot in MBR or UEFI mode.


that' I knew already, but what If I boot into legacy windows, with a MOBO's legacy bios (not UEFI mode cuz don;t have it) WITH the video card in UEFI mode? isn't suppose the legacy bios not to recognize the UEFI video card?


----------



## kizwan

Quote:


> Originally Posted by *christoph*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> GPU UEFI BIOS is basically GPU UEFI driver for (UEFI) BIOS boot process. It's independent from windows or OS. Once BIOS/UEFI BIOS boot process completed it will hand over to windows/OS. You can have either windows/OS boot in MBR or UEFI mode.
> 
> 
> 
> that' I knew already, but what If I boot into legacy windows, with a MOBO's legacy bios (not UEFI mode cuz don;t have it) WITH the video card in UEFI mode? isn't suppose the legacy bios not to recognize the UEFI video card?
Click to expand...

I don't have copy of your BIOS. So I don't know what your computer going to react when GPU in UEFI mode. Even if the GPU only have UEFI GOP driver, your computer should still be able to boot into windows. Your screen will not show anything until windows loaded. If the GPU have legacy VGA & UEFI GOP BIOS in the same ROM, your computer will automatically use legacy VGA BIOS.


----------



## jdorje

Quote:


> Originally Posted by *christoph*
> 
> well guess what?


Sapphire tech support people don't speak English?


----------



## christoph

Quote:


> Originally Posted by *jdorje*
> 
> Sapphire tech support people don't speak English?


----------



## Mattuz

Guys, they just made a mistake in the photo but the description of the photo is right, when you push it is in UEFI and the light is on. With my crosshair IV formula it didn't boot with the card in uefi (light on) and worked fine when the light was off (legacy), so it seems a little bit obvious to me that is just the photo that is wrong


----------



## Stige

Just ordered some 11w/mk Fujipoly for my card.
http://www.ebay.com/itm/Fujipoly-Thermal-Pads-1-5mm-11W-mK-for-GPU-CPU-LED-XBOX-PS3-PC-Laptop-Heatsink-/181701216935?hash=item2a4e3c8aa7:g:z~UAAOSw9r1V-Ai-

Hoping for great results! That strip should just about cover the VRM area for my card.


----------



## tolis626

Would sandwiching together two 1mm Fujipoly pads work? I mean, at least as a temporary solution. I suppose it would, seeing how soft and pliable they are, but I dunno...


----------



## Stige

Doesn't hurt to try? It would still be better than making very little if any contact at all if you ask me.


----------



## christoph

Quote:


> Originally Posted by *Mattuz*
> 
> Guys, they just made a mistake in the photo but the description of the photo is right, when you push it is in UEFI and the light is on. With my crosshair IV formula it didn't boot with the card in uefi (light on) and worked fine when the light was off (legacy), so it seems a little bit obvious to me that is just the photo that is wrong


did you read the tech support's mail that I've quoted? , they are telling me that with the button on (blue light) is in Legacy Mode, I'd believe too that is the other way around, but what computer boots either way


----------



## christoph

Quote:


> Originally Posted by *tolis626*
> 
> Would sandwiching together two 1mm Fujipoly pads work? I mean, at least as a temporary solution. I suppose it would, seeing how soft and pliable they are, but I dunno...


I don't see why it wouldn't work


----------



## christoph

one more time, for those who didn't work in legacy mode


----------



## bluej511

Stige told you so. I think ill get some Kyronaut for bare die.

http://www.overclock.net/t/1596338/photos-of-liquid-pro-bonding-leaving-solid-residue/0_20#post_25040745


----------



## Stige

Quote:


> Originally Posted by *bluej511*
> 
> Stige told you so. I think ill get some Kyronaut for bare die.
> 
> http://www.overclock.net/t/1596338/photos-of-liquid-pro-bonding-leaving-solid-residue/0_20#post_25040745


It seems you have some issues with understanding what you have read.

I have ALWAYS said that CLU does not harden between the IHS and Die, which is correct. Because it just doesn't do that. This guy is also using CLP and not CLU, CLP does harden faster than CLU does and maybe to a more severe effect because I have never had any issue with CLU, it does harden but it never sticks together like that.

I'm taking my setup apart tomorrow in preparation for my Skylake parts and I doubt I will have any issues with it.

If you put anything else but CLU between the IHS and DIE then you are making a massive mistake.

TL;DR Use CLU between IHS/Block, not CLP.

Get your facts straight and perhaps read my posts again before trying to put words in my mouth that are clearly not mine...


----------



## bluej511

You really are idiotic its amazing. Clu clp wtv you wanna call it its liquid METAL. See the main word metal? The same **** thats in solder. Just like solder clu/clp will leave a residue. This is from the DOZENS of posts ive read here and otherwise.

Like someone else posted going bare die will ALWAYS give better results. You dont seem to understand how isolation works or heat travel or thermodynamics. Any layercopper/aluminum/brass/nickel wtv is another layer for the heat to go thru. The ihs acts as a heatsink but cant dissapate heat as fast as you think. The clu on the bottom dissapates more wattage then the paste on top. What do you happens with that excess heat?

Just because you havent had issues doesnt mean issues dont exist. Im stable at 1.155v for 4.5 ghz. Doesnt mean everyone else will be. Give it a rest man honestly youre making yourself look dumb. From thinking vram/ram doesnt need cooling. Go check the difference between water cooled ram and passively cooled ram then talk.


----------



## Stige

Quote:


> Originally Posted by *bluej511*
> 
> You really are idiotic its amazing. Clu clp wtv you wanna call it its liquid METAL. See the main word metal? The same **** thats in solder. Just like solder clu/clp will leave a residue. This is from the DOZENS of posts ive read here and otherwise.
> 
> Like someone else posted going bare die will ALWAYS give better results. You dont seem to understand how isolation works or heat travel or thermodynamics. Any layercopper/aluminum/brass/nickel wtv is another layer for the heat to go thru. The ihs acts as a heatsink but cant dissapate heat as fast as you think. The clu on the bottom dissapates more wattage then the paste on top. What do you happens with that excess heat?
> 
> Just because you havent had issues doesnt mean issues dont exist. Im stable at 1.155v for 4.5 ghz. Doesnt mean everyone else will be. Give it a rest man honestly youre making yourself look dumb. From thinking vram/ram doesnt need cooling. Go check the difference between water cooled ram and passively cooled ram then talk.


Why you have to be so mad just because you are wrong? Then you are trying to twist my words to something I have never said.

1. I have ALWAYS said that direct-to-die vs. CLU on inside and outside the IHS = neglible difference, not enough to make direct-to-die worth the effort that it is. I have tried both and I know, the difference won't be more than 1-2C at most, which won't make a difference in most cases unless you run something like the intel stock cooler lol

2. Also again, you are talking about CLP/CLU hardening between the IHS and Block when I have always said that it *DOES NOT HARDEN BETWEEN IHS AND DIE*

I found these from my archives, this was CLP between the IHS and CPU Block. I have never claimed it does not do that between those two... This was after a year or so. Still had no troubles taking it off but I also never said CLP doesn't fuse them together, I said CLU should not do that.
Strike that, I just checked and that was this 3570K I have now and not the 2500K I thought it was, should have known when it's only the IHS there lol... So it was CLU and not CLP there, which does harden but it shouldn't fuse them together permanently like CLP might. But I never had issues with CLP either and I had it between my 2500K and this same CPU block for a year without swapping it.





These are from over 2 years ago.


----------



## bluej511

Quote:


> Originally Posted by *Stige*
> 
> Why you have to be so mad just because you are wrong? Then you are trying to twist my words to something I have never said.
> 
> 1. I have ALWAYS said that direct-to-die vs. CLU on inside and outside the IHS = neglible difference, not enough to make direct-to-die worth the effort that it is. I have tried both and I know, the difference won't be more than 1-2C at most, which won't make a difference in most cases unless you run something like the intel stock cooler lol
> 
> 2. Also again, you are talking about CLP/CLU hardening between the IHS and Block when I have always said that it *DOES NOT HARDEN BETWEEN IHS AND DIE*
> 
> I found these from my archives, this was CLP between the IHS and CPU Block. I have never claimed it does not do that between those two... This was after a year or so. Still had no troubles taking it off but I also never said CLP doesn't fuse them together, I said CLU should not do that.
> Strike that, I just checked and that was this 3570K I have now and not the 2500K I thought it was, should have known when it's only the IHS there lol... So it was CLU and not CLP there, which does harden but it shouldn't fuse them together permanently like CLP might. But I never had issues with CLP either and I had it between my 2500K and this same CPU block for a year without swapping it.
> 
> 
> 
> 
> 
> These are from over 2 years ago.


So strike that and yea you were wrong. That looks awful. Btw its supposed to harden same way solder does. If it stayed a liquid it would run off the die between the dies and ihs. You dont know how soldering and flux works so ill forgive you.

In short. YOU WERE WRONG IT IS CORROSIVE!!!!!!!!!!!!


----------



## Stige

Quote:


> Originally Posted by *bluej511*
> 
> So strike that and yea you were wrong. That looks awful. Btw its supposed to harden same way solder does. If it stayed a liquid it would run off the die between the dies and ihs. You dont know how soldering and flux works so ill forgive you.
> 
> In short. YOU WERE WRONG IT IS CORROSIVE!!!!!!!!!!!!


Here is just for you cause you are too dumb to read it seems:


The image says: *DOES NOT HARDEN BETWEEN IHS AND DIE*

Do you understand now that it only says that in this post?


----------



## battleaxe

Quote:


> Originally Posted by *bluej511*
> 
> So strike that and yea you were wrong. That looks awful. Btw its supposed to harden same way solder does. If it stayed a liquid it would run off the die between the dies and ihs. You dont know how soldering and flux works so ill forgive you.
> 
> In short. YOU WERE WRONG IT IS CORROSIVE!!!!!!!!!!!!


Quote:


> Originally Posted by *Stige*
> 
> Here is just for you cause you are too dumb to read it seems:
> 
> 
> The image says: *DOES NOT HARDEN BETWEEN IHS AND DIE*
> 
> Do you understand now that it only says that in this post?


Jeez guys. Can we just give it a rest?


----------



## mus1mus

Agree. If you have your own personal views to talk about, do it in PM's. As your rants at each other are rather NUISANCE for other guys who visit and frequent here.

There's a reason forum Private Messaging was created.


----------



## dagget3450

Quote:


> Originally Posted by *mus1mus*
> 
> Agree. If you have your own personal views to talk about, do it in PM's. As your rants at each other are rather NUISANCE for other guys who visit and frequent here.
> 
> There's a reason forum Private Messaging was created.


I dunno, listening to two people argue about Hardening can be stimulating.


----------



## battleaxe

Quote:


> Originally Posted by *dagget3450*
> 
> I dunno, listening to two people argue about Hardening can be stimulating.


That all depends on how long you have been Hard.









Okay... back on topic. RMA process has begun. Yay. Sucks a much. Hope my card gets fixed or replaced. Fixed hopefully. I really don't want a turd replacement.


----------



## mus1mus

Quote:


> Originally Posted by *battleaxe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *dagget3450*
> 
> I dunno, listening to two people argue about Hardening can be stimulating.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That all depends on how long you have been Hard.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Okay... back on topic. RMA process has begun. Yay. Sucks a much. Hope my card gets fixed or replaced. Fixed hopefully. I really don't want a turd replacement.
Click to expand...



RMA is always a pain. And yep, replacement is still a gamble.

My 290 was replaced with a 290X but can only do 1250 on the core and has Elpida Memory


----------



## leslie0880

hi guys
i got a sapphire tri-x 390x
i m trying looking a waterblock of it
but cant get, any 1 can help???


----------



## mus1mus

Look for the Alphacool Block.


----------



## bluej511

Quote:


> Originally Posted by *mus1mus*
> 
> Look for the Alphacool Block.


Yea pretty much the only one for sapphire. Its actually not too bad. My VRMs are under 60C stock speeds with their ****ty thermal pads. Order some 7 or 11 (see wat i did there lol) w/mk thermal pads 1.5mm use em for the vrm. Easier installing em now then after.


----------



## dagget3450

Quote:


> Originally Posted by *Scorpion49*
> 
> So I swapped the cooler out just now. The reference 290X has a much different cooler than this 390X. Check it out:
> 
> All ready to go!
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Reference 290X heatsink:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 390X heatsink:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I first did the small heatsinks as the instructions said, but I added a few extras in some spots since I had them, but then I had to change out some of the ones on the VRAM because they interfered with the heat pipes.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Topped it all off with a pair of AP-45's, running at 7V with my fan controller they move more air than any 3rd party graphics card I have ever encountered and nearly silent (can only hear them with the case open).
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Surprisingly it didn't sag much, but I put a little prop in there anyways just for safety. Going to start testing temps now.
> 
> 
> Spoiler: Warning: Spoiler!


Did you do any testing on this XFX Ref cooler? I was curious how it did because in the pic it looks like they shrunk the heatsink. Also, i am seriously thinking about getting some of these for my EK 290x copper blocks mine are old original ver and i think these will fit perfectly.


----------



## leslie0880

Quote:


> Originally Posted by *mus1mus*
> 
> Look for the Alphacool Block.


ok..... but no prefer since the outlook .........
haha
anyway thx reply bro

Quote:


> Originally Posted by *bluej511*
> 
> Yea pretty much the only one for sapphire. Its actually not too bad. My VRMs are under 60C stock speeds with their ****ty thermal pads. Order some 7 or 11 (see wat i did there lol) w/mk thermal pads 1.5mm use em for the vrm. Easier installing em now then after.


really? maybe will try on it~
since find a china brand, name BYSKI oso
i m looking 2 block for CF actually


----------



## Transmaniacon

Sapphire R9 390X Nitro with Hitman for $390, good deal? My 7950 is showing it's age and I would like something that can handle 1440P in the future when I get a new monitor.


----------



## Mister300

I have an XFX 390 X and it runs the new AOC 4 K 28 inch 60 Hz monitor very well, and it is a free sync too.


----------



## Scorpion49

Quote:


> Originally Posted by *dagget3450*
> 
> Did you do any testing on this XFX Ref cooler? I was curious how it did because in the pic it looks like they shrunk the heatsink. Also, i am seriously thinking about getting some of these for my EK 290x copper blocks mine are old original ver and i think these will fit perfectly.


Stock heatsink is about what you would get from a reference 290X. It does cool better, thanks to the lower voltages involved on the more mature chips but it still gets to 94C and sits there. Unlike a 290X though, 40% fan speed will keep it at 90C and not throttling.

Any 290/X block that fits reference cards should fit it, it seems to be identical PCB with a 390X core and memory IC's strapped to it.


----------



## mus1mus

Quote:


> Originally Posted by *leslie0880*
> 
> ok..... but no prefer since the outlook .........
> haha
> anyway thx reply bro
> really? maybe will try on it~
> since find a china brand, name BYSKI oso
> i m looking 2 block for CF actually


Those chinese blocks were floating around aliexpress. Im kind of intrigued with them too.

Maybe you can try and let us know.


----------



## bluej511

Quote:


> Originally Posted by *Scorpion49*
> 
> Stock heatsink is about what you would get from a reference 290X. It does cool better, thanks to the lower voltages involved on the more mature chips but it still gets to 94C and sits there. Unlike a 290X though, 40% fan speed will keep it at 90C and not throttling.
> 
> Any 290/X block that fits reference cards should fit it, it seems to be identical PCB with a 390X core and memory IC's strapped to it.


Id be careful with that though the dvi port assembly seems to be slitghly different. Id always check the manufacturer make sure it fits.

I gotta admit those byksi blocks look pretty good, i just dont trust purchasing from those sites.


----------



## Scorpion49

Quote:


> Originally Posted by *bluej511*
> 
> Id be careful with that though the dvi port assembly seems to be slitghly different. Id always check the manufacturer make sure it fits.
> 
> I gotta admit those byksi blocks look pretty good, i just dont trust purchasing from those sites.


They are identical. This is a reference AMD stamped PCB.


----------



## mus1mus

The company I work for purchased a sheetload of toners from a seller in aliexpress. Never had an issue and their support is quite good to be honest. Their english is not too bad either.

But it's a different thing, I know. Might as well try buying a block for my 290X from the site one day. But then, this thing cools my 290X too darn good.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Transmaniacon*
> 
> Sapphire R9 390X Nitro with Hitman for $390, good deal? My 7950 is showing it's age and I would like something that can handle 1440P in the future when I get a new monitor.


That's not a bad price, but don't expect much from Hitman until they fix the damn thing. Game won't run more thna 5-10 minutes in DX12 without crashes. Seems to be the story for everyone so far. The benchmark on it is nice though, lol


----------



## Transmaniacon

Quote:


> Originally Posted by *Agent Smith1984*
> 
> That's not a bad price, but don't expect much from Hitman until they fix the damn thing. Game won't run more thna 5-10 minutes in DX12 without crashes. Seems to be the story for everyone so far. The benchmark on it is nice though, lol


Yeah I have heard there are some DX12 issues? I wasn't planning on buying it but did enjoy the previous games in the series. I just want a card that will max 1080P games, and be solid at 1440P.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Transmaniacon*
> 
> Yeah I have heard there are some DX12 issues? I wasn't planning on buying it but did enjoy the previous games in the series. I just want a card that will max 1080P games, and be solid at 1440P.


I play everything on my 390x @ 4k with 35-60 FPS pretty much max settings, no AA


----------



## Agent Smith1984

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I play everything on my 390x @ 4k with 35-60 FPS pretty much max settings, no AA


1440p is fantastic on these cards!!


----------



## Transmaniacon

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I play everything on my 390x @ 4k with 35-60 FPS pretty much max settings, no AA


Well that is good to know! Seems like it should be a pretty substantial upgrade over my 7950, I average around 30-45 FPS on high at 1080P in The Division, and it's not as smooth as I would like.


----------



## Transmaniacon

Quote:


> Originally Posted by *Agent Smith1984*
> 
> 1440p is fantastic on these cards!!


I am debating between 1440P or 1080P at 144Hz, I do play a lot of FPS games (not competitively) and would like the smooth gameplay, but a bigger resolution/screen is more immersive. I currently just have a basic Asus 24" 1080P TN panel.


----------



## m70b1jr

Quote:


> Originally Posted by *Transmaniacon*
> 
> I am debating between 1440P or 1080P at 144Hz, I do play a lot of FPS games (not competitively) and would like the smooth gameplay, but a bigger resolution/screen is more immersive. I currently just have a basic Asus 24" 1080P TN panel.


I have an acer 1080p monitor, and overclocked it to 70Hz. In all honesty, I wish I would've invested into a 4k monitor. If i were you, I would save for a 4k monitor, and just overclock it for a bit more Hz. Oh, also go for a freesync panel. My friends say, that low FPS looks a lot smoother with freesync if that makes any sense.


----------



## Transmaniacon

Quote:


> Originally Posted by *m70b1jr*
> 
> I have an acer 1080p monitor, and overclocked it to 70Hz. In all honesty, I wish I would've invested into a 4k monitor. If i were you, I would save for a 4k monitor, and just overclock it for a bit more Hz.


Even with the lower FPS though?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Transmaniacon*
> 
> I am debating between 1440P or 1080P at 144Hz, I do play a lot of FPS games (not competitively) and would like the smooth gameplay, but a bigger resolution/screen is more immersive. I currently just have a basic Asus 24" 1080P TN panel.


My 390X with my daily clocks of 1100/1600 on stock voltage, is almost twice as powerful as my son's 7950 @ 1100/1500









I personally don't see any benefit in 144+ FPS but maybe that's just my untrained eye.... I do enjoy 60+ FPS in shooters though. The other stuff like GTA V and Witcher 3 etc and just fine for me in the 40FPS range.... I prefer a bigger higher res screen capable of 120hz over a more expensive smaller screen that does 144hz...


----------



## m70b1jr

Quote:


> Originally Posted by *Transmaniacon*
> 
> Even with the lower FPS though?


Get a smaller 4k screen (23 - 28 inch) and play graphically intensive games at 1440p or 1080, and play games like CSGO, TF2, and War thunder @ 4k.


----------



## bluej511

I was thinking 1440 with freesync but ive always wanted ultrawide so its what im going for next, has more pixels but r9 390 should be no problem. Waiting to see the price on the new 29" curved from lg if too muh going 29um68. Price went up over the 67 though shame but range is 40-75 maybe even custom driver could be down to 30-35


----------



## Transmaniacon

Quote:


> Originally Posted by *Agent Smith1984*
> 
> My 390X with my daily clocks of 1100/1600 on stock voltage, is almost twice as powerful as my son's 7950 @ 1100/1500
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I personally don't see any benefit in 144+ FPS but maybe that's just my untrained eye.... I do enjoy 60+ FPS in shooters though. The other stuff like GTA V and Witcher 3 etc and just fine for me in the 40FPS range.... I prefer a bigger higher res screen capable of 120hz over a more expensive smaller screen that does 144hz...


Wow this should be a huge upgrade then, I don't even have my 7950 clocked that high... I have never gamed on a high refresh monitor, but a lot of people can't go back from the "buttery smooth". I think I would probably end up with a 1440P monitor though, more immersive and as long as I can get around 60FPS I would be happy.

Also nice to see a fellow Raleigh OCNer.


----------



## Transmaniacon

Quote:


> Originally Posted by *bluej511*
> 
> I was thinking 1440 with freesync but ive always wanted ultrawide so its what im going for next, has more pixels but r9 390 should be no problem. Waiting to see the price on the new 29" curved from lg if too muh going 29um68. Price went up over the 67 though shame but range is 40-75 maybe even custom driver could be down to 30-35


Is there a sweet spot in terms of FPS for free-sync?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Transmaniacon*
> 
> Wow this should be a huge upgrade then, I don't even have my 7950 clocked that high... I have never gamed on a high refresh monitor, but a lot of people can't go back from the "buttery smooth". I think I would probably end up with a 1440P monitor though, more immersive and as long as I can get around 60FPS I would be happy.
> 
> Also nice to see a fellow Raleigh OCNer.


Oh dude, you are in Raleigh??

SWEET

Not a lot of PC gamers around here to my knowledge.....


----------



## Transmaniacon

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Oh dude, you are in Raleigh??
> 
> SWEET
> 
> Not a lot of PC gamers around here to my knowledge.....


Yeah I don't know a ton of people that do, but there was a big games convention here last year IIRC.


----------



## bluej511

Quote:


> Originally Posted by *Transmaniacon*
> 
> Is there a sweet spot in terms of FPS for free-sync?


Yea just in that range. I hear under it just gets lag/stutter but then again i dont ever go under 40. Id just use frtc to 74 and bam


----------



## Transmaniacon

Quote:


> Originally Posted by *bluej511*
> 
> Yea just in that range. I hear under it just gets lag/stutter but then again i dont ever go under 40. Id just use frtc to 74 and bam


Thanks! Seems like the 390X should be able to manage that just fine at 1440P, that might be the route I end up going.


----------



## bluej511

Quote:


> Originally Posted by *Transmaniacon*
> 
> Thanks! Seems like the 390X should be able to manage that just fine at 1440P, that might be the route I end up going.


Freesync 1440 and your golden 1440s are too much for my taste id just get 4k run it in 1440 same price.


----------



## Transmaniacon

Well, the damage is done. Have a Sapphire R9 390X Nitro on the way!
Quote:


> Freesync 1440 and your golden 1440s are too much for my taste id just get 4k run it in 1440 same price.


Is there any downside to running it at a lower resolution?


----------



## bluej511

Quote:


> Originally Posted by *Transmaniacon*
> 
> Well, the damage is done. Have a Sapphire R9 390X Nitro on the way!
> Is there any downside to running it at a lower resolution?


Never done it but all i could think of would be reading text, might be a bit more pixelated.

Has anyone tried 1440 on their 4k to chime in?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Transmaniacon*
> 
> Well, the damage is done. Have a Sapphire R9 390X Nitro on the way!
> Is there any downside to running it at a lower resolution?


Well, 4k is absolutely stunning when compared even to 1440p, however there is some downside to 4k....

First being, you will need a club3d adapter to get 60hz at 4k where these cards will do 1440p at 60hz+ no problem.... Also, the performance takes quite a hit, but if you are okay with ~45fps (you can still get over 60FPS at 4k in most games if you lower texture settings some), then a large 4k is the way to go. I have fallen in love with 4k and would not take the additional 60hz of 1440 over it, no way-no how.


----------



## Transmaniacon

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Well, 4k is absolutely stunning when compared even to 1440p, however there is some downside to 4k....
> 
> First being, you will need a club3d adapter to get 60hz at 4k where these cards will do 1440p at 60hz+ no problem.... Also, the performance takes quite a hit, but if you are okay with ~45fps (you can still get over 60FPS at 4k in most games if you lower texture settings some), then a large 4k is the way to go. I have fallen in love with 4k and would not take the additional 60hz of 1440 over it, no way-no how.


Yeah I would imagine 4K is pretty stunning, I think I would probably start with 1440P as I would really like the higher frames.


----------



## Agent Smith1984

Quote:


> Originally Posted by *bluej511*
> 
> Never done it but all i could think of would be reading text, might be a bit more pixelated.
> 
> Has anyone tried 1440 on their 4k to chime in?


1440 looks great on my 55" LG 4k TV, hell even upscaled 720p video looks pretty good, but a lot of that has to do with the post processing of the TV itself. Not sure of how much the GPU is doing on it's end..... I have seen upscaling vary quite a bit between different manufacturers. I know it's a BIG DEAL with TV manufacturers to make sure the upscaling is good, but not sure the PC monitor segments do anything to improve upscaling. I'll never play on anything other than a large TV again, but that's just me.... some prefer the desk/monitor experience.... I like couch, beer, wireless controls, and big 4k TV


----------



## m70b1jr

Quote:


> Originally Posted by *Transmaniacon*
> 
> Yeah I would imagine 4K is pretty stunning, I think I would probably start with 1440P as I would really like the higher frames.


Lowering your resolution to 1440 or even 1080p on a smaller monitor, you won't notice a difference at all, as long as the inch size is pretty small, because of the PPI (23 - 28) I'd still go for a 4k and overclock the Hz on it, and play more demanding games at 1440p.


----------



## bluej511

Sorry fellas ultrawide for me, since the day ive seen em i wanted one.


----------



## Mister300

My adventure with 4K

If you read all the noise on the net they all say that there is no single card option for 4K. Not true.

The XFX 390X I own set to factory clocks has enough VRAM to load textures at 4K at decent FPS. I was ready to venture into a LG free sync ultra-wide but came across a AOC 4K 60Hz TN 28 " for 350 USD at my local Microcenter so I pulled the trigger. Do not start flaming the TN panel is is the same one used by Samsung in their $700 version.

I do scientific imaging for a living and have seen many displays including a $20,000 Eizo which will correct any issues you may have at a price beyond affordable. My AOC has one flaw, a single edge stuck pixel which is in a spot where it is not noticeable. Yes there is minor off axis shift of colors

I also have a 4K TV for other viewing, it has a decent scaler as Agent Smith noted with his experience.

There are some minor desktop scaling issues with Win 10 and 4K, however all program windows and text are crisp and sharp.(word, excel, PP and Adobe Photoshop CS suite).

My gaming target is 40 FPS min, anything lower the free sync drops out unless you mod the drivers. I use V sync or target frame control to avoid tearing on some titles if above refresh rate.
I also turn off AA at 4K, personal preference.

I game on ultra on Verdun @ 75 FPS,
Multiplayer Crysis 3 @ 60 FPS very high settings,
Star Wars BF on ultra 60 FPS steady and higher if V sync is disabled,
Need For Speed 60 FPS ultra, med AO, Gorgeous in 4K
Metro LL 50 FPS very high, no PhysX, no SSAA,
COD BO3 90 FPS maxed out
Alien Isolation ultra 60 FPS min
Bioshock inf ultra 60 FPS min
Titanfall ultra 60 FPS with AO turned down a bit.
Dirt Rally ultra 60 FPS steady.

For the price it is highly recommended, my son has a Asus Strix 970 and there are no affordable 4K g sync displays, go team Red!!!!!

I will link to some videos later for those interested.

Thanks


----------



## m70b1jr

Can someone tell me who bad of an idea it would be to put my CPU radiator in ice cold water.


----------



## Transmaniacon

Quote:


> Originally Posted by *Mister300*
> 
> My adventure with 4K
> 
> If you read all the noise on the net they all say that there is no single card option for 4K. Not true.
> 
> The XFX 390X I own set to factory clocks has enough VRAM to load textures at 4K at decent FPS. I was ready to venture into a LG free sync ultra-wide but came across a AOC 4K 60Hz TN 28 " for 350 USD at my local Microcenter so I pulled the trigger. Do not start flaming the TN panel is is the same one used by Samsung in their $700 version.
> 
> I do scientific imaging for a living and have seen many displays including a $20,000 Eizo which will correct any issues you may have at a price beyond affordable. My AOC has one flaw, a single edge stuck pixel which is in a spot where it is not noticeable. Yes there is minor off axis shift of colors
> 
> I also have a 4K TV for other viewing, it has a decent scaler as Agent Smith noted with his experience.
> 
> There are some minor desktop scaling issues with Win 10 and 4K, however all program windows and text are crisp and sharp.(word, excel, PP and Adobe Photoshop CS suite).
> 
> My gaming target is 40 FPS min, anything lower the free sync drops out unless you mod the drivers. I use V sync or target frame control to avoid tearing on some titles if above refresh rate.
> I also turn off AA at 4K, personal preference.
> 
> I game on ultra on Verdun @ 75 FPS,
> Multiplayer Crysis 3 @ 60 FPS very high settings,
> Star Wars BF on ultra 60 FPS steady and higher if V sync is disabled,
> Need For Speed 60 FPS ultra, med AO, Gorgeous in 4K
> Metro LL 50 FPS very high, no PhysX, no SSAA,
> COD BO3 90 FPS maxed out
> Alien Isolation ultra 60 FPS min
> Bioshock inf ultra 60 FPS min
> Titanfall ultra 60 FPS with AO turned down a bit.
> Dirt Rally ultra 60 FPS steady.
> 
> For the price it is highly recommended, my son has a Asus Strix 970 and there are no affordable 4K g sync displays, go team Red!!!!!
> 
> I will link to some videos later for those interested.
> 
> Thanks


Impressive numbers for 4K! Are there any good gaming IPS displays these days, or are we pretty much stuck with TN?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Mister300*
> 
> My adventure with 4K
> 
> If you read all the noise on the net they all say that there is no single card option for 4K. Not true.
> 
> The XFX 390X I own set to factory clocks has enough VRAM to load textures at 4K at decent FPS. I was ready to venture into a LG free sync ultra-wide but came across a AOC 4K 60Hz TN 28 " for 350 USD at my local Microcenter so I pulled the trigger. Do not start flaming the TN panel is is the same one used by Samsung in their $700 version.
> 
> I do scientific imaging for a living and have seen many displays including a $20,000 Eizo which will correct any issues you may have at a price beyond affordable. My AOC has one flaw, a single edge stuck pixel which is in a spot where it is not noticeable. Yes there is minor off axis shift of colors
> 
> I also have a 4K TV for other viewing, it has a decent scaler as Agent Smith noted with his experience.
> 
> There are some minor desktop scaling issues with Win 10 and 4K, however all program windows and text are crisp and sharp.(word, excel, PP and Adobe Photoshop CS suite).
> 
> My gaming target is 40 FPS min, anything lower the free sync drops out unless you mod the drivers. I use V sync or target frame control to avoid tearing on some titles if above refresh rate.
> I also turn off AA at 4K, personal preference.
> 
> I game on ultra on Verdun @ 75 FPS,
> Multiplayer Crysis 3 @ 60 FPS very high settings,
> Star Wars BF on ultra 60 FPS steady and higher if V sync is disabled,
> Need For Speed 60 FPS ultra, med AO, Gorgeous in 4K
> Metro LL 50 FPS very high, no PhysX, no SSAA,
> COD BO3 90 FPS maxed out
> Alien Isolation ultra 60 FPS min
> Bioshock inf ultra 60 FPS min
> Titanfall ultra 60 FPS with AO turned down a bit.
> Dirt Rally ultra 60 FPS steady.
> 
> For the price it is highly recommended, my son has a Asus Strix 970 and there are no affordable 4K g sync displays, go team Red!!!!!
> 
> I will link to some videos later for those interested.
> 
> Thanks


I have had the same impressive results at 4k with no AA on my card. Of the games you listed, I have Crysis 3 and Dirt Rally.... even in multiplayer, Crysis 3 is no problem for this card at 4k with very high settings. Rally is peanuts even with the AA


----------



## Mister300

IPS tech at 4K with free sync has a price premuium.


----------



## Transmaniacon

Quote:


> Originally Posted by *Mister300*
> 
> IPS tech at 4K with free sync has a price premuium.


A 1440P IPS panel might be more attainable then, but hopefully when I do get around to upgrading the prices will have come down.


----------



## christoph

Quote:


> Originally Posted by *m70b1jr*
> 
> Can someone tell me who bad of an idea it would be to put my CPU radiator in ice cold water.


the water inside the radiator gonna start to be colder than ambient, so when the water lines inside your rig starts to drop temperature there's a chance to create condensation, drops of water outside the water lines, so one drop in your rig, and bye bye, but if you keep the ice cold water NOT THAT COLD, you could avoid this, there's some guys that likes to drop some ice cubes into their water reservoir


----------



## tolis626

So today I took my card apart AGAIN (For God's sake, I hate doing this, I fear I'll break it every time







) and I put all my leftover Fujipoly padding on top of the other pad that I had on the card's VRM. There was a small section that it could not cover as there simply wasn't enough, so I used an about 2mm long and 15mm wide strip of Phobya pad just to bridge the gap. And lo and behold, it works. My VRM temps are back to normal, under 70C most of the time with spikes in the 70-75C range. Nowhere near the 90C that it did yesterday with only one piece of pad, so I'm happy about that.

BUT (and there's always a but, isn't there?) not all could have gone well, and it seems the thickness of the padding (2mm effectively) may be preventing the core from making proper contact with the heatsink. The result is quite worse core temps. Whereas before I couldn't usually reach 75C at over +100mV, I now find it sitting in the 75-80C range most of the time at a +80mV offset, and that's with the fan at 90-100% too. So tomorrow I'll try maybe tightening the screws or something in the hopes that it alleviates the problem. If not... Sigh, I don't know. This is getting too tiring.


----------



## christoph

Quote:


> Originally Posted by *tolis626*
> 
> So today I took my card apart AGAIN (For God's sake, I hate doing this, I fear I'll break it every time
> 
> 
> 
> 
> 
> 
> 
> ) and I put all my leftover Fujipoly padding on top of the other pad that I had on the card's VRM. There was a small section that it could not cover as there simply wasn't enough, so I used an about 2mm long and 15mm wide strip of Phobya pad just to bridge the gap. And lo and behold, it works. My VRM temps are back to normal, under 70C most of the time with spikes in the 70-75C range. Nowhere near the 90C that it did yesterday with only one piece of pad, so I'm happy about that.
> 
> BUT (and there's always a but, isn't there?) not all could have gone well, and it seems the thickness of the padding (2mm effectively) may be preventing the core from making proper contact with the heatsink. The result is quite worse core temps. Whereas before I couldn't usually reach 75C at over +100mV, I now find it sitting in the 75-80C range most of the time at a +80mV offset, and that's with the fan at 90-100% too. So tomorrow I'll try maybe tightening the screws or something in the hopes that it alleviates the problem. If not... Sigh, I don't know. This is getting too tiring.


did you squeeze the heatsink to the PCB before tighten the screws?

try squeezing then together with your hands and even get another pair of hands to help you doing straight, put the screws while squeezing and let the video card sit for lets say 15 min, then squeeze the video card even more and tighten the screws...


----------



## bluej511

Quote:


> Originally Posted by *tolis626*
> 
> So today I took my card apart AGAIN (For God's sake, I hate doing this, I fear I'll break it every time
> 
> 
> 
> 
> 
> 
> 
> ) and I put all my leftover Fujipoly padding on top of the other pad that I had on the card's VRM. There was a small section that it could not cover as there simply wasn't enough, so I used an about 2mm long and 15mm wide strip of Phobya pad just to bridge the gap. And lo and behold, it works. My VRM temps are back to normal, under 70C most of the time with spikes in the 70-75C range. Nowhere near the 90C that it did yesterday with only one piece of pad, so I'm happy about that.
> 
> BUT (and there's always a but, isn't there?) not all could have gone well, and it seems the thickness of the padding (2mm effectively) may be preventing the core from making proper contact with the heatsink. The result is quite worse core temps. Whereas before I couldn't usually reach 75C at over +100mV, I now find it sitting in the 75-80C range most of the time at a +80mV offset, and that's with the fan at 90-100% too. So tomorrow I'll try maybe tightening the screws or something in the hopes that it alleviates the problem. If not... Sigh, I don't know. This is getting too tiring.


Thats def a temporary solution. I would def not keep it that way though. You can try tightening it down even more usually screws have a limited amount of thread so you can't exactly tighten too much, same as the screws that come with an alphacool radiator. I bet if you got 1.5mm pads temps would be better for core and vrm.


----------



## flopper

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Well, 4k is absolutely stunning when compared even to 1440p, however there is some downside to 4k....
> 
> First being, you will need a club3d adapter to get 60hz at 4k where these cards will do 1440p at 60hz+ no problem.... Also, the performance takes quite a hit, but if you are okay with ~45fps (you can still get over 60FPS at 4k in most games if you lower texture settings some), then a large 4k is the way to go. I have fallen in love with 4k and would not take the additional 60hz of 1440 over it, no way-no how.


Thats why I await Polaris, HDR and 4k as you can have both
60hz hdr and 120hz sdr on 4k.
screens finally getting somewhere good.


----------



## Scorpion49

Quote:


> Originally Posted by *christoph*
> 
> did you squeeze the heatsink to the PCB before tighten the screws?
> 
> try squeezing then together with your hands and even get another pair of hands to help you doing straight, put the screws while squeezing and let the video card sit for lets say 15 min, then squeeze the video card even more and tighten the screws...


Thermal pads do not need to be squeezed into oblivion, if they aren't working correctly he has the wrong size or clearance issues somewhere else. All you'll achieve by trying to compress them this much is breaking the PCB.


----------



## yuannan

Sorry if this is off topic, what the best way to get rep?

I'm looking to sell my 390x and you need 35 rep to do so.


----------



## bluej511

Quote:


> Originally Posted by *yuannan*
> 
> Sorry if this is off topic, what the best way to get rep?
> 
> I'm looking to sell my 390x and you need 35 rep to do so.


We can all just rep your post haha


----------



## Agent Smith1984

Quote:


> Originally Posted by *yuannan*
> 
> Sorry if this is off topic, what the best way to get rep?
> 
> I'm looking to sell my 390x and you need 35 rep to do so.


What model, what you looking to get for it, and why you selling?

You can PM if you want... just curious


----------



## Transmaniacon

So according to an article in the rumor section, the R9 490 and R9 490X are going to be debuted at Computex with a June release. Obviously this is just a rumor, but that website has apparently been pretty accurate with regards to AMD speculation. Now I have to wonder if I should return the 390X and wait out the new cards? My 390X should be here tomorrow, will be tough not to open it up


----------



## Agent Smith1984

Quote:


> Originally Posted by *Transmaniacon*
> 
> So according to an article in the rumor section, the R9 490 and R9 490X are going to be debuted at Computex with a June release. Obviously this is just a rumor, but that website has apparently been pretty accurate with regards to AMD speculation. Now I have to wonder if I should return the 390X and wait out the new cards? My 390X should be here tomorrow, will be tough not to open it up


It won't matter really.... if it's a polaris based card, it's going to be $650, so you can keep the 390x at much less than that, and have less performance, or you can spend a ton more on the better card. The way these things have slowed down over the last few years has helped keep card values pretty stable... I mean, just look at how much the 970 and 390 still sells for....


----------



## Transmaniacon

Quote:


> Originally Posted by *Agent Smith1984*
> 
> It won't matter really.... if it's a polaris based card, it's going to be $650, so you can keep the 390x at much less than that, and have less performance, or you can spend a ton more on the better card. The way these things have slowed down over the last few years has helped keep card values pretty stable... I mean, just look at how much the 970 and 390 still sells for....


That's a good point, it's supposed to be Polaris 10, but I was under the impression the high-end cards will be Vega and out in Q1 '17? Would they not keep the pricing structure similar with the prior generations? Maybe at a slight premium? I was figuring maybe $379 for the R9 490 and $479 for the R9 490X, but then it messes with the Fury cards if the performance in close.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Transmaniacon*
> 
> That's a good point, it's supposed to be Polaris 10, but I was under the impression the high-end cards will be Vega and out in Q1 '17? Would they not keep the pricing structure similar with the prior generations? Maybe at a slight premium? I was figuring maybe $379 for the R9 490 and $479 for the R9 490X, but then it messes with the Fury cards if the performance in close.


You could totally be right, I haven't followed it that closely.

I thought the polaris was to be the next high end card, but if it's going to be the uppder-middle class card, then yes, they would be looking for about $450 I'm guessing.


----------



## Transmaniacon

Quote:


> Originally Posted by *Agent Smith1984*
> 
> You could totally be right, I haven't followed it that closely.
> 
> I thought the polaris was to be the next high end card, but if it's going to be the uppder-middle class card, then yes, they would be looking for about $450 I'm guessing.


Most of it is speculation, but the HBM2 cards are most likely coming in 2017, which should replace the Fury cards. I think people aren't sure where the new 400 series will fall. If they supplant the current Fury cards, then those are basically useless until the new flagships show up. If they come in under Fury, then it's not a huge performance leap from the 300 series. Not sure at the moment what I will do.


----------



## bluej511

Polaris should be ddr5x. Vega and so on should be hbm even though the gtx 1080 is supposed to be hbm2 with 16gb but my guess is easily over 1000$ if not more it is nvidia after all.


----------



## Devildog83

Hey Agent Smith,

Fixn' to join the ranks. MSI R9 390 Gaming

http://www.techpowerup.com/gpuz/details.php?id=advbm

20160407_081142_HDR.jpg 1505k .jpg file


20160407_083844_HDR.jpg 1224k .jpg file


----------



## Agent Smith1984

The next HIGH-END AMD cards will be minimum 8GB HBM2, this has been confirmed my AMD themselves I believe.

The reality is, single 390X cards aren't that far behind Fury cards at all, so where the polaris will fit in the market is confusing to me, unless it's going to be what tonga was to tahiti, and simply replace the 390 series in the market with similar performance, but much better power consumption and temps to go with that.

I couldn't tell you to be honest.


----------



## Transmaniacon

Quote:


> Originally Posted by *Agent Smith1984*
> 
> The next HIGH-END AMD cards will be minimum 8GB HBM2, this has been confirmed my AMD themselves I believe.
> 
> The reality is, single 390X cards aren't that far behind Fury cards at all, so where the polaris will fit in the market is confusing to me, unless it's going to be what tonga was to tahiti, and simply replace the 390 series in the market with similar performance, but much better power consumption and temps to go with that.
> 
> I couldn't tell you to be honest.


Yeah it's kind of a weird situation. AMD has said the Polaris cards will be a big leap forward, so they might have some overlap with Fury and Polaris until the new HBM2 cards drop in 2017. They had a Polaris card on demo recently running Hitman @ 60FPS on a 1440P monitor. We don't know which card it was, and what the graphical settings were, but it's possible they could be releasing the 480 and 480X instead, which would replace the 390 and 390X, and then wait until later in the year for the new upper mid-range cards. I hate this game of speculating though, I might just end up keeping the 390X and then if I want to upgrade next year I can sell it and do so. It will still be a monster at 1080P for anything that's coming out, and it can handle 1440P very well.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Transmaniacon*
> 
> Yeah it's kind of a weird situation. AMD has said the Polaris cards will be a big leap forward, so they might have some overlap with Fury and Polaris until the new HBM2 cards drop in 2017. They had a Polaris card on demo recently running Hitman @ 60FPS on a 1440P monitor. We don't know which card it was, and what the graphical settings were, but it's possible they could be releasing the 480 and 480X instead, which would replace the 390 and 390X, and then wait until later in the year for the new upper mid-range cards. I hate this game of speculating though, I might just end up keeping the 390X and then if I want to upgrade next year I can sell it and do so. It will still be a monster at 1080P for anything that's coming out, and it can handle 1440P very well.


Yeah the 390x at sub $400 is still a "safe" card to have right now.... I mean yeah, we overpaid for a 290x with more VRAM, but we still have very capable cards at 1440p....

Hell, my wife's 7870 and my son's 7950 are both giving them great gaming experiences at 1080P, and I play at 4k on my 390x so I am pretty happy with the shelf life on hardware these days.

DX12 is looking even more promising for things..... I've seen some benches were the 390X goes from being 20-30% behind stock 980ti, and becomes around 3-5% faster after implementing DX12.... Of course that is likely drivers and what not, but when all things are equal, it still says a lot about how well even AMD's current and previous GCN architecture products are going to perform moving forward. If it's a hardware thing, then NVIDIA will obviously address it with their next cards, but that won't change a thing for folks who want to continue to use the hardware they have already invested in for a few years....


----------



## Agent Smith1984

After a little more research, and just using some common sense, I am guessing the following for Polaris 10:

Replaces the 390 series
2.5x more power efficient (AND YES THIS IS AMD ACTUAL CLAIMS!!!!! WOW)
Somewhere in the range of 2816-3540 shaders running at 1000~MHz (just a guess on my end to keep it competitive with 390X but not faster than Fury X)
Will use GDDR5X memory...
Will have a 8GB of VRAM on a 256bit memory bus running at 1750MHz for roughly 448GB/s bandwidth.

So basically, these cards will probably be in the $350-450 range at launch, and will perform around the same or slightly faster than the 390x, and are more or less a completely new replacement to that performance segment, instead of another rebrand.

I can't see Fury being rebranded to 490 series though, not with a FURY 2 (or wth ever they decide to call it) launching , because the 4GB sitting in an uppder-midrange segment, sandwiched between the lower level 8GB GDDR5X card and the higher level 8GB HBM2 card won't make sense at all. Would it?


----------



## m70b1jr

Quote:


> Originally Posted by *Agent Smith1984*
> 
> After a little more research, and just using some common sense, I am guessing the following for Polaris 10:
> 
> Replaces the 390 series
> 2.5x more power efficient (AND YES THIS IS AMD ACTUAL CLAIMS!!!!! WOW)
> Somewhere in the range of 2816-3540 shaders running at 1000~MHz (just a guess on my end to keep it competitive with 390X but not faster than Fury X)
> Will use GDDR5X memory...
> Will have a 8GB of VRAM on a 256bit memory bus running at 1750MHz for roughly 448GB/s bandwidth.
> 
> So basically, these cards will probably be in the $350-450 range at launch, and will perform around the same or slightly faster than the 390x, and are more or less a completely new replacement to that performance segment, instead of another rebrand.
> 
> I can't see Fury being rebranded to 490 series though, not with a FURY 2 (or wth ever they decide to call it) launching , because the 4GB sitting in an uppder-midrange segment, sandwiched between the lower level 8GB GDDR5X card and the higher level 8GB HBM2 card won't make sense at all. Would it?


I think there will be a full 4xx series. Low to highend cards, I think Polaris will have it's own flagship card that will out do the furry X in all honesty. I think there will be a 470(x) 480(x), 490(x) and then one faster than the 490x.
I think the 470(x) will be aimed at 1080p high - ultra @60+ FPS on newer titles and will have 4gb of VRAM. (150$ - 250$)
I think the 480(x) will be aimed at 1440p medium - ultra at hitting 60FPS+, again, on newer tittles ranging from 4 - 8 gigs of VRAM (250 - 300$)
Then the 490(x) aimed at 4k,at high - ultra settings, again. 60+ FPS on newer tittles, and have a minimum of 8gb of VRAM. (300 - 600$)
Then there will be a flagship card aimed at 4k, but also pushing higher framerates than 60FPS, or even playing 4k with multiple monitors at 60 fps. (600$+ )

Of course, all of these will have a 2.5x watt per performance over the 2xx and 3xx series, and have better Async compute and GDDR5X


----------



## Transmaniacon

Quote:


> Originally Posted by *Agent Smith1984*
> 
> After a little more research, and just using some common sense, I am guessing the following for Polaris 10:
> 
> Replaces the 390 series
> 2.5x more power efficient (AND YES THIS IS AMD ACTUAL CLAIMS!!!!! WOW)
> Somewhere in the range of 2816-3540 shaders running at 1000~MHz (just a guess on my end to keep it competitive with 390X but not faster than Fury X)
> Will use GDDR5X memory...
> Will have a 8GB of VRAM on a 256bit memory bus running at 1750MHz for roughly 448GB/s bandwidth.
> 
> So basically, these cards will probably be in the $350-450 range at launch, and will perform around the same or slightly faster than the 390x, and are more or less a completely new replacement to that performance segment, instead of another rebrand.
> 
> I can't see Fury being rebranded to 490 series though, not with a FURY 2 (or wth ever they decide to call it) launching , because the 4GB sitting in an uppder-midrange segment, sandwiched between the lower level 8GB GDDR5X card and the higher level 8GB HBM2 card won't make sense at all. Would it?


Yeah I think you are on the money here.


----------



## Mattuz

Quote:


> Originally Posted by *Transmaniacon*
> 
> So according to an article in the rumor section, the R9 490 and R9 490X are going to be debuted at Computex with a June release. Obviously this is just a rumor, but that website has apparently been pretty accurate with regards to AMD speculation. Now I have to wonder if I should return the 390X and wait out the new cards? My 390X should be here tomorrow, will be tough not to open it up


Well maybe for just two months you can think to wait. I bought my 390 in December and I don't regret it because I played with it for 4 months since now, and if I waited I was still playing all on low with my two 6850s but for two months maybe is worth waiting


----------



## Transmaniacon

Quote:


> Originally Posted by *Mattuz*
> 
> Well maybe for just two months you can think to wait. I bought my 390 in December and I don't regret it because I played with it for 4 months since now, and if I waited I was still playing all on low with my two 6850s but for two months maybe is worth waiting


Yeah I think I am going to wait. I can overclock my 7950 and it should last me fine until then. Seems like the 490(X) cards should be worth the wait.


----------



## Agent Smith1984

I literally bought a 390 the day they dropped (The MSI) and at $330 I thought it was well worth it. Then you find out a month later that the majority of performance improvements were driver related, and then the 290 guys ended up getting the same goodies (minus the 8GB of better clocking RAM)..... I am glad they were offered the performance improvements though... I dunno if AMD went through a moral change of heart, of if they felt they needed to make the change for all drivers before hackers completely exposed what they had done (which they did, and I believe this is the ONLY reason why they made the driver improvements for all hardware). Either way, the 390 series still stacks up great to even the best cards on the market when you look at the performance per dollar. I'd of never got a 390X if I hadn't of found it for $370 after rebate with a decent mouse and a copy of Hitman......

I have used 290's, 290's in crossfire, different combinations of 390's, an XFX Fury, and a GTX 980 KPE, and for the money this 390x has been second ONLY to the pair of tri-x 290's I had in crossfire, which I had spent about $400 in total on....


----------



## m70b1jr

I got my R9 390 in December as well for 330$. I'll end up selling it. and buying the highest end card that ends up coming out. it'' be the last graphics card I buy, because I'l be off to college in 1 1/2 years, so i'll be getting a laptop instead (Polaris / Zen Laptop?)


----------



## flopper

Quote:


> Originally Posted by *Agent Smith1984*
> 
> The next HIGH-END AMD cards will be minimum 8GB HBM2, this has been confirmed my AMD themselves I believe.
> 
> The reality is, single 390X cards aren't that far behind Fury cards at all, so where the polaris will fit in the market is confusing to me, unless it's going to be what tonga was to tahiti, and simply replace the 390 series in the market with similar performance, but much better power consumption and temps to go with that.
> 
> I couldn't tell you to be honest.


That seems at best doubtful.
new die shrink so they be cutting the new 490 so badly and that wont make sense.
if the 232mm2 core is accurate it be faster than furyx with the 490x card.

anything else seems to me like shooting themselves and give it up to nvidia totally.


----------



## Agent Smith1984

Makes me think now, that they will snub Hawaii altogether instead of relaunching it as a 480X....

I am thinking Fiji may get snubbed also...

So that just leads me to believe Polaris 10 will replace 390 series as 480 series , Polaris 11 will be the 490 series, and we will get a Vega 10 GPU for the top tier cards.

That would make Fiji one of the shortest lived GPU's we've seen in a while, unless you consider it's also short lived little brother tonga part of it's lineage.


----------



## flopper

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Makes me think now, that they will snub Hawaii altogether instead of relaunching it as a 480X....
> 
> I am thinking Fiji may get snubbed also...
> 
> So that just leads me to believe Polaris 10 will replace 390 series as 480 series , Polaris 11 will be the 490 series, and we will get a Vega 10 GPU for the top tier cards.
> 
> That would make Fiji one of the shortest lived GPU's we've seen in a while, unless you consider it's also short lived little brother tonga part of it's lineage.


or add Nano as 480x
all in all depends on cost for manufacturing the Fiji series and performance delta for the new cards.
but yea, short lived unless you count the x2 card for VR


----------



## yuannan

Quote:


> Originally Posted by *Agent Smith1984*
> 
> What model, what you looking to get for it, and why you selling?
> 
> You can PM if you want... just curious


Sapphire tri-x 390x, about £350, can be mailed to any in the world really, you have to pay ofc.

Im selling it to get money, I barely game anymore and I need the money as I'm going to college next year and need the money for a bike to get there. Walking is about an hour and my parents don't love me enough to buy one for me









ASIC 77.2%

clocks up to 1255/1750
You can see the fanboy thread to see my benches


----------



## bluej511

Quote:


> Originally Posted by *yuannan*
> 
> Sapphire tri-x 390x, about £350, can be mailed to any in the world really, you have to pay ofc.
> 
> Im selling it to get money, I barely game anymore and I need the money as I'm going to college next year and need the money for a bike to get there. Walking is about an hour and my parents don't love me enough to buy one for me
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ASIC 77.2%
> 
> clocks up to 1255/1750
> You can see the fanboy thread to see my benches


Bike or bicycle lol. I would but id need another rad and another gpu block.

Off topic turns out nh-t1 tim has an issue with bare die and doesnt last long, "press out" effect. Cant wait to get my screws and test it with nh-t1 again. Only variable would be the IHS.


----------



## yuannan

Quote:


> Originally Posted by *bluej511*
> 
> Bike or bicycle lol. I would but id need another rad and another gpu block.
> 
> Off topic turns out nh-t1 tim has an issue with bare die and doesnt last long, "press out" effect. Cant wait to get my screws and test it with nh-t1 again. Only variable would be the IHS.


bicycle, I live in the UK so we call it a bike. Your bike would what we would call a moped or a motorbike I guess.

They are no joke tho, cheap bikes are £150 a toss and good ones easily go ver £400.


----------



## bluej511

Quote:


> Originally Posted by *yuannan*
> 
> bicycle, I live in the UK so we call it a bike. Your bike would what we would call a moped or a motorbike I guess.
> 
> They are no joke tho, cheap bikes are £150 a toss and good ones easily go ver £400.


Yea my bike costs more then my build lol. If i sold it i could quad fire and watercool but no need at all. I live in southern france i ride year round.


----------



## yuannan

Quote:


> Originally Posted by *bluej511*
> 
> Yea my bike costs more then my build lol. If i sold it i could quad fire and watercool but no need at all. I live in southern france i ride year round.


You could get a real fire if you didn't water cool


----------



## diggiddi

Quote:


> Originally Posted by *yuannan*
> 
> Sorry if this is off topic, what the best way to get rep?
> 
> I'm looking to sell my 390x and you need 35 rep to do so.


Write a good review on a product, that is a quick way aside from being helpful


----------



## patriotaki

what are your idle temps on the 390?
with fans off


----------



## bluej511

Quote:


> Originally Posted by *patriotaki*
> 
> what are your idle temps on the 390?
> with fans off


My nitro used to be 34-35C at idle.


----------



## patriotaki

my friends pcs+ is at 59-60C idle all the time...he has a 140mm fan blowing air directly on gpu ..why is it so high? and on load goes up to 65-67C


----------



## bluej511

Quote:


> Originally Posted by *patriotaki*
> 
> my friends pcs+ is at 59-60C idle all the time...he has a 140mm fan blowing air directly on gpu ..why is it so high? and on load goes up to 65-67C


Seems like it prob doesn't underclock at idle or something must be running and using it. Check gpuz and see what it uses for wattage at idle.


----------



## Transmaniacon

Quote:


> Originally Posted by *patriotaki*
> 
> my friends pcs+ is at 59-60C idle all the time...he has a 140mm fan blowing air directly on gpu ..why is it so high? and on load goes up to 65-67C


Pretty sure the Nitro series won't turn the fan on until it needs to, when it gets up around 65C, so it leads to high idle temps but it's the intended use to keep the GPU silent.


----------



## bluej511

Quote:


> Originally Posted by *Transmaniacon*
> 
> Pretty sure the Nitro series won't turn the fan on until it needs to, when it gets up around 65C, so it leads to high idle temps but it's the intended use to keep the GPU silent.


Correct but no reason in hell they should idle at 60C


----------



## Transmaniacon

Quote:


> Originally Posted by *bluej511*
> 
> Correct but no reason in hell they should idle at 60C


You can just set up a fan profile so it kicks on like normal and keeps it down around 30.


----------



## patriotaki

Quote:


> Originally Posted by *Transmaniacon*
> 
> Pretty sure the Nitro series won't turn the fan on until it needs to, when it gets up around 65C, so it leads to high idle temps but it's the intended use to keep the GPU silent.


yes pcs+ has also mute fan technology the fans dont spin less than 60C.

my pcs+ runs 38C idle with fans at idle. His pcs+ though runs at 59-60C big difference..


----------



## Agent Smith1984

GPU will easily idle in the mid to high 50's with the fan off. My MSI does it too, but I use a 30% fan for anything up to 40c and then it climbs drastically from there.

I am actually working out my "summer clocks" right now.... I always end up pulling some voltages and slightly reducing some clock speeds for the CPU and the GPU when summer comes around.

Reduced CPU from 5GHz at 1.55v to 4.9GHz at 1.5v, and have GPU at 1100 (-25mv) and 1750 VRAM @ +50mv AUX.

Going from 1600mhx VRAM with no AUX voltage to 1750MHz with the 50mv literally gives me 2-4 FPS in games at 4k so it's well worth a few degrees. At 1080P it adds about 1 FPS at best. So anyone making the argument that the VRAM speed has no affect on performance, is obviously drawing that conclusion from 1080 testing.


----------



## afyeung

I'm currently running just my 390x because the pump header on my 290 broke while I was moving stuff around my PC







Still an excellent card for 1440p (80-100+fps in BF4 with 2xMSAA and Mantle). I'm using the Kraken G10 with the Kraken X41 and a single Corsair AF140L in pull because the stock X41 fan went bad. Temps are decent considering the AF140L is struggling with the rad. 70c max with 1180/1725 +100mv core and +50mv aux. Is it worth it to add more voltage for 1200mhz? Also is there a substantial difference between 1725 and 1750 because they run at different straps I believe? The only thing I'm not really happy with is the VRM 1 which gets into the 90c range even with a 92mm fan blowing on it and a 120mm fan blowing on it. I'm also using the Gelid VRM kit which works with the MSI 390x. VRM 2 temps stay nice at 45c.


----------



## Agent Smith1984

Quote:


> Originally Posted by *afyeung*
> 
> I'm currently running just my 390x because the pump header on my 290 broke while I was moving stuff around my PC
> 
> 
> 
> 
> 
> 
> 
> Still an excellent card for 1440p (80-100+fps in BF4 with 2xMSAA and Mantle). I'm using the Kraken G10 with the Kraken X41 and a single Corsair AF140L in pull because the stock X41 fan went bad. Temps are decent considering the AF140L is struggling with the rad. 70c max with 1180/1725 +100mv core and +50mv aux. Is it worth it to add more voltage for 1200mhz? Also is there a substantial difference between 1725 and 1750 because they run at different straps I believe? The only thing I'm not really happy with is the VRM 1 which gets into the 90c range even with a 92mm fan blowing on it and a 120mm fan blowing on it. I'm also using the Gelid VRM kit which works with the MSI 390x. VRM 2 temps stay nice at 45c.


Actually the strap changes after 1625.

I notice the more I play with additional voltage on my card, the bigger of a headache it is.... it can be stable on some days, and not on others.
That's mainly because I'm on air though.... if you are on water, 100mv should be fine (unless you get flack flickers, which I believe is a result of 390's TDP limitations, which were not apparent on the 290 series).

Bottom line is, these cards do great on lower voltage up to around 1150mhz (not to say they won't clock higher, but the scaling is terrible), and after that they die out. The older Hawaii chips seem to scale with voltage. Again, I don't know if that's all in relation to the newer manufacturing process, or if it's something with the boards also.

I see several XFX cards (which are almost identical to their 290 predecessors) that seem to keep climbing with voltage, while the others all completely die out between 1175 and 1200.


----------



## Scorpion49

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I see several XFX cards (which are almost identical to their 290 predecessors) that seem to keep climbing with voltage, while the others all completely die out between 1175 and 1200.


My XFX card does indeed keep climbing with voltage, as much as I can possibly add. It is also 100% reference 290X PCB with a 390X core and bigger memory IC's strapped to it.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Scorpion49*
> 
> My XFX card does indeed keep climbing with voltage, as much as I can possibly add. It is also 100% reference 290X PCB with a 390X core and bigger memory IC's strapped to it.


Good to know, I have seen several others that have confirmed my theory on that one too









Thanks for the input!


----------



## tolis626

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Good to know, I have seen several others that have confirmed my theory on that one too
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks for the input!


Well, my MSI was like that too. After 1175MHz, no matter how much voltage I added it would crap out. That was until I replaced the TIM. I don't know how and I don't know why it helped, but I can now do 1190MHz game stable clocks at +90mV no problem. For benching I can go over 1200MHz and I think I'm limited by my cooling at that point more than anything. These things under water must be beasts. You should give it a try. Just, don't make my mistakes and don't bother changing your VRM pads to anything other than 1.5mm Fujipoly pads.


----------



## Agent Smith1984

I don't think they'll clock all that impressive on water either, unless you have an xfx. Just a hunch though....

Not seen anyone breaking 1250 stable on these...


----------



## tolis626

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I don't think they'll clock all that impressive on water either, unless you have an xfx. Just a hunch though....
> 
> Not seen anyone breaking 1250 stable on these...


Well, I think that because of the high leakage (comparatively speaking) of the 300 series cards, cooling them better takes them the extra mile they need. I tend to believe that these can see even larger improvements than their 200 series brethren (again, by comparison). It's not like 290x's were breaking 1250MHz easily on air, so the comparison is unfair to begin with. I'm not saying that we'd suddenly see the 390x destroying the 290x's overclocking, but it could at least close the gap.


----------



## danielwhitt

Here's my 390 crossfire rig, had to change the case three times because of the card temps, finally settled on this case but still may have to water-cooled the cards, after I have tried everything I know to cool the top card down. Lol


----------



## Transmaniacon

Quote:


> Originally Posted by *danielwhitt*
> 
> 
> 
> Here's my 390 crossfire rig, had to change the case three times because of the card temps, finally settled on this case but still may have to water-cooled the cards, after I have tried everything I know to cool the top card down. Lol


That's the Air 540? What fans are you using up front?


----------



## danielwhitt

Yeah, it's the carbide air 540, all the fans are corsair pwm fans. The same SP fans as the red ones but without the red rings, the fans came with the radiator. Just a bit more uniformed.


----------



## Transmaniacon

Quote:


> Originally Posted by *danielwhitt*
> 
> Yeah, it's the carbide air 540, all the fans are corsair pwm fans. The same SP fans as the red ones but without the red rings, the fans came with the radiator. Just a bit more uniformed.


They are decent fans, but there are better performers out there, maybe go for some of those Noctua Industrial fans, they can really move some serious air. You will need PWM control for them so aren't running at 3000 rpm, but if they can't keep the GPU cool than I don't know what can (aside from making a wind tunnel with Deltas...).


----------



## danielwhitt

I've changed the thermal paste on the GPUs as MSI slapped it on with a trowel, which made a big difference. The bottom card idles 30 degrees and tops out around 65 - 70, and the top card idles around 33 and tops out 94 and then starts throttling, there's nothing in the way of the rear of the GPUs and the fans blow air directly over the cards, with the fans on full I can kinda tame it, it sounds like a rocket ship about to take off. During games the top card tops out around 85 but during benchmarking and the witcher games blow the top cards to bits. Lol


----------



## Transmaniacon

Quote:


> Originally Posted by *danielwhitt*
> 
> I've changed the thermal paste on the GPUs as MSI slapped it on with a trowel, which made a big difference. The bottom card idles 30 degrees and tops out around 65 - 70, and the top card idles around 33 and tops out 94 and then starts throttling, there's nothing in the way of the rear of the GPUs and the fans blow air directly over the cards, with the fans on full I can kinda tame it, it sounds like a rocket ship about to take off. During games the top card tops out around 85 but during benchmarking and the witcher games blow the top cards to bits. Lol


You would probably do better with a blower-type cooler since right now all that hot air from the bottom card is baking the top one. Or consider grabbing a pair of those Corsair GPU liquid coolers and taking off the air coolers. I would imagine you could fit them on both cards, and you have plenty of rad space in that case.


----------



## danielwhitt

Corsair don't do one for the msi r9 390, only reference cards, I have ordered two white kraken g10s as I have my old corsair h75 lying about and if it works well, I will do the bottom one aswell. Ive heard good things about them and hope they work, just like the stock coolers more then the g10s.

Both cards overclock really well on their own, so I'm hoping with water I can push them harder and keep them cooler.


----------



## Transmaniacon

Quote:


> Originally Posted by *danielwhitt*
> 
> Corsair don't do one for the msi r9 390, only reference cards, I have ordered two white kraken g10s as I have my old corsair h75 lying about and if it works well, I will do the bottom one aswell. Ive heard good things about them and hope they work, just like the stock coolers more then the g10s.
> 
> Both cards overclock really well on their own, so I'm hoping with water I can push them harder and keep them cooler.


Yeah the G10 should do a good job, at the very least it should be quieter. My MSI 7950 sounds like a plane taking off when it gets up into the mid-70s OCed pretty high.


----------



## danielwhitt

My last GPU in my old build was a powercolor hd6990 and that things still holds its own, but [email protected]#k me that was loud, but was cracking in the winter for central heating. Lol


----------



## bluej511

A single 120 isnt enouh for a hawaii id easily recommend a 240 for decent temps. Im at 42C with a 360 and 240 with a dozen fans only putting out 20db. The nitro alone at 60% fan speed would be louder lol.


----------



## anti-duck

Has anyone banged a Kraken G10 on an XFX R9 390 (specifically the Black Edition with the updated PCB - R9-390P-8DB6)? Just wondering if I would be able to leave the VRM heatsink on?

I was gonna wait for Polaris until I get a new GPU, but I think I'm gonna grab the XFX R9 390 next payday (currently using the iGPU on my 4790k and I can't wait any longer lol) and then just give it to my brother when Polaris arrives, I already have the G10 and a Corsair H90; hopefully it fits!


----------



## Slowpoke66

Quote:


> Originally Posted by *danielwhitt*
> 
> Corsair don't do one for the msi r9 390, only reference cards, I have ordered two white kraken g10s as I have my old corsair h75 lying about and if it works well, I will do the bottom one aswell. Ive heard good things about them and hope they work, just like the stock coolers more then the g10s.
> 
> Both cards overclock really well on their own, so I'm hoping with water I can push them harder and keep them cooler.


I have the same GPU-setup with the same thermal problems (on the top card). I put a Kraken G10 on the top card (with a Kraken X41) and the temps dropped from ~94c to ~60c while playing BF4 (22c ambient). 2nd card tops ~65c (fans @55%).

But beware of the VRM1-temps! Had to put on some copper heatsinks to lower the temps. Now they're between 80-90c (playing longer sessions of BF4).


----------



## christoph

Quote:


> Originally Posted by *danielwhitt*
> 
> 
> 
> Here's my 390 crossfire rig, had to change the case three times because of the card temps, finally settled on this case but still may have to water-cooled the cards, after I have tried everything I know to cool the top card down. Lol


what about that Red WD hard drive? what real speeds you get from it?


----------



## danielwhitt

Quote:


> Originally Posted by *Slowpoke66*
> 
> I have the same GPU-setup with the same thermal problems (on the top card). I put a Kraken G10 on the top card (with a Kraken X41) and the temps dropped from ~94c to ~60c while playing BF4 (22c ambient). 2nd card tops ~65c (fans @55%).
> 
> But beware of the VRM1-temps! Had to put on some copper heatsinks to lower the temps. Now they're between 80-90c (playing longer sessions of BF4).


Glad you were able to sort it with the kraken g10, did you leave the backplate on and the half heat sink on the front of the GPU, I've heard mixed things about the vrm temps, with some people saying they were fine with and without, gonna have a play once my krakens turn up, hopefully do both when I get another aio cooler for the bottom card.


----------



## danielwhitt

Quote:


> Originally Posted by *christoph*
> 
> what about that Red WD hard drive? what real speeds you get from it?


The drive is a 3tb drive used for storage and backup, honestly I've never tried speed tests with it, I bought it because I was recommended the nas drives and all the reviews said they were great, I've never had any problems with it and it's designed to be run 24/7 and not consume alot of power, if you want me to run some speed tests for you, I don't mind, just let me know.


----------



## Slowpoke66

Quote:


> Originally Posted by *danielwhitt*
> 
> Glad you were able to sort it with the kraken g10, did you leave the backplate on and the half heat sink on the front of the GPU, I've heard mixed things about the vrm temps, with some people saying they were fine with and without, gonna have a play once my krakens turn up, hopefully do both when I get another aio cooler for the bottom card.


Yes, I kept the back- & midplate on and attached some EnzoTech BCC9 on the VRMs.


----------



## danielwhitt

Quote:


> Originally Posted by *Slowpoke66*
> 
> Yes, I kept the back- & midplate on and attached some EnzoTech BCC9 on the VRMs.


Just had an email to say it's coming today, did you need longer bolts or did they come with the right size. Alos, how did you wire yours up, I've got some gelid GPU to pwm adapters so I can run the pump and the fan off that adapter and then the fan on the radiator off a fan controller. I take it you mean the vrms to the right off the mid plate ???


----------



## Stige

Quote:


> Originally Posted by *Transmaniacon*
> 
> They are decent fans, but there are better performers out there, maybe go for some of those Noctua Industrial fans, they can really move some serious air. You will need PWM control for them so aren't running at 3000 rpm, but if they can't keep the GPU cool than I don't know what can (aside from making a wind tunnel with Deltas...).


People need to stop recommending that noctua crap, there are way better fans out there for less. Like GTs. Just check the fan thread.


----------



## bluej511

Quote:


> Originally Posted by *Stige*
> 
> People need to stop recommending that noctua crap, there are way better fans out there for less. Like GTs. Just check the fan thread.


Here we go again haha. So a noctua has more static pressure then a vardar fan, and is quieter but somehow they're crap. Depending on the radiator fpi (fins per inch) any fan will work. High fpi u want a high static pressure fan, low fpi it doesnt really matter even a low pressure fan will work just fine.

P.S. I guess a 6yr warranty is worthless on fans huh.


----------



## danielwhitt

Here's the top GPU that was giving me grief temperature wise with a kraken g10 and corsair h75 on it. And the temps have dropped a stupid amount. Idles around 29 - 32 degrees and under load it's around 55 - 60 degrees as apposed to the 94 degrees it was getting too and throttling. Just need to save now for the bottom cards h75 and then they will both be done. Vrm 1 temps have only got into the mid sixties so far under load without any heatsinks on. That's my only concern really and monitoring those temps to see if I need to add them. Really happy with the results.


----------



## bluej511

Quote:


> Originally Posted by *danielwhitt*
> 
> Here's the top GPU that was giving me grief temperature wise with a kraken g10 and corsair h75 on it. And the temps have dropped a stupid amount. Idles around 29 - 32 degrees and under load it's around 55 - 60 degrees as apposed to the 94 degrees it was getting too and throttling. Just need to save now for the bottom cards h75 and then they will both be done. Vrm 1 temps have only got into the mid sixties so far under load without any heatsinks on. That's my only concern really and monitoring those temps to see if I need to add them. Really happy with the results.


Decent core temps. Thats what i dont like about those gpu coolers and the alphacool is the vrm temps. Although mine is staying below 60 with dif pads should be even better 3w/mk is pretty horrid for pads lol.


----------



## Transmaniacon

So I was going to return the R9 390X and wait, but now I am looking at it in my PC and regret nothing


----------



## danielwhitt

I didn't think those vrm temps were too bad, my vrm 2 temps are at a permanent 48, I assume that's because of the mid plate that is still installed, if the vrm 1 temps stay under seventy I'm pretty happy with that without heatsinks on. Although if they creep over then I shall put some on.


----------



## bluej511

Quote:


> Originally Posted by *danielwhitt*
> 
> I didn't think those vrm temps were too bad, my vrm 2 temps are at a permanent 48, I assume that's because of the mid plate that is still installed, if the vrm 1 temps stay under seventy I'm pretty happy with that without heatsinks on. Although if they creep over then I shall put some on.


If its also 48 at idle could be a faulty sensor. Seen that a few times with hawaii cards. Both of mine go up and down but i honestly dont think theyre reliable. My vrm1 shows 19C at idle when my ambient is 21C lol.


----------



## danielwhitt

I'll gonna wait for the whole PC to become ambient then run the sensors again, I've been benchmarking the living daylights outta it to see if there was a temp difference between stock and water. Just worried about the vrm temps as I'm not sure what the upper limit if for them.


----------



## Noirgheos

So I was taking out my card from the box when I felt it brush the side/edge of my palm and it hit the thumb of my sweaty hand (I get nervous around my PC parts) and I instantly started to freak out, but calming down I looked at thw connector and I saw nothing on it. Just to be safe I gave it a light wipe with a dry coffee filter, not even sure if I broke something doing that. Hope my natural skin oils don't cause any issues, but they're gold, should it even stick?

My question is should I even be worried? I tested it at a friend's house before taking it home and opening it and this happening, worked fine over at his house. Still, I'm worried. I see reviewers touch it all the time, so I'm guessing it's ok.


----------



## navjack27

get nervous around computer parts? 



 after watching this i don't get nervous anymore


----------



## Devildog83

I did some gaming with my MSI 390 and with everything maxed out in Crysis 3 at 1080p I hit a max of 68c. It was butter smooth but I guess it should be at 1080p. When I get a 1440p monitor I am expecting it to tax the card a lot more but I am as happy as I could be with it for now.


----------



## christoph

Quote:


> Originally Posted by *danielwhitt*
> 
> The drive is a 3tb drive used for storage and backup, honestly I've never tried speed tests with it, I bought it because I was recommended the nas drives and all the reviews said they were great, I've never had any problems with it and it's designed to be run 24/7 and not consume alot of power, if you want me to run some speed tests for you, I don't mind, just let me know.


no cuz actually those drives are the best for storage so when you have in mind that you shouldn't pay attention to the speed, I'm going to buy a couple for my storage


----------



## navjack27

Quote:


> Originally Posted by *christoph*
> 
> no cuz actually those drives are the best for storage so when you have in mind that you shouldn't pay attention to the speed, I'm going to buy a couple for my storage


sadly my 3tb drive i got from a nas was a seagate that has a high failure rate. rip. i just got a 900 somthing gig ssd to do video editing on for now on. the 3tb drive gets like 1.5tb free space and then 100% usage and reallocated sectors.


----------



## TainePC




----------



## TainePC

I have an XFX card, and people should know that the XSPC r9 290 waterblock and backplate, and the ekwb waterblocks will fit, but you have to do a modification: http://www.overclock.net/t/1569032/xfx-r9-390-dd-removed-from-the-ek-fc-r9-290x-v2-ca-cooling-configurator. i will post a picture of the mod i did to the block.  It wasn't too hard, and there are no water channels in that general area in the xspc model. The ekwb acrylic one requires you to dremel into the acrylic, and that might put some poeple off


----------



## danielwhitt

Looks like it's an afternoon for ghetto mods to get water-cooled working, I butchered the original vrm plate for the msi card, so I could mount it under the kraken g10 and add some heat sinks aswell to try and kerb the vrm temps, and low and behold, my ghetto mod has wored and now my vrm temps haven't gone over 76 degrees, so now I have a cool GPU core, cooler vrm temps and the whole card with the kraken is more solid with a pretty much complete midplate aswell as back plate.


----------



## danielwhitt




----------



## bluej511

Spending the afternoon installing Windows 10 in UEFI mode. So on the Nitro once the light is on its in UEFI mode, light off its in legacy+uefi mode. It will automatically pick the one thats setup.


----------



## gupsterg

A ROM is either UEFI or Non UEFI.

So if you want UEFI ROM switch needs to be that position and respectively in appropriate position for Non UEFI ROM.


----------



## bluej511

Yea just installed W10 in UEFI, didnt take long. Seems like with the light on it doesnt work and with the light off it works so just gonna leave it alone haha.


----------



## gupsterg

Usually light on = UEFI, light off = Non UEFI , used to own a Sapphire Vapor-X 290X with the LED lit push button bios switch.

IIRC Sapphire site also highlight same info for switch position.

I once had an Asus DCUII 290X STD edition with one side as OC edition clocks ROM and other STD edition clocks ROM. A bodge had occurred in factory of flashing per switch position, as Asus ROMs store serial number of card it was clear to see the OC edition ROM had a differing serial to the STD edition ROM (which matched the sticker on PCB).

Open GPU-Z and you will see a tick in the UEFI box to denote if ROM has UEFI enabled.


----------



## bluej511

Quote:


> Originally Posted by *gupsterg*
> 
> Usually light on = UEFI, light off = Non UEFI , used to own a Sapphire Vapor-X 290X with the LED lit push button bios switch.
> 
> IIRC Sapphire site also highlight same info for switch position.
> 
> I once had an Asus DCUII 290X STD edition with one side as OC edition clocks ROM and other STD edition clocks ROM. A bodge had occurred in factory of flashing per switch position, as Asus ROMs store serial number of card it was clear to see the OC edition ROM had a differing serial to the STD edition ROM (which matched the sticker on PCB).
> 
> Open GPU-Z and you will see a tick in the UEFI box to denote if ROM has UEFI enabled.


Well it showed enabled before, i think i had to install the drivers first then turn it on. Its on now with the light on and it booted no problem. Feels nice to have a clean OS though. Im using 16.3.2 since its WQHL.


----------



## gupsterg

Yep fresh OS feels so peepy to use







, just did a fresh UEFI Win 7 Pro install a few days ago







.


----------



## Majentrix

Moved cases from a vertical Silverstone FT02 to a more conventional NZXT S340. Temps on my 390 Strix immediately dropped from 94+ to around 80 under heavy load and the card no longer throttles. I can actually overclock the card now as well, I have it sitting at 1100. I'll have a go at getting it higher after work.
Maybe I missed something when I installed the Strix in my FT02, but it seems to work at least anecdotally better in a "normal" case.



Edit: Here's a Fire Strike comparison. Same settings on everything. A 1000 point difference! An average performance increase of 11%!

http://www.3dmark.com/compare/fs/8066931/fs/8160466


----------



## Transmaniacon

Quote:


> Originally Posted by *Majentrix*
> 
> Moved cases from a vertical Silverstone FT02 to a more conventional NZXT S340. Temps on my 390 Strix Immediately dropped from 94+ to around 80 under heavy load and the card no longer throttles. I can actually overclock the card now as well, I have it sitting at 1100. I'll have a go at getting it higher after work.
> Maybe I missed something when I installed the Strix in my FT02, but it seems to work at least anecdotally better in a "normal" case.
> 
> 
> 
> Edit: Here's a Fire Strike comparison. Same settings on everything. A 1000 point difference! An average performance increase of 11%!
> 
> http://www.3dmark.com/compare/fs/8066931/fs/8160466


Good looking build! Those vertical cases do better with blower style coolers, so it makes sense the temps were higher.


----------



## bluej511

Quote:


> Originally Posted by *Transmaniacon*
> 
> Good looking build! Those vertical cases do better with blower style coolers, so it makes sense the temps were higher.


Yea i just saw that FT02 had to google it, thats quite a super weird setup. Mine is a Core X5 though its horizontal. Unfortunately i didnt try it air cooled but i could see it being very good. I can add 8 intake fans and have 7 exhaust fans so would have had some good flow. Its just a weird setup as half the air would just blow under the motherboard tray.


----------



## Majentrix

Yeah I actually got the Strix for the bent heatpipes that run across the board and the cooling fins that run along the board. From what I've heard straight horizontal heatpipes like those on the 390 Nitro don't work as well when standing up in an FT02, the oil inside the pipes pools at the bottom or something.
I always thought I got a dud card but it turns out it was likely the case that was the problem.


----------



## Chaoz

I got the same card. Really happy with it. The only thing that bothers me is the blinking LED light and the red parts, as my PC has nothing else red (theme is blue, black and white). So I covered up the red and the LED light.


----------



## bluej511

Quote:


> Originally Posted by *Chaoz*
> 
> I got the same card. Really happy with it. The only thing that bothers me is the blinking LED light and the red parts, as my PC has nothing else red (theme is blue, black and white). So I covered up the red and the LED light.


Take it apart paint the parts its easy. LEDs can be changed too just do a dif color. Soldering is fun lol.


----------



## Chaoz

Quote:


> Originally Posted by *bluej511*
> 
> Take it apart paint the parts its easy. LEDs can be changed too just do a dif color. Soldering is fun lol.


LED itself is white. There is a Red clear vinyl in front of it with the logo on it.
I just put some black vinyl wrap over it. Over the other red parts I put some carbon vinyl over it.
Don't wanna mess with my card too much when it still is in warranty.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Majentrix*
> 
> Moved cases from a vertical Silverstone FT02 to a more conventional NZXT S340. Temps on my 390 Strix immediately dropped from 94+ to around 80 under heavy load and the card no longer throttles. I can actually overclock the card now as well, I have it sitting at 1100. I'll have a go at getting it higher after work.
> Maybe I missed something when I installed the Strix in my FT02, but it seems to work at least anecdotally better in a "normal" case.
> 
> 
> 
> Edit: Here's a Fire Strike comparison. Same settings on everything. A 1000 point difference! An average performance increase of 11%!
> 
> http://www.3dmark.com/compare/fs/8066931/fs/8160466


Same case here and love it!

Had to beef up the stock exhaust fans for crossfire though...


----------



## Majentrix

Yeah the stock cooling is pretty weak, imo the two fans the case comes with should come as intake for that positive pressure.
I've popped two spare Corsair SP140s in the front as intake which is keeping my 390 nice and cool.


----------



## diggiddi

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Same case here and love it!
> 
> Had to beef up the stock exhaust fans for crossfire though...


What is the red metallic strip in the case for?


----------



## Majentrix

Cable management, it's used in lieu of grommets. It works really well.


----------



## diggiddi

Quote:


> Originally Posted by *Majentrix*
> 
> Cable management, it's used in lieu of grommets. It works really well.


I see


----------



## christoph

the Sapphire websie says otherwise, my video card is UEFI off with the Light off, I have the gpu-z open and is says the UEFI is not enabled with the light on


----------



## bluej511

Quote:


> Originally Posted by *christoph*
> 
> the Sapphire websie says otherwise, my video card is UEFI off with the Light off, I have the gpu-z open and is says the UEFI is not enabled with the light on


Yea you are correct i just checked gpuz as well. Ive always had the light off though so my guess is light off = uefi AND legacy and light on is legacy only. Because for me light off worked for uefi and bios OS and light on my bios reverts back to uefi+legacy boot right away.

Off topic (slightly) have you guys seen how sexy the water cooler is for the Radeon duo? Shame its a single 120mm though, id try to replace it haha. Seems like it watercools the vrms on both sides as well, so sexy.


----------



## kizwan

Quote:


> Originally Posted by *bluej511*
> 
> Quote:
> 
> 
> 
> Originally Posted by *christoph*
> 
> the Sapphire websie says otherwise, my video card is UEFI off with the Light off, I have the gpu-z open and is says the UEFI is not enabled with the light on
> 
> 
> 
> Yea you are correct i just checked gpuz as well. Ive always had the light off though so my guess is light off = uefi AND legacy and light on is legacy only. Because for me light off worked for uefi and bios OS and light on my bios reverts back to uefi+legacy boot right away.
> 
> Off topic (slightly) have you guys seen how sexy the water cooler is for the Radeon duo? Shame its a single 120mm though, id try to replace it haha. Seems like it watercools the vrms on both sides as well, so sexy.
Click to expand...

Windows can boot in UEFI mode whether your gpu have legacy or UEFI VBIOS. It seems this is true for secure boot too. I have been able to secure boot with legacy or UEFI VBIOS.

What do you mean by _"...light on my bios reverts back to uefi+legacy boot right away..."_? Can you boot with either switch mode or only when UEFI enabled (light off)? You can check whether windows is boot in UEFI mode & whether it is in Secure Boot state in *msinfo*.


----------



## bluej511

Quote:


> Originally Posted by *kizwan*
> 
> Windows can boot in UEFI mode whether your gpu have legacy or UEFI VBIOS. It seems this is true for secure boot too. I have been able to secure boot with legacy or UEFI VBIOS.
> 
> What do you mean by _"...light on my bios reverts back to uefi+legacy boot right away..."_? Can you boot with either switch mode or only when UEFI enabled (light off)? You can check whether windows is boot in UEFI mode & whether it is in Secure Boot state in *msinfo*.


It boots fine with the light off and uefi is checked in gpuz. Since i installed my os last night in uefi ONLY with the light on it reverts back to legacy+uefi. So light off is uefi and legacy and light on is legacy only. With the light on my bios switches back cuz it doesnt support legacy only.

Windows will boot yes but it wont be in uefi, with the light off it will.


----------



## Stige

Running Firestrike for some epeen again, highest score for me all around and highest graphics score aswell on this
http://www.3dmark.com/3dm/11603634


----------



## Drag0g0

Edit, Wrong thread sorry.


----------



## kizwan

Quote:


> Originally Posted by *bluej511*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Windows can boot in UEFI mode whether your gpu have legacy or UEFI VBIOS. It seems this is true for secure boot too. I have been able to secure boot with legacy or UEFI VBIOS.
> 
> What do you mean by _"...light on my bios reverts back to uefi+legacy boot right away..."_? Can you boot with either switch mode or only when UEFI enabled (light off)? You can check whether windows is boot in UEFI mode & whether it is in Secure Boot state in *msinfo*.
> 
> 
> 
> It boots fine with the light off and uefi is checked in gpuz. Since i installed my os last night in uefi ONLY with the light on it reverts back to legacy+uefi. So light off is uefi and legacy and light on is legacy only. With the light on my bios switches back cuz it doesnt support legacy only.
> 
> Windows will boot yes but it wont be in uefi, with the light off it will.
Click to expand...

Please post screenshots of msinfo for both switch light off & light on.


----------



## bluej511

Quote:


> Originally Posted by *kizwan*
> 
> Please post screenshots of msinfo for both switch light off & light on.


Ill post one with the light off, if i boot with the light on it pretty much reverts back to uefi+legacy in the BIOS. Then it ends up giving me a black screen and it doesn't boot to the login.


----------



## christoph

Quote:


> Originally Posted by *kizwan*
> 
> Windows can boot in UEFI mode whether your gpu have legacy or UEFI VBIOS. It seems this is true for secure boot too. I have been able to secure boot with legacy or UEFI VBIOS.
> 
> What do you mean by _"...light on my bios reverts back to uefi+legacy boot right away..."_? Can you boot with either switch mode or only when UEFI enabled (light off)? You can check whether windows is boot in UEFI mode & whether it is in Secure Boot state in *msinfo*.


yeah, mine boots either UEFI enabled or not, and I've always thought that it was, light off = Legacy , Light on = UEFI


----------



## Agent Smith1984

Quote:


> Originally Posted by *Stige*
> 
> Running Firestrike for some epeen again, highest score for me all around and highest graphics score aswell on this
> http://www.3dmark.com/3dm/11603634


And that's on the Asus???

With water I assume?


----------



## Stige

Quote:


> Originally Posted by *Agent Smith1984*
> 
> And that's on the Asus???
> 
> With water I assume?


Yeah ASUS with the hybrid Alphacool GPX block (water only cools core, not VRM).
+200mV Voltage.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Stige*
> 
> Yeah ASUS with the hybrid Alphacool GPX block (water only cools core, not VRM).
> +200mV Voltage.


Hmmm, heavy artifacts or pretty clean run? Good to finally see one climb over 1200, but 200mv sure is a lot of juice.... what are temps like with that alpha block at that voltage?


----------



## Stige

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Hmmm, heavy artifacts or pretty clean run? Good to finally see one climb over 1200, but 200mv sure is a lot of juice.... what are temps like with that alpha block at that voltage?


Firestrike is so lightweight it doesn't go above 54-55C on the VRM, Core is like sub-40C always.

I think I saw one black flicker at the start of the first test and that's it so I think I could push it just a little bit further. Need to install my Fujipoly pads to hopefully get those VRM temps further down to get extra stability out of the VRM.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Stige*
> 
> Firestrike is so lightweight it doesn't go above 54-55C on the VRM, Core is like sub-40C always.
> 
> I think I saw one black flicker at the start of the first test and that's it so I think I could push it just a little bit further. Need to install my Fujipoly pads to hopefully get those VRM temps further down to get extra stability out of the VRM.


Awesome!

You tried the VR demo on the latest FS yet??

That thing runs at like 250 FPS on my card, it's crazy cool though, you can move around a litle in the demo and look at 3dmark benchmark "exhibits"....

I notice it finds artifacts easier with the high framerate load than the more difficult fs benchmark itself does.


----------



## Stige

I got this weird problem with my QNIX though when going past +175mV or something. I get these weird lines on my screen you used to get on old CRT monitors if you put too high refresh rate on.
And it only happens at high voltages. GPU pushing too much power through the DVI or something somehow. I need to get another DVI cable to try it out.
I first noticed this issue in R6 Siege.

And that is only at 100Hz+, I can benchmark fine if I set my desktop at 60Hz, zero issues.


----------



## Agent Smith1984

I'm pretty happy with my undervolt summer clocks...

14,300 gfx score @ 1100/1750 -25mv on the core, and +50mv on the AUX.

It's 100% daily stable and stays around 77C in 4k.... I can keep it down to around 72c withou the aux voltage, but that extra RAM clock is giving me noticeable gains at 4k (in the 3-5fps range!!!), so I'm keeping it.

Funny though, cause at 1080 it's like 1 FPS difference.... I think that's why so many people feel that overclocking the VRAM doesn't help... cause they are testing it at 1080 or 1440.


----------



## bluej511

Mine was 1200/1650 with only +100mv and didnt artifact at all. My vrms were mid 60s so not too bad, this was during heaven though firestrike doesnt seem to get as hot. I usually just test it in gaming anyways. The alphacool blocks do extremely well in core temps provided u don't use the ac thermal paste (which is absolute garbage) its just really restrictive so you get cooler temps.

I usually just test mine gaming anyways, my memory past 1650 is useless it artifacts right away, the core was stable at 1200 with +100mv. If i spent more time on it could probably get 1200 with +50mv without artifacts i think,


----------



## Agent Smith1984

Quote:


> Originally Posted by *bluej511*
> 
> Mine was 1200/1650 with only +100mv and didnt artifact at all. My vrms were mid 60s so not too bad, this was during heaven though firestrike doesnt seem to get as hot. I usually just test it in gaming anyways. The alphacool blocks do extremely well in core temps provided u don't use the ac thermal paste (which is absolute garbage) its just really restrictive so you get cooler temps.
> 
> I usually just test mine gaming anyways, my memory past 1650 is useless it artifacts right away, the core was stable at 1200 with +100mv. If i spent more time on it could probably get 1200 with +50mv without artifacts i think,


I can't go past 1625 on my VRAM without AUX voltage. Seen it on several cards. The extra heat from increasing AUX voltage won't be worth it for people playing on 1080 in my opinion.....

I can bench 1190 @ 60mv and 1200 on air at 83mv so I could see 1200 on 50mv being very doable with good water. Main thing is keeping the VRM below 60C and getting down to around 50c seems to be icing on the cake....

Either way, I still don't think we will see any 1275-1300 benchable 300 cards on water. It's just not stacking up that way so far, though that could have something to do with the lack of watercooling interest on such an old architecture.

Tons of samples and cards in people's hands now, but the amount of people wanting to go water on this series has been small....


----------



## bluej511

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I can't go past 1625 on my VRAM without AUX voltage. Seen it on several cards. The extra heat from increasing AUX voltage won't be worth it for people playing on 1080 in my opinion.....
> 
> I can bench 1190 @ 60mv and 1200 on air at 83mv so I could see 1200 on 50mv being very doable with good water. Main thing is keeping the VRM below 60C and getting down to around 50c seems to be icing on the cake....
> 
> Either way, I still don't think we will see any 1275-1300 benchable 300 cards on water. It's just not stacking up that way so far, though that could have something to do with the lack of watercooling interest on such an old architecture.
> 
> Tons of samples and cards in people's hands now, but the amount of people wanting to go water on this series has been small....


Makes sense, on my hand though memory is limited cuz its elpida (got stuck with it haha). For me heat isnt an issue even in a 24° room it hovers around 40C with VRM hovering around the 60s while gaming. I might just try it now to see if i can get to 1250.


----------



## Stige

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I can't go past 1625 on my VRAM without AUX voltage. Seen it on several cards. The extra heat from increasing AUX voltage won't be worth it for people playing on 1080 in my opinion.....
> 
> I can bench 1190 @ 60mv and 1200 on air at 83mv so I could see 1200 on 50mv being very doable with good water. Main thing is keeping the VRM below 60C and getting down to around 50c seems to be icing on the cake....
> 
> Either way, I still don't think we will see any 1275-1300 benchable 300 cards on water. It's just not stacking up that way so far, though that could have something to do with the lack of watercooling interest on such an old architecture.
> 
> Tons of samples and cards in people's hands now, but the amount of people wanting to go water on this series has been small....


I actually have -25mV on my AUX Voltage, I have yet to see any benefit from doing anything to it all so I just lowered it. Haven't tried yet how far lower I could go.

I'm at 1650 on my card atm because it is using way tighter timings than anything past that, I need to mod my BIOS again at some point to put tighter tmings upto like 1750 or something and see how how it can do then.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Stige*
> 
> I actually have -25mV on my AUX Voltage, I have yet to see any benefit from doing anything to it all so I just lowered it. Haven't tried yet how far lower I could go.
> 
> I'm at 1650 on my card atm because it is using way tighter timings than anything past that, I need to mod my BIOS again at some point to put tighter tmings upto like 1750 or something and see how how it can do then.


FYI, you go too low on AUX and you will get black screen, have to wipe driver in safe mode, uninstall MSI and all profiles, and start over.. lol

The memory needs the AUX voltage for some reason... not sure what it actually powers, but it is directly affecting memory clocks, and in no way shape or form impacting core clock capabilities... I have tested it on 4 different 390's with same results.

I need to flash my card to the custom BIOS I have with 1250 straps all the way up but haven't gotten a flash drive to do it with yet.. just been lazy about it really.


----------



## bluej511

Ok so at 100mv and 50% power my limit is 1200/1650. After that it just artifacts, tested it using AC Syndicate, core temp was at 43°C and VRMs were both under 70°C.


----------



## Stige

Quote:


> Originally Posted by *Agent Smith1984*
> 
> FYI, you go too low on AUX and you will get black screen, have to wipe driver in safe mode, uninstall MSI and all profiles, and start over.. lol
> 
> The memory needs the AUX voltage for some reason... not sure what it actually powers, but it is directly affecting memory clocks, and in no way shape or form impacting core clock capabilities... I have tested it on 4 different 390's with same results.
> 
> I need to flash my card to the custom BIOS I have with 1250 straps all the way up but haven't gotten a flash drive to do it with yet.. just been lazy about it really.


I have yet to find the bottom end, lowered it to -31mV atm.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Stige*
> 
> I have yet to find the bottom end, lowered it to -31mV atm.


Good news is, it DIRECTLY impacts core and VRM temps, so the lower you can go the better!!

The other good part is, you'll know for sure when you find the bottom, cause you'll be wiping everything to fix it! lol


----------



## Agent Smith1984

Quote:


> Originally Posted by *bluej511*
> 
> Ok so at 100mv and 50% power my limit is 1200/1650. After that it just artifacts, tested it using AC Syndicate, core temp was at 43°C and VRMs were both under 70°C.


Yeah, unless you can get the VRM around 60c, adding anything over 100mv won't help much, even with the core low. That's the only reason why the MSI card does even remotely well for overclocking in my opinion.... even the core hits high 70's at 4k (low 70's, even high 60's in 1080P) I still only hit 65~ or so on the VRM. MSI really did good cooling the VRM on these cards with the air cooler, but I think this crappy TIM is getting swapped soon. I'm just waiting to get my fuji pads.

I am thinking I need 1.5mm.


----------



## bluej511

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Yeah, unless you can get the VRM around 60c, adding anything over 100mv won't help much, even with the core low. That's the only reason why the MSI card does even remotely well for overclocking in my opinion.... even the core hits high 70's at 4k (low 70's, even high 60's in 1080P) I still only hit 65~ or so on the VRM. MSI really did good cooling the VRM on these cards with the air cooler, but I think this crappy TIM is getting swapped soon. I'm just waiting to get my fuji pads.
> 
> I am thinking I need 1.5mm.


The alphacool pads are only 3w/mk i believe so replacing the vrm ones with 8+ should see a good difference. Ill order em eventually with some Kryonaut TIM and do it that way.


----------



## Agent Smith1984

Quote:


> Originally Posted by *bluej511*
> 
> The alphacool pads are only 3w/mk i believe so replacing the vrm ones with 8+ should see a good difference. Ill order em eventually with some Kryonaut TIM and do it that way.


The Fuji's I'm ordering are 17w/mk.... that should get me some really good results


----------



## Gdourado

I am reading through this thread and am really torn.
I currently have a sapphire 390x tri-x oc.
I was offered the chance to buy an identical card from a friend for a good price.
My current monitor is 1080p 75hz freesync.

Should I jump on the card and crossfire?
Or are there more issues and problems than performance boost by going crossfire together with freesync?

Cheers


----------



## Agent Smith1984

Quote:


> Originally Posted by *Gdourado*
> 
> I am reading through this thread and am really torn.
> I currently have a sapphire 390x tri-x oc.
> I was offered the chance to buy an identical card from a friend for a good price.
> My current monitor is 1080p 75hz freesync.
> 
> Should I jump on the card and crossfire?
> Or are there more issues and problems than performance boost by going crossfire together with freesync?
> 
> Cheers


Dude, you are going to see way more benefits of having crossfire, than not having it..... but what I would say is.... you should be tearing up 75hz pretty easily on one card already, so unless you are going to upgrade to 1440 120-144hz, or to 4k in the near future, you don't really need the second card.


----------



## Stige

Crossfire is useless.


----------



## Gdourado

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Dude, you are going to see way more benefits of having crossfire, than not having it..... but what I would say is.... you should be tearing up 75hz pretty easily on one card already, so unless you are going to upgrade to 1440 120-144hz, or to 4k in the near future, you don't really need the second card.


Monitor upgrade is next in line.
Got this aoc for the price while I am waiting to see what takes off in gaming. 4k or 21:9


----------



## bluej511

Quote:


> Originally Posted by *Gdourado*
> 
> Monitor upgrade is next in line.
> Got this aoc for the price while I am waiting to see what takes off in gaming. 4k or 21:9


Honestly i think its 21:9. Unless they make an affordable gpu that can max out 4k i just domt see it. Pointless getting 4k to play on med or low. Me personally i cant play if its below 40fps hurts my eyes like crazy. 15/20 vision


----------



## Transmaniacon

Quote:


> Originally Posted by *bluej511*
> 
> Honestly i think its 21:9. Unless they make an affordable gpu that can max out 4k i just domt see it. Pointless getting 4k to play on med or low. Me personally i cant play if its below 40fps hurts my eyes like crazy. 15/20 vision


Well the first substantial GPU improvement in a long while is coming soon. I think the HBM2 cards are poised to deliver a high quality 4K gaming experience, and personally that seems to be the way forward.


----------



## Transmaniacon

So I am leaning towards getting myself a freesync monitor, my current Asus VE247H is pretty lousy and only 60Hz. The ViewSonic XG2401 seems like the best choice right now for a 1080P panel, it seems to me like the high speed IPS panels need time to mature, as most people have to send them back due to defects. Figure this ViewSonic is a nice option for the time being until we get some better options in the future.


----------



## Agent Smith1984

I don't see what stands to be gained with the 21:9 aspect ratio moving forward.... I guess the wide screens are cool, and I have seen a few demo'd, but for me it's the better image quality that comes through increased pixel density that really makes things move forward. The 21:9 1440p monitors don't offer anything in the way of PPI that any other 1440 doesn't already offer.

We are already seeing 390x and Fury Nano (sub $500 GPU's) give 50fps + @ 4k in several titles because there is ZERO need for AA at that resolution. If even the mid-high end Polaris offering is faster than these cards, then you will see close to 60FPS in several titles, and if they are priced in the upper $300 to lower $400 range then that to me is what will drive the industry.

The question is, where does VR stand in the midst of all this? I mean, if we are saying this thing is for real and going to stick around a while, then GPU's need to be able to render two images at a given resolution.
If that resolution is 1080p, then no problem right? But if we start wanting 1440 and even 4k VR, then we are looking at both NVIDIA and AMD both really, REALLY, having to step up their game for hi-res VR gaming.

I personally have NO interest in VR myself, as it just seems to tune people out even further than they already are from reality..... Right now, my family can sit with me and watch me play GTA V or whatever on my 55" 4k in my living room, but how does that make me look and feel when I get submerged into this other world? I dunno, guess it would be cool for younger folks on their own.

I am much more interested in crazy stuff liek 5k and 8k, and these insanely dense resolutions that are taking pixel density beyond a point that we can even see.

I ran a 4k fireplace at Christmas time and the image looked so real that it blew my grandparents minds, lol


----------



## bluej511

Well the 21:9 thats 1440 is actually 3440x1440 so its closer to 4k then to 2k to be honest. So more real estate then a 2560x1440 with more resolution. Id love to get a 34in whqd but honestly i sit way too close for anything bigger then a 29.


----------



## dagget3450

21:9 in my onion is way to thin across the vertical range. Makes it feel like i am this guy:


----------



## bluej511

Quote:


> Originally Posted by *dagget3450*
> 
> 21:9 in my onion is way to thin across the vertical range. Makes it feel like i am this guy:


Haha that seems to be the issue with most people. A 29" 21:9 is about the vertical height of a 23-24" from my understanding which is what im on now and its not an issue vertically. I'm also only 2' from my screen though so me getting anything a 34" ultrawide would be WAY too much. It really depends on where your sitting i guess.

Off topic: I just had to delete a folder in my appdata under intel, i believe a bitmining file of some sort msscv or wtv its called. Was playing GTA V and wondering why my gpu is using 5.3gb of ram when its usually 3gb. Quit, opened task manager and found it. I do believe ive had this before and once i deleted it it was gone for good, shocking though that it happened on a 2 day old fresh install.


----------



## Majentrix

You guys were right, the Strix really is a terrible overclocker. With +100mv and +50% power limit I get artifacting at 1175MHz and long term instability at 1150MHz.. I've settled at 1125MHz and I'll try playing around with voltages to keep temps low.


----------



## bluej511

Quote:


> Originally Posted by *Majentrix*
> 
> You guys were right, the Strix really is a terrible overclocker. With +100mv and +50% power limit I get artifacting at 1175MHz and long term instability at 1150MHz.. I've settled at 1125MHz and I'll try playing around with voltages to keep temps low.


I wouldnt say terrible, my Nitro only gets to 1200/1650, after that it artifacts. I tried 1250 and got artifacts n Syndicate right at the menu, 1225 got it in game, 1200 is stable. I could try 200mv using trixx but pointless until i get my new thermal pads. Core temps will never be an issue with 2 rads.


----------



## jdorje

Most cards won't go to 1150 at +100 mV...

Anyway I just noticed there's a 16.4.1 driver hotfix release. Should I get it?

http://support.amd.com/en-us/download/desktop?os=Windows%2010%20-%2064


----------



## Stige

Quote:


> Originally Posted by *Majentrix*
> 
> You guys were right, the Strix really is a terrible overclocker. With +100mv and +50% power limit I get artifacting at 1175MHz and long term instability at 1150MHz.. I've settled at 1125MHz and I'll try playing around with voltages to keep temps low.


The cooling is terrible, not the overclocking... You can't say any brand overclocks better than another.

Your VRM is propably hitting 100C at +100mV easily considering even at stock it hits 90C+.

You need to be aware of your temps... You can propably do that 1125MHz at almost no increase in voltage or very little.


----------



## r_aquarii

i currently have a gigabyte 390 and i'm getting 2pcs of msi 280x return from RMA soon.
should i use a single 390 or 280x crossfire, which config will have a better performace
gaming on 1440p


----------



## jdorje

280x crossfire will obviously perform better. When crossfire works. It doesn't get much from dx12 though and doesn't support freesync.


----------



## Stige

And doesn't really work at all in games, more drawbacks than gains from Crossfire.

Also 390 is like 70-80% faster than a 280X so...


----------



## Agent Smith1984

280x in crossfire with decent overclocks will actually do really well.... however you will see that you have enough shader horsepower to run high settings at high resolution, but then you get stutters once you pop the 3GB cherry, so it kind of negates the benefits.... I would run a single 390 personally.


----------



## OneB1t

avoid crossfire if you can, thats my advice








go as fast single card as you can afford


----------



## Transmaniacon

Got my Viewsonic XG2401 in yesterday, and while I could only spend a little while gaming, it is an enormous difference. GTAV was so smooth and I think I noticed the biggest difference in Warframe. It's like a whole new game! The Division was smoother but it's still a pretty taxing game, might lower some settings from Ultra so I can stay up in the 80s/90s more consistently.


----------



## r_aquarii

thanks for advise. most likely i will sell off the 2 gpu and upgrade my old sandy bridge to skylake


----------



## jdorje

Fixed my Fallout 4 performance. Disabling "power efficiency" in Amd settings helped a lot. But even more was updating to the 16.4.1 hotfix drivers.



FPS rarely drops below 60 now, on high+ settings. I'm using settings from

__
https://www.reddit.com/r/3s8wa2/optimal_fallout_4_settings_for_r9_390_users/
, but dropped AA to fxaa and shadows/decals/lighting all to high. I might bump them back up though now that the game is performing as it should.

CPU is overclocked at 4.7 with 2133 ram, though I don't know if that affects things. The game is capped at 72 FPS anyway, which is disappointing since I think I'd average higher than that without the cap. 390 is on stock 1225 mV but core, memory, and timings are all overclocked.

I'm not sure what causes the occasional FPS drop into the 40s and the noticeable spike there. Seems to happen during fast movement so I guess it could be a CPU thing. FPS almost never drops below my freesync minimum of 40 now so performance is quite good.


----------



## bluej511

Quote:


> Originally Posted by *jdorje*
> 
> Fixed my Fallout 4 performance. Disabling "power efficiency" in Amd settings helped a lot. But even more was updating to the 16.4.1 hotfix drivers.
> 
> 
> 
> FPS rarely drops below 60 now, on high+ settings. I'm using settings from
> 
> __
> https://www.reddit.com/r/3s8wa2/optimal_fallout_4_settings_for_r9_390_users/
> , but dropped AA to fxaa and shadows/decals/lighting all to high. I might bump them back up though now that the game is performing as it should.
> 
> CPU is overclocked at 4.7 with 2133 ram, though I don't know if that affects things. The game is capped at 72 FPS anyway, which is disappointing since I think I'd average higher than that without the cap. 390 is on stock 1225 mV but core, memory, and timings are all overclocked.
> 
> I'm not sure what causes the occasional FPS drop into the 40s and the noticeable spike there. Seems to happen during fast movement so I guess it could be a CPU thing. FPS almost never drops below my freesync minimum of 40 now so performance is quite good.


I dont have Fallout 4 but after my fresh install i did install 16.3.2 and the drivers crashed on me once so i went back to 16.3.1 which has been totally stable with no issues. Afterburner as of recently has been giving me a super hard freeze so idk if its because my hdd is mounted vertically or what but its annoying.


----------



## diggiddi

Quote:


> Originally Posted by *Stige*
> 
> And doesn't really work at all in games, more drawbacks than gains from Crossfire.
> 
> *Also 390 is like 70-80% faster than a 280X* so...


Whaat! I'm jus' sayin' you









Well in my exerience crossfire works fine but it just depends on the games played and which games need it the most(looking at you pcars and crysis3)
Quote:


> Originally Posted by *bluej511*
> 
> I dont have Fallout 4 but after my fresh install i did install 16.3.2 and the drivers crashed on me once so i went back to 16.3.1 which has been totally stable with no issues. Afterburner as of recently has been giving me a super hard freeze so idk if its because my hdd is mounted vertically or what but its annoying.


Afterburner has been acting up quite recently I dunno what's causing it, could be clashing with crimson. I used to get the hard freezes but they've stopped


----------



## m70b1jr

Hey guys, off topic, but could you guys help me out here..
http://www.overclock.net/t/1597468/weird-wifi-issue-with-my-galaxy-s6-edge-plus


----------



## battleaxe

Just sent off my 390X for RMA. Sucky. Better get it back... and not some turd clocker.


----------



## Stige

Quote:


> Originally Posted by *diggiddi*
> 
> Whaat! I'm jus' sayin' you
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Well in my exerience crossfire works fine but it just depends on the games played and which games need it the most(looking at you pcars and crysis3)
> Afterburner has been acting up quite recently I dunno what's causing it, could be clashing with crimson. I used to get the hard freezes but they've stopped


I was thinking about 7950 vs 390 which is ~80% herpderp.
280X vs 390 is about 50-60% which is definitely worth it over Crossfire that works in 2 games and gives you unplayable game experience anyway.

Only thing CF is good for is benchmarks and that's it. I had 2x HD7950 back in the day and it was just garbage.


----------



## diggiddi

Quote:


> Originally Posted by *Stige*
> 
> I was thinking about 7950 vs 390 which is ~80% herpderp.
> 280X vs 390 is about 50-60% which is definitely worth it over Crossfire that works in 2 games and gives you unplayable game experience anyway.
> 
> Only thing CF is good for is benchmarks and that's it. I had *2x HD7950 back in the day and it was just garbage*.


Ohh that 'splains a lot well, I don't have any major issues with XFire 290x, maybe crossfire bridge has something to do with it??


----------



## Stige

Quote:


> Originally Posted by *diggiddi*
> 
> Ohh that 'splains a lot well, I don't have any major issues with XFire 290x, maybe crossfire bridge has something to do with it??


I mainly meant the lack of proper CF support for majority of the games. And even when it does work, the gaming experience is far from smooth. I would always go single card in the future to avoid all the stuttering, and the fact that CF only works in exclusive fullscreen, not borderless or windowed which made it nearly useless for me as I prefer borderless in every case for faster alt+tabbing.

You might get more FPS with CF but it still introduces more issues than I would consider worth the trade off to be able to play 1-2 games at higher FPS. I would rather use a faster single card and lower 1-2 settings to get it to acceptable levels without stuttering etc.


----------



## Agent Smith1984

I have ran three different crossfire setups and only had issues with micro stutter when the two cards were clocked differently. Any time I locked the cards to the same speeds, everything performed great.

Crysis 3 with (2) 390's at 4k was like


----------



## dagget3450

Ill be joining this club very soon, i am picking up some xfx 390x on order now. Sadly i will be selling my furies and then ill keep an eye on polaris but i am not holding my breath as i have a feeling they will be mediocre. I am downgrading because of vram at 4k is an issue if your pushing settings. My most recent one was Dragon Age inq. Its aggravating when i get 150+fps until switching to a cutscene and having massive hitches causing 5 sec freezes.


----------



## Agent Smith1984

Quote:


> Originally Posted by *dagget3450*
> 
> Ill be joining this club very soon, i am picking up some xfx 390x on order now. Sadly i will be selling my furies and then ill keep an eye on polaris but i am not holding my breath as i have a feeling they will be mediocre. I am downgrading because of vram at 4k is an issue if your pushing settings. My most recent one was Dragon Age inq. Its aggravating when i get 150+fps until switching to a cutscene and having massive hitches causing 5 sec freezes.


I got rid of my Fury for the same reason buddy. Welcome to the club!!!


----------



## mus1mus

Quote:


> Originally Posted by *dagget3450*
> 
> Ill be joining this club very soon, i am picking up some xfx 390x on order now. Sadly i will be selling my furies and then ill keep an eye on polaris but i am not holding my breath as i have a feeling they will be mediocre. I am downgrading because of at 4k is an issue if your pushing settings. My most recent one was Dragon Age inq. Its aggravating when i get 150+fps until switching to a cutscene and having massive hitches causing 5 sec freezes.


Wow! Surprised mate!

You will be surprised too. 390Xs can almost get very close to what your Furies can do.


----------



## m70b1jr

Random question, but does anyone have any dead 290(x), 390(x) or Fury(x) GPU CHIPS laying around? Not the full PCB and cooler, but just the core it self. I'd like to get me a GPU core for my keychain, along with my AMD 965 BE.


----------



## dagget3450

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I got rid of my Fury for the same reason buddy. Welcome to the club!!!


Quote:


> Originally Posted by *mus1mus*
> 
> Wow! Surprised mate!
> 
> You will be surprised too. 390Xs can almost get very close to what your Furies can do.


Yeah, Fury was more disappointing than anything. The overclocking and vram are the biggest reasons i decided to change.
I like to use VSR and on fury it does 4k vsr which is great but it's forcing 4k and not allowing 3k or lower. Which is conflicting with my setup and vram limit.

I ordered the ref 390x because they will work with my ek blocks. I need to order the fuji or whatever pads for vrms. Does anyone know the size and best conducting ones? I also hope to be able to tighten vram timings and oc a little to try to get closer to fury.

I am hoping i can coast until vega 10 if Microsoft doesn't kill my gaming love


----------



## kizwan

Quote:


> Originally Posted by *dagget3450*
> 
> Ill be joining this club very soon, i am picking up some xfx 390x on order now. Sadly i will be selling my furies and then ill keep an eye on polaris but i am not holding my breath as i have a feeling they will be mediocre. I am downgrading because of vram at 4k is an issue if your pushing settings. My most recent one was Dragon Age inq. Its aggravating when i get 150+fps until switching to a cutscene and having massive hitches causing 5 sec freezes.


Congrats! You will regret it...erm, I mean you will not regret it.


----------



## dagget3450

Quote:


> Originally Posted by *kizwan*
> 
> Congrats! You will regret it...erm, I mean you will not regret it.


Probably both lol.

Anyone know if i should bother replacing memory ic pads on 390x? Nabbing some fujipoly 17w/mk now for vrm. not sure about vrams modules.


----------



## bluej511

Quote:


> Originally Posted by *dagget3450*
> 
> Probably both lol.
> 
> Anyone know if i should bother replacing memory ic pads on 390x? Nabbing some fujipoly 17w/mk now for vrm. not sure about vrams modules.


VRAM is fine they dont get nearly as hot as VRMs do. They only push 1.050v compared to wtv the VRMs push. VoltageXamps=wattage and the vrms push a LOT of amps.

Idk where you bought those fujis but i cant get the 17w ones for cheaper then like 50€ here its ccrazy. Ill settle on the 11w.


----------



## Agent Smith1984

To me, 2 390x in crossfire will have a longer lifespan than fury just because of the vram limitations. 4k needs a minimum of 6gb now, i see games break 4gb all time including gta v and now Hitman...


----------



## dagget3450

Quote:


> Originally Posted by *Agent Smith1984*
> 
> To me, 2 390x in crossfire will have a longer lifespan than fury just because of the vram limitations. 4k needs a minimum of 6gb now, i see games break 4gb all time including gta v and now Hitman...


Yeah, ive only found a handful of times my fury vram caps, that said its not pretty when it does it.

i knew about GTA5, DAI is also one i knew about. I haven't bought any new games for the last 6 months? I get free hitman copies though so i'll have that one. My furyx quadfire has been 50/50 love and hate.
My 290x quadfire was the same way for the first few months but then it really shined for a long time. Hopefully i will recoup most of my fury money and easily cover my 390x quadfire. The physical representation of the furyx is phenomenal. I do love the LED's ESP the Load leds. I had a couple 390x before i bought furyx and returned them for furies. Wish i would have talked myself out of that one. That said i did out of luck win some money with them on overclocking comp. Never won anything in my life before - I'll take what i can get


----------



## mus1mus

Quote:


> Originally Posted by *bluej511*
> 
> VRAM is fine they dont get nearly as hot as VRMs do. They only push 1.050v compared to wtv the VRMs push. VoltageXamps=wattage and the vrms push a LOT of amps.
> 
> Idk where you bought those fujis but i cant get the 17w ones for cheaper then like 50€ here its ccrazy. Ill settle on the 11w.


FYI, VRAM Voltage is not VDDCI. And is higher and not going to sitting at 1.050V.

VDDCI is for the memory interface or controller.


----------



## bluej511

Quote:


> Originally Posted by *mus1mus*
> 
> FYI, VRAM Voltage is not VDDCI. And is higher and not going to sitting at 1.050V.
> 
> VDDCI is for the memory interface or controller.


Going by memory voltage in info64. Makes sense btw ddr3 1.5v ddr4 1.2v ddr5 1.05 its not going to be higher then ddr4 voltage.


----------



## Agent Smith1984

Quote:


> Originally Posted by *bluej511*
> 
> Going by memory voltage in info64. Makes sense btw ddr3 1.5v ddr4 1.2v ddr5 1.05 its not going to be higher then ddr4 voltage.


All the GDDR5 on graphics cards usually runs at 1.5v or so...

DDR and GDDR are totally different power usages.


----------



## bluej511

Quote:


> Originally Posted by *Agent Smith1984*
> 
> All the GDDR5 on graphics cards usually runs at 1.5v or so...
> 
> DDR and GDDR are totally different power usages.


You sure about that? hwinfo64 reports MVDDC, GPU Memory Voltage as 1.050v. It monitors my core voltage and vrm temps and all the wattages correctly so i have no reason to doubt it.


----------



## Agent Smith1984

Quote:


> Originally Posted by *bluej511*
> 
> You sure about that? hwinfo64 reports MVDDC, GPU Memory Voltage as 1.050v. It monitors my core voltage and vrm temps and all the wattages correctly so i have no reason to doubt it.


Yes, prior to Hawaii cards you could control memory voltage and all 7800/270, 7900/280 memory voltage was a stock 1.5v and could be adjusted up to 1.7v "safely" though some chips suffered from the voltage increases. My 280x loved 1.55v at 1800mhz








1.6v made it artifact...


----------



## bluej511

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Yes, prior to Hawaii cards you could control memory voltage and all 7800/270, 7900/280 memory voltage was a stock 1.5v and could be adjusted up to 1.7v "safely" though some chips suffered from the voltage increases. My 280x loved 1.55v at 1800mhz
> 
> 
> 
> 
> 
> 
> 
> 
> 1.6v made it artifact...


Nice so its inaccurate reporting just like the VRM temp sensors report. Which they had direct contacts to measure with a volt meter.


----------



## mus1mus

Looking up data sheets will make it clearer.

http://html.alldatasheet.com/html-pdf/458072/ELPIDA/EDW2032BBBG/220/1/EDW2032BBBG.html


----------



## Agent Smith1984

Quote:


> Originally Posted by *mus1mus*
> 
> Looking up data sheets will make it clearer.
> 
> http://html.alldatasheet.com/html-pdf/458072/ELPIDA/EDW2032BBBG/220/1/EDW2032BBBG.html


----------



## tolis626

Man, how cool would it be if we could push memory voltage a bit on these cards? Kinda disappointed that there isn't a chip for it (or that someone didn't magically figure it out in the BIOS modding thread). I suppose all these 1750MHz cards are 1.5V ICs. 1.6V, so not a crazy increase, would yield some pretty interesting results, especially at higher resolutions. But, things are what they are and we aren't lacking in memory bandwidth anyway, but still... Damn.

And then there's my card that craps its pants at anything over 1650MHz. God damn it.


----------



## dagget3450

Quote:


> Originally Posted by *tolis626*
> 
> Man, how cool would it be if we could push memory voltage a bit on these cards? Kinda disappointed that there isn't a chip for it (or that someone didn't magically figure it out in the BIOS modding thread). I suppose all these 1750MHz cards are 1.5V ICs. 1.6V, so not a crazy increase, would yield some pretty interesting results, especially at higher resolutions. But, things are what they are and we aren't lacking in memory bandwidth anyway, but still... Damn.
> 
> And then there's my card that craps its pants at anything over 1650MHz. God damn it.


Get it some diapers?


----------



## tolis626

Quote:


> Originally Posted by *dagget3450*
> 
> Get it some diapers?


I got it diapers made of 14W/mK Fujipolys. No luck. It's crapped all over my hopes and dreams of that extra 1% of performance. It also craps itself so hard it blacks out.


----------



## dagget3450

Quote:


> Originally Posted by *tolis626*
> 
> I got it diapers made of 14W/mK Fujipolys. No luck. It's crapped all over my hopes and dreams of that extra 1% of performance. It also craps itself so hard it blacks out.


haha, sorry i am not laughing at you but with you on that comment.


----------



## bluej511

Mine stays below 70C on vrms even with 100mv and 50% power. Factory 1040/1500 they stay at around 60. Not sure if getting fujis will do much. It artifacts at anything above 1200/1650 and i doubt its temp issues.


----------



## tolis626

Quote:


> Originally Posted by *dagget3450*
> 
> haha, sorry i am not laughing at you but with you on that comment.


Laugh at me all you want, I won't judge.









Like seriously though, I can't understand why I fell on such a dud of a card when it comes to memory overclocking. It's fine when overclocking the core, so I'm not losing out on performance, but you know, it just rubs me the wrong way seeing most people here getting 1700MHz+(usually 1750MHz) when pushing AUX voltage a bit. It's not silicon lottery stuff, it's worse.


----------



## mus1mus

Is it time? Or willingness
Quote:


> Originally Posted by *tolis626*
> 
> Man, how cool would it be if we could push memory voltage a bit on these cards? Kinda disappointed that there isn't a chip for it (or that someone didn't magically figure it out in the BIOS modding thread). I suppose all these 1750MHz cards are 1.5V ICs. 1.6V, so not a crazy increase, would yield some pretty interesting results, especially at higher resolutions. But, things are what they are and we aren't lacking in memory bandwidth anyway, but still... Damn.
> 
> And then there's my card that craps its pants at anything over 1650MHz. God damn it.


They can be adjusted. You only need either a Lightning, or a Matrix.







Or simply by soldering some parts.









If you have Hynix, 1650 is a bit weird. 1750, normally can be had with proper tools. That includes, a BLOCK, Tons of Voltage, and a good airconditioned room.


----------



## tolis626

Quote:


> Originally Posted by *mus1mus*
> 
> Is it time? Or willingness
> They can be adjusted. You only need either a Lightning, or a Matrix.
> 
> 
> 
> 
> 
> 
> 
> Or simply by soldering some parts.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If you have Hynix, 1650 is a bit weird. 1750, normally can be had with proper tools. That includes, a BLOCK, Tons of Voltage, and a good airconditioned room.


I'm willing, don't get me wrong. Just too broke to start a whole custom loop at this point. And if I do, it probably won't be with the 390x, sadly, but with something like a Vega or like that.









Yeah, it's Hynix, that's what's so strange. I've even pushed over 1.05V AUX (+50mV) and still no joy. I've also tried it at high core voltages and it's a no go. 1750MHz just doesn't work. It can work at up to 1700MHz, but (I think) only at higher voltages and even then it's flaky. Like, the other day I played 4 hours of Witcher 3 at 1700MHz mem no problems and the next day, using the exact same settings I had problems with it. And by problems I mean all sorts except for it crashing and burning. No, it just crashed. I got a black screen at about 20 minutes of gameplay. I'm telling you man, this card hates me. It's playing with my mind.









Temps are fine, though. I ordered 1.5mm 11W/mK Fujipolys (best 1.5mm I could find on eBay, the other options are Amazon and Aquatuning, but their prices are ridiculously ridiculous) so that I can solve my contact issues on the core. I'll even get some Kryonaut on a later date, as I figured that, while CPUs with their IHS and low thermal output (relatively speaking) don't benefit too much from higher conductivity pastes, GPUs may in fact see large improvements as they're moving much more energy and it's a direct-to-die mount. I'm still debating (internally







) whether I should get some LM TIM and hope it works or if I should stick to the non-conductive stuff and play it safe. I just cringe at the thought of spilling it and frying the card, or it not making any contact and making temps go crazy, or even if it works, I'd cringe at having to lap my GPU heatsink to get it off. So it's probably a no. Or...?


----------



## mus1mus

Quote:


> Originally Posted by *tolis626*
> 
> I'm willing, don't get me wrong. Just too broke to start a whole custom loop at this point. And if I do, it probably won't be with the 390x, sadly, but with something like a Vega or like that.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yeah, it's Hynix, that's what's so strange. I've even pushed over 1.05V AUX (+50mV) and still no joy. I've also tried it at high core voltages and it's a no go. 1750MHz just doesn't work. It can work at up to 1700MHz, but (I think) only at higher voltages and even then it's flaky. Like, the other day I played 4 hours of Witcher 3 at 1700MHz mem no problems and the next day, using the exact same settings I had problems with it. And by problems I mean all sorts except for it crashing and burning. No, it just crashed. I got a black screen at about 20 minutes of gameplay. I'm telling you man, this card hates me. It's playing with my mind.


Take it this way, 1750 usually can be had at +200mV Core.


----------



## dagget3450

I will supposedly have 390x by saturday, I think i am just gonna order vrm fujipoly pads. I am not looking to go super extreme by any means i just want vrms cooler than they were on my 290x cards. Whats a reasonable clock speed to expect on 390x with stock or slight amount of voltage give or take. 1150?


----------



## kizwan

Quote:


> Originally Posted by *bluej511*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Agent Smith1984*
> 
> All the GDDR5 on graphics cards usually runs at 1.5v or so...
> 
> DDR and GDDR are totally different power usages.
> 
> 
> 
> You sure about that? hwinfo64 reports MVDDC, GPU Memory Voltage as 1.050v. It monitors my core voltage and vrm temps and all the wattages correctly so i have no reason to doubt it.
Click to expand...

Wrong labeling. The "GPU Memory Voltage (MVDDC)" is actually VRM2 voltage out which is VDDCI or AUX voltage. It's supposedly memory controller voltage. The memory voltage is fixed at 1.5V. You can check this out using DMM.

The author of the monitoring software usually work based on little or no documentation at all. With the old card it may be MVDDC but not anymore with Hawaii card. The labeling just got carry forward.


----------



## Majentrix

Has this happened to anyone else when clicking on imgur links? I'm assuming it's a driver error.


----------



## bluej511

Quote:


> Originally Posted by *Majentrix*
> 
> 
> 
> Has this happened to anyone else when clicking on imgur links? I'm assuming it's a driver error.


what in Gods name is that? looks like aliens trying to communicate never seen that.


----------



## mus1mus

Quote:


> Originally Posted by *Majentrix*
> 
> 
> 
> Has this happened to anyone else when clicking on imgur links? I'm assuming it's a driver error.


You have been hacked!

WALLPAPER WORTHY!


----------



## bluej511

So my EKWB Naked Ivy screws finally arrived after about a 2 week wait. The seller on ebay refunded me 3 days ago as after 2 weeks its deemed lost. They arrived today, stupid me didnt wait and ordered another set from EK as i thought they were lost so looks like ill be getting a free set of Naked Ivy mounting screws. Might sell em or give em away who knows.

Temps after delid went from 50°C to 45°C on NH-T1, after a couple weeks seems like it pressed out and went to around 53-54°C tested this morning at 4.2ghz/1.088v. Now running it bare die with the same tim (for consistency of testing) it peaks out at 43°C with an ambient a few degrees warmer, if we take that into account same testing temps as the other other ones would probably be 40-41°C.

So stige said it made no difference but i dropped 4-5°C going from delided to bare die so yes it does make a difference AND this is at quite a low voltage, would prob be even more so at higher voltages. Core 1-3 have identical temps with core 4 running at 39°C instead of 42-43°C. Installation was super easy as all i had to do was replace 4 screws and removed the IHS bracket. star key included. Pics to follow.


----------



## Stige

If you were to use CLU in all cases it would barely make a difference.

But that high temps at only 1.088v? That is pretty bad if you ask me.


----------



## dagget3450

So did i fall for the hype i bout grizzly thermal paste and fujipoly pads for gpus? Will i get -30c cooling aand 700mhz oc? Hehe but really though is this stuff worth the money?


----------



## battleaxe

Quote:


> Originally Posted by *dagget3450*
> 
> I will supposedly have 390x by saturday, I think i am just gonna order vrm fujipoly pads. I am not looking to go super extreme by any means i just want vrms cooler than they were on my 290x cards. Whats a reasonable clock speed to expect on 390x with stock or slight amount of voltage give or take. 1150?


I have bought three XFX 390X now and all three have done over 1220mhz on fire-strike. One of them kicked the bucket and is out on RMA. (the best one that did 1270mhz







) So, IDK. But if I end up with another one, I'm really hoping that XFX is somehow binning these and most are doing thereabout 1220mhz like the others I have had. Wishing I had not returned the one now, but there's no way I could have known.


----------



## bluej511

Quote:


> Originally Posted by *Stige*
> 
> If you were to use CLU in all cases it would barely make a difference.
> 
> But that high temps at only 1.088v? That is pretty bad if you ask me.


Here we go again haha. Ive also explained that the alphacool block is very restrictive and in my loop order it causes the water to stay in the cpu block a bit longer then if i was to run it straight cpu loop only. I can probably bump it to 1.155 and the temps would prob only increased a degree or so.


----------



## bluej511

Quote:


> Originally Posted by *dagget3450*
> 
> So did i fall for the hype i bout grizzly thermal paste and fujipoly pads for gpus? Will i get -30c cooling aand 700mhz oc? Hehe but really though is this stuff worth the money?


The Kryonaut is supposed to be the best TIM on the market, it comes very close to some liquid metals (no matter what Stige will tell you). Once my store has some back in stock i will be purchasing some.


----------



## Stige

Quote:


> Originally Posted by *bluej511*
> 
> Here we go again haha. Ive also explained that the alphacool block is very restrictive and in my loop order it causes the water to stay in the cpu block a bit longer then if i was to run it straight cpu loop only. I can probably bump it to 1.155 and the temps would prob only increased a degree or so.


I wanna see the temps at 1.45V or higher. I have the same Alphacool GPX block myself.


----------



## Stige

Quote:


> Originally Posted by *bluej511*
> 
> Dear lord you are as dumb as you sound. Hence the smiley faces dumb dumb. No kidding its already delided. Good grief charlie brown.


Reading your previous posts, one can never know if you are serious or just a fail troll or just... dumb. Your ability to refuse to accept anything else than what you think is astonishing, even when you are provided with clear proof that you are even wrong. What does that make you?


----------



## dagget3450

Quote:


> Originally Posted by *battleaxe*
> 
> I have bought three XFX 390X now and all three have done over 1220mhz on fire-strike. One of them kicked the bucket and is out on RMA. (the best one that did 1270mhz
> 
> 
> 
> 
> 
> 
> 
> ) So, IDK. But if I end up with another one, I'm really hoping that XFX is somehow binning these and most are doing thereabout 1220mhz like the others I have had. Wishing I had not returned the one now, but there's no way I could have known.


I hope i get similar results. Right now my biggest hurdle is making sure the cards fit the blocks. Which thanks to OCN user Scorpion who posted pics its looking good.
I really like the looks of the xfx ref 390x , sadly though i wont be using those long. I would love to get 100mhz oc game stable and maybe a bit on the ram as well. I am also curious how much improvements 2xx/3xx crossfire has had in performance since i last tested. I know my furies were fast though so i will expect a small letdown on that. I seriously don't understand why AMD advertised 4k so hard with fury given its small vram buffer.


----------



## Agent Smith1984

I'm almost positive that XFX is using old GPU's on their 390's.... OR the limitations of the other 390's is based on their boards/BIOS or something and XFX more reference 290 board is letting them fly. I just find it very doubtful that XFX is doing that much binning, and I have seen countless samples now that are breaking 1220, mean time no other cards will.


----------



## bluej511

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I'm almost positive that XFX is using old GPU's on their 390's.... OR the limitations of the other 390's is based on their boards/BIOS or something and XFX more reference 290 board is letting them fly. I just find it very doubtful that XFX is doing that much binning, and I have seen countless samples now that are breaking 1220, mean time no other cards will.


I wouldnt doubt that one bit, they were the first ones out with the r9 390/x. Even best best buy had em in stock way before the release date. I thought the xfx pcbs were custom as well or is that just the black series?


----------



## danielwhitt

I can get my MSI R9 390 upto 1220 before it artifacts and my temps are around 65 with 80 degrees vrm, need more voltage to get it any higher than 100mv.


----------



## Agent Smith1984

Quote:


> Originally Posted by *danielwhitt*
> 
> I can get my MSI R9 390 upto 1220 before it artifacts and my temps are around 65 with 80 degrees vrm, need more voltage to get it any higher than 100mv.


I'd love to see some benches at that clock speed.

I've had my hands on three different MSI samples and none will bench higher than 1210 without the worst artifacts you have ever seen.


----------



## bluej511

Is there anyway to get 200mv using afterburner or is that trixx only?


----------



## Agent Smith1984

Quote:


> Originally Posted by *bluej511*
> 
> Is there anyway to get 200mv using afterburner or is that trixx only?


You can edit the batch file on MSI to add whatever voltage you want, but you do it "blindly" and if you move the slider it screws it up.

Trixx will give you 200mv but I have not seen any good come from it on most of these cards. Some get additional core scaling though, and I just don't understand it. Even on water people just don't seem to be able to push anything other than XFX cards past 1200.


----------



## danielwhitt

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I'd love to see some benches at that clock speed.
> 
> I've had my hands on three different MSI samples and none will bench higher than 1210 without the worst artifacts you have ever seen.


I'll try and get some done later when I get home. Both cards are water-cooled though, is the only way to get higher voltage through BIOS mods.


----------



## n64ADL

after hearing new news about the polaris gpus, i might just wait and get another r9 390 for now. skip this and wait for HBM2. does anyone advise that at this point?? or i could sell my 390 and get 2 480x that would work well with a 750 watt power supply but not enough for 3440 x 1440.


----------



## bluej511

Quote:


> Originally Posted by *Agent Smith1984*
> 
> You can edit the batch file on MSI to add whatever voltage you want, but you do it "blindly" and if you move the slider it screws it up.
> 
> Trixx will give you 200mv but I have not seen any good come from it on most of these cards. Some get additional core scaling though, and I just don't understand it. Even on water people just don't seem to be able to push anything other than XFX cards past 1200.


Seems to be core and memory related not much heat issues. Mine runs cool but anything past 1200/1650 it artifacts right away. My firestrike graphics score was 15,640 so can't complain. Ive seen people get lower scores with higher OCs.


----------



## Agent Smith1984

Quote:


> Originally Posted by *n64ADL*
> 
> after hearing new news about the polaris gpus, i might just wait and get another r9 390 for now. skip this and wait for HBM2. does anyone advise that at this point?? or i could sell my 390 and get 2 480x that would work well with a 750 watt power supply but not enough for 3440 x 1440.


Looks like 480x will have between 2560 and 2816 shaders and use a 256 bit GDDR5X buffer (which will be the equivalent of GDDR5 at 512 bit)

The only real benefit is that it's going to run at 100w instead of 250+ watts. From a performance standpoint I don't think we are looking at much from going 390 to 480, the power drop is nice though.

The question will be, is this new GCN architecture faster per clock than the current.... because if it is, than even with only 2560 shaders, we could see cards that are faster than current 390x. It looks like so far the clock ceilings are still going to be in the 1050 range, but who knows what overclocking headroom will look like. My guess is... not that great.


----------



## danielwhitt

heres a quick one on valley with settings most benchmarking charts uses at 1220 n 1600


----------



## bluej511

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Looks like 480x will have between 2560 and 2816 shaders and use a 256 bit GDDR5X buffer (which will be the equivalent of GDDR5 at 512 bit)
> 
> The only real benefit is that it's going to run at 100w instead of 250+ watts. From a performance standpoint I don't think we are looking at much from going 390 to 480, the power drop is nice though.
> 
> The question will be, is this new GCN architecture faster per clock than the current.... because if it is, than even with only 2560 shaders, we could see cards that are faster than current 390x. It looks like so far the clock ceilings are still going to be in the 1050 range, but who knows what overclocking headroom will look like. My guess is... not that great.


If you use vsync or frtc the r9 390 doesnt use that much wattage anyways. I think running dirt rally my card is around 150w or so if not less. Thats fully maxed out too.


----------



## Agent Smith1984

Quote:


> Originally Posted by *bluej511*
> 
> If you use vsync or frtc the r9 390 doesnt use that much wattage anyways. I think running dirt rally my card is around 150w or so if not less. Thats fully maxed out too.


Yeah, I run Dirt Rally at 4k with everything completely maxed..... LOVE THAT GAME!!!


----------



## bluej511

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Yeah, I run Dirt Rally at 4k with everything completely maxed..... LOVE THAT GAME!!!


I didnt even know there was an update i loaded up Steam a week ago and bam 17gb update, and boy did it take a while on 13mbps. I miss my 100mbps in the US.

If ever there was an optimized game Dirt Rally is it, i don't think ive seen it hiccup once at all.


----------



## danielwhitt

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I'd love to see some benches at that clock speed.
> 
> I've had my hands on three different MSI samples and none will bench higher than 1210 without the worst artifacts you have ever seen.


here is a close up of the corner of the main image i posted


----------



## Agent Smith1984

Not familiar with Valley scores too much, you got a maxed out Heaven or a Firestrike run we can peak at?

Thanks


----------



## bluej511

Id love to see it compared to my 1200/1650 just out of curiousity if that extra core and mem makes any difference.


----------



## Agent Smith1984

Quote:


> Originally Posted by *bluej511*
> 
> Id love to see it compared to my 1200/1650 just out of curiousity if that extra core and mem makes any difference.


Did you say you put up 15,600 GPU score in FS? Is that with tess on???


----------



## bluej511

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Did you say you put up 15,600 GPU score in FS? Is that with tess on???


With tess off for the competition. I think i was at 14 something with it on.


----------



## Stige

15056 GPU score without any tess tweaks: http://www.3dmark.com/3dm/11603634?


----------



## Agent Smith1984

http://www.3dmark.com/fs/7975613

14,800+ no tess mods and 1200MHz core, thought that was pretty damn impressive for 1200 core clock....


----------



## Agent Smith1984

Here is over 16,100 with tess off:
http://www.3dmark.com/fs/7984641


----------



## Stige

390X lol, mine is just a 390.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Stige*
> 
> 390X lol, mine is just a 390.


Yeah I saw that, your core is doing a lot or you at 1250 though. You still haven't tried running 1750 memory yet huh?


----------



## Stige

I have been too lazy to mod my BIOS for tighter timings past 1650 yet. For some reason for any BIOS mods to take effect for me, I need a completely driver reinstall or something... Very annoying








Something to do with QNIX driver patch or.. something...


----------



## danielwhitt

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Not familiar with Valley scores too much, you got a maxed out Heaven or a Firestrike run we can peak at?
> 
> Thanks


heres a heaven benchmark

1220 n 1600

memory wise though i cannot get above 1670 at all, just black screens on me and needs a reset


----------



## Agent Smith1984

Quote:


> Originally Posted by *danielwhitt*
> 
> heres a heaven benchmark
> 
> 1220 n 1600
> 
> memory wise though i cannot get above 1670 at all, just black screens on me and needs a reset


Try running AUX voltage at 50mv+ and you should be able to get the memory to 1750.

That's a nice sample if that 1220 clock is game stable...


----------



## danielwhitt

can that be done through afterburner ????

also i can play tomb raider at those clock speeds and havent had any issues with witcher either.

also, on firestrike im getting around 18500 and 19000 on clocks upto 1200. any more doesnt make a fat lot of difference


----------



## Agent Smith1984

Quote:


> Originally Posted by *danielwhitt*
> 
> can that be done through afterburner ????
> 
> also i can play tomb raider at those clock speeds and havent had any issues with witcher either.
> 
> also, on firestrike im getting around 18500 and 19000 on clocks upto 1200. any more doesnt make a fat lot of difference


Yes, AXU can be raised in AB

19000? Two cards overall score? 2 cards should be in the 25000+ on the GPU score. 1 card should be in high 14's low 15's...


----------



## danielwhitt

really, it tends to be driver dependant for me.
what settings are they at for firestrike ?????

i have been quoting the combind score for firestrike

these are my daily clocks

graphics score 26725 @1120 1600
physics score 10491 @1120 1600

overall score 18434


----------



## Agent Smith1984

Quote:


> Originally Posted by *danielwhitt*
> 
> really, it tends to be driver dependant for me.
> what settings are they at for firestrike ?????
> 
> i have been quoting the combind score for firestrike
> 
> these are my daily clocks
> 
> graphics score 26725 @1120 1600
> physics score 10491 @1120 1600
> 
> overall score 18434


What are you running your 6600k at just out of curiosity?


----------



## danielwhitt

Quote:


> Originally Posted by *Agent Smith1984*
> 
> What are you running your 6600k at just out of curiosity?


It's at 4.7 GHz core and 4.5 cache @1.325 volts. It can go more but I'm not sure how much to push the volts on those CPUs yet


----------



## Agent Smith1984

Quote:


> Originally Posted by *danielwhitt*
> 
> It's at 4.7 GHz core and 4.5 cache @1.325 volts. It can go more but I'm not sure how much to push the volts on those CPUs yet


Nice clock for skylake.... I'm not sure what's safe for voltage on those either.

I am thinking real hard about doing a 6600k build myself


----------



## danielwhitt

It wasn't really a massive choice for me to make coming from an all AMD background. If it was an Intel upgrade, not sure how much you'd notice from an i5 4690k or something. For me the upgrade was huge. Never had a computer this fast or powerfull before. Well worth the expense.

Regarding the voltage, alot of people have been saying you can push them really high, 1.4 - 1.5 but I'm happy with my clock speeds and voltage without the worry if it's gonna blow. Lol


----------



## Agent Smith1984

Quote:


> Originally Posted by *danielwhitt*
> 
> It wasn't really a massive choice for me to make coming from an all AMD background. If it was an Intel upgrade, not sure how much you'd notice from an i5 4690k or something. For me the upgrade was huge. Never had a computer this fast or powerfull before. Well worth the expense.


I'm on AMD right now.... really excited to see what Zen brings, but probably going to sell off my current stuff before Zen kills the value of it....

I can get about $300 for my CPU/board/RAM and add $120 for a nice 6600k combo deal that would give me a good bit more single thread performance, and it'd be something new to "play with" until I see what Zen does.


----------



## danielwhitt

I got my board with 8gb ddr4 for £175 and the CPU was £200 for a new OEM one, and that's when it spiralled into everything had to be new and lots of money later. The only thing I need now is a new monitor as it's only 1080p but I'd ran out of cash. Lol. Need to save again.


----------



## Agent Smith1984

Quote:


> Originally Posted by *danielwhitt*
> 
> I got my board with 8gb ddr4 for £175 and the CPU was £200 for a new OEM one, and that's when it spiralled into every had to be new and lots of money later. The only thing I need now is a new monitor as it's only 1080p but I'd ran out of cash. Lol. Need to save again.


What kind of temps on the CPU do you see with the H100i?


----------



## bluej511

I honestly wouldnt push anything 4th generation and up more then 1.35v to be honest. My limit would be 1.3. I have mine set to 1.155v for 4.5 ran intel stress test for 15mins was stable. If i do 1.145 it blue screens running the test but boots no problem. All the programs show different voltages anyways, one says 1.152 intel stress test says 1.144 so who knows, i havent measured with my DMM yet. I love my mobo it has check points for voltages.


----------



## danielwhitt

Quote:


> Originally Posted by *bluej511*
> 
> I honestly wouldnt push anything 4th generation and up more then 1.35v to be honest. My limit would be 1.3. I have mine set to 1.155v for 4.5 ran intel stress test for 15mins was stable. If i do 1.145 it blue screens running the test but boots no problem. All the programs show different voltages anyways, one says 1.152 intel stress test says 1.144 so who knows, i havent measured with my DMM yet. I love my mobo it has check points for voltages.


My board has got those aswell, it's a cracking bit of kit without being stupidly expensive.


----------



## Stige

Quote:


> Originally Posted by *bluej511*
> 
> I honestly wouldnt push anything 4th generation and up more then 1.35v to be honest. My limit would be 1.3. I have mine set to 1.155v for 4.5 ran intel stress test for 15mins was stable. If i do 1.145 it blue screens running the test but boots no problem. All the programs show different voltages anyways, one says 1.152 intel stress test says 1.144 so who knows, i havent measured with my DMM yet. I love my mobo it has check points for voltages.


1.35V is like ***** territory








Anything below 1.5V is plenty safe.


----------



## danielwhitt

Quote:


> Originally Posted by *Agent Smith1984*
> 
> What kind of temps on the CPU do you see with the H100i?


On idle it's anything from 23 - 25 and tops out around 50 - 55 under load.


----------



## Agent Smith1984

I've got my eyes on this little setup right here:
http://www.newegg.com/Product/ComboBundleDetails.aspx?ItemList=Combo.2466098


----------



## danielwhitt

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I've got my eyes on this little setup right here:
> http://www.newegg.com/Product/ComboBundleDetails.aspx?ItemList=Combo.2466098


Yeah, that would spank mine I think. Pretty similar apart from processor though.


----------



## Agent Smith1984

My only reluctance is 5820k is marginally more than the 6700k and has two more HT cores.... kind of seems senseless to go 6700k with that in mind....


----------



## bluej511

Quote:


> Originally Posted by *Agent Smith1984*
> 
> My only reluctance is 5820k is marginally more than the 6700k and has two more HT cores.... kind of seems senseless to go 6700k with that in mind....


THe thinness of those skylake wafers puts me off they look so brittle. Vise method to delid doesnt even work anymore it just chips and bends the pcb. Intel made em cheaper and make em cost more haha.

I love my msi z97 gaming 5 board but honestly if i could i would have loved to go x99 and watercool the mofeset/cpu combo as well. But for now this is plenty.

And no anything above 1.35v might be stable but it shortens cpu life quite quickly.


----------



## bluej511

P.S. Pft who needs a 980 ti lol.

http://www.overclock.net/t/1597620/gamegpu-killer-instinct-dx12-gpu-test/0_20#post_25079894


----------



## Stige

Quote:


> Originally Posted by *bluej511*
> 
> THe thinness of those skylake wafers puts me off they look so brittle. Vise method to delid doesnt even work anymore it just chips and bends the pcb. Intel made em cheaper and make em cost more haha.
> 
> I love my msi z97 gaming 5 board but honestly if i could i would have loved to go x99 and watercool the mofeset/cpu combo as well. But for now this is plenty.
> 
> And no anything above 1.35v might be stable but it shortens cpu life quite quickly.


Razor works fine, has worked everytime and will.

If my 3570K lasted 2½ years without degradation at 1.52V and much more, no they won't. You are wrong.

Oh, and also, there is no reason to cool the mosfets if the motherboard is any decent at all, just hinder the performance of the rest of the loop for no gains.


----------



## Agent Smith1984

Dude, in all the last 3 DX12 benches I've seen, the 390x competes closely with the 980ti.....

People were saying all along.... wait till DX12 comes out, it's going to be like mantle.... so on and so forth, and you want to believe it but know the thought of a 390X competing closely with a 980ti is just ridiculous.... but now DX12 starts popping up and you see it really happen right before your eyes and you're glad you got an AMD card....

Anybody who picked up used 290's for $150-200 in 2014 is sitting VERY pretty right now from a performance per dollar standpoint.

At one point I had (2) 290 tri-x cards in crossfire that I have a total of $380 in, and even then they were destroying games (prior to going to 4k), that same setup in DX12 is beastly, but I will say for sure that DX12, like mantle, uses a TON more VRAM than DX11, so the 4GB limitation could finally be a widespread problem and not a minimal problem like it is now.

On the flip side of that coin, if developers would do us all the favor of programming their games to use both frame buffers from each card then that issue is solved entirely.

Fury X owners can finally start throwing each other some high-fives after seeing how well it does in DX12.


----------



## bluej511

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Dude, in all the last 3 DX12 benches I've seen, the 390x competes closely with the 980ti.....
> 
> People were saying all along.... wait till DX12 comes out, it's going to be like mantle.... so on and so forth, and you want to believe it but know the thought of a 390X competing closely with a 980ti is just ridiculous.... but now DX12 starts popping up and you see it really happen right before your eyes and you're glad you got an AMD card....
> 
> Anybody who picked up used 290's for $150-200 in 2014 is sitting VERY pretty right now from a performance per dollar standpoint.
> 
> At one point I had (2) 290 tri-x cards in crossfire that I have a total of $380 in, and even then they were destroying games (prior to going to 4k), that same setup in DX12 is beastly, but I will say for sure that DX12, like mantle, uses a TON more VRAM than DX11, so the 4GB limitation could finally be a widespread problem and not a minimal problem like it is now.
> 
> On the flip side of that coin, if developers would do us all the favor of programming their games to use both frame buffers from each card then that issue is solved entirely.
> 
> Fury X owners can finally start throwing each other some high-fives after seeing how well it does in DX12.


Seems like it yea, some games still need a lot of work but its a promising start. Now if they fixed gears of war and hitman should be good to go. Im waiting to see how Squad does in final release, the latest reviews aren't too good but its a game totally based on DX12


----------



## Agent Smith1984

Doom is in alpha testing right now and it shows AMD wiping the floor with NVIDIA... not sure if the game is DX12 or not but it's shocking how well AMD cards are doing in it. I am REALLY looking forward to that game...


----------



## dagget3450

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Dude, in all the last 3 DX12 benches I've seen, the 390x competes closely with the 980ti.....
> 
> People were saying all along.... wait till DX12 comes out, it's going to be like mantle.... so on and so forth, and you want to believe it but know the thought of a 390X competing closely with a 980ti is just ridiculous.... but now DX12 starts popping up and you see it really happen right before your eyes and you're glad you got an AMD card....
> 
> Anybody who picked up used 290's for $150-200 in 2014 is sitting VERY pretty right now from a performance per dollar standpoint.
> 
> At one point I had (2) 290 tri-x cards in crossfire that I have a total of $380 in, and even then they were destroying games (prior to going to 4k), that same setup in DX12 is beastly, but I will say for sure that DX12, like mantle, uses a TON more VRAM than DX11, so the 4GB limitation could finally be a widespread problem and not a minimal problem like it is now.
> 
> On the flip side of that coin, if developers would do us all the favor of programming their games to use both frame buffers from each card then that issue is solved entirely.
> 
> Fury X owners can finally start throwing each other some high-fives after seeing how well it does in DX12.


Im giving my furyx the high five right onto fleabay. 4gb vram isnt enough for me, and 8gb is. I will be interested in what vram sizes polaris launches with. If they have more than 8gb and middle to high end i may go ahead and pick them up after reviews and what not. I dont mind coasting on 390x though until then.


----------



## Mister300

i would go 5820K more cache(15 MB) and more cores. Mine is a beast and cost less than 300 USD when our Houston Microcenter runs a sale.


----------



## n64ADL

I love my i7 5820k. don't regret it worth a day, just wish i stuck with liquid instead of air cooling. just want to stream my pc and leave it on while im gone. aircooling sucks with this processor.


----------



## mus1mus

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I've got my eyes on this little setup right here:
> http://www.newegg.com/Product/ComboBundleDetails.aspx?ItemList=Combo.2466098


That's a sweet deal IMO. Go for it.








Quote:


> Originally Posted by *bluej511*
> 
> I honestly wouldnt push anything 4th generation and up more then 1.35v to be honest. My limit would be 1.3. I have mine set to 1.155v for 4.5 ran intel stress test for 15mins was stable. If i do 1.145 it blue screens running the test but boots no problem. All the programs show different voltages anyways, one says 1.152 intel stress test says 1.144 so who knows, i havent measured with my DMM yet. I love my mobo it has check points for voltages.


With Skylake's FIVR no longer on die, people stated pumping the Voltages again. It's actually the FIVR that is finicky on Haswell and Broadwell. Including Haswell E.

I killed one (5930K) with a 2.5 VCCIN. But that poor fella survived 2.1 VCore.
Quote:


> Originally Posted by *Stige*
> 
> 1.35V is like ***** territory
> 
> 
> 
> 
> 
> 
> 
> 
> Anything below 1.5V is plenty safe.


Anything below 1.4 for 24/7 and keeping the chip.

Quote:


> Originally Posted by *Stige*
> 
> Razor works fine, has worked everytime and will.
> 
> If my 3570K lasted 2½ years without degradation at 1.52V and much more, no they won't. You are wrong.
> 
> Oh, and also, there is no reason to cool the mosfets if the motherboard is any decent at all, just hinder the performance of the rest of the loop for no gains.


MOSFETS get warm. Their efficiency suffers with heat. They should be cooled.

If you say, watercooling, then yeah. They don't really need blocks on them. But they look soo KEWL with blocks. so meh!


----------



## Geoclock

Hi guys, just tested my 390x with Assassins Creed Syndicate game and it reached 94C.
Then i turn on all case fans (3x240mm) 100% speed and dropped down to 83c at full load.
It's normal?
Any recommendations?
Thanks.


----------



## battleaxe

Quote:


> Originally Posted by *Geoclock*
> 
> Hi guys, just tested my 390x with Assassins Creed Syndicate game and it reached 94C.
> Then i turn on all case fans (3x240mm) 100% speed and dropped down to 83c at full load.
> It's normal?
> Any recommendations?
> Thanks.


Probably. Go water. Then see 40c


----------



## dagget3450

Silly question i forgot to ask. Does the 3xx suffer like 2xx did with over 100mv oc hdmi/dp start blanking out during benching/gaming? I remember my 290x would blank out alot when heavily over voltage 150mv i think. Ive notice something similar on fury as well but only on DP and lower voltage oc


----------



## mus1mus

Card dependent.









To what voltage level they blank out vary.


----------



## Stige

Quote:


> Originally Posted by *dagget3450*
> 
> Silly question i forgot to ask. Does the 3xx suffer like 2xx did with over 100mv oc hdmi/dp start blanking out during benching/gaming? I remember my 290x would blank out alot when heavily over voltage 150mv i think. Ive notice something similar on fury as well but only on DP and lower voltage oc


I have a.. similiar issue with my 390. But it does not blank out.

When I start pushing it past 175mV my monitor goes crazy if above 60Hz, you get those lines you used to get on a CRT if you put too high refresh rate on one.
I need to get another DVI cable to try it out see if it still does that. My monitor is one of those korean overclock monitors QNIX QX2710.

60Hz = fine
120Hz = nuts instantly in Firestrike if past 175mV or so.


----------



## mus1mus

I dont think Single Link DVI was even mentioned. Except for 60Hz.

HDMI has several revisions too. So you're assuming dear, I would add. So is DP.


----------



## bluej511

Quote:


> Originally Posted by *mus1mus*
> 
> I dont think Single Link DVI was even mentioned. Except for 60Hz.
> 
> HDMI has several revisions too. So you're assuming dear, I would add. So is DP.


I hope you know that even HDMI 1.3 has far more bandwidth then dvi dual link and that was released 10 YEARS ago.

Pretty sure you guys dont use SSDs in sata I or II so why do the same to a monitor? DVI had no revisions, hdmi on the 390 is 1.4 and dp 1.2. Even a cheap monitor made in the last DECADE would have both.


----------



## bluej511

Quote:


> Originally Posted by *Geoclock*
> 
> Hi guys, just tested my 390x with Assassins Creed Syndicate game and it reached 94C.
> Then i turn on all case fans (3x240mm) 100% speed and dropped down to 83c at full load.
> It's normal?
> Any recommendations?
> Thanks.


As long as itdoesnt throttle 94C is fine its actually what theyre rated to run at. You can try a custom fan curve in msi afterburner. Could be bad case flow as well. Some coolers run hotter then others depends on thecard. Running 30% fan speed my Nitro could get up into the high 80s


----------



## patriotaki

so something is faulty on my pc, it makes the gpu coil whine. my friends r9 390 pcs+ which doesnt coil whine on his PC ( he has a worse psu than me) . now my friends 390 coil whines too on my machine.
not as much as mine but still makes some noise..

whats wrong?


----------



## bluej511

Quote:


> Originally Posted by *patriotaki*
> 
> so something is faulty on my pc, it makes the gpu coil whine. my friends r9 390 pcs+ which doesnt coil whine on his PC ( he has a worse psu than me) . now my friends 390 coil whines too on my machine.
> not as much as mine but still makes some noise..
> 
> whats wrong?


Educated guess, quite possibly the psu. Have you tried your card on his machine to see if it whines? Would discard the possibility of a bad card.


----------



## patriotaki

Quote:


> Originally Posted by *bluej511*
> 
> Educated guess, quite possibly the psu. Have you tried your card on his machine to see if it whines? Would discard the possibility of a bad card.


not yet.. but still why his 390 coil whines on mine,, i got xfx ts 750w gold


----------



## patriotaki

im so frustrated now/...


----------



## dagget3450

And so the project begins...


Well, i can block them just yet as i am waiting on pads/thermal paste. This will give me time to confim/test each one though before blocking them all.


----------



## battleaxe

Quote:


> Originally Posted by *dagget3450*
> 
> And so the project begins...
> 
> 
> Well, i can block them just yet as i am waiting on pads/thermal paste. This will give me time to confim/test each one though before blocking them all.


Nice. Nice. Nice....


----------



## bluej511

Quote:


> Originally Posted by *dagget3450*
> 
> And so the project begins...
> 
> 
> Well, i can block them just yet as i am waiting on pads/thermal paste. This will give me time to confim/test each one though before blocking them all.


Jealous lol. Dont even think my mobo can do quads.


----------



## dagget3450

Quote:


> Originally Posted by *bluej511*
> 
> Jealous lol. Dont even think my mobo can do quads.


Quad can be overrated for a lot of things, i am going to try pushing crazy resolutions and need more vram. I will cry inside some if AMD releases a 16gb card in two months but i am already building funds in case i get surprised. I got tons of stuff to sell off by then.

These new ref design coolers seem rather decent. haven't done a lot of testing but it quickly reminded me of the noise levels lol. it is still seemingly slightly quieter than 290x ref.


----------



## bluej511

Quote:


> Originally Posted by *dagget3450*
> 
> Quad can be overrated for a lot of things, i am going to try pushing crazy resolutions and need more vram. I will cry inside some if AMD releases a 16gb card in two months but i am already building funds in case i get surprised. I got tons of stuff to sell off by then.
> 
> These new ref design coolers seem rather decent. haven't done a lot of testing but it quickly reminded me of the noise levels lol. it is still seemingly slightly quieter than 290x ref.


Still only has 8gb anyways but its pretty insane. Hopefully they bring games to tie all the cards up together, 32gb of ram is useful although theoretically only about 20gb would be useful.


----------



## dagget3450

Quote:


> Originally Posted by *bluej511*
> 
> Still only has 8gb anyways but its pretty insane. Hopefully they bring games to tie all the cards up together, 32gb of ram is useful although theoretically only about 20gb would be useful.


As dumb as it sounds, i would love to get my hands on a 16gb or 32gb firepro w9100 just to play with the vram buffer







- I'll put that right next to winning the lottery on my list.


----------



## bluej511

Quote:


> Originally Posted by *dagget3450*
> 
> As dumb as it sounds, i would love to get my hands on a 16gb or 32gb firepro w9100 just to play with the vram buffer
> 
> 
> 
> 
> 
> 
> 
> - I'll put that right next to winning the lottery on my list.


Quadfire those. Drool.


----------



## Sycksyde

Quote:


> Originally Posted by *dagget3450*
> 
> And so the project begins...
> 
> 
> Well, i can block them just yet as i am waiting on pads/thermal paste. This will give me time to confim/test each one though before blocking them all.


Awesome, talk about sheer brute force


----------



## vallonen

Quote:


> Originally Posted by *dagget3450*
> 
> And so the project begins...
> 
> 
> Well, i can block them just yet as i am waiting on pads/thermal paste. This will give me time to confim/test each one though before blocking them all.


Stop the madness!


----------



## dagget3450

Quote:


> Originally Posted by *Sycksyde*
> 
> Awesome, talk about sheer brute force











Quote:


> Originally Posted by *vallonen*
> 
> Stop the madness!












So silly question, what is a good/normal vram clock range for 390x? I am able to push 1800 on the first card i am testing so far. This seems really good?
firestrike stock gpu 1800 vram test
http://www.3dmark.com/3dm/11680580?

is this good? It seems stable in valley also


----------



## kizwan

Quote:


> Originally Posted by *dagget3450*
> 
> So silly question, what is a good/normal vram clock range for 390x? I am able to push 1800 on the first card i am testing so far. This seems really good?
> firestrike stock gpu 1800 vram test
> http://www.3dmark.com/3dm/11680580?
> 
> is this good? It seems stable in valley also


1750 seems like "piece of cake" on 390/390X.


----------



## dagget3450

Quote:


> Originally Posted by *kizwan*
> 
> 1750 seems like "piece of cake" on 390/390X.


So then this is normal, just kinda surprised i recall my 290x barely getting to 1700 they had hynix.


----------



## mus1mus

Interesting.

When you have the blocks on, let's do some little shootout.

I have been trying my best to break 60K graphics on fours but I have never been to. Maybe you can.


----------



## patriotaki

Guys can someone help me out please.
My friends 390 does coil whine on my PC and on his machine does NOT. Also my 390 coil whines a lot more than his when running in my machine.

I have xfx ts 750w gold
Any idea what causes this?


----------



## danielwhitt

What voltage has been run to get past 1750, @ aux 50mv mine won't go past 1750, didn't really want to go up any higher. ???


----------



## yuannan

Quote:


> Originally Posted by *patriotaki*
> 
> Guys can someone help me out please.
> My friends 390 does coil whine on my PC and on his machine does NOT. Also my 390 coil whines a lot more than his when running in my machine.
> 
> I have xfx ts 750w gold
> Any idea what causes this?


are you sure it's not just the case?

if not that sound like a faulty PSU or something with power delivery such as motherboard.


----------



## mus1mus

Quote:


> Originally Posted by *patriotaki*
> 
> Guys can someone help me out please.
> My friends 390 does coil whine on my PC and on his machine does NOT. Also my 390 coil whines a lot more than his when running in my machine.
> 
> I have xfx ts 750w gold
> Any idea what causes this?


Try his PSU.


----------



## patriotaki

Am I that unlucky? I got that PSU 1-2 months ago.. All said that it's a quality PSU :/ Pff damn


----------



## dagget3450

Quote:


> Originally Posted by *mus1mus*
> 
> Interesting.
> 
> When you have the blocks on, let's do some little shootout.
> 
> I have been trying my best to break 60K graphics on fours but I have never been to. Maybe you can.


This build was going to be fore gaming than benching but, i will surely do some benching. I don't know that ill push it that far but i will try to give some samples to see where it stands.


----------



## bluej511

Quote:


> Originally Posted by *dagget3450*
> 
> This build was going to be fore gaming than benching but, i will surely do some benching. I don't know that ill push it that far but i will try to give some samples to see where it stands.


You must have quite a few rads or just one huge one like one of those with 9 120mm fans lol.


----------



## patriotaki

can someone help me? what should i do with my card?


----------



## Stige

Ignore it, like every card these days has coil whine.


----------



## battleaxe

Quote:


> Originally Posted by *Stige*
> 
> Ignore it, like every card these days has coil whine.


I have yet to have a card that has loud coil whine. A few did have it yes. But even crossword with the last four 390x I have had and two 290x even with my fans turned off the hard drives are the loudest items in the case. A few times on venues it has shown up. But overall very quiet cards. I did have one 290 that squealed like a dying bat though. Seems some have bad luck for coil whine. Others good luck. Or it has something more to do with the psu or something. Cool whine is a strange animal.

Ignore my phones autocorrect


----------



## bluej511

Quote:


> Originally Posted by *patriotaki*
> 
> can someone help me? what should i do with my card?


Try a dif psu or on your friends. If it goes away its your psu. Maybe too much 12v ripple causing it. My 5770,7850 and my 390 none have ever coil whined, nor have i ever heard coil whine in any of my systems. Try his and see what happens, the fact that his whines on your is either your psu or your hearing case vibrations.


----------



## dagget3450

Quote:


> Originally Posted by *bluej511*
> 
> You must have quite a few rads or just one huge one like one of those with 9 120mm fans lol.


i have 3 large rads external - 2x Phobya Xtreme 400 and 1xMagiCool XTREME Hexa 720 Radiator


----------



## PunkX 1

Hey guys I just got the XFX R9 390 DD. The stock clocks are 1015/1500Mhz. Has anyone ever tried flashing a 390X bios to this card to unlock the extra shaders? Also have many here attempted to edit the bios of the 390's?


----------



## Stige

Quote:


> Originally Posted by *PunkX 1*
> 
> Hey guys I just got the XFX R9 390 DD. The stock clocks are 1015/1500Mhz. Has anyone ever tried flashing a 390X bios to this card to unlock the extra shaders? Also have many here attempted to edit the bios of the 390's?


1. No, you cannot unlock 390 to 390X. 290 is a possibility but 390 is not(?)
1. It is possible but very unlikely, check the link below

2. Plenty of people including myself, there is another thread for BIOS modding.

You can use this tool to confirm if your GPU can be unlocked just to be sure: http://www.overclock.net/t/1567179/activation-of-cores-in-hawaii-tonga-and-fiji-unlockability-tester-ver-1-6-and-atomtool


----------



## PunkX 1

Quote:


> Originally Posted by *Stige*
> 
> 1. No, you cannot unlock 390 to 390X. 290 is a possibility but 390 is not(?)
> 1. It is possible but very unlikely, check the link below
> 
> 2. Plenty of people including myself, there is another thread for BIOS modding.
> 
> You can use this tool to confirm if your GPU can be unlocked just to be sure: http://www.overclock.net/t/1567179/activation-of-cores-in-hawaii-tonga-and-fiji-unlockability-tester-ver-1-6-and-atomtool


The tool shows it as hardware locked. Seems like I got unlucky when it comes to overclocking as I can only manage 1080/1700Mhz on stock voltage and around 1110Mhz on the core with +90mv in afterburner. Will editing the bios help increase my threshold? The reason I'm asking is cause I've noticed that the vdroop is quite bad. Under load the voltage even when set high drops really low and continues to fluctuate. So ultimately the max voltage I get under load is around 1.21-1.23v. The ASIC quality of my card is 65.2%.


----------



## bluej511

Quote:


> Originally Posted by *PunkX 1*
> 
> The tool shows it as hardware locked. Seems like I got unlucky when it comes to overclocking as I can only manage 1080/1700Mhz on stock voltage and around 1110Mhz on the core with +90mv in afterburner. Will editing the bios help increase my threshold? The reason I'm asking is cause I've noticed that the vdroop is quite bad. Under load the voltage even when set high drops really low and continues to fluctuate. So ultimately the max voltage I get under load is around 1.21-1.23v. The ASIC quality of my card is 65.2%.


Yea my ASIC is 75% and maxed out i can get 1200/1650. Try 100mv and 50% power limit see if it makes any difference. You should be able to get 1150 easily.


----------



## Stige

Quote:


> Originally Posted by *PunkX 1*
> 
> The tool shows it as hardware locked. Seems like I got unlucky when it comes to overclocking as I can only manage 1080/1700Mhz on stock voltage and around 1110Mhz on the core with +90mv in afterburner. Will editing the bios help increase my threshold? The reason I'm asking is cause I've noticed that the vdroop is quite bad. Under load the voltage even when set high drops really low and continues to fluctuate. So ultimately the max voltage I get under load is around 1.21-1.23v. The ASIC quality of my card is 65.2%.


Your VRM temps. That is what it is. Check your VRM temps. Asic quality doesn't mean anything.


----------



## PunkX 1

Both Vrm temps never exceed 75c. They are mostly in the late 60s with the core temps macing out at around 78c even with the voltage slider maxed out.


----------



## PunkX 1

Quote:


> Originally Posted by *bluej511*
> 
> Yea my ASIC is 75% and maxed out i can get 1200/1650. Try 100mv and 50% power limit see if it makes any difference. You should be able to get 1150 easily.


With max voltage and power limit I can get 1130 but with white lines coming up once in a while in benchmarks. Are you using an XFX card?


----------



## bluej511

Quote:


> Originally Posted by *PunkX 1*
> 
> With max voltage and power limit I can get 1130 but with white lines coming up once in a while in benchmarks. Are you using an XFX card?


Yea that seems pretty low but a few members on here haven't been able to get high core clocks but have gotten very high memory clocks.. For me its the opposite, anything past 1650 memory and nada but my core is stable to 1200. 1100 is a pretty good jump from 1010 especially with that memory at 1750. Someone on here does bios editing so worth a shot.


----------



## Stige

Quote:


> Originally Posted by *PunkX 1*
> 
> Both Vrm temps never exceed 75c. They are mostly in the late 60s with the core temps macing out at around 78c even with the voltage slider maxed out.


The lower the VRM temp is, the less voltage you will need.


----------



## PunkX 1

Quote:


> Originally Posted by *Stige*
> 
> The lower the VRM temp is, the less voltage you will need.


Thanks


----------



## PunkX 1

Quote:


> Originally Posted by *bluej511*
> 
> Yea that seems pretty low but a few members on here haven't been able to get high core clocks but have gotten very high memory clocks.. For me its the opposite, anything past 1650 memory and nada but my core is stable to 1200. 1100 is a pretty good jump from 1010 especially with that memory at 1750. Someone on here does bios editing so worth a shot.


Do you happen to know who he is?


----------



## Stige

Quote:


> Originally Posted by *PunkX 1*
> 
> Thanks


And just to give you an example: My card is fine at 1200/1650 at +131mV but once the VRM gets to 70C+, I start to see artifacts, anything below that and nothing.


----------



## PunkX 1

Quote:


> Originally Posted by *Stige*
> 
> And just to give you an example: My card is fine at 1200/1650 at +131mV but once the VRM gets to 70C+, I start to see artifacts, anything below that and nothing.


I must notice that next time. Sometimes I've noticed that in certain games the GPU tends to downclock so as of now I'm using the an app called clock blocker which helps more or less maintain 3d clocks. I'm quite impressed with this card, it's managing most games well at 1440p. Is there any way to lower vrm temps?


----------



## bluej511

Quote:


> Originally Posted by *PunkX 1*
> 
> I must notice that next time. Sometimes I've noticed that in certain games the GPU tends to downclock so as of now I'm using the an app called clock blocker which helps more or less maintain 3d clocks. I'm quite impressed with this card, it's managing most games well at 1440p.


I get that but with less voltage and VRMs stay around the 60s or so. Its supposed to downclock thats the beauty of it. Id hate to run 250w just surfing chrome. Its why i oced my cpu using turbo clocks. 4.5ghz isn't needed just chilling at idle its a total waste.


----------



## bluej511

Quote:


> Originally Posted by *PunkX 1*
> 
> Do you happen to know who he is?


He was offering it during the amd/nvidia competition but completely forgot his username. If your BIOS wont flash or unlock doesnt matter anyways, even a good OC during most games you only see a slight improvement depending on the resolution. This card is plenty for me to get most games to stay at 60fps.


----------



## patriotaki

Should I replace the PSU or my gpu? Since my friends card coil whines on my system and on his no..he has a worse PSU thougg


----------



## rdr09

Quote:


> Originally Posted by *patriotaki*
> 
> Should I replace the PSU or my gpu? Since my friends card coil whines on my system and on his no..he has a worse PSU thougg


Have you tried your gpu on your friend's system?


----------



## PunkX 1

Quote:


> Originally Posted by *bluej511*
> 
> I get that but with less voltage and VRMs stay around the 60s or so. Its supposed to downclock thats the beauty of it. Id hate to run 250w just surfing chrome. Its why i oced my cpu using turbo clocks. 4.5ghz isn't needed just chilling at idle its a total waste.


No I mean it downclocks under 3D applications such as in games.


----------



## Stige

Quote:


> Originally Posted by *PunkX 1*
> 
> I must notice that next time. Sometimes I've noticed that in certain games the GPU tends to downclock so as of now I'm using the an app called clock blocker which helps more or less maintain 3d clocks. I'm quite impressed with this card, it's managing most games well at 1440p. Is there any way to lower vrm temps?


Yeah some lighter games can cause the clocks to fluctuate, like World of Tanks for example. I use ClockBlocker aswell to make sure they stay at max everytime I have a game open, such a wonderful piece of software it is









In other news: Need to get these installed on my VRM and test again how big the difference is vs the phobya pads I have right now


----------



## patriotaki

Quote:


> Originally Posted by *rdr09*
> 
> Have you tried your gpu on your friend's system?


I will tomorrow


----------



## rdr09

Quote:


> Originally Posted by *patriotaki*
> 
> I will tomorrow


Do that first.


----------



## bluej511

Quote:


> Originally Posted by *PunkX 1*
> 
> No I mean it downclocks under 3D applications such as in games.


No point in running max clocks if the game can run it fine with 900mhz haha. For example dirt rally doesnt even come close to fully using 1040mhz and the game runs perfect maxed out. Why use more power when its totally unnecessary. If a game lags and stutters i can understand locking the clocks but besides that pointless.


----------



## PunkX 1

Quote:


> Originally Posted by *Stige*
> 
> Yeah some lighter games can cause the clocks to fluctuate, like World of Tanks for example. I use ClockBlocker aswell to make sure they stay at max everytime I have a game open, such a wonderful piece of software it is
> 
> 
> 
> 
> 
> 
> 
> 
> 
> In other news: Need to get these installed on my VRM and test again how big the difference is vs the phobya pads I have right now


Let me know if they do make a difference.


----------



## Stige

Quote:


> Originally Posted by *bluej511*
> 
> No point in running max clocks if the game can run it fine with 900mhz haha. For example dirt rally doesnt even come close to fully using 1040mhz and the game runs perfect maxed out. Why use more power when its totally unnecessary. If a game lags and stutters i can understand locking the clocks but besides that pointless.


Fluctuating clocks will cause stuttering in almost every case, it's just a bad design, locking the clocks to max while gaming has no drawbacks to it while fluctuating clocks do. There is absolutely no reason to NOT lock them while you have a game open.


----------



## PunkX 1

Quote:


> Originally Posted by *bluej511*
> 
> No point in running max clocks if the game can run it fine with 900mhz haha. For example dirt rally doesnt even come close to fully using 1040mhz and the game runs perfect maxed out. Why use more power when its totally unnecessary. If a game lags and stutters i can understand locking the clocks but besides that pointless.


Makes sense. Here's a look at my temps after running a few benchmarks including in game benchmarks. Voltage is at stock while AUX voltage is at +50mv. Clocks are at 1080/1700Mhz.


----------



## PunkX 1

Quote:


> Originally Posted by *Stige*
> 
> Fluctuating clocks will cause stuttering in almost every case, it's just a bad design, locking the clocks to max while gaming has no drawbacks to it while fluctuating clocks do. There is absolutely no reason to NOT lock them while you have a game open.


I experienced this and trust me I'm fed up of stuttering and micro stutter. I dealt with that for almost a year when using my R9 285's in CF.


----------



## bluej511

Quote:


> Originally Posted by *PunkX 1*
> 
> Makes sense. Here's a look at my temps after running a few benchmarks including in game benchmarks. Voltage is at stock while AUX voltage is at +50mv. Clocks are at 1080/1700Mhz.


Quote:


> Originally Posted by *PunkX 1*
> 
> I experienced this and trust me I'm fed up of stuttering and micro stutter. I dealt with that for almost a year when using my R9 285's in CF.


Yea those VRM temps are FINE, just a low OCing card is all.

Well thats also because CF and SLI sucks, at everything haha.


----------



## PunkX 1

Quote:


> Originally Posted by *bluej511*
> 
> Yea those VRM temps are FINE, just a low OCing card is all.
> 
> Well thats also because CF and SLI sucks, at everything haha.


This is the first card I've had with is terrible at overclocking. I guess one has to run out of luck at some point. I've hear about people tightening the memory timings on the GPU Memory. Does that actually yield any performance increase?


----------



## bluej511

Quote:


> Originally Posted by *PunkX 1*
> 
> This is the first card I've had with is terrible at overclocking. I guess one has to run out of luck at some point. I've hear about people tightening the memory timings on the GPU Memory. Does that actually yield any performance increase?


My 7850 had more overclocking room then my 390 but the performance is night and day. It might help the memory OC but youre already at 1750, guessing you have hynix. Its weird it seems like one or the other will OC high but its rare to see both. Ive yet to see a 390/x do 1300/1750 or so.


----------



## PunkX 1

Quote:


> Originally Posted by *bluej511*
> 
> My 7850 had more overclocking room then my 390 but the performance is night and day. It might help the memory OC but youre already at 1750, guessing you have hynix. Its weird it seems like one or the other will OC high but its rare to see both. Ive yet to see a 390/x do 1300/1750 or so.


It's hynix. I guess with the 390's either the core or memory goes high. Would have been amazing to hit 1200 on the core though. I'm really happy to have 8GB vram. It's needed at 1440p, especially when you increase textures.


----------



## bluej511

Quote:


> Originally Posted by *PunkX 1*
> 
> It's hynix. I guess with the 390's either the core or memory goes high. Would have been amazing to hit 1200 on the core though. I'm really happy to have 8GB vram. It's needed at 1440p, especially when you increase textures.


Mine tops out at 1200, even 1225 starts to artifact, not much but enough that id not consider it stable. 8gb is nice yea but i think after 5-6 or so it becomes pointless. R9 390s can't use all 8 anyways. Try maxing out gta v or something, it will use around 5-6 but stay below 30fps and thats at 1080p


----------



## PunkX 1

I've maxed out GTA V at 1440p. Everything is at the maximum except for MSAA and it does stop at around 6GB on the vram. However, FPS is in the late 40's to mid 50s.


----------



## bluej511

Quote:


> Originally Posted by *PunkX 1*
> 
> I've maxed out GTA V at 1440p. Everything is at the maximum except for MSAA and it does stop at around 6GB on the vram. However, FPS is in the late 40's to mid 50s.


No msaa not maxed out haha. I turned everything on and i mean EVERYTHING, even the additional graphics shadows and all that and it was def unplayable. MSAA is the killer. Right now im running it at 2x tweaked ultra/very high and its beautiful stays 60fps until you get to grassy areas.


----------



## PunkX 1

Quote:


> Originally Posted by *bluej511*
> 
> No msaa not maxed out haha. I turned everything on and i mean EVERYTHING, even the additional graphics shadows and all that and it was def unplayable. MSAA is the killer. Right now im running it at 2x tweaked ultra/very high and its beautiful stays 60fps until you get to grassy areas.


Haha. MSAA sure does know how to kill your frames. You're gaming at 1080p or 1440p?


----------



## bluej511

Quote:


> Originally Posted by *PunkX 1*
> 
> Haha. MSAA sure does know how to kill your frames. You're gaming at 1080p or 1440p?


1080 till i get my ultrawide then its 2560x1080 so about 700,000 more pixels souldnt affect perfromance much


----------



## PunkX 1

Quote:


> Originally Posted by *bluej511*
> 
> 1080 till i get my ultrawide then its 2560x1080 so about 700,000 more pixels souldnt affect perfromance much


From what I've noticed at 1080p if you get 60fps, you'll end up getting around 45-50fps at 1440p.


----------



## bluej511

This is the next upgrade im looking at, if its too much curved ill get the flat one. 29" ultrawide curved. Gets more then 60 i just cap it, screen tearing drives me nuts.


----------



## PunkX 1

Quote:


> Originally Posted by *bluej511*
> 
> This is the next upgrade im looking at, if its too much curved ill get the flat one. 29" ultrawide curved. Gets more then 60 i just cap it, screen tearing drives me nuts.


Dude that's one sexy monitor!


----------



## dagget3450

Quote:


> Originally Posted by *PunkX 1*
> 
> Dude that's one sexy monitor!


Its the curves isn't it


----------



## PunkX 1

Quote:


> Originally Posted by *dagget3450*
> 
> Its the curves isn't it


Now who doesn't like them, eh







?


----------



## dagget3450

Sweet my junk has arrived -paste - 17w/mk pads


Hope it will be worth it


----------



## rdr09

Quote:


> Originally Posted by *dagget3450*
> 
> Sweet my junk has arrived -paste - 17w/mk pads
> 
> 
> Hope it will be worth it


Nice. hell yeah that's worth it. That's more for benching.


----------



## patriotaki

I'm over at my friends house testing my r9 390

The card whines wayyyyyy less
That if you don't pay attention is not audible.

On my system it does coil.whine a lot.

Although his 390 does not coil whine at all


----------



## Stige

I guess next step would be to try another PSU in your PC? If you can borrow one from him or somewhere.
Easiest way to find if it is your PSU causing the extra interference or something else.


----------



## patriotaki

Quote:


> Originally Posted by *Stige*
> 
> I guess next step would be to try another PSU in your PC? If you can borrow one from him or somewhere.
> Easiest way to find if it is your PSU causing the extra interference or something else.


I have my previous PSU the old superflower 650w
Which also coil whines there


----------



## tolis626

Maybe it's a bad ground somewhere? I don't even know if it can cause coil whine, but it can cause interference to the audio circuits for example. I have such a problem where when I'm using my headphones and the system is under load I can hear a sound like coil whine from the cans themselves. Quite annoying really, but I can't be arsed to take everything apart and search for the cause.


----------



## patriotaki

How can I check that?


----------



## tolis626

Quote:


> Originally Posted by *patriotaki*
> 
> How can I check that?


That's the bad part. You mostly need to take everything apart and check whether there's things like the motherboard making contact with the case metal or such. Too tedious for me. What you could try is to take apart your system and reassemble it out of the case, see if the whine persists. If it does, I'm out of ideas officially.


----------



## mus1mus

Just accept the card as it is. Or whine to a store to get it replaced. I'll do the first one if I were you. Sell it to someone. Done.


----------



## patriotaki

Quote:


> Originally Posted by *tolis626*
> 
> That's the bad part. You mostly need to take everything apart and check whether there's things like the motherboard making contact with the case metal or such. Too tedious for me. What you could try is to take apart your system and reassemble it out of the case, see if the whine persists. If it does, I'm out of ideas officially.


the thing is I don't know what to replace should I return the gpu? The psu?
My friends card whines less than mine on my system and almost no coil. Whine my card on his


----------



## mus1mus

Quote:


> Originally Posted by *patriotaki*
> 
> the thing is I don't know what to replace should I return the gpu? The psu?
> My friends card whines less than mine on my system and almost no coil. Whine my card on his


Return both.


----------



## dagget3450

I don't understand coil whine issues, maybe its cause i don't have it or notice it? I mean i hear a slight high pitched squeal on unigine exit screen. Outside of that i never hear it. I have been through massive number of gpus.

Does this coil whine happen constantly while in use? I keep thinking back to old CRT tvs and how i remember they had a high pitched squeal and it was constant. Is this similar to coil whine?


----------



## patriotaki

Quote:


> Originally Posted by *dagget3450*
> 
> I don't understand coil whine issues, maybe its cause i don't have it or notice it? I mean i hear a slight high pitched squeal on unigine exit screen. Outside of that i never hear it. I have been through massive number of gpus.
> 
> Does this coil whine happen constantly while in use? I keep thinking back to old CRT tvs and how i remember they had a high pitched squeal and it was constant. Is this similar to coil whine?


no only under load the noise is strong very loud..in my case


----------



## patriotaki

with which psu should i change it for ?


----------



## battleaxe

Quote:


> Originally Posted by *patriotaki*
> 
> with which psu should i change it for ?


Corsair AX860 (made by Seasonic)

I've got two of them. If you have a MicroCenter close you can get them refurb for around $120 which is a steal. Great PSU. Mine are both silent and the cards are silent running on them as well. They benched over 1000W on the review.


----------



## bluej511

Quote:


> Originally Posted by *battleaxe*
> 
> Corsair AX860 (made by Seasonic)
> 
> I've got two of them. If you have a MicroCenter close you can get them refurb for around $120 which is a steal. Great PSU. Mine are both silent and the cards are silent running on them as well. They benched over 1000W on the review.


I personally prefer to stay away on refurbs no matter how good the deal. Yea they have a warranty but if it goes bad in a year its a pita to have the system down. Then again can happen with a new unit too.

My TX650 crapped out after 5 yrs, got a new tx650v2 with new cables and everything and put that in my carry on when i moved to France, used it for a while got a good deal on an RM1000. I hear its got very bad voltage ripple so i MIGHT end up getting rid of it or keeping it as a spare and picking up the new RM1000i. Especially since my cpu is overclocked.

Off topic. Started my PC this morning, fans ran then poof it shut off. This happened before on my old system and psu (dunno why dont ask). Decided to do the basic thing since all i did recently was go bare die. Took out 2 rams sticks and bam booted right up.

Turns out wtv water i had spilled doing my install got into one of the ram slots














, cleaned up the slot and the ram stick and good to go.


----------



## Agent Smith1984

The only refurb product I will use in a system IS the GPU or a case probbaly.....

PSU's? No way, snap crackle pop...

HDD.... have had 6 different various refurbished drives fail in personal use and server class usage.

SSD.... seen several go bad in 1-3 months time.

CPU's..... every one I got refurbed would not OC for crap (though this was back in the Venice days, lol)

RAM.. never seen any for sale but bad RAM could be a major PITA

Motherboard.... seen SEVRAL refurbs fail in 3-6 months time...

GPU... generally the GPU that is refurbished is fine. Pobably had a bad cap or VRM and all else is fine. I have used SEVERAL refurnished GPU's with only one seeming to not OC well at all, and that back in x850 pro days, and it didn't clock well cause I flashed it to an XT. I have had several good results with refurbished GPU's, but as always your mileage may vary.

Case... who cares if it's a good deal right?


----------



## Agent Smith1984

Looks like PS4 is either upgrading their GPU, OR implementing a form of Crossfire.....

How the hell is this going to run 4k???

http://www.giantbomb.com/articles/sources-the-upgraded-playstation-4-is-codenamed-ne/1100-5437/#ftag=YHF323e137


----------



## Transmaniacon

I saw an article on Reddit saying they believe the new PS4.5 has a Polaris 10 GPU?

Maybe developers will target 4K @ 30FPS for consoles. I can't see them pushing 60 FPS without some serious horsepower, heck a 980Ti can't even do that without lowering the settings a good bit.


----------



## bluej511

Yea its more then likely 4k at 30fps which im pretty sure an r9 390 already does with ease. Don't forget there wont be a need for AA at 4k either so its not too crazy. Its going to a version of the Polaris version which educated guess, i see the "r9 480/ps4.5 chip" being close to r9 380x or even r9 390 performance.


----------



## Transmaniacon

Quote:


> Originally Posted by *bluej511*
> 
> Yea its more then likely 4k at 30fps which im pretty sure an r9 390 already does with ease. Don't forget there wont be a need for AA at 4k either so its not too crazy. Its going to a version of the Polaris version which educated guess, i see the "r9 480/ps4.5 chip" being close to r9 380x or even r9 390 performance.


Yeah that's my guess as well, but at a much lower TDP. I am looking forward to 4K @ 60FPS from a single GPU, hopefully Vega can deliver.


----------



## Agent Smith1984

Very good point about polaris.

It is very possible that they are using a 32CU Polaris variant specifically for the PS4.5

Microsoft can take this time to refresh the One also and put a GPU in it that actually competes with Sony's (they really screwed up last time and slapped a measly 7770 equivelent in a system that is supposed to be "next gen"... lol

From my understanding, the systems won't all be outputting 4k games, just 4k output to the display. So they could still be running 1440-60FPS or in some cases 4k - 30 FPS for games.

I can say though, that games like GTA V are fine at 30FPS (though it's a tad cinematic), but if they are releasing new shooters, then I would expect 1440p to be the norm.

I guess they could also run some custom res like 3200x1800 also though. That resolution looks great on a 4k display, and can help a lot of folks on 290 crossfire fighting VRAM capacity.


----------



## bluej511

Do tvs even run a 1440p mode? I havent checked 4k tv specs in ages. It might just be 2160 and 1080 so who knows. Polaris wont be my update i think its vega or after for me.


----------



## Agent Smith1984

Quote:


> Originally Posted by *bluej511*
> 
> Do tvs even run a 1440p mode? I havent checked 4k tv specs in ages. It might just be 2160 and 1080 so who knows. Polaris wont be my update i think its vega or after for me.


Well, you can run any res you want into the TV and stretch it.

I run 1440 and 1800P on stuff all the time for testing.









Even current Xbox and PS4 games run at 900P in some cases already...

A few notes though...

1440 doesn't quite scale on 4k the way 1080p does. When upscaling 1080p images, the upscaler in the TV itself simply quaddrooples the pixels.
When scaling something like 1440, or 1800, etc..... there is some fractions involved and it can make certain images appear to be not quite as crisp....

LG had originally wanted to make 1440p TV's before 4k went widespread, and those TV's would have scaled 720p television beautifully, but due to scaling issues they would have faced with 1080p movies and game content, and the loss of image sharpness, I think they just bypassed that step altogether.... we all know damn-a-well that if there was an additional generation of TV's to make money off of, the industry would of done it..... think about how we saw 480p then 720p, then 1080p, but now all the sudden everything in the TV land went straight to 4k? Well, my theory is that it was due to the image sharpness pertaining to the upscaling of more 1080P content than 720p content. They didn't have this issue with as much with 720p upscaling on 1080p monitors because at the time of it' entry into the market all major media avalable in HD was either 720p OR 1080i (still the case on satellite and cable) and to be honest I sometimes prefer my 46" 1080P bedroom set over my 55" 4k living set when watching OTA networks because it is essentially playing back upscaled 1080i video.


----------



## Tobiman

Quote:


> Originally Posted by *patriotaki*
> 
> the thing is I don't know what to replace should I return the gpu? The psu?
> My friends card whines less than mine on my system and almost no coil. Whine my card on his


Return the GPU for a replacement. You have to start somewhere and that's likely the culprit.


----------



## Agent Smith1984

Quote:


> Originally Posted by *patriotaki*
> 
> the thing is I don't know what to replace should I return the gpu? The psu?
> My friends card whines less than mine on my system and almost no coil. Whine my card on his


I'd guess it's a 90% chance or better that the GPU is causing the whining..... here is the difficult part though. I have heard some GPU's whine with certain PSU's and not with others.... SO you can keep digging until you find a GPU that plays nice with your PSU, or you can return both in hopes of fixing the issue.... I have had a few whiners in my day, and to be honest, it's pretty common at this point. My Fury was the loudest GPU I've ever heard, but my 980 KPE made a little noise under load also.

My MSI 390 was silent, but my 390X chirps a little on load screens. It's simply a vibration of the power circuitry... at a certain frequency of power delivery, the noise can be heard. At other frequencies it can not.

Odds are, you'll return both items, and get the same results, and if not you got lucky.

In my case (no pun intended) the Cooler Master Jet-flo exhaust fans are up so high under load to deal with temps, that I don't hear anything but them. In turn, I crank game volumes up and rock out.


----------



## patriotaki

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I'd guess it's a 90% chance or better that the GPU is causing the whining..... here is the difficult part though. I have heard some GPU's whine with certain PSU's and not with others.... SO you can keep digging until you find a GPU that plays nice with your PSU, or you can return both in hopes of fixing the issue.... I have had a few whiners in my day, and to be honest, it's pretty common at this point. My Fury was the loudest GPU I've ever heard, but my 980 KPE made a little noise under load also.
> 
> My MSI 390 was silent, but my 390X chirps a little on load screens. It's simply a vibration of the power circuitry... at a certain frequency of power delivery, the noise can be heard. At other frequencies it can not.
> 
> Odds are, you'll return both items, and get the same results, and if not you got lucky.
> 
> In my case (no pun intended) the Cooler Master Jet-flo exhaust fans are up so high under load to deal with temps, that I don't hear anything but them. In turn, I crank game volumes up and rock out.


i dont mind about how loud the noise is..its just very annoying when you play for long...my ears go crazy with that pitching sound...
what if the card does not whine in their system?







thats a possibiliity too


----------



## bluej511

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I'd guess it's a 90% chance or better that the GPU is causing the whining..... here is the difficult part though. I have heard some GPU's whine with certain PSU's and not with others.... SO you can keep digging until you find a GPU that plays nice with your PSU, or you can return both in hopes of fixing the issue.... I have had a few whiners in my day, and to be honest, it's pretty common at this point. My Fury was the loudest GPU I've ever heard, but my 980 KPE made a little noise under load also.
> 
> My MSI 390 was silent, but my 390X chirps a little on load screens. It's simply a vibration of the power circuitry... at a certain frequency of power delivery, the noise can be heard. At other frequencies it can not.
> 
> Odds are, you'll return both items, and get the same results, and if not you got lucky.
> 
> In my case (no pun intended) the Cooler Master Jet-flo exhaust fans are up so high under load to deal with temps, that I don't hear anything but them. In turn, I crank game volumes up and rock out.


I must be the only one that has never had or even heard coil whine. I have pitch perfect hearing too if one of my fans hums from a vibration i can hear it. I know that my Noctua NF-14 at 1050rpm has the slightest hums enough that i either use 1100 or 1000. Ive never had coil whine and id hear it.


----------



## Transmaniacon

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Well, you can run any res you want into the TV and stretch it.
> 
> I run 1440 and 1800P on stuff all the time for testing.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Even current Xbox and PS4 games run at 900P in some cases already...
> 
> A few notes though...
> 
> 1440 doesn't quite scale on 4k the way 1080p does. When upscaling 1080p images, the upscaler in the TV itself simply quaddrooples the pixels.
> When scaling something like 1440, or 1800, etc..... there is some fractions involved and it can make certain images appear to be not quite as crisp....
> 
> LG had originally wanted to make 1440p TV's before 4k went widespread, and those TV's would have scaled 720p television beautifully, but due to scaling issues they would have faced with 1080p movies and game content, and the loss of image sharpness, I think they just bypassed that step altogether.... we all know damn-a-well that if there was an additional generation of TV's to make money off of, the industry would of done it..... think about how we saw 480p then 720p, then 1080p, but now all the sudden everything in the TV land went straight to 4k? Well, my theory is that it was due to the image sharpness pertaining to the upscaling of more 1080P content than 720p content. They didn't have this issue with as much with 720p upscaling on 1080p monitors because at the time of it' entry into the market all major media avalable in HD was either 720p OR 1080i (still the case on satellite and cable) and to be honest I sometimes prefer my 46" 1080P bedroom set over my 55" 4k living set when watching OTA networks because it is essentially playing back upscaled 1080i video.


I think it all comes down to the media, Blu-Ray is 1080P, and that will up-scale better to 4K. I think my plan is wait for 4K to mature and then make that upgrade when we have the GPU horsepower to handle it well.


----------



## Agent Smith1984

My Fury had a certain pitch it would whine at on load screens when I got it that could literally cut through any sound in the room.

I ran that thing on loops of FireStrike at 4k for about 3 hours trying to "break it in".... finally after about a week's time I noticed it had lessened quite a bit. In the end, it wasn't too bad at all, but was definitely always there....


----------



## Agent Smith1984

Quote:


> Originally Posted by *Transmaniacon*
> 
> I think it all comes down to the media, Blu-Ray is 1080P, and that will up-scale better to 4K. I think my plan is wait for 4K to mature and then make that upgrade when we have the GPU horsepower to handle it well.


Exactly my point!









I went ahead and made the move because the pricing on entry level sets is great, and you have nothing to lose by making the change.

The DPI of 4k is 4 times that of 1080p. I'm not 100% sure on how the upscaler on my TV works on ALL media, but in reality, even it simply quadroppled each pixel in a 1080p image, you would sill see the exact same clarity and sharpness at 1080P because the DPI would remain the same. My TV seems to go a step beyond that and do some additional upscaling and it definitely looks great with 1080 content...... where the real benefit is, is with all of the 4k content available on Netflix, etc..... AND then of course the gaming side of things, where the image quality is just bar-none. 4k nature content on netflix is just absolutely insane to look at. You feel as if you could put your hand in the screen and pet this lizards... lmao


----------



## Transmaniacon

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Exactly my point!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I went ahead and made the move because the pricing on entry level sets is great, and you have nothing to lose by making the change.
> 
> The DPI of 4k is 4 times that of 1080p. I'm not 100% sure on how the upscaler on my TV works on ALL media, but in reality, even it simply quadroppled each pixel in a 1080p image, you would sill see the exact same clarity and sharpness at 1080P because the DPI would remain the same. My TV seems to go a step beyond that and do some additional upscaling and it definitely looks great with 1080 content...... where the real benefit is, is with all of the 4k content available on Netflix, etc..... AND then of course the gaming side of things, where the image quality is just bar-none. 4k nature content on netflix is just absolutely insane to look at. You feel as if you could put your hand in the screen and pet this lizards... lmao


Haha yeah it's quite impressive, I will occasionally go and look at them at Best Buy, but that price tag promptly brings me back to reality









I think I am going to wait for the HBM2 cards to come out and at that point consider jumping to 4K if they can offer up high performance. The prices should be pretty good by then as well, and maybe we will see some higher refresh rates on the 4K panels.


----------



## bluej511

Id love to get 4k netflix. Problem is 13mbps adsl doesnt cut it lol. Its why i bought gta v on discs and saved about 20€ on release day.


----------



## Agent Smith1984

Quote:


> Originally Posted by *bluej511*
> 
> Id love to get 4k netflix. Problem is 13mbps adsl doesnt cut it lol. Its why i bought gta v on discs and saved about 20€ on release day.


Oh man you are telling me.... I had 5mb DSL where I lived before (friggin boonies), and now I am in the city and have 200mb internet. I friggin love it!!!

I didn't get GTA V until I moved simply for that reason alone (I don't use an optical disc drive).

I get a soliod 47-52mb stream rate on Netflix, so I can watch 4k in the living room while everybody else in the house streams 1080P and there isn't a single stutter.


----------



## Transmaniacon

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Oh man you are telling me.... I had 5mb DSL where I lived before (friggin boonies), and now I am in the city and have 200mb internet. I friggin love it!!!
> 
> I didn't get GTA V until I moved simply for that reason alone (I don't use an optical disc drive).
> 
> I get a soliod 47-52mb stream rate on Netflix, so I can watch 4k in the living room while everybody else in the house streams 1080P and there isn't a single stutter.


You on RoadRunner? I've got 100/10 right now from them, waiting for Google Fiber to get here.


----------



## m70b1jr

Quote:


> Originally Posted by *Transmaniacon*
> 
> You on RoadRunner? I've got 100/10 right now from them, waiting for Google Fiber to get here.


I have time warner and used to have 30mb/s down and 5mb/s upload, but all the packages got upgraded FOR FREE, so now i'm 200mb/s down and 20mb/s upload without paying a penny extra.


----------



## bluej511

I used to have 100/5 for 170 a month or so tv/net/phone. Here i got 12/1 for 50€ a month with tv/net/phone and cell with 4g 10gb. Fiber at 30/5 costs the same so does 100 and 200mbps in areas that have it. I live in an apt building and theres fiber at the pole outside its just a pita to get the building co to approve it.


----------



## Transmaniacon

Quote:


> Originally Posted by *m70b1jr*
> 
> I have time warner and used to have 30mb/s down and 5mb/s upload, but all the packages got upgraded FOR FREE, so now i'm 200mb/s down and 20mb/s upload without paying a penny extra.


Nice, the free upgrade was pretty sweet. I was on the turbo with 20/5, and that bumped me to 100/10.


----------



## Agent Smith1984

Yeah. I got 200 down/20 up with TWC

It's really good internet. It costs money of course, but with the amount of content we stream in my house, it's well worth it.

On promotion, I pay $150 right now for 2 HD boxes with 200+ channel package, 2 standard boxes with 72 channels, unlimited LD phone, and 200mb internet with dual band wifi router , 2.4GHz and 5Ghz bands.


----------



## Transmaniacon

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Yeah. I got 200 down/20 up with TWC
> 
> It's really good internet. It costs money of course, but with the amount of content we stream in my house, it's well worth it.
> 
> On promotion, I pay $150 right now for 2 HD boxes with 200+ channel package, 2 standard boxes with 72 channels, unlimited LD phone, and 200mb internet with dual band wifi router , 2.4GHz and 5Ghz bands.


That's a solid deal! The tricky part is renegotiating that same deal when it's up...


----------



## bluej511

Quote:


> Originally Posted by *Transmaniacon*
> 
> That's a solid deal! The tricky part is renegotiating that same deal when it's up...


You gotta tell em youre switching to another provider watch how fast they keep that rate. So many times my net service was unreliable id call everytime. Few times i got service refunded into my account. By law i think there supposed to be able to provide 80% speed or something.

Its worked for me quite often including keeping the promotion deal. Most customer service reps have no idea what theyre doing they dont care if the co refunds you its not their money lol.


----------



## battleaxe

Quote:


> Originally Posted by *bluej511*
> 
> I personally prefer to stay away on refurbs no matter how good the deal. Yea they have a warranty but if it goes bad in a year its a pita to have the system down. Then again can happen with a new unit too.
> 
> My TX650 crapped out after 5 yrs, got a new tx650v2 with new cables and everything and put that in my carry on when i moved to France, used it for a while got a good deal on an RM1000. I hear its got very bad voltage ripple so i MIGHT end up getting rid of it or keeping it as a spare and picking up the new RM1000i. Especially since my cpu is overclocked.
> 
> Off topic. Started my PC this morning, fans ran then poof it shut off. This happened before on my old system and psu (dunno why dont ask). Decided to do the basic thing since all i did recently was go bare die. Took out 2 rams sticks and bam booted right up.
> 
> Turns out wtv water i had spilled doing my install got into one of the ram slots
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> , cleaned up the slot and the ram stick and good to go.


Quote:


> Originally Posted by *Agent Smith1984*
> 
> The only refurb product I will use in a system IS the GPU or a case probbaly.....
> 
> PSU's? No way, snap crackle pop...
> 
> HDD.... have had 6 different various refurbished drives fail in personal use and server class usage.
> 
> SSD.... seen several go bad in 1-3 months time.
> 
> CPU's..... every one I got refurbed would not OC for crap (though this was back in the Venice days, lol)
> 
> RAM.. never seen any for sale but bad RAM could be a major PITA
> 
> Motherboard.... seen SEVRAL refurbs fail in 3-6 months time...
> 
> GPU... generally the GPU that is refurbished is fine. Pobably had a bad cap or VRM and all else is fine. I have used SEVERAL refurnished GPU's with only one seeming to not OC well at all, and that back in x850 pro days, and it didn't clock well cause I flashed it to an XT. I have had several good results with refurbished GPU's, but as always your mileage may vary.
> 
> Case... who cares if it's a good deal right?


You guys have some terrible luck. So far I've bought 4 PSU's, and 3 GPU's that were refurbished in the last 3-4 years. Oh, two motherboards too.

All still in use today. I have had a mother board, and a few GPU's go bad. But none were refurbished. To each his own I suppose.


----------



## Agent Smith1984

Yeah, I am all to familiar with the "I am going to cancel if you don't thing/..."
Quote:


> Originally Posted by *battleaxe*
> 
> You guys have some terrible luck. So far I've bought 4 PSU's, and 3 GPU's that were refurbished in the last 3-4 years. Oh, two motherboards too.
> 
> All still in use today. I have had a mother board, and a few GPU's go bad. But none were refurbished. To each his own I suppose.


Well, that's not the worst of it buddy... you know how much knew stuff I've had break??? lol


----------



## afyeung

Hi all. I recently installed the X41 on my 390X. I'm running push pull and with fans at 60%, I'm getting max 52c on the core. With my old fan I was getting max 54c after 2 hours of BF4. I highly recommend going the G10 route, even heavy overclocked cards like my own will have frosty temps. I can't say the same for the VRM though. It averages around 84c when under max load even with the gelid sink. I might have installed it wrong but I don't really care since they don't get awfully high. I have pictures and stuff here https://linustechtips.com/main/topic/584424-frosty-r9-390x-48c-max-load-max-overclock/#comment-7622541 All I can say is that I'm very impressed with the core temps


----------



## bluej511

Quote:


> Originally Posted by *afyeung*
> 
> Hi all. I recently installed the X41 on my 390X. I'm running push pull and with fans at 60%, I'm getting max 52c on the core. With my old fan I was getting max 54c after 2 hours of BF4. I highly recommend going the G10 route, even heavy overclocked cards like my own will have frosty temps. I can't say the same for the VRM though. It averages around 84c when under max load even with the gelid sink. I might have installed it wrong but I don't really care since they don't get awfully high. I have pictures and stuff here https://linustechtips.com/main/topic/584424-frosty-r9-390x-48c-max-load-max-overclock/#comment-7622541 All I can say is that I'm very impressed with the core temps


Eh for overclocking id say thats awfully high haha. My alphacool block that passively cools the vrms with no direct fans at all gets cooler then that. i think those X41s and other universtal aio coolers made for gpu have zero air flow for the VRMs even if they have a tiny fan blowing over it. VRM temps are way way more important then core temps.


----------



## battleaxe

Quote:


> Originally Posted by *bluej511*
> 
> VRM temps are way way more important then core temps.


Both are important.

Lower VRM tems just gets you more stability when the core is already cold, and can get you more Mhz too for going higher in some cases... some...

If the core is hot though, you're not going anywhere high (OC) regardless of the VRM's temp. (compared to what the same card could do at 45C core for example)

Edit:
Quote:


> Originally Posted by *Agent Smith1984*
> 
> Yeah, I am all to familiar with the "I am going to cancel if you don't thing/..."
> Well, that's not the worst of it buddy... you know how much knew stuff I've had break??? lol


yeah, I've definitely had more new stuff break than refurbed. But I've bought more new likely too.

I think I've bought about 6 Corsair H80's (AIO) water coolers that were refurbed, all for around $40 back while mining. All are still working today. One of them is pretty noisy though, but I seldom use it anymore so I don't care.


----------



## dagget3450

Posting this here because it may be relevant for someone out there looking at 390x for EK water blocks.

In my case i have EK-FC R9-290x water blocks, i believe mine are Rev 1.0 - Since there are issues with 390x/390 fitting water blocks due to small changes i think even on AMD reference designs. There is a thread about an inductor being too large next to the AUX VRMS. http://www.overclock.net/t/1569032/xfx-r9-390-dd-removed-from-the-ek-fc-r9-290x-v2-ca-cooling-configurator


Quote:


> I know this probably a rare thing and may not even help one other person but as confirmed by scorpion49 in his post but not intended for the reason of this thread.
> 
> I have 4 390x of R9-390X-8VR6 i just ordered. I have only taken one apart but i don't see anything changing as these are AMD ref design cards.
> 
> The inductor in question is the small one
> 
> 
> 
> It looks identical to my 290x ref card. I am blocking these now so if there is a development I'll update but i don't foresee anything - I realize this isn't the same as the OP model but maybe people looking for this specific need will find it!


Scorpion49's post with pics
Quote:


> Quote:
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Scorpion49*
> 
> So I swapped the cooler out just now. The reference 290X has a much different cooler than this 390X. Check it out:
> 
> All ready to go!
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Reference 290X heatsink:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 390X heatsink:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I first did the small heatsinks as the instructions said, but I added a few extras in some spots since I had them, but then I had to change out some of the ones on the VRAM because they interfered with the heat pipes.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Topped it all off with a pair of AP-45's, running at 7V with my fan controller they move more air than any 3rd party graphics card I have ever encountered and nearly silent (can only hear them with the case open).
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Surprisingly it didn't sag much, but I put a little prop in there anyways just for safety. Going to start testing temps now.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
Click to expand...


----------



## PunkX 1

COD: Black Ops III maxing out vram on the R9 390


----------



## bluej511

Quote:


> Originally Posted by *PunkX 1*
> 
> COD: Black Ops III maxing out vram on the R9 390


WHAAAAAAAAAAAAAAAAATTTTTTTTTTT. Got enough taskbar icons lol. Vram leak must be nice.


----------



## spyshagg

I believe that is normal with blops3. It caches all it can on vmem, but doesn't actually uses all of it.


----------



## battleaxe

Quote:


> Originally Posted by *PunkX 1*
> 
> COD: Black Ops III maxing out vram on the R9 390


Holy Carp. And I thought 8gb would be plenty for while to come...


----------



## christoph

Quote:


> Originally Posted by *PunkX 1*
> 
> COD: Black Ops III maxing out vram on the R9 390


but how is the performance? you play in 1080p? everything max out?


----------



## dagget3450

Quote:


> Originally Posted by *spyshagg*
> 
> I believe that is normal with blops3. It caches all it can on vmem, but doesn't actually uses all of it.


What Spyshagg said


----------



## Majentrix

Quote:


> Originally Posted by *Majentrix*
> 
> 
> 
> Has this happened to anyone else when clicking on imgur links? I'm assuming it's a driver error.


Worked out was what was causing this, it was just Flash being wacky. Uninstalling and reinstalling fixed it.


----------



## PunkX 1

Quote:


> Originally Posted by *christoph*
> 
> but how is the performance? you play in 1080p? everything max out?


1440p maxed out. FPS is anything between a minimum of 45-70 FPS with an average of mainly 55FPS.


----------



## dopemoney

Had PowerColor R9 390, it failed after just two months. Highest Heaven score achieved with that card was 1488. Returned and purchased MSI Gaming R9 390 (while still priced at $329). As 390 gains popularity the price is going up. If you don't have one, get one soon. This new Heaven score is with clocks at 1067/1610, and 5% power increase. Quality-wise, I think this card is ahead of the PowerColor series.


----------



## Agent Smith1984

Quote:


> Originally Posted by *dopemoney*
> 
> 
> 
> Had PowerColor R9 390, it failed after just two months. Highest Heaven score achieved with that card was 1488. Returned and purchased MSI Gaming R9 390 (while still priced at $329). As 390 gains popularity the price is going up. If you don't have one, get one soon. This new Heaven score is with clocks at 1067/1610, and 5% power increase. Quality-wise, I think this card is ahead of the PowerColor series.


390 should be staying around the same prices for now... I paid $330 for my first MSI 390.... and just got my MSI 390X on sale for $370 agter rebate ith a mouse and Hitman....

Polaris will be out in two months so I highly doubt it benefits them at all to raise prices on Hawaii cards. If anything the prices will tank....


----------



## flopper

Quote:


> Originally Posted by *battleaxe*
> 
> Holy Carp. And I thought 8gb would be plenty for while to come...


cache..

Quote:


> Originally Posted by *PunkX 1*
> 
> 1440p maxed out. FPS is anything between a minimum of 45-70 FPS with an average of mainly 55FPS.


yea 390 is a good dog.
Quote:


> Originally Posted by *Agent Smith1984*
> 
> 390 should be staying around the same prices for now... I paid $330 for my first MSI 390.... and just got my MSI 390X on sale for $370 agter rebate ith a mouse and Hitman....
> 
> Polaris will be out in two months so I highly doubt it benefits them at all to raise prices on Hawaii cards. If anything the prices will tank....


There be a slight new situation a die shrink is different, so let say same price or lower with let say 30% more performance?
so a Polaris priced at 250$ with 30% more performance than a 390?
seems to be a really good deal for anyone with a 1440 screen.


----------



## christoph

Quote:


> Originally Posted by *PunkX 1*
> 
> 1440p maxed out. FPS is anything between a minimum of 45-70 FPS with an average of mainly 55FPS.


1440p good


----------



## patriotaki

2x pcs+ 390 no OC, stock settings
i5 6600k


----------



## battleaxe

XFX just shipped my replacement 390X. And this knowing I had taken off the heatsink. Just an FYI. If you level with them and be honest, seems they are pretty fair.

I'm pretty impressed TBH. So glad I didn't get boned on this one. I will definitely buy from XFX again. Hopefully, the replacement will hit 1200mhz like the other one does. I can't expect it to hit 1280mhz like the one that died. I wish. But I seriously doubt it. I haven't heard of any that were doing as well as she was. Why did she have to die? So sad.


----------



## Nedrozak

Pretty nice overclock!


----------



## tolis626

Hey, small update on my mess-ups. The 11W/mK Fujipoly strip I had ordered arrived today. I installed it a few hours ago and lo and behold, it's the right size (Finally). Temps aren't too great (compared to stock), but not too bad either. I've been playing a bit of BF4 (at 1440p, so it's quite hotter than even Witcher 3 at 1080p) and max I got was 78C on the core and 81C on the VRMs. That was with 1180/1675MHz at +80/+50mV. Certainly an improvement over stock, but still meh.

My next moves? Well, I think that the Phobya 1mm strip I placed on the chokes may be causing some contact issues. Since the time I installed that I think I've been getting worse temps. I don't like removing it completely even though it should be fine, because MSI did have a strip on it. So I dunno, I may get a cheapo 0.5mm pad and call it a day. I will also get me some Kryonaut in the hopes that the added conductivity will help over the GC Extreme. Doubt it, but I've tried stupider things already.

Anyways. Just when I thought that "Boy, that went well!", I realized that my card coil whines a lot more since taking it apart and replacing the TIM and VRM pads this last time. And I mean A LOT. Until yesterday it would whine a bit, but nothing major and certainly nothing I could hear over my case fans. Not the case anymore. If a game is running and I take my headphones off, it's like there's insects flying in the room, and large ones at that. I have no idea what caused this, so any help is appreciated.

PS : Just a warning of sorts. Fujipoly pads are one-use only. And what I mean by that is you can't remove them without destroying them. On my card it was stuck on the VRMs and taking it off led to tearing it to pieces. So yeah, there goes 15€.


----------



## battleaxe

Quote:


> Originally Posted by *tolis626*
> 
> Hey, small update on my mess-ups. The 11W/mK Fujipoly strip I had ordered arrived today. I installed it a few hours ago and lo and behold, it's the right size (Finally). Temps aren't too great (compared to stock), but not too bad either. I've been playing a bit of BF4 (at 1440p, so it's quite hotter than even Witcher 3 at 1080p) and max I got was 78C on the core and 81C on the VRMs. That was with 1180/1675MHz at +80/+50mV. Certainly an improvement over stock, but still meh.
> 
> My next moves? Well, I think that the Phobya 1mm strip I placed on the chokes may be causing some contact issues. Since the time I installed that I think I've been getting worse temps. I don't like removing it completely even though it should be fine, because MSI did have a strip on it. So I dunno, I may get a cheapo 0.5mm pad and call it a day. I will also get me some Kryonaut in the hopes that the added conductivity will help over the GC Extreme. Doubt it, but I've tried stupider things already.
> 
> Anyways. Just when I thought that "Boy, that went well!", I realized that my card coil whines a lot more since taking it apart and replacing the TIM and VRM pads this last time. And I mean A LOT. Until yesterday it would whine a bit, but nothing major and certainly nothing I could hear over my case fans. Not the case anymore. If a game is running and I take my headphones off, it's like there's insects flying in the room, and large ones at that. I have no idea what caused this, so any help is appreciated.
> 
> PS : Just a warning of sorts. Fujipoly pads are one-use only. And what I mean by that is you can't remove them without destroying them. On my card it was stuck on the VRMs and taking it off led to tearing it to pieces. So yeah, there goes 15€.


Cover the coils with some thin pads if you have them. The ones next to the VRM's.

They don't get super hot.

Until you get on water the FujiPoly don't really get their worth IMO. But you can see what they are capable of a little bit on air.









And, that's absolutely true. They only last one or two (max) applications if you are very careful. Very delicate for sure. Kinda makes me mad TBH with how much they cost. Cant' count the amount of $ I have spent on them for replacements.


----------



## tolis626

Hey
Quote:


> Originally Posted by *battleaxe*
> 
> Cover the coils with some thin pads if you have them. The ones next to the VRM's.
> 
> They don't get super hot.
> 
> Until you get on water the FujiPoly don't really get their worth IMO. But you can see what they are capable of a little bit on air.


Damn it man! Stop trying to make me spend my money (my summer vacation money, mind you!) on custom water cooling! I'll never forgive myself if I give in and make a custom loop instead of going to an island and swim my GPU sorrows away.









On a more serious note, VRM temps were better with the two 1mm 14W/mK pads stacked together than with a single 1.5mm 11W/mK pad. Not drastically so, but they were. Sad thing is, it's nigh on impossible to get 14 or, god forbid, 17W/mK pads in Europe in all sizes without paying like 50-100€. That's absurd on so many levels. I'd have ordered from PerformancePCs were it not for the very high shipping costs. Europeans drawing the short end of the stick. Again.

Anyway. Which ones are the coils? The ones that look like capacitors (or are they capacitors?)?


----------



## battleaxe

Quote:


> Originally Posted by *tolis626*
> 
> Hey
> Damn it man! Stop trying to make me spend my money (my summer vacation money, mind you!) on custom water cooling! I'll never forgive myself if I give in and make a custom loop instead of going to an island and swim my GPU sorrows away.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> On a more serious note, VRM temps were better with the two 1mm 14W/mK pads stacked together than with a single 1.5mm 11W/mK pad. Not drastically so, but they were. Sad thing is, it's nigh on impossible to get 14 or, god forbid, 17W/mK pads in Europe in all sizes without paying like 50-100€. That's absurd on so many levels. I'd have ordered from PerformancePCs were it not for the very high shipping costs. Europeans drawing the short end of the stick. Again.
> 
> Anyway. Which ones are the coils? The ones that look like capacitors (or are they capacitors?)?


The semi square blocks close to the VRM's.

Can you use Amazon? That's where I order from.


----------



## Slowpoke66

Quote:


> Originally Posted by *tolis626*
> 
> Hey
> Damn it man! Stop trying to make me spend my money (my summer vacation money, mind you!) on custom water cooling! I'll never forgive myself if I give in and make a custom loop instead of going to an island and swim my GPU sorrows away.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> On a more serious note, VRM temps were better with the two 1mm 14W/mK pads stacked together than with a single 1.5mm 11W/mK pad. Not drastically so, but they were. *Sad thing is, it's nigh on impossible to get 14 or, god forbid, 17W/mK pads in Europe in all sizes without paying like 50-100€*. That's absurd on so many levels. I'd have ordered from PerformancePCs were it not for the very high shipping costs. Europeans drawing the short end of the stick. Again.
> 
> Anyway. Which ones are the coils? The ones that look like capacitors (or are they capacitors?)?


Have U checked Ebay?

Here's a seller from UK:
http://www.ebay.com/itm/Fujipoly-Thermal-Pad-1-0mm-14W-mK-for-GPU-CPU-LED-XBOX-PS3-PS4-PC-Laptop-/181831706474?hash=item2a5603a76a:g:vokAAOSwLqFV-ApG


----------



## bluej511

Quote:


> Originally Posted by *Slowpoke66*
> 
> Have U checked Ebay?
> 
> Here's a seller from UK:
> http://www.ebay.com/itm/Fujipoly-Thermal-Pad-1-0mm-14W-mK-for-GPU-CPU-LED-XBOX-PS3-PS4-PC-Laptop-/181831706474?hash=item2a5603a76a:g:vokAAOSwLqFV-ApG


No matter where you buy from .5 and 1mm pads are cheap, unfortunately for the r9 390s you need 1.5. Thats when it goes up in price. Hell two strips of 17w/mk from aquatuning is like 50€+.


----------



## bluej511

Quote:


> Originally Posted by *Nedrozak*
> 
> Pretty nice overclock!


Thats quite a high score there. Dont think thats maxed out but if it is thts ridiculous.


----------



## christoph

My fujipoly pads arrived yesterday so I just did a full swap to my computer, I've apply the 11w/mk in my Sapphire 390, and the one I took off my video card was a 1.0 mm thick, anyway I installed the 1.5 and I'm checking temps right now, everything seems normal, and the one I took off the video card I applied to my motherboard VRMs that did improve temps by 3 degrees at idle, so I'm gonna test the computer with a game later on...


----------



## Nedrozak

If you mean that the benchmark is not maxed out you are wrong. You can see it in the first picture.
Quality: Ultra
Tessellation: Extreme


----------



## bluej511

Quote:


> Originally Posted by *Nedrozak*
> 
> If you mean that the benchmark is not maxed out you are wrong. You can see it in the first picture.
> Quality: Ultra
> Tessellation: Extreme


Reason i say that is because all the ones weve done we've gotten 60fps or so.

Edit: Ha nvm i see it, you ran it with no AA whatsoever haha.

Heres mine with 2 dif OCs.


----------



## kizwan

Quote:


> Originally Posted by *Nedrozak*
> 
> If you mean that the benchmark is not maxed out you are wrong. You can see it in the first picture.
> Quality: Ultra
> Tessellation: Extreme


No 8xAA. In any heaven competition, this is required. Even if you're not, most people follow the competition rule anyway. Easier to compare scores with others.


----------



## bluej511

Took me a while to notice there was no AA haha.


----------



## Nedrozak

Sorry for not running the benchmark with AA
Here results with 8xAA


----------



## bluej511

there you go still impressive haha.

I can see how Fujis don't make too much difference on air as water because water dissipates the heat just that much faster. Its why ive been wondering if its even worth it on my alphacool block. Although its water the VRMs and VRAM are passively cooled, although it does a better job then all the custom air coolers i wonder if adding 11w fujis would even make a difference.


----------



## patriotaki

Quote:


> Originally Posted by *patriotaki*
> 
> 2x pcs+ 390 no OC, stock settings
> i5 6600k


is this score good for 2x 390s?


----------



## Stige

My Heaven run with my daily clocks.


----------



## bluej511

Oh how i hate being right. Just put CLU on my die and low and behold, it made absolutely no difference at all. My package temps are identical. I applied it in a very very thin layer and mounted it. I checked the paste this morning and the block to die makes very good contact. I might take it apart later on see if its got good contact but pretty sure it does. Unless there's a break in period idk about haha.


----------



## PunkX 1

Just a shot in the dark here but is there any chance that the pre applied thermal is crappy? I have some NT-H1 which I could use. Would that make a difference?

@Stige that's an impressive score! Is that with tightening the memory timings as well?


----------



## Stige

Quote:


> Originally Posted by *PunkX 1*
> 
> Just a shot in the dark here but is there any chance that the pre applied thermal is crappy? I have some NT-H1 which I could use. Would that make a difference?
> 
> @Stige that's an impressive score! Is that with tightening the memory timings as well?


Only one notch, haven't still gone all out on them









I used GC Extreme on my GPU, never compared temps though but I would imagine the stock paste being rather crap.


----------



## bluej511

Quote:


> Originally Posted by *PunkX 1*
> 
> Just a shot in the dark here but is there any chance that the pre applied thermal is crappy? I have some NT-H1 which I could use. Would that make a difference?
> 
> @Stige that's an impressive score! Is that with tightening the memory timings as well?


Its not so much the paste thats crappy but more the way its applied. They pretty much stick it and forget it. Any high viscocity paste will do for the GPU, GC Extreme, Kryonaut, Hydronaut. When i installed my water block i used left over gc extreme and have had no degradations temps have stayed the same the past couple months. As long as your cores stay around 70°C or less on air its perfectly fine.

My daily clocks. No memory timings or nada just 1200/1650.


----------



## rdr09

false
Quote:


> Originally Posted by *patriotaki*
> 
> is this score good for 2x 390s?


Yes, i have to oc my 290s to 1200 core to match your graphics score i think.


----------



## PunkX 1

Quote:


> Originally Posted by *bluej511*
> 
> Its not so much the paste thats crappy but more the way its applied. They pretty much stick it and forget it. Any high viscocity paste will do for the GPU, GC Extreme, Kryonaut, Hydronaut. When i installed my water block i used left over gc extreme and have had no degradations temps have stayed the same the past couple months. As long as your cores stay around 70°C or less on air its perfectly fine.
> 
> My daily clocks. No memory timings or nada just 1200/1650.


I've noticed that VRM Temp 1 gets way hotter than VRM Temp 2. What is VRM Temp 1 and 2? My max GPU temps are around 78-80c with extra voltage. Being in India doesn't play out well for overclocking, especially when the ambient temps are around 40c throughout the day. I have NT-H1 thermal paste. Would that do?

Here's what my temps look like after a long gaming session of Dying Light which gives the GPU a good work out.


----------



## bluej511

Quote:


> Originally Posted by *PunkX 1*
> 
> I've noticed that VRM Temp 1 gets way hotter than VRM Temp 2. What is VRM Temp 1 and 2? My max GPU temps are around 78-80c with extra voltage. Being in India doesn't play out well for overclocking, especially when the ambient temps are around 40c throughout the day. I have NT-H1 thermal paste. Would that do?


Both my VRMs stay closed to each other, i think the hotter one is VRAM VRM and the one that usually runs cooler is the Core VRM has it has a little bit less voltage. I could be totally wrong haha.

I tried Noctua on my card, did well for a few weeks then performance went back to wtv it was before. Id use something thick like Kryonaut or GC Extreme. Seems like viscocity is more important then w/mk.

Edit: So i just tested my theory using Euro Truck Simulator 2, the hottest VRM seems to be the memory, for me anyways. VRM 1 got to 40°C and VRM 2 got to 56°C. That game barely uses core power it just goes up and down so good way to test.


----------



## danielwhitt

Quote:


> Originally Posted by *patriotaki*
> 
> is this score good for 2x 390s?


Hi ya, I get around 18500 - 19000 score on fire strike with the same setup. Hope this helps.


----------



## PunkX 1

Quote:


> Originally Posted by *bluej511*
> 
> Both my VRMs stay closed to each other, i think the hotter one is VRAM VRM and the one that usually runs cooler is the Core VRM has it has a little bit less voltage. I could be totally wrong haha.
> 
> I tried Noctua on my card, did well for a few weeks then performance went back to wtv it was before. Id use something thick like Kryonaut or GC Extreme. Seems like viscocity is more important then w/mk.
> 
> Edit: So i just tested my theory using Euro Truck Simulator 2, the hottest VRM seems to be the memory, for me anyways. VRM 1 got to 40°C and VRM 2 got to 56°C. That game barely uses core power it just goes up and down so good way to test.


Thanks







Those are really low vrm temps. What are you using to cool them?


----------



## bluej511

Quote:


> Originally Posted by *PunkX 1*
> 
> Thanks
> 
> 
> 
> 
> 
> 
> 
> Those are really low vrm temps. What are you using to cool them?


My alphacool water block but the vrms are passively cooled. Its not a demanding game so they dont get hot. In Syndicate im at around 60C


----------



## Stige

VRM1 should be hotter on every card than VRM2, atleast from what I have observed.


----------



## bluej511

Memory VRM 1.5v as we established. Core VRM 1.1x-1.2x. Memory vrm will get hotter. Under gpuz vrm 2 is the memory at least on the Sapphire Nitro.


----------



## tolis626

Quote:


> Originally Posted by *bluej511*
> 
> Memory VRM 1.5v as we established. Core VRM 1.1x-1.2x. Memory vrm will get hotter. Under gpuz vrm 2 is the memory at least on the Sapphire Nitro.


Well, the voltage for the memory is higher, but the current that goes through the memory VRM is like an order of magnitude lower than the core VRM. The only reason the memory VRM gets hot is that, well, there's only one or two phases compared to 6 or more for the core. Also, it's almost never cooled as well as the core VRM.


----------



## gupsterg

Ref PCB 290/X rear VRM has 1 phase MVDDC, 5 phase VDDC, (ie VRM 1), front VRM has 1 phase VDDCI (ie VRM 2).


Spoiler: ref PCB 290/X hard volt mod / measure points






Source is Shammy's voltmod guide on Kingpin cooling forum.


----------



## flopper

Been happy with my card.
still are.

Polaris seems to be something really nice for those into gaming and BF5.
Vega going to be a star


----------



## m70b1jr

If polaris doesn't release something in the #300 - $500 range i'll probably grab me another XFX r9 390 to XFire with my current one.


----------



## Stige

Quote:


> Originally Posted by *m70b1jr*
> 
> If polaris doesn't release something in the #300 - $500 range i'll probably grab me another XFX r9 390 to XFire with my current one.


Only thing Xfire is good for is benchmarks. Anything else = not worth it.


----------



## dagget3450

Using a blanket statement like that isn't really accurate. Might want to elaborate more than just that.


----------



## bluej511

Quote:


> Originally Posted by *dagget3450*
> 
> Using a blanket statement like that isn't really accurate. Might want to elaborate more than just that.


Don't bother haha. When it works it works great, the issue is not too many devs optimize for crossfire or sli for that matter. Its a shame because 2 r9 390s would destroy a single 980ti in 1440 and 4k and still cost a fair amount less even if watercooled.


----------



## bluej511

So looks like Mirrors Edge runs just great on the r9 390. But am i seeing this right, 6gb of VRAM and 12gb of RAM?!?!?!?!?!?


----------



## battleaxe

Quote:


> Originally Posted by *Stige*
> 
> Only thing Xfire is good for is benchmarks. Anything else = not worth it.


Hmmm.... must be your opinion because I like it just fine. Thank you. I don't notice any difference besides higher FPS using Xfire. Maybe you haven't tried it lately? It works great for me and lots of other folks around here.


----------



## mus1mus

Quote:


> Originally Posted by *Stige*
> 
> Only thing Xfire is good for is benchmarks. Anything else = not worth it.


Not using the right drivers maybe?


----------



## afyeung

Quote:


> Originally Posted by *bluej511*
> 
> Don't bother haha. When it works it works great, the issue is not too many devs optimize for crossfire or sli for that matter. Its a shame because 2 r9 390s would destroy a single 980ti in 1440 and 4k and still cost a fair amount less even if watercooled.


2 390s would kill a 980ti, that's a given. It should be significantly more expensive if both have waterblocks though. MSI 980ti Gaming only when for $530 like 3 days ago.


----------



## rdr09

Quote:


> Originally Posted by *battleaxe*
> 
> Hmmm.... must be your opinion because I like it just fine. Thank you. I don't notice any difference besides higher FPS using Xfire. Maybe you haven't tried it lately? It works great for me and lots of other folks around here.


some of those who complain about crossfire are matching their cards using an i5. my i7 sandy with HT off got maxed out by 2 HD 7900s in BF3. i don't even think about disabling it now with 2 290s.

single 290 it is fine to turn off HT.


----------



## Agent Smith1984

I gotta be honest, I had awesome experience with dual 7970's, dual 290's, and dual 390x/390 hybrid.....

And on the following games tested:

BF4

Far Cry 3

Far Cry 4

Crysis 3

Dirt Rally

GTA V

Mind you, I waited a little for profiles before running in CF on some of them (mainly GTA V and Dirt)

But it was all butter smooth......

I must admit though, I had trouble when trying to clock two cards differently. I found that locking clocks gave me perfect gameplay.


----------



## Stige

I used to have 2x HD7950. The games they worked on were.. minimal to say atleast. And by work I mean without stuttering or any other crap that commonly comes with CF setups. You are much better off buying a single card that is faster and you will be a happy gamer without any of the CF drawbacks. If only dual card setups didn't have all the stuttering and worked in more than 3 games a year, it might be good, especially with these high memory 390s but it's just meh.


----------



## rdr09

Quote:


> Originally Posted by *Stige*
> 
> I used to have 2x HD7950. The games they worked on were.. minimal to say atleast. And by work I mean without stuttering or any other crap that commonly comes with CF setups. You are much better off buying a single card that is faster and you will be a happy gamer without any of the CF drawbacks. If only dual card setups didn't have all the stuttering and worked in more than 3 games a year, it might be good, especially with these high memory 390s but it's just meh.


With an i5, right? here was my i7 4.5 HT off in BF3 with 7970/7950 crossfire . . .



At first, HT on in BF3 was a mess. When they finally fixed it, then I switched to HT on 'cause during large explosion my fps drops when HT was off. Similar thing happened with C3 and BF4.

I must admit, i experienced crossfire HD 7900s before the frame pacing fix for a short period of time and it was bad.


----------



## bluej511

Really depends on the game though you cant say you shouldnt crossfire with an i5 but i agree. For example Syndicate maxes out my cpu, dirt rally, black flag, euro truck, far cry 4 dont. Id be fine getting another 390.


----------



## rdr09

Quote:


> Originally Posted by *bluej511*
> 
> Really depends on the game though you cant say you shouldnt crossfire with an i5 but i agree. For example Syndicate maxes out my cpu, dirt rally, black flag, euro truck, far cry 4 dont. Id be fine getting another 390.


None of your games should be maxing out your cpu with a single 390 and cause issue. Like i said, i was fine with HT off with a single 290 and all my games. Another 390 to crossfire would be fine with your setup so long as you are aware that some games will give it a hard time no matter how good a crossfire profile there is and/or how optimized these games are. Checkout post #19 . . .

http://www.overclock.net/t/1587616/i5-4690k-100-usage-gaming-temps-fine-windows-10/10


----------



## bluej511

Quote:


> Originally Posted by *rdr09*
> 
> None of your games should be maxing out your cpu with a single 390 and cause issue. Like i said, i was fine with HT off with a single 290 and all my games. Another 390 to crossfire would be fine with your setup so long as you are aware that some games will give it a hard time no matter how good a crossfire profile there is and/or how optimized these games are. Checkout post #19 . . .
> 
> http://www.overclock.net/t/1587616/i5-4690k-100-usage-gaming-temps-fine-windows-10/10


Youd be surprised, i regularly see 100% usage out of Syndicate. Its the only game i have that gets my CPU temps as hot as intel burn test so clearly its working hard.


----------



## rdr09

Quote:


> Originally Posted by *bluej511*
> 
> Youd be surprised, i regularly see 100% usage out of Syndicate. Its the only game i have that gets my CPU temps as hot as intel burn test so clearly its working hard.


Is it causing issue playing the game? Unless it's the temp that's causing the issue, then a better cooler is in order. I got high usage with HT off, too, in BF4 but never had to turn on HT with a single 290. I only had a 120 rad per block, so i had to turn HT off to keep my cpu temp below 60C. Recently, i upgraded my rad space.


----------



## bluej511

Quote:


> Originally Posted by *rdr09*
> 
> Is it causing issue playing the game? Unless it's the temp that's causing the issue, then a better cooler is in order. I got high usage with HT off, too, in BF4 but never had to turn on HT with a single 290. I only had a 120 rad per block, so i had to turn HT off to keep my cpu temp below 60C. Recently, i upgraded my rad space.


Haha no temp is def not the issue. Still a toasty 47°C at 1.2v. Its clear though that its def using the cpu quite a bit. Usage has gone down a tiny bit since going from 4.2 to 4.5 though. I think at least in Syndicate getting another r9 390 would cause issues, in Far Cry 4 or Dirt Rally id have 0 issues.

It really really does depend on the game. BF4 is KNOWN to be ridiculously CPU heavy.


----------



## rdr09

Quote:


> Originally Posted by *bluej511*
> 
> Haha no temp is def not the issue. Still a toasty 47°C at 1.2v. Its clear though that its def using the cpu quite a bit. Usage has gone down a tiny bit since going from 4.2 to 4.5 though. I think at least in Syndicate getting another r9 390 would cause issues, in Far Cry 4 or Dirt Rally id have 0 issues.
> 
> It really really does depend on the game. BF4 is KNOWN to be ridiculously CPU heavy.


Yes, oc the cpu a bit more. Like this member . . .

http://www.overclock.net/t/1598288/cpu-gpu-bottleneck-please-help

i advised to oc his/her cpu.


----------



## flopper

Quote:


> Originally Posted by *afyeung*
> 
> 2 390s would kill a 980ti, that's a given. It should be significantly more expensive if both have waterblocks though. MSI 980ti Gaming only when for $530 like 3 days ago.


soon previous generation cards be dirt cheap to buy anyhow.
4k is the main obstacle which Vega will adress.
Polaris might be a golden era card for 1080p and 1440p gaming.

Cant wait what to do as I prepared my computer for stuff that might soon happen to it.
the computer nags me all day about the next generation stuff


----------



## afyeung

Quote:


> Originally Posted by *flopper*
> 
> soon previous generation cards be dirt cheap to buy anyhow.
> 4k is the main obstacle which Vega will adress.
> Polaris might be a golden era card for 1080p and 1440p gaming.
> 
> Cant wait what to do as I prepared my computer for stuff that might soon happen to it.
> the computer nags me all day about the next generation stuff


True. My friend sold his 980Ti + 240mm AIO for $500, and I got my R9 290 for $143 and 390x for $175


----------



## battleaxe

Quote:


> Originally Posted by *Stige*
> 
> I used to have 2x HD7950. The games they worked on were.. minimal to say atleast. And by work I mean without stuttering or any other crap that commonly comes with CF setups. You are much better off buying a single card that is faster and you will be a happy gamer without any of the CF drawbacks. If only dual card setups didn't have all the stuttering and worked in more than 3 games a year, it might be good, especially with these high memory 390s but it's just meh.


There are other things too that you have to know. Xfire setups should be used by those who know how to use them. Latency on any monitoring software, video recording,etc. should all be set to a lengthy delay in order to eliminate stutter. You were likely experiencing system wide Xfire setup problems instead of actual micro-stutter.


----------



## Agent Smith1984

I would just say that, if you hadn't ran crossfire within the last year or so, that it'd be worth trying again....

I ran (2) overclocked 390's on my 9590 and it was perfect. No stutter issues, and no bottlenecking at all....

Had to move and needed the extra $300 so sold one off, but plan on adding one back depending on what polaris looks like....

The temps with two 390's is frightening, and with the power usage of Polaris, I don't think it will be an issue, so even equally powerful cards with lower TDP's would be really good for crossfire.

We shall see....


----------



## dagget3450

It's real simple if you want 4k, your gonna need crossfire unless your okay with 30/40ish fps. Judging by next gpu release (speculation) this sill still going to be the case. I am going to start looking for a 4k freesynch monitor soon. I want 40inches or bigger though so thats going to leave me in the Korean market most likely. My crossover 44k will never get freesynch.


----------



## bluej511

Quote:


> Originally Posted by *dagget3450*
> 
> It's real simple if you want 4k, your gonna need crossfire unless your okay with 30/40ish fps. Judging by next gpu release (speculation) this sill still going to be the case. I am going to start looking for a 4k freesynch monitor soon. I want 40inches or bigger though so thats going to leave me in the Korean market most likely. My crossover 44k will never get freesynch.


I dont think theres anything 40" or so with freesync haha. 34" are the closests and those are ultrawide. You must be sitting pretty damn far though cuz my tv is a 40" and i sit about 5-6 feet away from it. I did try it on my computer desk before sitting 2 feet away from it and holy hell it was pretty badass but awful lol.

I do agree that Polaris 10-11 prob wont see a huge difference in 4k numbers BUT we could be wrong we have no idea yet not even any remotely close leaks. Then again everyone though the Fury/Fury X would crush in 4k but didn't happen.


----------



## afyeung

What are your all's experience with adding more than +100mv? Currently sitting at 1200/1730 with +100mv for daily clocks but would be cool if I could go a bit above 1200 to like 1220 with +125mv or higher


----------



## bluej511

Quote:


> Originally Posted by *afyeung*
> 
> What are your all's experience with adding more than +100mv? Currently sitting at 1200/1730 with +100mv for daily clocks but would be cool if I could go a bit above 1200 to like 1220 with +125mv or higher


Honestly id be totally happy with that, the extra 20mhz for the extra 25mv+ is totally not worth it. The difference would be marginal at best. Id be happy with 1200/1730, my mem can only manage 1650.


----------



## Stige

Quote:


> Originally Posted by *afyeung*
> 
> What are your all's experience with adding more than +100mv? Currently sitting at 1200/1730 with +100mv for daily clocks but would be cool if I could go a bit above 1200 to like 1220 with +125mv or higher


My daily clocks are at +131mV or thereabouts atm. Anything upto +200mV seems fine as the vdroop is massive.


----------



## bluej511

16.4.2 is out if anyone is interested. Has crossfire profiles for Need for Speed among other minor fixes. I gotta say though lately the drivers have been ridiculous. Its like once or twice a month. The latest whql crashed on me once so i went back to 16.3.1 and its been 100% stable for a beta. Ridiculous.

http://support.amd.com/en-us/kb-articles/Pages/AMD-Radeon-Software-Crimson-Edition-16.4.2.aspx

And anyone wondering how hard 4k is to push. Pro duo barely gets 60 maxed out haha. Btw that power consumption WOW!!!!! AMD knows what there doing now Jesus.

http://wccftech.com/amd-radeon-pro-duo-benchmark-results-leaked/


----------



## bluej511

So not sure if anyone is interested but heres the confirmation.

VRM 1 under gpuz and VRM 1 under hwinfo-gpu core section, are the long line of VRMs. Those SHOULD be the ones that get the hottest as there are way more of them.

VRM 2 is the cluster of 3 VRMs next to the PCIe bracket. Not sure which does which but my guess is the long line of VRMs is for the memory?

Anyways my new VRM temps while AC Syndicate, VRM 1 has gone from about 60-65°C to 47°C and VRM 2 has remained unchanged at 59-63°C. So putting a fan right on the alphacool fins dropped the temps a good 15°C for VRM 1. Id love to make some kind of bracket and maybe add 2 fans who knows.


----------



## afyeung

Quote:


> Originally Posted by *bluej511*
> 
> Honestly id be totally happy with that, the extra 20mhz for the extra 25mv+ is totally not worth it. The difference would be marginal at best. Id be happy with 1200/1730, my mem can only manage 1650.


Very impressed with the memory clocks on both the hynix 290 series and 390 series. 1700 range puts them near 450gb/s which is very close to the Fury X, not to mention the sheer memory speed is a lot faster. So if games somehow do make use of that then the 390/290s will see benefit as well.


----------



## afyeung

Quote:


> Originally Posted by *bluej511*
> 
> So not sure if anyone is interested but heres the confirmation.
> 
> VRM 1 under gpuz and VRM 1 under hwinfo-gpu core section, are the long line of VRMs. Those SHOULD be the ones that get the hottest as there are way more of them.
> 
> VRM 2 is the cluster of 3 VRMs next to the PCIe bracket. Not sure which does which but my guess is the long line of VRMs is for the memory?
> 
> Anyways my new VRM temps while AC Syndicate, VRM 1 has gone from about 60-65°C to 47°C and VRM 2 has remained unchanged at 59-63°C. So putting a fan right on the alphacool fins dropped the temps a good 15°C for VRM 1. Id love to make some kind of bracket and maybe add 2 fans who knows.


Awesome







I run passive VRM 2 and get 40-50c max but your VRM 1 temps are so much cooler even though I have quite a bit of cooling on those.


----------



## bluej511

Quote:


> Originally Posted by *afyeung*
> 
> Awesome
> 
> 
> 
> 
> 
> 
> 
> I run passive VRM 2 and get 40-50c max but your VRM 1 temps are so much cooler even though I have quite a bit of cooling on those.


Yea mine are passive as well but with a giant heatsink pretty much. Was shocked to see it get lower then 50°C. If i had another 120mm over the VRM 2 would drop to about the same.

Passive VRM 2 and 40-50C is pretty damn cool, must have really good airflow in your case.


----------



## dagget3450

Quote:


> Originally Posted by *bluej511*
> 
> I dont think theres anything 40" or so with freesync haha. 34" are the closests and those are ultrawide. You must be sitting pretty damn far though cuz my tv is a 40" and i sit about 5-6 feet away from it. I did try it on my computer desk before sitting 2 feet away from it and holy hell it was pretty badass but awful lol.
> 
> I do agree that Polaris 10-11 prob wont see a huge difference in 4k numbers BUT we could be wrong we have no idea yet not even any remotely close leaks. Then again everyone though the Fury/Fury X would crush in 4k but didn't happen.


There are a few 40inch and over that do freesynch and 4k ips. I just don't know if its worth it for freesynch @ 4k or not. I sit really close to my screen but its complicated cause i use custom resolutions above 4k

Fury X does awesome @ 4k esp with 4xcrossfire performance wise(when it works). It's just the VRAM becomes an issue unfortunately for me.


----------



## Agent Smith1984

Quote:


> Originally Posted by *afyeung*
> 
> What are your all's experience with adding more than +100mv? Currently sitting at 1200/1730 with +100mv for daily clocks but would be cool if I could go a bit above 1200 to like 1220 with +125mv or higher


You tell us my dude....

Try it out... I don't but it being a daily without water, and even then the vdroop is real and i wonder if it's all just board limits


----------



## mus1mus

Quote:


> Originally Posted by *afyeung*
> 
> What are your all's experience with adding more than +100mv? Currently sitting at 1200/1730 with +100mv for daily clocks but would be cool if I could go a bit above 1200 to like 1220 with +125mv or higher


Pretty good card I should say.

Try adding more volts and see for yourself if heat will be bearable.


----------



## djtmalta

Does anyone have any bios mods for the r9 390? I looked up and down the internet and could only find ways to unlock some of the core. (mine were all unlock-able). But I use to have a Nvidia GTX card and there were bios mods everywhere. I was able to enter the bios and add 20 percent to the idle fan speeds so now my card does get near 60C while idling. Stays at about 35C now.

Any help with bios modding would be greatly appreciated.

Thanks,

David


----------



## mus1mus

Quote:


> Originally Posted by *djtmalta*
> 
> 
> 
> Does anyone have any bios mods for the r9 390? I looked up and down the internet and could only find ways to unlock some of the core. (mine were all unlock-able). But I use to have a Nvidia GTX card and there were bios mods everywhere. I was able to enter the bios and add 20 percent to the idle fan speeds so now my card does get near 60C while idling. Stays at about 35C now.
> 
> Any help with bios modding would be greatly appreciated.
> 
> Thanks,
> 
> David


Here's your best resource: http://www.overclock.net/t/1561372/hawaii-bios-editing-290-290x-295x2-390-390x/0_50


----------



## Streetdragon

need fast help!
I deinstalled my driver with DDU and now i cant install a new graphi driver. nothing works!
The installation crashes at the driver installation...every...damn...time and i cant fix it.
i deinstalled my antivir and stop all programms that i dont need and it still crashes.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Streetdragon*
> 
> need fast help!
> I deinstalled my driver with DDU and now i cant install a new graphi driver. nothing works!
> The installation crashes at the driver installation...every...damn...time and i cant fix it.
> i deinstalled my antivir and stop all programms that i dont need and it still crashes.


Go into safe mode and remove everything AMD/Cat-Crimson related. Also run DDU a few times. Also remove any overclocking software entirely including profiles.

Then restart and attempt to install the latest driver.


----------



## Streetdragon

still crashes. Dont wanna install windows again-.-


----------



## Agent Smith1984

Quote:


> Originally Posted by *Streetdragon*
> 
> 
> 
> 
> still crashes. Dont wanna install windows again-.-


Try a different driver from both he catalyst and the crimson variety.


----------



## Streetdragon

Wont work too. Crash at displaydriver install. Never had domething like this befor. And google dont know **** about it too


----------



## m70b1jr

I got my R9 390 is January, but with some recent news, they're making a Polaris 10 GPU with similar performance to the 980Ti, for $300. Not sure if I need to sell mine, or see if the price drops down like hot cakes and get a 2nd 390 or sell my 390 for a Polaris chip


----------



## kizwan

Quote:


> Originally Posted by *Streetdragon*
> 
> Wont work too. Crash at displaydriver install. Never had domething like this befor. And google dont know **** about it too


System restore?


----------



## bluej511

Quote:


> Originally Posted by *Streetdragon*
> 
> Wont work too. Crash at displaydriver install. Never had domething like this befor. And google dont know **** about it too


Here ill fix it for you haha. First of all disable windows drivers update cuz they suck and make everything a nightmare. Right click my pc, then propreties then advanced system settings then go under hardware and device installation settings and press no.

Then for your amd crashing issue, unplug your ethernet or disable your wifi. That MIGHT fix it as it did in my case id get stuck at either 2 or 14% then it wouldnt install. Try it, if it works thank me later haha


----------



## Streetdragon

i wish i could kiss you feed, but didnt helped me.... tried to unplug everything, without my secound card, only secound card.. still crash. windows dont like me

do fresh installation and send a packuage filled with poo to microsoft-.-


----------



## bluej511

Quote:


> Originally Posted by *Streetdragon*
> 
> i wish i could kiss you feed, but didnt helped me.... tried to unplug everything, without my secound card, only secound card.. still crash. windows dont like me
> 
> do fresh installation and send a packuage filled with poo to microsoft-.-


Odd for me thats always worked, its an issue with amd install and bitdefender for me, would do exactly what yours does too.

Have you tried installing directly from the C:/amd folder as well?


----------



## battleaxe

Quote:


> Originally Posted by *Streetdragon*
> 
> Wont work too. Crash at displaydriver install. Never had domething like this befor. And google dont know **** about it too


Plug your monitor into the motherboard if you have it. Then install via that port. Unplug the GPU's too before doing this.

Edit: nvm. I just saw you don't have on-board graphics. Sorry


----------



## Agent Smith1984

Quote:


> Originally Posted by *m70b1jr*
> 
> I got my R9 390 is January, but with some recent news, they're making a Polaris 10 GPU with similar performance to the 980Ti, for $300. Not sure if I need to sell mine, or see if the price drops down like hot cakes and get a 2nd 390 or sell my 390 for a Polaris chip


Polaris is NOT going to perform like 980ti for $300.....

It's more likely going to perform like a 390x for around $300 and use half the power or less....


----------



## bluej511

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Polaris is NOT going to perform like 980ti for $300.....
> 
> It's more likely going to perform like a 390x for around $300 and use half the power or less....


Id say def closer to the 390 maybe even the 380x. Polaris is not supposed to be the high end card, i do believe thats going to be Vega and wtv else. I could be wrong and Polaris could destroy the 390x who knows. If thats the case and EK make a water block ill make the jump.


----------



## Agent Smith1984

Quote:


> Originally Posted by *bluej511*
> 
> Id say def closer to the 390 maybe even the 380x. Polaris is not supposed to be the high end card, i do believe thats going to be Vega and wtv else. I could be wrong and Polaris could destroy the 390x who knows. If thats the case and EK make a water block ill make the jump.


Well, there is going to be two offerings, one that should be around 380x performance, and one around 390x performance or slightly better for $300-350 is the best sense I can make of it.
The big deal with Polaris is the price/performance/power efficiency all combined.

I mean, right now anybody running crossfire 390s on air is dealing with a top card is the upper 85c+ range, even if the bottom card runs at decent temps.....

Now take the TDP and slash in half, and you are talking cards that hit 60c on air so running two would be cake on air, not to mention, your run of the mill 600w PSU will be just fine doing it









I am not going to go crossfire 390x until I determine a) is it worth it to sell this card and get 2 polaris offerings, or b) will I just watch 390 prices tank and scoop up a 390x on the cheap and deal with the power and heat......


----------



## patriotaki

what will be the price tag of a similar 390 polaris sapphire edition?








if its around 300eur i can sell my 390 now and wait for polaris


----------



## Streetdragon

Made fresh Windowsinstall. now it works again.. just *** xD Buth thx for the help^^

Have a new little "problem"
With HWinfo sidebar. wrong forum but maybe you know how to fix it.

i miss the MHZ,% and Temperature icons..
The voltage is cuttet to...


----------



## mus1mus

Use AIDA 64 Gadget. Better.


----------



## bluej511

Ok if this is true ill buy one and put her on water.

http://www.game-debate.com/news/?news=20006&graphics=Radeon%20R9%20490X%208GB&title=AMD%20Polaris%2010%20GPU%20Reportedly%20Offers%20Near%20980%20Ti%20Performance%20For%20300%20USD


----------



## dagget3450

lol i finally started blocking my 390x cards. of 3 tested so far only 1 was able to bench @ 1800 on ram with a small vcore bump. Not sure if this will change once i get them water cooled but i doubt it will. looks like all three so far hit 1700 with no problem as suggested by others. gpu core clocks i havent done much testing with really but i would be super stoked if i could pull 1200 on all 4 of them. Then all this may be wasted if polaris comes around almost matching a 980ti/furyx. However i don't think they will.


----------



## mus1mus

1300 FTW!


----------



## dagget3450

Quote:


> Originally Posted by *mus1mus*
> 
> 1300 FTW!


one can dream lol, im sure i will have a piggy or two when i get to testing. I am not expecting to game on anything over 1150/1200 if i am lucky is my guess. I am hoping to see how these will compare to my old 290x ref cards. They were all Hynix and hit 1600-1700 vram clocks. I think i was able to bench close to 1200 on cores, but i wasnt pushing them that hard. I also seem to recall my vrm temps were pretty high cause i was using stock ek pads.


----------



## mus1mus

Quote:


> Originally Posted by *dagget3450*
> 
> one can dream lol, im sure i will have a piggy or two when i get to testing. I am not expecting to game on anything over 1150/1200 if i am lucky is my guess. I am hoping to see how these will compare to my old 290x ref cards. They were all Hynix and hit 1600-1700 vram clocks. I think i was able to bench close to 1200 on cores, but i wasnt pushing them that hard. I also seem to recall my vrm temps were pretty high cause i was using stock ek pads.


Cooling them does wonders. Voltage comes along that line.

When you hit a Voltage limit, but maintains them cool enough, let me know. Maybe we can unlock them.


----------



## bluej511

Quote:


> Originally Posted by *mus1mus*
> 
> Cooling them does wonders. Voltage comes along that line.
> 
> When you hit a Voltage limit, but maintains them cool enough, let me know. Maybe we can unlock them.


Eh even if low VRM temps i was never able to get past 1200/1650, Elpida is def limited. My VRM 1 is even cooler now staying below 50°C. I could try again to see if i get more then 1650 but doubtful. Even on water if they stays below 40 wont really matter much these cards are limited we all know t hat.


----------



## Agent Smith1984

Quote:


> Originally Posted by *dagget3450*
> 
> lol i finally started blocking my 390x cards. of 3 tested so far only 1 was able to bench @ 1800 on ram with a small vcore bump. Not sure if this will change once i get them water cooled but i doubt it will. looks like all three so far hit 1700 with no problem as suggested by others. gpu core clocks i havent done much testing with really but i would be super stoked if i could pull 1200 on all 4 of them. Then all this may be wasted if polaris comes around almost matching a 980ti/furyx. However i don't think they will.


Looking good!!

Remember, if you do want the 1750 VRAM (seems to be the magic mark for all 390's), then you may need to add 25-50mv of AUX voltage. Can't explain why, but it's been the norm on almost all the ones I've tested.

Can't wait to see how the cores do on water...... I would expect no less than 1175 each, and with your setup you may be able to add a little voltage and get 1200.

That's going to be a seriously powerful setup!


----------



## bluej511

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Looking good!!
> 
> Remember, if you do want the 1750 VRAM (seems to be the magic mark for all 390's), then you may need to add 25-50mv of AUX voltage. Can't explain why, but it's been the norm on almost all the ones I've tested.
> 
> Can't wait to see how the cores do on water...... I would expect no less than 1175 each, and with your setup you may be able to add a little voltage and get 1200.
> 
> That's going to be a seriously powerful setup!


The reference blocks (depending on which company) cool the VRM decently, its not as amazing as people think because 1, water doesnt flow directly over VRM 2 (the trio, or single vrm next to the pcie bracket) VRM 1 on the other hand gets cooled very efficiently with some blocks. Considering i can get 46°C maxed out on VRM one passively cooled with a single 900rpm fan should be testament to that.


----------



## patriotaki

my r9 390 looks sweet in my new case








what do you thinnk of my build




edit: waiting for my red led strips to arrive


----------



## battleaxe

She's running again. I don't know how they fixed it so fast, but my old card is back. So glad I didn't get it replaced with a dud. I decided to steel wool the copper pipe VRM coolers and put a coat of clear finish on them so finger prints don't create oxidation. Looks better.

Looking forward to next gen so I can get some full blocks.










Spoiler: Warning: Spoiler!


----------



## n64ADL

how much should i overclock the msi r9 390 gpu??? i'm considering overclocking this bad boy to make it last longer until polaris comes out, or i buy another r9 390 what do you guys recommend. i know MSI has an OC feature in there software that brings it up 80 mhz but i feel like it can perform a lot better than that???


----------



## battleaxe

Quote:


> Originally Posted by *n64ADL*
> 
> how much should i overclock the msi r9 390 gpu??? i'm considering overclocking this bad boy to make it last longer until polaris comes out, or i buy another r9 390 what do you guys recommend. i know MSI has an OC feature in there software that brings it up 80 mhz but i feel like it can perform a lot better than that???


What monitor are you running? What games do you play? Is FPS a problem now?

If running 4k then you can benefit from Xfire. I use Xfire, but even in single card on 4k most games are playable with decent settings. If you don't have 4k then Xfire is completely pointless. One card is plenty for 1080p IMO.


----------



## n64ADL

Quote:


> Originally Posted by *battleaxe*
> 
> What monitor are you running? What games do you play? Is FPS a problem now?
> 
> If running 4k then you can benefit from Xfire. I use Xfire, but even in single card on 4k most games are playable with decent settings. If you don't have 4k then Xfire is completely pointless. One card is plenty for 1080p IMO.


3440 x 1440. some games play better than others, bf4 starts struggles if i turn it up to ultra at 3440 x 1440. have a i7 5820k cpu, and windows 10. my framerate in the division is just awful at 3440 x 1440, the worst part is the game crashes EVERYTIME i try to change the resolution to lower one. that game is one of the worst option changing available for customers i've ever seen next to dark souls. i just would like to get more fps to last a little bit longer bc not all games play nice at that resolution, including mgs V i always turn on freesync as its an acer 3440 x 1440 ultra wide 34 inch freesynce monitor.


----------



## battleaxe

Quote:


> Originally Posted by *n64ADL*
> 
> 3440 x 1440. some games play better than others, bf4 starts struggles if i turn it up to ultra at 3440 x 1440. have a i7 5820k cpu, and windows 10. my framerate in the division is just awful at 3440 x 1440, the worst part is the game crashes EVERYTIME i try to change the resolution to lower one. that game is one of the worst option changing available for customers i've ever seen next to dark souls. i just would like to get more fps to last a little bit longer bc not all games play nice at that resolution, including mgs V i always turn on freesync as its an acer 3440 x 1440 ultra wide 34 inch freesynce monitor.


If the game is crashing that has nothing to do with the GPU frames. Something is wrong. I can play BF4 on one GPU, but I have to use medium and some high settings and no AA. Frames are over 60 avg. This is on 4k.

Something is wrong there.


----------



## diggiddi

Quote:


> Originally Posted by *n64ADL*
> 
> 3440 x 1440. some games play better than others, bf4 starts struggles if i turn it up to ultra at 3440 x 1440. have a i7 5820k cpu, and windows 10. my framerate in the division is just awful at 3440 x 1440, the worst part is the game crashes EVERYTIME i try to change the resolution to lower one. that game is one of the worst option changing available for customers i've ever seen next to dark souls. i just would like to get more fps to last a little bit longer bc not all games play nice at that resolution, including mgs V i always turn on freesync as its an acer 3440 x 1440 ultra wide 34 inch freesynce monitor.


What's your RAM speed, 2400 I hope? Bf4 likes fast ram, anyhoo, I say double up on the gpu as long as you have the power to run it


----------



## kizwan

Quote:


> Originally Posted by *n64ADL*
> 
> Quote:
> 
> 
> 
> Originally Posted by *battleaxe*
> 
> What monitor are you running? What games do you play? Is FPS a problem now?
> 
> If running 4k then you can benefit from Xfire. I use Xfire, but even in single card on 4k most games are playable with decent settings. If you don't have 4k then Xfire is completely pointless. One card is plenty for 1080p IMO.
> 
> 
> 
> 3440 x 1440. some games play better than others, bf4 starts struggles if i turn it up to ultra at 3440 x 1440. have a i7 5820k cpu, and windows 10. my framerate in the division is just awful at 3440 x 1440, the worst part is the game crashes EVERYTIME i try to change the resolution to lower one. that game is one of the worst option changing available for customers i've ever seen next to dark souls. i just would like to get more fps to last a little bit longer bc not all games play nice at that resolution, including mgs V i always turn on freesync as its an acer 3440 x 1440 ultra wide 34 inch freesynce monitor.
Click to expand...

BF4 Ultra @1440p with crossfire running great here but I don't have number to give you. No lagging/stuttering for sure. @4K if I remember correctly either High or Very High w/o AA, average in 60 FPS. Not so sure, need to check it again.


----------



## peejay2104

Hi everyone, i've got an msi r9 390x which gets to Hot i think :/
When i play assetto corsa for example, the fans on the card has to ramp up to about 70% to keep the card at 82°c. Isn't this a bit much or is this normal? Anyone Who can help me with this? Many thanks!
Peejay


----------



## bluej511

Quote:


> Originally Posted by *peejay2104*
> 
> Hi everyone, i've got an msi r9 390x which gets to Hot i think :/
> When i play assetto corsa for example, the fans on the card has to ramp up to about 70% to keep the card at 82°c. Isn't this a bit much or is this normal? Anyone Who can help me with this? Many thanks!
> Peejay


Depends where you live, your case and case flow. 82°C sounds about right for an msi. The MSI isnt a card that tends to stay cool, you can change the thermal paste might help but unless it throttles youre fine for temps, its a bit warm but its fine.

https://www.techpowerup.com/reviews/MSI/R9_390X_Gaming/34.html


----------



## spyshagg

crossfire at high resolutions will devastate a slow cpu/ram combo.

Assetto Corsa 6400x1440p single card: 35% cpu usage.
Assetto Corsa 6400x1400p crossfire: 75% cpu usage, into hyperthreading territory.

[email protected] DDR3 1333

When you reach the portion of the game that does require CPU power, you wont have cpu resources available.


----------



## peejay2104

Quote:


> Originally Posted by *bluej511*
> 
> Depends where you live, your case and case flow. 82°C sounds about right for an msi. The MSI isnt a card that tends to stay cool, you can change the thermal paste might help but unless it throttles youre fine for temps, its a bit warm but its fine.
> 
> https://www.techpowerup.com/reviews/MSI/R9_390X_Gaming/34.html


It's only around 18°c in my Room and Flow in the cases is good to, got three intakes and three outtake fans. I'm fine with the temps, i just don't think it should Be Spinning at 70% fan Speed in Order to stay at 80°c :/

Pj


----------



## Rmosher

I want to update my entry. With the new Crimson 16.3.2 I was able to push my OC higher than before. Here is my GPUZ link: http://www.techpowerup.com/gpuz/details/b3sdm. Sapphire Tri-X R9 390X. Stock/Air. 100 mV. Not bad for stock air cooling.


----------



## dagget3450

So quiet in here...

3 cards blocked testing 4th here. still stoked that i get to keep my water blocks in action. would already be further along if i hadn't been sick with a cold last week.

I have Hitman now(came with gpu), wondering if its any good for benching.


----------



## bluej511

Quote:


> Originally Posted by *dagget3450*
> 
> So quiet in here...
> 
> 3 cards blocked testing 4th here. still stoked that i get to keep my water blocks in action. would already be further along if i hadn't been sick with a cold last week.
> 
> I have Hitman now(came with gpu), wondering if its any good for benching.


I hear its a DX12 nightmare but would love to see the results. Ive been getting back into Euro Truck Simulator 2 haha.


----------



## dagget3450

Quote:


> Originally Posted by *bluej511*
> 
> I hear its a DX12 nightmare but would love to see the results. Ive been getting back into Euro Truck Simulator 2 haha.


The quietness is due to gaming, now that's a good sign









I hate AMD so much:


j/k - my 12th card is in a box atm. rest are being switched around or put up for sale


----------



## Agent Smith1984

Hitman is the most broke game I've ever played. Dx12 lockups constantly. I have uninstalled altogether now....


----------



## dagget3450

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Hitman is the most broke game I've ever played. Dx12 lockups constantly. I have uninstalled altogether now....












2016 is turning out to be the worst pc gaming year? 2015 was rough but had tons of good titles?


----------



## Agent Smith1984

Quote:


> Originally Posted by *dagget3450*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 2016 is turning out to be the worst pc gaming year? 2015 was rough but had tons of good titles?


It should get better... I hope anyways!

Thinking Doom is going to be awesome, and beta test show AMD dominating in it


----------



## Stige

Quote:


> Originally Posted by *Agent Smith1984*
> 
> It should get better... I hope anyways!
> 
> Thinking Doom is going to be awesome, and beta test show AMD dominating in it


Doom is just utter garbage, just a Halo copy and nothing to do with Doom really. Only COD and Halo fanbois can like crap like that.


----------



## gupsterg

Wasn't bowled over by DooM beta, I know plenty don't like SWBF but I've been luving that as a quick session of MP.


----------



## tolis626

I know it's old news by now, but gimme Witcher 3 all day every day and I'm happy. It may slaughter my 390x regardless of overclock, but it's gorgeous on top of having some of the best storytelling ever. It also just works, no problems whatsoever. CDPR set the bar a bit too high, especially for RPGs. Gotta love these guys.

Στάλθηκε από το GT-I9300 μου χρησιμοποιώντας Tapatalk


----------



## GorillaSceptre

I've been meaning to pick up TW3 but from everything I've heard the game is a monster... I'll need months to finish it. I loved the first two so i have to get it at some point.

Also been playing a lot of SWBF, I'm liking it a lot lately, far more than i did at launch for some reason. Quality game, just nowhere near worth $60.


----------



## gupsterg

I paid £1 for SWBF







.


----------



## tolis626

Quote:


> Originally Posted by *GorillaSceptre*
> 
> I've been meaning to pick up TW3 but from everything I've heard the game is a monster... I'll need months to finish it. I loved the first two so i have to get it at some point.
> 
> Also been playing a lot of SWBF, I'm liking it a lot lately, far more than i did at launch for some reason. Quality game, just nowhere near worth $60.


It's a monster alright. I've been playing and playing and playing and still I have a lot to go. 100% completion need 150-200 hours or more. I've been playing just under 100 hours and I still have a lot of quests and a lot of undiscovered places to go to. If, however, you decide to just follow the main storyline and only the important side quests... Well, it's still long, but not so long. It's also beautiful. B-e-a-utiful.

Στάλθηκε από το GT-I9300 μου χρησιμοποιώντας Tapatalk


----------



## dagget3450

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Hitman is the most broke game I've ever played. Dx12 lockups constantly. I have uninstalled altogether now....


Does it have an in game benchmark? Also has there been any Mirrors edge catalyst benchmarks done?


----------



## bluej511

Quote:


> Originally Posted by *dagget3450*
> 
> Does it have an in game benchmark? Also has there been any Mirrors edge catalyst benchmarks done?


Mirrors edge is still under embargo, theres a few vids on youtube with the r9 390. Seems to run pretty well.


----------



## Chaoz

Quote:


> Originally Posted by *bluej511*
> 
> Mirrors edge is still under embargo, theres a few vids on youtube with the r9 390. Seems to run pretty well.


I've played the Beta on max settings and it was very smooth no lagg or issues. Okay it's still in Beta but I liked what I saw already.


----------



## afyeung

Quote:


> Originally Posted by *dagget3450*
> 
> The quietness is due to gaming, now that's a good sign
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I hate AMD so much:
> 
> 
> j/k - my 12th card is in a box atm. rest are being switched around or put up for sale


Nice collection! Why do you need 4 way 390x's?


----------



## dagget3450

Quote:


> Originally Posted by *afyeung*
> 
> Nice collection! Why do you need 4 way 390x's?


It's not really "needed" except 8gb vram for me but, if a game uses it then its nice. It's also for a project i am reviving for high resolutions


----------



## battleaxe

I just check the serial numbers and I do not have the same card back from RMA that I had thought I sent in.

And the new one does at least over 1240mhz.

So that is four XFX 390X cards I have had now that all do over 1220mhz. I'm thinking about buying another one for 3way Tri-Fire. Very happy with XFX I have to say. 4 out of 4 cards that hit 1220mhz + on core is pretty crazy if you ask me. One of them did 1270mhz, and I"m not sure what this new one from RMA is going to hit yet. I haven't really had time to test it out very well yet.

I sure hope the next gen is worth the wait because the 290x/390x cards have been the best deal going now for over two years if you ask me.


----------



## dagget3450

Quote:


> Originally Posted by *battleaxe*
> 
> I just check the serial numbers and I do not have the same card back from RMA that I had thought I sent in.
> 
> And the new one does at least over 1240mhz.
> 
> So that is four XFX 390X cards I have had now that all do over 1220mhz. I'm thinking about buying another one for 3way Tri-Fire. Very happy with XFX I have to say. 4 out of 4 cards that hit 1220mhz + on core is pretty crazy if you ask me. One of them did 1270mhz, and I"m not sure what this new one from RMA is going to hit yet. I haven't really had time to test it out very well yet.
> 
> I sure hope the next gen is worth the wait because the 290x/390x cards have been the best deal going now for over two years if you ask me.


which model, and what voltage?


----------



## afyeung

Quote:


> Originally Posted by *dagget3450*
> 
> It's not really "needed" except 8gb vram for me but, if a game uses it then its nice. It's also for a project i am reviving for high resolutions


BF4 Ultrawide 1440P would be a dream with that setup







hopefully you also have the cpu power to back it up lol


----------



## dagget3450

Quote:


> Originally Posted by *afyeung*
> 
> BF4 Ultrawide 1440P would be a dream with that setup
> 
> 
> 
> 
> 
> 
> 
> hopefully you also have the cpu power to back it up lol


If i do high res only cpu wont be a big deal, but right now i am debating teaming it up with my 5960x. It did fine with 4x FuryX mostly. However i have an SR2 i am debating using instead. Me personally i don't like the ultra wide monitors due to the narrow vertical size. That said they are nice though just not for me.


----------



## bluej511

Quote:


> Originally Posted by *dagget3450*
> 
> If i do high res only cpu wont be a big deal, but right now i am debating teaming it up with my 5960x. It did fine with 4x FuryX mostly. However i have an SR2 i am debating using instead. Me personally i don't like the ultra wide monitors due to the narrow vertical size. That said they are nice though just not for me.


Depends how close you sit: Im about 2 feet away and using a 23in right now. Vertically its just fine not too small not too big. A 29in has the same height. If sitting further away i can understand though


----------



## battleaxe

Quote:


> Originally Posted by *dagget3450*
> 
> which model, and what voltage?


Stock voltage is 1.211 (under load) and I was putting +65 to it when I did this. I need to get time to really see how this card does. My second one in Xfire tops out around 1225mhz on FS I know that. Not sure what the new on in slot0 does yet.

And I'm not sure what version they are. How do I check that? GPUZ? I didn't realize there were different versions of the 390x?

Edit: Right now I've got an air bubble in the VRM loop somewhere that's keeping the VRM's hotter than they should be. I need to figure that out before I push it too much.


----------



## dagget3450

Quote:


> Originally Posted by *battleaxe*
> 
> Stock voltage is 1.211 (under load) and I was putting +65 to it when I did this. I need to get time to really see how this card does. My second one in Xfire tops out around 1225mhz on FS I know that. Not sure what the new on in slot0 does yet.
> 
> And I'm not sure what version they are. How do I check that? GPUZ? I didn't realize there were different versions of the 390x?
> 
> Edit: Right now I've got an air bubble in the VRM loop somewhere that's keeping the VRM's hotter than they should be. I need to figure that out before I push it too much.


I was just curious if they are reference pcb or something else. What video ports do you have, that might narrow it some. Mine are straight AMD reference and identical to 290x pcb from AMD. Does yours have blower style fan or DD cooler?


----------



## battleaxe

Quote:


> Originally Posted by *dagget3450*
> 
> I was just curious if they are reference pcb or something else. What video ports do you have, that might narrow it some. Mine are straight AMD reference and identical to 290x pcb from AMD. Does yours have blower style fan or DD cooler?


All of mine have been the 390x DD versions. All look identical. One DVI-D, DVI-A, DP, HDMI. That's all I know. I think they have all been the same version as the BIOS versions have been (I think) the latest BIOS versions. I'm pretty sure they are all the latest models of DD 390x.

The VRM on the second card is 13c cooler than the first card under stress. So I know I've bot a bubble in there. Gotta figure out how to get it out. I'm still learning about full loops. I love the full loop, but when you change gear out as often as I do, seems the bubbles are a pain in the rear to me. I'm sure I'll get better with it in time though.


----------



## dagget3450

Quote:


> Originally Posted by *battleaxe*
> 
> All of mine have been the 390x DD versions. All look identical. One DVI-D, DVI-A, DP, HDMI. That's all I know. I think they have all been the same version as the BIOS versions have been (I think) the latest BIOS versions. I'm pretty sure they are all the latest models of DD 390x.
> 
> The VRM on the second card is 13c cooler than the first card under stress. So I know I've bot a bubble in there. Gotta figure out how to get it out. I'm still learning about full loops. I love the full loop, but when you change gear out as often as I do, seems the bubbles are a pain in the rear to me. I'm sure I'll get better with it in time though.


They sound like they are reference, i almost bought those myself but XFX support in another thread said an inductor had changed size and thus my old ek blocks wouldnt fit without cutting them with a dremel. So i settled for the VR6 model that came with ref blower.

Yeah hope you get that sorted, i went with 17wmk fujipoly this time as my 290x were using ek provided pads before. I was getting like 70/80c on vrms with my 290x and 200mv.


----------



## battleaxe

Quote:


> Originally Posted by *dagget3450*
> 
> They sound like they are reference, i almost bought those myself but XFX support in another thread said an inductor had changed size and thus my old ek blocks wouldnt fit without cutting them with a dremel. So i settled for the VR6 model that came with ref blower.
> 
> Yeah hope you get that sorted, i went with 17wmk fujipoly this time as my 290x were using ek provided pads before. I was getting like 70/80c on vrms with my 290x and 200mv.


Yes. These are the ones with a new cap. Doesn't fit the old block without some mods. That's why i made my own vrm coolers. My second card hits 50c vrm1 but the first hits around 63c so I Dec have an air bubble in there.


----------



## afyeung

Quote:


> Originally Posted by *battleaxe*
> 
> Stock voltage is 1.211 (under load) and I was putting +65 to it when I did this. I need to get time to really see how this card does. My second one in Xfire tops out around 1225mhz on FS I know that. Not sure what the new on in slot0 does yet.
> 
> And I'm not sure what version they are. How do I check that? GPUZ? I didn't realize there were different versions of the 390x?
> 
> Edit: Right now I've got an air bubble in the VRM loop somewhere that's keeping the VRM's hotter than they should be. I need to figure that out before I push it too much.


Only +65mv to hit 1200mhz+?


----------



## bluej511

Quote:


> Originally Posted by *battleaxe*
> 
> All of mine have been the 390x DD versions. All look identical. One DVI-D, DVI-A, DP, HDMI. That's all I know. I think they have all been the same version as the BIOS versions have been (I think) the latest BIOS versions. I'm pretty sure they are all the latest models of DD 390x.
> 
> The VRM on the second card is 13c cooler than the first card under stress. So I know I've bot a bubble in there. Gotta figure out how to get it out. I'm still learning about full loops. I love the full loop, but when you change gear out as often as I do, seems the bubbles are a pain in the rear to me. I'm sure I'll get better with it in time though.


Run the pump at full speed should get rid of it, seems like its a big enough bubble to get temps that much higher. Unless its not sitting too well against the VRM? Dont worry about the air bubbles getting sucked back into the pump, it will break em up into tinier and tinier bubbles so that when you run it back at a lower speed theyl just pass thru.

You should see how crazy i went titling my case to get air bubbles out, i almost had it flat on its side haha.


----------



## battleaxe

Quote:


> Originally Posted by *bluej511*
> 
> Run the pump at full speed should get rid of it, seems like its a big enough bubble to get temps that much higher. Unless its not sitting too well against the VRM? Dont worry about the air bubbles getting sucked back into the pump, it will break em up into tinier and tinier bubbles so that when you run it back at a lower speed theyl just pass thru.
> 
> You should see how crazy i went titling my case to get air bubbles out, i almost had it flat on its side haha.


I just did this. Temps dropped like a rock. Very nice.









I could hear the bubbles going crazy. Sounded like a bag of marbles were let loose in my case. Thank you!

+1

When I get some time away from CAD I'll do some bench runs to see what she can do now.

Edit: Decided to hit a quick run. Not sure what happened. VRM temps shot back up again. Can't seem to keep them down. They were down in the high 40's then all of a sudden shot up to almost 80 in a few minutes. Doesn't add up. Anyway, decent little run.

Holds 1220mhz at only +63. The max voltage shown on GPUZ is before Vdroop.

I ran it up to 1270mhz at +100mv but the VRM was just getting too hot. I have to figure out how to get VRM1 cooler again. I had it working on the old card, but seems I'm not getting very good contact or something.


----------



## bluej511

Quote:


> Originally Posted by *battleaxe*
> 
> I just did this. Temps dropped like a rock. Very nice.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I could hear the bubbles going crazy. Sounded like a bag of marbles were let loose in my case. Thank you!
> 
> +1
> 
> When I get some time away from CAD I'll do some bench runs to see what she can do now.
> 
> Edit: Decided to hit a quick run. Not sure what happened. VRM temps shot back up again. Can't seem to keep them down. They were down in the high 40's then all of a sudden shot up to almost 80 in a few minutes. Doesn't add up. Anyway, decent little run.
> 
> Holds 1220mhz at only +63. The max voltage shown on GPUZ is before Vdroop.
> 
> I ran it up to 1270mhz at +100mv but the VRM was just getting too hot. I have to figure out how to get VRM1 cooler again. I had it working on the old card, but seems I'm not getting very good contact or something.


Glad it helped, and im a noob at water cooling too haha. You should definitely be getting lower vrm temps then im getting on my alphacool. I did notice however that VRM 2 on most block doesnt even have water flowing directly over it just very closely.

For me having a fan blowing over the long strip of VRMs i get temps below 50C. I did notice however that VRM 2 on the sapphire is only a single VRM while reference ones have 3.


----------



## lefteris82

Hello guys. I recently bought an MSI R390X gaming 8GB. When i 'm playing games for example Rise of the Tomb raider or Evil within at max settings 1440p . i'm getting 93-94 degrees on the GPU core REGALDLESS the FAN of the GPU.
I mean that if i set the fan at full rpm 100% or at 50% i 'm getting the same temperature. Is this logical?

the case has good airflow with 2 x 140mm front and 1 120mm back high rpms.


----------



## bluej511

Quote:


> Originally Posted by *lefteris82*
> 
> Hello guys. I recently bought an MSI R390X gaming 8GB. When i 'm playing games for example Rise of the Tomb raider or Evil within at max settings 1440p . i'm getting 93-94 degrees on the GPU core REGALDLESS the FAN of the GPU.
> I mean that if i set the fan at full rpm 100% or at 50% i 'm getting the same temperature. Is this logical?
> 
> the case has good airflow with 2 x 140mm front and 1 120mm back high rpms.


Could take it apart and replace the thermal paste. GC Extreme and its thickness seems to be the best for gpus. The MSI cards do tend to run hot but 94°C would start throttling.


----------



## lefteris82

I cannot remove the heatsink to replace thermal paste because then i will lose the warranty.

Although there is no throttling, I ran benchmarks and im getting same results from review sites.


----------



## bluej511

Quote:


> Originally Posted by *lefteris82*
> 
> I cannot remove the heatsink to replace thermal paste because then i will lose the warranty.
> 
> Although there is no throttling, I ran benchmarks and im getting same results from review sites.


You wont lose your warranty. You can call msi and ask but pretty sure just replacing thermal paste wont void it. A few manufacturers have even stated so.


----------



## gupsterg

AFAIK on MSI it does, also Sapphire







. I do know of people that have RMA'd sucessfully though with even TIM changed







, perhaps it depends who checks card when RMA'd / company if they pull an owner on it.


----------



## lefteris82

I will contact MSI and if so i will change the thermal paste.

I read in MSI forums in a topic about GTX 980Ti and a 700w psu that this may be a PSU issue.

my PSU is a coolermaster RS-700-ASAA


----------



## battleaxe

Quote:


> Originally Posted by *gupsterg*
> 
> AFAIK on MSI it does, also Sapphire
> 
> 
> 
> 
> 
> 
> 
> . I do know of people that have RMA'd sucessfully though with even TIM changed
> 
> 
> 
> 
> 
> 
> 
> , perhaps it depends who checks card when RMA'd / company if they pull an owner on it.


Yes. I ran a water block on my Sapphire and had to RMA. No issues. XFX even warrants too, even if the stickers are removed. Sapphire doesn't even use stickers AFAIK. There weren't any on my 290x anyway. I have the replacements cards back on both companies now. Both were pretty fast at replacing them.


----------



## battleaxe

Quote:


> Originally Posted by *bluej511*
> 
> Glad it helped, and im a noob at water cooling too haha. You should definitely be getting lower vrm temps then im getting on my alphacool. I did notice however that VRM 2 on most block doesnt even have water flowing directly over it just very closely.
> 
> For me having a fan blowing over the long strip of VRMs i get temps below 50C. I did notice however that VRM 2 on the sapphire is only a single VRM while reference ones have 3.


Is that on load? How much voltage are you seeing below 50c on VRM1?


----------



## bluej511

Quote:


> Originally Posted by *battleaxe*
> 
> Is that on load? How much voltage are you seeing below 50c on VRM1?


Yea on load, and wtv stock volts are so i, guessing 1.2 slightly less maybe. I have a 120mm fan blowing right on the fins lol. I went from 60ish to 50 just by adding a fan right to my alphacool block. I was shocked.


----------



## patriotaki

Hmm maybe i should stick with my 390.. i dont know
I found someone who wants to buy my pcs+ 390 for 300euro,
my only options are :
1) sell 390 get a used gtx 970
2) sell 390 keep money for polaris
3) keep card

should i sell it now for 300euro or later cheaper..because when Polaris come out no one will buy the 390 for 300euro

what do you think guys? help me out here xD

PS: the reason i want get rid of my 390 is coil whine


----------



## TrueForm

Sell card and wait for Polaris.


----------



## anti-duck

Quote:


> Originally Posted by *patriotaki*
> 
> Hmm maybe i should stick with my 390.. i dont know
> I found someone who wants to buy my pcs+ 390 for 300euro,
> my only options are :
> 1) sell 390 get a used gtx 970
> 2) sell 390 keep money for polaris
> 3) keep card
> 
> should i sell it now for 300euro or later cheaper..because when Polaris come out no one will buy the 390 for 300euro
> 
> what do you think guys? help me out here xD
> 
> PS: the reason i want get rid of my 390 is coil whine


A bit confused as to why #1 is even an option, why sell an R9 390 due to coil whine to get a slightly worse card known for coil whine? Just a bit strange.

If I had an R9 390, I'd just sell it now if I had the kind of offer you have and then wait for Polaris 10, but it will be a couple of months.


----------



## mus1mus

I'd be hard press on selling it now if I were you.

Look at two possible scenarios, if Polaris comes out great, 200 and 300 series cards will surely go down in terms of prices. Means, you can no longer sell them at decent deals. But on the bright side, you can easily grab another and will still come out more powerful.

Polaris is still uncertain at this point. Even Pascal in fact.

If they turn out to be duds, your card will still sell quite well. Plus, you can still game while waiting.


----------



## patriotaki

Quote:


> Originally Posted by *mus1mus*
> 
> I'd be hard press on selling it now if I were you.
> 
> Look at two possible scenarios, if Polaris comes out great, 200 and 300 series cards will surely go down in terms of prices. Means, you can no longer sell them at decent deals. But on the bright side, you can easily grab another and will still come out more powerful.
> 
> Polaris is still uncertain at this point. Even Pascal in fact.
> 
> If they turn out to be duds, your card will still sell quite well. Plus, you can still game while waiting.


agree, but getting 2nd card is not my option xD i tried once 2x pcs+ r9 390 on my machhine and the heat was....lol you couldnt imagine the whole room temp was getting higher and higher.

i just installed my old 6870 on my pc, my exam period is almost here so not palying games during the exam period is awesome for me







and until im done hopefully polaris will come out


----------



## mus1mus

Oh yeah. That's just the main issue. lol

If you have a back up card, then yeah.







300 EU is still quite huge isn't it?


----------



## TainePC

Quote:


> Originally Posted by *patriotaki*
> 
> Hmm maybe i should stick with my 390.. i dont know
> I found someone who wants to buy my pcs+ 390 for 300euro,
> my only options are :
> 1) sell 390 get a used gtx 970
> 2) sell 390 keep money for polaris
> 3) keep card
> 
> should i sell it now for 300euro or later cheaper..because when Polaris come out no one will buy the 390 for 300euro
> 
> what do you think guys? help me out here xD
> 
> PS: the reason i want get rid of my 390 is coil whine


if you are getting coil wine you really hate return the 390 and ask to swap to a different model. if you are stubborn enough they will let you swap, at the very least paying the difference in real money. i did this with an rm750 to an evga g2 750 gold cos my rm was making coil noise.

Don't say the name "970" . Replace it with "Fraudulent piece of crap"" and you will start getting into the territory of Nvidias scam. a tonne of people (Myself included) switched from nvidia to amd once the realized the 970 fraud. i was getting lag in shadow of mordaw, gtav and watchdogs with the 970. i fought with pc case gear and managed to get a swap to the xfx r9 390 im using right now.
ps. slightly off topic but if you do swap to a different 390 and plan to stay on air don't get the xfx cards. their cooler is horrible i was hitting 74 degrees stock clocks. The MSI and asus ones look solid though


----------



## THUMPer1

What thermal grease/paste/liquid are you guys using for your GPU's?

How are you applying it? As an "X" and let the heatsink or block spread it out, or do you use a pea size then spread it evenly careful not to put on too much?


----------



## Agent Smith1984

Quote:


> Originally Posted by *TainePC*
> 
> if you are getting coil wine you really hate return the 390 and ask to swap to a different model. if you are stubborn enough they will let you swap, at the very least paying the difference in real money. i did this with an rm750 to an evga g2 750 gold cos my rm was making coil noise.
> 
> Don't say the name "970" . Replace it with "Fraudulent piece of crap"" and you will start getting into the territory of Nvidias scam. a tonne of people (Myself included) switched from nvidia to amd once the realized the 970 fraud. i was getting lag in shadow of mordaw, gtav and watchdogs with the 970. i fought with pc case gear and managed to get a swap to the xfx r9 390 im using right now.
> ps. slightly off topic but if you do swap to a different 390 and plan to stay on air don't get the xfx cards. their cooler is horrible i was hitting 74 degrees stock clocks. The MSI and asus ones look solid though


In my opinion the XFX DD cards are really good, and they are only 2 slots and seem to clock well since the improved VRM cooling.

The MSI does okay but I honestly see 85c sometimes while gaming at 4k on 80% fan speed (though the VRM cooling is great and stays in the lower 70's).

I have great caseflow too..... 2) Cooler Master jet-flo's on full tilt pulling air out, and 2) 120 mm LEPA fans at full tilt intaking on my 240 radiator.

The Sapphire seems to cool really good but the chips aren't binned as well.

The primary one to stay away from is the Asus, which I have tested and though I could keep the core within the low 80's, the VRM was almost 90c on stock voltage.


----------



## patriotaki

Which is best solution for 390?


----------



## tolis626

Quote:


> Originally Posted by *THUMPer1*
> 
> What thermal grease/paste/liquid are you guys using for your GPU's?
> 
> How are you applying it? As an "X" and let the heatsink or block spread it out, or do you use a pea size then spread it evenly careful not to put on too much?


On GPUs I usually suggest spreading the paste by hand, since it's a direct to die mount. You can't afford to have naked spots, so to speak. Safest method to so it is put the amount you want (an X or a pea-sized blob or whatever) and spread it using some piece of plastic. I use the spatula that came with the tube of Gelid GC Extreme, but something like an old credit card will work just fine.

Best pastes for GPUs... Well, about the ones that are the best for CPUs really. One could make the argument that the higher heat load would benefit more from a higher conductivity rather than lower thermal resistance, but both matter and a high quality paste is a high quality paste. Gelid GC Extreme, Thermal Grizzly Kryonaut, Arctic MX-4 and Phobya HE-Grease are some of the best, but there are others as well. Lots of roundups online, so just look at some to make sure. The Kryonaut should perform the best among those, but I haven't tried it yet. GC Extreme works really well though.


----------



## christoph

I found out that the coil whine is not from my Sapphire 390, it comes from the PSU


----------



## mus1mus

I always spread. Reminded of the time I pulled out the block eith some unTIMed areas. Before that were numerous blackscreens.


----------



## bluej511

GPU i use x, dual line. Its going to spread anyways and you pretty much only need a super thin layer. Never had issues and ive done quite a few cpu/gpu.


----------



## anti-duck

I always apply a little more TIM than usual to a GPU so that when pressure is applied (if you don't spread), the paste covers the transistors on the GPU die.


----------



## Streetdragon

i put on my GPU liquid metal paste. works great. Just be careful^^


----------



## spyshagg

Quote:


> Originally Posted by *christoph*
> 
> I found out that the coil whine is not from my Sapphire 390, it comes from the PSU


I had it happen to my 700W psu with 2x 290x. It nearly killed her.


----------



## bluej511

Quote:


> Originally Posted by *spyshagg*
> 
> I had it happen to my 700W psu with 2x 290x. It nearly killed her.


Thats closing to pushing that PSU to the max though. 290x 250w each easy without an overclock, cpu prob another 150 or so. For crossfire id easily get an 850 if not 1000.


----------



## n3o611

Quote:


> Originally Posted by *gupsterg*
> 
> AFAIK on MSI it does, also Sapphire
> 
> 
> 
> 
> 
> 
> 
> . I do know of people that have RMA'd sucessfully though with even TIM changed
> 
> 
> 
> 
> 
> 
> 
> , perhaps it depends who checks card when RMA'd / company if they pull an owner on it.


I asked the Sapphire support before I changed my TIM, they just replied "Aslong the card comes as one back, its fine.. it wont void your warranty







"


----------



## gupsterg

Like I said depends who you chat to in company and happens with your RMA







. If we view the wording of warranty:-

Sapphire Warranty Link
Quote:


> 1. Product Warranty will not be valid even if returned after purchased for the following cases:
> 
> Products that are defaced or physically damaged and modified by customer.


Also ref post 2 and 3 of this thread started by OCN Sapphire rep.

MSI Warranty Link
Quote:


> The following circumstances excluded from warranty coverage
> 
> 3.Unauthorized changes of non MSI parts, modifications or alterations , parts removal in or to the products


Recently when I bought a Sapphire Fury X I was surprised to find a full cover sticker on the top/bottom of card.



Removing this to install a custom panel is warranty void a) ref terms conditions b) contacted sapphire support 2x using differing names/serials, hoping if once someone stated it's OK I'd use info if refused a RMA.

The MSI one I had was like most Fury Xs, ie no sticker but the soft feel panels (sapphire has these panels under stickers).



Now in most web reports it's stated AMD announced custom panels for Fury X, this is untrue from my own research







. The page linked in many of these reports is this, now read section near the end of that posting by dtjong.

And who is dtjong? David Tjong is Product Marketing Engineer for AMD.



Spoiler: AMD's stance on Fury X custom panel



Quote:


> IMPORTANT: AMD's product warranty does not cover damage to your graphics card or system caused in whole or in part by removing, modifying or reinstalling the AMD Radeon Fury X faceplate, which activities you agree to carry out at your own risk. AMD will not provide replacement faceplates for any faceplates lost or damaged, nor will AMD be liable for any damages to the graphics card or your system caused during the removal, modification or reinstallation of the faceplate.
> 
> David Tjong is Product Marketing Engineer for AMD. His postings are his own opinions and may not represent AMD's positions, strategies or opinions. Links to third party sites are provided for convenience and unless explicitly stated, AMD is not responsible for the contents of such linked sites and no endorsement is implied.






All in all AFAIK no company wishes a users to mod what they make (in this context) and honor warranty if we view wording. I'd advise keep a record of that discussion with tech support







.


----------



## dagget3450

Quote:


> Originally Posted by *gupsterg*
> 
> Like I said depends who you chat to in company and happens with your RMA
> 
> 
> 
> 
> 
> 
> 
> . If we view the wording of warranty:-
> 
> Sapphire Warranty Link
> Also ref post 2 and 3 of this thread started by OCN Sapphire rep.
> 
> MSI Warranty Link
> Recently when I bought a Sapphire Fury X I was surprised to find a full cover sticker on the top/bottom of card.
> 
> 
> 
> Removing this to install a custom panel is warranty void a) ref terms conditions b) contacted sapphire support 2x using differing names/serials, hoping if once someone stated it's OK I'd use info if refused a RMA.
> 
> The MSI one I had was like most Fury Xs, ie no sticker but the soft feel panels (sapphire has these panels under stickers).
> 
> 
> 
> Now in most web reports it's stated AMD announced custom panels for Fury X, this is untrue from my own research
> 
> 
> 
> 
> 
> 
> 
> . The page linked in many of these reports is this, now read section near the end of that posting by dtjong.
> 
> And who is dtjong? David Tjong is Product Marketing Engineer for AMD.
> 
> 
> All in all AFAIK no company wishes a users to mod what they make (in this context) and honor warranty if we view wording. I'd advise keep a record of that discussion with tech support
> 
> 
> 
> 
> 
> 
> 
> .


Man i am glad i am not the only one, i was super upset about the stickers on my furyx as well.. all sapphire.. and after reading that thread i may just not get sapphire anymore. I just picked up some XFX cards. When Polaris launches ill keep this in mind


----------



## christoph

Quote:


> Originally Posted by *spyshagg*
> 
> I had it happen to my 700W psu with 2x 290x. It nearly killed her.


Quote:


> Originally Posted by *bluej511*
> 
> Thats closing to pushing that PSU to the max though. 290x 250w each easy without an overclock, cpu prob another 150 or so. For crossfire id easily get an 850 if not 1000.


mine is a 850w, 70 amps in 12v line, it doesn't make that much noise, but I'd try to change the psu, so far I been working with no problems at all, 60 FPS smooth all day in any game


----------



## barrelz

Hey guys,

just recently got a Sapphire R9 390 & i5 6600k, but i have some performance issues that i can't solve.
When i play GTA 5 & do a benchmark on high settings i get around 20-25fps.
I was told that the drivers might be the problems but since this is my first amd card i have literally no clue what to do.

current driver:


cheers


----------



## bluej511

Quote:


> Originally Posted by *barrelz*
> 
> Hey guys,
> 
> just recently got a Sapphire R9 390 & i5 6600k, but i have some performance issues that i can't solve.
> When i play GTA 5 & do a benchmark on high settings i get around 20-25fps.
> I was told that the drivers might be the problems but since this is my first amd card i have literally no clue what to do.
> 
> current driver:
> 
> 
> cheers


You should check to see what settings are set to under gaming and global settings. Make sure most of em are set to use application settings and that the frame rate target control and power efficiency are off.

On the other hand AMD came out with ANOTHER new driver, ive already gone to 16.5.1 we'll see how it does. I have a backup of so many AMD drivers, even have catalyst 15.7.1 haha.


----------



## Hardstyler3

Quote:


> Originally Posted by *barrelz*
> 
> Hey guys,
> 
> just recently got a Sapphire R9 390 & i5 6600k, but i have some performance issues that i can't solve.
> When i play GTA 5 & do a benchmark on high settings i get around 20-25fps.
> I was told that the drivers might be the problems but since this is my first amd card i have literally no clue what to do.
> 
> current driver:
> 
> 
> cheers


uninstall driver with ddu and install the newest one


----------



## barrelz

Went to 16.5.1 now, i'll have a look & report back.


----------



## patriotaki

just sold my 390








waiting for polaris now

remove me from the list


----------



## afyeung

Quote:


> Originally Posted by *patriotaki*
> 
> just sold my 390
> 
> 
> 
> 
> 
> 
> 
> 
> waiting for polaris now
> 
> remove me from the list


lol. Gaming experience wasn't good enough?


----------



## mus1mus

That was quick.

Yet, http://thetechnews.com/2016/05/04/amd-radeon-r9-480-will-not-outperform-your-old-r9-390x/

Don't be disappointed....


----------



## patriotaki

Quote:


> Originally Posted by *afyeung*
> 
> lol. Gaming experience wasn't good enough?


Coil whine wass good enough??

At least I could grab a sapphire r9 390 when Polaris come out even cheaper


----------



## bluej511

Quote:


> Originally Posted by *mus1mus*
> 
> That was quick.
> 
> Yet, http://thetechnews.com/2016/05/04/amd-radeon-r9-480-will-not-outperform-your-old-r9-390x/
> 
> Don't be disappointed....


Yet their saying it will outperform the 980ti so who to believe haha.


----------



## patriotaki

Quote:


> Originally Posted by *bluej511*
> 
> Yet their saying it will outperform the 980ti so who to believe haha.


no one








just wait and benchmarks will answer all our questions









i just got 300EUR and dont have a gpu to play most games during my exam period so for me its double WIN


----------



## bluej511

Quote:


> Originally Posted by *patriotaki*
> 
> no one
> 
> 
> 
> 
> 
> 
> 
> 
> just wait and benchmarks will answer all our questions
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i just got 300EUR and dont have a gpu to play most games during my exam period so for me its double WIN


I know for a fact polaris wont beat a 390/x so im ok with it. I will wait until 2017 and rebuild, maybe do vega with the new intel or if zen is spectacular. Or upgrade to a 4790k. I just need to make sure next card i get has a full waterblock.


----------



## gupsterg

Quote:


> Originally Posted by *dagget3450*
> 
> Man i am glad i am not the only one, i was super upset about the stickers on my furyx as well.. all sapphire.. and after reading that thread i may just not get sapphire anymore. I just picked up some XFX cards. When Polaris launches ill keep this in mind


Yep was upset, but fun thing about the web there is a work around







, just about to exploit it soon. Will PM when sucessful if you still have Sapphire Fury X at that time







.


----------



## mus1mus

They are now doing what Intel does for the past few years. Doing a mid-shift release.

Will just wait for a few more days after release. Sure, current gen cards will drop prices.


----------



## afyeung

Quote:


> Originally Posted by *patriotaki*
> 
> Coil whine wass good enough??
> 
> At least I could grab a sapphire r9 390 when Polaris come out even cheaper


My MSI 390X has coil whine. I think a lot of cards have coil whine no matter the brand. Also has to do with PSU


----------



## bluej511

Quote:


> Originally Posted by *afyeung*
> 
> My MSI 390X has coil whine. I think a lot of cards have coil whine no matter the brand. Also has to do with PSU


My past 3 cards none have had coil whine, same with my last 3 psus. Reason might be that i dont run my PSUs at full power? Who knows, tx 650>tx 650 v2>rm1000 no whine. MSI 5770>sapphire 7850>sapphire r9 390 no whine. Maybe im just lucky? Id def hear it too as my system is stupid quiet.


----------



## gupsterg

_Or_ you can't hear high frequency sound range







.


----------



## bluej511

Quote:


> Originally Posted by *gupsterg*
> 
> _Or_ you can't hear high frequency sound range
> 
> 
> 
> 
> 
> 
> 
> .


Oh how i wish that was true lol. I have pitch perfect hearing? i actually hear highs better then lows according to my hearing test. I watched linuses video and i gotta say never heard such awful screetching. Ive played with vsync and without, with frtc and without and ive never heard it. I def would though would be enough for me to RMA a card or get a new one.


----------



## gupsterg




----------



## Synntx

Hey all, I haven't posted here in quite some time as I've been waiting to receive my waterblock from AquaTuning for me 390x. Well, I got it, and I'm in love. Here is my latest firestrike score:

http://www.3dmark.com/3dm/11906742?

LOOK AT DEM CLOCKS THO

So I've got about 10 degrees of thermal margin to play with, and need to get my hands on HIS iTurbo to push farther, but their link seems to be dead. Does anyone have a different download link I can snag?? Thanks.


----------



## m70b1jr

Quote:


> Originally Posted by *Synntx*
> 
> Hey all, I haven't posted here in quite some time as I've been waiting to receive my waterblock from AquaTuning for me 390x. Well, I got it, and I'm in love. Here is my latest firestrike score:
> 
> http://www.3dmark.com/3dm/11906742?
> 
> LOOK AT DEM CLOCKS THO
> 
> So I've got about 10 degrees of thermal margin to play with, and need to get my hands on HIS iTurbo to push farther, but their link seems to be dead. Does anyone have a different download link I can snag?? Thanks.


How did you get those clock speeds?...
But I can't find a link to HIS Iturbo either.


----------



## Synntx

Quote:


> Originally Posted by *m70b1jr*
> 
> How did you get those clock speeds?...
> But I can't find a link to HIS Iturbo either.


Well, I'm on water so ivery got a much larger thermal margin than air. To achieve those clocks in pushing +200mv


----------



## Agent Smith1984

Quote:


> Originally Posted by *Synntx*
> 
> Well, I'm on water so ivery got a much larger thermal margin than air. To achieve those clocks in pushing +200mv


Awesome score but.... Why u no overclock that cpu more?!?

5ghz will give that thing a tad more headroom and get that overall score up a lot too


----------



## Synntx

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Awesome score but.... Why u no overclock that cpu more?!?
> 
> 5ghz will give that thing a tad more headroom and get that overall score up a lot too


I've tried a bazillion times to push past 4650 and it just won't stabilize, even at 1.65v vcore. Just a paradoxical thing with the 8320e


----------



## Agent Smith1984

Quote:


> Originally Posted by *Synntx*
> 
> I've tried a bazillion times to push past 4650 and it just won't stabilize, even at 1.65v vcore. Just a paradoxical thing with the 8320e


Oh man, that's nuts, all the ones I've seen do at least 4.8 at around 1.4v.... My 8300 did 4.95 at around 1.475v, my 9590 needs a tad over 1.525v to do that, and 1.55 for 5ghz. You tried running around 1.4-1.45v and pushing multi only? Those 95w chips seem to scale well at lower voltages up until a certain point and then they just poop out. What's your RAM/NB/HT settings and voltages?


----------



## Synntx

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Oh man, that's nuts, all the ones I've seen do at least 4.8 at around 1.4v.... My 8300 did 4.95 at around 1.475v, my 9590 needs a tad over 1.525v to do that, and 1.55 for 5ghz. You tried running around 1.4-1.45v and pushing multi only? Those 95w chips seem to scale well at lower voltages up until a certain point and then they just poop out. What's your RAM/NB/HT settings and voltages?


I haven't tried running multi only. I guess I could try that real quick


----------



## Synntx

Failed a blend test instantly 4.8ghz at 1.59v vcore. Though I've got thermal marain to push beyond 1.6v vcore,I'm not interested in doing so.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Synntx*
> 
> Failed a blend test instantly 4.8ghz at 1.59v vcore. Though I've got thermal marain to push beyond 1.6v vcore,I'm not interested in doing so.


It shouldn't need any more then 1.45v or so for 4.8, are you sure you're RAM is dry right? Those chips don't respond much to that high of voltage. Your stability van be something as simple as not having enough cpu-nb voltage, or an IMC. That doesn't like 1t RAM... Regardless, even if the chip really is that bad of a clocker (it Would be the worst "E" I've Ever seen) and is really stopping at 4.6, it would still only need 1.45-1.475v or so to get there.


----------



## Synntx

Sent you a PM


----------



## mus1mus

Quote:


> Originally Posted by *Agent Smith1984*
> 
> It shouldn't need any more then 1.45v or so for 4.8, are you sure you're RAM is dry right? Those chips don't respond much to that high of voltage. Your stability van be something as simple as not having enough cpu-nb voltage, or an IMC. That doesn't like 1t RAM... Regardless, even if the chip really is that bad of a clocker (it Would be the worst "E" I've Ever seen) and is really stopping at 4.6, it would still only need 1.45-1.475v or so to get there.


Very true except gor the Vcore part where it relies to the CPU quality.

Best to ask the guy to visit the FX forum.
Quote:


> Originally Posted by *Synntx*
> 
> Sent you a PM


Like I said, visit the FX forum. And not to derail this thread too much, here are a few tips:

Prime Blend tests CORE + CPU-NB +RAM
Prime Small FFT for the Core.


----------



## battleaxe

delete please


----------



## Agent Smith1984

If anyone is interested:

http://www.overclock.net/t/1599383/msi-390x-gaming-like-new-with-box-and-all-accessories

Everybody knows I go through hardware like candy...... so here is me moving onto the next thing that catches my eye I guess... lol


----------



## patriotaki

Quote:


> Originally Posted by *Agent Smith1984*
> 
> If anyone is interested:
> 
> http://www.overclock.net/t/1599383/msi-390x-gaming-like-new-with-box-and-all-accessories
> 
> Everybody knows I go through hardware like candy...... so here is me moving onto the next thing that catches my eye I guess... lol


which direction will you follow?


----------



## Agent Smith1984

Quote:


> Originally Posted by *patriotaki*
> 
> which direction will you follow?


No clue yet, I'm selling my CPU/board/GPU all off and going in a totally different direction, just not sure what that will be yet, lol


----------



## patriotaki

Quote:


> Originally Posted by *Agent Smith1984*
> 
> No clue yet, I'm selling my CPU/board/GPU all off and going in a totally different direction, just not sure what that will be yet, lol


wait for polaris xD i want someone with me


----------



## Agent Smith1984

Quote:


> Originally Posted by *patriotaki*
> 
> wait for polaris xD i want someone with me


I'm also really curious to see this GTX 1070...... though I am bother by the fact that it uses 8GB of GDDR 5 on a 256bit bus instead of going to GDDR5X like the 1080 is doing. (at least that's what is rumored.)


----------



## flopper

Quote:


> Originally Posted by *patriotaki*
> 
> wait for polaris xD i want someone with me


anything red is great


----------



## patriotaki

i am sure the 1070 will be very veryy expensive from what it has to offer if you compare it to the 490


----------



## Agent Smith1984

Quote:


> Originally Posted by *patriotaki*
> 
> i am sure the 1070 will be very veryy expensive from what it has to offer if you compare it to the 490


Looks more to me like 1070 will be targeted the same way 970 was at around $330-360 dollars depending on the variant.

It should perform about as good as an overclocked 980 since it will have the exact same memory bus but with 8GB.

The polaris cards are slated to offer 970/290x performance at a mainstream cost meaning around around $230-300 (depending on 480 or 480x I guess , not sure of nomenclature yet)..... so they won't be any faster than what we are using now if you have a 390x...

AMD has stated this publicly. They aren't going for any performance achievements until Vega, polaris will simply have better power efficiency.

The GTX 1080 should be about as fast or slightly faster as a 980ti but for around $500, and the 1080ti should launch near Vega and compete directly with it.

If you have a 390x now, you won't gain much at all from going polaris outside of lower power usage and getting a slightly cheaper card.


----------



## patriotaki

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Looks more to me like 1070 will be targeted the same way 970 was at around $330-360 dollars depending on the variant.
> 
> It should perform about as good as an overclocked 980 since it will have the exact same memory bus but with 8GB.
> 
> The polaris cards are slated to offer 970/290x performance at a mainstream cost meaning around around $230-300 (depending on 480 or 480x I guess , not sure of nomenclature yet)..... so they won't be any faster than what we are using now if you have a 390x...
> 
> AMD has stated this publicly. They aren't going for any performance achievements until Vega, polaris will simply have better power efficiency.
> 
> The GTX 1080 should be about as fast or slightly faster as a 980ti but for around $500, and the 1080ti should launch near Vega and compete directly with it.
> 
> If you have a 390x now, you won't gain much at all from going polaris outside of lower power usage and getting a slightly cheaper card.


yes but i dont think they will release the 490 with same performance as the 390 thats just stupid..
also the 970 here in greece when it came out costed about 400euro lol


----------



## Agent Smith1984

Quote:


> Originally Posted by *patriotaki*
> 
> yes but i dont think they will release the 490 with same performance as the 390 thats just stupid..
> also the 970 here in greece when it came out costed about 400euro lol


No if there is a 490 that will probably be Vega..... They are going to have to reduce the sku number if the performance is going to be where they are claiming for pascal.

UNLESS they are saying that Pascal 11 will compete with 290x/970 and the Pascal 10 is a 490 with 20-25% improvement in performance in the $450-500 range that performs as well as Fiji does now, but with 8GB GDDR5x, and then maybe they launch a side named card like they did with Fury such as "Fury II" or maybe something really cool like "Nightmare"







that would be around 40-50% faster (than 390x) in the $650 range....

Who knwos how it will play out really...... what's funny is that my summer 2015 pricing chart on the OP is STILL relevant, lol

I have dropped my 390x to 325 shipped to US48...









I am actually expecting someone international will probably buy it though and just pay an extra $25 in shipping to get a good deal on it


----------



## Agent Smith1984

Here you go....

Looks like 1070 is right around 980ti in 3dmark 11, but I would say both are being necked by the i3.....

http://www.overclock.net/t/1595065/wccftech-nvidia-pascal-gtx-1080-1070-1060-benchmarks-leaked-3dmark-11-performance-entries-spotted/210#post_25003734

The results are sketchy due to the CPU in my opinion. I have seen my brother's 980ti pull 28k GPU score by itself in 3dmark 11 with a decent OC, and around 24k stock. His CPU is babsically a 5960x though (Xeon 16** v3 or something like that







)....


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I'm also really curious to see this GTX 1070...... though I am bother by the fact that it uses 8GB of GDDR 5 on a 256bit bus instead of going to GDDR5X like the 1080 is doing. (at least that's what is rumored.)


It won't be 8GB. It will be 7GB with 1GB of donkey-cart-throw-out-the-boat-anchor-into-the-oncoming-traffic kind of RAM.

Never again will I buy an Nvidia card at launch. Never. I'll wait until the dust settles and we actually know what we are getting from now on. Or just buy an AMD card whichever is easier.


----------



## bluej511

Has anyone tried the Forza beta? Its got some serious stutter/lag issues haha.


----------



## Agent Smith1984

Quote:


> Originally Posted by *battleaxe*
> 
> It won't be 8GB. It will be 7GB with 1GB of donkey-cart-throw-out-the-boat-anchor-into-the-oncoming-traffic kind of RAM.
> 
> Never again will I buy an Nvidia card at launch. Never. I'll wait until the dust settles and we actually know what we are getting from now on. Or just buy an AMD card whichever is easier.


It's absolutely going to be a cut down card with only about 7GB usable memory..... Even the benchmark result shows 76**MB of VRAM.

And don't forget that if you are going 4k, the memory bus on the 1070 will be fairly slow STILL......

Sadly for AMD, if NVIDIA were to do something as simple as put a 512bit memory bus, the card would probably destroy even the Fury X because the core architecture is so fast... err, because AMD has been gimped by devs for the most part making the NVIDIA CUDA cores appear to be faster anyways. Thing is, Pascal still won't have async like AMD I don't believe, so DX12 will still belong to AMD, especially when Polaris drops on smaller process


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> It's absolutely going to be a cut down card with only about 7GB usable memory..... Even the benchmark result shows 76**MB of VRAM.
> 
> And don't forget that if you are going 4k, the memory bus on the 1070 will be fairly slow STILL......
> 
> Sadly for AMD, if NVIDIA were to do something as simple as put a 512bit memory bus, the card would probably destroy even the Fury X because the core architecture is so fast... err, because AMD has been gimped by devs for the most part making the NVIDIA CUDA cores appear to be faster anyways. Thing is, Pascal still won't have async like AMD I don't believe, so DX12 will still belong to AMD, especially when Polaris drops on smaller process


I'm really hoping AMD keeps its lead on DX12 as we have seen so far. I would like to see AMD make a solid comeback this season. They need all the help they can get.


----------



## Agent Smith1984

Quote:


> Originally Posted by *battleaxe*
> 
> I'm really hoping AMD keeps its lead on DX12 as we have seen so far. I would like to see AMD make a solid comeback this season. They need all the help they can get.


So wehn we getting some overclocked 390x CF benchmarls man??


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> So wehn we getting some overclocked 390x CF benchmarls man??


I just finished finals. Sorry. LOL

Its been crazy last few weeks. I still have to figure out why VRM1 on the slot0 card is getting too hot. I haven't had time to reseat the blocks or anything to try and figure it out.


----------



## flopper

Quote:


> Originally Posted by *Agent Smith1984*
> 
> It's absolutely going to be a cut down card with only about 7GB usable memory..... Even the benchmark result shows 76**MB of VRAM.
> 
> And don't forget that if you are going 4k, the memory bus on the 1070 will be fairly slow STILL......
> 
> Sadly for AMD, if NVIDIA were to do something as simple as put a 512bit memory bus, the card would probably destroy even the Fury X because the core architecture is so fast... err, because AMD has been gimped by devs for the most part making the NVIDIA CUDA cores appear to be faster anyways. Thing is, Pascal still won't have async like AMD I don't believe, so DX12 will still belong to AMD, especially when Polaris drops on smaller process


if pascal dont do async compute they might be in for a world of hurt with dx12 games and engines.
No future with that card if thats the case.


----------



## bluej511

In any case DX12 seems to be very cpu heavy. I just tried forza 6 (hey its free) and it uses 2 cores at 100% and the other two are all over the place but boy does it look AMAZING. R9 390 has no problems at ultra, needs a bit of work its a beta after all but it looks ridiculous. Its only 18gb so worth a shot if you have W10. Go into xbox app and turn off game dvr seems to help quite a bit.


----------



## bluej511

Quote:


> Originally Posted by *flopper*
> 
> if pascal dont do async compute they might be in for a world of hurt with dx12 games and engines.
> No future with that card if thats the case.


I agree with this but the main issue is only a handful of devs will use async. Why? because nvidia still has all the market share totally pointless to use async. Its really a shame. I think the 1070 and 1080 will be complete busts but we know how nvidia owners operate theyll buy it anyways since nvidia will stop supporting older hardware.


----------



## patriotaki

gtx 1080 and 1070...
damn


----------



## patriotaki

gtx 1070 will not have gdddr5x thats a bonus for amd right?


----------



## flopper

Quote:


> Originally Posted by *bluej511*
> 
> I agree with this but the main issue is only a handful of devs will use async. Why? because nvidia still has all the market share totally pointless to use async. Its really a shame. I think the 1070 and 1080 will be complete busts but we know how nvidia owners operate theyll buy it anyways since nvidia will stop supporting older hardware.


games designed for console on amd hardware, so async will be used.

Quote:


> Originally Posted by *patriotaki*
> 
> gtx 1070 will not have gdddr5x thats a bonus for amd right?


GDDR5 will be used and gddr5x will be used by amd also.
cost/price is the nominator.

For me it seems Polaris might be a price/performance card and the new King as AMD can price and settle it as *the choice to upgrade* to for anyone with a 370/380/960 or such even a 970/390 goes there also. That would increase their market share.


----------



## patriotaki

Quote:


> Originally Posted by *flopper*
> 
> games designed for console on amd hardware, so async will be used.
> GDDR5 will be used and gddr5x will be used by amd also.
> cost/price is the nominator.
> 
> For me it seems Polaris might be a price/performance card and the new King as AMD can price and settle it as *the choice to upgrade* to for anyone with a 370/380/960 or such even a 970/390 goes there also. That would increase their market share.


hmm i dont know if AMD will be the king with the 490/490X , nvidia claimed that the gtx 1080 is twice as fast as the titan x, and the gtx 1070 a bit faster than titan x


----------



## bluej511

Quote:


> Originally Posted by *flopper*
> 
> games designed for console on amd hardware, so async will be used.
> GDDR5 will be used and gddr5x will be used by amd also.
> cost/price is the nominator.
> 
> For me it seems Polaris might be a price/performance card and the new King as AMD can price and settle it as *the choice to upgrade* to for anyone with a 370/380/960 or such even a 970/390 goes there also. That would increase their market share.


True but theres already TONS of games that are for consoles that nvidia gets its hands on developing haha. Lets not kid ourselves. If they do async it wont be anywhere near as good as amd. There kicking ass with a 4 yr old chip i mean come on haha.


----------



## Master0fBlunt

Sooo, Remember how I was having performance issues? Not anymore, turns out, my junky 42' television was to blame. Who woulda thunk it. I now have stable OC in Witcher 3 @ 1776x1000 Ultra+NoHair and OC'ed 1150(+100mhz) / 1650(+150mhz).

Proof:
http://steamcommunity.com/sharedfiles/filedetails/?id=680057949

Going to do a fresh GPU test and see how this new monitor boosts my score @ stock clock, then going for super clock. Fingers crossed, hope I don't burn her up, will have fan set to manual 80% lol....


----------



## dagget3450

Quote:


> Originally Posted by *patriotaki*
> 
> hmm i dont know if AMD will be the king with the 490/490X , nvidia claimed that the gtx 1080 is twice as fast as the titan x, and the gtx 1070 a bit faster than titan x


Excellent marketing as if you pay attention they are talking about VR though. Its not twice as fast outside VR. We need to see reviews first before we know. Both AMD and Nvidia always boost their own benchmarks etc.


----------



## afyeung

Hey guys. My VRM 1 is getting super hot (106c max). The gelid VRM kit that I got was supposed to lower it by a lot but it didn't really. I was thinking of replacing the thermal pad on VRM 1 with something better. Would this be good? http://www.amazon.com/Fujipoly-mod-smart-Extreme-Thermal/dp/B00ZSJPZQ2/ref=sr_1_1?ie=UTF8&qid=1462633648&sr=8-1&keywords=fujipoly+pad I don't know much about thermal pads but I hear fujipoly was great.


----------



## bluej511

Quote:


> Originally Posted by *afyeung*
> 
> Hey guys. My VRM 1 is getting super hot (106c max). The gelid VRM kit that I got was supposed to lower it by a lot but it didn't really. I was thinking of replacing the thermal pad on VRM 1 with something better. Would this be good? http://www.amazon.com/Fujipoly-mod-smart-Extreme-Thermal/dp/B00ZSJPZQ2/ref=sr_1_1?ie=UTF8&qid=1462633648&sr=8-1&keywords=fujipoly+pad I don't know much about thermal pads but I hear fujipoly was great.


Id try to reseat it cuz that is quite high. Too thin a padand youll get crazy hot temps, too thick and wont dissipate as well. I do believe VRM1 is the long line of VRMs, you could always blow a fan over it/across it and see how it does. Def seems like theres an issue there.


----------



## dagget3450

I hope to have mine together today, i noticed my pads felt thinner than the stock ek pads. So i am worried a bit as they are the right size thickness according to the packaging.


----------



## afyeung

Quote:


> Originally Posted by *bluej511*
> 
> Id try to reseat it cuz that is quite high. Too thin a padand youll get crazy hot temps, too thick and wont dissipate as well. I do believe VRM1 is the long line of VRMs, you could always blow a fan over it/across it and see how it does. Def seems like theres an issue there.


Lol I have 2 fans blowing on it. Which pad do you think would work? I'll get that and reseat


----------



## m70b1jr

Quote:


> Originally Posted by *Synntx*
> 
> Well, I'm on water so ivery got a much larger thermal margin than air. To achieve those clocks in pushing +200mv


I'm on water as well, but I get black screen in Trixx.


----------



## m70b1jr

Hey guys. I asked this before, but never got a solution, but, I am on liquid cooling, but I can only get 1165mhz stable on +100mv core, and max power limit, AUX Voltage and all of that. Well, I wan't to pass the 100mv limit on afterburner, but I can't. So, In trixx, anything above 135mv black screens my PC. Apparently, there's a BIOS based Volt limit, or maybe trixx is garbage, but I remember sending my BIOS to someone, and they somehow remove that limit, So i was able to get 1225mhz stable, but after I reboot(ed) my PC, something goes wrong, and my display drivers crash, and my voltage won't got over 1.023. that exact number, until I flash back the default BIOS. So, the edited BIOS doesn't stay permanent. So I was wondering if someone could help me get towards 1200mhz. Mainly for bragging rights


----------



## dagget3450

Quote:


> Originally Posted by *m70b1jr*
> 
> Hey guys. I asked this before, but never got a solution, but, I am on liquid cooling, but I can only get 1165mhz stable on +100mv core, and max power limit, AUX Voltage and all of that. Well, I wan't to pass the 100mv limit on afterburner, but I can't. So, In trixx, anything above 135mv black screens my PC. Apparently, there's a BIOS based Volt limit, or maybe trixx is garbage, but I remember sending my BIOS to someone, and they somehow remove that limit, So i was able to get 1225mhz stable, but after I reboot(ed) my PC, something goes wrong, and my display drivers crash, and my voltage won't got over 1.023. that exact number, until I flash back the default BIOS. So, the edited BIOS doesn't stay permanent. So I was wondering if someone could help me get towards 1200mhz. Mainly for bragging rights


Can you verify its not a video port issue, Can you try DVI only? I seem to recall DP/hdmi blanking out on me above 100mv to the point it stays blank once closer to 200mv.


----------



## m70b1jr

Quote:


> Originally Posted by *dagget3450*
> 
> Can you verify its not a video port issue, Can you try DVI only? I seem to recall DP/hdmi blanking out on me above 100mv to the point it stays blank once closer to 200mv.


I don't think it's my ports, because my GPU fan cranks up all the way when it happens.


----------



## eryklok

Guys is this good for a Sapphire 390x Nitro OC'd?
http://www.3dmark.com/3dm11/11231186

Code:



Code:


Unigine Heaven Benchmark 4.0

FPS:    
57.4
Score:  
1446
Min FPS:        
26.5
Max FPS:        
116.3
System

Platform:       
Windows NT 6.2 (build 9200) 64bit
CPU model:      
Intel(R) Core(TM) i5-4690K CPU @ 3.50GHz (3499MHz) x4
GPU model:      
AMD Radeon (TM) R9 390 Series 16.150.2211.0 (4095MB) x1
Settings

Render: 
Direct3D11
Mode:   
1920x1080 8xAA fullscreen
Preset  
Custom
Quality 
Ultra
Tessellation:   
Extreme
Powered by UNIGINE Engine
Unigine Corp. © 2005-2013


----------



## m70b1jr

Quote:


> Originally Posted by *eryklok*
> 
> Guys is this good for a Sapphire 390x Nitro OC'd?
> http://www.3dmark.com/3dm11/11231186
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> Unigine Heaven Benchmark 4.0
> 
> FPS:
> 57.4
> Score:
> 1446
> Min FPS:
> 26.5
> Max FPS:
> 116.3
> System
> 
> Platform:
> Windows NT 6.2 (build 9200) 64bit
> CPU model:
> Intel(R) Core(TM) i5-4690K CPU @ 3.50GHz (3499MHz) x4
> GPU model:
> AMD Radeon (TM) R9 390 Series 16.150.2211.0 (4095MB) x1
> Settings
> 
> Render:
> Direct3D11
> Mode:
> 1920x1080 8xAA fullscreen
> Preset
> Custom
> Quality
> Ultra
> Tessellation:
> Extreme
> Powered by UNIGINE Engine
> Unigine Corp. © 2005-2013


According to the link you sent, it's a 290x.


----------



## eryklok

Quote:


> Originally Posted by *m70b1jr*
> 
> According to the link you sent, it's a 290x.


It's a 390x man, thats just what it's being picked up as, if you notice it's got 8GB of memory and in the Heaven benchmark its picked up as 390 series, just wanted to know if I am getting decent clocks for this card, sits at around 80~ benchmarking.


----------



## m70b1jr

Quote:


> Originally Posted by *eryklok*
> 
> It's a 390x man, thats just what it's being picked up as, if you notice it's got 8GB of memory and in the Heaven benchmark its picked up as 390 series, just wanted to know if I am getting decent clocks for this card, sits at around 80~ benchmarking.


Better OC than what I'm getting and I'm on liquid.. yea. good card man.


----------



## bluej511

Quote:


> Originally Posted by *eryklok*
> 
> It's a 390x man, thats just what it's being picked up as, if you notice it's got 8GB of memory and in the Heaven benchmark its picked up as 390 series, just wanted to know if I am getting decent clocks for this card, sits at around 80~ benchmarking.


Same mobo, same cpu as me lol. I can get 1200/1650 on my 390 so good start especially if thats with no extra voltage. I Forgot what i got under heaven on mine.


----------



## eryklok

Quote:


> Originally Posted by *m70b1jr*
> 
> Better OC than what I'm getting and I'm on liquid.. yea. good card man.


yours the sapphire as well?
Quote:


> Originally Posted by *bluej511*
> 
> Same mobo, same cpu as me lol. I can get 1200/1650 on my 390 so good start especially if thats with no extra voltage. I Forgot what i got under heaven on mine.


Oh nice, it's at +100mV just because I read someone else got stable clocks with that voltage, whats your voltage on man?


----------



## kizwan

Quote:


> Originally Posted by *m70b1jr*
> 
> Hey guys. I asked this before, but never got a solution, but, I am on liquid cooling, but I can only get 1165mhz stable on +100mv core, and max power limit, AUX Voltage and all of that. Well, I wan't to pass the 100mv limit on afterburner, but I can't. So, In trixx, anything above 135mv black screens my PC. Apparently, there's a BIOS based Volt limit, or maybe trixx is garbage, but *I remember sending my BIOS to someone, and they somehow remove that limit, So i was able to get 1225mhz stable, but after I reboot(ed) my PC, something goes wrong, and my display drivers crash, and my voltage won't got over 1.023. that exact number, until I flash back the default BIOS.* So, the edited BIOS doesn't stay permanent. So I was wondering if someone could help me get towards 1200mhz. Mainly for bragging rights


That just crimson drivers acting up, voltage get reset but not clock. You should use non-crimson or try newer crimson, like 16.3.2 for example.


----------



## bluej511

Quote:


> Originally Posted by *eryklok*
> 
> yours the sapphire as well?
> Oh nice, it's at +100mV just because I read someone else got stable clocks with that voltage, whats your voltage on man?


Thats what i had mine set to 100mv and 50% power. I leave it at factory clocks to game though as the 390 can run most games at 60fps without breaking a sweat. Screen tearing drives me crazy.


----------



## m70b1jr

Quote:


> Originally Posted by *kizwan*
> 
> That just crimson drivers acting up, voltage get reset but not clock. You should use non-crimson or try newer crimson, like 16.3.2 for example.


I'm on the newest crimson drivers.


----------



## TheLAWNOOB

Is a used 290 worth $200 US, or 250 CAD? Someone offering me an ASUS DCUII which is exactly what I have right now.


----------



## DR4G00N

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> Is a used 290 worth $200 US, or 250 CAD? Someone offering me an ASUS DCUII which is exactly what I have right now.


It's not a great deal, I've seen a few 290x's for that price.

I would just wait to see what AMD & Nvidia have to offer with the new generation gpus for now.


----------



## m70b1jr

I got some more info on my issue. I re-flashed the custom BIOS that we made for me, and was doing benchmarks etc. I was able to get up to 1230mhz, and was pushing it further in firestrike. I aritfacted at 1250mhz, and tried to ALT-F4 out of the test, which didn't work. So, i flipped the power switch on my PC. When I booted in windows, afterburners voltage and power limit sliders were back to +0, so I ramped them back up, but I couldn't even do anything. Opening chrome would cause AMD's drivers to crash. Lowering my clock speeds to 1120 fixed the issue, but It seems like the voltage is being reset, and not staying permanent, which is weird. Here is the custom BIOS that was made for me a couple months back.

max_oc.zip 99k .zip file


----------



## tolis626

Speaking about BIOSes, could anybody take a look at mine? I tried opening it with HawaiiBIOSReader and can't make too much sense of it. I also opened m70b1jr's BIOS from the post above and some things are completely diffrerent, like how the voltages are set (his are in mV, mine shows some odd numbers like 65.000 or something). Also, the memory timing are different? Although I dunno, maybe his is modded so I don't know what's happening. Attached is my card's stock BIOS. Thanks in advance!









Hawaii.zip 97k .zip file


----------



## m70b1jr

Quote:


> Originally Posted by *tolis626*
> 
> Speaking about BIOSes, could anybody take a look at mine? I tried opening it with HawaiiBIOSReader and can't make too much sense of it. I also opened m70b1jr's BIOS from the post above and some things are completely diffrerent, like how the voltages are set (his are in mV, mine shows some odd numbers like 65.000 or something). Also, the memory timing are different? Although I dunno, maybe his is modded so I don't know what's happening. Attached is my card's stock BIOS. Thanks in advance!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hawaii.zip 97k .zip file


Yea, mine is modded. A lot.


----------



## fyzzz

Quote:


> Originally Posted by *tolis626*
> 
> Speaking about BIOSes, could anybody take a look at mine? I tried opening it with HawaiiBIOSReader and can't make too much sense of it. I also opened m70b1jr's BIOS from the post above and some things are completely diffrerent, like how the voltages are set (his are in mV, mine shows some odd numbers like 65.000 or something). Also, the memory timing are different? Although I dunno, maybe his is modded so I don't know what's happening. Attached is my card's stock BIOS. Thanks in advance!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hawaii.zip 97k .zip file


The 65... etc number are normal for stock bios, it automatically sets the voltage for all the dpm's. It is best to set it manually like he has in his bios. You can look up what voltages your card is running per dpm with aida64>Video debug>Ati GPU registers. He also has modded timings in his bios, which is a great way to boost memory performance.


----------



## tolis626

Quote:


> Originally Posted by *m70b1jr*
> 
> Yea, mine is modded. A lot.


I figured that out. 1.35V DPM7 voltage, huh? Someone likes to live on the edge.








Can't imagine how hot your card would be getting if it was using the stock cooler... I mean, you're pushing (at +100mV) over 1.4V (without droop), right? Jesus...
Quote:


> Originally Posted by *fyzzz*
> 
> The 65... etc number are normal for stock bios, it automatically sets the voltage for all the dpm's. It is best to set it manually like he has in his bios. You can look up what voltages your card is running per dpm with aida64>Video debug>Ati GPU registers. He also has modded timings in his bios, which is a great way to boost memory performance.


Really? I figured they'd be there with their normal values, didn't know what these numbers were. Good to learn! Thanks!









I was mostly curious if there's anything out of place with that BIOS. It's kind of the last thing I have in mind about my bad memory overclocking. I mean, from what I've seen around here, blackscreening at anything over 1650MHz for Hynix memory is legendarily bad. Add to that the fact that stability is inconsistent (like, one time it'll work at 1725MHz and the next day not even 1675MHz cuts it) and I've gone through a lot of frustration with this card. Good ("good"







) thing is I'm still playing on a 1080p screen, so memory overclocking doesn't have a huge impact on performance.


----------



## m70b1jr

Quote:


> Originally Posted by *tolis626*
> 
> I figured that out. 1.35V DPM7 voltage, huh? Someone likes to live on the edge.
> 
> 
> 
> 
> 
> 
> 
> 
> Can't imagine how hot your card would be getting if it was using the stock cooler... I mean, you're pushing (at +100mV) over 1.4V (without droop), right? Jesus...
> Really? I figured they'd be there with their normal values, didn't know what these numbers were. Good to learn! Thanks!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I was mostly curious if there's anything out of place with that BIOS. It's kind of the last thing I have in mind about my bad memory overclocking. I mean, from what I've seen around here, blackscreening at anything over 1650MHz for Hynix memory is legendarily bad. Add to that the fact that stability is inconsistent (like, one time it'll work at 1725MHz and the next day not even 1675MHz cuts it) and I've gone through a lot of frustration with this card. Good ("good"
> 
> 
> 
> 
> 
> 
> 
> ) thing is I'm still playing on a 1080p screen, so memory overclocking doesn't have a huge impact on performance.


I lowered it to 1250 for DPM. And haven't noticed and issues. Yet.


----------



## m70b1jr

Well I'm noticing issues. Anyone have an idea on what my DPM's should be. Trying to hit 1200mhz +. I'm on a AIO 140mm Liquid cooler.


----------



## Chaoz

Well I just changed the thermalpad on my R9 390 and the temps dropped a bit.
So to test this I ran Valley.

With standard Thermal pad


With Fujipoly Extreme thermal pad (11W/mk)


Not an extreme difference, but noticable. Can't seem to find any 17w/mk. There are some shops that sell it but they don't ship to EU.

Temps in my room are a bit high, tbh it's around 25°C or more. So I'll test this agin when it has cooled down a bit.


----------



## afyeung

Quote:


> Originally Posted by *m70b1jr*
> 
> Well I'm noticing issues. Anyone have an idea on what my DPM's should be. Trying to hit 1200mhz +. I'm on a AIO 140mm Liquid cooler.


I'm also on a 140mm AIO. My 24/7 clocks are 1200/1730 with +100mv and temps on the core are at a nice 53c


----------



## afyeung

Quote:


> Originally Posted by *Chaoz*
> 
> Well I just changed the thermalpad on my R9 390 and the temps dropped a bit.
> So to test this I ran Valley.
> 
> With standard Thermal pad
> 
> 
> With Fujipoly Extreme thermal pad (11W/mk)
> 
> 
> Not an extreme difference, but noticable. Can't seem to find any 17w/mk. There are some shops that sell it but they don't ship to EU.
> 
> Temps in my room are a bit high, tbh it's around 25°C or more. So I'll test this agin when it has cooled down a bit.


What thickness pad are you using? I just bought the 11w/mk to use with the gelid vrm sink with my kraken g10. I think I might have applied the sink wrong because even with two fans blowing on the VRM area, load vrm1 temps are in the 90s. Hopefully with the new pad and reseating the sink, plus reorienting the kraken fan to pull air away from the vrm will lower it to the 70s


----------



## Chaoz

My temps were never really high to begin with. Max temp I got on vrm1 was 90°C. But wanted to try and lower it. So I tried the 1mm but they didn't touch the vrm's at all. I didn't want risk damaging the card. So I found some 1.5mm and they touch the vrm's just fine. 2mm pads would've been better but can't find any +11w/mk with 2mm thickness.


----------



## afyeung

Quote:


> Originally Posted by *Chaoz*
> 
> My temps were never really high to begin with. Max temp I got on vrm1 was 90°C. But wanted to try and lower it. So I tried the 1mm but they didn't touch the vrm's at all. I didn't want risk damaging the card. So I found some 1.5mm and they touch the vrm's just fine. 2mm pads would've been better but can't find any +11w/mk with 2mm thickness.


stock cooler?


----------



## Chaoz

It's the dc3 cooler from Asus.


----------



## afyeung

Quote:


> Originally Posted by *Chaoz*
> 
> It's the dc3 cooler from Asus.


Oh ok. I bought 0.5mm pads. But Im using a dedicated sink for the vrm


----------



## bluej511

Quote:


> Originally Posted by *Chaoz*
> 
> Well I just changed the thermalpad on my R9 390 and the temps dropped a bit.
> So to test this I ran Valley.
> 
> With standard Thermal pad
> 
> 
> With Fujipoly Extreme thermal pad (11W/mk)
> 
> 
> Not an extreme difference, but noticable. Can't seem to find any 17w/mk. There are some shops that sell it but they don't ship to EU.
> 
> Temps in my room are a bit high, tbh it's around 25°C or more. So I'll test this agin when it has cooled down a bit.


Thats not much of an improvement. Youd get better results adding a 120mm behind the card or something. Your temps seem a bit high. Even my stock nitro cooler would never reach that. And instead of ussing valley use Heaven its far more gpu intensive youll probably get even higher VRM temps lol.

On my passively cooled alphacool with one fan blowing over the VRM1 they stay under 50C without an overclock, and thats with 3w/mk thermal pads. Stock cooler theyd be around mid to high 60s. My vrm 2 reaches high 50s low 60s depending on the game.

Btw aquatuning has the alphacool 17w/mk (fujipoly makes em) but they are ludicrously expensive


----------



## bluej511

So did a quick Heaven test this morning at 1200/1650 +100mv 50%. Still the highest i found stable. After that its blah blah. Temps on vrm 1 went up 10°C from stock VRM 2 hasn't changed.


----------



## afyeung

Quote:


> Originally Posted by *bluej511*
> 
> So did a quick Heaven test this morning at 1200/1650 +100mv 50%. Still the highest i found stable. After that its blah blah. Temps on vrm 1 went up 10°C from stock VRM 2 hasn't changed.


Wow! Your GPU temps are extremely low. Is this with a full cover block?


----------



## Chaoz

I can't put a fan infront of it. Due to my H100i V2 being there as exhaust. There is no other spot I can place it with 4 fans for push/pull.

As I said before it's 30°C outside atm. In my room it is 25°c atleast last night. So when it cools down there pprobably will be better temps. The first test without fujipoly i did was a month ago.

I'm still using the stock cooler unlike you with a fullcover block.

The 11w/mk were already 20€ with shipping so i'm not gonna pay even more with a small chance of improvement.


----------



## bluej511

Quote:


> Originally Posted by *afyeung*
> 
> Wow! Your GPU temps are extremely low. Is this with a full cover block?


True but guys im using this, it cools the core very very well at the expense of heating up my cpu slightly more (not an issue its oced at 1.2v and staying at 50C). It passively cools the vrms though but i now have a fan sitting right against the fins that are over VRM1.

http://www.aquatuning.us/water-cooling/gpu-water-blocks/gpu-full-cover/19917/alphacool-nexxxos-gpx-ati-r9-390-m01-mit-backplate-schwarz


----------



## afyeung

Quote:


> Originally Posted by *bluej511*
> 
> True but guys im using this, it cools the core very very well at the expense of heating up my cpu slightly more (not an issue its oced at 1.2v and staying at 50C). It passively cools the vrms though but i now have a fan sitting right against the fins that are over VRM1.
> 
> http://www.aquatuning.us/water-cooling/gpu-water-blocks/gpu-full-cover/19917/alphacool-nexxxos-gpx-ati-r9-390-m01-mit-backplate-schwarz


Awesome







what thickness pads are you using?


----------



## m70b1jr

Could anyone hint me as to what my DPM voltages can / should be on liquid?


----------



## mus1mus

DPM is not your concern even on liquid. In fact, you won't even need to change anything if overclocking is your only goal.

Just remove the limit if you are shooting for benchmark scores. Maintaining default DPM values is good even when the card stays very cool.


----------



## bluej511

Quote:


> Originally Posted by *afyeung*
> 
> Awesome
> 
> 
> 
> 
> 
> 
> 
> what thickness pads are you using?


Ones that came with it are 1.5mm on the front and 3.0mm for the back against the backplate. Might be why the vrms are decent theres thermal pads everywhere.


----------



## afyeung

Quote:


> Originally Posted by *bluej511*
> 
> Ones that came with it are 1.5mm on the front and 3.0mm for the back against the backplate. Might be why the vrms are decent theres thermal pads everywhere.


Hopefully the 0.5mm pads I bought aren't too thin then. Even though they're just for the VRM sink.


----------



## m70b1jr

Quote:


> Originally Posted by *mus1mus*
> 
> DPM is not your concern even on liquid. In fact, you won't even need to change anything if overclocking is your only goal.
> 
> Just remove the limit if you are shooting for benchmark scores. Maintaining default DPM values is good even when the card stays very cool.


How do you remove the limit?


----------



## Chaoz

Quote:


> Originally Posted by *afyeung*
> 
> Hopefully the 0.5mm pads I bought aren't too thin then. Even though they're just for the VRM sink.


Dunno. Even the 1.0mm I bought were too thin for my card. So it could be just enough or not enough.
My card only has one row that needs thermal pads.


----------



## rv8000

Can someone with a 390 run firestrike at stock, and then again at stock core but 1000mhz on the ram? Just trying to make some sense of where Polaris will sit, and some other speculation.


----------



## gupsterg

Quote:


> Originally Posted by *m70b1jr*
> 
> Could anyone hint me as to what my DPM voltages can / should be on liquid?


VID per DPM is based on LeakageID + default GPU clock + other GPU properties and not cooling solution in stock ROM. In Hawaii bios mod OP see heading "What is ASIC quality?", this heading has info as to how a high leakage ASIC draws more power, placing more demand on VRM.

In stock ROM only DPM 0 (lowest/idle state) is manually set, the other states are set automatically with ASIC Profiling using parameters as stated in first sentence.

Overclocking by SW is great when you wish to just OC "on the fly" , bios mod is a more elegant solution







plus you can still OC "on the fly".

When we OC using SW we apply voltage as an offset, which *effects all* DPMs, which usually *is not needed*.

OC using ROM means you can add voltage to a DPM as required







and usually only DPM 7 is where you need the extra voltage. With ROM OC you can also change clocks per DPM, which will gain you performance when card is not sticking to highest DPM. For example in a game that floats within DPM states due to PowerTune tech. In Hawaii bios mod OP see heading "Making OC bios like factory pre OC'd card/ROM".

Investigating VID per DPM also tells you what voltage (VID) your GPU is set to *no other method* tells you that, as when we use monitoring SW we see VDDC.

VDDC is realtime voltage of GPU, this variates due to what DPM the GPU maybe in plus depends on PowerTune / Variable voltage tech which means even when GPU is in like DPM / clock state depending on loading of GPU the voltage will vary. Lets say you apply +50mV in SW, you run 3DM FS and see VDDC of 1.22V, then on same settings you run Valley and see 1.18V you have no idea of VID.

Knowing VID per DPM also lets you know when you add an offset via SW what is final VID. Let's say DPM 7 is 1.250V on your card, when you add +50mV you're at 1.300V as VID. When people comparing OC on a card only state offset you have no idea of what is final VID. They may have a card which defaults to 1.187V, so +50mv result in 1.237V, where as a person with 1.250V card ends up at 1.300V. Higher leakage ASIC has lower threshold of sustaining voltage before breakdown compared with lower leakage ASIC.

If this has wet your appetite for bios mod check out Hawaii bios mod thread. With bios mod you can also mod RAM timings, gaining you small performance boost, depending on memory controller of GPU it may lower your max RAM frequency.

I choose bios mod over SW OC all day long







, I then have no SW app applying OC or running in background







.


----------



## m70b1jr

Quote:


> Originally Posted by *gupsterg*
> 
> VID per DPM is based on LeakageID + default GPU clock + other GPU properties and not cooling solution in stock ROM. In Hawaii bios mod OP see heading "What is ASIC quality?", this heading has info as to how a high leakage ASIC draws more power, placing more demand on VRM.
> 
> In stock ROM only DPM 0 (lowest/idle state) is manually set, the other states are set automatically with ASIC Profiling using parameters as stated in first sentence.
> 
> Overclocking by SW is great when you wish to just OC "on the fly" , bios mod is a more elegant solution
> 
> 
> 
> 
> 
> 
> 
> plus you can still OC "on the fly".
> 
> When we OC using SW we apply voltage as an offset, which *effects all* DPMs, which usually *is not needed*.
> 
> OC using ROM means you can add voltage to a DPM as required
> 
> 
> 
> 
> 
> 
> 
> and usually only DPM 7 is where you need the extra voltage. With ROM OC you can also change clocks per DPM, which will gain you performance when card is not sticking to highest DPM. For example in a game that floats within DPM states due to PowerTune tech. In Hawaii bios mod OP see heading "Making OC bios like factory pre OC'd card/ROM".
> 
> Investigating VID per DPM also tells you what voltage (VID) your GPU is set to *no other method* tells you that, as when we use monitoring SW we see VDDC.
> 
> VDDC is realtime voltage of GPU, this variates due to what DPM the GPU maybe in plus depends on PowerTune / Variable voltage tech which means even when GPU is in like DPM / clock state depending on loading of GPU the voltage will vary. Lets say you apply +50mV in SW, you run 3DM FS and see VDDC of 1.22V, then on same settings you run Valley and see 1.18V you have no idea of VID.
> 
> Knowing VID per DPM also lets you know when you add an offset via SW what is final VID. Let's say DPM 7 is 1.250V on your card, when you add +50mV you're at 1.300V as VID. When people comparing OC on a card only state offset you have no idea of what is final VID. They may have a card which defaults to 1.187V, so +50mv result in 1.237V, where as a person with 1.250V card ends up at 1.300V. Higher leakage ASIC has lower threshold of sustaining voltage before breakdown compared with lower leakage ASIC.
> 
> If this has wet your appetite for bios mod check out Hawaii bios mod thread. With bios mod you can also mod RAM timings, gaining you small performance boost, depending on memory controller of GPU it may lower your max RAM frequency.
> 
> I choose bios mod over SW OC all day long
> 
> 
> 
> 
> 
> 
> 
> , I then have no SW app applying OC or running in background
> 
> 
> 
> 
> 
> 
> 
> .


I've read that thread a ton. One thing I did is I got the "Fixed BIOS" That doesn't black screen, and Did the starting VID at 1031, and increase each state by 31.25. So DPM7 is about 1274 now, and inside afterburner i did a +100mv. I did some benchmarking and so far able to get 1240mhz before artifacting. I'm at school now, so i'll work on it when I get home. Temps are about 31C idle with the fan running at 10%









EDIT: Will there be any hardware restrictions on adding voltage? If I can handle it, I might go up to 1.4 - 1.5 volts to push for a 1300mhz + core clock







. In all honesty, As long as I stay below 80c (On liquid) i'll probably push this card as much as I can go.


----------



## gupsterg

Stock ROMs (so far the ones I've come across) 1.48125V is MAX VID.

SO if your DPM 7 is 1274mV + 100mV you're at 1374mV, so you still got some headroom without removing limit.


----------



## m70b1jr

I know my PC would black screen (Default BIOS) in Trixx when I went above +130mv. Then mus1mus a while back removed the software limit in the BIOS (Idk how) and now i'm working off the one he sent me. I'll try to get my DPM7 up to 1374 then do +100mv in MSI. Also, do my DPM's HAVE to be in intervals? If so, does that mean DPM0 has to be 1131?


----------



## mus1mus

gups, limit - from what I have seen varies a lot. And irregardless of the SW Offset, you can not override each card's limit without doing the mod. Each card reacts differently as well.

Bios modding the DPMs is good for 24/7 OCs. Meaning - less software intervention. But shooting for high scores and clocks, I tend to just take the limit off. That way, after reboots, the cards go back to normal. Apply humungous Offset for a run, and get back to stock state when done.


----------



## gupsterg

Quote:


> Originally Posted by *m70b1jr*
> 
> I know my PC would black screen (Default BIOS) in Trixx when I went above +130mv. Then mus1mus a while back removed the software limit in the BIOS (Idk how) and now i'm working off the one he sent me. I'll try to get my DPM7 up to 1374 then do +100mv in MSI. Also, do my DPM's HAVE to be in intervals? If so, does that mean DPM0 has to be 1131?


Don't worry about lower DPMs VID, just mod DPM 7. I'd mod lower DPMs VID if you're changing GPU and/or RAM clock for lower DPM in ROM, if they are stock values no need to change.

Mus1mus is proficient in bios mod







and has far more OC experience on Hawaii than I







, so you're in good hands







.
Quote:


> Originally Posted by *mus1mus*
> 
> gups, limit - from what I have seen varies a lot. And irregardless of the SW Offset, you can not override each card's limit without doing the mod. Each card reacts differently as well.
> 
> Bios modding the DPMs is good for 24/7 OCs. Meaning - less software intervention. But shooting for high scores and clocks, I tend to just take the limit off. That way, after reboots, the cards go back to normal. Apply humungous Offset for a run, and get back to stock state when done.


I see where you're coming from and agree







, I didn't know m70b1jr was under your wing for help







.

On Voltage limit (VID) how the Stilt explained / IR PDF came across, based on set VID you have the MAX offset that can be applied. For example The Stilt states for his MLU builds:-
Quote:


> At default the maximum VDDCR voltage is 1.48125V. All of my bioses use lower limit for enhanced protection (1.36875V). No matter what kind of offset you use or what voltage level you input into PowerPlay table the voltage will never exceed these values.


So basically a GPU where DPM 7 is 1.187V VID the max offset a person can use is ~+293mV, where DPM 7 is 1.250V the max offset a person can use is ~+231mV. This is what I understood, are you seeing it based of VDDC?

Cheers for info share







, as you know I've never given GPU the kind of voltage you have as never been on WC.


----------



## m70b1jr

Quote:


> Originally Posted by *mus1mus*
> 
> gups, limit - from what I have seen varies a lot. And irregardless of the SW Offset, you can not override each card's limit without doing the mod. Each card reacts differently as well.
> 
> Bios modding the DPMs is good for 24/7 OCs. Meaning - less software intervention. But shooting for high scores and clocks, I tend to just take the limit off. That way, after reboots, the cards go back to normal. Apply humungous Offset for a run, and get back to stock state when done.


So is this why when I applied your default modified BIOS, when I rebooted, i'd get all those issues ive stated before? Such as the MV in afterburner reseting? If so, manually setting my DPM's seems to have fixed that. For now. When I getr home imma try to push for 1.45 volts. I really wanan push as much mhz as I can.


----------



## m70b1jr

It seems if I reboot my PC, the Voltage resets. Only issue I'm really having. I was able to get 1.388 Volts, and was pushing 1265mhz stable, until flipped my power switch on my PC, only for it to boot up and have the voltage reset, and start to artifact at 1125mhz


----------



## Agent Smith1984

Quote:


> Originally Posted by *m70b1jr*
> 
> It seems if I reboot my PC, the Voltage resets. Only issue I'm really having. I was able to get 1.388 Volts, and was pushing 1265mhz stable, until flipped my power switch on my PC, only for it to boot up and have the voltage reset, and start to artifact at 1125mhz


Never turn it off again?


----------



## m70b1jr

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Never turn it off again?


Honestly I thought about it, but if I have to turn my PC off, I have to reflash the BIOS.


----------



## mus1mus

Quote:


> Originally Posted by *gupsterg*
> 
> Don't worry about lower DPMs VID, just mod DPM 7. I'd mod lower DPMs VID if you're changing GPU and/or RAM clock for lower DPM in ROM, if they are stock values no need to change.
> 
> Mus1mus is proficient in bios mod
> 
> 
> 
> 
> 
> 
> 
> and has far more OC experience on Hawaii than I
> 
> 
> 
> 
> 
> 
> 
> , so you're in good hands
> 
> 
> 
> 
> 
> 
> 
> .
> I see where you're coming from and agree
> 
> 
> 
> 
> 
> 
> 
> 
> , I didn't know m70b1jr was under your wing for help
> 
> 
> 
> 
> 
> 
> 
> .
> 
> On Voltage limit (VID) how the Stilt explained / IR PDF came across, based on set VID you have the MAX offset that can be applied. For example The Stilt states for his MLU builds:-
> So basically a GPU where DPM 7 is 1.187V VID the max offset a person can use is ~+293mV, where DPM 7 is 1.250V the max offset a person can use is ~+231mV. *This is what I understood, are you seeing it based of VDDC?*
> 
> Cheers for info share
> 
> 
> 
> 
> 
> 
> 
> , as you know I've never given GPU the kind of voltage you have as never been on WC.


In a way, cards react to a bios' voltage limit differently. Partly maybe due to ASIC. (I never really did some deep look into the limit values) But I'll put it this way:

My 290 has a pretty low limit. 1.33ish to 1.35ish Under Load. Even at +400, the max vddc will be the same. Now if I put a certain value for dpm7 without removing the limit, the same maximum vddc is still achieved.

If you remove the limit, and put a manual value for dpm7, things get scurry. That is IMO.
Quote:


> Originally Posted by *m70b1jr*
> 
> So is this why when I applied your default modified BIOS, when I rebooted, i'd get all those issues ive stated before? Such as the MV in afterburner reseting? If so, manually setting my DPM's seems to have fixed that. For now. When I getr home imma try to push for 1.45 volts. I really wanan push as much mhz as I can.


I see what the problem is now. It's Crimson's maintaining the clocks and not the Voltages. Isn't it?

I down do high OC on Crimson for that fact.


----------



## m70b1jr

Quote:


> Originally Posted by *mus1mus*
> 
> In a way, cards react to a bios' voltage limit differently. Partly maybe due to ASIC. (I never really did some deep look into the limit values) But I'll put it this way:
> 
> My 290 has a pretty low limit. 1.33ish to 1.35ish Under Load. Even at +400, the max vddc will be the same. Now if I put a certain value for dpm7 without removing the limit, the same maximum vddc is still achieved.
> 
> If you remove the limit, and put a manual value for dpm7, things get scurry. That is IMO.
> I see what the problem is now. It's Crimson's maintaining the clocks and not the Voltages. Isn't it?
> 
> I down do high OC on Crimson for that fact.


Any way to fix it?


----------



## mus1mus

My fix is not using Crimson when shooting for high scores.









Anyway, just for a few tips,

1. Gauge whether the card needs Voltage to clock. If the card scales with Voltage - artifacts disappear when you add SW Offsets.

There's nothing much that can be done if the card doesn't react well to Voltage.

2. Keep it under 50C abosolute max when you are playing with more than 1.4V VDDC.

Can you?

3. Don't feed it to much. I don't need to iterate the value of the card. Card death is not the target.

4. Observe the first tip. How much do you need to clock 1200? How much is needed for another 25MHz? If it takes +100 to reach another 25Meg, just stop. Note that I am talking of VDDC values so observe if VDDC scales well with the Offset added.


----------



## gupsterg

Quote:


> Originally Posted by *mus1mus*
> 
> In a way, cards react to a bios' voltage limit differently. Partly maybe due to ASIC. (I never really did some deep look into the limit values) But I'll put it this way:
> 
> My 290 has a pretty low limit. 1.33ish to 1.35ish Under Load. Even at +400, the max vddc will be the same. Now if I put a certain value for dpm7 without removing the limit, the same maximum vddc is still achieved.
> 
> If you remove the limit, and put a manual value for dpm7, things get scurry. That is IMO.


The voltage 1.33ish to 1.35ish is that VDDC shown in MSI AB, TriXX, GPU-Z, HWiNFO? or are your discussing VID as shown in registers dump via AiDA64?

Cheers







.


----------



## mus1mus

GPU-Z and HWInfo.









And to add, the more Voltage you add, the greater the risk to encounter Black Screen or Signal Loss.


----------



## m70b1jr

Quote:


> Originally Posted by *mus1mus*
> 
> My fix is not using Crimson when shooting for high scores.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyway, just for a few tips,
> 
> 1. Gauge whether the card needs Voltage to clock. If the card scales with Voltage - artifacts disappear when you add SW Offsets.
> 
> There's nothing much that can be done if the card doesn't react well to Voltage.
> 
> 2. Keep it under 50C abosolute max when you are playing with more than 1.4V VDDC.
> 
> Can you?
> 
> 3. Don't feed it to much. I don't need to iterate the value of the card. Card death is not the target.
> 
> 4. Observe the first tip. How much do you need to clock 1200? How much is needed for another 25MHz? If it takes +100 to reach another 25Meg, just stop. Note that I am talking of VDDC values so observe if VDDC scales well with the Offset added.


Why do I have to keep it under 50C?


----------



## bluej511

Quote:


> Originally Posted by *m70b1jr*
> 
> Why do I have to keep it under 50C?


Well i dont think thats possible keeping it under 50C unless you have at least a 480mm rad for the card alone. Mine OCed with only 100mv, and the best core cooler around, yes better then ek. Under heaven id hit 43C easy. An extra 300mv id be over 50 for sure.

Unless he means 50 for VRMs then yea but for the core no way haha.


----------



## gupsterg

Quote:


> Originally Posted by *mus1mus*
> 
> GPU-Z and HWInfo.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And to add, the more Voltage you add, the greater the risk to encounter Black Screen or Signal Loss.


Cheers







, so we're talking VDDC







. I'm assuming 133ish to 135ish under load is LLC off? if so VID is higher than VDDC







, as you'd only see 1.33-1.35V VDDC with stock LLC under load if VID is higher.

Just to put this into perspective on my Vapor-X 290X I had DPM 7 @ VID 1.300V depending on app loading card I'd see as low as 1.18V VDDC for DPM 7 state due to LLC / PowerTune varying voltage.


----------



## m70b1jr

http://www.techpowerup.com/gpuz/details/6r3sy

I'll run a valley benchmark real quick. Ignore the 1160mhz default core clock, it's a modified BIOS.


----------



## bluej511

Quote:


> Originally Posted by *m70b1jr*
> 
> http://www.techpowerup.com/gpuz/details/6r3sy
> 
> I'll run a valley benchmark real quick. Ignore the 1160mhz default core clock, it's a modified BIOS.


Valley blows use heaven it peaks the GPU the whole time.


----------



## mus1mus

Quote:


> Originally Posted by *m70b1jr*
> 
> Why do I have to keep it under 50C?


Well, since you want to add a ton of Voltage, you need to make sure you have all the support needed.

Heat and Voltage will kill the chip. Power consumption rises with heat. So that silicon is in danger the higher you go.
Quote:


> Originally Posted by *bluej511*
> 
> Well i dont think thats possible keeping it under 50C unless you have at least a 480mm rad for the card alone. Mine OCed with only 100mv, and the best core cooler around, yes better then ek. Under heaven id hit 43C easy. An extra 300mv id be over 50 for sure.
> 
> Unless he means 50 for VRMs then yea but for the core no way haha.


You have not seen my temps nor the Voltage I pump.









Quote:


> Originally Posted by *gupsterg*
> 
> Cheers
> 
> 
> 
> 
> 
> 
> 
> , so we're talking VDDC
> 
> 
> 
> 
> 
> 
> 
> . I'm assuming 133ish to 135ish under load is LLC off? if so VID is higher than VDDC
> 
> 
> 
> 
> 
> 
> 
> , as you'd only see 1.33-1.35V VDDC with stock LLC under load if VID is higher.
> 
> Just to put this into perspective on my Vapor-X 290X I had DPM 7 @ VID 1.300V depending on app loading card I'd see as low as 1.18V VDDC for DPM 7 state due to LLC / PowerTune varying voltage.


That card has a VID of 1.18. So yeah, it's quite low. The good thing is, it does 1300 within 1.3.


----------



## bluej511

Quote:


> Originally Posted by *mus1mus*
> 
> Well, since you want to add a ton of Voltage, you need to make sure you have all the support needed.
> 
> Heat and Voltage will kill the chip. Power consumption rises with heat. So that silicon is in danger the higher you go.
> You have not seen my temps nor the Voltage I pump.
> 
> 
> 
> 
> 
> 
> 
> 
> That card has a VID of 1.18. So yeah, it's quite low. The good thing is, it does 1300 within 1.3.


Youve got hexa crossfire or octa crossfire, probably got one of those 12x120 external rads haha.


----------



## m70b1jr

Much FPS Gain, such wow. 1015mhz / 1500mhz VS 1235mhz / 1675mhz.


----------



## bluej511

Quote:


> Originally Posted by *m70b1jr*
> 
> 
> 
> 
> 
> Much FPS Gain, such wow. 1015mhz / 1500mhz VS 1235mhz / 1675mhz.


Something def wrong there haha.

1. Dont ever use windowed as youll get lower fps, stuff will run in the background, exclusive fullscreen in gaming and benches. Heres mine and with tesselation at 1200/1650. Yours is clocked higher, with no tess should be WAY higher. Btw you must have vsync on or frtc from the looks of it.


----------



## m70b1jr

Quote:


> Originally Posted by *bluej511*
> 
> Something def wrong there haha.
> 
> 1. Dont ever use windowed as youll get lower fps, stuff will run in the background, exclusive fullscreen in gaming and benches. Heres mine and with tesselation at 1200/1650. Yours is clocked higher, with no tess should be WAY higher. Btw you must have vsync on or frtc from the looks of it.


Wow. I did have Vysnc on. I don't remember turning that on.. I hate fullscreen, cause I like to ALT+Tab in gaming sessions to add people to skype calls, etc. Some games crash if you ALT+Tab in Fullscreen. Re-running bench marks.


----------



## bluej511

Quote:


> Originally Posted by *m70b1jr*
> 
> Wow. I did have Vysnc on. I don't remember turning that on.. I hate fullscreen, cause I like to ALT+Tab in gaming sessions to add people to skype calls, etc. Some games crash if you ALT+Tab in Fullscreen. Re-running bench marks.


I ALWAYS game full screen. Running windowed your system is allowed to run background programs/apps. If thegame is cpu intensive its gonnaruin your gaming, thanks but no thanks. When i game i game.

Spoiler alert. After years of loving 21:9 i finally ordered my 29UM68 today. Hoping to get it by wed if it ships out tomorrow. Everything in France arrives in either a day or two oh with free shipping haha.


----------



## m70b1jr

OH YEA BOI. Due to some artifacting, imma lower the core clock to 1230mhz and call it quits. For now.


----------



## m70b1jr

http://www.3dmark.com/3dm/11957120?


----------



## diggiddi

Quote:


> Originally Posted by *bluej511*
> 
> I ALWAYS game full screen. *Running windowed your system is allowed to run background programs/apps. If thegame is cpu intensive its gonnaruin your gaming, thanks but no thanks*. When i game i game.
> 
> Spoiler alert. After years of loving 21:9 i finally ordered my 29UM68 today. Hoping to get it by wed if it ships out tomorrow. Everything in France arrives in either a day or two oh with free shipping haha.


That explains the slowdowns and stuttering


----------



## bluej511

Quote:


> Originally Posted by *m70b1jr*
> 
> 
> 
> 
> 
> OH YEA BOI. Due to some artifacting, imma lower the core clock to 1230mhz and call it quits. For now.


Now try that with tesellation on haha.
Quote:


> Originally Posted by *diggiddi*
> 
> That explains the slowdowns and stuttering


It could for sure yea, ive never used it any other way so idk. Its alaways been fullscreen for me no matter what.


----------



## mus1mus

Quote:


> Originally Posted by *bluej511*
> 
> Youve got *hexa* crossfire or *octa* *crossfire*, probably got one of those 12x120 external rads haha.


You are out of your mind.

Just to point that out, actually, HEAVEN is CPU reliant. In case that never crossed both your attention.


----------



## m70b1jr

Quote:


> Originally Posted by *mus1mus*
> 
> You are out of your mind.
> 
> Just to point that out, actually, HEAVEN is CPU reliant. In case that never crossed both your attention.


Well, my dad cut off the power to the house to do something, booted up my PC, and the voltage reset... Time to reflash...


----------



## bluej511

Quote:


> Originally Posted by *mus1mus*
> 
> You are out of your mind.
> 
> Just to point that out, actually, HEAVEN is CPU reliant. In case that never crossed both your attention.


Everything is CPU reliant haha. Heaven uses way more gpu then valley though thats what i,m saying.

We never know what system you have it changes daily haha. Ill be picking up a GTX1070 once water blocks become available. YEA RIGHT!!!!!!!!!


----------



## mus1mus

Quote:


> Originally Posted by *bluej511*
> 
> Everything is CPU reliant haha. Heaven uses way more gpu then valley though thats what i,m saying.
> 
> We never know what system you have it changes daily haha. Ill be picking up a GTX1070 once water blocks become available. YEA RIGHT!!!!!!!!!












They're just 200 series cards. Not 300s.









Which 1070 though?
Founders?







Get that and be pissed that they are priced higher than the AIBs.









Quote:


> Originally Posted by *m70b1jr*
> 
> Well, my dad cut off the power to the house to do something, booted up my PC, and the voltage reset... Time to reflash...


Don't use Crimson for the purpose you are into.. Use Cat 15.10. The last of the Pre-Crimson crap.


----------



## bluej511

Quote:


> Originally Posted by *mus1mus*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> They're just 200 series cards. Not 300s.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Which 1070 though?
> Founders?
> 
> 
> 
> 
> 
> 
> 
> Get that and be pissed that they are priced higher than the AIBs.
> 
> 
> 
> 
> 
> 
> 
> 
> Don't use Crimson for the purpose you are into.. Use Cat 15.10. The last of the Pre-Crimson crap.


Yup saw that. Ceo lines his pockets and stupid fan boys lose support every 2 years.


----------



## m70b1jr

@mus1mus , if you don't mind me asking, how did you fix the black screening with too much voltage with the BIOS you sent me a while back?


----------



## mus1mus

By limiting the Voltage.









It can fixed by a hardmod though. The Hawaii Bios Editting thread has the info.


----------



## m70b1jr

Quote:


> Originally Posted by *mus1mus*
> 
> By limiting the Voltage.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It can fixed by a hardmod though. The Hawaii Bios Editting thread has the info.


And how'd you do that xddd


----------



## mus1mus

Quote:


> Originally Posted by *m70b1jr*
> 
> And how'd you do that xddd


http://www.overclock.net/t/1561372/hawaii-bios-editing-290-290x-295x2-390-390x/1950_50#post_24936560


----------



## m70b1jr

Quote:


> Originally Posted by *mus1mus*
> 
> http://www.overclock.net/t/1561372/hawaii-bios-editing-290-290x-295x2-390-390x/1950_50#post_24936560


What I mean is whats the exact steps you did to my BIOS?.


----------



## Agent Smith1984

So....

Honest question...

Is anyone interested in taking over this thread and giving it the attention it deserves moving forward? I can continue to add members weekly, but was wondering if someone wanted to make a new hobby of it...

Just throwing that out there!


----------



## mus1mus

Fid that fix the issue?


----------



## m70b1jr

Quote:


> Originally Posted by *Agent Smith1984*
> 
> So....
> 
> Honest question...
> 
> Is anyone interested in taking over this thread and giving it the attention it deserves moving forward? I can continue to add members weekly, but was wondering if someone wanted to make a new hobby of it...
> 
> Just throwing that out there!


Idk if I have enough experience but I'm down.


----------



## m70b1jr

Quote:


> Originally Posted by *mus1mus*
> 
> Fid that fix the issue?


What?


----------



## DillBer

Hey guys, I'm having some trouble with overclocking my MSI 390. I hear most people say the the MSI model should get to 1150-1160 core and 1600-1650 memory without extra voltage; nevermind after adding voltage which most are getting even more than that. It seems I can only get my 390 to 1110 core and 1550 memory at default voltage. It takes me +70mv to even get to 1150 core and only 1575 memory. I cant even reach 1600 mhz memory when it seems most are getting 1650-1700 with very little to no voltage added.

I'm using afterburner to OC. Are there certain settings that could possibly be causing instability? Could this be possibly caused by a faulty PSU cause this?
Sucks to think I just got a bad card. Unfortunately one thats not bad enough to RMA.


----------



## bluej511

Quote:


> Originally Posted by *DillBer*
> 
> Hey guys, I'm having some trouble with overclocking my MSI 390. I hear most people say the the MSI model should get to 1150-1160 core and 1600-1650 memory without extra voltage; nevermind after adding voltage which most are getting even more than that. It seems I can only get my 390 to 1110 core and 1550 memory at default voltage. It takes me +70mv to even get to 1150 core and only 1575 memory. I cant even reach 1600 mhz memory when it seems most are getting 1650-1700 with very little to no voltage added.
> 
> I'm using afterburner to OC. Are there certain settings that could possibly be causing instability? Could this be possibly caused by a faulty PSU cause this?
> Sucks to think I just got a bad card. Unfortunately one thats not bad enough to RMA.


Yea idk where you read that but i dont see getting 1150 core without extra voltage. Most of us need 100mv to get 1150-1200 core some dont even reach 1200, memory i can only get 1650 while some reach 1750 with hynix memory.

Dont forget to turn off power efficiency and set your power limit to 50%


----------



## gupsterg

Out of 4 hawaii cards I had 1 which would do 1140/1495 with +6.25mV to stock detected EVV VID 1.250V. This was not just bench stable but usable for gaming,etc. IIRC i have some screenies of [email protected] runs over 12hrs without issues.


----------



## flopper

Quote:


> Originally Posted by *DillBer*
> 
> Hey guys, I'm having some trouble with overclocking my MSI 390. I hear most people say the the MSI model should get to 1150-1160 core and 1600-1650 memory without extra voltage; nevermind after adding voltage which most are getting even more than that. It seems I can only get my 390 to 1110 core and 1550 memory at default voltage. It takes me +70mv to even get to 1150 core and only 1575 memory. I cant even reach 1600 mhz memory when it seems most are getting 1650-1700 with very little to no voltage added.
> 
> I'm using afterburner to OC. Are there certain settings that could possibly be causing instability? Could this be possibly caused by a faulty PSU cause this?
> Sucks to think I just got a bad card. Unfortunately one thats not bad enough to RMA.


I set 1100mhz and call it a day.
fps difference while pushing the hrdware to the limit sems, not a good idea.

I just wonder what amd make with Polaris,
the 1080 for example will be able to reach 2.5ghz.
Unless amd done something special Polaris wont be close to that with OC.


----------



## bluej511

Quote:


> Originally Posted by *flopper*
> 
> I set 1100mhz and call it a day.
> fps difference while pushing the hrdware to the limit sems, not a good idea.
> 
> I just wonder what amd make with Polaris,
> the 1080 for example will be able to reach 2.5ghz.
> Unless amd done something special Polaris wont be close to that with OC.


Honestly who cares lol. The biggest news ive seen today is the fact that Vega is being pushed to October of this yr instead of Q1-Q2 2017. If it absolutely destroys my r9 390 and its under 500€ for 8gb of HBM2 i would totally get one and water cool it. Still want to buy that Uncharted ps4 edition though, its so damn SEXY!!!!


----------



## mus1mus

People seem to really dig OC numbers. meh! Imagine a Fury clocked to 1500MHz









Performance is not about the MHz value, people. You are not comparing cards of the same die.


----------



## dagget3450

Quote:


> Originally Posted by *flopper*
> 
> I set 1100mhz and call it a day.
> fps difference while pushing the hrdware to the limit sems, not a good idea.
> 
> I just wonder what amd make with Polaris,
> the 1080 for example will be able to reach 2.5ghz.
> Unless amd done something special Polaris wont be close to that with OC.


I keep seeing 2.5ghz tossed around, if there any proof of this yet?


----------



## yuannan

Quote:


> Originally Posted by *dagget3450*
> 
> I keep seeing 2.5ghz tossed around, if there any proof of this yet?


I think the demo ran at 2.1GHz.

so 2.5?
Maybe not, but I think 1.8-2.0 would be quite possible for normal GPUs.


----------



## m70b1jr

Quote:


> Originally Posted by *mus1mus*
> 
> People seem to really dig OC numbers. meh! Imagine a Fury clocked to 1500MHz
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Performance is not about the MHz value, people. You are not comparing cards of the same die.


More the merrier. My r9 390 oc'd by 200mhz got +10FPS. Just imagine a r9 390 oc'd to 2ghz, beside the house fire that would start, but imagine the performance numbers.


----------



## bluej511

Quote:


> Originally Posted by *m70b1jr*
> 
> More the merrier. My r9 390 oc'd by 200mhz got +10FPS. Just imagine a r9 390 oc'd to 2ghz, beside the house fire that would start, but imagine the performance numbers.


10fps in games or just benchmarks? For games thats huge but if youre already at 60fps with a 60hz display wont make any difference lol


----------



## m70b1jr

Quote:


> Originally Posted by *bluej511*
> 
> 10fps in games or just benchmarks? For games thats huge but if youre already at 60fps with a 60hz display wont make any difference lol


Well personally, I can notice a difference between 60FPS and 240FPS on a 60htz monitor. I can notice it. Also, I get a 10FPS boost in Verdun with the OC. Speaking of Verdun, if any of you are hyped for BF1, go play some Verdun. It's really fun and requires skill.


----------



## bluej511

Quote:


> Originally Posted by *m70b1jr*
> 
> Well personally, I can notice a difference between 60FPS and 240FPS on a 60htz monitor. I can notice it. Also, I get a 10FPS boost in Verdun with the OC. Speaking of Verdun, if any of you are hyped for BF1, go play some Verdun. It's really fun and requires skill.


Not possible since youll only see your monitors refresh rate. You must have some crazy screen tearing though


----------



## m70b1jr

Quote:


> Originally Posted by *bluej511*
> 
> Not possible since youll only see your monitors refresh rate. You must have some crazy screen tearing though


No screen tearing, and I can notice a difference. I know what I see and I can without a doubt notice a difference. Probably still not as big as going to a 120hz monitor would be but it's there.


----------



## battleaxe

Quote:


> Originally Posted by *bluej511*
> 
> Not possible since youll only see your monitors refresh rate. You must have some crazy screen tearing though


Quote:


> Originally Posted by *m70b1jr*
> 
> No screen tearing, and I can notice a difference. I know what I see and I can without a doubt notice a difference. Probably still not as big as going to a 120hz monitor would be but it's there.


Typically frame time will be lower with higher FPS on a given monitor. So same refresh rate but lower frame time. Which is likely what you are noticing, not more FPS.


----------



## m70b1jr

Quote:


> Originally Posted by *battleaxe*
> 
> Typically frame time will be lower with higher FPS on a given monitor. So same refresh rate but lower frame time. Which is likely what you are noticing, not more FPS.


Probabably. I'm saving for a Car, then i'll probably pre-order BF1, save up for a high end polaris card if they come out, or a vega card then get a water block for it. By then, hopefully I'll have learned more. Maybe intime to be the starter / owner of the Vega / Navi threads


----------



## battleaxe

Quote:


> Originally Posted by *m70b1jr*
> 
> Probabably. I'm saving for a Car, then i'll probably pre-order BF1, save up for a high end polaris card if they come out, or a vega card then get a water block for it. By then, hopefully I'll have learned more. Maybe intime to be the starter / owner of the Vega / Navi threads


Either way. Lower frame times will equal a better experience. Its not all about high FPS, or at least that's not the only count that matters these days. Low frame times matter too.


----------



## bluej511

Quote:


> Originally Posted by *m70b1jr*
> 
> No screen tearing, and I can notice a difference. I know what I see and I can without a doubt notice a difference. Probably still not as big as going to a 120hz monitor would be but it's there.


You either have a freesync monitor or just not noticing it haha. Me i got 15/20 vision, i notice screen tearing below and above 60hz, i cant wait to my ultrawide freesync tomorrow, no more input lag and no more tearing. Every monitor if going above or below its set refresh rate without vsync will tear, this has been proven over and over. Its why they invented freesync/gsync


----------



## tolis626

Quote:


> Originally Posted by *bluej511*
> 
> You either have a freesync monitor or just not noticing it haha. Me i got 15/20 vision, i notice screen tearing below and above 60hz, i cant wait to my ultrawide freesync tomorrow, no more input lag and no more tearing. Every monitor if going above or below its set refresh rate without vsync will tear, this has been proven over and over. Its why they invented freesync/gsync


Well, while what you say is actually true, it depends. Me? I'm playing on a 60Hz panel too. I have better than 20/20 vision (don't even remember by how much, but it sucks being so picky about details so I sit at a distance) and in Battlefield 4, so fast-paced action, I can't notice almost any tearing when I play at my usual 90-110FPS. That's with everything maxed out at 1440p, with just MSAA at 2x. It's not that tearing isn't there, it's just that it's much less noticeable. If I enable 4x MSAA I will dip to the 70-90FPS range and then tearing is far more prevalent. The experience shouldn't be less smooth, but it is. Lower frametimes, coupled with the tearing lines being visible for shorter periods of time, lead to a better overall experience.

With all that said, this is Battlefield and tearing isn't as bad in it as it is in other games, but my point still stands I think.


----------



## bluej511

Quote:


> Originally Posted by *tolis626*
> 
> Well, while what you say is actually true, it depends. Me? I'm playing on a 60Hz panel too. I have better than 20/20 vision (don't even remember by how much, but it sucks being so picky about details so I sit at a distance) and in Battlefield 4, so fast-paced action, I can't notice almost any tearing when I play at my usual 90-110FPS. That's with everything maxed out at 1440p, with just MSAA at 2x. It's not that tearing isn't there, it's just that it's much less noticeable. If I enable 4x MSAA I will dip to the 70-90FPS range and then tearing is far more prevalent. The experience shouldn't be less smooth, but it is. Lower frametimes, coupled with the tearing lines being visible for shorter periods of time, lead to a better overall experience.
> 
> With all that said, this is Battlefield and tearing isn't as bad in it as it is in other games, but my point still stands I think.


Your point stands on one fast paced game so yes could be less noticeable, betcha id still notice it. I notice it in gta v instantly without the screen moving around if it goes above my frtc of 60fps, i notice it on netflix too if i dont have hardware acceleration enabled. It all depends on the person and the display i think. Im using a 5yr old hp display so idk could be why.

Im very very very picky, ACIII had it absolutely the worst when inside animus with cut scenes it was neck breaking.


----------



## m70b1jr

Hey guys, I got a question. Which of you have a graphics card that doesn't blackscreen adding +200mv? If you do, tell me which version / brand of the card and f you could upload you BIOS, it would be nice.


----------



## ChevChelios

someone here with a 390X own Rise of the Tomb Raider ?

if so - what are the min/avg frames on maxed out 1080P (Very High/Extreme settings with Very High PureHair) ?

ideally on both DX11 & 12

Ive seen conflicting tests with some showing 55-60 and others showing ~40-45+ for 390X in completely maxed out 1080P (granted it may have just been different levels)


----------



## afyeung

Quote:


> Originally Posted by *ChevChelios*
> 
> someone here with a 390X own Rise of the Tomb Raider ?
> 
> if so - what are the min/avg frames on maxed out 1080P (Very High/Extreme settings with Very High PureHair) ?
> 
> ideally on both DX11 & 12
> 
> Ive seen conflicting tests with some showing 55-60 and others showing ~40-45+ for 390X in completely maxed out 1080P (granted it may have just been different levels)


Maxed out at 1080p with SMAA you're looking at an average FPS in the high 50s and drops into the 40s at times.


----------



## afyeung

Quote:


> Originally Posted by *m70b1jr*
> 
> Hey guys, I got a question. Which of you have a graphics card that doesn't blackscreen adding +200mv? If you do, tell me which version / brand of the card and f you could upload you BIOS, it would be nice.


Why does yours blackscreen with +200mv? My MSI 390x is just fine when I use Trixx and +200mv. Sapphire should be fine as well since they made Trixx lol


----------



## mus1mus

Quote:


> Originally Posted by *m70b1jr*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mus1mus*
> 
> People seem to really dig OC numbers. meh! Imagine a Fury clocked to 1500MHz
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Performance is not about the MHz value, people. You are not comparing cards of the same die.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> More the merrier. My r9 390 oc'd by 200mhz got +10FPS. Just imagine a r9 390 oc'd to 2ghz, beside the house fire that would start, but imagine the performance numbers.
Click to expand...

That is a silly assumption. Like I said, if you are not comparing same cards, it's pointless.

nVidia have been doing some high clocking cards as of late. You might think those are OC numbers but they are in fact, designed to.

If their current 1000 series cards have a base clock of 1000 MHz, Boost of 1400ish MHz, then the 2000 MHz OC clock is significant.

But look,





That OC number is still, relatively low in terms of the actual increase.

Consider 980TIs that can OC up to 1500MHz pretty easily, you are looking at an increase of 23.3% over 1216 Boost Clock.

Now, bring the same OC increase to the 1080, and you are looking at 2100+ MHz OC clock. Not so hard right?

Quote:


> Originally Posted by *afyeung*
> 
> Why does yours blackscreen with +200mv? My MSI 390x is just fine when I use Trixx and +200mv. Sapphire should be fine as well since they made Trixx lol


The way black screen due to signal loss happens is dependent on each card. Just because yours doesn't black screen at +200, doesn't mean everyone's cards will do the same. Let alone the effective Voltage that varies with each card's ASIC requirements.


----------



## dagget3450

I still think its a great marketing ploy to compare a 980 to 1080 even though they are both in line of performance per respective series. While on OCN most people i see posting they will buy it are using 980ti. Then the whole faster than TitanX when even 980ti can be faster than a Titanx. So it has me thinking the benchmarks once out will less impressive for 1080. I also think the 980ti adopters will possibly be underwhelmed with 1080.

Just a hunch but i cant say much about fanboy and gpu purchases outside that due to my own AMD gpu collection.


----------



## mus1mus

Quote:


> Originally Posted by *dagget3450*
> 
> I still think its a great marketing ploy to compare a 980 to 1080 even though they are both in line of performance per respective series. While on OCN most people i see posting they will buy it are using 980ti. Then the whole faster than TitanX when even 980ti can be faster than a Titanx. So it has me thinking the benchmarks once out will less impressive for 1080. I also think the 980ti adopters will possibly be underwhelmed with 1080.
> 
> Just a hunch but i cant say much about fanboy and gpu purchases outside that due to my own AMD gpu collection.


Well, for sure the 1080 can't or is not aimed towards replacing the Titan-X and the 980TI. A tad better at best maybe. Remember the case with 780TIs vs 980s at launch?

What I do know for sure is, Drivers will dictate their upcoming sales. I am looking at a declined 980TI/Titan-X performance with new Drivers to give the 1000 series the leverage.









Best time is to either wait for the new Titan or the TI cards IMO.


----------



## patriotaki

i heard that the 1080/1070s are faster only in VR Performance.............


----------



## flopper

Quote:


> Originally Posted by *mus1mus*
> 
> Well, for sure the 1080 can't or is not aimed towards replacing the Titan-X and the 980TI.
> .


1080 replace the 980.
Once they drop the 1080ti anyone with a 980ti that bought a 1080 card will cry.
People dont think much but the 1080 will have a high mhz OC for sure.
we will see 2.5ghz cards out there.

the 1070 card at around 370euro will be a whole new 970 for nvidia unless AMD have a much better Polaris to replace the 390 series with.
Or AMD face another year with a superb 970/980 serie with no real answer and while the 390 and 290 still beats the 970 the customer see 2ghz boost clocks
guess how they then reason? The IPC with Pascal is worse than Maxwell and why they went with high clockspeeds.

Polaris might be the whole new 9700 and a real killer for anyone even with a 390 to upgrade into.


----------



## mus1mus

Quote:


> Originally Posted by *flopper*
> 
> 1080 replace the 980.
> Once they drop the 1080ti anyone with a 980ti that bought a 1080 card will cry.
> People dont think much but the 1080 will have a high mhz OC for sure.
> we will might see 2.5ghz cards out there.
> 
> the 1070 card at around 370euro will be a whole new 970 for nvidia unless AMD have a much better Polaris to replace the 390 series with.
> Or AMD face another year with a superb 970/980 serie with no real answer and *while the 390 and 290 still beats the 970 the customer see 2ghz boost clocks*
> guess how they then reason? The IPC with Pascal is worse than Maxwell and why they went with high clockspeeds.
> 
> Polaris might be the whole new 9700 and a real killer for anyone even with a 390 to upgrade into.










corrected









Bold item is very true.







I have a 1500/1650 290X benching and people were like














They can't even see the scores were lower than 1300/1600+ cards.


----------



## bluej511

Quote:


> Originally Posted by *flopper*
> 
> 1080 replace the 980.
> Once they drop the 1080ti anyone with a 980ti that bought a 1080 card will cry.
> People dont think much but the 1080 will have a high mhz OC for sure.
> we will see 2.5ghz cards out there.
> 
> the 1070 card at around 370euro will be a whole new 970 for nvidia unless AMD have a much better Polaris to replace the 390 series with.
> Or AMD face another year with a superb 970/980 serie with no real answer and while the 390 and 290 still beats the 970 the customer see 2ghz boost clocks
> guess how they then reason? The IPC with Pascal is worse than Maxwell and why they went with high clockspeeds.
> 
> Polaris might be the whole new 9700 and a real killer for anyone even with a 390 to upgrade into.


No no and NO sorry. Polaris 10 and 11 were never meant to be 390/390x replacements even amd said so its more mainstream aka 1080ponly cards which lets face it is the main market. AMD wasnt expecting the 1070 to have such performance for a cheap MSRP (although we know mrsp means nothing)

This is why amd is bringing up vega at least 4-6 months ahead of what they had planned. But polaris isnt meant for 1440p and up not from everything ive read. Vega if it does well should destroy the 1080and possibly even the 1080ti unless that comeswith HBM2 as well.

Personally if Vega does ridiculous ill be getting it, or not since im getting my ultrawide today and r9 390 handles that no problem, unless games get ridiculous but i dont see it.


----------



## flopper

Quote:


> Originally Posted by *bluej511*
> 
> No no and NO sorry. Polaris 10 and 11 were never meant to be 390/390x replacements even amd said so its more mainstream aka 1080ponly cards which lets face it is the main market. AMD wasnt expecting the 1070 to have such performance for a cheap MSRP (although we know mrsp means nothing)
> 
> This is why amd is bringing up vega at least 4-6 months ahead of what they had planned. But polaris isnt meant for 1440p and up not from everything ive read. Vega if it does well should destroy the 1080and possibly even the 1080ti unless that comeswith HBM2 as well.
> 
> Personally if Vega does ridiculous ill be getting it, or not since im getting my ultrawide today and r9 390 handles that no problem, unless games get ridiculous but i dont see it.


Polaris demo on 1440p hitman, you wont demo on 1440p unless the card is aimed towards such performance.
aka replacing the 390 with a low cost die and with as high clocks as doable.
The VR statement is making it as cheap as possible while still do a 390 level performance or better, AMD stated such.

Your argument dont make sense vs what AMD already stated and said so guess what?
Pascal shot them in the foot with the OC ability, if a 1070 for example can reach 2.5ghz with thirdparty design, guess what card a load of people would buy?
Unless Polaris OC a lot and offers a similar curve for performance as we have had with the 390 as IPC might save them that however does not seem to be the case as they Moved Vega ahead of schedule.
Pascal did better than they expected, simple as that.


----------



## bluej511

Quote:


> Originally Posted by *flopper*
> 
> Polaris demo on 1440p hitman, you wont demo on 1440p unless the card is aimed towards such performance.
> aka replacing the 390 with a low cost die and with as high clocks as doable.
> The VR statement is making it as cheap as possible while still do a 390 level performance or better, AMD stated such.
> 
> Your argument dont make sense vs what AMD already stated and said so guess what?
> Pascal shot them in the foot with the OC ability, if a 1070 for example can reach 2.5ghz with thirdparty design, guess what card a load of people would buy?
> Unless Polaris OC a lot and offers a similar curve for performance as we have had with the 390 as IPC might save them that however does not seem to be the case as they Moved Vega ahead of schedule.
> Pascal did better than they expected, simple as that.


Total BS and its why i never ever take manufacturer benchmarks seriously. The Fury does Hitman at 1440 at 60fps maxed out. Polaris is no where near Fury power, it might have been 1440 butwe have no idea at what setting, could have been medium for all we know. High end will do 60fps on 1440 Polaris wont.

Remember when the furyx demoed at 4k yet the 980ti still destroys the fury x at 4k in every single game haha? Demos and press conferences mean nothing, a mainstream card is a mainstream card. Polaris is just that affordable 1080 gaming and its where amd has the upper hand. 380/380x vs the weak 960 amd wins everytime. AMD knows they cant compete with the 980ti/1070/1080 so they decided to release Vega, dont forget Vega was slatted for Q2 of 2017.

Btw you do realize only a very small minority OC their cards right its not what sells. That OC for the 1080 is again pure speculation, again cant trust nvidia in press conferences. It might be 2.5 the absolute max OC of a binned 1080.


----------



## Agent Smith1984

I wouldn't be shocked if the 1070/1080 show up, and are total 1080P/VR monsters, but choke up in anything over 1440p....

I may seriously grab a 1070 to check out though









I am mainly curious to see how it compares to my brothers 980ti that runs at 1500/8200,. and also see how it compared to what my Fury did at 1060/560 (highest clock I got without voltage when I sold it)

Also curious to see the jump from my 980 KPE that ran at 1526/7940 daily since they will have essentially the same amount of cores at a higher clock, and similar memory bus....


----------



## bluej511

So just got my ultrawide and it is AWESOME!!!!!!!!!.

First thing i did, ran the freesync demo. And then PROBLEM, started to flicker on and off so i thought the vesa bracket (thats touching the dp cable, bad design on lg there) was the culprit. Bent the bracket back a hair and started her back up again. Then back same problem, lil specs of flicker on the image and image would cut in and out and the freesync demo crashed.

So i switched dp ports on the gpu as ive read somewhere other people having issues. Im not sure if its matters which port youre supposed to use but i think it just might. Seems ok now, didnt do it at the desktop it was only on the freesync demo, havent tried gaming yet but will do. I set it to 75hz of course.


----------



## tolis626

Hey guys, quick question here out of curiosity mostly. What would you say is the max "safe" voltage for an air cooled Hawaii card? I mean, as long as temps are in check, how high can it go before the high voltage itself is a problem?

And to those who do BIOS modding, is there any way to have LLC reduce VDroop or is it what it is and that's it? Thanks!

@bluej511

Nice! I was hoping to get an ultrawide for myself at some point. Glad to see more people seem to like them and it's not just a niche thing. Hope you have fun with it!


----------



## Streetdragon

*Green is ugly*

I have a little question about ULPS in Crossfire.
If i start my PC my secound card is running. 0% Idle, but is aktive.
If i now disable crossfire in CCC and aktivate it, then the card goes in the ULPS state. Dont know why but it is irritating.
Is there a solution? Didnt found any on google and search^^


----------



## Streetdragon

Quote:


> Originally Posted by *tolis626*
> 
> Hey guys, quick question here out of curiosity mostly. What would you say is the max "safe" voltage for an air cooled Hawaii card? I mean, as long as temps are in check, how high can it go before the high voltage itself is a problem?
> 
> And to those who do BIOS modding, is there any way to have LLC reduce VDroop or is it what it is and that's it? Thanks!
> 
> @bluej511
> 
> Nice! I was hoping to get an ultrawide for myself at some point. Glad to see more people seem to like them and it's not just a niche thing. Hope you have fun with it!


I would say <= 1,3V. should be OK if it stay cool enough


----------



## djtmalta

Hello, does anyone know how to mod a bios for my MSI 390? I can't figure out how to leave the voltage as it is or increase a little and increase the power to +50. I know (I think how to do the core to 1135 and the memory to 1600)... If someone could help me out I'd really appreciate it. I'm tired of redoing the overclock settings every time I reinstall the Crimson Drivers. Or does someone already have a moderate overclocked bios already that I could use and flash.

Thanks,

David


----------



## bluej511

Quote:


> Originally Posted by *tolis626*
> 
> Hey guys, quick question here out of curiosity mostly. What would you say is the max "safe" voltage for an air cooled Hawaii card? I mean, as long as temps are in check, how high can it go before the high voltage itself is a problem?
> 
> And to those who do BIOS modding, is there any way to have LLC reduce VDroop or is it what it is and that's it? Thanks!
> 
> @bluej511
> 
> Nice! I was hoping to get an ultrawide for myself at some point. Glad to see more people seem to like them and it's not just a niche thing. Hope you have fun with it!


Core temps are ok to pretty high temps, the thing to check is VRM temps its a bit more important.


----------



## tolis626

Quote:


> Originally Posted by *Streetdragon*
> 
> I would say <= 1,3V. should be OK if it stay cool enough


Well, on my setup max temps I see are in the high 70s or maybe low 80s (very rarely), but that's at +125mV (VID of 1.375V). 1.3V does seem a bit on the low side. By safe I didn't mean TOO safe.









Quote:


> Originally Posted by *bluej511*
> 
> Core temps are ok to pretty high temps, the thing to check is VRM temps its a bit more important.


Well, my VRMs are usually a little cooler than the core is, so usually low to mid 70s, even at high overclocks. So no, temperatures aren't the issue for what it's worth.









PS : I noticed that increasing or decreasing my AUX voltage didn't actually do anything for the temperatures. They don't seem to be higher at +50mV than at stock. What's up with that?


----------



## bluej511

So heres a god awful picture (s4 mini takes awful pictures in low light). its def taxing my 390 haha. Unity drops into the low 40s when before it would drop to the high 40s low 50s. I still need to adjust colors and what not they are awful. GTA V is just insane in ultra. The height is just about identical to my 23in the width is just BAM



GTA V in awful phone quality haha. I shouldnt even be putting it up.


----------



## m70b1jr

This is so weird. I flashed Asus's R9 390 BIOS On my card, and in Trixx, I don't black screen throwing +200 volts, but If I edit my DPM's manually, rebooting the PC causes the voltages to reset... I love After burner, mainly for because I can monitor all my temps and clock speeds at the top left corner of my monitor. I guess i'll have to suck it up and use Trixx if I want that high OC.


----------



## bluej511

Some ultrawide goodness. It def takes a little bit of a hit depending on the game.


----------



## mus1mus

Quote:


> Originally Posted by *m70b1jr*
> 
> This is so weird. I flashed Asus's R9 390 BIOS On my card, and in Trixx, I don't black screen throwing +200 volts, but If I edit my DPM's manually, rebooting the PC causes the voltages to reset... I love After burner, mainly for because I can monitor all my temps and clock speeds at the top left corner of my monitor. I guess i'll have to suck it up and use Trixx if I want that high OC.


ITurbo for the win.


----------



## tolis626

Quote:


> Originally Posted by *mus1mus*
> 
> ITurbo for the win.


Why not use AB scripts? That's how I do it, seeing how I despise both iTurbo and Trixx.

@m70b1jr
You can use this to have AB set over +100mV offsets. It won't show in the application, but it works so there's that. Although if you move the slider you reset back to +100mV max, but I think that it persists through reboots.

Code:



Code:


cd "C:\Program Files (x86)\MSI Afterburner"
MsiAfterburner.exe /wi6,30,8d,14
MsiAfterburner.exe

Your voltage offset is that 14 number. That's hexadecimal and is the number of voltage steps (6.25mV) you apply as your offset. 14 in hexadecimal is 20 in decimal, so the above means a +125mV offset.









PS : I have this saved as a .bat file on my desktop and I just right-click and run it as administrator whenever I need more voltage. Quick and almost easier than going into Afterburner.


----------



## Streetdragon

Quote:


> Originally Posted by *Streetdragon*
> 
> *Green is ugly*
> 
> I have a little question about ULPS in Crossfire.
> If i start my PC my secound card is running. 0% Idle, but is aktive.
> If i now disable crossfire in CCC and aktivate it, then the card goes in the ULPS state. Dont know why but it is irritating.
> Is there a solution? Didnt found any on google and search^^


Found the reason...
Skype is waking my gpu 2 all the time-.- just ***


----------



## Agent Smith1984

My apologies, I'm not trying to avoid you. My wife can't seem to find the tracking number and I have been calling USPS to see if they can recover it for me which is why I haven't gotten back to you. I never have her ship anything so she didn't know the importance of keeping the receipt. I know for a fact the GPU is delivering by Monday to you though. Please don't worry, I have sold 6 or 7 high end GPU's on OCN with no issues or complaints, I just got held up a little longer than usual with this one. My apologies again, and if I get the number I will send it over ASAP. Thanks again for your purchase.


----------



## bluej511

Quote:


> Originally Posted by *Streetdragon*
> 
> Found the reason...
> Skype is waking my gpu 2 all the time-.- just ***


Haha nice. I used to have to block skype home from connections as it would ping my gpu usage. Its fixed now thru one of their updates.


----------



## n64ADL

Anybody know if the 480x will be compatible with the 390. This gen of gpu's is starting to really make me regret getting a free sync monitor....


----------



## bluej511

Quote:


> Originally Posted by *n64ADL*
> 
> Anybody know if the 480x will be compatible with the 390. This gen of gpu's is starting to really make me regret getting a free sync monitor....


More then likely NOT completely different architecture. Why would you regret freesync?


----------



## n64ADL

I don't regret freesync but with that 1070 by green team.....


----------



## christoph

Quote:


> Originally Posted by *bluej511*
> 
> Some ultrawide goodness. It def takes a little bit of a hit depending on the game.


what can you tell about that monitor? any complaints? what's good? what's bad?


----------



## bluej511

Quote:


> Originally Posted by *christoph*
> 
> what can you tell about that monitor? any complaints? what's good? what's bad?


So far ultrawide is AWESOME. It def taxes the gpu quite a bit more then 1080p (more then i thought anyways) does have about 700,000 more pixels but not too bad.

Fressync works great, some games you notice it way better then others (weird i know), for example Tomb Raider is super smooth virtually down to 40fps, AC Unity/Syndicate anything below 50fps and its total crap fest (dunno if thats freesync or just the game, but i do recall when the fps drops it ran juddery on my other screen too, GTA V does the same in grassy areas). Screen tearing is absolutely gone which is drove me insane. I have the frtc set to 74fps so no chance of it going over.

Idk if i should set vsync on with freesync or off but i couldnt tell the difference. Desktop is absolutely massive. I had an issue with the picture cutting in and out when i first tried it out, bad dp port on my card i believe other one im using is just fine.

Only bad thing is the bezel is very VERY thin BUT theres a small black edge thats part or the screen so its actually more like 28.5" instead of 29" but no worries there. Heres how the desktop looks taken with a DSLR, new keyboard and new mouse in the shot as well. Excuse the mess.



All my games have worked so far even going back to Mafia II, AC Black Flag has a tweak and it works (looks fantastic).


----------



## christoph

Quote:


> Originally Posted by *bluej511*
> 
> So far ultrawide is AWESOME. It def taxes the gpu quite a bit more then 1080p (more then i thought anyways) does have about 700,000 more pixels but not too bad.
> 
> Fressync works great, some games you notice it way better then others (weird i know), for example Tomb Raider is super smooth virtually down to 40fps, AC Unity/Syndicate anything below 50fps and its total crap fest (dunno if thats freesync or just the game, but i do recall when the fps drops it ran juddery on my other screen too, GTA V does the same in grassy areas). Screen tearing is absolutely gone which is drove me insane. I have the frtc set to 74fps so no chance of it going over.
> 
> Idk if i should set vsync on with freesync or off but i couldnt tell the difference. Desktop is absolutely massive. I had an issue with the picture cutting in and out when i first tried it out, bad dp port on my card i believe other one im using is just fine.
> 
> Only bad thing is the bezel is very VERY thin BUT theres a small black edge thats part or the screen so its actually more like 28.5" instead of 29" but no worries there. Heres how the desktop looks taken with a DSLR, new keyboard and new mouse in the shot as well. Excuse the mess.
> 
> 
> 
> All my games have worked so far even going back to Mafia II, AC Black Flag has a tweak and it works (looks fantastic).


have you noticed any ghosting? you know between changing images in a fast pace? or playing videos...


----------



## bluej511

Quote:


> Originally Posted by *christoph*
> 
> have you noticed any ghosting? you know between changing images in a fast pace? or playing videos...


I havent yet, i tried ufo ghosting online and the freesync demo and i have found zero ghosting. Idk if LG changed the panel between the 67 and 68 versions but i havent noticed any. I would too as im super anal but i havent. I did watch the videos of the 67 ghosting so idk. I dont have my brightness crazy high either as my eyes are sensitive to light. I think im at 30-40 brightness level lol.


----------



## christoph

Quote:


> Originally Posted by *bluej511*
> 
> I havent yet, i tried ufo ghosting online and the freesync demo and i have found zero ghosting. Idk if LG changed the panel between the 67 and 68 versions but i havent noticed any. I would too as im super anal but i havent. I did watch the videos of the 67 ghosting so idk. I dont have my brightness crazy high either as my eyes are sensitive to light. I think im at 30-40 brightness level lol.


nice, I need a new monitor, and I saw this one but I didn't had enough info about it


----------



## patriotaki

well remember when i had coil whine on my r9 390?








i think the cause was the PSU, my brand new XFX TS 750w gold died today


----------



## tolis626

Quote:


> Originally Posted by *patriotaki*
> 
> well remember when i had coil whine on my r9 390?
> 
> 
> 
> 
> 
> 
> 
> 
> i think the cause was the PSU, my brand new XFX TS 750w gold died today


Well, would you look at that. Talk about bad luck. At least it didn't decide to die spectacularly and it didn't take anything else with it. So that's good.









Too bad you sold your 390. You could've seen if the whine improved with the new PSU.


----------



## patriotaki

Quote:


> Originally Posted by *tolis626*
> 
> Well, would you look at that. Talk about bad luck. At least it didn't decide to die spectacularly and it didn't take anything else with it. So that's good.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Too bad you sold your 390. You could've seen if the whine improved with the new PSU.


Bad luck everywhere!!!
Ill try to get the corsair rm650x instead of replacing the xfx ts 750w
Hope my luck changes from now on!

Yea i sold my 390..waiting for polaris vs pascal benchmarks.. Ill get either the 1070 or the 490
Benchmarks will show up soon....


----------



## battleaxe

Quote:


> Originally Posted by *patriotaki*
> 
> Bad luck everywhere!!!
> Ill try to get the corsair rm650x instead of replacing the xfx ts 750w
> Hope my luck changes from now on!
> 
> Yea i sold my 390..waiting for polaris vs pascal benchmarks.. Ill get either the 1070 or the 490
> Benchmarks will show up soon....


DX12 = 490 supremacy as a guess.









Lets hope anyway. LOL


----------



## Mazda6i07

Really hoping Amd drops some good cards this year; while I'm happy with my 390 it's always fun to upgrade







Looking forward to their release whenever that may be.


----------



## bluej511

Quote:


> Originally Posted by *Mazda6i07*
> 
> Really hoping Amd drops some good cards this year; while I'm happy with my 390 it's always fun to upgrade
> 
> 
> 
> 
> 
> 
> 
> Looking forward to their release whenever that may be.


Polaris 10 is supposed to be competing with the r9 390 and the gtx 1070 but with a lot less watts from what i understand. Polaris 11 is going to be even less which is ridiculous. If Polaris 10 beats the r9 390 by a fair amount ill get one and put her on water, if not im waiting for Vega maybe eventually add another ultrawide lol.


----------



## Mazda6i07

Yep, I'm planning on waiting for Vega as well, and will likely upgrade then assuming their cards are performing well. Looking forward to it, also not in a rush to upgrade like most people either.


----------



## patriotaki

offtopic :hey guys sorry for asking this , but i have a thread here and no one is answering xD
http://www.overclock.net/t/1599931/how-to-clean-acrylic-window-panel#post_25164339 , go to the last post of mine
is it safe to use this product to clean the acrylic window?







it says its not toxic, alcohol free,biological product , its even safe to pure it into the sea, all surface cleaner


----------



## Harry604

do 390s unlock to 390x


----------



## mus1mus

Quote:


> Originally Posted by *Harry604*
> 
> do 390s unlock to 390x


Pretty uncertain.

But you will have higher chances doing a 390X unlocked to a 390.


----------



## kizwan

Quote:


> Originally Posted by *Harry604*
> 
> do 390s unlock to 390x


So far I only know one successful unlock but seriously, don't hold your breath.
http://www.overclock.net/t/1567179/activation-of-cores-in-hawaii-tonga-and-fiji-unlockability-tester-ver-1-6-and-atomtool/1030#post_25062718


----------



## rdr09

Quote:


> Originally Posted by *n64ADL*
> 
> I don't regret freesync but with that 1070 by green team.....


I read the spec and not sure if it is true. The 1070 has 256 bit.


----------



## bluej511

Quote:


> Originally Posted by *rdr09*
> 
> I read the spec and not sure if it is true. The 1070 has 256 bit.


The 1070 looks like a total bust and weak card, id be shocked if it beats a 970 lol. The 1080 on the other hand looks pretty crazy but 700$ no thanks, the 1080ti is probably gonna be a grand. I really REALLY hope Vega is 500$ and pisses over both lol.


----------



## rdr09

Quote:


> Originally Posted by *bluej511*
> 
> The 1070 looks like a total bust and weak card, id be shocked if it beats a 970 lol. The 1080 on the other hand looks pretty crazy but 700$ no thanks, the 1080ti is probably gonna be a grand. I really REALLY hope Vega is 500$ and pisses over both lol.


WTH, just saw the spec for the 1080 . . . 256 bit. I hope it is a typo. lol


----------



## bluej511

Quote:


> Originally Posted by *rdr09*
> 
> WTH, just saw the spec for the 1080 . . . 256 bit. I hope it is a typo. lol


Yea but it uses ddr5x, honestly doesnt even seem like 256 to 512 bit makes any difference haha. The 1080 is crushing everything its ridiculous. And im talking actual gaming benchmarks, STILL no 4k 60fps in a single card as ive said all along lol.


----------



## rdr09

Quote:


> Originally Posted by *bluej511*
> 
> Yea but it uses ddr5x, honestly doesnt even seem like 256 to 512 bit makes any difference haha. The 1080 is crushing everything its ridiculous. And im talking actual gaming benchmarks, STILL no 4k 60fps in a single card as ive said all along lol.


Single card maybe not. SLI . . . that's prolly why they set it up for only 2-way. ugh.


----------



## flopper

Quote:


> Originally Posted by *bluej511*
> 
> Yea but it uses ddr5x, honestly doesnt even seem like 256 to 512 bit makes any difference haha. The 1080 is crushing everything its ridiculous. And im talking actual gaming benchmarks, STILL no 4k 60fps in a single card as ive said all along lol.


you wont see 1440p 60fps either.
the 980ti couldnt do that with 1080p last year.
4k is basically 6 years away at least before its reasonable to expect anything close to 60fps


----------



## TehMasterSword

http://i.imgur.com/x8pf71O.png

Hype, boys?


----------



## yuannan

Quote:


> Originally Posted by *TehMasterSword*
> 
> http://i.imgur.com/x8pf71O.png
> 
> Hype, boys?


Man, this is heating up.

I was for sure going to get a gtx 1080 for the new "fast sync" as I find tearing annoying but don't want a new monitor as my current one is already overpriced and OP for my need.

The 490 better be 980ti+!!!


----------



## m70b1jr

Quote:


> Originally Posted by *yuannan*
> 
> Man, this is heating up.
> 
> I was for sure going to get a gtx 1080 for the new "fast sync" as I find tearing annoying but don't want a new monitor as my current one is already overpriced and OP for my need.
> 
> The 490 better be 980ti+!!!


Well I guess XFX has a job opening now.


----------



## monster4bob

Need some help with my MSI 390x

It idles at around 47-55 degrees Celsius. I don't know if that is normal. Also when I use the MSI gaming app and turn on silent mode I reach Temps of 75+ idle and it only goes up.

Otherwise wise when I use Afterburner and set fans to about 40-50% I get 65-68 degrees Celsius playing bioshock infinite maxed out at 1080p for about 30min. I haven't tried for any longer.

It's the idle Temps that has me worried I heard ppl getting idle Temps of 30-35 Celcius with no gpu fans spinning.


----------



## bluej511

Quote:


> Originally Posted by *monster4bob*
> 
> Need some help with my MSI 390x
> 
> It idles at around 47-55 degrees Celsius. I don't know if that is normal. Also when I use the MSI gaming app and turn on silent mode I reach Temps of 75+ idle and it only goes up.
> 
> Otherwise wise when I use Afterburner and set fans to about 40-50% I get 65-68 degrees Celsius playing bioshock infinite maxed out at 1080p for about 30min. I haven't tried for any longer.
> 
> It's the idle Temps that has me worried I heard ppl getting idle Temps of 30-35 Celcius with no gpu fans spinning.


Depends on your case and airflow. When people mean idle they mean absolute 0 loads on the gpu, even having chrome/ie/firefox open will load your gpu a bit.

I think if i remember correctly my nitro with no fan at idle was 35-36°C. Idle temps don't matter much especially at such low loads. Me personally i didn't like it and made a custom fan curve using afterburner. 30%@30°C>60%@60°C would keep it at 60% after that its loud.


----------



## monster4bob

Quote:


> Originally Posted by *bluej511*
> 
> Depends on your case and airflow. When people mean idle they mean absolute 0 loads on the gpu, even having chrome/ie/firefox open will load your gpu a bit.
> 
> I think if i remember correctly my nitro with no fan at idle was 35-36°C. Idle temps don't matter much especially at such low loads. Me personally i didn't like it and made a custom fan curve using afterburner. 30%@30°C>60%@60°C would keep it at 60% after that its loud.


Okay when I mean idle the only thing I have running is Skype. My air flow is mediocre I think. I'll post a pic later. I'm out at the moment. I have one cool air master fan at the top plus the stock case fan.

It's more or less that gaming on silent mode that has me worried which makes me reach Temps of 75+ with nothing but Skype open. I didn't have this problem until I took it out and reinstalled it again to make my cables neater. Also thanks for the quick response


----------



## bluej511

Quote:


> Originally Posted by *monster4bob*
> 
> Okay when I mean idle the only thing I have running is Skype. My air flow is mediocre I think. I'll post a pic later. I'm out at the moment. I have one cool air master fan at the top plus the stock case fan.
> 
> It's more or less that gaming on silent mode that has me worried which makes me reach Temps of 75+ with nothing but Skype open. I didn't have this problem until I took it out and reinstalled it again to make my cables neater. Also thanks for the quick response


Silent mode is pretty much useless for temps but good for well quiet. I just ONCE ran it with msi afterburner and let Crimson do the auto fan profile. I reached 84°C lol. Put my custom profile back in back to around 68-71°C. Now on water? Yea 41°C with a dozen fans that is STILL quitter then the nitro cooler at 60% lol.


----------



## monster4bob

Quote:


> Originally Posted by *bluej511*
> 
> Silent mode is pretty much useless for temps but good for well quiet. I just ONCE ran it with msi afterburner and let Crimson do the auto fan profile. I reached 84°C lol. Put my custom profile back in back to around 68-71°C. Now on water? Yea 41°C with a dozen fans that is STILL quitter then the nitro cooler at 60% lol.


alright thanks for the help. So it's nothing I should be worried about?


----------



## Agent Smith1984

Silent fan will hit 45-60c idle and stock fan profile will hit around 80-90c depending on case flow on msi cards


----------



## bluej511

Nothing to worry about. I will say this though, skype did a few months ago put a load on my gpu because of skype home and the ads they had running in the banner. Would peg my gpu at 100%. Check to see at idle if your gpu is at 0% usage.


----------



## monster4bob

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Silent fan will hit 45-60c idle and stock fan profile will hit around 80-90c depending on case flow on msi cards


That's my problem on silent via the MSI gaming app I'm getting 75+ temps

What do you mean by stock fan profile? Thanks for the input
Quote:


> Originally Posted by *bluej511*
> 
> Nothing to worry about. I will say this though, skype did a few months ago put a load on my gpu because of skype home and the ads they had running in the banner. Would peg my gpu at 100%. Check to see at idle if your gpu is at 0% usage.


I'll be home in about 15min can check and tell you then.


----------



## monster4bob

Okay so when I set my afterburner profile to 1:1 for fan speeds and temp, my core loads jump around it goes from 0-93 usually stays at 0 but jumps around to random numbers, with 93 being the highest. This is with chrome with one tab open, skype, and open hardware monitor


----------



## monster4bob

Here is a picture of my setup incase you were wondering for air flow


----------



## Worldwin

Get two fans in the front for intake. This should help your airflow immensely as you have no intake right now.


----------



## monster4bob

Quote:


> Originally Posted by *Worldwin*
> 
> Get two fans in the front for intake. This should help your airflow immensely as you have no intake right now.


you mean here where i circled in red?


----------



## Worldwin

Quote:


> Originally Posted by *monster4bob*
> 
> you mean here where i circled in red?


Close enough. I mean put two fans for INTAKE at the front. From the image both positions are empty as there are no fans there.


----------



## monster4bob

ah I got you now okay you mean to the right of that? will do. Thank you


----------



## bluej511

Intake fans always help. If thats an 850 G1 psu be careful before the psu nazi come in here lol.


----------



## monster4bob

Quote:


> Originally Posted by *bluej511*
> 
> Intake fans always help. If thats an 850 G1 psu be careful before the psu nazi come in here lol.


Thank will get the fans for sure : )

LOL why did i make a mistake buying that. I'm a first time pc builder so don't know the INs and OUTs of system building yet. Thanks for putting up with me. Also sorry last question, is gpu load fluctuating when not playing game normal. My gpu load goes from 0-100 and anywhere in between randomly.


----------



## bluej511

Quote:


> Originally Posted by *monster4bob*
> 
> Thank will get the fans for sure : )
> 
> LOL why did i make a mistake buying that. I'm a first time pc builder so don't know the INs and OUTs of system building yet. Thanks for putting up with me. Also sorry last question, is gpu load fluctuating when not playing game normal. My gpu load goes from 0-100 and anywhere in between randomly.


If thats what you got as a PSU its fine, its a basic average PSU nothing good nothing bad, he just likes to come in and tell people to replace their PSUs unless its ones up to his standards. Pretty sure hes paid by someone as no one really cares what PSU to put in more then this guy but oh well. He has several threads on PSUs where he says its not a hate/bash thread but goes around telling people to replace em, in literally every thread. He doesnt understand basic electricity buts its ok.

And as far as idle, it needs to be completely idle, no skype no chrome no nothing. With those running it will always fluctuate as theres something using the GPU. I had to block microsoft/skype ads few months ago as the skype homepage with ads ran my gpu to 100% very odd but its fixed now.

Since the stock fan profile of the MSI/Sapphire cards doesnt turn the fans on till 60°C they get hot quite fast but honestly its nothing to worry about. Depends if you like it quiet or cool.


----------



## monster4bob

Quote:


> Originally Posted by *bluej511*
> 
> If thats what you got as a PSU its fine, its a basic average PSU nothing good nothing bad, he just likes to come in and tell people to replace their PSUs unless its ones up to his standards. Pretty sure hes paid by someone as no one really cares what PSU to put in more then this guy but oh well. He has several threads on PSUs where he says its not a hate/bash thread but goes around telling people to replace em, in literally every thread. He doesnt understand basic electricity buts its ok.
> 
> And as far as idle, it needs to be completely idle, no skype no chrome no nothing. With those running it will always fluctuate as theres something using the GPU. I had to block microsoft/skype ads few months ago as the skype homepage with ads ran my gpu to 100% very odd but its fixed now.
> 
> Since the stock fan profile of the MSI/Sapphire cards doesnt turn the fans on till 60°C they get hot quite fast but honestly its nothing to worry about. Depends if you like it quiet or cool.


alright perfect thank you for the info, and thanks for putting up with me ( and everyone else that helped me )







I really appreciate it.

LOL at the PSU guy, that gave me a good laugh.


----------



## bluej511

Quote:


> Originally Posted by *monster4bob*
> 
> alright perfect thank you for the info, and thanks for putting up with me ( and everyone else that helped me )
> 
> 
> 
> 
> 
> 
> 
> I really appreciate it.
> 
> LOL at the PSU guy, that gave me a good laugh.


Its what the forum is for, if no one asked for help forum wouldnt even exist.

And yes if your case has room for intake fans and you dont have any i would def add some. Might help get some positive pressure in there and get the warmer air in the case out a bit easier.


----------



## rdr09

Quote:


> Originally Posted by *flopper*
> 
> you wont see 1440p 60fps either.
> the 980ti couldnt do that with 1080p last year.
> 4k is basically 6 years away at least before its reasonable to expect anything close to 60fps


True. Looks like 4K will require at least 2 gpus still. If maxing out the games is the goal.


----------



## patriotaki

Quote:


> Originally Posted by *rdr09*
> 
> True. Looks like 4K will require at least 2 gpus still. If maxing out the games is the goal.


maybe 2x1080's are enough for 4k?
hm.. maybe the upcoming 1080ti would be almost enough for 4K?
we dont know...graphics evolve much faster than the gpus..


----------



## rdr09

Quote:


> Originally Posted by *patriotaki*
> 
> maybe 2x1080's are enough for 4k?
> hm.. maybe the upcoming 1080ti would be almost enough for 4K?
> we dont know...graphics evolve much faster than the gpus..


2 should be more than enuf. Not sure about maxing out the newer titles. Heck, i only have 2 290s at stock to play at 4K.



Kinda hard to win if it is not playable, right?

EDIT: Not sure how the 256 bit will come to play though for such a fast card. I read in the news section it is good for up to 2 way sli.


----------



## flopper

Quote:


> Originally Posted by *rdr09*
> 
> True. Looks like 4K will require at least 2 gpus still. If maxing out the games is the goal.


Vega might be a good step into a single card for 4k but seeing how much fps is pushed today in 1080p and 1440p I dont hope to much.
at least mssa etc..isnt needed at 4k.

Quote:


> Originally Posted by *patriotaki*
> 
> maybe 2x1080's are enough for 4k?
> hm.. maybe the upcoming 1080ti would be almost enough for 4K?
> we dont know...graphics evolve much faster than the gpus..


lower a few settings and image quality wont be an issue as your normally focused on gameplay anyhow.
did run BF4 with a 290 115fps at 5040x1050 so if maxing fps a single high end card is well doable at 4k with some adjustments.

kind of depends what you want.
I found the 390 offers splendid fps for my gaming needs at 1440p.

4K HDR monitor support have made me wait for whatever comes out in the market.
what good is the gpu if you cant see things well on screen


----------



## ziggystardust

Recently I started getting those infamous driver crash and "driver stopped responding and has recovered" error message while browsing on Firefox or Chrome occasionally. I don't remember after which driver it started happening exactly. As far as I'm aware it doesn't happening while playing games since I finished Doom, played a lot of Stellaris, Witcher 3, Forza and some other games....

I have 390X Nitro, i7 4770k, 16gb of ram and currently on Windows 10 Pro with latest updates and Crimson 16.5.2.1.

Any suggestions?


----------



## bluej511

Quote:


> Originally Posted by *ziggystardust*
> 
> Recently I started getting those infamous driver crash and "driver stopped responding and has recovered" error message while browsing on Firefox or Chrome occasionally. I don't remember after which driver it started happening exactly. As far as I'm aware it doesn't happening while playing games since I finished Doom, played a lot of Stellaris, Witcher 3, Forza and some other games....
> 
> I have 390X Nitro, i7 4770k, 16gb of ram and currently on Windows 10 Pro with latest updates and Crimson 16.5.2.1.
> 
> Any suggestions?


Yea suck it up haha. Im on 16.5.2 and mine actually just crashed the other day while watching videos on chrome and nothing else. Hasnt happened in 2 days but its happened to me before where it ended up being constant. Im guessing it might have something to do with flash so cant wait till chrome goes to html5 and ditches flash all together.

Before switching over though i was on 16.3.1 and had ZERO crashes.


----------



## ziggystardust

Quote:


> Originally Posted by *bluej511*
> 
> Yea suck it up haha. Im on 16.5.2 and mine actually just crashed the other day while watching videos on chrome and nothing else. Hasnt happened in 2 days but its happened to me before where it ended up being constant. Im guessing it might have something to do with flash so cant wait till chrome goes to html5 and ditches flash all together.
> 
> Before switching over though i was on 16.3.1 and had ZERO crashes.


That really sucks... It was fine for the last 2 days for me as well but today I switched the power efficiency off to play some Fallout 4 and it happened twice.

I asked to some friends with AMD but none of them seems to getting these crashes.

So there is no fix or a workaround yet?


----------



## bluej511

Quote:


> Originally Posted by *ziggystardust*
> 
> That really sucks... It was fine for the last 2 days for me as well but today I switched the power efficiency off to play some Fallout 4 and it happened twice.
> 
> I asked to some friends with AMD but none of them seems to getting these crashes.
> 
> So there is no fix or a workaround yet?


None that ive seen, a few people say to turn off hardware acceleration in chrome/firefox which ive done before when it was happening and it made zero difference so i stopped bothering. Its annoying but wtv, i think some nvidia people get it too.


----------



## christoph

Quote:


> Originally Posted by *ziggystardust*
> 
> That really sucks... It was fine for the last 2 days for me as well but today I switched the power efficiency off to play some Fallout 4 and it happened twice.
> 
> I asked to some friends with AMD but none of them seems to getting these crashes.
> 
> So there is no fix or a workaround yet?


do you have OC your system? try stepping a little back...

whats your system? have you check voltages?


----------



## bluej511

Quote:


> Originally Posted by *christoph*
> 
> do you have OC your system? try stepping a little back...
> 
> whats your system? have you check voltages?


OC dont matter my r9 390 does it at stock speeds haha.


----------



## ziggystardust

Quote:


> Originally Posted by *christoph*
> 
> do you have OC your system? try stepping a little back...
> 
> whats your system? have you check voltages?


No OC at the moment. Like I said it wasn't the case like a few drivers ago and it only happens while browsing on Firefox or Chrome.


----------



## Streetdragon

how much performencegain could i get, if i switch my r9 290vapor for a second r9 390 nitro?


----------



## rdr09

Quote:


> Originally Posted by *Streetdragon*
> 
> how much performencegain could i get, if i switch my r9 290vapor for a second r9 390 nitro?


I compared my scores using only Firestrike bench and it seems i have to oc my 290 100 MHz more to match a 390. I think the performance will come mainly from the 8GB goodness, which i wish i have.









BTW, i have a reference one at 947 MHz stock.

http://www.3dmark.com/3dm/11285778?

Original bios. 1200 core to beat a stock 390X.


----------



## Streetdragon

Quote:


> Originally Posted by *rdr09*
> 
> I compared my scores using only Firestrike bench and it seems i have to oc my 290 100 MHz more to match a 390. I think the performance will come mainly form the 8GB goodness, which i wish i have.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> BTW, i have a reference one at 947 MHz stock.
> 
> http://www.3dmark.com/3dm/11285778?
> 
> Original bios. 1200 core to beat a stock 390X.


Yeah the 4Gb Vram are the main reason. but i dont know if a upgrade is worth it. A new one would cost me around 300€

Edit and i wanna keep the vapor, because it looks sweet^^

and maybe put the waterblock m01 on both cards


----------



## rdr09

Quote:


> Originally Posted by *Streetdragon*
> 
> Yeah the 4Gb Vram are the main reason. but i dont know if a upgrade is worth it. A new one would cost me around 300€
> 
> Edit and i wanna keep the vapor, because it looks sweet^^


I've seen it and it does look sweet. Just hang on for HBM2.


----------



## m70b1jr

Quote:


> Originally Posted by *Streetdragon*
> 
> Yeah the 4Gb Vram are the main reason. but i dont know if a upgrade is worth it. A new one would cost me around 300€
> 
> Edit and i wanna keep the vapor, because it looks sweet^^
> 
> and maybe put the waterblock m01 on both cards


I'd just hold on and wait for Polaris / Vega


----------



## Streetdragon

Quote:


> Originally Posted by *m70b1jr*
> 
> I'd just hold on and wait for Polaris / Vega


hmm then i wait for vega. save some € and go for crossfire instantly... that sounds good^^


----------



## ziggystardust

Quote:


> Originally Posted by *bluej511*
> 
> None that ive seen, a few people say to turn off hardware acceleration in chrome/firefox which ive done before when it was happening and it made zero difference so i stopped bothering. Its annoying but wtv, i think some nvidia people get it too.


I think I will just roll back to one of those previous drivers. 16.4.2 maybe, I'm somewhat sure I didn't have the problem with that driver.

Do you know any viable report/support e-mail that I can report this issue? I've filled the form at www.amd.com/report before but I didn't get any response back.


----------



## Stige

Quote:


> Originally Posted by *Streetdragon*
> 
> hmm then i wait for vega. save some € and go for crossfire instantly... that sounds good^^


Crossfire is a waste of time for anything but benchmarks.


----------



## battleaxe

Quote:


> Originally Posted by *Stige*
> 
> Crossfire is a waste of time for anything but benchmarks.


Oh jeez. Not this again.

Fact or OPINION?

Your OPINION is all that is. Nothing more. Not a fact at all.


----------



## bluej511

Quote:


> Originally Posted by *battleaxe*
> 
> Oh jeez. Not this again.
> 
> Fact or OPINION?
> 
> Your OPINION is all that is. Nothing more. Not a fact at all.


Yup check sig haha.

Off topic, gtx 1080 founders edition here in France from both evga and msi 790€, MSRP my ass haha.


----------



## christoph

Quote:


> Originally Posted by *bluej511*
> 
> OC dont matter my r9 390 does it at stock speeds haha.


Quote:


> Originally Posted by *ziggystardust*
> 
> No OC at the moment. Like I said it wasn't the case like a few drivers ago and it only happens while browsing on Firefox or Chrome.


I wasn't talking about video card OC


----------



## Stige

Quote:


> Originally Posted by *battleaxe*
> 
> Oh jeez. Not this again.
> 
> Fact or OPINION?
> 
> Your OPINION is all that is. Nothing more. Not a fact at all.


I had 2x HD7950. The number of games that it made it better in is probably close to zero. I might have gained FPS in some games but the stuttering and awful frametimes was just.. awful.
Unplayable even with higher frame rates.


----------



## rdr09

Quote:


> Originally Posted by *Stige*
> 
> I had 2x HD7950. The number of games that it made it better in is probably close to zero. I might have gained FPS in some games but the stuttering and awful frametimes was just.. awful.
> Unplayable even with higher frame rates.


You had them paired with an i5 Sandy, right? My i7 Sandy at 4.5 HT off got maxed at in BF3 MP with crossfire 7950/7970. Single 290 was fine. Even in BF4 MP.


----------



## Streetdragon

Quote:


> Originally Posted by *Stige*
> 
> I had 2x HD7950. The number of games that it made it better in is probably close to zero. I might have gained FPS in some games but the stuttering and awful frametimes was just.. awful.
> Unplayable even with higher frame rates.


390+290 with a 5820k is awesome. IF the game has a crossfire-profile i have no micro stutter or something like that. Power over power. Till now a fps-boost from 80-100%. And it looks cool and it satisfy me


----------



## ziggystardust

Quote:


> Originally Posted by *christoph*
> 
> I wasn't talking about video card OC


No overclock on the system. Like I said, it wasn't crashing before a few drivers ago.


----------



## TrueForm

I got my Gigabyte G1 gaming 390 today!

(was planning on getting a 1080 but too expensive and g-sync monitors are the same)


----------



## SiD997

PowerColor Radeon R9 390
stockcooling

great card and only paid £180


----------



## Ayyemdee

Hey guys. Might have slightly stupidly purchased a R9 390 XFX BLACK in the beginning of May. Should have waited for Polaris or the 1070, but whatever, that's not why I'm here and so far it's been doing great in 1080p games and that's all I really want.

So, to get to the point. I've never done any overclocking whatsoever so excuse me if the following questions seem stupid








1a. Would an overclock from 1050/1500 to a 1150/1650~ significantly increase my GPUs idle&load temps? (My R9 never goes above 30-35 when idling and I have yet to see it go above 75 under load)
1b. What load temps should I avoid having my card working at for extended periods of time? (heard 80c for 4hours is pushing it but I'm pretty new to the PC scene)
2. How would I go about getting a good stable OC on the card?(literally step by step, moron friendly instructions) (need anything else except MSI Afterburner?)
3. Any cons to having my 390 OCed?

Sorry for the loaded questions but like I said, just recently started to look into PC hardware and I still know very little.


----------



## bluej511

Quote:


> Originally Posted by *Ayyemdee*
> 
> Hey guys. Might have slightly stupidly purchased a R9 390 XFX BLACK in the beginning of May. Should have waited for Polaris or the 1070, but whatever, that's not why I'm here and so far it's been doing great in 1080p games and that's all I really want.
> 
> So, to get to the point. I've never done any overclocking whatsoever so excuse me if the following questions seem stupid
> 
> 
> 
> 
> 
> 
> 
> 
> 1a. Would an overclock from 1050/1500 to a 1150/1650~ significantly increase my GPUs idle&load temps? (My R9 never goes above 30-35 when idling and I have yet to see it go above 75 under load)
> 1b. What load temps should I avoid having my card working at for extended periods of time? (heard 80c for 4hours is pushing it but I'm pretty new to the PC scene)
> 2. How would I go about getting a good stable OC on the card?(literally step by step, moron friendly instructions) (need anything else except MSI Afterburner?)
> 3. Any cons to having my 390 OCed?
> 
> Sorry for the loaded questions but like I said, just recently started to look into PC hardware and I still know very little.


No cons to OCing a card EXCEPT for watching the one temp that waters quite a bit and thats VRM temps. GPUZ or hwinfo will show you those. The idea is generally the lower the VRM temps the better for OCing.

You should be able to get 1150/1650 (if your lucky) with adding very minimal voltage. I can get 1100/1600 without any voltage at all on my Nitro. Core temps are ok up to 94°C then it starts to throttle so generally you wanna stay under that. As far as stability you can use firestrike, heaven, valley or just game on it to see if you get any artifacts.

Id start by just setting it to 1100/1600 without any voltage and run a few benches or games and see if you get any artifacts. If not without any voltage then you can try higher clocks.


----------



## Ayyemdee

Quote:


> Originally Posted by *bluej511*
> 
> No cons to OCing a card EXCEPT for watching the one temp that waters quite a bit and thats VRM temps. GPUZ or hwinfo will show you those. The idea is generally the lower the VRM temps the better for OCing.
> 
> You should be able to get 1150/1650 (if your lucky) with adding very minimal voltage. I can get 1100/1600 without any voltage at all on my Nitro. Core temps are ok up to 94°C then it starts to throttle so generally you wanna stay under that. As far as stability you can use firestrike, heaven, valley or just game on it to see if you get any artifacts.
> 
> Id start by just setting it to 1100/1600 without any voltage and run a few benches or games and see if you get any artifacts. If not without any voltage then you can try higher clocks.


OCed my card to 1100/1600 (no voltage was changed), benched it in valley a couple of times. So far so good. No stability issues, Core/VRAM reached a max of 70/60c respecitvely (any thoughts on these? actually have no clue what 'good' vram temps are) . Will keep my 390 at this clock speed for a couple of days to see if any issues arise with crashes/arifacts/temps when put in a real gaming scenario. Thanks for your time, will hit you guys up again in a few days to maybe push this card a little further still







Cheers.


----------



## tolis626

All this discussion about Crossfire and no one seems to mention how much it has improved with the 200 and 300 series compared to older generations. From my own experience (And most people testing it online seem to agree), Crossfire 290/390s are a vastly superior solution to Crossfire 7950/7970s. And I'm not talking about the obvious performance benefits, but rather the experience itself. Since Hawaii came out, Crossfire has performed as well or even better than SLI in the frame times department and stuttering is much, much less prevalent. Now, is it a superior solution to a single card if both have the performance to satisfy your needs? No, it isn't and multi card setups never have been. But would I take Crossfire 390s over a 980ti/Fury X for 1440p/4K? Absolutely.


----------



## bluej511

Quote:


> Originally Posted by *tolis626*
> 
> All this discussion about Crossfire and no one seems to mention how much it has improved with the 200 and 300 series compared to older generations. From my own experience (And most people testing it online seem to agree), Crossfire 290/390s are a vastly superior solution to Crossfire 7950/7970s. And I'm not talking about the obvious performance benefits, but rather the experience itself. Since Hawaii came out, Crossfire has performed as well or even better than SLI in the frame times department and stuttering is much, much less prevalent. Now, is it a superior solution to a single card if both have the performance to satisfy your needs? No, it isn't and multi card setups never have been. But would I take Crossfire 390s over a 980ti/Fury X for 1440p/4K? Absolutely.


If theres no need i would never EVER get crossfire over a single card. Simple reason? Crossfire profiles on launch are always a mess. You gotta understand that crossfire maybe makes up 1% of the entire market (idk the actual figure but its very VERY small)

As far as i understand reason its gotten better is because theres no crossfire/sli bridge anymore its all done thru pcie. Its why nvidia has and will always have issues till they switch.

Me personally i would never get crossfire just for that simple reason, no good profiles for most games. Good luck getting em on windows games btw haha.


----------



## battleaxe

Agreed. I have been running Xfire for two generations now. I used to run SLI GTX670's. And the 390x Xfire setup is more smooth (IMO) than was the SLI setup. It runs great for me. For the games I play; BF3 BF4 it runs just fine. I get that its not for everyone and that its only good on some games, but a person should know that going in IMO.

Xfire is awesome in my opinion and it works perfectly for many of us.

Yes, there are a few things a person needs to know in order to set it up properly. If using monitoring software then run a long latency on the software. I run 5000msec on mine just to keep any possible latency delays at bay. I haven't' experienced stuttering at all since doing this.
Quote:


> Originally Posted by *bluej511*
> 
> If theres no need i would never EVER get crossfire over a single card. Simple reason? Crossfire profiles on launch are always a mess. You gotta understand that crossfire maybe makes up 1% of the entire market (idk the actual figure but its very VERY small)
> 
> As far as i understand reason its gotten better is because theres no crossfire/sli bridge anymore its all done thru pcie. Its why nvidia has and will always have issues till they switch.
> 
> Me personally i would never get crossfire just for that simple reason, no good profiles for most games. Good luck getting em on windows games btw haha.


Sure. I can agree with most of this. But when these cards are as strong as the 390x and 290x were, its hard to argue with them against the competition as these are simply stronger than any single GPU (currently).


----------



## SLK

Code:

Quote:


> Originally Posted by *ziggystardust*
> 
> Recently I started getting those infamous driver crash and "driver stopped responding and has recovered" error message while browsing on Firefox or Chrome occasionally. I don't remember after which driver it started happening exactly. As far as I'm aware it doesn't happening while playing games since I finished Doom, played a lot of Stellaris, Witcher 3, Forza and some other games....
> 
> I have 390X Nitro, i7 4770k, 16gb of ram and currently on Windows 10 Pro with latest updates and Crimson 16.5.2.1.
> 
> Any suggestions?


This just happened to be last night. Got a random TDR from jut using the Edge browser on my Sapphire Nitro 390x. Wonder if its the 16.5.2.1 driver since just reinstalled the OS then this starts happening.

Had no TDRs on 16.3.


----------



## tolis626

I don't disagree with you guys, I just wanted to point out that it's a higly personal thing whether you'll accept the trade-offs of multi-card setups for the performance increase.









On another note, anyone having problems with BF4 not launching after installing the 16.5.2.1 driver? I don't know if the driver is to blame, but until yesterday I could play BF4 just fine. Since yesterday night, though, it's nothing but trouble. It just doesn't work, it doesn't start. It loads and then crashes to desktop before it makes it into the game. Any ideas?

PS : Yup, it's the drivers. Found it out in the Battlelog forums. They screwed something up about Mantle, so now to play with it one has to delete Mantle's cache from the BF4 folder (In Documents). Bummer.


----------



## ziggystardust

Quote:


> Originally Posted by *SLK*
> 
> Code:


This just happened to be last night. Got a random TDR from jut using the Edge browser on my Sapphire Nitro 390x. Wonder if its the 16.5.2.1 driver since just reinstalled the OS then this starts happening.

Had no TDRs on 16.3.

Looks like there's a problem with 16.5.2 and 16.5.2.1 drivers.

Do you also have flickering screen bug while browsing?

People with 290/390 are reporting that they are having occasional flickerings on the screen while browsing or sometimes on desktop (not in games) with 16.5.2 and 16.5.2.1 drivers. I didn't have those flickerings yet though but apparently these two issues are somewhat related and hopefully there's a fix ready for the next drivers.

Do you have power efficiency on or off by the way?

I felt like it's happening more often when it's off (still rarely though).

Like you said, I never had any TDRs before 16.5.2/16.5.2.1.


----------



## bluej511

Quote:


> Originally Posted by *ziggystardust*
> 
> Do you have
> Looks like there's a problem with 16.5.2 and 16.5.2.1 drivers.
> 
> Do you also have flickering screen bug while browsing?
> 
> People with 290/390 are reporting they are having occasional flickering screen while browsing or sometimes on desktop (not in games) on 16.5.2 and 16.5.2.1. I didn't have those flickerings yet though but apparently these two issues are somewhat related and hopefully there's a fix ready for the next drivers.
> 
> Do you have power efficiency on or off by the way?
> 
> I felt like it's happening more often when it's off (still rarely though).
> 
> Like you said, I never had any TDRs before 16.5.2/16.5.2.1.


That flickering thing caught my attention. I thought it was my new screen or my psu getting crap power oscilation.

I did notice this once in a while on my old screen with my old driver though but ive noticed this a couple times in the past couple days. Didn't think it was from the drive cuz it happened on my old screen with older drivers. It is happening on this one at desktop or while browsing, havent seen it yet in gaming. Maybe ill revert back to 16.51 or 16.3.1. This is why i keep the installs on my spare drive haha i even have 15.7.1 Catalyst.


----------



## SLK

Quote:


> Originally Posted by *ziggystardust*
> 
> Looks like there's a problem with 16.5.2 and 16.5.2.1 drivers.
> 
> Do you also have flickering screen bug while browsing?
> 
> People with 290/390 are reporting that they are having occasional flickerings on the screen while browsing or sometimes on desktop (not in games) with 16.5.2 and 16.5.2.1 drivers. I didn't have those flickerings yet though but apparently these two issues are somewhat related and hopefully there's a fix ready for the next drivers.
> 
> Do you have power efficiency on or off by the way?
> 
> I felt like it's happening more often when it's off (still rarely though).
> 
> Like you said, I never had any TDRs before 16.5.2/16.5.2.1.


No Flickering and power efficiency was on at the time. I actually forgot about that feature. Just some random TDRs while on the desktop. Maybe because I use freesync there is some kind of buffer that I don't see the flicker.


----------



## bluej511

Quote:


> Originally Posted by *SLK*
> 
> No Flickering and power efficiency was on at the time. I actually forgot about that feature. Just some random TDRs while on the desktop. Maybe because I use freesync there is some kind of buffer that I don't see the flicker.


I got freesync and still saw it. I went back to 16.5.1 so we'll see how it goes haha.


----------



## spyshagg

One should understand that if your go-to games support crossfire, then crossfire becomes a good option.

Out of the games I play regularly (for years now), only two don't support crossfire. The ones that support are:

BF4
BLOPS3
Assetto Corsa
Dirt
Pcars

It does not matter if the last 4 or 5 games released this year dont support crossfire. I won't play all of them, and the ones I play I'll only spend 8-10 hours on it..

On the games I mentioned above I have 1000's hours. All have great crossfire scaling and all need as much horsepower as you can throw at them (BF4 + Blops = 144fps AC + DIRT + Pcard = triples screens). Crossfire is more than justified.


----------



## SLK

Quote:


> Originally Posted by *bluej511*
> 
> I got freesync and still saw it. I went back to 16.5.1 so we'll see how it goes haha.


I am just glad its not the card. 16.3 works just fine so far.


----------



## battleaxe

Quote:


> Originally Posted by *spyshagg*
> 
> One should understand that if your go-to games support crossfire, then crossfire becomes a good option.
> 
> Out of the games I play regularly (for years now), only two don't support crossfire. The ones that support are:
> 
> BF4
> BLOPS3
> Assetto Corsa
> Dirt
> Pcars
> 
> It does not matter if the last 4 or 5 games released this year dont support crossfire. I won't play all of them, and the ones I play I'll only spend 8-10 hours on it..
> 
> On the games I mentioned above I have 1000's hours. All have great crossfire scaling and all need as much horsepower as you can throw at them (BF4 + Blops = 144fps AC + DIRT + Pcard = triples screens). Crossfire is more than justified.


This ^^


----------



## ziggystardust

Quote:


> Originally Posted by *bluej511*
> 
> I got freesync and still saw it. I went back to 16.5.1 so we'll see how it goes haha.


16.5.3 is out but people still reporting that the flickering issue is still there. I didn't have those flickerings but probably it won't fix these occasional TDRs as well. Still worth to try though.


----------



## bluej511

Quote:


> Originally Posted by *ziggystardust*
> 
> 16.5.3 is out but people still reporting that the flickering issue is still there. I didn't have those flickerings but probably it won't fix these occasional TDRs as well. Still worth to try though.


Seems like 16.5.1 i havent noticed it since going back. Seems like 16.5.2-16.5.3 have it especially if people are reporting it.


----------



## ziggystardust

Quote:


> Originally Posted by *bluej511*
> 
> Seems like 16.5.1 i havent noticed it since going back. Seems like 16.5.2-16.5.3 have it especially if people are reporting it.


Yeah, it's weird that there is still no acknowledgement from AMD yet. Also I've reported the issue to AMD support last week but no response so far.


----------



## tolis626

Stupid question but what are TDRs? Also, 16.5.2.1 is all problems. BF4's making my PC unstable as all hell.


----------



## ziggystardust

Quote:


> Originally Posted by *tolis626*
> 
> Stupid question but what are TDRs? Also, 16.5.2.1 is all problems. BF4's making my PC unstable as all hell.


Timeout detection and recovery. Basically that infamous message you get, when your gpu driver stops responding and successfully recovered.

I'm on 16.5.3 right now. So far no flickering or tdr. But needs more time. There are already some reports on reddit.


----------



## Ayyemdee

Wrote a post about OCing my R9 390 XFX BLACK a couple of pages ago. I ran an OC from 1050/1500 (stock) to 1100/1600. It ran well for an hour or two (Witcher 3) but then my GPU temp spiked from a stable 68-70c~ to 75-77c and artifacting begun (discoloured squares just fyi). I then lowered the memory clock to stock but left the core clock at 1100. Same thing happened minus the temp spike (maybe that was due to ambient heat in the room or smthing). Feeling kinda bad since I heard there are multiple reports of ppl pushing 1150/1650 without any voltage changes, maybe the silicon lottery wasn't kind to me (ASIC score of 66.6%, not even sure what that means but some people requested it). So, my question is. How would I got about achieving a stable 1150/1650 OC? What kind of changes to the voltage should I make (please be as specific as you can be, I never dabbled in OCing before)


----------



## bluej511

Quote:


> Originally Posted by *Ayyemdee*
> 
> Wrote a post about OCing my R9 390 XFX BLACK a couple of pages ago. I ran an OC from 1050/1500 (stock) to 1100/1600. It ran well for an hour or two (Witcher 3) but then my GPU temp spiked from a stable 68-70c~ to 75-77c and artifacting begun (discoloured squares just fyi). I then lowered the memory clock to stock but left the core clock at 1100. Same thing happened minus the temp spike (maybe that was due to ambient heat in the room or smthing). Feeling kinda bad since I heard there are multiple reports of ppl pushing 1150/1650 without any voltage changes, maybe the silicon lottery wasn't kind to me (ASIC score of 66.6%, not even sure what that means but some people requested it). So, my question is. How would I got about achieving a stable 1150/1650 OC? What kind of changes to the voltage should I make (please be as specific as you can be, I never dabbled in OCing before)


I havent seen anyone do 1150/1650 without any voltage. I do 1200/1650 stable and thats WITH 100mv so idk where you got that idea haha. Without voltage i can do 1100/1600 and stable.


----------



## TainePC

Could i get added to members list plz? Pretty sure i added gpu z above
thx


----------



## Ayyemdee

W
Quote:


> Originally Posted by *bluej511*
> 
> I havent seen anyone do 1150/1650 without any voltage. I do 1200/1650 stable and thats WITH 100mv so idk where you got that idea haha. Without voltage i can do 1100/1600 and stable.


What kind of temps are you getting running 1200/1650?


----------



## bluej511

Quote:


> Originally Posted by *Ayyemdee*
> 
> W
> What kind of temps are you getting running 1200/1650?


Im under water so it might have raised my core a couple degrees thats about it, VRMs went up maybe 8-10°C at that load. I havent ran it like that in a while so not sure i remember. Right now my VRM1-VRM2 are at about 50/60°C or so respectively.


----------



## bluej511

Well up to date. Going back to 16.5.1 has fixed the screen flicker issue i was having with 16.5.2. Wondering now if 16.5.3 has that issue, its optimized for games i dont play so no reason to lol.


----------



## mus1mus

Quote:


> Originally Posted by *Ayyemdee*
> 
> Wrote a post about OCing my R9 390 XFX BLACK a couple of pages ago. I ran an OC from 1050/1500 (stock) to 1100/1600. It ran well for an hour or two (Witcher 3) but then my GPU temp spiked from a stable 68-70c~ to 75-77c and artifacting begun (discoloured squares just fyi). I then lowered the memory clock to stock but left the core clock at 1100. Same thing happened minus the temp spike (maybe that was due to ambient heat in the room or smthing). Feeling kinda bad since I heard there are multiple reports of ppl pushing 1150/1650 without any voltage changes, maybe the silicon lottery wasn't kind to me (ASIC score of 66.6%, not even sure what that means but some people requested it). So, my question is. How would I got about achieving a stable 1150/1650 OC? What kind of changes to the voltage should I make (please be as specific as you can be, I never dabbled in OCing before)


GPU temps will rise even without adding Voltages so be aware to ramp up your fan speed along with overclocking the card.

If you wanna do it systematically, do it slow, observe your gains if they're worth the extra heat and eventually--noise.

1. Grab a good software like MSI Afterburner.
2. Increase the fan speed up to a point that you can bear.
3. You might also consider your case air flow before delving into this foray.
4. As usual, basic settings:


5. Add 25 MHz on the core and test with a game or Heaven Benchmark. Do this till you get artifatcs by adding 25 Mhz after a successful - artifact-free run.
6. Once you get to a point where the artifacts appear, Try to add a little amount of Voltage (VDDC) and see if that fixes the artifacts issue. Add 25mV at a time and observe your temps.
7. Once you hit your personal thermal limit ( pick a ceiling you are willing to play with ), and the Core clocks dialed in, play your games on that clock just to verify you got things dialed in properly.
8. If everything goes right, Proceed with the Memory OC. This time, you don't need to touch anything other than Memory Clock. Try 1625 off the bat and test. If it causes the card to become unstable, drop it to 1600. and so on..

9. Optional - once you get the Core, Memory and temps in check, you might consider adding another 25mV for the VDDC if temps allow or go back to the previous (-25 MHz) clock you have attained for real stability. Voltage can be dropped as well as 25 Mhz sometimes require more than 25mV to be stabilized.








I find Overclocking fun. If you don't have that itch, run stock.


----------



## tolis626

Hey guys, anyone using HWiNFO64 that has updated to the latest beta? It added a GPU memory errors monitor. Now, I have no idea how accurate it is, but if it is accurate, there's something wrong with my PC probably. There are errors in the thousands. Like, seriously thousands. And that's on otherwise stable memory overclocks (my sack-of-crap memory does 1650MHz at +50mV AUX or something. Totally horrible). What the hell is up with that? Or is ECC supposed to correct those?

Other than that, 16.5.3 also had problems with TDRs. I never had any flickering with either 16.5.2.1 or 16.5.3, so I can't comment on that. For now, 16.5.1 it is.

PS : Nope, spoke too soon. Even 1650MHz generated just over 10.000 errors. What. The. Hell.


----------



## bluej511

Quote:


> Originally Posted by *tolis626*
> 
> Hey guys, anyone using HWiNFO64 that has updated to the latest beta? It added a GPU memory errors monitor. Now, I have no idea how accurate it is, but if it is accurate, there's something wrong with my PC probably. There are errors in the thousands. Like, seriously thousands. And that's on otherwise stable memory overclocks (my sack-of-crap memory does 1650MHz at +50mV AUX or something. Totally horrible). What the hell is up with that? Or is ECC supposed to correct those?
> 
> Other than that, 16.5.3 also had problems with TDRs. I never had any flickering with either 16.5.2.1 or 16.5.3, so I can't comment on that. For now, 16.5.1 it is.
> 
> PS : Nope, spoke too soon. Even 1650MHz generated just over 10.000 errors. What. The. Hell.


Idk how well id trust that haha. Its the same thing when running RAM test or SMART test on HDDs when you end up with thousands sometimes millions of errors but its meaningless.


----------



## tolis626

Quote:


> Originally Posted by *bluej511*
> 
> Idk how well id trust that haha. Its the same thing when running RAM test or SMART test on HDDs when you end up with thousands sometimes millions of errors but its meaningless.


Is it, though? That's what I'm concerned about. So far, strangely, the least amount of errors is with high core votlages and clocks (In excess of +100mV and 1190MHz) and with the memory at 1625MHz with no additional AUX voltage. Go figure.


----------



## bluej511

Quote:


> Originally Posted by *tolis626*
> 
> Is it, though? That's what I'm concerned about. So far, strangely, the least amount of errors is with high core votlages and clocks (In excess of +100mV and 1190MHz) and with the memory at 1625MHz with no additional AUX voltage. Go figure.


I believe so cuz even when i installed a brand new hard drive the SMART data under diskdefrag told me there was errors lol. Idk how reliable these are for GPU/CPU/RAM errors and all that. If it works just fine id say theres no worries.

Mine also just reaches 1650mhz guessing you have elpida memory as well.


----------



## kizwan

Quote:


> Originally Posted by *tolis626*
> 
> Hey guys, anyone using HWiNFO64 that has updated to the latest beta? It added a GPU memory errors monitor. Now, I have no idea how accurate it is, but if it is accurate, there's something wrong with my PC probably. There are errors in the thousands. Like, seriously thousands. And that's on otherwise stable memory overclocks (my sack-of-crap memory does 1650MHz at +50mV AUX or something. Totally horrible). What the hell is up with that? Or is ECC supposed to correct those?
> 
> Other than that, 16.5.3 also had problems with TDRs. I never had any flickering with either 16.5.2.1 or 16.5.3, so I can't comment on that. For now, 16.5.1 it is.
> 
> PS : Nope, spoke too soon. Even 1650MHz generated just over 10.000 errors. What. The. Hell.


I think the error count you get is before the memory controller make the correction. You can ask HWiNFO author. I rather have zero error than error that being corrected by memory controller because it just wasting energy without getting full performance.


----------



## tolis626

Quote:


> Originally Posted by *bluej511*
> 
> I believe so cuz even when i installed a brand new hard drive the SMART data under diskdefrag told me there was errors lol. Idk how reliable these are for GPU/CPU/RAM errors and all that. If it works just fine id say theres no worries.
> 
> Mine also just reaches 1650mhz guessing you have elpida memory as well.


Mine never shows errors that aren't there. SMART etc have never showed me errors. Only my RAM did at some point but that was with half-stable overclocks, so no point to those. This is the first time I see errors from the GPU. And strange thing is I have Hynix memory, so in theory I should have been getting better clocks. But no, 1650MHz max for me. God damn it. If I hadn't taken the card apart like 10 times and I hadn't replaced the TIM and every thermal pad it had, I'd try to RMA it. But it works and I've tampered with it, so that's a no go.
Quote:


> Originally Posted by *kizwan*
> 
> I think the error count you get is before the memory controller make the correction. You can ask HWiNFO author. I rather have zero error than error that being corrected by memory controller because it just wasting energy without getting full performance.


Well, I too prefer not getting errors, but this just seems impossible at this point. Maybe you could try it on your PC too? I'm kind of lost here.


----------



## spyshagg

ECC corrects it and in the process it wastes cycles (performance). Before this tool, we were blind. Now we can see and adjust the mem clock to its most efficient setting.


----------



## kizwan

Quote:


> Originally Posted by *tolis626*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> I think the error count you get is before the memory controller make the correction. You can ask HWiNFO author. I rather have zero error than error that being corrected by memory controller because it just wasting energy without getting full performance.
> 
> 
> 
> Well, I too prefer not getting errors, but this just seems impossible at this point. Maybe you could try it on your PC too? I'm kind of lost here.
Click to expand...

I already tried the beta version of the beta version. lol There's earlier version posted by the author in the Hawaii BIOS mod thread. I got 1754 errors on my Hynix card at 1500MHz with +100mV core & +50mV AUX. I did get flickering & artifacting when testing at this frequency. My Hynix card is going downhill after frequent abuse for a couple of years.

Zero error on the Elpida card at the same frequency but need re-testing later.


----------



## tolis626

Quote:


> Originally Posted by *kizwan*
> 
> I already tried the beta version of the beta version. lol There's earlier version posted by the author in the Hawaii BIOS mod thread. I got 1754 errors on my Hynix card at 1500MHz with +100mV core & +50mV AUX. I did get flickering & artifacting when testing at this frequency. My Hynix card is going downhill after frequent abuse for a couple of years.
> 
> Zero error on the Elpida card at the same frequency but need re-testing later.


After a 20-25 minute game of Battlefield 4 at 1185/1675MHz at +100mV with no AUX added I got about 226.000 errors. I'm telling you, I got problems.


----------



## Ayyemdee

Quote:


> Originally Posted by *mus1mus*
> 
> GPU temps will rise even without adding Voltages so be aware to ramp up your fan speed along with overclocking the card.
> 
> If you wanna do it systematically, do it slow, observe your gains if they're worth the extra heat and eventually--noise.
> 
> 1. Grab a good software like MSI Afterburner.
> 2. Increase the fan speed up to a point that you can bear.
> 3. You might also consider your case air flow before delving into this foray.
> 4. As usual, basic settings:
> 
> 
> 5. Add 25 MHz on the core and test with a game or Heaven Benchmark. Do this till you get artifatcs by adding 25 Mhz after a successful - artifact-free run.
> 6. Once you get to a point where the artifacts appear, Try to add a little amount of Voltage (VDDC) and see if that fixes the artifacts issue. Add 25mV at a time and observe your temps.
> 7. Once you hit your personal thermal limit ( pick a ceiling you are willing to play with ), and the Core clocks dialed in, play your games on that clock just to verify you got things dialed in properly.
> 8. If everything goes right, Proceed with the Memory OC. This time, you don't need to touch anything other than Memory Clock. Try 1625 off the bat and test. If it causes the card to become unstable, drop it to 1600. and so on..
> 
> 9. Optional - once you get the Core, Memory and temps in check, you might consider adding another 25mV for the VDDC if temps allow or go back to the previous (-25 MHz) clock you have attained for real stability. Voltage can be dropped as well as 25 Mhz sometimes require more than 25mV to be stabilized.
> 
> 
> 
> 
> 
> 
> 
> 
> I find Overclocking fun. If you don't have that itch, run stock.


Thank you so much







Exactly what I needed. Following your method I got to 1125/1600 which is still pretty mild as R9 390 OCs go (afaik) but the temperatures are starting to get into the 79-81c (core)(vram temps sitting at 70c~) range which is probably the limit to which I'm physically comfortable with. My attic room barely dissipates any heat and summer's on its way so the few extra FPS is probably not worth sweating all over my chair and desk. Once again , thanks ^^


----------



## tolis626

So I tried playing with 1675MHz and no AUX voltage and it plays... Fine. There's even a small performance boost over 1625MHz (that's at 1440p). And temps are more in control too. Errors do end up in the 200.000 range, but as long as I don't lose performance I'm ok, I think. I also tried 1700MHz but it started flickering. Which is strange, because I've tried 1700MHz with up to +75mV and it always, sooner or later, ends up in a black screen and usually with no prior flickering. Maybe my card doesn't respond well to AUX voltage? Or maybe is this stability a fluke? Because with BF4 in particular, even 1675MHz at +50mV AUX crashes after a while.


----------



## bluej511

Quote:


> Originally Posted by *tolis626*
> 
> So I tried playing with 1675MHz and no AUX voltage and it plays... Fine. There's even a small performance boost over 1625MHz (that's at 1440p). And temps are more in control too. Errors do end up in the 200.000 range, but as long as I don't lose performance I'm ok, I think. I also tried 1700MHz but it started flickering. Which is strange, because I've tried 1700MHz with up to +75mV and it always, sooner or later, ends up in a black screen and usually with no prior flickering. Maybe my card doesn't respond well to AUX voltage? Or maybe is this stability a fluke? Because with BF4 in particular, even 1675MHz at +50mV AUX crashes after a while.


Honestly it depends on the game and benchmark, i can be stable in heaven but firestrike will artifact. Thats when i tried 1225/1700 was stable in heaven but not in firestrike. Settle on 1200//1650.


----------



## kizwan

Quote:


> Originally Posted by *tolis626*
> 
> So I tried playing with 1675MHz and no AUX voltage and it plays... Fine. There's even a small performance boost over 1625MHz (that's at 1440p). And temps are more in control too. Errors do end up in the 200.000 range, but as long as I don't lose performance I'm ok, I think. I also tried 1700MHz but it started flickering. Which is strange, because I've tried 1700MHz with up to +75mV and it always, sooner or later, ends up in a black screen and usually with no prior flickering. Maybe my card doesn't respond well to AUX voltage? Or maybe is this stability a fluke? Because with BF4 in particular, even 1675MHz at +50mV AUX crashes after a while.


I remember reading that the memory PHY / controller is the hottest part. So increasing AUX will only make it run hotter which means higher AUX voltage than stock not always going to help stability. I would run up to +50mV AUX max. BF4 pretty hard on both system memory & GPU memory. I've had crash with unstable system memory due to too high VCCSA voltage. Reducing the voltage fixed it. If your 1675MHz with +50mV crashed, I would settle lower MHz with +0mV AUX. My cards even with low memory overclock (*≥ 1375MHz*), it needs +50mV AUX.


----------



## tolis626

Quote:


> Originally Posted by *kizwan*
> 
> I remember reading that the memory PHY / controller is the hottest part. So increasing AUX will only make it run hotter which means higher AUX voltage than stock not always going to help stability. I would run up to +50mV AUX max. BF4 pretty hard on both system memory & GPU memory. I've had crash with unstable system memory due to too high VCCSA voltage. Reducing the voltage fixed it. If your 1675MHz with +50mV crashed, I would settle lower MHz with +0mV AUX. My cards even with low memory overclock (*≥ 1375MHz*), it needs +50mV AUX.


You didn't understand (And that tends to happen when I'm sleepy and write things like a drunk, but I digress







). I mean that 1675MHz crashes after a few minutes. Quite a lot of minutes, but it does. I just completed like 2-3 hours of 64 player conquest on Siege of Sanghai (So about as harsh as BF4 gets) with my mem at 1675MHz and no AUX voltage and it ran fine. It did gather quite a lot of errors (>2.000.000, yes 2 million), but over the span of 3 hours I'd say it's acceptable. At first I was playing at 1185/1675MHz at +100mV core but it hit 80C on the core and 83C on the VRM, so I backed down to 1175/1675MHz at +75mV and it topped out at 76C (both core and VRM). Pretty good actually and it ran perfectly.

I surely have more testing to do, but it seems promising. The ultimate test will, as always, be Witcher 3. But that's for tomorrow. Now I just need a good night's sleep.


----------



## kizwan

Quote:


> Originally Posted by *tolis626*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> I remember reading that the memory PHY / controller is the hottest part. So increasing AUX will only make it run hotter which means higher AUX voltage than stock not always going to help stability. I would run up to +50mV AUX max. BF4 pretty hard on both system memory & GPU memory. I've had crash with unstable system memory due to too high VCCSA voltage. Reducing the voltage fixed it. If your 1675MHz with +50mV crashed, I would settle lower MHz with +0mV AUX. My cards even with low memory overclock (*≥ 1375MHz*), it needs +50mV AUX.
> 
> 
> 
> You didn't understand (And that tends to happen when I'm sleepy and write things like a drunk, but I digress
> 
> 
> 
> 
> 
> 
> 
> ). I mean that 1675MHz crashes after a few minutes. Quite a lot of minutes, but it does. I just completed like 2-3 hours of 64 player conquest on Siege of Sanghai (So about as harsh as BF4 gets) with my mem at 1675MHz and no AUX voltage and it ran fine. It did gather quite a lot of errors (>2.000.000, yes 2 million), but over the span of 3 hours I'd say it's acceptable. At first I was playing at 1185/1675MHz at +100mV core but it hit 80C on the core and 83C on the VRM, so I backed down to 1175/1675MHz at +75mV and it topped out at 76C (both core and VRM). Pretty good actually and it ran perfectly.
> 
> I surely have more testing to do, but it seems promising. The ultimate test will, as always, be Witcher 3. But that's for tomorrow. Now I just need a good night's sleep.
Click to expand...

Which part I don't understand?







Either it crashed after a while or crashed after few minutes, it is still not stable. It is stable if it doesn't crash ever. Well, if most of the time it's fine then it's fine.


----------



## tolis626

Quote:


> Originally Posted by *kizwan*
> 
> Which part I don't understand?
> 
> 
> 
> 
> 
> 
> 
> Either it crashed after a while or crashed after few minutes, it is still not stable. It is stable if it doesn't crash ever. Well, if most of the time it's fine then it's fine.


I meant that it *usually* crashes after a bit. But I *usually* run it with +50mV AUX. First time (I think) that I'm trying it with no AUX voltage added and it worked fine, no crashes. Not after a bit like before, not after 3 hours either. I mean, I haven't been able to play BF4 64 player conquest for 3 hours with anything higher than 1650MHz and that was with +50mV AUX too. So there's something here. Maybe, like VCCSA, sometimes more isn't better. But for now, it seems 1675MHz with no added AUX voltage is stable. Of course, more testing is needed to confirm, but still... Strange. Oh well.

On the plus side, temps are a bit better, so I'm able to push higher overclocks more "safely". 76C with a 20-25C ambient isn't shabby for a +75mV overvolt. And 1175MHz is a nice overclock too. I guess 80C isn't too much either, so I may be able to push it a bit more, but that'll come after I set up my air conditioning.


----------



## DarthBaggins

Any room for a AMD R9 390x (won it in the evening drawing at the PC Expo for MicroCenter vendors)


----------



## bluej511

Quote:


> Originally Posted by *DarthBaggins*
> 
> Any room for a AMD R9 390x (won it in the evening drawing at the PC Expo for MicroCenter vendors)


Time to mod it and put a different cooler on it haha.


----------



## Streetdragon

Interesting with the errors. So far i dont have any errors. But i font try ro push it so far xD. And a secound 390 would be nice. Would watercool it


----------



## bluej511

You guys are making me wanna try this now to see what i get haha.


----------



## DarthBaggins

Quote:


> Originally Posted by *bluej511*
> 
> Time to mod it and put a different cooler on it haha.


Yup after winning it I immediately started looking for water blocks. Also have to send my 960 off for RMA so I'll be cramming this beast into my Elite 130 so it's going from console killer to destroyer. Glad I finally get an opportunity to play with one of these cards. Last red team cards I had were the 270x&7870 which served me well til the upgrade to the 970 that is being upgraded to a 1080/1070


----------



## jendle

Hi guys, I'm planning to do a full thermal paste + thermal pad replacement on my R9 390 (MSI Cooler) to increase the card's overclocking ability and reduce noise when idle.
I'm just posting to make sure I'm buying the correct items:

99% Isopropyl Alcohol
Gelid GC-Extreme (GC-3) thermal compound (3.5g)
Fujipoly 1.5mm 11W/mK thermal pad (one strip of 100mm by 15mm)
Can anyone tell me is this enough paste, and will the strip be long and thick enough to replace the stock thermal pads? Also, is 99% isopropyl fine?

Thanks









PS: Do any of you know the cheapest place to get these items in the UK? The pads are a bit pricey on ebay.


----------



## tolis626

So I tried running my card at 1185/1700MHz with no additional AUX voltage and in BF4's test range out of all things, it flickered sometimes. It wouldn't crash though, so I fired up Witcher 3 and, sure enough, it crashed after a few minutes of gameplay. Temps, however, were about 80C. After a reboot, I tried 1175/1700MHz with no AUX voltage again and I managed to play for a bit under 3 hours no problem. Temps topped at 74C for both the core and VRM and it was rock stable. No artifacts, no flickering, no crashing. And not TOO many errors. Which is strange, I suppose, but I'll see where this goes. 1725MHz flickered so I backed down. Will play some BF4 again and see how it goes.

If more people try the HWiNFO64 beta with the error reporting and post their results here, it'd offer a nice set of information about what's going on with these cards.







Quote:


> Originally Posted by *jendle*
> 
> Hi guys, I'm planning to do a full thermal paste + thermal pad replacement on my R9 390 (MSI Cooler) to increase the card's overclocking ability and reduce noise when idle.
> I'm just posting to make sure I'm buying the correct items:
> 
> 99% Isopropyl Alcohol
> Gelid GC-Extreme (GC-3) thermal compound (3.5g)
> Fujipoly 1.5mm 11W/mK thermal pad (one strip of 100mm by 15mm)
> Can anyone tell me is this enough paste, and will the strip be long and thick enough to replace the stock thermal pads? Also, is 99% isopropyl fine?
> 
> Thanks
> 
> 
> 
> 
> 
> 
> 
> 
> 
> PS: Do any of you know the cheapest place to get these items in the UK? The pads are a bit pricey on ebay.


Take it from a guy that did this exact thing on the same card (Well, technically I have the 390x, but it's the same PCB), don't bother changing the thermal pads. 11W/mK is what I installed and they perform exactly the same as the stock pads. Unless you wanna go full baller and install some 17W/mK ones, but those are too expensive IMO and wouldn't really show their worth with an air cooler. Also don't bother removing the green strip of pad that covers the chokes, there's no point.

Other than that, 99% isopropyl alcohol is probably the best way to clean the old TIM off the card, but even simple pure ethanol will do fine. Also, Gelid's GC Extreme is consistently among the top perfrorming TIMs on the market, so it's fine. Maybe if you can get your hands on some Thermal Grizzly Kryonaut it'll be better, but GC Extreme works on my card, so it'll work for yours too.


----------



## jendle

Quote:


> Originally Posted by *tolis626*
> 
> So I tried running my card at 1185/1700MHz with no additional AUX voltage and in BF4's test range out of all things, it flickered sometimes. It wouldn't crash though, so I fired up Witcher 3 and, sure enough, it crashed after a few minutes of gameplay. Temps, however, were about 80C. After a reboot, I tried 1175/1700MHz with no AUX voltage again and I managed to play for a bit under 3 hours no problem. Temps topped at 74C for both the core and VRM and it was rock stable. No artifacts, no flickering, no crashing. And not TOO many errors. Which is strange, I suppose, but I'll see where this goes. 1725MHz flickered so I backed down. Will play some BF4 again and see how it goes.
> 
> If more people try the HWiNFO64 beta with the error reporting and post their results here, it'd offer a nice set of information about what's going on with these cards.
> 
> 
> 
> 
> 
> 
> 
> 
> Take it from a guy that did this exact thing on the same card (Well, technically I have the 390x, but it's the same PCB), don't bother changing the thermal pads. 11W/mK is what I installed and they perform exactly the same as the stock pads. Unless you wanna go full baller and install some 17W/mK ones, but those are too expensive IMO and wouldn't really show their worth with an air cooler. Also don't bother removing the green strip of pad that covers the chokes, there's no point.
> 
> Other than that, 99% isopropyl alcohol is probably the best way to clean the old TIM off the card, but even simple pure ethanol will do fine. Also, Gelid's GC Extreme is consistently among the top perfrorming TIMs on the market, so it's fine. Maybe if you can get your hands on some Thermal Grizzly Kryonaut it'll be better, but GC Extreme works on my card, so it'll work for yours too.


Thank you so much for the advice! I was worried that if i open up the GPU and close it the thermal pads aren't gonna be as effective. Since I won't be spending as much on the pads I think I'll just get the Kryonaut (can it be applied on the CPU cooler as well, by the way? Also, is 5.5g enough for CPU and GPU?). Anything else you'd recommend to decrease temps?

Also, do I need to heat the TIM with a hair dryer before application?

One extra thing, wouldn't it be even better to apply thermal paste onto the VRAM chips?


----------



## tolis626

Quote:


> Originally Posted by *jendle*
> 
> Thank you so much for the advice! I was worried that if i open up the GPU and close it the thermal pads aren't gonna be as effective. Since I won't be spending as much on the pads I think I'll just get the Kryonaut (can it be applied on the CPU cooler as well, by the way? Also, is 5.5g enough for CPU and GPU?). Anything else you'd recommend to decrease temps?
> 
> Also, do I need to heat the TIM with a hair dryer before application?
> 
> One extra thing, wouldn't it be even better to apply thermal paste onto the VRAM chips?


If you can splurge for a tube of Kryonaut, I say do it. As for the quantity, it's more than enough. To give you some perspective, a 4g tube of Arctic MX-4 was enoug for 4 CPUs and a GPU and I still have some of it left over. And TIM is TIM, it can be applied anywhere. At least as long as it's a non-conductive one (all of the aforementioned are, so no worries). Just make sure to cover the whole die on the GPU. It should come with a small plastic spatula, so use that to spread the TIM on the die and cover it completely. Don't put too much on it though.

Thermal pads can be reused, they won't lose performance. Just be careful not to get too much dust/hair etc on them, as it tends to stick and it may decrease their performance. VRAM doesn't really get THAT hot, so leave it alone. I say don't even remove the bracket that the MSI has that touches the VRAM chips. No point in doing so. To remove the cooler, just remove the 4 screws surrounding the core from the back and the 5 screws on the right of the backplate. You can leave the others on and keep the backplate in place, so you won't have to be extra careful where you place the card as it won't rest on its bare PCB.

Other than that, be careful of your case airflow. On my system, the difference between having my case fans (140mm Phanteks fans) barely spin (500RPM) and spin at high-ish RPMs (1000RPM) is 5-10C for the GPU. Granted, I have a lot of fans, my case has excellent airflow and whatnot, but still it need air. And lots of it.


----------



## bluej511

Quote:


> Originally Posted by *DarthBaggins*
> 
> Yup after winning it I immediately started looking for water blocks. Also have to send my 960 off for RMA so I'll be cramming this beast into my Elite 130 so it's going from console killer to destroyer. Glad I finally get an opportunity to play with one of these cards. Last red team cards I had were the 270x&7870 which served me well til the upgrade to the 970 that is being upgraded to a 1080/1070


Lucky for the reference pcb theres tons of water coolers. Kinda wish ek made one for mine but hey cant complain about a core that never sees 45°C lol
Quote:


> Originally Posted by *jendle*
> 
> Hi guys, I'm planning to do a full thermal paste + thermal pad replacement on my R9 390 (MSI Cooler) to increase the card's overclocking ability and reduce noise when idle.
> I'm just posting to make sure I'm buying the correct items:
> 
> 99% Isopropyl Alcohol
> Gelid GC-Extreme (GC-3) thermal compound (3.5g)
> Fujipoly 1.5mm 11W/mK thermal pad (one strip of 100mm by 15mm)
> Can anyone tell me is this enough paste, and will the strip be long and thick enough to replace the stock thermal pads? Also, is 99% isopropyl fine?
> 
> Thanks
> 
> 
> 
> 
> 
> 
> 
> 
> 
> PS: Do any of you know the cheapest place to get these items in the UK? The pads are a bit pricey on ebay.


Those pads are expensive no matter where you get em, here in France id have to order em from the UK aswell. Aquatuning has some but its even more lol.

GC Extreme is awesome its what im using, the tube i got with my waterblock was 1-1.5g and that was enough to do my gpu and my cpu at least 3-4x so depends how much you need.

Does seem like the fujis for air wont do much but for water it makes quite a difference. I dont think the air cant dissipate 11w/mk fast enough.


----------



## jendle

Quote:


> Originally Posted by *tolis626*
> 
> If you can splurge for a tube of Kryonaut, I say do it. As for the quantity, it's more than enough. To give you some perspective, a 4g tube of Arctic MX-4 was enoug for 4 CPUs and a GPU and I still have some of it left over. And TIM is TIM, it can be applied anywhere. At least as long as it's a non-conductive one (all of the aforementioned are, so no worries). Just make sure to cover the whole die on the GPU. It should come with a small plastic spatula, so use that to spread the TIM on the die and cover it completely. Don't put too much on it though.
> 
> Thermal pads can be reused, they won't lose performance. Just be careful not to get too much dust/hair etc on them, as it tends to stick and it may decrease their performance. VRAM doesn't really get THAT hot, so leave it alone. I say don't even remove the bracket that the MSI has that touches the VRAM chips. No point in doing so. To remove the cooler, just remove the 4 screws surrounding the core from the back and the 5 screws on the right of the backplate. You can leave the others on and keep the backplate in place, so you won't have to be extra careful where you place the card as it won't rest on its bare PCB.
> 
> Other than that, be careful of your case airflow. On my system, the difference between having my case fans (140mm Phanteks fans) barely spin (500RPM) and spin at high-ish RPMs (1000RPM) is 5-10C for the GPU. Granted, I have a lot of fans, my case has excellent airflow and whatnot, but still it need air. And lots of it.


Bought the Kryonaut. I think I have enough fans at the moment (2x140mm intake in the front, one 140mm at the back to blow hot air out, one 120mm at the top for extra intake).
Quote:


> Originally Posted by *bluej511*
> 
> Lucky for the reference pcb theres tons of water coolers. Kinda wish ek made one for mine but hey cant complain about a core that never sees 45°C lol
> Those pads are expensive no matter where you get em, here in France id have to order em from the UK aswell. Aquatuning has some but its even more lol.
> 
> GC Extreme is awesome its what im using, the tube i got with my waterblock was 1-1.5g and that was enough to do my gpu and my cpu at least 3-4x so depends how much you need.
> 
> Does seem like the fujis for air wont do much but for water it makes quite a difference. I dont think the air cant dissipate 11w/mk fast enough.


Yep, I don't think replacing the pads on an air cooler is very cost-effective after reading about it. My VRAM temperatures aren't that high either.


----------



## bluej511

Quote:


> Originally Posted by *jendle*
> 
> Bought the Kryonaut. I think I have enough fans at the moment (2x200mm intake in the front, one 200mm at the back to blow hot air out, one 120mm at the top for extra intake).
> Yep, I don't think replacing the pads on an air cooler is very cost-effective after reading about it. My VRAM temperatures aren't that high either.


Then one day for better airflow you should replace the 200mm fans. They tend to move the same amount of air as a 120/140 but have almost no pressure so if you have stuff behind em or filters theyre useless.


----------



## DarthBaggins

Yeah I was looking at either the Aquacomputers Kryo block or this setup:


----------



## jendle

Quote:


> Originally Posted by *bluej511*
> 
> Then one day for better airflow you should replace the 200mm fans. They tend to move the same amount of air as a 120/140 but have almost no pressure so if you have stuff behind em or filters theyre useless.


****. I bought them thinking they would actually be better than 120mm's. I also do have air filters on all air intakes which in this case is also a problem.
I will switch the bottom front fan (200mm) with the 120mm fan then, so the card gets larger quantities to use for cooling. Is that a good idea?


----------



## bluej511

Quote:


> Originally Posted by *DarthBaggins*
> 
> Yeah I was looking at either the Aquacomputers Kryo block or this setup:


The aquacomputer stuff is ridiculously top notch, im not sure if they fit the reference r9 390 but if i had one and they fit i totally would, being in europe theyre a lot easier to get as well. I think if i get polaris or vega ill get the clear acrylic copper block for it, my case would REALLY show it off too.

Yes stil needs management haha.


----------



## DarthBaggins

Well EK was showing the 390x is referenced off the 290x ref PCB so the aqua should work fine. Personally I'd rather buy the Aquacomputers or a heat killer over ek, but that's due to a CPU block I have used and loved it


----------



## bluej511

So i just tried the hwinfo error thing and got 0 at 1040/1500 stock speeds. Weird i thought id at least get one haha.


----------



## dagget3450

well i finally got my Lian Li T60B case in so i can put my quad 390x/5960x in action finally... good thing 3 day weekend!


----------



## DarthBaggins

Can't wait to cram this 390x into my CoolerMaster Elite 130 and take it for a test spin. At least now I'll have an even ratio of Red & Green team in the house (how it should be)


----------



## Harry604

just picked up a msi r9 390 gaming

lets see what this can do...

any bios worth flashing


----------



## DarthBaggins

Quote:


> Originally Posted by *Harry604*
> 
> just picked up a msi r9 390 gaming
> 
> lets see what this can do...
> 
> any bios worth flashing


Just to give you a heads up I've noticed a high failure and return rate on those models at my store, normally I recommend the Sapphire model or Asus models. (One of the return complaints is coil whine). Not trying to discourage your new investment so hoping you got a good one (fan of MSI's products)


----------



## dagget3450

Quote:


> Originally Posted by *DarthBaggins*
> 
> Just to give you a heads up I've noticed a high failure and return rate on those models at my store, normally I recommend the Sapphire model or Asus models. (One of the return complaints is coil whine). Not trying to discourage your new investment so hoping you got a good one (fan of MSI's products)


I wish we could get a microcenter in Florida


----------



## Metalbeard

Quote:


> Originally Posted by *DarthBaggins*
> 
> Just to give you a heads up I've noticed a high failure and return rate on those models at my store, normally I recommend the Sapphire model or Asus models. (One of the return complaints is coil whine). Not trying to discourage your new investment so hoping you got a good one (fan of MSI's products)


I bought an open box MSI 390 Gaming from my local Microcenter. I also noticed that there were a few open box there but I took the chance anyway. I've noticed a little coil whine in certain games but it's nothing that I can't deal with.


----------



## spdaimon

Quote:


> Originally Posted by *dagget3450*
> 
> I wish we could get a microcenter in Florida


I wish I had one that wasn't on the other side of the state..








Quote:


> Originally Posted by *Metalbeard*
> 
> I bought an open box MSI 390 Gaming from my local Microcenter. I also noticed that there were a few open box there but I took the chance anyway. I've noticed a little coil whine in certain games but it's nothing that I can't deal with.


Microcenter sells a lot of open box stuff. I didn't realize it until my friend had a problem with a motherboard with a bent pin. He thought he was buying new. They allowed an exchange after hassling him a bit. That's when they told them they sell open box.


----------



## Scorpion49

I've had a lot of requests for the reference 390X vBIOS so here is a rom for anyone interested.


----------



## DarthBaggins

Quote:


> Originally Posted by *Scorpion49*
> 
> I've had a lot of requests for the reference 390X vBIOS so here is a rom for anyone interested.


And I could've gotten that from mine (glad I have a dual BIOS option)

I always keep my eyes out on what all we have on open-box, currently have 2 290x's and a lot of other gpu's - always check the pins on a board on open box prior to purchase too. I do my best to ensure they are good but occasionally a few get by


----------



## Scorpion49

Quote:


> Originally Posted by *DarthBaggins*
> 
> And I could've gotten that from mine (glad I have a dual BIOS option)
> 
> I always keep my eyes out on what all we have on open-box, currently have 2 290x's and a lot of other gpu's - always check the pins on a board on open box prior to purchase too. I do my best to ensure they are good but occasionally a few get by


Not many people had the blower 390X, only XFX made them and only a few. Some people were curious as to what changes there might have been.


----------



## DarthBaggins

I'm glad I mine is straight AMD, but curious to the dual BIOS and which position is what. Since I'm not sure what was loaded onto it since it was donated to the drawing via a AMD Vendor, and all the Sapphire rep donated was a R7 240 lol


----------



## Scorpion49

Quote:


> Originally Posted by *DarthBaggins*
> 
> I'm glad I mine is straight AMD, but curious to the dual BIOS and which position is what. Since I'm not sure what was loaded onto it since it was donated to the drawing via a AMD Vendor, and all the Sapphire rep donated was a R7 240 lol


Thats the same card that XFX sells, and since XFX is usually the OEM for AMD cards this isn't surprising.


----------



## DarthBaggins

Yet xfx uses a black PCB vs the brown AMD OE uses


----------



## Harry604

anyone with msi r9 390 here flash their bios ?


----------



## Darknessrise13

I might be sending my 290 ref in for RMA with gigabyte... Is it worth the risk of getting a g1 gaming 390? Are they that bad that I'm reading from reviews on newegg and amazon?


----------



## Harry604

tried to modify memory timings and flashed new bios.

computer restarts and just a black screen think i bricked my 390 lol


----------



## Scorpion49

Quote:


> Originally Posted by *DarthBaggins*
> 
> Yet xfx uses a black PCB vs the brown AMD OE uses


The AMD part number is even identical to yours save one digit. What does GPU-Z list your vendor as?


----------



## diggiddi

Quote:


> Originally Posted by *battleaxe*
> 
> This ^^


I agree wholly with you guys especially with Pcars and Cry3
Quote:


> Originally Posted by *DarthBaggins*
> 
> Yeah I was looking at either the Aquacomputers Kryo block or this setup:
> 
> 
> Spoiler: Warning: Spoiler!


Nice, I was looking at that exact setup, looking at bagging an XFX 390x in the future, btw are there any other blocks out there?


----------



## dagget3450

Gah, well i still need watercooling parts(mainly connectors) but i think i am going to just put it together with what i have on hand. Always wanted a test bench setup cause i am tired of changing stuff and leaving the case half done. Anyways here some pics of quadfire 390x setup partially assembled










Spoiler: Warning: Spoiler!


----------



## kizwan

Quote:


> Originally Posted by *Harry604*
> 
> tried to modify memory timings and flashed new bios.
> 
> computer restarts and just a black screen think i bricked my 390 lol


You should be able to blind flash the card.


----------



## DarthBaggins

Right now running the card in [email protected] full bore, and yeah the card is screaming at me to put it under water lol


----------



## Devildog83

Quote:


> Originally Posted by *DarthBaggins*
> 
> Yeah I was looking at either the Aquacomputers Kryo block or this setup:


I have that exact pack plate, it was modified but I just need to peel of the photo and it would be like new, if you decide to go EK you can have it if you want. My 290x died and I don't need it.

This is what it looks like now.


----------



## tolis626

So, I just tested my card with stock settings (1080/1500MHz) and just added 50% to the power limit in AB. I still get some errors. Like, in each benchmark run of Heaven I get like 100-150 of them. There's either something wrong with my card or the counter is unreliable. But seeing as my card is among the few with Hynix memory that struggles with mem overclocking, I don't know what to believe...


----------



## dasitman67

Anyone running the 390x in crossfire? Just ordered 2 msi 390x's and im kinda worried about the power consumption


----------



## bluej511

Quote:


> Originally Posted by *dasitman67*
> 
> Anyone running the 390x in crossfire? Just ordered 2 msi 390x's and im kinda worried about the power consumption


Looking at roughly over 450W for both card alone if theyre running full tilt. Honestly for crossfire id get a 1000w psu especially if you wanna be efficient. GPUZ was telling me when oced that i peaked my R9 390 at about 300 something watts lol.

I think its pretty accurate as its stock wattage on gpuz seems to be on par with all the reviews.


----------



## dasitman67

Quote:


> Originally Posted by *bluej511*
> 
> Looking at roughly over 450W for both card alone if theyre running full tilt. Honestly for crossfire id get a 1000w psu especially if you wanna be efficient. GPUZ was telling me when oced that i peaked my R9 390 at about 300 something watts lol.
> 
> I think its pretty accurate as its stock wattage on gpuz seems to be on par with all the reviews.


Damn, I really hope this doesnt destroy my electricity bill lol. I ordered the corsair ax1200i with them so hopefully all good


----------



## bluej511

Quote:


> Originally Posted by *dasitman67*
> 
> Damn, I really hope this doesnt destroy my electricity bill lol. I ordered the corsair ax1200i with them so hopefully all good


Depends where you live haha, in my apt cuz my parents are retired here in France its cheaper for riterees so its .09c/kwh so ridiculously cheap.

My GPU is only at full load all the time at on a handful of games, AC IV/Unity/SYndicate are a few of em. Unless your gaming at 1440 or 4k its probably not going to use much wattage. Playing old games might not even break 100w.


----------



## mus1mus

Just something to mind, I would OCP trip my 1250 with 2 cards running at 1250MHz + 200mV combined with a 5930K.

So yeah be careful when giving the cards the juice.


----------



## yuannan

Quote:


> Originally Posted by *dasitman67*
> 
> Damn, I really hope this doesnt destroy my electricity bill lol. I ordered the corsair ax1200i with them so hopefully all good


You can also set power target thing in the control panel to 60fps, it'll still give great game play without vsync and was designed to reduce power usage.


----------



## bluej511

Quote:


> Originally Posted by *yuannan*
> 
> You can also set power target thing in the control panel to 60fps, it'll still give great game play without vsync and was designed to reduce power usage.


Yup thats what i use so i dont overshoot freesync, i set it to 74. Only my old games go over if its not turned on. Games look so awesome on ultrawide.


----------



## tolis626

Hey guys, 2 things. First off, anyone else having issues with their cards not downclocking fully? For some reason my card's been stuck at 533MHz on the core and max memory clocks and won't go further down. I tried closing all applications, restarting, even uninstalling ClockBlocker but to no avail. What is up with that?

Secondly, could some of you that have high memory clocks please check for errors with the latest HWiNFO64 beta? Last round I played on BF4 with memory at 1700MHz yielded a bit over 15.000.000 errors. Like what the hell? My card is even throwing errors when casually browsing with Chrome.









PS : I fed up (Mostly with the noise) and DDU'ed my drivers and reinstalled them. The no downclocking issue seems to have gone away. Strange...


----------



## Worldwin

Quote:


> Originally Posted by *tolis626*
> 
> Hey guys, 2 things. First off, anyone else having issues with their cards not downclocking fully? For some reason my card's been stuck at 533MHz on the core and max memory clocks and won't go further down. I tried closing all applications, restarting, even uninstalling ClockBlocker but to no avail. What is up with that?
> 
> Secondly, could some of you that have high memory clocks please check for errors with the latest HWiNFO64 beta? Last round I played on BF4 with memory at 1700MHz yielded a bit over 15.000.000 errors. Like what the hell? My card is even throwing errors when casually browsing with Chrome.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> PS : I fed up (Mostly with the noise) and DDU'ed my drivers and reinstalled them. The no downclocking issue seems to have gone away. Strange...


Could you explain how you check for errors with a guide.


----------



## tolis626

Quote:


> Originally Posted by *Worldwin*
> 
> Could you explain how you check for errors with a guide.


No need. Just install the latest HWiNFO64 beta and there is a "GPU Memory Errors" meter. If you can't see it by default, you may need to go to settings and enable it, but I think it's enabled by default. Then all you have to do is run some 3D apps and see how it goes.


----------



## Worldwin

OK, so one thing i note is that the memory errors seem to be "permanent" meaning that the only way to get rid of them is to reset the driver. This is done be restarting the computer, logging off etc.


----------



## tolis626

Quote:


> Originally Posted by *Worldwin*
> 
> OK, so one thing i note is that the memory errors seem to be "permanent" meaning that the only way to get rid of them is to reset the driver. This is done be restarting the computer, logging off etc.


Yeah, noticed that too. Strange to say the least. Also, sometimes the error cound will suddenly decrease, as if the ECC just corrected a bunch of errors in one swoop. But that seems kinda random? I dunno.


----------



## Worldwin

Quote:


> Originally Posted by *tolis626*
> 
> Yeah, noticed that too. Strange to say the least. Also, sometimes the error cound will suddenly decrease, as if the ECC just corrected a bunch of errors in one swoop. But that seems kinda random? I dunno.


I do suspect you have among the worst memory possible. At 1720mhz i get less than 10k errors. In theory i could be on the other end of the spectrum but i think it is more likely you got screwed over.

Now the real question is at what point do the # of errors that occur show up on screen. To be more clear i mean say 15000 errors shows memory artifacts on screen.Once someone discovers this i think well have a general benchmark to aim at.


----------



## tolis626

Quote:


> Originally Posted by *Worldwin*
> 
> I do suspect you have among the worst memory possible. At 1720mhz i get less than 10k errors. In theory i could be on the other end of the spectrum but i think it is more likely you got screwed over.
> 
> Now the real question is at what point do the # of errors that occur show up on screen. To be more clear i mean say 15000 errors shows memory artifacts on screen.Once someone discovers this i think well have a general benchmark to aim at.


That's what's strange, I get no artifacting whatsoever. Its behavior is unpredictable and it will sometimes crash downright (Black screens, usually), but it never shows artifacts. Like, never. Period. Not because of memory at least.

Similar error counts to you means about 1625MHz for my card. For Hynix memory that's legendarily bad.


----------



## kizwan

Quote:


> Originally Posted by *tolis626*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Worldwin*
> 
> I do suspect you have among the worst memory possible. At 1720mhz i get less than 10k errors. In theory i could be on the other end of the spectrum but i think it is more likely you got screwed over.
> 
> Now the real question is at what point do the # of errors that occur show up on screen. To be more clear i mean say 15000 errors shows memory artifacts on screen.Once someone discovers this i think well have a general benchmark to aim at.
> 
> 
> 
> That's what's strange, I get no artifacting whatsoever. Its behavior is unpredictable and it will sometimes crash downright (Black screens, usually), but it never shows artifacts. Like, never. Period. Not because of memory at least.
> 
> Similar error counts to you means about 1625MHz for my card. For Hynix memory that's legendarily bad.
Click to expand...

I have selected few posts from Hawaii BIOS mod thread that may help you understand the counter.
Quote:


> Originally Posted by *Mumak*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> OK guys, so here the first HWiNFO build featuring the EDC counter: www.hwinfo.com/beta/hw64_525_2869.zip
> 
> 
> 
> The value is shown as "GPU Memory Errors" and as already described by @The Stilt, *it counts errors only when the GPU is under load*.
> 
> Note, that on some GPUs the current counter might reset back to 0 when the GPU goes idle (i.e. Tobago), while on others the current value should keep the last error count when going idle (i.e. Tahiti, Hawaii) and continue counting after the GPU is under load again. The behavior on GPUs like Tobago might change in future versions...
> 
> Happy testing and clock adjusting !
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Many thanks to @The Stilt !
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> EDIT: Build 2869 has been pulled meanwhile. Check the next posts here for a later build.


Quote:


> Originally Posted by *The Stilt*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Lard*
> 
> Thanks, +Rep!
> 
> I got 19 errors with my HD 7970 1200/1700MHz and tight Memory Timings with a comlete CompuBenchCL run.
> I guess this is tolerable.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You want to stress the ROPs at their maximum capacity (i.e high resolution 3D load, such as 3DMarks)
> 
> 
> 
> 
> 
> 
> 
> 
> *The EDC doesn't separate correctable errors from uncorrectable ones, so you don't really know if they were corrected or not. Once you have a significant amount of errors which get through, you'll get visible artifacts. Prior that you just lose performance.*
Click to expand...

Quote:


> Originally Posted by *Mumak*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Bartouille*
> 
> I've been monitoring edc past couple days. I don't know if this is a bug or something... I got around 280 errors after playing the Witcher 3 for 45 mins on one card which is fine, the other card had around the same, and then all of a sudden one card started producing like 200 errors every 2 seconds for no reason.
> 
> I find monitoring edc very useful, but I don't know how much I can trust it right now. Wish I could have access to the raw data of that register or wherever that information is stored, but honestly I have no clue how to even access gpu registers. Does amd provide some libraries to do this (ADL?) ?
> 
> 
> 
> *I believe it should be accurate as long as you don't get sudden peaks by 4,000,000,000, etc.*
> I'm sorry but I cannot disclose the registers, this is confidential. ADL won't help there either.
Click to expand...

Quote:


> Originally Posted by *The Stilt*
> 
> Yeah, outside the unexpected roll-overs happening on the earlier versions the count seems to be correct and perfectly aligned with the figures read using internal tools.


----------



## Mumak

Quote:


> Originally Posted by *Worldwin*
> 
> OK, so one thing i note is that the memory errors seem to be "permanent" meaning that the only way to get rid of them is to reset the driver. This is done be restarting the computer, logging off etc.


It's a counter. And when the GPU goes idle, it stops counting (and the GPU resets it to 0). But HWiNFO sums all errors counted as long as it's active - otherwise the user might not notice the count when the GPU is not under load.
So no need to reset the driver, just close/open HWiNFO when the GPU is idle and it will start from 0.


----------



## bluej511

Quote:


> Originally Posted by *Mumak*
> 
> It's a counter. And when the GPU goes idle, it stops counting (and the GPU resets it to 0). But HWiNFO sums all errors counted as long as it's active - otherwise the user might not notice the count when the GPU is not under load.
> So no need to reset the driver, just close/open HWiNFO when the GPU is idle and it will start from 0.


I have zero errors at stock speeds havent tried to overclock it but my guess would be id get no errors the same.

Honestly if you have no issues with performance i wouldnt worry. Max my card gets is 1650 at 100mv but my core goes to 1200. Depends on the memory.


----------



## Worldwin

Quote:


> Originally Posted by *Mumak*
> 
> It's a counter. And when the GPU goes idle, it stops counting (and the GPU resets it to 0). But HWiNFO sums all errors counted as long as it's active - otherwise the user might not notice the count when the GPU is not under load.
> So no need to reset the driver, just close/open HWiNFO when the GPU is idle and it will start from 0.


Good to know.


----------



## bluej511

So tried it at 1100/1600 with no aux voltage since its stable and got 147 errors max haha.


----------



## dasitman67

Can you overclock a 390x? People on reddit have been telling me that they dont overclock well


----------



## bluej511

Quote:


> Originally Posted by *dasitman67*
> 
> Can you overclock a 390x? People on reddit have been telling me that they dont overclock well


You can and if its a good board will oc real well. AMD has never compared to nvidia in the oc department its not even close haha/


----------



## dasitman67

Quote:


> Originally Posted by *bluej511*
> 
> You can and if its a good board will oc real well. *AMD has never compared to nvidia in the oc department its not even close haha/*


What do you mean by this? This will be my first AMD card so im not sure what to expect lol


----------



## bluej511

Quote:


> Originally Posted by *dasitman67*
> 
> What do you mean by this? This will be my first AMD card so im not sure what to expect lol


For example i can go from 1040 to 1200 with voltage and thats it, i could add more voltage but the gains are minute. Nvidia cards you can get 3-400mhz easy if your watercooled.


----------



## tolis626

First off, thanks for your input guys, much appreciated.









Now, seems that it depends a lot on core voltage. 1185/1700MHz at +100mV core voltage produced a little over 1.500.000 errors during a BF4 game, but eventually crashed. I then played another round at 1200/1700MHz at +125mV and it ran fine with "only" about 130.000 errors, although it did seem to correct quite a few of them during my playtime. So it seems I just have a weak IMC or something? I have no idea. At least I don't think it's downright faulty, so that's a relief. Bad thing is that at +125mV I hit 80C on the core and 78C on the VRMs during a 20 minute BF4 round, so I wouldn't call it a solution just yet.


----------



## ziggystardust

It's the beta version of HWInfo that shows these errors right?


----------



## tolis626

Yup.


----------



## mus1mus

Quote:


> Originally Posted by *bluej511*
> 
> For example i can go from 1040 to 1200 with voltage and thats it, i could add more voltage but the gains are minute. Nvidia cards you can get 3-400mhz easy if your watercooled.


But.... you didn't consider the gains from OC'ing both nVidia and AMD cards.

In my experience, AMD cards react to overclocking more linearly than nVidia cards. Also, the figures nVidia has been putting up are rather hideous. nVidia Boost Tech to blame.

Although in all honesty, OC'ing nVidia cards on both Benching and Gaming, you can get Gaming clocks really close to Benching OC. While you can push AMD Cards to it's knees when benching but Gaming will never get close. 200 MHz difference is really not far fetched.


----------



## Streetdragon

I cant OC my nitro Memory much. 1625 with ~40mv produce 4 errors in Fay Cry Primal over an hour. Preety stable. but while streaming the rig freezes from time to time.
i gave the AUX +20MV too. Maybe i shold pump more volt throug it...

The problem with the errors befor (my last post). i resettet the order of the listed entrys in HWINFO and now it shows both error-readers


----------



## tolis626

Quote:


> Originally Posted by *Streetdragon*
> 
> I cant OC my nitro Memory much. 1625 with ~40mv produce 4 errors in Fay Cry Primal over an hour. Preety stable. but while streaming the rig freezes from time to time.
> i gave the AUX +20MV too. Maybe i shold pump more volt throug it...
> 
> The problem with the errors befor (my last post). i resettet the order of the listed entrys in HWINFO and now it shows both error-readers


First off, I'd consider 4 errors nothing. It's not system memory, where errors can prove to be a big problem. Worst case scenario here is a pixel is a bit off or something, but you won't notice it anyway. Plus, there's ECC, so they're probably corrected anyway. To give you some perspective, I just finished a round of BF4 at 1175/1625MHz at +75mV and it produced just over 11.000 errors. The performance and experience weren't affected, though, so I cosnider this okay. If I push some AUX voltage it produces fewer errors, but I wanted to try with no AUX added. So I'd say push yours more, it can handle it. I don't think the freezing is due to the GPU memory anyway.

BUT. I wouldn't push any AUX voltage on the Sapphire cards as they have a VDDCI of 1.05V by default, while the other brands have 1.0V. So +50mV AUX on my MSI is your stock VDDCI. Just something to keep in mind. You can check for yourself, but I think I'm right.


----------



## Streetdragon

Yeah +50 is stock. but i let it run now for 2 days without any errors or blackscreens/freezes on stock speed. So i think it was because of the higer memory. Maybe i am not locky with my chips. like always^^
I have hynix memory.
Maybe i push the core voltage offset to 100, because i have a waterblock.. so it should be ok. That would stabliyze the memory too or?

edit with stock aux and +100mv on core my stream startet to flicker on 1625 on memory


----------



## tolis626

Quote:


> Originally Posted by *Streetdragon*
> 
> Yeah +50 is stock. but i let it run now for 2 days without any errors or blackscreens/freezes on stock speed. So i think it was because of the higer memory. Maybe i am not locky with my chips. like always^^
> I have hynix memory.
> Maybe i push the core voltage offset to 100, because i have a waterblock.. so it should be ok. That would stabliyze the memory too or?
> 
> edit with stock aux and +100mv on core my stream startet to flicker on 1625 on memory


Oh man, it there isn't something else going on, this could be even worse than my card.









First off, are you absolutely sure your GPU core and CPU are stable? It seems strange to me that it occurs only when you're streaming. It it's unstable, it should be so even when just gaming. Secondly, what's your stock DPM7 voltage? It may be low, so your card may not be getting enough juice even at +100mV.


----------



## DarthBaggins

So far this 390x I got has done nothing but impress me further, ran FC4 wide-open at 1080p last night which of course my 970 can do the same but still such an impressive card to the point I plan on buying a FreeSync monitor for it. Also running stock setting on the card vs my high OC on the 970.


----------



## bluej511

Quote:


> Originally Posted by *DarthBaggins*
> 
> So far this 390x I got has done nothing but impress me further, ran FC4 wide-open at 1080p last night which of course my 970 can do the same but still such an impressive card to the point I plan on buying a FreeSync monitor for it. Also running stock setting on the card vs my high OC on the 970.


Do it freesync is awesome, provided you dont drop into below 45fps where youll still see slowdowns like any other monitor. I got an ultrawide freesync so taxes the card a bit more but playing Unity i easily drop into the low 40s and the game def slowdowns as in not as smooth as 60-75fps, in tomb raider it drops the same but doesnt slowdown as much its much smoother. Guess it matters on the game.


----------



## Streetdragon

Quote:


> Originally Posted by *tolis626*
> 
> Oh man, it there isn't something else going on, this could be even worse than my card.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> First off, are you absolutely sure your GPU core and CPU are stable? It seems strange to me that it occurs only when you're streaming. It it's unstable, it should be so even when just gaming. Secondly, what's your stock DPM7 voltage? It may be low, so your card may not be getting enough juice even at +100mV.


Cpu is at stock at the moment and the Ram too, because.. .dont need it xD
Gpu is only at 1100, what i can reach without adding voltage. just did it for the fun and for later core clocking. While streaming it only peaks to 880Mhz,if i do something else while streaming/*******

with +100mv is the highest voltage on the core @1,34, what i saw. So stock should be 1,24. Im at work so i can tell it later.

Im never lucky with my chips..always the same....


----------



## diggiddi

Quote:


> Originally Posted by *mus1mus*
> 
> But.... you didn't consider the gains from OC'ing both nVidia and AMD cards.
> 
> In my experience, AMD cards react to overclocking more linearly than nVidia cards. Also, the figures nVidia has been putting up are rather hideous. nVidia Boost Tech to blame.
> 
> Although in all honesty, OC'ing nVidia cards on both Benching and Gaming, you can get Gaming clocks really close to Benching OC. While you can push AMD Cards to it's knees when benching but Gaming will never get close. 200 MHz difference is really not far fetched.


Really ?? I can game at max bench, but then again.... Lightnings


----------



## bluej511

So if the performance is the same and ek makes more then one waterblock available i MAY just switch to Polaris 10 (not sure which one it seems like theyre going to have a few cards) but how small is that PCB damn its fury nano sized.


----------



## Streetdragon

Quote:


> Originally Posted by *bluej511*
> 
> So if the performance is the same and ek makes more then one waterblock available i MAY just switch to Polaris 10 (not sure which one it seems like theyre going to have a few cards) but how small is that PCB damn its fury nano sized.


or buy a used 290/390 and go crossfire/3-way xD

No more heating in the winter!


----------



## flopper

Quote:


> Originally Posted by *bluej511*
> 
> So if the performance is the same and ek makes more then one waterblock available i MAY just switch to Polaris 10 (not sure which one it seems like theyre going to have a few cards) but how small is that PCB damn its fury nano sized.


going to replace my current card for sure.
to good to pass up


----------



## mus1mus

Quote:


> Originally Posted by *diggiddi*
> 
> Really ?? I can game at max bench, but then again.... Lightnings


Ohhhh. And what is your max bench clock again?









Should I remind?
http://www.3dmark.com/fs/7829384
http://www.3dmark.com/3dm11/11075578


----------



## gupsterg

Quote:


> Originally Posted by *mus1mus*
> 
> Ohhhh. And what is your max bench clock again?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Should I remind?
> http://www.3dmark.com/fs/7829384
> http://www.3dmark.com/3dm11/11075578


I know my post off topic but couldn't resist







.

3DM 11 compare

3DM FS compare

I would say my OC is not just bench stable but usable everyday (except [email protected] I'm finding is fussy about HBM clock over lengthy runs). I am using ref Fury X AIO cooler + factory TIM application and not at 100% fan but normal daily use profile. I am feeding GPU 1.243V VID (ie ~+31mV), which isn't a lot in my books.

To me Fury X is like a golden sample Hawaii/Grenada on steroids







.


----------



## mus1mus

You are a cheater.









How hard is it to push it tops?


----------



## gupsterg

Quote:


> Originally Posted by *mus1mus*
> 
> You are a cheater.


Yep







, about the only time I'd ever be able to bench quicker than you! (LOL).
Quote:


> Originally Posted by *mus1mus*
> 
> How hard is it to push it tops?


Fiji is weird for OC'ing







.

Too much voltage, even without clock increase can produce negative performance scaling. In buildzoid's data share +50mV was point of negative scaling for him, I've only given that once or twice to a card and didn't really make my own data set. Reason being I never planned to give more than +50mV due to what I saw with other aspects of ROM testing.

You see lowest leakage ASIC can have max 1250mV set by EVV using stock ROM and when we mod for manual VID via PowerPlay in ROM it can be max 1300mV, any higher and driver will BSOD at OS load. Hence why I've stayed at +50mV only for testing, perhaps AMD deemed +50mV is max ASIC can take (but dunno). We can only get more than 1300mV by using offset in VRM controller.

Hawaii/Grenada did not seem to have VID limit we could set manually in PowerPlay. I can't find anything so far in ROM that is stopping manual VID higher than 1300mV.

Then there is this "power" aspect, for example I run Heaven/Valley and it will not stick to max clock (ie DPM7), even when PowerLimit is raised in ROM and it's not reaching it. Only way to make Heaven/Valley stick to DPM 7 is by switching "Power Efficiency" off in driver. 3DM V / 11 / FS does not need PE off. When PE is off Fiji owners get the nutty clock bounce at desktop use which Hawaii/Grenada had, with PE on card stick to 300MHz pretty much all the time at desktop use. It's like the driver has control on how much "power" a particular app can use.

Some places I've read mentioned that Fiji at times is not "fully awake" ie GPU is not using max resources, I don't know and have not looked into testing it. For the SP count there should be more performance IMO.



Spoiler: My Fiji cards tests in brief



Fury Tri-X stock DPM 7 1.243V max OC 1090 / 525 at stock VID upto +50mV given no increase in OC, did gain 3840SP unlock placed it pretty much bang on with a Fury X in 3DM benches. (sold)

Sapphire Fury X (no 1) stock DPM 7 1.250V max OC 1100 / 525 at stock VID upto +50mV given no increase in OC. (sold)

Sapphire Fury X (no 2) stock DPM 7 1.212V max OC 1135 / 535 @ 1.243V , +50mV gave no extra clocks or so little I didn't test for long period. (best sample so far IMO, due to some other aspects like power usage figures, good/bad asic determination).

MSI Fury X (no 3) stock DPM 7 1.187V max OC 1125 / 545 @ 1.225V , within week or so couldn't keep that OC so RMA'd for refund.

Sapphire Fury X (no 4) stock DPM 7 1.193V max OC 1100 @ 1.231V , approx 15hrs testing for that OC, was showing signs of allowing 550 on HBM but got bored testing.

Sapphire Fury X (no 5) stock DPM 7 1.231V so far max OC 1125 @ stock so still testing in other rig.

Sapphire Fury X (no 6) stock DPM 7 1.231V just plugged into main rig this afternoon, soon to be tested.



TBH even with slight OC at stock voltage Fiji is pretty good, my 3D Fanboy benches were that.


----------



## diggiddi

Quote:


> Originally Posted by *mus1mus*
> 
> Ohhhh. And what is your max bench clock again?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Should I remind?
> http://www.3dmark.com/fs/7829384
> http://www.3dmark.com/3dm11/11075578


Actually on my 7950 , max game clock was also same as max oc also now that I recall , my point is you are making a blanket statement


----------



## DarthBaggins

Quote:


> Originally Posted by *Streetdragon*
> 
> or buy a used 290/390 and go crossfire/3-way xD
> 
> No more heating in the winter!


Lmao, I can say my 390x does throw off some serious heat running [email protected]


----------



## Streetdragon

Quote:


> Originally Posted by *tolis626*
> 
> Oh man, it there isn't something else going on, this could be even worse than my card.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> First off, are you absolutely sure your GPU core and CPU are stable? It seems strange to me that it occurs only when you're streaming. It it's unstable, it should be so even when just gaming. Secondly, what's your stock DPM7 voltage? It may be low, so your card may not be geting enough juice even at +100mV.


My DPM-Volts are
1,02669
0,98768
1,11602
1,19302
1,23203
1,30903
1,30903
1,30903 <-DPM7

i try now offset of 150+

Edit a little bench run:

Unbenannt.png 1238k .png file

only 9 errors.... hm

edit still flickering. then i life with the 1200/1600 for now


----------



## FooSkiii

Hey guys been looking into getting a freesync monitor and i saw this

http://www.microcenter.com/product/462882/S22F350FHN_22_HD_LED_Monitor

Has anybody used this monitor before?
Looks kinda insane for $100 freesync monitor


----------



## TrueForm

^ I'd get a 120 or 144hz monitor tbh. It's so much better esp with freesync.


----------



## FooSkiii

Quote:


> Originally Posted by *TrueForm*
> 
> ^ I'd get a 120 or 144hz monitor tbh. It's so much better esp with freesync.


How is the AOC monitor working out for you?
I'm running a XFX r9 390


----------



## bluej511

I love freesync dont think i could ever go back. Especially with my eyesight id see tearing at anything above and below 60fps if i dont use vsync (which in some games i cant as it ends up dropping down to 30fps if it drops to 58fps).

Its so much more pleasing.


----------



## tolis626

Quote:


> Originally Posted by *Streetdragon*
> 
> My DPM-Volts are
> 1,02669
> 0,98768
> 1,11602
> 1,19302
> 1,23203
> 1,30903
> 1,30903
> 1,30903 <-DPM7
> 
> i try now offset of 150+
> 
> Edit a little bench run:
> 
> Unbenannt.png 1238k .png file
> 
> only 9 errors.... hm
> 
> edit still flickering. then i life with the 1200/1600 for now


Holy crap, that's a lot of voltage.









One thing I've noticed. On my card, I can run 1175MHz on the core at +75mV. +70mV gives some very very rare artifacts in Witcher 3, but +75mV is ok. With the memory at 1625MHz, +75mV on the core gives me quite a few errors. Bumping it to +80mV does lead to them being fewer. But then again, so does adding 25mV to AUX voltage, so there's that. Temps are about the same anyway.

My point is, play with it. I still think that there's something else that may be causing flickering if it only occurs when streaming. 9 errors should behave like no errors. Also, I think that's a low score for a 1200MHz overclock, so you may be throttling. Scaling goes out the window with too much voltage and high temps anyway.


----------



## FooSkiii

Quote:


> Originally Posted by *bluej511*
> 
> I love freesync dont think i could ever go back. Especially with my eyesight id see tearing at anything above and below 60fps if i dont use vsync (which in some games i cant as it ends up dropping down to 30fps if it drops to 58fps).
> 
> Its so much more pleasing.


Thnx bro i will definitely be looking into that monitor.
did you install the firmware before you turned on freesync and do you keep it on all the time?


----------



## bluej511

Quote:


> Originally Posted by *FooSkiii*
> 
> Thnx bro i will definitely be looking into that monitor.
> did you install the firmware before you turned on freesync and do you keep it on all the time?


I turned it on in the monitor then in crimson. Firmware did nothing for te display so no issues.


----------



## mus1mus

Quote:


> Originally Posted by *diggiddi*
> 
> Actually on my 7950 , max game clock was also same as max oc also now that I recall , my point is you are making a blanket statement


Blanket statement?

Get your facts right fella. Just because you can't bench past 1300, doesn't make your statement true.

How many Hawaii / Grenada cards have you been through? 2?

I have been playing around with 6!

You can bench Hawaii with tons of errors / artifacts, nVidia cards will crash the driver before you see that. It doesn't only happen with Maxwell. My 780s does the same. Even the 550TIs and several other 600 series cards I have laid my hands on.


----------



## diggiddi

Quote:


> Originally Posted by *mus1mus*
> 
> Blanket statement?
> 
> Get your facts right fella. Just because you can't bench past 1300, doesn't make your statement true.
> 
> How many Hawaii / Grenada cards have you been through? 2?
> 
> I have been playing around with 6!
> 
> You can bench Hawaii with tons of errors / artifacts, nVidia cards will crash the driver before you see that. It doesn't only happen with Maxwell. My 780s does the same. Even the 550TIs and several other 600 series cards I have laid my hands on.


So 6 cards meaningful statistic?? wow, just take a chill pill dude, its not that serious


----------



## mus1mus

Quote:


> Originally Posted by *diggiddi*
> 
> So 6 cards meaningful statistic?? wow, just take a chill pill dude, its not that serious


Don't accuse someone going like "blanket statement" when you have nothing to prove your point.









Word of advice, cool your lightnings and you will see higher clocks.









And by the way, 6 beats 2


----------



## diggiddi

Quote:


> Originally Posted by *mus1mus*
> 
> Don't accuse someone going like "blanket statement" when you have nothing to prove your point.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Word of advice, cool your lightnings and you will see higher clocks.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And by the way, 6 beats 2


Just.......Let ....it ....go...


----------



## mus1mus

NO


----------



## Streetdragon

Quote:


> Originally Posted by *tolis626*
> 
> Holy crap, that's a lot of voltage.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> One thing I've noticed. On my card, I can run 1175MHz on the core at +75mV. +70mV gives some very very rare artifacts in Witcher 3, but +75mV is ok. With the memory at 1625MHz, +75mV on the core gives me quite a few errors. Bumping it to +80mV does lead to them being fewer. But then again, so does adding 25mV to AUX voltage, so there's that. Temps are about the same anyway.
> 
> My point is, play with it. I still think that there's something else that may be causing flickering if it only occurs when streaming. 9 errors should behave like no errors. Also, I think that's a low score for a 1200MHz overclock, so you may be throttling. Scaling goes out the window with too much voltage and high temps anyway.


temps should be ok... maybe lift the powerlimit a bit? like +25% or something? My vrm where on air at 90+ with only +50mv on core

And bump the aux? it is already +50 offset on the nitro. i try 10mv more

What else could slow down the card? cpu is strong enough


----------



## tolis626

Quote:


> Originally Posted by *Streetdragon*
> 
> temps should be ok... maybe lift the powerlimit a bit? like +25% or something? My vrm where on air at 90+ with only +50mv on core
> 
> And bump the aux? it is already +50 offset on the nitro. i try 10mv more
> 
> What else could slow down the card? cpu is strong enough


If your power limit isn't raised, then probably there's your problem man. Just max it out to +50%. Won't hurt your card (It's about the only slider that won't hurt it at all) and it will let it draw the power it needs. Overvoltage needs a ton of power on these cards, so raising power limit is essential. I even run it maxed out even when using the card at otherwise stock settings to avoid any and all throttling. Back off on everything else, max out PL and see how it goes.

Also, I meant raising the AUX voltage on mine. Mine's the MSI one, so stock AUX voltage is 1V flat, not 1.05V like the Nitro. Nitro owners shouldn't raise AUX voltage further. That is, unless the Stilt is wrong, but I'm not the one to say that.


----------



## Streetdragon

Quote:


> Originally Posted by *tolis626*
> 
> If your power limit isn't raised, then probably there's your problem man. Just max it out to +50%. Won't hurt your card (It's about the only slider that won't hurt it at all) and it will let it draw the power it needs. Overvoltage needs a ton of power on these cards, so raising power limit is essential. I even run it maxed out even when using the card at otherwise stock settings to avoid any and all throttling. Back off on everything else, max out PL and see how it goes.
> 
> Also, I meant raising the AUX voltage on mine. Mine's the MSI one, so stock AUX voltage is 1V flat, not 1.05V like the Nitro. Nitro owners shouldn't raise AUX voltage further. That is, unless the Stilt is wrong, but I'm not the one to say that.


soooo itried adding powerlimit ti the card with sapphire Strixxxxxxxxxxx........

didnt helped. even without the tha card holds the full speed. i even lowerd the voltage a bit to see what happens. still stable with +140mv

cant break the 14000 points...


----------



## tolis626

Quote:


> Originally Posted by *Streetdragon*
> 
> soooo itried adding powerlimit ti the card with sapphire Strixxxxxxxxxxx........
> 
> didnt helped. even without the tha card holds the full speed. i even lowerd the voltage a bit to see what happens. still stable with +140mv
> 
> cant break the 14000 points...


Back off that voltage more, that's what I'm saying. And try pushing your memory a bit again. WIthout PL at max there's no point overclocking, really. Also, on my card, I get the highest scores at just above +100mV, like +110-120mV, but I'm above 15000 points at this point. That's what's confusing me about your card. Then again, mine's a 390x, but I don't think the difference should be nearly as large at the same clock.

Try whatever is the max you can get at, I dunno, +100mV. Also try for higher memory. See how it goes. Too much voltage without proper cooling will also hurt your performance, not only the lifespan of your card.


----------



## rdr09

Quote:


> Originally Posted by *Streetdragon*
> 
> temps should be ok... maybe lift the powerlimit a bit? like +25% or something? My vrm where on air at 90+ with only +50mv on core
> 
> And bump the aux? it is already +50 offset on the nitro. i try 10mv more
> 
> What else could slow down the card? cpu is strong enough


i know the vrms are good to like 120 but when oc'ing, though, you want it really cooler like 80 or lower. When you run a bench use GPUZ in the background like so . . .



Use the Sensor tab and make sure to set the temps to read MAX. You can also use HWINFO64. Use only one app at a time. To me, 90 is high.


----------



## Streetdragon

i changed in the bios the TDP Powerlimit TDC +50%
216->324
216->324
208->312

and i switched the powercables from one splitet cable to two 8pin cables now i have a bit more power. can try to push the memory more


----------



## rdr09

Quote:


> Originally Posted by *Streetdragon*
> 
> 
> 
> i changed in the bios the TDP Powerlimit TDC +50%
> 216->324
> 216->324
> 208->312
> 
> and i switched the powercables from one splitet cable to two 8pin cables now i have a bit more power. can try to push the memory more


Those are beautiful temps. What's your goal? For benching, i think Unigine like Valley benefits higher memory oc. You've surpassed 14K.


----------



## bluej511

Very nice those are about my temps, my VRM1 is a bit lower but about the same. OC with 100mv makes it shoot up haha.


----------



## Streetdragon

memory is now at 1700. produced 375 errors while firestrike.. think that is ok.
Score is now 14285. NOW im happy^^

My goal? just look what my block can do and clock it "safe" so i have the card 2 more years^^
But it is not a golden chip.. i want one golden chip in my life.


----------



## rdr09

Quote:


> Originally Posted by *Streetdragon*
> 
> memory is now at 1700. produced 375 errors while firestrike.. think that is ok.
> Score is now 14285. NOW im happy^^
> 
> My goal? just look what my block can do and clock it "safe" so i have the card 2 more years^^
> But it is not a golden chip.. i want one golden chip in my life.


Reasonable goal. Reaching 1700 oc is like - Wow.


----------



## Streetdragon

Quote:


> Originally Posted by *rdr09*
> 
> Reasonable goal. Reaching 1700 oc is like - Wow.


i thougt that is normal at the Nitro... but first i must test if it is stable.. 1625 was not stable, flickering from time to time will test it a bit

edit THX to @spyshagg
he made a timing mod for me^^

http://www.overclock.net/t/1561372/hawaii-bios-editing-290-290x-295x2-390-390x/2790#post_25225406
that is so cool


----------



## tolis626

Quote:


> Originally Posted by *Streetdragon*
> 
> i thougt that is normal at the Nitro... but first i must test if it is stable.. 1625 was not stable, flickering from time to time will test it a bit
> 
> edit THX to @spyshagg
> he made a timing mod for me^^
> 
> http://www.overclock.net/t/1561372/hawaii-bios-editing-290-290x-295x2-390-390x/2790#post_25225406
> that is so cool


Sweet!

Man, I am so thinking about making a custom timings BIOS (with some other doo-dads as well) and just do it. My only, literally, my only fear is that the MSI cards only have one BIOS, so if something goes wrong, no 390x for me. And with how my financials are right now, that also means no GPU and no gaming for me either. I get depressed at the mere thought of it. On the other hand, even if I brick it, I should be able to flash it back to life, no problem using the iGPU. Decisions...


----------



## Streetdragon

if you know what you are doing and with the help of the other guys on this forum(awesome here!) you should have no problems. just dont push to much volts and you are fine^^

if i can trade my r9 290 vapor for a second nitro.... block on it with same bios... man^^ that would be awesome


----------



## afyeung

Quote:


> Originally Posted by *rdr09*
> 
> Reasonable goal. Reaching 1700 oc is like - Wow.


I'm running 1730 lol.


----------



## bluej511

So my retailer is already slashing prices on the r9 390 or sapphire is haha. Its on sale here from 330€ to 290€ 40€ off. Tempted to crossfire watercool haha, nah jk.

Oh and prices for the 1080 none founders edition has already showed up, the Asus is 840€ the Evga is 750€, What a rip off, what happened to 599$ MSRP haha.


----------



## tolis626

Quote:


> Originally Posted by *bluej511*
> 
> So my retailer is already slashing prices on the r9 390 or sapphire is haha. Its on sale here from 330€ to 290€ 40€ off. Tempted to crossfire watercool haha, nah jk.
> 
> Oh and prices for the 1080 none founders edition has already showed up, the Asus is 840€ the Evga is 750€, What a rip off, what happened to 599$ MSRP haha.


Europe man, we always get the short (and ridiculously expensive) end of the stick. Having a proper PC is a luxury. God damnt it.


----------



## bluej511

Quote:


> Originally Posted by *tolis626*
> 
> Europe man, we always get the short (and ridiculously expensive) end of the stick. Having a proper PC is a luxury. God damnt it.


Haha try this on for size then. The true cost of my pc. My account is in the US in $, when i buy anything from the EU i pay in € so not only do i pay more to begin with i also have to convert haha. So my 2000€ pc ends up being even more in $. I got the short end twice lol.

It not so bad now when the eur is 1.12 for a 1$ but when i moved 2-3 years ago was close to 1.35 for 1$. So yea its why i slowly put my pieces over time.


----------



## tolis626

Quote:


> Originally Posted by *bluej511*
> 
> Haha try this on for size then. The true cost of my pc. My account is in the US in $, when i buy anything from the EU i pay in € so not only do i pay more to begin with i also have to convert haha. So my 2000€ pc ends up being even more in $. I got the short end twice lol.


If you didn't have custom watercooling, mine would be more expensive.









I had to hold back on purchasing computer stuff for 2 years because I didn't have enough to buy what I want. But when I had the money, I splurged all over. In retrospect, I should have kept some on the side, have a more robust upgrade path. But do I regret it? No, not at all. It's been almost 2 years with this rig (minus the 390x, I got that in September) and I still look at it like I'm in love.


----------



## bluej511

Quote:


> Originally Posted by *tolis626*
> 
> If you didn't have custom watercooling, mine would be more expensive.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I had to hold back on purchasing computer stuff for 2 years because I didn't have enough to buy what I want. But when I had the money, I splurged all over. In retrospect, I should have kept some on the side, have a more robust upgrade path. But do I regret it? No, not at all. It's been almost 2 years with this rig (minus the 390x, I got that in September) and I still look at it like I'm in love.


The watercooling wasn't too bad, the most expensive is the actual pump/fittings/gpu block the rest is pretty decent.

I got my r9 390 for about 350€ so not bad, but ends up being about 400$ or so if not more wtv the currency was few months back. Meanwhile its prob 350$ if not even less on newegg. But you can't compare eur to usd as the pay is WAY different between the EU and US at least in France anyways.


----------



## tolis626

Quote:


> Originally Posted by *bluej511*
> 
> The watercooling wasn't too bad, the most expensive is the actual pump/fittings/gpu block the rest is pretty decent.
> 
> I got my r9 390 for about 350€ so not bad, but ends up being about 400$ or so if not more wtv the currency was few months back. Meanwhile its prob 350$ if not even less on newegg. But you can't compare eur to usd as the pay is WAY different between the EU and US at least in France anyways.


Yeah, try that with the average pay in Greece. Get depressed.









Anyway, we have it better in other aspects in Europe, so I ain't complaining. It's just that I'm kinda envious of US citizens because my hobbies are such. PC gaming, cycling, audio are all far cheaper across the pond. But what can you do?


----------



## bluej511

Quote:


> Originally Posted by *tolis626*
> 
> Yeah, try that with the average pay in Greece. Get depressed.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyway, we have it better in other aspects in Europe, so I ain't complaining. It's just that I'm kinda envious of US citizens because my hobbies are such. PC gaming, cycling, audio are all far cheaper across the pond. But what can you do?


Its why i bought my Trek before moving. Ill take 2500$ over 2500€ any day. Virgin atlantic didnt charge so i bought a soft case for it still cheaper then buying it here all i had to do was re-true my rear wheel costs 5$ haha. I have quad citizenship so im all set.


----------



## tolis626

Quote:


> Originally Posted by *bluej511*
> 
> Its why i bought my Trek before moving. Ill take 2500$ over 2500€ any day. Virgin atlantic didnt charge so i bought a soft case for it still cheaper then buying it here all i had to do was re-true my rear wheel costs 5$ haha. I have quad citizenship so im all set.


Damn you. I got myself a custom Merida Scultura Comp 905 used (but barely so, the guy could have told me it was new and I would've believed him) for 1700€. It's not bad, but for about the same money I could have got it new in the US.

If I could import stuff from the US without having to pay huge amounts in customs (not to mention waiting for potentially several months before my things get processed), I would do it all the time. But I can't so... Yeah, I'm stuck.

Anyways! Sorry for the OT everyone! (Or not, it's been awfully quiet here the past 2 days anyway







)


----------



## bluej511

Quote:


> Originally Posted by *tolis626*
> 
> Damn you. I got myself a custom Merida Scultura Comp 905 used (but barely so, the guy could have told me it was new and I would've believed him) for 1700€. It's not bad, but for about the same money I could have got it new in the US.
> 
> If I could import stuff from the US without having to pay huge amounts in customs (not to mention waiting for potentially several months before my things get processed), I would do it all the time. But I can't so... Yeah, I'm stuck.
> 
> Anyways! Sorry for the OT everyone! (Or not, it's been awfully quiet here the past 2 days anyway
> 
> 
> 
> 
> 
> 
> 
> )


Yea kinda regret buying mine as 11speed came out like few months after but oh well i could always convert but no need.

Back on topic:
Card runs rainbow six siege in ultrawide maxed out 2x msaa no issues though so cant complain.


----------



## Gdourado

What are the differences between the Sapphire 390X Tri-X and the Nitro?


----------



## bluej511

Quote:


> Originally Posted by *Gdourado*
> 
> What are the differences between the Sapphire 390X Tri-X and the Nitro?


Honestly i think its just a name change. Tri-x used to be what the r9 290 was called then they went to Nitro for the r9 390. i think the 390 is nitro and 390x is tri-x but could be wrong.


----------



## Gdourado

Quote:


> Originally Posted by *bluej511*
> 
> Honestly i think its just a name change. Tri-x used to be what the r9 290 was called then they went to Nitro for the r9 390. i think the 390 is nitro and 390x is tri-x but could be wrong.


I know that in the fury case, the Nitro is different as it offers a better power delivery circuitry on the pcb in comparison to the tri-x.
I am wondering if there is the same case on the 390x with the Nitro having a better pcb.


----------



## mus1mus

390 doesn't have the Tri-X name. Only the 390X has a Tri-X model.

http://www.sapphiretech.com/catapage_pd.asp?cataid=69&lang=eng

http://www.sapphiretech.com/catapage_pd.asp?cataid=68&lang=eng


----------



## bluej511

Quote:


> Originally Posted by *mus1mus*
> 
> 390 doesn't have the Tri-X name. Only the 390X has a Tri-X model.
> 
> http://www.sapphiretech.com/catapage_pd.asp?cataid=69&lang=eng
> 
> http://www.sapphiretech.com/catapage_pd.asp?cataid=68&lang=eng


Yea i thought so, they change names too often lol. THe 390x is also nitro haha http://www.sapphiretech.com/productdetial.asp?pid=40831187-0C5F-493F-BDF5-108E243206F3&lang=eng

The new Sapphire looks like its binned though should be nice. http://www.techpowerup.com/222135/sapphire-unveils-the-radeon-r9-390-toxic-graphics-card


----------



## TrueForm

Quote:


> Originally Posted by *FooSkiii*
> 
> How is the AOC monitor working out for you?
> I'm running a XFX r9 390


AMAZING! Such a difference on FPS games like fallout 4, TF2 and DOOM! Felt like I got a whole new computer.


----------



## rdr09

Quote:


> Originally Posted by *afyeung*
> 
> I'm running 1730 lol.


You sure you have an amd gpu? lol


----------



## bluej511

Quote:


> Originally Posted by *rdr09*
> 
> You sure you have an amd gpu? lol


Those lucky Hynix bastards, imagine 1730core though haha.


----------



## rdr09

Quote:


> Originally Posted by *bluej511*
> 
> Those lucky Hynix bastards, imagine *1730core* though haha.


Now, that's definitely an nVidia card at stock. lol

@ Trueform, you have a freesync monitor?


----------



## mus1mus

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bluej511*
> 
> Those lucky Hynix bastards, imagine 1730core though haha.
> 
> 
> 
> Now, that's definitely an nVidia card at stock. lol
> 
> @ Trueform, you have a freesync monitor?
Click to expand...

Throttling edition clocks!









These are my nVidia basterds:
http://www.3dmark.com/3dm11/11103009


----------



## rdr09

Quote:


> Originally Posted by *mus1mus*
> 
> Throttling edition clocks!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> These are my nVidia basterds:
> http://www.3dmark.com/3dm11/11103009


Ewww. i think my highest is 35K graphics but tess normal.


----------



## mus1mus

Which leads us to this:

GTX 1080 is being beaten by 2 RX 480s.

Do you think the 480s are stronger than 390/290Xs?

I can't find comparisons for AOTS between 1080s and Hawaiis.


----------



## DarthBaggins

Quote:


> Originally Posted by *mus1mus*
> 
> Which leads us to this:
> 
> GTX 1080 is being beaten by 2 RX 480s.
> 
> Do you think the 480s are stronger than 390/290Xs?
> 
> I can't find comparisons for AOTS between 1080s and Hawaiis.


What I wondered from that bench, is why did the use dual. 480's at 51% when a single shown in bench would give us a real image of single card performance. Still seeing two at only. 51% power beating a 1080 is not bad.

I do think they are an improvement over the 390's and now I want to see the 490's


----------



## bluej511

Quote:


> Originally Posted by *mus1mus*
> 
> Which leads us to this:
> 
> GTX 1080 is being beaten by 2 RX 480s.
> 
> Do you think the 480s are stronger than 390/290Xs?
> 
> I can't find comparisons for AOTS between 1080s and Hawaiis.


The 480 is supposed to be dead even with a 390 maybe slightly more performing because of the generation change.


----------



## Agent Smith1984

RX 480 is going to be smack dead in the middle of 390X and Fury pro performance. The thing to look out for though, is that it's clocked at 1266 out of the box. If the 14nm Finfet clocks were (say we get maybe 1500mhz capability) then the card should really raise some when overclocked and easily outperform a Fury. Mind you this is a $200 card in 4GB form. That's game changing!!! That's putting high end capable hardware in the hands of the average GPU shopper...... This is what AMD does best and they stand to gain a lot of market share with this approach, though I do think the $380 (yeah right, expect $450 to be the norm) GTX 1070 will also hold a large share in the higher end market based on it's 980ti+ level of performance for $200 less than last year.


----------



## tolis626

Anybody tried the 16.6.1 driver? I installed it yesterday and it seems to be more of the same after 16.5.1. Mantle seems to be done for (At least BF4 doesn't work) and I think it still has some black screen and TDR issues. God damn it AMD, get your crap together.


----------



## bluej511

Quote:


> Originally Posted by *tolis626*
> 
> Anybody tried the 16.6.1 driver? I installed it yesterday and it seems to be more of the same after 16.5.1. Mantle seems to be done for (At least BF4 doesn't work) and I think it still has some black screen and TDR issues. God damn it AMD, get your crap together.


Im on it now and so far couple hours of no flickering so hopefully it goes well.


----------



## mus1mus

Quote:


> Originally Posted by *Agent Smith1984*
> 
> RX 480 is going to be smack dead in the middle of 390X and Fury pro performance. The thing to look out for though, is that it's clocked at 1266 out of the box. If the 14nm Finfet clocks were (say we get maybe 1500mhz capability) then the card should really raise some when overclocked and easily outperform a Fury. Mind you this is a $200 card in 4GB form. That's game changing!!! That's putting high end capable hardware in the hands of the average GPU shopper...... This is what AMD does best and they stand to gain a lot of market share with this approach, though I do think the $380 (yeah right, expect $450 to be the norm) GTX 1070 will also hold a large share in the higher end market based on *it's 980ti+ level of performance* for $200 less than last year.


hmmm. 980TI level due to driver gimping done by nVidia.

1080 i.e. is just a hair over the 980ti btw. When people get a hold of the 1070, that's when you can see where it should stand.

Makes you feel good, AMD doesn't gump old cards with new drivers


----------



## Agent Smith1984

Quote:


> Originally Posted by *mus1mus*
> 
> hmmm. 980TI level due to driver gimping done by nVidia.
> 
> 1080 i.e. is just a hair over the 980ti btw. When people get a hold of the 1070, that's when you can see where it should stand.
> 
> Makes you feel good, AMD doesn't gump old cards with new drivers


Well, all the 1070 reviews show it being in a dead heat with 980ti, even when both are overclocked, but it could definitely be driver related.

What I will say, is that on 390 initial release, AMD made driver specfic to the card which made them appear to be around 10% faster, even with the same clock speeds, then AMD released a driver shortly after for all cards that revealed the performance boost in the 3 series was totally driver based.

Here is an overclocked 1070 vs other cards overclocked:
http://www.overclockersclub.com/reviews/nvidia_geforcegtx_1070_overclocking/4.htm


----------



## fyzzz

All 390 cards are on sale here. I could get a xfx r9 390 for only 269€ now. I have an extra kryographics waterblock i could use and i could use my current 290 as secondary card. But i'm wondering if i should just wait for new cards to launch?


----------



## mus1mus

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Well, all the 1070 reviews show it being in a dead heat with 980ti, even when both are overclocked, but it could definitely be driver related.
> 
> What I will say, is that on 390 initial release, AMD made driver specfic to the card which made them appear to be around 10% faster, even with the same clock speeds, then AMD released a driver shortly after for all cards that revealed the performance boost in the 3 series was totally driver based.
> 
> Here is an overclocked 1070 vs other cards overclocked:
> http://www.overclockersclub.com/reviews/nvidia_geforcegtx_1070_overclocking/4.htm


This is not the 1070, but here's your clue. Real end user results. Best 5930K/1080 vs 5930K/980TI

http://www.3dmark.com/compare/fs/8613682/fs/5757415

That is why, I don't expect the 1070 using it's best driver to topple a 980TI with it's driver.









Also note of the AOTS results. We knew the fact that even the 390X slightly beats the 980TI in AOTS. Something you notice? These are paid reviews.









I'm waiting for end-user results to believe. That's just me.

While on AMD side (thread related) when the 300 series parts were released, the driver gave them the edge over 200 series cards. Yes. But we, 290/X users also gained some. So we're not gimped.


----------



## Agent Smith1984

Quote:


> Originally Posted by *mus1mus*
> 
> This is not the 1070, but here's your clue. Real end user results. Best 5930K/1080 vs 5930K/980TI
> 
> http://www.3dmark.com/compare/fs/8613682/fs/5757415
> 
> That is why, I don't expect the 1070 using it's best driver to topple a 980TI with it's driver.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also note of the AOTS results. We knew the fact that even the 390X slightly beats the 980TI in AOTS. Something you notice? These are paid reviews.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm waiting for end-user results to believe. That's just me.
> 
> While on AMD side (thread related) when the 300 series parts were released, the driver gave them the edge over 200 series cards. Yes. But we, 290/X users also gained some. So we're not gimped.


Yeah, I definitely want to see some user results, but if you go into the 1080 thread plenty of users are posting up 24k+ graphics scores. I scrolled through about 3 pages worth yesterday, lol

The card is certainly impressive, but I think the people who just dumped their 980ti's off for $350-400 are out of their minds. The maxwell architecture is actually faster clock for clock than Pascal, but it just so happens to be able to run at a crazy high clock speed to make up for it. Kind of the same tactic AMD took with going from Phenom to FX series.

The worst part about the 1080 is how badly NVIDIA has gimped it with power limtiations and hard limited voltages, etc.... even the LN2 guys are hating those things right now. Best clock so far has been near 2.5GHz I think, which is a really high clock speed for a GPU, but from a percentage standpoint it's nowhere near the headroom that the maxwells had.

If NVIDIA truly wanted these new cards to dominate they should have used a tad more memory bandwidth. But I guess that's what we will see with the 1080ti.....

I am very interested in seeing how a ~$460 RX480 8GB crossfire setup does in people's hands because I get the feeling that card is going to give us some good overclockability.


----------



## bluej511

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Yeah, I definitely want to see some user results, but if you go into the 1080 thread plenty of users are posting up 24k+ graphics scores. I scrolled through about 3 pages worth yesterday, lol
> 
> The card is certainly impressive, but I think the people who just dumped their 980ti's off for $350-400 are out of their minds. The maxwell architecture is actually faster clock for clock than Pascal, but it just so happens to be able to run at a crazy high clock speed to make up for it. Kind of the same tactic AMD took with going from Phenom to FX series.
> 
> The worst part about the 1080 is how badly NVIDIA has gimped it with power limtiations and hard limited voltages, etc.... even the LN2 guys are hating those things right now. Best clock so far has been near 2.5GHz I think, which is a really high clock speed for a GPU, but from a percentage standpoint it's nowhere near the headroom that the maxwells had.
> 
> If NVIDIA truly wanted these new cards to dominate they should have used a tad more memory bandwidth. But I guess that's what we will see with the 1080ti.....
> 
> I am very interested in seeing how a ~$460 RX480 8GB crossfire setup does in people's hands because I get the feeling that card is going to give us some good overclockability.


And this is why nvidia has such god awful business practices. Imagine if Microsoft stopped supporting their software after a couple years. Or mobo drivers being the same. Hell they still have game updates for Euro Truck Simulator 2 which is god knows how long. An AMD card from 4years ago still benefits from todays drivers.

There ceo is so god damn greedy its why if the only gpu is ever nvidia ill go back to consoles full time. Unless they have nvidia gpus then ill stop gaming and go back to biking lol.


----------



## Agent Smith1984

Thought you all may find this little comparison interesting:

http://www.3dmark.com/compare/3dm11/11263084/3dm11/11085257/3dm11/11170600

Card on the left is the 480x


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Thought you all may find this little comparison interesting:
> 
> http://www.3dmark.com/compare/3dm11/11263084/3dm11/11085257/3dm11/11170600
> 
> Card on the left is the 480x


So, new 480x is about the same power as the 390x? Looks good to me.

Here's to my new upgrade: 490X coming soon. When is release date? Late 2016?

So what is replacing the FuryX then?

I am so hoping that AMD is shaping up to really compete well this year.


----------



## Agent Smith1984

Quote:


> Originally Posted by *battleaxe*
> 
> So, new 480x is about the same power as the 390x? Looks good to me.
> 
> Here's to my new upgrade: 490X coming soon. When is release date? Late 2016?
> 
> So what is replacing the FuryX then?
> 
> I am so hoping that AMD is shaping up to really compete well this year.


490 should be a Vega part that competes on the high end level, but not sure if it will be the highest end card, or if there is an additional card coming to compete with 1080ti such as "Fury 2" or whatever they decide to call it. If that's the case, expect whatever Vega chip that's in the 490 to be a good bit chopped down from that card, but still capable of competing with a 1080, because if the top tier vega part is AMD's answer to the GTX 1080 vanilla and not something to compete against the 1080ti down the road a little ways, then it's going to be a long year and a half or so for them.


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> 490 should be a Vega part that competes on the high end level, but not sure if it will be the highest end card, or if there is an additional card coming to compete with 1080ti such as "Fury 2" or whatever they decide to call it. If that's the case, expect whatever Vega chip that's in the 490 to be a good bit chopped down from that card, but still capable of competing with a 1080


Here's to hoping...


----------



## jdorje

I just noticed today that hwinfo has gained a "GPU Memory Errors" field. And I'm up to 2 million of them on my current settings!

Based on some quick tests it does seem to be actual memory errors, not core errors. I have to drop memory from 1740 to 1575 (!) to get them to "stop". But I do have very aggressive memory timings as well that might contribute to that.

Anyone know what's up with this data? Has this been covered while I haven't been paying attention?


----------



## tolis626

Quote:


> Originally Posted by *jdorje*
> 
> I just noticed today that hwinfo has gained a "GPU Memory Errors" field. And I'm up to 2 million of them on my current settings!
> 
> Based on some quick tests it does seem to be actual memory errors, not core errors. I have to drop memory from 1740 to 1575 (!) to get them to "stop". But I do have very aggressive memory timings as well that might contribute to that.
> 
> Anyone know what's up with this data? Has this been covered while I haven't been paying attention?


Yay! At least I'm not alone!









If you bother reading a few pages back (I'd say don't, it's boring stuff really







), you'll see that I and a couple other guys tested it and, well, there's errors. Your card seems to be in line with the others, errors start creeping up at about 1600MHz. My card too spits out a few millions of errors at anything over 1650MHz, but what can you do? What can I do? As long as it doesn't hurt your performance or causes artifacts, I would stick with it. Problem is, my card will black screen if the error count gets too high, so yeah, there's that.

Maybe back down a bit. Also, adding some core voltage helps along with AUX voltage. I mean, as long as your cooling can handle more, it's ok.


----------



## Zyphur

CPUZ Validation for my upgraded build: http://valid.x86.fr/spprl1 | XFX 390 DD


----------



## kizwan

Quote:


> Originally Posted by *jdorje*
> 
> I just noticed today that hwinfo has gained a "GPU Memory Errors" field. And I'm up to 2 million of them on my current settings!
> 
> Based on some quick tests it does seem to be actual memory errors, not core errors. I have to drop memory from 1740 to 1575 (!) to get them to "stop". But I do have very aggressive memory timings as well that might contribute to that.
> 
> Anyone know what's up with this data? Has this been covered while I haven't been paying attention?


A bit of history; at Hawaii BIOS mod thread, we have been told months ago that memory controller EDC can be monitored & it was only a couple of weeks ago this feature was implemented in HWiNFO (originally in GPU-Z but it doesn't happen). This is the feature that most people that modify memory timings have been waiting for. I won't link the first post talking about this feature but you may want to read about it from *[this post]* (& onward) just in case if you have question(s) which probably have been answered there.

Your card's memory module most likely can do 1740 but it's throwing errors because of the memory controller limit. It just can not handle it without errors occuring. Read about it more *[here]* (Grenada memory controller limit may higher than Hawaii ones). Keep in mind EDC doesn't separate correctable & uncorrectable errors.


----------



## gupsterg

@kizwan
Quote:


> To get around the MEMCLK barrier in Grenada, AMD has latched some of the memory controller related timings (through bios). So for higher than 1375MHz use 390 memory block with corrected memory density and corrected timings.


Quote from link

It would be interesting to know when using 390/X VRAM_Info modded for a 290/X if the headroom for MCLK increases without errors.

Have you had a chance to checck?


----------



## Gdourado

So I just saw a great promotion on a 390X.
It's probably a clereance sale or something...
But either way, a pair of 390X comes at a sweet price.
How is the performance of a 390X crossfire?
How do they compare against a single Aircooled Fury?
From some reviews I see online, at 1080p, the fury is anywhere from 3 to 15 fps ahead of a single 390X.
So even with bad scalling, a crossfire 390X can pull ahead by 50-60 fps from a fury?
How is the current state of MicroStutter? Is it still an issue with latest drivers?

Cheers!


----------



## mus1mus

No single card can beat 2 Hawaiis/Grenada.

Scaling is pretty good with 2 cards. Very close to X2. (Tip - some stores allow direct replacements. Get to know the cards well if you purchase 2. Always sync your clocks when you do. If one is weak, return it and hope to get a better one to complement the good card)









You also get 8GB of buffer that will serve you pretty fine in 4K. Fury is no mas!

Microstutter will always be a question. But crossfire owners can't be bothered it seems.

Just remember, these are Power and Heat hogs. But then again, even when underclocked and undervolted, you are getting better frames than any single card. On Crossfire supported games of course.


----------



## Gdourado

Quote:


> Originally Posted by *mus1mus*
> 
> No single card can beat 2 Hawaiis/Grenada.
> 
> Scaling is pretty good with 2 cards. Very close to X2. (Tip - some stores allow direct replacements. Get to know the cards well if you purchase 2. Always sync your clocks when you do. If one is weak, return it and hope to get a better one to complement the good card)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You also get 8GB of buffer that will serve you pretty fine in 4K. Fury is no mas!
> 
> Microstutter will always be a question. But crossfire owners can't be bothered it seems.
> 
> Just remember, these are Power and Heat hogs. But then again, even when underclocked and undervolted, you are getting better frames than any single card. On Crossfire supported games of course.


So let's assume that for a 60 euro proce premium, the pair of 390X is worth it over a single fury?


----------



## Gdourado

Also, can a Tier 1 850W gold PSU run 390X crossfire?


----------



## Streetdragon

if you dont clock the hell out of them and you cpu it is ok. But only when the PSU isnt any China-bomb


----------



## mus1mus

Yes and No.


----------



## kizwan

Quote:


> Originally Posted by *gupsterg*
> 
> @kizwan
> Quote:
> 
> 
> 
> To get around the MEMCLK barrier in Grenada, AMD has latched some of the memory controller related timings (through bios). So for higher than 1375MHz use 390 memory block with corrected memory density and corrected timings.
> 
> 
> 
> Quote from link
> 
> It would be interesting to know when using 390/X VRAM_Info modded for a 290/X if the headroom for MCLK increases without errors.
> 
> Have you had a chance to checck?
Click to expand...

Do I need to use 290 + 390_VRAM_Info block ROM? I'm using 390 + 390_VRAM_Info block ROM with rated timings but still producing errors at 1500.


----------



## gupsterg

Quote:


> Originally Posted by *kizwan*
> 
> Do I need to use 290 + 390_VRAM_Info block ROM? I'm using 390 + 390_VRAM_Info block ROM with rated timings but still producing errors at 1500.


I would assume a 390 ROM with + 390_VRAM_Info block modified to support RAM IC is the same as testing 290 ROM with 390_VRAM_Info block modified to support RAM IC.

If I had a Hawaii card (wish I'd flipping kept 1 at least!, perhaps I'll have to get one again







) I'd do test method:-

i) stock 290/X ROM
ii) 290/X ROM with 390/X VRAM_Info
iii) 390/X ROM with 390/X VRAM_Info

In all cases above I'd stick to stock timings per strap. In cases ii & iii I'd only do RAM size, RAM IC density and timings per strap to support the RAM IC on 290/X card to 390/X VRAM_Info.

After those tests I'd also be tempted to play with ucRefreshRateFactor and/or VDDCI to see if it aids MCLK clocks/error count lowering.
Quote:


> Originally Posted by *Gdourado*
> 
> So let's assume that for a 60 euro proce premium, the pair of 390X is worth it over a single fury?


Definitely







, caveat being CF support.


----------



## jackblk

My 390 running full load at 94 celcius. I'm afraid it's a bit too high. I did some research but some say it's alright, some say i would have to reapply the thermal paste. Mine is MSI.

If i have to reapply thermal paste, is there any instruction? I tried to find on youtube but there is none for MSI 390. Is anything like that (290 i.e) alright to follow?

Thanks guys


----------



## Agent Smith1984

So...... been tinkering with a $70 Xeon 2670 8C/16T my brother got on ebay.

Trying to get the turbo multi to lock to 3.3GHz and then bump the bclk to around 105-108 if possible.

Totally off topic, just sayin... lol

I am hoping to push (2) 480's in crossfire with this for the ultimate budget build!!! problem is I gotta give this board back and x79 boards are spiked due to the fall in chip prices.

If you see 390's for dirt cheap right now, I would say grab one, BUT, understand it runs hot if you want to do a cheap CF build!!


----------



## bluej511

Quote:


> Originally Posted by *jackblk*
> 
> My 390 running full load at 94 celcius. I'm afraid it's a bit too high. I did some research but some say it's alright, some say i would have to reapply the thermal paste. Mine is MSI.
> 
> If i have to reapply thermal paste, is there any instruction? I tried to find on youtube but there is none for MSI 390. Is anything like that (290 i.e) alright to follow?
> 
> Thanks guys


Depends what your fan is set at and your case/airflow but yea 94°C is quite hot.


----------



## kizwan

Quote:


> Originally Posted by *gupsterg*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Do I need to use 290 + 390_VRAM_Info block ROM? I'm using 390 + 390_VRAM_Info block ROM with rated timings but still producing errors at 1500.
> 
> 
> 
> I would assume a 390 ROM with + 390_VRAM_Info block modified to support RAM IC is the same as testing 290 ROM with 390_VRAM_Info block modified to support RAM IC.
> 
> If I had a Hawaii card (wish I'd flipping kept 1 at least!, perhaps I'll have to get one again
> 
> 
> 
> 
> 
> 
> 
> 
> ) I'd do test method:-
> 
> i) stock 290/X ROM
> ii) 290/X ROM with 390/X VRAM_Info
> iii) 390/X ROM with 390/X VRAM_Info
> 
> In all cases above I'd stick to stock timings per strap. In cases ii & iii I'd only do RAM size, RAM IC density and timings per strap to support the RAM IC on 290/X card to 390/X VRAM_Info.
> 
> After those tests I'd also be tempted to play with ucRefreshRateFactor and/or VDDCI to see if it aids MCLK clocks/error count lowering.
Click to expand...

A little tidbits, my Elpida card so far did not produced any error at 1500 but at 1600 it produced errors with artifacts & this already with Vddc at +100mV.


----------



## bluej511

Quote:


> Originally Posted by *kizwan*
> 
> A little tidbits, my Elpida card so far did not produced any error at 1500 but at 1600 it produced errors with artifacts & this already with Vddc at +100mV.


Mine at 1500 no errors, 1600 just a few. Its nothing crazy like in the millions but under 100 errors. Thats without extra voltage though.


----------



## DarthBaggins

Debating on shelling out for another 390x (since I didn't have to pay for my initial one) or wait on the 490's. I do still plan on snagging a EK block for my 390x since it will be replacing my much loved 970 in JAC once it's sold


----------



## dasitman67

Ok so,I just got a pair of 390xs and im hitting 94 degrees when running Valley. Is this normal?


----------



## mus1mus

Push the fans.


----------



## dasitman67

Quote:


> Originally Posted by *mus1mus*
> 
> Push the fans.


theyre on 100%


----------



## DarthBaggins

Which version of the 390x's? I know mine likes to hit towards high 80's in games (FC4 & Wolfenstien New Order) but that's with the fan set to auto (blower style)


----------



## afyeung

Quote:


> Originally Posted by *dasitman67*
> 
> Ok so,I just got a pair of 390xs and im hitting 94 degrees when running Valley. Is this normal?


That's normal. How far apart are they? 94c means they are most likely throttling. It is very hard to dissipate 500+w of heat in most cases with open air coolers especially in a case with little exhaust.I switched to water and won't go back. Can't imagine Grenada on air especially with 2 cards.


----------



## dasitman67

Quote:


> Originally Posted by *DarthBaggins*
> 
> Which version of the 390x's? I know mine likes to hit towards high 80's in games (FC4 & Wolfenstien New Order) but that's with the fan set to auto (blower style)


I have the msi version. Coming from a 970 that barely hit 50 degrees in game I was a little worried lol.
Quote:


> Originally Posted by *afyeung*
> 
> That's normal. How far apart are they? 94c means they are most likely throttling. It is very hard to dissipate 500+w of heat in most cases with open air coolers especially in a case with little exhaust.I switched to water and won't go back. Can't imagine Grenada on air especially with 2 cards.


I will definitely need to switch to water cooling I think. My cards are about 1cm apart, really not much room at all, its the main problem with my mobo.


----------



## DarthBaggins

Yeah my 970 barely hits 43c under full load (under water), but doesn't compare performance wise to the 390x


----------



## jdorje

Quote:


> Originally Posted by *dasitman67*
> 
> Ok so,I just got a pair of 390xs and im hitting 94 degrees when running Valley. Is this normal?


Drop voltage as far as you need to to keep it under 85ish. These cards don't lose a lot with reduced voltage.

Throwing an aio on the top card is the real fix of course. But simply remounting the cooler with better tim can help a lot too.


----------



## Chaoz

Quote:


> Originally Posted by *jdorje*
> 
> Drop voltage as far as you need to to keep it under 85ish. These cards don't lose a lot with reduced voltage.
> 
> Throwing an aio on the top card is the real fix of course. But simply remounting the cooler with better tim can help a lot too.


True, I did my ASUS 390 card with Thermal Grizzly Kryonaut. And the temps dropped around 4-5°C.


----------



## bluej511

Quote:


> Originally Posted by *Chaoz*
> 
> True, I did my ASUS 390 card with Thermal Grizzly Kryonaut. And the temps dropped around 4-5°C.


True but on crossfire that will only get you so much. Anyone that crossfires i wouldnt even bother with AIOs or CLCs, your vrms are gonna run just as hot if not hotter.

You could always get an ek predator and 2 msi waterblocks but a single 360 for 2 cards is just meh. I love my temps though and thats not even on a full waterblock. Rainbow six siege seems to be my newest gpu test made my gpu reach 43°C and thats with an ambient of around 24-25°C haha.

94°C is quite hot and even with a huge case and good air flow it wont matter in this case. One car exahusts heat right into the case right into the other cpu so its just recycling hot air. Youd need negative pressure in this case so you get more exhausts then intake. Ends up being like a vacuum and sucks all the hot air right out. If the card runs at 94°C i cant even imagine the actual temperature of the air coming out of em haha


----------



## tolis626

Following the whole memory errors thing, here's more.

The other day I tried 1175/1700MHz at +80/+50mV and it worked fine... for a bit. Like maybe 20 minutes to half an hour. Then I got a big flicker and then in-game textures were messed up for some reason, so I guess it was a driver crash or something. One thing I noticed at that moment is that BF4 under DX11, apart from having worse performance, doesn't crash as violently as Mantle. Mantle is a done deal if it's unstable, it WILL black screen. DX11 sometimes recovers and lets me at least exit the game without having to reset the system.

Today I tried 1180/1700MHz at +90/+50mV. The behavior I noticed was that at first it worked and it worked great. I mean, the performance was there, it wasn't flickering, it wasn't artifacting, but errors started showing up in the thousands after a while. It reached about 100.000 errors, then the counter reset. Then it reached about 300.000 errors and the counter reset again. That was within the span of 2 BF4 matches, so about 40 minutes-ish. Then I noticed that performance wasn't what it should be, barely hitting 100FPS in 1080p when normally it's over 120FPS. So I quit the game and first thing I noticed other than the errors counter having reset in HWiNFO was that the card wasn't running hot (70C max on the core and 68C on the VRM) and the power usage meter, however inaccurate it may be, showed that it was only consuming 250W or thereabouts. Then, without touching anything I ran a Heaven benchmark and got a measly 1150-ish score, so there was definitely something fishy there. My guess is that it was the ECC running amok. More testing is in line, but it seems that even if it somehow stabilizes, 1700MHz and upwards just isn't viable on my card unfortunately. I'll see if maybe adding more VDDC helps. I'm just debating whether I should just increase voltage without touching the clocks, or if I should also let the core clock follow suit. I mean, the card can do 1190MHz at +100mV, so I dunno.

Other than that, 16.6.1 seems to be working fine, other than Mantle. It just plain stopped working properly after 16.5.1. I hope they fix it because BF4 benefits quite a bit from Mantle over DX11. I hope Battlefield 1 is a DX12 title (do we have any info on that?).


----------



## afyeung

Quote:


> Originally Posted by *bluej511*
> 
> True but on crossfire that will only get you so much. Anyone that crossfires i wouldnt even bother with AIOs or CLCs, your vrms are gonna run just as hot if not hotter.
> 
> You could always get an ek predator and 2 msi waterblocks but a single 360 for 2 cards is just meh. I love my temps though and thats not even on a full waterblock. Rainbow six siege seems to be my newest gpu test made my gpu reach 43°C and thats with an ambient of around 24-25°C haha.
> 
> 94°C is quite hot and even with a huge case and good air flow it wont matter in this case. One car exahusts heat right into the case right into the other cpu so its just recycling hot air. Youd need negative pressure in this case so you get more exhausts then intake. Ends up being like a vacuum and sucks all the hot air right out. If the card runs at 94°C i cant even imagine the actual temperature of the air coming out of em haha


I used to run 2 CLCs with crossfire. I only added +20mv so VRMs were fine. I used heatsinks for the VRM 1. VRM2 is covered by the mosfet plate in the MSI card.


----------



## bluej511

Quote:


> Originally Posted by *afyeung*
> 
> I used to run 2 CLCs with crossfire. I only added +20mv so VRMs were fine. I used heatsinks for the VRM 1. VRM2 is covered by the mosfet plate in the MSI card.


Yea 20mv is nothing. If you unlock voltage control it adds 19mv anyways right off the bat.

What were the core and VRM temps, just curious?


----------



## afyeung

Quote:


> Originally Posted by *bluej511*
> 
> Yea 20mv is nothing. If you unlock voltage control it adds 19mv anyways right off the bat.
> 
> What were the core and VRM temps, just curious?


Core temps were in the 60s for both cards and vrm temps in the 80s for vrm1.My case didn't have that great airflow.


----------



## bluej511

Quote:


> Originally Posted by *afyeung*
> 
> Core temps were in the 60s for both cards and vrm temps in the 80s for vrm1.My case didn't have that great airflow.


Yea i figured it wouldnt be drastic. A decent custom setup will drop those core temps another 10C if not more and the VRMs prob by half depending on rads and water temps.


----------



## tbob22

Quote:


> Originally Posted by *Agent Smith1984*
> 
> So...... been tinkering with a $70 Xeon 2670 8C/16T my brother got on ebay.
> 
> Trying to get the turbo multi to lock to 3.3GHz and then bump the bclk to around 105-108 if possible.
> 
> Totally off topic, just sayin... lol
> 
> I am hoping to push (2) 480's in crossfire with this for the ultimate budget build!!! problem is I gotta give this board back and x79 boards are spiked due to the fall in chip prices.
> 
> If you see 390's for dirt cheap right now, I would say grab one, BUT, understand it runs hot if you want to do a cheap CF build!!


Those chips are great, I have a few myself, not great for gaming though. Single thread performance is kind of terrible. In a dual setup they would be very good for rendering though.

Recently rebuilt my machine with a x79 deluxe (from x58 P6T6/x5670) due to some great deals I came across, basically ended up trading my old board/cpu.









390 is still running great, Mirrors Edge Catalyst gets 60fps solid at 1080p/110%/ultra.


----------



## jdorje

If you do run an AIO on the GPU, you definitely want it as exhaust for just that reason. VRM temps are going to be an issue but with the core heat moved away cooling them is easier. Might still need some custom heat sinks for it though.

I kinda want to put a water cooler on my 390 (XFX 8256). The noise under load is annoying compared to my the whisper of the rest of my build. I've got an h80i to throw on it that'll fit as rear exhaust on my case, but that won't work with a g10, and supposedly the hg10 won't fit the 390. What should I do? Is it time to bite the bullet and build a full loop?

And lastly, my Valley scores seem lower than they used to be. Card is still running 1090/1740 on 1225 mV, same as it was 6 months ago. Haven't really looked that deeply or noticed any decline in performance in games though.


----------



## fyzzz

So i recently bought a xfx 390 dd. I was hoping for hynix memory, but it had elpida. 1140/1575 +50mv passed heaven without issues, but with thousands of memory errors. The card reached 81c core/78 vrm 1 during heaven with fan speed locked at 60% (probably really bad airflow since i have radiators in the front and top). This cards also whines alot. So I'm not really happy with this card, I'm going to sell it or send it back if I can, i think.


----------



## mus1mus

How does it perform?

Mind uploading the BIOS?

My Vantage Scores improved btw.


----------



## fyzzz

Quote:


> Originally Posted by *mus1mus*
> 
> How does it perform?
> 
> Mind uploading the BIOS?
> 
> My Vantage Scores improved btw.


Nothing wrong with the performance, much better than my elpida 290's atleast. It seems to overclock ok. Broke 15k valid on air when i was testing it with custom bios http://www.3dmark.com/fs/8743342.
Stock bios, support for AJR and BABG memory:

XFX390DD.zip 101k .zip file


----------



## mus1mus

Thanks man. Will try to mod again.

http://www.3dmark.com/3dmv/5469545

Was able to reach ^ on a stock bios with Voltage Limit Removed. Pretty good run.


----------



## fyzzz

Quote:


> Originally Posted by *mus1mus*
> 
> Thanks man. Will try to mod again.
> 
> http://www.3dmark.com/3dmv/5469545
> 
> Was able to reach ^ on a stock bios with Voltage Limit Removed. Pretty good run.


A very nice score







. I have the 390 under water now and it definitely helps, must test it in vantage to. 3Dmark11 score is way higher than what my 290 could do, http://www.3dmark.com/compare/3dm11/11316753/3dm11/11273475


----------



## DarthBaggins

Need to try my 390x under vantage, think my FS score was only around 14k (stock)


----------



## Gdourado

I just saw 1070 prices here in Europe.
I just did the math a single 1070 is more expensive than a pair of sapphire 390 Nitro.
How does a pair of 390 crossfire perform against a single 1070?
I am looking at 1080p.

Cheers


----------



## battleaxe

Quote:


> Originally Posted by *Gdourado*
> 
> I just saw 1070 prices here in Europe.
> I just did the math a single 1070 is more expensive than a pair of sapphire 390 Nitro.
> How does a pair of 390 crossfire perform against a single 1070?
> I am looking at 1080p.
> 
> Cheers


390 Xfire is going to destroy a single 1070 as long as the games you play support Xfire. By a large margin 30-70% more (depending on title).... And especially if you are playing with DX12.

One 390X is comparable to a 980ti on some DX12 titles, just for reference.


----------



## rdr09

Quote:


> Originally Posted by *Gdourado*
> 
> I just saw 1070 prices here in Europe.
> I just did the math a single 1070 is more expensive than a pair of sapphire 390 Nitro.
> How does a pair of 390 crossfire perform against a single 1070?
> I am looking at 1080p.
> 
> Cheers


390s will beat even a 1080. But, for 1080P, go wait for the 480 or 1060. Will be hard to push 2 390s at 1080P. 1440 minimum.


----------



## Gdourado

Quote:


> Originally Posted by *rdr09*
> 
> 390s will beat even a 1080. But, for 1080P, go wait for the 480 or 1060. Will be hard to push 2 390s at 1080P. 1440 minimum.


Why will they be hard to push at 1080p?
Also, 390 crossfire is that much different from 390x crossfire?


----------



## rdr09

Quote:


> Originally Posted by *Gdourado*
> 
> Why will they be hard to push at 1080p?
> Also, 390 crossfire is that much different from 390x crossfire?


Too much gpu power for the resolution. The load will be on the cpu a lot. coupled with unoptimized games . . . you'll be here in ocn complaining.lol

1080

http://www.3dmark.com/fs/8613612

2 290s

http://www.3dmark.com/3dm/4644611?

compare graphics scores. that's 14 driver.

EDIT: Gundermann in post #252 explained it best . . .

http://www.overclock.net/t/1442038/build-log-the-hawaiian-heat-wave/250


----------



## mus1mus

Quote:


> Originally Posted by *battleaxe*
> 
> 390 Xfire is going to destroy a single 1070 as long as the games you play support Xfire. By a large margin 30-70% more (depending on title).... And especially if you are playing with DX12.
> 
> *One 390X is comparable to a 980ti on some DX12 titles, just for reference.*


3 cruising 390Xs = 2 980TIs at their peaks.

http://www.3dmark.com/compare/fs/7659119/fs/8064348#


----------



## battleaxe

Quote:


> Originally Posted by *mus1mus*
> 
> 3 cruising 390Xs = 2 980TIs at their peaks.
> 
> http://www.3dmark.com/compare/fs/7659119/fs/8064348#


Im not sure if you are agreeing or correcting me? I did said Dx12 though. I wasn't talking about benching or dx11. And this is all very title specific as well. Just in case it was a correction. Lol no worries either way though. You bench more than I do, so I know you are more up on it than I am.


----------



## mus1mus

Quote:


> Originally Posted by *battleaxe*
> 
> Im not sure if you are agreeing or correcting me? I did said Dx12 though. I wasn't talking about benching or dx11. And this is all very title specific as well. Just in case it was a correction. Lol no worries either way though. You bench more than I do, so I know you are more up on it than I am.


DX11 already shows 3 290Xs beating a couple 980TIs. Imagine DX12 where AMD's compute power can play a lot and utilized to their potential.









No nVidia card can beat 2 290X at the moment. Be it DX11 or DX12 to call it short.


----------



## battleaxe

Quote:


> Originally Posted by *mus1mus*
> 
> DX11 already shows 3 290Xs beating a couple 980TIs. Imagine DX12 where AMD's compute power can play a lot and utilized to their potential.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> No nVidia card can beat 2 290X at the moment. Be it DX11 or DX12 to call it short.


Yes. Agreed.

But even on DX12 one 390x is doing as well as the 980ti on some benchmarks. On 4k that is anyway.


----------



## mus1mus

Quote:


> Originally Posted by *battleaxe*
> 
> Yes. Agreed.
> 
> But even on DX12 one 390x is doing as well as the 980ti on some benchmarks. On 4k that is anyway.


No denying these cards tbh.

And yep, I have little faith with reviewers. They are paid by honchos.

OCN end users are better sources.


----------



## artyle81704

Hello all!

Wish I had found this page under better circumstances but I am hoping you guys can help me turn it around.

I can't keep my MSI 390x from throttling. It quickly jumps from low 40s to 94 and throttles within 10min of gaming. I know its an airflow issue because it drops to 80c when I remove the side of the case, but I can't figure it out.

I have an inwin 703 with:

- two 120mm intakes on the front, a stock case bottom fan (obstructed by immobile HHD cage) and a Noctua NF-F12 on top
- InWin stock exhaust fan remounted as bottom intake (not shown in Pic)
- One Thermaltake Riing 12 exhaust
- CPU is cooled by Noctua DH-14 pushing toward the exhaust fan

I have already reapplied thermal paste with Noctua NT-H1.

The CPU is great and stays at 30-40c, however the GPU just dumps heat into the case wall (I can feel at wash of hot air coming from the space between the fans when I hold my hand up to it, and for some reason the unobstructed NF-12 isn't pushing it out .

I am considering cutting a hole in the case wall and mounting a 120mm intake, and if that doesn't work just watercooling the whole system, but I'd rather not spend that kind of money at the moment.

I have tried mounting it in the other PCI slot to no effect.

If applicable I live in a high desert, 5000ft above sea level in Arizona.

Sorry for the disjointed post, just trying to answer all the anticipated questions.

Pic


http://imgur.com/Qb1axKI

 Also the GPU power cord has been tidied up.


----------



## mus1mus

Insert the card on the First PCIe slot. - that should give the card the needed breathing room. Plus, can benefit from your intake fan at the middle of the case.

Set a higher RPM profile to the GPU. As long as you can live with the noise that comes with it.

Use higher static pressure fans. (High static pressure for case fans -- am I nuts?) That can ensure you get enough air flow considers a small not very ventillated case.

Visit the hawaii bios editting thread. You might be able to get some tips on how to undervolt the card. Less Voltage, less heat with the expense of a little performance loss.

Positive air pressure is good. But you might need to use optimisations to fully exploit the benefits of a good case air flow. Rear fan needs to be a bit faster to pull those hot air out the case.


----------



## Gdourado

Is it worth the premium for 2 390X crossfire against 2 390?


----------



## HyeVltg3

Quote:


> Originally Posted by *Gdourado*
> 
> Is it worth the premium for 2 390X crossfire against 2 390?


Imho, No.
2 390s only need a 1050 PSU (Minimum) (I have them running on a Corsair AX850, all fine, except I dont want to test any GPU stresstests, pretty sure the test would sap way more than 900w causing my PC to shutdown, but 390CF on a 850W works fine gaming Witcher 3, Crysis 3, Black Desert, all fine, can play hours)

whereas 2 390Xs would most definitely need 1200 at the least.

in gaming scenarios, most games only gain 5-10fps going 390 to 390x.

if you think the PSU and price of 5-10fps is worth the almost $100 extra, than sure, 390X CF is fine.


----------



## Gdourado

My psu is a xfx 850w black edition.
80 gold tier 1.
Would it handle 390 crossfire?


----------



## bluej511

Quote:


> Originally Posted by *Gdourado*
> 
> My psu is a xfx 850w black edition.
> 80 gold tier 1.
> Would it handle 390 crossfire?


It would but wouldnt be very efficient. If you OC a bit with voltage might get very close to the max. Just wait for Polaris and get 2 of those haha.


----------



## tolis626

Quote:


> Originally Posted by *HyeVltg3*
> 
> Imho, No.
> 2 390s only need a 1050 PSU (Minimum) (I have them running on a Corsair AX850, all fine, except I dont want to test any GPU stresstests, pretty sure the test would sap way more than 900w causing my PC to shutdown, but 390CF on a 850W works fine gaming Witcher 3, Crysis 3, Black Desert, all fine, can play hours)
> 
> whereas 2 390Xs would most definitely need 1200 at the least.
> 
> in gaming scenarios, most games only gain 5-10fps going 390 to 390x.
> 
> if you think the PSU and price of 5-10fps is worth the almost $100 extra, than sure, 390X CF is fine.


Ermmm... A 390 and a 390x should consume more or less the same amount of power. I'd guess exactly the same, as long as they use the same clocks and voltages, too. Certainly not a 100W+ difference though, no way. My 390x heavily overclocked coupled with a 4790k at 4.7GHz at 1.29V consume about 400-450W when playing BF4. So I could easily fit another 390x in there with a quality 850W PSU. Sure, heavy overclocking may be pushing it too much, but for stock voltage or even a slight overclock/overvolt, you'll probably be fine. Unless you're also running a heavily overclocked 8350 or something.

Now, is the 390x worth the premium over the 390... Depends. If you want the last bit of performance you can get and the price difference isn't too steep, then yes. I'd say anything over 70-80$ per card is too much for what you get though. The 390 is within 5-10% of the 390x anyway, so the difference between them is never the difference between playble and unplayable.


----------



## flopper

Quote:


> Originally Posted by *Gdourado*
> 
> My psu is a xfx 850w black edition.
> 80 gold tier 1.
> Would it handle 390 crossfire?


Not really.
wait for the 480.


----------



## HyeVltg3

Quote:


> Originally Posted by *tolis626*
> 
> Ermmm... A 390 and a 390x should consume more or less the same amount of power. I'd guess exactly the same, as long as they use the same clocks and voltages, too. Certainly not a 100W+ difference though, no way. My 390x heavily overclocked coupled with a 4790k at 4.7GHz at 1.29V consume about 400-450W when playing BF4. So I could easily fit another 390x in there with a quality 850W PSU. Sure, heavy overclocking may be pushing it too much, but for stock voltage or even a slight overclock/overvolt, you'll probably be fine. Unless you're also running a heavily overclocked 8350 or something.
> 
> Now, is the 390x worth the premium over the 390... Depends. If you want the last bit of performance you can get and the price difference isn't too steep, then yes. I'd say anything over 70-80$ per card is too much for what you get though. The 390 is within 5-10% of the 390x anyway, so the difference between them is never the difference between playble and unplayable.


I believe tolis' insight on this is more accurate. I dont have 390 CF anymore (sold it to save up for 490 =D) but my single Stock Gigabyte G1 Gaming 1025mhz run my system(in sig) at about 432-469w playing games and furmark (seperate) so adding another 390 (didnt have wall outlet meter when I had CF) probably only bumped me up to another 200-350w)
I dont have hard results for the 390x like I do 390.

but yes, like the above are recommending it, either wait for 480 or "490" or whatever card comes after the 480.


----------



## bluej511

Quote:


> Originally Posted by *HyeVltg3*
> 
> I believe tolis' insight on this is more accurate. I dont have 390 CF anymore (sold it to save up for 490 =D) but my single Stock Gigabyte G1 Gaming 1025mhz run my system(in sig) at about 432-469w playing games and furmark (seperate) so adding another 390 (didnt have wall outlet meter when I had CF) probably only bumped me up to another 200-350w)
> I dont have hard results for the 390x like I do 390.
> 
> but yes, like the above are recommending it, either wait for 480 or "490" or whatever card comes after the 480.


This is true but a fully loaded psu is less efficient then one that isnt. On top of that youll get more heat as a more efficient psu wastes less wattage.

So for a cf rx480 an 850w psu would be ideal depending on the oc and cpu oc.


----------



## DarthBaggins

I've debated on going with another 390x but don't want to part with my CoolerMaster v850 lol


----------



## diggiddi

Quote:


> Originally Posted by *DarthBaggins*
> 
> I've debated on going with another 390x but don't want to part with my CoolerMaster v850 lol


AAh you'll be fine with that v850, I was able to run 290x CF with an Antec 750 HCG, just don't go crazy with the overclocking the core, it should handle stock clocks alright


----------



## bluej511

Quote:


> Originally Posted by *DarthBaggins*
> 
> I've debated on going with another 390x but don't want to part with my CoolerMaster v850 lol


Better safe then sorry. And id rather be safe and efficient lol. Plus at this point its totally not worth it. The rx480 is supposed to be slightly faster then a gtx 980 so will def be faster then an r9 390/x so i might just get one for the hell of it and watercool it. Seems like amd might only optimize cf rx480 for vr games though.


----------



## Streetdragon

could get a r9 390 nitro for 240€... maybe i sell my vapor and get this card for around 50€ or so hmm....

"offtopic" 240 radiator + h240-x should pull 2 nitro and 5820k without problems or? oc to hell for sure^^


----------



## bluej511

Quote:


> Originally Posted by *Streetdragon*
> 
> could get a r9 390 nitro for 240€... maybe i sell my vapor and get this card for around 50€ or so hmm....
> 
> "offtopic" 240 radiator + h240-x should pull 2 nitro and 5820k without problems or? oc to hell for sure^^


Ive got a 360 AND a 240 for a single 390 and the temps are AWESOME. 2 240s will give you decent temps at best especially with a 5820k.


----------



## tbob22

Has anyone noticed any performance difference in newer drivers? I reverted back to 16.2.1 as that's the last version that VSR works with my projector (_a known issue over HDMI_).

Supposedly newer drivers have "performance improvements", but I haven't really noticed any difference, other than significantly less frame drops with 16.2.1 in certain older games _(Trackmania 2 comes to mind)._

Here's an example in Mirrors Edge Catalyst:


*16.6.1*


*16.2.1*

Game Settings:
1920x1200
Fullscreen
FOV: 75
Detail: Hyper
Memory Restriction Disabled
Scaling: 1.00

System Settings:
Power Efficiency: OFF _(only applicable to 16.3 and up)_
Power: +30%
CPU Forced to 4.7ghz
DDU between driver versions


----------



## Irev

Quote:


> Originally Posted by *Gdourado*
> 
> My psu is a xfx 850w black edition.
> 80 gold tier 1.
> Would it handle 390 crossfire?


Yes.

I have a EVGA supernova 850W..... handles my R9 390x powercolor PCS+ crossfire setup fine...

tested with a power meter running Firestrike 1.1 and the maximum system power draw was 692watts

Also guys I have a Q regarding Crossfire temps...

my top card hits 86 degrees @ 100% fan speed during furmark... the bottom card hits 72 degrees......

I have a game assetto corsa that makes the top card also hit 86..... alot of other games like metal gear phantom pain it hits like 81-82....

However is 86 degrees considered safe?


----------



## bluej511

Quote:


> Originally Posted by *Irev*
> 
> Yes.
> 
> I have a EVGA supernova 850W..... handles my R9 390x powercolor PCS+ crossfire setup fine...
> 
> tested with a power meter running Firestrike 1.1 and the maximum system power draw was 692watts


Is that with any OC on the gpus? I do believe at 50mv (very mild oc) that they pull close to 300w each then with another 150-200w OCed cpu youre getting up there in wattage haha.

Cant wait to see what the rx480/490 do with power consumption. I might switch over just for that.


----------



## Gdourado

Quote:


> Originally Posted by *Irev*
> 
> Yes.
> 
> I have a EVGA supernova 850W..... handles my R9 390x powercolor PCS+ crossfire setup fine...
> 
> tested with a power meter running Firestrike 1.1 and the maximum system power draw was 692watts
> 
> Also guys I have a Q regarding Crossfire temps...
> 
> my top card hits 86 degrees @ 100% fan speed during furmark... the bottom card hits 72 degrees......
> 
> I have a game assetto corsa that makes the top card also hit 86..... alot of other games like metal gear phantom pain it hits like 81-82....
> 
> However is 86 degrees considered safe?


That's interesting.
What is your CPU and is it overclocked?


----------



## bluej511

Quote:


> Originally Posted by *Gdourado*
> 
> That's interesting.
> What is your CPU and is it overclocked?


Check his sig lol. Guessing with that wattage the GPUs aren't overclocked though.


----------



## Irev

Quote:


> Originally Posted by *bluej511*
> 
> Check his sig lol. Guessing with that wattage the GPUs aren't overclocked though.


My test was run with cards stock, 1060mhz on core and CPU running 4.4ghz

with a little overclocking I estimate the power draw would go up around 750w.... that still leaves 100W spare on the psu which is fine.

p.s anyone have answer regarding my temp question?


----------



## flopper

Quote:


> Originally Posted by *Irev*
> 
> Yes.
> 
> I have a EVGA supernova 850W..... handles my R9 390x powercolor PCS+ crossfire setup fine...
> 
> tested with a power meter running Firestrike 1.1 and the maximum system power draw was 692watts
> /quote]
> 
> I wouldnt push a psu to 700w when its rated at 850w.
> 390 crossfire is a 1000w psu+ solution.
> 480 will change that obviously to new low wattage but a 390 crossfire with 850w seems at best low and risky.
> 
> With hardware two things comes to mind, use a quality psu ranked a lot more than needed, use a solid motherboard.
> One dont want to push power usage to the strained limits.
> Thats when bad things can happen.


----------



## bluej511

Quote:


> Originally Posted by *Irev*
> 
> My test was run with cards stock, 1060mhz on core and CPU running 4.4ghz
> 
> with a little overclocking I estimate the power draw would go up around 750w.... that still leaves 100W spare on the psu which is fine.
> 
> p.s anyone have answer regarding my temp question?


Didn't see the edit. My gpu at 100mv pulls around 330w or so, if i had 2 and OCed and my cpu OC id probably be closer to 800w or so, thats not including the water pumps, ram, hdd and all that.

Wouldnt be very efficient either and have a bit of wasted wattage/heat.

As far as temps, not much you can do in a crossfire. That top card is just sucking up all the hot air from the second gpu. Open the side panel and see how much cooler both cards run at then youll know.


----------



## Irev

Quote:


> Originally Posted by *bluej511*
> 
> Didn't see the edit. My gpu at 100mv pulls around 330w or so, if i had 2 and OCed and my cpu OC id probably be closer to 800w or so, thats not including the water pumps, ram, hdd and all that.
> 
> Wouldnt be very efficient either and have a bit of wasted wattage/heat.
> 
> As far as temps, not much you can do in a crossfire. That top card is just sucking up all the hot air from the second gpu. Open the side panel and see how much cooler both cards run at then youll know.


if I open the side panel temps drop to 83 instead of 86..... so not a major change here,

An idea that I thought of was using a PCIE riser cable to move the 2nd gpu lower down in the case providing a much larger gap between the cards.

Is this worth doing?


----------



## bluej511

Quote:


> Originally Posted by *Irev*
> 
> if I open the side panel temps drop to 83 instead of 86..... so not a major change here,
> 
> An idea that I thought of was using a PCIE riser cable to move the 2nd gpu lower down in the case providing a much larger gap between the cards.
> 
> Is this worth doing?


Could be yea but as other people have said in this thread before, with crossfire its always going to run hot no matter what. Id be more worried about VRM temps then core temps at those temps though.


----------



## tolis626

Man, I've been trying to push my card to the limit and I had nearly forgotten how nice it is to have it run cool and quiet. I just played a bit of BF4 at 1140/1625MHz with only 25mV of AUX voltage added (Yeah, my card can do 1140MHz at stock volts) and temps never exceeded 65C on both the core and VRMs. And it was quiet. It was so damn quiet I nearly started questioning whether the extra performance when overclocking is worth the noise. Then I realized I mostly play with headphones on, so I don't care.









PS : I was getting some pretty mediocre to bad temperatures after my last attempt at messing with my card and the thermal pad I put on the chokes was to blame, of all things. I just put a 1mm pad on there just 'cause, but it seems that that led to the cooler not making proper contact with the core, so I started seeing temps in the low 80s up from the low-mid 70s on the same clocks. At first I thought it was the ambient temp rising that caused it, but even with the AC on it was running hotter than before. So I took it apart, got a 0.5mm pad for the chokes and lo and behold, temps are back down. I just have some doubts about whether the cooler is now making proper contact with the chokes, but at this point I couldn't care less. I don't think chokes even NEED that much cooling to begin with and would be good with just the airflow of the card. Point is, I'm not taking my card apart again unless there's pretty good reason to do so. The only reasons I can think of is if I get my hands on some Kryonaut or if I decide to go water. Both are highly unlikely at this point, so there's that.


----------



## jdorje

If i run my 390 at 1125 mV (-100), i use around 400-425w wall draw. Crossfire would certainly be possible on a 650w psu.

At 1325 mV it's about 525w at the wall. 850w would be enough.


----------



## bluej511

Quote:


> Originally Posted by *jdorje*
> 
> If i run my 390 at 1125 mV (-100), i use around 400-425w wall draw. Crossfire would certainly be possible on a 650w psu.
> 
> At 1325 mV it's about 525w at the wall. 850w would be enough.


Well on a 650w weve already said theres no way, youre only drawing that because of -100mv. Also you cant look at avg you have to look at peak so the psu doesnt overvolt and shut off. Also depends on cpu oc and what not.

Rough guess and from reviews a 290/390/x maxes out at about 300w or so peak. Two of those alone is 600w.


----------



## m70b1jr

Hey guys, I'm selling my PC, and just wanted to advertise it in this thread (Hopefully I can) I'm getting older, and need to get money for a vehicle. Anyways, here's the specs.

Case - NZXT S340
Motherboard - Asus M599FX Evo r.20
CPU - FX-8350
CPU Cooler - H80i GT
GPU - XFX R9 390
GPU Cooler - Arctic Acelerro Hybrid III 140mm
8GB AMD R5 RAM
128GB PNY Optima SSD
1TB Hard Drive
500 GB Hard Drive
750 Corsair CXM (I think it's CXM) Powersupply
140MM Noctua Fan
Custom LED Strips in the PC

Razer Deathadder Keyboard
1080p IPS Acer Display (Forgot monitor name)
And Snowball Microphone, Also willing to sell my steam account.. if anyone is interested in this build, PM me. Photos of the PC are on my instagram @corbinxtitus ...


----------



## jdorje

Quote:


> Originally Posted by *bluej511*
> 
> Well on a 650w weve already said theres no way, youre only drawing that because of -100mv. Also you cant look at avg you have to look at peak so the psu doesnt overvolt and shut off. Also depends on cpu oc and what not.
> 
> Rough guess and from reviews a 290/390/x maxes out at about 300w or so peak. Two of those alone is 600w.


Of course it's only possible because of the undervolt; that's what i said.

I believe that test was with x264 and valley running at the same time, so about 125W ony 4690k. It's higher than any gaming power draw would be. I may test it again.

390x is moderately higher power use than the 390, simply from having 10% more cores and a likely higher stock clock and voltage.

Note I'm not actually saying i plan to do that. I have a 1050w psu (too cheap to pass up) and may add a second 390 this fall if the price is right.


----------



## bluej511

Quote:


> Originally Posted by *jdorje*
> 
> Of course it's only possible because of the undervolt; that's what i said.
> 
> I believe that test was with x264 and valley running at the same time, so about 125W ony 4690k. It's higher than any gaming power draw would be. I may test it again.
> 
> 390x is moderately higher power use than the 390, simply from having 10% more cores and a likely higher stock clock and voltage.
> 
> Note I'm not actually saying i plan to do that. I have a 1050w psu (too cheap to pass up) and may add a second 390 this fall if the price is right.


I have a 1000w and im right at peak efficiency so im ok with that. If i do get a rx480 or even an rx490 or even a vega if its efficient ill go down to a 850w depending on what their consumption is.

Problem is i despise people who like to run their psu at 80-90%, 1 its a bit less efficient and 2 you get more voltage ripple the higher the draw, higher ripple will end up giving you a less stable oc especially in a cpu where is very picky.


----------



## DarthBaggins

Gotta love protection plans, should have a new 1000w psu this evening.


----------



## Irev

so was thinking of getting ridd of the 390x and replace with rx480 but from a performance view it probably isn't worth it. I might wait for RX490 or Furyx 2?

Just don't want to muck around selling and loosing money unless its a decent performance upgrade


----------



## flopper

Quote:


> Originally Posted by *Irev*
> 
> so was thinking of getting ridd of the 390x and replace with rx480 but from a performance view it probably isn't worth it. I might wait for RX490 or Furyx 2?
> 
> Just don't want to muck around selling and loosing money unless its a decent performance upgrade


Vega make more sense for you yes.
your not really the targetgroup for the 480.


----------



## bluej511

Quote:


> Originally Posted by *flopper*
> 
> Vega make more sense for you yes.
> your not really the targetgroup for the 480.


For 1080p the rx480 is perfect from what ive been seeing so far, if it OCs well even better. Vega is more then likely going to target 4k 60fps hoping, the rx490 if they come out with one should be geared at 1440p.


----------



## Irev

btw here is a photo of my setup feel free to add me to the club,

also notice how sandwiched the cards are together? top card topping out @ 86 degrees 100% fan speed during gaming ( depending on game some games only work it up to 82 degrees )

Gonna use a PCIE riser cable to shift the bottom card downwards in the case by about 5-10cm. should fix the heat problem

I had a 980Ti and swapped it out for 2 of these 390x in CF.... mostly due to the fact I now have a Free Sync monitor.


----------



## mus1mus

Quote:


> Originally Posted by *Irev*
> 
> 
> 
> btw here is a photo of my setup feel free to add me to the club,
> 
> also notice how sandwiched the cards are together? top card topping out @ 86 degrees 100% fan speed during gaming ( depending on game some games only work it up to 82 degrees )
> 
> Gonna use a PCIE riser cable to shift the bottom card downwards in the case by about 5-10cm. should fix the heat problem
> 
> I had a 980Ti and swapped it out for 2 of these 390x in CF.... mostly due to the fact I now have a Free Sync monitor.


Nice looking cards in there.









The PCIe spacing seems weird. But it may just be the cards.









A 980TI has nothing for 2 of these.


----------



## tolis626

Quote:


> Originally Posted by *Irev*
> 
> 
> 
> btw here is a photo of my setup feel free to add me to the club,
> 
> also notice how sandwiched the cards are together? top card topping out @ 86 degrees 100% fan speed during gaming ( depending on game some games only work it up to 82 degrees )
> 
> Gonna use a PCIE riser cable to shift the bottom card downwards in the case by about 5-10cm. should fix the heat problem
> 
> I had a 980Ti and swapped it out for 2 of these 390x in CF.... mostly due to the fact I now have a Free Sync monitor.


As a temporary solution, I would just ghetto-rig a high static pressure 120mm fan so that it forces air in the space between the cards. I think you could even get away by using zip-ties to strap it to the cards themselves. That and mind your case airflow, as you may need to also ramp up the fans manually (That's worth a lot of degrees, so don't ignore it).

Also keep in mind that two cards that consume 250W-ish each are going to run hot. Couple that with an air cooler for the CPU (That's an NH-D15, right?) and your case fans have to move north of 600W of heat out of the case. That's no easy task to ask of them.


----------



## Irev

Quote:


> Originally Posted by *mus1mus*
> 
> Nice looking cards in there.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The PCIe spacing seems weird. But it may just be the cards.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> A 980TI has nothing for 2 of these.


The pcie spacing is horrible, I cant believe that no board manufacturers have implemented a solution to this problem.... If I run bottom card in lowest slot I think it drops to x4 speeds...... I'll have to use a pcie riser cable.... The top card would be fine if it didnt suck all hot air in from bottom card, granted the air cooling on these power color cards is amazing (but noisey)


----------



## mus1mus

Quote:


> Originally Posted by *Irev*
> 
> The pcie spacing is horrible, I cant believe that no board manufacturers have implemented a solution to this problem.... If I run bottom card in lowest slot I think it drops to x4 speeds...... I'll have to use a pcie riser cable.... The top card would be fine if it didnt suck all hot air in from bottom card, granted the air cooling on these power color cards is amazing (but noisey)


What board is that? Z107 or Z97?

Mainstream Intels are terribad with X-Fire/SLI unless on Water. Xtreme series are different. You will have ample of spacing due to the native support for Quad cards.


----------



## Irev

Quote:


> Originally Posted by *mus1mus*
> 
> What board is that? Z107 or Z97?
> 
> Mainstream Intels are terribad with X-Fire/SLI unless on Water. Xtreme series are different. You will have ample of spacing due to the native support for Quad cards.


Z97x killer fatality by asrock.... My only real option to eliminate heat issue is to.have decent spacing.between cards... Apart from water cooling of course... If the 6cm of extra space doesn't fix it I'll need a.bigger.case


----------



## mus1mus

Quote:


> Originally Posted by *Irev*
> 
> Z97x killer fatality by asrock.... My only real option to eliminate heat issue is to.have decent spacing.between cards... Apart from water cooling of course... If the 6cm of extra space doesn't fix it I'll need a.bigger.case


Water will cost you a lot. So is a mobo with quad support.

Look up^ a fan on top of the cards will probably be your best bet for now. High Static/Speed/Airflow fan. A vented side panel will also be of great help.


----------



## fantasticdave

I am giving serious thought to getting a second one of these card in crossfire.

I have this motherboard - https://www.asus.com/uk/Motherboards/A88XGAMER/, which looks like it has a larger gap between the cars that the one above.

Irev - do you see much improvement in performance over just the one card?


----------



## rdr09

Quote:


> Originally Posted by *fantasticdave*
> 
> I am giving serious thought to getting a second one of these card in crossfire.
> 
> I have this motherboard - https://www.asus.com/uk/Motherboards/A88XGAMER/, which looks like it has a larger gap between the cars that the one above.
> 
> Irev - do you see much improvement in performance over just the one card?


Isn't one too much for the cpu already?


----------



## bluej511

Quote:


> Originally Posted by *rdr09*
> 
> Isn't one too much for the cpu already?


Depends on the cpu, me personally i wouldnt crossfire with anything less then an i7. Id love to see how the new rx handle i5s or even i3s.


----------



## fantasticdave

I have this -

AMD APU A10 7870K Black Edition Quad Core CPU Processor

Do you reckon this would cope?


----------



## rdr09

Quote:


> Originally Posted by *fantasticdave*
> 
> I have this -
> 
> AMD APU A10 7870K Black Edition Quad Core CPU Processor
> 
> Do you reckon this would cope?


I concur with blu. that cpu is not going to cope. if you run Firestrike and post your link I can easily tell you what i am talking about.

You can test yourself when you play your games by using Afterburner to check the various usages. See if your current card is even hitting 90 - 100% usage and not any lower.


----------



## fantasticdave

Quote:


> Originally Posted by *rdr09*
> 
> I concur with blu. that cpu is not going to cope. if you run Firestrike and post your link I can easily tell you what i am talking about.
> 
> You can test yourself when you play your games by using Afterburner to check the various usages. See if your current card is even hitting 90 - 100% usage and not any lower.


I am at work at the minute, but will.

I know that looking at afterburner my card hits 100% usage


----------



## mus1mus

http://www.anandtech.com/show/9307/the-kaveri-refresh-godavari-review-testing-amds-a10-7870k/7

Just a slight hit from an I3-4XXx on some titles when paired with a 290X.

Overclocking it may improve things.


----------



## rdr09

Quote:


> Originally Posted by *mus1mus*
> 
> http://www.anandtech.com/show/9307/the-kaveri-refresh-godavari-review-testing-amds-a10-7870k/7
> 
> Just a slight hit from an I3-4XXx on some titles when paired with a 290X.
> 
> Overclocking it may improve things.


Some games are easy, some are quite the opposite. Even a locked i5 will not be ideal with crossfire 390. Very seldom you'd see reviewers include the minimum fps. Some scenes get high fps that the average data becomes unreliable. Checkout this BF4 MP run with a single 290 and i7 4.5GHz HT off . . .



see how the minimum goes below 60. Maybe a higher oc will work. Here is an i5 with 2 980s . . .

http://www.overclock.net/t/1587616/i5-4690k-100-usage-gaming-temps-fine-windows-10


----------



## mus1mus

Awwe! Totally lost the XFire question.


----------



## bluej511

Quote:


> Originally Posted by *mus1mus*
> 
> Awwe! Totally lost the XFire question.


How do we take away your rep for not reading lol?


----------



## mus1mus

Quote:


> Originally Posted by *bluej511*
> 
> How do we take away your rep for not reading lol?


You hit the report button.


----------



## Agent Smith1984

This little guy is a game changer:
http://wccftech.com/amd-rx-480-faster-than-nano-980/

AMD just kicked the whole industry directly in the testicles.


----------



## mus1mus

Hmmm.. I'm selling my cards!


----------



## rdr09

Quote:


> Originally Posted by *mus1mus*
> 
> Hmmm.. I'm selling my cards!


GL getting one in the first few weeks . . .



What we need to consider buying, too, are AMD stocks. lol


----------



## THUMPer1

Quote:


> Originally Posted by *rdr09*
> 
> GL getting one in the first few weeks . . .


AMD/AIB's will have more stock. I'm sure there won't be much of an issue.


----------



## rdr09

Quote:


> Originally Posted by *THUMPer1*
> 
> AMD/AIB's will have more stock. I'm sure there won't be much of an issue.


True. The reference, though, are almost always the ones built like tanks. Meant for watercooling and oc'ing. The driver team should step it up!


----------



## Agent Smith1984

There are reports that the thing will overclock to around 1400Mhz from the 1266. Not crazy but still good.... at that speed it is faster than a Fury X accoring to leaked benchmarks.

This card at $230 is an astounding value, especially considering it runs at 100w and 60c under load. Clearly two of these in CF is hands down value redefined.


----------



## mus1mus

Quote:


> Originally Posted by *rdr09*
> 
> GL getting one in the first few weeks . . .
> 
> 
> 
> What we need to consider buying, too, are AMD stocks. lol


Well, it won't be available for another month or so after release. So yeah. That's what you get for living in a small market country.
Quote:


> Originally Posted by *Agent Smith1984*
> 
> There are reports that the thing will overclock to around 1400Mhz from the 1266. Not crazy but still good.... at that speed it is faster than a Fury X accoring to leaked benchmarks.
> 
> This card at $230 is an astounding value, especially considering it runs at 100w and 60c under load. Clearly two of these in CF is hands down value redefined.


The mere fact that they are highly efficient is enough for me to let go of my hawaiis. I'm toying around the idea of 390Xs but it seems that these cards can trump Grenadas.

AMD will be in for a beautiful ride this time. Not in performance, but sales. And that will give them the leverage to venture into the high end next release date. More sales - more funds!


----------



## bluej511

Id still take it with a grain of salt its just a synthetic benchmark. the r9 390x beats the gtx 980 in benchmarks but def not in gaming so ill still wait for reviews.

I do like the fact that with an ekwb it more then likely will come with a new io plate, meaning it will be a single slot watercooled card, bye bye dvi no more need to cut it or trim it.

If it gets 10fps or more per game then the r9 390 i have i will sell it or keep it as a spare and pick up a 480 and ekwb for it. If not then ill stick with this. Means that with my dual rads ill get even cooler temps, also means that you theoretically cool a cpu/gpu with good temps with a single 360 or 480mm rad.


----------



## mus1mus

Definitely not in stock configs. The trouble with 980s is OC scaling. They boost to around 1300 by default. And another 300MHz doesn't equate to their clock percentage advantage.

Also worthy, Vega will like have twice the specs of the RX 480. Good times coming.


----------



## Agent Smith1984

The thing is, I am always one to get the second tier of performance at a goo value. The highest card I have ever bought was a Fury(non X) and I was actually quite disappointed in what I got for $500.....

This card brings that level of performance, or within 5% of it, at $200 (or $230 for 8GB) which changes everything for someone like me because I try to get the best I can for the least amount of money, versus spending whatever amount necessary for "x" level of performance.

The 390's were great for this reason, and second hand 290's were even better, but this 480 literally changes everything we know (or knew). It's more important even than the $380 GTX 1070 (which is a stellar value all it's own).

I can go high-power crossfire setup for $460, and only use 200w of power, and temps are easily controllable. What??

I sold my 390X just in time to get a decent return on it ($300 shipped







)


----------



## spyshagg

Quote:


> Originally Posted by *bluej511*
> 
> Id still take it with a grain of salt its just a synthetic benchmark. the r9 390x beats the gtx 980 in benchmarks but def not in gaming so ill still wait for reviews.


Not in gaming? they traded blows this entire year and then some.


----------



## Agent Smith1984

Quote:


> Originally Posted by *spyshagg*
> 
> Not in gaming? they traded blows this entire year and then some.


According to sources the gaming benchmarks show better for the 480 than the synthetics even suggest! Wuuhh? And yes stock 390x vs 980 shows blows traded in almost every situation. Very equal competition. It's in synthetic benchmarks that the 980 walks over the 390's because of the (insert whatever green hating conspiracy jargon you would like)


----------



## tolis626

Quote:


> Originally Posted by *bluej511*
> 
> Id still take it with a grain of salt its just a synthetic benchmark. the r9 390x beats the gtx 980 in benchmarks but def not in gaming so ill still wait for reviews.


From my experience, it's exactly the other way around. The performance difference is quite small in most games (bar some gameworks gimped titles) and is stupidly exaggerated in benchmarks. The 980 scores what? 1000 points higher on average in Firestrike? I don't think the difference is ever this big in a game.

Anyways, seems AMD vindicated me for telling everyone to hold off on buying a GPU for the past 2-3 months. I'm not sidegrading to Polaris (because that's what it is for us 390x owners, even factoring in the lower power consumption), but Vega will most probably be very, VERY interesting. Here's to hoping AMD will finally get their **** together for once and also market it right.


----------



## bluej511

Quote:


> Originally Posted by *spyshagg*
> 
> Not in gaming? they traded blows this entire year and then some.


The 390 and even x version are usually a few fps under the 980 and thats a stock 980, the OCed ones took off like no ones business. And its a shame the 390 is a more powerful card (on paper anyways haha). The fact that a less powerful card can destroy it should be good.
Quote:


> Originally Posted by *Agent Smith1984*
> 
> According to sources the gaming benchmarks show better for the 480 than the synthetics even suggest! Wuuhh? And yes stock 390x vs 980 shows blows traded in almost every situation. Very equal competition. It's in synthetic benchmarks that the 980 walks over the 390's because of the (*TESSELLATION*)


Fixed it for you.

I live in France where people honestly dont give 2 poops about computers or even computer modding. The fury and fury x came out here right off the bat and were available they never sold out for weeks or months like in the US. Its the sae with the 1080, all the aib boards are all available within a week or so.

Im hoping i can snatch one up after the reviews and contact ek and get one of their waterblocks shipped out to me.
Quote:


> Originally Posted by *tolis626*
> 
> From my experience, it's exactly the other way around. The performance difference is quite small in most games (bar some gameworks gimped titles) and is stupidly exaggerated in benchmarks. The 980 scores what? 1000 points higher on average in Firestrike? I don't think the difference is ever this big in a game.
> 
> Anyways, seems AMD vindicated me for telling everyone to hold off on buying a GPU for the past 2-3 months. I'm not sidegrading to Polaris (because that's what it is for us 390x owners, even factoring in the lower power consumption), but Vega will most probably be very, VERY interesting. Here's to hoping AMD will finally get their **** together for once and also market it right.


Issue i have is, Vega will more then likely be priced quite high like the Fury/X was when it was came out if not even higher because of HBM2 and the new technology. Although im not sure so who knows, October isnt that far off would be a nice birthday present.


----------



## flopper

Quote:


> Originally Posted by *Agent Smith1984*
> 
> There are reports that the thing will overclock to around 1400Mhz from the 1266. Not crazy but still good.... at that speed it is faster than a Fury X accoring to leaked benchmarks.
> 
> This card at $230 is an astounding value, especially considering it runs at 100w and 60c under load. Clearly two of these in CF is hands down value redefined.


Hope for good watercooling mhz oc


----------



## Agent Smith1984

Quote:


> Originally Posted by *flopper*
> 
> Hope for good watercooling mhz oc


You are probably looking at a card that will OC between 1380-1420 on air, and maybe 1450+ on water.... I do expect TDP to be an issue on reference cards (considering the history of GCN's voltage/clock scaling) and probably a tad better on cards with 2x6 pins or 1x8 pin. That's all just my theory though. AMD has done a masterful job of creating hype around a mainstream product this time around..... wow. There may be some NDA's in place but you can believe nobody at AMD is mad about these positive performance leaks.


----------



## flopper

Quote:


> Originally Posted by *Agent Smith1984*
> 
> You are probably looking at a card that will OC between 1380-1420 on air, and maybe 1450+ on water.... I do expect TDP to be an issue on reference cards (considering the history of GCN's voltage/clock scaling) and probably a tad better on cards with 2x6 pins or 1x8 pin. That's all just my theory though. AMD has done a masterful job of creating hype around a mainstream product this time around..... wow. There may be some NDA's in place but you can believe nobody at AMD is mad about these positive performance leaks.


I suspect 300mhz should be obtainable on OC.
current card do 100w.
amd present a low wattage/fps version atm.
can be real awesome.
Looking to update my 390 anyhow.
2 weeks.


----------



## Devildog83

I recently read an article from Tweak Town about the 480 performance in the Steam VR bench and I am a bit confused. The article said the 480 score was a 6.3 and was on par with the 390, the issue is that I ran the same test on my 390 at stock and got a 7.2. That doesn't seem to be on par to me. What am I missing here?


----------



## tolis626

Quote:


> Originally Posted by *Devildog83*
> 
> I recently read an article from Tweak Town about the 480 performance in the Steam VR bench and I am a bit confused. The article said the 480 score was a 6.3 and was on par with the 390, the issue is that I ran the same test on my 390 at stock and got a 7.2. That doesn't seem to be on par to me. What am I missing here?


If they tested the 4GB version, I guess there's your answer. I have trouble believing 4GB is gonna be remotely enough for VR.


----------



## Devildog83

I agree but the statement that it's on par with the 390 is what threw me. Maybe the 8 Gig card would be.


----------



## Joe88

Quote:


> Originally Posted by *Devildog83*
> 
> I recently read an article from Tweak Town about the 480 performance in the Steam VR bench and I am a bit confused. The article said the 480 score was a 6.3 and was on par with the 390, the issue is that I ran the same test on my 390 at stock and got a 7.2. That doesn't seem to be on par to me. What am I missing here?


unless they are testing that asia only 4GB 390, not sure how they are getting 6.3


----------



## mus1mus

Guess the sentence is meant to be as GENERAL as it gets?

I'm roughly mad. --- though i am not!











And this should never be ignored as a disclaimer.


----------



## Irev

I did a bench however I think it did it WITH CF on... from my understanding CF doesnt work with VR right?

https://postimg.org/image/dj2i0bcqp/


----------



## fantasticdave

Quote:


> Originally Posted by *rdr09*
> 
> I concur with blu. that cpu is not going to cope. if you run Firestrike and post your link I can easily tell you what i am talking about.
> 
> You can test yourself when you play your games by using Afterburner to check the various usages. See if your current card is even hitting 90 - 100% usage and not any lower.


Here is my scores

Graphics score
12 654
Graphics test 1
58.64 FPS
Graphics test 2
51.82 FPS

Physics score
4 232
Physics test
13.44 FPS

Combined score
2 165
Combined test
10.07 FPS


----------



## tolis626

Quote:


> Originally Posted by *fantasticdave*
> 
> Here is my scores
> 
> Graphics score
> 12 654
> Graphics test 1
> 58.64 FPS
> Graphics test 2
> 51.82 FPS
> 
> Physics score
> 4 232
> Physics test
> 13.44 FPS
> 
> Combined score
> 2 165
> Combined test
> 10.07 FPS


Yup, that's kinda low. It's better than I expected, to be honest, but still there's a bottleneck. At least it's not a huge, crippling one. What I'd do if I were you (apart from upgrading that CPU) would be undervolting and underclocking that card to save power and run cool and quiet. Your performance will probably not suffer as you're already hitting a wall because you're CPU bound. Also, I would try hybrid Crossfire between the iGPU and the 390 for the lulz. I know it's stupid and will probably be a stuttery mess with worse performance than the card alone, but I always wanted to try it for fun.









For comparison's sake, my results would be something like 14.500 points for the graphics score, 13.000-13.500 for the physics and about 6000 for the combined score. That's with an overclocked 390x and a 4790k at 4.7GHz, so nothing extremely crazy.


----------



## battleaxe

Crap... looks like AMD is about to knock it out of the park. This is awesome. Time to sell some hardware... thank you AMD... thank you!!!


----------



## rdr09

Quote:


> Originally Posted by *battleaxe*
> 
> Crap... looks like AMD is about to knock it out of the park. This is awesome. Time to sell some hardware... thank you AMD... thank you!!!


if the 480 can reach 1600MHz core . . . ugh . . . we have no choice but to go 480 first, then Vega. Ain't too bad for $250 card.


----------



## Streetdragon

when would the 480 released? 1 month? ..just watercooled my nitro. Maybe i just buy 2 480. Hope that a Alphacool NexXxoS GPX for the 480 comes out and just buy a secound cooler.


----------



## fantasticdave

Quote:


> Originally Posted by *tolis626*
> 
> Yup, that's kinda low. It's better than I expected, to be honest, but still there's a bottleneck. At least it's not a huge, crippling one. What I'd do if I were you (apart from upgrading that CPU) would be undervolting and underclocking that card to save power and run cool and quiet. Your performance will probably not suffer as you're already hitting a wall because you're CPU bound. Also, I would try hybrid Crossfire between the iGPU and the 390 for the lulz. I know it's stupid and will probably be a stuttery mess with worse performance than the card alone, but I always wanted to try it for fun.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> For comparison's sake, my results would be something like 14.500 points for the graphics score, 13.000-13.500 for the physics and about 6000 for the combined score. That's with an overclocked 390x and a 4790k at 4.7GHz, so nothing extremely crazy.


I think I will try and overclock the CPU. Never done it before.

My motherboard is FM2 socket, so I am not sure I could get a better CPU without a new motherboard as well


----------



## bluej511

Quote:


> Originally Posted by *Streetdragon*
> 
> when would the 480 released? 1 month? ..just watercooled my nitro. Maybe i just buy 2 480. Hope that a Alphacool NexXxoS GPX for the 480 comes out and just buy a secound cooler.


That more then likely would take a while, then youd have to wait till alphacool actually just sells the 480 block without the gpx, which they dont even do with the r9 390 or its SUPER hard to find.

Seems ekwb is making waterblocks for aib 1080s now so heres to hoping they actually make em for the 480, which they should. For the 390 it was only amd pcb and msi pcb and that was it. XFX, Sapphire, and a few others got screwed.


----------



## flopper

Quote:


> Originally Posted by *rdr09*
> 
> if the 480 can reach 1600MHz core . . . ugh . . . we have no choice but to go 480 first, then Vega. Ain't too bad for $250 card.


1500mhz seems reachable but who knows whats doable with thirdparty cards.

Quote:


> Originally Posted by *Streetdragon*
> 
> when would the 480 released? 1 month? ..just watercooled my nitro. Maybe i just buy 2 480. Hope that a Alphacool NexXxoS GPX for the 480 comes out and just buy a secound cooler.


29June,2 weeks.


----------



## bluej511

Quote:


> Originally Posted by *flopper*
> 
> 1500mhz seems reachable but who knows whats doable with thirdparty cards.
> 29June,2 weeks.


Hopefully doesnt end up being like the 1080s where the AIB overclocks WORSE then the founders edition even with an extra plug lol. The new finfet is so new who knows what it will OC too, leaks show so far 1400 but who really knows.

Remember when they said the fury/x would oc really well haha.


----------



## flopper

Quote:


> Originally Posted by *bluej511*
> 
> Hopefully doesnt end up being like the 1080s where the AIB overclocks WORSE then the founders edition even with an extra plug lol. The new finfet is so new who knows what it will OC too, leaks show so far 1400 but who really knows.
> 
> Remember when they said the fury/x would oc really well haha.


absolutly, 300mhz OC would be 20%+ which seems ok.
Lot of if as far but thats part of the fun before stuff is out.
The engineer was drunk with that OC comment on the fury but they do overclock decently.
and btw amd developed new OC software it seems for the 480
all good and on track


----------



## Streetdragon

Like i know now is, in 2 weeks i get poor!

BUT wich card do you wanna buy? i want more than 6 pin. maype 8 pin for more overclock range. Else a waterblock is a bit overpowerd^^
What powerlups do you want?


----------



## rdr09

Quote:


> Originally Posted by *flopper*
> 
> 1500mhz seems reachable but who knows whats doable with thirdparty cards.
> 29June,2 weeks.


Yah, 1500 should be achievable. Some Hawaiis achieve 300MHz oc, so who knows - mostly reference, though.

Don't forget . . . we have gupsterg, fyzz, mus1mus, Stilt, Kizwan, and others who will tweak bioses for sure.


----------



## Agent Smith1984

What's really nuts is, this little RX 470 that is performing right between a 290 and a 970. It will cost roughly $140!! Two of those in crossfire for less than $300 will give you roughly 295x2 performance, especially with some overclocks. AMD is about to dominate the mid-range market with these releases.


----------



## bluej511

Quote:


> Originally Posted by *Agent Smith1984*
> 
> What's really nuts is, this little RX 470 that is performing right between a 290 and a 970. It will cost roughly $140!! Two of those in crossfire for less than $300 will give you roughly 295x2 performance, especially with some overclocks. AMD is about to dominate the mid-range market with these releases.


I hope so, thats why its so tempting to get a 480 especially since i only game in ultrawide, but if i get a 4k tv by end of the year or so id hate to have to buy another card. I may just wait out for Vega unless the rx480 is just out of this world.


----------



## Agent Smith1984

Quote:


> Originally Posted by *bluej511*
> 
> I hope so, thats why its so tempting to get a 480 especially since i only game in ultrawide, but if i get a 4k tv by end of the year or so id hate to have to buy another card. I may just wait out for Vega unless the rx480 is just out of this world.


I don't think the 480 will be out of this world by any means, but it will deliver solid 390x/980 performance at a great price and there could be the potential for some good overclocking which will take it into Fury X territory. I think the important thing is that the nightmares associated with running 2) hawaii cards in crossfire (too much power and too much heat) won't be an issue with these. You can have two cards for a great price, get enthusiast level performance, and not have temps in the 90's while burning up 700w of power. Instead you"ll be looking at roughly 300w of power with a heavy overclock on two cards, with temps probably in the 70's on top card, and 60's on the bottom.

You can tell that AMD is definitely trying to sell crossfire with this release based on their comparison of the 1080 to a pair of 480's in AoS. Two reasons for that; they need to sell as many as possible since their margins are going to be so low, and also they know they have nothing that can compete with the 1080 in single form until Vega, so to be relevant they are showcasing the power of two cards together. Nothing wrong with that tactic in my opinion but there are still trolls all over the net asking why the 480 doesn't compete with the 1070, which is ridiculous, as it was never meant to compete with it to begin with. I mean seriously, GTX 980 performance, with some good OC potential, and it's $200-230..... even in 2016 that type of performance is still very good and these cards (470 and 480) will put PC gaming at the finger tips of many more young people.


----------



## bluej511

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I don't think the 480 will be out of this world by any means, but it will deliver solid 390x/980 performance at a great price and there could be the potential for some good overclocking which will take it into Fury X territory. I think the important thing is that the nightmares associated with running 2) hawaii cards in crossfire (too much power and too much heat) won't be an issue with these. You can have two cards for a great price, get enthusiast level performance, and not have temps in the 90's while burning up 700w of power. Instead you"ll be looking at roughly 300w of power with a heavy overclock on two cards, with temps probably in the 70's on top card, and 60's on the bottom.
> 
> You can tell that AMD is definitely trying to sell crossfire with this release based on their comparison of the 1080 to a pair of 480's in AoS. Two reasons for that; they need to sell as many as possible since their margins are going to be so low, and also they know they have nothing that can compete with the 1080 in single form until Vega, so to be relevant they are showcasing the power of two cards together. Nothing wrong with that tactic in my opinion but there are still trolls all over the net asking why the 480 doesn't compete with the 1070, which is ridiculous, as it was never meant to compete with it to begin with. I mean seriously, GTX 980 performance, with some good OC potential, and it's $200-230..... even in 2016 that type of performance is still very good and these cards (470 and 480) will put PC gaming at the finger tips of many more young people.


I dont think so much crossfire as VR. One card per eye would be amazing and it seems like more VR games will be developed with 2 card in mind. The issue is most games (at least i think anyways) wont get good crossfire support even if 2 rx480s will be powerful.

Again this is just my opinion but they havent had good crossfire in probably a decade so i dont see it changing now since it really is such a small marker. Unless everyone ends up buying 2 rx480s instead of one, devs and AMD just wont put that much effort into crossfire, not as much as they will into VR (which i think wont pick up for a while anyways)


----------



## rdr09

Quote:


> Originally Posted by *fantasticdave*
> 
> Here is my scores
> 
> Graphics score
> 12 654
> Graphics test 1
> 58.64 FPS
> Graphics test 2
> 51.82 FPS
> 
> Physics score
> 4 232
> Physics test
> 13.44 FPS
> 
> Combined score
> 2 165
> Combined test
> 10.07 FPS


Maybe the A10 has some new features that can help it cope with a card like the 390. I paired my Phenom II Quad with my 290 and it really struggled. Had to unlock it to six core just to cope. Not sure if the Physics score (cpu) is a good indication of cpu preformance in games but my Phenom gets 6K at 4GHz in the same bench.


----------



## Rexer

I've not good skills at navigating through websites. Trying to join R9 390 & X overclocking club. Sort of lost at doing this.


----------



## Agent Smith1984

Quote:


> Originally Posted by *bluej511*
> 
> I dont think so much crossfire as VR. One card per eye would be amazing and it seems like more VR games will be developed with 2 card in mind. The issue is most games (at least i think anyways) wont get good crossfire support even if 2 rx480s will be powerful.
> 
> Again this is just my opinion but they havent had good crossfire in probably a decade so i dont see it changing now since it really is such a small marker. Unless everyone ends up buying 2 rx480s instead of one, devs and AMD just wont put that much effort into crossfire, not as much as they will into VR (which i think wont pick up for a while anyways)


I ran crossfire for a bit and honestly thought it was great. The performance boost I got in Crysis 3, BF4, and SoM at the time was awesome. GTA V showed no improvement though, but I believe that has since been fixed (I may be wrong).
The worst issues I had were with Far Cry 4 (possibly fixed now) and Skyrim, with Skyrim being absolutely unplayable with CF turned on, but got hundreds of frames per second with only one card anyways so no big deal...


----------



## yuannan

Quote:


> Originally Posted by *bluej511*
> 
> I dont think so much crossfire as VR. One card per eye would be amazing and it seems like more VR games will be developed with 2 card in mind. The issue is most games (at least i think anyways) wont get good crossfire support even if 2 rx480s will be powerful.
> 
> Again this is just my opinion but they havent had good crossfire in probably a decade so i dont see it changing now since it really is such a small marker. Unless everyone ends up buying 2 rx480s instead of one, devs and AMD just wont put that much effort into crossfire, not as much as they will into VR (which i think wont pick up for a while anyways)


What monitor are you running right now?

Also, I don't think crossfire VR will take off. Too much variables and stutter. Unless AMD ups their game quite a bit VR with crossfire will be a one way ticket to the toilet.


----------



## tolis626

Regarding VR Crossfire, I think one GPU per screen (or per eye in this instance) will be a completely different beast compared to classic Crossfire. There is no need to interlace the frames, split the load or anything. With some clever software, they can just split the workload 50-50, as both screens show roughly the same thing anyway. I think that instead of VR being the final nail in the coffin of SLI and Crossfire, it will bring it more towards the mainstream. Think about it. WIth such an implementation, you're effectively looking at double the performance. So Crossfire 480s would be a much better solution than a single 1080.

Of course, I could be wrong and this is impossible to pull off. After all I'm just talking out of my arse here. I just don't see why it wouldn't be possible.


----------



## Streetdragon

http://www.gamestar.de/hardware/grafikkarten/amd-radeon-rx-480/news-artikel/amd_radeon_rx,988,3274372.html

so the price could be:
RX 460 2GB ~ 79€
RX 470 4GB ~ 149€
RX 480 4GB ~ 209€
RX 480 8GB ~ 249€

I would say +30€ to all


----------



## gupsterg

Quote:


> Originally Posted by *Streetdragon*
> 
> i want more than 6 pin. maype 8 pin for more overclock range


75W is PCI-E Sig spec for 6 pin , then you have 75W from PCI-E slot, making it come up to 150W.

Hardware spec for 6 pin is ~190W so add on 75W from PCI-E slot = ~265W . There have been cards by AMD which don't conform to PCI-SIG spec but hardware spec, mainly dual GPU cards (ie R9 295X2).

On Videocardz they have a leaked RX 480 PCB image, 6+1 phases (GPU/RAM) to me it would seem higher spec than what would be required for low TDP card *so* perhaps AMD ref PCB is more in context to allow usage of power to hardware spec of 6 pin / PCI-E slot. All this depends on spec of VRM on PCB as well.

Most 6 pin connectors on PSUs are 6/8 so they have 3x 12V wires, no idea though if most PCBs with 6 pin connector have the 3rd 12V pin connected plus I would think a 3rd negative 12V line would be needed to make use of it, perhaps another with better knowledge can comment.

I kept info in the hawaii/fiji bios mod powerlimit section more towards PCI-SIG, here is hardware spec info I collected from a page on THG.


Quote:


> Originally Posted by *rdr09*
> 
> Don't forget . . . we have gupsterg, fyzz, mus1mus, Stilt, Kizwan, and others who will tweak bioses for sure.


So far to me it seems it's gonna use similar PowerPlay 7.0, same as Tonga\Fiji, as Polaris is in updated tonga_pptable.h.

Really looking forward to these cards







, was thinking of selling Fury X but just couldn't







, may still just get a RX 480 to meddle with though







.


----------



## bluej511

Quote:


> Originally Posted by *yuannan*
> 
> What monitor are you running right now?
> 
> Also, I don't think crossfire VR will take off. Too much variables and stutter. Unless AMD ups their game quite a bit VR with crossfire will be a one way ticket to the toilet.


Got an lg 29um68 with freesync, i can't ever go back haha.
Quote:


> Originally Posted by *Streetdragon*
> 
> http://www.gamestar.de/hardware/grafikkarten/amd-radeon-rx-480/news-artikel/amd_radeon_rx,988,3274372.html
> 
> so the price could be:
> RX 460 2GB ~ 79€
> RX 470 4GB ~ 149€
> RX 480 4GB ~ 209€
> RX 480 8GB ~ 249€
> 
> I would say +30€ to all


I hope these prices conform to all of europe, they seem to vary widely which is crazy. For example the 1080 founders is like 799€ here and in some euro countries its like 50€ less.
Quote:


> Originally Posted by *gupsterg*
> 
> 75W is PCI-E Sig spec for 6 pin , then you have 75W from PCI-E slot, making it come up to 150W.
> 
> Hardware spec for 6 pin is ~190W so add on 75W from PCI-E slot = ~265W . There have been cards by AMD which don't conform to PCI-SIG spec but hardware spec, mainly dual GPU cards (ie R9 295X2).
> 
> On Videocardz they have a leaked RX 480 PCB image, 6+1 phases (GPU/RAM) to me it would seem higher spec than what would be required for low TDP card *so* perhaps AMD ref PCB is more in context to allow usage of power to hardware spec of 6 pin / PCI-E slot. All this depends on spec of VRM on PCB as well.
> 
> Most 6 pin connectors on PSUs are 6/8 so they have 3x 12V wires, no idea though if most PCBs with 6 pin connector have the 3rd 12V pin connected plus I would think a 3rd negative 12V line would be needed to make use of it, perhaps another with better knowledge can comment.
> 
> I kept info in the hawaii/fiji bios mod powerlimit section more towards PCI-SIG, here is hardware spec info I collected from a page on THG.
> 
> 
> So far to me it seems it's gonna use similar PowerPlay 7.0, same as Tonga\Fiji, as Polaris is in updated tonga_pptable.h.
> 
> Really looking forward to these cards
> 
> 
> 
> 
> 
> 
> 
> , was thinking of selling Fury X but just couldn't
> 
> 
> 
> 
> 
> 
> 
> , may still just get a RX 480 to meddle with though
> 
> 
> 
> 
> 
> 
> 
> .


These are just specs, a pcie cable can indeed deliver more power provided its the right gauge and the psu can handle it. A 6 pin might be rated for 150w but easily surpass that and probably even provide 200w of power. I think the reason aib rx480s will have an extra 6/8pin is one, because they run a couple fans and have more VRMs. It unfortunately doesnt automatically mean its going to OC any better. Pretty sure the Sapphire boards have quite a few more VRM1s then amd boards but oc just the same if not worse haha.


----------



## rdr09

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I ran crossfire for a bit and honestly thought it was great. The performance boost I got in Crysis 3, BF4, and SoM at the time was awesome. GTA V showed no improvement though, but I believe that has since been fixed (I may be wrong).
> The worst issues I had were with Far Cry 4 (possibly fixed now) and Skyrim, with Skyrim being absolutely unplayable with CF turned on, but got hundreds of frames per second with only one card anyways so no big deal...


I played Skyrim in 4K, so i must be using both my 290s 'cause i never bother disabling crossfire. It was very smooth. It played as smooth as my 7950 does using 1080. Vannila though.

Quote:


> Originally Posted by *gupsterg*
> 
> 75W is PCI-E Sig spec for 6 pin , then you have 75W from PCI-E slot, making it come up to 150W.
> 
> Hardware spec for 6 pin is ~190W so add on 75W from PCI-E slot = ~265W . There have been cards by AMD which don't conform to PCI-SIG spec but hardware spec, mainly dual GPU cards (ie R9 295X2).
> 
> On Videocardz they have a leaked RX 480 PCB image, 6+1 phases (GPU/RAM) to me it would seem higher spec than what would be required for low TDP card *so* perhaps AMD ref PCB is more in context to allow usage of power to hardware spec of 6 pin / PCI-E slot. All this depends on spec of VRM on PCB as well.
> 
> Most 6 pin connectors on PSUs are 6/8 so they have 3x 12V wires, no idea though if most PCBs with 6 pin connector have the 3rd 12V pin connected plus I would think a 3rd negative 12V line would be needed to make use of it, perhaps another with better knowledge can comment.
> 
> I kept info in the hawaii/fiji bios mod powerlimit section more towards PCI-SIG, here is hardware spec info I collected from a page on THG.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> So far to me it seems it's gonna use similar PowerPlay 7.0, same as Tonga\Fiji, as Polaris is in updated tonga_pptable.h.
> 
> Really looking forward to these cards
> 
> 
> 
> 
> 
> 
> 
> , was thinking of selling Fury X but just couldn't
> 
> 
> 
> 
> 
> 
> 
> , may still just get a RX 480 to meddle with though
> 
> 
> 
> 
> 
> 
> 
> .


You gonna be busy.


----------



## gupsterg

@bluej511

Yep those specs are based on 18AWG IIRC (usual ATX spec for those connectors). As your an electronics man (







) am I reading this table correctly , 18AWG 10A @ 75C?

I concur higher phase count does not equal better OC







. My mention of that "stuff" was just in the context that if VRM spec is good, then taking it into consideration and hardware spec of 6 pin when OC'ing it should cope with power load







.

@rdr09

Bios mod "stuff" just enthralls me more at times than gaming







.


----------



## Streetdragon

Quote:


> Originally Posted by *bluej511*
> 
> Got an lg 29um68 with freesync, i can't ever go back haha.
> I hope these prices conform to all of europe, they seem to vary widely which is crazy. For example the 1080 founders is like 799€ here and in some euro countries its like 50€ less.
> These are just specs, a pcie cable can indeed deliver more power provided its the right gauge and the psu can handle it. A 6 pin might be rated for 150w but easily surpass that and probably even provide 200w of power. I think the reason aib rx480s will have an extra 6/8pin is one, because they run a couple fans and have more VRMs. It unfortunately doesnt automatically mean its going to OC any better. Pretty sure the Sapphire boards have quite a few more VRM1s then amd boards but oc just the same if not worse haha.


So all in all, what board would you buy? amd stock, sapphire msi ...? for clocking AND wich one will get the first waterblock?


----------



## bluej511

Quote:


> Originally Posted by *gupsterg*
> 
> @bluej511
> 
> Yep those specs are based on 18AWG IIRC (usual ATX spec for those connectors). As your an electronics man (
> 
> 
> 
> 
> 
> 
> 
> ) am I reading this table correctly , 18AWG 10A @ 75C?
> 
> I concur higher phase count does not equal better OC
> 
> 
> 
> 
> 
> 
> 
> . My mention of that "stuff" was just in the context that if VRM spec is good, then taking it into consideration and hardware spec of 6 pin when OC'ing it should cope with power load
> 
> 
> 
> 
> 
> 
> 
> .
> 
> @rdr09
> 
> Bios mod "stuff" just enthralls me more at times than gaming
> 
> 
> 
> 
> 
> 
> 
> .


Yea those graphs are a bit deceiving though as its best case scenario not attached to anything and what not.
Quote:


> Originally Posted by *Streetdragon*
> 
> So all in all, what board would you buy? amd stock, sapphire msi ...? for clocking AND wich one will get the first waterblock?


Not sure how ekwb or other manufacturers work as far as what comes first but usually its a factory pcb that gets first wb. I mean ekwb had theirs to jayztwocents before the cards were even on sale to the public so it depends. They dont make too many aib amd waterblocks because according to them not too many want it. Theyll make a radeon pro duo block but not a sapphire r9 390 block which there is far more of, and yes i know the radeon duo is already on water haha.


----------



## gupsterg

Quote:


> Originally Posted by *bluej511*
> 
> Yea those graphs are a bit deceiving though as its best case scenario not attached to anything and what not.


I thought as much, so many thanks







, +rep







.


----------



## bluej511

Quote:


> Originally Posted by *gupsterg*
> 
> I thought as much, so many thanks
> 
> 
> 
> 
> 
> 
> 
> , +rep
> 
> 
> 
> 
> 
> 
> 
> .


Best way to test would actually be at full load on a dmm. Idk if what gpuz and hwinfo is correct but i do believe the 390 pulls something like 18a or something.


----------



## gupsterg

Quote:


> Originally Posted by *bluej511*
> 
> Best way to test would actually be at full load on a dmm. Idk if what gpuz and hwinfo is correct but i do believe the 390 pulls something like 18a or something.


Cheers







.

Yep, The Stilt has highlighted Hawaii ~20A on 12V rail, so Grenada be similar.


----------



## Linkinred

Hi guys

I have 2 R9 390X in crossfire... however running stock clocks they throw up artifacts.... I have tried both cards one by one and the same deal artifacts also tried them on a totally different system...

one card seems to be worse then the other one.......

If I however downclock the core by say -20mhz then all artifacts vanish and are gone for good.....

if I raise the clock by say +10 or +20mhz on the core I get crazy artifacts and screen flashes on both cards

is it safe to say these are REALLY bad overclocking cards or I just got duds???

should I rma?


----------



## Gdourado

Quote:


> Originally Posted by *Linkinred*
> 
> Hi guys
> 
> I have 2 R9 390X in crossfire... however running stock clocks they throw up artifacts.... I have tried both cards one by one and the same deal artifacts also tried them on a totally different system...
> 
> one card seems to be worse then the other one.......
> 
> If I however downclock the core by say -20mhz then all artifacts vanish and are gone for good.....
> 
> if I raise the clock by say +10 or +20mhz on the core I get crazy artifacts and screen flashes on both cards
> 
> is it safe to say these are REALLY bad overclocking cards or I just got duds???
> 
> should I rma?


Artifacts with everything stock is a RMA...


----------



## bluej511

Quote:


> Originally Posted by *Linkinred*
> 
> Hi guys
> 
> I have 2 R9 390X in crossfire... however running stock clocks they throw up artifacts.... I have tried both cards one by one and the same deal artifacts also tried them on a totally different system...
> 
> one card seems to be worse then the other one.......
> 
> If I however downclock the core by say -20mhz then all artifacts vanish and are gone for good.....
> 
> if I raise the clock by say +10 or +20mhz on the core I get crazy artifacts and screen flashes on both cards
> 
> is it safe to say these are REALLY bad overclocking cards or I just got duds???
> 
> should I rma?


Sounds like stock volts they got artifacts and sounds like over volted even more so. Kinda weird to have BOTH cards that are that bad but not impossible. Unless its a driver issue somewhere.


----------



## bluej511

For my amd people, someone posted this video in another thread never seen it before and its 200% true i absolutely love it. Ive known about this for years as well.


----------



## tolis626

Quote:


> Originally Posted by *bluej511*
> 
> For my amd people, someone posted this video in another thread never seen it before and its 200% true i absolutely love it. Ive known about this for years as well.


Man... At first I was like "Here we go again, more butthurt AMD fanboys that blame NVidia for everything". I don't like NVidia either, mainly for their business practices, but people online seem to treat AMD like saints sometimes and I like that even less.

But this guy makes some good points. And most importantly, it's his tone that sets him apart from most similar "rants". NVidia should follow the rest of the industry instead of trying to force their way upon us. If they believe their products are so superior they should prove that on a level playing field. But I guess they aren't as impressive when it's fair, and impressive is what sells these days, so... Yeah. I hope some of this backfires on them for our sakes.


----------



## bluej511

Quote:


> Originally Posted by *tolis626*
> 
> Man... At first I was like "Here we go again, more butthurt AMD fanboys that blame NVidia for everything". I don't like NVidia either, mainly for their business practices, but people online seem to treat AMD like saints sometimes and I like that even less.
> 
> But this guy makes some good points. And most importantly, it's his tone that sets him apart from most similar "rants". NVidia should follow the rest of the industry instead of trying to force their way upon us. If they believe their products are so superior they should prove that on a level playing field. But I guess they aren't as impressive when it's fair, and impressive is what sells these days, so... Yeah. I hope some of this backfires on them for our sakes.


You can tell its not biased where he says he likes Nvidia 3d better then AMD lol. But i mean does anyone really game in 3D it hasnt picked up like they thought it would.

I mean honestly i blame Nvidia too but its not so much a blame as shoddy capitalist business practice. Also lets not forget their CEO worked and learned everything from AMD/ATI so dont forget that one.

The tessellation one is absolutely true and its a known issue in a few games. This is the chart i love to show people. Why nvidia why?


----------



## Linkinred

Quote:


> Originally Posted by *Gdourado*
> 
> Artifacts with everything stock is a RMA...


one card artifacts constantly @ stock speeds

the other one does aswell but only so often, but it still DOES ..... both cards are same brand produced same manufacture date so I gues tthey are the same batch?


----------



## Agent Smith1984

Even more reason to be excited folks!!

http://wccftech.com/amd-rx-480-1500mhz-overclocking-tool-voltage-control/

AMD is even giving overclocking software WITH VOLTAGE CONTROL OUT THE GATE!!!!!
And 1500mhz?? Yes, we'll take it

These guys are learning as they go


----------



## bluej511

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Even more reason to be excited folks!!
> 
> http://wccftech.com/amd-rx-480-1500mhz-overclocking-tool-voltage-control/
> 
> AMD is even giving overclocking software WITH VOLTAGE CONTROL OUT THE GATE!!!!!
> And 1500mhz?? Yes, we'll take it
> 
> These guys are learning as they go


I mean cant we do that with afterburner anyways? I dont see the big deal but thats just me, you still cant monitor in game with Crimson lol.


----------



## Agent Smith1984

Quote:


> Originally Posted by *bluej511*
> 
> I mean cant we do that with afterburner anyways? I dont see the big deal but thats just me, you still cant monitor in game with Crimson lol.


Well yes, but remember that Fury took months to get voltage control, and also keep in mind that NO GPU manufacturer has ever offered direct driver level voltage control to their customers. This is an industry first!

The card overclocked in the 1400-1500mhz range will be Fury X performance. That's pretty damn impressive for less than half the cost!


----------



## tolis626

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Well yes, but remember that Fury took months to get voltage control, and also keep in mind that NO GPU manufacturer has ever offered direct driver level voltage control to their customers. This is an industry first!
> 
> The card overclocked in the 1400-1500mhz range will be Fury X performance. That's pretty damn impressive for less than half the cost!


I highly doubt it's going to be Fury X real world performance. I'd expect more. A smaller, cooler and simpler chip shouldn't have the trouble the Fury line had when overclocking. I'm expecting real world results to show it as being faster than the Fury X and right into 980ti territory, while scraping by during benchmarks. I guess we'll see.


----------



## bluej511

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Well yes, but remember that Fury took months to get voltage control, and also keep in mind that NO GPU manufacturer has ever offered direct driver level voltage control to their customers. This is an industry first!
> 
> The card overclocked in the 1400-1500mhz range will be Fury X performance. That's pretty damn impressive for less than half the cost!


True. Guess im so used to afterburner i dont even oc or think about OCing with crimson/catalyst. Just makes it easier to oc and monitor all in one you know. If i do get a 480 id still oc with afterburner lol.


----------



## Agent Smith1984

Quote:


> Originally Posted by *tolis626*
> 
> I highly doubt it's going to be Fury X real world performance. I'd expect more. A smaller, cooler and simpler chip shouldn't have the trouble the Fury line had when overclocking. I'm expecting real world results to show it as being faster than the Fury X and right into 980ti territory, while scraping by during benchmarks. I guess we'll see.


http://videocardz.com/61154/amd-radeon-rx-480-crossfire-3dmark-performance

We shall see..... it's about 390x level out of the box, but the question will be, how much will the overclocking help..... I mean, if we will seriously see 200mhz+ overclocks on core, and if the memory will get into the 9GHz range like the 1070 does, then this card will certainly surpass a Fury level of performance.


----------



## bluej511

Quote:


> Originally Posted by *Agent Smith1984*
> 
> http://videocardz.com/61154/amd-radeon-rx-480-crossfire-3dmark-performance
> 
> We shall see..... it's about 390x level out of the box, but the question will be, how much will the overclocking help..... I mean, if we will seriously see 200mhz+ overclocks on core, and if the memory will get into the 9GHz range like the 1070 does, then this card will certainly surpass a Fury level of performance.


I saw that but then again not sure how accurate those are. My r9 390 on stock clocks pulls 12,600 and closer to 13,500 with a simple 60mhz overclock. There numbers for the 390x seem low to begin with. Pretty sure they score more then 300points over a 390.

Idc about synthetic benchmarks im waiting for UNBIASED reviews, aka probably only techpowerup lol.


----------



## tolis626

Quote:


> Originally Posted by *Agent Smith1984*
> 
> http://videocardz.com/61154/amd-radeon-rx-480-crossfire-3dmark-performance
> 
> We shall see..... it's about 390x level out of the box, but the question will be, how much will the overclocking help..... I mean, if we will seriously see 200mhz+ overclocks on core, and if the memory will get into the 9GHz range like the 1070 does, then this card will certainly surpass a Fury level of performance.


That's exactly what I meant. We can't go by Firestrike results. It's not terribly biased or anything, but it does favor big chips with tons of bandwidth. Seriously, just look at the 390x. How much of a real-world performance benefit is there going from 1625MHz memory to 1750MHz?None? 1-2%? Firestrike gains are in the 3 digits range. Also, how far behind a Fury is a 390x in games? That's why I believe that it may not bench that well, but in games it should deliver. Especially if it's not hitting any memory bottlenecks and the memory does clock well. I guess we shall see in a couple of weeks. Interesting times ahead.


----------



## Darkchild

Quote:


> Originally Posted by *Devildog83*
> 
> I recently read an article from Tweak Town about the 480 performance in the Steam VR bench and I am a bit confused. The article said the 480 score was a 6.3 and was on par with the 390, the issue is that I ran the same test on my 390 at stock and got a 7.2. That doesn't seem to be on par to me. What am I missing here?


i doubt its the 4gb because i just ran it on my 290 and scored 7.6. maybe its the bandwidth and not the amount of vram limiting the card.


----------



## Agent Smith1984

Quote:


> Originally Posted by *tolis626*
> 
> That's exactly what I meant. We can't go by Firestrike results. It's not terribly biased or anything, but it does favor big chips with tons of bandwidth. Seriously, just look at the 390x. How much of a real-world performance benefit is there going from 1625MHz memory to 1750MHz?None? 1-2%? Firestrike gains are in the 3 digits range. Also, how far behind a Fury is a 390x in games? That's why I believe that it may not bench that well, but in games it should deliver. Especially if it's not hitting any memory bottlenecks and the memory does clock well. I guess we shall see in a couple of weeks. Interesting times ahead.


It's definitely going to be interesting. AMD may have just knocked this one out of the park. I'm all but convinced I will be buying two of these cards over a single 1070 at this point.

My concern though, is TDP. These reference samples will all have one power port, and I'm sure we will see special AIB's with two later, but if we have seen anything from the 1070/1080 so far, adding additional connectors doesn't matter because the firmware on their cards are heavily limiting the clock speeds. Hopefully AMD didn't do that with these because as we have seen from the Nano, these single connector/low TDP cards suffer greatly with power limitations.

One thing to note though, is that reviewers have said the clock speeds are pretty much staying pegged and the power draw is around 100W. That means increasing power limiter to 50% and adding some voltage may not be an issue as we will be able to push into the 150w territory and still be within both the cards designed power limit, and within the physical power available to the board itself.

I also wonder about pricing.... this $199 thing is for a vanilla 4GB model, which is fine, and then the speculation was that the 8GB would be $30-50 more. But now I am seeing that the XFX RX480 8GB with slight OC is going to be $300.....
At $300, if it doesn't show a good deal better than the 390 series in games (OC or not) then I think they stand to sell very few of those. I certainly won't pay $300 for one unless it truly does offer than Fury X level of performance.


----------



## Streetdragon

The Beast card. wanna touch! I hope the Benchmarks get nice scores. Crossfire here i come!


----------



## monster4bob

quick question guys, I mixed up my PSU power cable with my monitor one they are interchangeable right?


----------



## Irev

Quote:


> Originally Posted by *monster4bob*
> 
> quick question guys, I mixed up my PSU power cable with my monitor one they are interchangeable right?


yes


----------



## monster4bob

Quote:


> Originally Posted by *Irev*
> 
> yes


okay thank you. I just wanted to make sure I wasn't gonna short anything out. Thanks


----------



## Linkinred

hey guys do u have the right to rma a card if it artifacts very rarely @ stock settings?

for example if I play a game and see maybe 1 or 2 quick artifacts flash on screen in a 1 hour gaming session?

also the card doesnt overclock at all if I put +10mhz on the core the artifacts come much more frequently.

my worry is that I dont want to rma the card and they dont find artifact at stock settings in their testing as it doesnt happen all the time.... id hate to be told they wont rma and they send it back to me


----------



## gupsterg

If I had a card that artifact at stock I would RMA. I would make detailed account, videos, etc. Caveat being knowing the rest of your system is not an issue.

For example no OC on CPU/RAM, OS has no issue which would lead to artifact (ie corrupt install, etc), no known issue that the app/game loading GPU is known not to have bug causing the artifact.


----------



## battleaxe

Quote:


> Originally Posted by *Linkinred*
> 
> hey guys do u have the right to rma a card if it artifacts very rarely @ stock settings?
> 
> for example if I play a game and see maybe 1 or 2 quick artifacts flash on screen in a 1 hour gaming session?
> 
> also the card doesnt overclock at all if I put +10mhz on the core the artifacts come much more frequently.
> 
> my worry is that I dont want to rma the card and they dont find artifact at stock settings in their testing as it doesnt happen all the time.... id hate to be told they wont rma and they send it back to me


I would say yes. Stock artifacting is no good. As it gets older it will get worse almost certainly.


----------



## Linkinred

Quote:


> Originally Posted by *battleaxe*
> 
> I would say yes. Stock artifacting is no good. As it gets older it will get worse almost certainly.


Here is video:





the other card isn't as bad but it _still does_ artifact at stock settings


----------



## bluej511

Quote:


> Originally Posted by *Linkinred*
> 
> Here is video:
> 
> 
> 
> 
> 
> the other card isn't as bad but it _still does_ artifact at stock settings


Yea def RMA that at stock speeds def either faulty or theyre refurbs.


----------



## gupsterg

I concur, RMA.


----------



## tolis626

Dude, my card (Not the greatest overclocker by any stretch of the imagination) doesn't artifact like that at heavy overclocks. That's definitely cause for RMA. What I would do to make sure is to try and reseat the cards and their respective PCIe power connectors and maybe, if you're able to, test them in another system, just to make sure. Also, make sure to document the process if you do.

However, I'd say none of this is needed. There is definitely something wrong there and you are entitled to a replacement. I would RMA both cards too. "Doesn't artifact AS MUCH" doesn't cover me, it artifacts at stock settings, period.


----------



## bluej511

Quote:


> Originally Posted by *Linkinred*
> 
> Here is video:
> 
> 
> 
> 
> 
> the other card isn't as bad but it _still does_ artifact at stock settings


Im curious but what psu are you using? And are you using daisychain power cables, ie a single pcie 8pin that splits into another pcie 8pin.


----------



## jdorje

Quote:


> Originally Posted by *Linkinred*
> 
> Here is video:
> 
> 
> 
> 
> 
> the other card isn't as bad but it _still does_ artifact at stock settings


Instability at stock is obviously cause for rma.

But if you have two cards that don't work i would assume the problem is not those cards. More likely the mobo, psu, or drivers.

The white snow artifacting is usually from an unstable core. Raise voltage or lower clock to stabilize it.


----------



## Linkinred

Quote:


> Originally Posted by *bluej511*
> 
> Im curious but what psu are you using? And are you using daisychain power cables, ie a single pcie 8pin that splits into another pcie 8pin.


I have tried the cards in a friends system to rule out any issue with my computer.

My power supply is EVGA Supernova G2 Gold 850W.

I am not daisy chaining EVGA has the 8pin + 6 pin on a single cable,

If i drop that card back to 1040mhz -20mhz from stock it seems stable and doesnt show artifacts.

The OTHER card artifacts also but not nearly as much , but it also is terrible card just +10mhz on any of these cards increases the frequency of the artifacts by alot


----------



## bluej511

Quote:


> Originally Posted by *Linkinred*
> 
> I have tried the cards in a friends system to rule out any issue with my computer.
> 
> My power supply is EVGA Supernova G2 Gold 850W.
> 
> I am not daisy chaining EVGA has the 8pin + 6 pin on a single cable,
> 
> If i drop that card back to 1040mhz -20mhz from stock it seems stable and doesnt show artifacts.
> 
> The OTHER card artifacts also but not nearly as much , but it also is terrible card just +10mhz on any of these cards increases the frequency of the artifacts by alot


Yea def RMA especially if you tried it on a different system and got the same result.


----------



## Pohernori

Just an update on my PowerColor PCS+ r9 390x. It burnt itself a a week ago and now its in RMA. Will update on what PowerColor does.


----------



## kizwan

Quote:


> Originally Posted by *Linkinred*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bluej511*
> 
> Im curious but what psu are you using? And are you using daisychain power cables, ie a single pcie 8pin that splits into another pcie 8pin.
> 
> 
> 
> I have tried the cards in a friends system to rule out any issue with my computer.
> 
> My power supply is EVGA Supernova G2 Gold 850W.
> 
> *I am not daisy chaining EVGA has the 8pin + 6 pin on a single cable,*
> 
> If i drop that card back to 1040mhz -20mhz from stock it seems stable and doesnt show artifacts.
> 
> The OTHER card artifacts also but not nearly as much , but it also is terrible card just +10mhz on any of these cards increases the frequency of the artifacts by alot
Click to expand...

8-pin & 6-pin on the same cable is basically similar to daisy chaining. Use separate cables for each PCIe connectors.


----------



## bluej511

Quote:


> Originally Posted by *kizwan*
> 
> 8-pin & 6-pin on the same cable is basically similar to daisy chaining. Use separate cables for each PCIe connectors.


Nice flowers i got a pic just like that haha.

Its what i meant but its kinda hard to understand. Basically use 2 SEPARATE cables for each 8pin. I think he means each cable has an 8pin and 6pin but hes using 2 cables. Ive def heard issues and problems on the 290/390 series of where daisychain CAN cause issues but its not impossible. The fact that both cards have issues isn't a good thing.

Im curious as to what oc and what not the cpu has and what else the system is using. Could be that its getting excessive voltage ripple and the -20mv adjustment on the cards stabilizes it, but he did try it on another rig as well so who knows.


----------



## battleaxe

Happily, I won't be buying any Nvidia cards this generation. So glad they screwed me over so nicely on the 970. It really made things a lot easier for me. Now I only have to buy from one brand of GPU's. AMD that is.

Thanks Nvidia for making my life so much easier. Now I can buy from a good brand instead of a bunch of thugs with no morals or business ethics.









Nice knowing you Nvidisuck.


----------



## Linkinred

Quote:


> Originally Posted by *bluej511*
> 
> Nice flowers i got a pic just like that haha.
> 
> Its what i meant but its kinda hard to understand. Basically use 2 SEPARATE cables for each 8pin. I think he means each cable has an 8pin and 6pin but hes using 2 cables. Ive def heard issues and problems on the 290/390 series of where daisychain CAN cause issues but its not impossible. The fact that both cards have issues isn't a good thing.
> 
> Im curious as to what oc and what not the cpu has and what else the system is using. Could be that its getting excessive voltage ripple and the -20mv adjustment on the cards stabilizes it, but he did try it on another rig as well so who knows.



This is how the cable looks it is 8+6 on the one cable, the only way to crossfire using EVGA 850W G2 Supernova is by using these two cables 8+6 on the single cable. The only bundled cables are 2x 8Pin or 2x8+6.... it was confirmed on EVGA forums by EVGA staff member that this is perfectly normal to use this cable like this - it is how it was intended to be used.

my cpu is running stock standard no overclock at all.

The card did the exact same issue in a totally different computer I think we can rule out any issue with my PSU/Machine by now, surely.


----------



## Streetdragon

normaly the calbe should provide enough power/wont melt.
So RMA it









BTW some new news about 480:
http://videocardz.com/61193/amd-radeon-rx-480-rumors

sapphire or switch to XFX what should i choose.. hmm


----------



## bluej511

Quote:


> Originally Posted by *Linkinred*
> 
> 
> This is how the cable looks it is 8+6 on the one cable, the only way to crossfire using EVGA 850W G2 Supernova is by using these two cables 8+6 on the single cable. The only bundled cables are 2x 8Pin or 2x8+6.... it was confirmed on EVGA forums by EVGA staff member that this is perfectly normal to use this cable like this - it is how it was intended to be used.
> 
> my cpu is running stock standard no overclock at all.
> 
> The card did the exact same issue in a totally different computer I think we can rule out any issue with my PSU/Machine by now, surely.


I gotcha still daisychained but yea not a problem. If you used the same method on the other pc though it is possible that its the issue.

Honestly if you can, i would try to remove one card entirely from the pc not even have it in the slot. The use BOTH cables into the card and see if you still have the issue, if you do RMA time, if not then youll know why. Worth a shot that way when you RMA you can tell em you tried that, 20$ says theyll make you try that anyways if their anal lol.

I just saw the pic. Youre not trying to plug a 6pin into an 8pin connector right? Is your card 8pin+8pin or 8pin+6pin? Cause i can't tell if both those connectors are 8pin, idk of many r9 390s that are 8+6


----------



## DarthBaggins

Even my 390x is a 8+6pin guess it depends on the aftermarket manufacturers decision to do dual 8pins


----------



## bluej511

Quote:


> Originally Posted by *DarthBaggins*
> 
> Even my 390x is a 8+6pin guess it depends on the aftermarket manufacturers decision to do dual 8pins


Maybe sapphire is the only 8pin+8pin. I thought they were all 8+8 then yea youre fine if you tried it on two SEPARATE cables and still have issues, some cards can be finicky with daisychaining power cables.


----------



## Gdourado

Quote:


> Originally Posted by *Linkinred*
> 
> Hi guys
> 
> I have 2 R9 390X in crossfire... however running stock clocks they throw up artifacts.... I have tried both cards one by one and the same deal artifacts also tried them on a totally different system...
> 
> one card seems to be worse then the other one.......
> 
> If I however downclock the core by say -20mhz then all artifacts vanish and are gone for good.....
> 
> if I raise the clock by say +10 or +20mhz on the core I get crazy artifacts and screen flashes on both cards
> 
> is it safe to say these are REALLY bad overclocking cards or I just got duds???
> 
> should I rma?


This reminds me of the whole issues with the Gigabyte 780ti ghz edition that would artifact like crazy at the out of the box OC setting.


----------



## tolis626

Finally, FINALLY, a store in Greece has brought some Thermal Grizzly compounds. You're gonna think that's a lot of excitement over something as mundane as TIM, but it's more the fact that it's (supposedly) the best TIM out there and we had no access to it that got to my nerves. Well, now I get to try it too. And it couldn't have come at a better time, as I'm having some CPU temps issues that I hope to sort out. Woo!

So I bought myself a 5.5g tube of Kryonaut and it's supposed to be here early next week. I hope they are punctual and don't screw me over. I'm more excited than I should be...









PS : I'm really really curious how much better than GC Extreme it can be. I mean, right now my GPU doesn't go over 72C when I game, and that's with a hefty overclock of 1175/1625MHz at +75/+50mV. At just 10MHz and 25mV less, it chills below 70C all the time. And that's without even having to take my side panel off. Turns out that the MSI cooler is indeed phenomenal and it's being held back by some bad decisions in the thermal compounds used.

A rather big tip from me : Use a 0.5mm thermal pad on the chokes. Or if you believe that they're chokes and don't need cooling, don't use padding at all. Seems that a >0.5mm pad (Like the stock one or the 1mm Phobya I had before) there hinders contact between the core and heatsink. WIth everything else exactly the same, with actually higher ambients and just the pad on the chokes changed, I shaved around 6C on your average gaming scenario and over 10C during a max overclock run. Ah... I'm finally enjoying this GPU even though its memory overclocking is crap.


----------



## m0nsky

Hello fellow 390/390X owners. I'm very late to the club.
I bought my MSI R9 390X Gaming 8GB about two months ago for my first PC build. I was planning on using this as a temporary GPU until the GTX1080 release, but I've decided to keep it.

I've read that the thermal paste on the MSI R9 390X cards has sometimes been applied poorly, so when it arrived I opened it up and apply some Thermal Grizzly Kryonaut.
I'm currently running a 1250mhz core clock and 1750mhz memory clock.



I've ran 3DMark Firestrike in which my PC scored 13252, with a 15590 graphics score. The temperature of the card stayed below 69c which I consider pretty good because the ambient temperature is pretty high right now.
http://www.3dmark.com/fs/8950378

I've also ran UserBenchmark, on which it scored 105% for gaming, with a 114% graphics score. Somehow the GPU charts say the R9 390X has never scored higher than 112%.
http://www.userbenchmark.com/UserRun/1239531


----------



## bluej511

Quote:


> Originally Posted by *m0nsky*
> 
> Hello fellow 390/390X owners. I'm very late to the club.
> I bought my MSI R9 390X Gaming 8GB about two months ago for my first PC build. I was planning on using this as a temporary GPU until the GTX1080 release, but I've decided to keep it.
> 
> I've read that the thermal paste on the MSI R9 390X cards has sometimes been applied poorly, so when it arrived I opened it up and apply some Thermal Grizzly Kryonaut.
> I'm currently running a 1250mhz core clock and 1750mhz memory clock.
> 
> 
> 
> I've ran 3DMark Firestrike in which my PC scored 13252, with a 15590 graphics score. The temperature of the card stayed below 69c which I consider pretty good because the ambient temperature is pretty high right now.
> http://www.3dmark.com/fs/8950378
> 
> I've also ran UserBenchmark, on which it scored 105% for gaming, with a 114% graphics score. Somehow the GPU charts say the R9 390X has never scored higher than 112%.
> http://www.userbenchmark.com/UserRun/1239531


Count yourself luck thats a hell of a nice OC. What is it running for aux voltage and power limit. Thats a pretty good card. MSIs run pretty hot so be glad its at 69. Kryonaut is delicious stuff though.


----------



## mus1mus

How much additional voltage?


----------



## bluej511

Quote:


> Originally Posted by *mus1mus*
> 
> How much additional voltage?


Damn bro -rep, did i not just ask that haha.


----------



## m0nsky

Thanks for replying. I knew I forgot something!
I have 3 presets in MSI Afterburner.

The first one is my 1200/1700 preset. I plan on using this if we get some crazy temperatures during the summer.
Voltages : +81/+63



The second one is the 1250/1750 preset which I am using right now.
Voltages : +100/+81



The third one is a factory preset which also uses my own fan curve.



The card always idles below 40c, when I boot up Battlefield 4, I choose to let the fan kick in aggressively so keep things cool. I haven't really noticed the sound.

If you see something I'm doing completely wrong, please let me know! These presets are the results of lots of testing in Furmark, 3DMark and Battlefield 4. I'd like to go higher on the 1250/1750 setup, but I'm afraid I'll have to up the voltage, which doesn't seem to be possible.

If there's anything else you'd like to know, just ask.


----------



## mus1mus

Damn.

That's pretty good!

Jealous here! I can put a block into that thing and push it further.

Be sure to use GPU-Z and check for VDDC Voltage. I wonder how +100 reflects into it's VDDC.

Either way, you have a winner in there! Congrats!

Two of those can easily laugh at any current single card in XFire games btw.


----------



## christoph

what about the ASIC quality of that card? just to see if its related or not


----------



## m0nsky

Here's the ASIC test in GPU-Z.



Some other stats (including VDDC).


----------



## tolis626

Quote:


> Originally Posted by *m0nsky*
> 
> Here's the ASIC test in GPU-Z.
> 
> 
> 
> Some other stats (including VDDC).


That VDDC screenshot doesn't tell us much. Best way to tell us is to install The Stilt's EVV app and see what your DPM7 voltage is. Also, you could just use GPU-z and have VDDC show its max value. The only thing we can see from this screenie is that at some point, your voltage was at 1.2-ishV.









Also, could you install HWiNFO64 and monitor for memory errors? I'm kind of curious to see how many your card gets.

Oh and, damn you, that's a golden card.


----------



## kizwan

The current value (VDDC) is what you want to look at because that is your card operating voltage.

Let me show you...


Max VDDC is 1.445V. How often the card run at 1.445V shown in the graph above?


----------



## tolis626

Quote:


> Originally Posted by *kizwan*
> 
> The current value (VDDC) is what you want to look at because that is your card operating voltage.
> 
> Let me show you...
> 
> 
> Max VDDC is 1.445V. How often the card run at 1.445V shown in the graph above?


Well, I don't disagree that, ideally, we want VDDC, but wouldn't it be useless without some graph showing how it changes over time? I mean, that it was 1.2V at some point doesn't mean much, if anything. If 1.2V was the average VID, then ok.

That's why I asked for max VDDC achieved. That should tell us more or less what is going on. LLC functions pretty much the same on these cards (At least from the same manufacturer), so we could draw some conclusions. I could be wrong though.









Where did you see that 1.445V VDDC? Or is that from your GPU? That's quite high anyway.


----------



## kizwan

Quote:


> Originally Posted by *tolis626*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> The current value (VDDC) is what you want to look at because that is your card operating voltage.
> 
> Let me show you...
> 
> 
> Max VDDC is 1.445V. How often the card run at 1.445V shown in the graph above?
> 
> 
> 
> 
> 
> 
> 
> Well, I don't disagree that, ideally, we want VDDC, but wouldn't it be useless without some graph showing how it changes over time? I mean, that it was 1.2V at some point doesn't mean much, if anything. If 1.2V was the average VID, then ok.
> 
> That's why I asked for max VDDC achieved. That should tell us more or less what is going on. LLC functions pretty much the same on these cards (At least from the same manufacturer), so we could draw some conclusions. I could be wrong though.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Where did you see that 1.445V VDDC? Or is that from your GPU? That's quite high anyway.
Click to expand...

The (his) screenshot show GPU load is 100%. 1.2V was not just at some point but when the GPU load is 100%. That is what you want to look at. Max VDDC doesn't tell us anything & if you look at my graph, the card doesn't run at that voltage when under load.

This is another graph with Max VDDC 1.453V. You can see the card is running at 1.3XXV when the card is under load.


----------



## tolis626

Quote:


> Originally Posted by *kizwan*
> 
> The (his) screenshot show GPU load is 100%. 1.2V was not just at some point but when the GPU load is 100%. That is what you want to look at. Max VDDC doesn't tell us anything & if you look at my graph, the card doesn't run at that voltage when under load.
> 
> This is another graph with Max VDDC 1.453V. You can see the card is running at 1.3XXV when the card is under load.


... I think we're saying the same thing but something's lost in the translation.









I agree. I was just saying that, as with your graph, voltage fluctuates a lot. Even at 100% load. On your example you can see it bouncing between 1.3V and 1.35V all the time. What I merely pointed out is that just VDDC at some random point in time isn't all that useful without more information, like max VDDC. A graph would be the most useful, though, so yeah, @m0nsky if you could post a graph, it'd be perfect. I think you can do that with GPU-z too, but HWiNFO and MSI Afterburner make it a lot easier.









Also, Kizwan, how much does that thing consume under load? Mine goes over 300W quite ofter at a 1.25-1.3V VDDC range.


----------



## kizwan

Quote:


> Originally Posted by *tolis626*
> 
> Also, Kizwan, how much does that thing consume under load? Mine goes over 300W quite ofter at a 1.25-1.3V VDDC range.


VDDC power out right? 220 to 240W. VDDC power in is around 250 to 300W.


----------



## tolis626

Quote:


> Originally Posted by *kizwan*
> 
> VDDC power out right? 220 to 240W. VDDC power in is around 250 to 300W.


Meh, it's not that much higher than mine. Seems Hawaii/Grenada/Whatever just has a point where efficiency goes completely out the window. I didn't buy a 390x for its efficiency anyway...









I would be curious to see how temperatures, especially VRM temps, affect power draw. Could be interesting, but it would require water and I don't have a loop, sadly.


----------



## kizwan

Quote:


> Originally Posted by *tolis626*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> VDDC power out right? 220 to 240W. VDDC power in is around 250 to 300W.
> 
> 
> 
> Meh, it's not that much higher than mine. Seems Hawaii/Grenada/Whatever just has a point where efficiency goes completely out the window. I didn't buy a 390x for its efficiency anyway...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I would be curious to see how temperatures, especially VRM temps, affect power draw. Could be interesting, but it would require water and I don't have a loop, sadly.
Click to expand...

Between VDDC power out vs. power in, I like to look at VDDC power in because that is the amount of power being drawn from PSU (ignoring the accuracy of the reading).


----------



## tolis626

Quote:


> Originally Posted by *kizwan*
> 
> Between VDDC power out vs. power in, I like to look at VDDC power in because that is the amount of power being drawn from PSU (ignoring the accuracy of the reading).


Yeah, the only times I look at VDDC power out is if I want to roughly calculate the VRMs' efficiency at certain operating conditions. What's surprising to me is that these readings are quite accurate. At some point I was using a Kill-a-Watt and the reading from that agreed somewhat with what the software readouts were saying. And the "Somewhat" part is because there was a discrepancy, sure (about 30W or so), but factoring in 8 fans, 2 HDDs and 2 LED strips along with PSU efficiency... Yeah, it's quite close, I'd say.


----------



## mus1mus

Quote:


> Originally Posted by *m0nsky*
> 
> Here's the ASIC test in GPU-Z.
> 
> 
> 
> Some other stats (including VDDC).


Awesome card!

1.203 under load is superb for 1250MHz.


----------



## MK-Professor

in msi afterburner in the unofficial overclocking mode I select "without powerplay support" from "disabled" that I had before to test something and after that I revert back to "disabled" and now the GPU is stack at 3D locks all the time as a result high temp and noise at idle.

*How I can enable again the powerplay?*


----------



## DarthBaggins

I have mine set at power play support, but I haven't pushed my clocks yet (1078/1750ish)


----------



## christoph

Quote:


> Originally Posted by *m0nsky*
> 
> Here's the ASIC test in GPU-Z.
> 
> 
> 
> Some other stats (including VDDC).


then the ASIC as many guys have said, is not related to its overclock potential or is it? mine is 75.4%, but I haven't try to OC my card yet


----------



## kizwan

Quote:


> Originally Posted by *MK-Professor*
> 
> 
> [SPOILER=Warning: Spoiler!][URL=http://www.overclock.net/content/type/61/id/2815397/width/350/height/700]http://www.overclock.net/content/type/61/id/2815397/width/350/height/700[/URL]
> 
> in msi afterburner in the unofficial overclocking mode I select "without powerplay support" from "disabled" that I had before to test something and after that I revert back to "disabled" and now the GPU is stack at 3D locks all the time as a result high temp and noise at idle.[/SPOILER]
> 
> [B]How I can enable again the powerplay?[/B][/QUOTE]
> Uninstall & reinstall the driver.
> 
> When overclock using without powerplay support, it is important for you to reset your overclock before restarting the computer or before disabling the "without powerplay support". Otherwise it'll stuck at 3D clock all the time.


----------



## mus1mus

It can mean something up to a point.

It can OC to *X* MHz at *Y* Voltage. Is an example.

How far can it OC from stock is a different question.


----------



## bluej511

So i got ride of the tomb raider and dx12 seems to work pretty damn well. I used the in game benchmark and the avg fps for all 3 scenes was pretty damn close, HOWEVER the min rate on dx11 was 8-10 and dx12 was 35-40







. So not sure how accurate it is as we can't monitor in game with rtss but wtv. Not sure which one plays smoother though havent tried.


----------



## m70b1jr

I'm selling my R9 390 with a Arctic Acclero Hybrid III-140 AIO Liquid cooler on it to try to get a RX480.. Hopefully it's worth it. PM Me if anyone here wants it


----------



## gordesky1

Guys i finally got 390 gaming 8gb from msi rma which in their mind replaced my 290x lightning.... Tho so far its not a bad card at all and the 8gb does help in games that is vram hungry like ark which made a big difference in that game.

Still miss my lightning cause ...well ... its a lightning lol..

So far i got 1180 core and 1700mem +69 out of it. didn't try any higher yet cause i need to work on my case cooling first cause it is getting toasty at 77-78 after a few hours of gaming...

Tho i been wanting to ask this question. In hwinfo there is a monitor that monitors gpu memory errors... i was wondering is that accurate? cause the higher i overclock the memory it goes higher and higher like right now its at current 222,109 errors and avg is 51k which keeps going higher and higher even in 2d.. Im pretty sure if that's true i would be having all kind of issues in games or even in bench's? cause im not seeing any problems at all. heck even going as small as 50mhz on the memory errors will show on it..

My cousin wanted to know this too cause his sapphire 7950 is also seeing about 6 to 10 errors even at stock but seeing no issues. none of my other cards even the lightning when i had it had that monitor.

doing multiple valley benchmarks shows no issues on the 390 even tho the errors keeps going higher and higher... looking gpu memory errors for hwin on google comes up with nothing much about it.


----------



## kizwan

Quote:


> Originally Posted by *gordesky1*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Guys i finally got 390 gaming 8gb from msi rma which in their mind replaced my 290x lightning.... Tho so far its not a bad card at all and the 8gb does help in games that is vram hungry like ark which made a big difference in that game.
> 
> Still miss my lightning cause ...well ... its a lightning lol..
> 
> So far i got 1150 core and 1700mem out of it. didn't try any higher yet cause i need to work on my case cooling first cause it is getting toasty at 77-78 after a few hours of gaming...
> 
> 
> 
> Tho i been wanting to ask this question. In hwinfo there is a monitor that monitors gpu memory errors... i was wondering is that accurate? cause the higher i overclock the memory it goes higher and higher like right now its at current 222,109 errors and avg is 51k which keeps going higher and higher even in 2d.. Im pretty sure if that's true i would be having all kind of issues in games or even in bench's? cause im not seeing any problems at all. heck even going as small as 50mhz on the memory errors will show on it..
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> My cousin wanted to know this too cause his sapphire 7950 is also seeing about 6 to 10 errors even at stock but seeing no issues. none of my other cards even the lightning when i had it had that monitor.
> 
> doing multiple valley benchmarks shows no issues on the 390 even tho the errors keeps going higher and higher... looking gpu memory errors for hwin on google comes up with nothing much about it.


There's two kind of errors; correctable & uncorrectable errors. The EDC error (reported in HWiNFO) doesn't differentiate between the two. The uncorrectables ones that will produce artifacts. All I know it's pretty accurate based on the info gathered from people that have NDA agreement with AMD.

Your screenshot show core is in 2D but memory is not which is likely the reason why the error counter still running but it should stop reading when there's nothing running that use VRAM. If it doesn't cause any issue at all in gaming, then the errors are handled properly & you should not need to worry about it.


----------



## gordesky1

Quote:


> Originally Posted by *kizwan*
> 
> There's two kind of errors; correctable & uncorrectable errors. The EDC error (reported in HWiNFO) doesn't differentiate between the two. The uncorrectables ones that will produce artifacts. All I know it's pretty accurate based on the info gathered from people that have NDA agreement with AMD.
> 
> Your screenshot show core is in 2D but memory is not which is likely the reason why the error counter still running but it should stop reading when there's nothing running that use VRAM. If it doesn't cause any issue at all in gaming, then the errors are handled properly & you should not need to worry about it.


I see so pretty much as long no artifacts shows in games or benchmark everything is fine and can be ignored? Can i still go higher on the overclock as long as i don't see any artifacts in games etc? it just worrys me when the errors keeps going higher when i overclock the mem more.. The weird thing is even very little overclock on the mem will report some errors..


----------



## kizwan

Quote:


> Originally Posted by *gordesky1*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> There's two kind of errors; correctable & uncorrectable errors. The EDC error (reported in HWiNFO) doesn't differentiate between the two. The uncorrectables ones that will produce artifacts. All I know it's pretty accurate based on the info gathered from people that have NDA agreement with AMD.
> 
> Your screenshot show core is in 2D but memory is not which is likely the reason why the error counter still running but it should stop reading when there's nothing running that use VRAM. If it doesn't cause any issue at all in gaming, then the errors are handled properly & you should not need to worry about it.
> 
> 
> 
> I see so pretty much as long no artifacts shows in games or benchmark everything is fine and can be ignored? Can i still go higher on the overclock as long as i don't see any artifacts in games etc? it just worrys me when the errors keeps going higher when i overclock the mem more.. The weird thing is even very little overclock on the mem will report some errors..
Click to expand...

If it doesn't cause visual artifacts or crash, then is should be fine. Forgot to mention that you should not get any error at stock because that likely bad card/memory/bios. The reading is a counter meaning the number will keep increasing when error occurs, it doesn't reset.

At what frequency you're starting to get EDC error reading? For 290/290X, the limit start around 1500MHz & higher when you'll likely to get EDC error. For 390/390, seems like around 1600MHz & higher. On Hawaii/Grenada, the memory controller is what limit the card memory overclockability not the memory itself .


----------



## gordesky1

Quote:


> Originally Posted by *kizwan*
> 
> If it doesn't cause visual artifacts or crash, then is should be fine. Forgot to mention that you should not get any error at stock because that likely bad card/memory/bios. The reading is a counter meaning the number will keep increasing when error occurs, it doesn't reset.
> 
> At what frequency you're starting to get EDC error reading? For 290/290X, the limit start around 1500MHz & higher when you'll likely to get EDC error. For 390/390, seems like around 1600MHz & higher. On Hawaii/Grenada, the memory controller is what limit the card memory overclockability not the memory itself .


at stock i get none, butI get some errors around 1550 and very very slowly climbs. at 1600+ it starts to climb fast and in the screenshot i posted i had the memory around 1700 which had no artifacts or crashing.

So pretty much every card gets these errors when overclocking the memory? i see people in this forum overclocking to 1750 1800 do they get alot?

Im pretty much a newb at this error counter because my amd 5x series didn't have it or my 570gtx or even my 290x lightning when i had it which i find weird..

Trying to find the sweet spot for this card and when i see those errors climbing it scares me lol..


----------



## Stige

Quote:


> Originally Posted by *m70b1jr*
> 
> I'm selling my R9 390 with a Arctic Acclero Hybrid III-140 AIO Liquid cooler on it to try to get a RX480.. Hopefully it's worth it. PM Me if anyone here wants it


480 is worse than a 390, why would it be worth it in any way?


----------



## mus1mus

Quote:


> Originally Posted by *Stige*
> 
> 480 is worse than a 390, why would it be worth it in any way?


How do you know?
Quote:


> Originally Posted by *gordesky1*
> 
> at stock i get none, butI get some errors around 1550 and very very slowly climbs. at 1600+ it starts to climb fast and in the screenshot i posted i had the memory around 1700 which had no artifacts or crashing.
> 
> So pretty much every card gets these errors when overclocking the memory? i see people in this forum overclocking to 1750 1800 do they get alot?
> 
> Im pretty much a newb at this error counter because my amd 5x series didn't have it or my 570gtx or even my 290x lightning when i had it which i find weird..
> 
> Trying to find the sweet spot for this card and when i see those errors climbing it scares me lol..


I get some errors on one card at 1550. But yeah, modded rom and timings.


----------



## bluej511

Quote:


> Originally Posted by *mus1mus*
> 
> How do you know?
> I get some errors on one card at 1550. But yeah, modded rom and timings.


Its not he doesnt know. They are pretty much even or close to in performance. The rx480 even tessells better which would be worth the upgrade alone. Considering theres an rx 490 coming out before the end of the year its what im waiting for.

Would have juimped on the 480 its around 270€ here but no need if a 490 is coming.


----------



## mus1mus

Quote:


> Originally Posted by *bluej511*
> 
> Its not he doesnt know. They are pretty much even or close to in performance. The rx480 even tessells better which would be worth the upgrade alone. Considering theres an rx 490 coming out before the end of the year its what im waiting for.
> 
> Would have juimped on the 480 its around 270€ here but no need if a 490 is coming.


You guys may be talking about an OC'ed 390 vs 480 with premature OC results.

Review samples are not gonna be allowed the same OC treatment we are enjoying now with Hawaii.

And opps, a new devil is coming. Dual RX 480s? Sweet.


----------



## Irev

so rx480 is a little slower then R9 390... and has no overclocking (from tech of tomorrows review sample anyway)

I want to see other reviews first before buying


----------



## mus1mus

Teasers mate. Teasers. Tech of Tomorrow would be damned to release a proper review and neglect the embargo.


----------



## bluej511

Quote:


> Originally Posted by *Irev*
> 
> 
> 
> 
> 
> 
> so rx480 is a little slower then R9 390... and has no overclocking (from tech of tomorrows review sample anyway)
> 
> I want to see other reviews first before buying


The firestrike score is higher then my 390 and oced is higher as well. So they seem to be about even. A lot of people are testing em with 16.6.1 drivers when they should be on 16.6.2 so who knows. AIBs should be coming out during the week as well so we;ll see how it goes. Guy was getting higher frame rates on a 480 with hairworks then the 390 without so we'll see.


----------



## bluej511

Our french prices and why i wont even buy an Asus card haha.


----------



## tolis626

I'm finding the RX 480 kinda underwhelming from the reviews I'm seeing. I thought it would at least be handily beating the 970 and 390, not struggle against them. Yes, it's very good for 200$ (well, 240$, I wouldn't consider the 4GB version really viable for me), but the 970 is what? A year and a half old? More? Not to mention the 390.

Anyways, my only hope is that once custom boards show and people start pushing some voltage, it'll get there. JayzTwoCents got 1360MHz for example without adding voltage (I think). Kyle fron AwesomeSauceNetwork got 1320MHz with no voltage (Although he did run into throttling). If it scales, maybe we'll see 1500MHz and some nice results. But for now, meh. The decreased power consumption alone isn't even remotely worth it for me.









Anyone know if we're going to get Wattman for the 390/390x too? I would love to be able to configure my overclock per P-state without having to mod my BIOS. Or is it only possible on the 480?


----------



## bluej511

Quote:


> Originally Posted by *tolis626*
> 
> I'm finding the RX 480 kinda underwhelming from the reviews I'm seeing. I thought it would at least be handily beating the 970 and 390, not struggle against them. Yes, it's very good for 200$ (well, 240$, I wouldn't consider the 4GB version really viable for me), but the 970 is what? A year and a half old? More? Not to mention the 390.
> 
> Anyways, my only hope is that once custom boards show and people start pushing some voltage, it'll get there. JayzTwoCents got 1360MHz for example without adding voltage (I think). Kyle fron AwesomeSauceNetwork got 1320MHz with no voltage (Although he did run into throttling). If it scales, maybe we'll see 1500MHz and some nice results. But for now, meh. The decreased power consumption alone isn't even remotely worth it for me.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyone know if we're going to get Wattman for the 390/390x too? I would love to be able to configure my overclock per P-state without having to mod my BIOS. Or is it only possible on the 480?


I felt the same way till i looked at power consumption haha. It destroys in wattage the gtx 970 and almost 100w less then the r9 390 for almost the same performance. Also don't forget, these are very very immature drivers and its pretty much a new process.

Also 5% OC on the core ends up giving 10% fps in some games, thats pretty damn impressive for only like a 100mhz OC.


----------



## kizwan

Quote:


> Originally Posted by *gordesky1*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> If it doesn't cause visual artifacts or crash, then is should be fine. Forgot to mention that you should not get any error at stock because that likely bad card/memory/bios. The reading is a counter meaning the number will keep increasing when error occurs, it doesn't reset.
> 
> At what frequency you're starting to get EDC error reading? For 290/290X, the limit start around 1500MHz & higher when you'll likely to get EDC error. For 390/390, seems like around 1600MHz & higher. On Hawaii/Grenada, the memory controller is what limit the card memory overclockability not the memory itself .
> 
> 
> 
> 
> 
> 
> 
> at stock i get none, butI get some errors around 1550 and very very slowly climbs. at 1600+ it starts to climb fast and in the screenshot i posted i had the memory around 1700 which had no artifacts or crashing.
> 
> So pretty much every card gets these errors when overclocking the memory? i see people in this forum overclocking to 1750 1800 do they get alot?
> 
> Im pretty much a newb at this error counter because my amd 5x series didn't have it or my 570gtx or even my 290x lightning when i had it which i find weird..
> 
> Trying to find the sweet spot for this card and when i see those errors climbing it scares me lol..
Click to expand...

You can try increase the AUX voltage to 1.050V - 1.100V. I would not recommend going over 1.1V because the memory controller sensitive with AUX voltage increase.

If you can gaming at that freq without crashing & artifacting for hours, you can ignore the error. If your overclock is really not stable, it can crash within 30 to 45 minutes of gameplay.

If I remember correctly the memory error counter started available earlier this month. You probably already RMA-ed your card that time.


----------



## Stige

Quote:


> Originally Posted by *bluej511*
> 
> Its not he doesnt know. They are pretty much even or close to in performance. The rx480 even tessells better which would be worth the upgrade alone. Considering theres an rx 490 coming out before the end of the year its what im waiting for.
> 
> Would have juimped on the 480 its around 270€ here but no need if a 490 is coming.


I know because there are some benchmarks out there already. Legit ones, not fake. 480 is just about the same or slightly slower than 970 so definitely slower than 390.


----------



## tolis626

So 16.6.2 driver came out and guess what? No Wattman for us. Booooooooooooo!


----------



## DarthBaggins

So looks like I'll be sticking with the beta driver I'm rocking right now


----------



## bluej511

How sexy.


----------



## tolis626

Quote:


> Originally Posted by *bluej511*
> 
> How sexy.


That's not sexy at all. Cute would be the right word. But cute like a kitten, not like a girl. You can pet it and play with it, but that's about as much interaction as you can get.









I wouldn't bother overclocking a reference card, to be honest. Single 6-pin power is sure going to be a limiting factor.


----------



## bluej511

Quote:


> Originally Posted by *tolis626*
> 
> That's not sexy at all. Cute would be the right word. But cute like a kitten, not like a girl. You can pet it and play with it, but that's about as much interaction as you can get.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I wouldn't bother overclocking a reference card, to be honest. Single 6-pin power is sure going to be a limiting factor.


Idk i dont think its the issue i think its more thermals then anything. Even the blower at 100% its horrible.

Unfortunately AMD has no where near the market for nvidia let alone for watercooling. Ekwb and other manufacturers only end up making reference pcb designs. Theyre prob wont be an sapphire or msi rx 480 waterblock. I hoping the sales are so huge that they do make one but they havent for the r9 200/300 series at all.

P.S. I think bare copper is damn sexy, id never get nickel plated haha.


----------



## gordesky1

Quote:


> Originally Posted by *kizwan*
> 
> You can try increase the AUX voltage to 1.050V - 1.100V. I would not recommend going over 1.1V because the memory controller sensitive with AUX voltage increase.
> 
> If you can gaming at that freq without crashing & artifacting for hours, you can ignore the error. If your overclock is really not stable, it can crash within 30 to 45 minutes of gameplay.
> 
> If I remember correctly the memory error counter started available earlier this month. You probably already RMA-ed your card that time.


Just tried up to 1.1v and at 1600 i got no errors but anything over that they stack up pretty fast still. But yea im at 1730 at the moment and so far no artifacting or crashing and i say i been gaming for about 3 to 4 hours. so i guess i can ignore them till i start having crashing or artifacts lol


----------



## gordesky1

Another thing is there a way to have msi afterburner to not run high voltage threw the card at idle? like i have my clock settings at 1190\1730 core voltage was at +90 which is little over 1.3 and at load its around 1.22. soon as i exit a game or stress program it will go back to 1.3... Or is this just how these run?

I tried doing force constant voltage hopeing i can lower that voltage down to 1.22 and have it stay that way but so far that setting is not doing a thing?


----------



## diggiddi

Quote:


> Originally Posted by *gordesky1*
> 
> Just tried up to 1.1v and at 1600 i got no errors but anything over that they stack up pretty fast still. But yea im at 1730 at the moment and so far no artifacting or crashing and i say i been gaming for about 3 to 4 hours. so i guess i can ignore them till i start having crashing or artifacts lol


^The Error thing is funny, I OC the mem to 1620 iirc, and it gave me beaucoup errors, then I started from 1500 +25mhz increments back to the the same freq and it gave me 0 errors











I hope they let us older GCN'ers get with that Wattman after they let the newer GPU's play with it for a little while,
just like when the 390 came out with that new driver


----------



## kizwan

Quote:


> Originally Posted by *gordesky1*
> 
> Another thing is there a way to have msi afterburner to not run high voltage threw the card at idle? like i have my clock settings at 1190\1730 core voltage was at +90 which is little over 1.3 and at load its around 1.22. soon as i exit a game or stress program it will go back to 1.3... Or is this just how these run?
> 
> I tried doing force constant voltage hopeing i can lower that voltage down to 1.22 and have it stay that way but so far that setting is not doing a thing?


I remember your card memory freq does not throttle down when idle. This is why you're seeing voltage 1.3V at idle. If you have multi-monitor or single monitor with refresh rate higher than 60Hz, memory freq can stuck at 3D clock when idle.


----------



## gordesky1

Quote:


> Originally Posted by *kizwan*
> 
> I remember your card memory freq does not throttle down when idle. This is why you're seeing voltage 1.3V at idle. If you have multi-monitor or single monitor with refresh rate higher than 60Hz, memory freq can stuck at 3D clock when idle.


yea using dual monitor, but is there a reason why at load the voltage drops from 1.3 to 1.22?


----------



## gordesky1

Hmm is this normal also... if i keep the memory at 1500 stock the voltage will stay around 1.141v at +56 but at load it will jump to 1.2 something which seems fine.. But if i higher the memory even 5mhz over the screen will blink and the voltage goes to 1.289v... and at load it drops lower...

I know its not because of the memory is down clocking because at stock its at 1500 always because 2 monitors, I just find it weird 1 mhz over stock on the memory it will higher the voltage on its own... while if i keep it at 1500 and just overclock the core the voltage is pretty stable and instead of droping at load it higher it up as it should. and when i quit the said program it lowers it.


----------



## jdorje

Okay, serious question, might have asked or hinted at it before, but never gotten very good answers.

I have an XFX 8256 390. I also have an h80i, currently on my CPU. Willing to get another AIO (not preferred) or work on a custom loop (sounds fun!).

How can I hook my h80i up to the 390? Should I disassemble my gpu cooler and put it right on? But the g10 won't fight my h80i, and the hg10 won't work well with my existing cooler? Would be ideal to move my AIO to the gpu and get a better CPU cooler, but that doesn't seem possible.

Alternately, I wouldn't mind building or working toward a loop. But the 8256 (with the notorious "Z" VRAM VRM) won't fit any full water blocks to my knowledge. Are core-only blocks viable? Those will transfer over to a new gpu, right? A distinct advantage I guess.

I welcome any advice or discussion on this matter.


----------



## bluej511

Quote:


> Originally Posted by *jdorje*
> 
> Okay, serious question, might have asked or hinted at it before, but never gotten very good answers.
> 
> I have an XFX 8256 390. I also have an h80i, currently on my CPU. Willing to get another AIO (not preferred) or work on a custom loop (sounds fun!).
> 
> How can I hook my h80i up to the 390? Should I disassemble my gpu cooler and put it right on? But the g10 won't fight my h80i, and the hg10 won't work well with my existing cooler? Would be ideal to move my AIO to the gpu and get a better CPU cooler, but that doesn't seem possible.
> 
> Alternately, I wouldn't mind building or working toward a loop. But the 8256 (with the notorious "Z" VRAM VRM) won't fit any full water blocks to my knowledge. Are core-only blocks viable? Those will transfer over to a new gpu, right? A distinct advantage I guess.
> 
> I welcome any advice or discussion on this matter.


A 120mm is barely enough for a gpu, i still dont understand why people do it. The point of water cooling is to run cooler then air and quieter and that doesnt fit either bill.

Me personally i just went straight to a full loop and haven't looked back, unfortunately my alphacool block passively cools the VRMs but it does a great job at it and the core has just recently reached 45°C because ambient case temp is now 27+ whereas it was 22+ before so its a bit warmer in summer but not much.

Honestly if you got the money to do a full loop do it, theres prob an alphacool block for it or can always get the ekwb thermosphere then stick on small heatsinks on the ram and nice beefy ones on the VRM and have a fan blow over it. I dropped 10°C off my vrm by just sticking a 900rpm 120mm fan right on my heatsink.


----------



## jdorje

There aren't any full cover blocks for it to my knowledge.

A 120mm aio cooler will be way better than the stock cooler. I'd probably have to rig up some vrm cooling though....same as for a loop with a regular water block.


----------



## Master0fBlunt

To update my VRAM/GPU clocks on the page 1 OP list, I've been running 1150 GPU / 1650 VRAM Stable for quite some time now, several months. When I push the VRAM up to 1700 i get system crashes though, screen freezes and nothing happens, have to hard reset PC. Using Afterburner with "Power Limit" maxed and +25mV


----------



## mus1mus

Quote:


> Originally Posted by *jdorje*
> 
> There aren't any full cover blocks for it to my knowledge.
> 
> A 120mm aio cooler will be way better than the stock cooler. I'd probably have to rig up some vrm cooling though....same as for a loop with a regular water block.


Check out bykski from aliexpress. They have some blocks that may fit some cards.

http://m.aliexpress.com/item/32253411798.html
Quote:


> Originally Posted by *Master0fBlunt*
> 
> To update my VRAM/GPU clocks on the page 1 OP list, I've been running 1150 GPU / 1650 VRAM Stable for quite some time now, several months. When I push the VRAM up to 1700 i get system crashes though, screen freezes and nothing happens, have to hard reset PC. Using Afterburner with "Power Limit" maxed and +25mV


You might need to bump VDDC to help get to 1750 Memory.

Somehow, memory Voltage and OC scales with Core Voltage.


----------



## Ayyemdee

As somoene who bought a 390 in the beggining of May this year, how much did I lose out on by not waiting for RX480?


----------



## bluej511

Quote:


> Originally Posted by *Ayyemdee*
> 
> As somoene who bought a 390 in the beggining of May this year, how much did I lose out on by not waiting for RX480?


I prob bought mine around the same time and don't regret it. The r9 390 performs better in everything, only thing you lost out on is wattman and about an 80w or so power difference. But if you use frtc or vsync prob not.

A more powerful card while using vsync will prob use the same amount of power as a weak card going full out to keep those fps. I thought i was going to see a significant difference but its not so. Ill wait for aib reviews and see but i dont see much improvement prob just in temps. Rx 490 is prob what ill upgrade to.


----------



## battleaxe

Quote:


> Originally Posted by *Ayyemdee*
> 
> As somoene who bought a 390 in the beggining of May this year, how much did I lose out on by not waiting for RX480?


You're probably burning a 125 more watts while gaming. Big deal right?

Me? I don't care. Waiting for 490x to see how she does.

480 seems a good card especially for the money. But the 480 isn't really for us. Its for the $200 card market. We are in a different class IMO... so we wait some more.


----------



## Streetdragon

bought a secound r9 390 nitro + waterblock.
If the new card has the same memory -> can i copy my modded bios from my first card(hynix) on the secound without problems it it has hynix too?


----------



## diggiddi

Quote:


> Originally Posted by *Streetdragon*
> 
> bought a secound r9 390 nitro + waterblock.
> If the new card has the same memory -> can i copy my modded bios from my first card(hynix) on the secound without problems it it has hynix too?


Sounds like a yes to me


----------



## mus1mus

Just mod it like the way you mod the first card.


----------



## Majentrix

Now that the 390 is nearing the end of its shelf life, it's a shame we didn't get more waterblocks for vendor cards. Would've loved a full cover block for my Strix.


----------



## flopper

Quote:


> Originally Posted by *Ayyemdee*
> 
> As somoene who bought a 390 in the beggining of May this year, how much did I lose out on by not waiting for RX480?


Quote:


> Originally Posted by *bluej511*
> 
> I prob bought mine around the same time and don't regret it. The r9 390 performs better in everything, only thing you lost out on is wattman and about an 80w or so power difference. But if you use frtc or vsync prob not.
> 
> A more powerful card while using vsync will prob use the same amount of power as a weak card going full out to keep those fps. I thought i was going to see a significant difference but its not so. Ill wait for aib reviews and see but i dont see much improvement prob just in temps. Rx 490 is prob what ill upgrade to.


The 390 seems to have a great life and dont expect anything really be worthy to upgrade until Vega hits.
the 490 might be but I dont expect it to.
still await the 480 custom made cards as if it fits my sis kid might get a good deal upgrade.


----------



## deskiller

I bought a sapphire nitro 390x 8gb wensday. should be receiving it next tuesday.

it will replacing my nvidia 780 classified. I used to have two 780 in sli. but during maintenance, I accidentally broke 3 transistors on the back of the card.

been wanting a higher memory card anyway....

also been more AMD team anyway.


----------



## Streetdragon

Quote:


> Originally Posted by *mus1mus*
> 
> Just mod it like the way you mod the first card.


i didnt mod the memory. spyshagg made it for me (one more big thx!)

i flashed the bios on the secound card and it work! http://www.3dmark.com/fs/9086609 i think it is nice^^


----------



## Krzych04650

I have two questions:

What is the difference between Sapphire 390 Nitro 11244-*01*-20G and 11244-*02*-20G ?

I am a bit tired of waiting for 480 and now there are additional weeks of waiting for custom designs. Also I don't see it giving any improvement over 390 in terms of performance and price to performance (at least in my country). But I am still curious about how this card is going to perform and OC without temperature and power limitations. Here are some more of my thoughts about it from other forum, what do you think should I do? Pick up 390 or wait?
Quote:


> And now I am even more confused... I wanted to get 480 temporary until Vega is out, but since it doesn't bring anything new in terms of performance and price to performance (at least in my country, 480 reference costs 1349 zł while Sapphire 390 Nitro costs 1440zł, so 480 Nitro will be at he same price or most likely more expensive than 390 Nitro), I decided not to wait another hell knows how many weeks for custom 480s and just get 390 Nitro (still didn't order it). Power draw is not a problem, I already counted that and my monthly electricity bill would grow from 250 to 258 if I take 390 over 480, so this is less than marginal difference. Noise and temps are also not a problem, Sapphire 390 Nitro is very cool and quiet card, especially for its TDP. But now this 480 Nitro looks really promising and also I am still curious how much power limited 480 is and if I can beat ~1150-1200 MHz 390 once its OC potential is unleashed, if there is any at all and additional power connectors will give any benefit and allow ~1500 overclocks. But there is still no release date and no reviews, hell knows how much waiting is left, and I could get 390 in like 3 days from now. My patience is really ending, but still I am curious about what 480 truly is after temperature and power limitations are removed by AIBs. But on the other hand waiting for 2+ weeks for reviews and probably month or so for availability in reasonable price is too much for me at the moment, I am sitting without GPU for a month now...


Also I play at 3440x1440 resolution so huge bandwidth of 390 should also be argument for getting it over 480 right? 480 seem to be doing very well in 1080p but it is loosing with 390 in 1440p and especially 4K, 3440x1440 is somewhere in the middle.

Overall I don't think that 480 will be able to beat 390 overclocked to max by any significant margin at 3440x1440 even with custom designs, it could be faster by less than 5% probably, or so I think...

Two more questions:

Is it true that Sapphire overclocking tool allows +200mV overvolting?
Is it true that power section on 390 Nitro gets as hot as 100C+ after overvolting?


----------



## flopper

Quote:


> Originally Posted by *Krzych04650*
> 
> I have two questions:
> 
> What is the difference between Sapphire 390 Nitro 11244-*01*-20G and 11244-*02*-20G ?
> 
> I am a bit tired of waiting for 480 and now there are additional weeks of waiting for custom designs. Also I don't see it giving any improvement over 390 in terms of performance and price to performance (at least in my country). But I am still curious about how this card is going to perform and OC without temperature and power limitations. Here are some more of my thoughts about it from other forum, what do you think should I do? Pick up 390 or wait?
> Also I play at 3440x1440 resolution so huge bandwidth of 390 should also be argument for getting it over 480 right? 480 seem to be doing very well in 1080p but it is loosing with 390 in 1440p and especially 4K, 3440x1440 is somewhere in the middle.


Older nitro version had NO backplate the newer version has one.
(likely the reason)
Hard to say, the 480 has early not mature drivers so one can expect a 10% increase over time or so.
I suspect the overall experience playing is similiar to one another.

I would say I rather go for the 480 as newer tech is often the better option in the long run.
cant go wrong in any of the purschases though.

a 390 Oc to 1100-1150mhz give or take.
a 480 custom 1350-1500? mhz


----------



## tolis626

My tube of Kryonaut got here this morning and first impressions are extremely good. It's the details, you know. You purchase a premium product, you want it to feel like one. It's well packaged, has two applicators (Maybe in case one of them gets lost or something), a Thermal Grizzly sticker... It was an attractive package. And when a company goes to great lengths for something as mundane as thermal paste, they have my attention.

Now, I installed in on my 390x and... Meh at first. 1C improvement? Maybe? Then I realized that the numbers I was comparing in my head were against Gelid GC Extreme whilst at an ambient of about 21C. While I was playing the room was certainly over 25C. So take that how you will. It works. GC Extreme is a close second, but neither Arctic MX-4 nor, god forbid, MSI's stock paste worked. Both those led to temps over 80C with a +50mV overvolt. At my usual 1175/1625MHz at +70/+50mV the Gelid paste would land at about 73C max, while Kryonaut stayed at 72C for its max, but usually was at 70-71C. I'll test again after having my AC run for a few hours.

TL;DR : If you can find it, get some. For the CPU, I dunno how much of a difference it'll make (I'll try in the next few days), but for the GPU it's worth it. We're talking about over 10C improvement over stock for my MSI 390x. It's like getting a new cooler for it.


----------



## DarthBaggins

I have Hydronaut on my CPU and it works better than anything I've used. Pleasantly pleased with it, and definitely plan on using it on my 390x


----------



## bluej511

Quote:


> Originally Posted by *DarthBaggins*
> 
> I have Hydronaut on my CPU and it works better than anything I've used. Pleasantly pleased with it, and definitely plan on using it on my 390x


Hydronaut is pretty good but Kyronaut is better for GPUs its nice and thick, its why gelid extreme works so well on GPUs.


----------



## Krzych04650

Quote:


> Originally Posted by *flopper*
> 
> Older nitro version had NO backplate the newer version has one.
> (likely the reason)
> Hard to say, the 480 has early not mature drivers so one can expect a 10% increase over time or so.
> I suspect the overall experience playing is similiar to one another.
> 
> I would say I rather go for the 480 as newer tech is often the better option in the long run.
> cant go wrong in any of the purschases though.
> 
> a 390 Oc to 1100-1150mhz give or take.
> a 480 custom 1350-1500? mhz


This is temporary card until Vega is out, then I will resell it or give it to my younger brother for even lower price, so long term is not really a matter here. I will switch from it soon and since I have 3440x1440 monitor I won't be able to play AAA games where all of those newer technologies can even matter, I just need something to wait for Vega on and play some less demanding games meanwhile. My younger brother is not any kind of enthusiast, this kind of GPU will be amazing for him compared to his current crappy laptop anyway, and he will play at 1080p at best, probably 900p max, so again nothing to gain from newer technologies.

I will get 390, some people who pre-ordered this card from Overclockers UK said that they told them that they will get in late July, 22nd July to be precise, and Overclockers is the first shop to get them in EU. So until those cards reach my country... And given very small possibility of reasonable pricing for first month or so... I am not going to wait for 2 months, I am already full month without GPU.


----------



## TotemTed

I would like to join in on the shenanigans.
https://www.techpowerup.com/gpuz/details/uqff3

Also the spreadsheet on the first page is broken for me, can't scroll left or right.


----------



## Hanjin

Picked up a Sapphire R9 390 Nitro w/Backplate for cheap after trading my GTX 950 and was wondering what mods and tweaks I can do to it for better overclocks.

https://www.techpowerup.com/gpuz/details/arc8


----------



## SuperZan

Dunno why I never bothered to join, I've had these cards for a bit. Anyhow:

http://i.imgur.com/y5bxwHt.png

https://www.techpowerup.com/gpuz/details/6hyn6

https://www.techpowerup.com/gpuz/details/bf9qd

2x Gigabyte G1 Gaming R9 390's - stock for now


----------



## Krzych04650

Also I still don't know what is the difference between 11244-*01*-20G and 11244-*02*-20G for Sapphire 390 Nitro. 11244-*00*-20G was version without backplate, 11244-*01*-20G is with backplate, and 11244-*02*-20G is hell knows what and it is more expensive (1440 vs 1569).


----------



## gupsterg

Have you checked GPU clock spec for each?


----------



## Master0fBlunt

Quote:


> Originally Posted by *Ayyemdee*
> 
> As somoene who bought a 390 in the beggining of May this year, how much did I lose out on by not waiting for RX480?


Nothing, the 480 is a heap of junk, my 390x runs circles around it IMHO. At the very least, it's not a huge step up TBH.
Quote:


> Originally Posted by *mus1mus*
> 
> Check out bykski from aliexpress. They have some blocks that may fit some cards.
> 
> http://m.aliexpress.com/item/32253411798.html
> You might need to bump VDDC to help get to 1750 Memory.
> 
> Somehow, memory Voltage and OC scales with Core Voltage.


VDC not changeable on MSI Afterburner for me, I dont even see the option. Running at 1150/1650 in GTA V stable for some couple days now.

Core Voltage (mV) / +25
Power Limit (%) / +50
Core Clock (MHz) / 1150
Memory Clock (MHz) / 1650
Fan Speed (%) / Aggressive Custom Profile
Memory Voltage (mV) / Locked, not changeable in MSI Afterburner for Asus Strix 390x
Aux Voltage (mV) / +25

Not sure if SpeedFan's voltage sensors are accurate, but I've seen VDDC bouncing off 1.3, pretty high but it's stable







I keep an aggressive fan profile, fan % never goes above 65% and temps stay below 68ish.


----------



## mus1mus

VDDC = Core Voltage
VDDCI = Aux Voltage

Sorry for the confusion.


----------



## buddatech

"Upgrade" from 390 to 390x $75 worth?


----------



## Krzych04650

Quote:


> Originally Posted by *buddatech*
> 
> "Upgrade" from 390 to 390x $75 worth?


Probably not. There is not a huge difference in performance, but TDP and power draw grows significantly compared to 390 and 390X is really hot beast. Also $75 is quite a lot for such small upgrade, I would rather safe money for some more meaningful upgrade, or maybe FreeSync monitor if you don't have one yet, it would be much more of a upgrade than going from 390 to 390X.

But if you are in need of extra performance and you don't have better option then why not, this is not some horrible deal, but personally I wouldn't do it.


----------



## TrueForm

Quote:


> Originally Posted by *battleaxe*
> 
> You're probably burning a 125 more watts while gaming. Big deal right?
> 
> Me? I don't care. Waiting for 490x to see how she does.
> 
> 480 seems a good card especially for the money. But the 480 isn't really for us. Its for the $200 card market. We are in a different class IMO... so we wait some more.


I'm in the same boat as you. waiting for the 490 as I have a freesync monitor. So im kinda stuck


----------



## buddatech

Quote:


> Originally Posted by *Krzych04650*
> 
> Probably not. There is not a huge difference in performance, but TDP and power draw grows significantly compared to 390 and 390X is really hot beast. Also $75 is quite a lot for such small upgrade, I would rather safe money for some more meaningful upgrade, or maybe FreeSync monitor if you don't have one yet, it would be much more of a upgrade than going from 390 to 390X.
> 
> But if you are in need of extra performance and you don't have better option then why not, this is not some horrible deal, but personally I wouldn't do it.


Ty for quick reply. Not really in need of the extra performance actually I downgraded from 1600p panel died







to 1080p I have my signature rig and my new $400 CL build specs below. I asked someone what they'd want in cash with my card (PowerColor PCS+ 390) for his XFX 390x said $75 is why I ask.

$410 CL Special








CPU: Intel Core i5 6600k
Motherboard: Gigabyte GA-Z170XP
GPU: PowerColor PCS+ R9 390 8GB GDDR5
RAM: Crucial Ballistix Sport 2×4GB DDR4 2400MHz
SSD: 120GB Intel 530 Series
HDD: Western Digital Blue 1TB 7200RPM
Optical: Samsung Combo drive
PSU: EVGA 650 GQ PSU
CPU Cooler: Cooler Master Hyper 212 EVO
Case: White Enermax


----------



## tolis626

Quote:


> Originally Posted by *Krzych04650*
> 
> Probably not. There is not a huge difference in performance, but TDP and power draw grows significantly compared to 390 and 390X is really hot beast. Also $75 is quite a lot for such small upgrade, I would rather safe money for some more meaningful upgrade, or maybe FreeSync monitor if you don't have one yet, it would be much more of a upgrade than going from 390 to 390X.
> 
> But if you are in need of extra performance and you don't have better option then why not, this is not some horrible deal, but personally I wouldn't do it.


You're not the first one to say this, but where did that "The 390x runs hotter and uses more power than the 390" come from? Absolutely not true. The difference in shaders is 10% in favor of the 390x over the 390, so performance should be about 10% better, usually a bit less than that. Sometimes a bit more, depends on the title. But as far as I'm concerned, the power draw and thermals are identical as long as the cards are clocked equally and use the same voltage. And even if there is a difference, it'll be negligible, especially in thermals.

@buddatech

The performance difference depends on how each card overclocks. Ask him to tell you how his card overclocks at a certain voltage (Ask for VDDC, not the offset in Afterburner). If your card can overclock to, say, 1125MHz and his can overclock to 1175MHz, then I'd say go for it. If they are about the same or your card overclocks better, meh, not worth it from a performance standpoint.


----------



## Krzych04650

Quote:


> Originally Posted by *tolis626*
> 
> You're not the first one to say this, but where did that "The 390x runs hotter and uses more power than the 390" come from? Absolutely not true. The difference in shaders is 10% in favor of the 390x over the 390, so performance should be about 10% better, usually a bit less than that. Sometimes a bit more, depends on the title. But as far as I'm concerned, the power draw and thermals are identical as long as the cards are clocked equally and use the same voltage. And even if there is a difference, it'll be negligible, especially in thermals.


Tests are a bit misleading, some of them showing 10-20W power draw difference, some 70-80W. I agree that 10-20W difference is negligible, but 80W difference is 30% difference in power draw. I don't know, I was just saying what I saw in tests, maybe I am wrong.


----------



## buddatech

Quote:


> Originally Posted by *tolis626*
> 
> You're not the first one to say this, but where did that "The 390x runs hotter and uses more power than the 390" come from? Absolutely not true. The difference in shaders is 10% in favor of the 390x over the 390, so performance should be about 10% better, usually a bit less than that. Sometimes a bit more, depends on the title. But as far as I'm concerned, the power draw and thermals are identical as long as the cards are clocked equally and use the same voltage. And even if there is a difference, it'll be negligible, especially in thermals.
> 
> @buddatech
> 
> The performance difference depends on how each card overclocks. Ask him to tell you how his card overclocks at a certain voltage (Ask for VDDC, not the offset in Afterburner). If your card can overclock to, say, 1125MHz and his can overclock to 1175MHz, then I'd say go for it. If they are about the same or your card overclocks better, meh, not worth it from a performance standpoint.


I will text him and let you know if he gets back to me. Haven't overclocked my 390 yet just got it couple of days ago powered on made sure post and tore down for a good cleaning and painted my case. Just put it back together yesterday, today I will be installing OS drivers and oc'ing a bit.


----------



## Stige

Lol people caring about power consumption, third world problems?


----------



## Krzych04650

Quote:


> Originally Posted by *Stige*
> 
> Lol people caring about power consumption, third world problems?


More power consumption causes more heat and this affects noise levels, this is why I care. Cost wise even 150W difference in power consumption makes negligible difference on electricity bill for me if I play 5 hours per day, but heat generated by cards like overclocked 980 Ti or overclocked 390 is already on the edge of what you are able to cool with air cooling while maintaining acceptable noise levels, at least with currently available cooling solutions from AIBs like Sapphire or MSI. Adding another 30% of power consumption would generate some additional heat and cross the line. This is why I care, I am not of those who are choosing GPU by power consumption just to save $5 monthly on electricity bill. I am just very sensitive for sounds and I am playing on speakers because headsets are not comfortable for me, this is why I care about heat and power consumption, because I need to keep my PC noise levels basically on ambient sound level, and additional heat is not helping, obviously.


----------



## mus1mus

980Ti is hot?

















Something must be wrong in your area. I don't even need to Watercool them to bench 4 at 1500/2000



Maybe not as hot as 4 ref 290Xs but you get the idea.

Same goes for Power Draw. i.e, 3X 980TIs using a 1250W PSU. On the 290X, it'd be 2 without OC.

But to clear things up, 390X and 390 uses around the same amount of Power. It will boil down to chip quality. where a 390 can sometimes draw more than a 390X. Heat output too.


----------



## Krzych04650

Quote:


> Originally Posted by *mus1mus*
> 
> 980Ti is hot?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Something must be wrong in your area. I don't even need to Watercool them to bench 4 at 1500/2000


I am sure that you can somehow keep them alive and bench them in 3 or 4-way SLI by setting fans sky high so you cannot even hear your thoughts because of noise, but I am talking about cooling with low noise levels and daily gaming usage.

Nothing is wrong in my area, 20-24C ambient temp and I had a case with quite good air flow back then, there were six 120mm fans, 3 intake 3 exhaust. I had MSI 980 Ti Gaming 6G, this red one, and keeping card in low 80s during constant 99% load in games like Witcher 3 with acceptable noise levels (for this particular card 50% fan speed was max acceptable noise, ~40% was ideal) was just barely possible. I had to lower OC for a bit because card was crashing after reaching 82-83C.

This was very thin and small cooler for this kind of powerful GPU, so probably some beastly cooler would do better, but on the other hand there were two 100mm fans, and MSI have great fans with very good noise levels, so this wasn't a bad cooler, and yet it was barely able to cool GPU with good noise levels (by this I mean not crossing case fans noise level). I am just overly sensitive for noise levels so I guess there i no point on talking about what is hot and what is loud with me


----------



## bluej511

Quote:


> Originally Posted by *Krzych04650*
> 
> I am sure that you can somehow keep them alive and bench them in 3 or 4-way SLI by setting fans sky high so you cannot even hear your thoughts because of noise, but I am talking about cooling with low noise levels and daily gaming usage.
> 
> Nothing is wrong in my area, 20-24C ambient temp and I had a case with quite good air flow back then, there were six 120mm fans, 3 intake 3 exhaust. I had MSI 980 Ti Gaming 6G, this red one, and keeping card in low 80s during constant 99% load in games like Witcher 3 with acceptable noise levels (for this particular card 50% fan speed was max acceptable noise, ~40% was ideal) was just barely possible. I had to lower OC for a bit because card was crashing after reaching 82-83C.
> 
> This was very thin and small cooler for this kind of powerful GPU, so probably some beastly cooler would do better, but on the other hand there were two 100mm fans, and MSI have great fans with very good noise levels, so this wasn't a bad cooler, and yet it was barely able to cool GPU with good noise levels (by this I mean not crossing case fans noise level). I am just overly sensitive for noise levels so I guess there i no point on talking about what is hot and what is loud with me


Best way for good temps and low noise is and will always be watercooling. With decent quite fans your gpu/cpu will run cooler. I dropped 30°C on my gpu alone, and about 10-15db of noise. The r9 390 nitro cooler at 50% is about the same noise level i have now. At 60% it was my limit for sound. 70% and up forget it.


----------



## Krzych04650

Quote:


> Originally Posted by *bluej511*
> 
> Best way for good temps and low noise is and will always be watercooling. With decent quite fans your gpu/cpu will run cooler. I dropped 30°C on my gpu alone, and about 10-15db of noise. The r9 390 nitro cooler at 50% is about the same noise level i have now. At 60% it was my limit for sound. 70% and up forget it.


Yea I know, I even tried hybrid GPU once but AIO with no manual control over pump is just too loud for me. And custom water loop is expensive and requires too much care and maintenance, at least those are my feelings about it for now, I will try in future, but not now.

What temps were you getting at 50% fan speed on 390 Nitro?


----------



## Krzych04650

Double post.


----------



## tolis626

Well, the 980ti isn't a cool card by any means, but it's not 290x hot either. That's partially due to its sheer size. It's easier to transfer the same amount of heat from a 600sq.mm die than a 430-ish sq.mm one. Simple as that. But you do need a large cooler as the power consumption is pretty much the same, especially when overclocked.


----------



## bluej511

Quote:


> Originally Posted by *Krzych04650*
> 
> Yea I know, I even tried hybrid GPU once but AIO with no manual control over pump is just too loud for me. And custom water loop is expensive and requires too much care and maintenance, at least those are my feelings about it for now, I will try in future, but not now.
> 
> What temps were you getting at 50% fan speed on 390 Nitro?


Was getting about 71-74°C with winter temps and no heater in the house so it stays cool. Now with water im at 47°C with an ambient of about 28°C and no ac haha.


----------



## Stige

Quote:


> Originally Posted by *bluej511*
> 
> Was getting about 71-74°C with winter temps and no heater in the house so it stays cool. Now with water im at 47°C with an ambient of about 28°C and no ac haha.


Core temps don't mean anything, VRM is what matters. And 47C is high for core temp under water.


----------



## Stige

Quote:


> Originally Posted by *Krzych04650*
> 
> Yea I know, I even tried hybrid GPU once but AIO with no manual control over pump is just too loud for me. And custom water loop is expensive and requires too much care and maintenance, at least those are my feelings about it for now, I will try in future, but not now.
> 
> What temps were you getting at 50% fan speed on 390 Nitro?


Why would a custom loop require too much care and maintenance? I flush mine like once a year max without any issues.


----------



## buddatech

Quote:


> Originally Posted by *tolis626*
> 
> You're not the first one to say this, but where did that "The 390x runs hotter and uses more power than the 390" come from? Absolutely not true. The difference in shaders is 10% in favor of the 390x over the 390, so performance should be about 10% better, usually a bit less than that. Sometimes a bit more, depends on the title. But as far as I'm concerned, the power draw and thermals are identical as long as the cards are clocked equally and use the same voltage. And even if there is a difference, it'll be negligible, especially in thermals.
> 
> @buddatech
> 
> The performance difference depends on how each card overclocks. Ask him to tell you how his card overclocks at a certain voltage (Ask for VDDC, not the offset in Afterburner). If your card can overclock to, say, 1125MHz and his can overclock to 1175MHz, then I'd say go for it. If they are about the same or your card overclocks better, meh, not worth it from a performance standpoint.


Said that he never overclocked either of his 390x GPU'S


----------



## tolis626

Quote:


> Originally Posted by *buddatech*
> 
> Said that he never overclocked either of his 390x GPU'S


Hmmm... Well, XFX does have a good track record, but that means next to nothing. It's luck of the draw. What I'd do is overclock your card and, if it's a dud or mediocre start thinking about biting the bullet. But don't expect to be blown away by the difference by any means.


----------



## Harry604

.


----------



## Harry604

anyone here flash a gigabyte 290x windforce to 390x does it use reference pcb


----------



## Irev

Quote:


> Originally Posted by *buddatech*
> 
> "Upgrade" from 390 to 390x $75 worth?


no dont do it.

390x is like 5% faster. just overclock the 390


----------



## buddatech

Thanks for all the answers pertaining to my question +Rep to all that provided advise/help. Think I will keep the 390 for now. As for overclocking the card I just started messing with it and was able to achieve 1125/1550 to start stable in only the couple games I've tried so far not done yet. All setting on auto max temp I saw was 62c max fan was 68% which was very audible will be making a custom profile for this guy.


----------



## bluej511

Quote:


> Originally Posted by *buddatech*
> 
> Thanks for all the answers pertaining to my question +Rep to all that provided advise/help. Think I will keep the 390 for now. As for overclocking the card I just started messing with it and was able to achieve 1125/1550 to start stable in only the couple games I've tried so far not done yet. All setting on auto max temp I saw was 62c max fan was 68% which was very audible will be making a custom profile for this guy.


Depending on which 390, 60% was the max i was able to tolerate. 62°C is damn cold though ambient must be like 18-19°C if not colder haha. Don't think ive ever seen ANY 390/x run that cold. Wondering what your VRM temps must be if the core is that low.


----------



## buddatech

Have a finished basement where my PC's are setup at and it's naturally cool, with AC on this time of the year temps do range 18-22c mostly hover at 20c I find myself wearing a sweatshirt quite often when I'm downstairs which is often lol. I'll check VRM temps and double check GPU temps later today or tomorrow.


----------



## mus1mus

Do all Nitros have Hynix VRAM ?


----------



## bluej511

Quote:


> Originally Posted by *mus1mus*
> 
> Do all Nitros have Hynix VRAM ?


Nada, i got stuck with Elpida.


----------



## Streetdragon

Quote:


> Originally Posted by *mus1mus*
> 
> Do all Nitros have Hynix VRAM ?


Have 2 390 Nitros and both have hynix ram


----------



## mus1mus

hmm. I believe fyzzz has Elpida too.

I posted my cards with waterblocks on a local marketplace with an option for a 390 swap.

Already have an offer for a Nitro.


----------



## fyzzz

Quote:


> Originally Posted by *mus1mus*
> 
> hmm. I believe fyzzz has Elpida too.
> 
> I posted my cards with waterblocks on a local marketplace with an option for a 390 swap.
> 
> Already have an offer for a Nitro.


Yep i have elpida ram on my xfx card. But it's not so bad, it can beat my 290 scores. I'm starting to get quite impressed by this card. Bios modded 390 vs 290 with 390 bios http://www.3dmark.com/compare/fs/9162766/fs/8202624 and i'm currently testing tighter timings.


----------



## bluej511

Mine reaches a hard 1650 and thats it wont do anymore. My core can reach 1200 so not too bad on that one.


----------



## DarthBaggins

Finally loaded GPU-z to see more info on this card I acquired:


----------



## buddatech

Okay confused, so my 390 plays LoL, COD BO3 and passes heaven bench multiple times @1125/1550 max gpu temp 62c max fan was 68% and now sometimes makes it through The Division bench @1100/1500 Tried @ 1125/1550 fail 1120/1500 fail 1100/1500 yes and no +20% power limiter yes and no and @ 1100/1500 +50% power limiter yes and no. The whole time GPU throttling like crazy temps NEVER over 66c VRM1 68c VRM2 68c


----------



## Krzych04650

I got my R9 390 Nitro today. What a disaster. Coil whine is just crazy and there are sudden frequency drops from 1040 to like 500-700 even with temps around 75C. I wasn't even able to check my monitor for FreeSync properly because this card was doing some crazy things like those sudden frequency drops like every half minute. I thought that maybe FreeSync on my monitor is not working properly, but once I got back to all default settings, no FreeSync and no 75 Hz (LG monitors are usually have 60 Hz refresh without FreeSync and 75 with) and I was still getting those crazy frequency drops. Looks like I got super faulty card, not only whining like crazy but also loosing frequency









What is interesting, I was able to get through Fire Strike few times without such issues, scores are okay, but once games are loaded... Terrible. Anyways, I am returning it right away and I will think what to do next, whether to get another 390 or wait for 480. I am not satisfied about how fans are working on Nitro, they are set up very badly, you jump from passive to like 1200-1300 RPM right away for 25% fan speed, 35% is 1700 RPM. My MSI 980 Ti had 1700 RPM at 70% fan speed. Makes no sense at all, everything past 30% is basically too loud on Nitro because it is already getting close to some crazy 2000 RPM. I don't get this, who designed those fans speeds? I think I will go MSI again, their Twin Frozr cooler is probably the best on the market in terms of noise/performance ratio. I wanted to try something new and see if there is anything better but it didn't turn out to good as you can see


----------



## kizwan

Did you try disable Power Efficiency in the Crimson control panel?


----------



## Krzych04650

Quote:


> Originally Posted by *kizwan*
> 
> Did you try disable Power Efficiency in the Crimson control panel?


Yes, I also set power target to +50%. This card is just faulty, thanks to this crazy coil whine that it has it works like an engine so I can hear how it works. It has some micro breaks in operation that causes microstutter and flicker, and it is choking when those frequency drops are present. I tried to record it but microphone in my phone is not even able to catch those crazy whining sounds. It was working fine for some first 15 minutes, except for coil whine, I tested FreeSync and was amazed that 45 FPS is now smooth, and then it started to have those problems, no matter if FreeSync was on or off, drivers reinstalled and etc.


----------



## cainy1991

MSI gaming VS Sapphire Tri-X

Anyone have any opinions on which is better?


----------



## bluej511

Quote:


> Originally Posted by *Krzych04650*
> 
> Yes, I also set power target to +50%. This card is just faulty, thanks to this crazy coil whine that it has it works like an engine so I can hear how it works. It has some micro breaks in operation that causes microstutter and flicker, and it is choking when those frequency drops are present. I tried to record it but microphone in my phone is not even able to catch those crazy whining sounds. It was working fine for some first 15 minutes, except for coil whine, I tested FreeSync and was amazed that 45 FPS is now smooth, and then it started to have those problems, no matter if FreeSync was on or off, drivers reinstalled and etc.


45fps is the limit for smooth, below that it starts to lag/stutter/slow down. Depends on the game too, freesync works awesome on my lg but once it drops to like 45-40 you can tell its slowing down.

Quote:


> Originally Posted by *cainy1991*
> 
> MSI gaming VS Sapphire Tri-X
> 
> Anyone have any opinions on which is better?


Most people will prob say Sapphire as it cools much better then the MSI especially the VRMs.


----------



## cainy1991

Quote:


> Originally Posted by *bluej511*
> 
> Most people will prob say Sapphire as it cools much better then the MSI especially the VRMs.


Yeah I am leaning towards the Sapphire anyway... seems MSI has a super gimped warranty in Australia, 1 year here VS 3 years elsewhere.


----------



## deskiller

temps are good for me with the sapphire nitro 390x

I played GTA V at 3480x2160 at 30hz. and temps stayed at around 70c

my monitor is lg 34um67 and AMD VSR is disable for it, even though its a freesync monitor.









but for my monitor 4k is 5120x2160p. and using custom resolution settings wont accept the 5120

if VSR was working. I could of done it at 60hz

so why no VSR support for 21:9

my older nvidia 780 DSR supported it. but performance sucked at higher res due to lack of enough memory

this 390x seems to handle higher res pretty well.


----------



## diggiddi

From what I uderstand there was an earlier driver with vsr for ultrawide then it was dropped


----------



## Ron Soak

After switching from a low airflow case (h440) to a high airlflow Thermaltake Core V21 my card's tempurature and oc'ing ability has improved significantly.

Now im sitting comfortably with 1180/1750 (+75mv & 50% power limit)

Who knew the airflow helped? (or coincidence)


----------



## bluej511

Quote:


> Originally Posted by *Ron Soak*
> 
> After switching from a low airflow case (h440) to a high airlflow Thermaltake Core V21 my card's tempurature and oc'ing ability has improved significantly.
> 
> Now im sitting comfortably with 1180/1750 (+75mv & 50% power limit)
> 
> Who knew the airflow helped? (or coincidence)


Airflow always helps, especially with open cooler instead of reference ones.

If anyone is interested, heres my findings with Rise of the Tomb Raider in dx11-dx12 and the new patch. As we can see, dx12 is FAR better.


----------



## flopper

Quote:


> Originally Posted by *bluej511*
> 
> Airflow always helps, especially with open cooler instead of reference ones.
> 
> If anyone is interested, heres my findings with Rise of the Tomb Raider in dx11-dx12 and the new patch. As we can see, dx12 is FAR better.


cpu overhead limits a lot.
the min fps arising is a good sign.
sadly not enough games/engines with dx12 yet.


----------



## bluej511

Quote:


> Originally Posted by *flopper*
> 
> cpu overhead limits a lot.
> the min fps arising is a good sign.
> sadly not enough games/engines with dx12 yet.


Yea its def cpu taxing. The game is a lot more then the first one and looks fantastic, especially in utlrawide. Picked it up on the steam sale. I cant monitor in game though since ab doesnt work in dx12 and FRTC doesnt work in dx12 either, but she stays in freesync range anyways so its all good.


----------



## kuss

What voltage should i not exceed in OC ? 390x tri


----------



## Irev

Those with 390 are you looking at the rx480 wishing it was a faster 490/fury2 ?

Im so keen for an upgrade wanting AMD to hurry up for enthusiast grade


----------



## Ha-Nocri

Quote:


> Originally Posted by *Irev*
> 
> Those with 390 are you looking at the rx480 wishing it was a faster 490/fury2 ?
> 
> Im so keen for an upgrade wanting AMD to hurry up for enthusiast grade


Something like that. I'm waiting for a <= 500e card (that's the price of gtx 1070 here). I always buy cut-down version of AMD chips as they offer incredible price/performance. So cut-down Vega.


----------



## bluej511

490 should probably be out years end.

I was wondering why my card was running at 54°C today. Turns out i forgot to set my pump speed, back to 75% and its back to 45°C phew. Damn high ambient


----------



## mus1mus

Quote:


> Originally Posted by *kuss*
> 
> What voltage should i not exceed in OC ? 390x tri


Temps, not Voltage. You can pump +400mV as long as you keep the card within 60C and the VRMs.

I have a guy willing to trade Nitro 390s for my 290X and a 290 with blocks. I'm gonna miss my cards when they're gone.


----------



## kuss

Quote:


> Originally Posted by *mus1mus*
> 
> Temps, not Voltage. You can pump +400mV as long as you keep the card within 60C and the VRMs.
> 
> I have a guy willing to trade Nitro 390s for my 290X and a 290 with blocks. I'm gonna miss my cards when they're gone.


max core 64c vram i thinks its 79 max i seen, to hot ?

https://i.gyazo.com/c07d04f33e56e858fe17777a7dfbdc0d.png


----------



## Rexer

Hi airflow case is encouraging. I was just browsing in here thinking I may get an idea and you guys may be my answer. Thank you. I have 390 Strix dc3. I bet I can fry eggs on but hardly crashes. It's pretty fast and buttery smooth. I get off a night of CoD AW or BF4 and I see temps 92c/86c. It cools instantly and a few minutes, 56c/59c. My clocks are 1140/1600, +32mv. It doesn't crash much but I'm not interested in going further because of those temps.
I feel pretty lucky I got blessed with such a good card. It's a monster in close quarter games. I wish every player could have one.https://www.overclock.net/images/smilies/thumb.gif


----------



## mus1mus

Quote:


> Originally Posted by *kuss*
> 
> max core 64c vram i thinks its 79 max i seen, to hot ?
> 
> https://i.gyazo.com/c07d04f33e56e858fe17777a7dfbdc0d.png


Right where you should consider calling it a day.

If you add another 50mV, temps will rise faster and higher.
Quote:


> Originally Posted by *Rexer*
> 
> Hi airflow case is encouraging. I was just browsing in here thinking I may get an idea and you guys may be my answer. Thank you. I have 390 Strix dc3. I bet I can fry eggs on but hardly crashes. It's pretty fast and buttery smooth. I get off a night of CoD AW or BF4 and I see temps 92c/86c. It cools instantly and a few minutes, 56c/59c. My clocks are 1140/1600, +32mv. It doesn't crash much but I'm not interested in going further because of those temps.
> I feel pretty lucky I got blessed with such a good card. It's a monster in close quarter games. I wish every player could have one.https://www.overclock.net/images/smilies/thumb.gif


Those are hot. Yeah. Is 1100/1625 not enough for you?


----------



## kuss

Quote:


> Originally Posted by *mus1mus*
> 
> Right where you should consider calling it a day.
> 
> If you add another 50mV, temps will rise faster and higher.
> Those are hot. Yeah. Is 1100/1625 not enough for you?


Can I keep it at that or only go to 70c?


----------



## mus1mus

I'd take a 75C limit for suspended period of time (Gaming).

Some cards start to artifact once VRM reaches 75C. And as always, cooler is better.

Try to optimise your air flow before pushing things higher. Take off the case fan and see if the temps drop. If they do by a significant amount, play around your case optimisation.

But bottomline, gaming at 64 fps may not really give you a significant advantage over 60 fps. That is sometimes what you gain for a 50MHz more OC. But with the cost of temps.








I hope that makes sense.


----------



## kuss

Quote:


> Originally Posted by *mus1mus*
> 
> I'd take a 75C limit for suspended period of time (Gaming).
> 
> Some cards start to artifact once VRM reaches 75C. And as always, cooler is better.
> 
> Try to optimise your air flow before pushing things higher. Take off the case fan and see if the temps drop. If they do by a significant amount, play around your case optimisation.
> 
> But bottomline, gaming at 64 fps may not really give you a significant advantage over 60 fps. That is sometimes what you gain for a 50MHz more OC. But with the cost of temps.
> 
> 
> 
> 
> 
> 
> 
> 
> I hope that makes sense.


https://i.gyazo.com/70128f53913404555d9bff475228fe7e.png
safe now


----------



## Streetdragon

What a bit undervolting can do.... Still a good bench. the clock from 1040 to 1000 and the mem from 1500 to 1400(with better timings)
The temps just reach 60°(core and vrm) with ambient @ 26°
With lower memory i could lower the voltage more but... this is already nice.

How far can you undervolt? silly summer^^


----------



## Worldwin

Undervolting requires you to lower the voltage. From the screenshot you are just underclocking. Actual undervolting increases the average voltage but drastically decreases current.


----------



## Streetdragon

Quote:


> Originally Posted by *Worldwin*
> 
> Undervolting requires you to lower the voltage. From the screenshot you are just underclocking. Actual undervolting increases the average voltage but drastically decreases current.


it is -31 mvolt so far^^


----------



## Worldwin

Quote:


> Originally Posted by *Streetdragon*
> 
> it is -31 mvolt so far^^


You can go lower. I believe in you. For reference my 390X was able to undervolt from [email protected] ~1.275V (bios level) to 1050mhz @ 1.156V.


----------



## Rexer

Lol. To go fast.
Or the sound, intelligent word the world prefers is performance. It's like buying a Shelby Mustang over a Civic. No way do you buy a Corvette just to wave at the neighbors. I like playing first person shooter. I'm like a kid with a water gun again. In the case of winning or losing, I rather it comes down to how capable I physically and mentally am to beat my opponent. At least I know the failure is me, not the hardware. But if you're asking why didn't I buy the 390X instead of the 390, well, that's a matter of small, funny funds. Otherwise, I'd love to buy a Ferrari over a Corvette, lol.
But why not overclock? Heck, we paid for the performance of the card and it'll be down the wayside with the Oldsmobiles and Pontiacs in a couple of years. We already know overclocking doesn't kill a gpu card outright so why not enjoy what we have?


----------



## Rexer

Hmm. somehow I missed joining my answer to musimus' comment. Sorry.


----------



## Streetdragon

Quote:


> Originally Posted by *Rexer*
> 
> Lol. To go fast.
> Or the sound, intelligent word the world prefers is performance. It's like buying a Shelby Mustang over a Civic. No way do you buy a Corvette just to wave at the neighbors. I like playing first person shooter. I'm like a kid with a water gun again. In the case of winning or losing, I rather it comes down to how capable I physically and mentally am to beat my opponent. At least I know the failure is me, not the hardware. But if you're asking why didn't I buy the 390X instead of the 390, well, that's a matter of small, funny funds. Otherwise, I'd love to buy a Ferrari over a Corvette, lol.
> But why not overclock? Heck, we paid for the performance of the card and it'll be down the wayside with the Oldsmobiles and Pontiacs in a couple of years. We already know overclocking doesn't kill a gpu card outright so why not enjoy what we have?


i have a oc profile @1200/1700... but at this heat in my room.. its not funny. I wanna stil game in the summer without dying on a heat, while my gpus are happy with there water xD. Not everyone have an AC. I life under a roof sloping. Have sun on it for 6 hours^^


----------



## Geoclock

Hi guys.
Recommend please good thermal paste for my R9 card, still using factory one and temps are little high.
Do you recommend Coollaboratory Liquid Pro Thermal Paste ?
Thanks.


----------



## bluej511

Quote:


> Originally Posted by *Geoclock*
> 
> Hi guys.
> Recommend please good thermal paste for my R9 card, still using factory one and temps are little high.
> Do you recommend Coollaboratory Liquid Pro Thermal Paste ?
> Thanks.


Liquid ultra would be awesome, youd have to put something over the transistors in case the liquid ultra runs off just a tiny bit (better safe then sorry). Otherwise Gelid Extreme or Kryonaut should be more then plenty. LMs wont drop temps that much anyways.


----------



## Geoclock

Is Gelid Extreme better than Arctic MX-4 ?


----------



## bluej511

Quote:


> Originally Posted by *Geoclock*
> 
> Is Gelid Extreme better than Arctic MX-4 ?


Gelid Extreme is prob top 3 in TIMs, Kryonaut and GC Extreme are pretty even. Theyre a bit thicker then most TIMs so they tend to work a bit longer then ones that are thinner and pump out.


----------



## Geoclock

Thanks, any members with Coollaboratory Liquid Pro and results?


----------



## Streetdragon

Quote:


> Originally Posted by *Geoclock*
> 
> Thanks, any members with Coollaboratory Liquid Pro and results?


i had it on air and now on water with liquid ultra. had a drop of 3-5°. So it is not bad at all. just use less as possible of it, so it cant sread itself


----------



## jdorje

Still leaning more and more toward watercooling my 390.

But there's no full cover block for it. I don't mind just cooling the core with one block, but there must be appropriately sized water blocks (i.e. just straight and flat...) that would work to cool the VRM and vram also right?


----------



## mus1mus

Quote:


> Originally Posted by *Rexer*
> 
> Lol. To go fast.
> Or the sound, intelligent word the world prefers is performance. It's like buying a Shelby Mustang over a Civic. No way do you buy a Corvette just to wave at the neighbors. I like playing first person shooter. I'm like a kid with a water gun again. In the case of winning or losing, I rather it comes down to how capable I physically and mentally am to beat my opponent. At least I know the failure is me, not the hardware. But if you're asking why didn't I buy the 390X instead of the 390, well, that's a matter of small, funny funds. Otherwise, I'd love to buy a Ferrari over a Corvette, lol.
> But why not overclock? Heck, we paid for the performance of the card and it'll be down the wayside with the Oldsmobiles and Pontiacs in a couple of years. We already know overclocking doesn't kill a gpu card outright so why not enjoy what we have?


A line should be drawn between going Fast and Cruising Safe.

From your analogy, sure, you buy a Corvette coz it can go fast. Bet no one puts a big-ass Carburetor or a mightier Supercharger without first thinking about upgrading the Radiator and the Water Pump to cool down the engine along with the increased Horsepower figures.









That's the same idea I am injecting. Cool it, then clock it.


----------



## mus1mus

I will be joining this club officially later this afternoon.


----------



## SuperZan

Quote:


> Originally Posted by *mus1mus*
> 
> That's the same idea I am injecting. Cool it, then clock it.


Spoken like a true Vishera grand-master.

Quote:


> Originally Posted by *mus1mus*
> 
> I will be joining this club officially later this afternoon.










the Nitro 390's? You'll make them sing I'm sure.


----------



## mus1mus

I hope..









Ohh, got it. A 3 days old card to replace my 290X!

And yeah, it's a terribad one!

1200 requires +200mV!


----------



## Rexer

@mus1mus. Yup. I agree. Running modified, requires a larger radiator. I even seen an old Ford with OEM power, air and a dinky two row radiator. But then again, it's a Ford.
Since last month I've been hunting for cooling solutions, such as the high air-flow case you and others have been discussing. The case I'm using is a mid-tower. It's very cool most of the time. 390 just cooks in high graphic games. Cools rapidly when I jump out so my first test was a jeri-rig cooler. Using a 6" fan with plastic scoops (facing the fan) taped to gpu card. I also used thin plastic strips to spoil in coming air away from the exit outlets (side of fan cover). Boom. temps hit low 70's. Great setup. Too bad it's so awkward I have to assemble/disassemble it every time I play games. Considered just building an open jig but I got a cats around here.
So the high air-flow case really sounds good to me. Got any full size cases in mind?


----------



## CoffeeIsLife

Hi all!

I've recently joined the club of desktop users again and built a system with a 390X by Sapphire. I'm still looking for alternative cooling methods, but that doesn't seem too easy. So far, i've only heard that the Arctic Accelero Xtreme IV is supposed to be compatible, but i couldn't get any info on the Raijintek Morpheus series. Is there maybe any user here who has experience with that cooler on the Sapphire? How did it influence the temps?

What about liquid cooling (there seems to be a (sort of) full cover block by Alphacool)? How are the temps compared to air cooling?

Thanks in advance!


----------



## Charcharo

I did a DOOM mini-bench. Nightmare settings, everything on. Necropolis (Crucible).

OpenGL:
1440x900 - 102 fps
2560x1600 -52 fps

Vulkan
1440x900 - 152 fps
2560x1600 - 73 fps

stock R9 390, i5 4460 + Windows 7

My R9 390 is maturing into something even greater than what it was at the start


----------



## bluej511

Quote:


> Originally Posted by *Charcharo*
> 
> I did a DOOM mini-bench. Nightmare settings, everything on. Necropolis (Crucible).
> 
> OpenGL:
> 1440x900 - 102 fps
> 2560x1600 -52 fps
> 
> Vulkan
> 1440x900 - 152 fps
> 2560x1600 - 73 fps
> 
> stock R9 390, i5 4460 + Windows 7
> 
> My R9 390 is maturing into something even greater than what it was at the start


Very nice, for me it was Rise of the Tomb Raider that got a good boost in DX12.


----------



## AliNT77

Quote:


> Originally Posted by *bluej511*
> 
> Very nice, for me it was Rise of the Tomb Raider that got a good boost in DX12.


Can you do a little benchmarking please? ?


----------



## bluej511

Quote:


> Originally Posted by *AliNT77*
> 
> Can you do a little benchmarking please? ?


Already did haha. Oh and between dx 12 and dx 12 new patch there was a cpu change (stil i 5 4690k) running at about 4.3 instead of 4.5.


----------



## mus1mus

Quote:


> Originally Posted by *Charcharo*
> 
> I did a DOOM mini-bench. Nightmare settings, everything on. Necropolis (Crucible).
> 
> OpenGL:
> 1440x900 - 102 fps
> 2560x1600 -52 fps
> 
> Vulkan
> 1440x900 - 152 fps
> 2560x1600 - 73 fps
> 
> stock R9 390, i5 4460 + Windows 7
> 
> *My R9 390 is maturing into something even greater than what it was at the start
> 
> 
> 
> 
> 
> 
> 
> *


So true!

I did some benchmarks today and got blown by the improvement on the Drivers!

Coming from Catalyst 15.10 (which is my go to benching driver) to Crimson 16.7.2 (I avoided Crimson due to the black screen after reboot issues when pushing things a lot) in some key areas in 3DMark11, I am seeing over 10fps of improvement! That is massive!

390 - http://www.3dmark.com/3dm11/11405546

290X - http://www.3dmark.com/3dm11/11075578

Sad to say my new card doesn't clock so well. But hey, still way better than a 290X clock to clock!


----------



## OneB1t

how is that 390 is beating 250mhz faster 290X?







there is something not right its same chip


----------



## bluej511

Quote:


> Originally Posted by *OneB1t*
> 
> how is that 390 is beating 250mhz faster 290X?
> 
> 
> 
> 
> 
> 
> 
> there is something not right its same chip


At lower clocks too how crazy is that haha.

Kinda glad i stuck to my 390 and didnt buy an rx 480, although the water cooler for it looks awesome, i bet EK wont make em for AIBs though, they barely made it for the 290/390. Ill stick with it for now.


----------



## mus1mus

Quote:


> Originally Posted by *OneB1t*
> 
> how is that 390 is beating 250mhz faster 290X?
> 
> 
> 
> 
> 
> 
> 
> there is something not right its same chip


Drivers are different. 15.10 vs 16.7.2
That 290X (reference VTX3D with a new Board) needs to clock 100MHz more over the older reference cards to equalize performance. Not sure why. Board seems pre-Grenada (with printed 8GB/4GB markers).

Quote:


> Originally Posted by *bluej511*
> 
> At lower clocks too how crazy is that haha.
> 
> Kinda glad i stuck to my 390 and didnt buy an rx 480, although the water cooler for it looks awesome, i bet EK wont make em for AIBs though, they barely made it for the 290/390. Ill stick with it for now.


480 is good for starters. Not worth it really for Hawaii and Grenada adopters. Unless you really mind power consumption.

The Nitro runs pretty cool! But I won't stick woth this card. It clocks very low for my liking.


----------



## bluej511

Quote:


> Originally Posted by *mus1mus*
> 
> Drivers are different. 15.10 vs 16.7.2
> That 290X (reference VTX3D with a new Board) needs to clock 100MHz more over the older reference cards to equalize performance. Not sure why. Board seems pre-Grenada (with printed 8GB/4GB markers).
> 480 is good for starters. Not worth it really for Hawaii and Grenada adopters. Unless you really mind power consumption.
> 
> The Nitro runs pretty cool! But I won't stick woth this card. It clocks very low for my liking.


My Nitro at an ambient of 21°C runs about 40°C or so. I have no ac here so now im stuck to running mid 40s. Still beats the 74°C i was stuck with on air so idc. i only have 360/240mm rads so its plenty. The rx480 would prob run maybe a couple degrees cooler. My water delta is 5-7°C over case ambient so not a worry.


----------



## mus1mus

Quote:


> Originally Posted by *bluej511*
> 
> My Nitro at an ambient of 21°C runs about 40°C or so. I have no ac here so now im stuck to running mid 40s. Still beats the 74°C i was stuck with on air so idc. i only have 360/240mm rads so its plenty. The rx480 would prob run maybe a couple degrees cooler. My water delta is 5-7°C over case ambient so not a worry.


I meant the stock cooler.









12C ambient, default fan settings, +250 VDDC(1.38V) , 1200/1625, short 3DMark11 runs, max temp - 55C, VRM1 - 61C.

For comparison, 980TI HOF levels of cooling at 1500/2200 1.212









Pretty good for me. Coming from 290Xs that have temps of 55C max at +381mV no Voltage Limit on water; 3-way XFire.


----------



## OneB1t

Quote:


> Originally Posted by *mus1mus*
> 
> Drivers are different. 15.10 vs 16.7.2
> That 290X (reference VTX3D with a new Board) needs to clock 100MHz more over the older reference cards to equalize performance. Not sure why. Board seems pre-Grenada (with printed 8GB/4GB markers).
> 480 is good for starters. Not worth it really for Hawaii and Grenada adopters. Unless you really mind power consumption.
> 
> The Nitro runs pretty cool! But I won't stick woth this card. It clocks very low for my liking.


flash them with same bios and 290X will just thrash 390







bad idea to downgrade from 1500mhz 290X to 1225mhz 390


----------



## m0nsky

Oh dear I'm now running a GTX 1080. But my 1250/1750 MSI R9 390X isn't collecting dust either!







A friend is running it in his chassis now, he's running my MSI Afterburner presets too. If I'm ever installing that card to my PC again (it's still mine) I will watercool it, flash the bios and see what's possible.


----------



## bluej511

Quote:


> Originally Posted by *mus1mus*
> 
> I meant the stock cooler.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 12C ambient, default fan settings, +250 VDDC(1.38V) , 1200/1625, short 3DMark11 runs, max temp - 55C, VRM1 - 61C.
> 
> For comparison, 980TI HOF levels of cooling at 1500/2200 1.212
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Pretty good for me. Coming from 290Xs that have temps of 55C max at +381mV no Voltage Limit on water; 3-way XFire.


12°C damn, what do you live in a freezer?


----------



## mus1mus

Quote:


> Originally Posted by *OneB1t*
> 
> flash them with same bios and 290X will just thrash 390
> 
> 
> 
> 
> 
> 
> 
> bad idea to downgrade from 1500mhz 290X to 1225mhz 390


Not soo bad when you consider using a 1600p monitor.









I really don't think the 290X will TBH granted the same clocks.

This is simply based off my experience though. Pushing the cards further than most people here of course.


----------



## mus1mus

Quote:


> Originally Posted by *bluej511*
> 
> 12°C damn, what do you live in a freezer?


lol. My co-workers call my spot, "the igloo"!









I can stand 12C even when I am living in tropical country.









Canucks will still feel warm with this ambient though.


----------



## bluej511

Quote:


> Originally Posted by *mus1mus*
> 
> lol. My co-workers call my spot, "the igloo"!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I can stand 12C even when I am living in tropical country.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Canucks will still feel warm with this ambient though.


Im in southern France with no AC, ambient inside is 27-28°C. Case will get up to 31°C inside while gaming. Water temp around 36-37°C for very demanding games. In the winter, no heat so the inside gets nice and cool and water temp is prob like 25-26°C at load, at first start up its like 18°C lol.

I think i can do 16°C ambient any lower and my bad knees from my accident start to suffer. I lived 8years in New England so tons of cold tons of snow. I cycle here in January in shorts while everyone here is wearing fur coats.


----------



## mus1mus

My delta is very low even with just 2 360s as my fans are, well, 4K rpms 38mms.









I think with 3 cards on full throttle and a 5930K, I am not seeing 10C water on top of the ambient air. Pretty handy when benching.









I still have 2 480 rads yet to be used as I am still to finish my other bencher. It can house 2 - 480s and 2 - 360s.

Really eager to test those HWLabs GTXs


----------



## bluej511

Quote:


> Originally Posted by *mus1mus*
> 
> My delta is very low even with just 2 360s as my fans are, well, 4K rpms 38mms.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I think with 3 cards on full throttle and a 5930K, I am not seeing 10C water on top of the ambient air. Pretty handy when benching.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I still have 2 480 rads yet to be used as I am still to finish my other bencher. It can house 2 - 480s and 2 - 360s.
> 
> Really eager to test those HWLabs GTXs


Well with an ambient of 12°C id be surprised to see 10°C of water haha.


----------



## jdorje

Vulkan is pretty incredibly promising. In dota i get near 250 fps with it compared to 160 in dx12. It's at least partially a cpu bottleneck. Too glitchy though.


----------



## Vellinious

Quote:


> Originally Posted by *mus1mus*
> 
> So true!
> 
> I did some benchmarks today and got blown by the improvement on the Drivers!
> 
> Coming from Catalyst 15.10 (which is my go to benching driver) to Crimson 16.7.2 (I avoided Crimson due to the black screen after reboot issues when pushing things a lot) in some key areas in 3DMark11, I am seeing over 10fps of improvement! That is massive!
> 
> 390 - http://www.3dmark.com/3dm11/11405546
> 
> 290X - http://www.3dmark.com/3dm11/11075578
> 
> Sad to say my new card doesn't clock so well. But hey, still way better than a 290X clock to clock!


What kind of improvements in Firestrike?


----------



## spyshagg

16.7.2 driver doesn't keep OC clocks after a reboot? I freaking dread this issue. Still using 15.11.1 as a daily driver because of that


----------



## bluej511

Quote:


> Originally Posted by *spyshagg*
> 
> 16.7.2 driver doesn't keep OC clocks after a reboot? I freaking dread this issue. Still using 15.11.1 as a daily driver because of that


Prob why afterburner still exists. Even with AB sometimes ive had issues with it not keeping OCed clocks after a reboot or sometimes even a regular boot.


----------



## mus1mus

Quote:


> Originally Posted by *bluej511*
> 
> Well with an ambient of 12°C id be surprised to see 10°C of water haha.


10C on top of 12C, means 10C over 12C ambient.









No?









Quote:


> Originally Posted by *Vellinious*
> 
> What kind of improvements in Firestrike?


Yet to try.
Quote:


> Originally Posted by *spyshagg*
> 
> 16.7.2 driver doesn't keep OC clocks after a reboot? I freaking dread this issue. Still using 15.11.1 as a daily driver because of that


hmm. Actually, you're better off with the driver resetting to stock clocks after the reboot. For obvious reasons.


----------



## bluej511

Quote:


> Originally Posted by *mus1mus*
> 
> 10C on top of 12C, means 10C over 12C ambient.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> No?
> 
> 
> 
> 
> 
> 
> 
> 
> Yet to try.
> hmm. Actually, you're better off with the driver resetting to stock clocks after the reboot. For obvious reasons.


Yea with those cold temps i can't see your water getting past 20°C especially if its air conditioned.


----------



## Sycksyde

Quote:


> Originally Posted by *mus1mus*
> 
> Do all Nitros have Hynix VRAM ?


Mine has Elpida, haven't pushed it yet though.


----------



## jdorje

https://en.wikipedia.org/wiki/List_of_games_with_Vulkan_support

https://en.wikipedia.org/wiki/List_of_games_with_DirectX_12_support

Found these two lists. Probably a list of games in which my 390 is going to crush at 1440 resolution, right?


----------



## spyshagg

Quote:


> Originally Posted by *bluej511*
> 
> Prob why afterburner still exists. Even with AB sometimes ive had issues with it not keeping OCed clocks after a reboot or sometimes even a regular boot.


Quote:


> Originally Posted by *mus1mus*
> 
> 10C on top of 12C, means 10C over 12C ambient.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> No?
> 
> 
> 
> 
> 
> 
> 
> 
> Yet to try.
> hmm. Actually, you're better off with the driver resetting to stock clocks after the reboot. For obvious reasons.


I meant I want the driver to reset ALL values like catalyst did. With crimson if my card crashed with overclocked values, windows would boot with OC clocks minus the voltage = blackscreen


----------



## mus1mus

Yep.

I think, hidden offset will help with that.


----------



## bluej511

Me no likey 30°C ambient temps with no AC.


----------



## TheLAWNOOB

Quote:


> Originally Posted by *bluej511*
> 
> Me no likey 30°C ambient temps with no AC.


Game naked?


----------



## mus1mus




----------



## spyshagg

a sacrifice most of us will make i'm afraid!

...Until I made a tiny hole in the wall and moved all the radiators/fans to the other room!


----------



## bluej511

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> Game naked?


Haha its already pretty close. I mean for PC temps, me no likey. Even with the window open and pretty much 40km/h gusts into the apartment, it makes absolutely no differences to temps. Case ambient and water temp at idle are already 30-31°C. Room ambient is prob 29-30°C.

At least ill finally be getting fiber by the end of the year. Sick of 12mbps. Translate to about 1.5-1.6MB/s


----------



## Spartan117J

I got 2178 on heaven and this is a pic of the 3116 I got on valley.
The card is watercooled.
overclocked to 1250 gpu and 1750 ram, full voltage and power limit.
Will try to push it further when I have more time
Oh and temps topped out at 49 degrees


----------



## Agent Smith1984

Quote:


> Originally Posted by *Spartan117J*
> 
> I got 2178 on heaven and this is a pic of the 3116 I got on valley.
> The card is watercooled.
> overclocked to 1250 gpu and 1750 ram, full voltage and power limit.
> Will try to push it further when I have more time
> Oh and temps topped out at 49 degrees


Could you post a baseline Firstrike at the 1250/1750 clock speeds?

Thanks


----------



## AliNT77

anyone with a Result from Time Spy?

Time Spy Result

R9 290 @1070-1625 with 1250Timings


----------



## SuperZan

Quote:


> Originally Posted by *AliNT77*
> 
> anyone with a Result from Time Spy?
> 
> Time Spy Result
> 
> R9 290 @1070-1625 with 1250Timings


Copypasta from my post in the news thread:

Total score of 6,588 . Graphics score of 7,555 . CPU score of 3,820 .

GPU: 2x R9 390, stock
CPU: FX-8370 @ 4.8GHz

RAM: 16 GB 1926MHz, 9-10-9-28

http://www.3dmark.com/spy/9846

Just a quick first run to get my feet wet before I start tuning for best results.


----------



## AliNT77

damn, that FX8370 is destroying i5-2500k in DX12 ...









my [email protected] + 1333CL7-7-6-18 scores 7800 in FireStrike Physics Test how much does your CPU score?


----------



## SuperZan

Quote:


> Originally Posted by *AliNT77*
> 
> damn, that FX8370 is destroying i5-2500k in DX12 ...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> my [email protected] + 1333CL7-7-6-18 scores 7800 in FireStrike Physics Test how much does your CPU score?


DX12 looks to utilise a bit more of the FX's potential, finally! 

As to FireStrike physics, quick and dirty testing under W10 gives me 9,307.

www.3dmark.com/3dm/13196922


----------



## jdorje

http://www.3dmark.com/3dm/13198214?

3957 graphics score, 3890 CPU score

This is my 390 on a 1125 mV (<200w) summer overclock, with 1740 memory (3dmark never gets that right) and modded timings.

4690k at 4.7

3dmark seems to alternate between saying I have a valid score and that my graphics driver is not approved. Bit strange. I'm on the latest beta drivers though.

The fire strike physics score is just a fully threaded benchmark; a slow 8-core chip should score very well in it. And the graphics score is nearly cpu independent. So I'd expect an 8350 to beat a 2500k.


----------



## SuperZan

Quote:


> Originally Posted by *jdorje*
> 
> The fire strike physics score is just a fully threaded benchmark; *a slow 8-core chip should score very well in it*. And the graphics score is nearly cpu independent. So I'd expect an 8350 to beat a 2500k.


FX has always underwhelmed in FireStrike physics, for whatever reason (poor IPC for one). It's still not going to compete with a well-tuned Haswell/Skylake i7, or an HEDT hex/octa but FX is a bit less sluggish in the combined scenes in Time Spy.

Light OC on the 390's (as I've got Gigabytes) was worth +500 on the graphics score: http://www.3dmark.com/spy/12227


----------



## mus1mus

Fire Strike Physics score for FX is just right where it should be.

It's the Combined that cripples the FX with 1Core per CU utilisation.


----------



## JazzaHendo

Hi, I have a Gigabyte G1 Gaming model of an R9 390 and I was interested if i could make any improvements to my fan curve. I currently have it at 30% fan speed until 45 degrees and it goes up on a diagonal from there up until 100 degrees. I hit about 72 degrees in Grand Theft Auto and i happy with the temperature, but the noise level is a little high and i was wondering if there were any improvements I could make in my fan curve to make a better balance between temperature and fan noise.

Thank you for taking the time to read this post.


----------



## SuperZan

Quote:


> Originally Posted by *JazzaHendo*
> 
> Hi, I have a Gigabyte G1 Gaming model of an R9 390 and I was interested if i could make any improvements to my fan curve. I currently have it at 30% fan speed until 45 degrees and it goes up on a diagonal from there up until 100 degrees. I hit about 72 degrees in Grand Theft Auto and i happy with the temperature, but the noise level is a little high and i was wondering if there were any improvements I could make in my fan curve to make a better balance between temperature and fan noise.
> 
> Thank you for taking the time to read this post.


What's your case airflow situation? I've been able to save myself a bit of GPU fan noise in some of my cases through a more aggressive case-fan arrangement.


----------



## JazzaHendo

My case is the Define R5 and i have 2 intake fans in the front of my case, 1 exhaust fan in the rear of my case and one intake fan mounted on the side panel of my case putting cold air right on the GPU.

I plan to remove the rear moduvent and put in a 140mm exhaust fan in there.

Thank you for your reply.


----------



## diggiddi

Quote:


> Originally Posted by *SuperZan*
> 
> Copypasta from my post in the news thread:
> 
> Total score of 6,588 . Graphics score of 7,555 . CPU score of 3,820 .
> 
> GPU: 2x R9 390, stock
> 
> CPU: FX-8370 @ 4.8GHz
> RAM: 16 GB 1926MHz, 9-10-9-28
> 
> http://www.3dmark.com/spy/9846
> 
> Just a quick first run to get my feet wet before I start tuning for best results.


http://www.3dmark.com/3dm/13224743?

5989
7904 Gpu Stock CFX 290x
2525 Stock FX 8350


----------



## bluej511

3d mark on sale for anyone interested.

Also has the Time Spy upgrade preview included. Its 10€ here i believe its 10$ in the US. Can't wait to try the dx12.

Heres a little API test if anyone is interested.

http://www.3dmark.com/3dm/13232620?


----------



## mus1mus

Anyone knew if 390X Devil uses reference board?

It doesn't, though it looks like a good one.

I scrapped the 390 btw. So now, looking at buying either another Nitro 390 or a 390X.

It has the potential for modding but lack the headroom.

http://www.3dmark.com/3dm11/11413852


----------



## Kyneaz

Guys what's the best solution between r9 390 nitro and msi for overcloking ?


----------



## snurds

I've been experimenting a bit with my XFX 390x in MSI Afterburner and it looks like I'm going to be temperature throttled before any overclocking.

I played Crysis 3 at +0% Power Limit and my core clock stayed mostly in the 990s with my temperature plateauing at 84 degrees. Then I changed to +50% Power Limit and shot up to a stable 1060Mhz while my temperature climbed. Eventually temperature plateaued at 94 degrees while my clock rate moved around somewhere in the 1000s.

This can't be normal, right? It would seem to mean I can't overclock at all with my current case setup because I can't keep a stable stock clock of 1060 Mhz in the first place.

Sorry if this is an obvious question, I haven't had a chance to look around in this thread yet.


----------



## bluej511

Quote:


> Originally Posted by *Kyneaz*
> 
> Guys what's the best solution between r9 390 nitro and msi for overcloking ?


Honestly its a crap shoot. Some sapphires have elpida some have hynix. The hynix ones dont seem to get a higher core then the elpida and elpida cant OC the mem as high as hynix but the cores seem to go higher. its always a crap shoot.
Quote:


> Originally Posted by *snurds*
> 
> I've been experimenting a bit with my XFX 390x in MSI Afterburner and it looks like I'm going to be temperature throttled before any overclocking.
> 
> I played Crysis 3 at +0% Power Limit and my core clock stayed mostly in the 990s with my temperature plateauing at 84 degrees. Then I changed to +50% Power Limit and shot up to a stable 1060Mhz while my temperature climbed. Eventually temperature plateaued at 94 degrees while my clock rate moved around somewhere in the 1000s.
> 
> This can't be normal, right? It would seem to mean I can't overclock at all with my current case setup because I can't keep a stable stock clock of 1060 Mhz in the first place.
> 
> Sorry if this is an obvious question, I haven't had a chance to look around in this thread yet.


More info on case/fans/etc because yea hitting 94°C on stock clocks is pretty bad.


----------



## snurds

The case is the Thermaltake Chaser A31 ATX Mid Tower Case. I bought the PC used and have never had occasion to mess with the fans but it looks like they're just the one main expulsion fan in the back, immediately behind and orthogonal to the CPU. Then there's also the PSU fan.

I will very likely not be overclocking since I only have a 600W PSU but still I would like to get the full stock 1060Mhz is possible.

Is it normal to install additional case fans for this card? It's the double dissipation.


----------



## bluej511

Quote:


> Originally Posted by *snurds*
> 
> The case is the Thermaltake Chaser A31 ATX Mid Tower Case. I bought the PC used and have never had occasion to mess with the fans but it looks like they're just the one main expulsion fan in the back, immediately behind and orthogonal to the CPU. Then there's also the PSU fan.
> 
> I will very likely not be overclocking since I only have a 600W PSU but still I would like to get the full stock 1060Mhz is possible.
> 
> Is it normal to install additional case fans for this card? It's the double dissipation.


There should be 2 intake fans as well. If they're missing i would add 2 front intake fans immediately and remove one or both of the HDD cages if youre not using em or have the HDD mounted somewhere else. Just looked at the pics looks you can a dd a bottom fan as well right behind the PSU. That should help a bit more then front intake fans as it will blow the air right onto the gpu.

That should help quite a bit right off the bat.


----------



## Sycksyde

Quote:


> Originally Posted by *Kyneaz*
> 
> Guys what's the best solution between r9 390 nitro and msi for overcloking ?


The Nitro has better cooling but it's still a lottery.


----------



## flopper

Quote:


> Originally Posted by *Kyneaz*
> 
> Guys what's the best solution between r9 390 nitro and msi for overcloking ?


Lottery.
MSI might edge out but basically every card is around 1100-1150mhz OC.
sure some may do more but then you really need good cooling.
Quote:


> Originally Posted by *snurds*
> 
> I've been experimenting a bit with my XFX 390x in MSI Afterburner and it looks like I'm going to be temperature throttled before any overclocking.
> 
> I played Crysis 3 at +0% Power Limit and my core clock stayed mostly in the 990s with my temperature plateauing at 84 degrees. Then I changed to +50% Power Limit and shot up to a stable 1060Mhz while my temperature climbed. Eventually temperature plateaued at 94 degrees while my clock rate moved around somewhere in the 1000s.
> 
> This can't be normal, right? It would seem to mean I can't overclock at all with my current case setup because I can't keep a stable stock clock of 1060 Mhz in the first place.
> 
> Sorry if this is an obvious question, I haven't had a chance to look around in this thread yet.


Might be bad contact with the core.
if xfx allows it with warranty might tighten screws or replace the tim.


----------



## Spartan117J

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Could you post a baseline Firstrike at the 1250/1750 clock speeds?
> 
> Thanks


Hi pal,
I don't have fire strike to test with sorry


----------



## diggiddi

Quote:


> Originally Posted by *Spartan117J*
> 
> Hi pal,
> I don't have fire strike to test with sorry


Download for free @ 3dmark.com


----------



## xboxshqip

Feels good being a 390 owner lately, what do you guys think?


----------



## snurds

Thanks for the replies about my thermal problem.

First of all there is an intake fan at the bottom, I didn't notice it before.

Anyway I gave my case some more breathing room and it looked like the temperature was going to stabilize around 91 degrees. I didn't actually wait long enough to confirm that because it didn't seem like a desirable result.

Then I tried removing the side of my case and I got stabilization at 79 degrees. That's a good result I think but not sure if that's a good solution...seems like it will lead to problems with dust?

But anyway I think it's confirmed that my basic problem is ventilation. I will look into moving things around in my case.

For anyone who has made it this far, what would you say is a good temperature target for a XFX 390x under full load but not overclocked?


----------



## TrueForm

Quote:


> Originally Posted by *xboxshqip*
> 
> Feels good being a 390 owner lately, what do you guys think?


#

Heck yeah.

It was my 390 or the 970, I chose the card that doesn't lie about the amount of vram it has


----------



## JazzaHendo

Hi, I have a Gigabyte G1 Gaming model of an R9 390 and I was interested if i could make any improvements to my fan curve. I currently have it at 30% fan speed until 45 degrees and it goes up on a diagonal from there up until 100 degrees. I hit about 72 degrees in Grand Theft Auto and i happy with the temperature, but the noise level is a little high and i was wondering if there were any improvements I could make in my fan curve to make a better balance between temperature and fan noise.

Thank you for taking the time to read this post.

P.s. I did post earlier but I thought I would post it again as I received no replies to my problem.


----------



## SuperZan

I have it going from 30% at 40 degrees to 40% at 50 to 50% at 60, then I ramp up the fans by 5% every 5 degrees thereafter. This is in a Crossfire config without water blocks and my fanspeed is rarely above 70% for longer than a minute. If you're using a softer curve try a more aggressive one like that, I find I save myself a bit of noise by cutting off heat early rather than trying to play catch-up.

The G1 fans are loud at 80% and up, though, so try to keep it from getting there.


----------



## JazzaHendo

Thank you very much for your reply


----------



## 12Cores

How well does two 390x's perform in 4k without AA? Considering picking two of these up instead of the dual rx 480's? Thank you in advance.


----------



## snurds

I don't know but I think most people would tell you to go for a Fury X in your situation. They go for $400 on sale in the U.S.


----------



## SuperZan

Two 390s run everything I'm playing (Witcher 3, FO4, RoTR, Forza, CoD black ops 3, Blade and Soul, ARK, SC2, random AAA stuff) very well at 4k. Without AA I can run everything AT Ultra type settings, occasionally compromising on shadows or something down to High. I do have a Free sync monitor but rarely experience dips below 50ish FPS and if I do it's momentary


----------



## 12Cores

Quote:


> Originally Posted by *SuperZan*
> 
> Two 390s run everything I'm playing (Witcher 3, FO4, RoTR, Forza, CoD black ops 3, Blade and Soul, ARK, SC2, random AAA stuff) very well at 4k. Without AA I can run everything AT Ultra type settings, occasionally compromising on shadows or something down to High. I do have a Free sync monitor but rarely experience dips below 50ish FPS and if I do it's momentary


Superzen, Interesting, do you play project cars at 4k, if so what is the experience like?

Snurds I am staying away from the Fury X due to the 4gb of vram.


----------



## SuperZan

No Project Cars for me unfortunately. Controlling games 32x Tess or lower helps in TWIMTBP games, though.


----------



## jdorje

I don't think the 4gb of vram should be any more of a turn-off than the downsides of crossfire. All that happens when you run out of vram is the occasional microstutter when new textures have to be swapped in - should be minor compared to the mini stuttering you can get in crossfire.

Not sure 4k gaming is really a good idea compared to 1440, which can be run quite decently on a single hawaii or very well on two.

Of course, a fury x ($390 I think!) is only like 40% more powerful than a 390. So two 390s will actually outperform it.

Kinda does make me want to get a second 390/390x for my [email protected] though. Graphics in witcher could use some improvement with a single 390 at 1440.


----------



## SuperZan

I had two Furies, and before that a Fury X (I like to swap around) and the experience with 390 xfire is basically the same as Fury Xfire on 60hz 4k. The 8GB VRAM does seem to help with buffering vis a vis crossfire though as what microstutter ive is short lived and nearly imperceptible.


----------



## ZoePancakes

Posted this on the rx 480 thread aswel

But i must chose between a rx 480 and a r9 390x the rx 480's reference are already costing 320 euro's here in the Netherlands while the nitro r9 390x costs 340.

i cant deside what to take at the moment i game in 1440p on monitor and 4k on the tv what should i do


----------



## mus1mus

You will need to fall in line for the RX 480.


----------



## Mister300

My XFX 390x does 80 C under load in a NZXT 440


----------



## TrueForm

my 390 full load is 73C with core at 1100 from 1025.


----------



## christoph

my 390 full load does 55c


----------



## bluej511

Quote:


> Originally Posted by *snurds*
> 
> Thanks for the replies about my thermal problem.
> 
> First of all there is an intake fan at the bottom, I didn't notice it before.
> 
> Anyway I gave my case some more breathing room and it looked like the temperature was going to stabilize around 91 degrees. I didn't actually wait long enough to confirm that because it didn't seem like a desirable result.
> 
> Then I tried removing the side of my case and I got stabilization at 79 degrees. That's a good result I think but not sure if that's a good solution...seems like it will lead to problems with dust?
> 
> But anyway I think it's confirmed that my basic problem is ventilation. I will look into moving things around in my case.
> 
> For anyone who has made it this far, what would you say is a good temperature target for a XFX 390x under full load but not overclocked?


If with case open it dropped that much its a case flow issue.

Most of our cards will stay under 80°C but also really depends on location, and fan setup. Also are you using the Crimson fan curve or you made one in afterburner? Crimson keeps the fan curve very low for quietness but more heat. I had mine on my nitro peak at 60% would keep my temps under 75°C at all times.
Quote:


> Originally Posted by *christoph*
> 
> my 390 full load does 55c


Damn again someone else who lives in an igloo haha.


----------



## snurds

I'm using the Crimson curve.

Do you think it's unusual to use a 390x in case with only one intake and one exhaust fan? Or is that typical?

I think I'm just going to pop the side of my case off when I play games at 100% GPU load (which is not all that common, I play a lot of old games). But I'm still a bit curious about what the underlying cause of my ventilation problem is. Because my case setup seems pretty normal.


----------



## SuperZan

Quote:


> Originally Posted by *snurds*
> 
> I'm using the Crimson curve.
> 
> Do you think it's unusual to use a 390x in case with only one intake and one exhaust fan? Or is that typical?
> 
> I think I'm just going to pop the side of my case off when I play games at 100% GPU load (which is not all that common, I play a lot of old games). But I'm still a bit curious about what the underlying cause of my ventilation problem is. Because my case setup seems pretty normal.


Depending on your case size that's a little light on the airflow. I mean, the configuration is fine in itself and that can work just fine, but depending on the case I'm usually going for more airflow than that.

As an example on my main gamer with 390 crossfire, I've got rad intake topside, two side intake, two front intake, and two exhaust aft. It's a bit overkill and took some modding. On a rig in a Corsair 760T, I've got two intake front, two intake on the side, an exhaust aft, and two exhaust topside. Again, overkill perhaps, but Hawaii runs warm (as does Fermi in the 760T) and I like to keep things nice and cool.


----------



## bluej511

Quote:


> Originally Posted by *snurds*
> 
> I'm using the Crimson curve.
> 
> Do you think it's unusual to use a 390x in case with only one intake and one exhaust fan? Or is that typical?
> 
> I think I'm just going to pop the side of my case off when I play games at 100% GPU load (which is not all that common, I play a lot of old games). But I'm still a bit curious about what the underlying cause of my ventilation problem is. Because my case setup seems pretty normal.


Ive got a dozen fans but im running water cooled. So 8 of my fan are for the rads alone haha.

One intake is a bit light, i did look up your case and it does have 2 intake fan spots. If you bought used my guess is hes using the fans for something else. You can always buy two decent fans, install em and remove one or both HDD cages to get better flow and need less static pressure.

The crimson curve is also pretty weak sauce. If you can install afterburner and monitor fan % it might be staying below 40% or something. You can try to set it manually and leave it at like 50-60% or wtv you deem is acceptable for noise. It will stay at a single speed. If your temps drop even more then thats why.

Usually running a case open and seeing a huge drop means you don't have either enough exhaust/intake. Ask @doyll he should be able to help you with that, or check out his threads/stickies. Guy is the case/fan guru.


----------



## doyll

Quote:


> Originally Posted by *snurds*
> 
> Thanks for the replies about my thermal problem.
> 
> First of all there is an intake fan at the bottom, I didn't notice it before.
> 
> Anyway I gave my case some more breathing room and it looked like the temperature was going to stabilize around 91 degrees. I didn't actually wait long enough to confirm that because it didn't seem like a desirable result.
> 
> Then I tried removing the side of my case and I got stabilization at 79 degrees. That's a good result I think but not sure if that's a good solution...seems like it will lead to problems with dust?
> 
> But anyway I think it's confirmed that my basic problem is ventilation. I will look into moving things around in my case.
> 
> For anyone who has made it this far, what would you say is a good temperature target for a XFX 390x under full load but not overclocked?


Like @bluej511 said, you have an airflow issue.
But it's 'don't have either enough exhaust/intake'. We can't have one without the other. / what goes in must come out / what comes out must go in / they both have to flow the same amount of air and are limited by whichever is the lessor of the two.
I suggest you look at "Ways to Better Cooling" link in my sig. 1st post is index, click on topics to see them. Start with 5th one to get a basic at how case airflow works. This should give you enough knowledge to have a basic idea of what is going on. Ask if you want more, because that is just the tip of the iceberg.


----------



## Krzych04650

So like I said before my 390 Nitro is whining like crazy and I have some issues with FreeSync enabled (I don't know yet if this is GPU or monitor related, I will get other AMD card soon to check that out), so I am returning the card, but it was able to get through 3D Mark tests without issue, so I will post results (TriXX, Fire Strike and GPU-Z)

Stock

Maximum stable core OC without touching memory and with +100mv overvoltage

(Card is not crashing at higher core frequencies like 1120 or so but gives visual artifacts)

Maximum stable core and memory OC

As for temps, all tests are made with fixed fan speed at 30% (1500 RPM) because I find that speed maximum acceptable in terms of noise. Case is Fractal Design Define R5, with slow RPM 140mm Silentum PC Sigma Pro 3-pin fans, running at 5V (~400 RPM). There are 5 of them, 2x front intake, bottom intake, side panel intake and rear exhaust.

As for noise of Nitro fans(keep in mind that I am playing on speakers, not headset):
-minimal speed when fans start to spin is it seems 25% (1200 RPM) and at this speed GPU fan noise is slightly exceeding case fans noise, so Nitro fans noise at 25% (1200 RPM) fan speed is of equivalent to mid-range fan running at 500-600 RPM, so rather typical speed for 4-pin PWM fans that can be found on CPU coolers and etc while running at 5V
-I found maximum acceptable noise level to be at 30% (1500 RPM) fan speed. This is equivalent of a mid-range case fan noise running at 7V (800-900 RPM).

Overall:
-card is very well optimized for stock operation. At stock speeds, maximum temp with fixed 30% fan speed was as low as 73C, which is great score considering huge power consumption and low fan speed. You can easily get away with minimal fan speed at 25% and be able to keep card below/around 80C with little noise which is great score for a card with this kind of power consumption and heat generation. VRM temperatures are in check with only 75C.
-card is not very suitable for overclocking. Potential is very little as well as performance gains, while power consumption and heat goes sky high and you cannot cool the card quietly anymore.
-coil whine is very big which makes the card not usable under gaming load, but this is quite random thing and is present for any GPU on the market at some degree, so I wouldn't blame this particular Sapphire model. Industry just cannot deal with this issue, or they don't want to/don't care
-card is very solidly built
-card is not as huge as people say, card is just long, but no wide or thick, not super heavy also


----------



## ZoePancakes

Quote:


> Originally Posted by *mus1mus*
> 
> You will need to fall in line for the RX 480.


so the rx 480 is beter ? in some benchmarks its neck on neck but some favor the 390x some the 480 >< its confusing as hell


----------



## ThatGuy16

I'm thinking about adding another R9 390 to my rig, you guys think its worth it and with my PSU support two? Theres a few games I get some lag in every now and then playing maxed out. Some reason I don't have the ability to lower graphics settings...







lol


----------



## mus1mus

Quote:


> Originally Posted by *ZoePancakes*
> 
> so the rx 480 is beter ? in some benchmarks its neck on neck but some favor the 390x some the 480 >< its confusing as hell


IMO, 390X = 480. But one thing that is very appealling about the 480 is Power Consumption. 2nd is cost.

If you already have a 390/X, just go crossfire. If starting from scratch, go for the 480.

I declined the store offer for a "priority list" for the 480 when stocks arrive. That is after returning the 390. I just went for a full refund and keep the money til used prices lower further.








Quote:


> Originally Posted by *ThatGuy16*
> 
> I'm thinking about adding another R9 390 to my rig, you guys think its worth it and with my PSU support two? Theres a few games I get some lag in every now and then playing maxed out. Some reason I don't have the ability to lower graphics settings...
> 
> 
> 
> 
> 
> 
> 
> lol


Mention the PSU model and capacity for guys on mobile.









You might need 1000 W or more when you overclock.


----------



## ThatGuy16

:
Quote:


> Originally Posted by *mus1mus*
> 
> IMO, 390X = 480. But one thing that is very appealling about the 480 is Power Consumption. 2nd is cost.
> 
> If you already have a 390/X, just go crossfire. If starting from scratch, go for the 480.
> 
> I declined the store offer for a "priority list" for the 480 when stocks arrive. That is after returning the 390. I just went for a full refund and keep the money til used prices lower further.
> 
> 
> 
> 
> 
> 
> 
> 
> Mention the PSU model and capacity for guys on mobile.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You might need 1000 W or more when you overclock.


EVGA G2 750w, the one made by seasonic... probably not big enough I'm overclocked with a 4690k and watercooling lol.


----------



## jdorje

Interesting little experiment I did/am doing today.

I've been having some flickering while on low power states of my modded bios. So I've used afterburner to set my clock/memory/voltage to each power state one at a time.

P3 is 840 mhz at 1050 mV with 1500 vram. Probably way lower clock/higher voltage than is needed, but I ran with it.

Scored 2355 on valley (extreme hd preset). Firestrike 10604 score, though strangely my physics score dropped also and I couldn't raise it. http://www.3dmark.com/3dm/13363305?

Max power usage as registered by the VRMS (power in, in hwinfo) is 106+24 watts (106 watts on the core vrms, 24 on the vram). Temperatures were absurdly low.

Now on to P2, which I assume is the offender with its 150 mhz vram.


----------



## mus1mus

Quote:


> Originally Posted by *ThatGuy16*
> 
> :
> EVGA G2 750w, the one made by seasonic... probably not big enough I'm overclocked with a 4690k and watercooling lol.


Definitely not enough.









FYI, 2 290Xs and an overclocked 5930K will OCP trip my 1250 once I push things.


----------



## snurds

I've been experimenting with overclocking my 390x a bit and wonder if these results seem normal. I'm not going to be increasing the voltage because it looks like it would be pretty hazardous with my 600w PSU.

Anyway I seem to have no problems at 1080Mhz. Beyond that a got a little artifacting at 1085. From what I have read 1080 is a little weak at +0 V but not too far off.

What seems odd to me is that memory clock. I have tried as little as 10 Mhz increments either way (underclocking as well as overclocking) and nothing seems to improve my Uningine Heaven score. This is both at my stock 1060 clock at at my overclocked 1080 and I think at 1070 as well.

Is it normally to get no gain at all from a VRAM overclock? I haven't tried smaller increments than 10 although if I did the benefit would be so small it would just be to satisfy my curiosity.


----------



## gupsterg

RAM OC gains little due to RAM bus width.

You need larger increase to gain something discernible in bench and also depends on bench.

You also need to be setting RAM frequency closer to the end of a RAM strap to gain the most performance for the timings of that strap.

390/X also have slightly tighter RAM timings "out of the box" so gain is less vs tightening RAM timings on 290/X.

Also the 390/X has tweaked memory controller timings vs 290/X from what The Stilt has posted. When we add VRAM_Info from 390/X to 290/X (with mos to make it compatible) there is again a ever so slight boost.

All in all the 390/X out of the box is tweaked more vs 290/X.


----------



## bottlefedchaney

http://www.3dmark.com/spy/84003

CPU- 3770k @ 4.7
GPU- Strix 390x @ 1150core 1600mem
Windows 10

Going to try and OC it more tonight and see. It's so humid in the south and its been hot.










Love seeing TimeSpy scores, keep up the good work fella's.


----------



## jdorje

I'd always discounted claims that memory and core overclocking had any relation to each other, but today I more or less proved it while playing around with my bios to lower the voltage for all 8 of my pstates.

For example, if I run 936 on the core (pstate 5, aka the 3rd highest) with 1740 memory (my max) at a minimal core voltage, I will get massive memory artifacting. Memory artifacting is very different from core artifacting - more like flickering, in this case, while core artifacting is white snow. Running OCCT with its error checking on I get 0 errors, but the screen flickers unusably.

Now if I lower memory from 1740 to 1500, it's immediately solved. But if I raise core voltage enough (like 50 mV), it's also solved. And this is a straight up on-off thing: at one voltage setting it's flickering to hell, and then with 6.25 mV more it's completely gone and stable. AUX voltage seems to have no effect.


----------



## jon666

Raised memory by fifty, then raised it again. I've never had gpu memory clock this easy before. Might try for more later. http://www.3dmark.com/fs/9398840

Guess I better add the scores from the shiney new benchmark.

http://www.3dmark.com/spy/96402


----------



## Krzych04650

Here is clock stability test I made on Sapphire 390 Nitro in Witcher 3 with and without Power Efficiency option enabled:



I did this test because game was stuttery as fcuk.

Is AMD completely insane? Power Efficiency mode completely breaks performance and smoothness and is enabled by default. How many users and reviewers even know about that? How many users are even using this Radeon Settings panel? How could they set something like this by default?


----------



## jdorje

Is there any difference in frame rate?


----------



## Krzych04650

Quote:


> Originally Posted by *jdorje*
> 
> Is there any difference in frame rate?


Well if there is a lot stuttering with Power Efficiency option enabled then surely there is differece, much more FPS drops and less consistency.


----------



## jdorje

Normally if I got stuttering in a game I'd assume it was from lack of CPU power or something running in the background. If upping the GPU clock fixes it, then it could be tied to GPU.


----------



## bluej511

Quote:


> Originally Posted by *Krzych04650*
> 
> Here is clock stability test I made on Sapphire 390 Nitro in Witcher 3 with and without Power Efficiency option enabled:
> 
> 
> 
> I did this test because game was stuttery as fcuk.
> 
> Is AMD completely insane? Power Efficiency mode completely breaks performance and smoothness and is enabled by default. How many users and reviewers even know about that? How many users are even using this Radeon Settings panel? How could they set something like this by default?


Its all over the net already, at least in the US for english speakers. It was discovered when people were getting poor firestrike scores. Mines been off since day 1 anyways so ive had no issues. Haven't even bothered trying it with it on haha.


----------



## dagget3450

Just be glad they added the option to turn off power efficiency. For fury owners it was a fracking nightmare as we had no way to turn it off


----------



## snurds

Is there a way to change the voltage setting in smaller increments than MSI Afterburner allows? I seem to just barely get artifacting at -6 (not sure what that stands for, -6%?). Or do you have to edit the BIOS for that?


----------



## PunkX 1

Quote:


> Originally Posted by *snurds*
> 
> Is there a way to change the voltage setting in smaller increments than MSI Afterburner allows? I seem to just barely get artifacting at -6 (not sure what that stands for, -6%?). Or do you have to edit the BIOS for that?


Bios.


----------



## gupsterg

Quote:


> Originally Posted by *snurds*
> 
> Is there a way to change the voltage setting in smaller increments than MSI Afterburner allows? I seem to just barely get artifacting at -6 (not sure what that stands for, -6%?). Or do you have to edit the BIOS for that?


What you see in MSI AB is rounded number, you can't do smaller increment than 6.25mV via OC tool/ROM, as this is a stipulation of AMD SVI.


----------



## Charcharo

Are there any real reports of people unlocking a 390X from a PowerColor R9 390?


----------



## OneB1t

nope


----------



## battleaxe

So far, I gotta say I'm feeling really good about my 390x purchases a long time ago. Nothing really that great on the market yet.


----------



## jdorje

Quote:


> Originally Posted by *snurds*
> 
> Is there a way to change the voltage setting in smaller increments than MSI Afterburner allows? I seem to just barely get artifacting at -6 (not sure what that stands for, -6%?). Or do you have to edit the BIOS for that?


You can't do it in the bios either. The voltage steps are in 6.25 mV increments.


----------



## Rexer

Quote:


> Originally Posted by *dagget3450*
> 
> Just be glad they added the option to turn off power efficiency. For fury owners it was a fracking nightmare as we had no way to turn it off


I did a lot of work arounds. Even turning the thing off a few months ago.. only to find out it's turned on again. Makes ya think there's an Nvidia saboteur working at AMD.


----------



## tolis626

Quote:


> Originally Posted by *Rexer*
> 
> I did a lot of work arounds. Even turning the thing off a few months ago.. only to find out it's turned on again. Makes ya think there's an Nvidia saboteur working at AMD.


Tried DDUing the driver then reinstalling it? There's surely something wrong there. It's been working fine for most of us for many months now.


----------



## fyzzz

I think my 390 is officially dead...







. I was just rebuilding my loop and when i started my pc everything was fine. But after a while it had been on, suddenly white artifacts all over the screen and the computer dies. It refused to boot later, the motherboard was just showing b2 as error code. My 290 boots up just fine and when i switch over to the 390 again, it actually boots this time, but after a while it died again with the same artifacts over the screen. I don't think my next card is going to be a xfx card again. I had a 290 a while ago, also from xfx, which died a random death too.


----------



## Skry

I agree xboxshqip I just picked up a 390X for a great thanks to a kind soul for the deal and yes it is a nice improvement over the 290!


----------



## OneB1t

Quote:


> Originally Posted by *fyzzz*
> 
> I think my 390 is officially dead...
> 
> 
> 
> 
> 
> 
> 
> . I was just rebuilding my loop and when i started my pc everything was fine. But after a while it had been on, suddenly white artifacts all over the screen and the computer dies. It refused to boot later, the motherboard was just showing b2 as error code. My 290 boots up just fine and when i switch over to the 390 again, it actually boots this time, but after a while it died again with the same artifacts over the screen. I don't think my next card is going to be a xfx card again. I had a 290 a while ago, also from xfx, which died a random death too.


but they had reference PCB or not?


----------



## 12Cores

Just upgraded to two r9 390x's, I cannot believe the amount of heat they put out. I think I finally got the heat under control, I had to put two 120mm fan on the side of the cards







for added cooling. I can run all my games at 3200x1800 maxed out at 60hz. The only game I am having issues with is the witcher 3, its pulling almost 800 watts from the wall on ultra without hairworks.

I have not been on air in about 5 years, feels odd trying to figure out fan curves and such. Does anyone have any fan curve tips for crossfire using afterburner?

R9 390X Crossfire = responsible amount of power.


----------



## OneB1t

check 390X bioses and remove voltage offsets from VRM then underclock them to 1050mhz or 1000mhz if its some overclocked edition that will get thermals under control


----------



## 12Cores

Quote:


> Originally Posted by *OneB1t*
> 
> check 390X bioses and remove voltage offsets from VRM then underclock them to 1050mhz or 1000mhz if its some overclocked edition that will get thermals under control


I think I have the temps under control, how would access the bios for the card?


----------



## p1234

Are the r9 390x g1 gaming a good model? I read on the internet that it has temperature problems and the fact the voltage is locked makes me wonder how loud it might be. I couldn't find a single review for this model. Do you guys know if it is a ok card or should I just wait even more for the custom 480?


----------



## Devildog83

Quote:


> Originally Posted by *p1234*
> 
> Are the r9 390x g1 gaming a good model? I read on the internet that it has temperature problems and the fact the voltage is locked makes me wonder how loud it might be. I couldn't find a single review for this model. Do you guys know if it is a ok card or should I just wait even more for the custom 480?


Wait for the custom 480 for sure. The card is going to be so much better in the future. The 490 will also be out soon which should be worth waiting for also if you can.


----------



## Devildog83

Will a 480 crossfire with a 390 or only with Explicit Multi will that work.


----------



## p1234

Quote:


> Originally Posted by *Devildog83*
> 
> Wait for the custom 480 for sure. The card is going to be so much better in the future. The 490 will also be out soon which should be worth waiting for also if you can.


I don't have the moneis to buy a 490 for sure. I'm thinking on getting the 390x because it is on the price range that I expect the custom 480 arrive here on my country, basically I'm considering the g1 gaming just to not wait or buy a reference 480. Do you still think it is a bad move?


----------



## Devildog83

Quote:


> Originally Posted by *p1234*
> 
> I don't have the moneis to buy a 490 for sure. I'm thinking on getting the 390x because it is on the price range that I expect the custom 480 arrive here on my country, basically I'm considering the g1 gaming just to not wait or buy a reference 480. Do you still think it is a bad move?


From what I can tell the 480 will at least be on par performance with the 390x and with less heat and less power draw so if you can wait for the 480. I bought a 390 a few months ago and I will be selling and getting the 480 or maybe even wait a bit to get a 490. I don't think the price is going to be crazy either but we don't know yet what the card will have in it. The AIB partner cards should be out vary soon for the 480.


----------



## PunkX 1

When you decide that the fans on your XFX R9 390 DD just make way too much noise and you come up with a ghetto fix







Zip tied two Noctua Industrial PPC 120mm fans to the cooler. Well, that annoying noise has gone and temps are quite good too







Feast your eyes on this ungodly monstrosity.


----------



## TainePC

Quote:


> Originally Posted by *PunkX 1*
> 
> When you decide that the fans on your XFX R9 390 DD just make way too much noise and you come up with a ghetto fix
> 
> 
> 
> 
> 
> 
> 
> Zip tied two Noctua Industrial PPC 120mm fans to the cooler. Well, that annoying noise has gone and temps are quite good too
> 
> 
> 
> 
> 
> 
> 
> Feast your eyes on this ungodly monstrosity.


Ikr the stock fans are loud. at first i thought it was coil wind, but then after installing a water-block realised that it was just the stock cooler


----------



## 12Cores

Upgraded from two water cooled 7970'[email protected]/1575 to two r9 390x's on air to hold me over until Vega drops. Spent a good portion of the weekend trying to cool them down while running very demanding games like crysis 3 and witcher 3 at 3200x1800 without AA. I took off the side of my case and added two 120mm fans to the side of the cards and they still would not run under 85 degrees in these two demanding games. I decided to reduce volts by .25 in afterburner and drop the clocks from 1080/1500 to 1000/1250, massive improvement with no impact on fps.

*Results with Vysnc on*

*Witcher 3* at 3200x1800 hairworks off, ambient occlusion off, everything else on max
1080/1500 ~780 watts locked at 60fps ~90c
1000/1250 ~550 watt locked at 60fps ~ 75c

*Crysis 3* 3200x1800 Very High, Len Flare on, No AA
1080/1500 ~750watts locked at 60fps ~90c
1000/1250 ~550 watts locked at 60fps ~ 73c

*BF4* 3200x1800 Ultra, No AA
1080/1500 ~500 watts loc ked at 60fps ~73c
1000/1250 ~400watts locked at 60fps ~63c

*Project Cars* 3200x1800 all settings maxed, AA(SMAA Ultra) Thunderstorm
1080/1500 ~750 watts ~ 55fps ~90c
1000/1250 ~550 watts ~ 55fps ~ 75c


----------



## Worldwin

Quote:


> Originally Posted by *12Cores*
> 
> Upgraded from two water cooled 7970'[email protected]/1575 to two r9 390x's on air to hold me over until Vega drops. Spent a good portion of the weekend trying to cool them down while running very demanding games like crysis 3 and witcher 3 at 3200x1800 without AA. I took off the side of my case and added two 120mm fans to the side of the cards and they still would not run under 85 degrees in these two demanding games. I decided to reduce volts by .25 in afterburner and drop the clocks from 1080/1500 to 1000/1250, massive improvement with no impact on fps.
> 
> *Results with Vysnc on*
> 
> *Witcher 3* at 3200x1800 hairworks off, ambient occlusion off, everything else on max
> 1080/1500 ~780 watts locked at 60fps ~90c
> 1000/1250 ~550 watt locked at 60fps ~ 75c
> 
> *Crysis 3* 3200x1800 Very High, Len Flare on, No AA
> 1080/1500 ~750watts locked at 60fps ~90c
> 1000/1250 ~550 watts locked at 60fps ~ 73c
> 
> *BF4* 3200x1800 Ultra, No AA
> 1080/1500 ~500 watts loc ked at 60fps ~73c
> 1000/1250 ~400watts locked at 60fps ~63c
> 
> *Project Cars* 3200x1800 all settings maxed, AA(SMAA Ultra) Thunderstorm
> 1080/1500 ~750 watts ~ 55fps ~90c
> 1000/1250 ~550 watts ~ 55fps ~ 75c


Undervolt harder. Im sure you should be able to shave another 30-60mv @ 1080mhz.


----------



## PunkX 1

Quote:


> Originally Posted by *TainePC*
> 
> Ikr the stock fans are loud. at first i thought it was coil wind, but then after installing a water-block realised that it was just the stock cooler


Totally! Those fans sound so noisy and are probably lowder than all your fans combined together in your system. I'm quite happy now


----------



## RWGTROLL

what are your guys vrm 1 and 2 getting up to on the R9 390x? I know that they get hot i just wanted to see if mine are normal


----------



## Rexer

Quote:


> Originally Posted by *tolis626*
> 
> Tried DDUing the driver then reinstalling it? There's surely something wrong there. It's been working fine for most of us for many months now.


Yeah lol. I'm forever uninstalling, reinstalling drivers, games etc. Everytime I get an AMD update I tweak it in sync. Sometimes I think it's my hobby in life. My last driver was great. I just installed a new one and it stutters in every game (this is with freesync on). I'll get it smooth as silk and download a new Crimson next month and do it over again. Sometimes I have to remove everything and start over. What I should do is stop installing every new AMD driver update that comes out. I just keep thinking each new Crimson update is better that the last so I download it. Once I took apart a card just to look at it. Threw some Artic #5 paste on it and bam! My temps dropped 10c. Pretty suprised at that.


----------



## snurds

Quote:


> Originally Posted by *12Cores*
> 
> Upgraded from two water cooled 7970'[email protected]/1575 to two r9 390x's on air to hold me over until Vega drops. Spent a good portion of the weekend trying to cool them down while running very demanding games like crysis 3 and witcher 3 at 3200x1800 without AA. I took off the side of my case and added two 120mm fans to the side of the cards and they still would not run under 85 degrees in these two demanding games. I decided to reduce volts by .25 in afterburner and drop the clocks from 1080/1500 to 1000/1250, massive improvement with no impact on fps.
> 
> *Results with Vysnc on*
> 
> *Witcher 3* at 3200x1800 hairworks off, ambient occlusion off, everything else on max
> 1080/1500 ~780 watts locked at 60fps ~90c
> 1000/1250 ~550 watt locked at 60fps ~ 75c
> 
> *Crysis 3* 3200x1800 Very High, Len Flare on, No AA
> 1080/1500 ~750watts locked at 60fps ~90c
> 1000/1250 ~550 watts locked at 60fps ~ 73c
> 
> *BF4* 3200x1800 Ultra, No AA
> 1080/1500 ~500 watts loc ked at 60fps ~73c
> 1000/1250 ~400watts locked at 60fps ~63c
> 
> *Project Cars* 3200x1800 all settings maxed, AA(SMAA Ultra) Thunderstorm
> 1080/1500 ~750 watts ~ 55fps ~90c
> 1000/1250 ~550 watts ~ 55fps ~ 75c


So I guess this means CF scaling is not limited by clock speeds. What does that leave, CPU speed?


----------



## 12Cores

Quote:


> Originally Posted by *snurds*
> 
> So I guess this means CF scaling is not limited by clock speeds. What does that leave, CPU speed?


Not sure, but my cpu does not seem to be holding the cards back.


----------



## flopper

Quote:


> Originally Posted by *snurds*
> 
> So I guess this means CF scaling is not limited by clock speeds. What does that leave, CPU speed?


he plays at 3200x1800.....resolution cpu bound there nah...


----------



## mus1mus

As you go higher in resolutiopn, it becomes more GPU-Bound.

Also, LOCKED to 60FPS.


----------



## Wrecker66

I got a 390 strix and would like to know is it possible to flash it to 390x? Is it the same procedure like with 290-290x?
Also i would like to pair it with my 295x2.


----------



## mus1mus

You can pair them, no issues.

Flashing it to a 390X will work as long as you don't have an ROG board. ROG boards will not allow it to post due to ID Mismatch. Other will just skip PCI ID check.

But seriously, there's no reason to flash it to a 390X.


----------



## Wrecker66

Where can i find the best and simplest guide and 390x bios?


----------



## 12Cores

Quote:


> Originally Posted by *mus1mus*
> 
> As you go higher in resolutiopn, it becomes more GPU-Bound.
> 
> Also, LOCKED to 60FPS.


This, if not locked at 60 there is no way I can keep these things cool without water. The cards have exceeded my expectations, I was expecting a minor upgrade over the 7970's, but they are running most game close to max settings at 3200x1800, that is 5.7 million pixels folks not bad at all.


----------



## rdr09

Quote:


> Originally Posted by *12Cores*
> 
> This, if not locked at 60 there is no way I can keep these things cool without water. The cards have exceeded my expectations, I was expecting a minor upgrade over the 7970's, but they are running most game close to max settings at 3200x1800, that is 5.7 million pixels folks not bad at all.


Since you know how to handle crossfire . . . go get a 4K. They are getting cheaper. If you can find one with freesync - that I recommend.


----------



## diggiddi

Quote:


> Originally Posted by *rdr09*
> 
> Since you know how to handle crossfire . . . go get a 4K. They are getting cheaper. If you can find one with freesync - that I recommend.


What is best 40" 4k monitor/tv ATM??


----------



## rdr09

Quote:


> Originally Posted by *diggiddi*
> 
> What is best 40" 4k monitor/tv ATM??


AFAIK, you gonna need an adapter to use a 4K TV with these cards to get 60Hz. For the monitor size, that depends how far you seat from it. If for example you pick a 40 inch, then you might need to move farther away. I seat quite close to my 28 inch monitor..About 2 ft.


----------



## diggiddi

Quote:


> Originally Posted by *rdr09*
> 
> AFAIK, you gonna need an adapter to use a 4K TV with these cards to get 60Hz. For the monitor size, that depends how far you seat from it. *If for example you pick a 40 inch, then you might need to move farther away. I seat quite close to my 28 inch monitor..About 2 ft*.


I'm about the same distance a lil over 2' . I want to be "in" the TV lol , its more immersive that way especially if its curved


----------



## AliNT77

So i have a question... Is there any way to get 4k VSR with these cards? I have both win10 and win7 btw...


----------



## diggiddi

3200 by 1800 pixels is max AFAIK


----------



## PunkX 1

Quote:


> Originally Posted by *AliNT77*
> 
> So i have a question... Is there any way to get 4k VSR with these cards? I have both win10 and win7 btw...


Honestly this sucks as GPUs less capable of 4K like the R9 285 are allowed VSR of up to 4K but the R9 390 isn't.


----------



## AliNT77

And what's funny is that there was a way to get 4k VSR working before 15.3 driver ...

So these cards can undoubtedly do 4k vsr but for some reason it is locked... Is there anyway to talk with the driver team?


----------



## Seahawkshunt

Better late then never, I have been lurking this owners club and the AMD Vishera owners club for a long time.


----------



## dagget3450

Quote:


> Originally Posted by *PunkX 1*
> 
> Honestly this sucks as GPUs less capable of 4K like the R9 285 are allowed VSR of up to 4K but the R9 390 isn't.


Quote:


> Originally Posted by *AliNT77*
> 
> And what's funny is that there was a way to get 4k VSR working before 15.3 driver ...
> 
> So these cards can undoubtedly do 4k vsr but for some reason it is locked... Is there anyway to talk with the driver team?


Yes, they are doing some silly things lately. RX 480 gets wattman, last i knew they haven't and probably wont allow wattman for other gpus outside polaris. 4k vsr being limited to certain models...They are acting like you have to upgrade to use these features. Which is well not very nice imo.


----------



## rdr09

Quote:


> Originally Posted by *diggiddi*
> 
> I'm about the same distance a lil over 2' . I want to be "in" the TV lol , its more immersive that way especially if its curved


4K curved TV would be awesome. Might back off about 4 ft and it might be a good idea to go higher in size. I have a 40 inch hooked up to a PS and we sit about 5 ft away. Great for 4 player BLOPS.
Quote:


> Originally Posted by *PunkX 1*
> 
> Honestly this sucks as GPUs less capable of 4K like the R9 285 are allowed VSR of up to 4K but the R9 390 isn't.


12Cores at post # 10046 is using vsr. Not sure what driver he is using. My 7950 hooked to a 1080 can do 1440 vsr with 16.4. Let me check again now it is under 16.7.2.

EDIT: Yah, my 7950 has vsr on and can be set to 1440 max (from 1080) using 16.7.2.

Hooked a 1080P to my 290s and vsr can be set up to 3200 using 16.7.2.

Wut is up with the 390s?


----------



## th3illusiveman

Quote:


> Originally Posted by *battleaxe*
> 
> So far, I gotta say I'm feeling really good about my 390x purchases a long time ago. Nothing really that great on the market yet.


GCN just keeps on ticking


----------



## snurds

Quote:


> Originally Posted by *flopper*
> 
> he plays at 3200x1800.....resolution cpu bound there nah...


Quote:


> Originally Posted by *mus1mus*
> 
> As you go higher in resolutiopn, it becomes more GPU-Bound.
> 
> Also, LOCKED to 60FPS.


Yes I missed that it's locked. So he's in effect undervolting and power throttlibg, I guess.


----------



## bluej511

Has anyone tried the 16.7.3 and ran into issues? Seems like SOME people might be having issues with the driver but only on the rx 480, unless gamernexus is being very anti AMD again and posting garbage lol.


----------



## diggiddi

Its working fine for me 290x. I used the Crimson updater then deleted old driver


----------



## 12Cores

Quote:


> Originally Posted by *snurds*
> 
> Yes I missed that it's locked. So he's in effect undervolting and power throttlibg, I guess.


Quote:


> Originally Posted by *snurds*
> 
> Yes I missed that it's locked. So he's in effect undervolting and power throttlibg, I guess.


Correct, I am running VSR 3200X1800 on a 1080p TV locked at 60 fps to keep temps down. I usually run on water but will not with these two cards, I may change my mind later. VSR is good way to simulate a higher resolution for a very low price, pretty sure a 32 inch 3200x1800 native monitor would cost more than $600 dollars.


----------



## rdr09

Quote:


> Originally Posted by *12Cores*
> 
> Correct, I am running VSR 3200X1800 on a 1080p TV locked at 60 fps to keep temps down. I usually run on water but will not with these two cards, I may change my mind later. VSR is good way to simulate a higher resolution for a very low price, pretty sure a 32 inch 3200x1800 native monitor would cost more than $600 dollars.


With Freesync maybe? I got my Acer 28 inch 4K for $355 almost two years ago a few days before Cyber Monday I recall.


----------



## Mister300

Dear All:
I just mounted a g10 kraken with a H90 cooler on my XFX 390X. I had to file down the vrm heatsink and remount it to the pcb, no problems.

My max vrm temps are about 81 C. I cannot mount another larger fan to bracket.

Which offset in AB would tweak this a few degrees? Core voltage, power limit or aux volt decrease?

Any ideas I am at 60fps @4K.


----------



## jdorje

81C on vrms is into the range where it hurts stability, but far from the range where its problematic. Still its a good argument against the aio i guess. My vrms approach 100C at stock voltage and mid 80s at my current 1131mV with the stock xfx cooler.

Lowering wattage, aka voltage, is the way to reduce it.


----------



## Mister300

Have a similar case do you use high static pressure fans or high vol flow fans? I want to replace the front intake fans on my nzxt 440.


----------



## jdorje

You don't want high-static-pressure fans for case fans. Moderate static pressure is okay.

I've got some phanteks XPs (pwm case fans) coming tomorrow...


----------



## Mister300

yeah I ordered the corsair mag levs and a Noctura for the H90 rad.


----------



## Mirko1988

Since february I'm with My xfx 390 black edition and I'm in love with it. After many Nvidia cards (kepler and Maxwell) I got an AMD and I'm pretty sure I will stay in team red for a long time.
390 is an amazing card (if you don't mind power consumption) it's really future proof and has a lot raw power.

Only one problem: On win 10 x64 I get low framerate and low performance on both Wolfenstein new order and wolfenstein old blood. On win 7 x64 they run well. Any help for me?

Thanks to everyone!


----------



## doyll

Quote:


> Originally Posted by *jdorje*
> 
> You don't want high-static-pressure fans for case fans. Moderate static pressure is okay.
> 
> I've got some phanteks XPs (pwm case fans) coming tomorrow...


I guess it depends on definition of 'moderate' and 'high'. To me 'low' pressure fans are worthless; 'moderate' pressure is any fan good for coolers, cases and thin (normal) radiators; and 'high' pressure are fans like 32-38mm thick NIdec-servo and Delta high speed fans.

You may know this, but others might not.
We need to keep in mind how restrictive grills and filters actually are. For example a honeycomb back grill with 5mm holes is about as low restriction as any grill, and it blocks 20% of airflow area. Round hole mesh blocks 50-70% of it's area. Best grill is wire ring design.









We also need to keep in mind how much lower the pressure rating is at lower fan speeds .. and that pressure ratings are the point at which the fan stops flowing air. A fan with a pressure rating of 1.40 mmH2O at 1250rpm drops to 0.69 mmH2O at 850rpm and 0.19 mm H2O at 450rpm. The same fan moves 79.4 cfm at 1250cfm free airflow (no resistance at all), drops to 49.7 cfm at 850rpm and is 27.3 cfm at 450rpm. Don't forget this is with no resistance at all !!


----------



## THUMPer1

Quote:


> Originally Posted by *doyll*
> 
> You may know this, but others might not.
> We need to keep in mind how restrictive grills and filters actually are. For example a honeycomb back grill with 5mm holes is about as low restriction as any grill, and it blocks 20% of airflow area. Round hole mesh blocks 50-70% of it's area. Best grill is wire ring design.


I'd like to point out this is why I stay away from cases build from Steel. Fractal, NZXT, Corsair (excluding the air 540), cooler master, phanteks. All the popular cases. I stick to Lian Li which have open fan holes with 0 restrictions.


----------



## jdorje

http://www.newegg.com/Product/Product.aspx?Item=N82E16814150727

To buy, or not to buy?

Would be crossfired with a 390 (the same XFX 8256).

It's an all-time low for the 390x, and probably better than any retail price we've seen a 390 at. But waiting for fall and Vega...


----------



## DarthBaggins

Theres an openbox MSI 390x here I'm thinking of snagging for $250 and some 390's for $223


----------



## Carniflex

I took a look at the first post in this thread and I have to point out there IS actually a block for Gigabyte G1 gaming 390/390X.



It's not a true full cover but kind of hybrid solution, very large passive aluminium heatsink for VRM's and a core block. Mine is approx +20C core temperature over water temperature and approx +25C for the VRM's over the water temperature (and water is approx 20C delta over air with my loop with the card at 1100 core / 1565 mem / +20% power after few hours of gaming.


----------



## TainePC

I'm not sure that was around on the compatibility checker when the cards launched. when they launched it was pretty much only ref or near ref pcbs that even got updated with "compatible" and it was 4 months until the msi block was released.


----------



## mus1mus

Hoping Vega can at least give the Titan XP a run for the money.

Yeah people, 2X the 390 Performance!

Exciting year-ender!


----------



## sagrzmnky

I am new here and I just bought the Asus R9 390. My previous card was the MSI R7970. Can I get some information anout installing the 390? Let me tell you what I did. I ran the DDU to clean the system. Next I removed the R7970 and installed the 390. I plugged in both power plugs and plugged my monitor into the HDMI plug. I started the system up and the fans spun up on the 390 as well as the white led lights by the power plugs and the red logo. The monitor would not come on though. I moved the 390 to another PCI plug and still got the same results. I install my old card and everything works normally, except in a lower resolution mode. My power supply is 1000 watts so I would think it would have enough power. Is this card DOA or am I doing something wrong?


----------



## diggiddi

Try using other cable to connect monitor to card dvi/DP also have you checked uefi option in bios?


----------



## sagrzmnky

I plugged into the HDMI plug, I used an active adapter and plugged into a display port, and I plug a different monitor into the dvi port. Still no luck. I have not messed with the bios. I assumed that the card should at least turn a monitor on since my other card does.


----------



## sagrzmnky

Quote:


> Originally Posted by *diggiddi*
> 
> Try using other cable to connect monitor to card dvi/DP also have you checked uefi option in bios?


I am not sure where it is at in the bios. I will have to check. I ordered an EVGA 1300 watt power supply that should be here tomorrow. Also, I only have until 8/17 to send this card back if it is DOA so I hope I figure it out soon.


----------



## bluej511

7970 was also a uefi card so shouldnt be an issue i dont think. You didnt need to use DDU between a 7970 to r9 390 they use the same drivers, i went from a 7850 to r9 390 just swapping cards lol.

Install whichever AMD driver you want on the 7970, if you want on the amd site use the one that says r9 200/300. Then restart it so it finishes the install then shut down and swap cards. Windows should automatically install a crap driver anyways so if you got nothing its either a cable/adapter issue or DOA card.


----------



## jdorje

Is there some reason to think Vega will give twice hawaii's performance?


----------



## mus1mus

Quote:


> Originally Posted by *jdorje*
> 
> Is there some reason to think Vega will give twice hawaii's performance?


Well, it has to keep up with the new TitanXP for one.

.http://hwbot.org/hardware/videocard/titan_x_pascal/

http://www.3dmark.com/3dm/13843815?


----------



## Mirko1988

Usually, if you have issues after new GPU installation, new Windows installation is recommended.


----------



## sagrzmnky

Quote:


> Originally Posted by *bluej511*
> 
> 7970 was also a uefi card so shouldnt be an issue i dont think. You didnt need to use DDU between a 7970 to r9 390 they use the same drivers, i went from a 7850 to r9 390 just swapping cards lol.
> 
> Install whichever AMD driver you want on the 7970, if you want on the amd site use the one that says r9 200/300. Then restart it so it finishes the install then shut down and swap cards. Windows should automatically install a crap driver anyways so if you got nothing its either a cable/adapter issue or DOA card.


I suspected as much. Thank you.
Just an update, I know it is not the hdmi cable because I use it with the other card and it works perfect. I ordered a Sapphire R9 390X, which is more expensive than the ASUS. The ASUS is on its way back to Amazon.


----------



## monster4bob

are these okay temps to be getting. for instance when I play last light I get 94 degrees Celsius? Or even bf4 for example. Nothing I do keeps the card cool, except under clocking it.


----------



## tbob22

Haven't run 3dmark in a while.
http://www.3dmark.com/3dm/13903676

Temps are still great on this card, max 73c core, VRM1 73c, VRM2 60c, max fan of course. Too bad it doesn't clock very well.

I'll have to give it another run with the CPU at 4.8-5ghz.


----------



## xanax40

Hello, FYI

I got a Gigabyte R9 390 8gb G1, change the crappy amd loud default cooler for a ARTIC Accelero Hybrid III-140 and the temps are 31Cº Idle and 52Cº at gaming, away better then 92Cº sometimes


----------



## sagrzmnky

Ok. I am not sure what I am doing wrong here.I have my MSI R7970 Lightning installed right now and it is working perfectly as I am typing right now. I returned the ASUS R9 390 as defective. I just received my Sapphire R9 390X. I installed it in place of the R7970 and I am having the same issues. The fans come on, but the monitor will not. It shows no input. I even grabbed my little 15" DVI monitor and plugged it in. I bought a new power supply a couple of days ago thinking that it could be the issue. I now have the EVGA Supernova 1300 G2. I know it has plenty of power now. What is going on here? Could it be a motherboard issue? My motherboard is an MSI 990FXA-GD65. It has 2 PCI-E Express 2.0 plugs and I have 32GB of Ram. I am running Windows 7 64 for the OS. Do I need to send this card back and just forget about the upgrade?


----------



## sagrzmnky

Ok, I knew nothing about the Legacy/uefi button. There is no information about it anywhere. Now everything is working.


----------



## Carniflex

Looking at the table in the first post I get an impression that 1100 MHz at core is sort of "normal" overclock at the stock volts?

While the card is "stable" at 1125 MHz as well for me in the sense that everything is not crashing under load it throws about 5000 errors a minute at me in the OCCT stress test for GPU checking memory integrity. Even at 1100 I'm getting few hundred errors over few minute test normally but the card is stable when gaming and I have not been able to see any visual artifacts. Issue for me seems to be temperature - up to about 55..60C there is zero errors, then when I hit there errors start appearing hundred or so at a time at random intervals (tens of seconds between bursts) and once I 60 .. 65C errors start appearing in the bursts of approx thousand every few seconds. The card is Gigabyte 390X G1 Gaming with Alphacool hybrid block, current clocks are 1100 core / 1565 memory (for 400 GiB/s memory bandwidth) / +20% power allowance. Temperatures are up to 70C after few hours of gaming when ambient creeps to 26... 27C in the room.


----------



## kizwan

Quote:


> Originally Posted by *sagrzmnky*
> 
> Ok, I knew nothing about the Legacy/uefi button. There is no information about it anywhere. Now everything is working.


UEFI have caused grief & sorrow to a lot of people. At least you get rid of that ASUS because it have inefficient cooler. Some motherboards don't really care whether gpu have legacy or legacy/uefi BIOS, like my motherboard for example. I can always boot with Secure Boot ON regardless.


----------



## sagrzmnky

This Sapphire 390X is more expensive, but I understand that it is much better too. I just hope others read this and do not have to go through the frustration that I did.


----------



## bluej511

Quote:


> Originally Posted by *sagrzmnky*
> 
> This Sapphire 390X is more expensive, but I understand that it is much better too. I just hope others read this and do not have to go through the frustration that I did.


Its on the site but it seems to be a pita, seems like on means off and off means on lol. I set mine to both wit the light OFF according to sapphire that means the bios is set to uefi/bios. Otheres have said light off is uefi only lol. FOr me with the light off and my uefi install it works fine so im not messing with it.


----------



## sagrzmnky

Quote:


> Originally Posted by *bluej511*
> 
> Its on the site but it seems to be a pita, seems like on means off and off means on lol. I set mine to both wit the light OFF according to sapphire that means the bios is set to uefi/bios. Otheres have said light off is uefi only lol. FOr me with the light off and my uefi install it works fine so im not messing with it.


When the light was off it would not turn the monitors on and the MB would not beep. Once I pushed it in and the light came on, I almost pissed my pants when the monitors lit up....


----------



## bluej511

Quote:


> Originally Posted by *sagrzmnky*
> 
> When the light was off it would not turn the monitors on and the MB would not beep. Once I pushed it in and the light came on, I almost pissed my pants when the monitors lit up....


Yea see for me it was the opposite, light on i had nothing but troubles, would barely boot haha. Light off it works, i have uefi bios and windows installed uefi. I get the msi logo as my boot screen as well thats how i know for sure haha.


----------



## GorillaSceptre

I saw a few posts about high temps..

If you have a MSI 390X try under-volting it, mines rock solid @ 1080/1500 with -80mV on the core, it makes a _*huge*_ difference if you have high-ambients like i do in summer (sometimes up to ~40C!).

I shaved off like 100W of power and dropped my temps down from 85C @ 70% fan speed, to 74C @ 44-55% fans. That's in Rise of the Tomb Raider, which hammers my card far more than any other game I've played temp-wise.. In TW3 and others it hits 72C Max with fans on around 44-50%.









Can't speak for the other vendors, but MSI seem to have given their card far more voltage than most chips need. If that still doesn't help then try changing the thermal paste as that seems to be another thing MSI can't get right..


----------



## 12Cores

Quote:


> Originally Posted by *GorillaSceptre*
> 
> I saw a few posts about high temps..
> 
> If you have a MSI 390X try under-volting it, mines rock solid @ 1080/1500 with -80mV on the core, it makes a _*huge*_ difference if you have high-ambients like i do in summer (sometimes up to ~40C!).
> 
> I shaved off like 100W of power and dropped my temps down from 85C @ 70% fan speed, to 74C @ 44-55% fans. That's in Rise of the Tomb Raider, which hammers my card far more than any other game I've played temp-wise.. In TW3 and others it hits 72C Max with fans on around 44-50%.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can't speak for the other vendors, but MSI seem to have given their card far more voltage than most chips need. If that still doesn't help then try changing the thermal paste as that seems to be another thing MSI can't get right..


I am amazed that you can hold those clocks with such a high undervolt, I can do 1050/[email protected] using afterburner. I am running 1000/1250 with a -25mv undevolt 24/7 right now, temps and power draw are way down especially with some kind of frame cap vsync combo.

Update on temps([email protected]/Vysnc Enabled) -

I now have two Thermaltake 120mm radiator fans on the side of the cards which are virtually silent, they are giving me about 5 degrees and most importantly temps rise much slower on my top card and will often level off in the low 70's. I am now able to to run BF4 uncapped, which is impressive as fps are well over 120fps at 3200x1800 on ultra with no AA. My entire rig only pulls about 575 watts from the wall in most strenuous games(Witcher 3/Crysis 3), most games I am under 500 watts which is amazing for some much performance.

Loving these 390x's, just curious to see if a single Vega card will match these monsters.


----------



## GorillaSceptre

Quote:


> Originally Posted by *12Cores*
> 
> I am amazed that you can hold those clocks with such a high undervolt, I can do 1050/[email protected] using afterburner. I am running 1000/1250 with a -25mv undevolt 24/7 right now, temps and power draw are way down especially with some kind of frame cap vsync combo.
> 
> Update on temps([email protected]/Vysnc Enabled) -
> 
> I now have two Thermaltake 120mm radiator fans on the side of the cards which are virtually silent, they are giving me about 5 degrees and most importantly temps rise much slower on my top card and will often level off in the low 70's. I am now able to to run BF4 uncapped, which is impressive as fps are well over 120fps at 3200x1800 on ultra with no AA. My entire rig only pulls about 575 watts from the wall in most strenuous games(Witcher 3/Crysis 3), most games I am under 500 watts which is amazing for some much performance.
> 
> Loving these 390x's, just curious to see if a single Vega card will match these monsters.


Wish i had better airflow.. I have a stock 650D with a H100 CLC, so not the greatest in that respect.









Hmm, i haven't done much research into undervolting results so I'm not sure how my chip stacks up. Are you running stuff like Furmark to test stability? If so then that's maybe why mine can undervolt a lot further, and maybe i should retract my "rock solid" statement. I've only tested stability in games, i played RotTR for about 18Hrs, put around 20 into Evolve F2P, and about 10 in TW3 so far, plus a few hours in the usual stuff like Valley and Heaven, none crashed a single time or had artifacts at all.

The only game I had issues in was a certain area in Tlotr: War in the North, so i went down to -75 and -69 but it still crashed, tried at stock and yet again it crashed so i think that's down to drivers.. It's a really old title and if RoTR doesn't break it i don't see how that title has a chance.







But other than that it's been great, the decrease in temps/noise has been fantastic.

You're also running CF so i assume that would find instability far easier too. As for Vega well... it bloody better.


----------



## Thingamajig

Hi guys.

Got a ASUS Strix r9 390 here, sitting under water on a custom loop.

Downloaded Sapphire TRIXX for the additional voltage increase (200mv+) and at full wack i get 1200mhz core stable.

I don't seem to have a memory voltage slider anywhere tho which is a shame, but i've tried 1600mhz memory and so far no problem. How high could i increase this on stock voltages?

Temps never exceed 65 degree's on core, and VRM's are stable around 85.

What i wanted to ask though is whats an acceptable voltage for 24/7 use? I want to avoid degrading the card in any way, even though temps are a non-issue.


----------



## DarthBaggins

For me I still prefer MSI's Afterburner for OC'ing on AMD & Nvidia cards. So far works great compared to TRIXX


----------



## GorillaSceptre

Quote:


> Originally Posted by *Thingamajig*
> 
> Hi guys.
> 
> Got a ASUS Strix r9 390 here, sitting under water on a custom loop.
> 
> Downloaded Sapphire TRIXX for the additional voltage increase (200mv+) and at full wack i get 1200mhz core stable.
> 
> I don't seem to have a memory voltage slider anywhere tho which is a shame, but i've tried 1600mhz memory and so far no problem. How high could i increase this on stock voltages?
> 
> Temps never exceed 65 degree's on core, and VRM's are stable around 85.
> 
> What i wanted to ask though is whats an acceptable voltage for 24/7 use? I want to avoid degrading the card in any way, even though temps are a non-issue.


Like Darth i also prefer AB for overclocking..

Temps aren't a problem so I'd say whatever voltage AB allows is "safe". Not sure about other software though.


----------



## bluej511

Quote:


> Originally Posted by *Thingamajig*
> 
> Hi guys.
> 
> Got a ASUS Strix r9 390 here, sitting under water on a custom loop.
> 
> Downloaded Sapphire TRIXX for the additional voltage increase (200mv+) and at full wack i get 1200mhz core stable.
> 
> I don't seem to have a memory voltage slider anywhere tho which is a shame, but i've tried 1600mhz memory and so far no problem. How high could i increase this on stock voltages?
> 
> Temps never exceed 65 degree's on core, and VRM's are stable around 85.
> 
> What i wanted to ask though is whats an acceptable voltage for 24/7 use? I want to avoid degrading the card in any way, even though temps are a non-issue.


85°C is quite high for VRM, me passively cooled with a single 120mm fan blowing on it my VRMs stay below 60°C. 200mv is a lot for air, core temps dont really matter until you get into the 80-90s. VRM is far more important as thats what voltage regulates the whole thing.


----------



## monster4bob

Okay so I did some stress tests under load. For games like Bf4, last light, witcher my card gets to 94+ degrees that is with 100% fan profile and 2 intake fans and another exhaust. Any thoughts?


----------



## Krzych04650

Quote:


> Originally Posted by *monster4bob*
> 
> Okay so I did some stress tests under load. For games like Bf4, last light, witcher my card gets to 94+ degrees that is with 100% fan profile and 2 intake fans and another exhaust. Any thoughts?


You mean core temperatures at stock clock and voltage? What model do you have?


----------



## Thingamajig

Quote:


> Originally Posted by *bluej511*
> 
> 85°C is quite high for VRM, me passively cooled with a single 120mm fan blowing on it my VRMs stay below 60°C. 200mv is a lot for air, core temps dont really matter until you get into the 80-90s. VRM is far more important as thats what voltage regulates the whole thing.


This temp is reached on Kombustor. The card sits under an Alphacool water block (With full cover heatsink). Under usual gaming use those VRM's never touch that.

Dunno why but Kumbustor seems to properly stress VRM temps.


----------



## ziggystardust

Anyone with a Freesync monitor can check if everything works with the new driver 16.8.1?

Most of the users on Reddit, including myself, are having problems even on desktop. Refresh rate is all over the place, even dropping down to 40s-50s on desktop, making even a simple web browsing laggy as hell.

Freesync was always a hit or miss, working in one game while not working for other. But now, it seems they managed to break it completely.


----------



## kizwan

Quote:


> Originally Posted by *Thingamajig*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bluej511*
> 
> 85°C is quite high for VRM, me passively cooled with a single 120mm fan blowing on it my VRMs stay below 60°C. 200mv is a lot for air, core temps dont really matter until you get into the 80-90s. VRM is far more important as thats what voltage regulates the whole thing.
> 
> 
> 
> Dunno why but Kumbustor seems to properly stress VRM temps.
Click to expand...

Yeah, it is a complete mystery.








Quote:


> Originally Posted by *ziggystardust*
> 
> Anyone with a Freesync monitor can check if everything works with the new driver 16.8.1?
> 
> Most of the users on Reddit, including myself, are having problems even on desktop. Refresh rate is all over the place, even dropping down to 40s-50s on desktop, making even a simple web browsing laggy as hell.
> 
> Freesync was always a hit or miss, working in one game while not working for other. But now, it seems they managed to break it completely.


I don't see any reason to use drivers 10.7.3 or above. Why not use older driver, pre-10.7.3?


----------



## doyll

Quote:


> Originally Posted by *monster4bob*
> 
> Okay so I did some stress tests under load. For games like Bf4, last light, witcher my card gets to 94+ degrees that is with 100% fan profile and 2 intake fans and another exhaust. Any thoughts?


What is your system?
What are your CPU and GPU intake air temps?


----------



## bluej511

Quote:


> Originally Posted by *Thingamajig*
> 
> This temp is reached on Kombustor. The card sits under an Alphacool water block (With full cover heatsink). Under usual gaming use those VRM's never touch that.
> 
> Dunno why but Kumbustor seems to properly stress VRM temps.


Cuz it probably over volts haha. Im under the same block and my vrm1 will never reach 60°C in gaming and vrm 2 stays under 70°C. I havent tested it in a while though, ambient is around 30°C lately so i expect that to change a bit, i also have a 120mm fan literally right over the fins too lol.


----------



## jdorje

The new drivers (beta is what i have) fail with freesync. Sometimes it'll get stuck flickering and i have to turn freesync off and back on again.

How do you see frame rate on the desktop?

With my summer overclock core temps barely break 70 now. Vrm temps 80-85 so those remain a decent bit above core.


----------



## rdr09

Quote:


> Originally Posted by *jdorje*
> 
> The new drivers (beta is what i have) fail with freesync. Sometimes it'll get stuck flickering and i have to turn freesync off and back on again.
> 
> How do you see frame rate on the desktop?
> 
> With my summer overclock core temps barely break 70 now. Vrm temps 80-85 so those remain a decent bit above core.


Are you playing F1 in crossfire? I think the driver is primarily for the 4XX series.


----------



## bluej511

Quote:


> Originally Posted by *jdorje*
> 
> The new drivers (beta is what i have) fail with freesync. Sometimes it'll get stuck flickering and i have to turn freesync off and back on again.
> 
> How do you see frame rate on the desktop?
> 
> With my summer overclock core temps barely break 70 now. Vrm temps 80-85 so those remain a decent bit above core.


The freesync thing happens to me once and its been fine since. Its kinda weird, when playing rainbox six siege sometimes ill see a tear here and there, but playing rocket league or rttr or car mechanic and what not il see NONE. i even tested it on Unity and below 40 it tears and above NADA.


----------



## Thingamajig

Quote:


> Originally Posted by *kizwan*
> 
> Yeah, it is a complete mystery.


Sarcasm or do you know something I don't?

Quote:


> Originally Posted by *bluej511*
> 
> Cuz it probably over volts haha. Im under the same block and my vrm1 will never reach 60°C in gaming and vrm 2 stays under 70°C. I havent tested it in a while though, ambient is around 30°C lately so i expect that to change a bit, i also have a 120mm fan literally right over the fins too lol.


that 120mm fan is what does it. I don't have that, especially as I don't think it's really necessary under current temperature values i get.

What sort of core temps you getting? Never peak 65 here. Loop has 2x240mm radiators tho.

VRM's are rated at around 125c so i think 85 is plenty acceptable under heavy stress conditions. Of which regular gaming would never get it to so i'm not too concerned.


----------



## bluej511

Quote:


> Originally Posted by *Thingamajig*
> 
> Sarcasm or do you know something I don't?
> that 120mm fan is what does it. I don't have that, especially as I don't think it's really necessary under current temperature values i get.
> 
> What sort of core temps you getting? Never peak 65 here. Loop has 2x240mm radiators tho.
> 
> VRM's are rated at around 125c so i think 85 is plenty acceptable under heavy stress conditions. Of which regular gaming would never get it to so i'm not too concerned.


My core in 31°C ambient has never reached 50°C, only did when i had my pump speed super slow haha, i have a 240 and a 360 though so helps quite a bit. VRMs are rated quite high but the higher the temp the less stability we end up with.

I tested it this morning playing rainbow six which pings everything and VRM1 was 61°C and VRM2 was 65°C max temps. This is also with my stupid high ambient and no ac. When my ambien is 21°C we can take 10°C off the VRM temps and my core usually stays around 41°C.


----------



## jdorje

Quote:


> Originally Posted by *rdr09*
> 
> Are you playing F1 in crossfire? I think the driver is primarily for the 4XX series.


Definitely no crossfire, though I am running a second monitor on my igpu. I think I've seen the flickering even when the second monitor is disconnected.


----------



## iRUSH

I got tired of either waiting or disappointment (thermally) regarding the RX 480. I picked up a new XFX 390 for $240. It's quiet, cool in the mid-upper 60's. Overclocked to 1100+ mhz should match or come close to matching the 480.


----------



## jdorje

Doesn't the stock 390 beat the stock 480? But, you know, much worse thermally.


----------



## iRUSH

Quote:


> Originally Posted by *jdorje*
> 
> Doesn't the stock 390 beat the stock 480? But, you know, much worse thermally.


I new the 390X did, I assumed the 390 non-x was behind. The temps on these 480's are pretty bad IMO. Not impressed considering the TDP improvement. I was expecting much better thermals to be honest. Compared the this XFX 390 hits around 65c and it's not obnoxious while doing it.


----------



## Thingamajig

Quote:


> Originally Posted by *iRUSH*
> 
> I new the 390X did, I assumed the 390 non-x was behind. The temps on these 480's are pretty bad IMO. Not impressed considering the TDP improvement. I was expecting much better thermals to be honest. Compared the this XFX 390 hits around 65c and it's not obnoxious while doing it.


Isn't the 390 of a different...er...."Teir" then the 480?

So a fairer comparison would be 490 vs 390.


----------



## Streetdragon

What du you think? http://www.3dmark.com/fs/9708825
Crossfire 390 nitro 1200/1600 with modded timings.


----------



## Vellinious

Quote:


> Originally Posted by *Streetdragon*
> 
> What du you think? http://www.3dmark.com/fs/9708825
> Crossfire 390 nitro 1200/1600 with modded timings.


Graphics score is right there with the 970s I had. Good run.


----------



## iRUSH

Quote:


> Originally Posted by *Thingamajig*
> 
> Isn't the 390 of a different...er...."Teir" then the 480?
> 
> So a fairer comparison would be 490 vs 390.


They perform the same, so it's the same tier for me lol


----------



## jdorje

Quote:


> Originally Posted by *Thingamajig*
> 
> Isn't the 390 of a different...er...."Teir" then the 480?
> 
> So a fairer comparison would be 490 vs 390.


No, the 390 is the same tier as the 480. Might be pretty comparable overall.

The 970 and 1060 and 390x and 980 are probably in the same tier, though it might depend on how big you want to make the tiers. Make them small enough and you have to pick an arbitrary cutoff somewhere.

I think it's 970<480<390<1060<390x<980. Hard to be sure though since there's always new drivers with new benchmarks with different results.


----------



## Carniflex

Quote:


> Originally Posted by *Thingamajig*
> 
> Isn't the 390 of a different...er...."Teir" then the 480?
> 
> So a fairer comparison would be 490 vs 390.


Normally a previous generation top card is equivalent to the next gen one notch down performance wise (unless it's rebrand).

I.e., 4870 ~ 5770; 5870 ~ 6870 (AMD changed branding adding another tier); 6970 ~ 7870; .... and so on ... so following the same logic a 390 ~ 480 performance wise. But all the rebrands confuse things up ofc, like 290 being the same card 390 and so on.


----------



## bluej511

Quote:


> Originally Posted by *Carniflex*
> 
> Normally a previous generation top card is equivalent to the next gen one notch down performance wise (unless it's rebrand).
> 
> I.e., 4870 ~ 5770; 5870 ~ 6870 (AMD changed branding adding another tier); 6970 ~ 7870; .... and so on ... so following the same logic a 390 ~ 480 performance wise. But all the rebrands confuse things up ofc, like 290 being the same card 390 and so on.


From what ive seen so far its mix and match, even the r9 390 performs better then the rx 480 in some games. Although this could be due to driver immaturity. Seems like games high in ngreedia tessellation perform better on the rx 480 which leads me to believe AMD has worked on that. Otherwise its pretty much a side grade, unless you water cool and are worried about power consumption lol.


----------



## TainePC

no way would a 1060 beat a 390. sure it might get close, but there is still a reasonably large gap between the 390 and the rx480


----------



## Charcharo

This is how it is:
R9 290 < GTX 970 < 290X = R9 390 < RX 480 < 1060 < 390X < 980

The difference between the 290 and the 980 isnt big. In a sense all are kinda close to one another and it also depends on driver maturity (why I put 480 over 390) and games tested as well as resolutions.


----------



## Streetdragon

Quote:


> Originally Posted by *Charcharo*
> 
> This is how it is:
> R9 290 < GTX 970 < 290X = R9 390 < RX 480 < 1060 < 390X < 980
> 
> The difference between the 290 and the 980 isnt big. In a sense all are kinda close to one another and it also depends on driver maturity (why I put 480 over 390) and games tested as well as resolutions.


A AMD Fury would be at the end or? 980


----------



## 12Cores

I now have the temps of these monsters under control, the highest temps with stock clocks and -25mv undervolt are now in the mid 70's. The total system draw with 12 fans, 3 SSD's, a pump, 16gb ram and the cards is around 600w when running games at 3200x1800 with Vsync on, most games not named Witcher 3 and Crysys 3 are around 500w

I have to admit that I was not expecting much from this crossfire setup as most reviews harped on the insane powerdraw and heat from these things. I wish I would have got them sooner.

My AMD GPU History
4850 air
5750 air
5750/5770 air
5770/5770 water 1ghz
6870/6870 water 985ghz(cards voltage locked)
7970 water 1225/1750( best card I have ever owned, period)
7970/280x water 1185/1575


----------



## monster4bob

Quote:


> Originally Posted by *Krzych04650*
> 
> You mean core temperatures at stock clock and voltage? What model do you have?


ya everything at stock, I have the MSI model. For the love of me I can't figure out why I get such crazy temps. Only two things come to mind I suck at fan placement and the air flow in my case sucks or thermal paste that I got is terrible?


----------



## Worldwin

Quote:


> Originally Posted by *monster4bob*
> 
> ya everything at stock, I have the MSI model. For the love of me I can't figure out why I get such crazy temps. Only two things come to mind I suck at fan placement and the air flow in my case sucks or thermal paste that I got is terrible?


Reasons why your temps are high: high intake temp, high starting voltage, poor airflow, crappy thermal paste application. I know for sure voltage and TIM are gonna be unfavorable.


----------



## monster4bob

Quote:


> Originally Posted by *Worldwin*
> 
> Reasons why your temps are high: high intake temp, high starting voltage, poor airflow, crappy thermal paste application. I know for sure voltage and TIM are gonna be unfavorable.


So you're saying the Thermal Paste that came default with my card probably sucks and that I should change it? My card voltage is default I haven't changed it? Should I lower it you mean? I can post pictures of my build if u want? Also my card is still in with in warranty. If I want to change the thermal paste would that not void it?


----------



## Dundundata

Quote:


> Originally Posted by *monster4bob*
> 
> So you're saying the Thermal Paste that came default with my card probably sucks and that I should change it? My card voltage is default I haven't changed it? Should I lower it you mean? I can post pictures of my build if u want? Also my card is still in with in warranty. If I want to change the thermal paste would that not void it?


I have the MSI 390 and temps are way lower. 75C tops OC'd on air and under load. Stock would be in the 60s. So yeah something is wrong, how is your case setup? My case is nothing special (a Corsair) but it is open inside (removed drivebay) and got cables out of the way, and I have 6 Noctua case fans running. I have a custom curve for the GPU fans but they never max out.

Assuming your case setup is good and ambient temps are OK that could lead to a possible TIM issue. I've never looked at mine, but yeah sometimes they put on way too much and pretty messy. I had an XFX card I repasted and dropped 10C.


----------



## bluej511

Quote:


> Originally Posted by *Dundundata*
> 
> I have the MSI 390 and temps are way lower. 75C tops OC'd on air and under load. Stock would be in the 60s. So yeah something is wrong, how is your case setup? My case is nothing special (a Corsair) but it is open inside (removed drivebay) and got cables out of the way, and I have 6 Noctua case fans running. I have a custom curve for the GPU fans but they never max out.
> 
> Assuming your case setup is good and ambient temps are OK that could lead to a possible TIM issue. I've never looked at mine, but yeah sometimes they put on way too much and pretty messy. I had an XFX card I repasted and dropped 10C.


I didnt think of that, the crimson fan curve might be horrid and letting the fans peak at maybe 38% or something. Thats the only time my sapphire saw 84°C was when i let crimson control my fan curve.


----------



## Worldwin

Quote:


> Originally Posted by *monster4bob*
> 
> So you're saying the Thermal Paste that came default with my card probably sucks and that I should change it? My card voltage is default I haven't changed it? Should I lower it you mean? I can post pictures of my build if u want? Also my card is still in with in warranty. If I want to change the thermal paste would that not void it?


If you search through this thread you will find people who have reported lower temperatures since changing the paste on MSI cards. The warranty depends on where you live whether it is voided. In NA changing TIM does NOT void your warranty for MSI.


----------



## DarthBaggins

I'm thinking of applying some of the Hydronaut I have since it's been a great paste for my 4790k


----------



## Streetdragon

hi i have a little new problem. I extendet my waterloop a bit. everything fine. no leak or anythink. Dried over night.
No i started a game to check the new temps, and my memory got unstable. had to go from 1600 to 1500 on the mem, so the games and driver wont crash. Still get from time to time a Memory error.
A bit of distilled water got on the card, but liek i said id dryed ofer night.

I mean i can still run 1200/1500modded on both cards and have more power that i can handle on my monitor... but it is still.....lower then befor


----------



## monster4bob

Quote:


> Originally Posted by *Dundundata*
> 
> I have the MSI 390 and temps are way lower. 75C tops OC'd on air and under load. Stock would be in the 60s. So yeah something is wrong, how is your case setup? My case is nothing special (a Corsair) but it is open inside (removed drivebay) and got cables out of the way, and I have 6 Noctua case fans running. I have a custom curve for the GPU fans but they never max out.
> 
> Assuming your case setup is good and ambient temps are OK that could lead to a possible TIM issue. I've never looked at mine, but yeah sometimes they put on way too much and pretty messy. I had an XFX card I repasted and dropped 10C.


I don't have the best case in the world a CorsairR100 but I do have pretty good cable management. But ya maybe I have ****ty fan placement and poor airflow in my case. Either way I guess I'm going to send my card in. If I do still get temps that high then the problem was me to start with I guess. Thank you for the feedback though
Quote:


> Originally Posted by *Worldwin*
> 
> If you search through this thread you will find people who have reported lower temperatures since changing the paste on MSI cards. The warranty depends on where you live whether it is voided. In NA changing TIM does NOT void your warranty for MSI.


ya I called and checked, but its okay i'll just RMA it. The tech support guy said so too, he said if you're getting temps of 94+ within 5-10min of playing games something is off. Only games I don't that high of a temp is borderlands 2 and rocket league. Thanks for the feedback too.


----------



## Rexer

Quote:


> Originally Posted by *Worldwin*
> 
> If you search through this thread you will find people who have reported lower temperatures since changing the paste on MSI cards. The warranty depends on where you live whether it is voided. In NA changing TIM does NOT void your warranty for MSI.


Here's a loop for you. I decided to void the warranty on my Asus R9 390 Strix dc3. Took off the backplate and changed my technique on how I apply thermo paste to my gpu and got lower temps. I usually blot spot the mud on the chip but I used a razor blade to spread the mud thinly and evenly over the entire chip face. (I had to redo twice because I'm not nibble fingered) Got some great temps. 36c/39c on idle and 68c/78c tops under load on the clock ( gpu -1153mhz, 67mv, 1620mhz mem.) I used Artic Silver #5. Stock mud is not O.K.


----------



## Chaoz

Quote:


> Originally Posted by *Rexer*
> 
> Here's a loop for you. I decided to void the warranty on my Asus R9 390 Strix dc3. Took off the backplate and changed my technique on how I apply thermo paste to my gpu and got lower temps. I usually blot spot the mud on the chip but I used a razor blade to spread the mud thinly and evenly over the entire chip face. (I had to redo twice because I'm not nibble fingered) Got some great temps. 36c/39c on idle and 68c/78c tops under load on the clock ( gpu -1153mhz, 67mv, 1620mhz mem.) I used Artic Silver #5. Stock mud is not O.K.


Did the same thing a while ago on my Strix card. But used Thermal Grizzly Kryonaut also changed the thermal pads to Fujipoly 1.5mm. Temps in idle are 32°C and under load while in-game or something around 60°C max with a high settings games. VRM temps dropped drastically aswell.


----------



## DarthBaggins

Thermal Grizzly is miracle TiM lol. But still need to change out mine to see if it can be improved


----------



## Rexer

Last winter I asked Jeff, the products representative for Asus, "Can I take the backplate off?" He said, "No." pretty blunt about it. Taking anything off voids the warranty. So I turned off my electric heater and let the Strix card warm the room. Pretty remarkable heat. As summer was coming over the horizon, it was heading into the 90s . I took off the backplate and changed the mud. The temps are so low, it's like night and day.
Quote:


> Originally Posted by *Chaoz*
> 
> Did the same thing a while ago on my Strix card. But used Thermal Grizzly Kryonaut also changed the thermal pads to Fujipoly 1.5mm. Temps in idle are 32°C and under load while in-game or something around 60°C max with a high settings games. VRM temps dropped drastically aswell.


----------



## bluej511

Quote:


> Originally Posted by *Rexer*
> 
> Last winter I asked Jeff, the products representative for Asus, "Can I take the backplate off?" He said, "No." pretty blunt about it. Taking anything off voids the warranty. So I turned off my electric heater and let the Strix card warm the room. Pretty remarkable heat. As summer was coming over the horizon, it was heading into the 90s . I took off the backplate and changed the mud. The temps are so low, it's like night and day.


Asus blow in customer service, ive never heard of a GPU manufacturer denying warranty because someone replaced the TIM. It actually saves them money by us doing that instead of sending it for RMA and having them do it haha.


----------



## Chaoz

I didn't even take the backplate off, just the cooler.


----------



## gordesky1

Quote:


> Originally Posted by *bluej511*
> 
> Asus blow in customer service, ive never heard of a GPU manufacturer denying warranty because someone replaced the TIM. It actually saves them money by us doing that instead of sending it for RMA and having them do it haha.


Msi can be a bit tricky now adays too it seems . i asked twitter support before i sent in my 290x lightning for rma if i can re paste it cause i did it a few times and had to remove the screw sticker cause i herd it was ok, they said nope the warranty will be voided sorry...

sent it in anyways and everything was fine and they sent me the 390 gaming 8gb which i wasn't that happy about... but that's another story,

few weeks ago i message msi on there support ticket page about repasteing this 390 cause it gets pretty hot.. and they got back to me and said yes as long as i don't damage it well doing so. so i screenshot what they said for future proof lol..

i posted on twitter next and told them there ticket support said it was fine which i haven't back from them lol.. so its best to avoid twitter support...


----------



## bluej511

Quote:


> Originally Posted by *gordesky1*
> 
> Msi can be a bit tricky now adays too it seems . i asked twitter support before i sent in my 290x lightning for rma if i can re paste it cause i did it a few times and had to remove the screw sticker cause i herd it was ok, they said nope the warranty will be voided sorry...
> 
> sent it in anyways and everything was fine and they sent me the 390 gaming 8gb which i wasn't that happy about... but that's another story,
> 
> few weeks ago i message msi on there support ticket page about repasteing this 390 cause it gets pretty hot.. and they got back to me and said yes as long as i don't damage it well doing so. so i screenshot what they said for future proof lol..
> 
> i posted on twitter next and told them there ticket support said it was fine which i haven't back from them lol.. so its best to avoid twitter support...


It must just be a few manufacturers that have stickers cuz i took my sapphire apart to put a waterblock on and i dont remember seeing a sticker at all, maybe i forgot haha.


----------



## Carniflex

Quote:


> Originally Posted by *bluej511*
> 
> It must just be a few manufacturers that have stickers cuz i took my sapphire apart to put a waterblock on and i dont remember seeing a sticker at all, maybe i forgot haha.


There is number of manufacturers which have specifically in their warranty conditions that replacing the stock cooler voids the warranty if I remember correct (Sapphire, Gigabyte, etc). Has been few years since I have looked. And few manufacturers who specifically allow(ed) aftermarket coolers on their cards (like, for example, Club3D).

However, Gigabyte, for example, did not have that "warranty void" sticker on my 390X, Sapphire did not have it on the 7950 I had from them but had it on 5770 I had from them and XFX had it on 6770 Eyefinity 5 card I still have. So even if they have it in their conditions that does not mean that they will enforce it all the time. But it would be still far better to buy a card from a company which allows repasting/aftermarket cooling without voiding the warranty than to play the Russian roulette with the tech support of a company that has excluded that kind of stuff from their warranty.


----------



## jdorje

If you expect to ever rma, buy from xfx, sapphire, or msi. Probably in that order.


----------



## xanax40

Gigabyte R9 390 8GB G1 with a Artic Accelero Hybrid III-140 on a Thermaltake Core V1 case


----------



## battleaxe

Quote:


> Originally Posted by *jdorje*
> 
> If you expect to ever rma, buy from xfx, sapphire, or msi. Probably in that order.


Here, here...

I just RMA'd two 390x from XFX not long ago. Coolers taken off of both and run in my water cooling loop. Screw stickers missing as well...

And not a single question was asked. New cards running right now.

Same thing with Sapphire. not too long ago either. Not sure what was going on but I had a 290x, and two 390x die on my within 2 mos time. All three are fine now and been running great ever since. Must have been some bad silicon coming out of the growers or something.


----------



## Rexer

Quote:


> Originally Posted by *Chaoz*
> 
> I didn't even take the backplate off, just the cooler.


I don't know about putting the back plate back on. It was always hot to the touch on my Asus 390 S dc3 and after a long gaming session, it was very slow reducing temperatures. It looks nice, though. I'm not going to knock that. I may take to the garage and drill ventilation holes in it.


----------



## 12Cores

Quote:


> Originally Posted by *xanax40*
> 
> 
> 
> Gigabyte R9 390 8GB G1 with a Artic Accelero Hybrid III-140 on a Thermaltake Core V1 case


Impressive







.


----------



## Rexer

Quote:


> Originally Posted by *xanax40*
> 
> 
> 
> Gigabyte R9 390 8GB G1 with a Artic Accelero Hybrid III-140 on a=
> Quote:
> 
> 
> 
> Originally Posted by *bluej511*
> 
> Asus blow in customer service, ive never heard of a GPU manufacturer denying warranty because someone replaced the TIM. It actually saves them money by us doing that instead of sending it for RMA and having them do it haha.
> 
> 
> 
> Yeah, there service people can be sticklers. On several boards I worked on, I had to jump through a few hoops. It wasn't much of a coin toss after I decided I was going to screw around with it. But with that aside, I can't complain much about how downright fast this card is. I have friends who have 980 ti and 970 sc and I beat them down in CoD AW clan games. 390 I believe is a hair trigger faster. It does not flinch a moment. In knife meeles, 390 is ruthless. This is considering we have good ping between ourselves. This is also considering I own a 970 sc, too. Proof is in the performance and this made me wait till summer to take it apart.
> I've seen the spec comparisons, reviews and watch the YouTube videos on 390 vs 980 vs 970 and I'm dumbfounded. I wasn't able to beat everyone so consistently with 970 sc.
Click to expand...


----------



## DarthBaggins

I have to agree my 390x smoke my Strix 970 even with the 970 clocked to 1535


----------



## Rexer

Quote:


> Originally Posted by *DarthBaggins*
> 
> I have to agree my 390x smoke my Strix 970 even with the 970 clocked to 1535


I love fps games and in some cases bigger maps are better. Make use of some of that 8gb ram, lol. I'm addicted speed and really prefer a natural picture, as opposed to the bright and appealing colors the 970 exhibits. The 390 really business about that. Really satiates me to think this is what I paid for. Yeah, sweetens the rim of the cup.


----------



## DarthBaggins

Still want to play with a pair of 480's too. Funny how I started gaming with AMD cards and dabbled with Nvidia just to go back to AMD. Don't get me wrong the 970 & 960 I have have been great cards over-all but when it comes down to performance this 390x I won has been a workhorse


----------



## Rexer

Quote:


> Originally Posted by *DarthBaggins*
> 
> Still want to play with a pair of 480's too. Funny how I started gaming with AMD cards and dabbled with Nvidia just to go back to AMD. Don't get me wrong the 970 & 960 I have have been great cards over-all but when it comes down to performance this 390x I won has been a workhorse


Two 480's looks like the way to go. I'm pretty much awestruck by Battlefield 1. The game looks massive. Any big card will have it's work cut out. I wonder if the 256 bit bus is up to handling all the information passing through at one time.
About the Nvidia's 970, I never believed that Nvidia's words that the 256 bit bus is sufficient after it was noted to only use 192 bits. They pretty much 'straight faced lied' to us. Even the usage of the 4gb memory is wrong. It's really 3.5 gb. If the 970 has to use the 500mb of the 4gb of memory, it has to run a at a slower speed that the 3.5 allocated. ? Whatup with that? If you got a good product why not tell everyone the truth? I believe it's a gimmick to sell us a new card every year. Oh, it slows down with the bigger, newer game maps. A way to pinch off technology to us every year. Comparable to selling us a $250 card for $350 and a way of making us think 4gbs is not fast enough for today's new games.
When everything comes down to the bottom line, it's all about money. They still print on the box 256 bit memory bus and 4gb of ram. With the sheer size of the games like Witcher 3, I'm not going to pay $500 for for any 256 bit memory bus. I want something to cram big Mc Farlan's 300lb. mom in, you know? A bus, not a car. Lol. The way they're taking us is like buying a new card for every new game.
At least you know what you have with an AMD card.


----------



## christoph

Quote:


> Originally Posted by *Rexer*
> 
> Two 480's looks like the way to go. I'm pretty much awestruck by Battlefield 1. The game looks massive. Any big card will have it's work cut out. I wonder if the 256 bit bus is up to handling all the information passing through at one time.
> About the Nvidia's 970, I never believed that Nvidia's words that the 256 bit bus is sufficient after it was noted to only use 192 bits. They pretty much 'straight faced lied' to us. Even the usage of the 4gb memory is wrong. It's really 3.5 gb. If the 970 has to use the 500mb of the 4gb of memory, it has to run a at a slower speed that the 3.5 allocated. ? Whatup with that? If you got a good product why not tell everyone the truth? I believe it's a gimmick to sell us a new card every year. Oh, it slows down with the bigger, newer game maps. A way to pinch off technology to us every year. Comparable to selling us a $250 card for $350 and a way of making us think 4gbs is not fast enough for today's new games.
> When everything comes down to the bottom line, it's all about money. They still print on the box 256 bit memory bus and 4gb of ram. With the sheer size of the games like Witcher 3, I'm not going to pay $500 for for any 256 bit memory bus. I want something to cram big Mc Farlan's 300lb. mom in, you know? A bus, not a car. Lol. The way they're taking us is like buying a new card for every new game.
> At least you know what you have with an AMD card.


and you want to add something to that?

where I live, I could've buy 2 sapphire 390 for the price of a single 980


----------



## Rexer

Quote:


> Originally Posted by *christoph*
> 
> and you want to add something to that?
> 
> where I live, I could've buy 2 sapphire 390 for the price of a single 980


Wow. I stared at your words for 5 minutes. Better worded, I let it burn a hole in my head for 5 minutes. Lol. 2 Sapphire R9 390's for the price of a 980! I feel like I left my Rollex watch on a park bench in Central Park but I don't have a Rollex. Just a cheap Timex I bought for 9 bucks at Right-Aid... . and I never been to Central Park. What can I say! Words escape me, lol. 2 Sapphire R9 390's. In my head I'm pulling one out of the box, tenderly pushing it into the slot, "Oh,sure baby, it hurts for the first time but you'll feel better". plug into display port and turn on the rig... load Afterburner and stroke the controls. First an easy 1125mhz with 50%mv she smiles and gives me a smooth picture. Crysis 3. Aaah! Then a twin shows up! Awww, drives me nuts!


----------



## christoph

Quote:


> Originally Posted by *Rexer*
> 
> Wow. I stared at your words for 5 minutes. Better worded, I let it burn a hole in my head for 5 minutes. Lol. 2 Sapphire R9 390's for the price of a 980! I feel like I left my Rollex watch on a park bench in Central Park but I don't have a Rollex. Just a cheap Timex I bought for 9 bucks at Right-Aid... . and I never been to Central Park. What can I say! Words escape me, lol. 2 Sapphire R9 390's. In my head I'm pulling one out of the box, tenderly pushing it into the slot, "Oh,sure baby, it hurts for the first time but you'll feel better". plug into display port and turn on the rig... load Afterburner and stroke the controls. First an easy 1125mhz with 50%mv she smiles and gives me a smooth picture. Crysis 3. Aaah! Then a twin shows up! Awww, drives me nuts!


well the thing is that people anywhere got the idea from someone else that Nvidia are the best brand just because is expensive and why AMD is cheaper? cuz is crap, and the thing is that idea gets to retailers here where I live and they sell even more expensive than what is in USA so they can make big money


----------



## Rexer

Quote:


> Originally Posted by *christoph*
> 
> well the thing is that people anywhere got the idea from someone else that Nvidia are the best brand just because is expensive and why AMD is cheaper? cuz is crap, and the thing is that idea gets to retailers here where I live and they sell even more expensive than what is in USA so they can make big money


It was a pretty cheap trick to fool Nvidia GTX 970 users at the same time try to push AMD into a rock and a hard place in the market. Advertising 1160+ clocks out of the box. A game using 4gbs of memory at a constant rate is pretty uncommon, when it does go over the 970's 3.5 threshold, it slows to the slowest ram (500mb) at the end of the chip layout. Phiff! A pretty rude trick. I'm glad Nvidia users are smart enough to sue Nvidia and win in a class action lawsuit. False advertising.


----------



## bluej511

Quote:


> Originally Posted by *Rexer*
> 
> It was a pretty cheap trick to fool Nvidia GTX 970 users at the same time try to push AMD into a rock and a hard place in the market. Advertising 1160+ clocks out of the box. A game using 4gbs of memory at a constant rate is pretty uncommon, when it does go over the 970's 3.5 threshold, it slows to the slowest ram (500mb) at the end of the chip layout. Phiff! A pretty rude trick. I'm glad Nvidia users are smart enough to sue Nvidia and win in a class action lawsuit. False advertising.


There are plenty, PLENTY of games that reach that 3.5gb limit easy. Hell even Rainbow Six Siege which isnt too gpu heavy will easily use 3.5gb maxed out.

But then again an r9 390 can't use all 8gb anyways, its closer to 5-6gb.


----------



## mus1mus

Even 3DMark for that matter.


----------



## gapottberg

Now that i feel i have a reasonable OC on my CPU ive been tuning in here on the regular to get advice and ideas on my 390X.

I must say i am impressed with thr cards performace though i bought mine pre polaris so i had better for the money right?

I am however disapointed in the stock settings. Im savey enough to know to increase power limits and remove power saving modes before using cards like this, but for those who are not you will likely wonder what the hell is wrong with your card. They run hot and throttle like crazy until you unlock the potential AMD is stupidly putting behind closed doors it seems.

After a slight underclocking and undervolting, and a bit of tinkering with fannspeeds, i am getting much better temps and quieter operation than any of the stock settings ever gave me. I can still play all my games at 1080p 60htz with zero issues, zero noise, and no more space heater next to me.

Will work on the overclocking of this card later this fall and see how high i can push it.

Thanks for the info eveyone. Been helpful as always.


----------



## Geoclock

Hi Gappotberg, can you send your specs please.
I got 390x MSI and was getting hot 95c on Gaming mode, then I applied Arctic Mx 4 and now I got near 75c at load but with heavy cooling 2x140mm fans plus 6x120mm fans.
I guess 75c still hot enough.


----------



## gapottberg

Quote:


> Originally Posted by *Geoclock*
> 
> Hi Gappotberg, can you send your specs please.
> I got 390x MSI and was getting hot 95c on Gaming mode, then I applied Arctic Mx 4 and now I got near 75c at load but with heavy cooling 2x140mm fans plus 6x120mm fans.
> I guess 75c still hot enough.


Here you go. For those on mobile, I down clocked the GPU to 1000MHz (from 1080/1100) and lowered voltage to GPU by -31mV. I started by dropping voltage 1 tick at a time in MSI Afterburner, and then ran Firestrike Ultra Stress test to validate each drop in voltage was initially stable before stopping at what i did. I might have been able to go lower because i never got artifacts or crashes but decided to stop there and play with it for a week.



Spoiler: Warning: Spoiler!







During the 10 min Stess tests temps never broke 70'C thanks in part to my new fan curve i also customized based on the noise levels i prefer. That's is also shown below.



Spoiler: Warning: Spoiler!


----------



## Geoclock

Thanks, I'll try. How did you make custom stuff?


----------



## gapottberg

Quote:


> Originally Posted by *Geoclock*
> 
> Thanks, I'll try. How did you make custom stuff?


Um...the fan settings? It is a tab in MSI afterburner. You just have to enable it and move your threshholds where you like. Then on the main page there is a tiny check box at the bottom to enable custom fan settings and another to enable MSI Afterburner to boot when you start your PC.


----------



## Geoclock

Thanks. Do you think 75c is good enough or put some aftermarket thermal pads.


----------



## gapottberg

Its really up to you. Graphics card are desgined to be capable of running hotter than many components safely and 75 is well within the limits of what the card can theoretically handle.

That said the cooler you can run a card the better and if you are into overclocking for more performace every little bit can help.

Personally im fine with the temps and performance i am getting Vs what i was before my custom settings. Everyone has their own tolerences for thermals, noise, and performance, and risk. You will have to decide for yourself whether its worth it.

I will caution you on replacing thermal pads with TIM...many times the pads are thickers and some types of tim can leak out and fail to make good contact over time with certain components and the heat sink. It's often better to replace poor pads with better pads if you can find them.

Good luck.


----------



## Mister300

Son just purchased an 480 to replace our server gpu.

It a downgrade to his 970 on brief testing. My 390 X at 4k is better.

Will benchmark later.


----------



## Geoclock

Here is my Unigine Valley Extreme HD x8 AA 1080p
How is it the performance of MSI 390x with 1100mhz-6080mhz?


----------



## gapottberg

For anyone who cares, all of Stardock's games are on sale for the weekend...including the now de facto 2016 DX12 benchmark AoS for $13.00...a game i am actually interested in playing as much as using it for benchmarking. I'm not a huge RTS fan or player but the visuals have me captivated as much as any game since maybe Endless Legend which i love to death. Hoping for a similar experience...and if nothing else another tool in the toolbox for benching my rigs.


----------



## christoph

Quote:


> Originally Posted by *gapottberg*
> 
> For anyone who cares, all of Stardock's games are on sale for the weekend...including the now de facto 2016 DX12 benchmark AoS for $13.00...a game i am actually interested in playing as much as using it for benchmarking. I'm not a huge RTS fan or player but the visuals have me captivated as much as any game since maybe Endless Legend which i love to death. Hoping for a similar experience...and if nothing else another tool in the toolbox for benching my rigs.


what game?


----------



## gapottberg

Ashes of the singularity


----------



## christoph

Quote:


> Originally Posted by *gapottberg*
> 
> Ashes of the singularity


thanks


----------



## MadPolygon

Yeeeesss, I finally got my waterblock for the Sapphire 390 Nitro











Running a leak test later to see if everything is ok and working.


----------



## Streetdragon

Quote:


> Originally Posted by *MadPolygon*
> 
> Yeeeesss, I finally got my waterblock for the Sapphire 390 Nitro
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Running a leak test later to see if everything is ok and working.


The R9 390 Nitro has a fullblock? i thought there is only the Alphacool nexxxos ones.... Can you link your block?


----------



## MadPolygon

It's custom made by Liquid Extasy. It's brand new and not yet listed in the shop. The reason you haven't seen this block is because I got the first one.


----------



## Streetdragon

Quote:


> Originally Posted by *MadPolygon*
> 
> It's custom made by Liquid Extasy. It's brand new and not yet listed in the shop. The reason you haven't seen this block is because I got the first one.


so awesome... i think i made myself a bit wet... or is it just so hot in here? dont know....

The VRMs are only getting "passive" cooled from the water or? Hope you can test a little overclock for me to check the temps on the VRM when you push 150mv+Offset through them. I reach 75 max and there it starts to get unstable-.-


----------



## MadPolygon

Yeah it's looking better than I thought it would.

If you mean that there is no water directly flowing where the VRMs are then you're right. It's still one solid piece of copper, so heat from the VRMs is still taken away by the water in the loop. Overall I think the difference in temps vs. a block which has a direct waterflow above the VRMs should be pretty minimal.

Definitely gonna test overclocking with this block.


----------



## Dundundata

I have the msi390 (non x) and 75 is about my high when OC and under load.

I'm going to repaste soon and see if anything changes but I think that's pretty normal.


----------



## bluej511

Quote:


> Originally Posted by *MadPolygon*
> 
> Yeah it's looking better than I thought it would.
> 
> If you mean that there is no water directly flowing where the VRMs are then you're right. It's still one solid piece of copper, so heat from the VRMs is still taken away by the water in the loop. Overall I think the difference in temps vs. a block which has a direct waterflow above the VRMs should be pretty minimal.
> 
> Definitely gonna test overclocking with this block.


I think the VRM temp won't be minimal. The heat from the VRM (they do reach 60°C + quite easily), will spread throughout that entire piece of copper covering em, the water flowing near it will only cool the copper where water directly flows above it. Personally i think the VRM temps will be decent but no where near a full block like ek or aqua computers.

Id love be wrong so can't wait to see your results for VRM temps, if theyre lower then my Alphacool block ill switch, if not then no need.


----------



## MadPolygon

Well, I could definitely be wrong. We'll see when I'll do some benchmarks.

My leak test earlier was successfull







.


----------



## bluej511

Quote:


> Originally Posted by *MadPolygon*
> 
> Well, I could definitely be wrong. We'll see when I'll do some benchmarks.
> My leak test earlier was successfull
> 
> 
> 
> 
> 
> 
> 
> .


Yea im very curious, i don't see the water anywhere near the VRM1 (the long line of VRMs, VRM2 is the single VRM chip close to the io brackket).

I can see the block being cool all around the water, but the distance to the VRMs is quiet a reach. Don't forget that even with a water temp of 30°C might not be enough to keep VRM temps low. Id love to see your results though, mine is just an educated guess. I have the alphacool with a 120mm fan blowing over it and when its not crazy hot in the apartment my VRM1 doesn't exceed 50-55°C and thats PASSIVELY cooled too.


----------



## Rexer

Quote:


> Originally Posted by *Streetdragon*
> 
> The R9 390 Nitro has a fullblock? i thought there is only the Alphacool nexxxos ones.... Can you link your block?


Had my eye on this one I seen on Amazon.

Nhowe A-AS39X-X Asus 390 STRIX-R9 390X-DC3OC

I just haven't seen any reviews yet.


----------



## Rexer

Quote:


> Originally Posted by *Rexer*
> 
> Had my eye on this one I seen on Amazon.
> 
> Nhowe A-AS39X-X Asus 390 STRIX-R9 390X-DC3OC
> 
> I just haven't seen any reviews yet.


Ooops! My bad. It's not a Sapphire Nitro waterblock. Sorry.


----------



## Rexer

Quote:


> Originally Posted by *bluej511*
> 
> There are plenty, PLENTY of games that reach that 3.5gb limit easy. Hell even Rainbow Six Siege which isnt too gpu heavy will easily use 3.5gb maxed out.
> 
> But then again an r9 390 can't use all 8gb anyways, its closer to 5-6gb.


Just making reference to staying at a constant 3.5 gb rate.


----------



## Rexer

Quote:


> Originally Posted by *Dundundata*
> 
> I have the msi390 (non x) and 75 is about my high when OC and under load.
> 
> I'm going to repaste soon and see if anything changes but I think that's pretty normal.


On the Asus 390 I own, when I popped the heat sink off, there was too much factory mud. Enough to fill a third of the die block.


----------



## Dundundata

Quote:


> Originally Posted by *Rexer*
> 
> On the Asus 390 I own, when I popped the heat sink off, there was too much factory mud. Enough to fill a third of the die block.


saw the same thing on an xfx 390, repasting dropped temps by 10C!

also I'm curious how often people change out their paste for CPU/GPU?


----------



## Chaoz

Quote:


> Originally Posted by *Rexer*
> 
> On the Asus 390 I own, when I popped the heat sink off, there was too much factory mud. Enough to fill a third of the die block.


Did that aswell on my STRIX 390. There was indeed too much TIM on it. Wiped it clean and applied TG Kryonaut and now it's running a lot cooler than before. Also replaced the Thermalpads with 1.5mm Fujipoly 11 w/mk pads and the temps dropped a lot aswell.


----------



## Dundundata

Quote:


> Originally Posted by *Chaoz*
> 
> Did that aswell on my STRIX 390. There was indeed too much TIM on it. Wiped it clean and applied TG Kryonaut and now it's running a lot cooler than before. Also replaced the Thermalpads with 1.5mm Fujipoly 11 w/mk pads and the temps dropped a lot aswell.


hmmm I had not thought about the pads, would those also work on the MSI? I have been happy with temps but if I can drop them a bit that would be nice and push the OC further.

On another note, is it worth it to Xfire these cards? What I mean is are there alot of games that will take advantage of it?


----------



## Chaoz

Quote:


> Originally Posted by *Dundundata*
> 
> hmmm I had not thought about the pads, would those also work on the MSI? I have been happy with temps but if I can drop them a bit that would be nice and push the OC further.
> 
> On another note, is it worth it to Xfire these cards? What I mean is are there alot of games that will take advantage of it?


It should work fine. Just make sure you get the correct ones for your card. The STRIX needs 1.5mm because the Fujipoly's are a bit harder than the stock pads. So you need to press the cooler gently in the pad for it to make contact. Did this a while back and my temps are still a lot lower than stock pads.

Could be that the MSI's need 1 mm. Not sure, tho. Bought mine on ebay for $15 incl. shipping. Only needed one strip.

Xfiring these cards gives a great performance boost and a lot of heat so make sure you get a good airflow in your case.


----------



## tolis626

Quote:


> Originally Posted by *Dundundata*
> 
> hmmm I had not thought about the pads, would those also work on the MSI? I have been happy with temps but if I can drop them a bit that would be nice and push the OC further.
> 
> On another note, is it worth it to Xfire these cards? What I mean is are there alot of games that will take advantage of it?


Don't change pads on the MSI, not worth it. MSI uses high quality thermal pads already, it seems, as using 11W/mK Fujipolys on mine just barely improved temps, if at all.

Repasting, on the other hand, can be a huge upgrade. Kryonaut is the bee's knees, but Gelid GC Extreme and other high quality TIMs will work just fine. Don't expect temps to drop 10C under light overclocking, but if you start pushing your card it's a must IMO. Before repasting I couldn't get over 1170MHz reliably because the temps were too high (over 80C) and adding voltage didn't improve stability (I suspect that my card also had some hot spots where the TIM hadn't spread properly, but I can't know now). Now I can easily game at 1180MHz and higher, all under 80C during the winter. Summer is another story and I don't use over 1150MHz, usually less than even that.

Last thing, these cards are VERY sensitive to case airflow. Good versus bad airflow in your case can be 10C worth alone. Maybe even more.


----------



## Rexer

Quote:


> Originally Posted by *Chaoz*
> 
> Did that aswell on my STRIX 390. There was indeed too much TIM on it. Wiped it clean and applied TG Kryonaut and now it's running a lot cooler than before. Also replaced the Thermalpads with 1.5mm Fujipoly 11 w/mk pads and the temps dropped a lot aswell.


Fujipoly11? I reused my pads hoping they were still good (which they were) but I have a XFX HD7950 that's sitting in a cardboard needing pads. Where did you find them?


----------



## DarthBaggins

Thinking of pulling my card tonight and repasting with my Hydronaut. Need to get better temps with this card, tired of 70-80c temps


----------



## gapottberg

So i had a weird issue yesterday. I have been monitoring temps on this card via the HWInfo64 sensor utility (low 60's). Last night i changed it up and used MSI Afterburner and got wildly different results (spikes of 80+). Double checked with the sensor utility on GPUZ and my results were more in line with the HWinfo64 results (low 60's). Curious if there is any known issues with MSI Afterbruner showing different temps than other programs?

[edit] ok a bit embarrassing, but i think i discovered my issue. Think i was looking at the wrong graph for temps. I have so many stacked i get them mixed up some times.


----------



## Chaoz

Quote:


> Originally Posted by *Rexer*
> 
> Fujipoly11? I reused my pads hoping they were still good (which they were) but I have a XFX HD7950 that's sitting in a cardboard needing pads. Where did you find them?


I even have 14's but they're not thick enough to fit on my card, so it was a wrong purchase. They're 1mm and I needed 1.5mm for my STRIX card.

Bought em off eBay. From this guy's shop in the UK:

http://www.ebay.co.uk/sch/pcerb/m.html?_nkw=&_armrs=1&_ipg=&_from=

He has loads of different kinds, but this is the one I bought:
http://www.ebay.co.uk/itm/Fujipoly-Thermal-Pad-1-5mm-11W-mK-for-GPU-CPU-LED-XBOX-PS3-PS4-PC-Laptop-100x15-/181701216935?hash=item2a4e3c8aa7:g:z~UAAOSw9r1V-Ai-

The pad is quite hard so I had to press my heatsink onto it for it to sit snug on my VRM's and such.


----------



## 12Cores

Vulkan does not support multi-gpu as yet, I have been playing doom at max settings on a single r9 390x using VSR at 3200x1800 and its locked at 60fps. I am at a loss for words as this game throws so much at you at such high visual fidelity.

I am really looking forward to more games on these new API's if this is the kind of performance that can be had with a single card.


----------



## gapottberg

Holy crap! Never seen a game push my CPU as hard as Ashes of the Singularity does. I was getting highs of about 65'C during an hour or so long game session on my CPU with my current OC. May have to dial it back a hair. The Graphics card was also taxed actually pushing temps into the low 70's compared to the low 60's i see in every other game i play. Still, i'm pretty happy with my temps after my undervolting and custom fan curve.

Ran a quick benchmark in DX11 and DX12...there is a huge difference in performance as one would expect. Im still pushing over 40 FPS with High settings all around, and game feels and looks good. Pretty happy with it so far, other than it crippling my AIO.


----------



## bluej511

Quote:


> Originally Posted by *gapottberg*
> 
> Holy crap! Never seen a game push my CPU as hard as Ashes of the Singularity does. I was getting highs of about 65'C during an hour or so long game session on my CPU with my current OC. May have to dial it back a hair. The Graphics card was also taxed actually pushing temps into the low 70's compared to the low 60's i see in every other game i play. Still, i'm pretty happy with my temps after my undervolting and custom fan curve.
> 
> Ran a quick benchmark in DX11 and DX12...there is a huge difference in performance as one would expect. Im still pushing over 40 FPS with High settings all around, and game feels and looks good. Pretty happy with it so far, other than it crippling my AIO.


You should try Rise of the Tomb Raider in DX12, the avg didnt jump too much for me but the min almost quadrupled.


----------



## gapottberg

Quote:


> Originally Posted by *bluej511*
> 
> You should try Rise of the Tomb Raider in DX12, the avg didnt jump too much for me but the min almost quadrupled.


Yeah i think my Avg jumped by 10 fps and my min jumped by like 30 fps. Ill have time today to do some proper benching and will post pics.


----------



## gapottberg

Ok, here is a quick bench test I did on both DX11 and DX12 for Ashes of the Singularity. I made a chart that is easy to read but will add the photos of the actual bench for proof of numbers. DX12 is looking very promising indeed for older CPU's, but i will say this...you really need to monitor your temps when playing DX12 games if you are Overclocking your CPU. DX12 pushes temps higher than i have ever seen in DX11 games. Much closer to what a true Stress test like P95 pushes over a long enough session.












Spoiler: Warning: Spoiler!


----------



## Dundundata

Quote:


> Originally Posted by *bluej511*
> 
> You should try Rise of the Tomb Raider in DX12, the avg didnt jump too much for me but the min almost quadrupled.


just got this, what a cool game and the 390 is handling it nicely


----------



## bluej511

Quote:


> Originally Posted by *Dundundata*
> 
> just got this, what a cool game and the 390 is handling it nicely


Its so fun and beautiful just wish it was a tad longer. Cutscenes are ridiculous though i will admit. I get 60fps avg but im on ultrawide as well.


----------



## Dundundata

A Journey of Paste. Very messy. Lowered temps by 5C!


----------



## tolis626

Anyone playing the BF1 open beta? How's it running for you guys? It seems to be pretty demanding.


----------



## AliNT77

running on a [email protected]

locked flawless 60 with 1080p ultra ...


----------



## fyzzz

[email protected]/1500 around 70 fps with lowest settings 4k, still looks pretty good.


----------



## tolis626

Nice!

Mind posting what resolution scaling you guys are using? Seems like the slider is showing wrong values because the default is 42% and it looks like that corresponds to 100% scaling. Maxing the slider to 100% led to my framerates plummeting to the low 40s at all ultra with my card running 1165/1625MHz. It's not the CPU either, 4790k at 4.8GHz here. Either it's the res scaling or the fact that I was using DX12. More likely the former than the latter, although many report DX12 having actually worse performance than DX11. I know the game has 1.5 month or so before release and it's normal for stuff to not work correctly during a beta, but something tells me that DX12 won't be working properly at launch. Call it a hunch.

To be honest, I was kinda hoping that they would use Vulkan instead of DX12 (or have both - wishful thinking), but that was just me. I know it wasn't all that probable to begin with, but one can hope. Or could. Whatevs.


----------



## bluej511

On ultrawide im probably getting 74fps on ultra easy, left everything alone just put hbao to ssao. Dx12 runs pretty poorly for me so i stuck to DX11.

It runs very smoothly for me, the 74fps is locked as i use freesync. I changed FOV a bit but thats about it, on ultra no issues here.


----------



## 12Cores

Quote:


> Originally Posted by *tolis626*
> 
> Anyone playing the BF1 open beta? How's it running for you guys? It seems to be pretty demanding.


The game is not very demanding, the frostbite engine is very mature at this point and well optimized. The 1060/480 will run this game at 1440p without any issues, same goes for the 980/390/970 crowd from the previous generation.

My settings on ultra Directx 11 -
3200x1800 crossfire 390x's mid 80's
1440p single 390x mid 70's
1080p single 390x in the mid 90's


----------



## Worldwin

BF1 runs amazingly well. I get [email protected] at ultra preset. This is expected of DICE at this point.

Somethings to note: DX12 does work assuming my assumption that due to MSI AB OSD not supporting DX12 disappears once the DX12 is enabled. Does any notice any improvements in framerates?
Also does anyone think async shaders are being used or should we all be expecting further optimization once launch hits.


----------



## DarthBaggins

I have noticed an improvement in DX12 on my 390x. FPS went from 100 to 120+


----------



## Worldwin

DX12 has a stuttering issue for me. Frametimes after respawning can go haywire swapping between say 40ms and 6ms. It plays stupidly poorly and i hope it gets fixed prior to launch.

Frametimes can be viewed by enabling "PerfOverlay.DrawGraph 1."


----------



## AliNT77

Yeah the frametimes under DX12 are all over the place

I get serious stuttering under dx12 but silky smooth frametimes under dx11


----------



## bluej511

Quote:


> Originally Posted by *AliNT77*
> 
> Yeah the frametimes under DX12 are all over the place
> 
> I get serious stuttering under dx12 but silky smooth frametimes under dx11


Same for me on BF1, however in Tomb Raider its unbelievably smooth.


----------



## AliNT77

Same here

Btw i cant believe my eyes when i play BF1 ... It looks so damn realistic , the lighting-terrain-character models-physics... its the most photorealistic game ever (on top of SW:BF)


----------



## koxy

Anyone knows if prolimatech mk-26 will fit Msi r9 390?


----------



## DarthBaggins

Thinking of swapping my OEM 390x for the MSI 390x so I don't have to listen to the fighter jet turbine this thing has in it lol


----------



## gapottberg

Quote:


> Originally Posted by *DarthBaggins*
> 
> Thinking of swapping my OEM 390x for the MSI 390x so I don't have to listen to the fighter jet turbine this thing has in it lol


My MSI 390x has been exceptionally quiet and cool since i set custom fan speeds and undervolted the stock speeds. You shouldnt be disapointed if noise is your issue.


----------



## bluej511

Quote:


> Originally Posted by *DarthBaggins*
> 
> Thinking of swapping my OEM 390x for the MSI 390x so I don't have to listen to the fighter jet turbine this thing has in it lol


Or just go water and enjoy both haha.

Try one of the arctic gpu coolers instead of switching complete gpus might make it super quiet for u.


----------



## DarthBaggins

I want to stick it under water but EK has stopped making the blocks for this card, but I could go Aquacomputers if I knew the block for the 290x would fit properly (same PCB but 4GBvs8GB)


----------



## fyzzz

Quote:


> Originally Posted by *DarthBaggins*
> 
> I want to stick it under water but EK has stopped making the blocks for this card, but I could go Aquacomputers if I knew the block for the 290x would fit properly (same PCB but 4GBvs8GB)


The aquacomputer kryographics block should fit. I have one on my XFX R9 390 DD ( the new bigger inductor even) and I had no issues installing the block.


----------



## christoph

hey guys, what about having multiple GPUs brands in the same PC under windows 10?

nothing yet? only rumors that disappear over time?


----------



## DarthBaggins

Quote:


> Originally Posted by *fyzzz*
> 
> The aquacomputer kryographics block should fit. I have one on my XFX R9 390 DD ( the new bigger inductor even) and I had no issues installing the block.


Good to know looks like I'll be ordering a Kryo block soon then


----------



## Streetdragon

Quote:


> Originally Posted by *christoph*
> 
> hey guys, what about having multiple GPUs brands in the same PC under windows 10?
> 
> nothing yet? only rumors that disappear over time?


i think it is already possible with DX12 IF the developer of the game supports it AND the driver allow it.


----------



## christoph

Quote:


> Originally Posted by *Streetdragon*
> 
> i think it is already possible with DX12 IF the developer of the game supports it AND the driver allow it.


then no


----------



## EternalRest

If I get a universal GPU block for my PowerColor 390, how would I keep the VRMS cool?


----------



## Chaoz

Quote:


> Originally Posted by *EternalRest*
> 
> If I get a universal GPU block for my PowerColor 390, how would I keep the VRMS cool?


With a fan directly over them. Like the Kraken G10.


----------



## bichael

Quote:


> Originally Posted by *EternalRest*
> 
> If I get a universal GPU block for my PowerColor 390, how would I keep the VRMS cool?


Have got the alphacool GPX block on my powercolor 390 which I would recommend. Still upgradeable and vrm cooling is pretty good even though it's a hybrid block.


----------



## Nameless1988

Hi mates, yesterday i got my new freesync monitor, an AOC G2460VQ6, 1080p, 24'' TN panel, with 1ms and a freesync range of 35-75 Hz, (max refresh rate = 75 through display port).
Great monitor for 155€. Plus freesync! Never tried freesync before, and now ... it's gorgeous! Now my games run smooth, no more tearing, no more stuttering when framerate dips to 50-40 fps.

A list of demanding games i tried, they all run better and smooth with freesync:
Doom, Star Wars Battlefront, Far Cry Primal, Fallout 4, Tomb Raider 2013, BioShock Infinite, The Witcher 3, Black Ops 3, Rise of The Tomb Raider, Deus Ex Mankind Divided (some minor stuttering due to the bad otpimization); even Watch dogs (worse optimization ever) runs smooth (some minor stuttering when driving, due to the bad engine).
Only game that won't run properly is Just Cause 3, we know it has bad optimization specially on AMD hardware.

I am very satisfied, imho freesync is the best technology about pc-gaming, freesync abolutely changed the game!
AMD is moving on the right way : driver and software are very stable, performance is top notch, no issue since i have my XFX B.E. R9 390 (february 2016) and no issue with my new freesync monitor.

Everything run just perfect!

P.S. Once you try freesync, you become addicted to it!


----------



## bluej511

Quote:


> Originally Posted by *Nameless1988*
> 
> Hi mates, yesterday i got my new freesync monitor, an AOC G2460VQ6, 1080p, 24'' TN panel, with 1ms and a freesync range of 35-75 Hz, (max refresh rate = 75 through display port).
> Great monitor for 155€. Plus freesync! Never tried freesync before, and now ... it's gorgeous! Now my games run smooth, no more tearing, no more stuttering when framerate dips to 50-40 fps.
> 
> A list of demanding games i tried, they all run better and smooth with freesync:
> Doom, Star Wars Battlefront, Far Cry Primal, Fallout 4, Tomb Raider 2013, BioShock Infinite, The Witcher 3, Black Ops 3, Rise of The Tomb Raider, Deus Ex Mankind Divided (some minor stuttering due to the bad otpimization); even Watch dogs (worse optimization ever) runs smooth (some minor stuttering when driving, due to the bad engine).
> Only game that won't run properly is Just Cause 3, we know it has bad optimization specially on AMD hardware.
> 
> I am very satisfied, imho freesync is the best technology about pc-gaming, freesync abolutely changed the game!
> AMD is moving on the right way : driver and software are very stable, performance is top notch, no issue since i have my XFX B.E. R9 390 (february 2016) and no issue with my new freesync monitor.
> 
> Everything run just perfect!
> 
> P.S. Once you try freesync, you become addicted to it!


Freesync is pretty awesome. 1. Don't use vsync just set your frtc in radeon settings to 74, youll get lower input lag and frametimes. FRTC now works on dx12 as well, ive checked in game using msi afterburners latest beta that now offers dx12 ingame overlay and it works perfectly. My fps is now capped at 74fps, before the recent drivers frtc did not work on dx12 and my frames would easily go over the 74fps cap i set it too. Now it stays at 74.

2. Freesync wont get rid of the stuttering, it just makes it smoother, anything beloew 45fps for me its too stuttery. AC Unity tends to dip even below 40fps but rarely.


----------



## Nameless1988

Quote:


> Originally Posted by *bluej511*
> 
> Freesync is pretty awesome. 1. Don't use vsync just set your frtc in radeon settings to 74, youll get lower input lag and frametimes. FRTC now works on dx12 as well, ive checked in game using msi afterburners latest beta that now offers dx12 ingame overlay and it works perfectly. My fps is now capped at 74fps, before the recent drivers frtc did not work on dx12 and my frames would easily go over the 74fps cap i set it too. Now it stays at 74.
> 
> 2. Freesync wont get rid of the stuttering, it just makes it smoother, anything beloew 45fps for me its too stuttery. AC Unity tends to dip even below 40fps but rarely.


1. Yes mate, I will never use vsync in game, now with freesync I always play with 74 frame limiter in radeon crimson settings. If i cap the frame to 75 sometimes i see tearing, but on 74 there is no tearing.
2. Obviously, if a game is a stuttering-mess, adaptive sync wont get rid of it, but certainly makes the frametime variation smoother, when framerate dips below the refresh rate value, freesync come in help to keep the gaming experience smoother and consistent. On classic 60 Hz panel without adaptive sync, when framerate dips below the refresh rate value, there is some "natural" stuttering.

which afterburner version offers dx12 in-game overlay?

EDIT: with both Wolfenstein new order & old blood, seems like freesync doesn't run properly, maybe because the engine runs at 60 fps and doesn't let you play with 60+ fps ? My monitor is 75 Hz but I'm pretty sure the engine i set up on 60 Hz


----------



## bluej511

Quote:


> Originally Posted by *Nameless1988*
> 
> 1. Yes mate, I will never use vsync in game, now with freesync I always play with 74 frame limiter in radeon crimson settings. If i cap the frame to 75 sometimes i see tearing, but on 74 there is no tearing.
> 2. Obviously, if a game is a stuttering-mess, adaptive sync wont get rid of it, but certainly makes the frametime variation smoother, when framerate dips below the refresh rate value, freesync come in help to keep the gaming experience smoother and consistent. On classic 60 Hz panel without adaptive sync, when framerate dips below the refresh rate value, there is some "natural" stuttering.
> 
> which afterburner version offers dx12 in-game overlay?
> 
> EDIT: with both Wolfenstein new order & old blood, seems like freesync doesn't run properly, maybe because the engine runs at 60 fps and doesn't let you play with 60+ fps ? My monitor is 75 Hz but I'm pretty sure the engine i set up on 60 Hz


What i mean is freesync wont smooth out below 45fps stuttering, you can still tell its lower fps then 75 is what i mean.

MSI Afterburner 4.3.0 Beta 14, make sure its Beta 14. I tried it on Rise of the Tomb Raider and Battlefield one, both in dx12.


----------



## Nameless1988

Quote:


> Originally Posted by *bluej511*
> 
> What i mean is freesync wont smooth out below 45fps stuttering, you can still tell its lower fps then 75 is what i mean.
> 
> MSI Afterburner 4.3.0 Beta 14, make sure its Beta 14. I tried it on Rise of the Tomb Raider and Battlefield one, both in dx12.


My freesync range is 35-75 so even at 40 freesync still working.
Generally, smoothness depends by game\engine too: some engine\game feel choppy at 50 fps, some are smoother even at 40 like SW Battlefront (frostbite 3, best engine out there).
I agree with you: 40 fps\Hz is not 75 fps\Hz.


----------



## DarthBaggins

I was looking at the ASUS mini ROG 144hz 1080p (has FreeSync) as my next monitor. But I'm enticed by the 27" predator (hoping to have the money for it soon)


----------



## Dundundata

I'm looking to give 4K a try when newer cards come out but unsure which team I'll be on!


----------



## Nameless1988

Quote:


> Originally Posted by *DarthBaggins*
> 
> I was looking at the ASUS mini ROG 144hz 1080p (has FreeSync) as my next monitor. But I'm enticed by the 27" predator (hoping to have the money for it soon)


144 Hz freesync is useless, imho. It' s difficult run demanding games at 60-70 fps high settings even with a 1070 (considering there are many unoptimized games).


----------



## DarthBaggins

So far running games above 60-70fps hasn't been difficult with the 390x @1080p


----------



## Nameless1988

Quote:


> Originally Posted by *DarthBaggins*
> 
> So far running games above 60-70fps hasn't been difficult with the 390x @1080p


yes but you have to low many settings! there are many AAA games (console porting) that are problematic to run at 60+ fps even on medium settings. Just cause 3, deus ex mankind divided, AC syndicate, ecc ecc


----------



## Dundundata

Quote:


> Originally Posted by *Nameless1988*
> 
> yes but you have to low many settings! there are many AAA games (console porting) that are problematic to run at 60+ fps even on medium settings. Just cause 3, deus ex mankind divided, AC syndicate, ecc ecc


Well the games I've been playing, Witcher 3, FO4, Tomb Raider, NMS have all performed well on the 390 with a decent OC. 50-60fps @1080p with pretty much everything maxed. There may be an option or 2 I put down to high. So I could see a 390x doing 60-70, Now yeah there are always bad ports that might give some trouble.


----------



## mandrix

Latest driver bumped my Fire Strike a little. This is my 390x at 1200/1670.


----------



## christoph

Quote:


> Originally Posted by *mandrix*
> 
> Latest driver bumped my Fire Strike a little. This is my 390x at 1200/1670.


what was your previous score with what driver?

and what driver are you using now?


----------



## bluej511

Quote:


> Originally Posted by *christoph*
> 
> what was your previous score with what driver?
> 
> and what driver are you using now?


Ill give mine a go as well see what i end up with haha.

Edit: Mine only went up 9 points in graphics score lol


----------



## mandrix

Quote:


> Originally Posted by *christoph*
> 
> what was your previous score with what driver?
> 
> and what driver are you using now?


Current driver is 16.8.3 hotfix.
12961 Fire Strike score with the last WQHL driver with the same 1200/1670 clocks.


----------



## christoph

Quote:


> Originally Posted by *bluej511*
> 
> Ill give mine a go as well see what i end up with haha.
> 
> Edit: Mine only went up 9 points in graphics score lol


Quote:


> Originally Posted by *mandrix*
> 
> well 9 is a 9
> 
> Current driver is 16.8.3 hotfix.
> 12961 Fire Strike score with the last WQHL driver with the same 1200/1670 clocks.


at least no one can say is not improving


----------



## bluej511

Quote:


> Originally Posted by *christoph*
> 
> at least no one can say is not improving


True, i can't even get 3dmark to run in ultrawide for wtv reason lol.


----------



## battleaxe

Quote:


> Originally Posted by *mandrix*
> 
> Current driver is 16.8.3 hotfix.
> 12961 Fire Strike score with the last WQHL driver with the same 1200/1670 clocks.


Pretty awesome how the 970 used to beat these cards and now the 290x/390x are just running away from them. Great cards. Great value. Very happy I switched over to the 390x Xfire, worlds better than the junk 970 cards IMO


----------



## daunow

Quote:


> Originally Posted by *battleaxe*
> 
> Pretty awesome how the 970 used to beat these cards and now the 290x/390x are just running away from them. Great cards. Great value. Very happy I switched over to the 390x Xfire, worlds better than the junk 970 cards IMO


The 390x was released a year+ more compare to the 970 (and also a higher tier).
the sad part about all of this, is the fact that the 970 used to even beat them at all.


----------



## Nameless1988

970/980 are already dead.


----------



## mus1mus

970/980 are just 780/780TIs with less muscle but more agile. Clock them the same and see where the improvement of nVidia Manufacturing went!


----------



## Vellinious

Except that Keppler and Maxwell were completely different architectures, so comparing like clock performance doesn't really tell you anything....


----------



## battleaxe

Quote:


> Originally Posted by *daunow*
> 
> The 390x was released a year+ more compare to the 970 (and also a higher tier).
> the sad part about all of this, is the fact that the 970 used to even beat them at all.


I have a 290x also. So I guess I was mostly speaking of that one. 390x was a rebrand so hardly anything new really, same die, so not really a new card per-see.


----------



## mus1mus

Quote:


> Originally Posted by *Vellinious*
> 
> Except that Keppler and Maxwell were completely different architectures, so comparing like clock performance doesn't really tell you anything....


Of course they will tell you that, at same clocks, and roughly the same memory config, they do perform quite the same.

What you are saying will make sense since nVidia gimps previous gens to promote the latest and their hottest card to promote sales. Kepler went into Legacy right before the 970/980 cards were released. Worse, pulled out from the stores. Coz seriously, you shouldn't charge a premium for new cards released when they offer the same performance albeit the TDP from the previous gen.









That also happen with your 980TI right? 980TI versus 1070.

The only good thing that happened is that, 980TIs are being let go for a very affordable price!


----------



## Vellinious

The 980tis I had went back to EVGA stepping up to ACX 3.0 1080s. But before that happened, the latest driver versions still increased DX12 performance on the 980ti.

And...Keppler, Maxwell and now Pascal are a completely different architecture.....they're designed differently.

The only thing that "legacy" means, is that they will no longer be creating driver versions specifically created to increase performance on it. If you're smart, you don't upgrade to the newest driver versions anyway....you find one that works for you, and you stay with it. Hell, I was still using a driver version from late 2015 right up to the point that I shipped my cards back to EVGA. And why? Because THAT driver was the last "performance improvement" driver for Maxwell...everything that released after that was a hot fix for some game...

That said, AMD is still releasing performance improvement drivers for GPUs that are what....4 years old now? Too bad they didn't do that when they were new, eh? /wink


----------



## christoph

Quote:


> Originally Posted by *Vellinious*
> 
> The 980tis I had went back to EVGA stepping up to ACX 3.0 1080s. But before that happened, the latest driver versions still increased DX12 performance on the 980ti.
> 
> And...Keppler, Maxwell and now Pascal are a completely different architecture.....they're designed differently.
> 
> The only thing that "legacy" means, is that they will no longer be creating driver versions specifically created to increase performance on it. If you're smart, you don't upgrade to the newest driver versions anyway....you find one that works for you, and you stay with it. Hell, I was still using a driver version from late 2015 right up to the point that I shipped my cards back to EVGA. And why? Because THAT driver was the last "performance improvement" driver for Maxwell...everything that released after that was a hot fix for some game...
> 
> That said, AMD is still releasing performance improvement drivers for GPUs that are what....4 years old now? Too bad they didn't do that when they were new, eh? /wink


not the way you see it...

I bought my card back when was way cheaper cuz price-performance ratio, and now I have a card that has double its value

back when I bought my card I'd buy 2 390 for the price of a single 980 ( and I have said this too many times, and I don't get tired of saying it) and now they are not at the same price but getting close

and what? people bought the 970 for the price-performance ratio that end up being you know


----------



## mus1mus

Quote:


> Originally Posted by *Vellinious*
> 
> The 980tis I had went back to EVGA stepping up to ACX 3.0 1080s. But before that happened, the latest driver versions still increased DX12 performance on the 980ti.
> 
> And...Keppler, Maxwell and now Pascal are a completely different architecture.....they're designed differently.
> 
> *The only thing that "legacy" means, is that they will no longer be creating driver versions specifically created to increase performance on it. If you're smart, you don't upgrade to the newest driver versions anyway....you find one that works for you, and you stay with it.* Hell, I was still using a driver version from late 2015 right up to the point that I shipped my cards back to EVGA. And why? Because THAT driver was the last "performance improvement" driver for Maxwell...everything that released after that was a hot fix for some game...
> 
> That said, AMD is still releasing performance improvement drivers for GPUs that are what....4 years old now? Too bad they didn't do that when they were new, eh? /wink


This is not about those who have owned them. But those on the shelves waiting for a buyer. The marketing department for nVidia is really working wonders to promote new products when the fact is, their improvement really is abysmal. That's all there is.

Plus, don't assume everyone is like you who knows their way thru which Driver is best.

And yeah, tell me about this.

Kepler - 28 nm
Maxwell - 28nm


Where did that 35% go? I am serious man. 35% comparing a card that boosts to 1500MHz versus a card that struggles to maintain 1000MHz maybe?

Pascal is a different story. Yet, how much did a 1080 gained you over a 980TI at stock really? Ever wondered why they are clocked so high in the first place?

Let's just talk this way, coz we can't dismiss the design characteristics anyway, a CPU architecture is judged by Clock to Clock Performace to distinguish the improvement over the previous gen. Of course they vary. But seriously, If I offer you a 6900K that does not outperform a 5960X at same clocks but priced ridiculously higher, would you take the new CPU? No. That's where my argument points.

But GPUs (nVidia, primarily) do not follow that rule coz it's a shame that after all the money spent into the new design, they still yet to offer something that is significantly better than the preceding gen! Clocks, Memory Config are just some of the way to improve the overall system. That's where your argument centers.


----------



## Vellinious

Quote:


> Originally Posted by *mus1mus*
> 
> This is not about those who have owned them. But those on the shelves waiting for a buyer. The marketing department for nVidia is really working wonders to promote new products when the fact is, their improvement really is abysmal. That's all there is.
> 
> Plus, don't assume everyone is like you who knows their way thru which Driver is best.
> 
> And yeah, tell me about this.
> 
> Kepler - 28 nm
> Maxwell - 28nm
> 
> 
> Where did that 35% go? I am serious man. 35% comparing a card that boosts to 1500MHz versus a card that struggles to maintain 1000MHz maybe?
> 
> Pascal is a different story. Yet, how much did a 1080 gained you over a 980TI at stock really? Ever wondered why they are clocked so high in the first place?


The move from Kepler to Maxwell was more about energy efficiency, than it was about IPC. Of course NVIDIA is going to come out and say they're more powerful. And they were, at the clocks they were running at, and accounting for really small IPC gains. That's marketing....gotta sell products. AMD does the same ****. Moving on......Pascal was a die shrink...less voltage, marginally less heat output, faster clocks and quite a bit more performance, specifically in DX12 and VR applications. We have yet to see what Pascal is really capable of once we break open the bios. Something I'm hoping for soon.....I have a TitanX sitting here just begging for higher power limits....it'll gain 10 fps easy, when it isn't banging it's head up against the artificially low power limits. Now THAT, is criminal.

I know NVIDIA is the big baddie, and everyone has to feel sorry for AMD, but...honestly...they dug their own grave with the FX line, and they'll be digging out of it for the next 2 years. I just hope, for all our sakes, and for the sake of my portfolio (I bought AMD when it was about the price of a Big Mac), that Zen is a homerun. lol I'm already planning my Zen build. /shrug

Man, I tell everyone I can, that if you don't NEED to update drivers, DON'T.....if everyone sent the same message, there'd be a lot less unhappy people in the world. lol


----------



## mus1mus

Less than a year if they care about us IMO.


----------



## bluej511

Quote:


> Originally Posted by *mus1mus*
> 
> Less than a year if they care about us IMO.


He came to troll so don't worry about it.

Even nvidia owners know that ngreedia purposefully gimps older models its common knowledge. Only people who have their head up their @ss can't comprehend why.

Its the main reason ill never buy nvidia period.


----------



## mus1mus

latest talks from the green camp:

DX12 sucks.


----------



## Vellinious

I wasn't trolling...I'm not an NVIDIA fanboy. I buy for performance and the fun of overclocking. Which is why I bought the 290X a while back...I got bored with overclocking Maxwell and it seemed like a good challenge. It didn't disappoint.

As I said before....if GPU owners were smart, they wouldn't keep updating drivers for GPUs they weren't meant to improve. That's just dumb on a stick.

Anyway....


----------



## mus1mus

I'm also thinking going back to 980TIs for home PC. Coz they're cheap now. And the best part about that is, 390X are still way overpriced locally. So are the 480s.

It's not a dumb move IMO.


----------



## christoph

Quote:


> Originally Posted by *mus1mus*
> 
> I'm also thinking going back to 980TIs for home PC. Coz they're cheap now. And the best part about that is, 390X are still way overpriced locally. So are the 480s.
> 
> It's not a dumb move IMO.


if you buy the 970 do you still get your 20 dollars back?


----------



## daunow

Quote:


> Originally Posted by *christoph*
> 
> if you buy the 970 do you still get your 20 dollars back?


Refurb? no
Used? more than likely no
New? probably


----------



## mus1mus

Y would u buy a 970 U?


----------



## Nameless1988

Be advised: do not buy already dead GPU! Ahahahahah


----------



## christoph

Quote:


> Originally Posted by *Nameless1988*
> 
> Be advised: do not buy already dead GPU! Ahahahahah


checked


----------



## milkbreak

What exactly is the procedure to undervolt a 390X? I tried with Sapphire TriXX but the voltage seemed to not stay where I set it. Do I need to disable something in the Radeon driver settings or is there a better utility? Anything out there that doesn't need to run on startup to work?


----------



## GorillaSceptre

Quote:


> Originally Posted by *milkbreak*
> 
> What exactly is the procedure to undervolt a 390X? I tried with Sapphire TriXX but the voltage seemed to not stay where I set it. Do I need to disable something in the Radeon driver settings or is there a better utility? Anything out there that doesn't need to run on startup to work?


There's probably a lot of info out there to really get things dialed in. But in my case, i just use afterburner and back the core voltage slider off until i get no crashes in benchmarks or my most demanding games, then i test more to make sure i get no artifacts.

Can't help you with things that don't need to run at start up, but applying my AB settings at startup has always worked flawlessly for me. Never noticed slower startup times or anything.



Spoiler: Warning: Spoiler!


----------



## Mister300

Did you check force constant voltage in options?


----------



## milkbreak

Quote:


> Originally Posted by *Mister300*
> 
> Did you check force constant voltage in options?


I must be blind because I don't see this option anywhere in the latest TriXX.


----------



## Mister300

Sorry its in afterburner


----------



## milkbreak

I made the switch to Afterburner and everything seems to be working just fine. Thanks for the suggestion. I should be using "third party" voltage control in the advanced options for a Sapphire Nitro 390x, right? Rather than reference or MSI?


----------



## kaspar737

Anyone undervolted -100mV on 390X and measured power draw? I'm looking at getting a 390 or 390X because the 470/480 8GB are hella expensive in Europe, but I'd like it's power consumption to be around 150W.


----------



## tolis626

Quote:


> Originally Posted by *kaspar737*
> 
> Anyone undervolted -100mV on 390X and measured power draw? I'm looking at getting a 390 or 390X because the 470/480 8GB are hella expensive in Europe, but I'd like it's power consumption to be around 150W.


If software measurements are correct (It seems they are quite accurate in this case compared to what my Kill-A-Watt shows), depending on your particular model and ASIC quality, -100mV should result in sub 150W power draw. Mine typically showed an average of 130-140W. Mine is also a 1.275V stock part, though, so others may see lower if they have 1.225V or 1.25V parts. YMMV, but it should fall within your desired power draw limits. My card can actually do 1040/1500MHz or something like that at -100mV, so not too shabby. They also become ridiculously easy to cool, so that's a plus.


----------



## Rexer

Quote:


> Originally Posted by *Chaoz*
> 
> I even have 14's but they're not thick enough to fit on my card, so it was a wrong purchase. They're 1mm and I needed 1.5mm for my STRIX card.
> 
> Bought em off eBay. From this guy's shop in the UK:
> 
> http://www.ebay.co.uk/sch/pcerb/m.html?_nkw=&_armrs=1&_ipg=&_from=
> 
> He has loads of different kinds, but this is the one I bought:
> http://www.ebay.co.uk/itm/Fujipoly-Thermal-Pad-1-5mm-11W-mK-for-GPU-CPU-LED-XBOX-PS3-PS4-PC-Laptop-100x15-/181701216935?hash=item2a4e3c8aa7:g:z~UAAOSw9r1V-Ai-
> 
> The pad is quite hard so I had to press my heatsink onto it for it to sit snug on my VRM's and such.


Lol. Well, this gives me a certain satisfaction. My old 7950's going to finally get new thermo undies on it. For 2 years, it sat naked in a box. Thanks for the info.


----------



## milkbreak

Turns out I can underclock my Nitro 390X by -119mV (the default was +19mV for whatever reason). Seems to be stable in Heaven and Furmark. That's pretty decent, right? Is there some way to flash this setting to the VBIOS once it's confirmed 100% stable so I don't have to run any utilities at all?


----------



## Rexer

Hey! I heard some odd rumor about Gigabyte cards. Had a conversation about overclocking 390 and 390x cards and this techie guy said Gigabyte locked out over volting on their 390 (x) cards and it's not possible to raise the mV. I thought Buffalo Soup. Why would Gigabyte do that? You couldn't obtain the highest performance for your money. Well, Lol, I never own one so what would I know. But out of curiosity, did Gigabyte actually lock out users from raising the volts on their 390 cards?


----------



## battleaxe

Quote:


> Originally Posted by *Rexer*
> 
> Hey! I heard some odd rumor about Gigabyte cards. Had a conversation about overclocking 390 and 390x cards and this techie guy said Gigabyte locked out over volting on their 390 (x) cards and it's not possible to raise the mV. I thought Buffalo Soup. Why would Gigabyte do that? You couldn't obtain the highest performance for your money. Well, Lol, I never own one so what would I know. But out of curiosity, did Gigabyte actually lock out users from raising the volts on their 390 cards?


That is my understanding. Yes. Hardly anyone buys the Gigi cards here for that reason.


----------



## Rexer

Quote:


> Originally Posted by *battleaxe*
> 
> That is my understanding. Yes. Hardly anyone buys the Gigi cards here for that reason.


Uuuh. That's sad. It's almost insulting. Makes ya wonder, what can Gigabyte be thinking? That's like a minor league attempt to castrate the tech community.


----------



## battleaxe

Quote:


> Originally Posted by *Rexer*
> 
> Uuuh. That's sad. It's almost insulting. Makes ya wonder, what can Gigabyte be thinking? That's like a minor league attempt to castrate the tech community.


I suppose we are such a small minority (that OC) they figure its safer to lock it down for the typical consumer. But we don't buy em' and reviewers wouldn't review them as well either I would think. So, IDK... I agree.... seems a bit short sighted... they've been doing this for a few gens now on AMD cards.


----------



## MrMetaton

Sorry to just jump into the chit chat. But i just happen to be own a Gigabyte G1 Gaming R9 390. And Yep. Its voltage hard locked. When i bought it i had no idea. Its for my smallbudget build. And i have tried to OC it thoe. I can go upto ~1150/1650 with +50% power limit. But its rly depends on the driver as far as i see. With the current 16.9.1 Crimson i only can go upto 1090/1630 . But yet i only tried a few driver... Not even sure if its becouse of the driver. I just noticed that when last time Crimson updated since then i had screen flickers etc so had to back up a little bit. This happened before also. But then there was another update that "magically" fixed it and i didnt had to lower my oc.
Now I'm on a mission to try maybe all the drivers to see which is the best for performance and oc.


----------



## Rexer

Quote:


> Originally Posted by *battleaxe*
> 
> I suppose we are such a small minority (that OC) they figure its safer to lock it down for the typical consumer. But we don't buy em' and reviewers wouldn't review them as well either I would think. So, IDK... I agree.... seems a bit short sighted... they've been doing this for a few gens now on AMD cards.


Locked mV. That's such a slap in the face. My old manager liken techies and adrenalin junkies on the same page. Techies are like hot rodders of a different nature at a different time. He told me a story about the powerful SS Chevelle in one odd year (1973, I think) came with a 2 barrel carburetor (not the power inducing Rochester 4 barrel). He said the whole hot rod community said, "Huh?" For all the power and research modifications that goes into 9, 10 second street machines, here was an 18 second one. More fun to watch the skate boarders going down the sidewalk.
Why would we buy a graphics card if we can't modify it's performance? That's what owning a powerful 390x or 980ti is all about, right? It's not spending the money to cruise in a SUV or station wagon or look at table lamps on Amazon adds.
Lol. I love raising the roof in a good multiplayer game. BF3 & BF4, where 14 to 20 guys are trying to capture a flag in a dog beating battle. If I can't melee with the best of them, I'm a deficit. I'm a dud. There's no mercy in dog eat dog. Lol. That's why we all have great rigs and 1,000 gigs of memory. zero ping and monster 12 pack, fuel injected processors.
Yes, locked voltage that's not our science.


----------



## Rexer

Quote:


> Originally Posted by *MrMetaton*
> 
> Sorry to just jump into the chit chat. But i just happen to be own a Gigabyte G1 Gaming R9 390. And Yep. Its voltage hard locked. When i bought it i had no idea. Its for my smallbudget build. And i have tried to OC it thoe. I can go upto ~1150/1650 with +50% power limit. But its rly depends on the driver as far as i see. With the current 16.9.1 Crimson i only can go upto 1090/1630 . But yet i only tried a few driver... Not even sure if its becouse of the driver. I just noticed that when last time Crimson updated since then i had screen flickers etc so had to back up a little bit. This happened before also. But then there was another update that "magically" fixed it and i didnt had to lower my oc.
> Now I'm on a mission to try maybe all the drivers to see which is the best for performance and oc.


I think I had to roll back a driver, too. Started seeing snow patterns.


----------



## jdorje

Back in spring was the golden age for my gpu's clock. Crimson drivers had matured, performance was up, bugs were down.

Sometime between then and a couple months ago, a driver update changed things. Previously, I could run a slightly unstable OC, get like one artifact an hour, and otherwise benefit from that extra clock. But after this update - and I believe it was an instantaneous thing - the same OC, with the same level of artifacting, would instead crash (blackscreen) every now and then. This was quite no good, so I lowered clock, eventually by around 15 mhz. At this time I was playing witcher 3, which seemed way better at causing these crashes than other games.

Now, it's possible today's driver actually gets enough performance optimization that it beats the spring driver's performance, even with the lower clock. Giving it the benefit of the doubt, I'm using the modern driver.

But in any case, this would be a difficult thing to test, because there was no difference in stability before and after except for the crashing, and that takes a long time to happen.


----------



## Dundundata

Well you were living on the edge. I had the same thing happen with W3, occasional artifacts. So I gave it a little more juice and tuned down 10MHz, and everything is stable.


----------



## Rexer

Quote:


> Originally Posted by *jdorje*
> 
> Back in spring was the golden age for my gpu's clock. Crimson drivers had matured, performance was up, bugs were down.
> 
> Sometime between then and a couple months ago, a driver update changed things. Previously, I could run a slightly unstable OC, get like one artifact an hour, and otherwise benefit from that extra clock. But after this update - and I believe it was an instantaneous thing - the same OC, with the same level of artifacting, would instead crash (blackscreen) every now and then. This was quite no good, so I lowered clock, eventually by around 15 mhz. At this time I was playing witcher 3, which seemed way better at causing these crashes than other games.
> 
> Now, it's possible today's driver actually gets enough performance optimization that it beats the spring driver's performance, even with the lower clock. Giving it the benefit of the doubt, I'm using the modern driver.
> 
> But in any case, this would be a difficult thing to test, because there was no difference in stability before and after except for the crashing, and that takes a long time to happen.


Spring was the last good driver for me, too. 16.5.2 (May). Around the time when AMD introduced the 480. I skipped till 16.7.3 (July-Aug) and started noticing artifacting (a few weeks after I replaced the thermopaste in my gpu), some overheating and snow. So I loaded the latest driver 16.9.1 (Sept) then black screened.


----------



## Rexer

Quote:


> Originally Posted by *Dundundata*
> 
> Well you were living on the edge. I had the same thing happen with W3, occasional artifacts. So I gave it a little more juice and tuned down 10MHz, and everything is stable.


Wondering if the 480 updates in the software had anything to do with it.


----------



## Pillendreher

Hey guys,

sorry to bust in here like that. I'm returning the ASUS 290X DirectCUII I got last week since its cooling is straight out terirble and I'm not willing to spend another 60 bucks on a custom cooler. Now I'm trying to find a 390(x) for a decent price. Are there any 390(x)s that I should avoid at all possible costs? I'm not gonna buy another ASUS card, that's for sure. I've looked at some reviews and it seems like PowerColor made a nice 390 with a great cooler. The most used 390s that are being sold right now are PowerColors, MSIs and Sapphires. Can I choose either one of them or should I look for a specific one?

I could get a PowerColor 390 PCS+, which is only a couple of weeks old, for 209 €.

Thanks in advance!


----------



## Streetdragon

Quote:


> Originally Posted by *Pillendreher*
> 
> Hey guys,
> 
> sorry to bust in here like that. I'm returning the ASUS 290X DirectCUII I got last week since its cooling is straight out terirble and I'm not willing to spend another 60 bucks on a custom cooler. Now I'm trying to find a 390(x) for a decent price. Are there any 390(x)s that I should avoid at all possible costs? I'm not gonna buy another ASUS card, that's for sure. I've looked at some reviews and it seems like PowerColor made a nice 390 with a great cooler. The most used 390s that are being sold right now are PowerColors, MSIs and Sapphires. Can I choose either one of them or should I look for a specific one?
> 
> I could get a PowerColor 390 PCS+, which is only a couple of weeks old, for 209 €.
> 
> Thanks in advance!


Go for it. while heavy gaming you will see around 75° on the gpu. VRM can go a bit higher but that is normal and for around 200€ it is ok!


----------



## Pillendreher

I went for it. It's gonna be interesting to see if I can unlock it and if I need to undervolt it.


----------



## GorillaSceptre

Nice!

Post back how it works out, interested in your undervolting results (if needed). Thanks to my climate I'm more interested in undervolting instead of overclocking..


----------



## Dundundata

Undervolt.net ?


----------



## 113802

Got my R9 390x yesterday and these are my 100% stable clocks. The ram didn't need any extra voltage to hit 7100Mhz but the core requires +50mv Using a Raijintek Morpheus II

https://www.techpowerup.com/forums/forums/gpu-z-test-builds.56/


----------



## Shatun-Bear

Quote:


> Originally Posted by *WannaBeOCer*
> 
> Got my R9 390x yesterday and these are my 100% stable clocks. The ram didn't need any extra voltage to hit 7100Mhz but the core requires +50mv Using a Raijintek Morpheus II
> 
> https://www.techpowerup.com/forums/forums/gpu-z-test-builds.56/


Nice









I had a Nitro+ OC 480 last week but the card weren't stable much past 1340Mhz and the acoustics were a little too loud to maintain that overclock so I returned it and got £250 back. I then snagged a deal on Amazon Warehouse plus a discount I have to get a Nitro 390X for £211. A superb deal! Card is much quieter than the Nitro 480 when you turn the fans up.

My stable overclock seems around 1140 on the core and 1775 on the memory with 40% fan, which is very quiet still. Been playing DOOM Vulkan with it and the performance is pretty spectacular in that game under that API.


----------



## ziggystardust

Anyone else had Power Efficiency off by default with the latest drivers (16.9.2)?

It was always on by default since that option is introduced. I just found it strange since I also had to enable Freesync this time. I'm wondering what other settings are messed up.


----------



## bluej511

Quote:


> Originally Posted by *ziggystardust*
> 
> Anyone else had Power Efficiency off by default with the latest drivers (16.9.2)?
> 
> It was always on by default since that option is introduced. I just found it strange since I also had to enable Freesync this time. I'm wondering what other settings are messed up.


Honestly, who cares lol. Power efficiency should always be off, don;t know why its even an option. Sometimes after an update ive had to reset my frtc and set it back to 74fps. Considering i do a clean install everytime id say its normal.


----------



## DarthBaggins

Those are two things I never associate together- Power efficiency and AMD lol. Power efficiency is for casuals


----------



## ziggystardust

Quote:


> Originally Posted by *bluej511*
> 
> Honestly, who cares lol. Power efficiency should always be off, don;t know why its even an option. Sometimes after an update ive had to reset my frtc and set it back to 74fps. Considering i do a clean install everytime id say its normal.


Quote:


> Originally Posted by *DarthBaggins*
> 
> Those are two things I never associate together- Power efficiency and AMD lol. Power efficiency is for casuals


Hahaha yeah I always turning it off manually. But since both Freesync and Power Efficiency came disabled by default this time, I became a bit paranoid as some other settings might be messed up.


----------



## Dundundata

Didn't even know about that setting, and mine was set to On


----------



## ziggystardust

Quote:


> Originally Posted by *Dundundata*
> 
> Didn't even know about that setting, and mine was set to On


After updating to the latest driver (16.9.2)?


----------



## Dundundata

Quote:


> Originally Posted by *ziggystardust*
> 
> After updating to the latest driver (16.9.2)?


I'm on the last one will update now
Quote:


> Originally Posted by *ziggystardust*
> 
> After updating to the latest driver (16.9.2)?


I just did a clean install (using DDU) to 16.9.2 and it was set to No. Could be AMD has a saved settings file somewhere I'm not sure. I haven't really delved into the Radeon Setting up until now, just OC with Afterburner and go! So I'm curious to see if I'll notice any difference with power efficiency off.


----------



## ziggystardust

Quote:


> Originally Posted by *Dundundata*
> 
> I'm on the last one will update now
> I just did a clean install (using DDU) to 16.9.2 and it was set to No. Could be AMD has a saved settings file somewhere I'm not sure. I haven't really delved into the Radeon Setting up until now, just OC with Afterburner and go! So I'm curious to see if I'll notice any difference with power efficiency off.


A new driver was resetting the power efficiency setting before, since I always had to toggle it off after an update (even with DDU). So it's off by default for you too after installing 16.9.2? Looks like AMD set it off by default.

Actually there isn't really a difference in most of the games but some games (like Fallout 4) didn't like that power efficiency thing and core clock was fluctuating and dipping down to 800-900 mhz in some areas while it is on.


----------



## Dundundata

Quote:


> Originally Posted by *ziggystardust*
> 
> A new driver was resetting the power efficiency setting before, since I always had to toggle it off after an update (even with DDU). So it's off by default for you too after installing 16.9.2? Looks like AMD set it off by default.
> 
> Actually there isn't really a difference in most of the games but some games (like Fallout 4) didn't like that power efficiency thing and core clock was fluctuating and dipping down to 800-900 mhz in some areas while it is on.


Well now that you tell me this I am guessing that happened to me, because I got some weird dips in that game for sure for no apparent reason.

Yes it was set to Off after installing the latest driver


----------



## ziggystardust

Quote:


> Originally Posted by *Dundundata*
> 
> Well now that you tell me this I am guessing that happened to me, because I got some weird dips in that game for sure for no apparent reason.
> 
> Yes it was set to Off after installing the latest driver


Yea those weird dips in F4 are exactly caused by the power efficiency. Probably there are more games having similar problems with the setting.

But another weird issue; a few months ago occassional driver crashes started to happen while browsing with Chrome or Firefox if the power efficiency is off. Recent drivers didn't fix the issue, hope this one fixes it.


----------



## jkuddyh801

Hey whats up I want in, definitely found another solid "SILICON LOTTERY" Winner GPU, POWERCOLOR PCS+ R9 390 8GB (+430 GB/s Bandwidth) @ 1205 MHz CORE / 1700 MHz MEM using +50 POWER TARGET/ +100mV / +25 mV AUX Voltage / +100 FAN SPEED / Still got room to go and killing it benchmarks at the moment! Thanx !





So a few things happened here, I found out through trial and error just how finicky this card really is or can be. Ranging from different clocks, voltages, etc. I had scores with less CORE CLOCK vs HIGHER CLOCKS results in better performance and visa versa. So ive included a benchmark on PCIE 2.0 to test for fun and the clocks are seen there, ive included a previous GPU-Z w/the typical Overclocks seen on this card, but the only thing I need to retract from GPU-Z #1 is i said its not the best Silicon Lottery Winner when later I found it is and am still Clocking Higher as I write this, Im just making sure everything is stable before I submit anything I want to be as thorough as possible with methodologies. But so far im past 1750 Mem and around 1266 Core and going. Will update as they come in!


----------



## Pillendreher

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Nice!
> 
> Post back how it works out, interested in your undervolting results (if needed). Thanks to my climate I'm more interested in undervolting instead of overclocking..


I just installed the card. There's a buzzing coming from it under load. God damn it! Is that 'coil whine'?


----------



## GorillaSceptre

Quote:


> Originally Posted by *Pillendreher*
> 
> I just installed the card. There's a buzzing coming from it under load. God damn it! Is that 'coil whine'?


I'd imagine so.. It's loud enough that you can hear it from outside the case? Not sure what you could do tbh, sometimes the PSU can cause it too. That sucks, man. Are you able to get a replacement?


----------



## Pillendreher

Yeah I can hear it through the closed case. If I throttle the card down to 600 MHz, it's gone.

The card itself is used, but was only bought a week ago. I might be able to return it to the store he bought it at tho - I'll have to check the law. Having gone to law school comes in handy in situations like this.

Man, what a week. First that 290X with the ****y ASUS cooling and now this. Damn.


----------



## 113802

Anyone have the PowerColor PCS+ r9 390x bios? I noticed I can only flash PowerColor bios on my card that actually boot. I found the Devil but it black screens since it has a high core clock.


----------



## Rexer

Quote:


> Originally Posted by *DarthBaggins*
> 
> Those are two things I never associate together- Power efficiency and AMD lol. Power efficiency is for casuals


Ah, ha, me too! I teak my card every week. I also fry a card every now and then. I got a 390 I'm just in love with but I bought a spare 390X in the closet (got it on the summer sales). I also have to sneak peek in my device manager to kill the AMD sound. Lol. Outta habit I've been using RealTek audio drivers and when I update or reinstall Crimson, I turn off AMD sound drivers. When I forget, I have no sound at all.-zip! I sometimes go the week not knowing the sound is off. When the clan calls and says, "Hey! Sleeping coward, join up the game!" they have to jingle my phone. Lol.


----------



## Pillendreher

This is what my card sounds like:


__
https://soundcloud.com/pillendreher-1%2Fcoil-whine-r9-390

I stopped the fans in the end and it didn't go away. Would another PSU help? I'm using a brand new Corsair Vengeance 550W CM.


----------



## Shatun-Bear

Quote:


> Originally Posted by *Pillendreher*
> 
> This is what my card sounds like:
> 
> 
> __
> https://soundcloud.com/pillendreher-1%2Fcoil-whine-r9-390
> 
> I stopped the fans in the end and it didn't go away. Would another PSU help? I'm using a brand new Corsair Vengeance 550W CM.


I have a Nitro 390X I bought on Amazon Warehouse last week and it has the exact same noise when under load. But for me it's not noticeable at all when the case lid is on. So if mine is coil whine it's very mild. There are ways to try and 'whine-out' the coil whine if it's bad although I've never tried stuff like that before.


----------



## Dundundata

You could try a different PSU but who knows if that will do anything. I had 2 cards give me whine, returned one and got the same card from somewhere else - both whined. Then I bought a different brand and no whine.

By the way this was a really high pitched noise and very distracting, and got even worse when overclocking.


----------



## Pillendreher

Quote:


> Originally Posted by *Dundundata*
> 
> You could try a different PSU but who knows if that will do anything. I had 2 cards give me whine, returned one and got the same card from somewhere else - both whined. Then I bought a different brand and no whine.
> 
> By the way this was a really high pitched noise and very distracting, and got even worse when overclocking.


I just ordered 3 PSUs (RM550X, TP500C, Straight Power E10-CM-500). Hopefully they'll solve the coil whine. I really like this card for ~200 bucks - great performance, great cooling.


----------



## DarthBaggins

Why just a 500-550w psu?


----------



## Pillendreher

Why should I get a higher wattage? 500-550 should be quite enough.


----------



## DarthBaggins

I would've upped to at least a 600-650 to allow ample room for OverClocking, but that's just me. Not like these cards are the most energy/power efficient.


----------



## Pillendreher

Quote:


> Originally Posted by *DarthBaggins*
> 
> I would've upped to at least a 600-650 to allow ample room for OverClocking, but that's just me. Not like these cards are the most energy/power efficient.


Oh well I generally don't overlclock.


----------



## DarthBaggins

Also figuring your current issue w/ the coil whine could be due to your current choice in voltage (if it's around the 500 mark). But I can say the RM550x should be ample, but since you don't have your build in signature we can only guess/assume (don't know what else is in your build)


----------



## Pillendreher

Quote:


> Originally Posted by *DarthBaggins*
> 
> Also figuring your current issue w/ the coil whine could be due to your current choice in voltage (if it's around the 500 mark). But I can say the RM550x should be ample, but since you don't have your build in signature we can only guess/assume (don't know what else is in your build)


X6 1090T
PowerColor PCS+ 390
ASRock 970 Extreme 3
8GB DDR3
2x 120mm case fans
Samsung Evo 850 500GB

I'm still waiting for Zen to change my whole CPU/Mainboard combo. I'm currently using a Corsair Vengeance 550M, which is totally new btw. I ordered those other PSUs to make sure it's not the Corsair that's causing the whining.

The whining was also there in my other PC (FX-8320, Gigabyte board), which also features a Corsair Vengeance 550M (had to replace the old bequiet PSU since the PC was shutting down on its own).


----------



## Pillendreher

According to some reviews, even though the PCS+ 390 has a double BIOS feature, there should be no difference and each BIOS should be the same. This is not the case for my card: Slot 1 is 015.049.000.004.000000

and Slot 2 is
015.049.000.008.000000

I couldn't find either version for the PCS+ 390 on techpowerup.


----------



## Grenseal

Hi guys! Just got a brand new MSI R9 390 Gaming 8GB, for $250CAD!! The dude I bought it from seemed nervous when I picked it up; I wonder if he stole the card or something.
Right now, it's on air; the EK waterblock for it is about $200CAD. Not sure if I want to water cool it, hoping for a used block on ebay or a cheap Chinese Bykski knock off.
I clocked it to 1125core and 1700mem, no added voltage yet. I'm new to this card so I dont know if this is any good, here are the numbers:


https://www.techpowerup.com/gpuz/details/avbzu


----------



## lanofsong

Hey R9 390/390X owners,

Would you consider putting all that power to a good cause for the next 2 days? If so, come *sign up* and fold with us for our monthly Foldathons - see attached link.

September Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## Pillendreher

My R9 390 doesn't seem to need that much voltage so far. I've gotten it down to -75 mV via Afterburner and haven't had any crashes or artifcats or anything like that until now. Too bad there's still the stupid coil whine, although I think it has quieted down a bit over the last few days.


----------



## DarthBaggins

Quote:


> Originally Posted by *lanofsong*
> 
> Hey R9 390/390X owners,
> 
> Would you consider putting all that power to a good cause for the next 2 days? If so, come *sign up* and fold with us for our monthly Foldathons - see attached link.
> 
> September Foldathon
> 
> To get started:
> 
> 1.Get a passkey (allows for speed bonus) - need a valid email address
> http://fah-web.stanford.edu/cgi-bin/getpasskey.py
> 
> 2.Download the folding program:
> http://folding.stanford.edu/
> 
> Enter your folding name (mine is the same as my OCN name)
> Enter your passkey
> Enter Team OCN number - 37726
> 
> later
> lanofsong


There are teams looking for 390/390x folders to compete, I know my team Brass Bottom Boys is definitely looking for RED Team folders


----------



## ziggystardust

Hello folks,

I have Sapphire 390X Nitro. After installing 16.9.2, I start having memory errors on HWInfo 64 at stock clocks. Not so excessive though. Just 3-5 errors per 1 hour of gaming sometimes 8-10. I don't remember getting that much errors before. Just 1 or 2 sometimes. Is it normal? Anything to worry about?

As far as I can tell card temps are allright. While playing Witcher 3 for example, gpu maxes out around 67-68 and vrms 79-80.

I'm just a bit lazy these days to roll back some of those previous drivers to test thoroughly. But I can do it if it's necessary.


----------



## bluej511

Quote:


> Originally Posted by *ziggystardust*
> 
> Hello folks,
> 
> I have Sapphire 390X Nitro. After installing 16.9.2, I start having memory errors on HWInfo 64 at stock clocks. Not so excessive though. Just 3-5 errors per 1 hour of gaming sometimes 8-10. I don't remember getting that much errors before. Just 1 or 2 sometimes. Is it normal? Anything to worry about?
> 
> As far as I can tell card temps are allright. While playing Witcher 3 for example, gpu maxes out around 67-68 and vrms 79-80.
> 
> I'm just a bit lazy these days to roll back some of those previous drivers to test thoroughly. But I can do it if it's necessary.


Doesnt matter. If you have no artifacts, no issues, no crashes then who gives a flying ..... lol. Hwinfo might be processing uncorrected errors so who really knows.


----------



## ziggystardust

Quote:


> Originally Posted by *bluej511*
> 
> Doesnt matter. If you have no artifacts, no issues, no crashes then who gives a flying ..... lol. Hwinfo might be processing uncorrected errors so who really knows.


Yeah, so far no issues, no crashes and nothing visible as artifacts, but it still made me wonder if it's a normal thing or not.


----------



## MrMetaton

Sorry to just jump in the conversation.
But i just realized what true sadness is...
So i have a Gigabytee version of the R9 390... Which is hard voltage locked...
And i'm sittin here . At 1127 core clock. 1630 memory clock. With a max temp of 75C in every single tests i run after 1 hour....
Fans are at like 65%... Like for real. The sadness here is true...
Wondeer rhow much could i get out of it with a normal oc.
Some people run their cards at like 90C with OC. Which is just 4Clower before it starts throttle...
Not to meniton that if i would really wanna be ghetto style just remove the backplate and place a fan blowing air to the back of the card. Also heard that some people put fans to the sides of the card basically sucking air out of the card ...
Such sadness... xD The potential that i can never see...


----------



## 113802

PowerColor PCS+ R9 390X still looking for the bios of this powercolor card!

If anyone has it please upload it if you have it!


----------



## TeslaHUN

390 Crossfire @water or GTX [email protected] water ?
1440p resolution.
I have already one 390+ water , so i could buy another one +new PSU , or sell this and buy 1x 1080 , and PSU will be enough ( FSP raider 550)


----------



## battleaxe

I have 390x crossfire and I don't have any issues, works great and I'm very happy with it. That said though, I'd probably go 1080 if I had to buy another PSU as well... but honestly I'd prob wait a bit and get the next AMD card coming out instead of the 1080. I just can't stand the way Nvidia treats its customer base anymore.

DX12/Vulcan and all becoming the new standard... Just my 2c.


----------



## MRT-067

Hello to all,

I am newly registered to this forum but I am familiar with its content as I've visited
this group before.

I too own a R9 390X (Sapphire Variation) and I currently have a concern as to if this GPU
fits well on the ASROCK Z170 OC Formula Motherboard because I saw a review written
by someone saying he was unable to make his R9 390 fit on the PCIE x16 (first slot) .

So, anyone has this MB ?

Thanks to All.


----------



## christoph

why unable to make it fit?

cuz the cpu cooler? isn't the plastic cover for the I/O shield where it says OC FORMULA somewhat in the way?


----------



## MRT-067

I know nothing more than such brief complaint from this stranger of the written review.

The reason I've posted this question is because I was considering in affording this
MB, but without knowing for sure before buying is a no no for me.

Not relating to the review, my concern was that my GPU would not go in because of the
side I/O plastic cover which is one piece of cover extending down through the audio section,
and from picture research on the MB I understood that the first top PCI-E is way to close to
the bottom end of the I/O plastic cover, and secondly I feared that little GPU plastic pieces from
the front end of my card would hit the audio plastic cover and go no further as this cover is a little elevated.

So basically now I dropped on the idea of getting this MB as I was just not confident about it, by taking
into consideration that I cannot just buy another GPU at this time if something went wrong on the sticking -
placing Dept. No returns no refunds if that happens right ! ?


----------



## christoph

yeah you better get a usual board, one that people are using already with the video card you own


----------



## diggiddi

Are these cards no longer in production or what?


----------



## Rexer

Quote:


> Originally Posted by *battleaxe*
> 
> I have 390x crossfire and I don't have any issues, works great and I'm very happy with it. That said though, I'd probably go 1080 if I had to buy another PSU as well... but honestly I'd prob wait a bit and get the next AMD card coming out instead of the 1080. I just can't stand the way Nvidia treats its customer base anymore.
> 
> DX12/Vulcan and all becoming the new standard... Just my 2c.


If you had to buy something that big, that's the way to go. 1080. I need more denture work before I can reason with that. I'm currently convinced the 390 & 390x are better than the 480 and 1070. I don't think benchmark programs give accurate gameplay stats. Also Nvidia likes to play up the 256 bit pipe song but I think that matters. The bigger the pipe the more pig you can slam into it at one time. I can't figure out how they can say compression is better. Compression always needs to be expanded. 512 is still straight through. My opinion and belief is no one right now will see the full use of the 512 bus. 390 alone has a high 384000 mb bandwidth is more than 980ti's 336000mb and their bus is 384mb.

By the way, my XFX xtr 750w psu is giving me fits. I need to RMA it. I got a smaller Seasonic 550w sitting around for temporary use. Think that's enough for a 390? I can borrow (secretly switch) my 550w with for my boss's Seasonic 650w since he doesn't use the total power of it. All of this is single rail. Do you think I'll need more than 600w to run a 390?


----------



## TeslaHUN

What is max voltage recommended for daily use ? MSI 390 +Alphacool NexXxoS GPX gpu watercooling ( vrm /vrams passive )
Is it worth to overclock the memory ? or not worth the extra heat.


----------



## Rexer

Quote:


> Originally Posted by *diggiddi*
> 
> Are these cards no longer in production or what?


Yeah. While I had the chance, I bought a spare 390x on summer sale for under $275 (that's close. I forgotten the exact price). I'd be a fool if I didn't. I have the option now to keep it (which seems better) or sell it. But yeah, I think we're not going to see a better AMD card till next year.

Cassettes! I use to sell custom loaded TDK (sa) and Maxwell cassettes for years. By 10s to 500 cnt. boxes+, any size from C-5 to C-120. The last technology of the audio cassette was really astonishing. The best Cro2 and Metal was actually better than CD. The audio range could reach 22,000 hz as oppose to 20,000 hz on CD. It had a sound depth (or breath) sound stage presents. I do miss working in the old audio shop.
The bad part of audio device listening is, the best quality sound is now in the past.


----------



## diggiddi

Quote:


> Originally Posted by *Rexer*
> 
> If you had to buy something that big, that's the way to go. 1080. I need more denture work before I can reason with that. I'm currently convinced the 390 & 390x are better than the 480 and 1070. I don't think benchmark programs give accurate gameplay stats. Also Nvidia likes to play up the 256 bit pipe song but I think that matters. The bigger the pipe the more pig you can slam into it at one time. I can't figure out how they can say compression is better. Compression always needs to be expanded. 512 is still straight through. My opinion and belief is no one right now will see the full use of the 512 bus. 390 alone has a high 384000 mb bandwidth is more than 980ti's 336000mb and their bus is 384mb.
> 
> By the way, my XFX xtr 750w psu is giving me fits. I need to RMA it. I got a smaller Seasonic 550w sitting around for temporary use. *Think that's enough for a 390? I can borrow (secretly switch) my 550w with for my boss's Seasonic 650w since he doesn't use the total power of it. All of this is single rail. Do you think I'll need more than 600w to run a 390*?


I think it should work(550w) but it depends heavily on CPU and whether its stock or heavily OCed,
the 650 should be a better choice as you already know, I'd get it in a heartbeat
FYI, I was able to run my FX 8350 @4.8 ghz and overclock 2 290X lightnings @1230/1620 on my Antec 750 HCG for a while
then it started taking its toll.

390 should consume approx 300watts at full chat


----------



## Rexer

Quote:


> Originally Posted by *diggiddi*
> 
> I think it should work(550w) but it depends heavily on CPU and whether its stock or heavily OCed,
> the 650 should be a better choice as you already know, I'd get it in a heartbeat
> FYI, I was able to run my FX 8350 @4.8 ghz and overclock 2 290X lightnings @1230/1620 on my Antec 750 HCG for a while
> then it started taking its toll.
> 
> 390 should consume approx 300watts at full chat


Thanks. He aint gonna miss it.


----------



## diggiddi

Quote:


> Originally Posted by *Rexer*
> 
> Yeah. While I had the chance, I bought a spare 390x on summer sale for under $275 (that's close. I forgotten the exact price). I'd be a fool if I didn't. I have the option now to keep it (which seems better) or sell it. But yeah, I think we're not going to see a better AMD card till next year.
> 
> Cassettes! I use to sell custom loaded TDK (sa) and Maxwell cassettes for years. By 10s to 500 cnt. boxes+, any size from C-5 to C-120. The last technology of the audio cassette was really astonishing. The best Cro2 and Metal was actually better than CD. The audio range could reach 22,000 hz as oppose to 20,000 hz on CD. It had a sound depth (or breath) sound stage presents. I do miss working in the old audio shop.
> The bad part of audio device listening is, the best quality sound is now in the past.


Good deal on the 390x, smart move on your part. I'm looking to acquire one in the future and you are right,
we wont be seeing a better card til '17
C-5? never seen those before, just the regular C60's and C90's with occasional C30 sprinkled in
Did not know tapes had that much dynamic range, but you prolly needed a Nakamichi Dragon to best a Cd,no??
If ya pockets are deep you can still get Metal tapes and tape decks on Ebay


----------



## Rexer

Quote:


> Originally Posted by *diggiddi*
> 
> Good deal on the 390x, smart move on your part. I'm looking to acquire one in the future and you are right,
> we wont be seeing a better card til '17
> C-5? never seen those before, just the regular C60's and C90's with occasional C30 sprinkled in
> Did not know tapes had that much dynamic range, but you prolly needed a Nakamichi Dragon to best a Cd,no??
> If ya pockets are deep you can still get Metal tapes and tape decks on Ebay


The 390 & 390x prices shockingly jumped back up as availability diminishes. Especially 390x (outside of Gigabyte's locked mV board). But that's the way computer component stuff goes. One week, I got a cheap, 1/2 cost replacement for a customer's motherboard, the next week, it was twice the retail cost. Lucky me, the buy was at the right time. I've had great buys on cpu's but sometimes motherboard prices were through the roof or not available.
Studio jocks and bands would buy the C-5 to give away as samples to record promoters and agents. They used them for auditioning night clubs and event promotions (like political rallys). Radio station jocks looking for jobs would use C-5 for voice samples from their past radio shows as auditioning tool. I even had a guy who was selling black market U.S. radio rock shows to England. It was a crazy business.
I sold junk cassettes, too. TV get rich scam guys selling, 'get rich quick schemes' would want thousands of cheapie cassettes. A ten cassette pack of how to buy and sell homes, a ten volume pack, teaching 'how to buy stocks' etc. Big churches wold buy bulk on a monthly bases. They'd ask me a size, type and price, have my helpers spin them off on big, cassette loading machines, put'em in a box(s) and ship them off.
I remember the studio guys like Nakamichi Dragon. I used Tascam, Teac V 970x, Akai GX-9 all for dbx archive and Akai GX70r for personal use. Can't be cheap around clients and customers. Funny how I can still remember that. I'm sure someday I'll get back in it. I still got a small cassette library in a cool storage facility. The very best sound came from ramping the (pitch) cassettes speed. Only a few machines were able to do that.


----------



## Rexer

Quote:


> Originally Posted by *christoph*
> 
> yeah you better get a usual board, one that people are using already with the video card you own


Yep.


----------



## Rexer

Quote:


> Originally Posted by *Pillendreher*
> 
> My R9 390 doesn't seem to need that much voltage so far. I've gotten it down to -75 mV via Afterburner and haven't had any crashes or artifcats or anything like that until now. Too bad there's still the stupid coil whine, although I think it has quieted down a bit over the last few days.


Coil whine. I heard that term before. Is that a spinning sound? A vibrating sound. It's not that little buzzer speaker that emits the error code? The only moving parts are the fans. If the fans aren't making the noise, whatever's making it obviously isn't normal.
I remember hearing buzzing noises from cpu controlled devices (machinery) because a certain part was not functioning properly. I think it was capacitors hooked in parallel, when one of them cease to work properly, the other capacitor(s) would take up the slack. It would continue to run but the full load would be a strain on the one capacitor (or remaining capacitors). Performance in general was slower, hotter running and made errors or cease consecutive sequences. The buzzing sound was caused by forced voltage straining, like bottle necking.
I imagine lowering voltage would relieve some of the load but going lower than the recommended settings would cause voltage to jump across connections rather than making clean contact among components. The jumping across connections would cause it to burn intermittently (a brown out effect), widening the gap across connections. When this happens at a micro scale you can never find where it burns. Can you pin point where the sound is coming?


----------



## Mister300

Coil whine is a strange one to pin down, I blame it on poor epoxy mounting of components on pcb. If you crack open a pre 1990 piece of gear you will see globs of epoxy on filter caps and related components. I second the cassette tape era I have a three head Nak CR series deck and it still runs today. BTW my relative owns a recording studio and his finest rig is a Nagra open reel digital recorder, amazing sonics. Flat to 10Hz.


----------



## Dundundata

Coil whine (for me anyway) was a high pitched sound, I suppose it was a sort of buzz, that got worse overclocked and under load. I can even here it slightly on my new card on those rare occassions it's running 1000s of fps.

You can find alot of deals on these cards now if you don't mind used. I've seen sub $200 390s. Won't xfire though because my psu is 750 and my case could be bigger.


----------



## Rexer

Quote:


> Originally Posted by *Mister300*
> 
> Coil whine is a strange one to pin down, I blame it on poor epoxy mounting of components on pcb. If you crack open a pre 1990 piece of gear you will see globs of epoxy on filter caps and related components. I second the cassette tape era I have a three head Nak CR series deck and it still runs today. BTW my relative owns a recording studio and his finest rig is a Nagra open reel digital recorder, amazing sonics. Flat to 10Hz.


I can't pin point the irritating buzz my psu made before it shut down. I reckoned it to be coil whine except my system shut down completely. Didn't stay running. More a sound of under voltage (brownout) frying. It's running right now and makes me feel tentative about gaming on it.
a while back, I bought a bogus Corsair RM750 (in 2013) from the advice of a friend and I regretted it. Fried up a gpu, cpu and motherboard. Auto temp fan setting were wrong. Got hot and had a hint of that strange electrical burn smell. It had serial numbers that had been recalled before I bought it. Didn't find out till I RMA'd it. I can't understand why Corsair didn't remove the bad ones from retail shelves. Rather than reinstalling it, I gave to a friend.
So I got this XFX XTR 750w made by Seasonic. It's been running 2 years till now. Acts too, like it's going south. So I asked diggiddi if my spare 550w had enough power to push a 390 till I RMA'd the 750w. 550w. Got a 650w at my disposal, too.
I know the 390 & 390x has about a 300w power @top draw but how much can that spike up to? I overclock like everyone else does so I figure the draw to excess. Doing Ultra fps gaming, with 64 nut cases in Battlefiel 4, I'm thinking there's a chance I somehow destroyed this 750 watter.

Yeah, I remember Nagra. They also made a great portable digital recorder that was the love of audiphile world.They gushed with prasie, lol.
What's the name of your relative's studio? Guys like sound men and gophers use to roam from studio to studio. I worked with some guy (can't remember his name anymore) did mix work on jazz guys John Klemmer, Lee Ritenour and Joe Sample. Sorta fun.
I used a huge, cast iron Ampex tub for broadcast editing. It was just big, 4 track R & R, had it's own stand. Studer Revox made a dandy R & R. I think Nakamichi's were king of the hill recorders. It's amazing how they would rise past 21,000 hz. You began hearing the breath of recording halls, night clubs, the slick surface of guitar strings and stand up bass instruments, footsteps prattle around the recording stage, someone whispering for Ned to move his head. It was amazing what a cassette will do. In a good setup it would shine.


----------



## diggiddi

On a good quality 750w you could still Xfire just no OC or at most only mem OC but leave core at stock


----------



## bluej511

Quote:


> Originally Posted by *diggiddi*
> 
> On a good quality 750w you could still Xfire just no OC or at most only mem OC but leave core at stock


Really depends on the cpu. I can push out 225 on my r9 390 with ease not even slightly OCed. With my CPU OCed and card at stock im easily pushing 400w, add in ram, HDD, fans, etc etc gets up there easy. Two cards alone are easily 400w. I like to be efficient so i get a psu thats going to be used about 40-50% of my load. Its never good to load a psu past 80% anyways.

Yea he could run xfire at 750w psu but why? What happens if he OCs both lol.


----------



## Dundundata

Quote:


> Originally Posted by *diggiddi*
> 
> On a good quality 750w you could still Xfire just no OC or at most only mem OC but leave core at stock


No overclock







If I was going to run 2 cards I would just get a new case and PSU. You can see my case, it's good for one card. Actually my biggest issue is most of the cables are too long, or have unnecessary extensions. Anyways for 1080p the 1 card does a great job. Looking forward to 4k I'd like to see what the next line of cards has to offer.


----------



## PunkX 1

Going anything above 1.3v on the R9 390 causes a black screen. Anyone else faced this?


----------



## Dundundata

I regularly run my card at 1.325 and can run it at 1.375 no problem


----------



## battleaxe

Quote:


> Originally Posted by *Dundundata*
> 
> I regularly run my card at 1.325 and can run it at 1.375 no problem


What kind of clocks can you achieve at that voltage?


----------



## Dundundata

My normal OC is 1140/1625 @1.325. I can go up to 1180 @1.375. Stock is 1.275. I could push it further but I like to be 100% stable with no artifacts.


----------



## tolis626

I think problems start popping up when the cards push over 1.3V under load. 1.3V at idle is fine, I've gone over 1.4V and had no problems. Under load I've only made it to about 1.3V, but that was already at a huge offset so I didn't even let it finish 'cause I want my card alive and, well, not well-cooked.


----------



## Rexer

Quote:


> Originally Posted by *Dundundata*
> 
> No overclock
> 
> 
> 
> 
> 
> 
> 
> If I was going to run 2 cards I would just get a new case and PSU. You can see my case, it's good for one card. Actually my biggest issue is most of the cables are too long, or have unnecessary extensions. Anyways for 1080p the 1 card does a great job. Looking forward to 4k I'd like to see what the next line of cards has to offer.


I just pulled the 650w from my bosses, computer... I'm going to put it back and buy a bigger one. I'm thinking I'll need 800+ for next year, moving up x-fire and 1440p. Thanks.


----------



## Rexer

Quote:


> Originally Posted by *diggiddi*
> 
> On a good quality 750w you could still Xfire just no OC or at most only mem OC but leave core at stock


Wasn't suspecting of reaching a limit. Getting second thoughts of moving up.


----------



## Dundundata

Quote:


> Originally Posted by *tolis626*
> 
> I think problems start popping up when the cards push over 1.3V under load. 1.3V at idle is fine, I've gone over 1.4V and had no problems. Under load I've only made it to about 1.3V, but that was already at a huge offset so I didn't even let it finish 'cause I want my card alive and, well, not well-cooked.


Hey we have the same motherboard, I just got it. Do you think it would be worth it for me to swap my 1600 ram for 2400?

What's your stock voltage? I'm only running +50mV most of the time but that puts me over 1.3V. Doesn't seem to be a problem for the card.


----------



## PunkX 1

I've been running my 390 at close to 1.28v under load for almost a year now. No problem whatsoever.


----------



## PunkX 1

Quote:


> Originally Posted by *Dundundata*
> 
> My normal OC is 1140/1625 @1.325. I can go up to 1180 @1.375. Stock is 1.275. I could push it further but I like to be 100% stable with no artifacts.


Is this the voltage under load?


----------



## tolis626

Quote:


> Originally Posted by *Dundundata*
> 
> Hey we have the same motherboard, I just got it. Do you think it would be worth it for me to swap my 1600 ram for 2400?
> 
> What's your stock voltage? I'm only running +50mV most of the time but that puts me over 1.3V. Doesn't seem to be a problem for the card.


Nice choice dude, the M7F is an awesome piece of hardware. And it looks pretty!









Now, I overclocked my RAM to 2400MHz and I did see a difference, mainly in the minimum framerates of some games, but that's about it. To be honest, with the current prices of DDR3, I see no reason to buy 1600MHz. As long as your CPU's IMC can handle it, I'd go to 2400MHz.

My GPU's stock voltage is 1.275V too, so we're about on the same boat here, that's why I was commenting. +50mV does make it run at 1.325V at idle (when running in 3D mode but without real load, like when running a GPU accelerated app). Under load is a different story, however. Fire up a game and it lands in the 1.23V-ish range. I have to go well over +100mV to get close to 1.3V under load. I've never had problems because I've never used over 1.3V for an extended period of time, as my cooling probably can't handle it. MSI's cooler is great and all, but there's only so much you can do with air cooling.

Στάλθηκε από το GT-I9300 μου χρησιμοποιώντας Tapatalk


----------



## Dundundata

I will have to check tonight because I usually look at the maximum value and don't monitor voltage while gaming, so maybe it's running lower under load than I think.

I've been running +50mV for over a year now and temps are good. Not all the time just for certain games, but it sees a good amount of overclocking.


----------



## diggiddi

Quote:


> Originally Posted by *Rexer*
> 
> I just pulled the 650w from my bosses, computer... I'm going to put it back and buy a bigger one. I'm thinking *I'll need 800+ for next year*, moving up x-fire and 1440p. Thanks.


I'd go with a minimum of 1000w


----------



## bluej511

Quote:


> Originally Posted by *diggiddi*
> 
> I'd go with a minimum of 1000w


If hes getting r9 390s sure. If hes doing vega or polaris, no point. They use about 150w 1000w would be a bit too much. Hed be fine and efficient with 850


----------



## Rexer

Quote:


> Originally Posted by *bluej511*
> 
> If hes getting r9 390s sure. If hes doing vega or polaris, no point. They use about 150w 1000w would be a bit too much. Hed be fine and efficient with 850


390. Maybe 390x if I find another.


----------



## Dundundata

Well I have much to learn. Yes my voltage was well under 1.3 overclocked and under load. What's with the spikes at idle though?


----------



## 2jzom

what is a stable OC settings for sapphire r9 390 nitro tri-x oc 8gb??


----------



## tolis626

Quote:


> Originally Posted by *Dundundata*
> 
> Well I have much to learn. Yes my voltage was well under 1.3 overclocked and under load. What's with the spikes at idle though?


It's not spiking at idle, idle voltage is fine. That's what it should be. It's just that VDROOP sets under load and makes you land at a lower voltage. In an ideal world there wouldn't be any VDROOP, but it is what it is. I wouldn't care TOO much about it though.


----------



## Rexer

Quote:


> Originally Posted by *2jzom*
> 
> what is a stable OC settings for sapphire r9 390 nitro tri-x oc 8gb??


That's a tough one. A staple setting is sort of nebulous. Overclocking is like a practice. You have to reach it. That's because everybody's computers aren't configured the same. Fans, watercoolers, heatsinks, psu power, crossfire, motherboard bios changes and programs use different requirements so they draw different voltages and gpu use.
It's sort of a gradual ramping or downclocking. Sort of having to find your own sweet spot.
I'd say for starters, Raise the power limit all the way (50%) and increase the gpu with 5/10mhz increments without voltage. When you find it's artifacting or producing anomalies (like snow patterns) raise the voltage a few mV till it's stable. I wouldn't mess with the Mem Clock, yet. I raised mine to 1600 mhz but honestly, I don't see much difference using it.
Everyone has a way to achieve the best stable overclock. Mine isn't the only way. I think it's reasonably safe practice.
Hope this helps.


----------



## Rexer

Quote:


> Originally Posted by *Dundundata*
> 
> I will have to check tonight because I usually look at the maximum value and don't monitor voltage while gaming, so maybe it's running lower under load than I think.
> 
> I've been running +50mV for over a year now and temps are good. Not all the time just for certain games, but it sees a good amount of overclocking.


Hey, this is gpu voltage you're talking about, not Power Limit, right?


----------



## b0uncyfr0

There are currently at least 5 390x cards on the XFX page and i cannot tell what there differences are. Are they just revisions of each other?

Im planning to upgrade from a VX 290x 8gb that overclocks like **** ( +100 mv for 1100 core) to an XFX 390x for a small exchange in cash. I tried to read the owners list and check voltage/temp/clocks but cant scroll to the right..

A few things to consider in my case:
1) I have an FT02 (vertical fans that blow from the bottom) - so the long heatpipe design isnt best for these cards. The VX has excellent cooling and keeps the core around 80 even with +100, but the VX is the best.
2) The XFX would be a tad shorter and is more closed off than the VX . Therefore more air might get in from the bottom and be pushed out from the ports compared to the VX where everything is dumped in case with those three smaller fans.

As for the XFX cards-
1) Are there any issues like fans breaking , poor paste, heatsink not making proper contact, specific revisions that are good (as hinted on Intro page) anything at all?
2) Intro page said that cooling on the core is adequate - this has me abit concerned.
2) Is it worth it in your opinion?


----------



## jdorje

Fury x for $325. Worth the upgrade at [email protected]? Hawaii used prices are inflated so i could probably sell my 390 for $200. Haven't even had it for a year but that's only $35 less than i paid.

Or, wait for black Friday or Vega?


----------



## DarthBaggins

We had a Fury X on open box for $260 and I missed my chance at snagging it


----------



## Rexer

Quote:


> Originally Posted by *DarthBaggins*
> 
> We had a Fury X on open box for $260 and I missed my chance at snagging it


Ow. Yeah. Hopefully, there will be one drop from now till black Friday.


----------



## DarthBaggins

I am planning on selling off my 390x to save for the 490(x) as it's looking to be a really decent card, just hope it's price-point is as well


----------



## flopper

Quote:


> Originally Posted by *jdorje*
> 
> Fury x for $325. Worth the upgrade at [email protected]? Hawaii used prices are inflated so i could probably sell my 390 for $200. Haven't even had it for a year but that's only $35 less than i paid.
> 
> Or, wait for black Friday or Vega?


cleaning out means, Vega is close.
I wait for Vega myself.
Fury is a great card however for that resolution.


----------



## lanofsong

Hey R9 390/390X owners,

We are having our monthly Foldathon from Monday 17th - 19th 12noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.

October Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## Rexer

Quote:


> Originally Posted by *DarthBaggins*
> 
> I am planning on selling off my 390x to save for the 490(x) as it's looking to be a really decent card, just hope it's price-point is as well


I heard that rumor. RX 490 will supposedly be released in early 2017. From what I heard, the Vega is a pretty large chip compared to Polaris 10, with a larger than 256mb bus for 4K. Price wise, I know nothing.
I did hear Nvidia wants to take the prices for their big gtx cards into the $1,000 price range. So that sorta 5uck5 for Nvidia loyals. They just want to jack up the market prices for video cards. On the other hand, so what? I'm no fan of theirs. They always have some kinda customer scheme going on their loyals.


----------



## DarthBaggins

The price point is what is veering me back to AMD on GPU's especially if the 490 is at a good price (less than a 1080 but closer to the 1070)


----------



## Rexer

Quote:


> Originally Posted by *DarthBaggins*
> 
> The price point is what is veering me back to AMD on GPU's especially if the 490 is at a good price (less than a 1080 but closer to the 1070)


Probably for me, too. I saved coin for the 1080 but every time I hear stuff like a class action lawsuit filed by 970 owners, I get tentative about buying anything from them. Just a while back, I heard they even lied about 970's render output units from 64 to 56. So what else could be right? When comes down to it, it's really a $250 card with lies printed all over the box. Is even the 1140mhz clock speed correct?


----------



## diggiddi

Quote:


> Originally Posted by *Rexer*
> 
> Probably for me, too. I saved coin for the 1080 but every time I hear stuff like a class action lawsuit filed by 970 owners, I get tentative about buying anything from them. Just a while back, I heard they even lied about 970's render output units from 64 to 56. So what else could be right? *When comes down to it, it's really a $250 card with lies printed all over the box*. Is even the 1140mhz clock speed correct?


LOL Classic, I think i'll add that to my sig


----------



## b0uncyfr0

Can anyone see how my modded 290x compares to a similarly clocked 390x?

Sepcs
3770k at 4.6 (HT on)
16GB DDR3 at 2600Mhz
290x at 1100/1500 with 390x mods
SSD - naturally
Crimson - 10.6.1
Windows 10 x64 - Not sure. (Will provide asap)
I will post benchmarks up as soon as i get home today.


----------



## DarthBaggins

Quote:


> Originally Posted by *Rexer*
> 
> Probably for me, too. I saved coin for the 1080 but every time I hear stuff like a class action lawsuit filed by 970 owners, I get tentative about buying anything from them. Just a while back, I heard they even lied about 970's render output units from 64 to 56. So what else could be right? When comes down to it, it's really a $250 card with lies printed all over the box. Is even the 1140mhz clock speed correct?


I was actually able to get my Strix 970 to 1500mhz on a stable OC, so they are a good clocking card but I only paid $280 for it


----------



## 12Cores

Put my cards under water, was able to get them to 1150/1600. I can now run wither 3 with all settings at max including hairworks at a locked 60fps with vsync on. Curious to see if a single Vega card will outperform these two.


----------



## TeslaHUN

Quote:


> Originally Posted by *12Cores*
> 
> Put my cards under water, was able to get them to 1150/1600. I can now run wither 3 with all settings at max including hairworks at a locked 60fps with vsync on. Curious to see if a single Vega card will outperform these two.


Let us see, make photo about ur config ! What waterblock did u buy ? fullcover for 120$ or universal block for 50$ ?


----------



## 12Cores

Quote:


> Originally Posted by *TeslaHUN*
> 
> Let us see, make photo about ur config ! What waterblock did u buy ? fullcover for 120$ or universal block for 50$ ?


Running two universal vga water blocks with two 120mm fans blowing on the vrm's, blocks cost about $55 US a piece.


----------



## Rexer

Quote:


> Originally Posted by *DarthBaggins*
> 
> I was actually able to get my Strix 970 to 1500mhz on a stable OC, so they are a good clocking card but I only paid $280 for it


I think Nvidia established 970 on lower figures and stats but beef them up to so they can get the price for the R & D they put into it. After all, who would want to pay 329/350+ for 3.5 properly ultilized memory in a 4.0 frame buffer, 196 mb bus instead of 256mb as advertised and 56 rendering output units instead of 64. The specs are between R9 280x and 290.
R9 280x uses a 384 bus, 32 rops and 3gb of memory.
R9 290 uses a 512 bus, 64 rops with 4gb of memory.
The big, box marked specs (4GB) put it closer to R9 290 prices which is why I think it was no mistake. I think it's real price point should be lower 250 to 300.
I own a EVGA Nvidia gtx 970sc in a bench computer. By no wise is it slow. I think it's over priced.


----------



## DarthBaggins

And in the end people thought they were getting a card that would perform similar to the 980 for a lower cost


----------



## flopper

Quote:


> Originally Posted by *DarthBaggins*
> 
> The price point is what is veering me back to AMD on GPU's especially if the 490 is at a good price (less than a 1080 but closer to the 1070)


wont expect Vega be cheap.
Performance and as seen with Fury they do superb with dx12 still will cost likely more than the fury line did initially.


----------



## realmister

Guys, 390 Nitro owner here. I use freesync and can't live without overclock. I was able to reach 1180Mhz/1800Mhz stable on Heaven Bench.

I'm want to know if someone else is havin the same issues. With the latest driver, I'm getting a random crash which leads to a RED SCREEN while playing TW3 with a mild overclock (1160Mhz/1500Mhz). Now, I've heard of black screens, blue screens, but RED it's a first for me. It seems to occur more frequently when I overclock, but I've had crashes on stock as well.

Yesterday I did some tests, and noticed that whenever I clock my VRAM above the 1500 stock, the core voltage spikes to over 1,350V, in a most uncanny manner. I mean, I can't reach that core voltage even when I crank the Afterburner slider all the way up, so why the hell when I put my vram to 1501 those spikes start appering? Nevertheless, the spikes seem unrelated to the RED screen crash, because I had it even with the vram clock on stock.

I really like this card, but this situation is getting on my nerves. I don't know if it's the drivers fault. Some dude on reddit suggested rolling back to the 16.7.x driver, but I didn't want to miss on the new improvements.


----------



## ziggystardust

Quote:


> Originally Posted by *realmister*
> 
> Guys, 390 Nitro owner here. I use freesync and can't live without overclock. I was able to reach 1180Mhz/1800Mhz stable on Heaven Bench.
> 
> I'm want to know if someone else is havin the same issues. With the latest driver, I'm getting a random crash which leads to a RED SCREEN while playing TW3 with a mild overclock (1160Mhz/1500Mhz). Now, I've heard of black screens, blue screens, but RED it's a first for me. It seems to occur more frequently when I overclock, but I've had crashes on stock as well.
> 
> Yesterday I did some tests, and noticed that whenever I clock my VRAM above the 1500 stock, the core voltage spikes to over 1,350V, in a most uncanny manner. I mean, I can't reach that core voltage even when I crank the Afterburner slider all the way up, so why the hell when I put my vram to 1501 those spikes start appering? Nevertheless, the spikes seem unrelated to the RED screen crash, because I had it even with the vram clock on stock.
> 
> I really like this card, but this situation is getting on my nerves. I don't know if it's the drivers fault. Some dude on reddit suggested rolling back to the 16.7.x driver, but I didn't want to miss on the new improvements.


The best way to make sure if it's caused by the driver or not is to test with some of those previous drivers. It's the best thing to do for the time being.

I was actually having noticeably higher gpu memory errors with 16.9.1 and 16.9.2 drivers while playing TW3. Not a single errors in previous drivers. W,ith 16.10.1 it's a lot less now but still happening from time to time. AMD is clearly messing up something in drivers lately, so best way to make sure is to test thoroughly for a while with other drivers.

Does it happen in any other game?


----------



## Rexer

Quote:


> Originally Posted by *DarthBaggins*
> 
> And in the end people thought they were getting a card that would perform similar to the 980 for a lower cost


Nvidia's gotta be getting a laugh outta this. If you keep thinking about it, it's really a slap in the face. It's not only them but their client companies. Haw! They said it's there advertising dept that made a mistake but I know everyone at Nvidia had to be looking at the box. They're a big company with lots of eyes so how much of a mistake can it be, especially when their client companies (Asus, EVGA, MSI, etc.) are farting around with they're particular circuits to compliment the gpu, then print them up. Har, har, Nvidia fans. They ripped off as much as $100.00 off every 970 customer who paid full price.
C'mon, are the client companies blind, too. Lol. They're saying, "Oh, Nvidia's wants to print false information and screw their fan base. Well, let's see if we can do that, too." It's pretty amazing. None of them said anything.
No, I don't think the 970's a bad card. In fact it's an engineering feat but truthfulness and money is amiss. I'm just not that blind. Lol.


----------



## bluej511

Quote:


> Originally Posted by *ziggystardust*
> 
> The best way to make sure if it's caused by the driver or not is to test with some of those previous drivers. It's the best thing to do for the time being.
> 
> I was actually having noticeably higher gpu memory errors with 16.9.1 and 16.9.2 drivers while playing TW3. Not a single errors in previous drivers. W,ith 16.10.1 it's a lot less now but still happening from time to time. AMD is clearly messing up something in drivers lately, so best way to make sure is to test thoroughly for a while with other drivers.
> 
> Does it happen in any other game?


Those errors in hwinfo or wtv you're using don't mean anything at all. Its probably showing you uncorrected memory errors which isnt a problem. If your card has no issues at stock clocks and/or OCed clocks then there's no problem and don't look for one.


----------



## mynm

Quote:


> Originally Posted by *ziggystardust*
> 
> The best way to make sure if it's caused by the driver or not is to test with some of those previous drivers. It's the best thing to do for the time being.
> 
> I was actually having noticeably higher gpu memory errors with 16.9.1 and 16.9.2 drivers while playing TW3. Not a single errors in previous drivers. W,ith 16.10.1 it's a lot less now but still happening from time to time. AMD is clearly messing up something in drivers lately, so best way to make sure is to test thoroughly for a while with other drivers.
> 
> Does it happen in any other game?


I'm having gpu memory errors with may 380 too. Yesterday I have reinstalled DX with this: https://www.microsoft.com/en-us/download/confirmation.aspx?id=8109 , after that I don't see errors, and I don't see a problem with DirectShow in dxdiag, but I'm not 100% sure if it is solving the problem.

Acording to this: http://answers.microsoft.com/en-us/windows/forum/windows_10-hardware/how-do-i-unistallreinstall-directx-on-windows-10/f703b4ef-74e1-434e-8fce-1e84743301a6?auth=1 it is only reinstalling DX11 files, so if it is a DX12 problem you will have to reinstall windows.


----------



## Dundundata

Quote:


> Originally Posted by *bluej511*
> 
> Those errors in hwinfo or wtv you're using don't mean anything at all. Its probably showing you uncorrected memory errors which isnt a problem. If your card has no issues at stock clocks and/or OCed clocks then there's no problem and don't look for one.


Yeah I just noticed I get errors on HWinfo, they show up when OCed but everything runs fine as far as I can tell - no artifacts anyway.


----------



## gupsterg

The Stilt gave info for Mumak author of HWiNFO to implement feature, so do read his posts in a few threads, one such post.


----------



## Dundundata

Quote:


> Originally Posted by *gupsterg*
> 
> The Stilt gave info for Mumak author of HWiNFO to implement feature, so do read his posts in a few threads, one such post.


I was reading through that before but I'm still confused whether I should be worried or not. Most I've seen is 65,000 idk sounds like alot.

One quote
"The EDC doesn't separate correctable errors from uncorrectable ones, so you don't really know if they were corrected or not. Once you have a significant amount of errors which get through, you'll get visible artifacts. Prior that you just lose performance." -The Stilt


----------



## gupsterg

The counter does not differentiate between corrected and uncorrected errors. So it is basically total number of errors the memory controller has detected. The counter does not work at idle/low loads but at 3D loads.

The errors that are corrected will result in performance loss as memory controller has to take cycles to correct them. The errors which are not corrected and are significant in number you will see visual artifacts.

As to what is a high enough number which we should be concerned with no idea. You see when The Stilt created RAM timings for his mining/gaming ROMs the timings where done so they are error free and he went for clocks which he knew most Hawaii MC will be able to sustain without error.

The stock RAM timings in ROM are also set and placed in appropriate straps to be error free. If you do archive an OC in the higher clock range error free then basically the MC is better than another GPU's which does error.


----------



## Dundundata

Yeah I don't think my card likes mem OC very much. I run up to 1625. So I will try to mess around a bit and see if I can reduce errors. But I do wonder if a greater memory clock speed can (basically) negate a certain amount of errors? So you lose a bit of performance with errors but overall are gaining because of the higher clock speeds?


----------



## bluej511

Quote:


> Originally Posted by *Dundundata*
> 
> Yeah I don't think my card likes mem OC very much. I run up to 1625. So I will try to mess around a bit and see if I can reduce errors. But I do wonder if a greater memory clock speed can (basically) negate a certain amount of errors? So you lose a bit of performance with errors but overall are gaining because of the higher clock speeds?


The way its supposed to be done CORRECTLY, is to OC until you start dropping synthetic benchmark scores ie heaven or firestrike then go back down to what worked.


----------



## Streetdragon

i wanna oc my mem too. but even with higher voltage and vddci i get from time to time e blackscreen while straming etc... but whatever. the performence is enogh^^


----------



## gsdavid1

Hi,

Does anyone know if MSI R 930 https://www.msi.com/Graphics-card/R9-390-GAMING-8G.html#hero-specification can run 3 monitors, 2 of which are 2x 1200p on DVI connectors already working and a additional one such as http://www.benq.com/product/monitor/bl3201pt/features/ ? I kinda need 3 monitors for work and I'm wondering if i can get all 3 to work to get extra work surface

From what i can see, i can use the display port for 4k but i'm not sure if all 3 will work or can r9 390 handle them all at once..this is obviously for desktop/non gaming only... I would only game on 1200p since 4K is too much for 390 regardless of overclock

I'm on Win10 if that is any way relevant

Pls help a noob,

Thank you


----------



## Mister300

I use a stock XFX 390 X freesync combo for 4K and it is fine.

My gaming target is 40 FPS min, anything lower the free sync drops out unless you mod the drivers. I use V sync or target frame control to avoid tearing on some titles if above refresh rate.
I also turn off AA at 4K, personal preference.

I game on ultra on Verdun @ 75 FPS,
Multiplayer Crysis 3 @ 60 FPS very high settings,
Star Wars BF on ultra 60 FPS steady and higher if V sync is disabled,
Need For Speed 60 FPS ultra, med AO, Gorgeous in 4K
Metro LL 50 FPS very high, no PhysX, no SSAA,
COD BO3 90 FPS maxed out
Alien Isolation ultra 60 FPS min
Bioshock inf ultra 60 FPS min
Titanfall ultra 60 FPS with AO turned down a bit.
Dirt Rally ultra 60 FPS steady.


----------



## Mister300

Sorry I also run three monitors with no issue, game on a single 4K no eyeinifinitiy


----------



## gsdavid1

Cool, tnx for detailed info


----------



## Dundundata

Quote:


> Originally Posted by *bluej511*
> 
> The way its supposed to be done CORRECTLY, is to OC until you start dropping synthetic benchmark scores ie heaven or firestrike then go back down to what worked.


Well this is why I am wondering, because at a higher mem OC I am still gaining in benchmarks, even though errors increase.


----------



## jdorje

The errors hwinfo shows are corrected errors and can be completely ignored. Until you start dropping performance you are fine.


----------



## KGB1st

Can I overclock my msi 390x vga more than 1500 mhz? Anyway..
Need I do any modifications with vgas' bios? Or 1500 it's not for this card?


----------



## battleaxe

Quote:


> Originally Posted by *KGB1st*
> 
> Can I overclock my msi 390x vga more than 1500 mhz? Anyway..
> Need I do any modifications with vgas' bios? Or 1500 it's not for this card?


Not on the core. Never heard of one going over 1300mhz.


----------



## chumanga

Hey guys, i have a problem which is not related to overclocking but card default values. The problem is than card always crash to black screen and PC keeps running during browsing, flash content, video playback and things like this. If i run a stressful 3D application then the problem never shows up. I already send the card to warranty, they find the problem and reballed it to exchange GPU and say it was fine then. But the problem persist with crashes(black screen of death) when doing light tasks for almost all time i own the card.

Lately i pay attention to HWinfo memory errors and it always cause some thousand errors during light tasks. That so strange because i cant reproduce the problem during full load and that the point which warranty guys will look.

Can card be really hardware faulty? I see some posts saying Elpida memory was better at 1500mhz while the Hynix which is in my card was causing more errors. Sometime ago some guys which has same problem like me discovered than decreasing memory clock during this light tasks stabilized the card and vanish with crashes for these light tasks, what really happens. But yet dont explain why if memory is not able to run at 1500mhz it dont crash during stress.

I get so mad with AMD with this card.


----------



## gsdavid1

I had exact same problem until a while ago...i would mostly crash while watching tv series or twitch/youtube but never ingame..had like 500hrs+ stable ingame hours while i would get 1-2 daily crashes in chrome/vlc related...apparently its AMD and win 10 related

For me it got fixed with 16.9xxx series of drivers...perhaps try DDU or rollback to older drivers or smtn...I've googled and often found this problem, seems it almost never faulty hardware


----------



## Weird0ne

I'm running 1150/1725 @+75.
I just re did my TIM on it to try to get better temps, I don't think it dropped honestly.
But, should I re do my OC from stock and see what performance I can get?

XFX 390 DD


----------



## Dundundata

Quote:


> Originally Posted by *Weird0ne*
> 
> I'm running 1150/1725 @+75.
> I just re did my TIM on it to try to get better temps, I don't think it dropped honestly.
> But, should I re do my OC from stock and see what performance I can get?
> 
> XFX 390 DD


what temps are you getting?


----------



## Weird0ne

Quote:


> Originally Posted by *Dundundata*
> 
> what temps are you getting?


Right now about 45~ idle and around 70~ when I run heaven benchmark.


----------



## mus1mus

Quote:


> Originally Posted by *KGB1st*
> 
> Can I overclock my msi 390x vga more than 1500 mhz? Anyway..
> Need I do any modifications with vgas' bios? Or 1500 it's not for this card?


Pour Liquid Nitrogen to the card. Otherwise, pick a 970 or a 1060 for the same money and clock them over 1500MHz on the Core.


----------



## Dundundata

Quote:


> Originally Posted by *Weird0ne*
> 
> Right now about 45~ idle and around 70~ when I run heaven benchmark.


45 idle might be a little high but 70 under load is quite normal for these cards. you might try messing with your gpu fan profile and case fan setup if you haven't already


----------



## KGB1st

So I think that I can send this sht in trash today)))

Or I can buy something of this and make its as double 2x390X and when this cards will be not enought for me I can buy 1080, wich will be cost within 1/2 price


----------



## mus1mus

lol. So your target is 1500MHz after all.


----------



## KGB1st

Quote:


> Originally Posted by *mus1mus*
> 
> lol. So your target is 1500MHz after all.


not 1500. I thought that I can overclock 390X to 1500-1600 and make x2 of them? like cross







Today I understand that this VGA not support this, it will be overhit for core =D


----------



## mus1mus

No, these cards are not capable of those clocks by design. Whoever gave you the idea that they can clock these cards to 1500 deserve a kick in the nuts.

Anyway, a 390X is an equivalent, if not beating, to a GTX 980. OC per OC, they are right at each other. A GTX 1060 may come under them.

Read a lot before you purchase anything. Lesson #1.


----------



## Mister300

Got Battlefield 1 and it runs great with a stock XFX 390X at 4 K.

Key to hit 60 FPS is to set everything at ultra except back down resolution scale to 75%.

Very few issues with it for an impulse buy,.


----------



## Rmosher

Just wondering what others have been able to get for an overclock on the stock voltage of their 390X's. I have a Sapphire Tri-X 390X OCed to 1105 core and 1725 mem running the latest WHQL Crimson drivers on the stock voltage. And when I say overclock I mean 24/7 able to play games on it overclock not an artifacting benchmark run overclock. Here is my GPUZ validation link: https://www.techpowerup.com/gpuz/details/cydwr. Look forward to hearing from you guys.


----------



## JerDerv

Quote:


> Originally Posted by *Rmosher*
> 
> Just wondering what others have been able to get for an overclock on the stock voltage of their 390X's. I have a Sapphire Tri-X 390X OCed to 1105 core and 1725 mem running the latest WHQL Crimson drivers on the stock voltage. And when I say overclock I mean 24/7 able to play games on it overclock not an artifacting benchmark run overclock. Here is my GPUZ validation link: https://www.techpowerup.com/gpuz/details/cydwr. Look forward to hearing from you guys.


My super basic afterburner oc for daily use is 1100 1650 on a gigabyte 390x. I had to put power limit to 35% in order to maintain a steady core clock under load.


----------



## jdorje

Always put power limit to max.

I don't think I ever managed to exceed the 208+50% power limit of my 390 though. And the 390x has 8+8 pins right? It should have an even higher base limit.


----------



## Mister300

Way too much measurbating here with regard to clock speeds, go out and enjoy the hardware. BTW a "measurbator" is a person who just looks at specs not actual results. I am a scientist with over 25 years in the field and am guilty of this sometimes. When I moderated a photography forum we coined this term.

Buy a modern AAA game and enjoy the 4 K 60 fps experience.

I find that at least with my experience OC has little to no impact on my frame rates. Numbers on a synthetic benchmark do improve as expected.

Still respect everyone's advice and efforts on this forum, you have a great group.


----------



## Rmosher

Quote:


> Originally Posted by *JerDerv*
> 
> My super basic afterburner oc for daily use is 1100 1650 on a gigabyte 390x. I had to put power limit to 35% in order to maintain a steady core clock under load.


That's pretty good. Maybe most 390X's top out at 1100? That's what I'm curious to know most of all. Oh and I have my power limit maxed out to maintain a steady 1100 MHz while gaming. Cheers.


----------



## bluej511

Quote:


> Originally Posted by *Rmosher*
> 
> That's pretty good. Maybe most 390X's top out at 1100? That's what I'm curious to know most of all. Oh and I have my power limit maxed out to maintain a steady 1100 MHz while gaming. Cheers.


I can get 1100 on mine with no added voltage or power limit whatsoever. After that i need 100mv to get to 1200/1650.


----------



## Rmosher

Quote:


> Originally Posted by *bluej511*
> 
> I can get 1100 on mine with no added voltage or power limit whatsoever. After that i need 100mv to get to 1200/1650.


I added the power limit to keep the clock rate from bouncing around. It was stable without it just with the added power limit keeps it from down clocking. What brand is your card? Also the highest OC I've been able to get is 1165 core 1750 mem with 100 mv. And that isn't very stable, highest stable I've gotten is 1150 core 1750 mem at 83 mv.


----------



## sandmanza

Is there anyone with a PowerColor Radeon R9 390 PCS+ being cooled by a Kraken G10 ?

I'm interested in doing this but dont know if the VRM's will get too hot or not.

Thanks


----------



## bluej511

Quote:


> Originally Posted by *Rmosher*
> 
> I added the power limit to keep the clock rate from bouncing around. It was stable without it just with the added power limit keeps it from down clocking. What brand is your card? Also the highest OC I've been able to get is 1165 core 1750 mem with 100 mv. And that isn't very stable, highest stable I've gotten is 1150 core 1750 mem at 83 mv.


Sapphire r9 390 Nitro, im on water now though but those were on air and VRM temps were pretty damn good, even better now though.

Heres my latest heaven run 1200/1650 stable +100mv 50% power limit. Dont feel like taking a screenshot too annoying haha. The min is still messed up as people are aware but not bad. Temps are as follows.

Core: 40°C
VRM1 (Core VRM i believe) : 61°C
VRM2 (Memory VRM) : 62°C

FPS:
63.7
Score:
1606
Min FPS:
8.6
Max FPS:
132.5


----------



## yafatana

please an one can help find why I'm getting that proplem
almost every time I play battlefield 4 metro map
every maybe 15 or 20 minutes
I get sudden stutter for maybe 10 to 20 seconds with fps drops
it happen on several servers
I have battlefield 4 installed on western digital 2 TB black
and the HDD SMART does not report any proplems

I have windows 10 64 bit
i7 6700k not overclocked
sapphire r9 390 nitro not overclocked
seasonic g650
amd driver 16.7.2

cpu temp in idle 24 C load max 52 C
gpu max 55 to 62 C

do you think it is a proplem with my HDD ?


----------



## bluej511

Quote:


> Originally Posted by *yafatana*
> 
> please an one can help find why I'm getting that proplem
> almost every time I play battlefield 4 metro map
> every maybe 15 or 20 minutes
> I get sudden stutter for maybe 10 to 20 seconds with fps drops
> it happen on several servers
> I have battlefield 4 installed on western digital 2 TB black
> and the HDD SMART does not report any proplems
> 
> I have windows 10 64 bit
> i7 6700k not overclocked
> sapphire r9 390 nitro not overclocked
> seasonic g650
> amd driver 16.7.2
> 
> cpu temp in idle 24 C load max 52 C
> gpu max 55 to 62 C
> 
> do you think it is a proplem with my HDD ?


If its only on one game then its def not an HDD issue. Id start by probably verifying game cache see if it helps. Are you playing in windowed/borderless/full screen? Your memory clock seems to bounce around quite a lot in gaming, pretty sure that shouldn't be happening.

Try turning off power efficiency in radeon settings. Has this just started happening recently or its always been this way?


----------



## yafatana

Quote:


> Originally Posted by *bluej511*
> 
> If its only on one game then its def not an HDD issue. Id start by probably verifying game cache see if it helps. Are you playing in windowed/borderless/full screen? Your memory clock seems to bounce around quite a lot in gaming, pretty sure that shouldn't be happening.
> 
> Try turning off power efficiency in radeon settings. Has this just started happening recently or its always been this way?


I play on full screen
as for memory clock it's the first time I notice this
but maybe because it is a multiplayer game ?
as my ping was 150
I will try with onother game and I will post again .


----------



## Streetdragon

Quote:


> Originally Posted by *bluej511*
> 
> Core: 40°C
> VRM1 (Core VRM i believe) : 61°C
> VRM2 (Memory VRM) : 62°C
> 
> 132.5


hmm my 2 nitro reach up to 80° on the vrm(both) 60° dream... remade the paste and replaced the pads. Fan infront of the cards


----------



## bluej511

Quote:


> Originally Posted by *Streetdragon*
> 
> hmm my 2 nitro reach up to 80° on the vrm(both) 60° dream... remade the paste and replaced the pads. Fan infront of the cards


Im on water but the VRMs are passively cooled with a 120mm fan blowing over my VRM1, VRM2 still gets passively cooled with no fans and temps are identical.


----------



## Rexer

Quote:


> Originally Posted by *sandmanza*
> 
> Is there anyone with a PowerColor Radeon R9 390 PCS+ being cooled by a Kraken G10 ?
> 
> I'm interested in doing this but dont know if the VRM's will get too hot or not.
> 
> Thanks


I did this Kraken G 10 a few years back with a HD 7950. Needs heat sinks on vrm, and ram. 90mm fan couldn't cover enough area. gpu temps were very good but would've been better with a water jacket. Karken G-10 yeah, it's a good idea. however there are better solutions.


----------



## Rexer

Quote:


> Originally Posted by *Mister300*
> 
> Got Battlefield 1 and it runs great with a stock XFX 390X at 4 K.
> 
> Key to hit 60 FPS is to set everything at ultra except back down resolution scale to 75%.
> 
> Very few issues with it for an impulse buy,.


Running BF1 @4k. Hmm. They say the big map's gorgeous. So your doing fine at 60. Hmm. Looks like I gotta find my little piggy and pop a coin for it.


----------



## sandmanza

Quote:


> Originally Posted by *Rexer*
> 
> I did this Kraken G 10 a few years back with a HD 7950. Needs heat sinks on vrm, and ram. 90mm fan couldn't cover enough area. gpu temps were very good but would've been better with a water jacket. Karken G-10 yeah, it's a good idea. however there are better solutions.


Hi, Thanks for the Response

I do have good case airflow and will have heat sink on the VRM


----------



## Rexer

Quote:


> Originally Posted by *yafatana*
> 
> I play on full screen
> as for memory clock it's the first time I notice this
> but maybe because it is a multiplayer game ?
> as my ping was 150
> I will try with onother game and I will post again .


150 ping is sorta high. If you can, below 100 is a good idea. Gotta 144 monitor? Push frame rate target control to 144.


----------



## AizenN1r

Hi guys , can i fit arctic Accelero Hybrid II on my msi r9 390x?


----------



## bluej511

Quote:


> Originally Posted by *AizenN1r*
> 
> Hi guys , can i fit arctic Accelero Hybrid II on my msi r9 390x?


Click technical data. From my understanding, the MSI doesnt use a reference board so probably wouldn't work.

https://www.arctic.ac/eu_en/accelero-hybrid-ii-120.html


----------



## AizenN1r

Quote:


> Originally Posted by *bluej511*
> 
> Click technical data. From my understanding, the MSI doesnt use a reference board so probably wouldn't work.
> 
> https://www.arctic.ac/eu_en/accelero-hybrid-ii-120.html


There is any way for me to watercool my msi r9 390x exept of waterblock and a custom loop?


----------



## diggiddi

I think you could use NZXT Kraken G10 VGA cooler
http://www.techspot.com/products/cooling/nzxt-kraken-g10-vga-cooler.98892/

Other options are semi custom with aio connected to vga cooler
eg EKWB predator with quick disconnects EK-XLC Predator 140 (incl. QDC)
+ This VGA waterblock EK-FC R9-390X TF5 - Acetal+Nickel

or even a alphacool eisbar,coolermaster glacer, swiftech H220/240X AIO and vga block


----------



## dimster

Hey guys I need some help.
I have the chance to buy a used Sapphire R9 290X Tri-X OC (UEFI) (8Gb) and I would like to know a few things.

1) Can I flash it to r9 390x bios
2) And if yes would I have some benefits or could I instead simply oc this gpu and match the r9 390x performance?

As far I know it is basically the same gpu with tighter vram timings for r9 390x and possibly higher stock speed both vram and chip.


----------



## diggiddi

Quote:


> Originally Posted by *dimster*
> 
> Hey guys I need some help.
> I have the chance to buy a used Sapphire R9 290X Tri-X OC (UEFI) (8Gb) and I would like to know a few things.
> 
> 1) Can I flash it to r9 390x bios
> 2) And if yes would I have some benefits or could I instead simply oc this gpu and match the r9 390x performance?
> 
> As far I know it is basically the same gpu with tighter vram timings for r9 390x and possibly higher stock speed both vram and chip.


post here for more help
http://www.overclock.net/t/1561372/hawaii-bios-editing-290-290x-295x2-390-390x/3260_20#post_25611563


----------



## sandmanza

Just want to state for records sake (Since no-where on the internet could I find 100% information on this...)

The Powercolor PCS+ R9 390 8GB and the Kraken G10 work just fine together, no problems during install, VRM temps about 70-80C under load.


----------



## chumanga

I'm with difficult to understand if my card is faulty or if AMD is faulty. Getting black screen during webbrowsing, video playback latest days with all drivers from 16.9.2 to 16.10.3. Now just installed 16.11.1 and using MPC-HC madVR for video playback it caused 10k memory errors in 2 minutes.
I just dont know if my card memory is readlly faulty or if all AMD drivers are broken for my system.

So for test purpose i run AotS to test, it load 6GB in GPU memory and not a single error created after 3 minute benchmark. If i go back and start video playback with madVR then dozens of errors being accumulated per second until i stop it.


----------



## flopper

Quote:


> Originally Posted by *chumanga*
> 
> I'm with difficult to understand if my card is faulty or if AMD is faulty. Getting black screen during webbrowsing, video playback latest days with all drivers from 16.9.2 to 16.10.3. Now just installed 16.11.1 and using MPC-HC madVR for video playback it caused 10k memory errors in 2 minutes.
> I just dont know if my card memory is readlly faulty or if all AMD drivers are broken for my system.
> 
> So for test purpose i run AotS to test, it load 6GB in GPU memory and not a single error created after 3 minute benchmark. If i go back and start video playback with madVR then dozens of errors being accumulated per second until i stop it.


bump voltage slightly to ram/core in 2d mode to check if that fixes it.
seems like a faulty card at first glance.


----------



## chumanga

Quote:


> Originally Posted by *flopper*
> 
> bump voltage slightly to ram/core in 2d mode to check if that fixes it.
> seems like a faulty card at first glance.


Lets make it a bit more complex. Using EVR memory errors dont occur but MadVR do. I know MadVR use GPU for scaling but i dont know if EVR too or is CPU only to not causing same errors.


----------



## KGB1st

Quote:


> Originally Posted by *Mister300*
> 
> Way too much measurbating here with regard to clock speeds, go out and enjoy the hardware. BTW a "measurbator" is a person who just looks at specs not actual results. I am a scientist with over 25 years in the field and am guilty of this sometimes. When I moderated a photography forum we coined this term.
> 
> Buy a modern AAA game and enjoy the 4 K 60 fps experience.
> 
> I find that at least with my experience OC has little to no impact on my frame rates. Numbers on a synthetic benchmark do improve as expected.
> 
> Still respect everyone's advice and efforts on this forum, you have a great group.


I know that 390X is powerfull but I know that after sometimes it's cease to be such.


----------



## Mister300

Still relevant at 60 fps on Battlefield 1.

The only concession I make is to back the render resolution to 80% and my rig hits 60 fps at 4K, no Titan needed here.

All else is on max eye-candy.


----------



## Nameless1988

During single player campaign, I noticed some stuttering, even if 60+ fps all time on my 390 + freesync, 1080p maxed out. Someone else have some stuttering?


----------



## Dundundata

Quote:


> Originally Posted by *Nameless1988*
> 
> During single player campaign, I noticed some stuttering, even if 60+ fps all time on my 390 + freesync, 1080p maxed out. Someone else have some stuttering?


I can't speak for that game but I did notice going from a 7200rpm HDD to an SSD got rid of stutter I was having in Witcher 3.


----------



## Mister300

HardOCP ran an article discussing DX 12 Issues with BF1 and stuttering was one very documented effect. I noticed it when I engaged it, so I switched back to DX 11 and MP stuttering vanished.


----------



## Mister300

Quote:


> Originally Posted by *Nameless1988*
> 
> During single player campaign, I noticed some stuttering, even if 60+ fps all time on my 390 + freesync, 1080p maxed out. Someone else have some stuttering?


Here you go.

' DX12 introduces a stutter in BF1 on both video cards. We found it incredibly noticeable during mission load, and multiplayer map load. For the first 15 seconds of loading a new map or multiplayer map there would be random noticeable stutter. Then there would be random stutter as you played, especially more noticeable in multiplayer maps. DX11 was butter smooth with no issues on either video card in multiplayer.

Therefore, the answer to our question right now is simple; run BF1 in DX11. There is no reason to run BF1 in DX12, unless you just like to punish yourself. Both the Radeon RX 480 and GeForce GTX 1060 will be better off in DX11 in multiple ways.

Maybe a patch will come in the future that helps DX12, or new drivers, we can only hope and be on the lookout for such things. '

http://www.hardocp.com/article/2016/10/24/battlefield_1_video_card_dx12_performance_preview


----------



## Rexer

Quote:


> Originally Posted by *DarthBaggins*
> 
> Why just a 500-550w psu?


Yeah, I agree. You never know if you'll want to overclock your card. If you do, you'll want the bigger psu.


----------



## Rexer

Quote:


> Originally Posted by *Mister300*
> 
> Here you go.
> 
> ' DX12 introduces a stutter in BF1 on both video cards. We found it incredibly noticeable during mission load, and multiplayer map load. For the first 15 seconds of loading a new map or multiplayer map there would be random noticeable stutter. Then there would be random stutter as you played, especially more noticeable in multiplayer maps. DX11 was butter smooth with no issues on either video card in multiplayer.
> 
> Therefore, the answer to our question right now is simple; run BF1 in DX11. There is no reason to run BF1 in DX12, unless you just like to punish yourself. Both the Radeon RX 480 and GeForce GTX 1060 will be better off in DX11 in multiple ways.
> 
> Maybe a patch will come in the future that helps DX12, or new drivers, we can only hope and be on the lookout for such things. '
> 
> http://www.hardocp.com/article/2016/10/24/battlefield_1_video_card_dx12_performance_preview


I get 15 second stutter DX11 all the time. I thought it was a online server problem. A lot of guys I know have the same problem. It happens in select maps both large and small. It's weird. Lol. It goes away after 15 seconds.


----------



## Darknessrise13

Quote:


> Originally Posted by *Rexer*
> 
> Yeah, I agree. You never know if you'll want to overclock your card. If you do, you'll want the bigger psu.


I overclock a 290 just fine on 550w.


----------



## Rexer

Quote:


> Originally Posted by *Darknessrise13*
> 
> I overclock a 290 just fine on 550w.


Sometimes you can get away running a 500w psu with an R9 290. The real game is amps. The more amps you have, the higher you can cruise above the minimum watt requirement.
I don't know how you set your system up but that doesn't matter. If you can run without any problems, that's fine. Personally I didn't find it that way. I started with HD7950 with a 450w and ran it just fine till I began overclocking it. It just couldn't handle the power spikes and crashed. The AMD requirement was a 500w psu. Moved up to 650w psu and it was fine.
Manufacturers requirements for a 390 or 290 are 750w psu. That's just a minimum. There's good reason for that. They don't just print out specs and let us guess it's true power requirement. It might just be they're protecting themselves from guys like us who, run what we have and burn our gpu's up on lower watts.
Besides, if you're like me and start adding extra drives and pcie boards, the extra power's always handy.


----------



## bluej511

Quote:


> Originally Posted by *Rexer*
> 
> Sometimes you can get away running a 500w psu with an R9 290. The real game is amps. The more amps you have, the higher you can cruise above the minimum watt requirement.
> I don't know how you set your system up but that doesn't matter. If you can run without any problems, that's fine. Personally I didn't find it that way. I started with HD7950 with a 450w and ran it just fine till I began overclocking it. It just couldn't handle the power spikes and crashed. The AMD requirement was a 500w psu. Moved up to 650w psu and it was fine.
> Manufacturers requirements for a 390 or 290 are 750w psu. That's just a minimum. There's good reason for that. They don't just print out specs and let us guess it's true power requirement. It might just be they're protecting themselves from guys like us who, run what we have and burn our gpu's up on lower watts.
> Besides, if you're like me and start adding extra drives and pcie boards, the extra power's always handy.


Really depends on OC and what else you have.

I have an r9 390 non OCed but my 4690k is at 1.21v, im probably running around 400-450w, i have a 1000w power supply. Why? Because its running at its peak efficiency of around 92-93%, whereas a 550w psu at the same voltage would probably be closer to 89-90%. As its a gaming PC and only used for gaming and its probably at full load quite often im ok with it.


----------



## Mister300

Sometimes I feel like a BF 1 beta tester. So much pressure for dev/studios to release game before product is outdated. I really need to wait before purchasing new titles.

I always let game run for ten minutes to smooth out hiccups in FPS.

People do not realize the many variables there are in today's modern tech.


----------



## bichael

Quote:


> Originally Posted by *chumanga*
> 
> I'm with difficult to understand if my card is faulty or if AMD is faulty. Getting black screen during webbrowsing, video playback latest days with all drivers from 16.9.2 to 16.10.3. Now just installed 16.11.1 and using MPC-HC madVR for video playback it caused 10k memory errors in 2 minutes.
> I just dont know if my card memory is readlly faulty or if all AMD drivers are broken for my system.
> 
> So for test purpose i run AotS to test, it load 6GB in GPU memory and not a single error created after 3 minute benchmark. If i go back and start video playback with madVR then dozens of errors being accumulated per second until i stop it.


Probably not the same issue given you also mention web browsing but for madVR I was getting crashes lately when going to full screen, probably since a driver update. Changing madvr settings to DX11 seems to have fixed it which I got from the below link.
https://yabb.jriver.com/interact/index.php?topic=107120.0


----------



## Nameless1988

Ok, changed setting from dx12 to dx11 and now BF1 runs silky smooth on max settings, 1440p , 60\70 fps with few dips into 50s.
P.S. Freesync is awesome!!


----------



## Rexer

Quote:


> Originally Posted by *bluej511*
> 
> Really depends on OC and what else you have.
> 
> I have an r9 390 non OCed but my 4690k is at 1.21v, im probably running around 400-450w, i have a 1000w power supply. Why? Because its running at its peak efficiency of around 92-93%, whereas a 550w psu at the same voltage would probably be closer to 89-90%. As its a gaming PC and only used for gaming and its probably at full load quite often im ok with it.


Yup.


----------



## koxy

DIY mod with R9 390 from Msi.





Stock MSI fans



Artic F12 PWM rev2



Don't know why but tachometer, when plugged into gpu pwm header, get crazy with readings when PWM is set below 15%, but fans works and slow down (around 400 RPM tested when plugged into cpu fan header). Im pretty impressed with these fans, cost me 15e postage included, for both and yeah works pretty well, maybe isn't significant drop in temperatures but in noise level its a huge difference(1600 RPM vs ~1000 RPM). Next im gonna try with static pressure fans Cooler Master Silencio FP120 as Artic are case fans, hope temps will be little better.


----------



## bluej511

Quote:


> Originally Posted by *koxy*
> 
> DIY mod with R9 390 from Msi.
> 
> 
> 
> 
> 
> Stock MSI fans
> 
> 
> 
> Artic F12 PWM rev2
> 
> 
> 
> Don't know why but tachometer, when plugged into gpu pwm header, get crazy with readings when PWM is set below 15%, but fans works and slow down (around 400 RPM tested when plugged into cpu fan header). Im pretty impressed with these fans, cost me 15e postage included, for both and yeah works pretty well, maybe isn't significant drop in temperatures but in noise level its a huge difference(1600 RPM vs ~1000 RPM). Next im gonna try with static pressure fans Cooler Master Silencio FP120 as Artic are case fans, hope temps will be little better.


Quite a nice little temp drop, my Nitro did 73°C with stock fans but they ran at like 2200rpm or something like that. Now on water with a dozen fans its still quieter haha. Oh and my core is only 38°C today ambient of 21°C.

You made the card into a 4slot gpu which is pretty sick lol.


----------



## koxy

Quote:


> Originally Posted by *bluej511*
> 
> You made the card into a 4slot gpu which is pretty sick lol.


True, but have plenty of space in mine Fractal R4 so i dont care really xD Always can use 25mm thick fan.


----------



## Dundundata

Hey guys I just upgraded my 16GB DDR3 1600Mhz Cas9 Ram to 2400 C10. I bought a 32GB kit which is way more than I need really, but was wondering if there is any disadvantage to installing all 32gigs? The mobo is z97 so no quad channel and the chip is 4790k.


----------



## mus1mus

Other than increased stress on the IMC which sometimes limits the maximum DRAM Frequency you can run, you don't really lose anything.


----------



## Rexer

I'm reading this article and I thought I'd share this with the 390 club. The new 490 card looks pretty awesome. I hope along with VR, it'll keep AMD a strong competitor in the market.

http://en.yibada.com/articles/174013/20161117/amd-radeon-rx-490-feature-vega-10-gpu-release-slated.htm


----------



## Dundundata

Quote:


> Originally Posted by *Rexer*
> 
> I'm reading this article and I thought I'd share this with the 390 club. The new 490 card looks pretty awesome. I hope along with VR, it'll keep AMD a strong competitor in the market.
> 
> http://en.yibada.com/articles/174013/20161117/amd-radeon-rx-490-feature-vega-10-gpu-release-slated.htm


do you think it will be a good 4k card? The 390 is still a beast for me @1080/60 but I would upgrade to 4k eventually.


----------



## flopper

Quote:


> Originally Posted by *Dundundata*
> 
> do you think it will be a good 4k card? The 390 is still a beast for me @1080/60 but I would upgrade to 4k eventually.


speculations.
390 works for 4k even today, sure adjust a few settings but its a workhorse.
wait for real news.


----------



## Rexer

Quote:


> Originally Posted by *Dundundata*
> 
> do you think it will be a good 4k card? The 390 is still a beast for me @1080/60 but I would upgrade to 4k eventually.


I don't know if it'll be a good 4k card. Nvidia hasn't got an answer for it but I'm sure they're heads up. The rumor I hear is Nvidia's aren't savy 3D DX12 cards. They certainly fooled the market with false 970 specs. Fooled me, since I bought one.
If 490 is a good card, I'm going to wait till summer to get one. I want to wait a few months till they iron out some of the bugs. Anyway, I plan to build a new rig. My 3770k is over 4 years old.
I'm sold on 390 & 390x. I have both cards and a Nvidia 970 sc. In my rig, I like 390 best. For some reason, I get higher stable clock settings than with the 390x. When I game with the CoD aw and BF 4 crowd, I sometimes have to put up with idiots calling me a hack. It's usually guys who have slower machines or live to far from the server.
Being in a small clan, I know a few of the guys who have 980 and 1080 machines. Let me tell you, 390 is a real chest beater. When it's tweaked, it stands toe to toe. It's fast, clear and agile. It makes the Nvidia guys angry. It's too much fun.


----------



## Dundundata

390 Powah!


----------



## Skry

Quote:


> Originally Posted by *Dundundata*
> 
> 390 Powah!


Right on! What CPU?


----------



## koxy

Quote:


> Originally Posted by *Rexer*
> 
> I don't know if it'll be a good 4k card. Nvidia hasn't got an answer for it but I'm sure they're heads up. The rumor I hear is Nvidia's aren't savy 3D DX12 cards. They certainly fooled the market with false 970 specs. Fooled me, since I bought one.
> If 490 is a good card, I'm going to wait till summer to get one. I want to wait a few months till they iron out some of the bugs. Anyway, I plan to build a new rig. My 3770k is over 4 years old.
> I'm sold on 390 & 390x. I have both cards and a Nvidia 970 sc. In my rig, I like 390 best. For some reason, I get higher stable clock settings than with the 390x. When I game with the CoD aw and BF 4 crowd, I sometimes have to put up with idiots calling me a hack. It's usually guys who have slower machines or live to far from the server.
> Being in a small clan, I know a few of the guys who have 980 and 1080 machines. Let me tell you, 390 is a real chest beater. When it's tweaked, it stands toe to toe. It's fast, clear and agile. It makes the Nvidia guys angry. It's too much fun.


For me R9 390 is bit too weak for 1440p at 144 hz, but since freesync does the job its ok, hope 490 will handle it with np, really doubt 390 or 390x can stand toe to toe with gtx 1080...


----------



## Carniflex

Quote:


> Originally Posted by *Rexer*
> 
> Hey! I heard some odd rumor about Gigabyte cards. Had a conversation about overclocking 390 and 390x cards and this techie guy said Gigabyte locked out over volting on their 390 (x) cards and it's not possible to raise the mV. I thought Buffalo Soup. Why would Gigabyte do that? You couldn't obtain the highest performance for your money. Well, Lol, I never own one so what would I know. But out of curiosity, did Gigabyte actually lock out users from raising the volts on their 390 cards?


I have a Gigabyte 390X - to be frank - I'm not surprised that they locked the volts. That thing thermal throttles at factory settings with a 140mm blowing straight into it from the side-panel while howling so loud that wife walked up from downstairs to see what the rattle is about when I pushed it under 100% for more than 5 minutes. Had to put it under water and even then it barely limps up to 1100 on core hitting 65..70C with two pumps and 280mm rad with 4x 140mm fans in push pull on it all for itself. Alphacool GPX "full cover" hybrid block.


----------



## diggiddi

Quote:


> Originally Posted by *Rexer*
> 
> I don't know if it'll be a good 4k card. Nvidia hasn't got an answer for it but I'm sure they're heads up. The rumor I hear is Nvidia's aren't savy 3D DX12 cards. They certainly fooled the market with false 970 specs. Fooled me, since I bought one.
> If 490 is a good card, I'm going to wait till summer to get one. I want to wait a few months till they iron out some of the bugs. Anyway, I plan to build a new rig. My 3770k is over 4 years old.
> I'm sold on 390 & 390x. I have both cards and a Nvidia 970 sc. In my rig, I like 390 best. For some reason, I get higher stable clock settings than with the 390x. When I game with the CoD aw and BF 4 crowd, I sometimes have to put up with idiots calling me a hack. It's usually guys who have slower machines or live to far from the server.
> Being in a small clan, I know a few of the guys who have 980 and 1080 machines. Let me tell you, 390 is a real chest beater. When it's tweaked, it stands toe to toe. It's fast, clear and agile. It makes the Nvidia guys angry. It's too much fun.
























That was funny, I was telling my brother the other day AMD has great engineering, they only need to get software that complements their hardware


----------



## Dundundata

Quote:


> Originally Posted by *Skry*
> 
> Right on! What CPU?


4790k


----------



## lanofsong

Hey AMD R9 390/390X owners,

We are having our monthly Foldathon from Monday 21st - 23rd 12noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.

November Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## Mister300

Lets be realistic here the 1080 goes for 599 to 1000 dependent on the version. You can't expect a 390X at 400 usd to compete.


----------



## Hellegaard1

Owners Club Submission:

XFX R9-390 Stock Air
w/ Memory Strap Mod



Only reason AB was set at stock was because I was trying to run a firestrike bench right before this screenshot, and unfortunately the 1150/1750 wasn't stable enough


----------



## Rexer

Quote:


> Originally Posted by *Carniflex*
> 
> I have a Gigabyte 390X - to be frank - I'm not surprised that they locked the volts. That thing thermal throttles at factory settings with a 140mm blowing straight into it from the side-panel while howling so loud that wife walked up from downstairs to see what the rattle is about when I pushed it under 100% for more than 5 minutes. Had to put it under water and even then it barely limps up to 1100 on core hitting 65..70C with two pumps and 280mm rad with 4x 140mm fans in push pull on it all for itself. Alphacool GPX "full cover" hybrid block.


AMD cards are all about overclocking. I really don't get why they wanted to lock the voltage since AMD is all about clocking. I guess they trust that we'll blow up our cards, eh? Don't want to see a pile of warranty returns on their doorstep. Who knows.


----------



## Rexer

Quote:


> Originally Posted by *Mister300*
> 
> Lets be realistic here the 1080 goes for 599 to 1000 dependent on the version. You can't expect a 390X at 400 usd to compete.


Aahhh. That's a good point. In fact, 1080 is better but in actual FPS gaming, it just has to be close because it comes down to the user. If my opponent isn't nimble or fast enough to wit, it does him no good to have excelsior or even premium hardware. Ducking and dodging, melee fighting in close quarters, it matters I clearly see my opponent in a knife tussle instead of swirling shades of blur stricken colors or stutter and clog stepping movements like I did with HD 7950 or GTX 970.
My 970 would slow down a tad bit. in some maps. It's like losing a step from exhaustion. In the same maps with GTX 1080 users, Using 390, I've come out top monkey so often, I end up dying from carelessness or being big headed. Quick draw is stink'in accurate. I can be easily satisfied until next spring when I build my new rig. Hopefully, by then AMD will work some of the teething bugs out of RX 490


----------



## Nameless1988

My next card will be another Radeon x90, dont know if a 490 or 590. My 390 is the best gpu ever owned: hawaii is the best chip ever and GCN the best architecture ever made.

No more Nvidia's gpu in my case!


----------



## Devildog83

I got a MSI Gaming 390 a few months back and was always a bit unsatisfied because it idled in the 40's and recently started idling in the 50's so even though I did not really want to I took it apart, cleaned it and replaced the thermal paste and pads. The paste job was horrible, it barely had any between the block and the GPU. Once back together it now is idling around 25c and under load playing Crysis 3 the temps max in the low 60's. The fans don't even ramp up where before the would ramp up to the point where it sounded like a freight train. I am disappointed the MSI would do such a crap job putting their cards together. I expected more from them. All is good now though and I won't feel uncomfortable overclocking it a bit.


----------



## ronaldo9_r9

Guys I bought R9 390 for £195. I have Fx 8350 processor and both card and cpu are running on crosair cw500 watt power supply.

Shall i buy rx 480 8gb or gtx 1060? Or shall I keep this one. I game at 1080p and do bit of video rendering using after effects.

Sent from my SM-G935F using Tapatalk


----------



## Dundundata

Quote:


> Originally Posted by *Devildog83*
> 
> I got a MSI Gaming 390 a few months back and was always a bit unsatisfied because it idled in the 40's and recently started idling in the 50's so even though I did not really want to I took it apart, cleaned it and replaced the thermal paste and pads. The paste job was horrible, it barely had any between the block and the GPU. Once back together it now is idling around 25c and under load playing Crysis 3 the temps max in the low 60's. The fans don't even ramp up where before the would ramp up to the point where it sounded like a freight train. I am disappointed the MSI would do such a crap job putting their cards together. I expected more from them. All is good now though and I won't feel uncomfortable overclocking it a bit.


seems to be a common theme, and not just with MSI. mine had the opposite; too much paste. at least it's a simple fix and now you know your card is ready for action, it really is a great gpu


----------



## Dundundata

Quote:


> Originally Posted by *ronaldo9_r9*
> 
> Guys I bought R9 390 for £195. I have Fx 8350 processor and both card and cpu are running on crosair cw500 watt power supply.
> 
> Shall i buy rx 480 8gb or gtx 1060? Or shall I keep this one. I game at 1080p and do bit of video rendering using after effects.
> 
> Sent from my SM-G935F using Tapatalk


I would stick with it especially at 1080p


----------



## christoph

Quote:


> Originally Posted by *ronaldo9_r9*
> 
> Guys I bought R9 390 for £195. I have Fx 8350 processor and both card and cpu are running on crosair cw500 watt power supply.
> 
> Shall i buy rx 480 8gb or gtx 1060? Or shall I keep this one. I game at 1080p and do bit of video rendering using after effects.
> 
> Sent from my SM-G935F using Tapatalk


yeah there's absolutely no reason to upgrade now, at least wait until 490 comes down in price


----------



## ronaldo9_r9

Thanks for prompt reply. Is my power supply okay. Its CX500 watts, do i need to upgrade the Power supply or will I be okay.

I have read lots of positive reviews on this card i am going from Msi R9 270 to XFX R9 390. I am hoping that this will serve me well for few years.

Sent from my SM-G935F using Tapatalk


----------



## bichael

Quote:


> Originally Posted by *ronaldo9_r9*
> 
> Its CX500 watts, do i need to upgrade the Power supply or will I be okay.
> 
> i am going from Msi R9 270 to XFX R9 390. I am hoping that this will serve me well for few years.


I went from a 270x to 390 and have been very happy. Can't see me needing to upgrade for a couple of years at least with my 1080p TV.

I'm using 450W psu and has been fine. Your 500W probably similar given your cpu has higher tdp. Only issue would be it probably limits overclocking. I have tried some small OC on gpu (no voltage increase) which worked okay but to be honest performance is good at stock so I just run that.

Edit - your psu actually has slightly lower amps on 12v rail than myne (34 vs 36). Do you have a plug in monitor to check how many watts you are pulling? Would suggest keeping gpu at stock clocks, could maybe even underclock (cant remember what clocks the xfx is at). So probably is pretty close but as long as it's running okay...


----------



## Devildog83

Quote:


> Originally Posted by *ronaldo9_r9*
> 
> Guys I bought R9 390 for £195. I have Fx 8350 processor and both card and cpu are running on crosair cw500 watt power supply.
> 
> Shall i buy rx 480 8gb or gtx 1060? Or shall I keep this one. I game at 1080p and do bit of video rendering using after effects.
> 
> Sent from my SM-G935F using Tapatalk


Upgrade the CPU if anything. If you are looking to upgrade the GPU hold for a bit and wait for Vega.


----------



## Dundundata

you can run something like HWinfo and see how much power your system draws, you might be close to the limit @500 and it's good to have room to spare. But hey if it works

By the by this how upgraditis starts. I've replaced PSU, RAM, MOBO, upgraded to SSD's since I built my system. But the 390 is 1 part I haven't touched.


----------



## ronaldo9_r9

Quote:


> Originally Posted by *bichael*
> 
> I went from a 270x to 390 and have been very happy. Can't see me needing to upgrade for a couple of years at least with my 1080p TV.
> 
> I'm using 450W psu and has been fine. Your 500W probably similar given your cpu has higher tdp. Only issue would be it probably limits overclocking. I have tried some small OC on gpu (no voltage increase) which worked okay but to be honest performance is good at stock so I just run that.
> 
> Edit - your psu actually has slightly lower amps on 12v rail than myne (34 vs 36). Do you have a plug in monitor to check how many watts you are pulling? Would suggest keeping gpu at stock clocks, could maybe even underclock (cant remember what clocks the xfx is at). So probably is pretty close but as long as it's running okay...


No I dont have device to plugin to monitor the watts. I am not thinking of over locking yet maybe when I get 750W power supply or even 650W. Any suggestions on whats the decent power supply that handle the overclock on both CPU and GPU?


----------



## Nameless1988

Hi guys, i am interested in an Arctic accelero Xtreme 3 or 4 or an AIO kit from Arctic. Which is better for my xfx 390??









Can you suggest me other cooler from other brands in the 60-70€?

Thank you!


----------



## Rexer

Quote:


> Originally Posted by *Devildog83*
> 
> I got a MSI Gaming 390 a few months back and was always a bit unsatisfied because it idled in the 40's and recently started idling in the 50's so even though I did not really want to I took it apart, cleaned it and replaced the thermal paste and pads. The paste job was horrible, it barely had any between the block and the GPU. Once back together it now is idling around 25c and under load playing Crysis 3 the temps max in the low 60's. The fans don't even ramp up where before the would ramp up to the point where it sounded like a freight train. I am disappointed the MSI would do such a crap job putting their cards together. I expected more from them. All is good now though and I won't feel uncomfortable overclocking it a bit.


Yo, man. All the manufacturers of 390 and 390x are curiously the same way. Must be a massaginly boring job to spread mud on a gpu. When I reapplied the goop my gpu, I did think of peanut butter on toast. Yeah, my temps went way down.


----------



## Rexer

Quote:


> Originally Posted by *Nameless1988*
> 
> Hi guys, i am interested in an Arctic accelero Xtreme 3 or 4 or an AIO kit from Arctic. Which is better for my xfx 390??
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can you suggest me other cooler from other brands in the 60-70€?
> 
> Thank you!


You know, for some reason all the 390 (x) manufacturers love spreading thermal mud like creamy peanut butter. The heat from excess mud gets temps pretty high. Try removing the old gpu mud and apply a thinner coat. Most of us who've done this had really good results. Hope it helps.


----------



## Stige

Quote:


> Originally Posted by *ronaldo9_r9*
> 
> No I dont have device to plugin to monitor the watts. I am not thinking of over locking yet maybe when I get 750W power supply or even 650W. Any suggestions on whats the decent power supply that handle the overclock on both CPU and GPU?


I can barely pull 500W from the wall with my 390 and 6600K, and that is with +175mV on the GPU and 1.47V on the CPU.

You won't be pulling anywhere near that much.


----------



## Nameless1988

Quote:


> Originally Posted by *Rexer*
> 
> You know, for some reason all the 390 (x) manufacturers love spreading thermal mud like creamy peanut butter. The heat from excess mud gets temps pretty high. Try removing the old gpu mud and apply a thinner coat. Most of us who've done this had really good results. Hope it helps.


Already changed it at least 2-3 times. Temps are fine in the 75-76 C on gpu and 60-65s C on the VRMs. But i would change cooler because XFX one is noisy: fans are too loud for my ears at 60 %. I don't like too much DB in my case









The accelero xtreme are very silent and cool.


----------



## Devildog83

Mine wasn't from excess paste by the way, it was from to little or just poor paste. It was like it turned to liquid and there was almost nothing between the heat sink and GPU. I loaded up pretty good as I have found too much is not a problem but too little can cause big ones. Better paste helps too.


----------



## Devildog83

My 6700k and 390 only pull about 400w from the wall with both overclocked. My Seasonic 660w platinum is plenty.


----------



## bluej511

Quote:


> Originally Posted by *Devildog83*
> 
> My 6700k and 390 only pull about 400w from the wall with both overclocked. My Seasonic 660w platinum is plenty.


Yea that sounds about right, my 4690k is at 1.21v and my r9 390 factory clocked pulls about 215w max (less then average since im on water), im about 400w as well.. Add in the HDDs, SSD, RAM. water pump and im a tiny bit higher haha. I went for efficiency and got a 1000w psu. If vega and zen pull high wattage ill keep it, if not ill get an 850w or something.

People still tell me my 1000w psu is overkill but i tell em to shove it as im using it EFFICIENTLY lol.


----------



## Stige

Quote:


> Originally Posted by *Devildog83*
> 
> My 6700k and 390 only pull about 400w from the wall with both overclocked. My Seasonic 660w platinum is plenty.


This. I think people overestimate the power usage of their PC a lot for some reason. It is only gonna get lower really.


----------



## Devildog83

Quote:


> Originally Posted by *bluej511*
> 
> Yea that sounds about right, my 4690k is at 1.21v and my r9 390 factory clocked pulls about 215w max (less then average since im on water), im about 400w as well.. Add in the HDDs, SSD, RAM. water pump and im a tiny bit higher haha. I went for efficiency and got a 1000w psu. If vega and zen pull high wattage ill keep it, if not ill get an 850w or something.
> 
> People still tell me my 1000w psu is overkill but i tell em to shove it as im using it EFFICIENTLY lol.


Most good PSU's like your Corsair have the best efficiency between 25 and 80% load. While 1000w is a bit much and Vega and Zen will use less power together than what you have I don't think you will hurt anything with what you have and if you ever decide to add a GPU you have plenty of room.


----------



## Devildog83

Quote:


> Originally Posted by *Stige*
> 
> This. I think people overestimate the power usage of their PC a lot for some reason. It is only gonna get lower really.


So true. Yours has about the worst case you can get from a system with a 390 and any CPU. Maybe an FX CPU would use a bit more but at almost 1.5v not much more. A bit a research goes a long way to helping folks pick a brand and wattage. I believe the right PSU is about the most important you can buy for a system costing as much as we put into ours.

Check out this video, it's a prime example of a newb not understanding how important it is to have good quality.


----------



## bluej511

Quote:


> Originally Posted by *Devildog83*
> 
> Most good PSU's like your Corsair have the best efficiency between 25 and 80% load. While 1000w is a bit much and Vega and Zen will use less power together than what you have I don't think you will hurt anything with what you have and if you ever decide to add a GPU you have plenty of room.


Oh i know all about the range, but i mean PEAK efficiency is most always between 30-50% after that it drops down. Not much but the amount i game it def makes a difference.

Plus the fan never turns on its dead silent haha.


----------



## Devildog83

Quote:


> Originally Posted by *bluej511*
> 
> Oh i know all about the range, but i mean PEAK efficiency is most always between 30-50% after that it drops down. Not much but the amount i game it def makes a difference.
> 
> Plus the fan never turns on its dead silent haha.


Yep, yours at 500w load is over 93% efficient according to Anandtech.


----------



## Dundundata

Any recommendations on some GPU and CPU pastes to try? Haven't tinkered inside my case for a couple weeks !


----------



## tolis626

Quote:


> Originally Posted by *Dundundata*
> 
> Any recommendations on some GPU and CPU pastes to try? Haven't tinkered inside my case for a couple weeks !


Unless you wanna go liquid metal (And there are many reasons to do and to not do so, but I digress), then I highly recommend the Thermal Grizzly Kryonaut as my #1 and the Gelid GC Extreme as my #2. There are, of course, others, but I've had the best results with those.


----------



## Exposal

New to forum and saw this awesome post. I just recently watercooled my MSI Gaming 390 and wanted to start overclocking it a bit. Was wondering what a safe start would be to overclocking this card.

Thanks!


----------



## Stige

Quote:


> Originally Posted by *Exposal*
> 
> New to forum and saw this awesome post. I just recently watercooled my MSI Gaming 390 and wanted to start overclocking it a bit. Was wondering what a safe start would be to overclocking this card.
> 
> Thanks!


You can't do any harm to it without BIOS mods, the limits from software are that limited.


----------



## Exposal

Quote:


> Originally Posted by *Stige*
> 
> You can't do any harm to it without BIOS mods, the limits from software are that limited.


So top the voltage and see how high it goes with afterburner?


----------



## Stige

Quote:


> Originally Posted by *Exposal*
> 
> So top the voltage and see how high it goes with afterburner?


Yes.


----------



## Master0fBlunt

Quote:


> Originally Posted by *Devildog83*
> 
> Most good PSU's like your Corsair have the best efficiency between 25 and 80% load. While 1000w is a bit much and Vega and Zen will use less power together than what you have I don't think you will hurt anything with what you have and if you ever decide to add a GPU you have plenty of room.


I'm using a Rosewill 1000W RBR....

https://pcpartpicker.com/list/gWBt9W


----------



## Master0fBlunt

Quote:


> Originally Posted by *Stige*
> 
> You can't do any harm to it without BIOS mods, the limits from software are that limited.


LOL ! ! ! Not as risky as CPU tweaking but you can SURELY damage your card if you're a loose cannon. I run mine at stock usually unless im showing off lol. But when I have cranked her up, bad things have happened. Ask the occasional scan line or the occasional complete and instantaneous PC freeze up with that annoying loud and steady high pitched noise through my speakers.


----------



## Nameless1988

I think flashing bios is a risky thing, and generally is worthless.


----------



## Master0fBlunt

Quote:


> Originally Posted by *Nameless1988*
> 
> I think flashing bios is a risky thing, and generally is worthless.


I agree 100%, thats the kind of thing you do to a card when it reaches EOL, and you truly don't care whether it lives or dies. I would never suggest to anyone that they attempt to do so unless I knew that they were EXPLICITLY knowledgeable and experienced with such things, and even then I would emplore them to hold off until the card is entirely obsolete with a net worth of $50 or less.


----------



## Stige

Quote:


> Originally Posted by *Master0fBlunt*
> 
> LOL ! ! ! Not as risky as CPU tweaking but you can SURELY damage your card if you're a loose cannon. I run mine at stock usually unless im showing off lol. But when I have cranked her up, bad things have happened. Ask the occasional scan line or the occasional complete and instantaneous PC freeze up with that annoying loud and steady high pitched noise through my speakers.


Quote:


> Originally Posted by *Master0fBlunt*
> 
> I agree 100%, thats the kind of thing you do to a card when it reaches EOL, and you truly don't care whether it lives or dies. I would never suggest to anyone that they attempt to do so unless I knew that they were EXPLICITLY knowledgeable and experienced with such things, and even then I would emplore them to hold off until the card is entirely obsolete with a net worth of $50 or less.


+100mV can't do you any harm, just not possible, it is such a low increase in voltage. I run my card at constant +175mV and +21mV or something on AUX voltage. Stop spreading false information here when you clearly have no experience whatsoever.

There is absolutely no way you can damage your card with only +100mV unless you have a really bad cooler and it's on fire with that extra voltage, which is your own fault then cause you didn't look at the temps or did something about the crappy cooling first.
Quote:


> Originally Posted by *Nameless1988*
> 
> I think flashing bios is a risky thing, and generally is worthless.


Not really if you know what you are doing. I have flashed extra +100mV juice for my 390 through BIOS and tightened the memory timings by quite a bit which actually gives a pretty decent performance boost in itself already. So now I can run +200mV total using MSI Afterburner. Or even run stock voltage with -100mV on MSI AB if I wanted to for any reason. +100mV just isn't enough.

Also done multiple BIOS mods here for other people and they haven't had any issues either.


----------



## Nameless1988

Quote:


> Originally Posted by *Stige*
> 
> +100mV can't do you any harm, just not possible, it is such a low increase in voltage. I run my card at constant +175mV and +21mV or something on AUX voltage. Stop spreading false information here when you clearly have no experience whatsoever.
> 
> There is absolutely no way you can damage your card with only +100mV unless you have a really bad cooler and it's on fire with that extra voltage, which is your own fault then cause you didn't look at the temps or did something about the crappy cooling first.
> Not really if you know what you are doing. I have flashed extra +100mV juice for my 390 through BIOS and tightened the memory timings by quite a bit which actually gives a pretty decent performance boost in itself already. So now I can run +200mV total using MSI Afterburner. Or even run stock voltage with -100mV on MSI AB if I wanted to for any reason. +100mV just isn't enough.
> 
> Also done multiple BIOS mods here for other people and they haven't had any issues either.


We were talking about flashing bios, not about OC. I can run my 390 at +100mV too, with temps under 80C (XFX DD cooler), this is ok, but this is not about flashing bios.

I used to flash kepler bios: 680 and 770 but that was worthless.

Don't know about Hawaii because I have not done it before, but generally, on a 300\400€ VGA flashing the bios is worthless, and I could flash the vram timings on my 390 , i have proper Knowledge how to do that but again, it'is worthless.


----------



## m70b1jr

Quote:


> Originally Posted by *Stige*
> 
> +100mV can't do you any harm, just not possible, it is such a low increase in voltage. I run my card at constant +175mV and +21mV or something on AUX voltage. Stop spreading false information here when you clearly have no experience whatsoever.
> 
> There is absolutely no way you can damage your card with only +100mV unless you have a really bad cooler and it's on fire with that extra voltage, which is your own fault then cause you didn't look at the temps or did something about the crappy cooling first.
> Not really if you know what you are doing. I have flashed extra +100mV juice for my 390 through BIOS and tightened the memory timings by quite a bit which actually gives a pretty decent performance boost in itself already. So now I can run +200mV total using MSI Afterburner. Or even run stock voltage with -100mV on MSI AB if I wanted to for any reason. +100mV just isn't enough.
> 
> Also done multiple BIOS mods here for other people and they haven't had any issues either.


How are you pushing that much volts without black screening? I really want your BIOS.


----------



## Stige

Quote:


> Originally Posted by *Nameless1988*
> 
> We were talking about flashing bios, not about OC. I can run my 390 at +100mV too, with temps under 80C (XFX DD cooler), this is ok, but this is not about flashing bios.
> 
> I used to flash kepler bios: 680 and 770 but that was worthless.
> 
> Don't know about Hawaii because I have not done it before, but generally, on a 300\400€ VGA flashing the bios is worthless, and I could flash the vram timings on my 390 , i have proper Knowledge how to do that but again, it'is worthless.


The whole point here was that you cannot do any harm to your card without BIOS mods?
Quote:


> Originally Posted by *m70b1jr*
> 
> How are you pushing that much volts without black screening? I really want your BIOS.


I think my black screens were fixed after adding some AUX Voltage? Haven't had any since.


----------



## CALiteral

Quote:


> Originally Posted by *Nameless1988*
> 
> Don't know about Hawaii because I have not done it before, but generally, on a 300\400€ VGA flashing the bios is worthless, and I could flash the vram timings on my 390 , i have proper Knowledge how to do that but again, it'is worthless.


Why do you say it's worthless? By putting the 1125MHz timings in the 1500MHz strap, I got between a 2 and 3% increase in performance, depending on the benchmark. I admit it's not huge, but it's still a free boost.

I know @The Stilt said he wasn't tuning memory timings anymore but if someone was able to talk him into tweaking a set for our Hynix AJR IC's, it might just boost performance even more.


----------



## Dundundata

I've tweaked my memory timings but never got around to actually flashing the bios


----------



## Nameless1988

Hi guys, which is the best overall RX 480 about cooling and PCB quality ?
Seems like the Strix and the MSI are the coolest and have best PCB.
And which is the quietest with fans at 2500 RPM fan ?????? (something that it's difficult to get from reviews)


----------



## THUMPer1

16.12.1 drivers are out.
Afterburner no longer changes clocks for me on my 390x. And Wattman is garbage.


----------



## gordesky1

So far wattsman is very buggy with the 390series.... keep getting the memory stuck at 150mhz and it stays that way till you reinstall the drivers....


----------



## TehMasterSword

Updated to the ReLive drivers. Everything worked great at first, until I tried playing RL and Dota. Both suffered very low FPS. I didn't bother investigating why (probably the same memory problem that gordesky1 had) and immediately went to fixing it.

DDU>Fresh install of brand new drivers>Yes, install ReLive>Performance is back to normal but ReLive didn't install. No tab for it anywhere. No "Check for updates" button.

Try again. Nothing

Try again, this time installing older drivers, then upgrading from there. Same problem.

Tried again, this time declining ReLive. SUCCESS. The tab is there, and Radeon correctly says "ReLive is not install. Would you like to install now?". Do so, installs correctly and functions. Performance is perfect in all games I own. 20 minutes down the drain, but it's all working.

As for ReLive, I am very impressed. It really is just as good as Shadowplay. File sized are small, quality is better than Plays.tv in my experience, and no stuttering like plays.tv was giving me. The only mark against it is that one of my instant replays had a slight audio sync problem.

I have no feelings on Wattman yet.


----------



## ziggystardust

What about Trixx? Is it working?

I'm still downloading drivers. For some reason it's very slow.


----------



## THUMPer1

Quote:


> Originally Posted by *ziggystardust*
> 
> What about Trixx? Is it working?
> 
> I'm still downloading drivers. For some reason it's very slow.


IDK, I'm not going to try Trixx.


----------



## ziggystardust

Quote:


> Originally Posted by *THUMPer1*
> 
> IDK, I'm not going to try Trixx.


Found a solution with AB? Or we'll have to wait for an updated version?


----------



## bluej511

Quote:


> Originally Posted by *TehMasterSword*
> 
> Updated to the ReLive drivers. Everything worked great at first, until I tried playing RL and Dota. Both suffered very low FPS. I didn't bother investigating why (probably the same memory problem that gordesky1 had) and immediately went to fixing it.
> 
> DDU>Fresh install of brand new drivers>Yes, install ReLive>Performance is back to normal but ReLive didn't install. No tab for it anywhere. No "Check for updates" button.
> 
> Try again. Nothing
> 
> Try again, this time installing older drivers, then upgrading from there. Same problem.
> 
> Tried again, this time declining ReLive. SUCCESS. The tab is there, and Radeon correctly says "ReLive is not install. Would you like to install now?". Do so, installs correctly and functions. Performance is perfect in all games I own. 20 minutes down the drain, but it's all working.
> 
> As for ReLive, I am very impressed. It really is just as good as Shadowplay. File sized are small, quality is better than Plays.tv in my experience, and no stuttering like plays.tv was giving me. The only mark against it is that one of my instant replays had a slight audio sync problem.
> 
> I have no feelings on Wattman yet.


You must be the only one, i found the quality to be decent at best. For me 16.12.1 is a best and i reverted back. Using relive playing Rainbow Six Siege it made the game freeze an entire second TWICE just recording a 1min clip. Then my freesync did not work in the game, worked everywhere else but siege, reverting back fixed the issue. I also thought the quality of relive was just ok. It didnt record in ultrawide for me just 1080p and it didnt look as sharp as the game actually is, game me black bars top and bottom as well.


----------



## coffeeplus

Does anyone successfully manage to use the manual voltage control from Crimson ReLive Edition 16.12.1 with R9 390?

Since I heard reports that MSI Afterburner overclocking/volting no longer works well with this driver, I tried to use Wattman to try to apply a 50mV undervolt that I had applied before via MSI AB.

When I set v-control to manual, I seem to get wrong values for the power states.

For example:

in it's lowest - I stand at 0.930mV - shown by GPU-Z, while manual voltage control minimum voltage is greater than that value - aproximately 0.98 (can't remember and don't have access to my machine now)
this happens similarly for the highest state - that when I look at the core voltage in GPU-Z while having maximum clock - I read a value which is lower than the one which is shown by default by Wattman.
More than this - I seem not to be able to change values for all the states - some of them keep getting reset to the 'wrong defaults' as soon as I hit ENTER or change focus on the text input.


----------



## THUMPer1

I don't think the manual voltage control works for the 390's.


----------



## Dundundata

I'll wait for newer drivers, not ready to ditch afterburner


----------



## Nameless1988

Sent back my 390 XFX BE, i just ordered the MSI Gaming X 480 8GB









N.B.
This upgrade costs me 0 €
If you have to pay to change from a 390 to a 480, is no sense. (hey perform the same).

Seem like the MSI Gaming X is the best 480 out there: coolest, quietest, great OC potential.


----------



## Dundundata

Been very happy with my MSI 390 over the last 14 months. Originally had an XFX but I like the MSI better.


----------



## h2323

I can't change manual voltages either


----------



## amusa

I been happy with my MSI 390x. I was planning to get another 390x but still debating if I want to dump my 390x for Fury X or just keep the 390x since I have no issue with it.

The only thing I don't like about my graphic card is you have to modified it to add liquid cooling.


----------



## bluej511

Quote:


> Originally Posted by *amusa*
> 
> I been happy with my MSI 390x. I was planning to get another 390x but still debating if I want to dump my 390x for Fury X or just keep the 390x since I have no issue with it.
> 
> The only thing I don't like about my graphic card is you have to modified it to add liquid cooling.


EKWB makes a block specifically for the MSI r9 390/x so no modding needed.


----------



## bluej511

Quote:


> Originally Posted by *amusa*
> 
> I been happy with my MSI 390x. I was planning to get another 390x but still debating if I want to dump my 390x for Fury X or just keep the 390x since I have no issue with it.
> 
> The only thing I don't like about my graphic card is you have to modified it to add liquid cooling.


Stupid double post.


----------



## amusa

Quote:


> Originally Posted by *bluej511*
> 
> EKWB makes a block specifically for the MSI r9 390/x so no modding needed.


You have to remove the radiator (cooler) fins on the GPU to add the water block. I know that EKWB came out with one because I looked into it when they release it.


----------



## ziggystardust

Quote:


> Originally Posted by *amusa*
> 
> I been happy with my MSI 390x. I was planning to get another 390x but still debating if I want to dump my 390x for Fury X or just keep the 390x since I have no issue with it.
> 
> The only thing I don't like about my graphic card is you have to modified it to add liquid cooling.


Wait for Vega imo.


----------



## Rexer

Quote:


> Originally Posted by *ziggystardust*
> 
> Wait for Vega imo.


Yes, Vega.. . hold on, it's almost here.


----------



## Rexer

Quote:


> Originally Posted by *ziggystardust*
> 
> What about Trixx? Is it working?
> 
> I'm still downloading drivers. For some reason it's very slow.


Tried Trixx for a while. It's good, basic, user friendly. It's Sapphire so it's got some shine.


----------



## Rexer

Hey fellow video boarders. I just seen the AMD live presentation of the Zen cpu. It's called 'Ryzen' and the first ones coming are 8 core, 16 thread 3.4 gig. Taller numbers are coming and so far, it looks like it's going to be competition for Intel. They didn't announce any prices but AMD alluded to i7 6700k Skylake's price tag.
Seen a demonstration of the Vega graphics card with Ryzen @ 60fps 4k. Watching on a computer was really small but the game play looked smooth.
I think they'll bring Ryzen and Vega out after CES next month.
I'm feeling pretty good for AMD and for the road, I'm happy with my 390x and very happy with my 390. I plan to build my new rig next summer but I just might wait longer. I got a pee-chee rig. It's great when everything's tuned up and the last internet bill got paid.


----------



## Carniflex

I installed the radeon relive drivers which were released on 8th Dec. However, it seems that Overdrive panel for my primary GPU (390X from Gigabyte) is missing. I still do have the Overdrive panel for my secondary GPU (7870 Eyefinity 6 from Club3D). So anyone knows what is going on in this regard? Anyone else encountered something like this and if you did did you manage to fix it somehow? The OP system is Windows 7 Pro.

Edit: Figured it out I think. there is now some new tab for primary card called wattman. It's not possible to change the GPU frequency much more granually between the individual power stages. Overall seems to hold the card sliightly cooler under the same OC. Have not figured out yet how can 'I overclock the GPU memory. I kept mine at 1565 for nice round 400 GB/s bandwidth number but are atm stuck on the default 1500.


----------



## Rexer

Quote:


> Originally Posted by *Carniflex*
> 
> I installed the radeon relive drivers which were released on 8th Dec. However, it seems that Overdrive panel for my primary GPU (390X from Gigabyte) is missing. I still do have the Overdrive panel for my secondary GPU (7870 Eyefinity 6 from Club3D). So anyone knows what is going on in this regard? Anyone else encountered something like this and if you did did you manage to fix it somehow? The OP system is Windows 7 Pro.
> 
> Edit: Figured it out I think. there is now some new tab for primary card called wattman. It's not possible to change the GPU frequency much more granually between the individual power stages. Overall seems to hold the card sliightly cooler under the same OC. Have not figured out yet how can 'I overclock the GPU memory. I kept mine at 1565 for nice round 400 GB/s bandwidth number but are atm stuck on the default 1500.


I rolled back to drivers from June ( 16.6). Drivers from 16.8 and later, I had problems with. Crashing and overheating. I'm not sure what exactly is the problem. I use to suspect it was the Rx 400 series cards that came out around May/June created the problem. They share the same drivers. Polaris tech might be different from R series cards but AMD would know there would be a problem with drivers if such was the case.
Anyway, I looked up AMD to find out more about 'Wattman' and found other people having problems so you're not alone. Hope this helps.

https://community.amd.com/thread/204322


----------



## m70b1jr

My MSI Afterburner randomly started doing this. Im not sure if it's the result of me updating to ReLive. I've tried Uninstalling Afterburner, then drivers, reinstall etc.. and even tried flashing my BIOS(s). I can't change my voltage in Wattman either.


----------



## lanofsong

Hey AMD R9 390/390X owners,

We are having our monthly Foldathon from Monday 19th - 21st 12noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.

December Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## Rexer

Anyone have an Asus board and Windows 10? I just discovered F2 can't get me into the bios. Arrr. Looks like I gotta hunt down the fix.


----------



## tolis626

Quote:


> Originally Posted by *Rexer*
> 
> Anyone have an Asus board and Windows 10? I just discovered F2 can't get me into the bios. Arrr. Looks like I gotta hunt down the fix.


Pressing the "Delete" button doesn't work? When did F2 become a thing?


----------



## jkuddyh801

HEY BUDDY, ITS OCN_JKUDDYH801 - THX SO MUCH. HERES SOME BROLL IMAGES+VALIDATION PIX/URLS







GPU-Z VALIDATION: https://www.techpowerup.com/gpuz/details/b6y7k

THX AGAIN!


----------



## TehMasterSword

Nice buy, J! And good overclock


----------



## Slowpoke66

Quote:


> Originally Posted by *m70b1jr*
> 
> My MSI Afterburner randomly started doing this. Im not sure if it's the result of me updating to ReLive. I've tried Uninstalling Afterburner, then drivers, reinstall etc.. and even tried flashing my BIOS(s). I can't change my voltage in Wattman either.


Seems like Ure using an older beta-version of AB. Try out the latest 4.3.0 (final): http://www.guru3d.com/files-details/msi-afterburner-beta-download.html


----------



## diggiddi

Quote:


> Originally Posted by *jkuddyh801*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> HEY BUDDY, ITS OCN_JKUDDYH801 - THX SO MUCH. HERES SOME BROLL IMAGES+VALIDATION PIX/URLS
> 
> 
> 
> 
> 
> 
> 
> GPU-Z VALIDATION: https://www.techpowerup.com/gpuz/details/b6y7k
> 
> 
> THX AGAIN!


Better late than never








btw what car is that?


----------



## amusa

Quote:


> Originally Posted by *ziggystardust*
> 
> Wait for Vega imo.


Quote:


> Originally Posted by *Rexer*
> 
> Yes, Vega.. . hold on, it's almost here.


I will look into it, Thanks for the recommendation.


----------



## Subduck

Just thought i'd share my XFX R9 390 undervolt. Valley and Heaven have a temp max of 69C, while some early access game called Guardians of Orion on steam decides to just melt any GPU to like 90C... so we'll forget about that one :3


----------



## Regnitto

I have an ASUS dcuIII 390x and one of the fans is going out on it. I also have a cooler master seidon 120xl laying around from aio cooling my old 290, so I'm thinking about slapping it on the 390x now. Only problem is I need a cooling solution for my vrms. Anyone got heatsink recommendations for this card?


----------



## Carniflex

Quote:


> Originally Posted by *Regnitto*
> 
> I have an ASUS dcuIII 390x and one of the fans is going out on it. I also have a cooler master seidon 120xl laying around from aio cooling my old 290, so I'm thinking about slapping it on the 390x now. Only problem is I need a cooling solution for my vrms. Anyone got heatsink recommendations for this card?


I kind of like Alphacool GPX series for their very large semipassive heatsink part. Although it might not be exactly what you are looking for as the passive heatsink part is intended to function with their own core-only blocks.


----------



## Stige

Quote:


> Originally Posted by *Regnitto*
> 
> I have an ASUS dcuIII 390x and one of the fans is going out on it. I also have a cooler master seidon 120xl laying around from aio cooling my old 290, so I'm thinking about slapping it on the 390x now. Only problem is I need a cooling solution for my vrms. Anyone got heatsink recommendations for this card?


There is no heatsink you can put there that would be enough as even the DC3 cooler doesn't cool it well enough.

The Alphacool GPX block is your one and only option really unless you send it for warranty. Even with that, having fan on top of the VRM provided a massive help in temps.


----------



## Regnitto

Quote:


> Originally Posted by *Stige*
> 
> There is no heatsink you can put there that would be enough as even the DC3 cooler doesn't cool it well enough.
> 
> The Alphacool GPX block is your one and only option really unless you send it for warranty. Even with that, having fan on top of the VRM provided a massive help in temps.


well, i found a 20 pack of 8.8x8.8x5mm aluminum heatsinks, and i plan on putting at least 1 120 or 1-2 80 or 40mm fans directly over the vrms. could RMA it, but don't want to go without the card, no backup while i wait


----------



## Stige

Quote:


> Originally Posted by *Regnitto*
> 
> well, i found a 20 pack of 8.8x8.8x5mm aluminum heatsinks, and i plan on putting at least 1 120 or 1-2 80 or 40mm fans directly over the vrms. could RMA it, but don't want to go without the card, no backup while i wait


Friend had fans die on his DC3 so I borrowed him my own cooler that works for now atleast, seems to be a trend with these fans to fail...

You are definitely gonna need some real fans over the VRM if you are really gonna try that, "silent" isn't gonna cut it :l
I have a GT AP-15 (1850rpm) strapped on top of the GPX block at the VRM end as the cooling performance was abysmal out of the box for the VRM.


----------



## Derek129

Hey guys my giga r9 390 since the latest driver updates I assume is horrizontally flickering at only about the top inch of my monitor. Has anyone else experienced any problems?


----------



## -OC-GameRR

Latest drivers totally crashed my computer. I couldn't even boot because it crashed mid-booting. I however got into safe mode and managed to uninstall every single display driver I had. Now my computer boots to windows, but my bios does not detect my gpu (Msi r9 390x 8g). I have installed the card properly, fans start spinning and leds light up. However, I cannot install AMD drivers because my pc does not recognize any amd hardware in my system. What can I do?

Any help?


----------



## Stige

edit wrong thread


----------



## TehMasterSword

Quote:


> Originally Posted by *Derek129*
> 
> Hey guys my giga r9 390 since the latest driver updates I assume is horrizontally flickering at only about the top inch of my monitor. Has anyone else experienced any problems?


I had the same problem. My current solution was to switch from 144hz to 120hz. Flickering went away.


----------



## Derek129

Quote:


> Originally Posted by *TehMasterSword*
> 
> I had the same problem. My current solution was to switch from 144hz to 120hz. Flickering went away.


Thank you for the reply, I thought my card was ****ting the bed because it crashed on me while watching a YouTube video and then the flickering at the top of the screen started happening. My Msi afterburner settings were all gone after the amd driver updates. I'll give the 120hz a go tonight


----------



## Rexer

Quote:


> Originally Posted by *tolis626*
> 
> Pressing the "Delete" button doesn't work? When did F2 become a thing?


Yeah, 'Delete' should've gotten me in the bios, too. ? Didn't work, either. ??? But yesterday, the numbers pad failed, too.. . so you can guess what it was. Dug out an old keyboard from the closet and plug it in. There was the board bios on both F2 and Delete. Lol. Just in time to make me feel good Windows 10. Thanks bro.


----------



## Regnitto

Quote:


> Originally Posted by *Stige*
> 
> Friend had fans die on his DC3 so I borrowed him my own cooler that works for now atleast, seems to be a trend with these fans to fail...
> 
> You are definitely gonna need some real fans over the VRM if you are really gonna try that, "silent" isn't gonna cut it :l
> I have a GT AP-15 (1850rpm) strapped on top of the GPX block at the VRM end as the cooling performance was abysmal out of the box for the VRM.


So I'm assuming that the vrms are even harder to keep cool than the ones on my old powercolor reference r9 290.......the heatsink powercolor put on the vrms pplus a 120mm fan zip-tied to the card were enough to keep them around 65c w/ +100mV on it.

was that the nexXxos GPX 390 m03 block? I started looking at it over the weekend, but my phone wasn't cooperating with alphacool's site very well. maybe it's time for my first custom loop........would be gpu only, my non-k sku i5 doesn't even need the h100i that's already on it.


----------



## Carniflex

Quote:


> Originally Posted by *Regnitto*
> 
> So I'm assuming that the vrms are even harder to keep cool than the ones on my old powercolor reference r9 290.......the heatsink powercolor put on the vrms pplus a 120mm fan zip-tied to the card were enough to keep them around 65c w/ +100mV on it.
> 
> was that the nexXxos GPX 390 m03 block? I started looking at it over the weekend, but my phone wasn't cooperating with alphacool's site very well. maybe it's time for my first custom loop........would be gpu only, my non-k sku i5 doesn't even need the h100i that's already on it.


If you decide to go for a custom loop with TU case for only a GPU check out their GPX Pro series - it has basically a bit modified DC-LT pump on the block so you could get away with just a radiator + fans + GPU block which would make the life a lot easier to you than to try to find a place for a pump and, God forbid, for a res inside the case. If you must have a separate pump in there then DC-LT is one of the smallest ones but even that is damn bulky for such a tiny case when you take into account the fittings needed as well. Basically the DC-LT with plexi top is 50x50x37mm, however, the fittings are on opposite sides of the top meaning that you end up with approx 100mm in one direction even if you use as compact 90 degree ones as you can find. The exact dimensions for DC-LT can be found in a thread here in these forums dedicated to this pump. Newer revisions are somewhat quieter but the first revisions tended to be a bit on the loud side.

Although GPX Pro series might not be available for your particular card as its a relatively recent addition to their lineup and they have said that they do not intend to release Pro versions for older cards under the scheme where you can send them a card and they release a block for it if it's not in their lineup and you get the first block for free as a reward.

Edit:
I have a GPX M04 on Gigabyte 390X witth 280mm radiator and 4x 140mm at 1000 rpm in push-pull and my VRM's tend to hit up to 75C with the card running at 1100 MHz and +7% power allocation (its voltage locked like all Gigabyte GFX cards).


----------



## Regnitto

Quote:


> Originally Posted by *Carniflex*
> 
> If you decide to go for a custom loop with TU case for only a GPU check out their GPX Pro series - it has basically a bit modified DC-LT pump on the block so you could get away with just a radiator + fans + GPU block which would make the life a lot easier to you than to try to find a place for a pump and, God forbid, for a res inside the case. If you must have a separate pump in there then DC-LT is one of the smallest ones but even that is damn bulky for such a tiny case when you take into account the fittings needed as well. Basically the DC-LT with plexi top is 50x50x37mm, however, the fittings are on opposite sides of the top meaning that you end up with approx 100mm in one direction even if you use as compact 90 degree ones as you can find. The exact dimensions for DC-LT can be found in a thread here in these forums dedicated to this pump. Newer revisions are somewhat quieter but the first revisions tended to be a bit on the loud side.
> 
> Although GPX Pro series might not be available for your particular card as its a relatively recent addition to their lineup and they have said that they do not intend to release Pro versions for older cards under the scheme where you can send them a card and they release a block for it if it's not in their lineup and you get the first block for free as a reward.
> 
> Edit:
> I have a GPX M04 on Gigabyte 390X witth 280mm radiator and 4x 140mm at 1000 rpm in push-pull and my VRM's tend to hit up to 75C with the card running at 1100 MHz and +7% power allocation (its voltage locked like all Gigabyte GFX cards).


I was just on performance-pcs doing a little window shopping, found a dual 5.25 bay res/pump combo for about $60 and a 120x30 rad for just under 30....still gotta figure in tubing and fittings...etc...now I remember why I went aio for cpu....

Counter option....anyone got dc3 cooler not being used want to either sell me the cooler or one of the fans off it? If so pm me.

I can't rma the card now, disassembled to unplug the bad fan, and part of the fan header broke...


----------



## bluej511

Quote:


> Originally Posted by *Carniflex*
> 
> If you decide to go for a custom loop with TU case for only a GPU check out their GPX Pro series - it has basically a bit modified DC-LT pump on the block so you could get away with just a radiator + fans + GPU block which would make the life a lot easier to you than to try to find a place for a pump and, God forbid, for a res inside the case. If you must have a separate pump in there then DC-LT is one of the smallest ones but even that is damn bulky for such a tiny case when you take into account the fittings needed as well. Basically the DC-LT with plexi top is 50x50x37mm, however, the fittings are on opposite sides of the top meaning that you end up with approx 100mm in one direction even if you use as compact 90 degree ones as you can find. The exact dimensions for DC-LT can be found in a thread here in these forums dedicated to this pump. Newer revisions are somewhat quieter but the first revisions tended to be a bit on the loud side.
> 
> Although GPX Pro series might not be available for your particular card as its a relatively recent addition to their lineup and they have said that they do not intend to release Pro versions for older cards under the scheme where you can send them a card and they release a block for it if it's not in their lineup and you get the first block for free as a reward.
> 
> Edit:
> I have a GPX M04 on Gigabyte 390X witth 280mm radiator and 4x 140mm at 1000 rpm in push-pull and my VRM's tend to hit up to 75C with the card running at 1100 MHz and +7% power allocation (its voltage locked like all Gigabyte GFX cards).


Yea those GPX waterblocks dont cool the VRM too well but mine have never ever hit 75°C while gaming, i have a fan blowing over VRM1 on my r9 390 and that usually stays close to 60°C maybe even under because of cooler weather i haven't checked in a while.


----------



## chris89

So what kind of cool stuff can the 390X do guys? What has been your max core overclock?

Any word on what memory modules the 390X uses and what voltage and speeds they allow? Can they do 512GB/s with being @ 512-bit?

AtomBiosReader has them set at 1000mv I'm guessing... I plan on de-clocking the ram to 256GB/s on the 512-bit bus to give the core the overhead to talk on other cards that usually outrun this card.

This card should be able to outrun everything dang near, it appears more emphasis was placed on memory bandwidth than Core throughput... Which clearly shows FPS matters more if we give the core most of the power than a higher memory bandwidth : core ratio. For instance the GTX 980 has 64 ROP's just as the 390X does yet clocks higher on the core because of it's 256-bit memory bus. I think it's possible to de-clock the 390X ram to 256GB/s (1000Mhz memory vs 1500Mhz)(Significant reduction in load TDP requirement)(Giving a theoretical headroom of 50% for core overclockability)(Might need to run the memory @ 750mv compared to 1000mv to further reduce memory power to give it to the gpu core) (Still faster than GTX 980 ram) and clock out the core to 80 + Giga Pixel's per second... 1250mhz x 64 = 80 GigaPixel's & 1250Mhz x 176 = 220,000 Gigatexel's...

It's so clear to me this card should be able to run up with the GTX 1000-series cards quite easily if clocked and programmed appropriately.









Here we see 1438mv core voltage... And knowing the GTX 300 series really doesn't wanna boot or work hardly above 1450mv is clear the 512-bit bus and memory clock is hogging all the power... What's stock voltage? 1100mv/1000mv offset @ 1050mhz? So to achieve 14.28% increase in core clock we need a 30.72% increase in power/ voltage? Only a tiny nano hop from 1050mhz to 1200mhz requires 30% more power for only a 14% increase in performance? On non-reference PCB? What about reference pcb overclock performance? If we can reduce memory clock by 50% were still rolling @ RX 480 memory bandwidth which is 256GB/s... With a 50% reduction that could theoretically mean a 50% reduction in power... Meaning we can then clock out the core on less voltage... this would need extensive testing but it is undeniably a possibility.

I just sense the whole R9 290/390 owners massively dislike the huge power consumption/ huge heat of these GPU's... Great performance with a downside.. and to not clearly win against a GTX 980 is a downer to say the least because it very well should do amazingly well. De-clocked 512-bit GDDR5 is still gonna rock every game out there.. Just overclock the core and see the FPS numbers the Hawaii XT should have seen all along...


----------



## m70b1jr

http://www.overclock.net/content/type/61/id/2926554/width/350/height/700/flags/LL PC is still doing the same thing even after driver reinstalls etc.


----------



## m70b1jr

EDIT: Fixed


----------



## Rexer

Quote:


> Originally Posted by *-OC-GameRR*
> 
> Latest drivers totally crashed my computer. I couldn't even boot because it crashed mid-booting. I however got into safe mode and managed to uninstall every single display driver I had. Now my computer boots to windows, but my bios does not detect my gpu (Msi r9 390x 8g). I have installed the card properly, fans start spinning and leds light up. However, I cannot install AMD drivers because my pc does not recognize any amd hardware in my system. What can I do?
> 
> Any help?


? What AMD drivers are they? Crimson, version
Quote:


> Originally Posted by *-OC-GameRR*
> 
> Latest drivers totally crashed my computer. I couldn't even boot because it crashed mid-booting. I however got into safe mode and managed to uninstall every single display driver I had. Now my computer boots to windows, but my bios does not detect my gpu (Msi r9 390x 8g). I have installed the card properly, fans start spinning and leds light up. However, I cannot install AMD drivers because my pc does not recognize any amd hardware in my system. What can I do?
> 
> Any help?


? Wow. It doesn't show up in device manager? I know weird things happen if the video driver isn't completely uninstalled down to the registry paths. But to get to them after Crimson has been uninstalled, you have to reinstall the drivers that were on your computer, then do a complete uninstall. (I use a 3rd party uninstaller like 'Your uninstaller' or ' Revo Uninstaller'). But you said you can't install them so I'm not sure how to do that. Did you check 'Programs and Features' to see if the driver's still there?
Can you find it in 'Device Manager' and rollback the driver? You may try to rollback Windows in 'Recovery' to an earlier date (last known good).
I know AMD's got an auto -detect download. You might try it. Other than that, I couldn't tell you. Are you using re-live? An earlier version of Crimson?
I had trouble with AMD Crimson 16.8 so I just rolled it back to the last known good.

http://support.amd.com/en-us/download/auto-detect-tool

I'm only guessing here. Hope it helps.


----------



## christoph

try pulling out the card and boot without it, then put it back in and boot, see if that helps the BIOS to detect the video card


----------



## dagget3450

I seem to recall something similar on my setup one time. I think i had to boot to safemode and found card was disabled. Once renabled it worked. I think somehow disabled device was hidden. Its been a long time it may have even been win7


----------



## -OC-GameRR

Quote:


> Originally Posted by *Rexer*
> 
> ? What AMD drivers are they? Crimson, version
> ? Wow. It doesn't show up in device manager? I know weird things happen if the video driver isn't completely uninstalled down to the registry paths. But to get to them after Crimson has been uninstalled, you have to reinstall the drivers that were on your computer, then do a complete uninstall. (I use a 3rd party uninstaller like 'Your uninstaller' or ' Revo Uninstaller'). But you said you can't install them so I'm not sure how to do that. Did you check 'Programs and Features' to see if the driver's still there?
> Can you find it in 'Device Manager' and rollback the driver? You may try to rollback Windows in 'Recovery' to an earlier date (last known good).
> I know AMD's got an auto -detect download. You might try it. Other than that, I couldn't tell you. Are you using re-live? An earlier version of Crimson?
> I had trouble with AMD Crimson 16.8 so I just rolled it back to the last known good.
> 
> http://support.amd.com/en-us/download/auto-detect-tool
> 
> I'm only guessing here. Hope it helps.


Thank you all for help! I've been really busy, so, sorry for not posting sooner.

I've managed to solve the problem: fried pci-e slot. GPU works in different slot and also on different mobo. I had to swap mobos because the lowest slot blocked all the io-cables but now I have an fully working system. Yeyyy! Shame tho that I had to get a "new" mobo.

I wish all of you lots of frames and pleasant new year!


----------



## chris89

I just got my 390x... stock hit 84C reference blower at 240 watts for 60fps firestrike... reduced it from 1500mhz ram to 1000mhz ram from 384gb/s to 256gb/s. Now hit 55fps @ 60C and 180 watts... Cannot tell a difference except now it's very cool and efficient. This is a masterpiece of a card. I have no doubt in my mind it's capable of straight waxing the gtx 980 with reprogramming. It can also now handle higher clocks with ram at 256gb/s on stock volts.. around 1200mhz or higher on stock volts and much faster and cooler running. Updates to come.


----------



## bichael

Anyone try the Bykski water blocks for the 390's?
Link below is for the Powercolor cards but looks like they do other versions too.
https://world.taobao.com/item/521275666173.htm?spm=a312a.7700714.0.0.yj1W3l#detail

Price isn't bad so I'm considering a switch from my Alphacool GPX. I still think the GPX is great but since moving to an RVZ02 with no case fans (using an external rad) my PC has been getting pretty hot - I mean the case, psu and ssd are hot to the touch, the core temps on cpu and gpu are fine at around 55oC, gpu vrm is around 75 which I'm also not really worried about. My main worry is the psu gets much noisier than it was in my SG05 (with a front case fan) and the potential impact of higher temps on SSD/HDD. So thinking a full cover block will help by getting more heat out of the case to my external rad.


----------



## chris89

So what's weird is the 380x I had could list 3840x2160 in all games but the 390x only goes up to 3200x1800...

Yet it is perfectly capable of even 5120x2880 on a Custom Run with 3dmark... Quite annoying. I'm sure we are all certain it's just software limitation. Hopefully the new driver will give the in-game option up to 5120x2880 on 390X and many others.


----------



## m70b1jr

Anyone know any volt mods for the XFX R9 390? anything over like 120mv black screens my PC..


----------



## chris89

*@m70b1jr* Hi how high are you clocking now? my 390x reference can do like 1137 totally stable @ 65288 = 1.250v... I run my VDDCI @ 875mv to reduce memory controller power by 14.28% while taxing all 8 GB. I also run my ram at 1000Mhz = 256GB/s is fine and cool at full tilt. Using Hawaii Bios Reader you can tune the card nicely. Stock is 65288 @ full speed 3d @ 1.250v... Gains are not noticed beyond 1.250v... I cranked this sucker out to 1,250Mhz @ 1400mv and saw no gains what so ever... It can do 1,250Mhz easily though... Power consumption is too high.. but then again I think you need to find the ideal power limit to yield gains at a given clock... 10 watts too high could be 10% less fps... Stock 208 watts can go around 20% beyond that... So if you set 225w, it can go to 270 watts... So its best to set the Limit 20% lower than what you want it at... Meaning 190 watts means around 230 watts... So I guess I could fiddle with the power limit to yield gains at 1250mhz but it seems overvolting doesn't seem to show higher fps... at least in firestrike. It runs so nice under-clocked at cool temps so that's just me.

On a side note Nvidia DSR can do 4k on old school GPU's like GTX 400 series so its not a hardware limitation amd says VSR limit is... Just software. It certainly should and can do more resolution... I'm just kind of annoyed at max 3200x1800 when my 380x showed 3840x2160...

For a long time I was all Nvidia and it was great I love Nvidia, and I love AMD. Maybe now I love AMD just a bit more. Maybe AMD Vega will offer a lowest end entry card with 8gb of 128 bit @ 128GB/s with 32 ROP, 128 TMU in standard and 6-pin variants. Using massive cache in the core and extra low tdp, should be unreal. I always thought it would be awesome to see a low tdp card with 8gb ram and 32 ROP @ 2Ghz... That sounds impossible but idk it's cool. Once they integrated the core vrm in the diode, 2Ghz will be effortless and the Input : Output ratio will be near 1:1. On the 390x was see 25% more input to output power, that's because of ASIC loss because of the VRM distance from the diode and the temperature of the ASIC VRM. The 390X can be extremely efficient though @ 875mv VDDCI @ 65288 or less? @ an even 1Ghz is still very powerful with the 512bit ram @ 256GB/s. You can run the 390x on a low end power supply at these settings and still max everything out. We're only looking at a fraction less fps with a massive decrease in power consumption and increase in efficiency. It's clear the 390X has top-of-the-line power delivery components which can have extreme efficiency or exceptional performance. I'm very happy with the 390X, but will pick up a RX 480 for stuffs and giggles. By the Way as far as power limit and TDC... By Always abiding by the TDC being 75% of the TDP is ideal. For 225 watts is 169 Ampere and it showed higher scores than stock 208W 200A. Take your desired TDP and Multiply by 0.75 for your ideal Ampere TDC amount.

*Side Note*

I'm using this power supply that says on the sticker 22A on 12V rail and that's it... I modded the fan to Red to Yellow, Black to Black and oiled the fan so it runs cooler at high load.

This BIOS I made allows the possibility to run the 390X on a Power Supply Rated @ 22A on single 12V rail. 12 volt rail droop does occur but no more than 160 watts @ 3200x1800 ultra. I tested 850 VDDCI, which is not possible so no less than 875 VDDCI is possible and 875 is perfectly stable(14.285% free power savings is worth it). The card is extremely sensitive to below 1.250v, even 1.200v BSOD's @ 1Ghz. Maybe 1.225V is possible. It takes like 5 minutes to flash the bios on this card so you really have to think it through and be patient with this card.


----------



## zmora

Sapphire NITRO R9 390, stock cooling.

First post here









Mainly because i have to ask a question, anybody else having problems with the latest Strixx software? I am having major screen flicker after clicking "apply" button (no matter if i made any changes or not). The flickering stops after changing refresh rate to 60 and back to 144hz.. Also i am wondering if it's strixx making an error or gpuz , becuase strixx is showing me 1500 on GPU clock while GPUz is showing stock values ( under both default and "current").

I have been using Strixx before without those issues, now suddenly here they are...

Anybody got similar experiences?

EDIT:
i've just noticed some earlier posts about screen flickering.
today my whole screen was flickering after oc at 144hz (even after switching to 60 and back) after trying to oc a bit more the switching works only when switching between 60hz and 120hz. so no more 144hz for me ;(


----------



## m70b1jr

We'll I was playing h1z1 and a slew of rainbowed colored pixels came up and my pc turned off. I turn it on and random lines go through my monitor so im pretty sure i killed my graphics card. Tried switching bios's but no luck. I'm flashing my default one in safe mode but im pretty sure she's dead.


----------



## m70b1jr

*sigh*
Looks like no PC until vega..


----------



## m70b1jr

Tried using DVI and flashing bios. Pc black screens when installing drivers.


----------



## bichael

Quote:


> Originally Posted by *bichael*
> 
> Anyone try the Bykski water blocks for the 390's?
> Link below is for the Powercolor cards but looks like they do other versions too.
> https://world.taobao.com/item/521275666173.htm?spm=a312a.7700714.0.0.yj1W3l#detail
> 
> Price isn't bad so I'm considering a switch from my Alphacool GPX. I still think the GPX is great but since moving to an RVZ02 with no case fans (using an external rad) my PC has been getting pretty hot - I mean the case, psu and ssd are hot to the touch, the core temps on cpu and gpu are fine at around 55oC, gpu vrm is around 75 which I'm also not really worried about. My main worry is the psu gets much noisier than it was in my SG05 (with a front case fan) and the potential impact of higher temps on SSD/HDD. So thinking a full cover block will help by getting more heat out of the case to my external rad.


Have found another option so would appreciate any thoughts, currently deciding between;

- Keep my GPX hybrid cooler and add a few small fans to help keep it (and the case overall) cooler
- Get the Bykski full cover block for my 390, probably around 95SGD ($65) so if I can sell the GPX wouldn't be a huge outlay
- Pick up a reference 290 plus EK full cover block, there are a couple at a good price locally so think I could buy these for around 220SGD ($150) and probably sell my 390 and GPX for a small profit (I think if sell back in the UK second hand prices are a bit higher generally than here in Singapore), not sure if this would be a noticeable downgrade or not though


----------



## chris89

@m70b1jr under warranty? you could break the card down to just heatsink, remove all plastic... start off 325F for 10min after wrapping the whole card in 1 layer of tin foil... keep raising temperature at 10.. until the card works normally... I have 2 bad 290x on the shelf, my 390x is fine. Make sure it doesn't exceed 88C and it won't fail... Default is over 100C which will cause the card to fail, must be set via bios.


----------



## m70b1jr

I got my card from newegg 1 year ago so I have another year left.m. I started a support ticket. I'm not sure if warranty will cover bio flashing or custom coolers or not.


----------



## chris89

@ m70b1jr... card has dual bios.. switch it to stock bios.. boot machine.. then switch back to the corrupt bios.. reflash stock to corrupt slot.. good to go... I have had the same issue as you with incorrectly modding the bios...

Are my scores way less than everyone elses 390x?

stock 3dmark firestrike



modded 3dmark firestrike (With No voltage change still on 65288/ 225W Limit & 169A Which is 225 X 0.75 = 168.75 round up to 169)


----------



## m70b1jr

I've already tried that but still isn't working. I'm guessing I did physical damage by pushing 1350mv in it (Temps were fine tho)


----------



## chris89

If it's non reference boosting voltage or tdp will insta-rape the card haha you have the msi one huh?

I had my reference up to 1,250mhz @ 1400mv @ 400 flippin watts no gains in sight haha no problem on reference.

Your gonna either need an rma direct from XFX? or whatever the manufacturer or bake the beast.









I was going to get non-reference XFX DD 390X myself until it became so clear to me reference was far superior to all others.

What this reference could use has the paint stripped down and have the VRM/ Memory Cooling "PLATE" dipped in Molten Copper to plate both the core heatsink and the entire plate... Even bonding them together as One-Piece in 110% Pure Copper through-and-through would give this card solid gains... With a hefty pure copper backplate 0.25" thick with thermal material to absorb rear pcb heat as well would further gains... I may do this possibly.. Even know I haven't even touched the internals on this 390x yet... I'm exceedingly disappointed thus far for lack of 3840x2160 vsr so I hope they release the next driver 4k unlocked on all cards.


----------



## m70b1jr

Baked 350 degrees for 8 minutes and works. I've tried baking methods in the past but they never worked. this is a first.


----------



## AverdanOriginal

Hi guys,

It's been a while since I posted here. Since I am currently looking into putting a custom loop into my case I was looking for waterblocks for my sweet MSI R9 390. So far I have only stumbled acroos the Alphacool NexXxos GPX r9-390-m02 which is not full cover (VRMs only passively cooled) and the EKWB full-cover block ek-fc-r9-390x-tf5-acetal-nickel

Does anyone have any experience with either of them?
I am currently leaning towards the EKWB full-cover as it will help to cool the VRMs aswell, while the Alphacool version does good, but not perfect as I have been reading.
Anyone have any experience he would like to share?


----------



## battleaxe

Quote:


> Originally Posted by *m70b1jr*
> 
> Baked 350 degrees for 8 minutes and works. I've tried baking methods in the past but they never worked. this is a first.


Seriously? You baked it and now its working again? That's awesome man... never tried it but did want to try. Last time I was temped to try I was able to RMA, but the new card is nowhere near the card the old one was. I shoulda tried the bake method on it.


----------



## m70b1jr

Is crossfiring these cards worth it? Or wait for vega? I found r9 390's near me for $140.. I don't know what going on around here but I also found a fx-8350 for $50 lol.


----------



## battleaxe

Quote:


> Originally Posted by *m70b1jr*
> 
> Is crossfiring these cards worth it? Or wait for vega? I found r9 390's near me for $140.. I don't know what going on around here but I also found a fx-8350 for $50 lol.


I have a pair of 390x's and love em. But not sure if its better to wait for Vega or not... All how much you want to spend I think really.


----------



## 12Cores

Quote:


> Originally Posted by *battleaxe*
> 
> I have a pair of 390x's and love em. But not sure if its better to wait for Vega or not... All how much you want to spend I think really.


I can run most games that support crossfire at 4k with my two 390x's. I originally got the cards as a stop gap until Vega is released but they blow me away with their performance and I was planning on holding on to them for another two years, but right now BF1 does not support multi-gpu and if that continues I will be forced to get a single card that can run that game at [email protected]

Looking forward to Vega







.


----------



## bluej511

Quote:


> Originally Posted by *AverdanOriginal*
> 
> Hi guys,
> 
> It's been a while since I posted here. Since I am currently looking into putting a custom loop into my case I was looking for waterblocks for my sweet MSI R9 390. So far I have only stumbled acroos the Alphacool NexXxos GPX r9-390-m02 which is not full cover (VRMs only passively cooled) and the EKWB full-cover block ek-fc-r9-390x-tf5-acetal-nickel
> 
> Does anyone have any experience with either of them?
> I am currently leaning towards the EKWB full-cover as it will help to cool the VRMs aswell, while the Alphacool version does good, but not perfect as I have been reading.
> Anyone have any experience he would like to share?


If your card has an ekwb available then get it you can't go wrong with there blocks. I wanted to buy the MSI but went with the sapphire as it had WAY better temps for VRM and core temps, then i got the urge to watercool and there was no blocks for my card. Didn't want to buy a new card just to wc so i bought an alphacool block and could not be happier. Stock clocks my VRMs with a120mm fan blowing right over it just barely touches 60°C in a very demanding game after 2-3 hrs. The core barely touches 40°C.

The alphacool has quite a bit of backpressure (but gets better core temps) and the ekwb has very low backpressure and slightly higher core temps. Go ekwb its worth it.


----------



## chris89

*@m70b1jr* : Sick bro that's awesome.. yeah that 325 to 350 for 10 should do fine.. gtx 400 series needed a hot melt to get working again.. probably could have done 325-350 though haha

I'm up to 1172Mhz and as for what voltage what clock needs? *1250mv stock 65288 divided by 1050mhz stock is 1.(19.047%)* so for 1172mhz is multiplied by 1.19047 = 1395.23mv... thats quite a bit for just 1172mhz but thats how its clocked by stock... *Sure we could try 17% or 16% or 15% or 14%*....... *1172* multiplied by *1.14* equals *1336mv* sounds better... just gotta find what *Voltage Percentage* works for your clock artifact free and all that jazz...

I tried 10% and it does artifact but completes exceeding stock performance with a reduction in heat noise and power @ 1172mhz core & 1000mhz ram 256GB/s @ 875mv VDDCI.

Totally stock



Tuned @ 10% But will raise to 14% 1336mv


----------



## Carniflex

Well .. mine is voltage locked ... So I can drag it as far as 1100 core and OCCT detects few hundred errors over couple minute run but its stable enough for everyday usage. TBH at least mine is also damn hot - with the stock cooler it was hitting 90+C under full load after ~10 minutes so I had to reduce its power allowance to -5% to even be able to make use of it without wife coming up from downstairs to complain about the noise. It really was that bad, first time I fired it up and went to see firestrike score wife heard it from downstairs over watching the TV and came up to see what is going on. That was with an 140mm on a side-panel blowing straight into the card.

With a custom loop all for itself the card is running 70... 72 C under full load after the temps stabilize. Alphacool GPX M04 for 390, 2x DC-LT pumps, 280mm 45mm Alphacool radiator, 4x 140mm 1200rpm in push/pull, radiator mounted externally to the case. Power allowance +10%, stock volts as this thing is locked, 1100MHz on core.


----------



## chris89

Wow yeah that's some hot temps. I got this reference which I can run past 1200mhz no problem and if u dial in the power/ tdc and fan profile its quiet and cool under load... I can mod it to load out 60-65C or 70-75C while noise is low. I threw an offer on this reference 390x which handles power effortlessly. I set my max temps to 88C, bios sets to over 100C which if it ever got that hot... permanent damage... 88C prevents permanent damage. I ran it out to 1,250Mhz @ almost 400 watts @ 1,400mv .. pretty loud but didn't fail... you can dial in these cards a lot with hawaii bios reader.
http://www.ebay.com/itm/HP-AMD-R9-390X-Aries-E3-8GB-Video-Card-New-832894-001-/332064729916?hash=item4d5099ab3c:g:w8UAAOSwEzxYVQ~Q


----------



## AverdanOriginal

Quote:


> Originally Posted by *bluej511*
> 
> If your card has an ekwb available then get it you can't go wrong with there blocks. I wanted to buy the MSI but went with the sapphire as it had WAY better temps for VRM and core temps, then i got the urge to watercool and there was no blocks for my card. Didn't want to buy a new card just to wc so i bought an alphacool block and could not be happier. Stock clocks my VRMs with a120mm fan blowing right over it just barely touches 60°C in a very demanding game after 2-3 hrs. The core barely touches 40°C.
> 
> The alphacool has quite a bit of backpressure (but gets better core temps) and the ekwb has very low backpressure and slightly higher core temps. Go ekwb its worth it.


Thanks for the help. Now I know what to spend my next money on


----------



## Streetdragon

i woul
Quote:


> Originally Posted by *AverdanOriginal*
> 
> Hi guys,
> 
> It's been a while since I posted here. Since I am currently looking into putting a custom loop into my case I was looking for waterblocks for my sweet MSI R9 390. So far I have only stumbled acroos the Alphacool NexXxos GPX r9-390-m02 which is not full cover (VRMs only passively cooled) and the EKWB full-cover block ek-fc-r9-390x-tf5-acetal-nickel
> 
> Does anyone have any experience with either of them?
> I am currently leaning towards the EKWB full-cover as it will help to cool the VRMs aswell, while the Alphacool version does good, but not perfect as I have been reading.
> Anyone have any experience he would like to share?


Go for EK. The NexXxos cant cool the VRM enough when you start to overclock. They hit easy 80+ with more then 100mv offset on my 2 nitros and i have a fan infront of each block(yes i assembled them like in the Instructions).

Its a bit better then the stock aircooler but.... it's not worth it. Maybe it is on the MSI-Card better


----------



## bluej511

Quote:


> Originally Posted by *Streetdragon*
> 
> i woul
> Go for EK. The NexXxos cant cool the VRM enough when you start to overclock. They hit easy 80+ with more then 100mv offset on my 2 nitros and i have a fan infront of each block(yes i assembled them like in the Instructions).
> 
> Its a bit better then the stock aircooler but.... it's not worth it. Maybe it is on the MSI-Card better


Mine were fine with 100mv, barely reached 70°C but i do have great case airflow could be the difference, and i only have one lol.


----------



## b0uncyfr0

Just benched my modded 290x in firestrike.

CPU - 3770k at 4.6
GPU - 290x at 1080/1450 (Relive 16.12.2 drivers)
RAM - DDR3 at 2400Mhz
Win 10

Score - 11420. How does this compare to overclocked 390x's or stock ones? Thanks.


----------



## bluej511

Quote:


> Originally Posted by *b0uncyfr0*
> 
> Just benched my modded 290x in firestrike.
> 
> CPU - 3770k at 4.6
> GPU - 290x at 1080/1450 (Relive 16.12.2 drivers)
> RAM - DDR3 at 2400Mhz
> Win 10
> 
> Score - 11420. How does this compare to overclocked 390x's or stock ones? Thanks.


That your graphic score or full score?

My r9 390 does graphic scores of 13k at stock clocks and 15640 at 1200/1650. Stock clocks are 1040/1500.


----------



## chris89

_*@b0uncyfr0
@bluej511*_

Hi I see your 290x is at what 11,420 up to 1080mhz? I must be limited severely by lacking processor(s) and PCIe bus 2.0 on this card. My 390x total score is 10,615 stock, modded up to 11,206... Which is graphics score from : 12,872 to modded score of : 13,917. I suppose PCIe 2.0 is limiting this card in particular? Can you guys post a screenshot of your scores? Thanks


----------



## m70b1jr

Can't you overclock your pci express slot in the BIOS?


----------



## chris89

PCIe 2.0 and 3.0 are pretty much for the most part electrically different, among higher bandwidth capabilities on 3.0 compared to 2.0. You can't overclock the PCIe "Version" but you can turn up the slot bus clock allocation of typically 100Mhz or whatever, up to 105Mhz or something (Not Ideal). The only thing I can do is prioritize the data between the PCIe bus and Processor(s), or Prioritize the data between the Processor(s) and Memory. Maybe I will tinker a bit to see what yields highest performance on this GPU in particular. I have it setup for most others but I'm sure I will need PCIe 3.0 and a new-tech Processor to see the real numbers the card is capable of. Firestrike test 1 stock 60fps, overclocked to 65fps is not much... Are you guy's hitting 100 to 500fps on test 1 on PCIe 3.0? It's funny the 390x is solid, yet we go to 3dmark results and see 250fps GTX 980 and what-not, which seems far from a valid score.


----------



## Removed1

Quote:


> Originally Posted by *Carniflex*
> 
> Well .. mine is voltage locked ... So I can drag it as far as 1100 core and OCCT detects few hundred errors over couple minute run but its stable enough for everyday usage. TBH at least mine is also damn hot - with the stock cooler it was hitting 90+C under full load after ~10 minutes so I had to reduce its power allowance to -5% to even be able to make use of it without wife coming up from downstairs to complain about the noise. It really was that bad, first time I fired it up and went to see firestrike score wife heard it from downstairs over watching the TV and came up to see what is going on. That was with an 140mm on a side-panel blowing straight into the card.
> 
> *With a custom loop all for itself the card is running 70... 72 C under full load after the temps stabilize.* Alphacool GPX M04 for 390, 2x DC-LT pumps, 280mm 45mm Alphacool radiator, 4x 140mm 1200rpm in push/pull, radiator mounted externally to the case. Power allowance +10%, stock volts as this thing is locked, 1100MHz on core.


There is some problem there?! You checked the contact between the gpu and block?
Because 70° it is really too much under water, even if your 390 is an hoven.
Voltage blocked? You can't raise the voltage with Wattman? Tried to flash the bios with a modded one?*

With my 290, i top 14670pts on gpu score, before Relive drivers 15k was done, at 1280/1665Mhz, under water.


----------



## Carniflex

Quote:


> Originally Posted by *Wimpzilla*
> 
> There is some problem there?! You checked the contact between the gpu and block?
> Because 70° it is really too much under water, even if your 390 is an hoven.
> Voltage blocked? You can't raise the voltage with Wattman? Tried to flash the bios with a modded one?*
> 
> With my 290, i top 14670pts on gpu score, before Relive drivers 15k was done, at 1280/1665Mhz, under water.


I think I probably just got an particularly hot one with silicon lottery. I mean it is not an average for 390x to hit 90+ C on stock settings either with a 140mm blowing cool air from outside right into it with a stock cooler. Water delta to the core temp is pretty normal with the core running approx 20C under water temp at full load. I.e., my water temp is hitting around 50C when the core is hitting 70C. So there is clearly enough thermal energy transferred. At Idle I'm usually sitting around 45C at core with water at about 30.. 33C while room ambient is about 20..23C. I have not looked what amps do my PCIe rails report but the total load for the whole system when gaming is around 600W. That is an [email protected] GHz with very minor voltage bump, and 390X at stock volts with core at 1100 MHz and power at +10% and an idle 7870 running auxiliary displays.

I have not tried the modded bios. I'm not that motivated TBH and considering the temperatures mine is putting out I probably do not have that much headroom either even if I manage to push the volts a bit. I'm just going to settle for 1100 core for time being and when that aint cutting it anymore I can either try to find a second hand 390X to crossfire for the titles where that thing works or its time to upgrade the main card.


----------



## AverdanOriginal

Quote:


> Originally Posted by *Streetdragon*
> 
> i woul
> Go for EK. The NexXxos cant cool the VRM enough when you start to overclock. They hit easy 80+ with more then 100mv offset on my 2 nitros and i have a fan infront of each block(yes i assembled them like in the Instructions).
> 
> Its a bit better then the stock aircooler but.... it's not worth it. Maybe it is on the MSI-Card better


Quote:


> Originally Posted by *bluej511*
> 
> Mine were fine with 100mv, barely reached 70°C but i do have great case airflow could be the difference, and i only have one lol.


Yeah thought as much. the Alphacool would cost over here around 100€ and the EKWB 120€ considering the better cooling for VRMs i might aswell pay the extra 20€ just in case (doesn't really make a difference considering all the money spent on fittings and such for custom loop).

By the way. does the backplate make sense to buy for the EKWB? other than a bit of stability (which shouldn't be needed anymore as the heavy cooler will be taken off and swoped for a bit lighter water block) and looks I do not see much point in it.?


----------



## Removed1

Quote:


> Originally Posted by *Carniflex*
> 
> I think I probably just got an particularly hot one with silicon lottery. I mean it is not an average for 390x to hit 90+ C on stock settings either with a 140mm blowing cool air from outside right into it with a stock cooler. Water delta to the core temp is pretty normal with the core running approx 20C under water temp at full load. I.e., my water temp is hitting around 50C when the core is hitting 70C. So there is clearly enough thermal energy transferred. At Idle I'm usually sitting around 45C at core with water at about 30.. 33C while room ambient is about 20..23C. I have not looked what amps do my PCIe rails report but the total load for the whole system when gaming is around 600W. That is an [email protected] GHz with very minor voltage bump, and 390X at stock volts with core at 1100 MHz and power at +10% and an idle 7870 running auxiliary displays.
> 
> I have not tried the modded bios. I'm not that motivated TBH and considering the temperatures mine is putting out I probably do not have that much headroom either even if I manage to push the volts a bit. I'm just going to settle for 1100 core for time being and when that aint cutting it anymore I can either try to find a second hand 390X to crossfire for the titles where that thing works or its time to upgrade the main card.


Not to be boring but i rather believe there is a problem there. Your water t°/Block t° should not climb so high, same as the gpu core.
Like you wisely said, if you notice that the heat is exchanged it is right to mean that the contact is good.
Nevertheless the problem is elsewhere, so i would simply guess the water is not flowing into the loop or you didn't fill the loop carefully and there is air bubbles somewhere.
It is not possible to get the same aircooling t° with a watercooling, especially with the full block of yours, i get a stupid H100 on the 290, modded ok, but still get 65° full at 1280Mhz at 1.32v.
So you should troubleshoot the water flow in your small loop, hope i helped a bit and not be boring.


----------



## AverdanOriginal

Quote:


> Originally Posted by *Carniflex*
> 
> With a custom loop all for itself the card is running 70... 72 C under full load after the temps stabilize. Alphacool GPX M04 for 390, 2x DC-LT pumps, 280mm 45mm Alphacool radiator, 4x 140mm 1200rpm in push/pull, radiator mounted externally to the case. Power allowance +10%, stock volts as this thing is locked, 1100MHz on core.


Quote:


> Originally Posted by *Carniflex*
> 
> my water temp is hitting around 50C when the core is hitting 70C. So there is clearly enough thermal energy transferred. At Idle I'm usually sitting around 45C at core with water at about 30.. 33C while room ambient is about 20..23C. I have not looked what amps do my PCIe rails report but the total load for the whole system when gaming is around 600W.


My MSI is hitting 35-40C in idle and my slight daily overclock of 1060/1550 (actually while writing this right now) and room temp is around 19C.
I am not on water yet, so still the standard MSI Zero Frozer cooler but I changed the TIM half a year ago.

looking at your Kohver v.4 it seems your only exhaust fans on the case are your radiator fans. which means also the hot from the CPU will need to go through there... ok you have your psu sucking in hot air from the CPU.... how does the case get fresh cold air? or am I wrong in thinking your fans on the radiator are pushing and pulling air out of the case?


----------



## chris89

I could probably get a tab bit more but this is on untouched reference 390x. I'm guessing the RX 480 still out performs this card on a lot less power huh? This card is ASIC temperature limited as all others are. Had to set this turbo prop to 75%, which could be felt from about 3 feet at least off the back of the case. Core temp was just 75C but I have Max ASIC Temperature @ 88C to prevent damage. It shut down at first on stock'ish profile. Had to set manual to pass the test @ 1438mv (1,250Mhz) & a crazy 320W/ 320A TDP.


----------



## Dundundata

I have owned the lovely MSI 390 for well over a year now and this card seems to just get better. Well, it's most likely drivers and patches for games but it shows how impressive it is. I did re-paste recently which gave a slight but noticeable decrease of a few degrees. Temps are great. VRM sub 70C.

For stable clocks here are my results:

*volts/clocks/max tempC (firestrike graphics score):*
stock 1040/1500 (12.7k)
stock 1100/1500, 65 (13.5k)
+50mV 1140/1600, 72 (13.9k)
+100mV 1170/1625, 76 (14.3k)

I run either stock or +50, the other is just for benchmarks. Cool thing is using stock voltage most games run solid 1080/60. Witcher 3 has really been my most demanding test. Even when things pass in firestrike W3 may demand more volts!


----------



## bichael

Quote:


> Originally Posted by *AverdanOriginal*
> 
> Yeah thought as much. the Alphacool would cost over here around 100€ and the EKWB 120€ considering the better cooling for VRMs i might aswell pay the extra 20€ just in case (doesn't really make a difference considering all the money spent on fittings and such for custom loop).
> 
> By the way. does the backplate make sense to buy for the EKWB? other than a bit of stability (which shouldn't be needed anymore as the heavy cooler will be taken off and swoped for a bit lighter water block) and looks I do not see much point in it.?


If I was choosing now I would go with the Bykski full cover block. Might depend what price you can get one delivered for but for me they are a lot cheaper than EK and build quality still looks good. Haven't found any feedback on 390 blocks but there's a thread about Bykski stuff somewhere and feedback seems okay. I've used Barrow for all my fittings which have been great so I'm pretty open to a Chinese brand.
https://world.taobao.com/item/521275666173.htm?spm=a312a.7700714.0.0.yj1W3l#detail

My GPX keeps the core <55oC and VRMs <75oC but I never overclocked much at all. I'm also now thinking about going to a fullcover but mainly due to changing to a case that doesn't have any case fans which is a pretty bad combination with the GPX...

And yeah as far as I know the EK backplate is mostly cosmetic. The water block will be pretty heavy though as it's basically a lump of metal.


----------



## Carniflex

Quote:


> Originally Posted by *AverdanOriginal*
> 
> My MSI is hitting 35-40C in idle and my slight daily overclock of 1060/1550 (actually while writing this right now) and room temp is around 19C.
> I am not on water yet, so still the standard MSI Zero Frozer cooler but I changed the TIM half a year ago.
> 
> looking at your Kohver v.4 it seems your only exhaust fans on the case are your radiator fans. which means also the hot from the CPU will need to go through there... ok you have your psu sucking in hot air from the CPU.... how does the case get fresh cold air? or am I wrong in thinking your fans on the radiator are pushing and pulling air out of the case?


That is correct. The airflow in the case is in from the back and out from the front through the radiator. There is also an 120mm behind the reservoir pushing air out through the grill above the radiator. The PSU fan is actually off unless I make use also of the second gfx card. It's an 1.2 kW unit which is passive up to around 550W, Although there is a 80mm in an little air duckt pushing some fresh air from outside into the cpu heatsink. So the air comes in over the GPU area throiugh the PCI-e slot opening and through the PSU and the fan sucking in air from the back and is exhausted from the front.

So the airflow is less than ideal, but there is plenty of it and I'm reasonably sure there aint no hot air pockets inside the case this way.

Then there is a slight problem with the core area of the block itself - one of the screws do not align properly so the core area has only 3 screws as opposed to 4 which its supposed to have. But as noted the heat exchange is good enough so I'm not blaming that for my temperatures. Because the card was hitting already 90C in stock settings in my previous case with decent enough airflow I'm convinced its just the card itself that is particularly hot.

Water flow should be decent enough because switching off the second pump does not affect the temperatures more than about 1C. It is a very simple loop with majority of the restriction coming from the GPU block. I do not believe an air bubble could be stubborn enough to be able to survive in that loop because this is an mobile system - it works on upright position but is transported on its side so there is no comfy pocket for the air-bubble to settle in and not to be disturbed. Unless it has found some particularly clever spot somewhere in the GPU block.


----------



## simonfredette

Im having all kinds of trouble overclocking this card or even getting it to work properly at all during tests. A friend of mine bought a 1070 and ran the time spy test soI figured id give it a go and my results were worse than a gaming laptop!! That being said when I looked in the compare settings the core clock was 300 and memory clock of 800. Im trying to use afterburner to lock in the core and memory to what I want but have not been able to, the power slider is grayed out and I have no fan control. Im running drivers 16.12.1.



How can I unlock the controls I need for afterburner and whats wrong with time spy that it doesnt use the proper clock speeds. Also I remember when I bought this card it had a handy tray icon that let you choose between performance and quiet and change led settings etc. what ever happened to that ?


----------



## b0uncyfr0

Quote:


> Originally Posted by *bluej511*
> 
> That your graphic score or full score?
> 
> My r9 390 does graphic scores of 13k at stock clocks and 15640 at 1200/1650. Stock clocks are 1040/1500.


Ah sorry i should have specified. That is my Graphic score. I managed to find a safe sweet spot running 1090/1500. My graphics score score is now 13.66k. How is that?

My card has horrible vdroop. With +50 in Afterburner, i'm seeing 1.164v as the average. No wonder some are pushing past 1100 with just a slight nudge in voltage.


----------



## chris89

Oh yeah, you will want to change the ASIC values in the far right tab of Hawaii Bios Reader to your specific settings for your card or copy over these settings to your BIOS rom file. Altering any one value from factory spec will cause a no-boot black screen so at least for me I don't touch them.

Hi, I've been working on finding the best speed and stability out of my 390x. Anyone could flash this BIOS on their 390x. I play at ultimate 3200x1800, so I was finding the coolest, fastest, and most stable bios to run. Having the ASIC Max Temp @ 88C is very important as it will come into play on the hotter variants of this bios. I tested 1166mhz no issue, then 1177mhz, up to 1250mhz. I find on air or at least "stock air" the card can reliably only pull up to 1177Mhz throttle-free. Anything beyond 1177Mhz hits the ASIC Temp shutdown switch which is very good as my card is fine. I also tested Watt TDP to Ampere TDC ratio, and too great a ratio between is bad. I find 10 less Ampere TDC to Watt TDP is ideal. When running at 230 to 240 watt tdp, 3200x1800 or 4k @ 400 Gigabytes per second is really smooth! Something like 40fps or more @ 1177Mhz ROTR DX12 Exclusive Fullscreen. I did find that @ TDP 200W / TDC 190A that it will stay at around 75C load, loud yes but you can't make this card quiet unless your on water or just running it at 1000Mhz. A slight reduction to VDDCI from 1000mv to 975mv helps reduce temperature by 2.56% @ 1,563Mhz memory clock and 400 Gigabytes per second. Memory clock helps when rendering large open scenes like GTA V and ROTR, among Witcher 3. Although it blurs through hefty wide open large scenes effortlessly @ 230 to 240 Watts @ 1177Mhz it does get a bit louder and hotter. I use earphone to ignore the unavoidable noise. At 240 watts is peak potential for 1177Mhz and it runs so smooth on everything completely maxed @ 3200x1800. So here I made a zip of all the BIOS if anyone wants to try... You always have the switch so there's nothing to worry about and I thoroughly tested these settings.

Lower TDP bios scales clocks according to temps a lot better, higher TDP is a solid Wattman line of 1177Mhz @ 240 watts. I do have the temp target max @ 80C, since damage occurs at above 88C and if the GPU is allowed to do 84-86C the ASIC max temp shuts down the GPU (ASIC runs a couple Celsius hotter than GPU)(Limited by the thermal conductivity of the thermal material used) to prevent damage (ie Black Screen). So with max target @ 80C it will ride out 1177Mhz until 80C is hit and then it will scale back to stay stable and at or below 80C. Which allows the ASIC temperature to stable in the 80 to 87C range. It's a tight window of performance but it does do very well with these BIOS. Good luck!









390X-1177Mhz-400GBps-Warm-Hot-Hotter-Hottest-Molten.zip 496k .zip file


----------



## dagget3450

Finally finally got my quad 390x installed on waterblocks and in system. The only issue is my cpu is a xeon tempoarily until i get my i7 back from intel rma. Hope to have some testing and benchs soon. I am specifically going to run super high resolutions as i want to test vram limits. Sadly lack of crossfire in newer games will mean only games that have crossfire.

I am using fury x cards as well and at high resolutions i am hitting vram limjts. It sucks because they have power but not always enough vram. I hope the new benchmarks get here soon from 3dmark and unigine.


----------



## Regnitto

Took me forever to find this, so I thought I'd share it with you guys.
https://www.amazon.com/gp/aw/d/B01MECH12B/ref=aw_wl_ov_dp_1_1?colid=31BUOK6U2UNOB&coliid=I8CK6DVSC0EL1&vs=1&th=1&psc=1
Replacement fans for Strix DC3 390x/980ti


----------



## chris89

So I finally just basically became bored of factory specification of my 390x, though it did well I wanted to Improve it. As we can see the factory plate, no picture shown we are looking at roughly a 13.33% reduction in allowable escapable air from the "Outlet" of the reference cooler. That means by stock we are looking at a 13.33% increase in temperature more than what is possible. So as I have done before I modified the plate to give the full CFM the card can allow until I remove the DVI Ports (Flux & Heat Gun About 3 minutes of work). That way I can get the maximum amount of thermal performance of this reference AMD R9 390X. It would be cool if AMD went away from what has "Always" been done with your standard run of the mill rear GPU IO Plate, and create a neat looking "Grilless" rear outlet. Where the coolers plastic has rolled edges to give it a kind of Afterburning Turbofan appearance(rolled edge oval with nothing blocking airflow At-All)... So far I have seen just around 10C reduction without removing the DVI ports, after DVI removal maybe 13-15C (Meaning More CFM At Lower Fan Speed Means Cooler And Higher Performing Graphic Card). I removed the VRM pads and used Arctic Silver Ceramique 2 and VRM stays perfectly fine (Reasonable TDP 65-75C). The VRM temperature cannot be improved until I use 2,000 Watts per meter kelvin Panasonic PGS among having the ENTIRE cooling plate and heatsink stripped of paint and finish, and have it plated in a thick beautiful coat of shiny COPPER.







Which I very much so plan on doing. I will also probably buy a large 0.25" thick piece of copper to create a custom copper backplate to absorb a vast amount of rear PCB heat (Which is not present at all, and man it gets Hot)(By the way a backplate alone does Nothing At All for thermals, you must add a full-length-width piece of 2mm thick thermal material to efficively transfer the heat from the PCB to the copper). If you set Max Temp and High Temp to your desired max temperature, it will maintain that temperature like say set them both to 75C it will scale back clocks to maintain that temperature. That's why I make a BIOS faster clocked so it'll scale to peak performance within that Temperature Envelope.


----------



## m70b1jr

So, since no Vega right now, would picking up another XFX R9 390 near me for $140 be worth it?


----------



## christoph

Quote:


> Originally Posted by *m70b1jr*
> 
> So, since no Vega right now, would picking up another XFX R9 390 near me for $140 be worth it?


140? what?

buy it and sell to me for 100


----------



## m70b1jr

Quote:


> Originally Posted by *christoph*
> 
> 140? what?
> 
> buy it and sell to me for 100


Yea, the guy has had it up for a while and has been lowering the price.


----------



## christoph

Quote:


> Originally Posted by *m70b1jr*
> 
> Yea, the guy has had it up for a while and has been lowering the price.


then get it


----------



## dagget3450

Man i just spent alot of time to find out an issue with my x99/xeon and the 390x. Basically i cannot get pcie3.0 to work with my 390x but my furyx does just fine. I have an x79 ill port them to and test. So wierd, i broke down my 4 gpu and 1 cpu water blocks and loop twice before figuring it out. I post tested the cards dry blocked on an x58 board. In the bios on the x99 i had to put pcie to gen2 or auto. When testing them on auto x99 i get only 2.0 max speeds. Something up with the x99 or 390x.

Top of that running high resolution is being hampered by pcie2.0









I may just have to put these to the side until i get my cpu back from intel and put them in a backup rig.


----------



## bluej511

Quote:


> Originally Posted by *dagget3450*
> 
> Man i just spent alot of time to find out an issue with my x99/xeon and the 390x. Basically i cannot get pcie3.0 to work with my 390x but my furyx does just fine. I have an x79 ill port them to and test. So wierd, i broke down my 4 gpu and 1 cpu water blocks and loop twice before figuring it out. I post tested the cards dry blocked on an x58 board. In the bios on the x99 i had to put pcie to gen2 or auto. When testing them on auto x99 i get only 2.0 max speeds. Something up with the x99 or 390x.
> 
> Top of that running high resolution is being hampered by pcie2.0
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I may just have to put these to the side until i get my cpu back from intel and put them in a backup rig.


You know between 2.0x16 and 3.0x16 theres not much difference in terms of performance right?


----------



## dagget3450

Quote:


> Originally Posted by *bluej511*
> 
> You know between 2.0x16 and 3.0x16 theres not much difference in terms of performance right?


Only 1 of them is getting x16 2.0, the other 3 seem to be x8 2.0. its really wierd on this x99 board. I am also pushing 6400x3600 since i am limited to 3800x1600 VSR

Alsl just remembered this board only two of the 4 slots are physically wired for full 16x. Ill have to double check but i believe that means x8/x8/x16/x8 in four way.

So bios sets the pcie version and slots are designed to operate in that speed


----------



## diggiddi

Quote:


> Originally Posted by *dagget3450*
> 
> Finally finally got my quad 390x installed on waterblocks and in system. The only issue is my cpu is a xeon tempoarily until i get my i7 back from intel rma. Hope to have some testing and benchs soon. I am specifically going to run super high resolutions as i want to test vram limits. Sadly lack of crossfire in newer games will mean only games that have crossfire.
> 
> I am using fury x cards as well and at *high resolutions i am hitting vram limjts*. It sucks because they have power but not always enough vram. I hope the new benchmarks get here soon from 3dmark and unigine.


What games are having these issues?


----------



## diggiddi

Quote:


> Originally Posted by *simonfredette*
> 
> Im having all kinds of trouble overclocking this card or even getting it to work properly at all during tests. A friend of mine bought a 1070 and ran the time spy test soI figured id give it a go and my results were worse than a gaming laptop!! That being said when I looked in the compare settings the core clock was 300 and memory clock of 800. Im trying to use afterburner to lock in the core and memory to what I want but have not been able to, the power slider is grayed out and I have no fan control. Im running drivers 16.12.1.
> 
> 
> 
> How can I unlock the controls I need for afterburner and whats wrong with time spy that it doesnt use the proper clock speeds. Also I remember when I bought this card it had a handy tray icon that let you choose between performance and quiet and change led settings etc. what ever happened to that ?


Reinstall software, try using revo uninstall to clean registry


----------



## dagget3450

Quote:


> Originally Posted by *diggiddi*
> 
> What games are having these issues?


Well pretty much any game when you push resolution far enough. However i suspect the game i am playing currently its more due to bad crossfire support in the game engine(maybe no support.)

On furyx i find in elder scrolls online that when i enable the third gpu it causes major pauses and spikes in certain areas that are in town and busy. While out in the wild it runs rather smooth. The resolution i am using for this test is 4x 4k so 7680x4320 i think(not on pc atm). When on 390x at 6400x3600 i can see vram usage is going just over 4gb. On furyx it seems to float right around 4gb also.

A similar example is witcher 3 at 6400x3600 i get playable frame rates 60 -30fps on two furyx, but throw in a third gpu and its same issue. So its interesting to see, it doesnt help atm i am on a low clocked locked xeon. I am waiting for my 5960x back from intel. I dont doubt it will help as well.


----------



## chris89

I recommend not using any software, but if anything ASUS Tweak II works best... Trixx worked great on 380x and Polaris but 390x if u hit apply or even reset, graphical corruption... I use Hawaii Bios Reader if I want to change something... add a couple more fan % at a particular temperature or a bit more TDP to hold true at 1177mhz or reduce memory controller temperature and power consumption by 14.28% (VDDCI from 1000 to 975 up to 1563mhz)...

Send your bios .rom dump via GPU-Z and I can dial it in to your preference... Make some solid gains as well among safety being key, we want our cards to last. Just if anyone didn't know, ASIC temperature by itself is unmonitorable and if it exceeds 90C damage occurs... This goes on long enough, you got a bad card. So setting ASIC to 88C is paramount. I think you could reduce ASIC tempature by 20C if you use thermal glue between caps to allow their heat to distribute evenly across all caps. Rather than allowing some caps to run smokin hot and others okay, we can combine their temperature and even out the temps of all caps/ increasing performance and longevity along the way. These cards use top of the line vrm modules, there isn't anything better per say. It's the caps that fail, or more specifically the solder for the cap melts if ASIC temperature goes above 90C. Causing a non bootable card over time. I know this because I received two bad 290X cards which had been ran on stock 95C bios. All caps solder had melted and cracked. I'm replacing all the caps to get them going again and modding the BIOS to max ASIC of 88C so it will never happen again. Just for shiznizzies and giggles I'll mod the 290x to 390x bios and run 3x "390x" crossfire... They are 4GB cards so it will back down my 8gb 390x to 4gb correct? Or does dx12 stack memory quantity?

HawaiiBiosReader-master.zip 73k .zip file


atifwinlash_274.zip 1214k .zip file


GPU-Z_ASUS_ROG_1.12.0.zip 2079k .zip file


I just always BIOS mod, I can run solid 1177mhz core @ 1333mv & 1563mhz ram @ 975 vddci @ 233W @ 223A minimal throttle (only micro throttles at 233w/ 240-280w throttle free) 77C 63% fan VRM 1/2 70C.

I had 2x 290x cards/ coolers laying around to compare to my reference hp cooler... Popped on the 290x cooler and its 10C less core and VRM. By the way the reference comes stock with grey 17w/m k fujipoly. Only thing better is Panasonic PGS 2,000w/m k.


----------



## dagget3450

Quote:


> Originally Posted by *chris89*
> 
> I recommend not using any software, but if anything ASUS Tweak II works best... Trixx worked great on 380x and Polaris but 390x if u hit apply or even reset, graphical corruption... I use Hawaii Bios Reader if I want to change something... add a couple more fan % at a particular temperature or a bit more TDP to hold true at 1177mhz or reduce memory controller temperature and power consumption by 14.28% (VDDCI from 1000 to 975 up to 1563mhz)...
> 
> Send your bios .rom dump via GPU-Z and I can dial it in to your preference... Make some solid gains as well among safety being key, we want our cards to last. Just if anyone didn't know, ASIC temperature by itself is unmonitorable and if it exceeds 90C damage occurs... This goes on long enough, you got a bad card. So setting ASIC to 88C is paramount. I think you could reduce ASIC tempature by 20C if you use thermal glue between caps to allow their heat to distribute evenly across all caps. Rather than allowing some caps to run smokin hot and others okay, we can combine their temperature and even out the temps of all caps/ increasing performance and longevity along the way. These cards use top of the line vrm modules, there isn't anything better per say. It's the caps that fail, or more specifically the solder for the cap melts if ASIC temperature goes above 90C. Causing a non bootable card over time. I know this because I received two bad 290X cards which had been ran on stock 95C bios. All caps solder had melted and cracked. I'm replacing all the caps to get them going again and modding the BIOS to max ASIC of 88C so it will never happen again.
> 
> HawaiiBiosReader-master.zip 73k .zip file
> 
> 
> atifwinlash_274.zip 1214k .zip file
> 
> 
> GPU-Z_ASUS_ROG_1.12.0.zip 2079k .zip file
> 
> 
> I just always BIOS mod, I can run solid 1177mhz core @ 1333mv & 1563mhz ram @ 975 vddci @ 233W @ 223A minimal throttle (only micro throttles at 233w/ 240-280w throttle free) 77C 63% fan VRM 1/2 70C.
> 
> I had 2x 290x cards/ coolers laying around to compare to my reference hp cooler... Popped on the 290x cooler and its 10C less core and VRM. By the way the reference comes stock with grey 17w/m k fujipoly. Only thing better is Panasonic PGS 2,000w/m k.


You got 10c less on 290x ref ccooler? I have xfx 390x that have the smaller heatsink like your pic on the right. I have 290x. Ref coolers laying around also. I may change them if i deblock them from the water loop


----------



## chris89

@dagget3450 Sweet dude! so nice having 290x laying around haha. Yeah it's quite a bit heavier (the 290x cooler), so its transferring vrm and core heat to the larger fin area. Yeah it's a little cooler and a little quieter but worth it for now imo it holds 1177mhz way better (390x cooler 80-88C at 80% compared to 77C @ 63% "And I reused by new hp fan rather than the red one" is all i have seen on the 290x cooler so far at 1177mhz)... A must until I water cool mine.. I have an alienware aurora r4 cpu water cooler.. are they sufficient enough to cool 390X? I have to create a custom brace mount but I wanna try water cooling for one. I pulled the block apart thinking it was gunked up and it was clean as a whistle haha. I plan on replacing the hose with clear vinyl.. will clear vinyl work? It's an AIO 120mm water cooler.. can i mod up the hosing to a reservoir? I need some good old water cooling 101 on custom loops using an AIO cooler.. haha









I'm seriously considering sending off all 3 of my Hawaii XT coolers minus the fan to plating company. Having them all striped of stickers and paint and Plate them in Beautiful sexy copper! I wanna get some Copper spray paint and paint the fan and shroud Copper as well and add a Copper backplate. I dream of decked out all copper Hawaii XT's!!














What if AMD's (Possible Founder's Edition/ Expensive but I WOULD SO FLIPPIN PAY!) next card was unveiled in all Copper? haha So beautiful, sexy, and shiny as the polished Copper glistens in the light (Copper being the coolest running most thermally conductive element next to very expensive elements like Gold, Silver, & Whatever others there are above 500 watts per meter kelvin).


----------



## gordesky1

So with the wattsman drivers... it seems like my msi 390 is not detecting overclock clocks? Like i usely run 1650 1700 memory before these drivers threw msi afterburner, But with these drivers when i put it anything over 1500 afterburner shows it right even gpuz in the sensors selection but in HWiNFO64 it still shows its running at 1500... even on the main screen of gpu show it running at stock?

heres pics


----------



## chris89

@gordesky1 Upload your stock bios .rom and I can get it up to 1177 or 1188 (Your on water, right?) and memory could go to 1758Mhz max @ 1000mv VDDCI or maybe 1100mv VDDCI (Memory errors by HWInfo though games are fine @ 450 Gigabytes per second). I find even 1166Mhz is too laggy @ 3200x1800 on high end games, so 1177mhz runs really nice @ 1333mv to 1366mv. I prefer WHQL Certified 16.9.2 until another WHQL Certified driver comes out. Since non-whql gives weird results and values in HWInfo. Running 16.9.2 gives accurate results on my preferred HWInfo v524 as its less busy than the newer ones.

I find 1563mhz is most reasonable power wise. I went from 975mv vddci @ 1563mhz to 1100mv vddci @ 1758mhz and it's an extra 70 memory watts... you can run 1758mhz @ 1000mv vddci and run about 30 watts but with errors... Still 140 watts @ 1000mv vddci @ 1758mhz... So 1563mhz @ 975mv vddci is about 100 watts... a lot less power for 50 Gigabytes per second... to gain 50 gigabytes per second is 40-70 watts with or without errors.

I find also after some testing that it can do 1563mhz ram on 966mv from 975mv, meaning a 3.52% reduction in VRM temperature for the memory VRM about 2C, and about 5 watts from 105 to 98 watts. MAFIA III among every other game with this card needs more tdp to full fps. Even know 1177mhz doesn't throttle @ 233 watts, it runs about 5 fps faster at 244 watts... The higher you go, the higher the fps goes and temperatures until you get to a real nice smooth fps. I dialed in my fan profile to 78C decent fan speed. After turning on the FPS display, I took it up to 1200mhz, hot and loud about 82C but fps is 20 to 30fps... mostly 22-28fps. Which is on par with GTX 1080.

Oh yeah if it BSOD's if u accidentally don't close MSI or HWInfo before opening ATIWinFlash, turn off the power supply immediately! It goes into an electrical anomaly and caps start smelling. So turn off psu right away to prevent this... During these BSOD's all GPU BIOS safety features are inactive and the ASIC goes sky high. Not a major deal but if it BSOD's run to the power supply switch in 1 second. haha









Once all apps like HWinfo and MSI among anything that even thinks about the GPU are closed you can proceed to open ATIWinFlash, applies to all AMD cards. Once flashing begins, be patient the Image size is massive like 20 million lines of code so it takes like 5 minutes... Well worth it though.

I find it takes 5 to 7 minutes to flash, though I have found a way to reduce it to about 2 minutes. Close all apps, then open ATIWinFlash and wait for it to open. Once open, you can go to youtube and watch a video. You cannot touch anything, so start the video and move to ATIWinFlash and then proceed to enjoy your video while flashing in 2 minutes. As the video keeps the gpu slightly alert and speeds up the process... Once complete, obviously restart.

By the way guys, setting Fan Control Type to 0 is very important as 1 doesn't even worry about your fan settings and fan speed stays too low and temps go sky high. By setting to 0, it actually abides by your specific fan settings to dial it in the way you want it. Without some none sense custom fan profile in some application.

I did some more testing with other games all at maxed out, well beyond maxed out since I have the AMD Settings set to the max as well minus AA. It doesn't exceed 79C, but fan hops to 100% and back down and it does that continous(Zero 1177mhz throttling though it holds true on fps). So I have a few options, but most logical would be to raise max temp to 88C, and High to 85C and raise high temp fan speed to 80% or maybe 75% but most likely 85-85%... I was playing MAFIA III @ 3200x1800 on high with the AMD Settings as this... So it does perform beautifully like way up there with the high end Nvidia Cards which feels good. Temps, Noise, Power Consumption though... but who cares? I guess it came down to the Low $200 range is what the 8GB 390X cost me. MAFIA III totally tweaked out to the max @ 3200x1800 at a really nice and smooth around 35fps is pretty good? I see 390x reviews and they are all like the 390x got smoked by everything and it ran too hot while doing it... I'm sitting here like we can "Boost The Beast" increase power consumption, noise, reduce temps, and talk on the big dogs... Why Not? HAHA



High Preset 3200x1800 (A bit loud and hot at 1200mhz but this game is a beast, probably the most devastating to date







)







I think the game is memory bandwidth limited... I'm at 400 gigabytes per second. Maybe if I cranked it out to 450 gigabytes per second it would go up but who cares? Here's stock voltage 65288 @ 1133mhz...


----------



## m70b1jr

The guy lowered the xfx r9 390 on Craigslist to $120.. Crossfire here I come!


----------



## gordesky1

Quote:


> Originally Posted by *chris89*
> 
> @gordesky1 Upload your stock bios .rom and I can get it up to 1177 or 1188 (Your on water, right?) and memory could go to 1758Mhz max @ 1000mv VDDCI or maybe 1100mv VDDCI (Memory errors by HWInfo though games are fine @ 450 Gigabytes per second). I find even 1166Mhz is too laggy @ 3200x1800 on high end games, so 1177mhz runs really nice @ 1333mv to 1366mv. I prefer WHQL Certified 16.9.2 until another WHQL Certified driver comes out. Since non-whql gives weird results and values in HWInfo. Running 16.9.2 gives accurate results on my preferred HWInfo v524 as its less busy than the newer ones.
> 
> I find 1563mhz is most reasonable power wise. I went from 975mv vddci @ 1563mhz to 1100mv vddci @ 1758mhz and it's an extra 70 memory watts... you can run 1758mhz @ 1000mv vddci and run about 30 watts but with errors... Still 140 watts @ 1000mv vddci @ 1758mhz... So 1563mhz @ 975mv vddci is about 100 watts... a lot less power for 50 Gigabytes per second... to gain 50 gigabytes per second is 40-70 watts with or without errors.
> 
> I find also after some testing that it can do 1563mhz ram on 966mv from 975mv, meaning a 3.52% reduction in VRM temperature for the memory VRM about 2C, and about 5 watts from 105 to 98 watts. MAFIA III among every other game with this card needs more tdp to full fps. Even know 1177mhz doesn't throttle @ 233 watts, it runs about 5 fps faster at 244 watts... The higher you go, the higher the fps goes and temperatures until you get to a real nice smooth fps. I dialed in my fan profile to 78C decent fan speed. After turning on the FPS display, I took it up to 1200mhz, hot and loud about 82C but fps is 20 to 30fps... mostly 22-28fps. Which is on par with GTX 1080.
> 
> Oh yeah if it BSOD's if u accidentally don't close MSI or HWInfo before opening ATIWinFlash, turn off the power supply immediately! It goes into an electrical anomaly and caps start smelling. So turn off psu right away to prevent this... During these BSOD's all GPU BIOS safety features are inactive and the ASIC goes sky high. Not a major deal but if it BSOD's run to the power supply switch in 1 second. haha
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Once all apps like HWinfo and MSI among anything that even thinks about the GPU are closed you can proceed to open ATIWinFlash, applies to all AMD cards. Once flashing begins, be patient the Image size is massive like 20 million lines of code so it takes like 5 minutes... Well worth it though.
> 
> I find it takes 5 to 7 minutes to flash, though I have found a way to reduce it to about 2 minutes. Close all apps, then open ATIWinFlash and wait for it to open. Once open, you can go to youtube and watch a video. You cannot touch anything, so start the video and move to ATIWinFlash and then proceed to enjoy your video while flashing in 2 minutes. As the video keeps the gpu slightly alert and speeds up the process... Once complete, obviously restart.
> 
> 
> 
> High Preset 3200x1800 (A bit loud and hot at 1200mhz but this game is a beast, probably the most devastating to date
> 
> 
> 
> 
> 
> 
> 
> )
> 
> 
> 
> 
> 
> 
> 
> I think the game is memory bandwidth limited... I'm at 400 gigabytes per second. Maybe if I cranked it out to 450 gigabytes per second it would go up but who cares? Here's stock voltage 65288 @ 1133mhz...


Here's my bios,

also this is the msi 390 non x, just wanted to let you know cause i see you have the 390x not sure if editing the bios is the same or not lol

and no just water on the cpu but gpu is stock msi cooler.. wish it was water cooled lol

Biosfile.zip 99k .zip file


----------



## m70b1jr

Okay so I enabled crossfire in crimson, but in unigine heaven, gpu 2 usage is 0%...


----------



## Streetdragon

Quote:


> Originally Posted by *m70b1jr*
> 
> Okay so I enabled crossfire in crimson, but in unigine heaven, gpu 2 usage is 0%...


enabled fullscreen?


----------



## chris89

So far I can get the VDDCI down to 955mv @ 1563mhz. First thing I notice is DXVA how it jumps to high speed memory clocks and back to idle memory, power went by stock from 15 watts idle to 25 watts dxva. Now at 955mv, it uses 15 watts at 1563mhz, and about 3 watts at idle. From 1000mv to 955mv is 4.71% less power... maybe it can do 954 or 953mv... I know 950mv is not possible but its fun to see just how low it can go. This helps massively reduce load power consumption, it's like 5 watts per millivolt.

So more testing... It booted to windows at 951mv, driver crash at first but worked but reported 956mv. So I tried 953mv, works great even taxed all 8GB and it was totally fine. Finally pulled 952mv, no driver crash but reports 953mv... So @ 1563Mhz @ 400 Gigabytes per second it can do 952mv minimum but really 953mv is the actual minimum since it actually reports 953mv minimum. From 1000mv to 953mv is 4.931% less power, but right now I have it at 952mv which is 5.042% less power. 5.042% is if at 71C VRM, now at 67C VRM... well at least I know how low it can go... It's worth it to save power on this card in every way you can.

*@gordesky1* I recommend not running it at 1188mhz, but if you wanted to see it should be fine just loud and hot. At 1133mhz, its cool quiet and fast. I can run the memory controller tdp down to 960mv, maybe less I will try. Which is if I can go down to 957mv, that's 4.5% less heat and power... If your at 85C, it would be at 81C... That's quite a lot of difference... From 210 watts to 200 watts basically. So undervolting the ram controller is well worth it. Its totally stable for me on mafia III @ 3200x1800 high.

Hawaii_MSI_R9_390_1133Mhz_1563Mhz.zip 99k .zip file


Hawaii_MSI_R9_390_1188Mhz_1563Mhz.zip 99k .zip file


I'll need to make a custom mounting plate but maybe this could cool the 390x... Which clear tubing is ideal?





Here's a comparison of the VDDCI... we see a fractional amount of reported extra power at 952mv VDDCI vs 956mv VDDCI, though it's undeniably cooler running at 952mv VDDCI... So its clear in reality, monitoring aside it is running cooler meaning less power at 952mv VDDCI.


----------



## m70b1jr

Wow! So, my 2nd card is unlockable to a 390x! Now, If I BIOS flash it from a r9 390 to a 390x, can I still use crossfire?


----------



## m70b1jr

So, I unlocked it to a 390x, and did some light overclocking (Currently stable with 1130 Core, 1620 Mem). Really a good steal. A 390 that I was able to unlock to a 390x for $120, AND it has a lifetime warranty at bestbuy.
Right now I'm getting a Score of 2535 in Unigine Heaven (Extreme) and my goal is to beat my friend, who has a GTX 1070, (He's scoring 2585) so If I can get a score of 2600, I'll be happy, but since I only have a Corsair CXM750, My headroom is limited.


----------



## dagget3450

Quote:


> Originally Posted by *m70b1jr*
> 
> So, I unlocked it to a 390x, and did some light overclocking (Currently stable with 1130 Core, 1620 Mem). Really a good steal. A 390 that I was able to unlock to a 390x for $120, AND it has a lifetime warranty at bestbuy.
> Right now I'm getting a Score of 2535 in Unigine Heaven (Extreme) and my goal is to beat my friend, who has a GTX 1070, (He's scoring 2585) so If I can get a score of 2600, I'll be happy, but since I only have a Corsair CXM750, My headroom is limited.


Look top30 unigine heaven. Benchmark thread for tweaks!


----------



## chris89

For benchmarks you can go up to 1,758Mhz memory max on stock VDDCI, that's 450GB/s... Just to improve your score, though TDP goes up by 30% alone so with core and memory increase could be 75-85% increase in TDP. I was able to pull 1,250Mhz @ 1400mv'ish & 1,758Mhz @ stock 1000mv VDDCI. Send your bios and I can build you a benchmark bios... not to be used for gaming but quick ultra fast run throughs if you wish... Stock cooler would need 80-90-100% fan speed throughout testing to get those real high scores.

I did more testing and even know I saved power at 1,563Mhz memory @ 952mv VDDCI, it did BLACK SCREEN on me after an hour of stable gaming... Which when it goes to BLACK SCREEN run to the power supply switch like your hairs on fire! It enters into that Electrical Anomaly again and caps smell bad! No damage, but shut off the PSU within 10 seconds but way way less preferably.

So since 952mv VDDCI was stable in game but was prone to crashing upon returning to desktop, So I settled @ 953mv VDDCI which does not have the issue. It's funny how 1 tiny little millivolt can make it or break it, haha.

1133Mhz to 1144Mhz is easily possible on stock voltage at the core, up to 1,758Mhz max memory clock on stock 1000mv VDDCI for benchmarks. Though a hop up to 1,563Mhz @ a solid 400 gigabytes per second is great while saving power at the same time @ 953mv VDDCI totally stable and error free... (953mv from 1000mv is 4.931% decrease in power ie like 4 degrees Celsius) Just a 0.00105% increase in power for total stability @ 1,563mhz (952mv VDDCI vs 953mv VDDCI).


----------



## m70b1jr

Quote:


> Originally Posted by *chris89*
> 
> For benchmarks you can go up to 1,758Mhz memory max on stock VDDCI, that's 450GB/s... Just to improve your score, though TDP goes up by 30% alone so with core and memory increase could be 75-85% increase in TDP. I was able to pull 1,250Mhz @ 1400mv'ish & 1,758Mhz @ stock 1000mv VDDCI. Send your bios and I can build you a benchmark bios... not to be used for gaming but quick ultra fast run throughs if you wish... Stock cooler would need 80-90-100% fan speed throughout testing to get those real high scores.
> 
> I did more testing and even know I saved power at 1,563Mhz memory @ 952mv VDDCI, it did BLACK SCREEN on me after an hour of stable gaming... Which when it goes to BLACK SCREEN run to the power supply switch like your hairs on fire! It enters into that Electrical Anomaly again and caps smell bad! No damage, but shut off the PSU within 10 seconds but way way less preferably.
> 
> So since 952mv VDDCI was stable in game but was prone to crashing upon returning to desktop, So I settled @ 953mv VDDCI which does not have the issue. It's funny how 1 tiny little millivolt can make it or break it, haha.
> 
> 1133Mhz to 1144Mhz is easily possible on stock voltage at the core, up to 1,758Mhz max memory clock on stock 1000mv VDDCI for benchmarks. Though a hop up to 1,563Mhz @ a solid 400 gigabytes per second is great while saving power at the same time @ 953mv VDDCI totally stable and error free... (953mv from 1000mv is 4.931% decrease in power ie like 4 degrees Celsius) Just a 0.00105% increase in power for total stability @ 1,563mhz (952mv VDDCI vs 953mv VDDCI).


I just have the default Hawaii.. Use TechPowerUp to search for the Bios(s).. I have the XFX r9 390 DD, and I guess I'll need a XFX 390x DD Bios, since I unlocked one of em to a r9 390x. I'd send you mine, but I'm at school.


----------



## b0uncyfr0

Quote:


> Originally Posted by *b0uncyfr0*
> 
> Ah sorry i should have specified. That is my Graphic score. I managed to find a safe sweet spot running 1090/1500. My graphics score score is now 13.66k. How is that?
> 
> My card has horrible vdroop. With +50 in Afterburner, i'm seeing 1.164v as the average. No wonder some are pushing past 1100 with just a slight nudge in voltage.


Anyone??


----------



## m70b1jr




----------



## h2323

I have an MSI 390X, Was hot from the start but has but has been worse since I switched to a 4k monitor.

I re-applied paste,Huge results, dropped 13c under load. 94 max to 81 max. Factory thermal paste was sloppy, too much, and some of it was dry.

I used Noctua NT-H1 Thermal Compound, line of paste down the middle of the die technique.


----------



## chris89

I've been doing some more testing, finding ideal settings etc. I feel like maybe water cooling isn't worth the hassle. I've been able to reduce temps dramatically, primarily by using R9 290X heatsink on the 390X helps a lot. I find though when I wanna actually pull more fps, clocking the ram out doesn't improve fps at all... Except waste a bunch of power. The difference from 1000Mhz memory @ 875mv VDDCI and 1,758Mhz memory @ 1000mv in terms of fps is nothing really except a solid 125 watt increase from 256GB/s to 450GB/s... Memory isn't as important for actual FPS gains as much as the Core clock. I was struggling to pull 30fps beyong "Very High" @ 3200x1800 on Rise of tomb raider. I'd clock it to 1234Mhz core and without a power limit increase, no fps changes from 1160Mhz or 1133Mhz. Once I increased TDP to 250-280 and TDC to 240 to 270 is when it went up to 40fps for about 5-10 seconds and it hit my 88C ASIC Temperature... I adjust fan speed 30% and no change, ASIC goes uncontrollably high @ high TDP/ FPS. I suppose making little Copper hat heatsinks and using thermal glue to combine all the caps nearest to eachother could prevent hitting the ASIC Max Temp I have set @ 88C to prevent damage. Anyway, it is possible to get way higher fps but you need lots of TDP but it's the ASIC that gets out of control and you can't monitor it unless you placed a thermal sensor on all the capacitors and measured which ones where the hottest to control them down to maintain those high fps numbers... The core and vrm stay cool-enough yet ASIC goes sky high. Here are some screenshots with some testing. I am running them at 1000Mhz memory @ 875mv VDDCI which is saving a total of 125 free watts and doing fine with no noticeable fps difference really at all like maybe 1 fps give or take a few fps. From 1000mv to 875mv is saving 14.28% of total power... From 85C to 74C...


----------



## m70b1jr

Overclocking the memory on my Crossfire setup causes a Red Screen (Guess due to power draw) So i'm gonna leave it at 1500mhz.


----------



## bluej511

Quote:


> Originally Posted by *m70b1jr*
> 
> Overclocking the memory on my Crossfire setup causes a Red Screen (Guess due to power draw) So i'm gonna leave it at 1500mhz.


Thats a shame, what psu are you using?


----------



## m70b1jr

I have a corsair CX750M


----------



## chris89

I actually took the card apart yet again, and did some more modding to resolve the ASIC overheat issue which it actually resolved and it uses less power now meaning cooler for higher overclocks. The ASIC overheat issue meaning "The Reason This Card Fails" wouldn't be an issue if the cooling plate also cooled the capacitors as well...




Black dots from lack of core voltage, trying to find the ideal voltage. That was 1399mv, so will need to raise to resolve. It's amazing the temperatures and power draw. That's at 250 watts TDP and 240 ampere current TDC. Once I find the proper Voltage : TDP ratio the fps will go up along with.


*@m70b1jr* Are you running dual XFX DD 390X? They have thermal material on the memory modules and direct heatsink contact right? I had this issue on my Sapphire Nitro 380X because the memory voltage regulators did not have any cooling just sitting out right on the PCB. Which they got smoking hot and caused green screen quite often until I added a heatsink... I haven't had a Green Screen on my 390X at all and took it up to 1,758Mhz memory. The 3 VRM on front of the PCB are for the memory and they should run near 1:1 with the rear Core Regulators... Meaning getting just as hot... How much clock are you throwing at the modules? As high as you should go for realistic gaming sake is 1,563Mhz on a Memory VRM Under-Volt to about 950mv to let the VRM run cooler... I don't get more FPS from memory clock while at 3200x1800 as I would with benchmarks... 3200x1800 gains are a whole different ball game, we're dealing with power and temperature limitations up at 3200x1800.


----------



## m70b1jr

Quote:


> Originally Posted by *chris89*
> 
> I actually took the card apart yet again, and did some more modding to resolve the ASIC overheat issue which it actually resolved and it uses less power now meaning cooler for higher overclocks.
> 
> 
> 
> 
> Black dots from lack of core voltage, trying to find the ideal voltage. That was 1399mv, so will need to raise to resolve.
> 
> 
> *@m70b1jr* Are you running dual XFX DD 390X? They have thermal material on the memory modules and direct heatsink contact right? I had this issue on my Sapphire Nitro 380X because the memory voltage regulators did not have any cooling just sitting out right on the PCB. Which they got smoking hot and caused green screen quite often until I added a heatsink... I haven't had a Green Screen on my 390X at all and took it up to 1,758Mhz memory. The 3 VRM on front of the PCB are for the memory and they should run near 1:1 with the rear Core Regulators... Meaning getting just as hot... How much clock are you throwing at the modules? As high as you should go for realistic gaming sake is 1,563Mhz on a Memory VRM Under-Volt to about 950mv to let the VRM run cooler... I don't get more FPS from memory clock while at 3200x1800 as I would with benchmarks... 3200x1800 gains are a whole different ball game, we're dealing with power and temperature limitations up at 3200x1800.


YES they're both DD, but my 1st 390 has an AIO cooler with a heatsink. My 2nd 390 that I bios unlocked to a 390x is the DD


----------



## chris89

Nice, I'm not sure. Maybe reference is ideal... Highest Graphics Score so far... I gained 164 points from adding the Fujipoly Sarcon I think 2.5mm thickness between capacitors. Plus rather than 77C max, hit 70C max.

Is a Firestrike graphics score of 14,830 weak cheese? It's probably still way slower and worse than everyone else's score. I'm on the reference blower and PCIe 2.0 as well on stock slow as heck Xeon's. Maybe if AMD releases dual logical to one physical core CPU's, I'll go dual 8-core (48 thread) rig in the future. Titan X Pascal must be hitting like 400fps...?



After a whole bunch of testing, finally have 1,234Mhz (@1400mv) pretty well stable as it didn't crash. It can't run throttle-free at this clock but it did handle it pretty well. Crazy how its set to 190W TDP/ 180A TDC and it went way higher. This is just about as fast as it will go without crashing. I can tell how much power it used by taking idle 12 volt rail and subtracting the minimum, though it does account for dual 95W Xeon's in turbo boost totalling 375 watts. Though it was only using like 5% a cpu so most of it was GPU power.


----------



## diggiddi

Quote:


> Originally Posted by *chris89*
> 
> Nice, I'm not sure. Maybe reference is ideal... Highest Graphics Score so far... I gained 164 points from adding the Fujipoly Sarcon I think 2.5mm thickness between capacitors. Plus rather than 77C max, hit 70C max.
> 
> Is a Firestrike graphics score of 14,830 weak cheese? It's probably still way slower and worse than everyone else's score. I'm on the reference blower and PCIe 2.0 as well on stock slow as heck Xeon's. Maybe if AMD releases dual logical to one physical core CPU's, I'll go dual 8-core (48 thread) rig in the future. Titan X Pascal must be hitting like 400fps...?
> 
> 
> 
> After a whole bunch of testing, finally have 1,234Mhz (@1400mv) pretty well stable as it didn't crash. It can't run throttle-free at this clock but it did handle it pretty well. Crazy how its set to 190W TDP/ 180A TDC and it went way higher. This is just about as fast as it will go without crashing. I can tell how much power it used by taking idle 12 volt rail and subtracting the minimum, though it does account for dual 95W Xeon's in turbo boost totalling 375 watts. Though it was only using like 5% a cpu so most of it was GPU power.


Thats a pretty decent score, my 290x @1230/1620 give or take 10hz is around 15k mark


----------



## chris89

*Thanks, yeah it looks like I'm about 8 to 9 fps slower than PCIe 3.0 on PCIe 2.0, so I have to clock it out to even pull anything worth looking at. That's amazing the 290x/ 390x are just amazing cards man. I have 2x 290X 4GB getting fixed and I'll try and flash them to 390x bios, That's possible right?

Anyway, I finally figured out why I was getting black screens after leaving a game. It was the ASIC temperature, it remained high after hitting the desktop even when the core was 39C the ASIC was still right up there near my max 88C ASIC temperature. It has become increasingly clear that the ASIC max temp is in fact BUGGED on the 390x/ Hawaii GPU. As when it hits black screen ie Max ASIC temp it doesn't shut down. It also loses all touch with itself and fan goes to nothing when no display... While ASIC actually begins to rise higher and higher as fan is off... Fan controls only work when the card is functioning, not while in black screen ie max ASIC temp again... So you gotta shut off the PSU really quick before the card burns it self... It takes a couple minutes to cool back down to 80C even...

I'm running 1219mhz/ 1563mhz ram... I thought it was my VDDCI being too low at 952mv, but I tested 1000mv and it still happened. So I had 2 options, raise the ASIC Temperature Max or raise my minimum and medium temperature targets and fan speeds. That way when it hits the desktop the fan still kinda hangs up there which actually resolved the issue... Meaning I saw it actually went to Black Screen again yet recovered which it never does... I had min target at 35C and medium target at 40C... 20% @ 35C & 40% @ 40C... Which wasn't quite enough because obviously it still occured yet recovered.

So it appears I need to set my min to 29C and medium to 40C so the fan speed will hang higher longer while returning to pure idle. It takes a good 10 minutes to cool the ASIC down from near 88C to 85C even... So I'm thinking about setting ASIC max temp to 95C(first I'll work on fan speeds at 88C and see if it goes away because 95C means a bad card meaning not working anymore haha







), and setting min target at 29C and medium target at 40C... 20% & 50% & 60% up to 80C... Only saw it hit 75C at 59% @ 1219mhz while VRM 1/2 @ 70C... ASIC is much hotter running than any monitorable sensor...

I've begun to sense the temperature of the ASIC so I can more accurately resolve its issues... I hope this helps anyone with a similar issue. I had it just fine at 1,250mhz but black screen occured really quick ie 88C max ASIC in about 2 seconds in game... At 1234mhz it allowed 30 seconds and black screen 88C max ASIC... couldn't get around it unless I had the fan way louder than usual.

So I had to dial back to 78 billion pixel's per second... 1250mhz is 80 billion, 1234mhz is 79 billion... So 78 billion is still a lot and way more do-able... Needs fan speed though because of that ASIC that's always there annoying the heck out of ya... haha

So 1219mhz works for a good couple minutes @ 1377mv but 1377mv isn't enough... Actually in reality 1219mhz calls on 1450mv... Meaning stock at 65288 is 1250mv @ 1050mhz so take 1250 and divide it by 1050 which is 19% or 1.19... That means in reality for real stability, all clocks need 19% ... so you want 1180mhz it's 1180 multiplied by 1.19 equals 1404.5mv... round up or down whatever you feel like... I play with it way down to 12.5% and no stability in sight haha but I like to try as it's way cooler ASIC... ASIC gets hot when even 1 millivolt above 65288 or 1250mv.

It wants to hit that 88C readily and eagerly really quick... So you have to crank the fans higher than all the temps display because ASIC is non-monitorable. I know 100% I can get 0.000000001 seconds out of 1450mv haha it hits 88C in that short of time at 1450mv.... So i have to dial back and crank fans to work over that dang ASIC temperature. So it's tedius, but I love tedius difficult tasks that are highly difficult to solve yet the solution is just around the corner every second of my experimentation... haha anyway happy gaming guy's!







*


----------



## m70b1jr

Anyone got some bench marking tips? Also, for some reason, in fire strike regular, on graphics test 2 only, i get "Failed to produce workload" but all the other test's work.


----------



## diggiddi

Quote:


> Originally Posted by *chris89*
> 
> Thanks, yeah it looks like I'm about 8 to 9 fps slower than PCIe 3.0 on PCIe 2.0, so I have to clock it out to even pull anything worth looking at. That's amazing the 290x/ 390x are just amazing cards man. I have 2x 290X 4GB getting fixed and I'll try and flash them to 390x bios, That's possible right?
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Anyway, I finally figured out why I was getting black screens after leaving a game. It was the ASIC temperature, it remained high after hitting the desktop even when the core was 39C the ASIC was still right up there near my max 88C ASIC temperature. It has become increasingly clear that the ASIC max temp is in fact BUGGED on the 390x/ Hawaii GPU. As when it hits black screen ie Max ASIC temp it doesn't shut down. It also loses all touch with itself and fan goes to nothing when no display... While ASIC actually begins to rise higher and higher as fan is off... Fan controls only work when the card is functioning, not while in black screen ie max ASIC temp again... So you gotta shut off the PSU really quick before the card burns it self... It takes a couple minutes to cool back down to 80C even... I'm running 1219mhz/ 1563mhz ram... I thought it was my VDDCI being too low at 952mv, but I tested 1000mv and it still happened. So I had 2 options, raise the ASIC Temperature Max or raise my minimum and medium temperature targets and fan speeds. That way when it hits the desktop the fan still kinda hangs up there which actually resolved the issue... Meaning I saw it actually went to Black Screen again yet recovered which it never does... I had min target at 35C and medium target at 40C... 20% @ 35C & 40% @ 40C... Which wasn't quite enough because obviously it still occured yet recovered. So it appears I need to set my min to 29C and medium to 40C so the fan speed will hang higher longer while returning to pure idle. It takes a good 10 minutes to cool the ASIC down from near 88C to 85C even... So I'm thinking about setting ASIC max temp to 95C(first I'll work on fan speeds at 88C and see if it goes away because 95C means a bad card meaning not working anymore haha
> 
> 
> 
> 
> 
> 
> 
> ), and setting min target at 29C and medium target at 40C... 20% & 50% & 60% up to 80C... Only saw it hit 75C at 59% @ 1219mhz while VRM 1/2 @ 70C... ASIC is much hotter running than any monitorable sensor... I've begun to sense the temperature of the ASIC so I can more accurately resolve its issues... I hope this helps anyone with a similar issue. I had it just fine at 1,250mhz but black screen occured really quick ie 88C max ASIC in about 2 seconds in game... At 1234mhz it allowed 30 seconds and black screen 88C max ASIC... couldn't get around it unless I had the fan way louder than usual. So I had to dial back to 78 billion pixel's per second... 1250mhz is 80 billion, 1234mhz is 79 billion... So 78 billion is still a lot and way more do-able... Needs fan speed though because of that ASIC that's always there annoying the heck out of ya... haha So 1219mhz works for a good couple minutes @ 1377mv but 1377mv isn't enough... Actually in reality 1219mhz calls on 1450mv... Meaning stock at 65288 is 1250mv @ 1050mhz so take 1250 and divide it by 1050 which is 19% or 1.19... That means in reality for real stability, all clocks need 19% ... so you want 1180mhz it's 1180 multiplied by 1.19 equals 1404.5mv... round up or down whatever you feel like... I play with it way down to 12.5% and no stability in sight haha but I like to try as it's way cooler ASIC... ASIC gets hot when even 1 millivolt above 65288 or 1250mv. It wants to hit that 88C readily and eagerly really quick... So you have to crank the fans higher than all the temps display because ASIC is non-monitorable. I know 100% I can get 0.000000001 seconds out of 1450mv haha it hits 88C in that short of time at 1450mv.... So i have to dial back and crank fans to work over that dang ASIC temperature. So it's tedius, but I love tedius difficult tasks that are highly difficult to solve yet the solution is just around the corner every second of my experimentation... haha anyway happy gaming guy's! :thumb
> 
> 
> :


Chris please use paragraphs to break up your wall of text, its unreadable thanks


----------



## chris89

*@diggiddi* Thank you. I hope it's better now.

*Side Note* Continuation on above... I had fan go to 75% at 40C way too loud but it still happened anyway when core hit like 60C @ 1219mhz & VRM 1/2 @ like 47C...

So I have only two options... Either turn up Max ASIC Temp to 100C or reduce clock... I am so stubborn about clocks, my mind isn't happy at near stock because I only like as fast as it will go and no less... Like with me and my extreme gear head nature, as long as your not nuts behind the wheel. A good pedal to the metal to the rev limit run is plenty fun as long as you know when to let off... The most impressive part of a good driver is not the initial punch, it's when one chooses to let off is what sticks around... This is key when girls are in the car... you can excite them...but not scare them is important.

So I'm going to turn up Max ASIC Temp and Risk it all just to be as fast as possible... I don't know about you guy's? But I get a thrill from the riskiness of max overclocks on powerful beasts like the 390x. I sit back and I'm like 78 billion 16 million pixel's per second plus 214 billion 544 million texel's per second and how?!? what?!?


----------



## Streetdragon

im playing with speed and volte a bit too now. Stock VDDCI is 1050mv went to 1000mv. memory is stable so far at 1450 with 1250 timing. 1500 is gamingstable too, but when i stream a bit or something like that it likes to start to produce mem errors and every week or so i got a blackscreen. so i think my mem dont like oc so much^^

1350mv or so (stock voltage + 100mv offset) does 1170Mhz stable so far. try to push it a bit more to 1200. I hope the lower mem speed gives me more room for the core clock^^


----------



## Streetdragon

@chris89 thx for some tipps^^

i modded a bit more now. went without offset and used manuell voltages in the bios.
Result is a cooler and a bit faster card:

http://www.3dmark.com/3dm/17316470?

Bios:


nice nice nice^^ like my cards more and more


----------



## chris89

Nice scores @Streetdragon and your welcome. I don't feel right with max ASIC above 88C... I set TDP and TDC to 1,000 watts haha Tomb Raider 2013 has a throttling bug that even throttled at 1,000 watts... maybe it needs 10,000 watts of TDP.. haha it doesn't go anywhere near that but it sort of fixes the bug the higher I go...

I took my card apart again.. I hate deviation between VRM 1/2 but I guess it's unavoidable.. even know max always like equal on 1/2.. when gaming one is about 8C hotter.. which is normal with core supplying way more power than memory vrm. I bought a 4" x 12" 24 AWG thick copper sheet and some m2 x 16mm screws to screw in my Arctic Accelero Xtreme III or Gelid Icy Vision I have laying around to keep it totally quiet under load.. I'll mess with water cooling later... I'm going to cut out a Front-Plate to cool all the VRM's and Memory and adhere with 3m thermal tape.. and whatever amount of sheet I have left I'll make a little back plate.. haha it's super thin but just for the heck of it.. it's not too thin I guess but thin enough to easily bend by hand and cut with scissors or tin snips.. I need super thin because of clearance with memory modules and Accelero Xtreme III and Gelid.. copper pipes are super close to modules.. I'll then strategically add heatsinks here and there across the pcb to help out more.. pretty excited about silent gaming the 390x.. I can pretty well game silent on the blower with a nano application of thermal paste .. Tiny dot and spread ever so thin across the chip edge-to-edge by finger and it helps me hit 65C @ 50% fan core @ 1133Mhz.. little loud.. could set max target to 85C and game @ 20% fan to 30% fan which is nearly inaudible. The gelid or accelero will allow overclocked silent operation.


----------



## m70b1jr

Does lowering mem clocks really help out the core clock more? Which is more beneficial in all honesty?


----------



## Streetdragon

Quote:


> Originally Posted by *m70b1jr*
> 
> Does lowering mem clocks really help out the core clock more? Which is more beneficial in all honesty?


From my testingday yesterday i would say yes. BUT only for the last 20MHZ or so. It just helps to lower the powerusage overall


----------



## chris89

I have the 390X and literally not one game sounds fun to play... Suggestions? I want amazing graphics and cute girls with funny giggly bubbly personalities... That doesn't sound so crazy yet none exist...


----------



## Streetdragon

play crysis^^ nice graphic but no chicks... but a cool guy in a sexy awesome super hyper suit!

Or metro


----------



## chris89

Are those the games your playing? So totally bored of those games haha yeah it's like where are all the girls in games? It seems developers would rather be looking at dudes.. lets see some games with lots of girls .. overflowing as much... I find if you waste and release all your steam, Even lame unimpressive boring games seems cool.. Hold on to your steam and then you realize... all games are pretty well lame and there are no freakin girls in them.. Like our minds need to be fed complex stuff and no games are doing it... at least girls in games in a good way with cute personalities and funny and all that jazz. Third/ First person like Black Desert is pretty lame other than the characters up close.. 99.9% of the rest of the game is lame like running physics and all that other stuff... I like realistic physics and moving animations with tons of Geometry.. I can feast my eyes on Geometry for hours.. flowing lines and curves and what not...

Even during character animation they start out with tons of character geometry and polygons and then dial back when finished to lame mode that runs on low end tech.. so we don't get to see the raw artwork... Which is what we want to see! Raw uncompressed artwork within a radius of the viewport for good fps and insane uncompressed characters.. and models and objects etc.. Seriously though games should devastate on the Polygons and Geometry and realize low fps is primarily in LOD Distance/ Or Distance Detail, and Distance Culling.. Line up the engine to render what's near in Over-The-Top quality and in a slight distance go low detail to give best graphics and performance... rendering only what's very near and line up levels to work well with this.. not to force render too far a distance away pulling fps down or just giving the ability to cull that distance for fps...

Where are the tropical games with green and sun and beaches and smiles and laughter and joking and goofiness... ?

idk title of these


----------



## Carniflex

Quote:


> Originally Posted by *chris89*
> 
> I have the 390X and literally not one game sounds fun to play... Suggestions? I want amazing graphics and cute girls with funny giggly bubbly personalities... That doesn't sound so crazy yet none exist...


I kind of liked Hydrophobia. It is not a very popular game and it is a bit on the short side (about 6h) but it was different from the usual shooters and there is a female character you play. I'm not sure if its demanding enough to make 390X flex its muscles but there is some nice water and smoke effects. Should be pretty cheap. I believe I got mine for around 5 EUR on steam sale.


----------



## chris89

Nice thanks man I appreciate the suggestions... What r u guys playing to flex the 390x's muscles? haha

Once you overclock the 390x and repaste it runs cool and handles everything so easily up to 3200x1800... Needs 3840x2160 or 5120x2880 to really let it sweat a little...

I'm watching Hydrophobia 4k gameplay... this is a perfect example why it would be unreal to have a crazy 16k VSR res... To give a little pixel excitement from low-end looking games that hit 500fps @ 3200x1800... Even 3200x1800 runs too smooth on 390x... Need up to 5k or 16k VSR







So when I finally get a 4k TV... I get to Virtual Over sample 4k to 16k.. taking it to the next level haha






This is what I plan to do with my 390x.


----------



## tolis626

@chris89

If you need that kind of resolution to make your 390x sweat, you probably haven't really played much. Just play Witcher 3 at max settings and watch it crap its pants at 1080p. You can also play Dragon Age Inquisition (you can also play with women in that game, so it will satisfy your... desires, whatever they may be - but dude... Wat?), max it out and see the 390x kneel. Then there's the usual hogs that are many modern games, like Tomb Raider and ROTR, Crysis 3, Deus Ex Mankind Divided and others. Like, really, the 390x is a nice card for what it is, but making it sweat is easy peasy at 1440p and sometimes even 1080p. 4K is a joke on the 390x, unless you go medium settings or no AA or something (one could argue that you don't need AA at 4K, but still it struggles at high-ultra, even without AA).

On another note, and at least on my MSI 390x, it seems that you're onto something with the whole low memory clocks and VDDCI thing and pushing the card further. Thing is, on my card all it took was not forcing the memory to over 1600MHz, which itself isn't bad (I mean, many people have achieved 1750MHz, but my card's memory or memory controller is crap and can't really go over 1650MHz or even 1625MHz without pushing stability over the edge with a lot of errors). I can run up to 1700MHz on the memory with 1.05V VDDCI for short runs and 1650MHz game stable, but it seems both compromise stability a lot. In fact, even 1625MHz seems to do so. I can, however, run 1600MHz with ~0.98V VDDCI and it runs quite a bit cooler and allows me to push my core to 1200MHz or more, which is nice and offers more of a real world performance improvement than those extra 50-100MHz of memory overclock. It's not a night and day difference, but at 1080p I guess a bandwidth in excess of 400GB/s is overkill anyway, so... Yeah, why not? Core IS king after all. I'm even pushing about 15100-15200 points in Firestrike this way, so I'm happy.


----------



## chris89

*@tolis626* Maybe you can show off some Witcher 2/3 gfx screenshots with the hotties? It's not out of perversion, but rather love of real pure and true beauty.

Yeah I tried Witcher 2.. didn't look good and ran bad too.. On 24-threads of Xeon power & 390x... It just didn't look good.. maybe you have to mod it out... plus not into it much.. Witcher has amazing girls but the rest of the game not my cup of tea. Tomb Raider 2013 & ROTR run great for me on Ultimate & Beyond Very High @ 3200x1800... by stock clocks its a little laggy like below 30fps in 20's... overclocked it's in the mid 30's and runs really smoothly is all I want is smooth maxed. I get around 100fps on ROTR and Tomb Raider 2013 maxed @ 1080p.. 1440p is like 60-75 and then 3200x1800 goes to 30-50fps overclocked.. so I wish I could go to 3840x2160 or 5120x2880... heck i'd play all my old games over again eagerly if I could go up to 3840x2160 vsr...

I find the 390x sweats big time on stock clocks for sure and I find you can't get more fps unless you turn up power limit @ 3200x1800... The behavior of clocks and performance on overclocks is completely and totally different at 3200x1800 than 1920x1080 or 2560x1440... I get like 20fps on stock clocks ROTR @ 3200x1800 & like 30-50 @ 255W TDP/ 245A TDC @ 1219Mhz Core @ 1430mv & 1563Mhz Memory @ 1000mv & 95C Max ASIC since it messes up at 88C.. 88C is too low it hits it real quick overclocked. It also doesn't run that hot or loud... Since I used a tiny little teeny weeny dot of Arctic Silver Ceramique 2 in the center of the Core Diode, and spread it across the entire chip by finger Edge-To-Edge... The layer is literally a 1000th of mm thick or so.. something around there.. when pulling to check contact.. you can hardly see any paste and it's just pure perfect contact... Giving me more performance & way less temperature and noise. You can tell if your getting exceptional thermal performance from the 390x is by running the stock bios through a run of Firestrike and see what the max GPU temp was on the 95C fan stock bios... Mine went up to 72C completely inaudible as well.


----------



## Carniflex

Well as far as the need for _special_ graphical content goes. Might I suggest a Skyrim with the right mods









Throw ENB at it and it can melt even nvidia 1080. Although its x87 heavy DX9c game. Word of warning though, if you fire it up without mods first to just see whats it all about - it looks rather ... bland.


----------



## chris89

*@Carniflex* Hi can u make the game pre-modded so i can just open and play? Modding seems like such a huge hassle.. even know what i do would appear as a hassle with gpu bios/ gpu hardware testing etc.. It seems like such a huge PITA just to gain good looking girls in the games hahahaha I don't think I'm that content... Wished someone could make a Setup to Mod-All Next-Next-Next-Done hahahaha







They want you to have like 10 to 15 different tools and sign up on forums to download etc etc like seriously?

Yeah I have seen Skyrim modded and it looks soo good. The girls though? Like wow. I wish there was a mod that you are surrounded by 20 of the most beautiful girls and you get to go on quests together.. haha Maybe 3 is enough







If only Witcher and Skyrim, Revolved around the good looking girls.. everywhere you look your like wow... wow.. wow.. wow hahaha Know of any mods like that, that they follow you around or you follow them on quests and adventures and they give advice and what not..
















It's going to be nice to reduce core temperature and the noise on the Arctic Accelero III. This screenshot is a little higher than Very High @ 3200x1800 DX11 mode mind you it's a tiny bit smoother at DX12. As you can see the Dual X5650's don't break even a little sweat. I can run them at 30 watts @ 1.6Ghz and still play everything without any fps change. Each X5650 Core is very powerful in essence. If only it had 8 logical to each physical... 8 x 12 = 96 threads vs 24 now.. 3Ghz boost on all threads is plenty for any app or game... Benchmark sake sure it's not talking on a Dual 20-Core Xeon rig or a 10Ghz Overclocked i7... but it does very well and will do so for quite some time. The 390X however gets toasty @ 225W TDP/ 225A TDC @ 1333mv @ 1166Mhz @ 1563Mhz @ 1000mv... It's hit or miss at 952mv VDDCI... Black Screen issue... It black screens near 1400mv on a title that taxes all the VRAM like TROTR DX12 @ 3200x1800 beyond very high custom settings. I does hit 36fps, but at 225 watts throttles to 1Ghz and 28fps. So I must increase TDP... By a mathematically calculated amount... We throttle from 1166mhz to 1011mhz I see @ 225 watts... Meaning it needs no less than 15.33% increase in TDP to just hold 1166mhz @ 1333mv... Both the core voltage and clock are tied to this 15.33% TDP increase requirement. Meaning 225 X 1.1533 = 259.4925 Watts to just barely hold true @ 1166Mhz @ 1333mv... If I increase the voltage to 1388 at the exact same 1166mhz... 1388mv is 4.1260% more voltage meaning the TDP would now need to be 270.199 Watts... Quite a bit huh? So I suppose its better to run lower voltage and crash on ocassion than higher than 1333mv... Now I will need the fan to go to 60% @ 75C for some serious noise or let it ride out to 85 to 88C @ low fan speeds haha. By stock clocked it's in the low 20fps range average at these detail and resolutions levels... Meaning I increased the 3200x1800 frame rate by a lot... 36fps vs 20fps. Gains to be made at this resolution is like pumping iron, you don't make gains unless you work through the pain.



*Same 1333mv @ 259W @ 249A @ 1188Mhz .... Unstable with fan tune need 1360mv or more leaving TDP/ TDC. Max fps about 50fps avg consistent 35fps. At 3840x2160 it would probably be 30 to 32fps... 5k 20 to 30fps not sure...*


----------



## Removed1

Quote:


> Originally Posted by *chris89*


You should go and throw it under water, like it was said.
I'm sorry to device you, but you will never hit the max performances of your card on air, it is the way it work for haiwaii.
And looking how much time you spend enjoying bench, you should really consider to go with a IAO + kraken/HG10 if you are not a water guy.
The major factor driving heavy power consumption is the heat produced by the gpu, just because the chip is an oven, big and complex!
So on hawaii lowering the t° from the 70°/90° you get to 40°/60° allow you to finally recover the full performances of the gpu.

Grenada chip is no exception to that, since it is a tuned hawaii, but with a huge gain: exploit even more the water performance, since it heat and consume a bit less of a r9 29Xx.
=> bigger room for oc, maybe reach 1250/1280/1300mhz on gpu = big perfs.


----------



## gordesky1

Anyone having any issues with crysis 3? Its being weird for me right now.. like if i have texture resolution set to very high and everything else very high in weird spots like some objects in the game the fps will drop down to 20s and even as low as 18 than it will go back up if i look away from that said object... but if i put the texture resolution down on low and keep everything else on very high i get 90+ fps everywhere.... Than more messing around with the settings i tried very high texture resolution and try each setting option down one by one to see if i can find what causing it.

And 2 options which are shadows and object detail is what causing the very low fps.. if i keep those on low i get 50s the lowest and avg 60+ and max around 150fps even with texture resolution on very high..

This is at the beginning on the objective breach the cell.. I bought this game back in 2013 and played it a good bit on my older system 1100t with a 5870 8gb of ram and pretty sure i had everything on mem with decent fps.

Now im not sure if the game been updated sense than? Maybe they change stuff around and made it more demanding sense i last played?

Here's a bench i found for the game that even shows amd is doing pretty good with it so i know its not my cpu and they using a 690. and it shows very high settings now im not sure if they just meant the settings or the texture resolution also..



Heck i even tried crysis 1 and at the place were the sunset rises the fps will drop down in the low 30s??? That game still looks great but was expecting performance to be better for a 2007 game lol

edit Hmm on crysis 3 i just passed that level and in the level with the mines and towers shooting at you and i just put everything on maxed even those 2 settings and the fps are much better now??? Maybe its just that one level that is buggy? Even tried 3200x1800 vsr and it was in the 30s and up to 42fps. another edit... And did some researching again and found this link... http://kotaku.com/5986908/ah-so-thats-why-crysis-3s-first-level-is-so-bad It seems like Ropes is causing the issue on the first level... and a problem with the game.... Was getting worried it was my system... lol But Still not sure if crysis 1 is normal tho...


----------



## chris89

AMD just ought to just take a re-look at the 390X, it's technology is crazy. If taken back to the drawing board, work over the PCB by placing all VRM and CAP's around in one central location will improve power consumption an insane amount... Think about it 390x, reworked in finfet 14nm with all the TMU's unlocked if possible or just leave 64/176 is plenty... work over the Memory Bus with HBM? or high efficiency GDDR5x @ 256-bit... It could be the best GPU ever made.

I don't think AMD should *EVER* forget about the all mighty 390x. It's unreal, the Hawaii XT is a masterpiece of art work and ingenuity. It's potential exceeds that of several years of so-called next-gen tech gpu architecture.

@gordesky1 I'll try Crysis 3... is there a benchmark in the game? Or a way to do one? I only play @ 3200x1800 so I'll post those results.

Here I'm comparing overclocked to stock tuned @ 3200x1800... OSD robs a couple frames... Compared overclocked dx11 maxed vs dx12 stock (dx12 has no OSD)

One of my Xeon's cut out or something, the riser won't work idk what's up so on 1 Xeon. From 1.6Ghz to 3.06Ghz Turbo boost is about 7fps boost.




Dx12 stock tuned (ASIC @ 88C Max & Cooler Target Temps up to Max 88C)(Compared to 95C+ Stock)(Also still kept 208w TDP 200a TDC)


Dx11 overclocked


Dx11 stock


crysis 3.. appears to be tdp limited @ 208W.. needs 240w @ stock clocks voltages to resolve most.. that goes with 230a tdc as well... Overclocked will need like 275w tdp, 265a tdc to minimize throttling.

I tested 1188mhz @ 1377mv @ 275w tdp @ 265a tdc... Still throttled but max went from 96 to 102.. 6fps faster... Just gonna need a massive TDP/ TDC to maintain consistent FPS.. this is optimization'ish being the cause.. most likely distance culling being the cause.. need to mod the .ini or whatever to increase distance culling.. meaning reduce detail distance... it's a waste of power with too low of a culling radius (Culling means to reduce details at distance/ basically what you can't even see)... It's almost not worth overclocking.. just sticking with stock clocks/ voltages and increase TDP to 400w and TDC to 390a... just to have the throttling headroom .. or mod the Config culling distance.




Only way to keep it running nice is turning system spec to low.. tex res to very high.. it's a console port and has issues... you can try this and have it go in the same folder as System.cfg autoexec.cfg .. i tried setting distance culling no change.. Crysis 1 had better performance with 0 culling to insane distances still high fps.. u can use this app to make Crysis look unreal...

CryConf.zip 765k .zip file


400w tdp/ 390a tdc 1377mv @ 1188mhz .. utilization hopping all over.. game doesn't look any different hardly on low system spec max textures 3200x1800 and runs beautifully.


I just set everything here to "Ultra High" and I'm sure some values can be tweaked to take it well beyond "Ultra High" to "Redonkulous High"... which resolved the droop and looks pretty nice.

Place in the Root of the Crysis3 folder... this is set to "Ultra High" via CryConf... it changes System Spec to Custom... runs way better and looks better... haha

FOV tweaked.. haven't tested.. but will be nice with a wide FOV.

autoexec-distance-adjustment-near-60fps-constant-3200x1800.zip 1k .zip file



Just testing it... set max frame rate to keep temps real low.. never ever dropped below 50.


----------



## Dundundata

Quote:


> Originally Posted by *chris89*
> 
> This is what I plan to do with my 390x.


ooo so sexy









Witcher 3 has it's fair share of hot chicks that will get into your bed. Of course it's quite a lengthy game so will take some time to see them all.


----------



## lanofsong

Hello AMD R9 390/390X owners,

We are having our monthly Foldathon from Monday 16th - 18th - 12noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.

January 2017 Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## dasitman67

My 390x's are still going strong, not even sure if there's a point to upgrade. Been playing Battlefield in 4K and cant complain. Now for a keyboard hmm, I was reading this article on cheap mechanical keyboards, and they recommend the Corsair strafe.

What do you fellow 390x lovers use?


----------



## bluej511

Quote:


> Originally Posted by *dasitman67*
> 
> My 390x's are still going strong, not even sure if there's a point to upgrade. Been playing Battlefield in 4K and cant complain. Now for a keyboard hmm, I was reading this article on cheap mechanical keyboards, and they recommend the Corsair strafe.
> 
> What do you fellow 390x lovers use?


I LOVE my Strafe, i've had no issues with it and pretty sure i've owned it just a year so far. Sometimes you will have to restart your PC as the corsair CUE sometimes malfunctions but its very very rare. But ever since switching to mechanical and ultrawide w/freesync i dont see myself ever going back to anything other then this. My mech keyboard isn't even that loud with brown switches, only thing loud on it is the spacebar haha.


----------



## dagget3450

man drivers must have gotten better or i simply just dont recall old 290x numbers in FS.i just got a cpu back from rma and wanted to drop the 4 390x underwater that were sitting around for at least 6 months. Even stock so far i was impressed with the numbers. I dont recall combined being as high but it was broken i suspect for a long time. Maybe its been fixed?


----------



## tolis626

Nope, it's definitely driver related. Back when it was new, my 390x couldn't break the 15000 mark in FS even at its max overclock of 1210/1725MHz (and that was only bench stable, especially the memory). Now it casually gets like 15100 points at 1200/1600MHz, and that's game stable-ish. AMD has been doing some pretty good work. On the flip side of the driver improvements argument, however, these "improvements" shouldn't have been needed. AMD would've been in a better position right now if their cards performed at their peak since day one. I'm not complaining, you know, it just proves that the hardware could do more than it was given credit for. The Hawaii GPU went from barely beating the 780 in 2014 to almost matching a mediocre overclocking 980. On one hand the free performance improvement is nice, but on the other hand the 290x should've been handily beating the 780 and maybe even the 780ti since back then. Damn AMD, can't you do everything right for once?


----------



## Dundundata

Quote:


> Originally Posted by *dasitman67*
> 
> My 390x's are still going strong, not even sure if there's a point to upgrade. Been playing Battlefield in 4K and cant complain. Now for a keyboard hmm, I was reading this article on cheap mechanical keyboards, and they recommend the Corsair strafe.
> 
> What do you fellow 390x lovers use?


Been doing alot of research on mechanical switches and I just ordered a Logitech g610 w/cherry mx brown switches. I looked at Corsair, Razer, etc. and they all seem fine. I stuck w/logitech because I have their mice and like the software.


----------



## Carniflex

I remembered that Steam has a little test that was all the rage early in the previous year so I ran it to see how it does with 390X at 1100 MHz. It is listed under "tools" category in steam. I remember that at the release my score was hovering between 7.4 and 8.8

For comparison I looked up the screenshots I took Feb 2016

There were some disparency in the results depending on how the displays were connected and I do have couple of GFX cards in system and sometimes the test makes use of the secondary 7870 eyefinity 6 card and sometimes it does not and its all a bit confusing. For reference I have atm 6 1080p dispays connected. 4x into 390X (3x DP 1x HDMI) and 2x to the 7870 (2x DP). Back then - 7870 scored 9.5 but I suspect it might have made use of 390x in the background. 390X at stock scored 7.4 but very consistently with no dropped or delayed frames and the test with forced multiGPU flag scored 8.8.


----------



## chris89

@lanofsong Sorry that I couldn't help, I'm sure others here maybe the 4x 390X guy could huff through a couple folds... haha

Dang guys, I see the 4x 390X @ 250fps firestrike... WOW









If anyone is having throttling issues, best way to resolve it is to unlock the limit causing throttling...

I'm getting nearly 60fps all the time vsync Fallout 4 4k ultra modded to heck haha







It throttled down to 25fps so I kept working at it until I found the fix... Fallout 4 doesn't hit so well on Dual 24 Threads of Xeon Power... 10% across the CPU's. So I'm CPU Clock bound for some scenes. Just need to pickup Dual 1 TeraHertz Xeon's... haha







Some scenes hits 25fps as GPU Clock holds, GPU utilization goes down. So if I picked up Dual X5675 (95W) Xeon's that Boost @ 3.33Ghz vs 3.06Ghz... I'm looking at 2 to 5 FPS HAHA... If I get Dual X5690's up to 3733Mhz, that's 21.75%... From 25 to 30fps... That would help but X5690's for 5 fps? Crazy. Game needs unlimited core utilization and finer Tuned "Distance Settings"... If I walked away from the 25fps scene, instantly to 60... Ran away and came back and it was at 60fps in the same scene... So tiny software anomalies, whatever it runs soo flippin good @ 99 million watts haha... Looked at VRM Power In, 240 watts...

If anyone hadn't noticed? HWInfo reports over 27 Million Watts on core power(16.12.2 Specific and 16.12.1)... Haha... So we must go well beyond it to bypass the software glitch.

Not only was it super important to specify "Boost" Clocks & Voltages which is Clock + 15% Up To 1166Mhz... Beyond 1166Mhz Up To I'm Not Sure? Is Clock + 17.5%... Does that make sense to anyone? Clock Multiplied By 1.15 or 1.175 equals the voltage for stable operation... Also by using 17.5% on the Big-End, it yields significant gains in FPS.

To Completely Dissolve Pseudo Power Throttling On The Hawaii XT 390X/ 290X at least for me on 390X With Reference Blower 290X Cooling Is Set TDP MAX, TDP LIMIT, and TDC to Value *99999999*, Yes your heard right 99 Million 999 Thousand 999 Watts & Amps







. It appears the card reads total power incorrectly in nano bursts, limiting even value 999. So 99999999 should do the trick.

I have not a single time experienced power specific "Throttling" at these settings. Keep reading...









You ask, Wow? Value 99999999?!? What in the Heck?..







(It has not came ANYWHERE Near 99999999, somewhere in the meat of 250 Watt Range)

At 1,173Mhz with Temp Target Medium & High @ 80C (40% Fan) & 84C (50% Fan) (Quiet? Yes..How?!? IDK?!?







)... Among ASIC Max Temp 88C, As well as Max Temp Target @ 88C... The Glorious AMD Hawaii XT Abides... Now it's Throttled Smoothly & Beautifully by Temperature, Not By Power is Key. When it hits 84C, Throttles back from max Clock Of 1,173Mhz (Clock + 17.5%)(Meaning Clock Times 1.175) to anywhere between (Clock + 15%)(Clock Times 1.15) 1,000Mhz/ 1,133Mhz/ 1,166Mhz and temps DO NOT, I repeat, DO NOT exceed my High Temperature Target Of 84C. All so smooth and beautiful, like the Card should have been all along.


----------



## Regnitto

Got my replacement fan for my 390x today. China-USA 10 days...not bad?

I was happy to see it was an exact match, brand and p/n match...funny side note, date on back of original fan was my son's second birthday ?.



Got the fan installed, card running nice and quiet again


----------



## dagget3450

Quote:


> Originally Posted by *chris89*
> 
> @lanofsong Sorry that I couldn't help, I'm sure others here maybe the 4x 390X guy could huff through a couple folds... haha
> 
> Dang guys, I see the 4x 390X @ 250fps firestrike... WOW
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If anyone is having throttling issues, best way to resolve it is to unlock the limit causing throttling...
> 
> I'm getting nearly 60fps all the time vsync Fallout 4 4k ultra modded to heck haha
> 
> 
> 
> 
> 
> 
> 
> It throttled down to 25fps so I kept working at it until I found the fix... Fallout 4 doesn't hit so well on Dual 24 Threads of Xeon Power... 10% across the CPU's. So I'm CPU Clock bound for some scenes. Just need to pickup Dual 1 TeraHertz Xeon's... haha
> 
> 
> 
> 
> 
> 
> 
> Some scenes hits 25fps as GPU Clock holds, GPU utilization goes down. So if I picked up Dual X5675 (95W) Xeon's that Boost @ 3.33Ghz vs 3.06Ghz... I'm looking at 2 to 5 FPS HAHA... If I get Dual X5690's up to 3733Mhz, that's 21.75%... From 25 to 30fps... That would help but X5690's for 5 fps? Crazy. Game needs unlimited core utilization and finer Tuned "Distance Settings"... If I walked away from the 25fps scene, instantly to 60... Ran away and came back and it was at 60fps in the same scene... So tiny software anomalies, whatever it runs soo flippin good @ 99 million watts haha... Looked at VRM Power In, 240 watts...
> 
> If anyone hadn't noticed? HWInfo reports over 27 Million Watts on core power(16.12.2 Specific and 16.12.1)... Haha... So we must go well beyond it to bypass the software glitch.
> 
> Not only was it super important to specify "Boost" Clocks & Voltages which is Clock + 15% Up To 1166Mhz... Beyond 1166Mhz Up To I'm Not Sure? Is Clock + 17.5%... Does that make sense to anyone? Clock Multiplied By 1.15 or 1.175 equals the voltage for stable operation... Also by using 17.5% on the Big-End, it yields significant gains in FPS.
> 
> To Completely Dissolve Pseudo Power Throttling On The Hawaii XT 390X/ 290X at least for me on 390X With Reference Blower 290X Cooling Is Set TDP MAX, TDP LIMIT, and TDC to Value *99999999*, Yes your heard right 99 Million 999 Thousand 999 Watts & Amps
> 
> 
> 
> 
> 
> 
> 
> . It appears the card reads total power incorrectly in nano bursts, limiting even value 999. So 99999999 should do the trick.
> 
> I have not a single time experienced power specific "Throttling" at these settings. Keep reading...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You ask, Wow? Value 99999999?!? What in the Heck?..
> 
> 
> 
> 
> 
> 
> 
> (It has not came ANYWHERE Near 99999999, somewhere in the meat of 250 Watt Range)
> 
> At 1,173Mhz with Temp Target Medium & High @ 80C (40% Fan) & 84C (50% Fan) (Quiet? Yes..How?!? IDK?!?
> 
> 
> 
> 
> 
> 
> 
> )... Among ASIC Max Temp 88C, As well as Max Temp Target @ 88C... The Glorious AMD Hawaii XT Abides... Now it's Throttled Smoothly & Beautifully by Temperature, Not By Power is Key. When it hits 84C, Throttles back from max Clock Of 1,173Mhz (Clock + 17.5%)(Meaning Clock Times 1.175) to anywhere between (Clock + 15%)(Clock Times 1.15) 1,000Mhz/ 1,133Mhz/ 1,166Mhz and temps DO NOT, I repeat, DO NOT exceed my High Temperature Target Of 84C. All so smooth and beautiful, like the Card should have been all along.


No folding here, it's just a bench setup. I couldn't trust it to leave on for long periods. These 390x draw some damn power! i am probably pulling close to 1800/2kw now.... i barely even did anything either lol.

for fun.
Heaven 4.0 1080p extreme


----------



## Streetdragon

Quote:


> Originally Posted by *dagget3450*
> 
> No folding here, it's just a bench setup. I couldn't trust it to leave on for long periods. These 390x draw some damn power! i am probably pulling close to 1800/2kw now.... i barely even did anything either lol.
> 
> for fun.
> Heaven 4.0 1080p extreme


Convert power in FPS and heat xD i like dat^^

@chris89 i dont realy understand why you have to use such a high amount of TDP. i set mine to 200+-5watts and around 200amps with 1200clock and 1,35v or so and have no problems with that^^


----------



## chris89

The actual limit is 57599 so setting 99,999,999 just goes back to 57599... Which does give zero throttling @ 1,173Mhz and pulls the highest FPS... Yeah this card uses the most power of I think any card ever... I have it hooked up dedicated to my KingWin 1,000W at the moment. It does fine. I get like 20fps extra in some parts with unlimited power limit @ 4k ultra (3200x1800) over a power limited card... So it's like I'm making my single card work so-to-speak like a Crossfire Setup... Consuming as much power as 2x 390X on one card and performing near Crossfire speeds @ 3200x1800. It's 40 to 100fps @ 3200x1800 ultra with it unlimited. Compared to 25 to 35fps limited.

I finally recieved my 16mm phillips M2 screws to mount the Arctic Accelero Xtreme III.... Also used the copper I received to cool components... You need a think piece of copper about 1/4" thick for the VRAM, and Rear VRM... Front VRM stays cool... Even the Arctic Accelero Xtreme III couldn't outperform a properly modded Factory 290X Blower on 390X Card. I checked contact and paste 2 times both times contact and pressure was perfect... The temperature of the Arctic Accelero III didn't get as hot as the Stock Blower by Touch.. Important because by touch heat means more efficient thermal transfer... The Accelero did not have what it takes to cool the 390X, period. You would need a pure all cooper heatsink assembly to outperform the reference design.

The Arctic Accelero Xtreme III just wasn't transferring enough heat... It's too light as well, the density is less than the 290X Blower so the 290X Blower performs the best. Having hefty density pure copper heatsink on the core would be necessary... You just need a 3D Printer, and Water Jet Cutter and create a All-In-One Copper piece that contacts all VRM, VRAM, and Core all in a nice heavy copper heatsink... Blower style or slim 120mm fans...


----------



## Removed1

@Chris89









Sorry for tell you that, but imo you are doing it wrong mate and could kill your card if you continue to push and mod further your bios/gpu/card with this cooling setup.
Seems last time you didn't listen the advice some of us gave you, to go under water.
If you didn't want to go with a custom loop, go with a kraken/HG10 + IAO.

For your info, going 999 on TDP/PL/TDC do not remove all the real limit of the bios, you must remove the second layer of protection of the IR3567B to get really current flow through the vrm without limit. That's why the values revert anyway to something lower. Only disabling the second layer allow to set 999, i think imo.
Things that would *KILL* instantly your card with the actual cooling you made, especially *this stuff* that should cool the vrm!


----------



## chris89

Yeah that's why I have Max Temp And ASIC @ 88C... 88C for a reason...As stock is 95. I did listen about water, and yes that's what I have in mind. Would need a full water-block that cools VRM as well.

My factory cooler can handle unlimited TDP without any issue at all and it doesn't run hot at all... 70C load @ 1,173Mhz Core/ 1,563Mhz Ram.. sure it'll go to 83C after an hour but never goes above and runs fine...

The test was only to see if the Accelero did well, which it didn't.

Sticking with stock blower and it works fine, nice and cool and stable.. 60fps @ 3200x1800 maxed.


----------



## Removed1

Quote:


> Originally Posted by *chris89*
> 
> Yeah that's why I have Max Temp And ASIC @ 88C... 88C for a reason...As stock is 95. I did listen about water, and yes that's what I have in mind. Would need a full water-block that cools VRM as well.
> 
> My factory cooler can handle unlimited TDP without any issue at all and it doesn't run hot at all... 70C load @ 1,173Mhz Core/ 1,563Mhz Ram.. sure it'll go to 83C after an hour but never goes above and runs fine...
> 
> The test was only to see if the Accelero did well, which it didn't.
> 
> Sticking with stock blower and it works fine, nice and cool and stable.. 60fps @ 3200x1800 maxed.


My concern was more about the vrm, killing the gpu by t° is quite difficult, easier kill it by voltage.
But for vrm 110° is almost the max they can handle. Remember when you read 110° is the T°Case of the vrm, so the package.
The internal ic inside the package is far higher than that. So imo it is good to do not go higher than 100°, because inside the ic is already almost at 110°/125°.

Full water or not, just buy or mod decently some vrm and ram heatsink, if you want to go with a simple metal shroud, get it as thick as possible.
Going on water on gpu is only to oc it more, imo gpu and power supply of a card are two separate things that you must to cool as possible.
Having a low t° on gpu with 110° on vrm is useless, same the opposite.

Mod the stock plate like everybody does and put on the top a IAO cooler waiting your full block. It cool decently the vrm and the ram with a bit of air flow on it.


----------



## simonfredette

Hi, im having trouble overclocking my r9 390, Ive been running stock since I bought it in feb but now I want a bit more out of it. I was previously with nvidia cards so this it strange to me, I use msi afterburner to overclock normally and just run something like heaven in the back ground. Im finding that despite what settings I put in afterburner the card is capped at a core clock of 1040. I uninstalled my gaming app to make sure it wasnt taking priority in clock settings but still nothing. Are you guys actually overclocking using crimson ( or whatever wattman crap theyre calling it)?


----------



## chris89

@Wimpzilla
Thanks man yeah I appreciate the advice... I didn't properly go about cooling the video memory and core vrm at the back.. yeah they could use a 1/2" thick block of copper with the 2 drilled and thread tapped holes with Fujipoly 17w/m k or Panasonic PGS 2k w/m k. With a 1/2" block with threaded holes probably see 40C load VRM temps... My stock 290X blower on my 390X @ 1,173Mhz Core is like 70C VRM 1/2.. Core 75C'ish... I do have a water block... Just need to get the Clear PVC hose and submerge the radiator in water to bleed all the bubbles out... I would probably run just straight distilled water to start since the block and radiator are clean... I just need to make a custom mounting bracket.... Also need a custom forged copper front-plate to cool everything with 1/2" thickness on core VRM and 1/4" thickness across the video memory and memory VRM. If I could just create a custom forged copper heatsink I could get far better cooling with no need for water... I just don't have the tools... a custom 3d printer that can forge it in copper... if I did that would be insane... So we can only hope that maybe AMD or third party suppliers build full all-in-one forged copper heatsink/ fan assemblies that don't have the Cooper soldered to Allunimum fins because that doesn't work on high end gpu's... Need no solder binding anything and just have pure through and through cooper with no soldering or bonding anywhere... to keep conductivity at peak... I noticed the copper cools down like 20X faster than alluminum... the Copper absorbs very fast and cools down very fast as well as long as the density and mass is properly used.
@simonfredette
Send your gpu bios from GPU-Z dump... I can make you a very nice running bios that's very stable with stock power limits... Up to 1,173Mhz scaling by temperature and power limits.


----------



## simonfredette

Thanks but I wouldnt even know how to flash the new bios im really just trying to figure out how to overclock it the old fashion way, now its worse its running at a fixed 300 core clock and 150 mem clock with no way to change it .


----------



## chris89

It's super easy, like really super easy... Once flashed you don't have to touch a single application because the fan speed and temperatures are controlled by the FirmWare rather than SoftWare... So it can save it self rather than software could fail and card could fail...

You dump and save GPU-Z Bios File... Then .Zip it into a Archive and Upload here.











Then I can mod the file.... Then you close out of all apps like MSI, HWInfo among anything in the taskbar minus Antivirus. I can create a .bat so it Auto-Flashes... Just Double Click and wait... it will take up to 5 minutes to flash but when it's complete... Restart and now you have a faster more stable card that runs cooler with no need to touch software apps...

GPU-Z_ASUS_ROG_1.12.0.zip 2079k .zip file


----------



## simonfredette

Hawaii.zip 99k .zip file


Like this ^?


----------



## chris89

Yep. Thanks. Here ya go...

Simonfredette_r9_390_modded_hawaii.zip 1203k .zip file


----------



## simonfredette

ill give it a shot , im reverting back my drivers now to the suggested and will refuse wattman so it doesnt take over and then try to load this. Thanks, if it blows itl have been fun


----------



## chris89

That's hilarious man! haha your card has power limits higher than reference so I kind of dialed them back a bit... Also your fan profile was a little louder but I left it because I don't know how hot your card gets... Let me know about Temps & Performance & Noise... I can help make it quieter if it seems fit.









I thought this was kinda funny... I had to make a double look at the fps.. I was like what-what? haha starting up Crysis 3.


----------



## simonfredette

Il sacrifice noise for a cooler healthier card, the computer isn't that close so it doesn't bother me, there cards run hot so i usually used speed fan and turned them to 100% any time the load was higher than 20


----------



## simonfredette

I get an error resources files missing


----------



## chris89

Double Click ATIWinFlash.exe open it... Open and find the modded file. Proceed to flash. Make sure msi and browsers and everything is closed. Wait and restart.









Cool dude, yeah the card will run cooler and the max allowed temperatures were reduced to a far healthier level.

Though can you show a picture and the model of the card you use? The reference 390 can handle the bios and memory speeds. Though some non-reference 390 series uses passive memory cooling needing some memory voltage/ speed adjustments because the modules could fail if passively cooled. I've seen ASUS Strix use passive memory on the 390, causing even stock memory clocks & voltage to be an issue.


----------



## simonfredette

double clicking the atiwinflash.exe doesnt open anything just pops up an error window, resources files missing.. I went in properties and set it to run as admin and everything else possible is closed


----------



## chris89

Right click ATIWinFlash (Click once because it takes 30 seconds to open sometimes) and click Run as administrator... most of the other files are just "hidden" to keep the folder "less busy". This BIOS may not work (meaning not work well under load but it just might?)(It works fine in 2d as its near the same as stock) so I'll get to work on a revision bios to be more reasonable for low voltage. The paste application and fan speed are dependent on stability with the bios I sent you.


----------



## simonfredette

I still nothing, im restarting the computer before trying again. Right clicking and run as admin does the same thing all I get is an error popup


----------



## chris89

Make sure User Account Control is turned off as well. On the same BIOS, I'm hitting 68C @ 1080P. At 3200x1800 is when the fan speed and everything gets tiny bit iffy. Though I have worked on that bios for a month straight and it holds true even at 3200x1800 VSR. I just can't wait until 3840x2160 & 5120x2880 options are available. *CROSSING FINGERS*







*New Drivers Are Out By The Way Guys* ... If I had $400 laying around I'd pickup a 40" 4k Samsung but maybe sometime. haha


----------



## simonfredette

No didnt work, this card is jinxed, user account settings was already turned off


----------



## Dundundata

I have heard some people are having trouble w/afterburner and the new drivers. It works for me on 16.12.2 but now I see there is a newer driver. I haven't used wattman at all, there were some features missing for me like temperature control. Plus I love afterburner.


----------



## simonfredette

So do I but even when I changed settings on afterburner it would not go over the stock settings used by wattman. I never "used" wattman it just automatically takes over and trumps other oc software.


----------



## Removed1

Quote:


> Originally Posted by *simonfredette*
> 
> So do I but even when I changed settings on afterburner it would not go over the stock settings used by wattman. I never "used" wattman it just automatically takes over and trumps other oc software.


Wattman is easy, i leave everything auto, i just add a % on a gpu, set my ram clock, 50% power, that's all.
I check the clock using gpu-z, so i increase 0.5/1% each time, after a bit you begin to understand witch % = clock.
For change the VCCD aka vgpu use MSI AB, since you have everything in auto, it will apply the +XXXmv over the stock VCCD.


----------



## simonfredette

When i try to run both msi ab and wattman at the same time it crashes and sets the clock down to 300 and mem to 150, then i have to restart the computer to start over


----------



## Removed1

Quote:


> Originally Posted by *simonfredette*
> 
> When i try to run both msi ab and wattman at the same time it crashes and sets the clock down to 300 and mem to 150, then i have to restart the computer to start over


Fine, can you please specify the brand of the 390?

-Did you use gpu-z to check if the clock are applied, same as the VCCD of the gpu? Can you make a screen of your card with gpu-z ?
-Can you change the gpu voltage with MSI AB, can you see the VCCD change into gpu-z?
-What are the overclock setting you apply, the valid ones, the ones that crash?
-What are your t°, gpu, vrm1, vrm2 under load?


----------



## chris89

Here's a video showing how to use ATIWinflash for windows 10... If you have Windows 7... I'll make another video.

http://www.mediafire.com/file/j62yl9d99nh1q90/How_To_Use_ATIWinFlash_Windows_10.mp4


----------



## simonfredette

its an msi r9 390, I took a screenshot of what I mean by it crashes, I have reinstalled the amd drivers and didnt accept the wattman agreement so wattman is not whats causing the issue. As you can see while running heaven with AB settings set to stock card ( but after changing voltage settings) my clock speeds are locked at 300 and 150, you can also see that the fps is around 15 when it should be over 100


----------



## chris89

I never messed with software overclocking the 390x, it didn't work well like it did on other cards like 380x. So sticking to BIOS mods is the best and most reliable way to make massive strides in performance gains... I just compared the BIOS I made for you... my stock firestrike GFX score was 12,800... with the mod 13,800...


----------



## simonfredette

Quote:


> Originally Posted by *chris89*
> 
> I never messed with software overclocking the 390x, it didn't work well like it did on other cards like 380x. So sticking to BIOS mods is the best and most reliable way to make massive strides in performance gains... I just compared the BIOS I made for you... my stock firestrike GFX score was 12,800... with the mod 13,800...


Only thing I saw different in your video is that you have a copy of the zipped file in the folder with the bios mods and I did not, I did everything the same but when I double click the app I dont get the message saying to run as admin I just get an error screen, same if I run as admin and same with all windows and background applications closed. Ill try adding the zip file in with the unzipped folder and try again. Edit. Tried and same error .


----------



## chris89

Turn off the anti virus program and literally every single app in the task bar, as they conflict. I don't even run an anti virus... they're a total waste of time.


----------



## simonfredette

Quote:


> Originally Posted by *chris89*
> 
> Turn off the anti virus program and literally every single app in the task bar, as they conflict. I don't even run an anti virus... they're a total waste of time.




nothing at all open even cleared out more background processes, if I turn more off windows wont be running anymore


----------



## simonfredette

Ok so I took a screenshot with AB settings asking for a core clock of 1050, im running kombuster and you can see on gpu z that the core clock is still 1040, also the screen gets glitchy with horizontal lines flashing in and out when AB is on. I also put my AB settings on the right, ive tried most combinations of settings I can think of but if someone has the proper AB settings for an MSI 390 so I can stop trial and error. The card crashed while testing and I had to shut down and when I rebooted I got the message from AMD saying they reverted to wattman settings because of the crash. Again I have not accepted the terms for wattman so I am definitely not using them. Could this be caused by a setting in my actual BIOS or something ?


----------



## Removed1

Quote:


> Originally Posted by *simonfredette*
> 
> its an msi r9 390, I took a screenshot of what I mean by it crashes, I have reinstalled the amd drivers and didnt accept the wattman agreement so wattman is not whats causing the issue. As you can see while running heaven with AB settings set to stock card ( but after changing voltage settings) my clock speeds are locked at 300 and 150, you can also see that the fps is around 15 when it should be over 100


So if you apply +50mv on MSI AB your card crash under heaven and bug to 300/150mhz, same behavior if you change the gpu/ram clock?
If yes, you have maybe a locked voltage card, i know some are like this.

But i dunno if it is only a bios lock, so you could flash another one or it is a hard lock. I can't help you more here since i have only a 290.
I looked for you bios and nothing let me presume it is a locked one, it is quite normal. After that if you have a dual bios you can try to flash another bios, if it don't work on windows, just go under dos with a bootable usb key.
You should also check the thread about the 290/390 unlock and read your gpu info to check if you have a Grenada PRO/XT.


----------



## simonfredette

Ya if I open gpuz and run kombuster I can see the memory reaches its max , then I only change the voltage setting to +25 and run kombuster again and im back down to 150 300 and locked until I restart the computer.


----------



## Removed1

Quote:


> Originally Posted by *simonfredette*
> 
> Ya if I open gpuz and run kombuster I can see the memory reaches its max , then I only change the voltage setting to +25 and run kombuster again and im back down to 150 300 and locked until I restart the computer.












At least you know where is the problem, google it and ask there, it worth to dig a bit since the card have a nice oc potential.









Maybe just flashing a bios on the other switch, if you have one, solve the problem.


----------



## chris89

Try this and right click open as administrator and open the modded file and flash... make sure everything is closed.









atiflash_274.zip 1214k .zip file


----------



## simonfredette

Quote:


> Originally Posted by *chris89*
> 
> Try this and right click open as administrator and open the modded file and flash... make sure everything is closed.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> atiflash_274.zip 1214k .zip file


All right finally a result, I think .. I ran the atiflash application, brought me to a series of cmd pages, I had to press any key to move on a few times and then the screen closed. I ran the benchmark and it was running 1045 1500. Did I still need to run the ATIwinflash after?

Nevermind , went back and used winflash , selected the modded rom from earlier and it just finished flashing , ill restart and see how it goes.


----------



## simonfredette

Well the flash did work, however the coreclock is now 895 when stock was 1050 .. thats hurting performance quite a bit.


----------



## dagget3450

I didn't think people used kombustor on AMd gpus anymore. Or am i thinking of something else?


----------



## simonfredette

it was just easier to show a screenshot t ''full'' load than running heaven or 3dmark, I dont think anyone actually uses it to base the stability of an Overclock


----------



## Removed1

Quote:


> Originally Posted by *simonfredette*
> 
> Well the flash did work, however the coreclock is now 895 when stock was 1050 .. thats hurting performance quite a bit.


Set 50% power taget to see if it recover the stock speed.
You should just flash, now it work, an unlocked 390 bios.


----------



## simonfredette

Quote:


> Originally Posted by *Wimpzilla*
> 
> Set 50% power taget to see if it recover the stock speed.
> You should just flash, now it work, an unlocked 390 bios.


Power limiter to max still gives 895 core clock speed, I just did flash it with the new rom, Ive been reading online though and none of these MSI hawaii cards are voltage locked, only saphire and a couple others. I think I just need a new flash now that I know flashing it works but with somewhat of an overclock , something in the 1100s core clock and 1500-1650 range memory clock


----------



## chris89

Yeah, that's proper to hit 895 on Furmark as it'll bend your card over like a beautiful girl and do you know what... HAHAHA

Your gonna want to "Reset" MSI and not touch anything to test it as-is in game first to see how it does... if your not happy with the fps... we can increase the power limits on the new bios.

You don't want 1173mhz furmark... you'll need an arc reactor on hand plus a 5,000mm radiator... sort of kinda of kidding... these cards use a lot of power... even 225 watts is not enough for 1Ghz on this card... So you can't set the power limit to what the clock wants, you have to set it to what the power supply and thermals can handle... Kinda cool.. the performance of the card exceeds all elements supporting it's existence... Gives us enthusiasts massive overhead to see what it actually can do...

New BIOS on the way... Try the BIOS in-game first... See how it goes on temps... then we will go from there.

That BIOS is for actual gaming for hours on end or folding or some such, stable... We can throw an experimental bios at it with wild performance numbers but stability cannot be guaranteed.


----------



## simonfredette

I dont want wild performance but now were at less than stock clock speeds, ill try it in game and see if I still have the same 101 fps that I had


----------



## chris89

First thing I notice is your memory isn't operating at the BIOS specified 1563mhz so you will need to reset MSI afterburner to reset to default 1173mhz core 1563mhz memory... from 384GB/s to 400GB/s which is 4.16% faster + from 1000mhz? to 1173mhz is 17.3% totaling 21.46% faster than stock. That means your 101fps could be 122.67fps... far too much fps in my book unless you have a 120hz display and want 120fps minimum which it is now capable of.

The card will hop to the clock that is within the safe parameters of the bios specifications... FPS is much higher... about 20% increase in fps potential yet it's power limited and temperature limited to keep it huffing along for a long time... We can increase power limits but if you want it set up in a way you can load an OS fresh, only the driver and a game and walk away for a month... And know it'll still be working after that month is how it's dialed in. Stock bios is incapable of such a thing...maybe it'll work for a month, and that's just about it @ 95 degrees celsius.

On that BIOS 1080p @ 60fps vsynced it'll go down to 900mhz never breaking a sweat at 59 degrees celsius... If I wanted to push it to 500fps... the heat and power will come along with it as well as a 100% fan speed... So realistically it's perfect for 1080p 60fps continuous never needing 1173mhz... @ 3200x1800 is when the 1173mhz core and 1563mhz memory shines through.


----------



## simonfredette

Well I just tried gaming with the same graphics settings as I used to have and never went past 20 fps when I was usually around 90-100.. It was obviously unplayable at 20 fps haha. I might have to reflash to the original until we have something better. These cards run pretty hot so Ill have the GPU fans running at 100% even if its just to increase airflow anyways. I wasnt satisfied with the stock settings and MSI AB wont let me change voltages to increase the core and mem clocks past their stock boost clock.


----------



## chris89

First thing is to reset MSI Afterburner and verifying both GPU-Z and MSI that default is 1173mhz & 1563mhz... It's throttling, it's too little power... Your power limit is 20% higher than reference meaning the ASIC quality is lower than reference so more power is needed to power lesser clocks. Verify GPUZ clocks and post screenshot. New BIOS on the way.

This bios is the same with modified fan profile if you want 100% fan which goes to 80% at a far lower temperature at 70C. At 80C is 100%... Max temp is 88C... ASIC at 88C... this is a good thing to prevent damage... The new power limit is from TDP max 225W and Power Limit 208W and TDC at 169A. To now 275W Max, 240W Power Limit, and 220A TDC. So this new bios is by stock a 20% higher limit so +50% power limit is 412.5 watts...

I also noticed I made a mistake on the memory clock in v1 bios, so now v2 is corrected. You can deleted v1 bios and that whole package I sent originally.

*Please Upload GPUZ Screenshot to verify clocks are correct when done, thanks*.









Simonfredette_r9_390_modded_hawaii_v2.zip 99k .zip file


----------



## simonfredette

Y the settings are correct, I restarted and reran heaven without opening AB to make sure it didnt interfere and the core clock did reach 1000, but then the memory clock was still at 150 instead of 1500ish so my fps was 19. Good to see the flash worked now ill try the second one and see how it reacts.


----------



## chris89

*Make sure to Reset MSI Afterburner prior to beginning the flash and restarting.

Also if you can screen shot it under a GPUZ Render test to make sure the clocks go to peak under light GPUZ Render test load.*


----------



## simonfredette

Doesnt look like anything changed , memory clock is stuck at 150 , core clock didnt go past 1000 and fps were around 20. Ill try a reflash and make sure everything is closed again.


----------



## chris89

Yeah, I'm not sure it's succeeding with the flash at all as the clocks are all stock... something is interfering with your clocks... Did you disable ULPS? I would leave it enabled. Uninstall MSI and then test and reinstall without checking all the Unified Overclocking Jazz in MSI...

*If the flash actually succeeds, it should be even better than my reference 390X with a 225W Max TDP, 208W Power Limit, and 169A TDC.

AMD Relive capture is far from perfect with AVC. Some sort of Asynchronous Variable Capturing Bitrate... Need CBR function with 2 pass option to clean it up...

http://www.mediafire.com/file/1kfynxn3p111417/390X-225W-TDP-MAX-UP-TO-1173MHZ.zip*


----------



## simonfredette

do you have screenshot of the actual settings page on MSI AB, ive tried a bunch so I still dont know what they should be , especially the compatibility and reference vs msi etc. mine look like this now but ill uninstall msi AB and reinstall fresh



I tried reflashing the v2 rom and it said already loaded, so it definitely took it

Installed a new version of afterburner and cleared all settings , the new version had the proper default settings after restarting the computer but my clock speeds are still stuck at 1000 150 ..


----------



## chris89

Show a screenshot of the first page of GPUZ... If it says already installed. First flash stock hawaii.rom and do not restart, then flash modded v2.

This is how I have my MSI Setup, once you have a proper BIOS installed, MSI's only use is to support Riva Tuner OSD... I use HWInfo to relay info to RTSS, so I don't need MSI for anything other than to be RTSS's companion. I don't mess with any overclocking software at all except Wattman on rare occasions, just if I want to for some reason turn up the fan or mess with clocks, which is never once a proper BIOS is installed.

So basically now I believe v2 will not be ideal, because the TDP Max etc is far in excess. I believe if I remodified v1, with your MSI uninstalled and the folders and settings deleted with a restart. Then test with v1, it'll be pretty good. Your temperatures are high enough as is so turning up the power isn't ideal. I see your VRM @ 77C... I see 77C after hours of 1173mhz gameplay and it takes a minute to hit 77C on the core VRM.


----------



## simonfredette

Ya id like to not use it at all , ive been googling the stuck at 150 issue and its pretty common with 280 290 380 390 users but their fix has been to go into windows display settings and turning off aux monitors but im only running one monitor. others have rolled back to before 16.2 drivers pre relive drivers to fix the issue. 
ill try going back to the 17.1 drivers before I start trying to fish out the old 16.12.1


----------



## chris89

If Default and GPU Clock are different. Uninstall MSI and don't keep settings, and reset Wattman to correct. I don't have that issue per-say unless I had a Cluster-F of MSI settings and others interfering with the BIOS. The BIOS does the finest job at controlling everything.

If I were you, I would...

1)Uninstall MSI Afterburner not keeping settings
2)Device Manager to display devices uninstall and uninstall software as well checkmark
3)DDU (Latest) and wipe the whole system clean and clear of AMD jazz (Clean and Do Not Restart) Proceed to step (4)
4)Flash proper BIOS like for instance v1 re-modified and do not restart prior to step (5)
5)Install (Latest) AMD Driver
6)Restart and test as-is with GPUZ, make sure Default and GPU Clock are the same... Run GPUZ render test and check GPU-Z monitoring to make sure it hits peak clocks.
7)Test Games with Wattman open to monitor clock throttling etc and post a screenshot of wattman monitoring after some solid fullscreen gaming (If it doesn't fail @ 275W tdp which it most likely will)(ie need v1 bios re-modified)


----------



## simonfredette

Quote:


> Originally Posted by *chris89*
> 
> If Default and GPU Clock are different. Uninstall MSI and don't keep settings, and reset Wattman to correct. I don't have that issue per-say unless I had a Cluster-F of MSI settings and others interfering with the BIOS. The BIOS does the finest job at controlling everything.


I updated my drivers to the beta they offered and restarted then ran heaven with gpuz and didnt touch MSI at all and the settings are correct , 1173 and 1563 but its artifacting and crashed and reverted to wattman settings, its going to need more power to maintain those clocks for sure.


----------



## chris89

By the way, your power supply is about to blow up... So best to not push the limit and stick with v1 re-modified until you get the money for a 1200W high efficiency PSU. Rosewill Photon 1200 has a single super powerful 100 ampere 12 volt rail. http://www.ebay.com/itm/Rosewill-Photon-1200-PHOTON-Series-1200W-Full-Modular-Power-Supply-80-PLUS-/192060048359?hash=item2cb7abc3e7:g:zJMAAOSw5cNYXF-U

Artifacting means the voltage cannot work on your card @ 17.5% to clock ratio. It requires more voltage (ie probably stock ratio of 1250mv divided by 1050mhz = 19.05%). Increasing the powerlimit won't fix it. V3 and V1 Re-Modified on the way.









*MSI R9 390 BIOS MODS

Abiding by stock clock to voltage ratio of 19.05% @ 1173Mhz. Testing 17.5% from 1133Mhz to 1166Mhz. This is important to have that Percentage Reduction in voltage to yield that much reduction in temperature when limits are reached. If that voltage is not a reduction in percentage, no temperature drop will occur and will crash.

Cool Running Max TDP 225W/ 208W Limit/ 169A TDC (15% Cooler than stock and faster)

Simonfredette_r9_390_modded_hawaii_v1-Re-Modified.zip 99k .zip file

Hot Running Max TDP 300W/ 240W Limit/ 220A TDP (30% Hotter than stock and way faster)

Simonfredette_r9_390_modded_hawaii_v3.zip 99k .zip file
*


----------



## gordesky1

The wattsman drivers also have issues with my msi 390 card too, Before i used chris modded bios doing anything in wattsman will do what simon card did, Puting the memory down to 150mhz till i restart or do a driver reinstall..

But than i notice it seems like wattsman doeisnt like dual monitors also? Removing my 2nd dvi monitor and just leaving my main hdmi 50inch connected wattsman works great for me. Enable dual monitors again bang driver will either crash and go to 150mhz or just get stuck at 150mhz with lines going across the screen...

And it seems like even in aftter burner the overclock i did never took for the memory no matter what i put it at even tho afterburner showed it.. The core speed took tho.

Before these drivers i had no issues overclocking with afterburner which i had the clocks at 1150-1730.

Amd really needs to rework these drivers.. For the 300series

But yea been using your 1130/1562bios chris and so far haven't had any problems







Didn't try the 1175 one yet tho.


----------



## Dundundata

I also have msi 390. I did a clean install including ddu of amd software/drivers when i installed 16.2.2. Also did an uninstall and reinstall of Msi AB to 4.3.0. and everything seems to jive together just fine.


----------



## simonfredette

Ya its working now , I did the same , reflashed my original bios and DDUd but went to the 17 Drivers . Installed a clean AB and OCd to 1100 core, 1540 memory power limit maxed and +56mv, Ill keep redoing the benchmarks to see how low I can get the voltage bump before I get artifacts again, I might be able to get a bit more memory clock out of it if I improve my cooling. Im printing new fan brackets to mount intake and exhaust fans to the back as the computer is built into an antique sewing machine and had no real cooling to speak of. What did you get out of yours, stable of course ?


----------



## chris89

Thank you Gordesky1, maybe you could try 1173mhz bios? like me...? v1 re modified ... i can make u a new one just like what i run.

Nice. So Simonfredette, I suppose you passed on the bios mods for 1173mhz? 1563mhz ram? Going beyond 1563mhz is just hogging extra power and barely any extra fps...

I can run up to 1,250mhz core & 1,758mhz ram as I have said like 100 times... it's just hot and loud.

I run the 1173mhz bios just like what I made for Simonfredette, 225w and it is so stable and runs perfectly... I can always count on going in game and not crashing no matter how long i play and it looks and runs gorgeously.


----------



## simonfredette

With the new coolin I got to 1140 1550 but thats as far as I could get before it got unstable. crd doesnt exceed 59c but 291W in and 230W out peak.. lots of juice for my 750 W psu especially with an OC cpu as well.


----------



## chris89

You might try the bios I made... it allows peak speeds of 1173mhz... and it'll go between 1000mhz, 1133mhz, and 1166mhz all revolving around the temperature and power limits... The voltage that supplies the clock is correct so no instability what so ever, just peak performance all the time. Plus cooler and faster than 1140mhz 1550mhz... 1140mhz x 64 rop means the card is doing 72 billion 960 million pixel's per second maximum... With the my bios it gives 4 hops... 1000mhz @ 64 billion pixel's per second/ 1133mhz @ 72 billion 512 million pixel's per second/ 1166mhz @ 74 billion 624 million pixel's per second/ and finally 1173mhz @ 75 billion 72 million pixel's per second... So the bios gives a window of performance of 11 billion 72 million pixel's available at any given time to pull peak performance... If you want my video it shows how it works...

Watching the fps carefully... You'll see between hops were talking 30fps hops between clocks... like 160 to instantly 190 in a split second... not 1fps up down but big hops in fps. Running a static max clock won't give the best performance like a boost-able bios will.

It's really neat like when it shows 1000Mhz... then it hops to 1173mhz.... We just witnessed 11 billion 72 million more pixel's in that split second... So Neat.. This card is so insanely powerful...

http://www.mediafire.com/file/1kfynxn3p111417/390X-225W-TDP-MAX-UP-TO-1173MHZ.zip


----------



## Dundundata

I can get up to 1180/1625 @ +100mV but usually run 1140/1600 @ +50

Stock voltage i run 1100/1500

I haven't modded the bios


----------



## simonfredette

Quote:


> Originally Posted by *chris89*
> 
> You might try the bios I made... it allows peak speeds of 1173mhz... and it'll go between 1000mhz, 1133mhz, and 1166mhz all revolving around the temperature and power limits... The voltage that supplies the clock is correct so no instability what so ever, just peak performance all the time. Plus cooler and faster than 1140mhz 1550mhz... 1140mhz x 64 rop means the card is doing 72 billion 960 million pixel's per second maximum... With the my bios it gives 4 hops... 1000mhz @ 64 billion pixel's per second/ 1133mhz @ 72 billion 512 million pixel's per second/ 1166mhz @ 74 billion 624 million pixel's per second/ and finally 1173mhz @ 75 billion 72 million pixel's per second... So the bios gives a window of performance of 11 billion 72 million pixel's available at any given time to pull peak performance... If you want my video it shows how it works...
> 
> Watching the fps carefully... You'll see between hops were talking 30fps hops between clocks... like 160 to instantly 190 in a split second... not 1fps up down but big hops in fps. Running a static max clock won't give the best performance like a boost-able bios will.
> 
> It's really neat like when it shows 1000Mhz... then it hops to 1173mhz.... We just witnessed 11 billion 72 million more pixel's in that split second... So Neat.. This card is so insanely powerful...
> 
> http://www.mediafire.com/file/1kfynxn3p111417/390X-225W-TDP-MAX-UP-TO-1173MHZ.zip


I tried both the v3 and remodded v1 and neither could run a benchmark test, artifacts all over. Maybe ill try it again, or just game with it and see if the same happens during games or only during 100% load


----------



## Removed1

Happy to see your card was not VCCD locked.








Hopefully i was wrong.








Vrm t° full load?


----------



## chris89

I always find it interesting that settings that work on my card cannot work on any other... I get no artifacts at all... that video was the fault of 5mbps video bitrate @ 1080p 60fps .. too little video bitrate for 1080p 60fps...

I find it interesting but usually artifacts because 12 volt rail droops too low meaning the voltage is also drooping causing too little volts to the core and artifacts.

Right now I run a dedicated 1000w to the video card, doing nothing but powering the 390x alone and it still droops the 12 volt rail on the 1000w.

It's 1398mv @ 1173mhz... though it's most likely the video memory cannot operate properly @ 1563mhz.... That's probably the issue I'm sure. New bios coming.

I feel 1398mv is too much per say so the v4 bios is 1378mv @ 1173mhz.

Here's 1500mhz memory bios up to 1173mhz with tdp max/ limit @ 240w & tdc @ 220A.

stock msi r9 390 .. is .. 230W MAX/ LIMIT & 210A TDC. Stock Reference 390x is 208W MAX/LIMIT & 200A TDC.

Simonfredette_r9_390_modded_hawaii_v4.zip 99k .zip file


----------



## simonfredette

Ill try it , doesnt hurt amd i have a couple sets of known stable settings now


----------



## chris89

Sweet.. yeah I'd like to hear how it works for ya guys.. I don't know why r9 390 can't do 1563mhz artifact free like reference can.. but it's okay at least you can get 1173mhz







Looks like the memory module nearest to the gpu core diode isn't totally covered by thermal material... gets too hot beyond 1500mhz.

I suspect V4 BIOS to be a bit loud... and a bit hot? If it's too loud and hot... Here's a quiet and cooler BIOS with the the same 1173mhz peak boostable clock.

The best way to make a fast and cool bios is to have a nice gap between max power and power limit... The window in this bios is 40 watts from limit to max... Also includes a 70 ampere current reduction which is massive power savings and a lot cooler... This bios has max tdp of 240 watts, power limit 200 watts, and ampere current of 150 amperes. Also fan speed reductions so it's not whining like a jet engine...









The ampere current is based off of 75% of the TDP limit, not the Max TDP... This is a far cooler running way to do it. If you want a little extra performance, and a bit more heat and noise we can base the Ampere Current off of 75% of the Max TDP... Which would be 180A Current from 150A Current... That's 20C difference.

Simonfredette_r9_390_modded_hawaii_v5.zip 99k .zip file




looks like the modules under the memory cooling plate also are only contacting half the module... no good.


----------



## simonfredette

Can't do that either, only made it about 7 seconds into heaven benchmark before the screen froze, turned off and reverted to wattman settings , which are now the default bios settings so it fails again and again until windows blocks heaven f t om running. It's probably like you said, psu cant provide stable power to maintain that overclocking under 100% load


----------



## bluej511

Quote:


> Originally Posted by *chris89*
> 
> Sweet.. yeah I'd like to hear how it works for ya guys.. I don't know why r9 390 can't do 1563mhz artifact free like reference can.. but it's okay at least you can get 1173mhz
> 
> 
> 
> 
> 
> 
> 
> Looks like the memory module nearest to the gpu core diode isn't totally covered by thermal material... gets too hot beyond 1500mhz.
> 
> I suspect V4 BIOS to be a bit loud... and a bit hot? If it's too loud and hot... Here's a quiet and cooler BIOS with the the same 1173mhz peak boostable clock.
> 
> The best way to make a fast and cool bios is to have a nice gap between max power and power limit... The window in this bios is 40 watts from limit to max... Also includes a 70 ampere current reduction which is massive power savings and a lot cooler... This bios has max tdp of 240 watts, power limit 200 watts, and ampere current of 150 amperes. Also fan speed reductions so it's not whining like a jet engine...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The ampere current is based off of 75% of the TDP limit, not the Max TDP... This is a far cooler running way to do it. If you want a little extra performance, and a bit more heat and noise we can base the Ampere Current off of 75% of the Max TDP... Which would be 180A Current from 150A Current... That's 20C difference.
> 
> Simonfredette_r9_390_modded_hawaii_v5.zip 99k .zip file
> 
> 
> 
> 
> looks like the modules under the memory cooling plate also are only contacting half the module... no good.


Are you talking about the actual VRAM? If thats only half covered it doesn't really matter. The actual memory chip don't need anywhere the cooling that VRMs do, they don't even get remotely hot compared to VRMs so i wouldn't worry about that.


----------



## chris89

By the way stock reference 390X cooler is too small & too hot, However 290X reference Cooler rocks the socks for sure as it's beefier. With the tiniest amount of thermal paste spread precisely across the whole diode Edge-To-Edge and maintaining the integrity of the (Factory Equipped) VRM Fujipoly 17w/m K pads, among the cheap low conductive memory pads. The 390X on Ref. 290X blower cools exceptionally well. Mine runs soo good, so stable, so cool, and fan doesn't need to go too high either. I'm gaming 75C Core, 60C VRM1/2 on a proper BIOS at 50% fan speed very stable @ 1173mhz & 1563mhz memory. If you watch my video, the temperature ever so slowly rises 1C at a time and does so very graciously. The Accelero III Xtreme went from 20C idle, to instantly 60C in a split second and temps kept rising beyond acceptable so the Accelero is a no-go.

Yeah, if you felt the Video Ram under a light load, you'd then realize that... the pain is gone and your finger is now missing... Yes it gets that hot.









Just by a nano light load of just 1GB out of 8GB, SMOKING HOT... So we can all now know that 512-bit GDDR5 smokes on heat/ power consumption/ and performance.









I even find Fujipoly 17w/m K could certainly be a prized asset to the Video Memory Modules, as it's very conductive. Panasonic PGS would be ideal as it's like 100X more thermal conductive... Just requires more compression to transfer beyond Fujipoly by vast margins.

Here's new BIOS... I'm super stumped on Simonfredette stability issues, but will work trielessly until we find what works and I'm exceedingly happy to do so. This is a learning lesson for me, one I wish to master.

For the moment, I made Gordesky1 & Simonfredette a new BIOS... Cooler/ More Power Efficient/ Yet has Unlimited Big-End potential...

I'll do some thinking over Simonfredette's BIOS, and find a proper replacement. Simonfredette, did it go to black screen or artifact? Did the driver crash and go to desktop? It gets confusing with what settings is for What Version BIOS I made for ya... This V6 is ideal @ 200W tdp... Insta-Freeze is too high of TDP Limit, though de-limiting the Max TDP helps keep things fast while still cool & quiet.

Tuned BIOS For Gordesky1 and Simonfredette... dialed back... Should be cool quiet efficient and perform very well.

Gordesky1_Modded_Hawaii_1173_Limited_Big_End_Cool_Quiet_v2.zip 99k .zip file


Simonfredette_r9_390_modded_hawaii_Up_To_1166_Cooler_v7.zip 99k .zip file


Don't Run These (Too Hot... Very Fast Though)

Gordesky1_Modded_Hawaii_1173_Delimited_Big_End_Cool_Quiet_v1.zip 99k .zip file


Simonfredette_r9_390_modded_hawaii_v6.zip 99k .zip file


----------



## dagget3450

Quote:


> Originally Posted by *chris89*
> 
> Sweet.. yeah I'd like to hear how it works for ya guys.. I don't know why r9 390 can't do 1563mhz artifact free *like reference can*.. but it's okay at least you can get 1173mhz
> 
> 
> 
> 
> 
> 
> 
> Looks like the memory module nearest to the gpu core diode isn't totally covered by thermal material... gets too hot beyond 1500mhz.
> 
> I suspect V4 BIOS to be a bit loud... and a bit hot? If it's too loud and hot... Here's a quiet and cooler BIOS with the the same 1173mhz peak boostable clock.
> 
> The best way to make a fast and cool bios is to have a nice gap between max power and power limit... The window in this bios is 40 watts from limit to max... Also includes a 70 ampere current reduction which is massive power savings and a lot cooler... This bios has max tdp of 240 watts, power limit 200 watts, and ampere current of 150 amperes. Also fan speed reductions so it's not whining like a jet engine...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The ampere current is based off of 75% of the TDP limit, not the Max TDP... This is a far cooler running way to do it. If you want a little extra performance, and a bit more heat and noise we can base the Ampere Current off of 75% of the Max TDP... Which would be 180A Current from 150A Current... That's 20C difference.
> 
> Simonfredette_r9_390_modded_hawaii_v5.zip 99k .zip file
> 
> 
> 
> 
> looks like the modules under the memory cooling plate also are only contacting half the module... no good.


Reference 290x/290/390/390x got a bad wrap due to cooler, but the quality of the board/chip and clocks are really good to be honest. I've pushed my 290x pretty hard with pt1 bios and water cooling. While yeah on reference cooler they wouldn't do as well they absolutely shine under water blocks. ill spend some time this weekend and push these 390x under water and see what they got







I bet they will do really decent, they already easily do 1700 on ram 1150 clocks without hardly any voltage change. So i feel they will do good and i am hoping for roughly 1200+/1700+ for benching.


----------



## chris89

@dagget3450 I find everything on the card's are great, the card and the cooler... AMD made two big mistakes, too hot of BIOS settings and too much thermal paste. Also they card is powerful so it needs the power to power the performance. In time AMD will find out how to reduce input/ output power loss by moving the VRM right by the Diode in a straight line trace with almost no resistance. Then maybe we will see the 390X remastered with low power consumption and crazy performance.

You can get very stable performance and amazing temperatures if you use the tiniest amount of thermal paste. A tiny dot in the center and spread it by finger unlike what most others say too do... An X or whatever... That never guarantee's that you get utter complete contact from edge-to-edge every square millimeter... Also you want so little that when it heats up that small amount expands just like Thermal Dynamics would advocate because it is in fact what happens. Once spread perfectly by finger that tiny amount contacts perfectly the copper contact of the heatsink... Just tight screws until they stop and tighten just a bit more, like a light slow tapping to tighten... It's like torquing the screws a little tighter... Thermal Dynamics also implies that different material heat up and expand at different temperatures, so by torquing you make sure they don't loosen when hot is what happens by default... They will loosen but to the perfect tightness... Even a stress run, then a retorque will ensure optimal thermal performance. Especially if torqued while hot especially... Just like the lug nuts on a car...

Just for fun I clocked my Reference 390X with Reference 290X Cooler w/ New HP Fan Rather than "Red" Not sure if it's quieter or louder... These fans move serious cfm if the rear bracket is modified... I just need to get the DVI ports removed to improve CFM by literally 50%... Since it blocks 1/2 of the exhaust...

1233Mhz (Though 1234MHZ would yield that magic rounded up number of 79.0 Billion Pixel's per second) Could probably get away with well at least doesn't crash but ASIC gets hot... ASIC is that temp that floats around unknowingly smoking haha... I have mine set max 88C so Zero Damage Ever Occurs no matter how hard I push this card. 1233MHZ X 1.175 (17.5%) = 1448.775mv... Rounded up to the absolute max of 1449mv... The Rx 300 series cannot boot above 1449mv... 1450mv is insta-black-screen upon no-post... Any water cooling guy's might try it...

To keep things DO-ABLE, I have MAX TDP @ 266W & TDP LIMIT @ 166W & TDC @ 133A... That way if it feels it could do it, it'll hop from 1173Mhz to 1233Mhz for a second or more maybe here and there... While temps well within parameters on air.


----------



## dagget3450

Quote:


> Originally Posted by *chris89*
> 
> @dagget3450 I find everything on the card's are great, the card and the cooler... AMD made two big mistakes, too hot of BIOS settings and too much thermal paste. Also they card is powerful so it needs the power to power the performance. In time AMD will find out how to reduce input/ output power loss by moving the VRM right by the Diode in a straight line trace with almost no resistance. Then maybe we will see the 390X remastered with low power consumption and crazy performance.
> 
> You can get very stable performance and amazing temperatures if you use the tiniest amount of thermal paste. A tiny dot in the center and spread it by finger unlike what most others say too do... An X or whatever... That never guarantee's that you get utter complete contact from edge-to-edge every square millimeter... Also you want so little that when it heats up that small amount expands just like Thermal Dynamics would advocate because it is in fact what happens. Once spread perfectly by finger that tiny amount contacts perfectly the copper contact of the heatsink... Just tight screws until they stop and tighten just a bit more, like a light slow tapping to tighten... It's like torquing the screws a little tighter... Thermal Dynamics also implies that different material heat up and expand at different temperatures, so by torquing you make sure they don't loosen when hot is what happens by default... They will loosen but to the perfect tightness... Even a stress run, then a retorque will ensure optimal thermal performance. Especially if torqued while hot especially... Just like the lug nuts on a car...
> 
> Just for fun I clocked my Reference 390X with Reference 290X Cooler w/ New HP Fan Rather than "Red" Not sure if it's quieter or louder... These fans move serious cfm if the rear bracket is modified... I just need to get the DVI ports removed to improve CFM by literally 50%... Since it blocks 1/2 of the exhaust...
> 
> 1233Mhz (Though 1234MHZ would yield that magic rounded up number of 79.0 Billion Pixel's per second) Could probably get away with well at least doesn't crash but ASIC gets hot... ASIC is that temp that floats around unknowingly smoking haha... I have mine set max 88C so Zero Damage Ever Occurs no matter how hard I push this card. 1233MHZ X 1.175 (17.5%) = 1448.775mv... Rounded up to the absolute max of 1449mv... The Rx 300 series cannot boot above 1449mv... 1450mv is insta-black-screen upon no-post... Any water cooling guy's might try it...
> 
> To keep things DO-ABLE, I have MAX TDP @ 266W & TDP LIMIT @ 166W & TDC @ 133A... That way if it feels it could do it, it'll hop from 1173Mhz to 1233Mhz for a second or more maybe here and there... While temps well within parameters on air.


the main issue i had with reference cooler was when going crossfire it was a mess. Even with spacing it was just too much heat in the box. Either way, all my reference cards oc'd well, with the exception of 4 290's i had with elpida memory. they didnt oc as well. I am not pushing too much vcore just yet as i dont want to melt down my circuit breaker hehe. I do wish i had a 390x with three DP. I have to many issues with AMD's DVI pixel clock locked at 165mhz. Also get alot of screen tearing on 390x CF so i wish i had DP to test and see if its more drivers or related to port type.


----------



## chris89

*@dagget3450*

Nice man. Yeah I get tearing too, you have to dial in the vsync in many different ways, I find some settings enabled in amd control center or whatever it's called can conflict with game vsync.

Sometimes I set fps target which still tears but you have to have amd vsync off plus game vsync off for it to work... Also if the game vsync causes tearing, disable in game and turn off frame rate target and set always on vsync in amd settings... That fixes tearing for me.

Yeah I wish the DVI ports weren't there at all, totally cutting down on cooling. The funny thing is before I tore the card down the first time, I could do 1,250mhz core benchmarks but after I can't... I think once you pull the cooler, the air pockets develop in memory pads and basically the first time you use thermal pads is the best time for benchmarks... As long as their fujipoly 17w/m k or maybe panasonic pgs... which I might pick both up to test... Though I really need to have the entire 290x heatsink stripped of paint and plated in copper first. plus a 1/2" thick backplate haha This card gets pretty hot as we all know when pushing it. I just don't like fan speed above 60%... 50-55 is fine though... So I find to achieve that, I can do maybe 240 to 250 tdp maximum on 225w/ 169a can game stable for hours @ 3200x1800. At 3200x1800, this card requires extra testing to stabilize because 1080p is a breeze... I don't care about 1080p though because fps is too high and high clocks are totally pointless... I just want 60fps @ 3200x1800... Right now Fallout 4 is taxing the 390x @ 3200x1800 modded with 4k landscapes maxed out... It's no easy rendering game.... I keep asking myself, would an RX 480 8GB do a better job than the 390X @ say 3840x2160 on fallout 4? I just kinda wouldn't mind low power consumption, on the RX 480... But can 32 ROP's really out perform 64 ROP's @ 4k or near 4k? Everything points to no... But maybe I'm wrong?

Anyway I find to keep it all nice and stable and keep power as low as possible with maximum speed and cool stable enough temps... Is running 1173mhz and no more... Also VDDCI @ 959mv helps reduce temps @ 1,563mhz ram by 4.3%... Not much but that's a couple well deserved Degrees Celsius less during long gaming especially @ 3200x1800...

If you want to try it while keeping everything acceptable... Try 1173Mhz @ 1366mv or 1378mv (1378mv gives no artifact occurrences when 1366mv does rarely)... Setting Max TDP to 266 @ 1366mv or 1378mv @ 1173Mhz gives a whole bunch of performance but I can't run that high of tdp without temps rising too much and fan speed too loud... You might try it? Okay enough jabbering on my part haha









Send your bios, I'll make you a real nice one for water... Not pushing too far but making a solid stable performer without going too crazy...


----------



## Streetdragon

Quote:


> Originally Posted by *simonfredette*
> 
> Can't do that either, only made it about 7 seconds into heaven benchmark before the screen froze, turned off and reverted to wattman settings , which are now the default bios settings so it fails again and again until windows blocks heaven f t om running. It's probably like you said, psu cant provide stable power to maintain that overclocking under 100% load


its not your psu.... every card need more or less volts for a clock. Your card just needs more voltage to get this clock stable.

A bios that works on one card likely WONT run on your card!


----------



## Removed1

Quote:


> Originally Posted by *Streetdragon*
> 
> its not your psu.... every card need more or less volts for a clock. Your card just needs more voltage to get this clock stable.
> 
> A bios _settings_ that works on one card likely WONT run on your card!


Edited.









Anyway you're right, if you want to push no matter what, you will need at certain point to raise the VCCD.
After that, is your card able to sustain and deliver a good voltage, is another issue.

To give you an example, my bios without any limit stabilize my gpu with +140mv, when with the stock bios i use +275mv, for the same 1250mhz clock. I use less offset and get a lower VCCD compared to stock bios.
And even without limit, i spend a lot of time to tune the VCCD to be the lower possible, atm, and it is a pain because when you steep up the clock, at certain point no matter what you need to raise it.
And it could be +10/20mv just for a few mhz, for high oc.









The PSU could be an issue too, if i think my 290 eat 25A on 12v rail, a basic 450/500W could begin to be short if the cpu and gpu are highly oc, especially a 390.


----------



## simonfredette

Quote:


> Originally Posted by *chris89*
> 
> By the way stock reference 390X cooler is too small & too hot, However 290X reference Cooler rocks the socks for sure as it's beefier. With the tiniest amount of thermal paste spread precisely across the whole diode Edge-To-Edge and maintaining the integrity of the (Factory Equipped) VRM Fujipoly 17w/m K pads, among the cheap low conductive memory pads. The 390X on Ref. 290X blower cools exceptionally well. Mine runs soo good, so stable, so cool, and fan doesn't need to go too high either. I'm gaming 75C Core, 60C VRM1/2 on a proper BIOS at 50% fan speed very stable @ 1173mhz & 1563mhz memory. If you watch my video, the temperature ever so slowly rises 1C at a time and does so very graciously. The Accelero III Xtreme went from 20C idle, to instantly 60C in a split second and temps kept rising beyond acceptable so the Accelero is a no-go.
> 
> Yeah, if you felt the Video Ram under a light load, you'd then realize that... the pain is gone and your finger is now missing... Yes it gets that hot.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just by a nano light load of just 1GB out of 8GB, SMOKING HOT... So we can all now know that 512-bit GDDR5 smokes on heat/ power consumption/ and performance.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I even find Fujipoly 17w/m K could certainly be a prized asset to the Video Memory Modules, as it's very conductive. Panasonic PGS would be ideal as it's like 100X more thermal conductive... Just requires more compression to transfer beyond Fujipoly by vast margins.
> 
> Here's new BIOS... I'm super stumped on Simonfredette stability issues, but will work trielessly until we find what works and I'm exceedingly happy to do so. This is a learning lesson for me, one I wish to master.
> 
> For the moment, I made Gordesky1 & Simonfredette a new BIOS... Cooler/ More Power Efficient/ Yet has Unlimited Big-End potential...
> 
> I'll do some thinking over Simonfredette's BIOS, and find a proper replacement. Simonfredette, did it go to black screen or artifact? Did the driver crash and go to desktop? It gets confusing with what settings is for What Version BIOS I made for ya... This V6 is ideal @ 200W tdp... Insta-Freeze is too high of TDP Limit, though de-limiting the Max TDP helps keep things fast while still cool & quiet.
> 
> Tuned BIOS For Gordesky1 and Simonfredette... dialed back... Should be cool quiet efficient and perform very well.
> 
> Gordesky1_Modded_Hawaii_1173_Limited_Big_End_Cool_Quiet_v2.zip 99k .zip file
> 
> 
> Simonfredette_r9_390_modded_hawaii_Up_To_1166_Cooler_v7.zip 99k .zip file
> 
> 
> Don't Run These (Too Hot... Very Fast Though)
> 
> Gordesky1_Modded_Hawaii_1173_Delimited_Big_End_Cool_Quiet_v1.zip 99k .zip file
> 
> 
> Simonfredette_r9_390_modded_hawaii_v6.zip 99k .zip file


This is with the V7 bios , seconds into the benchmark it artifacts , screen freezes and then I get the wattman settings returned. I tried opening AB and pumping up the power limit and core volts to max and running again but it does the same thing. Coul it be because were not using the same drivers.


----------



## tolis626

Quote:


> Originally Posted by *simonfredette*
> 
> This is with the V7 bios , seconds into the benchmark it artifacts , screen freezes and then I get the wattman settings returned. I tried opening AB and pumping up the power limit and core volts to max and running again but it does the same thing. Coul it be because were not using the same drivers.


Nope, it's simply because your card can't handle THAT frequency at THAT voltage. No matter how much theory you throw at it, the cold hard truth is this. No 2 cards are the same. It's not the BIOS, it's not the drivers. I can't take a set of settings that work for my card and flash them on yours with a BIOS and then hope it works and neither can anyone else. If it works, it's luck. There's people around here that have run their cards at over 1300MHz. Don't try copying their settings.


----------



## simonfredette

OH yeah were just trying to find that sweet sweet medium where itl run faster than I had it clocked at and stable. Myabe its unattainable but at least now with better fan cooling I have the temps down to 60 even overclocked at 100% load, I can live with that it used to reach the 90s.


----------



## chris89

*@simonfredette* That makes me happy though that you like the BIOS fan parameters. That it's running nice and cool for ya? I see your running your card at light speed huh?







1,500Mhz Core Clock & 2,000Mhz Memory... If that were possible... That's 96 BILLION PIXEL'S per second haha... Among the full 512GB/s memory...

The top right should say 1166Mhz / 1500Mhz... To keep it stable... Reset Wattman and go in and test and make sure it's running at the right speeds... Reset everything then Open GPU-Z Render Test... Then Monitor While in the test... run for a few seconds... make sure to set clock *Monitor* to MAX values via gpuz monitor among others and temps to make sure it is working at the clocks set.

The real true sweet spot is about 1133Mhz with the right TDP, will do very good but I like more... I want 75 Billion Pixel's per second... So that's 1,173Mhz... The proper TDP Sweet spot is also good to find... It's right around the 244 to 254 watts range for Max TDP, though running a modest TDC will lower temps on a modest TDP Limit... But if MAX TDP is high it'll heat up a lot but just run like a beast... haha


----------



## simonfredette

Quote:


> Originally Posted by *chris89*
> 
> *@simonfredette* I see your running your card at light speed huh?
> 
> 
> 
> 
> 
> 
> 
> 1,500Mhz Core Clock & 2,000Mhz Memory... If that were possible... That's 96 BILLION PIXEL'S per second haha... Among the full 512GB/s memory...
> 
> The top right should say 1166Mhz / 1500Mhz... To keep it stable... Reset Wattman and go in and test and make sure it's running at the right speeds...
> 
> The real true sweet spot is about 1133Mhz with the right TDP, will do very good but I like more... I want 75 Billion Pixel's per second... So that's 1,173Mhz... The proper TDP Sweet spot is also good to find... It's right around the 244 to 254 watts range for Max TDP, though running a modest TDC will lower temps on a modest TDP Limit... But if MAX TDP is high it'll heat up a lot but just run like a beast... haha


Heaven just doenst display the proper settings, gpu-z shows the card is running at 1166 1500


----------



## chris89

*@simonfredette Give me a minute... I'm going to make a Heaven run and record it to see... I'm on 17.1.1.









Plus I'm running a 248W Max TDP/ 240W Limit/ 180A TDC... It's way hotter than normal but handles higher average 3200x1800 frame rates.

Make sure to "RESET" to "DEFAULTS" MSI AFTERBURNER and make sure it's closed... If you slid the overclock slider to 10 Trillion Terahertz... It won't work.. haha

Let it go to default bios clocks but not overclocking it yet... haha

http://www.mediafire.com/file/29huvuyb5ux69te/Heaven_v4_390X_1173mhz.mp4

http://www.mediafire.com/file/0aw36l46m7j36ao/GpuZ_Render_Test_390X_Plus_Unigine_Heaven.mp4*


----------



## simonfredette

fps seems a bit low for those couple scenes, mine will run it at 1140 1550 on extreme at slightly higher fps, windowed at 1920 x 1080p. My temps during that run never went over 56 though which is nice but still the best score I can get is 2676 on ultra with tessellation off


----------



## chris89

Nice *@Simonfredette*.

That's some good temps man... So are you running say v7 bios on your custom clock settings? I'll make one with your specified 1140mhz core & 1550mhz memory but with a little overhead voltage, to see if you can clock it higher... Then we can go from there...

This one is @ 1140Mhz Core & 1550Mhz Memory @ 959mv VDDCI (ie Memory Voltage and it's 4.3% less power and 4.3% cooler than 1000mv VDDCI)(Low VDDCI tends to work best on a very stable 12 volt rail).

It also runs a 240 Watt Maximum TDP, with a very modest 208 Watt TDP Limit, and a nice and cool 166 Ampere TDC Current(ie 75% of the TDP Limit). Fan is a nice quieter slope.

This BIOS gives extra core voltage @ 1140Mhz for 1388mv... Which should allow more clock speed by a good 4-5%, so test how fast it can go with this BIOS.

*

Simonfredette_MSI-R9_390_1140_1550_240W_208W_166A_v8.zip 99k .zip file
*


----------



## simonfredette

No those are with the original bios amd my settings, I wasn't sure how it would be affected with my settings. Ill give v8 a shot


----------



## chris89

I actually wanted to remake those bios for Gordesky1 & Simonfredette as I found mistakes I made. I often upload before I thoroughly look over the BIOS and work out most kinks. As far as predicting stable voltage, and fan speed specs for your cards go's that is. As I am not there testing them, I feel these BIOS I made will give you guy's some options. Question? I'm happy to answer any you may have.

1Ghz Delimited is very fast and very cool of all things, very doable. The 1133Mhz & 1173Mhz may be a stretch for your systems. I dialed in the fan speeds on each bios the same, with fan speed to meet temperature. Not too much fan like the other BIOS. The fan starts at 1% @ 51C and will scale according to temperature up to 100% @ 84C... So each bios will stabilize fan noise and speed proportionally to the temperature/ speed of the bios.

Gordesky1MSIR9390_Delimted_Bios_Pack_v1.zip 297k .zip file


SIMONFREDETTE_R9390_Delimited_Bios_Pack_v1.zip 297k .zip file


I'm getting 15fps Fallout 4 3200x1800 Ultra on 4k landscapes, enb maxed to the kilter, and literally every knob at it's highest. Running De-limited 1,173Mhz... So at these settings... 75 billion pixel's per second is 15fps... If 75 billion means 15fps... 150 billion would be 30... It would be unreal if this card could clock to 2Ghz for 128 billion pixel's per second. That means like 26 on the low end wow... *AMD Radeon Excelsior Alpha Omega with 128 ROP/ 256 TMU @ 2Ghz would be flipping unreal... 256 Billion pixel's per second, and a half TRILLION Texel's per second... AMD Radeon Excelsior Lineup ... Mightiest of the mighty. When the TITAN appears in sight, Excelsior only half throttle... Slowly plants the pedal to the metal and disappears in a cloud of smoke. DROOL







*

Looks like 320GB/s (1250Mhz Memory) works just fine at 875 VDDCI...Too loud and too little voltage though for 1204mhz. Maybe AMD or another will pull some of the 390X PCB's collecting dust on the shelf, remaster with removing the dvi ports and rework the bios and the cooler pure copper with triple variable geometry inlet fans and re release them... I'll buy.


----------



## Regnitto

I never actually formally joined the club........guess i should do that now

GPU-Z validation

max overclock 1150/1750, +69mv, +25mv aux, +50 power limit


----------



## infinite0180

Hey guys thought i would ask this question here. Somebody local is selling a XFX DD r9 390. How much is a fair offer for it? He wants $175 but for that much i can grab a 4gb rx 480 brand new... let me know what you guys think is fair!


----------



## chris89

*@infinite0180* If I was you go 8GB man, as it will last you a lot longer. Plus DX12 titles love 8GB especially with VSR 4k bro. However XFX DD looks beautiful and you can't get RX 480 8GB anymore on sale since as soon as I notified everyone of the sale on RX 480 8GB from Visiontek they sold out in an hour.

Though 175 is a great deal, I paid 225 for HP Reference 390X 8GB... If you can get an 8GB 390X hit us up bro or just pick up that XFX 390 whatever the GB rating is and I'll help speed it up and cool it down a lot.









Sometimes I do wish I had the RX 480 8GB... So if your thinking it over pick up RX 480 8GB, hit us up on the Polaris Owner's thread. I own a 4GB RX 460 Unlocked. It's good for 50 watts... R9 290/290x/390/390x are very powerful though and only after I mod you up a BIOS. So if you find a good 290 or 290X hit us up again as well. I love modding these cards man, since after it's like a whole new card.

$216 or offer 8GB Sapphire Nitro 390 without backplate... Throw offer 175?
http://www.ebay.com/itm/Sapphire-AMD-R9-390-Nitro-8GB-Graphics-Card-/172442673717?hash=item2826626235:g:qxcAAOSwt5hYcsng

$220 or offer 8GB Sapphire Nitro 390 with backplate... Throw offer 175?
http://www.ebay.com/itm/Sapphire-AMD-R9-390-Nitro-w-Backplate-8GB-Graphics-Card-/172442673725?hash=item282662623d:g:GuwAAOSw-0xYcsir

$189 + $12 shipping 8GB MSI RX 470 (Still very powerful and capable) You could message the seller? Ask "Can you send me a One-Time Offer $175 Shipped? I'll pay now"
http://www.ebay.com/itm/MSI-Gaming-X-AMD-RX-470-8GB-GPU-GFX-Graphics-Card-PC-/282333822507?hash=item41bc683e2b:g:t2oAAOSw-0xYg6Tr


----------



## infinite0180

@chris89

Thanks man. Im thinking i will offer the local guy 150$ just to see if he will take it for the 390. If not then i will have to make a decision on wether or not 175$ is worth it for me. Thanks for the advice on the 8gb vram. I think its a good idea as well.


----------



## chris89

*@infinite0180* Sure thing brother, good luck









So I was determined to unleash full cfm on my 390x, those DVI ports drove me nuts... Turns out the solder requires major heat to unsolder, and I'm over here trying a 450F oven and a Hair Dryer of all things, which did not work (you need a heat gun)...haha

So I just pulled out the tin snips and went hacking away on my 390X...

This is the result...

Is this a bad Unigine Heaven benchmark Ultra Extreme 1920x1080 no AA?



Again nano gains at 5GT/s... also noticing a consistent Unigine Frame Rate Error of about 10 to sometimes 50fps from RTSS... Could be a coincidence or not...


Surprisingly most of the fan noise was from the restriction of air, now it's way cooler and quieter at the same fan speed.


*If you can see this pic, I modified the air inlet on the case for the gpu with 150cfm Scythe Slipstream 120 (clean the bearing dry then use lightest weight viscosity synthetic in particular engine oil to achieve more cfm and continuous operation/ checking yearly and topping off).
Which was at low(5 volts), at high (12 volts) it can pull like 5-10C cooler.*



I gamed it up for a good hour but just chose to stop it was totally stable unlike always... and just straight blown away... These cards are only said to work at this clock on water, and water only or liquid nitrogen... ? I reduced the core vrm by about 10C by doing something you'd never think of to them. This is totally fine @ 1395mv core & 959mv VDDCI @ 1,563Mhz memory (400GB/s). You can only see the gains by setting MAX/LIMIT/TDC to 57599 and that's what it's at, unlimited power. 1kw dedicated. Plus 66% fan speed now is not unbearable like before, it's just an efficient jet engine running in super cruise.


Two of these at this specific setting would be unreal legit shiz... 160 Billion Pixel's & 440 Billion Texel's & 640GB/s while saving some power on the 875 VDDCI.

1,250Mhz Core for the full 80 Billion Pixel's per second, and 220 Billion Texel's per second with 875 VDDCI @ 1,250Mhz Memory... Reference 290X Cooler on 390X


----------



## Removed1

Tweaking the bios like you do and knowing your card allow you to achieve 1200mh+ on air, if you push and mod a bit the stock cooler like you did.
If you manage to keep the overall t° low enough and give enough voltage.
Seems AMD drivers brought more stability and performance, it helped a lot to get decent stable oc.

Here the 290 on water, couple hours of BF1, stable clocks, it is loud, it just pushed a 110CFM server fan on vrm!








I give +250mv for roughly 1.31v, i get lower fps cause HWinfo, dunno how to remove this annoying drop of frame during monitoring.



I can bench 1300mhz+ on the core and 1700mhz on the ram with this bios.

Extreme water up to 1350Mhz!
On LN2 you could bench at 1450mhz, somewhere near.


----------



## bluej511

Quote:


> Originally Posted by *Wimpzilla*
> 
> Tweaking the bios like you do and knowing your card allow you to achieve 1200mh+ on air, if you push and mod a bit the stock cooler like you did.
> If you manage to keep the overall t° low enough and give enough voltage.
> Seems AMD drivers brought more stability and performance, it helped a lot to get decent stable oc.
> 
> Here the 290 on water, couple hours of BF1, stable clocks, it is loud, it just pushed a 110CFM server fan on vrm!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I give +250mv for roughly 1.31v, i get lower fps cause HWinfo, dunno how to remove this annoying drop of frame during monitoring.
> 
> 
> 
> I can bench 1300mhz+ on the core and 1700mhz on the ram with this bios.
> 
> Extreme water up to 1350Mhz!
> On LN2 you could bench at 1450mhz, somewhere near.


C'est pas mal ca.

I love how people on air try to get amazing temps but it just won't happen. Those aftermarket and stock coolers are pretty much already maxed out no matter what you do. its true that lowering voltage will reduce TDP and therefore heat but guess what, i can game for 6hrs straight and my gpu won't see 40°C on the cores lol, my VRMs reach about 60°C and thats because there passively cooled.


----------



## bluej511

So just checked my temps, VRM1 is at 48°C and VRM2 is at 58°C so quite a lot cooler running then i thought. Stock clocks though


----------



## dopeonarope

not sure what you guys think. is this a good OC for a 390X

https://www.techpowerup.com/gpuz/details/7qmuw


----------



## dagget3450

Quote:


> Originally Posted by *dopeonarope*
> 
> not sure what you guys think. is this a good OC for a 390X
> 
> https://www.techpowerup.com/gpuz/details/7qmuw


looks decent.

for fun on quad 390x FS run still tweaking and probably going to roll with modified bios soon for memory straps and power/throttling.
Quote:


> Originally Posted by *dagget3450*
> 
> update
> 
> Dagget3450 - 4x [email protected]/1700 - cpu [email protected] score: 32393
> http://www.3dmark.com/3dm/17537204?


----------



## chris89

*@Wimpzilla* I'm clocking right now totally stable in everything...

@ 1,250Mhz Core @ 1400mv
@ 1,001Mhz Ram @ 1000mv

It appears it needs 1000mv if you don't wanna crash (not a crash per say, like a random black screen after a day of gamin' it up bruh at below 975mv vddci/ 975 might work though.. I'll try it) plus it still uses less power at 1001Mhz memory and can't tell a different when the core is straight screamin'...









Core is very stable and has not budged 1 Degree Celsius past 80 Degrees Celsius. I noticed after gaming recently... made a minor fan speed tune.. Go to look at the core.. and I was like a monkey scratching his head... 67C.. What?!?

VRM 1 & 2 @ 82C & 65C... You can look at my screenshots but that's after hours and hours of 1,250Mhz Core Gameplay on air. So ridiculious, right?

It did go up to 82C core vrm, ram vrm stays cooler at 1001mhz memory and 75 less watts.

I can clock the ram to 1,758Mhz but it's pointless on my PCIe 2.0... Which even limits me now @ 1,250Mhz core. Need 8GT/s on Quad 16GT/s CPU's or just an Inifinite direct from CPU to GPU Hyper-Pipeline to give zero limitation of the northbridge... teasing but maybe it could be done on one cpu...

Wrong screen but that's just an example below the ballpark of performance numbers my 390X is pullin'

I set 64X Tesselation, with Morphological Filtering and High Texture Filtering, among 16x Anisotropic filtering. On top of Ultra Everything with FXAA at 3200x1800.


----------



## tolis626

http://www.guru3d.com/articles_pages/resident_evil_7_pc_graphics_performance_benchmark_review,7.html
http://www.guru3d.com/articles_pages/resident_evil_7_pc_graphics_performance_benchmark_review,6.html

Well, I don't know what's up with this game, but DAMN Hawaii is looking good. The only card faster than the 390x is the Titan X Pascal (And the 1080 at 1080p only, that makes sense). We're talking about being faster than the 1080, 1070, 980ti, Titan X, Fury, Fury X... And no matter what's wrong with the game that makes all these other cards crawl (or makes the 390/390x shine, whatever way you want to take it), the 390x still manages 51fps at 4K. Not only is 50-ish FPS very playable, but with a bit of overclocking we'd probably be looking at 55-60fps. God damn AMD, god damn.

Still, I do believe the game is rather broken. The 390x is fast, sure, but it's not faster than the 1080 or 980ti, let's not be ridiculous.


----------



## dagget3450

Quote:


> Originally Posted by *tolis626*
> 
> http://www.guru3d.com/articles_pages/resident_evil_7_pc_graphics_performance_benchmark_review,7.html
> http://www.guru3d.com/articles_pages/resident_evil_7_pc_graphics_performance_benchmark_review,6.html
> 
> Well, I don't know what's up with this game, but DAMN Hawaii is looking good. The only card faster than the 390x is the Titan X Pascal (And the 1080 at 1080p only, that makes sense). We're talking about being faster than the 1080, 1070, 980ti, Titan X, Fury, Fury X... And no matter what's wrong with the game that makes all these other cards crawl (or makes the 390/390x shine, whatever way you want to take it), the 390x still manages 51fps at 4K. Not only is 50-ish FPS very playable, but with a bit of overclocking we'd probably be looking at 55-60fps. God damn AMD, god damn.
> 
> Still, I do believe the game is rather broken. The 390x is fast, sure, but it's not faster than the 1080 or 980ti, let's not be ridiculous.


I saw this and it made me laugh so hard.... i dunno whats up but its awesome. Does this mean we get to claim 390x is faster than 1080 or 980ti????


----------



## bluej511

The 390x is also faster then fury and fury x at 1080p which is a bit concerning haha. I have no interest in it anyways so could care less lol.


----------



## dagget3450

Quote:


> Originally Posted by *bluej511*
> 
> The 390x is also faster then fury and fury x at 1080p which is a bit concerning haha. I have no interest in it anyways so could care less lol.


You should care!!! 390x is second only to titanX pascal!!!!!!! heheheeh


----------



## Removed1

Driver issues imo, it is not possible for a 1080 to had this frame rate, just looking for the raw power horse calculation it had with 2kMhz. It should cope the game easy!









Vega is an hidden update of our hawaii/grenada chip, that's why the drivers are so good and the card work so well! .


----------



## chris89

Quote:


> Quote:
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *tolis626*
> 
> http://www.guru3d.com/articles_pages/resident_evil_7_pc_graphics_performance_benchmark_review,7.html
> http://www.guru3d.com/articles_pages/resident_evil_7_pc_graphics_performance_benchmark_review,6.html
> 
> 
> 
> Well, I don't know what's up with this game, but DAMN Hawaii is looking good. The only card faster than the 390x is the Titan X Pascal (And the 1080 at 1080p only, that makes sense). We're talking about being faster than the 1080, 1070, 980ti, Titan X, Fury, Fury X... And no matter what's wrong with the game that makes all these other cards crawl (or makes the 390/390x shine, whatever way you want to take it), the 390x still manages 51fps at 4K. Not only is 50-ish FPS very playable, but with a bit of overclocking we'd probably be looking at 55-60fps. God damn AMD, god damn.
> 
> Still, I do believe the game is rather broken. The 390x is fast, sure, but it's not faster than the 1080 or 980ti, let's not be ridiculous.
> 
> Click to expand...
> 
> Quote:
> 
> 
> 
> Originally Posted by *dagget3450*
> 
> I saw this and it made me laugh so hard.... i dunno whats up but its awesome. Does this mean we get to claim 390x is faster than 1080 or 980ti????
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> Quote:
> 
> 
> 
> Originally Posted by *bluej511*
> 
> The 390x is also faster then fury and fury x at 1080p which is a bit concerning haha. I have no interest in it anyways so could care less lol.
> 
> Click to expand...
> 
> Quote:
> 
> 
> 
> Originally Posted by *dagget3450*
> 
> You should care!!! 390x is second only to titanX pascal!!!!!!! heheheeh
> 
> Click to expand...
> 
> Quote:
> 
> 
> 
> Originally Posted by *Wimpzilla*
> 
> Driver issues imo, it is not possible for a 1080 to had this frame rate, just looking for the raw power horse calculation it had with 2kMhz. It should cope the game easy!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Vega is an hidden update of our hawaii/grenada chip, that's why the drivers are so good and the card work so well! .
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
Click to expand...

How do I compact the quotes to spoilers?

This is amazing guy's my 390x is not running no where near as fast as it could, I'm stuck on this dreaded PCIe 2.0 limiting it a lot. Clocking back to 1133mhz, delimited still limited by PCIe 2.0... I'm hogging up the entire PCIe 2.0 bus like nothing... Split second 390x consumes the whole bus... I could only imagine a dual xeon pcie 3.0 on this gpu delimited at 1250mhz.

I'm sure of it the 390X is far ahead of it's time... the technology, theory, science all are on point yet nothing now can truly see what it can actually do... We need more than 8GT/s bus and a CPU that can make that GT/s of the PCIe Bus so the CPU & GPU are 1:1 between the bus... minimal limitation... CPU Feeds All so the weak points are all over... Primarily Northbridge... Gonna need to speed up the northbridge by vast margins to ever witness to true potential of these GPU's later on.

Just as a test to see if your CPU is keeping up with the 390X's max potential. Run AIDA64 Memory & Cache Benchmark. If your CPU is pulling more or less than 450GB/s or 1/2 Terabyte per second across the L-Level Cache, we still can't actually fuel the 390X... Meaning on the Copy, Write, Read each value has to be more than the memory bandwidth of the single card... for Crossfire the same but x2 or x3 or x4... To power 4x 390X @ 450GB/s each you would be all 3 Read/ Copy/ Write over 1.8 Terabytes/s per second on the Read/ Copy/ Write... So it's likely we are all limited and aren't actually seeing what this GPU is truly all about.

My Xeon's Together are 1.1 Terabytes per second combined (Meaning I can power 3 cards with a full 15GT/s total.. with still 10.6GT/s left over... 25.6GT/s combined... Yet PCIe bus just 5GT/s... 20.6GT/s unused... So i'm performing 60% slower than it's potential with this kinda of Epic QPI Bandwidth...

What that means for this card, the 390X... If I'm clocked at 450GB/s with PCIe 2.0... It's really only doing like 280GB/s.... So that's why I notice no change in FPS with ram... Leaving @ 256GB/s @ 1001Mhz (1001 so it shows 1000mhz in HWInfo over 999Mhz) is right at the Bus Limit.
*
******By The Way******* *Your probably wondering how to get 1250mhz stable, right? It can't be done on 1000mv VDDCI... Has to be less than 1000mv VDDCI... Or it's not stable... I find 900mv VDDCI @ 1001Mhz memory works amazingly and trust me I'm scratching my head like a Monkey over the temps... Plus you need 1433mv to power stable @ 1250Mhz Core... But it doesn't matter for me... 1133mhz seems like the same FPS and now totally silent at 25% fan speed 75C... So totally silent gaming on this card De-Limited is still epic...*

Not only the bars spacing between PCI slots in the case cut down on airflow by about 33.33% among if you remove the DVI Ports... Cases in the future should not use a "FLAT" bar spacing slots, but rather round bars, so the air flows over with almost no cfm loss. The fan speed is now about 1% per 1 Celsius at max overclock... At lower clocks... like 30C to 1-5% fan speed... So you get more resolve with the exhaust port totally open... tiny bumps in fan speed makes much bigger changes... Rather than Stock where the Fan Keeps going higher and higher and higher and temps blistering higher and higher and higher... while your pulling your hair out at this clear and obvious thermal thinking and thought process flaw... Thermal engineering needs never to be rushed ever, never, ever, ever be rushed. Thermal Engineering takes days, weeks, months of continuous thought for pinnacle outcomes... Give it too little time like you think it doesn't need any thinking-over and find the end product gets ridiculed by all as it did...




*390X*
*2816:176:64 ----1.25Ghz is 80.00GPixel/s & 220 GTexel/s & 512-bits of bus up to 450.00GB/s*... If you had the PCIe Bandwidth and CPU QPI Power you could witness the speed...








*Gtx 1080*
*2560:160:64 ----2Ghz is 128GPixel/s & 320 GTexel/s* ... GTX 1080... fast but in some scenarios both the 1080 and TITAN X are limited by the memory bus width...
*Titan X Pascal*
*3584:224:96 ----2Ghz is 192GPixel/s & 448 GTexel/s* ... So titan x pascal could probably pull 1/2 trillion texel's per second

That's why the HBM is highly desired. AMD is gonna need something really special to even compare to the Next Titan... Since the next TITAN is going to Dwarf the Pascal TITAN X believe you me, I hope they are working passionately on the matter.


----------



## Removed1

At 1300/1700mhz my 290 achieve 83.2GPx/s, 208GTx/s, 435GB/s bus.








Not even near a 1070 on Pixel Fillrate.
Anyway seems the game is well optimized, even low end card achieve 50fps in 1080p, so drivers here should be the issue about the gain/loss fps.


----------



## DrFunkalupicus

Hawaii.zip 99k .zip file
Chris89, very sorry about that PM I shot ya...my searching skills suck. Anywho, I have attached my BIOS. If you could work some magic that would be great


----------



## dagget3450

Can someone with a 390x/390 test and see if they can disable tessellation for Firestrike. I cannot seem to disable it with crimson relive drivers.


----------



## battleaxe

Quote:


> Originally Posted by *dagget3450*
> 
> Can someone with a 390x/390 test and see if they can disable tessellation for Firestrike. I cannot seem to disable it with crimson relive drivers.


Go to Gaming tab, then global settings, you'll see it bottom right. "Over-ride" settings.


----------



## dagget3450

Quote:


> Originally Posted by *battleaxe*
> 
> Go to Gaming tab, then global settings, you'll see it bottom right. "Over-ride" settings.


I tried that, the thing is i dont think its working. 3dmark isnt changing scores or telling me tess modified. Thats why i was hoping someone else could test.


----------



## battleaxe

Quote:


> Originally Posted by *dagget3450*
> 
> I tried that, the thing is i dont think its working. 3dmark isnt changing scores or telling me tess modified. Thats why i was hoping someone else could test.


Ah... sorry to be redundant. Wish I could help, but I'm at work and can't run it.


----------



## chris89

*@dagget3450* Well it appears the the days of AMD Struggling with tessellation have done and gone, because the 390X on 64X Tessellation is like "64X Tessellation? No Problemo Brother".. haha







It eats through 64X Tessellation like it was disabled... No difference... I've tested it off or 64X ... minimal difference... Shows how epicly powerful the 390X core is...

If you wanna get more points... download Radeon Pro and turn all settings to Max Performance and Set LOD to +100 among test ahead Flip Queue Size and what not...

Ruby says to get it... So yeah she's the boss haha











http://www.radeonpro.info/Download/

*@DrFunkalupicus*

De-Limited BIOS, very powerful running. Just as you wanted under-volted and de-clocked ram to save 100 watts on the big end. Also de-limited, meaning no power limit so 1133Mhz and 1173Mhz would be proportionally more powerful. In other words 1173Mhz will use exactly 3.5% more power than 1133Mhz... Not much... This runs nice and cool, I run 1173Mhz almost silent on reference 290X blower on my 390X.

MSI_R9_390_8GB_DrFunkalupicus_BIOS_Pack_v1.zip 396k .zip file


----------



## dagget3450

Quote:


> Originally Posted by *chris89*
> 
> *@dagget3450* Well it appears the the days of AMD Struggling with tessellation have done and gone, because the 390X on 64X Tessellation is like "64X Tessellation? No Problemo Brother".. haha
> 
> 
> 
> 
> 
> 
> 
> 
> It eats through 64X Tessellation like it was disabled... No difference... I've tested it off or 64X ... minimal difference... Shows how epicly powerful the 390X core is...
> 
> If you wanna get more points... download Radeon Pro and turn all settings to Max Performance and Set LOD to +100 among test ahead Flip Queue Size and what not...
> 
> Ruby says to get it... So yeah she's the boss haha
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.radeonpro.info/Download/
> 
> *@DrFunkalupicus*
> 
> De-Limited BIOS, very powerful running. Just as you wanted under-volted and de-clocked ram to save 100 watts on the big end. Also de-limited, meaning no power limit so 1133Mhz and 1173Mhz would be proportionally more powerful. In other words 1173Mhz will use exactly 3.5% more power than 1133Mhz... Not much... This runs nice and cool, I run 1173Mhz almost silent on reference 290X blower on my 390X.
> 
> MSI_R9_390_8GB_DrFunkalupicus_BIOS_Pack_v1.zip 396k .zip file


Sadly tessellation still hits 390x hard. I want to disable tess in 3dmarks benchs for the top30 threads. it will provide a decent bump and is allowed even on hwbot/top30 ocn

4x [email protected]/1700 with tess


same as above with no tess


----------



## chris89

Further testing shows these work a lot better, plus the 390X much prefers to stay close to home as far as VDDCI goes... It gets home sick, the further you leave home vddci. At 1250mhz that is. When running up to 1133mhz on 65288, it can do 875mv @ 1001mhz or even 1250mhz ram. Huge increases in power consumption from 1001mhz memory up to 1758mhz memory... need more pcie bus to utilize that kind of memory bandwidth... 450GB/s memory @ 1000mv with errors but passes...need pcie bus 5.0.

On a side note as far as tdp goes I'm on 57599 @ 1250mzh delimited and it's real fast.

You could if you get shutdowns on 57599, 1250mhz would throttle a little but be more do-able @ 320 watts max tdp 280 watts limit 210 current ampere. This is power input, so we lose 30% in thin air... The gpu core is only using 256 watts.





Starting out on my boost clock table is 64 billion pixel's per second or 1000mhz. Then it has a range of a full 16 billion pixel's to clock through...Always to have a clock on stand by to eat through the harshest 3d rendering.

*@gordesky1* Here I made more 1250mhz bios for ya... I found 1410mv driver crashes after 2 1/2 hours... So it's a Trial And Error Testing game with this clock... Every 1mv above 1410 could add 30 minutes or an hour or 2 on stability... So i made a pack with 1411mv, 1412mv, 1413mv, 1414mv, 1415mv, 1416mv... So you can test the first 1410mv bios I sent then like mine did after 2 1/2 hours crashed... only 75C core 75C core vrm too... So I'm testing 1 millivolt at a time until it doesn't do it anymore...

Seems that stability is somewhere between 1415mv and maybe 1425mv or more but not possible beyond 1425mv without a waterblock I'm sure or a masterpiece of an all copper forged heatsink... give or take as when your up in this range... ASIC 88C is very near, so it's a very tight window... Unless your happy to let the ASIC and Max Temp, Eat well past 100 degrees Celsius?
*

Gordesky1_MSI_390_1250Mhz_Collection.zip 693k .zip file
*

No updates? haha 390x always on my mind











*@dagget3450* Yeah for sure, but when your not monitoring fps and just enjoying the visual beauty and the gameplay itself. Tesselation is handled with ease on the 390x.. right? Setting 64X on many titles runs without any difference that I can tell, still runs fine and Tesselation looks unreal.


----------



## chris89

Double-post my bad hahaha









I've been further testing what the 390x can do as I would hope all you guy's are as well... No point in sitting around vegging out bro's.









I have been working over the VDDCI (Memory Voltage) to Core Voltage ratio, which is now important as I now know what the value is at which what cannot be done.

I'm only kind of relaying info I have collected to knowledgabilize you all who are running this Hawaii GPU.

Firstly I find at no more than 1388 millivolts on the core, to an 875 millivolt VDDCI is stable without "Black Screen". The "Black Screen" issue is all Memory Voltage and or heat of Memory Voltage.

That means that we cannot clock beyond a 500 to max 513 millivolt offset.

Meaning mathematically now we know how low the VDDCI can operate properly at any given Core Clock...

Meaning if you wish to volt the core out to 1,400 millivolts, then the VDDCI value cannot under any circumstance reliably operate below 900 millivolts minimum.

This 500 millivolt max offset, correlates to only once the core clock to vddci comes out of that 500 millivolt window.

Since I tested 1388mv core to 875mv vddci on 782Mhz memory clock (200GB/s), and it was totally fine at 1,234Mhz core (79GPixel/s). Until driver crash, which is good to know the difference between "Black Screen" issue and a driver crash.

Once I raised core to 1400mv to 875mv vddci on 782Mhz memory clock, the driver crash now became the "Black Screen" issue...

Raise VDDCI to 1000mv while still at 782Mhz memory clock we still save 100 watts, over 120 watts at 875mv vddci...

There is no need to run the ram clock out unless you want high benchmark scores or an enormous power bill.

Running the core at above 1,200Mhz, in the 77 to 80 Giga Pixel area, we find that 200GB/s (782Mhz) video memory is plenty...

That is all. Hope to hear from you all.


----------



## dagget3450

Quote:


> Originally Posted by *chris89*
> 
> Double-post my bad hahaha
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've been further testing what the 390x can do as I would hope all you guy's are as well... No point in sitting around vegging out bro's.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have been working over the VDDCI (Memory Voltage) to Core Voltage ratio, which is now important as I now know what the value is at which what cannot be done.
> 
> I'm only kind of relaying info I have collected to knowledgabilize you all who are running this Hawaii GPU.
> 
> Firstly I find at no more than 1388 millivolts on the core, to an 875 millivolt VDDCI is stable without "Black Screen". The "Black Screen" issue is all Memory Voltage and or heat of Memory Voltage.
> 
> That means that we cannot clock beyond a 500 to max 513 millivolt offset.
> 
> Meaning mathematically now we know how low the VDDCI can operate properly at any given Core Clock...
> 
> Meaning if you wish to volt the core out to 1,400 millivolts, then the VDDCI value cannot under any circumstance reliably operate below 900 millivolts minimum.
> 
> This 500 millivolt max offset, correlates to only once the core clock to vddci comes out of that 500 millivolt window.
> 
> Since I tested 1388mv core to 875mv vddci on 782Mhz memory clock (200GB/s), and it was totally fine at 1,234Mhz core (79GPixel/s). Until driver crash, which is good to know the difference between "Black Screen" issue and a driver crash.
> 
> Once I raised core to 1400mv to 875mv vddci on 782Mhz memory clock, the driver crash now became the "Black Screen" issue...
> 
> Raise VDDCI to 1000mv while still at 782Mhz memory clock we still save 100 watts, over 120 watts at 875mv vddci...
> 
> There is no need to run the ram clock out unless you want high benchmark scores or an enormous power bill.
> 
> Running the core at above 1,200Mhz, in the 77 to 80 Giga Pixel area, we find that 200GB/s (782Mhz) video memory is plenty...
> 
> That is all. Hope to hear from you all.


I suspect the reduction of vram speed would have negative effects at higher resoltuions. Your testing at 1080p?


----------



## milkbreak

Is there any way to permanently remove a driver version from Windows 10's driver cache or something like that? If I uninstall a newer AMD driver to run DDU and install newer drivers, Windows 10 will install some old crap driver (outside of windows update so it's not a windows update block that I need) in the intermediary and I want that to stop.


----------



## bluej511

Quote:


> Originally Posted by *milkbreak*
> 
> Is there any way to permanently remove a driver version from Windows 10's driver cache or something like that? If I uninstall a newer AMD driver to run DDU and install newer drivers, Windows 10 will install some old crap driver (outside of windows update so it's not a windows update block that I need) in the intermediary and I want that to stop.


Thats one of the reasons i have the windows driver update set to off and i unplug my ethernet while doing gpu driver updates. Windows will still have some on reserve but ive never had it install one in between, usually just stays microsoft basic display adapter or something.


----------



## milkbreak

Quote:


> Originally Posted by *bluej511*
> 
> Thats one of the reasons i have the windows driver update set to off and i unplug my ethernet while doing gpu driver updates. Windows will still have some on reserve but ive never had it install one in between, usually just stays microsoft basic display adapter or something.


Unfortunately, this does not work for me. It will install despite of not having any connection which is why I believe it has nothing to do with Windows Update. Very annoying.


----------



## THUMPer1

I may have missed this. But does anyone else have system crashes with the new Relive drivers on their 390/390x? Every version of the new drivers after a while or just randomly My screen will go black and system hangs. Its annoying the **** out of me.


----------



## christoph

Quote:


> Originally Posted by *THUMPer1*
> 
> I may have missed this. But does anyone else have system crashes with the new Relive drivers on their 390/390x? Every version of the new drivers after a while or just randomly My screen will go black and system hangs. Its annoying the **** out of me.


me, but don't think is the Relive version, I been having blacks screens, first it goes black then in a few second the system hangs, and I think it has to do with the HDMI link speed or at lets for me I think is the link speed...


----------



## Tuxxey

Quote:


> Originally Posted by *THUMPer1*
> 
> I may have missed this. But does anyone else have system crashes with the new Relive drivers on their 390/390x? Every version of the new drivers after a while or just randomly My screen will go black and system hangs. Its annoying the **** out of me.


Think I'm having the same problem, black screen and the computer is frozen? Resulting in having to restart system?


----------



## battleaxe

Quote:


> Originally Posted by *THUMPer1*
> 
> I may have missed this. But does anyone else have system crashes with the new Relive drivers on their 390/390x? Every version of the new drivers after a while or just randomly My screen will go black and system hangs. Its annoying the **** out of me.


Yes. I'm having this on Win7. Seems fine on Win10 though.

Exact same PC. Dual boot.


----------



## THUMPer1

Quote:


> Originally Posted by *Tuxxey*
> 
> Think I'm having the same problem, black screen and the computer is frozen? Resulting in having to restart system?


Yep


----------



## Tuxxey

Quote:


> Originally Posted by *THUMPer1*
> 
> I may have missed this. But does anyone else have system crashes with the new Relive drivers on their 390/390x? Every version of the new drivers after a while or just randomly My screen will go black and system hangs. Its annoying the **** out of me.


Quote:


> Originally Posted by *battleaxe*
> 
> Yes. I'm having this on Win7. Seems fine on Win10 though.
> 
> Exact same PC. Dual boot.


Quote:


> Originally Posted by *THUMPer1*
> 
> Yep


When does the crash happen for both of you? I mainly experience it when playing The Division, had it with BF1 last night. My GPU and CPU temps are high, but my MSI R9 390 has always ran hot and I don't believe it's heat causing the system to crash


----------



## THUMPer1

Quote:


> Originally Posted by *Tuxxey*
> 
> When does the crash happen for both of you? I mainly experience it when playing The Division, had it with BF1 last night. My GPU and CPU temps are high, but my MSI R9 390 has always ran hot and I don't believe it's heat causing the system to crash


I'll give some more info since I have some time.

My crashes typically happen in BF1 and Fallout 4. I mainly play BF1 though. I have a 390x and was running a stable overclock on the old 16.x.x drivers of 1150/1625. If I run the memory OCed I will get crashes. But if I run it at default 1500, I don't seem to get crashes. I am using 17.1.1. I have not yet tried 17.1.2 with stock mem clocks, but I may do that.

My crash behavior is, screen goes black , sound loops and I am forced to hard reboot.

I think it has something to do with Wattman, even though I have never touched Wattman and enabled it.

It was random in BF1, it could be an hour of playing or it could be two hours. Typically when my KD is pushing 3.0 is when it would crash. haha


----------



## Streetdragon

so you get the crashes with oc? try gaming/testing without oc or only oc the gpu and not the memory.

sounds a bit like a memory problem/to less voltage on the memory


----------



## chris89

Here I'm comparing 1366mv & 1377mv @ 1204Mhz Core/ 1367Mhz Ram @ 0.968mv VDDCI... We get a decent score increase with 10 millivolts... Still not enough... need 1388 to 1400mv... Notice my $15 650W PSU 12 volt rail... It'll be fine but I don't plan on pushing it like this all the time. Running 1173Mhz core @ 75GPixel/s is more realistic power wise.

Running on the Reference 290X blower. Many/ Most/ All have said these are bad coolers... which is incorrect. What is bad about them is the lack of attention to detail on installation of the installer/ not the cooler...


----------



## Rexer

I knew it was a matter of time before I'd see this. A modified reference cover. Somebody's going to design an aerodynamic fan shroud and heatsink with a tuned port(s) soon. Box ends and jail-bar exhaust ports add nothing but drag and pockets of vacuum. Not that a reference cover doesn't cool the card, it could be way more efficient.
I have a friend who's Nvidia gtx 970 ssc was overheating at stock settings. Crashing was pretty regular under load. So we fashioned scoops over the fans (to catch the air from the case fans). We used a hollow rubber ball and cut it into quarters and tape a quarter over each fan, on the fan shroud. Added an air dam on the front end (facing the two large case fans on the front of the computer) so hot air can escape out the side and front more easily and get caught in the draft going to the rear of the computer. Then we turned up his front case fans 25% and his normal temps dropped 47c to 35c (stock settings). His card stopped crashing under load 76c+ to 65c. This was just doctoring the air and draft.


----------



## chris89

*@Rexer* Sounds like ingenious ingenuity to me, haha. That's great to hear you say that, and I'm glad to hear it made a big difference.

It's fun manipulating computer fan/ heatsink/ airflow to increase velocity. Since actually velocity 99.9% of the time is far more ideal than volume. For instance, diesel engines have low velocity air inlet intake systems and use high volume. High volume is very inefficient and can't achieve what high velocity can. Even a diesel semi or a boat or anything really could massively benefit from variable geometry air intake & exhaust... That's neat, glad to hear it man.

Basically on a boat or semi or really anything this would give way more extra down low grunt. By using variable butterflies on say a 6 cylinder engine, use 12 runners (6 short narrow tubes/ 6 long wider tubes). Use the long wide tubes up until peak volumetric efficiency is achieved in the plenum, then bang in the short narrow runners to give a nice arch of extreme performance up top. If done right, it's like two different engines that behave completely different under the same hood.

GPU wise/ we would restrict the fan inlet say 33.33% up until too high turbulence/ noise heard by ear occurs in the "Shroud/ GPU Air Intake Plenum"... Then open to 100%... this 33.33% to 0% transistion at a certain temperature is like the transition from Load to Idle (Temp Drops Like a Bang). It would be a performance boost with a Bang as well at this transition... Giving the GPU, Dynamic Thermal & Performance characterists. Rather than Static like we have always seen.









They just need to design new fin designs on the standard blower fans we have always seen... You know how a blower fan say on the 290X has slightly curved fins to scoop the air (increasing volumetric efficiency by calculating the difference of all curved fans to flat fins in Degrees). Rather the exterior appearance would look more like an Oval shape... the outside appearance of the fins aren't vertical, rather circular in appearance with much large air scoops on the fins (ie fins with great "scoop" degrees than standard). That way it may be able to work from the side, rather than the front... This is hard to explain in words : I'd have to show a picture or design and test myself which isn't a financially available option. haha


----------



## Rexer

Yeah. In all seriousness, there seems to be no real efficiency in PC case airflow. It's terrible. What PC companies should look into is air management. They should take a hint from race cars. What a science that is.
Watching the films of old GTU/GTP series race cars run around, you can see how they used air flow in a variety of ways. The shapes those sports cars have use the wind to the best of their advantage. Streamline designs with air scoops, wings, spoilers and cow catchers. Slip stream curves reduce drag, down force spoilers, dams for stability, and air scoops for turbo air demand and brakes. It also makes them very fuel efficient. Yeah, I know. The gpu card is no car. But their dynamic air flow sciences can be applied to almost anything that uses air. This is where I think AMD and Nvidia should get the idea the reference card should be changed. It's air management is very inefficient. Lol, looks like a squirrel cage mounted in a cereal box.
Years ago (1992) in Santa Ana, CA, I had a chance to see Dan Gurney's speed shop which look like a barb wire compound. They had a wind tunnel where they wouldn't let anyone in to see. But the cars designed in it where in the shop area and were nothing short of astonishing. No big V-8s like Ford made. Eagles cars where 4 cylinder turbo jobbies.
The car bodies employed various types of air inlets, scoops and air outlets that only work at high speeds. Especially brakes had both scoops and exit ducts just for cooling. The more I thought about it, the more I thought the computer and electronics people are way behind. We've just gotten use to using radiators and water jackets.
When I seen the picture of your graphics card, I thought, someone's got the right clue, reference cooling designs are bogus and need to be changed.
Kudos to you, Chris!


----------



## Pointy

What do gpu memory error mean in Hwinfo

regular usage i get quite a few of these

when i put power limit to -50% i get hundreds of em per minute.


----------



## Rexer

Was black screening even on low clocks (1116mhz @ 28mV). Rolled back my drivers to a known good a few months ago (16.5), put a cure on it. For some reason, every time AMD puts out a shared gpu driver (like RX 480 and R9 390x) I have issues. I'm not sure everyone has them.
The worse crashing I had came from not removing the old gpu drivers entirely. Windows 'Programs and Features' didn't completely remove the driver and left unseen fragments. I had to reinstall the gpu driver so the uninstall program could see it again. Then I used a 3rd party uninstaller (Revo Uninstaller) to get the registry paths. It's free to use from Bleeping Computer.
Had power issues that caused blackscreen, too. Replaced the psu and another time my APC/UPS unit was failing.
When I see uniform snow before I crash, I check my temps/fan. Especially in Battlefield games. 3 & 4 are large maps and a lot is going on. With smaller fps maps like CoD Ghost or CoD Advanced Warfare, not so much.
After crashing a half dozen or so times, for what ever reason, I'll run repair or uninstall and reinstall games, run sfc /scannow to make sure there aren't any errors on Windows. Crashing seems to have a way of corrupting everything. After a while, I see temps rise if my games are corrupt.
Seems like I'm constantly tuning various game and gpu updates for that sweet spot. But once I'm there, I put a grip on it for as long as it lasts. I've played consistently well from 1116 to 1175. Once I'm in that sweet spot, I'm it. I'm a jr. deity and turn the turbochargers on.. . and git some.


----------



## componentgirl90

I have an XFX 390x DD which is about a year and a half old (Sep 2015 purchase).

It worked fine until two months ago when the fan would occasionally accelerate to maximum when idle. I felt that by blowing dust off of it and reseating the card that this improved things. This card has been used a lot since purchase but is still under warranty.

I returned the card to the retailer (an "RMA") but they said that they found nothing wrong and they suggested it was a software issue. However they said that their test consisted of running the card for a day watching You Tube. They did not put the card under any form of heavy load which I feel is not satisfactory testing. I had previously tested some older drivers but I guess I should try a new set of drivers and try it on a new computer etc.

Does anyone have any suggestions as to what might be wrong with this card?


----------



## AverdanOriginal

Quote:


> Originally Posted by *componentgirl90*
> 
> I have an XFX 390x DD which is about a year and a half old (Sep 2015 purchase).
> 
> It worked fine until two months ago when the fan would occasionally accelerate to maximum when idle. I felt that by blowing dust off of it and reseating the card that this improved things. This card has been used a lot since purchase but is still under warranty.
> 
> I returned the card to the retailer (an "RMA") but they said that they found nothing wrong and they suggested it was a software issue. However they said that their test consisted of running the card for a day watching You Tube. They did not put the card under any form of heavy load which I feel is not satisfactory testing. I had previously tested some older drivers but I guess I should try a new set of drivers and try it on a new computer etc.
> 
> Does anyone have any suggestions as to what might be wrong with this card?


Hi,

I had the same issue on and off. Turned out to be MSI Afterburner on my end. When I used afterburner and changed the overclock more than 3 or 4 times without restarting the comp, the "auto" for fan would turn off and use the last used speed which was of course high 70% or so during benching. and it didnt change it in idle. so simply needed to turn auto on again.
So if you have msi afterburner and have max fan speed again, check if the auto fan turned itself off.
Just a suggestion.


----------



## componentgirl90

Thanks for your reply and for replying quickly.

I do have MSI Afterburner but the auto fan is not off. When this over-speed of the fan (I actually think it is going greater than 100% possibly) occurs, you are unable to control the fan. You cannot override it in MSI afterburner, it is unresponsive.

I am also thinking that it might be better to RMA it to XFX instead of the retailer as they probably deal with so many of these cards.


----------



## mus1mus

Quote:


> Originally Posted by *componentgirl90*
> 
> Thanks for your reply and for replying quickly.
> 
> I do have MSI Afterburner but the auto fan is not off. When this over-speed of the fan (I actually think it is going greater than 100% possibly) occurs, you are unable to control the fan. You cannot override it in MSI afterburner, it is unresponsive.
> 
> I am also thinking that it might be better to RMA it to XFX instead of the retailer as they probably deal with so many of these cards.


How's the temperature?

Try to repaste it if you have some TIMs around. Helps with temps. And could probably be the reason why it does that.

Or check Wattman if you have installed the latest driver.


----------



## componentgirl90

Quote:


> Originally Posted by *mus1mus*
> 
> How's the temperature?
> 
> Try to repaste it if you have some TIMs around. Helps with temps. And could probably be the reason why it does that.
> 
> Or check Wattman if you have installed the latest driver.


I think the temperature is reporting at 47 degrees when idle when this fan acceleration occurs. The fans are going at 3600 rpm (I think that maybe that is above 100%?). I am not sure how GPUs behave when the thermal paste has gone but I would assume there would be an increase in temperature.

Edit: Under load the GPU is in the 70s or 80s I think. Certainly not in the 90s.


----------



## Lifeshield

Reapplying thermal paste is likely a good idea. I took my Heatsink off my R390x and the die and surrounding area was covered with a ridiculous amount.

Now it sits at 35c idle and 73c when playing Guild Wars 2 after reapplying Thermal Paste, I was hitting near 85c before. Fans in general are alot quieter too.


----------



## Apogee777

Just had my Sapphire R9 390 Nitro (backplate version) replaced by my supplier for random black screens.

ramping up the fans via msi afterburner found a decrease in black screens but the latest drivers killed msi afterburner fan control, so i used sapphire trixx fan control with success.
however, black screens happened at desktop (not gaming) randomly.

replaced by supplier - all good now, heat is lower than previous card when maxed.


----------



## robin69

Anyone has had major problems when Overclocking the sapphire nitro r9 390 non X ?
(Have the Backplate Version with 1040Mhz)

My Card instantly turns the screen black if I touch any settings beside Power Target.

Sometimes it will just crash when starting AB, without OC, I am really upset.

I even cant set 1050Mhz, it just will turn the screen black instantly and I have to restart.

Now I have removed AB and other software like Riva Tuner and TRIXX und just set the power target to 50% on Wattman

Sorry for the bad spelling, english isn't my native language


----------



## Streetdragon

i had no problems with my 2 390 nitros. clocked with ab first, then bios.
Try a clean install on the driver, then install ab. restart one more time and start ab after boot.
Maybe then you can play with the clock/voltage


----------



## jkuddyh801

thanks a lot @ TehMasterSword for ur positive feedback my fellow AMD Overclocker! Add me on OCN buddy and ill do the same. By the way, add me on Twitter too @Imperial_PC_Gaming my friend.


----------



## Rexer

Quote:


> Originally Posted by *robin69*
> 
> Anyone has had major problems when Overclocking the sapphire nitro r9 390 non X ?
> (Have the Backplate Version with 1040Mhz)
> 
> My Card instantly turns the screen black if I touch any settings beside Power Target.
> 
> Sometimes it will just crash when starting AB, without OC, I am really upset.
> 
> I even cant set 1050Mhz, it just will turn the screen black instantly and I have to restart.
> 
> Now I have removed AB and other software like Riva Tuner and TRIXX und just set the power target to 50% on Wattman
> 
> Sorry for the bad spelling, english isn't my native language


I agree with Streetdragon which you might try a clean install. I had troubles, too when 390x first came out. I uninstalled and reinstalled several times and it still kept crashing.Turns out, the Windows uninstaller didn't get the traces of the video driver left in the registry path so I had to reinstall the video driver and use a third party uninstaller (Revo Uninstaller) (free to use) to dig the driver out.
You can manually dig the traces out if you know where to look in the HKEY files. Personally, I don't dig around in the registry much because it's a an area beyond my expertise. I've screwed around there once before and ended up reinstalling Windows. Lol.
After I reinstalled the video driver It was O.K. for a while then it started black screening again. Turns out I was having a power failure. My backup battery unit was failing (UPS unit). A few years before that, I had a power supply unit (psu) go bad and do the same thing, black screen.
High clocks can cause black screens, too. A small PSU will casue blackouts also. If you're like me and like to screw with the clocks and voltage till the card breaks, 850w psu is the ticket. If you're not and not using crossfire, I'd use the recommended 750w job. It's just to stay on the safe side. It's has more than enough power and allows enough power to expand for extra pci/ peripherals.
Hope this helps.


----------



## ziggystardust

Quote:


> Originally Posted by *robin69*
> 
> Anyone has had major problems when Overclocking the sapphire nitro r9 390 non X ?
> (Have the Backplate Version with 1040Mhz)
> 
> My Card instantly turns the screen black if I touch any settings beside Power Target.
> 
> Sometimes it will just crash when starting AB, without OC, I am really upset.
> 
> I even cant set 1050Mhz, it just will turn the screen black instantly and I have to restart.
> 
> Now I have removed AB and other software like Riva Tuner and TRIXX und just set the power target to 50% on Wattman
> 
> Sorry for the bad spelling, english isn't my native language


I have 390X Nitro and having the same issue with AB.

I can apply clock setting with Trixx just fine, but for some reason it doesn't change the clocks in real time.

Wattman integration just ****ed up everything for good.

If you find a solution, please let me know.


----------



## Apogee777

Quote:


> Originally Posted by *ziggystardust*
> 
> I have 390X Nitro and having the same issue with AB.
> 
> I can apply clock setting with Trixx just fine, but for some reason it doesn't change the clocks in real time.
> 
> Wattman integration just ****ed up everything for good.
> 
> If you find a solution, please let me know.


1. Wattman doesnt work with 390 only 4xx series, hence why it is not installed for 3xx series..
2 Trixx has a tick in settings for real time..
3 AB does not work with latest drivers.


----------



## bluej511

Quote:


> Originally Posted by *Apogee777*
> 
> 1. Wattman doesnt work with 390 only 4xx series, hence why it is not installed for 3xx series..
> 2 Trixx has a tick in settings for real time..
> 3 AB does not work with latest drivers.


Wattman works fine for the 390, i used an OC of 1100/1600 and worked just fine.


----------



## ziggystardust

Quote:


> Originally Posted by *Apogee777*
> 
> 1. Wattman doesnt work with 390 only 4xx series, hence why it is not installed for 3xx series..
> 2 Trixx has a tick in settings for real time..
> 3 AB does not work with latest drivers.


You are probably 4 months late, but AMD introduced Wattman to non 4xx cards.









Also I've been using Trixx for a long time, but now it doesn't work. It's not about the tick or anything else.

It's just Wattman fcked up everything...


----------



## robin69

Quote:


> Originally Posted by *Rexer*
> 
> I agree with Streetdragon which you might try a clean install. I had troubles, too when 390x first came out. I uninstalled and reinstalled several times and it still kept crashing.Turns out, the Windows uninstaller didn't get the traces of the video driver left in the registry path so I had to reinstall the video driver and use a third party uninstaller (Revo Uninstaller) (free to use) to dig the driver out.
> You can manually dig the traces out if you know where to look in the HKEY files. Personally, I don't dig around in the registry much because it's a an area beyond my expertise. I've screwed around there once before and ended up reinstalling Windows. Lol.
> After I reinstalled the video driver It was O.K. for a while then it started black screening again. Turns out I was having a power failure. My backup battery unit was failing (UPS unit). A few years before that, I had a power supply unit (psu) go bad and do the same thing, black screen.
> High clocks can cause black screens, too. A small PSU will casue blackouts also. If you're like me and like to screw with the clocks and voltage till the card breaks, 850w psu is the ticket. If you're not and not using crossfire, I'd use the recommended 750w job. It's just to stay on the safe side. It's has more than enough power and allows enough power to expand for extra pci/ peripherals.
> Hope this helps.


Thank you for your reply









Reinstalled everything, I came to the conclusion that my *crappy Corsair VS 650* is likely the part that causes this problems and limits my Overclock capability.

I am now really upset that i cant give my card enough juice, guess i should have invested better in the first place









Just another question, would it be possible to achieve a small OC with no or little power limit added ?

Thanks for your answers !


----------



## Carniflex

Quote:


> Originally Posted by *robin69*
> 
> Just another question, would it be possible to achieve a small OC with no or little power limit added ?


Probably depends on the card. My Gigabyte 390X was throttling even at stock settings without increasing the power limit. Then again it was such a howler that I had to put it under water even at stock settings. All it can do for me is 1100 MHz and even that is not fully stable. That is with locked volts even.


----------



## Apogee777

Quote:


> Originally Posted by *bluej511*
> 
> Wattman works fine for the 390, i used an OC of 1100/1600 and worked just fine.


well i can do that on trixx but..
wattman never installed for my 390. then i did a google search and found this:
https://community.amd.com/thread/202320


----------



## Apogee777

Quote:


> Originally Posted by *ziggystardust*
> 
> You are probably 4 months late, but AMD introduced Wattman to non 4xx cards.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also I've been using Trixx for a long time, but now it doesn't work. It's not about the tick or anything else.
> 
> It's just Wattman fcked up everything...


ahh ok got it.
i'll download the latest driver when i feel i need to


----------



## lanofsong

Hey R9 390X/390 owners,

We are having our monthly Foldathon from Monday 20th - Wednesday 22nd - 12noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.

March 2017 Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## robin69

@lanofsong sadly i missed this months oppertunity, but I'm looking forward to ascend the event next month


----------



## johnnyw

Afterburner still not working with these cards and latest driver? Just bought second hand sapphire 390 nitro few and when i try to overclock even 1mhz with afterburner and hit apply it results immidiate black screen.


----------



## bluej511

Quote:


> Originally Posted by *johnnyw*
> 
> Afterburner still not working with these cards and latest driver? Just bought second hand sapphire 390 nitro few and when i try to overclock even 1mhz with afterburner and hit apply it results immidiate black screen.


Mine does the same, although i havent tried it yet on my new w10 install and pc build so who knows.


----------



## johnnyw

Quote:


> Originally Posted by *bluej511*
> 
> Mine does the same, although i havent tried it yet on my new w10 install and pc build so who knows.


I guess AB isnt compatible with these latest drivers. I just tried overclock with wattman and it works fine.


----------



## gordesky1

Quote:


> Originally Posted by *johnnyw*
> 
> I guess AB isnt compatible with these latest drivers. I just tried overclock with wattman and it works fine.


Hmm my msi 390 works fine with new drivers and af -watts man. Even before when the watts man drivers was really buggy af worked fine but watts man made the memory go down too 100 till restart or drivers reinstall.


----------



## johnnyw

Well you have MSI card and afterburner is essentially made for those. Would be suprised if it wouldnt be working.

But after done some googling yesterday it seems that atleast with this sapphire nitro there is some compatibility issue with current drivers and afterburner.


----------



## gordesky1

Quote:


> Originally Posted by *johnnyw*
> 
> Well you have MSI card and afterburner is essentially made for those. Would be suprised if it wouldnt be working.
> 
> But after done some googling yesterday it seems that atleast with this sapphire nitro there is some compatibility issue with current drivers and afterburner.


Yea that probably is true, Tho i saw when the first relive drivers first came out some people also had the same problem with even msi with afterburner.


----------



## stephenn82

AMD locked down their drivers after 16.11.5. No newer driver will allow Afterburner or any other controlling program to work. This is all over reddit. I dont think anyone has cracked the drivers yet to get them to work. I am running 17.3.3 for my R9 390 and performance is...meh at best. When running 16.11.5, I had 1125/1625 on almost stock power settings. I think it was +11mV and =20% power just to be on safe side. Ran games like a champ! Everything after that on BF1 just seemed like slow doodoo.


----------



## bluej511

Quote:


> Originally Posted by *stephenn82*
> 
> AMD locked down their drivers after 16.11.5. No newer driver will allow Afterburner or any other controlling program to work. This is all over reddit. I dont think anyone has cracked the drivers yet to get them to work. I am running 17.3.3 for my R9 390 and performance is...meh at best. When running 16.11.5, I had 1125/1625 on almost stock power settings. I think it was +11mV and =20% power just to be on safe side. Ran games like a champ! Everything after that on BF1 just seemed like slow doodoo.


I can run 1100/1600 all stock with nothing changed on wattman with ZERO issues. Exact same as i could with afterburner.


----------



## m70b1jr

Selling one of my R9 390's on Ebay








http://www.ebay.com/itm/201868165368?ssPageName=STRK:MESELX:IT&_trksid=p3984.m1555.l2649


----------



## stephenn82

Quote:


> Originally Posted by *bluej511*
> 
> Wattman works fine for the 390, i used an OC of 1100/1600 and worked just fine.


oh, really now?

What driver are you running that allows AB to work? And how did you go about installing them to get them to play nice together?


----------



## stephenn82

Quote:


> Originally Posted by *bluej511*
> 
> I can run 1100/1600 all stock with nothing changed on wattman with ZERO issues. Exact same as i could with afterburner.


How do you change your memory clocks in Wattman? I can only change state 1 and none of the rest, like 2-7 on memory clocks.

I can, howerver, change the GPU clock speeds per state. This normal? Let me alt+Prt Screen it


----------



## bluej511

Quote:


> Originally Posted by *stephenn82*
> 
> How do you change your memory clocks in Wattman? I can only change state 1 and none of the rest, like 2-7 on memory clocks.
> 
> I can, howerver, change the GPU clock speeds per state. This normal? Let me alt+Prt Screen it


Highlight the 1625 and change it to wtv you need. Or change it from dynamic and then use the slide.


----------



## stephenn82

Quote:


> Originally Posted by *bluej511*
> 
> Highlight the 1625 and change it to wtv you need. Or change it from dynamic and then use the slide.


So state 1 is all you can change for memory speed, idle state 0 and max is state 1, no tiered performance like core clock?


----------



## bluej511

Quote:


> Originally Posted by *stephenn82*
> 
> So state 1 is all you can change for memory speed, idle state 0 and max is state 1, no tiered performance like core clock?


Yea same as you would for afterburner just one speed. Its either on or off for memory speed.


----------



## stephenn82

Great, wasn't sure 100% on wattman. Grew very familiar with AB since snagging my card in Jan 2016. It doesn't play well with current drivers. It's neat that gpu performance scales as it does now.

Goodnight, everyone!


----------



## Rexer

Quote:


> Originally Posted by *stephenn82*
> 
> AMD locked down their drivers after 16.11.5. No newer driver will allow Afterburner or any other controlling program to work. This is all over reddit. I dont think anyone has cracked the drivers yet to get them to work. I am running 17.3.3 for my R9 390 and performance is...meh at best. When running 16.11.5, I had 1125/1625 on almost stock power settings. I think it was +11mV and =20% power just to be on safe side. Ran games like a champ! Everything after that on BF1 just seemed like slow doodoo.


Thanks for this bit of information. I'm using 16.5 because it's my last known good. I have no clue why 16.6 on is buggy in my rig. I'm a happy 1136mhz all day log.


----------



## gapottberg

I disagree. It is quirky but i am on newer drivers and have gotten AB to work fine. It displays odd behavior for sure the first few times you try and set it up...and im still not sure what i did if anything to fix it...but i managed to finally get my undervolt with custom fan and clocks working on my msi 390x using AB, and one of the newer 17.xx drivers with wattman. Trick seems to be stay the **** out of wattman and just use AB, at least for me.


----------



## stephenn82

Quote:


> Originally Posted by *gapottberg*
> 
> I disagree. It is quirky but i am on newer drivers and have gotten AB to work fine. It displays odd behavior for sure the first few times you try and set it up...and im still not sure what i did if anything to fix it...but i managed to finally get my undervolt with custom fan and clocks working on my msi 390x using AB, and one of the newer 17.xx drivers with wattman. Trick seems to be stay the **** out of wattman and just use AB, at least for me.


Every time I would try it, it would reset back to stock 1040/1500 clocks on my 390. Funny, my model is supposed to d fault to 1060/1525. It wouldn't work to me on any driver and I gave up. I will clean install 17.3.3 and try again, as I have been using wattman


----------



## Rexer

Quote:


> Originally Posted by *gapottberg*
> 
> I disagree. It is quirky but i am on newer drivers and have gotten AB to work fine. It displays odd behavior for sure the first few times you try and set it up...and im still not sure what i did if anything to fix it...but i managed to finally get my undervolt with custom fan and clocks working on my msi 390x using AB, and one of the newer 17.xx drivers with wattman. Trick seems to be stay the **** out of wattman and just use AB, at least for me.


Most of my troubles with newer drivers come from the uninstallers not completely collecting and removing the old driver in the registry path. The old Catalyst wasn't such a problem a few years back. I could pile all the Catalyst updates one on top another, '"Hey, no problem, still runs".
But I can't do that with Crimson. It wants all the old drivers removed completely before I can install the latest driver. Discovering this headache took a few times repeating the same mistake. So now I'm not so apprehensive about updating drivers.
Yeah, sure I feel the need to update everything but it's running fine on Crimson 16.5. I figure if it aint broke, why fix it?


----------



## gapottberg

Yeah i finally got it working and have not updated to latest 17.whatever yet for same reason. I think as long as you use a cleaner like DDU to remove drivers and then stay the hell out of wattman AB will be fine...but i too am scared to mess with a working set up atm. Such a finicky mess.


----------



## Rexer

DDU? I gotta check it out. Thanks. I use Revo Uninstaller but I like trying new programs. On my work computer, I used Ccleaner and was surprised it worked pretty good taking off old drivers of a old XFX HD7850.
Replaced an EVGA 960 sc with an Asus 390x Strix for a customer without completely removing the old drivers. Talk about a nightmare, his computer randomly crashed, sometimes on startup. Ended up manually going through HKEY and C:/ Users looking for Nvidia folders to delete. Had to remove his games and reinstall them because of an Nvidia game enhancing program he used. I worked on that thing for hours.
I'd advise if you buy AMD, stay with AMD the same with Nvidia .The drivers nowadays have deep roots. Also, changing from Gsync to Freesync monitors is an expensive overhaul.


----------



## stephenn82

Quote:


> Originally Posted by *Rexer*
> 
> DDU? I gotta check it out. Thanks. I use Revo Uninstaller but I like trying new programs. On my work computer, I used Ccleaner and was surprised it worked pretty good taking off old drivers of a old XFX HD7850.
> Replaced an EVGA 960 sc with an Asus 390x Strix for a customer without completely removing the old drivers. Talk about a nightmare, his computer randomly crashed, sometimes on startup. Ended up manually going through HKEY and C:/ Users looking for Nvidia folders to delete. Had to remove his games and reinstall them because of an Nvidia game enhancing program he used. I worked on that thing for hours.
> I'd advise if you buy AMD, stay with AMD the same with Nvidia .The drivers nowadays have deep roots. Also, changing from Gsync to Freesync monitors is an expensive overhaul.


DDU runs a small installer/remover progrm, in safe mode, and allows total removal of driver packages to reinstall OR power down to remove card. It's a super handy install or portable program! Updated often too!


----------



## orangeDrank

I have had a asus strix 390 8gb since nov 2015 and it is great. Together with my i7-6700k I have no problems running games at high or ultra 60fps on 1080p. Though I'm thinking I'm getting a new card because of one issue, the noise. I've dealt with a long time but now its just becoming annoying, so loud under load. Someone on reddit told me I would be losing money if I wanted to side grade to a 480, and to lower the fan speed and underclock instead. Is that the best option?


----------



## gapottberg

Quote:


> Originally Posted by *orangeDrank*
> 
> I have had a asus strix 390 8gb since nov 2015 and it is great. Together with my i7-6700k I have no problems running games at high or ultra 60fps on 1080p. Though I'm thinking I'm getting a new card because of one issue, the noise. I've dealt with a long time but now its just becoming annoying, so loud under load. Someone on reddit told me I would be losing money if I wanted to side grade to a 480, and to lower the fan speed and underclock instead. Is that the best option?


A 480 by most metrics is indeed a side grade to a 390. And a slight downgrade to a 390x. Your best bet may be to try undervolting...possibly underclocking...and setting up custom fan curves. That did wonders for my 390x.

Second you can try disassembling and replacing thermal paste and heat pads with aftermarket material. It has been shown in many many cases to make a huge difference in thermals and thus a huge difference in noise. There is a risk but it is minimal with tons of how to videos and help diagrams and forums like this.

It may get you more life for vwry little investment and worst case scenario you buy the 480 anyways. Worth a few bucks if it were me on a 2 year old card to squeeze some more life out of it.


----------



## stephenn82

Quote:


> Originally Posted by *orangeDrank*
> 
> I have had a asus strix 390 8gb since nov 2015 and it is great. Together with my i7-6700k I have no problems running games at high or ultra 60fps on 1080p. Though I'm thinking I'm getting a new card because of one issue, the noise. I've dealt with a long time but now its just becoming annoying, so loud under load. Someone on reddit told me I would be losing money if I wanted to side grade to a 480, and to lower the fan speed and underclock instead. Is that the best option?


Get a decent 1440p monitor and the noise will probably go away. 1080 is NOTHING for that card. Acre g257hu is a nice panel and can do 75hz ?


----------



## stephenn82

Quote:


> Originally Posted by *gapottberg*
> 
> A 480 by most metrics is indeed a side grade to a 390. And a slight downgrade to a 390x. Your best bet may be to try undervolting...possibly underclocking...and setting up custom fan curves. That did wonders for my 390x.
> 
> Second you can try disassembling and replacing thermal paste and heat pads with aftermarket material. It has been shown in many many cases to make a huge difference in thermals and thus a huge difference in noise. There is a risk but it is minimal with tons of how to videos and help diagrams and forums like this.
> 
> It may get you more life for vwry little investment and worst case scenario you buy the 480 anyways. Worth a few bucks if it were me on a 2 year old card to squeeze some more life out of it.


I concur, thermal paste alone dropped my card from mid 80's to about mid to low 60's fully loaded down.

Hate to sound like a salesman but gelid gc extreme is epic.


----------



## orangeDrank

Quote:


> Originally Posted by *stephenn82*
> 
> Get a decent 1440p monitor and the noise will probably go away. 1080 is NOTHING for that card. Acre g257hu is a nice panel and can do 75hz ?


um what? How would the monitor affect the GPU noise? Not an expert at this stuff, sorry.


----------



## gapottberg

Quote:


> um what? How would the monitor affect the GPU noise? Not an expert at this stuff, sorry.


It's possible but only in certain situations. If you are running a low resolution monitor at say 1080p/60hz for example...the 390 can push frames well above the 60hz cap, with sometimes going over +100 more frames being rendered than needed. This creates a lot of work on the card, and thus a lot of largely unnecessary heat. By going to 1440p monitor and resolution, the 390 will only be able to push out closer to 60fps, and thus reduce work and not exceed a 1440p/60hz monitor solution.

The smarter way to deal with this is just use AMD Crimson to Frame cap your card at 60ish fps...i use 65. That way my 390x never renders more frames than is necessary for what my monitor can actually handle.

There are situations where having those higher frames can be important to some people, most notably in competitive "twitch" shooters; but with a low refresh rate monitor the effective gain is debatable.

If you do have a high refresh 144hz monitor, that is the other obvious situation where you leave frame capping be. Otherwise frame capping will net you the same thermal gains, and thus noise reduction gains, as going up to 1440p/60hz monitor would (without the eye candy higher resolutions bring of course).


----------



## chris89

So what's the word on cables etc for 390x 4k 60hz? I'm limited atm 30hz







haha
Quote:


> Originally Posted by *gapottberg*
> 
> A 480 by most metrics is indeed a side grade to a 390. And a slight downgrade to a 390x. Your best bet may be to try undervolting...possibly underclocking...and setting up custom fan curves. That did wonders for my 390x.
> 
> Second you can try disassembling and replacing thermal paste and heat pads with aftermarket material. It has been shown in many many cases to make a huge difference in thermals and thus a huge difference in noise. There is a risk but it is minimal with tons of how to videos and help diagrams and forums like this.
> 
> It may get you more life for vwry little investment and worst case scenario you buy the 480 anyways. Worth a few bucks if it were me on a 2 year old card to squeeze some more life out of it.


By the way the RX 480 isn't a downgrade at all to the 390x. The RX 480 is a beast and completely waxed my 390x on firestrike 1080p. Not to mention basically the same 4k performance, probably better actually... nearly 40fps 4k ultimate at 1407mhz stock voltage or 1416mhz .. Clocking ram from 2000mhz 256GB/s to 2266mhz on reference pcb made 290GB/s and really bridged the gap between the RX 480 and 390x. In some ways 390x is still faster for compute obviously 44 Compute Units. For gaming RX 480 is a huge step up, way way less power like less than 200 watts or around there at 1416mhz delimited... less than 200 watts 1407mhz stock voltage on Reference Blower PCB only... Only the Reference 480 can handle this kind of performance... Thoroughly tested it on the Polaris Thread.

RX 480 8GB @ 1416Mhz Core 2266Mhz Memory : Firestrike default 1080p : 15,819 Test 1 average 74.47fps : DeLimited power Spikes FPS near 100fps like 96fps

390X 8GB @ 1250Mhz Core 1758Mhz Memory : Firestrike default 1080 : 14,888 Test 1 average 70fps : DeLimited power Spikes FPS near 90fps like 91fps

Honestly the RX 480 is the best upgrade ever from a 390X user. Experience the same performance on way less power, sometimes more performance. On such a tiny PCB, it's like that Little Tiny Thing Does What?!?

On Visiontek refurb RX 480 8GB : 1407mhz stock voltage with ease from bios mod. Have to dial in the BIOS to yield huge performance. Max RPM 3667rpm, Sensitivity adjustment is Key 65200 range 65199 to 65299. Max Temp, Shutdown Temp, Hot Spot temp is also highly important. The Hotter you set the faster you go. I set Max Temp 80-84C, Shutdown 88C, Hot Spot 80-84C... Big hop in performance from 80 to 84C... It's stable at 84C so it's very high speed. Power Limits and TDC 999,999,999. The Core will go up to 80-84C whatever you set and throttle and not go past that temperature until fan spools up to 3667rpm the Core dials back from 80C to 67C and straight screams in performance noise is way less than 390x fan. haha

The RX 480 8GB has a fan delay built into sensitivity... Delay before fan spools up to speed by a couple minutes... So it takes a while to get from 800rpm to 3667rpm as it hops 25-250rpm per every other second or so. Takes about 2-3 minutes to spool up and performance soars. The 1st run through is slow since fan hadn't spooled up yet. Then after 2nd & 3rd run fan is consistently spooled up to speed and core stops throttling as core temperature dials back the big clocks come out to play.


----------



## stephenn82

Quote:


> Originally Posted by *gapottberg*
> 
> It's possible but only in certain situations. If you are running a low resolution monitor at say 1080p/60hz for example...the 390 can push frames well above the 60hz cap, with sometimes going over +100 more frames being rendered than needed. This creates a lot of work on the card, and thus a lot of largely unnecessary heat. By going to 1440p monitor and resolution, the 390 will only be able to push out closer to 60fps, and thus reduce work and not exceed a 1440p/60hz monitor solution.
> 
> The smarter way to deal with this is just use AMD Crimson to Frame cap your card at 60ish fps...i use 65. That way my 390x never renders more frames than is necessary for what my monitor can actually handle.
> 
> There are situations where having those higher frames can be important to some people, most notably in competitive "twitch" shooters; but with a low refresh rate monitor the effective gain is debatable.
> 
> If you do have a high refresh 144hz monitor, that is the other obvious situation where you leave frame capping be. Otherwise frame capping will net you the same thermal gains, and thus noise reduction gains, as going up to 1440p/60hz monitor would (without the eye candy higher resolutions bring of course).


Well, not entirely that but half. It will also take the strain off of you CPU as well. I run my games at medium settings but with a slight touch to meshes and push competitive shooters at 165 fps on my 1440p monitor. It has a little tearing, but I don't mind. Frame limiting to 75hz is good. Solid smooth performance at 1440p. All other games I play at ultra with no issues. Gamers Nexus has done this showing that higher resolutions of beefy cards eases the CPU work required.


----------



## gapottberg

Good catch. Forgot to mention the impact having fewer rendered frames has on the CPU in terms of helping to aide overall cooling and noise levels.

As for the Rx480 8gb numbers someone posetd...that is pretty impressive and an extremely impressive OC if done on stock cooling as well. I don't recall anyone in the tech press getting that or much better tbh. Well done!









Most of the data i have seen and was referencing in my post was at stock or factory OC settings. At factory settings, and at the time of testing, almost every reputable tech review i saw showed a 390 < 480 < 390X; with the margins being rather tight all around. In time the 480 may indeed prove the 390Xs better...but logistically it seemed unlikely considering the hardware specs. The 390X holds a substantive lead in various categories, including compute units, shaders, ROPs, TMUs, and bus width.

Here is a good spec summary from HW Bench. http://hwbench.com/vgas/radeon-rx-480-vs-radeon-r9-390x

The move to 14nm and engineering optimizations with the RX 400 series may indeed allow it to close the gap on some of these specs in performance, but it seems unlikely to a great degree. Time will tell. It is pretty amazing the performance AMD packed into such a small power efficient package at $200 price point. I am encouraging many of my friends to consider it as a viable upgrade for many of their gaming needs.

[edit]
Quote:


> Honestly the RX 480 is the best *upgrade* ever from a 390X user. Experience the *same* performance on way less power, sometimes more performance. On such a tiny PCB, it's like that Little Tiny Thing Does What?!?


Not to criticize, but I think this largely comes down to how the reader defines terms. If one's idea of upgrade is strictly tied to better performance, then the 480 fails to deliver in their mind. If their idea of upgrade means same performance but better experience/power efficiency/ smaller package ect...then sure it's defiantly a huge leap forward. I think most readers on this site would associate the later scenario as a "side grade" in terms of words and how they use them. I think we are all largely saying the same thing in the end.

Cheers


----------



## chris89

Haha yeah thats what I meant its an upgrade in terms of paying less and using way less power to get a little bit better results...

I guess it comes down to the money... sure a 1080 ti is a $700 upgrade from 390x for sure. Though RX 480 is I would say a better card for gaming than 390x for $169 and quite capable with 8GB. For that low of money for RX 480 its low power consumption is quite nice.

So it's like throw 2x 480s for $340 and clock Delimited to 1407mhz stock voltage and a bios tune and be golden.


----------



## gapottberg

Wow...where you finding RX480's with 8GB of ram for $169?!?!









I was just shopping the other day and thought the deal i saw for $204 with a $20 mail in rebate was good, but those where the 4GB models. Haven't seen 4 or 8GB ones break $200 yet, at least without mail in rebates (which i abhor).


----------



## bichael

Quote:


> Originally Posted by *orangeDrank*
> 
> I have had a asus strix 390 8gb since nov 2015 and it is great. Together with my i7-6700k I have no problems running games at high or ultra 60fps on 1080p. Though I'm thinking I'm getting a new card because of one issue, the noise. I've dealt with a long time but now its just becoming annoying, so loud under load. Someone on reddit told me I would be losing money if I wanted to side grade to a 480, and to lower the fan speed and underclock instead. Is that the best option?


In terms of lowering noise then most has been mentioned already - thermal paste, undervolt, capped framerate, fan profiles. I would definitely try those and think about upgrading again in 6months or so.

The other thing worth trying though would be the 'chill' feature. Must admit I haven't tried it, not sure if it's something people are using much? It sounds like a good idea though eg set 40fps for when nothing is really happening and ramp up to 60fps when needed, and the one review I read suggested the lowering of rate wasn't really noticeable and could actually help reduce stuttering. Is only supported on certain games though.

edit - the list still looks very short though there are some big names on it at least.
http://www.amd.com/en-us/innovations/software-technologies/radeon-software/gaming/radeon-chill


----------



## robin69

Quote:


> Originally Posted by *gapottberg*
> 
> Wow...where you finding RX480's with 8GB of ram for $169?!?!


Maybe used ?


----------



## chris89

@everyone
does anyone know of a better way to scale dpi for 4k in windows? GPUz hangs off too much and can't see "Render Test" button... Thanks

Not many in stock I'm sure but amazing deal and amazing cards. Capable of 1407mhz core stock voltage and 2266mhz memory minimal errors stock voltage

https://www.visiontek.com/refurbished/performance-graphics.html


----------



## stephenn82

Quote:


> Originally Posted by *chris89*
> 
> @everyone
> does anyone know of a better way to scale dpi for 4k in windows? GPUz hangs off too much and can't see "Render Test" button... Thanks


First world problems... lol


----------



## chris89

HAHAHAHA









Yeah I fixed it... Only on VSR 4k does it screw up GPUz... setting 200% @ 4096x2160 fixed it no issues. Very happy and wow 4096x2160, say what? haha

By the way, one DeLimited 390x has no issues in game at this resolution... Wouldn't have thought?


----------



## gapottberg

Quote:


> Originally Posted by *chris89*
> 
> By the way, one DeLimited 390x has no issues in game at this resolution... Wouldn't have thought?


IDK. From a hardware perspective it seemed the 390X had the nuts to provide adequate 4k gaming experience at traditionally reasonable frame rates (ie contiuous 30+)...but AMDs poor driver situation always held it back amoung other things on its release. I think you will find in many situations moving forward that more of that fine wine goodness is built into these cards than one might think.

For example...i believe memory bus width is one area where it shines with its 512bit bus, and that is directly related to higher resolution performance. It is one comparison where i think it will continue to outshine the RX480 into the future just due to hardware being logistically better for such things.


----------



## chris89

I hear ya man. I find something different of all GPU's.

It's not about on paper specs. ROP's & TMU's... We can throw unlimited ROP & TMU at it but we still have to deal with the "Power".

That's what's limiting everything, the dealing with power and controlling it's "Heat". That's what it all comes down to. How much heat is coming off the card at idle/load.

If they were to place a GPU in a Vaccum and monitor thermal pressures. We would find the 390x thermal pressure is about 2x as much as Polaris. Maybe even as much as 2.5x than polaris.

That "thermal pressure" is basically "lost" energy... energy is lost and not able to transmit to the GPU core diode and throw/ spit out more frames/ compute cycles etc.

When it comes to this aspect it has to get extremely scientific and require nasa like scientists and engineers to research and test/ experiment with this idea.

nasa engineers deal with this exact same thing that GPU's are a result of, every day. They learn and work with the energy loss, and attempt to capture and reduce loss.

All manufacturers are and have been working on this since forever. It comes down to a thorough understanding of what needs to be done to correct such a natural occuring anomaly.

HAHAHA Yes TMI again sorry









To Sum it all up. The performance of a GPU, so to speak is not about the specs on paper. It's how much power from the wall does it use. Not as much as just from the wall, but what of that power coming from the wall is actually utilized by the gpu for throughput.

Less loss from the wall to the core, means a faster GPU. Work on that first, then throw some ROP's & TMU's at it.

To further help and explain all this stuff we call ROP's & TMU's & Memory Bus Width & Memory Throughput is energy producing components... Each ROP and Each TMU has a "Heat" value. They all individually have a "Power/ Heat" value... Degrees Celsius/ Fahrenheit, Watts, and Amperes etc it's all just plain old "Energy".

Reduce the loss from which requires to power such ROP & TMU and then you will have GPU's performing in harmony I suppose you could say.

I keep going on and on and on but I really want this to make perfect sense for success. Basically the loss that occurs in total of all components is a variable to the specs of such components. For instance say you have a GPU that has 512 ROP & 1024 TMU with 16384 bit HBM.

Say today it would require 1,500 watts input to just power on the component and not put it under load. Throw it under load and we see loss right away within 0.01 nano seconds, watts are flying out the window. Say 75% of the input is going to work and 25% is lost. That means the performance at that point of the 512 ROP is 472.63 ROP's of performance (8.33%), on top of just 945.26 TMU's of performance, as well as just 15124.157 Bits of Memory bus width performance. See that's what occurs with loss.

This means that even know Polaris has less ROP's & TMU's than pretty much every card out there, it's loss is so small that it's throughput is actually greater than it's ROP's & TMU's. Actually it's the other way around, we are finally just now witnessing a tiny glimpse of what 32 ROP & 144 TMU is really all about today.


----------



## gapottberg

So got a 390X question for all my gurus out there. I was doing some benching and testing as i am prone to do from time to time, and noticed what i perceived to be some odd behavior regarding stuttering and screen tearing.

One game i benchmark a lot due to how demanding it is on my 8 cores is Ashes of the Singularity. I have always noticed when i push details high enough to get something akin to screen tearing and stuttering during the built in bench test. Particularly when the camera pans out during certain scenes. Last night i was doing something different and had it in windowed mode while i was testing for various reasons. Something i had not done before. Keep in mind i have pretty strict set up for how i manage and track settings i use, and in this case i ensured that Vsync was off, though i was using a 65fps frame cap that i always use with this particular 60hz 1080p monitor (not that i hit that high in this game with my Gfx bench settings anyways).

What i noticed was that in windowed mode the screen tearing and stuttering i normally exhibit was completely gone. I at first chalked it up to running at a lower resolution, but for ****s and giggles i reran the test again at 1080p, but in border-less window instead of full-screen mode like i normally do. Once again the stuttering and screen tearing seemed to be gone.

I have not had time to do more thorough testing on this but i hope to do a comparison of several games in boarder-less window vs full-screen mode to see if I was just seeing things or if there is a appreciable difference.

My question is...has anybody already noticed this or accounted for this in AoS or other games? If so what is going on that is causing it? Is it a 390X issue? An AMD driver issue? A particular settings issue? Or is it just all in my head? (perfectly reasonable...the testing was done at 3am)

I will report more info when i can find time outside of work to test more. Thanks in advance.


----------



## christoph

Quote:


> Originally Posted by *gapottberg*
> 
> So got a 390X question for all my gurus out there. I was doing some benching and testing as i am prone to do from time to time, and noticed what i perceived to be some odd behavior regarding stuttering and screen tearing.
> 
> One game i benchmark a lot due to how demanding it is on my 8 cores is Ashes of the Singularity. I have always noticed when i push details high enough to get something akin to screen tearing and stuttering during the built in bench test. Particularly when the camera pans out during certain scenes. Last night i was doing something different and had it in windowed mode while i was testing for various reasons. Something i had not done before. Keep in mind i have pretty strict set up for how i manage and track settings i use, and in this case i ensured that Vsync was off, though i was using a 65fps frame cap that i always use with this particular 60hz 1080p monitor (not that i hit that high in this game with my Gfx bench settings anyways).
> 
> What i noticed was that in windowed mode the screen tearing and stuttering i normally exhibit was completely gone. I at first chalked it up to running at a lower resolution, but for ****s and giggles i reran the test again at 1080p, but in border-less window instead of full-screen mode like i normally do. Once again the stuttering and screen tearing seemed to be gone.
> 
> I have not had time to do more thorough testing on this but i hope to do a comparison of several games in boarder-less window vs full-screen mode to see if I was just seeing things or if there is a appreciable difference.
> 
> My question is...has anybody already noticed this or accounted for this in AoS or other games? If so what is going on that is causing it? Is it a 390X issue? An AMD driver issue? A particular settings issue? Or is it just all in my head? (perfectly reasonable...the testing was done at 3am)
> 
> I will report more info when i can find time outside of work to test more. Thanks in advance.


do you clean the drivers in safe mode before updating the drivers?

sometimes the Audio driver conflicts with the Audio driver from AMD drivers, the one that gets installed for the HD audio for the HDMI, so try uninstalling those drivers in safe mode too


----------



## gapottberg

Quote:


> Originally Posted by *christoph*
> 
> do you clean the drivers in safe mode before updating the drivers?
> 
> sometimes the Audio driver conflicts with the Audio driver from AMD drivers, the one that gets installed for the HD audio for the HDMI, so try uninstalling those drivers in safe mode too


I primarily use DDU in safe mode. I believe it deals with the AMD audio drivers as well when it runs...but ill double check all my audio drivers tonight maybe. Make sure they are up to date. Been awhile since i checked for motherboard updates.


----------



## chris89

If anyone wants me to mod their bios for a cooler more efficient worry free operation send your bios to me or post here, I'm happy to help everyone.

Please post HWInfo under idle/ load so I know where we stand before we begin to note gains.


----------



## christoph

Quote:


> Originally Posted by *chris89*
> 
> If anyone wants me to mod their bios for a cooler more efficient worry free operation send your bios to me or post here, I'm happy to help everyone.
> 
> Please post HWInfo under idle/ load so I know where we stand before we begin to note gains.


the fans of my sapphire 390 are to set off when reached 50 degrees, can they be modded to be set off at less temperature, lets say 40 degress


----------



## chris89

Sure can but consider as the cards sits idle parts will run hot, using more power at idle than if set 16% for instance.

Send bios ill check it out.


----------



## christoph

Quote:


> Originally Posted by *chris89*
> 
> Sure can but consider as the cards sits idle parts will run hot, using more power at idle than if set 16% for instance.
> 
> Send bios ill check it out.


ok, what do you use to read the bios rom? I'm use to tweak my old video card bios, but not this 390, I haven't touch this one


----------



## chris89

Use GPUz to dump the stock .rom. Keep for safe keeping. It's easy to fix these cards since has dual bios switch.

Then use Hawaii Bios Reader to mod the .rom and then flash with ATIWinFlash v274 opening as administrator.

I can get ya going to start if you would like to upload your .rom here in .zip format?


----------



## christoph

Hawaii.zip 42k .zip file


whats the fan pwm percentage for?


----------



## chris89

The Fan PWM is basically the percentage, only will it abide if set Fan Type as 0. Here try this bios. It is DeLimited, so meant primarily for max load 4k gaming though 1080p should do fine.

Close everything in taskbar before beginning flash. Open ATIWinFlash.exe as administrator. Open .rom and begin flash. Be very patient it takes 5 minutes to flash, so don't touch anything while it's flashing.

Oh yeah this is for worry free continuous gaming, not a max benchmark bios.

atiflash_274.zip 1214k .zip file


Christoph-Sapphire390-1133-1267-HiSpeed.zip 42k .zip file


Christoph-Sapphire390-1133-1267-HiSpeed-48C-FanOffIdle.zip 42k .zip file


----------



## christoph

Quote:


> Originally Posted by *chris89*
> 
> The Fan PWM is basically the percentage, only will it abide if set Fan Type as 0. Here try this bios. It is DeLimited, so meant primarily for max load 4k gaming though 1080p should do fine.
> 
> Close everything in taskbar before beginning flash. Open ATIWinFlash.exe as administrator. Open .rom and begin flash. Be very patient it takes 5 minutes to flash, so don't touch anything while it's flashing.
> 
> Oh yeah this is for worry free continuous gaming, not a max benchmark bios.
> 
> atiflash_274.zip 1214k .zip file
> 
> 
> Christoph-Sapphire390-1133-1267-HiSpeed.zip 42k .zip file
> 
> 
> Christoph-Sapphire390-1133-1267-HiSpeed-48C-FanOffIdle.zip 42k .zip file


yeah just the same bios, but with the fans start up touched right?

ok, I can see what you did, but TDP MAX and POWER LIMIT it ends up like that?


----------



## chris89

Cool, the 57599 is just the max allowable value. Since there are intermittent anomaly wattage sensor data that causes throttling at anything less than 57599.

Were looking at quite a bit of power but not much more than stock. Not all computers/ power supplies can handle this. This just is like taking off the Restrictor Plate.

Try this one instead, I made a mistake which was to cause instability. This one is a better fit.

Christoph-Sapphire390-1133-1267-HiSpeed-45C-FanOffIdle.zip 42k .zip file


----------



## chris89

By the way the Hawaii/ Grenada series will have a hard time keeping the fan off as low as 40C. It would need to be higher for longer fan off time, since as soon as its off temperature rises. Maybe total idle it may stay off for some time.

It's just not ideal to set it to turn off, since a hotter gpu uses more power than a cooler GPU. So by setting 20% minimum cools the GPU at idle sufficiently to reduce its idle power consumption by quite a lot. Just an FYI.


----------



## gapottberg

I concure with your thoughts. I have a custom fan profile based on this idea that never actaully turns fans off but runs them at incredibly low speeds until certain thresholds are achieved. Between that and some custom undervolting i have greatly reduced noise and temps of the stock settings with better performance all around.

So much so i havnt felt the need to replace thermal paste or pads...but i am really tempted just for ****s and giggles based on the massively positive experience others here seem to have with it amd these cards.


----------



## chris89

If you don't feel like pulling the card apart but want cooler performance by up to 10C depending on thermal material used.

Pull the backplate, buy a large enough sheet to cover the entire PCB about 2-3mm thickness. Cut to size apply to PCB and tighten backplate. That alone could drop the temperatures of the core and VRM by 10C or more at load...

Sheets that big range from $50 and up so we can only get a sheet 9"x9" which if you cut it, it could cover the whole PCB. The 390X is 3.5" Wide by about 10.75 to 11 inches long (88mm x 280mm).

This stuff is dirt cheap and effective. Not too conductive thermally, but could see 10-15C reduction idle/load.

I didn't even cover the whole PCB with this stuff to line up a 380X on OpenCL compute right up near a TITAN X Maxwell. Covering the Whole PCB will help a lot more.

The more expensive stuff is $28 and cheaper stuff is $13. The difference is 1.1W/m K vs 1.8W/m K (How many Watts Per Meter Kelvin Of Thermal Conductance). When were talking about covering the whole PCB. That 1.8W/m K will surpass the 1.1W/m K by as much as 12.5% cooler (1.8 divided by 1.1 = 1.125)... Quite a lot.

Actually to calculate how many celsius you would rop. By stock the backplate is not emitting thermal conductivity, I would say 0.01W/m K or less. Only contact is mounting screw points, pretty much only for looks.

stock : 84C
1.1w/m k : 79C
1.8w/m k : 74C

https://www.digikey.com/product-detail/en/laird-technologies-thermal-materials/A14162-06/926-1116-ND/2445397

https://www.digikey.com/product-detail/en/laird-technologies-thermal-materials/A15959-10/926-1131-ND/2445429


----------



## jon666

Hmm...a clean install option. I have no reason to use it, so now I am in the midst of trying it out.


----------



## stephenn82

nice! you got anything for an MSI r9 390 Gaming 8g sitting around??

I seen a Hawaii BIOS from other guy, but here is mine, just in case.

I used to run my card at 1125/1625 with +19mv and +20% easily, custom fan curve in AB, but AB dont work with new AMD drivers...

Can you work some magic?

FIgured out my stupid mistake...didnt zip it...DOH!!

Attached now:

MSIR9390Gaming8G.zip 98k .zip file


----------



## Rexer

Quote:


> Originally Posted by *gapottberg*
> 
> Wow...where you finding RX480's with 8GB of ram for $169?!?!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I was just shopping the other day and thought the deal i saw for $204 with a $20 mail in rebate was good, but those where the 4GB models. Haven't seen 4 or 8GB ones break $200 yet, at least without mail in rebates (which i abhor).


A guy whom I game with bought a Sapphire Nitro RX 480+ for 224.00 last month from NewEgg. It's 8gb, too. Out of the box it's 1306 mhz & 1342mhz boost clock. Beats me by 200 mhz. I'm wondering if I'm missing something.
I've been waiting 7 months for Vega. AMD rumors were in September/October 2016 of a release in late December, early January after the CES show. No cigar. They pushed it back to the first quarter, then before June. To be honset, I don't even know any specs yet. It just might all be a rumor. Linus said he put his hands on one but heck, that could've been an 390x for all I know. How he got invited to AMD's exhibit over Jay2cents, I have no clue.


----------



## gapottberg

Most reliable rumors i have heard are that RX580 is basically the refresh of the 400 series, much like 300 proceeded the 200. nothing new to see, just better clock until the actual Vega line drops...which may be something like the 590...or may have a new name like Fury.

Either way if 580 drops this year it should push the 8GB 480s below $200 more consistently. That would be pretty legit considering how well they seem to be doing in crossfire these days in some games.


----------



## stephenn82

Quote:


> Originally Posted by *gapottberg*
> 
> Most reliable rumors i have heard are that RX580 is basically the refresh of the 400 series, much like 300 proceeded the 200. nothing new to see, just better clock until the actual Vega line drops...which may be something like the 590...or may have a new name like Fury.
> 
> Either way if 580 drops this year it should push the 8GB 480s below $200 more consistently. That would be pretty legit considering how well they seem to be doing in crossfire these days in some games.


If thats the case, I may just get two of those for less than anything else. It would be an improvement over one 390.


----------



## gapottberg

Quote:


> Originally Posted by *stephenn82*
> 
> If thats the case, I may just get two of those for less than anything else. It would be an improvement over one 390.


In crossfire supported titles...yes, 2 480s looks legit.

In most everything else...meh. A single 480 is very similar in performance to a 390 and context plays a big role on the subjective nature of "better".

If you are ok with the hit and miss nature of crossfire support then it can be very good as a down the road upgrade to a single 480 for sure. Esspecially at sub $200 prices.


----------



## stephenn82

No, I wouldnt go single 480. The 580 looks like a 10-15% increase due to clock rates, and refined process technology lowers power consumption even more. That MAY be worth it over the 390, it would best it by a decent margin while eating less power...the 480 is almost direct tie, but less power.

I am STILL waiting for this Rx 490/590/RX Vega (whatever Raja changes his mind to daily) If they release it in September, I have a feeliing it will be stillborn, as they;re waiting too long, dragging their feet (typical AMD, right?). Nvidia will have the second generation 10 series out, to include the Titan with updated specs, similar to the Ti of recent. But better...because Titan!


----------



## chris89

This card is doing upwards of 1450mhz on stock voltage. About 1418-1420 roughly...


----------



## robin69

@Chris89. Can you help me with a bios mod ? I am running Stock Clocks at 1040/1500 with 50% power target.

Just wanted to know if I could get 1100Mhz with a less power consumption or stock voltage and if touching the memory is worth it


----------



## chris89

@robin69 Sure thing. These GPU's use loads of power. I recommend DeLimited anyway. If you want a good running card even 1Ghz DeLimit is real fast. Sure send the bios .rom here in .zip format.

PS - I can make benchmarking bios and regular 24/7 bios which would you prefer? You can't have both as one uses too much power, and other runs cool stable and worry free with hardly any difference. These cards are real fast DeLimited, regardless of clock. Let me help you out.









To sum it up, any clock requiring more than default voltage is a benchmark bios. These clocks below explain how delimited clocks work and 1000mhz delimited is just as fast as stock limited power and cooler.

1000Mhz : 65288 : 1.25v'ish just fine real fast and cool
1250Mhz memory recommended : 888mv voltage : 16% cooler and less power than stock with no difference

1066-1088Mhz : 65288 : Possible if you want : No difference than 1000Mhz just more power/ hotter maybe couple fps

1133Mhz : 1333mv : hotter, way more power : barely any difference and have to worry about temps

1166Mhz : 1366mv : way hotter could be faster but some serious power and heat

1188Mhz : 1388mv : way way hotter and louder and way more power : were not talking about 25fps difference

1250Mhz : 1425mv : only under continuous supervision of temperatures and a proper setup, power is extra-hi


----------



## robin69

Hey Chris

Thanks for your help, I am looking for an enhanced 24/7 bios, I am really happy with the performance of the card cause i am gaming @1080p75Hz.
So more performance really isn't the thing I am looking for, I just thought i could give it a little bumb. But i read that higher clocks can couse lower minimal framrates, since I use a Freesync display wihtin a certain range(48-75Hz), so OC could hurt my gaming experience ?

And why is everybody clocking down the memory ? Does this help with stability ?

Just to clearify, a 1GHz card with 50% power target is already a decent performance ? I also tried to monitor the usages and clocks and couldn't really notice any throttling (Or maybe i monitored wrong)

I will be back at home tomorrow or maybe tonight, then i am going to upload the bios for you

PS:Sorry for bad spelling/grammar , not a native speaker
Really appreciate your support , awesome community here


----------



## Streetdragon

just wanna say: dont set a bios with a overclock WITHOUT knowing if tha card/gpu can run it....robin use wattman and clock your gpu a bit higher. look what you can get with stock voltage. test it with some benchmarks like heaven or firestrike some rounds


----------



## chris89

That's cool I'm the same way. I tested the 290x up to 1200mhz and 390x up to 1250mhz. The 290x could do 1250mhz too just didn't feel like pushing it as hard as the 390x.

We can use like way less power at 1ghz and memory 320GB/s vs 384GB/s... we are limited by the PCIe bus transfer rate... ie Memory Read/ Write... PCIe 2.0 is 8GB/s PCIe 3.0 is 16GB/s. PCIe 4.0 is 32GB/s which is still 50% slower than the system memory bus of a PCIe 4.0 system.

They ought to give the PCIe bus 64GB/s so even the systems that will use it, will see no system memory to video memory bottleneck.

So just saying I tested 320GB/s to 450GB/s on the 390x on PCIe 2.0. The bus is limited even at 320GB/s, same applies to 384GB/s. No fps difference hardly at all the extreme increase in power consumption.

So yeah to sum up 320GB/s is as fast as you can go to save power at 888mv. Maybe 1267Mhz, but it's pointless.

Yep, sure thing man. I'll line up a really solid BIOS for you.


----------



## Rexer

Quote:


> Originally Posted by *stephenn82*
> 
> No, I wouldnt go single 480. The 580 looks like a 10-15% increase due to clock rates, and refined process technology lowers power consumption even more. That MAY be worth it over the 390, it would best it by a decent margin while eating less power...the 480 is almost direct tie, but less power.
> 
> I am STILL waiting for this Rx 490/590/RX Vega (whatever Raja changes his mind to daily) If they release it in September, I have a feeliing it will be stillborn, as they;re waiting too long, dragging their feet (typical AMD, right?). Nvidia will have the second generation 10 series out, to include the Titan with updated specs, similar to the Ti of recent. But better...because Titan!


I hear you, bud. I think Ryzen was 6 months late at the minimum. Ryzen was a huge jump back into the cpu market landing at the heels of i7 Kaby Lake. It's supposedly better at workload efficiency. That's great but it's not what I want to hear. I'm impetuous and want speed
I heard some smoke yesterday, that Vega would be able to use the system ram in addition to the board. They say it reduces power consumption which would lower the heat signatures, helps continuous frame rates and reduces studdering. Yeah, that's adding a lot of happy testosterone but again, it's getting old when Nvidia's already countering with 2nd gen ti series cards and all we're seeing is smoke.


----------



## chris89

Yeah I hear ya that's funny. For sure on the whole next gen bunch of GPU's. I would though for surely buy a 480. At first because I was blinded by on paper specs, which was a dull moment for me. I took the on paper 390x specs and figured how fast it would be. Forget it all, on paper means nothing. The 480 is a card to use and have and appreciate. It's like a prettier/ healthier 390x in better fitter shape. The 390x has huge muscles but it's heavy. The Rx 480 has little muscles, but they get just as big if not bigger when flexed. HAHA So yeah I really really like the RX 480 and think everyone should try one.

390x is like a card that had been working out 24/7 and muscles are enormous and swull yet still re-building. The RX 480 is like a card that had been working out it's entire life yet has fully recovered and is full of strength and efficiency.


----------



## Rexer

Quote:


> Originally Posted by *chris89*
> 
> Yeah I hear ya that's funny. For sure on the whole next gen bunch of GPU's. I would though for surely buy a 480. At first because I was blinded by on paper specs, which was a dull moment for me. I took the on paper 390x specs and figured how fast it would be. Forget it all, on paper means nothing. The 480 is a card to use and have and appreciate. It's like a prettier/ healthier 390x in better fitter shape. The 390x has huge muscles but it's heavy. The Rx 480 has little muscles, but they get just as big if not bigger when flexed. HAHA So yeah I really really like the RX 480 and think everyone should try one.
> 
> 390x is like a card that had been working out 24/7 and muscles are enormous and swull yet still re-building. The RX 480 is like a card that had been working out it's entire life yet has fully recovered and is full of strength and efficiency.


Haw, haw. Yeah, the 390x got muscles. You know, I just got a Sapphire 480 nitro+ for a clan member (it was a weird shuffle of cash but the deal worked out). The old clan guy's hit the 70's. Just a cranky can of worms to deal with. He couldn't stay up with the rest of us fps guys in Call of Duty AW so we elected to get him a better card than the Sapphire HD7950 he was using. Granted, that was once a pretty spiffy card. Great for dancing around in UT3 and Medal of Honor. Too bad those days are gone.
Time marches on and the computer we built him needs some help. It's a 2nd gen i7 Sandy Bridge that his brother clocked up a smidget so he'd get into games faster.
He doesn't know much except to turn on his rig and play games. That's all he owns a computer for. Keeps him outta the casinos. He's not hardware or tech savy at all. So the clan got together and just sorta threw the 480 in his computer and said, "Play with it."
Now he's kicking our 455e5 all over the place. Pretty sour feeling since the guys' an old dog and just like rubbing salt on our b4ll5. But I got to thinking, 'D4nm, he's really hustling with that 480. Am I missing something?


----------



## chris89

Hahaha Good story. Yeah Rx 480 is worth it if you have a 390x. Such a relief of a card, cool quieter, way easier on power, just as fast if not faster.


----------



## stephenn82

Quote:


> Originally Posted by *stephenn82*
> 
> nice! you got anything for an MSI r9 390 Gaming 8g sitting around??
> 
> I seen a Hawaii BIOS from other guy, but here is mine, just in case.
> 
> I used to run my card at 1125/1625 with +19mv and +20% easily, custom fan curve in AB, but AB dont work with new AMD drivers...
> 
> Can you work some magic?
> 
> FIgured out my stupid mistake...didnt zip it...DOH!!
> 
> Attached now:
> 
> MSIR9390Gaming8G.zip 98k .zip file


@chris89 any luck with my MSI bios, or have one sitting around I can use for 24/7 use on my 390 gaming 8g?


----------



## chris89

Try this... lmk post hwinfo plus gpuz throw it under some time spy









atiflash_274.zip 1214k .zip file


Stephenn82_1080_1250.zip 99k .zip file


----------



## stephenn82

Quote:


> Originally Posted by *chris89*
> 
> Try this... lmk post hwinfo plus gpuz throw it under some time spy
> 
> 
> 
> 
> 
> 
> 
> 
> 
> atiflash_274.zip 1214k .zip file
> 
> 
> Stephenn82_1080_1250.zip 99k .zip file


What specs are those? I assume 1080mhz core and 1250 on ram?

How do you modify your BIOS anyways? YOu should do a vid and post on youtube. I think it was somewhere in this thread, but thats a lot of posts to comb through.

I just found RBE. Going to see if still applicable today.


----------



## chris89

Yep. I would try it first it's DeLimited 1080Mhz. Which means... truly witness 69 Billion 120 Million Pixel's every second & 190 Billion 80 Million Texel's every second. The memory clock of 1250mhz is up to 33% cooler and less power than 1500mhz memory. Were talking 320GB vs 384GB/s for over 100 watts saving and 25 Celsius cooler. Fast ram doesn't matter anyway because the PCIe slot can only transfer as much as 32GB/s on PCIe 4.0 which hasn't even came out yet. Which is still 10 Times Slower than the card's memory bandwidth output. PCIe 1.0 through PCIe 3.0 is 4-16GB/s, still 20 Times slower than the card's memory output.

Without a limit your actually seeing that much rendering performance onscreen.

Unlike by stock your only seeing a fraction of the clock you set. This applies to any clock set with a power limit in sight, you won't ever see the actual performance of that clock with a power limit. Try it and LMK. By the way here is the application.

HawaiiBiosReader-master.zip 73k .zip file


Use this to flash : Close everything in taskbar, really everything. Open ATIWinFlash as Administrator and Open .rom and begin flash. Be very patient, up to 5 minutes to flash... Less if you open a movie up and drag to the side before beginning flash. Don't click anything after clicking to begin flash or it could corrupt the bios. Good luck.

atiflash_274.zip 1214k .zip file


----------



## gapottberg

Hey chris89,

Finding this all very interesting as I am a strong proponent of Undervolting Video cards (thanks to the undervolter and his heresy) in maximizing performance while minimizing heat and noise.

Im not familiar with Delimiting. Is this the same as increasing power limit in say AfterBurner? Or is this a Bios only tweak?

Also if as you say there is a true PCI express bottleneck on memory...why are there many measurable performance gains one can find with higher ram speeds?

I am not saying your reasoning is wrong but i think its more complicated than describing it as a pure bottleneck. I do believe the gains do not juatify the costs more often than not...and may very well be negligable or non existant in real workloads...but if it were a true bottleneck we wouldnt see people with super clocked memory seeing any gains in bemchmarks at all.

A link to written article on delimiting would be much aplreciated. Thanks mate. Ill keep following and my try one of your bios mods out myself if i find its right for me after reading up more on it.


----------



## chris89

Hi gapottberg,

Sure thing man. I'm glad you found it interesting to read. I too found it interesting when I was dialing in my first 390x. With a bit of 380x knowledge on hand I had a tiny idea of what I was doing haha.

For me to come right up on a OpenCL Compute score of the TITAN X Maxwell, I had to set tdp limit to I think 350 watts on the 380x. Even know it didn't use anywhere near that. I was only getting an idea of what was limiting these new-tech AMD cards. The card I used would overheat the VRM, only when I used Graphite thermal material on the VRM did it really change the whole card completely.

As far as the memory see the PCIe bus is the channel where the data flows from System Memory to the GPU memory. See CPU copy's/ write's/ read's from system memory. CPU throughput inside the cpu's "Memory" or "Cache" is near or beyond 1,000GB/s. Then it write's what it compiled to System memory at what the System Memory can read at, which is could be as much as 64GB/s or way more on super computer's. So there's always a bottleneck between the CPU & the System Memory, same goes with the Bottleneck between System Memory and the PCIe bus.

Since now the CPU wrote data to system RAM from 1,000GB/s down to 64GB/s, it then has to feed as much of that data as it can into the PCIe bus. Which is limited to 32GB/s on PCIe 4.0. So there we have 2x worse throughput from system RAM to PCIe than the system RAM itself. See the PCIe bus would then write that 64GB/s system RAM at up to 32GB/s GPU memory Read/ Write. On older stuff like PCIe 1.0 to 3.0 it's 4-16GB/s. That means the Video memory that has theoretical peak throughput of 384GB/s or call it 1TB/s in Video Ram Throughput. That call it 1TB/s can only be thrown around on the GPU itself, it's confined 1TB/s. It has to accumulate over time data from system Ram that wrote it's 64GB/s to Video Ram through the PCIe "Channel" at 16GB/s on PCIe 3.0. Basically the internal video memory throughput runs circles around the Read and Write from the PCIe bus ie System Ram.

So to sum it up the limitation of Video Ram not helping at a certain level is, you won't see gains past a certain point when the latency becomes too great at too high video memory clock to yield fps gains because of the completely congested PCIe bus "Channel" being totally packed/ taxed/ accumulated. Make sense?

Now to DeLimited power limit, it comes down to thermals. Back in the day they didn't have a power limit but were able to keep the VRM cool enough because clocks weren't too high and VRM could handle the voltage of said "Low" clock. If you can work over the thermal design of a GPU to with ease handle absolutely no power limit. Then all clocks would be seen as they truly are because the clock has all the power it could ever require to yield max performance. Make sense? I don't wanna make this a book bro haha


----------



## gapottberg

I am pretty sure i understand it but ill keep working it over in my head. I have been dealing with data transfer rate bottle necks since my 386sx days so i know how it works in theory and am more interested in the exact points at which your gains on one side of the equation become pointless due to the bottle neck they flow through.

Also...the power delimit thing...is what your doing the same as what i do in software like afterburner or wattman when i increase the power limit? Just you do it via bios and i do it in software. Or is your bios tweak different?


----------



## stephenn82

Quote:


> Originally Posted by *chris89*
> 
> Hi gapottberg,
> 
> Sure thing man. I'm glad you found it interesting to read. I too found it interesting when I was dialing in my first 390x. With a bit of 380x knowledge on hand I had a tiny idea of what I was doing haha.
> 
> For me to come right up on a OpenCL Compute score of the TITAN X Maxwell, I had to set tdp limit to I think 350 watts on the 380x. Even know it didn't use anywhere near that. I was only getting an idea of what was limiting these new-tech AMD cards. The card I used would overheat the VRM, only when I used Graphite thermal material on the VRM did it really change the whole card completely.
> 
> As far as the memory see the PCIe bus is the channel where the data flows from System Memory to the GPU memory. See CPU copy's/ write's/ read's from system memory. CPU throughput inside the cpu's "Memory" or "Cache" is near or beyond 1,000GB/s. Then it write's what it compiled to System memory at what the System Memory can read at, which is could be as much as 64GB/s or way more on super computer's. So there's always a bottleneck between the CPU & the System Memory, same goes with the Bottleneck between System Memory and the PCIe bus.
> 
> Since now the CPU wrote data to system RAM from 1,000GB/s down to 64GB/s, it then has to feed as much of that data as it can into the PCIe bus. Which is limited to 32GB/s on PCIe 4.0. So there we have 2x worse throughput from system RAM to PCIe than the system RAM itself. See the PCIe bus would then write that 64GB/s system RAM at up to 32GB/s GPU memory Read/ Write. On older stuff like PCIe 1.0 to 3.0 it's 4-16GB/s. That means the Video memory that has theoretical peak throughput of 384GB/s or call it 1TB/s in Video Ram Throughput. That call it 1TB/s can only be thrown around on the GPU itself, it's confined 1TB/s. It has to accumulate over time data from system Ram that wrote it's 64GB/s to Video Ram through the PCIe "Channel" at 16GB/s on PCIe 3.0. Basically the internal video memory throughput runs circles around the Read and Write from the PCIe bus ie System Ram.
> 
> So to sum it up the limitation of Video Ram not helping at a certain level is, you won't see gains past a certain point when the latency becomes too great at too high video memory clock to yield fps gains because of the completely congested PCIe bus "Channel" being totally packed/ taxed/ accumulated. Make sense?
> 
> Now to DeLimited power limit, it comes down to thermals. Back in the day they didn't have a power limit but were able to keep the VRM cool enough because clocks weren't too high and VRM could handle the voltage of said "Low" clock. If you can work over the thermal design of a GPU to with ease handle absolutely no power limit. Then all clocks would be seen as they truly are because the clock has all the power it could ever require to yield max performance. Make sense? I don't wanna make this a book bro haha


That is super interesting.

From my understanding, my MSI R9 390 Gaming 8G has one of the best stock coolers out there.

I do have two VRM montiors showing in GPU-Z. One reads about 45-52c when loaded...the other always shows 64c, no matter what, and I would presume that its not actually connected to anyhting.

I havent loaded the BIOS. Still weary that down clocking the VRAM would help, but its worth a shot. Thanks for the links to stuff. I was searching around OC.net and found a lot of stuff, but then had to leave on an errands trip, then go to work.


----------



## christoph

ok just question

what is VRM temprerature 1 and temprerature 2 link to?


----------



## stephenn82

Quote:


> Originally Posted by *christoph*
> 
> ok just question
> 
> what is VRM temprerature 1 and temprerature 2 link to?


IDK, that was a question for everyone else who knows the engineering side of the 390.

I am at work and cannot access my computer at this time.


----------



## gapottberg

Wow, Preliminary results are looking good. Got through entire Timespy bench below 60'C and fans never broke 50% speed with my custom fan curve. Gonna do a stress test and see how hot i can make it next. Here is a pic of my set up.



[edit] Passed full Timespy stress test...topped out at 63'C and never broke 50% fan speed. I am impressed. At stock i was hitting over 90'C with this card during stress testing. With my under-volt and custom fan curve i was down to around 73'C iirc. I will take another 10'C drop for next to no hit in performance as far as i can tell with the 250 drop in mem speed.

Thanks Chris89!


----------



## stephenn82

Quote:


> Originally Posted by *chris89*
> 
> It's easy to fix these cards since has dual bios switch.


I dont think I have dual bios on my MSI card...if it bricks, am I SOL?


----------



## gapottberg

@stephenn82: I am using a completely software based solution...no Bios tweaking yet. I am not certain, but fairly confident it does nearly the same thing chris82's bios tweak will do. MSI afterburner is a great tool but it has been a bit buggy since wattman came out. I have managed to get it to work after using an aggressive driver remover called DDU, and doing a clean install of both Relive drivers and Afterburner. Just stay out of wattman if you use AB because they dont play nice with each other.

hopefully chris89 will verify if his Bios tweak actually does anything different than what my software tweak is doing soon. I am curios. Either way you can try what I have done with less risk as a gfx card without duel Bios that has the bios go bad is usually bricked.


----------



## chris89

Right on if you guys wanna stick with software then by all means stick with your 110 to 120 degrees Celsius stock thermal limits... right at the edge of melting the card by stock. If having your card not last is what you prefer then by all means stick to software and not modify the bios to run cooler.

And yeah you can't brick the card if you don't do something wrong... If you don't wanna do it then sure thing don't worry about it stick with stock bios and probably burn it out eventually.


----------



## Streetdragon

what has Ram transfer-rate with pci-speed to do? nothing!

just changed my cpu, so i have crossfire x16/x16 and not x16/x8 and my firestrikescore went up about 2k points with same speeds etc.

lower vramspeed is ok to save power etc. wont have a big impact in performence, but it has NOTHING to do with the PCI-Speed.

Chris89 with most of your text i totaly agree but pci-vram is nonsens


----------



## chris89

As you can see PCIe 2.0 is less than 8GB, about 6 1/2 GB/s Memory Read & Write all bound to the PCIe bus... The writing of system memory to video memory limitation.

Same applies PCIe 2.0 vs 3.0... The difference is 8GB max to 16GB max, PCIe 4.0 is 32GB. They are always behind 1 generation on the PCIe bus bandwidth than the system memory of a system that runs such a PCIe bus specification.

By the way the Width ie x1, x4, x8, x16 that's all power and has nothing to do with the "Speed". The speed is bound by the generational specification of the PCIe slot. Basically x8 is 37.5 watts through the slot which is made up for by using more power from the 6-pin so no loss. On x16 it's 75 watts through the slot.

If your card was x16 using no external power, it would only run with a TDP max of 37.5 watts without supplemental power from PCIe power pins through a x16 slot wired for x8.


----------



## Streetdragon

and vram has still nothing to do with the PCI-Speed^^


----------



## chris89




----------



## bluej511

Quote:


> Originally Posted by *chris89*


And id love to see which gpu maxes out pcie 2.0 speeds haha.


----------



## chris89

390x... haha doesn't matter how high you clock the ram still limited by 6.8GB/s read/write... theoretical max 8GB/s.

If it was on PCie 5.0, 64GB/s then it wouldn't be a bottleneck by the system ram since system ram is over 40GB/s on X5650x2.

I have results, only 0.1fps give or take from 1563Mhz ram to 1758Mhz ram... Which is like nearly 200 more watts for no fps gains.

Basically PCIe is a bottleneck. If PCIe was infinite 1TB/s theoretical... would be bound by system memory throughput...

Notice how CPU physical system memory Read/ Write/ Copy are near the same.

Though GPU memory Read/ Write comes in from the system ram ie the slot spec... Yet once the data is in video memory it can copy it around real fast internally (copy). Doesn't mean more fps because still always limited by Read/ Write...


----------



## Streetdragon

and still VRAM-Speed has nothing to do with the PCI-Speed.. dont knoe why you try to ignor that


----------



## chris89

Thank you for your comment I appreciate your comment. What I would like to ask is if you haven't noticed without the PCIe slot, your card does nothing at all... It sits there communicationless and can't pull not even 0.01 frames per second... Did you ever think that maybe the data that becomes stored in the video memory comes from the PCIe slot just maybe? The PCIe slot has a limit of bandwidth to read and write to video ram... Unless I'm a fool and PCIe has 1 Billion Tera Bytes per second bandwidth and I don't know what I'm talking about?

The video memory clock might just have faster performance with higher clocks sure... Though that data must be first stored in video memory... Which comes from... Thin Air possibly?


----------



## stephenn82

Quote:


> Originally Posted by *gapottberg*
> 
> @stephenn82: I am using a completely software based solution...no Bios tweaking yet. I am not certain, but fairly confident it does nearly the same thing chris82's bios tweak will do. MSI afterburner is a great tool but it has been a bit buggy since wattman came out. I have managed to get it to work after using an aggressive driver remover called DDU, and doing a clean install of both Relive drivers and Afterburner. Just stay out of wattman if you use AB because they dont play nice with each other.
> 
> hopefully chris89 will verify if his Bios tweak actually does anything different than what my software tweak is doing soon. I am curios. Either way you can try what I have done with less risk as a gfx card without duel Bios that has the bios go bad is usually bricked.


I have done that, MULTIPLE times...AB no longer works on ANY driver past 16.11.5 for me. I dont even open Wattman. I delete all the corresponding AMD folders as well, DDU doesnt always get them. Dont know what else to do? Mabye delete every MSI registry as well just to be safe?

I am trying his bios today he made for me. Not as aggressive as I am used to, but the way chris89 explained it, it will be better than the 1150/1675 i was running on LESS POWER. Getting ready to load this shin dig up!


----------



## stephenn82

Quote:


> Originally Posted by *Streetdragon*
> 
> and still VRAM-Speed has nothing to do with the PCI-Speed.. dont knoe why you try to ignor that


I believe that any data that goes to that card goes through the PCI-e lanes, right? Just like the RAM on motherboard, it has to connect to the CPU. If the CPU cant physically support ddr3200, it wont run it without some intervention from an onboard chip. (which is STILL limited to CPU speed to feed the northbridge)

Compare it to this. Say you have a 600hp engine sitting in a three speed VW Rabbit. Sure, its fast...but with those ****ty stock gears, you can only go so fast. If you crank up to say, a 6 speed close ratio with steep 5 and 6 gears, you will be going SOOOO much faster in the top end.


----------



## stephenn82

Pre flash timespy, no AB, no Wattman, plain jane run - http://www.3dmark.com/3dm/19169477?


----------



## bluej511

Quote:


> Originally Posted by *stephenn82*
> 
> Pre flash timespy, no AB, no Wattman, plain jane run - http://www.3dmark.com/3dm/19169477?


heres mine to compare, havent done it with my 1700x though lol.

http://www.3dmark.com/compare/spy/1532151/spy/1305103


----------



## stephenn82

Quote:


> Originally Posted by *bluej511*
> 
> heres mine to compare, havent done it with my 1700x though lol.
> 
> http://www.3dmark.com/compare/spy/1532151/spy/1305103


Quote:


> Originally Posted by *stephenn82*
> 
> Pre flash timespy, no AB, no Wattman, plain jane run - http://www.3dmark.com/3dm/19169477?


and POST flash. slight improvement over stock. I did notice the side window of my Air 740 wasnt hot like usual when loaded up, popped open case to put hand near where the blower vents, it was slightly warm, not the typical sauna that could roast marshmallows on stock speeds.

GREAT WORK!!! @chris89

quick comparo of results:
http://www.3dmark.com/compare/spy/1532297/spy/1532151
going to load 17.4.1 and see if any more improvement can be had.


----------



## stephenn82

Quote:


> Originally Posted by *bluej511*
> 
> heres mine to compare, havent done it with my 1700x though lol.
> 
> http://www.3dmark.com/compare/spy/1532151/spy/1305103


comparo of post flash with yours. i have 1080/1250 for clocks. you SLIGHTLY edge mine out. What are your clocks?
http://www.3dmark.com/compare/spy/1532297/spy/1305103#

they say hyperthreading has NO results over not HT cpu's. Thats a lie. I blew yours out of the water by almost 50% more. Cant wait to see what that AMD will do! Im glad they finally got their act together! Last AMD i owned was a 5000+ Black Edition Brisbane core. was a little beast back in 2007-2008!


----------



## bluej511

Quote:


> Originally Posted by *stephenn82*
> 
> comparo of post flash with yours. i have 1080/1250 for clocks. you SLIGHTLY edge mine out. What are your clocks?
> http://www.3dmark.com/compare/spy/1532297/spy/1305103#
> 
> they say hyperthreading has NO results over not HT cpu's. Thats a lie. I blew yours out of the water by almost 50% more. Cant wait to see what that AMD will do! Im glad they finally got their act together! Last AMD i owned was a 5000+ Black Edition Brisbane core. was a little beast back in 2007-2008!


I did a cpu only test I'm over 7000 i think. I didn't even do it OCed or ram OCed i dont think but will do.

http://www.3dmark.com/spy/1313147

My clocks on the timespy were 1040/1500. I can do 1100/1600 on stock voltages.


----------



## stephenn82

[quote name="bluej511" url="/t/1561704/official-amd-r9-390-390x-owners-club/10950#post_26000322"
http://www.3dmark.com/spy/1313147

My clocks on the timespy were 1040/1500. I can do 1100/1600 on stock voltages.[/quote]
You running a modded bios from Chris89? If no, what are your GPU max temps, if you dont mind me asking?
I just loaded up the new 17.4.1 drivers to see if any more improvements can be made over the modded bios.
I will set GPU-Z to record max GPU temp, and post it here. I was definitely in the 70's without AB fan profile on stock 16.11.5


----------



## bluej511

Quote:


> Originally Posted by *stephenn82*
> 
> [quote name="bluej511" url="/t/1561704/official-amd-r9-390-390x-owners-club/10950#post_26000322"
> http://www.3dmark.com/spy/1313147
> 
> My clocks on the timespy were 1040/1500. I can do 1100/1600 on stock voltages.


You running a modded bios from Chris89? If no, what are your GPU max temps, if you dont mind me asking?
I just loaded up the new 17.4.1 drivers to see if any more improvements can be made over the modded bios.
I will set GPU-Z to record max GPU temp, and post it here. I was definitely in the 70's without AB fan profile on stock 16.11.5[/quote]

Stock BIOS, gpu max temp is like 39°C lol


----------



## stephenn82

Quote:


> Originally Posted by *bluej511*
> 
> You running a modded bios from Chris89? If no, what are your GPU max temps, if you dont mind me asking?
> I just loaded up the new 17.4.1 drivers to see if any more improvements can be made over the modded bios.
> I will set GPU-Z to record max GPU temp, and post it here. I was definitely in the 70's without AB fan profile on stock 16.11.5
> Stock BIOS, gpu max temp is like 39°C lol


post a log file from GPUz please. Is that card under water?


----------



## stephenn82

Ok, so the newest drivers suck for the 390.

http://www.3dmark.com/compare/spy/1532297/spy/1532151/spy/1532439

1 is 16.11.5 post flash,
2. is 17.4.1 post flast,
3 is 16.11.5 pre flash.

MAX VRM temp 56C
Max GPU temp 56C
Max power 185W.


----------



## bluej511

Quote:


> Originally Posted by *stephenn82*
> 
> post a log file from GPUz please. Is that card under water?


It is under water yea, i need to redownload gpuz again but i need to run timespy again fully this time.


----------



## stephenn82

Quote:


> Originally Posted by *bluej511*
> 
> It is under water yea, i need to redownload gpuz again but i need to run timespy again fully this time.


mine is on air. get a custom bios from Chris89. i think his 1133 or 1166 would be epic on your card.

Im just running the stocker MSI Frozer V


----------



## bluej511

Quote:


> Originally Posted by *stephenn82*
> 
> mine is on air. get a custom bios from Chris89. i think his 1133 or 1166 would be epic on your card.
> 
> Im just running the stocker MSI Frozer V


Pointless as ill be getting a vega or rx 580 next lol.

Here is mine at 1100/1600 stock voltages. 1700x is at 3.8 and 2933mhz ram.

http://www.3dmark.com/compare/spy/1532761/spy/1532151#


----------



## stephenn82

Quote:


> Originally Posted by *bluej511*
> 
> Pointless as ill be getting a vega or rx 580 next lol.
> 
> Here is mine at 1100/1600 stock voltages. 1700x is at 3.8 and 2933mhz ram.
> 
> http://www.3dmark.com/compare/spy/1532761/spy/15321532297#


you mean THIS one
http://www.3dmark.com/compare/spy/1532761/spy/1532297#

That one you listed was stock clocked unmodded


----------



## bluej511

Quote:


> Originally Posted by *stephenn82*
> 
> you mean THIS one
> http://www.3dmark.com/compare/spy/1532761/spy/1532297#
> 
> That one you listed was stock clocked unmodded


Yea dont think i found your modded one, compare mine to yours if youd like and post it would love to see it.


----------



## chris89

*

Stephenn82_1133Mhz_PACK.zip 297k .zip file
*


----------



## stephenn82

Quote:


> Originally Posted by *bluej511*
> 
> Yea dont think i found your modded one, compare mine to yours if youd like and post it would love to see it.


dude, I posted it into a reply of mine...lol


----------



## stephenn82

Quote:


> Originally Posted by *chris89*
> 
> *
> 
> Stephenn82_1133Mhz_PACK.zip 297k .zip file
> *


ooooh a new one!

I took your stock 1080 1250 and ran it, was nice. I tinkered in wattman, set power to 30, 1150 core, 1475 ram...and holy smokes!

It was keeping up with my 1160/1650 +25mv +30 power when running afterburner...and much cooler!

I was seeing temps of 62c max on GPU, 65c max on VRM. That is hella awesome!

I am giong to load up the new one now
















OH SNAP...THREE! which one? mwuuhahahaaahaaaahaaaaaaaaaaaa


----------



## bluej511

Quote:


> Originally Posted by *stephenn82*
> 
> dude, I posted it into a reply of mine...lol


Ah you posted a comparison i see it now, i couldn't compare as you had 2 results but this one should be with your post flash.

http://www.3dmark.com/compare/spy/1532297/spy/1532761#


----------



## stephenn82

results! so, it did even better than the 1125/1450 I set in your other bios via Wattman.

http://www.3dmark.com/compare/spy/1536736/spy/1536306

Sam max temps, 62c core 65c VRM1. I think I may take your advice in a previous posting about getting the thicker TIM sheet for cooling. VRM1 is located on the back of card, under the back plate? The cooling strip on the front side near the chokes is hella thick and good. It fully covers and makes contact with the bits and the cooler. I like MSI for that reason.


----------



## chris89

Which voltage bios did you use? Nice man. Sometimes the less voltage can prove more points with potentially less stability. Let's get you above 30fps... Try running it a couple times, it will go up.


----------



## bluej511

Quote:


> Originally Posted by *stephenn82*
> 
> results! so, it did even better than the 1125/1450 I set in your other bios via Wattman.
> 
> http://www.3dmark.com/compare/spy/1536736/spy/1536306
> 
> Sam max temps, 62c core 65c VRM1. I think I may take your advice in a previous posting about getting the thicker TIM sheet for cooling. VRM1 is located on the back of card, under the back plate? The cooling strip on the front side near the chokes is hella thick and good. It fully covers and makes contact with the bits and the cooler. I like MSI for that reason.


Doing it with no special BIOS lol. Try to run priority to high and make sure youre running nothing else in the background might give you more oomf.

http://www.3dmark.com/compare/spy/1536736/spy/1532761


----------



## stephenn82

Quote:


> Originally Posted by *chris89*
> 
> Which voltage bios did you use? Nice man. Sometimes the less voltage can prove more points with potentially less stability. Let's get you above 30fps... Try running it a couple times, it will go up.


oh snap....I think i went middle ground. How can I pull that info? is it in log file?

Remembered, Hawaii bios reader will

@chris89 I loaded the 1333 max volts one, just in case. Been busy running around the house checking stuff out, figuring out why two outlets dont have power, but no breakers are tripped. real fun :/


----------



## stephenn82

Quote:


> Originally Posted by *bluej511*
> 
> Doing it with no special BIOS lol. Try to run priority to high and make sure youre running nothing else in the background might give you more oomf.
> 
> http://www.3dmark.com/compare/spy/1536736/spy/1532761


meh


----------



## bluej511

Quote:


> Originally Posted by *stephenn82*
> 
> meh


Yea i know man i got lucky, not too far off though. I can run her at 1200/1650 with 100mv and 50% power not sure what she will do.


----------



## stephenn82

Quote:


> Originally Posted by *bluej511*
> 
> Yea i know man i got lucky, not too far off though. I can run her at 1200/1650 with 100mv and 50% power not sure what she will do.


if you want to get luckier, drop a custom bios on there. I am getting better results after with less heat and less clocking required. its not super hard or scary to do.


----------



## bluej511

Quote:


> Originally Posted by *stephenn82*
> 
> if you want to get luckier, drop a custom bios on there. I am getting better results after with less heat and less clocking required. its not super hard or scary to do.


Im on water anyways, barely reaches 40°C and VRMs are under 60°C, im going rx 580 or vega in a couple months so im not gonna bother. I dont even run it OCed.


----------



## stephenn82

Quote:


> Originally Posted by *bluej511*
> 
> Im on water anyways, barely reaches 40°C and VRMs are under 60°C, im going rx 580 or vega in a couple months so im not gonna bother. I dont even run it OCed.


you want to sell it afterwards? Its a selling point...custom bios, better results...two fold.


----------



## chris89

If your on water or a modded air reference you can do 1,250mhz core... that's at the height of 80 billion pixel's ... 80 Giga Pixel's.

Btw - on reference modded, the gust coming off the blower can be felt from 10 feet at least. Feels and sound like a leaf blower haha... 80 Giga Pixel's though... and 220 Giga Texel's out right.


----------



## stephenn82

oh boy, page 1099...tax season is on the brain. anyways...

results, 1 is recent bios, pushed ram to 1475, 2 is 1333 stock bios, and 3 is where I started preflash at stock GPU.

http://www.3dmark.com/compare/spy/1536954/spy/1536736/spy/1532151

not bad!

max temps!
Core 62c
66c VRM1


----------



## stephenn82

@bluej511, im nipping at your heels!

http://www.3dmark.com/compare/spy/1536954/spy/1532761


----------



## chris89

What's your 1250mhz memory results at 888mv? Also which voltage did you use out of the 3 bios I sent? So i know if you could run less voltage. We can go up a lot with those temperatures. Well at least 1200Mhz I expect.

What's your VRM 1 & 2 Temperature?


----------



## bluej511

Quote:


> Originally Posted by *chris89*
> 
> If your on water or a modded air reference you can do 1,250mhz core... that's at the height of 80 billion pixel's ... 80 Giga Pixel's.
> 
> Btw - on reference modded, the gust coming off the blower can be felt from 10 feet at least. Feels and sound like a leaf blower haha... 80 Giga Pixel's though... and 220 Giga Texel's out right.


Ive tried it quite a bit, maxes out at 1200, even 1225 gets artifacts, my elpida is garbage and stuck at 1650 thats the max.
Quote:


> Originally Posted by *stephenn82*
> 
> @bluej511, im nipping at your heels!
> 
> http://www.3dmark.com/compare/spy/1536954/spy/1532761


You beat me already, my cpu is higher obviously and my vrm temps might be as well but nice job.


----------



## stephenn82

Quote:


> Originally Posted by *chris89*
> 
> What's your 1250mhz memory results at 888mv? Also which voltage did you use out of the 3 bios I sent? So i know if you could run less voltage. We can go up a lot with those temperatures. Well at least 1200Mhz I expect.
> 
> What's your VRM 1 & 2 Temperature?


I answered you early my man.

oh boy, page 1099...tax season is on the brain. anyways...

results, 1 is recent bios, pushed ram to 1475, 2 is 1333 stock bios, and 3 is where I started preflash at stock GPU.

http://www.3dmark.com/compare/spy/1536954/spy/1536736/spy/1532151

not bad!

max temps!
Core 62c
66c VRM1

I used the 1333mv one. with stock 1250 mem, let me run. I think it was a couple hundred less.


----------



## stephenn82

wow, it was only TWO less...TWO. I swear it was lower than that. Maybe just running it a few times helped "seat" it all in...just like an engine on a dyno.









http://www.3dmark.com/compare/spy/1536954/spy/1537220


----------



## stephenn82

So, running Valley had a difference with faster VRAM.

1133/1250/1333mv


and 1133/1475/1333mv


it also bumped core temp to max of 65c and 70 VRM1

I could try another bios with LESS voltage, eh? I may do that. Test for stability


----------



## chris89

Yeah try it start on the lowest voltage and test..


----------



## stephenn82

I will on Wednesday, getting ready to go to bed, work tomorrow, then off wed morning.


----------



## stephenn82

SO before bed last night, I did get on BF1 and noticed a lot of that lag (I would shoot, sound of firing heard, screen displays it, dusting occurs, nothing registered on server but I WOULD DIE. Or dying before even seeing who shot me) has seemed to dissapear. Not sure what the hell it is...but this custom BIOS has resolved about 90% of that laggy feel with the game. I was sucking after three matches and turned in. I guess I was used to compensating for that lag and grown accustomed to it, couldnt hit anything...was lowering my KDR...so I got off there.

Looking forward to dropping the voltages and testing for stability...with this 390, I dont think it will be a problem. Its funny that my bios shows as "Hawaii" even from GPU-z, but its a Granada card, right?


----------



## chris89

Hawaii is Grenada just a refined version is all. Cool dude glad to here good things. The BIOS i sent is 888mv... which can't hold much more than 1293mhz memory clock stable so that's probably your issue. You need 1000mv for high memory clocks which comes with at least 100 more watts load and lots more heat. I saw you gained like 0.1 fps from 1250mhz to 1600mhz memory.

Let's just focus on leaving ram at 888mv at 1250mhz to just maybe 1267mhz. Then focus on the Core Clock completely. We have a lot more to gain from core clock than memory bandwidth. Lmk bro. I wanna line you up with faster BIOS today.


----------



## componentgirl90

Hi Guys and gals,

My xfx 390x DD is still going to 3600rpm on the fan when idle sometimes @ 45-52 degrees celcius.

Looking at a power supply issue but I need help as I am noob.

Techpowerup GPU z says the 12V is 11.75V, the VDDC is 1.094V average with 1.086V minimum and 1.250V max, VDDCI 1.031V average with no variation, VDDC current in 2.0A (min), 6.8A max, 2.3A average

Are those normal? Are there any other voltages I should be checking?

I also have another EVGA PSU which is 500W which I might swap in to test but thinking thats quite possibly too little for gaming but it might be good enough for idle?

I have a EVGA Supernova G2 850W Gold installed currently.


----------



## chris89

Hello. So post your Hwinfo screenshot and could tell us more. What's the issue again?

As far as powering the 290x/390x at load would be need no less than 40A total on 12 volt rail... check sticker or specs online. Could be combined rails as well. Rail #1 : 20A, Rail #2 : 20A for 12V.

Use this calculator to calculate total output with droop... If your 850W is 11.75v idle on a 70A psu, then total output is 822.5 rather than 840 when 12v...









The 500W has 40-44A which is enough for load not overclocked.

http://www.rapidtables.com/calc/electric/Amp_to_Watt_Calculator.htm


----------



## componentgirl90

Quote:


> Originally Posted by *chris89*
> 
> Hello. So post your Hwinfo screenshot and could tell us more. What's the issue again?
> 
> As far as powering the 290x/390x at load would be need no less than 40A total on 12 volt rail... check sticker or specs online. Could be combined rails as well. Rail #1 : 20A, Rail #2 : 20A for 12V.


"XFX 390x DD (Sep 2015 purchase). Problem came Jan 2017ish Fans @ 3600 rpm (i call this "revving") when 0% load, normal idle temps.

Blew dust off + reseated. Possibly helped. Card has been used a lot.

RMAed it. Fault not Found. Testing was youtube "1 day" and not under load. They suspected software issue but card revs when not in windows.

I installed it again after the RMA (on approximately the 12th March), installed different drivers (the latest ones 17.2.1). Things improved. Up until today, the card revved just once a few minutes or so after windows start up, without any apparent exception. It would not rev at any other time.

However, today it seems to have gone downhill again. It has revved several times during being switched on, including when booting up and apparently not in windows.

145.png 35k .png file

"


----------



## chris89

Oh I see what's going on haha BIOS issue and software issue.

First save bios from GPUz and upload here as .zip.

Then uninstall the driver as well as software checkbox in Device manager. Then DDU without restart. Install call it 17.1.2 WHQL fresh... Delete AMD folder C:\

Once DDU'ed without restart, and restart later. Install 17.1.2 WHQL. Then without a restart again let's flash the BIOS. HAHA Long answer but it will help you a lot.









GPU-Z_ASUS_ROG_1.12.0.zip 2079k .zip file
 Save BIOS .rom and .zip it/ Compress to .zip and attach here

DDUv17.0.5.1.zip 1876k .zip file
 Use DDU as Normal Mode and Clean without restart AMD after first uninstalling display driver & software Device Manager

Once done with complete uninstall and cleaning... I would install 17.1.2 WHQL as fresh and then without restart after you install ReLive at the end. We will flash the BIOS, then finally restart.

atiflash_274.zip 1214k .zip file
 After you send me your BIOS, I will correct it and send it back. After your done with all above steps. Open ATIWinFlash.exe as Administrator... Make sure everything is closed especially HWInfo and GPUz/ MSI Afterburner etc. After I send you back the bios you will open it in ATIWinFlash and will open the .rom and click begin flash. Be very patient and do not click anything while it flashes. It could take up to 5 minutes to flash.


----------



## componentgirl90

Quote:


> Originally Posted by *chris89*
> 
> Oh I see what's going on haha BIOS issue and software issue.
> 
> First save bios from GPUz and upload here as .zip.
> 
> Then uninstall the driver as well as software checkbox in Device manager. Then DDU without restart. Install call it 17.1.2 WHQL fresh... Delete AMD folder C:\
> 
> Once DDU'ed without restart, and restart later. Install 17.1.2 WHQL. Then without a restart again let's flash the BIOS. HAHA Long answer but it will help you a lot.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> GPU-Z_ASUS_ROG_1.12.0.zip 2079k .zip file
> Save BIOS .rom and .zip it/ Compress to .zip and attach here
> 
> DDUv17.0.5.1.zip 1876k .zip file
> Use DDU as Normal Mode and Clean without restart AMD after first uninstalling display driver & software Device Manager
> 
> Once done with complete uninstall and cleaning... I would install 17.1.2 WHQL as fresh and then without restart after you install ReLive at the end. We will flash the BIOS, then finally restart.
> 
> atiflash_274.zip 1214k .zip file
> After you send me your BIOS, I will correct it and send it back. After your done with all above steps. Open ATIWinFlash.exe as Administrator... Make sure everything is closed especially HWInfo and GPUz/ MSI Afterburner etc. After I send you back the bios you will open it in ATIWinFlash and will open the .rom and click begin flash. Be very patient and do not click anything while it flashes. It could take up to 5 minutes to flash.


Thanks for this! Here is the BIOS

BIOS.zip 98k .zip file


----------



## stephenn82

Quote:


> Originally Posted by *componentgirl90*
> 
> Hi Guys and gals,
> 
> My xfx 390x DD is still going to 3600rpm on the fan when idle sometimes @ 45-52 degrees celcius.
> 
> Looking at a power supply issue but I need help as I am noob.
> 
> Techpowerup GPU z says the 12V is 11.75V, the VDDC is 1.094V average with 1.086V minimum and 1.250V max, VDDCI 1.031V average with no variation, VDDC current in 2.0A (min), 6.8A max, 2.3A average
> 
> Are those normal? Are there any other voltages I should be checking?
> 
> I also have another EVGA PSU which is 500W which I might swap in to test but thinking thats quite possibly too little for gaming but it might be good enough for idle?
> 
> I have a EVGA Supernova G2 850W Gold installed currently.


My 12v rail shows as 11.68-11.85 in GPU-z, but my bios shows it as a solid 12.14. I have put a multimeter on it in the past and it was 12.13. I wouldnt follow the GPU-Z voltage reading entirely.


----------



## bluej511

Quote:


> Originally Posted by *stephenn82*
> 
> My 12v rail shows as 11.68-11.85 in GPU-z, but my bios shows it as a solid 12.14. I have put a multimeter on it in the past and it was 12.13. I wouldnt follow the GPU-Z voltage reading entirely.


Correct, most software reading is WAY off for voltages. Even hwinfo64 shows my 5 and 12v rails below spec so i def wouldnt worry about that. BIOS or DMM is the way to go.


----------



## stephenn82

Quote:


> Originally Posted by *chris89*
> 
> Hawaii is Grenada just a refined version is all. Cool dude glad to here good things. The BIOS i sent is 888mv... which can't hold much more than 1293mhz memory clock stable so that's probably your issue. You need 1000mv for high memory clocks which comes with at least 100 more watts load and lots more heat. I saw you gained like 0.1 fps from 1250mhz to 1600mhz memory.
> 
> Let's just focus on leaving ram at 888mv at 1250mhz to just maybe 1267mhz. Then focus on the Core Clock completely. We have a lot more to gain from core clock than memory bandwidth. Lmk bro. I wanna line you up with faster BIOS today.


huh...Valley benchmark was A VAST IMPROVEMENT with bumping the memory clock up, as above post shows.

No, I didnt mess with the voltage of the memory. It holds well and stable at 1475 mhz on whatever voltage you provided. I didnt even bump up power. still at 0% in Wattman. That 1133 is very responsive. I will try tomorrow to get the lower core voltage BIOS loaded and try agian, as I am stuck at work for a 24 hour shift, since 6:40 this morning.


----------



## stephenn82

Quote:


> Originally Posted by *bluej511*
> 
> Correct, most software reading is WAY off for voltages. Even hwinfo64 shows my 5 and 12v rails below spec so i def wouldnt worry about that. BIOS or DMM is the way to go.


but what if I like analog volt meter? just teasin


----------



## componentgirl90

Quote:


> Originally Posted by *bluej511*
> 
> Correct, most software reading is WAY off for voltages. Even hwinfo64 shows my 5 and 12v rails below spec so i def wouldnt worry about that. BIOS or DMM is the way to go.


Quote:


> Originally Posted by *stephenn82*
> 
> My 12v rail shows as 11.68-11.85 in GPU-z, but my bios shows it as a solid 12.14. I have put a multimeter on it in the past and it was 12.13. I wouldnt follow the GPU-Z voltage reading entirely.


Ok ty. I will try this bios thing from chris89 now rather than the whole power line of investigation. I hope it isn't dangerous to flash but my 390x sounds like it is about to take off sometimes!! I swear 3600rpm is higher than 100% fan speed!


----------



## stephenn82

he will take care of you, you will be fine


----------



## chris89

@componentgirl90 Let's take it easy to start because I'm almost thinking there might be something else going on with the GPU. Try this and follow my instructions just to start. Then let's start to turn up the boost as we find what works.

Here ya go. It's a "Test" bios just to see if it helps to start. It's not a high performance bios, it's power limited to rule out thermal issues.

Once you follow my instructions, upload a HWInfo At-Load screenshot to see how the temps are and fan speeds etc. By the way is your 390x a reference model? Thanks

*

ComponentGirl90_Radeon_Bios_1.zip 99k .zip file
*


----------



## ziggystardust

Long awaited Unigine Superposition is out guys.

http://www.guru3d.com/files-details/unigine-superpostition-benchmark-download.html

By the way, I'm on 17.4.1 driver and can't overclock with Afterburner. Once I apply the clock, it goes back to its default value. I can raise the power limit though. What you guys using to overclock your 390s these days?


----------



## bluej511

Quote:


> Originally Posted by *stephenn82*
> 
> but what if I like analog volt meter? just teasin


Who doesnt, provided its in good shape should be even more accurate haha.


----------



## chris89

@ziggystardust Send me you bios haha... depends... want more heat for more power? or barely noticable in game in long sessions at less power consumption/ great performance?

1133mhz core at 1333mv to 1563mhz memory at 1000mv is 72.5GPixel & 400GB/s
1173mhz core to 1377mv to 1250mhz memory at 888mv is 25C cooler and quieter : 75GPixel/s & 320GB/s plus save 150 watts?

Depends... lmk and I can make it.


----------



## stephenn82

Quote:


> Originally Posted by *chris89*
> 
> @ziggystardust Send me you bios haha... depends... want more heat for more power? or barely noticable in game in long sessions at less power consumption/ great performance?
> 
> 1133mhz core at 1333mv to 1563mhz memory at 1000mv is 72.5GPixel & 400GB/s
> 1173mhz core to 1377mv to 1250mhz memory at 888mv is 25C cooler and quieter : 75GPixel/s & 320GB/s plus save 150 watts?
> 
> Depends... lmk and I can make it.


so 1150 will run on my core with probably less than 1333mv.

If ram runs at 1563mhz at 1000mv, what is typical default for 1500/1525? Would 1550mhz ram run in 950mv? not too much heat?


----------



## gapottberg

Quote:


> Originally Posted by *ziggystardust*
> 
> Long awaited Unigine Superposition is out guys.
> 
> http://www.guru3d.com/files-details/unigine-superpostition-benchmark-download.html
> 
> By the way, I'm on 17.4.1 driver and can't overclock with Afterburner. Once I apply the clock, it goes back to its default value. I can raise the power limit though. What you guys using to overclock your 390s these days?


After some tinkering i managed to get AB to work again. I have no idea how and i am still on 17.1.1 because of that. Too chicken I will mess it up with an update


----------



## stephenn82

how dafuq you manage that? I gave up trying to get AB to load...i removed all MSI junk from registries and havent looked back. It hasnt worked since the 16.11.5 driver.

Since then, I have gotten a custom BIOS and it runs better with LOWER clocks than what I ever managed out of AB, and LESS power used, with LESS heat made.
talk to @chris89


----------



## ziggystardust

Quote:


> Originally Posted by *chris89*
> 
> @ziggystardust Send me you bios haha... depends... want more heat for more power? or barely noticable in game in long sessions at less power consumption/ great performance?
> 
> 1133mhz core at 1333mv to 1563mhz memory at 1000mv is 72.5GPixel & 400GB/s
> 1173mhz core to 1377mv to 1250mhz memory at 888mv is 25C cooler and quieter : 75GPixel/s & 320GB/s plus save 150 watts?
> 
> Depends... lmk and I can make it.


All I wanted was to reaise my clock to 1150 mhz though. I could do it before AMD introduced that stupid Wattman but since then AB stopped working and I can't apply the clocks.

By the way, anyone tried the new Unigine benchmark? Post your results.


----------



## gapottberg

As far as i can tell...i may be the only one who is past 16.11.5 that has...and i uave no idea what i did or didnt do other than being persistent. But it does indeed work for me and i refuse to mess with a good thing at this point.


----------



## stephenn82

Quote:


> Originally Posted by *ziggystardust*
> 
> All I wanted was to reaise my clock to 1150 mhz though. I could do it before AMD introduced that stupid Wattman but since then AB stopped working and I can't apply the clocks.
> 
> By the way, anyone tried the new Unigine benchmark? Post your results.


ITS FINALLY OUT!!! OH SNAP!

I have been watching for that to drop. FINALLY! I will get on it tomorrow. Stand by for a mod to create the [Official] Unigine Superpostion thread


----------



## bluej511

Quote:


> Originally Posted by *stephenn82*
> 
> ITS FINALLY OUT!!! OH SNAP!
> 
> I have been watching for that to drop. FINALLY! I will get on it tomorrow. Stand by for a mod to create the [Official] Unigine Superpostion thread


Downloading it now, 1gb is slow on a 12mbps connection, lets see how my stock r9 390 does lol. Not sure what setting to run it on but gonna guess extreme just like on heaven.


----------



## chris89

@stephenn82 Stock is 1000mv so 1563mhz is 400GB/s totally fine if you want? I suppose let's just go for it. Mind you 25C hotter but 400GB/s if you saw a difference? It'll help with benchmarks. By the way 1758mhz is possible (450GB/s).

Here we have faster + cooler & faster + hotter

I made up for the temperature increase deviation with more proper fan profile.

Stephenn82_1173_1373mv_1250_888.zip 99k .zip file


Stephenn82_1173_1373mv_1563_1000.zip 99k .zip file


----------



## stephenn82

Quote:


> Originally Posted by *chris89*
> 
> @stephenn82 Stock is 1000mv so 1563mhz is 400GB/s totally fine if you want? I suppose let's just go for it. Mind you 25C hotter but 400GB/s if you saw a difference? It'll help with benchmarks. By the way 1758mhz is possible (450GB/s).
> 
> Here we have faster + cooler & faster + hotter
> 
> I made up for the temperature increase deviation with more proper fan profile.
> 
> Stephenn82_1173_1373mv_1250_888.zip 99k .zip file
> 
> 
> Stephenn82_1173_1373mv_1563_1000.zip 99k .zip file


HMM. I am incline to go for the second one...lol

I will try the first one first...then bump RAM MHZ up a little via wattman...THEN GO FOR THE GOLD!

I bet this wil push my card to about 68c core and say, 72 or 73VRM...putting my bets on it now. I will let you know tomorrow









know what? its STILL cooler than when i pulled this baby out of the box and slapped it into my system in Jan 2016.


----------



## gapottberg

Idk, i'd gladly use watt-man if they gave the 300 series the same fine grain control they did the 400 series. Something comparable to AB. Such BS we get half assed tools that are worthless and mess with things that work...and they cant use the excuse the older hardware isn't capable of such control, because AB has been doing it just fine for years now.

I may have found the fix in AB after all just now. There is a check box in settings Under the "AMD Compatibility Properties" header called "Erase autosaved startup settings". The tool tip seems to suggest it is the problem we are seeing with settings getting reapplied after being changed in AB. Checking the box may be the solution for those interested in using AB...though a BIOS tweak is a better solution in most cases as chris89 has pointed out.


----------



## chris89

Use Trixx if you don't want me to bios mod you up something that would smoke a software overclock hahaha


----------



## stephenn82

Im telling you all who dont believe in custom BIOS...just try it. You'll like it...lol


----------



## gapottberg

Quote:


> Originally Posted by *chris89*
> 
> Use Trixx if you don't want me to bios mod you up something that would smoke a software overclock hahaha


More than happy to have you cook me up one at some point. Even more interested in learning how to cook up my own.


----------



## bluej511




----------



## stephenn82

He can probably even get you something going for that 270 of yours. I updated to 17.4.1 on our 7870, and now wattman options are missing...it had full controls previously. :\


----------



## stephenn82

Quote:


> Originally Posted by *bluej511*


oh, Bluej511 I cant even run mine to compare! I really dont like 24hr shifts...





















it drives me to


----------



## bluej511

Quote:


> Originally Posted by *stephenn82*
> 
> oh, Bluej511 I cant even run mine to compare! I really dont like 24hr shifts...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> it drives me to


Took me a while but the soundtrack for it is awesome, I'm about 200 points shy of guru3ds results and not sure why but another site my avg beats the rx 480 and im at stock speeds so who knows. May give it a go at 1100/1600 see what she can pull.


----------



## stephenn82

NOICE!

So they arent using the same music? GOod. I got tired of hearing the same stuff from Heaven in Valley...

If you didnt notice it was the same, you havent ran either enough lol


----------



## daffy.duck

R9 390 at 1140-1600


----------



## bluej511

1040/1500

1100/1600


----------



## gapottberg

Quote:


> Originally Posted by *stephenn82*
> 
> He can probably even get you something going for that 270 of yours. I updated to 17.4.1 on our 7870, and now wattman options are missing...it had full controls previously. :\


Everytime i have tried it i have partial control of everything. Cant even set fan Curves properly. Annoying AF when i had a pretty nice set up in AB that was working well for me. All i wanted to do was use wattman to set the same dam thing up but never had the functionality i see on the 400 series reviews of it. Way more options for some reason with RX 480 for example. Makes no sense.

That system with the 270 is my back up rig my son uses atm, but i cant ever seem to get it to move over into the second slot in my profile. Serenity is my main build now and i am very happy with her.


----------



## chris89

@gapottberg How about you send me your bios .rom in .zip format... I make one, you flash and test and ask yourself is it too hot or too fast? If none of the above then be happy.

Otherwise you can learn from my bios and the characteristics of the bios I make and maybe fine tune it to more refined performance to better suit your liking. lmk
*

GPU-Z_ASUS_ROG_1.12.0.zip 2079k .zip file


atiflash_274.zip 1214k .zip file
*

@componentgirl90 How's it going? everything work or you have not tried it yet?


----------



## okiheh

Hi so i bought my Powercolor PCS+ R9 390 last year and when i got it i changed thermal paste to STG2 and got some nice results with temps.
Today i decided to do some more cooling and a got Thermal Grizzly Kryonaut








Looks like this paste works










http://imgur.com/kPXfW


Ps: 78°C to 72°C (172.4000℉ to 161.6000℉) and this is from Zelman STG2







room temp was 23°C during tests, 25-30 min of Witcher 3 standing still in empty Kaer Morhen


----------



## bluej511

Quote:


> Originally Posted by *okiheh*
> 
> Hi so i bought my Powercolor PCS+ R9 390 last year and when i got it i changed thermal paste to STG2 and got some nice results with temps.
> Today i decided to do some more cooling and a got Thermal Grizzly Kryonaut
> 
> 
> 
> 
> 
> 
> 
> 
> Looks like this paste works
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://imgur.com/kPXfW
> 
> 
> Ps: 78°C to 72°C (172.4000℉ to 161.6000℉) and this is from Zelman STG2
> 
> 
> 
> 
> 
> 
> 
> room temp was 23°C during tests, 25-30 min of Witcher 3 standing still in empty Kaer Morhen


yea good paste, i may pick some up to put on my ryzen, for my r9 390 im using gc extreme. In 23°C ambient i barely hit 40°C.


----------



## componentgirl90

Quote:


> Originally Posted by *chris89*
> 
> @componentgirl90 Let's take it easy to start because I'm almost thinking there might be something else going on with the GPU. Try this and follow my instructions just to start. Then let's start to turn up the boost as we find what works.
> 
> Here ya go. It's a "Test" bios just to see if it helps to start. It's not a high performance bios, it's power limited to rule out thermal issues.
> 
> Once you follow my instructions, upload a HWInfo At-Load screenshot to see how the temps are and fan speeds etc. By the way is your 390x a reference model? Thanks
> 
> *
> 
> ComponentGirl90_Radeon_Bios_1.zip 99k .zip file
> *


Thank you. I'm interested as to what indicates it might be a software/bios issue? Would I be able to play games using this test bios?

Regardless, given that there is no certainty that its a bios problem, I would prefer to exhaust all other possible options first. This is because if I flash the bios, some people say that this may invalidate the warranty. I also want to do the least risky things first. I also want to learn a bit before flashing the bios as I am a total noob.

The first thing I will do, unless you are fairly certain it is a bios issue, is to replace the EVGA Supernova G2 Gold 850W with:

EVGA 100-W1-0500-K3 - 500W 80 Plus Power Supply (100-W1-0500-K3)


----------



## componentgirl90

Quote:


> Originally Posted by *chris89*
> 
> @gapottberg How about you send me your bios .rom in .zip format... I make one, you flash and test and ask yourself is it too hot or too fast? If none of the above then be happy.
> 
> Otherwise you can learn from my bios and the characteristics of the bios I make and maybe fine tune it to more refined performance to better suit your liking. lmk
> *
> 
> GPU-Z_ASUS_ROG_1.12.0.zip 2079k .zip file
> 
> 
> atiflash_274.zip 1214k .zip file
> *
> 
> @componentgirl90 How's it going? everything work or you have not tried it yet?


Hi Chris. Sorry was asleep. I haven't tried it yet ( explain in my other recent post probably above this one).

BTW my card is xfx 390x DD, just a normal card with no mods.

The card has been heavily used. For some months 18 hours a day!. I would say probably 3000 hours of gaming since I bought it about 1.5 years ago.

Additional information: I have an R9 270 as well which has flashing textures and artifacts in games like battlefield 1, rainbow six siege and rust. I'm wondering if its somehow related to the problems the 390x is having because that card is only 2 years old and was used only for 8 months or so until I got the 390x.


----------



## chris89

Basically the stock bios has issues, it's plain too hot among many other things. So yeah haha.

First thing to cause like every single issue, is the BIOS. I would most certainly flash it, it won't void warranty and it's not risky if you read my instructions. It's a major problem solver, my bios that is. Test out my BIOS and yes you can game it will just be below power limit but will run fine clocks will just be low but running fine cool and smooth.

Then we can slowly speed things up as we monitor and assess the data after the bios flash and ddu/ fresh driver install.









PS - I can dial in your 270 BIOS as well so keep em comin. haha


----------



## componentgirl90

Quote:


> Originally Posted by *chris89*
> 
> Basically the stock bios has issues, it's plain too hot among many other things. So yeah haha.
> 
> First thing to cause like every single issue, is the BIOS. I would most certainly flash it, it won't void warranty and it's not risky if you read my instructions. It's a major problem solver, my bios that is. Test out my BIOS and yes you can game it will just be below power limit but will run fine clocks will just be low but running fine cool and smooth.
> 
> Then we can slowly speed things up as we monitor and assess the data after the bios flash and ddu/ fresh driver install.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> PS - I can dial in your 270 BIOS as well so keep em comin. haha


Ok. I did it! Here is the HWInfo screenshot

hwinfonew.png 53k .png file


----------



## stephenn82

Quote:


> Originally Posted by *bluej511*
> 
> yea good paste, i may pick some up to put on my ryzen, for my r9 390 im using gc extreme. In 23°C ambient i barely hit 40°C.


dont forget, under water cooling.


----------



## stephenn82

Quote:


> Originally Posted by *chris89*
> 
> Basically the stock bios has issues, it's plain too hot among many other things. So yeah haha.
> 
> First thing to cause like every single issue, is the BIOS. I would most certainly flash it, it won't void warranty and it's not risky if you read my instructions. It's a major problem solver, my bios that is. Test out my BIOS and yes you can game it will just be below power limit but will run fine clocks will just be low but running fine cool and smooth.
> 
> Then we can slowly speed things up as we monitor and assess the data after the bios flash and ddu/ fresh driver install.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> PS - I can dial in your 270 BIOS as well so keep em comin. haha


im telling you all, @chris89 is the man with this. I just ran Superpostion with pretty dang good results...max heat was 63C on core with 1333mv core, 888mv for VRAM, and clocks are 1133/1250. Yes, I worried too what 1250mhz would do to it, with timespy, almost nothing. 2 points lower than 1625 ram speed. TWO!!
Unlock your cards potential...send him your bios.


----------



## componentgirl90

Quote:


> Originally Posted by *stephenn82*
> 
> im telling you all, @chris89 is the man with this. I just ran Superpostion with pretty dang good results...max heat was 63C on core with 1333mv core, 888mv for VRAM, and clocks are 1133/1250. Yes, I worried too what 1250mhz would do to it, with timespy, almost nothing. 2 points lower than 1625 ram speed. TWO!!
> Unlock your cards potential...send him your bios.


I did! I followed the instructions. For a moment I thought it had crashed because the ATIWinflash wasn't responding but it worked.


----------



## stephenn82

Speaking of unlocked bios and benchmarks...
First run of Superposition
1133MHz core 1250Mhz Ram at 1333mv core/888mv ram. Max temp, 63C I accidently closed GPU-z and cant get power consumption. On next run with higher ram speed I will.


----------



## stephenn82

Quote:


> Originally Posted by *componentgirl90*
> 
> I did! I followed the instructions. For a moment I thought it had crashed because the ATIWinflash wasn't responding but it worked.


dont look back...your card will thank you for it...and your PSU will too!


----------



## chris89

@stephenn82 Thank man. True, True.
















@componentgirl90 I'm proud of ya! You did it!







Now throw it under load with that nano 133W limit haha... Post results and your thoughts, then let's put the pedal to the metal and let it eat.

haha


----------



## stephenn82

Quote:


> Originally Posted by *stephenn82*
> 
> Speaking of unlocked bios and benchmarks...
> First run of Superposition
> 1133MHz core 1250Mhz Ram at 1333mv core/888mv ram. Max temp, 63C I accidently closed GPU-z and cant get power consumption. On next run with higher ram speed I will.


Run #2
1133 core 1450 Mem
63c core 65c VRM max power output 225.8W Max power input 301W


*AS you can see...an improvement of only 56 points by bumping up ram speed and making all that extra heat. worth it? probably not*.


----------



## chris89

Nice man. We can match and well exceed that 1450mhz ram score on core alone with 1250mhz memory ya know? haha

Did you try the 1173Mhz bios? I haven't tested 1373mv, but it sounded right.

Btw at 1373mv... You could bump core by Trixx to 1204Mhz for shiz and giggles to watch the score rise ever further..


----------



## componentgirl90

Quote:


> Originally Posted by *chris89*
> 
> @stephenn82 Thank man. True, True.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> @componentgirl90 I'm proud of ya! You did it!
> 
> 
> 
> 
> 
> 
> 
> Now throw it under load with that nano 133W limit haha... Post results and your thoughts, then let's put the pedal to the metal and let it eat.
> 
> haha


I will have to leave it running and switch it on and off a few times over the next day before can say anything with any degree of certainty but will do.


----------



## bluej511

Quote:


> Originally Posted by *stephenn82*
> 
> Run #2
> 1133 core 1450 Mem
> 63c core 65c VRM max power output 225.8W Max power input 301W
> 
> 
> *AS you can see...an improvement of only 56 points by bumping up ram speed and making all that extra heat. worth it? probably not*.


Very nice, you got me there, im only at 1100/1600 though, she can do 1200 at the core but seems like wattman and ab both black screen my pc trying to get there, super lame. Used to work just fine before.


----------



## chris89

@bluej511 Send Bios?


----------



## stephenn82

Quote:


> Originally Posted by *chris89*
> 
> @bluej511 Send Bios?


yeah, bluej511...send him a bios...hit HIGHER clocks than stock


----------



## gordesky1

Was expecting a less score not bad with a fx running on chris 1204/1000 bios also forgot to close a video on my 2nd screen and also have tons of stuff running. Tho 1204 might be pushing it on air saw white dots sometimes...


----------



## bluej511

Quote:


> Originally Posted by *gordesky1*
> 
> Was expecting a less score not bad with a fx running on chris 1204/1000 bios also forgot to close a video on my 2nd screen and also have tons of stuff running. Tho 1204 might be pushing it on air saw white dots sometimes...


Processor doesnt matter on this benchmark unless you got like 3-4 gpus lol.


----------



## gordesky1

Quote:


> Originally Posted by *bluej511*
> 
> Processor doesnt matter on this benchmark unless you got like 3-4 gpus lol.


Tho what happen with the lower score than? at 1204 it should be higher than that if cpu dont have little to do with it lol

Going lower than 4.8 the score went lower.


----------



## bluej511

Quote:


> Originally Posted by *gordesky1*
> 
> Tho what happen with the lower score than? at 1204 it should be higher than that if cpu dont have little to do with it lol
> 
> Going lower than 4.8 the score went lower.


Bottleneck? your memory is really at 1000mhz? seems super low.


----------



## chris89

*@gordesky1* Were we unable to hit 888mv on memory on your card? I dialed in a new BIOS for ya. I really think it could do the 888mv. Only 4 extra millivolts on the core. LMK on white dots, that's voltage coupled with temperature. They are a tight bond.

1270Mhz memory is 325GB/s... 888mv way less than we had you at now at 987mv for some reason? haha







Been a while man.

By the way the window of opportunity becomes very narrow at 1200mhz and above for potential of "Black Screen". That's Overheat of various components. Which must be matched with precisely the right amount of airflow to overcome. Then we ride out this ultra narrow wave of extremely high speed. haha

*

Gordesky1_1204Mhz_1404mv_1270Mhz_888mv_1.zip 99k .zip file
*


----------



## THUMPer1

Quote:


> Originally Posted by *chris89*
> 
> Use Trixx if you don't want me to bios mod you up something that would smoke a software overclock hahaha


I can get 1180/1625 on my MSI 390x without much effort using MSI Afterburner. I typically run at 1150/1625 though. What can your BIOS do for me?


----------



## chris89

*@THUMPer1* Let's find out. haha


----------



## gordesky1

Quote:


> Originally Posted by *chris89*
> 
> *@gordesky1* Were we unable to hit 888mv on memory on your card? I dialed in a new BIOS for ya. I really think it could do the 888mv. Only 4 extra millivolts on the core. LMK on white dots, that's voltage coupled with temperature. They are a tight bond.
> 
> 1270Mhz memory is 325GB/s... 888mv way less than we had you at now at 987mv for some reason? haha
> 
> 
> 
> 
> 
> 
> 
> Been a while man.
> 
> By the way the window of opportunity becomes very narrow at 1200mhz and above for potential of "Black Screen". That's Overheat of various components. Which must be matched with precisely the right amount of airflow to overcome. Then we ride out this ultra narrow wave of extremely high speed. haha
> 
> *
> 
> Gordesky1_1204Mhz_1404mv_1270Mhz_888mv_1.zip 99k .zip file
> *


I mostly just been hanging onto that 1204/1000 bios lol

But i just tried that bios you just sent now 1204/1270 and yep did bring the score right up And no white dots! And a bit cooler too!


----------



## componentgirl90

Quote:


> Originally Posted by *chris89*
> 
> @stephenn82 Thank man. True, True.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> @componentgirl90 I'm proud of ya! You did it!
> 
> 
> 
> 
> 
> 
> 
> Now throw it under load with that nano 133W limit haha... Post results and your thoughts, then let's put the pedal to the metal and let it eat.
> 
> haha


I just switched it on again and it accelerated to 3794rpm for no reason again (for about 3 seconds), just after finishing loading all windows programs. About a minute later, it then spent about 20 seconds or so in the 3000s for no reason. I have a second PC, albeit an old one. I will try it in that. Also, I will try the other PSU.


----------



## chris89

Pull the GPU and take a couple hi def pics... I would suspect a physical issue with the card... VRM contact or Core contact issue.

Mainly to look at heatsink to core contact and heatsink to VRM contact... might need to dismantle it if you feel comfortable?


----------



## THUMPer1

SO I flashed BIOS for 1173/1270. But I don't understand what to do with the core and mem clocks. Do I have to manually adjust them in AB?
I assume I don't have to touch voltage. But I'm getting white boxes in the benchmark. I usually play BF1 and am just using the unigen for testing.


----------



## chris89

If you adjusted the memory like your not suppose to haha yes white boxes will appear... try reset and see... If reset still white boxes... Post Screenshots and many especially HWInfo v546 GPU DATA at idle and load through all your testing... Remain 1173Mhz - 1270Mhz memory.

I guess we were on separate pages as far as what this BIOS was going to do. You probably thought stock clocks with more voltage to ram up back up. However these BIOS are for max performance and efficiency. To save power, keep the card going for the long haul while generating cooler thermals and surprisingly at times better performance.

I could make a stock smokin hot I mean hot as lava stock bios and see just how hot it can get if you wanna just crank the bars to the max position? I wouldn't recommend it but if that's what you want I can make a bios that would reflect as much. haha









To put these clocks in more of a perspective... When DeLimited you are witnessing the full 75,072,000,000 Pixel's every single second at 1173Mhz... You realize? When DeLimited it's pull all those Pixel's every second. When limited by stock only a fraction at the same clocks. You would also witness the full amount of 206,448,000,000 Texel's every single second of gameplay at 1,173Mhz Core clock... Not a fraction of that like stock bios at the same or higher clocks... Make sense? So by then cranking up the ram way out into beyond nuts power consumption territory past 1600Mhz. We are talking about sucking down an extra 200 Watts from memory alone. By saving that power we feed it to the workhorse of the gpu, the Core. Along with power and thermal savings.


----------



## RaFDX

Hey all! Been trying to catch on on this bios mod cause i want to run my 390 a little differently. Currently it's at 1150MHz Core and 1700MHz Mem at +56mV. I have swapped out the stock cooler for a H55 and GeLid Extreme.

I'ev gone back 6 pages but not sure where to start. I also have never modded the 390's bios. Where/what should I do?


----------



## chris89

*@RaFDX* I suppose i'll just go ahead and say send me the bios haha







(Here attach as .ZIP) (GPUZ DUMP).


----------



## componentgirl90

Quote:


> Originally Posted by *chris89*
> 
> Pull the GPU and take a couple hi def pics... I would suspect a physical issue with the card... VRM contact or Core contact issue.
> 
> Mainly to look at heatsink to core contact and heatsink to VRM contact... might need to dismantle it if you feel comfortable?


I have drawn up a rough checklist of things I want to run through. I will run through that and whilst doing that take some external photos. I will dismantle it and take photos if that doesn't invalidate the warranty.


----------



## chris89

*@componentgirl90*

I've had that issue before. It was physical issues on the thermals of the GPU... Not physical so to speak just thermal pad installation and procedure as well as the thermal paste application of the Core Diode. Had to spread it across the whole Silicon Diode from edge to edge to create full contact and ideal lowest thermals. If you have a GPU app running in the background, could also cause it. Especially if you have had the same windows installation longer than a year.


----------



## RaFDX

Not too sure how to download your link :-\

For now i was going to try the following to increase clock speeds while not pumping too much mV to the card

http://www.overclock.net/t/1561372/hawaii-bios-editing-290-290x-295x2-390-390x

https://www.tomshw.it/forum/threads/guida-modding-bios-hawaii-grenada-290-290x-390-390x.574562/

Thoughts?


----------



## chris89

Get an error or something? it happens


----------



## RaFDX

I don't see it?? Lol not a good start


----------



## chris89

*

GPU-Z_ASUS_ROG_1.12.0.zip 2079k .zip file
*


----------



## christoph

Quote:


> Originally Posted by *chris89*
> 
> *@componentgirl90*
> 
> I've had that issue before. It was physical issues on the thermals of the GPU... Not physical so to speak just thermal pad installation and procedure as well as the thermal paste application of the Core Diode. Had to spread it across the whole Silicon Diode from edge to edge to create full contact and ideal lowest thermals. If you have a GPU app running in the background, could also cause it. Especially if you have had the same windows installation longer than a year.


do you have info on those thermal pads, or to be more specific the thermal pads that's between the backplate and the back of the mainboard? what if I replace that thermal pad with a thermal pad that is 3.2w/k??


----------



## bluej511

Quote:


> Originally Posted by *christoph*
> 
> do you have info on those thermal pads, or to be more specific the thermal pads that's between the backplate and the back of the mainboard? what if I replace that thermal pad with a thermal pad that is 3.2w/k??


3.2w/mK pads are pretty weak, wouldn't be surprised if thats already whats on. You can get anywhere from 5-14w/mK but they get very expensive. For a reference blower though it seems like it does help quite a bit getting some good thermal pads. Id use it only on the VRMs though, memory doesnt get anywhere near the heat the VRMs do.


----------



## chris89

Haha 3.2W/M K with a huge sheet to cover the entire back of the PCB is spectacular... got the cash? How much and where? Post links of what your thinking of...

512-Bit Beast Ram with a half a TeraByte per second sure oh yeah its cold.. yep a Half a TeraByte of the beastly GDDR5 any GPU has ever run on any GPU ever is cold... haha

Expect the ram to far exceed the core at load trust me I felt it.... Sm-sm-sm-smokin hot chips... like sizzle your finger tip at stock clocks .. Yes Really

I've tested everything between below 1 watt per meter kelvin to 400 watts per meter kelvin "Graphite" it's only 15w/m k on the proper axis that is in use for our application on the cheap graphite though... Still amazing stuff...

Anyway yes 3.2 is solid. I use 5 watts per meter kelvin from Laird Thermagon grey stuff.. Boron Nitride and it's solid stuff... make sure it's like a tough putty... the rubbery pads are garbage but work okay for the ram if direct heatsink contact with plenty of pressure... Plus I used 5 watts per meter kelvin fujipoly on my VRM as seen above in the pics and VRM is 60C at 1200Mhz in that particular installation method.


----------



## bluej511

Quote:


> Originally Posted by *chris89*
> 
> Haha 3.2W/M K with a huge sheet to cover the entire back of the PCB is spectacular... got the cash? How much and where? Post links of what your thinking of...
> 
> 512-Bit Beast Ram with a half a TeraByte per second sure oh yeah its cold.. yep a Half a TeraByte of the beastly GDDR5 any GPU has ever run on any GPU ever is cold... haha
> 
> Expect the ram to far exceed the core at load trust me I felt it.... Sm-sm-sm-smokin hot chips... like sizzle your finger tip at stock clocks .. Yes Really
> 
> I've tested everything between below 1 watt per meter kelvin to 400 watts per meter kelvin "Graphite" it's only 15w/m k on the proper axis that is in use for our application on the cheap graphite though... Still amazing stuff...
> 
> Anyway yes 3.2 is solid. I use 5 watts per meter kelvin from Laird Thermagon grey stuff.. Boron Nitride and it's solid stuff... make sure it's like a tough putty... the rubbery pads are garbage but work okay for the ram if direct heatsink contact with plenty of pressure... Plus I used 5 watts per meter kelvin fujipoly on my VRM as seen above in the pics and VRM is 60C at 1200Mhz in that particular installation method.


Forget that 5w/mK garbage, 11w/mK all the way haha.


----------



## chris89

I hear ya bro just gotta test all the variants... I once decked out all my Nvidia SLI GPU's on 17w/m K fujipoly thought nothing is better... its more complicated that just plain watts per meter kelvin.. it's its composition and density and mass that aid real thermal stability... a huge tough material very hard yet pliable is ideal for the real big heat generating components.






























I will say nothing will ever touch the capabilities of fujipoly 17w/m k kelvin 0.5mm thickness pad on a Nvidia GPU core where the core/ lid contacts direct heatpipes... only way to keep those GPU's cool... GTX 580 @ 1075Mhz core clock for instance can only work on Fujipoly 17w/m k


----------



## RaFDX

Quote:


> Originally Posted by *chris89*
> 
> *
> 
> GPU-Z_ASUS_ROG_1.12.0.zip 2079k .zip file
> *


aaaahhhhh very awesome!!

*My assumptions;

**Backup current bios via GPUZ

***DL and unzip your bios

****Load your bios via GPUZ

***** then????


----------



## THUMPer1

Quote:


> Originally Posted by *chris89*
> 
> If you adjusted the memory like your not suppose to haha yes white boxes will appear... try reset and see... If reset still white boxes... Post Screenshots and many especially HWInfo v546 GPU DATA at idle and load through all your testing... Remain 1173Mhz - 1270Mhz memory.


I understand now what we are trying to do. I like the core clock, whats the next memory clock we can go to to get the next step in performance? So I wasn't supposed to change mem clock to 1270? I did though...hah I'm confused.
Here are some screens.

And idle screen and 2 screens of the gpu at load.


----------



## stephenn82

Quote:


> Originally Posted by *RaFDX*
> 
> aaaahhhhh very awesome!!
> 
> *My assumptions;
> 
> **Backup current bios via GPUZ
> 
> ***DL and unzip your bios
> 
> ****Load your bios via GPUZ
> 
> ***** then????


upload your .zip of your BIOS here...
@chris89 will guide your through the rest...


----------



## stephenn82

@chris89 Turns out the 888mv wasnt enough for a steady/stable memory speed of 1475...lol. It may be the fact that not enough voltage to keep it running right only raised the score by less than 60 points in Unigine benches and two points in Timespy. I am now going to fully test core stability with the lower GPU COre mV. I will post results, going with lowest setting pack first! Wish me luck!


----------



## stephenn82

*UPDATE!*

The lowest core mv of 1293 in the pack netted almost identical results in Superposition, it was 2 points less. This tells me that the voltage is good enough over the 1333mv of last.









Max temps/watts/amps for run at 1080p extreme


Results of run


----------



## stephenn82

@chris89

I actually started playing with the Hawaii BIOS reader/editor. I came up with this little build just to test out. I bumped the Core clock to 1150, Memory to 1325, fan profiles up JUST A TOUCH (65% fan speed and 75% fan speed are almost indistinguishable with these MSI Torx 2 fans on Frozer V) Quick sanity check before I try this out for SnG's, if you will











*OH NO, I just noticed that Core clock 1 on upper table is 1133 still, and down below the max clock is 1150. Do these need to match? I can change it. Its a learning process


----------



## stephenn82

I will get back to that custom bios by me in a bit.

Ran Superposition again, pretty damned demanding.


A little bump, but not much. Proves that some benchmarks dont rely on VRAM speed much. Will test this with 1500mhz ram with 1000mv to check. Is that stock?


----------



## christoph

Quote:


> Originally Posted by *bluej511*
> 
> 3.2w/mK pads are pretty weak, wouldn't be surprised if thats already whats on. You can get anywhere from 5-14w/mK but they get very expensive. For a reference blower though it seems like it does help quite a bit getting some good thermal pads. Id use it only on the VRMs though, memory doesnt get anywhere near the heat the VRMs do.


that's the thing, they get very expensive for not to end up with good performance, the video card itself is not overheating at all, just thinking in keeping it a little cooler for a long run so I don't think is worth it to put a lot of money into it when overheating is not even near to become a problem

Quote:


> Originally Posted by *chris89*
> 
> Haha 3.2W/M K with a huge sheet to cover the entire back of the PCB is spectacular... got the cash? How much and where? Post links of what your thinking of...
> 
> 512-Bit Beast Ram with a half a TeraByte per second sure oh yeah its cold.. yep a Half a TeraByte of the beastly GDDR5 any GPU has ever run on any GPU ever is cold... haha
> 
> Expect the ram to far exceed the core at load trust me I felt it.... Sm-sm-sm-smokin hot chips... like sizzle your finger tip at stock clocks .. Yes Really
> 
> I've tested everything between below 1 watt per meter kelvin to 400 watts per meter kelvin "Graphite" it's only 15w/m k on the proper axis that is in use for our application on the cheap graphite though... Still amazing stuff...
> 
> Anyway yes 3.2 is solid. I use 5 watts per meter kelvin from Laird Thermagon grey stuff.. Boron Nitride and it's solid stuff... make sure it's like a tough putty... the rubbery pads are garbage but work okay for the ram if direct heatsink contact with plenty of pressure... Plus I used 5 watts per meter kelvin fujipoly on my VRM as seen above in the pics and VRM is 60C at 1200Mhz in that particular installation method.


5's? hmmm but my VRM does not get over 60, in fact they get like 57C with the stock themal pads...


----------



## stephenn82

huh, my VRM hits max of 69c after gaming a while. Thought MSI was the best cooler to cover VRMs? Oh well. I may buy some of that Gucci thermal sheet sometime in future while I hold out for the 580 or RX Vega...or prices to drop MORE on teh 1070/1080 before I move forward.


----------



## gordesky1

Quote:


> Originally Posted by *stephenn82*
> 
> huh, my VRM hits max of 69c after gaming a while. Thought MSI was the best cooler to cover VRMs? Oh well. I may buy some of that Gucci thermal sheet sometime in future while I hold out for the 580 or RX Vega...or prices to drop MORE on teh 1070/1080 before I move forward.


69c for vrm is pretty good

wish i can say the same for mine which will get in the 80s lol but blame case cooling at the moment.

and also running at 1204 core probably not helping them either but everything running smoothly.


----------



## ziggystardust

Quote:


> Originally Posted by *bluej511*
> 
> Took me a while but the soundtrack for it is awesome, I'm about 200 points shy of guru3ds results and not sure why but another site my avg beats the rx 480 and im at stock speeds so who knows. May give it a go at 1100/1600 see what she can pull.


Guru3d results are a bit off for Furys and 390s as far as I can compare them with other people's results. I'm also behind them about 100 points with my 390x nitro at stock speeds. So İ'm süre your score is just fine.


----------



## chris89

*Cooler fan profile*
*

Gordesky1_1207Mhz_1407mv_1270Mhz_888mv_1.zip 99k .zip file


Try it? I can run these clocks on my reference modded blower 390x

Stephenn82_1207Mhz_1407mv_1758Mhz_1000mv.zip 99k .zip file


Stephenn82_1250Mhz_1425mv_1270Mhz_888mv.zip 99k .zip file


Stephenn82_1250Mhz_1425mv_1563Mhz_1000mv_Experimental.zip 99k .zip file


Stephenn82_1250Mhz_1425mv_1758Mhz_1000mv_Experimental.zip 99k .zip file
*


----------



## RaFDX

It's not letting me upload my BIOS


----------



## chris89

Use WinZip or WinRar... Right click on the .rom "ADD TO ARCHIVE" select .ZIP... Attach the .ZIP you just created. Here.


----------



## THUMPer1

Were my hwinfo screen shots good enough?


----------



## chris89

I guess since I'm not sure what your wanting to get out of the BIOS? You tell me.

*Flash back to stock 1st.* Run through Time Spy at stock speeds and your defined overclock settings.

Then flash back to the BIOS again and run Time Spy. I'll make you a fresh BIOS. So run through the tests. Post Stock Results first then lets mod.

Without touching anything and reset MSI, document and screenshot the score differences.

Along with HWInfo for both stock and modded.
*

Flash_1st_Thumper1_1133_1293mv_1270_888mv.zip 99k .zip file


Flash_2nd_Thumper1_1173_1363mv_1270_888mv.zip 99k .zip file


Flash_3rd_Thumper1_1188_1378mv_1270_888mv.zip 99k .zip file


Flash_4th_Thumper1_1188_1378mv_1563_1000mv.zip 99k .zip file
*


----------



## RaFDX

R9390Hawaii.zip 99k .zip file


I was using 7zip. sorryz


----------



## chris89

It's cool @RaFDX : *Run through Time Spy with stock default BIOS*.... *Post the Results of the Test & HWInfo data.* Then after flash my bios and do the same thing.

*

Flash_1st_RaFDX_1133_1293mv_1270_888mv.zip 99k .zip file


Flash_2nd_RaFDX_1173_1363mv_1270_888mv.zip 99k .zip file


Flash_3rd_RaFDX_1173_1363mv_1563_1000mv.zip 99k .zip file
*


----------



## componentgirl90

Quote:


> Originally Posted by *chris89*
> 
> *@componentgirl90*
> 
> I've had that issue before. It was physical issues on the thermals of the GPU... Not physical so to speak just thermal pad installation and procedure as well as the thermal paste application of the Core Diode. Had to spread it across the whole Silicon Diode from edge to edge to create full contact and ideal lowest thermals. If you have a GPU app running in the background, could also cause it. Especially if you have had the same windows installation longer than a year.


I have had the same windows installation since jan 2015...but it happens sometimes as soon as I press the power button. Today It hasn't happened at all yet.

I was going to switch the PSU for a 500W one but I won't now because I am concerned that the power draw will be too much and I need to test it under load.

This is really the job of the RMA people but you (Chris) have probably done more for my card than they have done for 10 RMAs


----------



## chris89

*@componentgirl90* This will atleast give solid gaming performance on cool temperatures. I haven't tested this method so lmk on temperatures. Only 44% fan max and 68C Max Temperature. Idk what would happen... Need to test plus post HWInfo.























If you want to... I'm willing to fix your card for you so you can forget about RMA...? Ship it to me... PM... Include a return USPS label as well so I can ship it back. I just need 24 hours and it'll be on the way back to you...

ComponentGirl90_390x_1000_1270_Low_max_temp_Low_Fan_max.zip 99k .zip file


----------



## componentgirl90

Quote:


> Originally Posted by *chris89*
> 
> *@componentgirl90* This will atleast give solid gaming performance on cool temperatures. I haven't tested this method so lmk on temperatures. Only 44% fan max and 68C Max Temperature. Idk what would happen... Need to test plus post HWInfo.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If you want to... I'm willing to fix your card for you so you can forget about RMA...? Ship it to me... PM... Include a return USPS label as well so I can ship it back. I just need 24 hours and it'll be on the way back to you...
> 
> ComponentGirl90_390x_1000_1270_Low_max_temp_Low_Fan_max.zip 99k .zip file


If I flash that bios, I take it I just have to skip the whole installation of drivers stuff. Just open ATIFlash, load bios and boom?

What country do you live in? If it isn't fixed by September (end of warranty period) than I could send it then.

My brother did some research and said that accelerating fans like this can be caused by loose PSU cables. That is why I suspect a PSU related issue.


----------



## chris89

USA here. You? Yes, make sure MSI and all other apps like Hwinfo are closed .. open.. flash .. done

Dont click anything obviously while its flashing.

It wouldn't be PSU related. The fan ramps up when something is too hot automatically to full speed when "Max Temp" is reached. Which yours by stock is 98 degrees celsius.


----------



## stephenn82

Quote:


> Originally Posted by *chris89*
> 
> Use WinZip or WinRar... Right click on the .rom "ADD TO ARCHIVE" select .ZIP... Attach the .ZIP you just created. Here.


I like 7zip more. It does all the things. But follow these instructions to zip with any program
Quote:


> Originally Posted by *chris89*
> 
> *Cooler fan profile*
> *
> 
> Gordesky1_1207Mhz_1407mv_1270Mhz_888mv_1.zip 99k .zip file
> 
> 
> Try it? I can run these clocks on my reference modded blower 390x
> 
> Stephenn82_1207Mhz_1407mv_1758Mhz_1000mv.zip 99k .zip file
> 
> 
> Stephenn82_1250Mhz_1425mv_1270Mhz_888mv.zip 99k .zip file
> 
> 
> Stephenn82_1250Mhz_1425mv_1563Mhz_1000mv_Experimental.zip 99k .zip file
> 
> 
> Stephenn82_1250Mhz_1425mv_1758Mhz_1000mv_Experimental.zip 99k .zip file
> *


These will run fine on mine? They seem super aggressive.







but if the cooling results are ok for you and we can move forward, I'll be your huckleberry

What about my uploaded screen for my attempt to bios mod?


----------



## componentgirl90

Quote:


> Originally Posted by *chris89*
> 
> USA here. You? Yes, make sure MSI and all other apps like Hwinfo are closed .. open.. flash .. done
> 
> Dont click anything obviously while its flashing.
> 
> It wouldn't be PSU related. The fan ramps up when something is too hot automatically to full speed when "Max Temp" is reached. Which yours by stock is 98 degrees celsius.


The card is not at 98 though, its at like 40 or 50 when its happening.

I am in the UK btw. RIP that idea. lol


----------



## chris89

You could still ship it to me so I can fix the card for good after warranty is up. No more issues. I would go through every single thing on the card. Using all my high end thermal material and performance enhancing methods to ensure continuous stability.

As far as the "Max Temp" it is also the temperature of non-monitorable chips that can over heat if not cooled directly... here lets take a look at the cooler and PCB.


----------



## componentgirl90

Quote:


> Originally Posted by *chris89*
> 
> You could still ship it to me so I can fix the card for good after warranty is up. No more issues. I would go through every single thing on the card. Using all my high end thermal material and performance enhancing methods to ensure continuous stability.
> 
> As far as the "Max Temp" it is also the temperature of non-monitorable chips that can over heat if not cooled directly... here lets take a look at the cooler and PCB.


Ok I see now. I will let you know how it goes


----------



## chris89

The best way to fix that without removing the heatsink is buy 10mm thick thermal material and mush it in there so the VRM contact the heatsink.

Along with my BIOS I sent should fix it... try this BIOS actually knowing that is the issue we can knock VRM down 40 degrees celsius.

Fan speed is like this

30C @ 40%
50C @ 75%
60C @ 75%
75C Max @ 88% (Should throttle at 75C) May be loud in game but memory VRM won't overheat and this is DeLimited Bios. No Power Limit.

ComponentGirl90_Ideal_BIOS_For_XFX_DD_390X.zip 99k .zip file


----------



## bluej511

Quote:


> Originally Posted by *chris89*
> 
> The best way to fix that without removing the heatsink is buy 10mm thick thermal material and mush it in there so the VRM contact the heatsink.
> 
> Along with my BIOS I sent should fix it... try this BIOS actually knowing that is the issue we can knock VRM down 40 degrees celsius.
> 
> Fan speed is like this
> 
> 30C @ 40%
> 50C @ 75%
> 60C @ 75%
> 75C Max @ 88% (Should throttle at 75C) May be loud in game but memory VRM won't overheat and this is DeLimited Bios. No Power Limit.
> 
> ComponentGirl90_Ideal_BIOS_For_XFX_DD_390X.zip 99k .zip file


i wouldn't. The thicker the material the less and less it will work, a 110mm thermal pad would more then likely cause it to run hotter then anything.

It's actually been proven and its why the intel tim isn't the issue but the gap is, the thicker the thermal pad/thermal interface material you use the less it works. Ive tested this over the past couple of years.


----------



## chris89

Right gotta figure out something the memory vrm is cool on hardly any copper .. 65C load so it doesn't take much at all to cool them.


----------



## componentgirl90

Quote:


> Originally Posted by *chris89*
> 
> The best way to fix that without removing the heatsink is buy 10mm thick thermal material and mush it in there so the VRM contact the heatsink.
> 
> Along with my BIOS I sent should fix it... try this BIOS actually knowing that is the issue we can knock VRM down 40 degrees celsius.
> 
> Fan speed is like this
> 
> 30C @ 40%
> 50C @ 75%
> 60C @ 75%
> 75C Max @ 88% (Should throttle at 75C) May be loud in game but memory VRM won't overheat and this is DeLimited Bios. No Power Limit.
> 
> ComponentGirl90_Ideal_BIOS_For_XFX_DD_390X.zip 99k .zip file


Ok I will flash with this bios. Actually its been a cooler day today and it hasn't revved yet. I will also open the top vent of the computer (which I normally keep closed because I am paranoid about rain getting in) and open the side panel of the computer which usually helps drop temps a bit anyway.


----------



## componentgirl90

Quote:


> Originally Posted by *chris89*
> 
> Right gotta figure out something the memory vrm is cool on hardly any copper .. 65C load so it doesn't take much at all to cool them.


I flashed the bios with the last bios you recommended.

Here is the HWInfo screenshot after 1 hour or so of gaming and letting it cool.

It is a bit noisy when its idle because the fans are going fast but its tolerable.

If it doesn't accelerate like before over a number of days then I would say that your theory is almost certainly correct and I can just RMA this card and tell them where to look. They are obliged to fix or replace it.

The problem with fixing it myself under your guidance is that if some other problem were to come up which were not fixable, the warranty would be invalid and I could not return the card for a replacement









hwinfonewbios.png 34k .png file


----------



## chris89

Cool. That 32C temp is those 3 chips just sitting there. Was that load? What game did you play? I can perfect it a bit more... I'm not sure if it's going to 1000Mhz core clock? Can you leave HWInfo open just minimize or when u open it go to settings set to start with windows and minimize instead of close... so u can refer to it over the long haul.. more accurate logging that way.


----------



## componentgirl90

Quote:


> Originally Posted by *chris89*
> 
> Cool. That 32C temp is those 3 chips just sitting there. Was that load? What game did you play? I can perfect it a bit more... I'm not sure if it's going to 1000Mhz core clock? Can you leave HWInfo open just minimize or when u open it go to settings set to start with windows and minimize instead of close... so u can refer to it over the long haul.. more accurate logging that way.


That was idle. It was after playing battlefield 1 multiplayer for one hour. I think it is going to to 1000MHz, it just wasnt under load at the time. I have left HWInfo running now


----------



## componentgirl90

Quote:


> Originally Posted by *chris89*
> 
> Cool. That 32C temp is those 3 chips just sitting there. Was that load? What game did you play? I can perfect it a bit more... I'm not sure if it's going to 1000Mhz core clock? Can you leave HWInfo open just minimize or when u open it go to settings set to start with windows and minimize instead of close... so u can refer to it over the long haul.. more accurate logging that way.


I ran the computer idle, did the heaven benchmark and then left it idle for a few minutes. Here is the HWInfo:

idealbiosHWinfo30min.png 54k .png file

It didnt behave strange in that time.


----------



## stephenn82

Quote:


> Originally Posted by *chris89*
> 
> *Cooler fan profile*
> *
> 
> Gordesky1_1207Mhz_1407mv_1270Mhz_888mv_1.zip 99k .zip file
> 
> 
> Try it? I can run these clocks on my reference modded blower 390x
> 
> Stephenn82_1207Mhz_1407mv_1758Mhz_1000mv.zip 99k .zip file
> 
> 
> Stephenn82_1250Mhz_1425mv_1270Mhz_888mv.zip 99k .zip file
> 
> 
> Stephenn82_1250Mhz_1425mv_1563Mhz_1000mv_Experimental.zip 99k .zip file
> 
> 
> Stephenn82_1250Mhz_1425mv_1758Mhz_1000mv_Experimental.zip 99k .zip file
> *


before updating to these, should I go back to stock, then flash....or just flash this over what I have?


----------



## chris89

*@componentgirl90*
Looks good. Here's let try and quiet it down a bit more and speed things up a bit.

ComponentGirl90_2_1173Mhz_Ideal_BIOS_For_XFX_DD_390X.zip 99k .zip file


*@stephenn82*
Flash over.


----------



## stephenn82

Thank you sir!


----------



## chris89

Your quite welcome bud.. 1758mhz ram for benchmarks with errors and might not even work... 1270mhz ram for long gaming session maybe at 1250mhz core?

I might need to make a 1207mhz on 1270mhz memory as well or a 1207mhz on 1563mhz ram at 1000mv.. try them and lmk we will go from there.


----------



## RaFDX

Quote:


> Originally Posted by *chris89*
> 
> It's cool @RaFDX : *Run through Time Spy with stock default BIOS*.... *Post the Results of the Test & HWInfo data.* Then after flash my bios and do the same thing.
> 
> *
> 
> Flash_1st_RaFDX_1133_1293mv_1270_888mv.zip 99k .zip file
> 
> 
> Flash_2nd_RaFDX_1173_1363mv_1270_888mv.zip 99k .zip file
> 
> 
> Flash_3rd_RaFDX_1173_1363mv_1563_1000mv.zip 99k .zip file
> *


Geez terrible sorry! Didnt see this post. Do you want me to run TimeSpy using OC or stock numbers?


----------



## componentgirl90

Quote:


> Originally Posted by *chris89*
> 
> *@componentgirl90*
> Looks good. Here's let try and quiet it down a bit more and speed things up a bit.
> 
> ComponentGirl90_2_1173Mhz_Ideal_BIOS_For_XFX_DD_390X.zip 99k .zip file
> 
> 
> *@stephenn82*
> Flash over.


Im happy and would like to keep testing this bios for a couple more days before moving forward as the problem doesn't always immediately present itself.

P.s. It will be nice to run it at 1173 core clock but isn't that quite a substantial overclock for this card? Base clock for the xfx DD is 1050.


----------



## stephenn82

@chris89 loaded up the mild 1207mhz 1407mv and 1758mhz 1000mv ram, card doesnt work. Going to reflash the previous one.


----------



## chris89

@RaFDX
Yeah stock first. Have HWInfo boot with windows and idle out then run timespy on Stock default clocks/ bios and post HWInfo and results. Then start on bios #1.

@componentgirl90
Right on. From 1000Mhz to 1173Mhz core on the 390x is from 64 billion pixel's & 176 billion texel's every second.

To 75 billion 72 million pixel's & 206 billion 448 million texel's every second.

That's 11 billion pixels more per second & 30.448 billion more texel's every second... Not much probably?

@stephenn82
Try 1563mhz ram... sometimes 1758mhz doesn't work, though it does on reference with errors. If the 1563mhz doesn't work then the card isn't happy at 1425mv... Gonna need less max core voltage... LMK on 1563mhz ram on 1250mhz core.


----------



## RaFDX

Quote:


> Originally Posted by *chris89*
> 
> @RaFDX
> Yeah stock first. Have HWInfo boot with windows and idle out then run timespy on Stock default clocks/ bios and post HWInfo and results. Then start on bios #1.
> 
> @componentgirl90
> Right on. From 1000Mhz to 1173Mhz core on the 390x is from 64 billion pixel's & 176 billion texel's every second.
> 
> To 75 billion 72 million pixel's & 206 billion 448 million texel's every second.
> 
> That's 11 billion pixels more per second & 30.448 billion more texel's every second... Not much probably?
> 
> @stephenn82
> Try 1563mhz ram... sometimes 1758mhz doesn't work, though it does on reference with errors. If the 1563mhz doesn't work then the card isn't happy at 1425mv... Gonna need less max core voltage... LMK on 1563mhz ram on 1250mhz core.


Stock TimeSpy Score: 4195 (GPU 3997)
OC TimeSpy Score: 4586 (GPU 4416) OC is 1170MHz Core//1700MHz Memory +65mV

SO to flash your BIOS, do I have to perform these steps?

_The Setup:

1) Create a bootable MSDOS (Win98) USB KEY by following this guide HERE.
2) Download ATIFLASH.EXE and unzip it to your boot USB from HERE. BE SURE TO RENAME the ATIFLASH.exe program to "atiflash" or it will not work correctly!
3) Before attempting to flash ANY BIOS files to your card be sure to use GPU-Z to save a back-up of your current file in-case the flash does not work.
4) Create a backup folder on your machine. Store a copy of all the BIOS files that you have saved. Do not edit or change your "stock" BIOS files.
5) Save / rename the new BIOS file that you wish to flash as 0_NEW.ROM on to your MSDOS USB. Notice we append _NEW to the filename. Do that for any additional BIOS Files on your USB stick. Not overwriting them.

Booting To ATIFlash:

1) Reboot your computer that has the Graphics cards attached and press F10 or similar to bring up a boot choice menu. Select the MSDOS USB drive you just created. You will see the CMD prompt from Win98.

Completing The Flash:

1) TYPE the following into the CMD: "ATIFLASH.EXE -f -p 0 0_NEW.ROM" (gpu slot 0).
2) The first number in the command is the PCIE SLOT number of your first card. So if you have a card in your top PCIE SLOT the number will be 0. If you have a motherboard with 4 PCIE slots the numbers will be 0-3. So just make sure to know what slot your card(s) are in while flashing.
3) You can do this for other cards attached (Crossfire) also as follows:

ATIFLASH.EXE -f -p 1 1_NEW.ROM (gpu slot 1)
ATIFLASH.EXE -f -p 2 2_NEW.ROM (gpu slot 2)
And so on...

4) Wait for the confirmation message which will display various details about your card.
5) ATIFlash will tell you to reboot. Go ahead and flash any other cards using the correct name for their files. (Ex. 1_NEW.ROM)
6) After rebooting back into Windows your screen may flash a few times, do not worry, this is normal. The card is just completing it's BIOS update.
7) After the screen is done flashing be sure to check CCC and make sure that none of your settings have reset as for some reason CCC likes to do that after you flash the BIOS.
8) You are done!_


----------



## chris89

Need HWInfo for your stock & overclocked idle/ load temperatures.

*To perform the flash.*

1) Simply download ATIWinFlashv274 here.

2) Unzip to it's own folder.

3) Close MSI Afterburner & HWInfo & Everything that could look or log data of the GPU stats.

4) Right Click ATIWinFlash.exe and open as Administrator.

5) Open .ROM

6) Begin Flash : Wait it takes 5 minutes : Don't click anything : Complete : Restart : Done

atiflash_274.zip 1214k .zip file


----------



## RaFDX

it's hard to get the program started and when it does start and i load the rom, nothing happens. I let it sit there and reboot, but I still believe the stock bios is running. I am running HWMonitor now, normally I just run NZXTCam


----------



## chris89

Use HWInfo and leave it open... Go to options when start with option system summary or sensors only... check sensors only... Go to options set to start with windows.... check mark minimize instead of close... run it through time spy stock settings and overclock or just overclock it doesn't matter.... Post the benchmark results and HWInfo...

Windows 10 for some people is problematic... works fine for me but 99% of everyone leaves user account control enabled which is not more secure or anything... disable user account control.

Right click ATIWinFlash.exe as Administrator... For me this works fine and detects the GPU correctly... Post screenshot of your ATIWinFlash page before you begin flash.


----------



## RaFDX

That's what i get


----------



## RaFDX

but after a restart i have this. so now im going to run time spy


----------



## chris89

Lmk if it throttles... I am testing Hawaii "Max Temp" which I believe would throttle if reached... I never did test that though. If set higher, would not throttle and hold true. Slight adjustment to the fan profile would help it hang.


----------



## RaFDX

Quote:


> Originally Posted by *chris89*
> 
> Lmk if it throttles... I am testing Hawaii "Max Temp" which I believe would throttle if reached... I never did test that though. If set higher, would not throttle and hold true. Slight adjustment to the fan profile would help it hang.


Here are all 4 TimeSpy runs. Now that I am more familiar with the ATI Boot thingie, I can do this at will. Thanks for your patience

To recap: I have an R9 390 with the H55/Kraken G10 mod with 2 noctura 80mm fans (one replacing the kraken, one over the cooler/ forward vram). My overclock is 1170MHz Core and 1700MHz Memory (6800) with +65mV. How I judged this OC to be stable/ enough juice is I saw no artifacts in heaven benchmark. I saw zero artifacts in any of the TimeSpy runs. Hopefully this is the data you wanted.

Rom1; TS Score 4388 (GPU 4198, Max Temp 64C) http://i.imgur.com/KPo6yTe.png

Rom2; TS Score 4473 (GPU 4298, Max Temp 71C) http://i.imgur.com/gHfQGV5.png

Rom3; TS Score 4556 (GPU 4385, Max Temp 73C) http://i.imgur.com/Kc2Ttjv.png

Base w/OC ; TS Score 4581 (GPU 4415, Max Temp 51C)


http://imgur.com/RksDbz5


----------



## chris89

PS - Need HWInfo VRM 1 & 2 temperatures at load. Much more important than core temperature.

Watercooling only BIOS for nearly as fast as you can get. If you can also clock ram to 1700, I have no idea the resulting score. The max GPU core clock is 1250mhz. If your cooling is really as good as you think it is we can try that later.

Actually on air I hit my limit no more than 1425mv. Though max is 1.450v but more like 1.449v but I black screen anything above 1425mv. Overheat ASIC.

PS - I can clock my reference modded blower 390x to 1250mhz core and 1758mhz memory on air.

Flash_1st_RaFDX_1207_1407mv_1563_1000mv.zip 99k .zip file


Flash_2nd_RaFDX_1234_1415mv_1563_1000mv.zip 99k .zip file


Flash_3rd_RaFDX_1250_1425mv_1563_1000mv.zip 99k .zip file


----------



## stephenn82

huh, my 1150mhz 1293vcore and 1375mhz 988mv ran ok...not really much of a change. Tells me i may need a touch more of voltage? Max core 61c Max VRM 64c

I will play more in two days, off to work for 24 hours again


----------



## chris89

Louder but real fast. Ramp fan to 100% if you want the highest possible score. Put ram at the highest value as well.

Stephenn82_benchmarks_only_on_air_1207_1407mv_1563_1000mv.zip 99k .zip file


Stephenn82_gaming_only_on_air_1207Mhz_1407mv_1270Mhz_888mv.zip 99k .zip file


----------



## RaFDX

Are you just looking for the VRAM temp from hwmonitor?? I don't have hwinfo. VRAM temp I get from gpuz. One stays somewhat stable,fluctuates a few c, the other starts at 25 and bounces around. Max I saw hit is 69. I keep gpuz up normally cause I've read that it's been an issue where the VRAM doesn't get enough cooling, but I haven't seen the one go past 69.

I haven't tried to tweak the OC since I installed the aio mod. I saw what the max avg was and set my goal to 1160. I tried 1170 on a whim, hit, and stopped. 1700 was the max I could do before the mod, but since I read that you get more gains boosting the core clock, I stopped. Figured it was overkill for the games I played and was happy.

I installed the aio mod cause after the release of wattman, I wasn't able to maintain the OC. And at base speeds playing overwatch at ultra, my temps were 84-88C. After the swap my max temp range is 58-60. The real test comes this summer since my baby house has poor cooling.


----------



## bluej511

Quote:


> Originally Posted by *RaFDX*
> 
> Are you just looking for the VRAM temp from hwmonitor?? I don't have hwinfo. VRAM temp I get from gpuz. One stays somewhat stable,fluctuates a few c, the other starts at 25 and bounces around. Max I saw hit is 69. I keep gpuz up normally cause I've read that it's been an issue where the VRAM doesn't get enough cooling, but I haven't seen the one go past 69.
> 
> I haven't tried to tweak the OC since I installed the aio mod. I saw what the max avg was and set my goal to 1160. I tried 1170 on a whim, hit, and stopped. 1700 was the max I could do before the mod, but since I read that you get more gains boosting the core clock, I stopped. Figured it was overkill for the games I played and was happy.
> 
> I installed the aio mod cause after the release of wattman, I wasn't able to maintain the OC. And at base speeds playing overwatch at ultra, my temps were 84-88C. After the swap my max temp range is 58-60. The real test comes this summer since my baby house has poor cooling.


Those are VRM readings not VRAM, totally different component and there is no VRAM temp sensors. VRM is voltage regulator module and the vram is the video random access memory, vram doesnt get hot VRMs get toasty.


----------



## RaFDX

You'd be correct matey, not sure why I added an "A".

Ran Superpositon. Here are my results


----------



## chris89

rafdx, stock bios i presume?


----------



## gapottberg

Just a side question for everyone. Been playing a lot of AoS:E lately. It is one of the more demanding games I have and is one i primarily use for measuring OC changes and becnhmark stress testing.

But as of late i have had that RTS itch in the lead up to DoW3s release...so ive been in game a lot more. It generally plays butter smooth with my set up with 1 exception. I get seemingly random frame hitches maybe avout 1 every 5 min or so. Nothing serios just a single stutter and back to butter type thing.

Upon investigation with MSI afterburner monitoring software i notice the problem seems to be an kccasional voltage dip that mirrors a core clock dip. This phenomena doesnt seem to happen during even prolonged stress testing...only in AoS:E gameplay. So a timespy or firestrike stress test will not show this occur even once...but in an hour of AoS:E skirmish match i might see it 5 to 10 times.

I run my 390x with +50% power limit, -31mv undervolt, and 1000/1250 core/mem frequency. I will do some testing on whether bumping my voltage back up a click or two or more helps...but it is curious to me that it doesnt appear in other stressing situations.

Makes me wonder if its strictly an AoS:E problem as opposed to an OC problem or a powersupply deliver problem.

Thoughts?


----------



## chris89

Upload your bios...
















GPU-Z_ASUS_ROG_1.12.0.zip 2079k .zip file


atiflash_274.zip 1214k .zip file


----------



## stephenn82

Quote:


> Originally Posted by *chris89*
> 
> rafdx, stock bios i presume?


if his is stock, and mine is modded, ***? thats a better score than I have netted thus far lol

oh wait, Answered it here
Quote:


> Originally Posted by *RaFDX*
> 
> Here are all 4 TimeSpy runs. Now that I am more familiar with the ATI Boot thingie, I can do this at will. Thanks for your patience
> 
> To recap: I have an R9 390 with the H55/Kraken G10 mod with 2 noctura 80mm fans (one replacing the kraken, one over the cooler/ forward vram). My overclock is 1170MHz Core and 1700MHz Memory (6800) with +65mV. How I judged this OC to be stable/ enough juice is I saw no artifacts in heaven benchmark. I saw zero artifacts in any of the TimeSpy runs. Hopefully this is the data you wanted.
> 
> Rom1; TS Score 4388 (GPU 4198, Max Temp 64C) http://i.imgur.com/KPo6yTe.png
> 
> Rom2; TS Score 4473 (GPU 4298, Max Temp 71C) http://i.imgur.com/gHfQGV5.png
> 
> Rom3; TS Score 4556 (GPU 4385, Max Temp 73C) http://i.imgur.com/Kc2Ttjv.png
> 
> Base w/OC ; TS Score 4581 (GPU 4415, Max Temp 51C)
> 
> 
> http://imgur.com/RksDbz5


----------



## chris89

We don't have any idea if he ever modded the bios or anything haha

By the way ... Set 4:2:2 color AMD Settings "Preferences" "More Settings" + 8bpc color format (Heck set windows to 16 bit color as well...) & Windows High Performance + tesselation disabled and run again on whatever BIOS your running? 1207Mhz?!?


----------



## stephenn82

Quote:


> Originally Posted by *chris89*
> 
> We don't have any idea if he ever modded the bios or anything haha
> 
> By the way ... Set 4:2:2 color AMD Settings "Preferences" "More Settings" + 8bpc color format (Heck set windows to 16 bit color as well...) & Windows High Performance + tesselation disabled and run again on whatever BIOS your running? 1207Mhz?!?


It looks like he has a sweet hardware mod, an h55 cooler hooked to water block to deal with all that heat throwing at the stock bios. If he put a custom bios on htere, he may drop his temps down a lot more.

Quote:


> Originally Posted by *RaFDX*
> 
> Here are all 4 TimeSpy runs. Now that I am more familiar with the ATI Boot thingie, I can do this at will. Thanks for your patience
> 
> To recap: I have an R9 390 with the H55/Kraken G10 mod with 2 noctura 80mm fans (one replacing the kraken, one over the cooler/ forward vram). My overclock is 1170MHz Core and 1700MHz Memory (6800) with +65mV. How I judged this OC to be stable/ enough juice is I saw no artifacts in heaven benchmark. I saw zero artifacts in any of the TimeSpy runs. Hopefully this is the data you wanted.
> 
> Rom1; TS Score 4388 (GPU 4198, Max Temp 64C) http://i.imgur.com/KPo6yTe.png
> 
> Rom2; TS Score 4473 (GPU 4298, Max Temp 71C) http://i.imgur.com/gHfQGV5.png
> 
> Rom3; TS Score 4556 (GPU 4385, Max Temp 73C) http://i.imgur.com/Kc2Ttjv.png
> 
> Base w/OC ; TS Score 4581 (GPU 4415, Max Temp 51C)
> 
> 
> http://imgur.com/RksDbz5


Quote:


> Originally Posted by *chris89*
> 
> We don't have any idea if he ever modded the bios or anything haha
> 
> By the way ... Set 4:2:2 color "More Settings" & Windows High Performance + tesselation disabled and run again on whatever BIOS your running? 1207Mhz?!?


the 1207 1536 hard locked my card, it went black, I had to fiddle with my onboard to get my pc to work, reflash the 1133/1270. I will check out why it failed tomorrow. at work again...this three day shift cycle sucks









Its either way too much voltage, like you said, or something else. I iwll check it out before I go to Cali hopefully soon.


----------



## chris89

Cool dude. your on air right? Gonna need a beast fan profile to keep ASIC under control.. it's a fail safe on my BIOS 88C ASIC so no matter what card will still work.


----------



## stephenn82

Quote:


> Originally Posted by *chris89*
> 
> Cool dude. your on air right? Gonna need a beast fan profile to keep ASIC under control.. it's a fail safe on my BIOS 88C ASIC so no matter what card will still work.


yeah, still on the stock MSI Frozer V cooler. It works pretty good for a stocker. Cool, quiet even up to 65%, then it starts making some noises lol.

I will flash the 1207 bios with the 1250 ram speed to see if the card will even work...then figure out boosting the ram speed up a little later.


----------



## stephenn82

quick questtion

The Power Slider from AB to boost the power by a percentage....how does that work with the stock bios? How does it translate to a custom BIOS?

Can someone set anything in Wattman or with AB on top of what was built into the BIOS? What implications are created when doing so?

I havent messed with anything, dont want to fry my card by double boosting power if its alrady kicked up a noth into the bios.


----------



## chris89

Stock power limit is about 200 watts, my bios opens it up to literally 57,599 watts







Which gives all the headroom for Zero Throttling and Max Performance.

By dragging slide to +50.... Turning 200 into 250... No biggie it's not permanent or anything. Just a software override that barely helps. These cards need power for the big end clocks like my custom bios mods.

PS - You can fry the card on Stock BIOS... You can't on my BIOS... ie Max ASIC Shutdown 88C which is below the damage limit. By stock I believe is nearly 100C or greater... Well into the damage zone. That's why we see so many of these that faulted out... Easy fix ... 400F hot air blower and Flux on all Cap solder points and the mPGA main Core Diode... 10 minutes or less and done good to go.

PS #2 - Black Screen equals me setting ASIC too low and prevents "Damage" which I should leave at 88C so it won't black screen on ya on the Big End. So that's the issue... LMK when you want a new bios to fix this.

Stephenn82_1200Mhz_1388mv_1270Mhz_888mv_88C_ASIC.zip 99k .zip file


----------



## stephenn82

Quote:


> Originally Posted by *chris89*
> 
> Stock power limit is about 200 watts, my bios opens it up to literally 57,599 watts
> 
> 
> 
> 
> 
> 
> 
> Which gives all the headroom for Zero Throttling and Max Performance.
> 
> By dragging slide to +50.... Turning 200 into 250... No biggie it's not permanent or anything. Just a software override that barely helps. These cards need power for the big end clocks like my custom bios mods.
> 
> PS - You can fry the card on Stock BIOS... You can't on my BIOS... ie Max ASIC Shutdown 88C which is below the damage limit. By stock I believe is nearly 100C or greater... Well into the damage zone. That's why we see so many of these that faulted out... Easy fix ... 400F hot air blower and Flux on all Cap solder points and the mPGA main Core Diode... 10 minutes or less and done good to go.
> 
> PS #2 - Black Screen equals me setting ASIC too low and prevents "Damage" which I should leave at 88C so it won't black screen on ya on the Big End. So that's the issue... LMK when you want a new bios to fix this.
> 
> Stephenn82_1200Mhz_1388mv_1270Mhz_888mv_88C_ASIC.zip 99k .zip file


Thanks for the BIOS and the info. Very helpful

I havent really looked around on my own to find it previously, but started to take an interest in the matter. Its easier to be coached along the way instead of finding it out in a book or reading all the time.

Before when I was all stock, I went into the AB "unlocked settings" in my card, and I remember it being set to 94c. That may still be a little steep, but 88c shoul dbe enough. The thing is, when my GPU would go black screen was just after booting up fully, i would start to login, and black screen would happen. Not a lot of throttling going on at idle right?

yeah, flow the solder joints to make a better connection...Happened witht he PS3's and quite a bit of the first few generations of the XBOX 360's. I have a PS4 to fix for my wife's friend/family. It may be just loose solder that isnt holding proper connections. That would be easy fix.


----------



## chris89

Right on see at 1400mv+ and 1200mhz we are in that narrow window of possibility thermally speaking. Which is why a precisely tuned fan profile is required since ASIC on these cards rise fast and quick and are animals. So a minimum fan profile setting is ideal at higher speeds with lower temperature starting point at higher fan rpm. Than would seem "Necessary" yet that ASIC is floating in the air without you knowing and the temperature is controlled by the fan. We can't know exactly for sure how hot the ASIC is but we can control it with manipulating the fan profile. I have thoroughly tested this to get 1250mhz possible on the reference 390x using the reference beefier 290x heatsink/ blower as its 25C cooler than the 390X assembly with it's tiny RX 480 sized Core Heatsink.

The reference blower is the best Air Cooler, nothing else comes anywhere near close to it's thermal performance capabilities. The gust that comes off the back of my modded reference 290X & 390X can be felt from 10-15 feet away... I have it unrestricted with the DVI port removed so 110% of all that crazy amount of heat is allowed to escape most effortlessly.


----------



## stephenn82

wait, the 290x fan is cooler? is it the same with the 390/290 cards?

next question...will a 290x fan fit a 390 lol. I would assume yes, as it is almost the same card, but with some of the CU's disabled. I have 40/44. I CANT ulock the other 4, physically disabled








I always get the cards that have the potential to unlock, but physically cant. The only card that I owned that could be unlocked was my GeForce 5700 waaaay back in 2005. I think it was a 5700LE turned into a 5700 or 5700 Ultra...saved that money!


----------



## chris89

Whatever you do on your next upgrade don't buy the little "weaker" model and tell yourself "I Could Overclock it and smoke the better Model". This is what I thought over the years. Come to find out, no way in heck can the "weaker" model ever catch the "Best Model" of the series.

That's why I went 380X, 290X, 390X, RX 480. They are faster, so go for the beastliest card and reference I might add on your next purchase. They are the very most capable cards out there. Both Physically & Thermally. The blowers are Top-Notch performers when modded.

By the way, yes 290X fits the 390/ 390X as long as reference design PCB... Compare the PCB's... I use 290X reference blower on the 390X Reference... They are exactly the same cards. The 390X is just a finer tuned version is all. Able to use less power at idle and load.

For instance back when I bought my first GTX 470, I was all Nvidia at the time. I thought I'm probably as fast as the 480... Nope the 480 was a way way better card both thermally and physically... You could tell it was a finer card in the way it overclocked, it's thermals and straight up performance. I bought the 480 years later just to see what I had been missing out on. So yeah the "Best" model of the Series is so much better in every way, a finer build with more care and emphases to detail than the "Little" dumbed down weaker cards.


----------



## stephenn82

I would be stuck with an MSI AIB cooler then. I will look into that.

yeah, im at the point where just buying budget doesnt cut it. The 390 is nice, it drives a lot of games at 1440p at ultra pretty good.

I have spent 925 bucks on video cards since 2012. 5 years...900 plus freaking bucks. Know what I can buy for 700 that would probably easily last 5 years?

Know what I can buy for 400-450 used from a friend that would last the same?

I think I will go that route.

You make custom BIOS for the green team?


----------



## chris89

Yes sir sure do make custom bios for the Green Team. haha Tons of experience with the Green Team... Way more than AMD mind you









I just took the AMD route this time... Running on flippin Fermi Power in 2016 hahaha ... I was going to go GTX 760 4GB for efficiency and just up and bought the 380X 4GB and boy was I happy dude... 4K VSR I was like this is so awesome from 1.5GB GTX 580 to 4GB 380X that I had totally gone scientific on the BIOS mastering of Tonga. That way I pulled the weakling power limited 1040mhz base to 1,250Mhz core DeLimited... Didn't have reference 380X and still don't and I still never saw what the 380X was truly all about... I hope one day a 380X Reference pops up on eBay with a beautifully attractive price of 99.99 with way more than 25 in stock... Because Reference is the way to go... Like I knew about Nvidia... Only on Reference Nvidia could the cards see clocks no others could see on air... Not including expensive water cooling spenders...

Yep haha I've said enough in this post haha


----------



## stephenn82

VSR doesnt work on my card properly...







dang it AMD!

Far better hardware...doesnt support a simple thing. oh well.

Soon enough, I may be going with a 1070 or 1080...or hell, a 1080 Ti.


----------



## THUMPer1

First image my own overclocking setup. 1150/1600 + 40v
Second one is of the 1186/1563 that you cooked up. It's not stable. I had to add more voltage. +60 in MSI afterburner. just an FYI, I know I was not supposed too. But I'm a mad man


----------



## stephenn82

nice results, Thumper1!!
Quote:


> Originally Posted by *chris89*
> 
> Yes sir sure do make custom bios for the Green Team. haha Tons of experience with the Green Team... Way more than AMD mind you
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I just took the AMD route this time... Running on flippin Fermi Power in 2016 hahaha ... I was going to go GTX 760 4GB for efficiency and just up and bought the 380X 4GB and boy was I happy dude... 4K VSR I was like this is so awesome from 1.5GB GTX 580 to 4GB 380X that I had totally gone scientific on the BIOS mastering of Tonga. That way I pulled the weakling power limited 1040mhz base to 1,250Mhz core DeLimited... Didn't have reference 380X and still don't and I still never saw what the 380X was truly all about... I hope one day a 380X Reference pops up on eBay with a beautifully attractive price of 99.99 with way more than 25 in stock... Because Reference is the way to go... Like I knew about Nvidia... Only on Reference Nvidia could the cards see clocks no others could see on air... Not including expensive water cooling spenders...
> 
> Yep haha I've said enough in this post haha


The real kicker is, I almost bought the 380x, but my buddy talked me into getting the 390 over it. oh well.

To go with the 1080 my friend has, he has the FTW Hybrid mode...so got that going for me. Or for him...since he is waiting for 1080 Ti to have more cards on the market.


----------



## chris89

AMD bro you can't imagine the numbers on the next flagship.. wait it out bro.

Rafdx.. right on nice score.. new bios? Ill add voltage .. proper amount


----------



## RaFDX

Hey guys!

Had to drive to Jersey and back, home now. Yes, stock bios with the OC done in MSI Afterburner. I'd like to min/max on performance while driving down voltage. Looking at THUMPer1's numbers, maybe I'm not tweaked right/ over-volting??

*Here is my setup*



*GPU mod*



*Airflow Map* (I have 2 fans as exhaust for the radiator)


----------



## gapottberg

Quote:


> Originally Posted by *RaFDX*
> 
> Hey guys!
> 
> Had to drive to Jersey and back, home now. Yes, stock bios with the OC done in MSI Afterburner. I'd like to min/max on performance while driving down voltage. Looking at THUMPer1's numbers, maybe I'm not tweaked right/ over-volting??
> 
> *Here is my setup*
> 
> 
> 
> *GPU mod*
> 
> 
> 
> *Airflow Map* (I have 2 fans as exhaust for the radiator)


That is an interesting fan set up. My mind says 3+4 should be intakes with 9+10 being outtakes for best results...but if it works it works.


----------



## stephenn82

Quote:


> Originally Posted by *chris89*
> 
> AMD bro you can't imagine the numbers on the next flagship.. wait it out bro.
> 
> Rafdx.. right on nice score.. new bios? Ill add voltage .. proper amount


Yeah, im holding off for Vega. To be honest, I could hold it off for the Little Vega (Vega 21 or whatever their naming convention is)

I just get tired of AMD drivers on the older boards not working well. It took them over 7 driver updates to fix the multi monitor bug. I had the 150mhz ram bug and I DIDNT EVEN HAVE two monitors...

The lag issue has seemed to go away with your bios. maybe i did something wrong in AB? who knows! Maybe their bios just sucks?


----------



## chris89

Yes, cards are baller... Bios is... you know







haha gotta mod it 1st thing.. all bugs can be worked out a bit with bios at least 98% of them... then 1% all physically thermally with the PCB and then 1% driver software bugs.

Nice man. I'd re-think 9, 10. As we know heat rises... Rises to the top of the case and swept back in through the CPU cooler and through a single 120mm exhaust of smaller diameter than all the air feeding it. Which would cause air to become turbulent and bounce back into the case and the whole process starts all over again.

Ideally you want the front of the case sucking in cool air in over the GPU/ Northbridge/ Southbridge/ Memory/ CPU to all of the fans on the top and back as exhaust... That way the air is swept in, heated and blown out very efficiently... Going where it wants to anyway.


----------



## stephenn82

Quote:


> Originally Posted by *chris89*
> 
> Yes, cards are baller... Bios is... you know
> 
> 
> 
> 
> 
> 
> 
> haha gotta mod it 1st thing.. all bugs can be worked out a bit with bios at least 98% of them... then 1% all physically thermally with the PCB and then 1% driver software bugs.
> 
> Nice man. I'd re-think 9, 10. As we know heat rises... Rises to the top of the case and swept back in through the CPU cooler and through a single 120mm exhaust of smaller diameter than all the air feeding it. Which would cause air to become turbulent and bounce back into the case and the whole process starts all over again.
> 
> Ideally you want the front of the case sucking in cool air in over the GPU/ Northbridge/ Southbridge/ Memory/ CPU to all of the fans on the top and back as exhaust... That way the air is swept in, heated and blown out very efficiently... Going where it wants to anyway.


I concur, you may even see a few degrees of temp drop all around by optimizing case flow. Try it out and let us know...it would take you 10-15 minutes of time and a beer to keep you company


----------



## RaFDX

Quote:


> Originally Posted by *gapottberg*
> 
> That is an interesting fan set up. My mind says 3+4 should be intakes with 9+10 being outtakes for best results...but if it works it works.


I read from other H55 modders that you want the radiator as exhaust cause then you'd be blowing hot air back into the system. It really does get hot

So i was thinking of getting rid of fan 8 all together cause I believe its so close to the CPU cooler fan.

Do you think flipping 9 & 10 out would be better? Or blowing cooler air directly on to the CPU be better? I am def open to ideas regarding cooling.

Even with all the fans it's not loud. There are only two 120MMs fans running faster than 40% and they are the CPU's. The two noctura's are running at full speed, but they are so quiet, The rest are Cougar Vortex PWM


----------



## chris89

3+4 is a real pain... 300-400 watts of gpu heat dumping in the case.... Unless you had longer hose and placed the radiator at the top as exhaust to avoid heat moving through your case from the GPU... In the current 3 + 4 exhaust... it's just dumping that heat directly back into the other front 120mm intakes...

What would be real interesting is to run the hoses out the top of the case, create a Radiator Swivel that can be placed down flat for moving... and placed up 90 degrees sucking cool ambient air in and blowing it right back out into the ambient area... avoiding saturating case heat...

Otherwise that's a real tough one... maybe I'd put the GPU Radiator below where all the PCI expansion slots are to blow directly out the back of the case... supplemental intake airflow from front intake ... to direct out of the case... ya know?


----------



## RaFDX

Yeah, which is what i was worried about originally (heat rises) when I put the radiator down there. IDT I have enough hose length to get it to the top nor the clearance with the CPU cooler to put it as my lone exhaust. Maybe I should switch 3 & 4 at the top where fan 1 is and swap 9 & 10? I am delidding my Skylake with two friends in about two weeks, so I'd have the room to do everything then.

Keep the advice coming!


----------



## chris89

Nice. I think actually it would be better to put the radiator in the spot of #8 as exhaust... One fan mounted on the outside.. one inside... As it's a 90 degrees from the GPU core it's cooling, the heat will move best at a 90 degree angle up and out the back. The CPU heat soat will be minimal and the whole case will cool down significantly... I would then actually transition the CPU cooler to direct up exhaust out the dual 120mm exhaust fans... Create a brace mount to prevent the cooler with moving however way you feel would work best?

Then switch all the front fans to full Intake Inlet and all Cool air In and All Hot air Directly out...

Switch out the cooler haha that thing is flippin huge hahahaha ... Go Thermalright True Copper... those are sooo sweet man if you can find one... coolest of all coolers by 10-15C with dual Delta 500CFM intake exhaust at 5V operation ... 250CFM each... 20 CFM per Volt... 5 volts 200 CFM Quite Quiet.


----------



## gapottberg

Quote:


> Originally Posted by *RaFDX*
> 
> I read from other H55 modders that you want the radiator as exhaust cause then you'd be blowing hot air back into the system. It really does get hot
> 
> So i was thinking of getting rid of fan 8 all together cause I believe its so close to the CPU cooler fan.
> 
> Do you think flipping 9 & 10 out would be better? Or blowing cooler air directly on to the CPU be better? I am def open to ideas regarding cooling.
> 
> Even with all the fans it's not loud. There are only two 120MMs fans running faster than 40% and they are the CPU's. The two noctura's are running at full speed, but they are so quiet, The rest are Cougar Vortex PWM


I anticipated this response. You are correct in assuming that if you set up the radiator as an intake...it will raise the ambient air temp of the case. This is a give and take. You decrease ambient air temp on your gpu by drawing cool air through it...and increase ambient for your other components.

In your scenario you are pulling warm air into your radiator from your case, so your gpu is being suboptimally cooled. It is give and take depending on how you do it.

Also...in your current set up, you actually sabatoge the benfits of both options. Look at where the position of the gpu radiator is. Where is it pouring out all that "hot" less dense air? Right in below your two front panel intake fans. Which means a good portion of that hot air is actually being recycled back into your case by your intakes...and back through your gpu radiator. Try flipping the fans. You may be suprised.


----------



## stephenn82

Quote:


> Originally Posted by *RaFDX*
> 
> Yeah, which is what i was worried about originally (heat rises) when I put the radiator down there. IDT I have enough hose length to get it to the top nor the clearance with the CPU cooler to put it as my lone exhaust. Maybe I should switch 3 & 4 at the top where fan 1 is and swap 9 & 10? I am delidding my Skylake with two friends in about two weeks, so I'd have the room to do everything then.
> 
> Keep the advice coming!


delidding...you said you were up near NJ recently. I would like to delid, but not sure if I go about buying this tool for one time deal, how likely is it that the next generation intel chips will be compatible? like past Kaby Lake/Coffee Lake/Birthday Cake whatever they have up their sleeves.

I am PMing a question.


----------



## RaFDX

Replied.

But yes, hordes of Kabby Lake guys have already done the delidding with crazy results.

Not to move away from our glorious graphics cards, but here are some links:

7700k guy


http://imgur.com/CC2Qu

 and

__
https://www.reddit.com/r/4v1umj/48_ghz_145_vcore_i7_6700k_cpuz_stress_test_results/

i'm going to have to revamp my fan setup. Seems there are improvements I can make, just need to route them correctly. Keep the ideas coming


----------



## stephenn82

huh, even with a core speed of 1200, not a huge improvement over 1133, even with 1133 and 1375 memory.

http://www.3dmark.com/compare/spy/1536736/spy/1537220/spy/1570372

It was hot as balls though, compared to the other runs. see?


superposition at 1200 core


1133 core


----------



## stephenn82

I took your 1133 bios, touched up the core clock to 1150, bumped vram voltage from 888 to 950 and clocked at 1375. It runs pretty good, about as cool as your 1133 bios, so thats good. I need to figure out this default vram voltage to get the 1500 running. I want to test actual gameplay with the 1270 ram speed vs default 1500 to see if I notice any difference. a few frames, not a big deal...but like 10-20+, i may consider just touching up freq over 1300. GOing to test the 1375 vram speed now.

Chris89, thanks for all the work you do to help people out...really amazing! Thanks a lot man.


----------



## chris89

Your welcome man yeah those thermals are a pain haha i always recommend reference for this reason as they don't have these thermals issues if you upgrade the thermal pads.

PS - I wanna throw up some of my Superposition results... been on RX 480 for a month haven't even touched the molten 390x in a minute...

I'll mod something up see how she runs haha PS - Lacking results from T7500 PSU... big scores on Dedicated GPU only 1000W PSU.


----------



## stephenn82

true, true. I will probaby do that in future.

So, strange thing...the 1150/1375 ran decent...i bumped the ram speed to 1425, ran even better, slightly hotter.

Max core was 67, VRM was 73. no sweat I thought.

My Valley score was almost 3000, it was 2922.

I decided to pull the volts down just a tad via Wattman. Drastically lowered my score.

I think with your custom power build in the bios, the software hosed up the actual programming.

lower thermals and power consumption a lot.


with a hit to performance.


----------



## kilyan82

1144/1500 sapphire r9 390 nitro


1144/1500


----------



## chris89

Here's my base score 1st time run on 16.9.2 WHQL reference 390x on 290x reference blower fan. Only on 1 Xeon at the moment. The other one cut out, in the process of fixing it.

It appears I'm guessing topping out around 3,200 to 3,400 points. Probably more if I have a score to aim for. Probably something like 1070'ish. I'm on PCIe 2.0 though at 2.6Ghz.

1133mhz core : 1250mhz memory 888mv

We could only imagine the max of my card...

1250mhz core : 1758mhz memory 1000mv



2nd Run through :

1207mhz core : 1250mhz memory 888mv



Only Up To Score : 2999 : Pretty weak cheese if you ask me... PCIe 2.0 limitation severely.


----------



## stephenn82

nice...those extra 4 compute units really shine at lower clocks lol. Man, if I waited just another month in Jan 2016 that R9 390x was about same price I paid for the 390. Live and learn...

Nice scores man!


----------



## RaFDX

Can you explain the high scores with lower memory clocks? Is it cause of the higher clock speeds and low power or have you tweaked the ram timing?


----------



## chris89

Thanks. I don't know how to adjust timings but best left stock anyway. Timings seem to only help mining anyway.

Yeah 1,250mhz memory maybe 1,270mhz memory on 888mv is only possible on this PSU. Once I went to 1563mhz memory at 1000mv, max only went up to near 30fps but always shut down before complete. Need a better PSU or dedicated 1kw, but didn't care enough to go through all that.

Yeah for sure, go with the best model ya know? haha I like getting them used or refurb like my 390x for 225. You need a Ryzen at 4Ghz even R7 1700 to let the 390x shine just a little bit.

The PCIe bus limits this card... If it were speced out for "AMD Infinity Bus" specs on the PCIe that connects up to the "Infinity Fabric" then no bottleneck would exist on this highly limited PCIe bus. Even 4.0 & 5.0 will limit the 390x. Get the system ram to the speed of the video ram at 320GB/s+ across Copy/ Read/ Write on System Memory & Video Memory now that's where it's at.

The boards would be inline with PCIe specification yet for AMD GPU's would use the "Infinity PCIe Bus" that rides on no limitation with the system memory.

So yep, that's about it for now. haha


----------



## stephenn82

Quote:


> Originally Posted by *RaFDX*
> 
> Can you explain the high scores with lower memory clocks? Is it cause of the higher clock speeds and low power or have you tweaked the ram timing?


for me, or for chris89? we have R9 390, having 40 of 44 computer units...he has the 390x in all of its glory...that is the difference.
Quote:


> Originally Posted by *chris89*
> 
> Thanks. I don't know how to adjust timings but best left stock anyway. Timings seem to only help mining anyway.
> 
> Yeah 1,250mhz memory maybe 1,270mhz memory on 888mv is only possible on this PSU. Once I went to 1563mhz memory at 1000mv, max only went up to near 30fps but always shut down before complete. Need a better PSU or dedicated 1kw, but didn't care enough to go through all that.
> 
> Yeah for sure, go with the best model ya know? haha I like getting them used or refurb like my 390x for 225. You need a Ryzen at 4Ghz even R7 1700 to let the 390x shine just a little bit.
> 
> The PCIe bus limits this card... If it were speced out for "AMD Infinity Bus" specs on the PCIe that connects up to the "Infinity Fabric" then no bottleneck would exist on this highly limited PCIe bus. Even 4.0 & 5.0 will limit the 390x. Get the system ram to the speed of the video ram at 320GB/s+ across Copy/ Read/ Write on System Memory & Video Memory now that's where it's at.
> 
> The boards would be inline with PCIe specification yet for AMD GPU's would use the "Infinity PCIe Bus" that rides on no limitation with the system memory.
> 
> So yep, that's about it for now. haha


Yeah, that is legit. I guess when this "Optane" memory thing is more widespread it would be faster...not quite 320GB/sec, closer to 32GB/sec...but better than what we currently have.


----------



## chris89

@RaFDX @stephenn82 @kilyan82
Great Scores Guys!!! Keep em coming bros

Haha for sure maybe the GPU's would be inline with current gen PCIe specifications, yet if were connected to Infinity Fabric. Use AMD PCIe-XFinity of no number specification. Limited by the throughput of the system memory. The boards would need to be nuts though, because of the heat of the trace's on the PCB. Would need some sort of Trace Heatsink to cool the trace's at that point...

I'm so happy I went with the 390X, its also just a dumbed down version of the fully unlocked variant of hawaii only someone real special gets to test solo on their rig. HAHA

Rather than 64/176 ... maybe 390X was just 50% of the full unlocked Hawaii Silicon... used way too much power I'm sure... It's all good though haha


----------



## lanofsong

Hey AMD R9 390/390X owners,

We are having our monthly Foldathon from Monday 17th - Wednesday 19th - 12 noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.

April 2017 Foldathon

BTW - make sure you sign up









To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## gordesky1

Hmm with today weather it seems like the 1204-1270 bios is running way to hot even with max fan..







Also getting a good bit of white dots in dirt 3 Mostly seems when the temp gets over 80.



Chris can you make me couple that is 1133 1150 and 1173? and with memorys with 1270 and couple with 1500 memory if doable?

Think the 1204 is a bit to hot for me today with this weather lol...

This x9 case really holds alot of heat... case itself was really warm.. lol


----------



## stephenn82

Quote:


> Originally Posted by *lanofsong*
> 
> Hey AMD R9 390/390X owners,
> 
> We are having our monthly Foldathon from Monday 17th - Wednesday 19th - 12 noon EST.
> 
> Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.
> 
> April 2017 Foldathon
> 
> BTW - make sure you sign up " src="https://www.overclock.net/images/smilies/thumb.gif" style="border:medium none;">
> 
> To get started:
> 
> 1.Get a passkey (allows for speed bonus) - need a valid email address
> http://fah-web.stanford.edu/cgi-bin/getpasskey.py
> 
> 2.Download the folding program:
> http://folding.stanford.edu/
> 
> Enter your folding name (mine is the same as my OCN name)
> 
> Enter your passkey
> 
> Enter Team OCN number - 37726
> 
> later
> 
> lanofsong


Oh snap. I leave then for SoCal...cant really remote to start and stop it...unless you all havr tips and tricks?


----------



## chris89

@stephenn82Need ultra stable bios low temps probably 1133 : 1250 for folding... Only a 3 days run which is fine.

Download app, sign up, log into app... Begin folding and your folding amount gets recorded online...









@gordesky1























Yet again your avatar fits that Core & VRM temperature perfectly!!! So funny man.

Yes sir, will do.


----------



## gordesky1

Quote:


> Originally Posted by *chris89*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yet again your avatar fits that Core & VRM temperature perfectly!!! So funny man.
> 
> Yes sir, will do.


hehe

thanks man


----------



## chris89

If anyone has the cash for 5 dollar solder flux and a 10 dollar hot air blower and 69 bucks for a 290x in title yet believe to be 290... this may turn out to be a great buy... Just pull cooler... set flat on tin foil... low setting on blower place in the center of the core diode and rotate in a circle for I would give it 60 seconds... add a little liquid flux though before you begin around the edges of the chip so when heated the flux seeps under the core to properly go to all solder balls....

http://www.ebay.com/itm/262928999465?_trksid=p2060353.m1438.l2649&ssPageName=STRK%3AMEBIDX%3AIT

http://www.ebay.com/itm/131373589900?_trksid=p2060353.m2749.l2649&ssPageName=STRK%3AMEBIDX%3AIT

Never cover tip/ push directly to anything too long or could burn out the blower... the low settings is the only setting you should ever use. Hold at a 1-3" distance from core diode for 60 seconds in circles as you watch the flux seep under the chip... it will reflow it.

http://www.ebay.com/itm/New-Heat-Gun-Hot-Air-Dual-Temperature-4-Nozzles-Power-Tool-1500-Watt-W-Heatgun-/112372446019?hash=item1a29eb7343:giEAAOSwN6JY8x7F

@gordesky1 These are all from scratch... I moved all the old bios to an "Old" folder. Started fresh, new perspective experimental untested bios.

I have not tested of the behavior of "Max Temp" on Hawaii/ Grenada but I think it would throttle and not allow the GPU to rise past that temperature as long as values are in line.

Max temp of 75C here, post results... I'm interested since I think it would throttle the clock at 75C... I did test 84C max temp on 1250mhz core and it was stable and did not crash after running Superposition over and over again... I think it actually works...

Let's see...?

Gordesky1_75C_Max_Temp_1133_1333_1250_888.zip 99k .zip file


Gordesky1_75C_Max_Temp_1144_1333_1250_888.zip 99k .zip file


Gordesky1_75C_Max_Temp_1157_1333_1250_888.zip 99k .zip file


*If this "Max Temp of 75C" does not work on DeLimited BIOS... Then we can solve these temperatures with Math.

We saw VRM hit 92C. We also saw Input power 350 watts... 350 watts divided by 92 = 3.804347 (means nothing for now) haha









At least this can tell us what TDP is your max if we want no more than 75C... Basically 92 Divided by 75 = 1.2267 (22.67%) means we need to reduce TDP to 22.67% of 350 watts... Equals : 285 watts... New BIOS on the way... LMK on the above BIOS in the mean time.*

Gordesky1_75C_Max_285W_1133_1333_1250_888.zip 99k .zip file


Gordesky1_75C_Max_285W_1144_1333_1250_888.zip 99k .zip file


Gordesky1_75C_Max_285W_1157_1333_1250_888.zip 99k .zip file


Gordesky1_75C_Max_285W_1166_1366_1250_888.zip 99k .zip file


Gordesky1_75C_Max_285W_1200_1400_1250_888.zip 99k .zip file


----------



## gordesky1

Tried the Gordesky1_75C_Max_Temp_1144_1333_1250_888.zip 99k .zip file top files

heres the hwin info while playing dirt for about 30mins. no crashing.


----------



## chris89

That's nice man but I don't think you could walk away while sitting in game haha Right? That's the Hawaii/ Grenada for ya. So for your GPU we must define the TDP... Try 1144 285w tdp and post Hwinfo.









Reason being judging by 30min, those temps are on the rise so would likely crash eventually. Even 285w may be pushing it. Continuous walk away stable maybe 267W.


----------



## gordesky1

Quote:


> Originally Posted by *chris89*
> 
> That's nice man but I don't think you could walk away while sitting in game haha Right? That's the Hawaii/ Grenada for ya. So for your GPU we must define the TDP... Try 1144 285w tdp and post Hwinfo.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Reason being judging by 30min, those temps are on the rise so would likely crash eventually. Even 285w may be pushing it. Continuous walk away stable maybe 267W.


Heres the Gordesky1_75C_Max_285W_1157_1333_1250_888.zip 99k .zip file seems to run a bit cooler than the top ones. 54mins in dirt 3 and seems to max out around 76 core and 75vrm steady

Really need to buy those thermal pads lol..


----------



## chris89

Yeah for sure man. So even know it ran cooler... I'm sure you may of noticed a slight decrease in performance & fps? Regardless of the clock holding... Am I right?

As we cool down, we slow down... So keep thermals ultra low... Run "Hot" DeLimited Bios clocks & settings while running cool-er and pull off lots of performance... This makes sense I'm sure haha

Is it the fan that's too loud? you really want to use a Hot Air Blower like I linked for $10 and remove the DVI ports and mod the mounting bracket as well as you case. That way you could run a peak performance Reference Blower... they are the best out there by far... they could certainly be perfected immensely, however no other has achieved what the Reference Blower has on thermal performance characteristics.

As for me I would never go out and tell anyone the reference blower is way too hot... I'd say it has potential... Not whine... and straight just do something about it... Here's my RX 480... I had a spare bracket and modded it to work... as to keep the original in tip top shape... Only removed the shroud to remove the original bracket.. modded the one lying around and modded the case and Wha-La ...


----------



## stephenn82

Quote:


> Originally Posted by *chris89*
> 
> @stephenn82Need ultra stable bios low temps probably 1133 : 1250 for folding... Only a 3 days run which is fine.
> 
> Download app, sign up, log into app... Begin folding and your folding amount gets recorded online...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> @gordesky1
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yet again your avatar fits that Core & VRM temperature perfectly!!! So funny man.
> 
> Yes sir, will do.


i will get on that. makes almost no heat and runs very well. the 1150 I made does the job in gaming man, but the heat output is so much at 1425mhz ram speed. lol


----------



## chris89

haha I don't fold though just not into it anymore...


----------



## gordesky1

Quote:


> Originally Posted by *chris89*
> 
> Yeah for sure man. So even know it ran cooler... I'm sure you may of noticed a slight decrease in performance & fps? Regardless of the clock holding... Am I right?
> 
> As we cool down, we slow down... So keep thermals ultra low... Run "Hot" DeLimited Bios clocks & settings while running cool-er and pull off lots of performance... This makes sense I'm sure haha
> 
> Is it the fan that's too loud? you really want to use a Hot Air Blower like I linked for $10 and remove the DVI ports and mod the mounting bracket as well as you case. That way you could run a peak performance Reference Blower... they are the best out there by far... they could certainly be perfected immensely, however no other has achieved what the Reference Blower has on thermal performance characteristics.
> 
> As for me I would never go out and tell anyone the reference blower is way too hot... I'd say it has potential... Not whine... and straight just do something about it... Here's my RX 480... I had a spare bracket and modded it to work... as to keep the original in tip top shape... Only removed the shroud to remove the original bracket.. modded the one lying around and modded the case and Wha-La ...


Nope even on maxed the fan is quite so it doeisnt bother me at all. Were can i buy the blower style heat sink? i cant find any on ebay... Cause really sound wont be a problem for me when i ran my 5870 reference blower i mostly kept it on 100% and had a jet sound but it cold really nice and blew right out the rear of the case instead of inside of the case. The blower did die on me tho but i replaced it with my 4890 one.

I used that card from released way up to 2014 such a good card and still handle stuff decent for me. It just sucked when overclocking cause with 1mhz oc on the memory will cause a grey screen lol

Well removing the dvi will be a problem for me sense i use a old dell monitor as a 2nd screen lol


----------



## stephenn82

Quote:


> Originally Posted by *kilyan82*
> 
> 1144/1500 sapphire r9 390 nitro
> 
> 
> 1144/1500


Todays run with 1173/1563. a TRUE 400GB/sec on the memory BW...boy howdy! Not too hot either. ITS WHACKY FAST...TOO MANY FPS!! I didnt know what to do in BF1, was actually doing well. like 17 kills and 2 deaths well!





Here is my BIOS, very very similar to yours.


----------



## stephenn82

ok, off to bed, good night people! 4:15 comes early...like, in 4 hours and 16 minutes...lol


----------



## kilyan82

Quote:


> Originally Posted by *stephenn82*
> 
> Todays run with 1173/1563. a TRUE 400GB/sec on the memory BW...boy howdy! Not too hot either. ITS WHACKY FAST...TOO MANY FPS!! I didnt know what to do in BF1, was actually doing well. like 17 kills and 2 deaths well!
> 
> 
> 
> 
> 
> Here is my BIOS, very very similar to yours.


It would be cool to check, but your pics of the bios are all little and blurry. Can you tell me how you made those pics so we can improve the redeability?








I use the capture tool of windows, that is much better compared to copy paste in paint and resize of image


----------



## chris89

Nice scores man and I know right? Isn't it so awesome how awesomely fast these cards are DeLimited? Makes the Stock Bios feel like a Daihatsu Midget haha








@kilyan82


----------



## kilyan82

that's weird but i can see them clearly now opening in new tab.... O_O


----------



## chris89

sweet bro


----------



## stephenn82

beast mode!

1440p
http://www.ozone3d.net/gpudb/score.php?which=403843

1080p
http://www.ozone3d.net/gpudb/score.php?which=403848

beats out the 290x by a couple hundred points...not bad, eh?


----------



## chris89

Niiice score... whats ur bios/ clocks?


----------



## stephenn82

Quote:


> Originally Posted by *chris89*
> 
> Niiice score... whats ur bios/ clocks?


the ones you last sent me, 1175/1563.


----------



## chris89

nice man... know how to fix this error?


----------



## kilyan82

tried your settings and i had black screen at start up..stupid power hungry card i have:\


----------



## chris89

temperatures yeah they are a pain yes i agree.. needs thermal pad upgrades for everything and a repaste .. whatever its fine I dont wanna go too far but I kinda want to plate the assembly copper anyway


----------



## Rexer

I just seen new Asus RX 580 ROG gaming gpu card. Sorta disappointed. Almost same specs as RX 480 with overclock (out of the box). So I imagine AMD's going to try to get everything out of Polaris before they give it up. Not much room for clocking. Then again, it's not the Vega. Any news on Vega?


----------



## gapottberg

Quote:


> Originally Posted by *Rexer*
> 
> I just seen new Asus RX 580 ROG gaming gpu card. Sorta disappointed. Almost same specs as RX 480 with overclock (out of the box). So I imagine AMD's going to try to get everything out of Polaris before they give it up. Not much room for clocking. Then again, it's not the Vega. Any news on Vega?


From what i have seen, the only good thing for "enthusiasts" about the 580 is it will push 480 prices down. It is by most if not all accounts and reviews nothing but a highly OCed 480. The proof being that top end OCers cant even seem to push more than a dozen Mhz bump from factory settings. It goes from being a power responsible card as the 480, to a power hungry card like the 390 series in the 580, and offers minimal gains.

The best way to look at a 580 is get it for someone who doesn't have the tech savey to OC a 480. Otherwise a 480 with manual OC is probably far better deal. Especially after prices dip a bit more.

I am anxious to see if any of the 580 makers add something "unique" to this variant to make me reconsider. For now i will be patiently waiting for the 480 8gb cards to dip below $200 US without a rebate. When that happens, it's gonna awful hard to keep my trigger finger off the mouse button.


----------



## chris89

I like the 480 and if 580 is cheaper sure why not let's see RX 480 $99 everyone will buy them out...

RX 580 needs way cooler VRM like 65C loaded out and it would be a great card... Only issue with Polaris.. hard to keep VRM cool and well below 80C even 65C would be insanity.


----------



## Rexer

Quote:


> Originally Posted by *gapottberg*
> 
> From what i have seen, the only good thing for "enthusiasts" about the 580 is it will push 480 prices down. It is by most if not all accounts and reviews nothing but a highly OCed 480. The proof being that top end OCers cant even seem to push more than a dozen Mhz bump from factory settings. It goes from being a power responsible card as the 480, to a power hungry card like the 390 series in the 580, and offers minimal gains.
> 
> The best way to look at a 580 is get it for someone who doesn't have the tech savey to OC a 480. Otherwise a 480 with manual OC is probably far better deal. Especially after prices dip a bit more.
> 
> I am anxious to see if any of the 580 makers add something "unique" to this variant to make me reconsider. For now i will be patiently waiting for the 480 8gb cards to dip below $200 US without a rebate. When that happens, it's gonna awful hard to keep my trigger finger off the mouse button.


Quote:


> Originally Posted by *gapottberg*
> 
> From what i have seen, the only good thing for "enthusiasts" about the 580 is it will push 480 prices down. It is by most if not all accounts and reviews nothing but a highly OCed 480. The proof being that top end OCers cant even seem to push more than a dozen Mhz bump from factory settings. It goes from being a power responsible card as the 480, to a power hungry card like the 390 series in the 580, and offers minimal gains.
> 
> The best way to look at a 580 is get it for someone who doesn't have the tech savey to OC a 480. Otherwise a 480 with manual OC is probably far better deal. Especially after prices dip a bit more.
> 
> I am anxious to see if any of the 580 makers add something "unique" to this variant to make me reconsider. For now i will be patiently waiting for the 480 8gb cards to dip below $200 US without a rebate. When that happens, it's gonna awful hard to keep my trigger finger off the mouse button.


I replaced a friends' HD7950 with a 480 nitro+ and he raves about it. Course, there's a major difference in that, Lol.
But hey, I'd agree, we have AMD's top dogs. At least for now. Bandwidth. You can't beat 390(x) big 512mb bandwidth. It's how much info can be cram through the port. If any of those 256mb cards increased their bandwidth, they'd be around longer. Nvidia yanked our chain with the smaller bandwidth philosophy saying it was more than enough room. They even lied about the real size of the 970's -being much smaller. 192mb. They could've increased the bandwidth but the ploy and marketing strategy is to keep themselves in the game of selling us new gpu cards every year. Making and selling gpu cards is their business and how they make profit.
For example, a retired Ford Motor Company exec said this of Ford's marketing philosophy back as late as the 90's, "We in the business of selling cars. Not fixing, repairing or making used parts. We're a car manufacturing company selling cars and that's how we make our profits." So as long as there's a cheap way to stay in the ball game and sell volume, they're going to take that road. Thus AMD and Nvidia are in the business of selling gpu cards, Not building them to last a half decade. No profit doing that.
Lol. Stop and think a moment. If Nvidia sold the 980 at 970 prices, they would've dominated the market but would've had fewer new gpu card sales the next 3 years. It doesn't pay to dominate the market and not have nearly the stellar sales they've had the last 3 1/2 yrs.


----------



## Rexer

Quote:


> Originally Posted by *chris89*
> 
> I like the 480 and if 580 is cheaper sure why not let's see RX 480 $99 everyone will buy them out...
> 
> RX 580 needs way cooler VRM like 65C loaded out and it would be a great card... Only issue with Polaris.. hard to keep VRM cool and well below 80C even 65C would be insanity.


VRM cooling. Never seen mine below 67c. Both my 390(x) machines could use a tonic to lower the blood pressure. (Lol.)


----------



## componentgirl90

Quote:


> Originally Posted by *chris89*
> 
> @RaFDX
> Yeah stock first. Have HWInfo boot with windows and idle out then run timespy on Stock default clocks/ bios and post HWInfo and results. Then start on bios #1.
> 
> @componentgirl90
> Right on. From 1000Mhz to 1173Mhz core on the 390x is from 64 billion pixel's & 176 billion texel's every second.
> 
> To 75 billion 72 million pixel's & 206 billion 448 million texel's every second.
> 
> That's 11 billion pixels more per second & 30.448 billion more texel's every second... Not much probably?
> 
> @stephenn82
> Try 1563mhz ram... sometimes 1758mhz doesn't work, though it does on reference with errors. If the 1563mhz doesn't work then the card isn't happy at 1425mv... Gonna need less max core voltage... LMK on 1563mhz ram on 1250mhz core.


If someone were to open my card to put thermal paste on it or whatever it needs to stop it overheating, what would they do to fix it?


----------



## Rexer

Talk about vrm and heat, I'll tell ya what gets my goat is when I get my clocks cinched down perfect for each game I play, I get a game update that creates an unplayable episode. Like last month, Origin updated their features which created a ton of heat. I crashed like a hobby in Battlefield 3 and 4, even on default settings. The heat was going for a joy ride. 85c+. I finally had some luck turning off all the auto setting features in Origin but when I started the games the next day, I couldn't get in Origin to play games at all. Apparently their servers didn't like my tampering with their default auto settings.
Everything in Steam works fine. I run Call of Duty AW & Ghost multiplayer like the hero of the world. Clocks are high, temps are down. Mother couldn't love a son better. My computer base games are great, Everything doesn't get pass the 70c's. What better temps can I ask for in fast, resource eating, multiplayer games? It's like every mother wanting me to kiss their babies and all the young ladies throwing their panties in my face. Steam is excellent!.. . and Origin servers make my head explode.
Uninstalled and reinstalled games using Revo uninstaller and DDU to do a clean install of the video driver, used 'Scan Now' for Windows and Ccleaner to get all the registry junk cleared and topped that with a defrag. Even turned off my anti-virus just in case. Nope. Origin just says I'm gonna fry your head today. Even checked the task manger for any unwanted programs running, ran my anti-virus and spyware/malware programs for anything running that could possibly interfere with Origin. Checked system management>error logs. Nope. Everything's pee-chee. Just Origin.
Origin's gotta do something about that. I think they get ahead of themselves and don't thoroughly go through their new settings for faults. Humongous waste of time.


----------



## Rexer

Quote:


> Originally Posted by *componentgirl90*
> 
> If someone were to open my card to put thermal paste on it or whatever it needs to stop it overheating, what would they do to fix it?


It couldn't hurt. I did it. I found thermal paste smeared throughout the die block. Cleaned it up, used a smaller amount and it was 5c to 10c cooler.


----------



## chris89

@Rexer Haha temps? Send bios attach here .zip of .rom

@componentgirl90 I would use Arctic Silver Ceramique 2 on the Core use finger to spread evenly over whole chip nice and smooth... Then I would actually buy adhesive copper heatsinks and a hot glue device... Push it on the 3 Memory VRM and add a smidge of hot glue to 2 or 4 sides so it can't fall out... That would fix it... Those VRM don't need much to stay well below 70C at all times... I once used a 0.25mm thick copper piece I cut in a T-Shape to cool them... that's all and VRM was like 60C load.


----------



## Rexer

Quote:


> Originally Posted by *chris89*
> 
> @Rexer Haha temps? Send bios attach here .zip of .rom
> 
> @componentgirl90 I would use Arctic Silver Ceramique 2 on the Core use finger to spread evenly over whole chip nice and smooth... Then I would actually buy adhesive copper heatsinks and a hot glue device... Push it on the 3 Memory VRM and add a smidge of hot glue to 2 or 4 sides so it can't fall out... That would fix it... Those VRM don't need much to stay well below 70C at all times... I once used a 0.25mm thick copper piece I cut in a T-Shape to cool them... that's all and VRM was like 60C load.


Hey.. Now that you mention copper heatsinks, I actually used a copper plate in a 7950 a few years ago. I put it between the gpu chip and the big heatsink. I remember it being a bear to install because the thermo mud kept slurring and shifting it all over the place but once I got it centered and together, my temps went down. I got the idea from some German base waterblock company that said the gpu was sitting too low in the die block in some of the earlier 7950s and that using a copper plate help 'lift' and transfer heat away to the heatsink. If I recall, It dropped the temps 8c. There's something about copper that gives it better heat transfer properties. I've used aluminum heatsinks and plates for a number of electronic applications but found copper is just better at this.


----------



## gapottberg

Quote:


> Originally Posted by *Rexer*
> 
> Hey.. Now that you mention copper heatsinks, I actually used a copper plate in a 7950 a few years ago. I put it between the gpu chip and the big heatsink. I remember it being a bear to install because the thermo mud kept slurring and shifting it all over the place but once I got it centered and together, my temps went down. I got the idea from some German base waterblock company that said the gpu was sitting too low in the die block in some of the earlier 7950s and that using a copper plate help 'lift' and transfer heat away to the heatsink. If I recall, It dropped the temps 8c. There's something about copper that gives it better heat transfer properties. I've used aluminum heatsinks and plates for a number of electronic applications but found copper is just better at this.


Eh hem...


----------



## chris89

Only reference values and cool dudes! yeah for sure. Though now a days VRM contacts some sort of super low conductive elemental metal and uses a thermal coating like electro-anodized paint that conducts better than the material itself... So don't sand off the paint! haha If you do blast off the paint... have it plated in copper! Now that's awesome.

*Element Conductivity : watts per meter kelvin*

Aluminium 225
Copper, pure 400
Diamond 1,000
Silver 425
Gold 500


----------



## Rexer

Thank You! I stand fortified! I'm gonna get my 'T' shirt and shorts copper plated! Then I'll conduct some real nasty weather.


----------



## gapottberg

And its begun, Amazon has an Gigabyte RX 480 8GB 4GB model fro $199.99 with free shipping. No rebate required. Dam that was quick.









https://www.amazon.com/Gigabyte-Radeon-Graphics-GV-RX480G1-GAMING-4GD/dp/B01KCWZ1RW/ref=sr_1_1?s=pc&ie=UTF8&qid=1492582478&sr=1-1&keywords=rx+480+8gb&refinements=p_36%3A1253506011

Ooops...my eyes are playing tricks on me.


----------



## chris89

I'll have to contact Visiontek on their Refurb 480's those things are sick for $169.99 bro's 8GB edition...

They listed the RX 570 & 580 for sale ....









https://www.visiontek.com/products/graphics-cards/graphics-upgrades/amd-radeon-rx-500-series.html


----------



## Streetdragon

if i would still have my 290 i would switch over to 480. bu my 390 are still rocking^^


----------



## spyshagg

It doesn't scale perfectly to other games, but if you have a 290, keep the 290.

http://www.tomshardware.com/reviews/battlefield-1-directx-12-benchmark,5017.html


----------



## gapottberg

The middle/end of this video (starting at 8:52) shows a neat little tutorial on undervolting with wattman and using globalchill to maximize power efficiency, cooling, and minimizing noise levels. I do something similar with my 390X but i have to use AB because for whatever reason wattman doesn't offer me all the tools it does the 400 and 500 series of cards.

Obviously once you have settings that work like a champ you can make similar profiles directly for your BIOS and skip the 3rd part software all together, as many are doing here. Some features/limits are also tweak able in the BIOS that are not available through current software manipulation, giving you even greater fine grain control.

Global chill seems like a fantastic feature in theory and if it ever catches on and becomes mainstream I think it is another AMD innovation moving the market forward. A mid range AMD card with Globalchill and Free-sync could provide a fantastic gaming experience at a fantastic price, and the future of mobile laptop gaming looks brighter because of it too. Looking forward to seeing how things progress.


----------



## RaFDX

Question, can I flash the 390x bios to the 390??

Seeing that you can flash a 480 to be a 580, I'll assume the same can be done for the 390/390X


----------



## RaFDX

Since I'm using last year's OC values on a new cooling system, maybe I can hit new PR's. However I think I would rather drop the heat being produced. When you guys say "XXXMHz XXXMHz at 888mv", that "888mv" value is the card's (under) Voltage​ value? When I say "+65mV", what is my actual value? I know it for my Skylake, not for this thing


----------



## stephenn82

A good decent read for everyone. My 390 is clocked at 1175/1563...It does very well and UI m happy...No reason to upgrade for us 290/390 users. Not until Vega anyways.

http://www.game-debate.com/gpu/index.php?gid=3876&gid2=3079&compare=AMD%20Radeon%20RX%20580%208GB-vs-AMD%20Radeon%20R9%20390%20MSI%20Gaming%208GB%20Edition


----------



## stephenn82

Quote:


> Originally Posted by *RaFDX*
> 
> Since I'm using last year's OC values on a new cooling system, maybe I can hit new PR's. However I think I would rather drop the heat being produced. When you guys say "XXXMHz XXXMHz at 888mv", that "888mv" value is the card's (under) Voltage value? When I say "+65mV", what is my actual value? I know it for my Skylake, not for this thing


Chris89 can better answer this. I am pretty sure my card runs @950mv in ram at 1563 MHz. Vrm runs a nice 72c at this speed/power. I was 64c at 1250mhz 888mv. I'm a happy camper. 1175 on core of 1356mv, max temps when gaming was 65-67c. Furrymark takes it to 73c in a bench run. No artifacting at all. Will slowly lower volts to check stability.


----------



## RaFDX

Talking to my friend's today I don't plan on upgrading from my R9 til 2019. The games I play now don't really use that much VRAM

Eve Online
Overwatch
Heroes of the Storm


----------



## boot318

Quote:


> Originally Posted by *RaFDX*
> 
> Question, can I flash the 390x bios to the 390??
> 
> Seeing that you can flash a 480 to be a 580, I'll assume the same can be done for the 390/390X


A 480 and 580 are the same exact chip. Same amount of CUs and Stream Processors. The 390x and 390 are different. The 390 had some form of defect, so AMD had to deactivate some of the CU/SP (2816 vs 25XX).


----------



## stephenn82

Quote:


> Originally Posted by *RaFDX*
> 
> Talking to my friend's today I don't plan on upgrading from my R9 til 2019. The games I play now don't really use that much VRAM
> 
> Eve Online
> Overwatch
> Heroes of the Storm


exactly. I wish there was better drivers for this card. I compensate with lower power and higher clocks. Much more than I could squeeze out with afterburner and much much more responsive and cooler.

I just want that shiny new 1080...lol


----------



## RaFDX

Quote:


> Originally Posted by *stephenn82*
> 
> Chris89 can better answer this. I am pretty sure my card runs @950mv in ram at 1563 MHz. Vrm runs a nice 72c at this speed/power. I was 64c at 1250mhz 888mv. I'm a happy camper. 1175 on core of 1356mv, max temps when gaming was 65-67c. Furrymark takes it to 73c in a bench run. No artifacting at all. Will slowly lower volts to check stability.


Where are you reading your output from?

My max temp overwatch on ultra is 60c with my AIO, I want to go lower!!!!!!!!


----------



## gapottberg

Quote:


> Originally Posted by *RaFDX*
> 
> Since I'm using last year's OC values on a new cooling system, maybe I can hit new PR's. However I think I would rather drop the heat being produced. When you guys say "XXXMHz XXXMHz at 888mv", that "888mv" value is the card's (under) Voltage value? When I say "+65mV", what is my actual value? I know it for my Skylake, not for this thing


888mv is his actual setting he uses in his BIOS. When you use software like MSI AB you are applying an offset to whatever your cards stock voltages are.

You can view your stock voltage in wattman by enabling the ability to adjust them with the slider button.

You could then potentially adjust them individually in wattman instead of using the offset in your other software, but wattman and other 3rd party software dont play well together, and for many cards wattman is the inferior utility. Seems the 400 and 500 series of cards have much better support and many tools available compared to older models for no obvious reasoning.

If you are using the offest in MSI AB, all you have to do is look at your stock voltage in wattman and +/- the offset value you use to know your new voltage. That value should be very similar to what you read with a real time voltage tracker like Hwinfo or MSI AB utility.


----------



## chris89

@RaFDX It would be kinda awesome to run at below 50C load... Would need to remain stock or just 1Ghz on DeLimited power limit 65288 core voltage max 1.25v. On 888mv memory at 1250mhz.

You could game at 48-49C...? 1000mhz core 64gpixel 160gtexel r9 390... thats a ton of performance the difference is minimal and cool running is what feels good right?

lmk I can try and line you up a cooler bios.


----------



## gapottberg

Quote:


> Originally Posted by *chris89*
> 
> @RaFDX It would be kinda awesome to run at below 50C load... Would need to remain stock or just 1Ghz on DeLimited power limit 65288 core voltage max 1.25v. On 888mv memory at 1250mhz.
> 
> You could game at 48-49C...? 1000mhz core 64gpixel 160gtexel r9 390... thats a ton of performance the difference is minimal and cool running is what feels good right?
> 
> lmk I can try and line you up a cooler bios.


Chris...that is exactly what i am trying to run via software. I am working on getting it done via BIOS, but until i have more time to tinker i will use what i have now via AB.

It runs much cooler and quieter than stock by far even with my software undervolt. A BIOS fix would be tits though. I will hit you up with a PM when i am ready but thats exactly what i want to try to start with.


----------



## RaFDX

Quote:


> Originally Posted by *componentgirl90*
> 
> If someone were to open my card to put thermal paste on it or whatever it needs to stop it overheating, what would they do to fix it?


Just need a screw driver. This is what I used to guide me through my first thermal reapply, before I went AIO. There are two cables the connect the radiator to the card, so be careful 



http://imgur.com/kCxiI



__
https://www.reddit.com/r/40yvuv/just_fitted_a_kraken_g10_to_an_msi_r9_390/


----------



## RaFDX

Quote:


> Originally Posted by *chris89*
> 
> @RaFDX It would be kinda awesome to run at below 50C load... Would need to remain stock or just 1Ghz on DeLimited power limit 65288 core voltage max 1.25v. On 888mv memory at 1250mhz.
> 
> You could game at 48-49C...? 1000mhz core 64gpixel 160gtexel r9 390... thats a ton of performance the difference is minimal and cool running is what feels good right?
> 
> lmk I can try and line you up a cooler bios.


so it looks like i'm running at 200-300mV more than you guys. I'd like to get equal performance at lower mV


----------



## gapottberg

Quote:


> Originally Posted by *RaFDX*
> 
> so it looks like i'm running at 200-300mV more than you guys. I'd like to get equal performance at lower mV


There are multiple voltage controls on a card. Afterburner doesnt always allow you to modify them all. In this case it looks like you have gpu core voltage control...but i dont see memory control ot auxiliary control. The 888mv that keeps being sited is chris's voltage for memory when clocked around 1250mhz speed. I dont believe it is the core voltage on the gpu, which will greatly depend on your clockspeed, powerdelivery, and the quality of the silicon you won in the gpu lottery.

Cards of differing quality will hit different clocks at different voltages. There are only rough guidelines for what voltage is needed at what clocks that may need to be fine tuned on an individual basis.


----------



## RaFDX

awesome thanks for the clarification


----------



## chris89

I find that with all the 290X & 390X at least these need mathematically 10% more core voltage per clock... 176 TMU 290X/ 390X vs 160 TMU 290/ 390 is 10%.

So for 290X/ 390X 1080Mhz on 65288 (1.25v max but will only go to 1.25v under 4k or high stress), 1133Mhz 1333mv, 1166Mhz 1366mv, 1177Mhz 1377mv, 1188Mhz 1388mv, 1200Mhz 1400mv, 1250Mhz if thermals allow on VRM 1 Core of between 1415mv-1449mv. With artifacts on the lower end of that scale, without on higher end with likeliness of black-screen. Need more rpm soon at lower Core Temperature to keep VRM in line at 1250Mhz. 1449mv is too hot on most all cards even water but is what is needed for real 24/7 stability artifact free 1250Mhz core on all cards. Hot VRM, Hot Card. Only 1425mv is max on most all cards for 1250Mhz and gives awesome results.

For the 290/ 390 maybe 1100Mhz on 65288, 1133 let's just call it 1266mv, 1166mhz 1333mv, 1177mhz 1355mv, 1188mhz 1366mv, 1200mhz 1388mv. Basically... These voltages are not based on "What Works" but that extra bit of voltage required for those rare later on crashes or instability... You need more than "What Works" for continuous stability to have that extra voltage headroom when it is needed, which it will require over the long haul. I tested lower voltage, many voltages lower work sure but are not 100% stable in the long term. That's why I defined the true upper-end of real stable voltages here. Rather than the minimum possible that just works.

290/ 390 can do same clock as 290X/ 390X at less voltage because of 10% less TMU Count.

Gotta Watch VRM 1 Core and have a fan profile that kicks in sooner at a lower temperature at higher RPM to hold cool temps.


----------



## Rexer

Quote:


> Originally Posted by *gapottberg*
> 
> And its begun, Amazon has an Gigabyte RX 480 8GB 4GB model fro $199.99 with free shipping. No rebate required. Dam that was quick.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> https://www.amazon.com/Gigabyte-Radeon-Graphics-GV-RX480G1-GAMING-4GD/dp/B01KCWZ1RW/ref=sr_1_1?s=pc&ie=UTF8&qid=1492582478&sr=1-1&keywords=rx+480+8gb&refinements=p_36%3A1253506011
> 
> Ooops...my eyes are playing tricks on me.


You're right. 4gb Nitro but for $20 more, you cna get a Nitro 8gb

https://www.newegg.com/Product/Product.aspx?Item=N82E16814202270

https://www.newegg.com/Product/Product.aspx?Item=N82E16814202275


----------



## Rexer

I gotta learn how to spell 'can'. phifft!.


----------



## chris89

https://www.visiontek.com/products/graphics-cards/graphics-upgrades/amd-radeon-rx-500-series.html


----------



## RaFDX

Not upgrading til 2019 at this rate.


----------



## gapottberg

Quote:


> Originally Posted by *RaFDX*
> 
> Not upgrading til 2019 at this rate.


Just watched a podcast the other day that said if you really think about it...AMD hasnt made a "faster" card than the fury in over what...5 years now?

From the release of 290-->580 they are all basicly roughly equivielent in raw performance. The advantages have all been slight clock rate boosts, power effeiciancy, more fearure rich, and smaller size related. Not the true generational gains we have seen in the past.

It is essentially the intel model since 2600k --> 7700k


----------



## christoph

Quote:


> Originally Posted by *gapottberg*
> 
> Just watched a podcast the other day that said if you really think about it...AMD hasnt made a "faster" card than the fury in over what...5 years now?
> 
> From the release of 290-->580 they are all basicly roughly equivielent in raw performance. The advantages have all been slight clock rate boosts, power effeiciancy, more fearure rich, and smaller size related. Not the true generational gains we have seen in the past.
> 
> It is essentially the intel model since 2600k --> 7700k


yeah but if you look in the bright side;

you don't need to upgrade, so you're video card will last a little longer, and the people that needs to upgrade can look into buying the 580 or whatever...

AMD invested in the HBM memory of the Fury, something that hardly anyone wants to invest on, so they're losing money now, so they need to make little steps before making a huge one


----------



## gapottberg

Quote:


> Originally Posted by *christoph*
> 
> yeah but if you look in the bright side;
> 
> you don't need to upgrade, so you're video card will last a little longer, and the people that needs to upgrade can look into buying the 580 or whatever...
> 
> AMD invested in the HBM memory of the Fury, something that hardly anyone wants to invest on, so they're losing money now, so they need to make little steps before making a huge one


Dont get me wrong, I am not really complaining, just pointing out facts. The other advantage i forgot to mention was price. You are getting a much better value as the price of a 290 on day 1 compared to the much better 580 is astounding. We have a very capable 1080p/60hz ultra everything card now for around $200 US that stays cool, is small, and can literally sip power when compared to the 290 series which first brought us that level of performance by AMD, but at a higher cost with fewer advantages.

While AMD hasnt been pushing the performance end of things lately, that doesnt mean they have not contributed greatly to the betterment of the market. GCN, HBM, APUs, Mantel/Vulcan, Relive, Wattman, and the new Chill software which looks promising (see video i linked earlier in thread) are all just a few things helping to move "pc gaming technology" forward by making it better or more accessible to everyday users.


----------



## bluej511

Anyone having any issues getting wattman settings to stick on 17.4.3? As soon as i hit apply it reverts back and ab just black screens no matter the settings.


----------



## gapottberg

Quote:


> Originally Posted by *bluej511*
> 
> Anyone having any issues getting wattman settings to stick on 17.4.3? As soon as i hit apply it reverts back and ab just black screens no matter the settings.


In MSI AB settings tab there is a checkbox related to this issue you must check. It is a wattman/relive issue that you can disable in AB. Just got to find it. I would give you more details but im at work. Will update this post when i get home.

[Edit] Ok,I believe the vary last option under AMD COMPATIBILITY OPTIONS called "erase autosaved startup settings" is the setting you need to enable to get the Crimson/wattman bug that keeps MSI AB from working properly fixed. I had a heck of a time but managed to get it working and i am fairly confident this was what did the trick. Let me know.

I really wish we had full functionality for wattman as i would love to just use that, but since we dont and i know MSI AB pretty well now i just used that since i somehow seem to be the only one who has gotten Crimson and AB to play nice. Maybe we can change that. Let me know and good luck.


----------



## RaFDX

i dont use wattman. dont believe in it


----------



## stephenn82

Quote:


> Originally Posted by *RaFDX*
> 
> Where are you reading your output from?
> 
> My max temp overwatch on ultra is 60c with my AIO, I want to go lower!!!!!!!!


From GPU-z


----------



## christoph

Quote:


> Originally Posted by *gapottberg*
> 
> Dont get me wrong, I am not really complaining, just pointing out facts. The other advantage i forgot to mention was price. You are getting a much better value as the price of a 290 on day 1 compared to the much better 580 is astounding. We have a very capable 1080p/60hz ultra everything card now for around $200 US that stays cool, is small, and can literally sip power when compared to the 290 series which first brought us that level of performance by AMD, but at a higher cost with fewer advantages.
> 
> While AMD hasnt been pushing the performance end of things lately, that doesnt mean they have not contributed greatly to the betterment of the market. GCN, HBM, APUs, Mantel/Vulcan, Relive, Wattman, and the new Chill software which looks promising (see video i linked earlier in thread) are all just a few things helping to move "pc gaming technology" forward by making it better or more accessible to everyday users.


exactly, AMD have contributed a lot by pushing tech, and I mean a lot, just like you said it


----------



## bluej511

Quote:


> Originally Posted by *gapottberg*
> 
> In MSI AB settings tab there is a checkbox related to this issue you must check. It is a wattman/relive issue that you can disable in AB. Just got to find it. I would give you more details but im at work. Will update this post when i get home.
> 
> [Edit] Ok,I believe the vary last option under AMD COMPATIBILITY OPTIONS called "erase autosaved startup settings" is the setting you need to enable to get the Crimson/wattman bug that keeps MSI AB from working properly fixed. I had a heck of a time but managed to get it working and i am fairly confident this was what did the trick. Let me know.
> 
> I really wish we had full functionality for wattman as i would love to just use that, but since we dont and i know MSI AB pretty well now i just used that since i somehow seem to be the only one who has gotten Crimson and AB to play nice. Maybe we can change that. Let me know and good luck.


Then will i be able to use ab to tune my card or ill have to use wattman? Only reason i ask is because when i change voltage in wattman my pc crashes so hard with my new ryzen that i need to clear cmos to get it going again.


----------



## gapottberg

Quote:


> Originally Posted by *bluej511*
> 
> Then will i be able to use ab to tune my card or ill have to use wattman? Only reason i ask is because when i change voltage in wattman my pc crashes so hard with my new ryzen that i need to clear cmos to get it going again.


Dont touch wattman with a 10 foot pole. It is garbage in the condition non 400 and 500 series owners seem to inherit it. If it worked as intended id use it myself, but it's an insult they even enable such a half baked piece of trash as the "version" we 300 series and older get just doent work at all. Just use MSI AB if you can get it working.

The next step up from that would be to look into doing BIOS edits or having someone like Chris89 do them for you. That is the best option but it has its own issues.


----------



## bluej511

Quote:


> Originally Posted by *gapottberg*
> 
> Dont touch wattman with a 10 foot pole. It is garbage in the condition non 400 and 500 series owners seem to inherit it. If it worked as intended id use it myself, but it's an insult they even enable such a half baked piece of trash as the "version" we 300 series and older get just doent work at all. Just use MSI AB if you can get it working.
> 
> The next step up from that would be to look into doing BIOS edits or having someone like Chris89 do them for you. That is the best option but it has its own issues.


Tried your fix and i have the same issue in ab as well, ab doesnt seem to black screen once pressing apply though. Tried changing core clock to 1100 and hit apply and it immediately goes back to 1040. Im on w10 creators btw if it makes any difference. Not sure whats going on, worked fine with 17.3.1 on wattman now both programs just reset.


----------



## gapottberg

Quote:


> Originally Posted by *bluej511*
> 
> Tried your fix and i have the same issue in ab as well, ab doesnt seem to black screen once pressing apply though. Tried changing core clock to 1100 and hit apply and it immediately goes back to 1040. Im on w10 creators btw if it makes any difference. Not sure whats going on, worked fine with 17.3.1 on wattman now both programs just reset.


You may want to try using a program like DDU to remove your drivers and then reinstall them. Make sure you go the safemode route when you do to ensure a clean scrub of old files that may be causing you issues. I had many problems with AB doing the same until one time it didnt. I canr for the life of me figure out exactly what i did to make it work but it did. Thought maybe it was enablong that setting. Idk now.


----------



## bluej511

Quote:


> Originally Posted by *gapottberg*
> 
> You may want to try using a program like DDU to remove your drivers and then reinstall them. Make sure you go the safemode route when you do to ensure a clean scrub of old files that may be causing you issues. I had many problems with AB doing the same until one time it didnt. I canr for the life of me figure out exactly what i did to make it work but it did. Thought maybe it was enablong that setting. Idk now.


Oh def not using DDU thats for sure haha. This is a fresh windows install as well so i doubt theres stuff left over. The ab version is on my hdd and my os is on my ssd not sure if that matters.

I don't play games with it OCed its just annoying that this started happening after the fresh install. I may completely delete ab and then install it again but I've already tried that.


----------



## 12Cores

I was able to run wattman and afterburner(AB) together after a few months of tweaking. Basically I use wattman to set my clocks and afterburner to add voltage, what sucks is that you will have to follow the steps below whenever you decide to game with an overclock.

Please note this only works with the 17.3.3 drivers in my testing.

Step 1:
Use the 17.3.3 drivers/AB 430/river tuner v7.0

Step 2:
Set your core clocks in wattman and raise power limit to whatever level you consider safe, I have mines at +50. Don't touch your memory.

Step 3:
Raise vcore in AB to your desired level and hit accept then close the program. Whatever you do don't save any OC profiles in Afterburner.

Step 4:
Go back into wattman and make sure clocks and power limit did not reset. Reload settings if needed

Step 5:
Relaunch afterburner and check your wattman settings one more time. You should now be good to go.

Good Luck!


----------



## Rexer

Quote:


> Originally Posted by *bluej511*
> 
> Anyone having any issues getting wattman settings to stick on 17.4.3? As soon as i hit apply it reverts back and ab just black screens no matter the settings.


I heard a few months ago, guys going off about Wattman. Pretty scary. I think it had something to do with Afterburner not working with the update. Don't quote me, I've been avoiding later updates. I got mad last August because they didn't separate the card drivers and were still making one unified driver for all their cards. So I rolled back to the best known which was 16.5 and stayed there till 16.11.2 (hotfix for 16.11).
Well, somewhere in time, that changed (like they read my mind). They've been keeping drivers in respect to the cards which is a good thing. I don't want changes for an RX 480 that can't play something my 390x can. Even though the 390x will never access it, it's added 'stuff' to lug around like a still born twin attached to it's host. Heck, I got Office Professional 2016 to do that!


----------



## gapottberg

Well, my systematic journey to test the lower limits of my 390x continues. As some of you are aware I am currently testing undervolted limits and performance.

As of last night I was able to drop my gpu core volts another couple points with no issues in performance benching. This in turn made my temps even better under full load for significant periods of time.

My current settings are +50% powerlimit, clocks are gpu1000mhz/mem1250mhz, and an offset of -50mv on the gpu core (down from -31mv). All other voltages are currently stock as MSI AB doesnt have the ability to manipulate memory voltage to chris89's suggested 888mv, and im not ready to mess with auxilary voltage yet.

This set up along with my custom fan profile keeps my 390x from ever breaking 69'C in even the most brutal stress testing...and in many many games i enjoy +60hz minimums at 1080p or 1440p with no artifacting and temps in the mid 50's. More testing is required before i deem this 100% stable on my platform, but inital testing was promising.

This is really the best way to utilize the 290/390 series imo, and if you havnt tried undervolting and playing with a cool quiet card at a frame capped 60fps, you should. You wont miss the extra 2 fps higher clocks net you...but you will love the lower heat and noise output.


----------



## stephenn82

Quote:


> Originally Posted by *bluej511*
> 
> Tried your fix and i have the same issue in ab as well, ab doesnt seem to black screen once pressing apply though. Tried changing core clock to 1100 and hit apply and it immediately goes back to 1040. Im on w10 creators btw if it makes any difference. Not sure whats going on, worked fine with 17.3.1 on wattman now both programs just reset.


you used wattman, it doesn't work with AB. It disables it. I just loaded custom bios and no need to mess with either. Better clocks, less power, less heat, why go back??


----------



## gapottberg

Quote:


> Originally Posted by *stephenn82*
> 
> I just loaded custom bios and no need to mess with either. Better clocks, less power, less heat, why go back??


Loading a custom BIOS is great and is the best fix by far...but unless you are confident and capable enough to make your own it presents the issue of being relient on someone else whenever you want to tinker with settings.

I personally dont want someone to just give me a BIOS that works better. I want to learn to make one myself. Once i have had the time and resources to educate myself fully i qill do just that similarly to how i taught myself to overclock my CPU.

Until then i will keep tinkering with whatever software i can get to work.


----------



## stephenn82

Quote:


> Originally Posted by *gapottberg*
> 
> Loading a custom BIOS is great and is the best fix by far...but unless you are confident and capable enough to make your own it presents the issue of being relient on someone else whenever you want to tinker with settings.
> 
> I personally dont want someone to just give me a BIOS that works better. I want to learn to make one myself. Once i have had the time and resources to educate myself fully i qill do just that similarly to how i taught myself to overclock my CPU.
> 
> Until then i will keep tinkering with whatever software i can get to work.


I got a solution for rhat too. Comtact @chris89 to hook it up.

Any more excuses/worries out there?

And btw...he does both. Provides AND teaches. Thats how i got mine rolling. He is an awesome dude.

Teach a man to fish mentality.


----------



## gapottberg

I am aware, i have already PMed him.


----------



## componentgirl90

Quote:


> Originally Posted by *chris89*
> 
> @Rexer Haha temps? Send bios attach here .zip of .rom
> 
> @componentgirl90 I would use Arctic Silver Ceramique 2 on the Core use finger to spread evenly over whole chip nice and smooth... Then I would actually buy adhesive copper heatsinks and a hot glue device... Push it on the 3 Memory VRM and add a smidge of hot glue to 2 or 4 sides so it can't fall out... That would fix it... Those VRM don't need much to stay well below 70C at all times... I once used a 0.25mm thick copper piece I cut in a T-Shape to cool them... that's all and VRM was like 60C load.


Ok great ty.


----------



## componentgirl90

Quote:


> Originally Posted by *Rexer*
> 
> It couldn't hurt. I did it. I found thermal paste smeared throughout the die block. Cleaned it up, used a smaller amount and it was 5c to 10c cooler.


Ok I see. I guess that is why my card was revving so much. Just needs some more (or less) paste or something. The bios Chris did seemed to help a lot but its not perfect as if you leave the card running too long on 100% it still has the problem.


----------



## chris89

*@ Everyone* : Thank You that means a lot and your all great people as well... The thought that counts... I'm very happy to help with this in any way I can ... Any Questions? No Problem At All... I am not one to trip.








Quote:


> Originally Posted by *componentgirl90*
> 
> Ok great ty.


Your welcome. It's the Memory VRM modules...

I forgot about this... Turn On Chill! hahaha... Create a profile... I wish they would simply add the Chill Profile adjustment in the Global Section....

It's amazing... it'll dial back clocks/ temps/ power by as much as maybe 90% when just sitting there.. and you can let sit cool.. then set Min FPS & Max FPS.... min for when let it sit.. it'll cool down ... then when you move the mouse it'll increase FPS until you stop moving again.. Best thing that ever happened for the 290/ 390/ 290X/ 390X owners... Chill! haha

Take extra-special-note of the temperature of the Core when this occurs ... If you can tell me exactly what temperature the actual core is at when this happens I can fix it in the bios... Here's a new bios... I really don't want it to get this hot but the card will shut down automatically at an extreme temperature anyway...

Here I did set max temp 88C from default 98C... In the previous bios 88C... But that fan is annoying but I think this could work... I don't wanna set it too high and burn your card... So setting less is safer.

Min Temperature is 32C @ 44% fan speed... then 50C @ 77%... then 55C @ 77%... temperature will rise but slowly and probably take 10x longer to do it... I also reduced the memory clock and memory voltage to reduce the temperature drastically...

If it's one of those things you sleep in the same room as the computer when it's turned on and it does that in the middle of the night waking you up like ***?!? hahahaha









Maybe we need a pretty vast scale for the temperature to ride on... Like from low temperature extra low fan speed to not so high but give it like 50C & 44-100% to work with basically... not ideal but best way to have it dial it self in... This is gonna need testing to dial in.... Won't be a one time fix done good to go... I need info and am willing to help you fix this annoying as heck issue haha

ComponentGirl90_New_BIOS_For_XFX_DD_390X.zip 99k .zip file


----------



## componentgirl90

Quote:


> Originally Posted by *chris89*
> 
> *@ Everyone* : Thank You that means a lot and your all great people as well... The thought that counts... I'm very happy to help with this in any way I can ... Any Questions? No Problem At All... I am not one to trip.
> 
> 
> 
> 
> 
> 
> 
> 
> Your welcome. It's the Memory VRM modules...
> 
> I forgot about this... Turn On Chill! hahaha... Create a profile... I wish they would simply add the Chill Profile adjustment in the Global Section....
> 
> It's amazing... it'll dial back clocks/ temps/ power by as much as maybe 90% when just sitting there.. and you can let sit cool.. then set Min FPS & Max FPS.... min for when let it sit.. it'll cool down ... then when you move the mouse it'll increase FPS until you stop moving again.. Best thing that ever happened for the 290/ 390/ 290X/ 390X owners... Chill! haha
> 
> Take extra-special-note of the temperature of the Core when this occurs ... If you can tell me exactly what temperature the actual core is at when this happens I can fix it in the bios... Here's a new bios... I really don't want it to get this hot but the card will shut down automatically at an extreme temperature anyway...
> 
> Here I did set max temp 88C from default 98C... In the previous bios 88C... But that fan is annoying but I think this could work... I don't wanna set it too high and burn your card... So setting less is safer.
> 
> Min Temperature is 32C @ 44% fan speed... then 50C @ 77%... then 55C @ 77%... temperature will rise but slowly and probably take 10x longer to do it... I also reduced the memory clock and memory voltage to reduce the temperature drastically...
> 
> If it's one of those things you sleep in the same room as the computer when it's turned on and it does that in the middle of the night waking you up like ***?!? hahahaha
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Maybe we need a pretty vast scale for the temperature to ride on... Like from low temperature extra low fan speed to not so high but give it like 50C & 44-100% to work with basically... not ideal but best way to have it dial it self in... This is gonna need testing to dial in.... Won't be a one time fix done good to go... I need info and am willing to help you fix this annoying as heck issue haha
> 
> ComponentGirl90_New_BIOS_For_XFX_DD_390X.zip 99k .zip file


Hi Chris Thanks for writing this new bios. I am sending the card for its second RMA tomorrow. I am hoping that they fix it tbh as I just want to play games and forget about this issue. Maybe in the future I will flash it with a bios which overclocks it, but that will be when the warranty runs out probably (which isnt that long away). I don't want to replace this card yet as the next card I want to get will be something on the level of 1080ti overclocked hopefully in two years time or something when its much cheaper. It would be nice to give this baby a spin on a full directx 12 game as well.


----------



## chris89

@componentgirl90 Your welcome... make sure they take active measures to cool directly the (3) Memory VRM... If they don't do that... The problem will presist sadly...









Right on... I find 17.1.2 WHQL has better DX12 & Vulkan Draw Call Performance by as much as 25%... The latest 17.4.3 does best on Single Threaded DX11...


----------



## componentgirl90

Quote:


> Originally Posted by *chris89*
> 
> @componentgirl90 Your welcome... make sure they take active measures to cool directly the (3) Memory VRM... If they don't do that... The problem will presist sadly...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Right on... I find 17.1.2 WHQL has better DX12 & Vulkan Draw Call Performance by as much as 25%... The latest 17.4.3 does best on Single Threaded DX11...


Ok I will make sure to do that. I got my 270 in at the moment. That was having artifact problems as well but seems to be better since DDU and deleting everything although havent had it in for long.

So does the 390x actually perform nearly as good as a 980ti in a full directx 12 game?


----------



## chris89

Quote:


> Originally Posted by *componentgirl90*
> 
> Ok I will make sure to do that. I got my 270 in at the moment. That was having artifact problems as well but seems to be better since DDU and deleting everything although havent had it in for long.
> 
> So does the 390x actually perform nearly as good as a 980ti in a full directx 12 game?


It sure does when DeLimited power limits on PCIe 3.0 ... since no one using a 980 Ti is on PCIe 2.0... haha You could see 10-20 extra fps on PCIe 3.0 motherboard.

By the way I can speed up the 270 as it could use a little bump up to 1,173Mhz core... not much difference but stable and worth it... Artifacts is usually video ram overheat or too little core voltage.


----------



## gordesky1

So far my 390 is still running strong with your bios Cant wait too see what it does with the ryzen build im building today when it comes in

Doing bench marks with this system again so i can compare and games


----------



## chris89

Nice dude! im excited to see what PCIe 3.0 does for our cards... Superposition


----------



## componentgirl90

Quote:


> Originally Posted by *chris89*
> 
> It sure does when DeLimited power limits on PCIe 3.0 ... since no one using a 980 Ti is on PCIe 2.0... haha You could see 10-20 extra fps on PCIe 3.0 motherboard.
> 
> By the way I can speed up the 270 as it could use a little bump up to 1,173Mhz core... not much difference but stable and worth it... Artifacts is usually video ram overheat or too little core voltage.


I see. Ok ty Btw I have an MSI R9 270 OC (not the 270x).

If you could stop the flashing textures/artifacts that would be nice lol. I might be able to open this one up as well since I bought the card in Jan/Feb 2015 and thats over two years ago so I think the warranty has run out but not sure. I applied thermal paste a few times on a CPU but tbh its the most fiddly job I have ever done in a computer, I would rather avoid opening the graphics card up to do something similar.

Btw what made you think that the memory VRMs were overheating on the 390x I have?

270bios.zip 97k .zip file


----------



## chris89

Quote:


> Originally Posted by *componentgirl90*
> 
> I see. Ok ty Btw I have an MSI R9 270 OC (not the 270x).
> 
> If you could stop the flashing textures/artifacts that would be nice lol. I might be able to open this one up as well since I bought the card in Jan/Feb 2015 and thats over two years ago so I think the warranty has run out but not sure. I applied thermal paste a few times on a CPU but tbh its the most fiddly job I have ever done in a computer, I would rather avoid opening the graphics card up to do something similar.
> 
> Btw what made you think that the memory VRMs were overheating on the 390x I have?
> 
> 270bios.zip 97k .zip file

















Memory vrm need direct heatsink contact to transfer heat off the pcb to be cooled by the heatsink under load... If not eventually the vrm will lift off the PCB.. Printed Circuit Board and the entire card will be irreparably damaged from heat.

Slight TDP Reduction to R7 370 levels and overclock 1173Mhz... Maybe it'll work? Plus fan profile reduced from 101C to 67C.

1173-MSI-270-ComponentGurly90-.zip 98k .zip file


----------



## componentgirl90

Quote:


> Originally Posted by *chris89*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Memory vrm need direct heatsink contact to transfer heat off the pcb to be cooled by the heatsink under load... If not eventually the vrm will lift off the PCB.. Printed Circuit Board and the entire card will be irreparably damaged from heat.
> 
> Slight TDP Reduction to R7 370 levels and overclock 1173Mhz... Maybe it'll work? Plus fan profile reduced from 101C to 67C.
> 
> 1173-MSI-270-ComponentGurly90-.zip 98k .zip file


Thanks again for this Chris.

Do I need to Uninstall drivers and DDU before every bios flash?


----------



## chris89

Quote:


> Originally Posted by *componentgirl90*
> 
> Thanks again for this Chris.
> 
> Do I need to Uninstall drivers and DDU before every bios flash?


Your welcome and no you won't need to do all that extra stuff, just flash and restart. Let me know if it's working okay? =D


----------



## componentgirl90

Quote:


> Originally Posted by *chris89*
> 
> Your welcome and no you won't need to do all that extra stuff, just flash and restart. Let me know if it's working okay? =D


Hi Chris,

I'm afraid that bios didn't agree with my card. I ran heaven benchmark and all was going swimmingly until something went wrong and the screen went black and there was like straight horizontal lines across the screen. I think 1173MHz might be pushing it tbh based on the overclocking reviews for this card, its right on the edge according to the review I read.

I am going to flash the old bios back for now.

Cheers.


----------



## chris89

Quote:


> Originally Posted by *componentgirl90*
> 
> Hi Chris,
> 
> I'm afraid that bios didn't agree with my card. I ran heaven benchmark and all was going swimmingly until something went wrong and the screen went black and there was like straight horizontal lines across the screen. I think 1173MHz might be pushing it tbh based on the overclocking reviews for this card, its right on the edge according to the review I read.
> 
> I am going to flash the old bios back for now.
> 
> Cheers.


Hi, Computational Gurl 90 haha

Worth a shot... This one will work fine .. 1500mhz ram should work fine on 1000mhz core clock.. cooler running too...









Cheers mate









1000-1500-MSI-270-ComponentGurly90-.zip 98k .zip file


----------



## componentgirl90

Quote:


> Originally Posted by *chris89*
> 
> Hi, Computational Gurl 90 haha
> 
> Worth a shot... This one will work fine .. 1500mhz ram should work fine on 1000mhz core clock.. cooler running too...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Cheers mate
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1000-1500-MSI-270-ComponentGurly90-.zip 98k .zip file


Ok nice, will try that soon.


----------



## chris89

Quote:


> Originally Posted by *componentgirl90*
> 
> Ok nice, will try that soon.


I hope it works artifact free with memory at 1500... Good luck CompetentPCGirl90


----------



## RaFDX

guys i'm back!! hope all is well. what have i missed?


----------



## stephenn82

Same here...SoCal is nice this time of year. Missed the good Mexican food/beer too.


----------



## gordesky1

Heres a 3dmark score compaing my 8370 to my ryzen http://www.3dmark.com/compare/fs/12246933/fs/12461879

Superposition scored a 2687 with the 1157/1280 tho, Think with the 1204 bios on my 8370 it was 2704. I really need to try that bios again for benchmark ony to see if pcie 3.0 made any difference.

Is there a way you can open the Superposition benchmark files? i saved one of my 8370 at 1157/1280 but i cant seem to open them? If i can remember right think it was 2570 something.

But man in gaming its night and day difference from my 8370 4.8ghz vs 1700 at 3.6....

Tried 2 games that took alot, forza horizon 3 at the start of races on my 8370 i would get 20s and sometimes low 30s and it would stabilize to around 40s and that's with mem to high settings...

Now on ryzen on max settings







max fps i saw was 70s and up driving around in races even at the start its in the 50s and 60s and 70+ threw the track... street races are 50+ which before it was high 30s and low 40s...Even streaming the game on twitch doeisnt take any performance away...

Tried miscreated also which really stresses out hardware and on mem settings on the fx i got 30s and it dip in the 20s sometimes. very high was 20s and dip to 18 19...
ryzen high 40s-50s on maxed settings and the same with streaming

On the horizon forums they said no cpu ain't going to make that game any better... which i thought that too. But my system runs it perfect now Guess horizon likes the 17threads lol...

Gta 5 is also so much smoother and i can maxed that out also now..

I was going to hold off but really glad i upgraded


----------



## chris89

Quote:


> Originally Posted by *gordesky1*
> 
> Heres a 3dmark score compaing my 8370 to my ryzen http://www.3dmark.com/compare/fs/12246933/fs/12461879
> 
> Superposition scored a 2687 with the 1157/1280 tho, Think with the 1204 bios on my 8370 it was 2704. I really need to try that bios again for benchmark ony to see if pcie 3.0 made any difference.
> 
> Is there a way you can open the Superposition benchmark files? i saved one of my 8370 at 1157/1280 but i cant seem to open them? If i can remember right think it was 2570 something.
> 
> But man in gaming its night and day difference from my 8370 4.8ghz vs 1700 at 3.6....
> 
> Tried 2 games that took alot, forza horizon 3 at the start of races on my 8370 i would get 20s and sometimes low 30s and it would stabilize to around 40s and that's with mem to high settings...
> 
> Now on ryzen on max settings
> 
> 
> 
> 
> 
> 
> 
> max fps i saw was 70s and up driving around in races even at the start its in the 50s and 60s and 70+ threw the track... street races are 50+ which before it was high 30s and low 40s...Even streaming the game on twitch doeisnt take any performance away...
> 
> Tried miscreated also which really stresses out hardware and on mem settings on the fx i got 30s and it dip in the 20s sometimes. very high was 20s and dip to 18 19...
> ryzen high 40s-50s on maxed settings and the same with streaming
> 
> On the horizon forums they said no cpu ain't going to make that game any better... which i thought that too. But my system runs it perfect now Guess horizon likes the 17threads lol...
> 
> Gta 5 is also so much smoother and i can maxed that out also now..
> 
> I was going to hold off but really glad i upgraded


Which bios are you using?

Nice send your channel for streaming...

Yeah let's work on a BIOS to surpass the FX 8370... Also make sure it's 8GT/s and PCIe 3.0 x16 in GPUz .. post screenshot... Also AIDA GPGPU Scores


----------



## chris89

Quote:


> Originally Posted by *RaFDX*
> 
> guys i'm back!! hope all is well. what have i missed?


Not much other than @gordesky1 doubling his Hawaii/ Grenada performance on Ryzen hahaha


----------



## chris89

*@gordesky1

Gordesky1_75C-Gaming-296-266w-1207-1166-1156-1146_1001_888.zip 99k .zip file


Gordesky1_75C_Max_Temp_1207_1388_1001_888.zip 99k .zip file
*


----------



## RaFDX

@chris89 @stephenn82 Any suggestions on how to up my superposition score (2820) while dropping my added mV (+65)/overall voltage to the OC (1170//1700)/ card??


----------



## gordesky1

Quote:


> Originally Posted by *chris89*
> 
> *@gordesky1
> 
> Gordesky1_75C-Gaming-296-266w-1207-1166-1156-1146_1001_888.zip 99k .zip file
> 
> 
> Gordesky1_75C_Max_Temp_1207_1388_1001_888.zip 99k .zip file
> *


Heres the score in sup which gave me a much lower score than the 1157/1250...? I think its because the vram is at 1000?

gaming 

the non gaming bios 

And 300points lower in 3dmark firestrike.

Oh pcie is at 3.0 in gpuz and 8gt in hwin at load.


----------



## chris89

Quote:


> Originally Posted by *gordesky1*
> 
> Heres the score in sup which gave me a much lower score than the 1157/1250...? I think its because the vram is at 1000?
> 
> gaming
> 
> the non gaming bios
> 
> And 300points lower in 3dmark firestrike.
> 
> Oh pcie is at 3.0 in gpuz and 8gt in hwin at load.


*

Gordesky1_DeLimit_1190Mhz_1378mv_1290Mhz_898mv_75ASIC84C_Gam.zip 99k .zip file


Gordesky1_DeLimit_1190Mhz_1378mv_1290Mhz_898mv_96C_Benchmark.zip 99k .zip file
*


----------



## gordesky1

Quote:


> Originally Posted by *chris89*
> 
> *@gordesky1
> 
> Gordesky1_75C-Gaming-296-266w-1207-1166-1156-1146_1001_888.zip 99k .zip file
> 
> 
> Gordesky1_75C_Max_Temp_1207_1388_1001_888.zip 99k .zip file
> *


heres the non gaming one
Quote:


> Originally Posted by *chris89*
> 
> *
> 
> Gordesky1_DeLimit_1190Mhz_1378mv_1290Mhz_898mv_75ASIC84C_Gam.zip 99k .zip file
> 
> 
> Gordesky1_DeLimit_1190Mhz_1378mv_1290Mhz_898mv_96C_Benchmark.zip 99k .zip file
> *


Just tried both of thoes but the memory was stuck at 149mhz even after a driver reinstall..

Had to go back to the 1157/1250 and the memory went back back to normal.


----------



## gordesky1

benchmark bios http://www.3dmark.com/compare/fs/12484645/fs/12461879 1166/1289 vs 1157/1250 pretty good










sup went up too a bit does beat the 1157 

Good temps too.

Not as good as the 1204/1250 bios couple weeks ago i had on it with the fx think that scored a 2704? Did run hot for me tho at the time.

But sense i put a fan in front of it gpu temps and cpu temps went down lol


----------



## chris89

Quote:


> Originally Posted by *gordesky1*
> 
> benchmark bios http://www.3dmark.com/compare/fs/12484645/fs/12461879 1166/1289 vs 1157/1250 pretty good
> 
> 
> 
> 
> 
> 
> 
> 
> 
> sup went up too a bit does beat the 1157
> 
> Good temps too.
> 
> Not as good as the 1204/1250 bios couple weeks ago i had on it with the fx think that scored a 2704? Did run hot for me tho at the time.
> 
> But sense i put a fan in front of it gpu temps and cpu temps went down lol


Nice scores man









Right on so did you use the 64C bios or 96C bios? The 96C bios will score much higher for benchmark only.


----------



## gapottberg

Quote:


> Originally Posted by *chris89*
> 
> The 96C bios will score much higher for benchmark only.


This is what i hate about thr artificial scores of most of these benchmarking tools. They literally take a minutia of difference in actual avg fps score...and multiply it to thr Nth degree to yield their silly score.

Any review sites that dont also give me the actual avg fps score, (as limiting as a useful metric as even that is) when using synthetic benchs as a review metric, piss me off entirely.


----------



## gordesky1

Quote:


> Originally Posted by *chris89*
> 
> Nice scores man
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Right on so did you use the 64C bios or 96C bios? The 96C bios will score much higher for benchmark only.


Its the benchmark one 96c, So its not a good idea using this one for normal used right?


----------



## chris89

Quote:


> Originally Posted by *gordesky1*
> 
> Its the benchmark one 96c, So its not a good idea using this one for normal used right?


Is it running perfectly fine and cool? See if you can tell a difference in performance and thermals between the two?

I really think the 390 can do better haha... Gotta lower your module timings by strap? system memory clock memory all the way out to get more fps here on this benchmark...

Same bios with only extra core voltage at 1386mv vs 1336mv and 1210mhz core vs 1166mhz...

1210-1290-96C.zip 99k .zip file


----------



## Streetdragon

sooo i have a little problem: i have 2 monitor and 1 TV.
monitor1 is displayport(working)
monitor 2 is dvi
TV is HDMI(working)

if i pull the cable of the tv and disable->enable multiscreen on windows the monitor2 is working. but no tv.....

monitor2 is not even listet in the amd driver page. where is the problem with that? i realy dont get it...

ok never mind. had to change it in wondows settings to get it dworking..... only searched in amd driver


----------



## gordesky1

Quote:


> Originally Posted by *chris89*
> 
> Is it running perfectly fine and cool? See if you can tell a difference in performance and thermals between the two?
> 
> I really think the 390 can do better haha... Gotta lower your module timings by strap? system memory clock memory all the way out to get more fps here on this benchmark...
> 
> Same bios with only extra core voltage at 1386mv vs 1336mv and 1210mhz core vs 1166mhz...
> 
> 1210-1290-96C.zip 99k .zip file


Will give that one a try too later Yep temps been fine all day didn't pass over 70s yet. vrm is staying in the 70s too. that 240mm fan just siting there must be helping lol...

it seems like the more mhz on the memory is what this benchmark loves than core mhz lol..


----------



## chris89

Quote:


> Originally Posted by *gordesky1*
> 
> Will give that one a try too later Yep temps been fine all day didn't pass over 70s yet. vrm is staying in the 70s too. that 240mm fan just siting there must be helping lol...
> 
> it seems like the more mhz on the memory is what this benchmark loves than core mhz lol..


I just wanna see the R9 390 beat my PCIe 2.0 390x on Ryzen Bro.. C'mon haha









Try Unigine benchmarks... that's where I saw no difference is mega high memory clock .. 1563mhz vs 1758mhz... like 0.25fps

@componentgirl90 Lovely card for a lovely girl. I almost bought this card too.. i i cheers matey


----------



## stephenn82

Sorry everyone for being gone so long. I've been catching up. Looks like the whole thread has come alive with inventing their bioses. Sweet! I haven't done too much lately, been in socal for almost 2 weeks and been prepping the house for inlaws who are with us this week. Busy busy! Oh, and I got a beta invite for a game that rhymes with steak. It's pretty cool...but I suck at it. It's been years since I played it, forgot all the mechanics and how to actually play it.


----------



## gapottberg

Sounds like a champion of a game.


----------



## stephenn82

Quote:


> Originally Posted by *gapottberg*
> 
> Sounds like a champion of a game.


Yeah it was nice. Heard it will be free to play. Wonder if it will be free as well? My assumption, no. Even rainbow six siege is 15 bucks. Free grinding or pay for quick access.


----------



## gapottberg

Yeah, I have been really tempted to buy "Steak Live" to satisfy my hunger for that particular flavor of mayhem. The new "Steak" looks equally tasty though.


----------



## gordesky1

Quote:


> Originally Posted by *chris89*
> 
> I just wanna see the R9 390 beat my PCIe 2.0 390x on Ryzen Bro.. C'mon haha
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Try Unigine benchmarks... that's where I saw no difference is mega high memory clock .. 1563mhz vs 1758mhz... like 0.25fps
> ]/quote]
> 
> Hehe Yea at the moment im just trying to watch my temps with the current bios 1160 1270 temps are 80 and little up lol Been getting hot in the house again lol


----------



## bluej511

So heres how my 1700x as of today compares to my 4690k with my r9 390 at 1040/1500.

http://www.3dmark.com/compare/fs/12527705/fs/11871887


----------



## RaFDX

Quote:


> Originally Posted by *bluej511*
> 
> So heres how my 1700x as of today compares to my 4690k with my r9 390 at 1040/1500.
> 
> http://www.3dmark.com/compare/fs/12527705/fs/11871887


5k more physics score than my 6700k @ 4.6GHz. nice nice


----------



## bluej511

Quote:


> Originally Posted by *RaFDX*
> 
> 5k more physics score than my 6700k @ 4.6GHz. nice nice


Yea does alright, its doing 3.8 at 1.2v on water and is around 50°C under realbench, so i could probably get it at 4ghz if i get lucky.


----------



## chris89

Gen3 too bad not PCIe 2.0 for comparison


----------



## RaFDX

Quote:


> Originally Posted by *bluej511*
> 
> Yea does alright, its doing 3.8 at 1.2v on water and is around 50°C under realbench, so i could probably get it at 4ghz if i get lucky.


I'm at 1.340V cooling the i7 with a Cryorig H5 Ultimate. I am going to delid in a few weeks (hopefully) and push for 4.8. I can hit 4.7, but at 1.376Vish

http://www.3dmark.com/fs/12410990


----------



## bluej511

Quote:


> Originally Posted by *RaFDX*
> 
> I'm at 1.340V cooling the i7 with a Cryorig H5 Ultimate. I am going to delid in a few weeks (hopefully) and push for 4.8. I can hit 4.7, but at 1.376Vish
> 
> http://www.3dmark.com/fs/12410990


http://www.3dmark.com/compare/fs/12410990/fs/12527705


----------



## RaFDX

i wanna up my scores, looks like ill be overclocking some more. Havent touched the MSI (OC) since i installed the AIO, maybe i can go faster? @chris89 @stephenn82 thoughts on more speed?


----------



## chris89

Quote:


> Originally Posted by *RaFDX*
> 
> i wanna up my scores, looks like ill be overclocking some more. Havent touched the MSI (OC) since i installed the AIO, maybe i can go faster? @chris89 @stephenn82 thoughts on more speed?


 1173-1290-HAWAII-GPU.zip 99k .zip file


----------



## Dairam

.


----------



## RaFDX

Quote:


> Originally Posted by *chris89*
> 
> 1173-1290-HAWAII-GPU.zip 99k .zip file


So the program froze and I rebooted and now the bios is corrupted. How do I get it back??

Getting it back now


----------



## gordesky1

Quote:


> Originally Posted by *RaFDX*
> 
> So the program froze and I rebooted and now the bios is corrupted. How do I get it back??
> 
> Getting it back now


I fully recommend using ati flash instead of doing it in windows. I never had a issue doing it that way much safer. in windows anything can go wrong as you can see.


----------



## RaFDX

Quote:


> Originally Posted by *gordesky1*
> 
> I fully recommend using ati flash instead of doing it in windows. I never had a issue doing it that way much safer. in windows anything can go wrong as you can see.


I made the boot USB, however atiflash wouldn't run in dos, so I had to use the command prompt to run atiflash and used the following command once I was in the directory

*Atiflash -p -f 0 bios.rom*

Reinstalled Crimson 17.5 and I am up and running.

Never will I diss iGPU again. Lol


----------



## componentgirl90

Alas, the retailer failed to detect any problem with my 390x :| and they have swiftly returned the card to me.

I will try and install drivers 15.20 because I have read a number of issues about 100% fan and drivers. I think I may have only installed recent drivers.

I remain determined to resolve this issue.

I will run through a large checklist of things before I start to take the card apart tbh.


----------



## RaFDX

Quote:


> Originally Posted by *componentgirl90*
> 
> Alas, the retailer failed to detect any problem with my 390x :| and they have swiftly returned the card to me.
> 
> I will try and install drivers 15.20 because I have read a number of issues about 100% fan and drivers. I think I may have only installed recent drivers.
> 
> I remain determined to resolve this issue.
> 
> I will run through a large checklist of things before I start to take the card apart tbh.


what's going on with your card?


----------



## chris89

Quote:


> Originally Posted by *RaFDX*
> 
> I made the boot USB, however atiflash wouldn't run in dos, so I had to use the command prompt to run atiflash and used the following command once I was in the directory
> 
> *Atiflash -p -f 0 bios.rom*
> 
> Reinstalled Crimson 17.5 and I am up and running.
> 
> Never will I diss iGPU again. Lol


Hows it running?


----------



## RaFDX

Quote:


> Originally Posted by *chris89*
> 
> Hows it running?


no issues, havent overclocked it again yet. i was at 1175/1700 before i tried your OC. I think I am going to up the memory clock (+25) and do heaven benchmark to see if there's anything funky


----------



## componentgirl90

Quote:


> Originally Posted by *RaFDX*
> 
> what's going on with your card?


Quote:


> Originally Posted by *RaFDX*
> 
> what's going on with your card?


Basically it accelerates to 100% fan when idle and you can't reduce it. I have RMA'd it twice and they think its a software issue or that I have poor airflow. I have installed different drivers but I think they may have been recent drivers only. I am going to try installing 15.2 (which were the original drivers I got with the card).

I have collated ideas for solving the issue from different websites etc and will run through these before doing anything more drastic.


----------



## RaFDX

Dumb question; have you tried MSI Afterburner or NZXT Cam to control the fans??

Have you tried to hook up another fan to it via gelid pwm fan adapter??


----------



## componentgirl90

I have tried MSI afterburner but when its going 100% it wont respond if you try and take control of the fans. Normally if you take control of the fans in MSI afterburner I can slide it whereever I want.

"Have you tried to hook up another fan to it via gelid pwm fan adapter??" - I don't understand what you mean by this I have never done that.


----------



## lanofsong

Hey there R9 390/390X owners,

Would you consider signing up with Team OCN for the 2017 Pentathlon (*May 5th through May 19th*). There is so much time left an we really could use your help.

This event is truly a GLOBAL battle with you team OCN going up against many teams from across the world and while we put in a good showing at last year's event by finishing 6th, we could do with a lot more CPU/GPU compute power. All you need to do is sign up and crunch on any available hardware that you can spare.

The cool thing about this event is that it spread over 5 disciplines over *varying lengths of time* (different projects) so there is a lot of *strategy/tactics* involved.

We look forward to having you and your hardware on our team. Again, this event lasts for two weeks and takes place May 5th through the 19th.


Download the software here.

https://boinc.berkeley.edu/download.php

Presently we really would like some help with the following project - This starts 8pm EST 5/8/17 :

Add the following *GPU* project - *Einsteinathome.org*



Note: For every project you fold on, you will be offered if you want to join a team - type in overclock.net (enter) then JOIN team.


Remember to sign up for the Boinc team by going here: You can also post any questions that your may have - this group is very helpful









8th BOINC Pentathlon thread

To find your Cross Project ID# - sign into your account and it will be located under Computing and Credit


Please check out the GUIDE - How to add BOINC Projects page for more information about running different projects:

This really is an exciting and fun event and i look forward to it every year and I am hoping that you will join us and participate in this event









BTW - There is an awesome BOINC Pentathlon badge for those who participate









lanofsong

OCN - FTW


----------



## chris89

Quote:


> Originally Posted by *componentgirl90*
> 
> I have tried MSI afterburner but when its going 100% it wont respond if you try and take control of the fans. Normally if you take control of the fans in MSI afterburner I can slide it whereever I want.
> 
> "Have you tried to hook up another fan to it via gelid pwm fan adapter??" - I don't understand what you mean by this I have never done that.


Try this? Reduced fan max plus cooler memory voltage/ clock









1000Mhz-1001Mhz.zip 99k .zip file


----------



## componentgirl90

Quote:


> Originally Posted by *chris89*
> 
> Try this? Reduced fan max plus cooler memory voltage/ clock
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1000Mhz-1001Mhz.zip 99k .zip file


I felt the last bios you gave me for the 390x helped a lot. The card was revving really quickly and I believed the new bios kept it cool most of the time. But it still had problems after long gaming sessions. The strange thing is the fans were still running fast the morning after when I switched it on. Surely the card had cooled down by then.

Anyway, I am going to just try a tonne of things before doing any more bios things tbh. The way I see it, if other people were having fans going to 100% because of driver issues, then the same could be true with me.

I'll keep posting







Just a bit tired today and not feeling like facing this issue lol


----------



## chris89

Quote:


> Originally Posted by *componentgirl90*
> 
> I felt the last bios you gave me for the 390x helped a lot. The card was revving really quickly and I believed the new bios kept it cool most of the time. But it still had problems after long gaming sessions. The strange thing is the fans were still running fast the morning after when I switched it on. Surely the card had cooled down by then.
> 
> Anyway, I am going to just try a tonne of things before doing any more bios things tbh. The way I see it, if other people were having fans going to 100% because of driver issues, then the same could be true with me.
> 
> I'll keep posting
> 
> 
> 
> 
> 
> 
> 
> Just a bit tired today and not feeling like facing this issue lol


It should work well as the fan percentage is set lower and temperature limits are reduced.

Right on otherwise, l8r.


----------



## Streetdragon

i think the new driver is crap (17.5.1) running prey crossfire crashes from time to time
running 3dmark in crassfire = redscreen -> system restart
running heaven in crossfire = works

last time crossfire... vega help me....

edit. single gpu (plug power on secound + disabled with bios switch)
3dmark is working without problems hmmm...


----------



## componentgirl90

Quote:


> Originally Posted by *chris89*
> 
> Try this? Reduced fan max plus cooler memory voltage/ clock
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1000Mhz-1001Mhz.zip 99k .zip file


This is for the 270 or the 390x?


----------



## AverdanOriginal

Quote:


> Originally Posted by *Streetdragon*
> 
> i think the new driver is crap (17.5.1) running prey crossfire crashes from time to time
> running 3dmark in crassfire = redscreen -> system restart
> running heaven in crossfire = works
> 
> last time crossfire... vega help me....
> 
> edit. single gpu (plug power on secound + disabled with bios switch)
> 3dmark is working without problems hmmm...


I went back to 16.11 in order to still use MSI AB, since under 17.4 I couldn't increase the voltage only the Clock.
Do you know, if you can you use MSI AB with 17.5.1 again? Or at least overclock the 390s properly with Wattman?

Crossfire seems to be a problem on and off. once there is a driver out with that it works, then with the next one it doesn't work. Reason why I pretty much discarded the idea to go crossfire


----------



## RaFDX

I have the newest drivers (17.5.1) and am able to use MSI Afterburner with no issues. I don't have crossfire or use Wattman.


----------



## Roxborough

Picking up an MSI R390X 8gb tonight for £120 (bargain if you ask me). Going to put some MX-4 thermal paste on it and try a few overclocks. My aim is to beat a 980 at 1440p. Going to get my old 2600k back up to 4.7ghz and see what this thing can do!

EXCITE!!


----------



## bluej511

Quote:


> Originally Posted by *Roxborough*
> 
> Picking up an MSI R390X 8gb tonight for £120 (bargain if you ask me). Going to put some MX-4 thermal paste on it and try a few overclocks. My aim is to beat a 980 at 1440p. Going to get my old 2600k back up to 4.7ghz and see what this thing can do!
> 
> EXCITE!!


Eh use something better then mx-4 if you got, on my gpus i only use gc extreme or kryonaut.


----------



## Roxborough

Quote:


> Originally Posted by *bluej511*
> 
> Eh use something better then mx-4 if you got, on my gpus i only use gc extreme or kryonaut.


Unfortunately the only paste other than the MX-4 I have is the Zalman thermal paste from a VF3000 cooler I used back in 2010 on my GTX 470 rig. Haha! It doe snot seem like it would be as good as the MX-4.

I got the MX-4 for £7 a few months back as opposed to £20 for the same amount. It is great for CPU's. I imagine it would be better than the stock paste on this R9 390X. Which is 1.5 years old and used.

Found this. Although I can't justify spending anymore money on something I already have.

I will test the temps before and after and see if I have actually made it worse or not.


----------



## bluej511

Quote:


> Originally Posted by *Roxborough*
> 
> Unfortunately the only paste other than the MX-4 I have is the Zalman thermal paste from a VF3000 cooler I used back in 2010 on my GTX 470 rig. Haha! It doe snot seem like it would be as good as the MX-4.
> 
> I got the MX-4 for £7 a few months back as opposed to £20 for the same amount. It is great for CPU's. I imagine it would be better than the stock paste on this R9 390X. Which is 1.5 years old and used.


True, my ekwb cpu block came with gc extreme (they dont anymore which is a shame) and i had so much left over i was able to do my cpu like 4-5x and my gpu with it, under water my r9 390 reached 39°C playing siege uncapped (so pretty much balls out gpu usage)

I wonder if using liquid ultra would make any difference on the gpu haha.


----------



## Roxborough

Quote:


> Originally Posted by *bluej511*
> 
> True, my ekwb cpu block came with gc extreme (they dont anymore which is a shame) and i had so much left over i was able to do my cpu like 4-5x and my gpu with it, under water my r9 390 reached 39°C playing siege uncapped (so pretty much balls out gpu usage)
> 
> I wonder if using liquid ultra would make any difference on the gpu haha.


haha, yeah, but that stuff is a nightmare, one mistake and you can bust your GPU. Not only that, it is all about the application. I have always valued this post.. I am sure MX-4 will do just fine. I may be upgrading the cooler to a Kraken with the current H60 I have, then get a H80 for my CPU. Not sure yet though.


----------



## bluej511

Quote:


> Originally Posted by *Roxborough*
> 
> haha, yeah, but that stuff is a nightmare, one mistake and you can bust your GPU. Not only that, it is all about the application. I have always valued this post.. I am sure MX-4 will do just fine. I may be upgrading the cooler to a Kraken with the current H60 I have, then get a H80 for my CPU. Not sure yet though.


For a cpu i wouldnt get anything lower then a 240mm rad. A single does fine for cpus but way better temps with a 240.


----------



## Roxborough

Quote:


> Originally Posted by *bluej511*
> 
> For a cpu i wouldnt get anything lower then a 240mm rad. A single does fine for cpus but way better temps with a 240.


Yeah, it is a shame that I am on a low budget atm. My current rig, which is my old rig, is a downgrade from my i7 7700k & 1080ti. I sold that rig to my friend to get my old rig back. Only cost me £100 for my rig back in the end







! And there is way more than £100 worth of parts. The GTX 680 4gb just does not cut it at 1440p.

I do want a 240mm rad, but they are very very expensive, and for what I am doing, there is little gain really. I'd rather go with a massive macho cooler and stick some scythe gentle typhoons on it.


----------



## stephenn82

Quote:


> Originally Posted by *RaFDX*
> 
> I have the newest drivers (17.5.1) and am able to use MSI Afterburner with no issues. I don't have crossfire or use Wattman.


That's the key fellas. Use DDU, entirely remove all drivers and registry entries, install driver, and NEVER open wattman...ever.

When you are done playing with software clocks and want to hit that cards potential, talk at chris89 to convert you to the dark side


----------



## stephenn82

Quote:


> Originally Posted by *bluej511*
> 
> For a cpu i wouldnt get anything lower then a 240mm rad. A single does fine for cpus but way better temps with a 240.


Just a touch better on a 280 setup...if you have the case real estate


----------



## bluej511

Quote:


> Originally Posted by *stephenn82*
> 
> That's the key fellas. Use DDU, entirely remove all drivers and registry entries, install driver, and NEVER open wattman...ever.
> 
> When you are done playing with software clocks and want to hit that cards potential, talk at chris89 to convert you to the dark side


Whats funny is i have deleted it using amd uninstaller (which does the same as DDU now), cleaned the registry and reinstalled. I didn't open wattman but my afterburner still resets itself but wattman doesnt on the latest driver.

It's a shame afterburner hasn't been updated in forever to actually work with AMD. I'm sure its a setting in ab because if i changed powerplay support or wtv i can OC in ab but the settings dont stick and im stuck at 300 core and 150memory.


----------



## stephenn82

Amd totally revamped thw driver, it broke the link between AB and the driver/card. Sucks, i know. But custom bios will do more for you than AB


----------



## bluej511

Quote:


> Originally Posted by *stephenn82*
> 
> Amd totally revamped thw driver, it broke the link between AB and the driver/card. Sucks, i know. But custom bios will do more for you than AB


Yea i know, the card is plenty for my ultrawide as is, me running it at 1200/1650 at all times wont give me too much performance since i cap my fps at 74 (freesync ftw), im waiting for Vega anyways so the r9 390 will either be a back of a back up (have a 7850 as well in my closet) or be sold to someone looking to mine or upgrade.


----------



## gapottberg

Quote:


> Originally Posted by *bluej511*
> 
> Whats funny is i have deleted it using amd uninstaller (which does the same as DDU now), cleaned the registry and reinstalled. I didn't open wattman but my afterburner still resets itself but wattman doesnt on the latest driver.
> 
> It's a shame afterburner hasn't been updated in forever to actually work with AMD. I'm sure its a setting in ab because if i changed powerplay support or wtv i can OC in ab but the settings dont stick and im stuck at 300 core and 150memory.


This has been stated before but there is a setting in MSI Afterburner you must check in order to get the new relive drivers to stop resetting everything every dam time you turn around. I cant remember the exact name of it, but it was near bottom of settings menu. Just mouse over them all with tool tips enabled and you will find it. It should help.


----------



## bluej511

Quote:


> Originally Posted by *gapottberg*
> 
> This has been stated before but there is a setting in MSI Afterburner you must check in order to get the new relive drivers to stop resetting everything every dam time you turn around. I cant remember the exact name of it, but it was near bottom of settings menu. Just mouse over them all with tool tips enabled and you will find it. It should help.


I already have that ticked haha. Does nothing, ab resets settings automatically and wont apply but wattman applies em just fine.


----------



## componentgirl90

Quote:


> Originally Posted by *Roxborough*
> 
> Picking up an MSI R390X 8gb tonight for £120 (bargain if you ask me). Going to put some MX-4 thermal paste on it and try a few overclocks. My aim is to beat a 980 at 1440p. Going to get my old 2600k back up to 4.7ghz and see what this thing can do!
> 
> EXCITE!!


Nice!! 390x is a great card. It already matches the 980 at stock from the reviews I read in the past (although there is a lot of variation from game to game). You are aiming to beat an overclocked 980? best of luck with that because maybe the 980 > 390x when overclocked but I believe in you!


----------



## RaFDX

I use GELID GC-Extreme for my 390 and the delta in temps is honestly 5-7C across the board. No pun. GREAT stuff

I use thermal grizzly kryonaut for my CPU. ALSO great stuff


----------



## AverdanOriginal

Quote:


> Originally Posted by *bluej511*
> 
> I already have that ticked haha. Does nothing, ab resets settings automatically and wont apply but wattman applies em just fine.


I don't mind using Wattman instead of MSI AB, but it didn't work with 17.4.
Actually I just wanted to work with Wattmann and discard MSI AB, in order to reduce constant running programs in the background, but then I would lose the nice OSD from Rivatuner.
But that's ok, as I have my second screen running simultaniously in order to check system stats.
Quote:


> Originally Posted by *stephenn82*
> 
> Amd totally revamped thw driver, it broke the link between AB and the driver/card. Sucks, i know. But custom bios will do more for you than AB


Now I haven't read myself into flashing the BIOS on my R9 390 yet. *What exactly makes it so much better then running with MSI AB or Wattman?* the Clocks would be the same anyways or? Please could you explain in a bit more detail, then I might actually consider it if it pays off.
I only see the problem of not being able to update to the newest drivers, and if for some reason a game keeps crashing, I would need to flash my bios again back to the original or?

I have a couple of OCs saved in MSI AB (for hot days, for standard use, for extra power in demanding games) and it is easy to switch.


----------



## bluej511

Quote:


> Originally Posted by *AverdanOriginal*
> 
> I don't mind using Wattman instead of MSI AB, but it didn't work with 17.4.
> Actually I just wanted to work with Wattmann and discard MSI AB, in order to reduce constant running programs in the background, but then I would lose the nice OSD from Rivatuner.
> But that's ok, as I have my second screen running simultaniously in order to check system stats.
> Now I haven't read myself into flashing the BIOS on my R9 390 yet. *What exactly makes it so much better then running with MSI AB or Wattman?* the Clocks would be the same anyways or? Please could you explain in a bit more detail, then I might actually consider it if it pays off.
> I only see the problem of not being able to update to the newest drivers, and if for some reason a game keeps crashing, I would need to flash my bios again back to the original or?
> 
> I have a couple of OCs saved in MSI AB (for hot days, for standard use, for extra power in demanding games) and it is easy to switch.


I think you can actually download rtss as a lone program can't you? And you can use it with hwinfo64 and get even more information on screen then you thought possible.


----------



## AverdanOriginal

Quote:


> Originally Posted by *bluej511*
> 
> I think you can actually download rtss as a lone program can't you? And you can use it with hwinfo64 and get even more information on screen then you thought possible.


That would be great. But I think that used to be possible, but not anymore since MSI AB took it over.

Wait I just checked --> HWiNFO Forum

so that might actually be possible. Will try and see if it works.


----------



## bluej511

Quote:


> Originally Posted by *AverdanOriginal*
> 
> That would be great. But I think that used to be possible, but not anymore since MSI AB took it over.
> 
> Wait I just checked --> HWiNFO Forum
> 
> so that might actually be possible. Will try and see if it works.


He has a thread on here as well, worth asking, if anyone knows its Marin for sure. http://www.overclock.net/t/1235672/official-hwinfo-32-64-thread/0_20


----------



## Sinster

Don't bust my stones too bad there are 1135 pages here so I apologize it for me asking the same questions again.

I'm running two MSI 390X in CF (motherboard MSI Gaming 7 X97) and of course we all know the second card will run hotter. I keep hitting the second card thermal limit and it drops the GPU freq and it affects my gaming.

Things I've tried
Went from a AzzA Solondo 1000 to a Thor V.2 case and applied MX4.

I just ordered a Kraken G10, G40 AIO, and VRM heatsinks. I've found where others have successfully installed an AIO on the MSI 390X, but have a question (I know after all that rambling) that I haven't been able to find an answer for.

Will I require longer screws if I want to leave the backing plate and mid plate on the card with the G10? Can the G10 be installed on the stock backing plate?

My system specs arent' correct. I haven't updated for a while.


----------



## stephenn82

Quote:


> Originally Posted by *AverdanOriginal*
> 
> Now I haven't read myself into flashing the BIOS on my R9 390 yet. *What exactly makes it so much better then running with MSI AB or Wattman?* the Clocks would be the same anyways or? Please could you explain in a bit more detail, then I might actually consider it if it pays off.
> I only see the problem of not being able to update to the newest drivers, and if for some reason a game keeps crashing, I would need to flash my bios again back to the original or?
> 
> I have a couple of OCs saved in MSI AB (for hot days, for standard use, for extra power in demanding games) and it is easy to switch.


Lets say that you got your r9 390 overclocked on the stock bios to say 1125 on core and 1625 on memory. That was my go to setting via msi ab. What are your temps? Look at how much power is consumed and power output via gpu-z and def check the temps as well.

Do you overclock and try to undervolt your CPU to limit heat and power usage?

The stock bios has a thermal wall. Sure, your card shows you your clocks, but it just wont maintain the speeds at a constant, it sucks for the thermal limit.

In steps the custom bios. You can up maximum thermal limiter, raise clocks, lower power usage, and get better performance with less heat and power requirement.

Hope i explained that right chris89 lol

My maximum score in Valley with 1140\1650 was about 2900-3000. Even with replacing TIM my temps were hitting low 70's

I hit 2980 regularly with my custom bios of 1175/1563. I get to push to 1175! I couldnt do that via ab no matter how much core voltage or power slider i added. My temps are just touchhing 65 via superposition runs. My 1080p extreme runs are about 200 pts higher than "official" 390 results. I aint paying for a benchmark tool.

But thats why i run custom bios.


----------



## RaFDX

i still want to tweak the bios to undervolt! right now i'm at 1175 core and 1700 mem


----------



## chris89

1700 makes total power over 400 watts... reduce core voltage will only decrease like a few watts... it's the 512bit ram that consumes all the power... must undervoltage ram... or find 1563mhz memory stable memory undervolt.. but it didn't matter much it's either 888mv or 1000mv... 1290mhz memory or up to 1758mhz ... that's about 200 watts difference.


----------



## RaFDX

I'd like to try the 1175 core and 1290mhz undervolt. I think I'd like to consume 200 less watts. But then if I want to bench, I'd have to swap bios Hhhmmmmmm


----------



## chris89

Yeah that's why we have a bios switch... can use Restart64.exe from CRU when benchmarking... switch over switch.. Restart64.exe CRU... Benchmark... Flip back.. Restart64.exe good to go... kind of a pain but not a huge deal...

RaFDX-GAMING-1188-1250.zip 99k .zip file


RaFDX-BENCHMARK-1188-1700.zip 99k .zip file


----------



## Rexer

Anyone got anything good to say about 480/580 cards? I just blew up my last 390x (boo,hoo) and need a spare card until AMD manufacturers sellout off the first Vega reference cards. To add to that wait, High Band Memory (HBM) is on a back log. So that's about 3 months down the road. I don't want Nvidia because I own two freesync monitors.
390(x) Good God, what fun cards they are. They're the mystery kick-455 card of the rodeo. I have friends who own big Nvidia gods and playing fps games against them, they literally and routinely, got stomped and shoved into the dirt. Haw, haw. In my opinion, as much stuff and performance the big Nvidia card have, they try to resolve too much stuff too fast and lose something in their small, 256 bit pipe.. Whatever it it is, it levels the playing field. When you get heat mixed in with that, it changes everything. Nvidia says they're not worried about Vega but why do they plan to ramp up their release dates for Volta?


----------



## Mister300

Hey Rex, my son has the XFX 480X and is real close to my XFX 390x at 4K, 60 fps. I would recommend it for 200 USD. 390x has a tad more crunch due more ROP's and TMUs and more memory BW.


----------



## Rexer

Nice! I just ordered a Sapphire 580 n+ (480 n+ sold out!) Thanks for the tip, Mister 300. Yeah, I sorta pushed my 390x to the end. I feel like I crashed an airplane and survived to tell about it. Sat at the edge of 80c/100% fans most of the time (kept seeing that white snow when it overheats). It went out with a splashy flash, and a black screen. heh, heh.
Titan 2 and Call of Duty AW should have warning labels on them. I go absolutely nuts playing multiplayer, 1st person shooter. Actually bought an air conditioner to get more cool air. One of my friends, Jerry taunts me, it's the end of me. He's got a 1080 running at stock settings and beat my scores mostly when I leave the computer for the bathroom or kitchen. Our ping are pretty close most of the time.
At a time when the HD7950 was around, I'd get whipped by gtx 770's and 780's all the time. It really makes a person feel inferior to get beat up with low wins. Then when the 390x came out, I began to stomp away at 970s' and 980s'. So I knew the 390x is the all new, testosterone enhancer!
All the site reviews, calculated measurements that hardware web pages use really aren't accurate. I see more clearly and smoothly in actual game experience. I easily play against 1080's well. It's either the Nvidia cards are over rated or the players I play against are using inferior settings and attributing hardware. Maybe they players are just not savvy or Jerry drinks too much when he plays.
By all means, 390x is too much fun. I'm really gonna miss this card. Unless Vega proves to be better.


----------



## tuparisd

I have a Sapphire Tri-x R9 390X, and I've been thinking next time I come into some cash maybe xfire it with a used rx 480 rs 8GB or a R9 390(regular not 390x)

anyone think the performance would be a lot better(not that the Tri-x running stock really has issues running new games at high FPS and Ultra settings)? I just want my otherwise(at least now with ryzen) now last-gen PC keeps up with games for as long as possible.


----------



## RaFDX

Quote:


> Originally Posted by *tuparisd*
> 
> I have a Sapphire Tri-x R9 390X, and I've been thinking next time I come into some cash maybe xfire it with a used rx 480 rs 8GB or a R9 390(regular not 390x)
> 
> anyone think the performance would be a lot better(not that the Tri-x running stock really has issues running new games at high FPS and Ultra settings)? I just want my otherwise(at least now with ryzen) now last-gen PC keeps up with games for as long as possible.


I would do neither tbh. The 390x can handle games easily. I would wait for the second flight of Vega cards to upgrade.

A lot of people are still running 4690k's. That's just my opinion


----------



## stephenn82

agreed. Just wait. there is NO reason to go upgrading right now if you have a 290/390. The 480/580 match it for less power usage, but dont go spending money just yet. I would wait until tech refresh of Vega, as RaFDX said


----------



## tuparisd

false
Quote:


> Originally Posted by *RaFDX*
> 
> I would do neither tbh. The 390x can handle games easily. I would wait for the second flight of Vega cards to upgrade.
> 
> A lot of people are still running 4690k's. That's just my opinion


Alright, I just have a feeling I won't be able to afford a new card when the next wave comes out, so I was hoping xfire could compete, but yes, for now it does all I need, just wish the tri-x had more hdmi ports and less displayports. I needed a displayport to hdmi adapter cable for my TV lol


----------



## stephenn82

just save about 20 a month if you can, and when the tech refresh of Vega hits, you can afford that **** like a big baller! At least that is my plan..lol


----------



## RaFDX

Quote:


> Originally Posted by *tuparisd*
> 
> false
> Alright, I just have a feeling I won't be able to afford a new card when the next wave comes out, so I was hoping xfire could compete, but yes, for now it does all I need, just wish the tri-x had more hdmi ports and less displayports. I needed a displayport to hdmi adapter cable for my TV lol


Use adapters and what Steph said, save a little every month ?


----------



## RaFDX

Quote:


> Originally Posted by *tuparisd*
> 
> I have a Sapphire Tri-x R9 390X, and I've been thinking next time I come into some cash maybe xfire it with a used rx 480 rs 8GB or a R9 390(regular not 390x)
> 
> anyone think the performance would be a lot better(not that the Tri-x running stock really has issues running new games at high FPS and Ultra settings)? I just want my otherwise(at least now with ryzen) now last-gen PC keeps up with games for as long as possible.


I saw this review and found the computer he used interesting 

http://www.pcgamer.com/forza-horizon-3-hot-wheels-review/


----------



## Stige

Anyone have a stock BIOS for ASUS R9 390 Strix DC3 at hand perhaps?

I'm getting constant freezes/resets if anything GPU related is involved all of a sudden...

Freezes/resets 3-4 times before the whole PC freezes and I have to restart with power button.


----------



## lanofsong

Hey there R9 390/390X owners,

We are having our monthly Foldathon from Monday 22nd - Wednesday 24th - 12noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.

May 2017 Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## RaFDX

Quote:


> Originally Posted by *Stige*
> 
> Anyone have a stock BIOS for ASUS R9 390 Strix DC3 at hand perhaps?
> 
> I'm getting constant freezes/resets if anything GPU related is involved all of a sudden...
> 
> Freezes/resets 3-4 times before the whole PC freezes and I have to restart with power button.


https://www.techpowerup.com/vgabios/


----------



## Stige

Quote:


> Originally Posted by *Stige*
> 
> Anyone have a stock BIOS for ASUS R9 390 Strix DC3 at hand perhaps?
> 
> I'm getting constant freezes/resets if anything GPU related is involved all of a sudden...
> 
> Freezes/resets 3-4 times before the whole PC freezes and I have to restart with power button.


Well managed to fix this with stock BIOS for now atleast.

My own BIOS had really tight memory timings and +100mV and maybe some other tweaks. Probably something to do with the memory timings as memory clocks can cause black screens too.


----------



## Rexer

Quote:


> Originally Posted by *RaFDX*
> 
> https://www.techpowerup.com/vgabios/


Wow. How did you ever come across this site? I'd always had a thought the bios is the best way to change the characteristics of a stock card but I shyed away from tweaking the gpu bios thinking it was set and forget and I'd never find stock settings if I needed to. After all the gpu cards that passed through my hands, this would've changed everything.
Ever since I went on a hunt for an HD6950 with a bios change for Apple machines I thought, 'This could create a whole new life for down trodden Apple users' who are stuck with their expensive narrow tunnel.. . and, after a monumental safari into ebay and deep into hazardous Chinese sites, I thought, no. Curse them for buying Apple. Heh, heh.
The money saved from buying an Apple office machine could buy you a great BBQ grill and patio set. That's along with that great game computer you made.
Anyway, Thanks for answering Stieg's post. This will come in handy.


----------



## Stige

Quote:


> Originally Posted by *Rexer*
> 
> Wow. How did you ever come across this site? I'd always had a thought the bios is the best way to change the characteristics of a stock card but I shyed away from tweaking the gpu bios thinking it was set and forget and I'd never find stock settings if I needed to. After all the gpu cards that passed through my hands, this would've changed everything.
> Ever since I went on a hunt for an HD6950 with a bios change for Apple machines I thought, 'This could create a whole new life for down trodden Apple users' who are stuck with their expensive narrow tunnel.. . and, after a monumental safari into ebay and deep into hazardous Chinese sites, I thought, no. Curse them for buying Apple. Heh, heh.
> The money saved from buying an Apple office machine could buy you a great BBQ grill and patio set. That's along with that great game computer you made.
> Anyway, Thanks for answering Stieg's post. This will come in handy.


You can also just save the stock bios yourself before modifying it. Well you have to save it anyway to modify it so.


----------



## stephenn82

I am tempted to load stock and 17.5.2 driver and give it a go, then reflash this 1173/1563 and see the gainz!


----------



## Cyberpunk2077

Hi guys, today ı bought a used msi r9 390, have a 5 month warranty.İnstalled latest drivers and tested with Prey ultra settings, good fps but temperatures too high for me, 92-93 celcius, fan speed %100. Titanfall 2,Hitman,Heroes of the storm near 80 celcius. Battlefield 1 says stopped working, i cant tested. İs this temps normal?


----------



## Streetdragon

its a hot chip. you could try to redo the thermal paste and pads. do you have a good airflow in your case?


----------



## Cyberpunk2077

Quote:


> Originally Posted by *Streetdragon*
> 
> its a hot chip. you could try to redo the thermal paste and pads. do you have a good airflow in your case?


İ have 2 front intake, 2 top exhaust, 1 rear exhaust fans.İ cant make a decision, use warranty or apply new thermal paste.


----------



## chris89

$229 390X 8GB is intriguing https://www.visiontek.com/refurbished-radeon-r9-390x-pcie-8gb-gddr5.html

Quote:


> Originally Posted by *Cyberpunk2077*
> 
> İ have 2 front intake, 2 top exhaust, 1 rear exhaust fans.İ cant make a decision, use warranty or apply new thermal paste.


Send me your bios .rom buddy







...zip it.. zip it good for attachment here bro


----------



## Streetdragon

ound like good flow. like chris(godfather of bios) said, send him your bios file and he will tune it a bit for a cooler and maybe faster card


----------



## RaFDX

Quote:


> Originally Posted by *Cyberpunk2077*
> 
> Hi guys, today ı bought a used msi r9 390, have a 5 month warranty.İnstalled latest drivers and tested with Prey ultra settings, good fps but temperatures too high for me, 92-93 celcius, fan speed %100. Titanfall 2,Hitman,Heroes of the storm near 80 celcius. Battlefield 1 says stopped working, i cant tested. İs this temps normal?


I would use Gelid Extreme or thermal grizzly kyronaut for the paste.

I spent the $60 and swapped out the fans for an aio solution. My temps don't break 60c (overwatch highest settings, just cause).

Those temps are without Chris's modded bios. Which I would still like to do Chris. Wink wink hint hint ??


----------



## stephenn82

Quote:


> Originally Posted by *RaFDX*
> 
> I would use Gelid Extreme or thermal grizzly kyronaut for the paste.
> 
> I spent the $60 and swapped out the fans for an aio solution. My temps don't break 60c (overwatch highest settings, just cause).
> 
> Those temps are without Chris's modded bios. Which I would still like to do Chris. Wink wink hint hint ??


Get GPU-z, pull the bios from the card. Post it here for @chris89 to tinker with, and he will send post you some back here


----------



## RaFDX

Quote:


> Originally Posted by *stephenn82*
> 
> Get GPU-z, pull the bios from the card. Post it here for @chris89 to tinker with, and he will send post you some back here


Oh i have, but last time i dicked it up and bricked it. Currently i'm at 1175MHz//1700MHz but at +65mV


----------



## Cyberpunk2077

I'm work 12 hours a day so
Quote:


> Originally Posted by *chris89*
> 
> $229 390X 8GB is intriguing https://www.visiontek.com/refurbished-radeon-r9-390x-pcie-8gb-gddr5.html
> Send me your bios .rom buddy
> 
> 
> 
> 
> 
> 
> 
> ...zip it.. zip it good for attachment here bro


Here is my bios file : http://s9.dosya.tc/server2/6tb7mc/Cyberpunk2077_R9_390.rar.html
Good news, i have a liquid cooler and radiotor on rear exhaust fan. Replaced liquid cooler with new cpu cooler and remove the radiator, now my gpu temps in games like a prey,the surge (ultra quality), 66-76 C


----------



## gapottberg

Ill just leave this here. Cheers.


----------



## chris89

Quote:


> Originally Posted by *Cyberpunk2077*
> 
> I'm work 12 hours a day so
> Here is my bios file : http://s9.dosya.tc/server2/6tb7mc/Cyberpunk2077_R9_390.rar.html
> Good news, i have a liquid cooler and radiotor on rear exhaust fan. Replaced liquid cooler with new cpu cooler and remove the radiator, now my gpu temps in games like a prey,the surge (ultra quality), 66-76 C


These are cooler than stock yet faster...

72.5Gpixel/s DeLimited Option
&
75Gpixel/s Delimited Option

Cyberpunk2077-1292mv-888mv-1133-1250.zip 99k .zip file


Cyberpunk2077-1348mv-888mv-1173-1250.zip 99k .zip file


----------



## stephenn82

Quote:


> Originally Posted by *RaFDX*
> 
> Oh i have, but last time i dicked it up and bricked it. Currently i'm at 1175MHz//1700MHz but at +65mV


huh, thats crazy

what is stock voltage, btw?? 1250mv? that puts you at what, 1315mv? I will have to check mine, but I think im running 1175/1563 at 1293mv core 950mv mem


----------



## chris89

Quote:


> Originally Posted by *stephenn82*
> 
> huh, thats crazy
> 
> what is stock voltage, btw?? 1250mv? that puts you at what, 1315mv? I will have to check mine, but I think im running 1175/1563 at 1293mv core 950mv mem


Cool I forgot I set you up at 1,563Mhz memory at 950mv.

Yeah for 1133Mhz for 390 is 1293mv roughly... 1133mhz on 390x is 1333mv roughly but I found 1288mv works but some have said it artifacts at high fps numbers over 60fps.

1173Mhz could work at no less than 1350mv... Likely 390 @ 1363mv at 1173mhz.

Yes stock is 1250mv... Doing good man haha you know your stuff haha

Fathom this : Xbox Scorpio has a Polaris Based R9 390 at 1172Mhz... so 2.5X more fps than the 390 at this clock... which is unreal.

Xbox Scorpio is going to be like running 3-way R9 390s.


----------



## gapottberg

Because my last link didn't show properly when posting from mobile. Good info on where this community stands today.


----------



## RaFDX

Quote:


> Originally Posted by *chris89*
> 
> Cool I forgot I set you up at 1,563Mhz memory at 950mv.
> 
> Yeah for 1133Mhz for 390 is 1293mv roughly... 1133mhz on 390x is 1333mv roughly but I found 1288mv works but some have said it artifacts at high fps numbers over 60fps.
> 
> 1173Mhz could work at no less than 1350mv... Likely 390 @ 1363mv at 1173mhz.
> 
> Yes stock is 1250mv... Doing good man haha you know your stuff haha
> 
> Fathom this : Xbox Scorpio has a Polaris Based R9 390 at 1172Mhz... so 2.5X more fps than the 390 at this clock... which is unreal.
> 
> Xbox Scorpio is going to be like running 3-way R9 390s.


1175//1565 @ 950mV would be dandy, imo.

Coupled with the aio, I am able to push 1175//1700 @ +65mV. That's why my Superposition scores are so high; FPS + heat dissipation.

It's been cool in the NE since I've been back and my max OW temps have been 54-58c.


----------



## chris89

Quote:


> Originally Posted by *RaFDX*
> 
> 1175//1565 @ 950mV would be dandy, imo.
> 
> Coupled with the aio, I am able to push 1175//1700 @ +65mV. That's why my Superposition scores are so high; FPS + heat dissipation.
> 
> It's been cool in the NE since I've been back and my max OW temps have been 54-58c.


What's your score again? I hit 2,999 points at 1,250mhz core 1,250mhz memory

I'm PCIe 2.0 limited, so I could see 10-20 more fps if PCIe 3.0 on for instance Ryzen R7 1700... If I ever get one that is...

Quote:


> Originally Posted by *gapottberg*
> 
> Because my last link didn't show properly when posting from mobile. Good info on where this community stands today.


I bet the 390 isn't running my overclocked BIOS... my overclock bios is way faster than any software overclocks could ever pull...


----------



## RaFDX

2820 here. I am willing to try a BIOS again now that i know how to unbrick the gpu ;-)


----------



## chris89

Quote:


> Originally Posted by *RaFDX*
> 
> 2820 here. I am willing to try a BIOS again now that i know how to unbrick the gpu ;-)


 1357mv-950mv-1173mhz-1563mhz.zip 99k .zip file


1357mv-888mv-1173mhz-1250mhz.zip 99k .zip file


----------



## RaFDX

Quote:


> Originally Posted by *chris89*
> 
> 1357mv-950mv-1173mhz-1563mhz.zip 99k .zip file
> 
> 
> 1357mv-888mv-1173mhz-1250mhz.zip 99k .zip file


Here is my original score



Score using 1357mv-950-1173mhz-1563mhz bios


Attached are my GPU-Z logs. They are of two games of Overwatch at Epic setting and two superposition runs

1135restart.txt 864k .txt file


Within 42 points!







If i understand your file naming scheme I am using 407mV less than my baseline OC?? What's next?


----------



## chris89

Quote:


> Originally Posted by *RaFDX*
> 
> Here is my original score
> 
> 
> 
> Score using 1357mv-950-1173mhz-1563mhz bios
> 
> 
> Attached are my GPU-Z logs. They are of two games of Overwatch at Epic setting and two superposition runs
> 
> 1135restart.txt 864k .txt file
> 
> 
> Within 42 points!
> 
> 
> 
> 
> 
> 
> 
> If i understand your file naming scheme I am using 407mV less than my baseline OC?? What's next?


 1376mv-950mv-1188mhz-1563mhz.zip 99k .zip file


1383mv-950mv-1193mhz-1563mhz.zip 99k .zip file


1386mv-950mv-1196mhz-1563mhz.zip 99k .zip file


1400mv-950mv-1200mhz-1563mhz.zip 99k .zip file


These are in between voltages.... You could likely pass the benchmark at 1,250mhz at 1,400mv if you tried it? Would be very interested in the result... 1400mv will artifact likely at 1,250mhz a lot but highest possible score... Your on Water anyway, right?

390x is

1133mhz at 1333mv
1166mhz at 1366mv
1188mhz at 1388mv
1200mhz at 1400mv
1250mhz is like 1400-1435mv (artifacts but nearly 3,000 points on PCIe 2.0)

PCIe 2.0 on my system seems to top out around 2,999 points? It's weird it's like 2,997 ... 2,998.. 2,999 ... like I need to hook up a beast PSU to score past 3,000 points... I'll try it one of these days...


----------



## RaFDX

You getting such a high score with a PCIe 2 slot is very impressive. Cheers. Which mobo are you running?

Yes, I'm on water. Question(s):

I thought i was already at 1320mV with my original OC?

Are the memory clocks locked at 1563 for a reason I am unfamiliar with?

What if i stop myself at 1175 core, but go up in memory to 1725 or 1750?


----------



## chris89

Memory won't help as much as core clock... memory clock will increase power consumption and heat more than fps yield...

Though for a 50mv reduction in memory voltage to yield 400 Gigabytes per second memory is a lot of bandwidth.

It's all about getting the core near 1,250Mhz...

I'm sure you scores should near my PCIe 2.0 Dual Xeon Dell Precision T7500 system with 24 threads of xeon power and 48 gigabytes of Hexa-Channel ram.


----------



## RaFDX

Quote:


> Originally Posted by *chris89*
> 
> Memory won't help as much as core clock... memory clock will increase power consumption and heat more than fps yield...
> 
> Though for a 50mv reduction in memory voltage to yield 400 Gigabytes per second memory is a lot of bandwidth.
> 
> It's all about getting the core near 1,250Mhz...
> 
> I'm sure you scores should near my PCIe 2.0 Dual Xeon Dell Precision T7500 system with 24 threads of xeon power and 48 gigabytes of Hexa-Channel ram.


OK I see what you're saying with the bandwidth. My card receives diminishing returns past 1700MHz anyway. IDT my card can hit 1200MHz on the core. 1175 is the fastest i've been so far. I'll loaded the 1188MHz clock and try superposition in the morning. Thoughts? Thanks for your help today btw


----------



## rdr09

@gappottberg, I only watched it once. The 480 is impressive. How much faster is the 580?


----------



## chris89

Quote:


> Originally Posted by *RaFDX*
> 
> OK I see what you're saying with the bandwidth. My card receives diminishing returns past 1700MHz anyway. IDT my card can hit 1200MHz on the core. 1175 is the fastest i've been so far. I'll loaded the 1188MHz clock and try superposition in the morning. Thoughts? Thanks for your help today btw


1188Mhz is a breeze and 1200mhz as well but above 1200mhz is issue for me but it can do 1,250mhz.

If it's watercooled should be fine, I've seen people hit 1200mhz on water at 65C core... its the core vrm that should be watched is all.


----------



## RaFDX

Quote:


> Originally Posted by *chris89*
> 
> 1188Mhz is a breeze and 1200mhz as well but above 1200mhz is issue for me but it can do 1,250mhz.
> 
> If it's watercooled should be fine, I've seen people hit 1200mhz on water at 65C core... its the core vrm that should be watched is all.


Skipped the 1188 and went straight to the 1200. Here's what I got

*Base GPU Z*




*Superposition Score*


*GPU Z post Superposition with "Max" values shown*


I think 1200 is the max I want to go. Is that VRM temp 1 of 93 ok since I was intentionally pushing the card to run hard? Not sure what to do next in terms of core, memory, voltage and heat? This has been a lot of fun, thanks again!


----------



## bluej511

Quote:


> Originally Posted by *RaFDX*
> 
> Skipped the 1188 and went straight to the 1200. Here's what I got
> 
> *Base GPU Z*
> 
> 
> 
> 
> *Superposition Score*
> 
> 
> *GPU Z post Superposition with "Max" values shown*
> 
> 
> I think 1200 is the max I want to go. Is that VRM temp 1 of 93 ok since I was intentionally pushing the card to run hard? Not sure what to do next in terms of core, memory, voltage and heat? This has been a lot of fun, thanks again!


Quite hot at 93°C if you ask me, I'm on water and my VRMs are passively cooled (using the alphacool gpx waterblock, doesnt run water to the VRMs just giant aluminum fins) and i get around 50-60°C on both VRMs under load. My voltage peaks at 1.254v as well not sure what yours is set to. The cooler the VRM runs, the cleaner the power signal will be and you get longetivity and increased performance.


----------



## RaFDX

I kept the original VRM heatsink poat AIO mod cause I thought that would help with the cooling. I might be incorrect in that assumption. I've never seen/noticed the VRM that hot before.

Can you send me a picture of your card?


----------



## bluej511

Quote:


> Originally Posted by *RaFDX*
> 
> I kept the original VRM heatsink poat AIO mod cause I thought that would help with the cooling. I might be incorrect in that assumption. I've never seen/noticed the VRM that hot before.
> 
> Can you send me a picture of your card?


This is what my r9 390 runs.

http://www.aquatuning.fr/refroidissement-par-eau/gpu-blocs/gpu-refroidisseur-complet/19917/alphacool-nexxxos-gpx-ati-r9-390-m01-mit-backplate-schwarz?sPartner=googleshoppingfr&gclid=CIKJt6vdktQCFcoW0wodYDMPoA


----------



## chris89

*@RaFDX*

1316mv-933mv-1173mhz-1563mhz.zip 99k .zip file


----------



## RaFDX

The two fans I have now are blowing air on to the vrm, should I be taking air off instead?? Also, I have heatsinks on what I believe are VRM 1, according to this picture. Not sure why they hit 93c, normally it's a 20-25c jump from base temp










This is what my card looks like with one of the noctua fans removed


----------



## Firionhope

What do you guys recommend as a benchmark? My MSI 390x was reaching 95C at times which I believe is when it starts downclocking itself. Reapplied thermal paste (they did a terrible job, barely any paste) and it seems to have improved a lot but I wanna stress test it to see just how much. Also might try undervolting or something to reduce temps and heat further so it'd be good to have a go to benchmark to test if I ddo.


----------



## chris89

I can make it run cooler but your thermals on the core vrm and totally gone so you need to re-do the vrm thermals...

My cards vrm hits the same temp as the core no matter what voltage/ clock... well it's cooler at low voltage ie near stock but like 80c at 1,250mhz and 1,425mv...

That's at 60% fan.. if 100% its way cooler.. but my card has no dvi to cool down more .. stock cards are physically incapable of cool temps unless full waterblock.

Pull the heatsink assembly off and re-apply all thermal pads and take extra special care and temp the the line of CORE VRM(S). Roll up the pad material into a ball and place on each VRM individually... Once compressed temps will be 70C rather than 90C.

gotta dial back the MEMORY VOLTAGE and CLOCK to cool the core VRM to 70C....


----------



## RaFDX

Is that picture correct? VR 1 is on the right, VR 2 on the left?


----------



## chris89

We just call them Core VRM or Memory VRM... the part with More VRM is CORE, spot with 3 VRM is memory.

The hotter VRM is always the Core VRM.


----------



## RaFDX

i have heatsinks on all of those. Maybe i should re-orientate the fan and take some measurements, if not, remove the heatsinks.

The 90 is not an everyday thing. my max during gaming was 70C


----------



## componentgirl90

Hi guys,

I wanted to post to update on my issue with XFX 390x DD Edition.

Brief description of my problem: Fans will randomly go to 100% and it is not possible to control the fans.

After installing much older drivers, the problem has improved a lot but still remains. I am now 80% certain it is a software issue which will be resolved by reinstalling Windows. I believe it was caused by the installation of MSI afterburner (although I never overclocked my card). Many other AMD GPU products (most notably perhaps the RX 480) have problems with fans seemingly randomly going to 100%. Card owners with this problem seem to always have installed Afterburner.

I will post here once I have reinstalled windows with a full description of the issue and the resolution etc.

p.s. Many thanks to Chris who helped a lot with parts of this issue.


----------



## chris89

Quote:


> Originally Posted by *componentgirl90*
> 
> Hi guys,
> 
> I wanted to post to update on my issue with XFX 390x DD Edition.
> 
> Brief description of my problem: Fans will randomly go to 100% and it is not possible to control the fans.
> 
> After installing much older drivers, the problem has improved a lot but still remains. I am now 80% certain it is a software issue which will be resolved by reinstalling Windows. I believe it was caused by the installation of MSI afterburner (although I never overclocked my card). Many other AMD GPU products (most notably perhaps the RX 480) have problems with fans seemingly randomly going to 100%. Card owners with this problem seem to always have installed Afterburner.
> 
> Since what happens in a "Black Screen" is it goes black yet fan is basically at the lowest speed or even off it seems. GPU goes under a full-load scenario with no fan speed. Then reaches the pre-defined unchangeable full speed max temp value and fan goes full speed. Usually Max Temp in Fan Speed Section.
> 
> I will post here once I have reinstalled windows with a full description of the issue and the resolution etc.
> 
> p.s. Many thanks to Chris who helped a lot with parts of this issue.


Sure thing, My RX 480 and 390X don't do this but my 390X original did do it if it black screened and then eventually would go full speed fan. However it would be accompanied with the smell of burning capacitors. That's from when I was testing ideal memory voltage for said clock. Black screen is either ASIC overheat and prevention of failure yet doesn't shutdown. So damage occurs if gone black screen and don't reset the compute immediately upon seeing the black screen.

I find that setting higher-sooner fan speeds at lower temperature prevents ASIC overheat and issue does not occur.

I am certain it's a physical hardware issue as the memory VRM 110% must be cooled actively. Must require a heatsink on them to prevent this.

So basically a new bios would be in order. It isn't a software issue at all it's physical, though can be band-aid fixed with bios testing until it is resolved.

*Must know in what state the computer was in and for how long had it been turned on?

Not to mention if it was in use or after long-idle?*

The BIOS will band-aid fix it but cannot figure one bios is to fix-all. It requires continuous testing until the ideal settings are found.

Need to know at what fan RPM % is too loud, and what fan % is just about right to you? What's the max Fan % do you want to see?

My 390x is bit loud near 70%, but 61% is quite nice. So I have my profile setup for this acoustic level.

XFX_390X_DD_FAN_FIX.zip 99k .zip file


----------



## componentgirl90

Quote:


> Originally Posted by *chris89*
> 
> Sure thing, My RX 480 and 390X don't do this but my 390X original did do it if it black screened and then eventually would go full speed fan. However it would be accompanied with the smell of burning capacitors. That's from when I was testing ideal memory voltage for said clock. Black screen is either ASIC overheat and prevention of failure yet doesn't shutdown. So damage occurs if gone black screen and don't reset the compute immediately upon seeing the black screen.
> 
> I find that setting higher-sooner fan speeds at lower temperature prevents ASIC overheat and issue does not occur.
> 
> I am certain it's a physical hardware issue as the memory VRM 110% must be cooled actively. Must require a heatsink on them to prevent this.
> 
> So basically a new bios would be in order. It isn't a software issue at all it's physical, though can be band-aid fixed with bios testing until it is resolved.
> 
> *Must know in what state the computer was in and for how long had it been turned on?
> 
> Not to mention if it was in use or after long-idle?*
> 
> The BIOS will band-aid fix it but cannot figure one bios is to fix-all. It requires continuous testing until the ideal settings are found.
> 
> Need to know at what fan RPM % is too loud, and what fan % is just about right to you? What's the max Fan % do you want to see?
> 
> My 390x is bit loud near 70%, but 61% is quite nice. So I have my profile setup for this acoustic level.
> 
> XFX_390X_DD_FAN_FIX.zip 99k .zip file


Are you saying you think my card has a physical problem or yours (or both)?

I find this a very difficult problem as I am not an expert in any way.

Two people can have their fans at 100% but with different causes. Honestly, right now I think we both had different causes with the same symptom (100% fan).

When this issue occurs the card basic temp sensor reads normal, the VRM temps are normal. I guess its possible parts of the VRM could be overheating away from the sensor. Additionally the problem occurred sometimes immediately after booting on and it occurred a long time after switching it on also.

There seems to be a link with video playback as well. One time recently I was watching a stream and the issue was happening. I closed down the stream and the fan speed instantly dropped to normal. This seems in keeping with the sofrware issues that people are experiencing with the RX480.

BTW Im ok with 100% fan, just not randomly when im trying to concentrate on work; noise doesn't bother me whilst gaming or I would have gone for a 970 or 980 

https://community.amd.com/thread/203282


----------



## chris89

What I would do is pull the card and clean the pins with isopropyl alcohol of the PCIe x16 part of the card...

Then blow out the slot and use a flashlight to see if you see anything at all in there... Blow it all out.

Are you still on Original BIOS?

The BIOS I uploaded I feel could fix the issue...

Or If all else fails. DDU Uninstall all of the drivers. Then Only Install the Driver .inf from DEVICE MANAGER.

Right Click Display Adapter and Update Manually... Have Disk browse to the Driver Folder... To Packages to Display and click the *.inf* alone and install... That has fixed some issues as maybe something in the driver suite is causing it as well.

It could also be the UVD states... I'll thoroughly go over your BIOS and compare my UVD states to yours. You say only on VIDEO particularly so that's UVD related or a few other BIOS SPECIFIC reasons.

Your Problem is physical and BIOS related. Physical by means of XFX never Cooled the VRM. They didn't do it so they will always run hot from the day it left the manufacturer.

We can BAND-AID fix it with modding the BIOS for sure and I'm 1,000,000 % it's related the LACK OF MEMORY VRM Cooling & BIOS.... Only those 2 things could it be and nothing else except dirty slot....


----------



## componentgirl90

Quote:


> Originally Posted by *chris89*
> 
> What I would do is pull the card and clean the pins with isopropyl alcohol of the PCIe x16 part of the card...
> 
> Then blow out the slot and use a flashlight to see if you see anything at all in there... Blow it all out.
> 
> Are you still on Original BIOS?
> 
> The BIOS I uploaded I feel could fix the issue...
> 
> Or If all else fails. DDU Uninstall all of the drivers. Then Only Install the Driver .inf from DEVICE MANAGER.
> 
> Right Click Display Adapter and Update Manually... Have Disk browse to the Driver Folder... To Packages to Display and click the *.inf* alone and install... That has fixed some issues as maybe something in the driver suite is causing it as well.
> 
> It could also be the UVD states... I'll thoroughly go over your BIOS and compare my UVD states to yours. You say only on VIDEO particularly so that's UVD related or a few other BIOS SPECIFIC reasons.
> 
> Your Problem is physical and BIOS related. Physical by means of XFX never Cooled the VRM. They didn't do it so they will always run hot from the day it left the manufacturer.
> 
> We can BAND-AID fix it with modding the BIOS for sure and I'm 1,000,000 % it's related the LACK OF MEMORY VRM Cooling & BIOS.... Only those 2 things could it be and nothing else except dirty slot....


Ok I will run through the least difficult things step by step before doing anything which takes a lot of taking apart. If the problem is apparently fixed at any stage, I will stop. What is the best board partner? I have heard EVGA is good.

BTW I used compressed air on the slot at one point. Unfortunately I am not able to easily test my mobos other slot as the card is really really difficult to fit into that slot with my case.


----------



## chris89

The best partner is not a partner at all. It's the Supplier. *AMD*.

The reference heatsink design AMD offers is actually superior to board partners. I found this out the hard way.

Yeah so I highly recommend checking the slot with a flashlight as I have on many occasions, found dust in the slot(s) of memory and PCIe. Causing issues. Not to mention dirty PCIe x16 pins on the card itself, even a fingerprint could disrupt the flow of electricity.

However I am 5 trillion % sure it's the memory modules VRM, that's it right there. If not addressed, must be BIOS modded to address the issue. I would need the card to dial it in, but I would tear it down and fix the memory VRM first myself.

So in the mean time I'll compare your UVD (Video Engine) power/ voltage states and see what the difference is.

Hope I helped a little, cutie.


----------



## PunkX 1

So people are pushing 1.4v through the R9 390?


----------



## RaFDX

anyone else experiencing flicking when coming out of a full screen game at 144Hz? &^%($#@ annoying


----------



## PunkX 1

Quote:


> Originally Posted by *RaFDX*
> 
> anyone else experiencing flicking when coming out of a full screen game at 144Hz? &^%($#@ annoying


Only when overclocked, right?


----------



## chris89

Quote:


> Originally Posted by *RaFDX*
> 
> anyone else experiencing flicking when coming out of a full screen game at 144Hz? &^%($#@ annoying


https://www.newegg.com/Product/Product.aspx?Item=N82E16812400859

The HDMI isn't all too great and can't handle much. This adapter though converts the DP to HDMI 2.0 for like 200hz 1080p capability among 4k 60hz.

Flickering occurs from the connection doesn't have the bandwidth or the Resistance of cable/ adapter is too great.

See the Startech adapter is very short giving the least resistance for flicker free operation at very high refresh rates and resolutions.

Could also be your HDMI cable is aging or is too long... ie more than 4 feet long. For high refresh rates on this GPU, shorter the better. That way the wires inside the cable operate at the very least resistance. As the wires get longer resistance becomes higher which is the causing of flickering because the voltage resistance droops just below optimal.


----------



## RaFDX

@PunkX 1 no I reverted back to stock bios/speeds to do VRM heat testing.

@chris89 The issue sometime happens when I switched from a full screen game to a game I ran in windows mode. I am currently using DP to DP. Which end would this adapter go? Do I buy two, one for each end?


----------



## stephenn82

Quote:


> Originally Posted by *PunkX 1*
> 
> So people are pushing 1.4v through the R9 390?


max voltage I ran was 1393. at 1175mhz. Max temp it ever hit was 71c.


----------



## PunkX 1

Quote:


> Originally Posted by *stephenn82*
> 
> max voltage I ran was 1393. at 1175mhz. Max temp it ever hit was 71c.


What was the voltage you got under load?


----------



## chris89

Quote:


> Originally Posted by *RaFDX*
> 
> @PunkX 1 no I reverted back to stock bios/speeds to do VRM heat testing.
> 
> @chris89 The issue sometime happens when I switched from a full screen game to a game I ran in windows mode. I am currently using DP to DP. Which end would this adapter go? Do I buy two, one for each end?


Yeah get an hdmi 2.0 spec cable or like 2.1 or something and that adapter to get full res/ refresh rates... need a special high spec displayport to display cable for direct connection...


----------



## bluej511

Quote:


> Originally Posted by *chris89*
> 
> Yeah get an hdmi 2.0 spec cable or like 2.1 or something and that adapter to get full res/ refresh rates... need a special high spec displayport to display cable for direct connection...


Its a digital signal, the cables don't matter. Pretty much all high speed cables work and shouldnt be an issue, any company that labels their cable 2.0/2.1 is a total gimmick.

For anything above 60-70hz the displayport is the best to use period, and pretty much all 1.2/1.3 cables have more then enough bandwidth for 1080/144hz. The flicker could be a cheap cable, bad connection in the back (some cases mess with cables and prevent em from having a good full insertion), or could be your dp/hdmi ports on the gpu are bad.

When i first got my freesync display and first dp cable, freesync would not work, id get artifacts and flicker like crazy. Switched dp ports and problem solved. Once in a while getting flicker or exiting a game or something may be down to your monitor and/or cable and/or card. I sometimes get flicker during boot up or just randomly, it happens and its not often.
Quote:


> Originally Posted by *RaFDX*
> 
> anyone else experiencing flicking when coming out of a full screen game at 144Hz? &^%($#@ annoying


Read above, its for you but forgot to quote u haha.


----------



## RaFDX

I have a ViewSonic XG 2401 connected to the displayport. I can try swapping out cables. I've read that it's a software thing cause of crimson.


----------



## chris89

@RaFDX

1333mv-888mv-1188Mhz-1250Mhz.zip 99k .zip file


1340mv-888mv-1188Mhz-1250Mhz.zip 99k .zip file


1344mv-888mv-1188Mhz-1250Mhz.zip 99k .zip file


1348mv-888mv-1188Mhz-1250Mhz.zip 99k .zip file


----------



## RaFDX

@chris89 what are the profiles for?


----------



## chris89

Artifacts/ temperature testing for VRM 1 temperature.. to keep low VRM.


----------



## RaFDX

Thanks! Ill try them out tonight. It's tough to pinpoint cause it's sporadic. Do you have a theory why the flicking is occuring? I am going to buy a new cable tonight to see if it is the cable dying.

Also, at stock, the vrm max temp is 70. So I think I need to either rotate fans to remove heat or remove the heatsinks on the vrms (it might be the glue holding the vrms on that's insulating heat)?


----------



## kbft

@chris89 I'd be very happy if you could mod my bios as well!








couldnt upload it here directly so here's the link: https://www.sendspace.com/file/2ysyw6


----------



## stephenn82

Quote:


> Originally Posted by *PunkX 1*
> 
> What was the voltage you got under load?


1393


----------



## stephenn82

Quote:


> Originally Posted by *bluej511*
> 
> Its a digital signal, the cables don't matter. Pretty much all high speed cables work and shouldnt be an issue, any company that labels their cable 2.0/2.1 is a total gimmick.
> 
> For anything above 60-70hz the displayport is the best to use period, and pretty much all 1.2/1.3 cables have more then enough bandwidth for 1080/144hz. The flicker could be a cheap cable, bad connection in the back (some cases mess with cables and prevent em from having a good full insertion), or could be your dp/hdmi ports on the gpu are bad.
> 
> When i first got my freesync display and first dp cable, freesync would not work, id get artifacts and flicker like crazy. Switched dp ports and problem solved. Once in a while getting flicker or exiting a game or something may be down to your monitor and/or cable and/or card. I sometimes get flicker during boot up or just randomly, it happens and its not often.
> Read above, its for you but forgot to quote u haha.


CONCUR

make sure it has a quality gold connector, and good to go. monoprice has some good quality cables for what you pay.

some science for you:


----------



## chris89

Quote:


> Originally Posted by *RaFDX*
> 
> Thanks! Ill try them out tonight. It's tough to pinpoint cause it's sporadic. Do you have a theory why the flicking is occuring? I am going to buy a new cable tonight to see if it is the cable dying.
> 
> Also, at stock, the vrm max temp is 70. So I think I need to either rotate fans to remove heat or remove the heatsinks on the vrms (it might be the glue holding the vrms on that's insulating heat)?


Yeah the bios is set to keep VRM fans high and way cooler temperature settings. Plus 888mv memory on 1250mhz is 40% less vrm heat so 59C load maybe... The bios is power delimited for way higher 4k fps with delimited unless your core vrm solution is not even up to barely hold stock 200w limit... would need to take it apart and re do the thermal pads asap before the card goes bad very very soon.
Quote:


> Originally Posted by *kbft*
> 
> @chris89 I'd be very happy if you could mod my bios as well!
> 
> 
> 
> 
> 
> 
> 
> 
> couldnt upload it here directly so here's the link: https://www.sendspace.com/file/2ysyw6


 1296mv-895mv-1133Mhz-1250Mhz.zip 101k .zip file


----------



## RaFDX

[quote name="chris89" [/quote] Thanks for taking the time to create the bios'. Only had time to test one tonight, the 1333mV - 888 1188//1250 and VRM hit 97 while I was playing Eve Online. What I am going to do tomorrow is reverse the fan direction to exhaust and reload the stock bios. Run a 2x Superposition at stock and record values. What I am gauging, I should be near core temp at VRM1. Thoughts?


----------



## chris89

*Well it's clear ... You don't even have a Core VRM heatsink at all bro...*

Your welcome...

The flickering could be *Cable* or *Display* or *Video Card* ie... _No VRM Contact_... _The Card is Going out and Fast._

Try another video card and see if the flickering is gone...

97C vrm means you have no contact or the heatsink is so tiny it can't handle any more than stock power limit.. one of the two...

You do not want to use the card with *no VRM contact*... So work on it before using it at all or it's going to go out soon.


----------



## RaFDX

I am 95% sure I have the following https://www.amazon.com/dp/B01LXWK626/ref=cm_sw_r_cp_apa_1PmmzbVNDWVZ1 on the VRM 1 location. What that previous picture identified as VRM 1

Looks like I am going to taking it apart more that I originally expected. Either way I'll take lots of pictures tomorrow. Thanks again


----------



## stephenn82

would anyone recommend this on the VRM and memory chips over stock pads for the MSI 390?

https://www.amazon.com/Thermal-Grizzly-Minus-High-Performance/dp/B00ZJSC3JE/ref=sr_1_5?srs=12726538011&ie=UTF8&qid=1496371895&sr=8-5

or go with some Fujipoly SARCON XR-m


----------



## chris89

Quote:


> Originally Posted by *RaFDX*
> 
> I am 95% sure I have the following https://www.amazon.com/dp/B01LXWK626/ref=cm_sw_r_cp_apa_1PmmzbVNDWVZ1 on the VRM 1 location. What that previous picture identified as VRM 1
> 
> Looks like I am going to taking it apart more that I originally expected. Either way I'll take lots of pictures tomorrow. Thanks again


Right on. I see 110% that there is not one 1 single heatsink covering any of your Core VRM(s).


----------



## stephenn82

Quote:


> Originally Posted by *chris89*
> 
> Right on. I see 110% that there is not one 1 single heatsink covering any of your Core VRM(s).


damn, @chris89 you play baseball growing up? good eye!


----------



## chris89

https://www.amazon.com/Gelid-Solutions-Vision-Enhancement-R290/dp/B00K73F60E/ref=sr_1_1?s=electronics&ie=UTF8&qid=1496373081&sr=1-1&keywords=gelid+290x


----------



## matthew87

Hoping someone here can shed some light on what's going on with my Sapphire 390. When I first got the card it had no issues running at 1140/1675mhz. I didn't run the card to overclocked on a day to day basis, but it was 100% stable and solid at these speeds.

But now if I so much as touch the memory speed the screen goes blank and PC hard locks. Even bumping to 1550 results in needing a hard restart. I also noticed in sapphires trix software when I change core frequency it just reverts itself after 10 seconds or so. Using msi afterburner i seem to still be able to overclock the core, but again memory results in instant hard locks. Watt man also hard locks if I touch memory frequency.

Even built a new Ryzen PC since this issue first appeared and it's come across. Same issue with memory freezing now occurring on two different PC's with the same card. But I can't see how it's hardware...


----------



## chris89

Quote:


> Originally Posted by *matthew87*
> 
> Hoping someone here can shed some light on what's going on with my Sapphire 390. When I first got the card it had no issues running at 1140/1675mhz. I didn't run the card to overclocked on a day to day basis, but it was 100% stable and solid at these speeds.
> 
> But now if I so much as touch the memory speed the screen goes blank and PC hard locks. Even bumping to 1550 results in needing a hard restart. I also noticed in sapphires trix software when I change core frequency it just reverts itself after 10 seconds or so. Using msi afterburner i seem to still be able to overclock the core, but again memory results in instant hard locks. Watt man also hard locks if I touch memory frequency.
> 
> Even built a new Ryzen PC since this issue first appeared and it's come across. Same issue with memory freezing now occurring on two different PC's with the same card. But I can't see how it's hardware...


Memory speeds aren't the answer to more speed on these cards. The thermals created by increasing 512 bit GDDR5 is extremely high. As you raise memory clocks and speed, comes heat. Which most electrical power needed to feed such high speed ram is lost by thermal expansion of capacitors.

So basically send me your bios *.rom* here in *.zip* format... As *attachment* and let's get it running solid bud.

I'd be interested to see Ryzen benchmarks on the 390!









GPU-Z_ASUS_ROG_1.12.0.zip 2079k .zip file


atiflash_274.zip 1214k .zip file


----------



## bluej511

Quote:


> Originally Posted by *chris89*
> 
> Memory speeds aren't the answer to more speed on these cards. The thermals created by increasing 512 bit GDDR5 is extremely high. As you raise memory clocks and speed, comes heat. Which most electrical power needed to feed such high speed ram is lost by thermal expansion of capacitors.
> 
> So basically send me your bios *.rom* here in *.zip* format... As *attachment* and let's get it running solid bud.
> 
> I'd be interested to see Ryzen benchmarks on the 390!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> GPU-Z_ASUS_ROG_1.12.0.zip 2079k .zip file
> 
> 
> atiflash_274.zip 1214k .zip file


I have some.

http://www.3dmark.com/compare/fs/12653022/fs/11871887#

Pretty much the same as my 4690k, however, the physics score is out of this world.


----------



## matthew87

Quote:


> Originally Posted by *chris89*
> 
> So basically send me your bios *.rom* here in *.zip* format... As *attachment* and let's get it running solid bud.
> 
> I'd be interested to see Ryzen benchmarks on the 390!


As am I, genuinely appreciate the help

Attached is the default ROM

Hawaii.zip 42k .zip file


Still i find it damn odd that i could overclock the memory, regardless of benifit, without drama for over 12 months and then bam! Suddely broken and even 50mhz overclock results in hard lock.


----------



## RaFDX

Quote:


> Originally Posted by *chris89*
> 
> Right on. I see 110% that there is not one 1 single heatsink covering any of your Core VRM(s).


I am 100% sure I have heatsinks on the 6 thingies to the left of VRM1. Hahahahahahahaha Loved the zoomed in circles. I have ordered thermal pads and have spare heatsinks rdy to go. I will also rotate the fans to be exhaust to assist with heat dissipation.

Thanks again for the circles ????


----------



## kbft

Quote:


> Originally Posted by *chris89*
> 
> 1296mv-895mv-1133Mhz-1250Mhz.zip 101k .zip file


Thanks a lot chris! This really does run faster and is super stable as well. Just what I was hoping for! However the fans get kinda noisy for me, they jump up to 100 % at 65°C GPU Temp. Before they were running at 50% on 75°C under full load which is so much quieter. I did some fan % testing via trixx with your bios and these are the results i got:

Fan[%] GPU[°C] VRM[°C]
100 66 57
90 67 58
80 68 58
70 69 61
60 71 63
50 74 64

So apparently the temperature drop-off ain't too great with high fan speeds for me. Perhaps my airflow isn't good enough, I don't know.
Do you think it would make sense for me to have a fan curve from like 28 % to a max of about 80 % on full load with a rather smooth transition? These fan speeds appear to be rather quiet to me. And would it be possible to implement that into you bios? Just if its not to much effort of course! Or is that a bad idea? =)


----------



## chris89

Quote:


> Originally Posted by *kbft*
> 
> Thanks a lot chris! This really does run faster and is super stable as well. Just what I was hoping for! However the fans get kinda noisy for me, they jump up to 100 % at 65°C GPU Temp. Before they were running at 50% on 75°C under full load which is so much quieter. I did some fan % testing via trixx with your bios and these are the results i got:
> 
> Fan[%] GPU[°C] VRM[°C]
> 100 66 57
> 90 67 58
> 80 68 58
> 70 69 61
> 60 71 63
> 50 74 64
> 
> So apparently the temperature drop-off ain't too great with high fan speeds for me. Perhaps my airflow isn't good enough, I don't know.
> Do you think it would make sense for me to have a fan curve from like 28 % to a max of about 80 % on full load with a rather smooth transition? These fan speeds appear to be rather quiet to me. And would it be possible to implement that into you bios? Just if its not to much effort of course! Or is that a bad idea? =)


Your welcome bud. This BIOS is quieter. Try it bro.









1296mv-895mv-1133Mhz-1250Mhz-FanTune.zip 101k .zip file


----------



## Lutore

Hi!
Can someone help me? I'm ready to give up on this card soon...
Getting black screens and windows hangs constantly. Thought it was my PSU but I upgraded to a 850W and still got the same problems.
Bought the card from a friend and it has been running great for 6 months until last saturday. I have tried some different clock/mem speeds and ROMs but no luck. Sometimes its just stuck at 300/150.. It's a Sapphire Radeon R9 390 Nitro 8gb card.
Currently on Crimson 17.1.1. Can someone take a look at the original rom I extracted before I started to flash?

Hawaii_original.zip 101k .zip file


----------



## RaFDX

Quote:


> Originally Posted by *Lutore*
> 
> Hi!
> Can someone help me? I'm ready to give up on this card soon...
> Getting black screens and windows hangs constantly. Thought it was my PSU but I upgraded to a 850W and still got the same problems.
> Bought the card from a friend and it has been running great for 6 months until last saturday. I have tried some different clock/mem speeds and ROMs but no luck. Sometimes its just stuck at 300/150.. It's a Sapphire Radeon R9 390 Nitro 8gb card.
> Currently on Crimson 17.1.1. Can someone take a look at the original rom I extracted before I started to flash?
> 
> Hawaii_original.zip 101k .zip file


I would run DDU and reinstall the latest crimson drivers. Don't use Wattman, it hasn't really worked with our card. Hopefully you havent tried this yet


----------



## Lutore

Quote:


> Originally Posted by *RaFDX*
> 
> I would run DDU and reinstall the latest crimson drivers. Don't use Wattman, it hasn't really worked with our card. Hopefully you havent tried this yet


I have tried a lot of different drivers, always used DDU in safe mode. Just reinstalled windows too a few days ago.
I cant see nothing in wattman its all N/A on the clock speeds. It used to work. Afterburn still finds the card tho. Trying to mod my BIOs atm, but no luck so far. Latest driver and did not activate wattman.


----------



## stephenn82

Quote:


> Originally Posted by *stephenn82*
> 
> 1393


I was wrong @PunkX 1 the max load while running Superposition according to HWInfo was 1.244v. better than what I was expecting.


----------



## RaFDX

Have you looked at the board to see if there any physical damage?


----------



## PunkX 1

I have my XFX 390 under custom liquid cooling so can some help me edit the bios to get the most of of my GPU?


----------



## kbft

Quote:


> Originally Posted by *chris89*
> 
> Your welcome bud. This BIOS is quieter. Try it bro.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1296mv-895mv-1133Mhz-1250Mhz-FanTune.zip 101k .zip file


This has been great so far. Fast, stable and super quiet. Many many thanks chris!!


----------



## chris89

Quote:


> Originally Posted by *kbft*
> 
> This has been great so far. Fast, stable and super quiet. Many many thanks chris!!


Your welcome. If you feel you have headroom temperature wise. This bios increases another 2.5 Billion pixel's & 6.24 Billions texel's per second... Could be 10-15fps higher.

I'm not entirely certain on this voltage to clock. Let me know, however fan profile remains the same.

1337mv-895mv-1172Mhz-1250Mhz-FanTune.zip 101k .zip file


----------



## eightyD

Hello, new member here.
Picked up a Strix 390x last year for $160 at a garage sale. Didn't know about the 3/5 heatpipe ordeal. Repasted with MX4 and have comfortable gaming temps and fan speeds now. I was attempting some overclocking using Ungine Valley @ 1440p as this is my native res. [email protected] was fine so I decided to extend oc range in GpuTweakII, went to 1.3v but got artifacting at 1150MHz. I was backing it down and noticed the VRMs at 101c so I reverted to stock voltage. Friend told me it was good to 125c but I am weary of triple digit temps on any component. I have not attempted any memory oc because I was told that creates more heat as well and I have plenty of that already. 2nd profile is [email protected] | 1400 memory and very quiet 60hz gaming
hwinfo/valley SS
http://i.imgur.com/QUPVUZd.jpg


----------



## kbft

Quote:


> Originally Posted by *chris89*
> 
> Your welcome. If you feel you have headroom temperature wise. This bios increases another 2.5 Billion pixel's & 6.24 Billions texel's per second... Could be 10-15fps higher.
> 
> I'm not entirely certain on this voltage to clock. Let me know, however fan profile remains the same.
> 
> 1337mv-895mv-1172Mhz-1250Mhz-FanTune.zip 101k .zip file


Wow, you really deliver man.








Did some testing with it: had one single black screen crash. Fan speed was jumpy, pretty much like it was with the first bios you posted for me. Much in contrast to the second bios, which has very smooth fan speed transitioning. Seemed like it jumped to 100 % instantly whenever the gpu was in use and heatin up over 60°C. Other than that 70 °C on full load with full throttle fans. Stable except for that one crash.


----------



## Lutore

Quote:


> Originally Posted by *RaFDX*
> 
> Have you looked at the board to see if there any physical damage?


Been tinkering all night with the card, different modded BIOs versions and drivers. Finally got it running with a ROM that had a VDDC offset value of 24?! with normal clock speeds, this was with the 15.11.1 driver, after 3 hours I decided to try to update ->DDU-> 16.11.5 driver and it all got ****ed again. Bluescreens/Black screens and hangs, cant enter into windows. Back to square one right now.
Just saw your post and checked the card, saw this:

78m05 KG, should it look like this?


Took off the backplate as I saw some gooey stuff;


Seems like its from the rubber on the backplate, normal?

I'm going to see if some friend of mine can test the card in their rigg. I tried different BIOs versions for my MB but no luck.


----------



## chris89

Quote:


> Originally Posted by *kbft*
> 
> Wow, you really deliver man.
> 
> 
> 
> 
> 
> 
> 
> 
> Did some testing with it: had one single black screen crash. Fan speed was jumpy, pretty much like it was with the first bios you posted for me. Much in contrast to the second bios, which has very smooth fan speed transitioning. Seemed like it jumped to 100 % instantly whenever the gpu was in use and heatin up over 60°C. Other than that 70 °C on full load with full throttle fans. Stable except for that one crash.


Goofed here maybe needs voltage

1344mv-895mv-1172Mhz-1250Mhz-FanTune.zip 101k .zip file


----------



## matthew87

Hi Chris, do you have a recommended bios for sapphire 390 Nitro?

Card can run stable at 1140/1675 with an additional 62mv. Sadly it seems wattman does not allow me to exceed 1275mv for core. Presently I need to add additional voltage via 3rd party tools. Pain in the backside wattman. It just won't allow me to run enough voltage to push the card.

I did upload my bios earlier. Really appreciate your efforts.


----------



## TrueForm

Any custom bios to unlock the voltage of my gigabyte 390?


----------



## chris89

Quote:


> Originally Posted by *matthew87*
> 
> Hi Chris, do you have a recommended bios for sapphire 390 Nitro?
> 
> Card can run stable at 1140/1675 with an additional 62mv. Sadly it seems wattman does not allow me to exceed 1275mv for core. Presently I need to add additional voltage via 3rd party tools. Pain in the backside wattman. It just won't allow me to run enough voltage to push the card.
> 
> I did upload my bios earlier. Really appreciate your efforts.


Your welcome. Send bios here. .... _*.rom ... in .zip ... From GPUz ... Attachment*_








Quote:


> Originally Posted by *TrueForm*
> 
> Any custom bios to unlock the voltage of my gigabyte 390?


Your welcome. Send bios here. .... _*.rom ... in .zip ... From GPUz ... Attachment*_


----------



## matthew87

Quote:


> Originally Posted by *chris89*
> 
> Your welcome. Send bios here. .... _*.rom ... in .zip ... From GPUz ... Attachment*_


 Hawaii.zip 42k .zip file


Attached


----------



## RaFDX

Potentially bad news folks; Had a power outage, UPS failed to protect. Computer turned on and both the VRM temps were in the 90s. Turned off computer, rechecked all connections, VRMs still upper 80s/low 90s right after rebooting. Took apart gpu, reapplied thermal paste turned on computer; artifacts and unable to boot to windows.

Retook apart gpu and there is no physical damage on the die or the board. No smell either. Going to rebuild it and try again in a little bit.

Thoughts?


----------



## chris89

Quote:


> Originally Posted by *matthew87*
> 
> Hawaii.zip 42k .zip file
> 
> 
> Attached


These bios are like running your current card at 1225mhz core but at lower clocks because of De-Limited power limit... So way more performance at lower clocks... Watch Core VRM Temperature... DeLimited means lots of power consumption at 1172mhz but 1133mhz isn't too bad...

DeLimited is not like horribly high power consumption its ideal running really because if left default voltage and 1000Mhz core at 65288 and delimited with 895mv memory vs 1050mv memory on 1250mhz memory clock vs 1500mhz memory clock. We then not only create a more efficient running card but also higher performance than stock bios... See the power limit dictates your fps at all resolutions.. not so much the clock...

For instance from 256 watts limit on RX 480 ... my SuperPosition was 11-15 FPS..... 1080p Extreme.... By Setting Limit to then 999 watts/ amps TDP/ TDC it then was doing 22-24 FPS.... So from 11-15 FPS at the same exact clock to 22-24 FPS... that's 60% more FPS without Turning up or Changing clocks at all from 1,407Mhz Core...

FPS isn't about the CLOCK it's about the Power Limit and Thermal Ability to Hold said Clock Stable *THERMALLY* with no Power Limit... Just a Thermal Temperature Limit.

This means from 200 watts to say 50% power is 250 watts... These bios have no power limit what so ever. Means huge performance if VRM and all runs cool and well... Good luck... BTW the fan is dialed in to also run cool lmk on stability/ temps/ noise/ performance characteristics.

To Flash ATIWINFLASH.EXE as Administrator but first close all apps like MSI/ HWInfo/ Any monitoring app at all ... Open .ROM proceed to flash and wait 5 minutes.. don't touch anything then restart and good to go... I suggest test as-is and don't worry about memory clocks.. it's ideal for delimited to keep it all working as efficiently as possible.

atiflash_274.zip 1214k .zip file


1296mv-895mv-1133mhz-1250mhz.zip 42k .zip file


1344mv-895mv-1172mhz-1250mhz.zip 42k .zip file


----------



## stephenn82

Quote:


> Originally Posted by *chris89*
> 
> These bios are like running your current card at 1225mhz core but at lower clocks because of De-Limited power limit... So way more performance at lower clocks... Watch Core VRM Temperature... DeLimited means lots of power consumption at 1172mhz but 1133mhz isn't too bad...
> 
> This means from 200 watts to say 50% power is 250 watts... These bios have no power limit what so ever. Means huge performance if VRM and all runs cool and well... Good luck... BTW the fan is dialed in to also run cool lmk on stability/ temps/ noise/ performance characteristics.
> 
> To Flash ATIWINFLASH.EXE as Administrator but first close all apps like MSI/ HWInfo/ Any monitoring app at all ... Open .ROM proceed to flash and wait 5 minutes.. don't touch anything then restart and good to go... I suggest test as-is and don't worry about memory clocks.. it's ideal for delimited to keep it all working as efficiently as possible.
> 
> atiflash_274.zip 1214k .zip file
> 
> 
> 1296mv-895mv-1133mhz-1250mhz.zip 42k .zip file
> 
> 
> 1344mv-895mv-1172mhz-1250mhz.zip 42k .zip file


I can attest to this, 1133 is very good...1173 is hella fast, but quite a bit hotter. I think I will settle on 1133 or 1140 on core, and work something in between 1250 and 1550 on Memory. Lower power req/temp output is good. Chris89 is boss at getting these together. He explains it all in detail, and even talks you through modding your own bios. legit!

Speaking of that, Chris...what is a good fan curve for temp targets?

I have 25% at 30
45% at 45
and max of 78% at 62c


----------



## Rexer

Quote:


> Originally Posted by *Streetdragon*
> 
> its a hot chip. you could try to redo the thermal paste and pads. do you have a good airflow in your case?


Re-pasting the big 390x heatsink initially gave me a good 12c cool. Up to 6 months ago, I stopped running with clocks in Battlefield games. Overheating 85+. Never had the problem till I received an update from Origin. A few months ago, my 390x kacked. Yeah, the thrill is gone. So I picked up a 580 oc to tithe me over till Vega manufacturer's cards hit the market. The temps on this card stay 53/62c. Not sure if Origin updates is associated to the 390x overheating. Could be I stressed it out clocking it up. Sorta disappointing to see.


----------



## kbft

Quote:


> Originally Posted by *chris89*
> 
> Goofed here maybe needs voltage
> 
> 1344mv-895mv-1172Mhz-1250Mhz-FanTune.zip 101k .zip file


You really want to push my card eh?








This bios is indeed better than the 1172Mhz one with lower voltage. It's fast, fan behaviour is nice and no crashes. But sadly artifacts on full load (at ~ 75 °C).
Right now i feel like the second bios you posted for me hits the sweet spot (1296mv-895mv-1133Mhz-1250Mhz-FanTune). Runs great and keeps the fans silent.
Thanks so much for your help!


----------



## TrueForm

Quote:


> Originally Posted by *chris89*
> 
> Your welcome. Send bios here. .... _*.rom ... in .zip ... From GPUz ... Attachment*_
> 
> 
> 
> 
> 
> 
> 
> 
> Your welcome. Send bios here. .... _*.rom ... in .zip ... From GPUz ... Attachment*_


 trueformbios.zip 99k .zip file


----------



## PunkX 1

Quote:


> Originally Posted by *RaFDX*
> 
> Potentially bad news folks; Had a power outage, UPS failed to protect. Computer turned on and both the VRM temps were in the 90s. Turned off computer, rechecked all connections, VRMs still upper 80s/low 90s right after rebooting. Took apart gpu, reapplied thermal paste turned on computer; artifacts and unable to boot to windows.
> 
> Retook apart gpu and there is no physical damage on the die or the board. No smell either. Going to rebuild it and try again in a little bit.
> 
> Thoughts?


Was it running stock bios? If not, try flashing the card.


----------



## chris89

Quote:


> Originally Posted by *RaFDX*
> 
> Potentially bad news folks; Had a power outage, UPS failed to protect. Computer turned on and both the VRM temps were in the 90s. Turned off computer, rechecked all connections, VRMs still upper 80s/low 90s right after rebooting. Took apart gpu, reapplied thermal paste turned on computer; artifacts and unable to boot to windows.
> 
> Retook apart gpu and there is no physical damage on the die or the board. No smell either. Going to rebuild it and try again in a little bit.
> 
> Thoughts?


 65288-888mv-1000mhz-1250mhz.zip 99k .zip file


Quote:


> Originally Posted by *stephenn82*
> 
> I can attest to this, 1133 is very good...1173 is hella fast, but quite a bit hotter. I think I will settle on 1133 or 1140 on core, and work something in between 1250 and 1550 on Memory. Lower power req/temp output is good. Chris89 is boss at getting these together. He explains it all in detail, and even talks you through modding your own bios. legit!
> 
> Speaking of that, Chris...what is a good fan curve for temp targets?
> 
> I have 25% at 30
> 45% at 45
> and max of 78% at 62c


 1350mv-888mv-1172Mhz-1250Mhz.zip 99k .zip file

Quote:


> Originally Posted by *TrueForm*
> 
> trueformbios.zip 99k .zip file


 1350mv-888mv-1172Mhz-1250Mhz.zip 101k .zip file


----------



## matthew87

Hi Chris,

Your 133mhz bios worked perfectly, but sadly offered no additional performance gain over my old 1140/1675 overclock. However, with memory dialed up from 1250 to 1500mhz it was the best performance I've seen from the card. Is it possible for you to provide another 1135mhz bios with the memory voltage bumped back up and running at reference 1500mhz speed?

the 1172mhz bios appears to need a touch more voltage as there's slight occasional artifacting. The core vrm peaked at 71 degrees and memory at 68 degrees. So I don't believe temps were an issue. Core temp also topped out at 71 degrees c.


----------



## stephenn82

certain game titles show no performance hit with 1250mhz over 1500, or 1563 for mem speed. the division and bf1 were within 3-5 fps of the clocks. benchmarks on the otherhand were impacted. try actual games with it and compare with that as well.


----------



## chris89

Quote:


> Originally Posted by *matthew87*
> 
> Hi Chris,
> 
> Your 133mhz bios worked perfectly, but sadly offered no additional performance gain over my old 1140/1675 overclock. However, with memory dialed up from 1250 to 1500mhz it was the best performance I've seen from the card. Is it possible for you to provide another 1135mhz bios with the memory voltage bumped back up and running at reference 1500mhz speed?
> 
> the 1172mhz bios appears to need a touch more voltage as there's slight occasional artifacting. The core vrm peaked at 71 degrees and memory at 68 degrees. So I don't believe temps were an issue. Core temp also topped out at 71 degrees c.


 1356mv-895mv-1172mhz-1250mhz.zip 42k .zip file


1356mv-950mv-1172mhz-1563mhz.zip 42k .zip file


1366mv-895mv-1172mhz-1250mhz.zip 42k .zip file


1366mv-950mv-1172mhz-1563mhz.zip 42k .zip file

Quote:


> Originally Posted by *stephenn82*
> 
> certain game titles show no performance hit with 1250mhz over 1500, or 1563 for mem speed. the division and bf1 were within 3-5 fps of the clocks. benchmarks on the otherhand were impacted. try actual games with it and compare with that as well.


True
















I forgot if 950mv was fine at 1563mhz memory clock? lmk









1356mv-888mv-1172mhz-1250mhz.zip 99k .zip file


1356mv-950mv-1172mhz-1563mhz.zip 99k .zip file


1366mv-888mv-1172mhz-1250mhz.zip 99k .zip file


1366mv-950mv-1172mhz-1563mhz.zip 99k .zip file

Quote:


> Originally Posted by *TrueForm*
> 
> trueformbios.zip 99k .zip file


 1356mv-888mv-1172Mhz-1250Mhz.zip 101k .zip file


1366mv-888mv-1172Mhz-1250Mhz.zip 101k .zip file

Quote:


> Originally Posted by *PunkX 1*
> 
> Was it running stock bios? If not, try flashing the card.


*@PunkX 1*

1356mv-888mv-1172mhz-1250mhz.zip 99k .zip file


1366mv-888mv-1172mhz-1250mhz.zip 99k .zip file


----------



## stephenn82

Quote:


> Originally Posted by *chris89*
> 
> 1356mv-895mv-1172mhz-1250mhz.zip 42k .zip file
> 
> 
> 1356mv-950mv-1172mhz-1563mhz.zip 42k .zip file
> 
> 
> 1366mv-895mv-1172mhz-1250mhz.zip 42k .zip file
> 
> 
> 1366mv-950mv-1172mhz-1563mhz.zip 42k .zip file
> 
> True
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I forgot if 950mv was fine at 1563mhz memory clock? lmk
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1356mv-888mv-1172mhz-1250mhz.zip 99k .zip file
> 
> 
> 1356mv-950mv-1172mhz-1563mhz.zip 99k .zip file
> 
> 
> 1366mv-888mv-1172mhz-1250mhz.zip 99k .zip file
> 
> 
> 1366mv-950mv-1172mhz-1563mhz.zip 99k .zip file
> 
> 
> 1356mv-888mv-1172Mhz-1250Mhz.zip 101k .zip file
> 
> 
> 1366mv-888mv-1172Mhz-1250Mhz.zip 101k .zip file
> 
> *@PunkX 1*
> 
> 1356mv-888mv-1172mhz-1250mhz.zip 99k .zip file
> 
> 
> 1366mv-888mv-1172mhz-1250mhz.zip 99k .zip file


I think 988 is what chris recommended. I had it down to 975. Try 950. Every card is different


----------



## chris89

Quote:


> Originally Posted by *stephenn82*
> 
> I think 988 is what chris recommended. I had it down to 975. Try 950. Every card is different


I have the RX 480 in my rig at the moment because of the hdmi 60hz 4k issue unless i use the adapter on the 390x...

so rx 480 has 4k 60hz hdmi... otherwise I prefer the power of the 390x over the rx 480...

Well actually i prefer the rx 480 but the rx 480 is surprisingly hotter than the 390x at load when delimited and still is slower on Superposition....

24fps superposition roughly 1,407mhz core rx 480 ... vs 26-27fps PCIe 2.0 390x

Maybe you can test the bios I made for you for 950mv? Then ill try it on my 390x and pull out the rx 480?


----------



## narutonic

Yeah the 390 with her 512bit bus and bandwith is faster than the 480 in 4K

I'll receive one in less than one week (R9 390 Nitro) and just throw it in my custom loop.


----------



## spdaimon

I have a 4930K with a Asus R9 390 Strix running on a 750W Corsair HX PSU. I was planning maybe adding a second card. Seems like I need a 850W for crossfire. I am planning in the future running Ryzen 7 1700X maybe with the second card, or I was wondering if it was better to get a RX 580 or 2 instead, and I think that would run on the 750W instead. I'd like to be able to run 1440p at 144fps on high. That's my goal anyway. Any advice?


----------



## narutonic

I think that the cross can be handle by your PSU. If you undervolt the GPU. when undervolted (+ clock reduced at 1000MHz) an R9 390 will consume around 200w so 400~w for 2 GPU so there are plenty of room there. Unless that your cpu is overclocked the total power consumption will be aroud 600w


----------



## THUMPer1

I'm back. Can I get a BEAST BIOS for my 390x?


----------



## spdaimon

Quote:


> Originally Posted by *narutonic*
> 
> I think that the cross can be handle by your PSU. If you undervolt the GPU. when undervolted (+ clock reduced at 1000MHz) an R9 390 will consume around 200w so 400~w for 2 GPU so there are plenty of room there. Unless that your cpu is overclocked the total power consumption will be aroud 600w


Would that affect its performance at all? yeah, I have the cpu at 4.1Ghz right now.


----------



## stephenn82

Quote:


> Originally Posted by *chris89*
> 
> I have the RX 480 in my rig at the moment because of the hdmi 60hz 4k issue unless i use the adapter on the 390x...
> 
> so rx 480 has 4k 60hz hdmi... otherwise I prefer the power of the 390x over the rx 480...
> 
> Well actually i prefer the rx 480 but the rx 480 is surprisingly hotter than the 390x at load when delimited and still is slower on Superposition....
> 
> 24fps superposition roughly 1,407mhz core rx 480 ... vs 26-27fps PCIe 2.0 390x
> 
> Maybe you can test the bios I made for you for 950mv? Then ill try it on my 390x and pull out the rx 480?


Oh snap! Ive been checking this at work and no that much at home. Let me get on that and will let you know asap


----------



## chris89

Quote:


> Originally Posted by *THUMPer1*
> 
> I'm back. Can I get a BEAST BIOS for my 390x?


*

1296Mv-888Mv-1133Mhz-1250Mhz.zip 99k .zip file


1360Mv-888Mv-1172Mhz-1250Mhz.zip 99k .zip file


1387Mv-888Mv-1197Mhz-1250Mhz.zip 99k .zip file
*


----------



## stephenn82

@chris89 so far so good, 1563 runs fine at 950mv!!! lower scores on valley though, 2878 and was 2970 before. but runs cooler and no artifacting. 66/72c max temps! need some fujipoly


----------



## narutonic

Quote:


> Originally Posted by *spdaimon*
> 
> Would that affect its performance at all? yeah, I have the cpu at 4.1Ghz right now.


By reducing the core frequency(and reduce the core voltage and power limit) of the gpu it will slightly decrease the performance but by adding a second card it's not a big deal. There will be less heat and power consumptio.
And 4.1 isn't a big overclock so it *should* be okay (if your voltage is less or equal to 1.2). Anyway dont try furmark or any bench in the same type (kombustor), just stick with valley/3D Mark and games and dont try to push your PSU with some PSU test.


----------



## stephenn82

Quote:


> Originally Posted by *stephenn82*
> 
> @chris89 so far so good, 1563 runs fine at 950mv!!! lower scores on valley though, 2878 and was 2970 before. but runs cooler and no artifacting. 66/72c max temps! need some fujipoly


my suspicions are that it affects actual gameplay. I noticed Rising Storm 2 was unplayable..i quit. I pulled up The Division, and noticed it wasnt as peppy. Testing it now, I bumped mv to 988 in wattman.


----------



## THUMPer1

What the benefit of running the mem at 1250 and not 1500? Same bandwidth?


----------



## stephenn82

Quote:


> Originally Posted by *THUMPer1*
> 
> What the benefit of running the mem at 1250 and not 1500? Same bandwidth?


chris89 said it earlier. The PCI-E cant even keep up with the VRAM bandwidth.

Either way, there is minimal gain from running your memory at 320GB/sec (1250) vs 384GB Sec (1525) with that 512 bit bus it hogs power. makes a lot of heat too. the 1250mhz only requires a measly 888mv (or less!) to get 320GB/sec BW. the 1500/1525 is 1000mv. thats 33% power and cooling difference. for what, less than 8% performance difference? You test in benches, you will see a big difference. go in game, almost none. maybe 2fps.


----------



## THUMPer1

Quote:


> Originally Posted by *stephenn82*
> 
> chris89 said it earlier. The PCI-E cant even keep up with the VRAM bandwidth.
> 
> Either way, there is minimal gain from running your memory at 320GB/sec (1250) vs 384GB Sec (1525) with that 512 bit bus it hogs power. makes a lot of heat too. the 1250mhz only requires a measly 888mv (or less!) to get 320GB/sec BW. the 1500/1525 is 1000mv. thats 33% power and cooling difference. for what, less than 8% performance difference? You test in benches, you will see a big difference. go in game, almost none. maybe 2fps.


Got it. Well in Overwatch at 1440p there is no difference. Like you said 2 FPS. But in BF1 its like 10-15 fps. Especially after upping the RAM on my Ryzen system to 3400, I see huge gains in BF1.


----------



## stephenn82

Quote:


> Originally Posted by *THUMPer1*
> 
> Got it. Well in Overwatch at 1440p there is no difference. Like you said 2 FPS. But in BF1 its like 10-15 fps. Especially after upping the RAM on my Ryzen system to 3400, I see huge gains in BF1.


Wow. I only gor 3 or 5 fps gain in bf1.


----------



## TrueForm

Chris, can you edit my 390 bios, same voltages as the 1st one but make core 1150 and mem 1500? or maybe a little bit more voltage for mem also.


----------



## Tomwa

Hello, I'm some random nobody who's run into a performance barrier on their MSI 390X Gaming 8GB despite very low temps.

I decided I'd go check out if I could overclock my card which led me to here.

I read a few pages towards the end and used ATIFlash to try a variety of bios:

1296Mv-888Mv-1133Mhz-1250Mhz.rom
Results: Very stable, everything works, GPU temp peaks at 67C and averages about 65C under 100% load.

1296mv-895mv-1133Mhz-1250Mhz-FanTune.rom
Can't even install it, says I have a mismatch of some type.

1360Mv-888Mv-1172Mhz-1250Mhz.rom
Installs, runs, and can run the Unigine benchmark for hours, but for some reason Freesync ceases to function.

1387Mv-888Mv-1197Mhz-1250Mhz.rom
Temps are fine reaching a peak of 79C and averaging 76C but attempting to start the benchmark makes it go black and crash.

I have the original BIOS as exported via GPU-Z if anyone has the time to waste on me I'd appreciate it, if not then I appreciate the pre-existing bios-es as they already got a better clock than the default.


----------



## THUMPer1

Is there a way to just get a delimited BIOS and we can apply our own clocks? Or can we take the BIOS that was given to us and do custom clocks?


----------



## THUMPer1

Quote:


> Originally Posted by *Tomwa*
> 
> Hello, I'm some random nobody who's run into a performance barrier on their MSI 390X Gaming 8GB despite very low temps.
> 
> I decided I'd go check out if I could overclock my card which led me to here.
> 
> I read a few pages towards the end and used ATIFlash to try a variety of bios:
> 
> 1296Mv-888Mv-1133Mhz-1250Mhz.rom
> Results: Very stable, everything works, GPU temp peaks at 67C and averages about 65C under 100% load.
> 
> 1296mv-895mv-1133Mhz-1250Mhz-FanTune.rom
> Can't even install it, says I have a mismatch of some type.
> 
> 1360Mv-888Mv-1172Mhz-1250Mhz.rom
> Installs, runs, and can run the Unigine benchmark for hours, but for some reason Freesync ceases to function.
> 
> 1387Mv-888Mv-1197Mhz-1250Mhz.rom
> Temps are fine reaching a peak of 79C and averaging 76C but attempting to start the benchmark makes it go black and crash.
> 
> I have the original BIOS as exported via GPU-Z if anyone has the time to waste on me I'd appreciate it, if not then I appreciate the pre-existing bios-es as they already got a better clock than the default.


Most of the BIOS I have tried have low voltage. Which is fine, but they don't really run stable at the voltage applied to them. So I have always had to add extra voltage in MSI afterburner to get them to be stable.


----------



## Tomwa

Quote:


> Originally Posted by *THUMPer1*
> 
> Most of the BIOS I have tried have low voltage. Which is fine, but they don't really run stable at the voltage applied to them. So I have always had to add extra voltage in MSI afterburner to get them to be stable.


So I should just add voltage in small increments and test correct? Which increment would you suggest, I don't want to turn my graphics card into a massive fireball.

Edit: Also when is "Max voltage" so to speak?


----------



## THUMPer1

Quote:


> Originally Posted by *Tomwa*
> 
> So I should just add voltage in small increments and test correct? Which increment would you suggest, I don't want to turn my graphics card into a massive fireball.
> 
> Edit: Also when is "Max voltage" so to speak?


the 1172/1250 BIOS Chris gave me above, I set voltage to +40 in Afterburner and power limit to +50% it was stable for me. We have the same card mind you. I'm not saying this is the right approach, but it's the one I took. YMMV

I've had my voltage up to +100.


----------



## TrueForm

1140/1500 custom bios would be good with same voltages. I'm voltage locked on this card


----------



## stephenn82

When you guys get the bios from chris, you are throwing more voltage at it. 50 or 100mv to get stable? Ouch. What temps are you getting?


----------



## stephenn82

Quote:


> Originally Posted by *TrueForm*
> 
> 1140/1500 custom bios would be good with same voltages. I'm voltage locked on this card


Im doing 1160/1550 at 1350mv and 960mv on core and mem, respectively. Stable as all get out, temps hit 67/73. Going ro lower power and test for stable performance at lowest power usage.


----------



## THUMPer1

Quote:


> Originally Posted by *stephenn82*
> 
> When you guys get the bios from chris, you are throwing more voltage at it. 50 or 100mv to get stable? Ouch. What temps are you getting?


On the ones from chris I added +40 to get it to stop artifacting/flashing squares

before any custom bios I would push +100 at 1200 core for the lulz. I get around 70c core with chris's BIOS and extra voltage. If I don't add voltage I get blocks but temps are around 66. It's not stable.


----------



## stephenn82

I think you may be actually getting more than what you think on that. The delimited nature of the bios uncaps the limits. Like LLC (from my undertanding, I may be wrong. @chris89 will save me from my stupid thoughts lol) If you ad 40, it may be more. What model is your card? and have you replaced the stock thermal paste? The MSI paste looked pretty good, not squished bird doo doo like other pastes, but it still isnt as good as Kryonaut or Gelid GC Extreme. They work wonders on stock cooler. i have ran the same clocks you have, 1173/1563 and only required a max of 1375mv core/988mv VRAM. Max temps were at 67/72c

I also see that you have an X at the end of that card...slightly more power required. A reference style cooler is far superior than say an MSI Twin Frozr...also from what I have read elsewhere. You have to do some slight modifications to it, but hear its unreal cooling potential is sweet.


----------



## THUMPer1

Quote:


> Originally Posted by *stephenn82*
> 
> I think you may be actually getting more than what you think on that. The delimited nature of the bios uncaps the limits. Like LLC (from my undertanding, I may be wrong. @chris89 will save me from my stupid thoughts lol) If you ad 40, it may be more. What model is your card? and have you replaced the stock thermal paste? The MSI paste looked pretty good, not squished bird doo doo like other pastes, but it still isnt as good as Kryonaut or Gelid GC Extreme. They work wonders on stock cooler. i have ran the same clocks you have, 1173/1563 and only required a max of 1375mv core/988mv VRAM. Max temps were at 67/72c


Maybe I don't need that much voltage. Monitoring it though it peaked around 1.21V in gpuz. It's an MSI 390x. Already replaced paste a couple times.


----------



## stephenn82

huh, programmed my bios to use up to 1350mv and it pulls 1337, according to GPU-z.


----------



## matthew87

Anyone suggest a fix for random black screens at idle and desktop?

I've running Chris's 1356mv-950mv 1172/1563mhz BIOS, it's 100% stable in games, benchmarks and stress tests but I will get random lockups and black screens intermittently at desktop, browning the web etc. It appears to be when vram frequency changes from 150mhz and up

This issue seems to be common to Sapphire 390s based off this thread on AMD's forums:

https://community.amd.com/thread/200787?start=120&tstart=0

Also, as these BIOSs are power delimited is there any benifit in changing the power to +50% in Wattman or does that do nothing now?


----------



## chris89

@THUMPer1

*

1356mv_1172Mhz_888mv_1250Mhz.zip 99k .zip file


1356mv_1172Mhz_1000mv_1563Mhz.zip 99k .zip file

*
@Matthew87

*

1356mv-988mv-1172mhz-1563mhz.zip 42k .zip file


1356mv-1000mv-1172mhz-1563mhz.zip 42k .zip file
*


----------



## stephenn82

quick question...anyone play Rising Storm 2: Vietnam in here??


----------



## christoph

Quote:


> Originally Posted by *stephenn82*
> 
> quick question...anyone play Rising Storm 2: Vietnam in here??


is it out already?


----------



## stephenn82

Quote:


> Originally Posted by *christoph*
> 
> is it out already?


the pre-release game on steam is...bought it for 29.99...so far, not impressed.


----------



## THUMPer1

Quote:


> Originally Posted by *chris89*
> 
> @THUMPer1
> 
> *
> 
> 1356mv_1172Mhz_888mv_1250Mhz.zip 99k .zip file
> 
> 
> 1356mv_1172Mhz_1000mv_1563Mhz.zip 99k .zip file
> 
> *
> @Matthew87
> 
> *
> 
> 1356mv-988mv-1172mhz-1563mhz.zip 42k .zip file
> 
> 
> 1356mv-1000mv-1172mhz-1563mhz.zip 42k .zip file
> *


Thanks.
Both BIOS crash or hard lock. Performance is good though.


----------



## stephenn82

chris89 sent me a bios for 1172mhz that had vcore of 1393. you may need more power.


----------



## TrueForm

Quote:


> Originally Posted by *stephenn82*
> 
> Im doing 1160/1550 at 1350mv and 960mv on core and mem, respectively. Stable as all get out, temps hit 67/73. Going ro lower power and test for stable performance at lowest power usage.


1160 artifacts alot and crashes
Quote:


> Originally Posted by *chris89*
> 
> @THUMPer1
> 
> *
> 
> 1356mv_1172Mhz_888mv_1250Mhz.zip 99k .zip file
> 
> 
> 1356mv_1172Mhz_1000mv_1563Mhz.zip 99k .zip file
> 
> *
> @Matthew87
> 
> *
> 
> 1356mv-988mv-1172mhz-1563mhz.zip 42k .zip file
> 
> 
> 1356mv-1000mv-1172mhz-1563mhz.zip 42k .zip file
> *


Hey could you please send me a bios for the same voltages as the other one you gave me but set core to 1140 and mem to 1500? I've tested it and it runs stable. Thanks!


----------



## MK-Professor

Hoping someone can help with this, I have an sapphire 390 nitro since launch date running happily at 1100/1700mhz, +56mV, power limit %50, with temps around 63-67C while gaming.

Recently I experience black screens at a very frequently rate (avg 2 per day) with the above OC that was stable all this time, however when I revert back to stock clocks/mV etc the black screens immediately stop (ok I only get one black screen in a period of 2 weeks while the gpu was at stock). The black screen occurs while gaming(on every game).
I am using 16.10.1 driver from october and it worked flawlessly (like every other driver) at 1100/1700/+56/%50 until recently...


----------



## Streetdragon

i would say your ram clock is to high.....


----------



## chris89

1296mv-1000mv-1140Mhz-1500Mhz.zip 101k .zip file


----------



## matthew87

Quote:


> Originally Posted by *chris89*
> 
> @THUMPer1
> 
> *
> 
> 1356mv_1172Mhz_888mv_1250Mhz.zip 99k .zip file
> 
> 
> 1356mv_1172Mhz_1000mv_1563Mhz.zip 99k .zip file
> 
> *
> @Matthew87
> 
> *
> 
> 1356mv-988mv-1172mhz-1563mhz.zip 42k .zip file
> 
> 
> 1356mv-1000mv-1172mhz-1563mhz.zip 42k .zip file
> *


Hi Chis,

Thanks for all your efforts, you should really setup a PayPal account for donations









If you get the opportunity could you please provide me with two BIOSs configured with:

1366mv-988mv-1172mhz-1563mhz

1366mhz-1000mv-1172-1563mhz

I just need a fraction more voltage at the core to run 1172 stable, I get the odd tiny flicker and artefact when benchmarking. Dropping the core down to around 1150mhz sees it stabilize. 100% stable otherwise, yet to see it crash a benchmark or game. But I want perfection!

Again, thanks for all your time and help. It's greatly appreciated.


----------



## stephenn82

Just go with the 1000mv on the 1563 mem speed, at 988 i was getting slight little artifacting. the 1v cleared it up. a difference in 12mv will do. odd, huh?

at 1550mhz, 988mv is a ok...on my card anyways.


----------



## stephenn82

Just for SnG's i loaded up my original bios that i posted here for chris to do his magic with. what a power hungry hot mess!

The fans wouldnt kick on until 60c...ouch! Running valley real quick, it netted 2727 points with 67c on core, 71c on mem. Thats about my regular temps on 1173/1563.

I went in and tuned the bios to what the card SHOULD be with 1060mhz and 1525. Delimited the power, set max temp to 85c from 95c, and tinkered with the fan curve...now it runs at 60c and 62c max run through valley. scorew was 2732.

not bad for a stocker, eh?


----------



## MK-Professor

Quote:


> Originally Posted by *Streetdragon*
> 
> i would say your ram clock is to high.....


but it was fine until recently, should I RMA my gpu? because even at stock clocks I did get one black screen in a period of 2 weeks.


----------



## Streetdragon

Quote:


> Originally Posted by *MK-Professor*
> 
> but it was fine until recently, should I RMA my gpu? because even at stock clocks I did get one black screen in a period of 2 weeks.


maybe something on your card degradet or so. dont know. Maybe it would be enough to redo the tim and termal-pads to lower the heat for now.

but i think somone with more exp can help you better here


----------



## stephenn82

Quote:


> Originally Posted by *Streetdragon*
> 
> maybe something on your card degradet or so. dont know. Maybe it would be enough to redo the tim and termal-pads to lower the heat for now.
> 
> but i think somone with more exp can help you better here


This. Try the paste first. Maybe some expensove fujipoly is later down thw road for vrm amd ram.


----------



## MK-Professor

Quote:


> Originally Posted by *Streetdragon*
> 
> maybe something on your card degradet or so. dont know. Maybe it would be enough to redo the tim and termal-pads to lower the heat for now.
> 
> but i think somone with more exp can help you better here


I don't want to do that I will lose my warranty, besides the GPU at stock is running really cool while gaming 54-61C, and that with the fan speed around 45%. Also the max VRM temperature I have see while gaming is 64C (for VRM 1) and 66C (for VRM 2).


----------



## Rexer

I blew up my 390x AND 390 screwing around, pushing the clocks as far as it will go in multiplayer games. Talk about heat. Lol. Outside, it was in the low 60's F, inside the house it' in the low 80's F.
So I went to get a cheap 8gb card till the gamer's edition of Vega comes out. To my surprise, the most popular brands are almost completely sold out. WFT? They just came out a month ago (April. It's mid-May). So I made a desperate buy for the what was left and got a Sapphire RX 580 Nitro+.for $250.00. I thought, I barely got a 580 8gb at a low price because a few days later they were all sold out and what's left, the price is really jacked up. Even some of the 4gb cards. It's phenomenal.
It turns out the crypto-currency miners (bit miners) had been buying them out, preferring them over 4gb cards. And even the 4gb cards are going to them. AMD realized this and reportedly told manufacturers and retailers to reserve a stock for gamers. But this will undoubtedly drive the price of video cards up. Nvidia has announced it plans to release a pascal version for crypto-miners soon so there's some relief.
As for a cheap Vega in the coming days, I don't think that's going to happen. Not at the slow rate HBM memory is being manufactured and these bit miners gobbling up the high ram inventory. If you're going to buy Vega when the manufacturers release the game editions, I wouldn'twant to wait long. Newer models are planned almost after the first release.
Who knows? Nvidia1070 may go the same way.


----------



## krusic22

Can some one give me a OC Bios for the XFX R9 390.
Link to stock bios: here.
Thanks.
#Edit: My card has Hynix ram.


----------



## bluej511

So i see to be able to use ab again to oc my card and the settings stick. I did 1100/1600 so far and it worked fine.


----------



## thiussat

I have the Gigabyte (r9 390) G1 Gaming edition of this card. Are there still no waterblocks available for it? I am looking at photos of its PCB and it looks awfully close to some other models that do have waterblocks. Has anyone ever taken a chance and tried to see if one of these blocks would work?


----------



## lanofsong

Hello R9 390/390X owners,

We are having our monthly Foldathon from Monday 19th - Wednesday 21st - 12noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come *sign up* and fold with us - see attached link.

June 2017 Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## stephenn82

Ooooooh i may do it this month. Whats a fantastic clock for [email protected]? Im thinking 1133/1250 to be running continous for these days. I got a pair of ML140 pros in thw bottom of my case now to assisy cooling the GPU.


----------



## navjack27

is this a known issue? i'm using my 390x again for giggles and the temps are out of control and i swear those fans can go faster then 2500rpm. msi gaming 8g 390x. i already removed the heatsink and repplied paste. its just strange because i could swear that this card never ran that hot. it'll go pretty much in the 90s and stay there under load. its like something isn't right. i checked the bios for voltage changes and i remember back when this was my main card i never messed with em only changed the timings manually with a hex editor. do the newer drivers play strangely with this card or something?
its in a computer with a fresh install of windows 10 and an rx480 next to it in the mobo, which oddly enough, after cleaning the 480 off and applying paste is WAY cooler (red devil 480 8gb)


----------



## gordesky1

Quote:


> Originally Posted by *navjack27*
> 
> is this a known issue? i'm using my 390x again for giggles and the temps are out of control and i swear those fans can go faster then 2500rpm. msi gaming 8g 390x. i already removed the heatsink and repplied paste. its just strange because i could swear that this card never ran that hot. it'll go pretty much in the 90s and stay there under load. its like something isn't right. i checked the bios for voltage changes and i remember back when this was my main card i never messed with em only changed the timings manually with a hex editor. do the newer drivers play strangely with this card or something?
> its in a computer with a fresh install of windows 10 and an rx480 next to it in the mobo, which oddly enough, after cleaning the 480 off and applying paste is WAY cooler (red devil 480 8gb)


My msi 390 non x goes to 2500rpms too that's pretty much the max they should go. Yea that's pretty hot is that running stock clocks also? Did you try it with out the 480 in ? Maybe that;s blocking air flow for the 390.


----------



## navjack27

Its not being blocked cuz I just put the 480 in earlier.



Yeee i dunno. I underclocked and undervolted it too


----------



## navjack27

it even takes forever for the idle temp to lower down... but like, see, if it were making poor contact with the heatsink... no that isn't the issue... (thinking as i type sorry) i noticed that when i idle it and i'm waiting for it to cool, the air coming off of it still feels like actively, uh, heat producing, in some way... i'm going to take it apart AGAIN and REALLY look at this thing and GLOB a bunch of paste on to make sure its contacting.


----------



## Streetdragon

even the vrm are cooking. check that too


----------



## stephenn82

posting some results. slightly pulled down voltage to 1325mv on core and left at 1v on vram for 1173mhz and 1575. still more than stock, but not pushing it more for no reason.

BF1 is just a stuttery mess if you try playing with some settings turned up to look nice, so i had it all on low minus effects and mesh quality.

results:


I think that pair of ML140 fans helped by pushing air up through the case and removing any heat. that dropped from 67/73c without fans, and 1373mv...

A lot of these cards can hit 1160 on just power slider, or +20mv...i think +123mv is a little much for similar clocks.


----------



## stephenn82

@chris89
I see a lot of people on reddit running 1160/1170 on their 390's with +75mv on core (puts it to 1325) and get their ram to 1600 as well on the 1v

They also move power slider up to +50%

What is the actual power of vddc and vddci from doing that? Does a delimitted 390 need 1373mv core? And if so, why so much more than these guys? Is it due to thermal loss of an inefficient core design?

Recently i have been running my system at 1173/1325mv and 1575/1v (stock vddci) and things seem ok. I think EA servers have some lag, a few people last night were complainimg that there seemed to be an input delay. I was getting fluctuating extra offset from 13-25ms at time. Normally i roll 7 or less and a ping of 6 to server. It seemed like i was playimg on a 140ms ping rate.


----------



## chris89

Quote:


> Originally Posted by *stephenn82*
> 
> @chris89
> I see a lot of people on reddit running 1160/1170 on their 390's with +75mv on core (puts it to 1325) and get their ram to 1600 as well on the 1v
> 
> They also move power slider up to +50%
> 
> What is the actual power of vddc and vddci from doing that? Does a delimitted 390 need 1373mv core? And if so, why so much more than these guys? Is it due to thermal loss of an inefficient core design?
> 
> Recently i have been running my system at 1173/1325mv and 1575/1v (stock vddci) and things seem ok. I think EA servers have some lag, a few people last night were complainimg that there seemed to be an input delay. I was getting fluctuating extra offset from 13-25ms at time. Normally i roll 7 or less and a ping of 6 to server. It seemed like i was playimg on a 140ms ping rate.


 1172mhz-1563mhz-1366mv-1000mv.zip 99k .zip file


75Gpixel & 187.5Gtexel & 400GB/s : 60C : 84C : No Power Limit

I'm sure the driver causes issues as well so try 17.1.2 WHQL.

http://support.amd.com/en-us/download/desktop/previous/detail?os=Windows%2010%20-%2064&rev=17.1.2


----------



## stephenn82

Quote:


> Originally Posted by *chris89*
> 
> 1172mhz-1563mhz-1366mv-1000mv.zip 99k .zip file
> 
> 
> 75Gpixel & 187.5Gtexel & 400GB/s : 60C : 84C : No Power Limit
> 
> I'm sure the driver causes issues as well so try 17.1.2 WHQL.
> 
> http://support.amd.com/en-us/download/desktop/previous/detail?os=Windows%2010%20-%2064&rev=17.1.2


Tha ks for the bios, chris...but i was looking more for some answers. I went to the bios mod thread and its mostly reasonings and how-to hex edit. I got bored 6 pages in and watched some gamers nexus on yt about intel beingn stupid with the new x299 platform...


----------



## componentgirl90

Hi all,

The issue with my xfx 390x persists :'(

I reinstalled Windows 10 and installed more than one older driver. The fan issue still remains however. It will accelerate to 3600rpm or whatever. It must be a hardware issue. I will try it in another PC. If the fan spins in that then its clearly something wrong with the card and will ask the retailer one last time to repair the card. (I RMAed it twice and they found no fault).

If it doesnt Rev then it could be a fault with a component in my computer.

I am eliminating all the simply possibilities before having to face what people on this forum say it is, a problem with the cooling of the memory VRM.

p.s. I have a 500W PSU in my other PC, will that be able to run the 390x? It is a pretty old PC, has E7500 core 2 Duo


----------



## stephenn82

oh boy, that last run got a little hot...going to have to bump the fan curve up a bit. it dropped 20 points off of the last run. ALso, when watching youtube or anything not super demanding, there is a lot of stutter or tearing.


----------



## stephenn82

Bios is programmed to pull 1366mv, onpy registwrs in gpuz as taking 1265-1279mv


----------



## chris89

@stephenn82

That's normal so how's it running? Plus 75C VRM isn't hot at all that's optimal.

About @componentgirl90 gpu... your gonna need a new bios to fix it...

If you want to try a new bios I can help?

*Close all applications out easily.. just close all apps

Extract .zip into a folder.. find.. ATIWinFlash.exe

Right Click Open ATIWinFlash.exe as Administrator.. Open .rom .. Flash don't click anything for up to 5min and done...*

atiflash_274.zip 1214k .zip file


BIOS-XFX-390X.zip 99k .zip file


----------



## stephenn82

it didnt llike playing videos on youtube, steam game ads, etc when on lower settings...and it felt a little stuttery in some of the games and benches I ran. I know the stock card/temps ran like 75 or higher, I just dont like it getting that toasty









I did go in, delimit the stock bios, set max temp to 85, adjust fan curve, and set clocks to 1160/1575 on a tad less voltage than you gave me for 1173 and it runs pretty darned smooth. The 7 levels of clocking are linear, not a defined hard number like 200, 300, 400, etc up to 1173 and it doenst hvae stutter or frame tear when watching youtube. I may drop the VRAM down to 1550 or 1525 (stock speeds about) to keep it cooler, and lower to 975 or 980 mv for long term use.

Now, I feel confident in getting in my XFX 7870 and modding that bios. I wont get too crazy wiht tunes, but I could run that card at 1075/1410 over stock 1000/1250 in my old system...I will have to tinker...you know, the kids need that epicly fast minecraft


----------



## chris89

Quote:


> Originally Posted by *stephenn82*
> 
> it didnt llike playing videos on youtube, steam game ads, etc when on lower settings...and it felt a little stuttery in some of the games and benches I ran. I know the stock card/temps ran like 75 or higher, I just dont like it getting that toasty
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I did go in, delimit the stock bios, set max temp to 85, adjust fan curve, and set clocks to 1160/1575 on a tad less voltage than you gave me for 1173 and it runs pretty darned smooth. The 7 levels of clocking are linear, not a defined hard number like 200, 300, 400, etc up to 1173 and it doenst hvae stutter or frame tear when watching youtube. I may drop the VRAM down to 1550 or 1525 (stock speeds about) to keep it cooler, and lower to 975 or 980 mv for long term use.
> 
> Now, I feel confident in getting in my XFX 7870 and modding that bios. I wont get too crazy wiht tunes, but I could run that card at 1075/1410 over stock 1000/1250 in my old system...I will have to tinker...you know, the kids need that epicly fast minecraft


Nice you can run the HD 7870 if set Boot Core clock to 1173Mhz on stock voltage and dial in a nice fan curve should help...

VBE7.0.0.6.zip 546k .zip file


----------



## stephenn82

I dont think i will crank the little xfx to hell in a handbasket...even if it does have a double lifetime warranty lol


----------



## Carniflex

Quote:


> Originally Posted by *thiussat*
> 
> I have the Gigabyte (r9 390) G1 Gaming edition of this card. Are there still no waterblocks available for it? I am looking at photos of its PCB and it looks awfully close to some other models that do have waterblocks. Has anyone ever taken a chance and tried to see if one of these blocks would work?


Alphacool should have some hybrid block. It has a proper GPU block for the core and then pretty beefy solid aluminium part for the rest of the card making some contact with the core block as well. I have 390X G1 and are using that block.


----------



## Geoclock

Hi. Just upgraded to 32" 2k display from 24", tested Battlefield 1 game and temp hit 92c(70c I had with 24"). Need for speed Rivals played with 69c
It's normal higher temps at 2-4k monitors?


----------



## chris89

Quote:


> Originally Posted by *Geoclock*
> 
> Hi. Just upgraded to 32" 2k display from 24", tested Battlefield 1 game and temp hit 92c(70c I had with 24"). Need for speed Rivals played with 69c
> It's normal higher temps at 2-4k monitors?


Upload bios?


----------



## Geoclock

Any new BIOS for MSI r9 390X?
I got now MS-V30823-F2 .
Thanks.


----------



## chris89

Quote:


> Originally Posted by *Geoclock*
> 
> Any new BIOS for MSI r9 390X?
> I got now MS-V30823-F2 .
> Thanks.


*Cool dude this one is set for 84c max temps and i hope the fan is set right... ? It's delimited power limit so really fast... lmk on vrm temperatures.









thanks

GeoClock-390x.zip 99k .zip file
*


----------



## Geoclock

Fan I got at 50% all the time, plus case fans are at max(adjustable) while gaming but I never had 92c especially I got it under AS MX4 PASTE.
What's the latest MSI R9 390x BIOS? Maybe update will help.
Any suggestions?


----------



## chris89

Quote:


> Originally Posted by *Geoclock*
> 
> Fan I got at 50% all the time, plus case fans are at max(adjustable) while gaming but I never had 92c especially I got it under AS MX4 PASTE.
> What's the latest MSI R9 390x BIOS? Maybe update will help.
> Any suggestions?


*

atiflash_274.zip 1214k .zip file
*

*

GeoClock-390x.zip 99k .zip file
*


----------



## Geoclock

Will it work for MSI R9 390x?
I know all others got original firmwares from XFX, ASUS, POWERCOLOR and so.


----------



## stephenn82

Quote:


> Originally Posted by *Geoclock*
> 
> Will it work for MSI R9 390x?
> I know all others got original firmwares from XFX, ASUS, POWERCOLOR and so.


He wants you to dump yours so he can mod it, from your stock bios, so its guaranteed to work for your card


----------



## Geoclock

I'll better wait MSI release.


----------



## chris89

***Make Sure MSI Afterburner is closed and all apps are closed. If it freezes hold power, restart. Its because of MSI Afterburner. Then flash and don't click anything, it takes 5 minutes**

This is your card and BIOS you mentioned from which I modded for a stable, powerful gaming experience. As fast as possible and as cool as possible.

https://www.techpowerup.com/vgabios/173154/msi-r9390x-8192-150521-1

GeoClock-390x.zip 99k .zip file
*


----------



## Geoclock

Thanks a lot, how much performance will it gain and how cooler ? 5-10% ?
Why i didn't have high temps with 24" monitor?
In case it's back flashable?


----------



## matthew87

@Chris89

If you get the opportunity could you please provide me with two BIOSs configured with:

1366mv-988mv-1172mhz-1563mhz

1366mhz-1000mv-1172mhz-1563mhz

I just need a fraction more voltage at the core to run 1172 stable, I get the odd tiny flicker and artefact when benchmarking. 100% stable otherwise, yet to see it crash a benchmark or game. But I want perfection!

Again, thanks for all your time and help. It's greatly appreciated.


----------



## stephenn82

Quote:


> Originally Posted by *Geoclock*
> 
> Thanks a lot, how much performance will it gain and how cooler ? 5-10% ?
> Why i didn't have high temps with 24" monitor?
> In case it's back flashable?


yes, you can flash stock back....get GPU-z and save your bios. Plus, upload a copy of it here so it can be modded.

Do you have an iGPU or anything to provide graphics in case you brick the card? that way, you can flash the bios on the card to stock and recover everything...i had to do this once, as the voltage wasnt high enough to keep it running, I was testing for minimal stable volts for an overclock.


----------



## chris89

Quote:


> Originally Posted by *matthew87*
> 
> @Chris89
> 
> If you get the opportunity could you please provide me with two BIOSs configured with:
> 
> 1366mv-988mv-1172mhz-1563mhz
> 
> 1366mhz-1000mv-1172mhz-1563mhz
> 
> I just need a fraction more voltage at the core to run 1172 stable, I get the odd tiny flicker and artefact when benchmarking. 100% stable otherwise, yet to see it crash a benchmark or game. But I want perfection!
> 
> Again, thanks for all your time and help. It's greatly appreciated.












*

1366mv-988mv-1172mhz-1563mhz-84c-24c-hyst.zip 42k .zip file


1366mv-1000mv-1172mhz-1563mhz-84c-24c-hyst.zip 42k .zip file


1388mv-1000mv-1188mhz-1563mhz-84c-24c-hyst.zip 42k .zip file
*


----------



## Cherryblue

Hi,

I have a Sapphire Nitro Backplate R9 390X and wish to get the most efficiency of the card, but can't afford to look back at the 1150 pages of this topic







...

I searched for efficiency optimization but couldn't find anything.

Would you mind giving me some help/knowledge on this side?







.

(My goals are best temp/powerconsumption... and mining power haha, trying to mine with this).

My current configuration:
- Corsair Gold SFX 600W
- APU A10-7060K
- ITX FM2+ gigabyte motherboard
- Fractal Node 500 (ITX Case)
- Corsair H75 WC AIO

Thanks in advance for any clue!


----------



## componentgirl90

https://www.evga.com/products/Specs/PSU.aspx?pn=81e4cda4-e770-4372-91a9-55ab4bb5b12b

Is this PSU ok for the 390x on the following system:

SSD
E7500 Core 2 Duo
4gb DDR2 Ram

Power calculator gives me 403W but not sure if this value is correct and also the power supply is bronze so not sure if it will deliver 500W.

I want to risk this because i dont want to shell out for a new PSU but not sure if is a good idea.


----------



## bluej511

Quote:


> Originally Posted by *componentgirl90*
> 
> https://www.evga.com/products/Specs/PSU.aspx?pn=81e4cda4-e770-4372-91a9-55ab4bb5b12b
> 
> Is this PSU ok for the 390x on the following system:
> 
> SSD
> E7500 Core 2 Duo
> 4gb DDR2 Ram
> 
> Power calculator gives me 403W but not sure if this value is correct and also the power supply is bronze so not sure if it will deliver 500W.
> 
> I want to risk this because i dont want to shell out for a new PSU but not sure if is a good idea.


Should be fine, however with a core 2 duo the CPU might peak at all times and so will the cpu so it will be pretty close to full power at all times, unless you're using vsync and capping the fps. May i ask why is such a powerful gpu going in with such a weak cpu and ram?


----------



## PhantomLlama

First time posting on this thread...so sorry for jumping right in.

I've had my 390x for a couple years now and love it. The thing is...I've been running it with an i7-2700k and an Asus Sabertooth Z77 Mobo that whole time. My new parts are coming in Monday and are as follows:

I'm upgrading to:
i7-7700K Processor
Retaining CPU cooler
ROG Maximus IX Hero Z270 Motherboard
Retaining Sapphire Radeon R9 390x (for now)
16GB DDR4 Corsair 3000 RAM
256GB SSD for a boot drive (finally making the jump I should have done years ago!)
Retaining a couple of my other hard drives for storage
750W RMx Corsair Power Supply
Oh and throwing in some new Noctua fans.

All mounted in my Storm Trooper case.

I suppose I'm just curious if I will see a major boost in the Graphics Card's performance now that I will FINALLY have an updated CPU and Mobo.

Thanks guys! (I'll post pics once I'm done)


----------



## Seahawkshunt

Quote:


> Originally Posted by *PhantomLlama*
> 
> First time posting on this thread...so sorry for jumping right in.
> 
> I've had my 390x for a couple years now and love it. The thing is...I've been running it with an i7-2700k and an Asus Sabertooth Z77 Mobo that whole time. My new parts are coming in Monday and are as follows:
> 
> I'm upgrading to:
> i7-7700K Processor
> Retaining CPU cooler
> ROG Maximus IX Hero Z270 Motherboard
> Retaining Sapphire Radeon R9 390x (for now)
> 16GB DDR4 Corsair 3000 RAM
> 256GB SSD for a boot drive (finally making the jump I should have done years ago!)
> Retaining a couple of my other hard drives for storage
> 750W RMx Corsair Power Supply
> Oh and throwing in some new Noctua fans.
> 
> All mounted in my Storm Trooper case.
> 
> I suppose I'm just curious if I will see a major boost in the Graphics Card's performance now that I will FINALLY have an updated CPU and Mobo.
> 
> Thanks guys! (I'll post pics once I'm done)


Nice build. I just recently upgraded from a FX-8310 to a R5 1600 and can share some info.
(Not sure if CPU vendor will matter much with GPU score comparatively)
With the FX my max Graphics Score was 14840, CPU at 5.175ghz http://www.3dmark.com/fs/11099784
With the R5 my max Graphics Score was 15151, CPU at 4.19ghz http://www.3dmark.com/fs/13159284
*only difference was mem speed.
So within the margin of error as far as I can tell. I was also hoping for a boost in my 390 with a newer CPU but I am not seeing it. Still really happy with how well the 390x performs.


----------



## componentgirl90

Quote:


> Originally Posted by *bluej511*
> 
> Should be fine, however with a core 2 duo the CPU might peak at all times and so will the cpu so it will be pretty close to full power at all times, unless you're using vsync and capping the fps. May i ask why is such a powerful gpu going in with such a weak cpu and ram?


Hi bluej thanks for replying quick

My 390x has had problems with the fans. I want to test the card on the system so I can have a better idea of what is wrong (eliminate possibilities). As long as the GPU can go to full load in heaven and doesn't shut down due to too much power.


----------



## PhantomLlama

Quote:


> Originally Posted by *Seahawkshunt*
> 
> Nice build. I just recently upgraded from a FX-8310 to a R5 1600 and can share some info.
> (Not sure if CPU vendor will matter much with GPU score comparatively)
> With the FX my max Graphics Score was 14840, CPU at 5.175ghz http://www.3dmark.com/fs/11099784
> With the R5 my max Graphics Score was 15151, CPU at 4.19ghz http://www.3dmark.com/fs/13159284
> *only difference was mem speed.
> So within the margin of error as far as I can tell. I was also hoping for a boost in my 390 with a newer CPU but I am not seeing it. Still really happy with how well the 390x performs.


Thank you-really looking forward to seeing how it all works. And Thanks for the words! I mean I know my system has been bottlenecked by the CPU and older board, so we will have to see what happens!

Bit of extra info: it's a sapphire r9 390x Tri-X. Seems some people like sapphire, others not so much.... I figure if I have one more card fail on me (two over the years) I'll go with another brand.


----------



## stephenn82

sapphire is pretty good man! they best for AMD cards...just like EVGA for Nvidia.

You will notice a slight pick me up performance wise, but not a ton. The smoothness and crisp response in games will be a good welcome!


----------



## PhantomLlama

Quote:


> Originally Posted by *stephenn82*
> 
> sapphire is pretty good man! they best for AMD cards...just like EVGA for Nvidia.
> 
> You will notice a slight pick me up performance wise, but not a ton. The smoothness and crisp response in games will be a good welcome!


Well that's good to hear, on both accounts! Sapphire always struck me as being solid, but dead cards make a person wonder. I've had good experiences both times RMAing cards, so that's something. Hopefully I won't be needing to upgrade the GPU anytime soon, so that's a bridge for another day haha


----------



## stephenn82

dead cards...either bad PSU with massive OC, or they just dont get int here and clean their computer parts that often. I get in there every three months and really give it a thorough cleaning. My 7870 from 2012 is still kicking hard for the kids machine.


----------



## PhantomLlama

I certainly won't rule out PSU issues, especially since I used a 5.25 bay supplemental power supply. And I had tried OCing at one point. So I'm sure I didn't do it any favors. I've been going easy on the latest card. I've stopped OCing the whole way around.


----------



## componentgirl90

I was thinking of possibly turning my 390x into an ethereum mining card for the rest of its life as it is actually profitable enough to be worth it.

My electricity rate is like 0.17 dollars per hour (I live in the UK)....

How much could you reduce the power drain by? (Currently pulling 275W I believe).


----------



## stephenn82

Your card can turn what, 30MH/sec? Lets say total system power consumption is 340 watts.
You will make 21 bucks a month (with current market)
Not worth it. You would make waaaay more money just selling the card. Dont put it throught that torture...it loves you









Ok, for real though. Selling it for 250 or so is almost 11 months profit up front.
Check this out. Im sure you have.
https://www.cryptocompare.com/mining/calculator/eth?HashingPower=30&HashingUnit=MH%2Fs&PowerConsumption=340&CostPerkWh=0.17


----------



## Cherryblue

While I agree you have to be careful about your system health and all of that, it's not torture if your card is cool.

Reducing the consumption is also part of reducing the temp.

I'm trying to do the same thing and am searching for the perfect ratio between consumption and hash delivery.


----------



## xtuu

Could someone help me?
I have a powercolor r9 390 dual (16 gb) and I have not found any moded bios for mining.
I have searched in the forum but I have not found anything.
A greeting.


----------



## componentgirl90

Quote:


> Originally Posted by *stephenn82*
> 
> Your card can turn what, 30MH/sec? Lets say total system power consumption is 340 watts.
> You will make 21 bucks a month (with current market)
> Not worth it. You would make waaaay more money just selling the card. Dont put it throught that torture...it loves you
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Ok, for real though. Selling it for 250 or so is almost 11 months profit up front.
> Check this out. Im sure you have.
> https://www.cryptocompare.com/mining/calculator/eth?HashingPower=30&HashingUnit=MH%2Fs&PowerConsumption=340&CostPerkWh=0.17


Haha ok. But seriously, even a gtx 1070 would not make much more in my country.

I don't feel right in selling it because I know my card has a problem which I am yet to resolve although I will be trying again this month.


----------



## Seahawkshunt

AMD Crimson driver 17.7.2 does not work with MSI Afterburner. Afterburner has lost the ability to change anything except fan control for me. I have tried several clean install to no avail. Anyone else seeing this problem? Anytime I have MSI Afterburner running I am stuck at 300mhz on the core.


----------



## stephenn82

Ive been having that issue woth any crimson driver past 16.11.5


----------



## stephenn82

Quote:


> Originally Posted by *componentgirl90*
> 
> Haha ok. But seriously, even a gtx 1070 would not make much more in my country.
> 
> I don't feel right in selling it because I know my card has a problem which I am yet to resolve although I will be trying again this month.


Reapply paste and try custom bios


----------



## Maikalwolf

Good day folks. Need your advice. I own 2 R9 390's. Both over a year and 1/2 old. A month and a half ago one of them was having overheating problems bad to the point of rebooting my PC even while browsing within minutes of turning pc on. So I did an RMA, waited for 30 days for a replacement and the replacement turned out to be having black screen issues, withn half hour of being online. I had bought a secondR9 during the wait for the RMA and that one works great, no problems. I Got it used under $280US. Considering the prices for that card on ebay last month I got lucky!

Anyway I just got a response from RMA at MSI and they again are telling me they have no R9 390's in stock and are offering me a Radeon RX 470 ARMOR 4G OC as replacement. Now I love my R9 390 and I know its slightly better than the RX 470 from doing my research. But I wanted to hear from you folks on whether its a worthwhile swap or hold out for a R9 390 replacement? I know I got the second one, it still has a little over a year to go on its waranty. The one I RMA'ed has a year also on the warranty. So what do you think? Of course they are not telling me the RX 470 is new either, probably another refurb, but i sent an email back asking about that issue too.

So what do you think? No I dont use them both at the same time, saving for a better monitor before even trying that. And i really not sure even then if I will use them together.


----------



## Mister300

Yea,AB not working either.


----------



## stephenn82

Quote:


> Originally Posted by *Maikalwolf*
> 
> Good day folks. Need your advice. I own 2 R9 390's. Both over a year and 1/2 old. A month and a half ago one of them was having overheating problems bad to the point of rebooting my PC even while browsing within minutes of turning pc on. So I did an RMA, waited for 30 days for a replacement and the replacement turned out to be having black screen issues, withn half hour of being online. I had bought a secondR9 during the wait for the RMA and that one works great, no problems. I Got it used under $280US. Considering the prices for that card on ebay last month I got lucky!
> 
> Anyway I just got a response from RMA at MSI and they again are telling me they have no R9 390's in stock and are offering me a Radeon RX 470 ARMOR 4G OC as replacement. Now I love my R9 390 and I know its slightly better than the RX 470 from doing my research. But I wanted to hear from you folks on whether its a worthwhile swap or hold out for a R9 390 replacement? I know I got the second one, it still has a little over a year to go on its waranty. The one I RMA'ed has a year also on the warranty. So what do you think? Of course they are not telling me the RX 470 is new either, probably another refurb, but i sent an email back asking about that issue too.
> 
> So what do you think? No I dont use them both at the same time, saving for a better monitor before even trying that. And i really not sure even then if I will use them together.


Nope. Tell them its not eaual. No downgrade. Hook it up with a 480 minimum...lets talk 580!!


----------



## stephenn82

Quote:


> Originally Posted by *Mister300*
> 
> Yea,AB not working either.


I habemt gotten AB to work post 16.11.5. Clean install with DDU amd NEVER opened wattman. Still no dice. Just create a custom bios and never go back to wattman, AB or any other app.


----------



## Maikalwolf

Quote:


> Originally Posted by *stephenn82*
> 
> Nope. Tell them its not eaual. No downgrade. Hook it up with a 480 minimum...lets talk 580!!


Thats what i thought. I was goingto hold out for another R9 but thought a 580 or 1060/70 would be ideal even. So you are saying 480 4gb or8gb gaming would be even or just minimum. I also did not like the 470 having no back plate for stability!


----------



## Cherryblue

Quote:


> Originally Posted by *Seahawkshunt*
> 
> AMD Crimson driver 17.7.2 does not work with MSI Afterburner. Afterburner has lost the ability to change anything except fan control for me. I have tried several clean install to no avail. Anyone else seeing this problem? Anytime I have MSI Afterburner running I am stuck at 300mhz on the core.


Exactly the same here.

17.7.1 was good, DDU, then driver, then afterburner and everything was fine.

Can't do the same with 17.7.2, Afterburner can't set a lot of things... Here we go at it again... why amd why..


----------



## stephenn82

Quote:


> Originally Posted by *Maikalwolf*
> 
> Thats what i thought. I was goingto hold out for another R9 but thought a 580 or 1060/70 would be ideal even. So you are saying 480 4gb or8gb gaming would be even or just minimum. I also did not like the 470 having no back plate for stability!


I would only go 8gb for ram. Even swap or nothing!!

A 1060 with 6gb would be ok. Its the king at 1080p price to performance. Its ok at 1440p. 1070 is pretty good there. 1080 is amazing at 1440p. My friend has a 1440p 144hz gsync minitor and BF1 is epicly smooth on his computer.


----------



## Maikalwolf

Timing is everything, glad i saw this. The update reached me today. I will stay on 17.7.1 until something changes with Afterburner.


----------



## ziggystardust

Hey folks,

I noticed something weird recently. In Radeon Settings > Monitor > Technical Properties; Current HDCP Status seems disabled. What could be the reason for this? Any of you have a similar problem?

I have Nitro 390x and ViewSonic XG2401, connected via displayport and Win10 pro with latest updates. I'm also on 17.7.2.


----------



## stephenn82

Quote:


> Originally Posted by *ziggystardust*
> 
> Hey folks,
> 
> In Radeon Settings > Monitor > Technical Properties; Current HDCP Status seems disabled.


i dont even have that as an option.


----------



## ziggystardust

Quote:


> Originally Posted by *stephenn82*
> 
> i dont even have that as an option.


It's not an option though. If you have Radeon Settings installed you should have that in Monitor/Screen tab. At the top right side there is tech. properties. Then you can see Current HDCP Status. It's not an option, not a toggle and I don't understand why it's disabled for me. Really weird.


----------



## Maikalwolf

Okay, MSI RMA upped the ante to RADEON™ RX 480 GAMING X 8G.


----------



## stephenn82

this is what mine looks like, i dont have any other settings.

http://images.anandtech.com/doci/9811/Display%20Settings.jpg


----------



## Cherryblue

Quote:


> Originally Posted by *stephenn82*
> 
> this is what mine looks like, i dont have any other settings.
> 
> http://images.anandtech.com/doci/9811/Display%20Settings.jpg


You've got the wrong board here, Hawai wasn't present in HD 7xxx


----------



## stephenn82

This is still what my card looks like when opening settings. There is not separate tab under display to find any of this. I will post a screenshot ince i return home.


----------



## Seahawkshunt

Quote:


> Originally Posted by *stephenn82*
> 
> This is still what my card looks like when opening settings. There is not separate tab under display to find any of this. I will post a screenshot ince i return home.


For Crimson 17.4.4 and earlier display settings can be found under preferences.


Spoiler: Warning: Spoiler!








The newest Crimson driver change all this, and now can be found under the display tab.


----------



## stephenn82

Quote:


> Originally Posted by *Seahawkshunt*
> 
> For Crimson 17.4.4 and earlier display settings can be found under preferences.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> The newest Crimson driver change all this, and now can be found under the display tab.


How do I find this?

Radeon Settings > Monitor > Technical Properties; Current HDCP Status seems disabled

I have 17.7.2


----------



## stephenn82

@ziggystardust

Known Issues

Some protected content applications may experience an HDCP error code while playing BluRay content. *A workaround is to unplug and re-plug the connected display or turn the display off and then back on.*

Found here

https://community.amd.com/docs/DOC-1832

hope it heps!


----------



## PhantomLlama

Maybe I'm a total noob here...but my just had an update to my drivers a couple days ago....and now I cannot overclock my card... Either through Trixx or Afterburner. Anyone have an idea of how to fix this? I mean... it's not like I need to to play BF1 or anything...but....I could now I can't. That kind of thing bothers my OCD haha.


----------



## ziggystardust

Quote:


> Originally Posted by *PhantomLlama*
> 
> Maybe I'm a total noob here...but my just had an update to my drivers a couple days ago....and now I cannot overclock my card... Either through Trixx or Afterburner. Anyone have an idea of how to fix this? I mean... it's not like I need to to play BF1 or anything...but....I could now I can't. That kind of thing bothers my OCD haha.


Yes, the latest drivers completely borked AB and Trixx for me too.


----------



## rdr09

Quote:


> Originally Posted by *PhantomLlama*
> 
> Maybe I'm a total noob here...but my just had an update to my drivers a couple days ago....and now I cannot overclock my card... Either through Trixx or Afterburner. Anyone have an idea of how to fix this? I mean... it's not like I need to to play BF1 or anything...but....I could now I can't. That kind of thing bothers my OCD haha.


Quote:


> Originally Posted by *ziggystardust*
> 
> Yes, the latest drivers completely borked AB and Trixx for me too.


try this . . .

http://forums.guru3d.com/showpost.php?p=5458447&postcount=629

uninstall the old stuff - trixx and ab


----------



## ziggystardust

Quote:


> Originally Posted by *rdr09*
> 
> try this . . .
> 
> http://forums.guru3d.com/showpost.php?p=5458447&postcount=629
> 
> uninstall the old stuff - trixx and ab


Great. Thanks. Gonna try once I get home.


----------



## PhantomLlama

Well then, I'm not crazy! Well... not this time at least. Just downloaded and am checking it out now. Thank you, sir!


----------



## PhantomLlama

Seems to be working for me!


----------



## HeartBreakerTV

Hey guys,

New here and was wondering if I would see any gain from OC'n my SAPPHIRE Radeon R9 390X DirectX 12 100381OCL 8GB 512-Bit GDDR5. I mostly play CS:GO, PUBG, TESO.

I figure if I won't really see any benefit, I'd leave it as is to make sure I don't burn the card out before it's meant to. Had it since August 2015 (2 years).

My system specs:

PSU: Corsair RM 650W
OS: Windows 10
Motherboard: Gigabyte z97x Gaming 5
CPU: i7 4790k
Memory: 8GB G.Skill
Hard Drive: SSD w/ Windows + 500GB Western Digital Black
Video Card: R9 390X

Thanks!


----------



## PhantomLlama

Quote:


> Originally Posted by *HeartBreakerTV*
> 
> Hey guys,
> 
> New here and was wondering if I would see any gain from OC'n my SAPPHIRE Radeon R9 390X DirectX 12 100381OCL 8GB 512-Bit GDDR5. I mostly play CS:GO, PUBG, TESO.
> 
> I figure if I won't really see any benefit, I'd leave it as is to make sure I don't burn the card out before it's meant to. Had it since August 2015 (2 years).
> 
> My system specs:
> 
> PSU: Corsair RM 650W
> OS: Windows 10
> Motherboard: Gigabyte z97x Gaming 5
> CPU: i7 4790k
> Memory: 8GB G.Skill
> Hard Drive: SSD w/ Windows + 500GB Western Digital Black
> Video Card: R9 390X
> 
> Thanks!


Depends how much you're going to overclock the thing. So far pushed mine to 1160Mhz GPU and 1600Mhz RAM. But I really don't use these speeds other than when I'm testing...yet... When I benchmark it in Unigine Heaven, I see a small increase...

Stock I get 61.9FPS and a score of 1559

At 1160Mhz GPU and 160Mhz RAM I get 64.9 FPS and a score of 1635. Temps hit 66C during benching with the fan going still in the default curve. Also, I haven't changed my voltages yet, though I do set the power limit to max.

So, ultimately, it's up to you. Is it worth it? Just ramp it up slowly. Some people will say gains are gains. I still say my bottleneck is my garbage monitors haha


----------



## stephenn82

1176/1563 is absolutely beast clocking for in game runs. 1625 memory is king at benchies but not real world

Try this. 1160/1550 at +50mv or move power slider up to max.

Like it? Then bump voltage to +50mv and run 1176/1565 or 1600. It will make heat...but be hella fast.

Like it even more? Lets talk custom bios.


----------



## Seahawkshunt

Quote:


> Originally Posted by *stephenn82*
> 
> Like it even more? Lets talk custom bios.


I have always ran the official BIOS and have been able to overclock well. Up to 1200/1725 http://www.3dmark.com/fs/13159284 I run 1170/1625 daily with +30 on core and +50 on memory in MSI AB for the last year with no issue except GPU is now a heater when gaming. What kind of benefits would one get from a modified BIOS? Higher clocks or lower power consumption? I would love to dial down my power draw while gaming.

Quote:


> Originally Posted by *HeartBreakerTV*
> 
> Hey guys,
> 
> New here and was wondering if I would see any gain from OC'n my SAPPHIRE Radeon R9 390X DirectX 12 100381OCL 8GB 512-Bit GDDR5. I mostly play CS:GO, PUBG, TESO.
> 
> I figure if I won't really see any benefit, I'd leave it as is to make sure I don't burn the card out before it's meant to. Had it since August 2015 (2 years).
> 
> My system specs:
> 
> PSU: Corsair RM 650W
> OS: Windows 10
> Motherboard: Gigabyte z97x Gaming 5
> CPU: i7 4790k
> Memory: 8GB G.Skill
> Hard Drive: SSD w/ Windows + 500GB Western Digital Black
> Video Card: R9 390X
> 
> Thanks!


I doubt you will see any difference in CS:GO or TESO but in PUBG you should see a modest increase in FPS. 5-10 FPS increase in the best case scenario. For me it is worth it when I need that FPS to get 60 FPS with higher graphic settings or to stay above 60 FPS during intense graphic loads.


----------



## stephenn82

Quote:


> Originally Posted by *Seahawkshunt*
> 
> I have always ran the official BIOS and have been able to overclock well. Up to 1200/1725 http://www.3dmark.com/fs/13159284 I run 1170/1625 daily with +30 on core and +50 on memory in MSI AB for the last year with no issue except GPU is now a heater when gaming. What kind of benefits would one get from a modified BIOS? Higher clocks or lower power consumption? I would love to dial down my power draw while gaming.


Exactly that. Create your own fan curve baked into the bios and create your own power table. The stock power chart is flawed and can damage the card. Rhemal limit is set to point of chip meltdown can occur. I drop mine to a safe 80c from the 95c stock. Custom bios allows you to delimit the card (think of it as an infinite power slider) allowing you to draw the power it needs. Just like sliding CPU power to 4096W, even though you will never use it.

You can even lower power draw and drop temps.


----------



## PhantomLlama

Alright, sir. Let's talk custom BIOS if it really does help. Help a newbie learn this sorcery!

Just ran a stress test. 50mv, maxed power limit, gpu = 1175Mhz, RAM = 1625, appears stable, but I'll have to run a longer test.

Unigine Heaven yielded 66.8FPS and a score of 1684. And I adjusted the fan curve to be a bit higher. Never got above 60C.

That said, with that overclock, I'm see an average gain of 4.9FPS and a score increase of 125. Now I'm not sure what that would equate to in actual gaming gains, but that's what we are looking at. Honestly, if you want to play it safe, run the card stock or just slightly overclocked (that way you can say you have it overclocked, right? lol)

I tried running the core at 1200Mhz, but I got some bad artifacts and screen tearing.

But I am curious about a different BIOS and how it could affect it.


----------



## stephenn82

Quote:


> Originally Posted by *PhantomLlama*
> 
> Alright, sir. Let's talk custom BIOS if it really does help. Help a newbie learn this sorcery!
> 
> Just ran a stress test. 50mv, maxed power limit, gpu = 1175Mhz, RAM = 1625, appears stable, but I'll have to run a longer test.
> 
> Unigine Heaven yielded 66.8FPS and a score of 1684. And I adjusted the fan curve to be a bit higher. Never got above 60C.
> 
> That said, with that overclock, I'm see an average gain of 4.9FPS and a score increase of 125. Now I'm not sure what that would equate to in actual gaming gains, but that's what we are looking at. Honestly, if you want to play it safe, run the card stock or just slightly overclocked (that way you can say you have it overclocked, right? lol)
> 
> I tried running the core at 1200Mhz, but I got some bad artifacts and screen tearing.
> 
> But I am curious about a different BIOS and how it could affect it.


@chris89 you have been summoned to explain your magical sorcery in depth. I would link him to a post from somehwere about the 900 comment section, but cannot find it at the moment









He will explain it the best and the science behind it. He also gave me some insight over PM and I did a hand at tuning my own BIOS for performance I wanted. he helped me raise my timespy score from 4100 or so up to 4470. I went from stutter/chop in BF1 to smooth as butter always hold 80+fps to keep from tearing and competitive edge on BF1...until the most recent windows and game update...Im thinking its the game and NOT the card, and uninstalled the game...for the 5th time. All other games run smooth as melted butter on a short stack of pancakes. Fan curve is quiet and keeps the card cool. Max temps I seen with his setup was 65c on core, 71c on ram. I tweaked fan curve a slight bit ant got 61c/65c. I also have popped the fan off and replaced the TIM with Gelid GC Extreme..that alone dropped temps from 76c to 67c on a stock MSI R9 390.


----------



## Dundundata

Quote:


> Originally Posted by *rdr09*
> 
> try this . . .
> 
> http://forums.guru3d.com/showpost.php?p=5458447&postcount=629
> 
> uninstall the old stuff - trixx and ab


----------



## Cherryblue

Guys, I'm wondering about changing thermal paste of my 390x, and here is my question:

Should I also apply thermal paste on ram chips? Why not? Does it help? Plastic/silicon?


----------



## rdr09

If you have plenty and you got the cooler off - why not? When i put water blocks on my 290, i did. Just a Dot.

Dot - Thermal tape - Dot

I figured it might help transfer heat since it is better than tapes. Actually it was suggested to me.

Edit: make sure the paste does not conduct electricity. I used MX-2. I used CLU on the core back when i was brave.

Also, i used fujipoly for tape with 17w/mk. Not really necessary.


----------



## Cherryblue

Quote:


> Originally Posted by *rdr09*
> 
> If you have plenty and you got the cooler off - why not? When i put water blocks on my 290, i did. Just a Dot.
> 
> Dot - Thermal tape - Dot
> 
> I figured it might help transfer heat since it is better than tapes. Actually it was suggested to me.
> 
> Edit: make sure the paste does not conduct electricity. I used MX-2. I used CLU on the core back when i was brave.
> 
> Also, i used fujipoly for tape with 17w/mk. Not really necessary.


Well I don't have any tape (well none for this kind of things







) but the thermal paste I'm going to put is grizzly; it indeed doesn't conduct electricity







.

I was wondering how it would work on ram chips since there isn't any thermal paste on them on the sapphire 390x to begin with, and well the top isn't silicon but plastic isn't it? Maybe doesn't work great with it dunno..


----------



## bluej511

Quote:


> Originally Posted by *Cherryblue*
> 
> Well I don't have any tape (well none for this kind of things
> 
> 
> 
> 
> 
> 
> 
> ) but the thermal paste I'm going to put is grizzly; it indeed doesn't conduct electricity
> 
> 
> 
> 
> 
> 
> 
> .
> 
> I was wondering how it would work on ram chips since there isn't any thermal paste on them on the sapphire 390x to begin with, and well the top isn't silicon but plastic isn't it? Maybe doesn't work great with it dunno..


Let me help you out here.

1. Ram chips do not run at all hot, especially DDR5 and its low voltage, hell even DDR4 memory at 1.45v where i have it does not need extra cooling its fine with just the basic case cooling.

2. Thermal paste, unlike thermal pads is not made to be use in thickness, the thinner the layer the BETTER it works. If you put a thickness of .5mm of thermal paste between the ram chip and the cooler you might as well not put any thermal paste.

3. My sapphire r9 390 came with thermal pads and so did my waterblock so thats what i used, if you don't have any i suggest buying some. They are ridiculously cheap especially if you get a 3-5w/mK, for ram nothing higher is needed.

Now for VRMs, you can put thermal paste between the thermal pad and the cooler to aide in heat removal efficiency but i wouldnt use straight thermal paste.


----------



## rdr09

Quote:


> Originally Posted by *Cherryblue*
> 
> Well I don't have any tape (well none for this kind of things
> 
> 
> 
> 
> 
> 
> 
> ) but the thermal paste I'm going to put is grizzly; it indeed doesn't conduct electricity
> 
> 
> 
> 
> 
> 
> 
> .
> 
> I was wondering how it would work on ram chips since there isn't any thermal paste on them on the sapphire 390x to begin with, and well the top isn't silicon but plastic isn't it? Maybe doesn't work great with it dunno..


I mainly put paste so it would be easier to clean. But, i have not touched it for four years. Lol. Temps still the same. Prolly coz i moved and my ambient now is around19c. If i remember correctly, it was part of the instruction in the ek waterblock installation. Not sure.


----------



## Cherryblue

Thank you to both of you, I ordered thermal tape yesterday in order to put some on ram chips and/or vrms if needed.

I'm actually idling at around 50°C ,on the other side I get 38°C on R5 [email protected];1.25v in a Fractal ITX box type case. Using a corsair WC h75 on the processor.

I feel like my Hawai card really is too hot doing nothing, so much that I can't even use the "fans off" functionality







..

I'll be back to tell you what I gained from that







.

EDIT: Oh, I'm stupid, why describe my build that way when I can give you the userbenchmark runs here.


----------



## rdr09

Quote:


> Originally Posted by *Cherryblue*
> 
> Thank you to both of you, I ordered thermal tape yesterday in order to put some on ram chips and/or vrms if needed.
> 
> I'm actually idling at around 50°C ,on the other side I get 38°C on R5 [email protected];1.25v in a Fractal ITX box type case. Using a corsair WC h75 on the processor.
> 
> I feel like my Hawai card really is too hot doing nothing, so much that I can't even use the "fans off" functionality
> 
> 
> 
> 
> 
> 
> 
> ..
> 
> I'll be back to tell you what I gained from that
> 
> 
> 
> 
> 
> 
> 
> .
> 
> EDIT: Oh, I'm stupid, why describe my build that way when I can give you the userbenchmark runs here.


You ordered the right thickness, right?

Add your pc components in your sig.

Nice! R5 1600 system.


----------



## stephenn82

huh, a good deal on GPU blocks...but not sure if any are compatible for 390/390x.

http://www.swiftech.com/amd-2.aspx


----------



## Raventlov

Good day folks,

just registered to the forum to ask a question, because could't find any precise answer even if i looked everywhere.

I'm in the process of doing a re-paste job on both CPU and GPU. My GPU is a Sapphire Nitro 390 with the back plate.
For the thermal compound i've chosen the grizzly kryonaut for both CPU and GPU.

The problem arise because i would like to apply new thermal pads on the GPU for GDDR, Vcore and Vmem and i really don't know what is the thickness of those.

I know that Aphacool / Fujipoly would be the better choice but i can't afford that, so best for bucks, i was looking at the Arctic Cooling Thermal Pads.

(Really odd i could't find many reviews of thermal pads, the only one with some data on it was from this guy on hardwareluxx.de forum https://www.hardwareluxx.de/community/f136/alphacool-eisschich-co-waermeleitpad-test-1093326.html)



So coming to conclusion, anyone knows what thickness for the thermal pads?
And i'll gladly listen to any suggestion if you think that Arctic TP are not the best for bucks.

Thanks in advance


----------



## rdr09

Quote:


> Originally Posted by *Raventlov* ]
> 
> Good day folks,
> 
> just registered to the forum to ask a question, because could't find any precise answer even if i looked everywhere.
> 
> I'm in the process of doing a re-paste job on both CPU and GPU. My GPU is a Sapphire Nitro 390 with the back plate.
> For the thermal compound i've chosen the grizzly kryonaut for both CPU and GPU.
> 
> The problem arise because i would like to apply new thermal pads on the GPU for GDDR, Vcore and Vmem and i really don't know what is the thickness of those.
> 
> I know that Aphacool / Fujipoly would be the better choice but i can't afford that, so best for bucks, i was looking at the Arctic Cooling Thermal Pads.
> 
> (Really odd i could't find many reviews of thermal pads, the only one with some data on it was from this guy on hardwareluxx.de forum https://www.hardwareluxx.de/community/f136/alphacool-eisschich-co-waermeleitpad-test-1093326.html)
> 
> [/SPOILER]
> 
> So coming to conclusion, anyone knows what thickness for the thermal pads?
> And i'll gladly listen to any suggestion if you think that Arctic TP are not the best for bucks.
> 
> Thanks in advance


Verify if this block is compatible with your Nitro. If it is, scroll down the middle you'll see the Installation Manual. A good chance that the thermal pads used matches the original. Once you take the cooler off try to save as much of the original pads and label and set them aside. You might need them to verify thickness.

https://modmymods.com/alphacool-nexxxos-gpx-ati-r9-390-m01-incl-backplate-1011681.html

I recommend at least 11W/mk for the VRMs. Fujis have them and cheaper than the 14W/mk. The Memory can take the cheaper ones like 5-6W/mk. If you can't find the right thickness, get enough of the 0.5mm and just add them up by layers. They should still transfer heat just fine. I would go as far as test and see if the pads are being compressed. meaning putting the assembly together and disassembling again. Skip the paste for the test but be very careful of the core.


----------



## Raventlov

Ok from that manual it seems that everything is covered with thermal pads of 1.5 mm thickness. Honestly it's hard to tell even looking from disassembly videos like this one





OFC i could open up my GPU, measure the TP and then order it but i would like not to minimize "downtime".


----------



## christoph

I used 1.5mm in my Video Card


----------



## rdr09

Quote:


> Originally Posted by *Raventlov*
> 
> Ok from that manual it seems that everything is covered with thermal pads of 1.5 mm thickness. Honestly it's hard to tell even looking from disassembly videos like this one
> 
> 
> 
> 
> 
> OFC i could open up my GPU, measure the TP and then order it but i would like not to minimize "downtime".


Yes, 1.5mm as christoph also mentioned. All the way at the bottom tells you the exact length and width of each. For the VRMs and two outliers, i recommend 11W/mk.


----------



## Raventlov

Hey @rdr09 have you seen the picture i posted at my first post? From that chart is seems that the difference from fujipoly 11 W/mK and arctic 6 W/mK is just a mare 2 degrees. While against the 14 is 10 degrees!
Problem is that arctic 145x145x1.5 costs me 20 Euros, Fujypoly / Alphacool 14W/mK 100x100x1,5mm (Sarcon XR-j) is 85 Euros!!!









I'll try to find a better deal for it! Otherwise i'll go with Arctic for everything...


----------



## rdr09

Quote:


> Originally Posted by *Raventlov*
> 
> Hey @rdr09 have you seen the picture i posted at my first post? From that chart is seems that the difference from fujipoly 11 W/mK and arctic 6 W/mK is just a mare 2 degrees. While against the 14 is 10 degrees!
> Problem is that arctic 145x145x1.5 costs me 20 Euros, Fujypoly / Alphacool 14W/mK 100x100x1,5mm (Sarcon XR-j) is 85 Euros!!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'll try to find a better deal for it! Otherwise i'll go with Arctic for everything...


The Artic would work. Make sure you enuf to spare.


----------



## stephenn82

Let me know. I may do this as well. A sheet of 11mkw is 45-60 on amazon here.


----------



## Raventlov

I'll update you as i finish and have results to show... i'm goin' to have a little holiday soon, so i plan to do all this stuff when i come back. ETA 10 September

BTW on the back of the card (390 Sapphire with back plate) there is a chunk of 3mm thermal pad right behind the vCore, wondering if amplifying this area would benefit at all.


----------



## Cherryblue

Quote:


> Originally Posted by *rdr09*
> 
> You ordered the right thickness, right?


Ordered the thinnest I could find quickly. 0.5mm I think. Had the choice with 1.5mm.

Why? Worst case I can triple it to get 1.5mm.


----------



## rdr09

Quote:


> Originally Posted by *Cherryblue*
> 
> Ordered the thinnest I could find quickly. 0.5mm I think. Had the choice with 1.5mm.
> 
> Why? Worst case I can triple it to get 1.5mm.


That would work.


----------



## stephenn82

What thickness to replace all of the pads on my 390? .5, 1.0, or 1.5??


----------



## rdr09

Quote:


> Originally Posted by *stephenn82*
> 
> What thickness to replace all of the pads on my 390? .5, 1.0, or 1.5??


Verify if this is compatible to your 390 . . .

http://www.alphacool.com/download/GPX-A%20390-M02.pdf


----------



## componentgirl90

Hi Guys,

So I tested my 390x on another computer and same problem with different components, and it repeated.

So what would I need to do to cool those memory VRMs? Custom Bios and some sort of paste?


----------



## stephenn82

Quote:


> Originally Posted by *rdr09*
> 
> Verify if this is compatible to your 390 . . .
> 
> http://www.alphacool.com/download/GPX-A%20390-M02.pdf


1.5mm it is then!


----------



## rdr09

Quote:


> Originally Posted by *stephenn82*
> 
> 1.5mm it is then!


Yo sho? I thought i saw 2mms. Might be safe to get some 0.5 mms.


----------



## stephenn82

Quote:


> Originally Posted by *rdr09*
> 
> Yo sho? I thought i saw 2mms. Might be safe to get some 0.5 mms.


yeah, pretty sure

Remove the protective films of the 15 mm x 15 mm x 1.5 mm
thermal pads in red and place them on the memory chips. The
orange 15 mm x 15 mm x 2 mm pads are placed on the voltage
transformers.

From the 3rd section of the form


----------



## rdr09

Quote:


> Originally Posted by *stephenn82*
> 
> yeah, pretty sure
> 
> Remove the protective films of the 15 mm x 15 mm x 1.5 mm
> thermal pads in red and place them on the memory chips. The
> orange 15 mm x 15 mm x 2 mm pads are placed on the voltage
> transformers.
> 
> From the 3rd section of the form


----------



## stephenn82

Man...should I attempt going for a custom loop and get an EK block for the 390? or just hold off and wait for Volta to come out and push the market down for Pascal cards? this is insane, 1070's are selling more than the MSRP for 1080's...Vega eats a bunch of power and barely puts out...other than heat. Especially with the pricing fiasco going on lately from vendors and manufacturers...still waiting on all this price gouging to come down...like by 20% down.


----------



## stephenn82

dangit!

They sell it only in 60x50 sheets, I would need a 60x60 sheet. That or buy two sheets and have a lot left over.

that, or cut them 15x12mm It MIGHT not hurt thermal transfer too much...then again, it might. It has to be better than the little tiny thin wafer pads currently on my MSI board, even if cut 15x12, right?


----------



## PhantomLlama

Been watching the whole Vega bit... fiasco is truly a good word for it. I hate the idea of jumping the Radeon ship I've been on since 2004...but unless someone hung happens, I may have to. I think it's ridiculous that graphics Cards are such a commodity!

The 390x does just fine IMHO and I don't think I'd see much improvement with a 580. The Vega would be a big increase but not really worth the money to me at this point... now if there is a magical driver update...lol


----------



## stephenn82

I dont think a 580 is remotely close to an upgrade over a 390x. Now, I wish I would have spent the exta 80 bucks to get the 390x over the 390.


----------



## stephenn82

I can shell out 600 plus bucks for an Nvidia card that uses almost no power....or snag another 390 or 390x and crossfire it up and put this AX860 to use. hmmm. power bill will hate me, but I can always turn off PC when done lol. pair of 390's should be good enough to last me into 2019, right?


----------



## PhantomLlama

Quote:


> Originally Posted by *stephenn82*
> 
> I dont think a 580 is remotely close to an upgrade over a 390x. Now, I wish I would have spent the exta 80 bucks to get the 390x over the 390.


There are a couple areas the 580 out-performs the 390x. Not many... not enough to warrant upgrading to the 580.

I'm just holding out hope that AMD fixes their screw up with Vega soon.


----------



## stephenn82

I keep getting strange black screens in BF1. I can see green and blue markers for my team, swirling sand, and the HUD as expected, but the game is all black screen. It was doing it in single player too. Not sure what is up there. its not over heating, running at 61c on VRM and 58c on GPU. I dont understand what is going on. Get a lot of input lag too. I followed Battle(Non)Sense to a t to eliminate lag, and its still present. oh well. Waiting for vega and the GTX 10 series to drop prices where they SHOULD be.


----------



## chris89

Vega 64 Anyone?

btw long time no see


----------



## stephenn82

Alright everyone. I am goimg to offer here first to other members. I have a msi r9 390 up. I cant post to selling corner. But if interested PM me.


----------



## archilion

Does anyone found a solution for 390/x crashing in Chrome/Firefox?


----------



## chris89

Quote:


> Originally Posted by *archilion*
> 
> Does anyone found a solution for 390/x crashing in Chrome/Firefox?


Send bios .zip it and attach via paperclip & I suggest ddu & 17.8.2 thru device manager alone more fps especially


----------



## archilion

Quote:


> Originally Posted by *chris89*
> 
> Send bios .zip it and attach via paperclip & I suggest ddu & 17.8.2 thru device manager alone more fps especially


This problem is since 16.3, when AMD introduced Power Efficiency toggle. When it's ON, the card is completely stable in those two apps. When switched OFF, poof...TDR's and stuff. Only happens in web browsers. It's ok in games.

R9390sapphireBIOS.zip 101k .zip file


----------



## chris89

Quote:


> Originally Posted by *archilion*
> 
> This problem is since 16.3, when AMD introduced Power Efficiency toggle. When it's ON, the card is completely stable in those two apps. When switched OFF, poof...TDR's and stuff. Only happens in web browsers. It's ok in games.
> 
> R9390sapphireBIOS.zip 101k .zip file


 Archilion-r9-390-1080mhz-1250mhz-895mv-delimited.zip 101k .zip file


----------



## archilion

Quote:


> Originally Posted by *chris89*
> 
> Archilion-r9-390-1080mhz-1250mhz-895mv-delimited.zip 101k .zip file


Uhm..that's with oc'd gpu clock and underclocked mem? Why? I want to stay with stock speeds.


----------



## Cherryblue

Quote:


> Originally Posted by *rdr09*
> 
> Verify if this block is compatible with your Nitro. If it is, scroll down the middle you'll see the Installation Manual. A good chance that the thermal pads used matches the original. Once you take the cooler off try to save as much of the original pads and label and set them aside. You might need them to verify thickness.
> 
> https://modmymods.com/alphacool-nexxxos-gpx-ati-r9-390-m01-incl-backplate-1011681.html
> 
> I recommend at least 11W/mk for the VRMs. Fujis have them and cheaper than the 14W/mk. The Memory can take the cheaper ones like 5-6W/mk. If you can't find the right thickness, get enough of the 0.5mm and just add them up by layers. They should still transfer heat just fine. I would go as far as test and see if the pads are being compressed. meaning putting the assembly together and disassembling again. Skip the paste for the test but be very careful of the core.


Hi,

So in my case, I'm not going WC, I don't believe it to be fully compatible with nitro.

What I did was changing all my thermal pads & paste.

Mining with -20% consumption i'm currently at 66°C on core, 68°C on VRM Temp1 and 81°C on VRM Temp2.

VRMs seem too hot. What do you think?

I reapplied twice pads & paste cause I was not satisfied with end-result.

First time I put two layers of 0.5 artic silver thermal pads for everything but the core (where I put grizzly paste).

Second and current time, I just put one layer, and verified that contact could be done.

Could anyone explain why it's important to change 0.5 to 1 or to 1.5 depending on case? If it does touch both sides, I don't see why we should put more layers?


----------



## chris89

Quote:


> Originally Posted by *archilion*
> 
> Uhm..that's with oc'd gpu clock and underclocked mem? Why? I want to stay with stock speeds.


No power limit... try it and tell me what you think.

@stephenn82 It's cooler & uses less power and runs faster. .it's faster and cooler.. lmk

memory doesn't make any difference either but way cooler running and faster.


----------



## Cherryblue

Quote:


> Originally Posted by *chris89*
> 
> Archilion-r9-390-1080mhz-1250mhz-895mv-delimited.zip 101k .zip file


Does it work with sapphire nitro 390X?


----------



## componentgirl90

Hey

I just played a FPS game and my 390x is running at 92 degrees celcius on the core, is that normal? I never remember it going that high before although it may have done and I never noticed. I seem to remember it being in the 80s although that might have been the temps from just after minimising the program. VRMs at 74 and 73 during game.

Edit: idling at 34 if no activity on the screen


----------



## PhantomLlama

Quote:


> Originally Posted by *componentgirl90*
> 
> Hey
> 
> I just played a FPS game and my 390x is running at 92 degrees celcius on the core, is that normal? I never remember it going that high before although it may have done and I never noticed. I seem to remember it being in the 80s although that might have been the temps from just after minimising the program. VRMs at 74 and 73 during game.


Seems a bit high to me.. even overclocked mine rarely goes over 75C. Your ambient temps seem alright.

Couple thoughts:

Is your card being starved for air by another card? That would account for the drastic increase in temperature...

What is the ambient temperature in your case when it's running under load? What kind of CPU cooler and airflow do you have? If the temperature is going up in the case a lot that will also account for the temperature of the card.

Third....has the stock cooler ever been taken off the card (did you buy it used)? If so, the thermal paste application may have been sub-par. However, I think it's more likely to be the first two scenarios here.

Make sure the card is getting good air flow, as well as make sure you have good airflow inside the case to keep the hot air out and cool air in.

Hope this helps!


----------



## chris89

Quote:


> Originally Posted by *Cherryblue*
> 
> Does it work with sapphire nitro 390X?


Send bios & yes sir


----------



## Cherryblue

Quote:


> Originally Posted by *chris89*
> 
> Send bios & yes sir


Here it is:

Hawaii-Cherryblue-Nitro-390X.zip 99k .zip file


You're doing me a great favor here, if the card really lowers its temp while keeping same performance, you're a boss







.
That's quite the experience you have here.


----------



## componentgirl90

Hey

I just played a FPS game and my 390x is running at 92 degrees celcius on the core, is that normal?
Quote:


> Originally Posted by *PhantomLlama*
> 
> Seems a bit high to me.. even overclocked mine rarely goes over 75C. Your ambient temps seem alright.
> 
> Couple thoughts:
> 
> Is your card being starved for air by another card? That would account for the drastic increase in temperature...
> 
> What is the ambient temperature in your case when it's running under load? What kind of CPU cooler and airflow do you have? If the temperature is going up in the case a lot that will also account for the temperature of the card.
> 
> Third....has the stock cooler ever been taken off the card (did you buy it used)? If so, the thermal paste application may have been sub-par. However, I think it's more likely to be the first two scenarios here.
> 
> Make sure the card is getting good air flow, as well as make sure you have good airflow inside the case to keep the hot air out and cool air in.
> 
> Hope this helps!


I bought this card in autumn 2015 brand new in the UK. It has always been on its own in a Corsair Spec 01 case. The case has 1 rear exhaust and two intakes (120mm). The cooler on the CPU is stock intel. The cooler on the card is stock XFX DD. I have never made any physical amendment to the card.

No idea what the ambient temperature in the case is tbh. I am guessing that it has just seen a lot of gaming, I was heavily into MMOs, 4000 hours in 2 years. It has seen some sort of random throttling when idle earlier in the year which Chris89 recommended that I replace the pads on the VRMs and load up a custom bios which I intend to get done. I will probably write to XFX or something to ask for instruction on how to do that. I had the opportunity to test a 1070 in the case and that was normal temperature. (70 degrees under load).


----------



## PhantomLlama

I mean it could be going bad. I'd do what youre planning. Also make sure there is no dust accumulating..

Otherwise, I'll take it haha


----------



## componentgirl90

You'll take it eh Possibly,but only If you have the coin lol

I will try and fix it. XFX offered to try and fix it but I had to ship it to Hong Kong and pay loads of fees, no chance....

I might try one last RMA to the retailer and if that fails, I will have to have a crack even though I have no idea what I am doing....although I have built 3 PCs before, this seems more advanced.


----------



## chris89

Quote:


> Originally Posted by *Cherryblue*
> 
> Here it is:
> 
> Hawaii-Cherryblue-Nitro-390X.zip 99k .zip file
> 
> 
> You're doing me a great favor here, if the card really lowers its temp while keeping same performance, you're a boss
> 
> 
> 
> 
> 
> 
> 
> .
> That's quite the experience you have here.


here ya go









atiflash_277.zip 1189k .zip file


right click atiwinflash.exe properties compatibility run as administrator. make sure msi is closed and hwinfo and all apps. begin flash and don't click anything. it will take some time.

Plus installing the driver alone via Device Manager 17.8.2 alone without the suite speeds things up a lot I found... DDU first without restart then install the .INF via device manager.

1000mhz-1001mhz-undervolt-delimited-875mv-coolestquietests.zip 101k .zip file


1133mhz-1001mhz-undervolt-delimited-1288mv-875mv.zip 101k .zip file


1133mhz-1001mhz-undervolt-delimited-1333mv-875mv.zip 101k .zip file


1172mhz-1001mhz-undervolt-delimited-1366mv-875mv.zip 101k .zip file


----------



## Geoclock

Can i reduce for MSI 390x just power target to 50% in ASUS gpu tweak 2 to keep it cool while gaming?
Silent mode is 90%, Gaming 100% and OC mode 110%.
I reduced it to 50% and while gaming i have lot of freeze ups but temperature is under 70c., at 90-100% power target temps are reaching 90c.


----------



## chris89

Quote:


> Originally Posted by *Geoclock*
> 
> Can i reduce for MSI 390x just power target to 50% in ASUS gpu tweak 2 to keep it cool while gaming?
> Silent mode is 90%, Gaming 100% and OC mode 110%.
> I reduced it to 50% and while gaming i have lot of freeze ups but temperature is under 70c., at 90-100% power target temps are reaching 90c.


send bios


----------



## Geoclock

how to send a zip file or rar ? Attachment shows me an error:AJAX response unable to be parsed as valid JSON object. ???


----------



## chris89

.zip paperclip


----------



## Cherryblue

Quote:


> Originally Posted by *chris89*
> 
> here ya go
> 
> 
> 
> 
> 
> 
> 
> 
> 
> right click atiwinflash.exe properties compatibility run as administrator. make sure msi is closed and hwinfo and all apps. begin flash and don't click anything. it will take some time.
> 
> Plus installing the driver alone via Device Manager 17.8.2 alone without the suite speeds things up a lot I found... DDU first without restart then install the .INF via device manager.


Mate, if you ever come to France, I'll buy you a beer







.

Only tried the coolest/quiestest one for the moment, it's pretty good!
Although power consumption limit doesn't seem to work using AB? Not available anymore? *Well I didn't DDU & reinstall, it worked right after flashing.. So why reinstall?*

Do you have any experience/idea about vrm thermal on hawai?

One of my vrm sensor is systematically higher than my gpu core temp, I don't think it should. Since I changed thermal paste and pads, that means I made a bad arrangement with pads? In game or mining I can go as high as 85°C (didn't check each time but well..) ; that's clearly the thing keeping my card hot.

What's yours guys? (vrm temp)


----------



## chris89

Quote:


> Originally Posted by *Cherryblue*
> 
> Mate, if you ever come to France, I'll buy you a beer
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Only tried the coolest/quiestest one for the moment, it's pretty good!
> Although power consumption limit doesn't seem to work using AB? Not available anymore? *Well I didn't DDU & reinstall, it worked right after flashing.. So why reinstall?*
> 
> Do you have any experience/idea about vrm thermal on hawai?
> 
> One of my vrm sensor is systematically higher than my gpu core temp, I don't think it should. Since I changed thermal paste and pads, that means I made a bad arrangement with pads? In game or mining I can go as high as 85°C (didn't check each time but well..) ; that's clearly the thing keeping my card hot.
> 
> What's yours guys? (vrm temp)


Your welcome, it has no power limit for peak performance at low clock speed. Try this one. Otherwise I may hex mod your bios with a VRM throttle temperature at 75C.

1000mhz-1001mhz-coolest-fullspeedfanat67c.zip 101k .zip file


This is how the vrm pad material should look before reassembly and gpu core repaste or else the gpu vrm will run too hot... I see 55C load on my at 1133mhz core.


----------



## christoph

Quote:


> Originally Posted by *chris89*
> 
> Your welcome, it has no power limit for peak performance at low clock speed. Try this one. Otherwise I may hex mod your bios with a VRM throttle temperature at 75C.
> 
> 1000mhz-1001mhz-coolest-fullspeedfanat67c.zip 101k .zip file
> 
> 
> This is how the vrm pad material should look before reassembly and gpu core repaste or else the gpu vrm will run too hot... I see 55C load on my at 1133mhz core.


wait, but that is a reference video card, in the Sapphire 390 where are the VRM and whatever needs the pad?

I get a difference 10/15 degrees between the VRM1 and VRM2


----------



## Cherryblue

Quote:


> Originally Posted by *chris89*
> 
> Your welcome, it has no power limit for peak performance at low clock speed. Try this one. Otherwise I may hex mod your bios with a VRM throttle temperature at 75C.
> This is how the vrm pad material should look before reassembly and gpu core repaste or else the gpu vrm will run too hot... I see 55C load on my at 1133mhz core.


Well I might have badly repadded the vrms.. problem is I can't know which one has this problem







..

I bought artic silver 0.5mm pads.

Tried once with two 0.5pads layer but that was done with little quality.

Tried a second time with only one layer but done right. I made sure before closing the box, that pads, vrms and cooler were in contact with each other.

Do you think I need a peculiar length of layer?

Otherwise, can I just put thermal paste between cooler and vrms, without pads? Since i have some grizzly paste, at least it will be efficient..


----------



## Cherryblue

Quote:


> Originally Posted by *christoph*
> 
> wait, but that is a reference video card, in the Sapphire 390 where are the VRM and whatever needs the pad?
> 
> I get a difference 10/15 degrees between the VRM1 and VRM2


It's easy to know where are the pads and vrms, since the vrms are padded with thermal pads.

You open your card, remove the cooler, and you'll see what is "cooled" with pads. There are ram chips (bigger than the other chips cooled appart from the gpu itself) around the gpu core, and the other things are VRMs.


----------



## christoph

Quote:


> Originally Posted by *Cherryblue*
> 
> It's easy to know where are the pads and vrms, since the vrms are padded with thermal pads.
> 
> You open your card, remove the cooler, and you'll see what is "cooled" with pads. There are ram chips (bigger than the other chips cooled appart from the gpu itself) around the gpu core, and the other things are VRMs.


I've already done it, I was asking cuz maybe there's some VRMs than are not being cool


----------



## Raventlov

Quote:


> Originally Posted by *Raventlov*
> 
> I'll update you as i finish and have results to show... i'm goin' to have a little holiday soon, so i plan to do all this stuff when i come back. ETA 10 September
> 
> BTW on the back of the card (390 Sapphire with back plate) there is a chunk of 3mm thermal pad right behind the vCore, wondering if amplifying this area would benefit at all.


So i finished the job, problem was that the thermal pads on memory around the GPU are NOT 1.5 mm thick and i had to use the old ones (with 1.5 mm there was no good contact with the heat-sink and GPU)

Before:


After:


Don't know yet if i'm going to buy a sheet of 0.5 mm for the memory chips and check the paste-job but for now i think it's good.


----------



## chris89




----------



## Harry604

i have a asus 390 strix..

is there a bios worth flashing ?


----------



## Raventlov

Quote:


> Originally Posted by *chris89*


Impressive...is it water cooled , under volted and overclocked right? bios mod too?


----------



## componentgirl90

I am thinking about buying the Arctic accelero xtreme IV.

It seems to have good VRM cooling and also has preapplied thermal paste.

Would this be better than my XFX DD cooler? I seem to remember that the stock cooler on the XFX card doesnt have VRM cooling.

I would pay £45 for this product ($59).


----------



## chris89

Quote:


> Originally Posted by *Raventlov*
> 
> Impressive...is it water cooled , under volted and overclocked right? bios mod too?


Its stock blower, yep


----------



## chris89

Quote:


> Originally Posted by *Harry604*
> 
> i have a asus 390 strix..
> 
> is there a bios worth flashing ?


yes send bios .zip paperclip attached


----------



## Raventlov

Quote:


> Originally Posted by *chris89*
> 
> yes send bios .zip paperclip attached


Can i do it too? what magic touch you put into the mod bios?


----------



## stephenn82

He disables the power limiting factor, drops the factory down clock temps so it wont melt (default is 94c...85c is much, much safer) and builds in a custom fan curve. Chris89 adjusts the voltage curves as well to let that card really sing how it should have from the factory. Believe me, his bios running 1140/1563 was waaaay better than my stock bios using AB cranking it up to 1160/1625 with 20% power slider. Much, much better performance and less heat!


----------



## Harry604

Quote:


> Originally Posted by *chris89*
> 
> yes send bios .zip paperclip attached


 harry604strix390.zip 103k .zip file


here is the bios big thanks


----------



## chris89

Quote:


> Originally Posted by *Harry604*
> 
> harry604strix390.zip 103k .zip file
> 
> 
> here is the bios big thanks


Sure thing man, here ya go









Harry604-r9-390-1094Mhz-1250mhz-undervolt-delimit-KOOL-DUDE.zip 103k .zip file


----------



## Harry604

Quote:


> Originally Posted by *chris89*
> 
> Sure thing man, here ya go
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Harry604-r9-390-1094Mhz-1250mhz-undervolt-delimit-KOOL-DUDE.zip 103k .zip file


Everytime I try to install drivers it black screens even after using ddu

Any idea?


----------



## Raventlov

Hey @chris89 this is my bios... it's a 390 sapphire with backplate.

390SapphireBackplate.zip 100k .zip file


Thanks in advance... i'll run a benchmark right now to get a baseline and then post an update!


----------



## componentgirl90

Quote:


> Originally Posted by *stephenn82*
> 
> He disables the power limiting factor, drops the factory down clock temps so it wont melt (default is 94c...85c is much, much safer) and builds in a custom fan curve. Chris89 adjusts the voltage curves as well to let that card really sing how it should have from the factory. Believe me, his bios running 1140/1563 was waaaay better than my stock bios using AB cranking it up to 1160/1625 with 20% power slider. Much, much better performance and less heat!


Mine runs at 93 degrees celcius. Am I in danger of melting or catching fire?


----------



## Raventlov

So i tried the the bios for @archilion since we have the same card (checked with HawaiiBiosReader)

Results are good... on Heaven Benchmark the score is almost the same ( minus ~15 point with @chris89 bios) but temps are better especially on VRM since the drop on VDDCI (i think)

(-5 degrees on VRM1 -12 on VRM 2)


----------



## PhantomLlama

I could use a new bios for my sapphire tri-x 390x if it will give me some nice performance gains or temp changes!


----------



## chris89

Quote:


> Originally Posted by *PhantomLlama*
> 
> I could use a new bios for my sapphire tri-x 390x if it will give me some nice performance gains or temp changes!


sure attach via .zip here paperclip


----------



## chris89

Quote:


> Originally Posted by *Harry604*
> 
> Everytime I try to install drivers it black screens even after using ddu
> 
> Any idea?


Unusual... no issues on bios.. it should work fine at 1001mhz memory to 1250mhz at 875mv memory voltage .. saves 50% power heat... like not slower at all .. runs really smoothly.

Try this one. These are power delimited.

1094mhz-1001mhz-875mv-memory-cool.zip 103k .zip file


----------



## chris89

Quote:


> Originally Posted by *Raventlov*
> 
> Hey @chris89 this is my bios... it's a 390 sapphire with backplate.
> 
> 390SapphireBackplate.zip 100k .zip file
> 
> 
> Thanks in advance... i'll run a benchmark right now to get a baseline and then post an update!


Sure man.. this is cool bios .. i can make fast bios.. this is for gaming solid low heat very high speed like smooth as butter at maximum details zero throttling.

1094Mhz-1001Mhz-Memory-895mv-cool.zip 101k .zip file


----------



## chris89

Quote:


> Originally Posted by *componentgirl90*
> 
> Mine runs at 93 degrees celcius. Am I in danger of melting or catching fire?


Too hot.


----------



## PhantomLlama

Quote:


> Originally Posted by *chris89*
> 
> sure attach via .zip here paperclip


Now here's a stupid question.... how? lol I'm a noob when it comes to you bios...


----------



## chris89

Quote:


> Originally Posted by *PhantomLlama*
> 
> Now here's a stupid question.... how? lol I'm a noob when it comes to you bios...


https://www.techpowerup.com/vgabios/173845/sapphire-r9390x-8192-150527

Sapphire-TriX-390X-1094Mhz-1001Mhz-DeLimited-895mv-Memory.zip 99k .zip file


Right click ATIWINFLASH.exe Properties... Compatibility.. Open As Administrator

Make sure everything is closed including HWInfo, MSI, GPU, Etc, Everything...

Open .rom... Flash ... Don't click Anything for 5minutes then restart done

atiflash_277.zip 1189k .zip file


GPU-Z.2.2.0.zip 4413k .zip file


----------



## Harry604

Quote:


> Originally Posted by *chris89*
> 
> Unusual... no issues on bios.. it should work fine at 1001mhz memory to 1250mhz at 875mv memory voltage .. saves 50% power heat... like not slower at all .. runs really smoothly.
> 
> Try this one. These are power delimited.
> 
> 1094mhz-1001mhz-875mv-memory-cool.zip 103k .zip file
> [/quote
> 
> Thanks I'll try it out when I get home


----------



## PhantomLlama

Quote:


> Originally Posted by *chris89*
> 
> https://www.techpowerup.com/vgabios/173845/sapphire-r9390x-8192-150527
> 
> Sapphire-TriX-390X-1094Mhz-1001Mhz-DeLimited-895mv-Memory.zip 99k .zip file
> 
> 
> Right click ATIWINFLASH.exe Properties... Compatibility.. Open As Administrator
> 
> Make sure everything is closed including HWInfo, MSI, GPU, Etc, Everything...
> 
> Open .rom... Flash ... Don't click Anything for 5minutes then restart done
> 
> atiflash_277.zip 1189k .zip file
> 
> 
> GPU-Z.2.2.0.zip 4413k .zip file


Perfect! I flashed my Bios to that new one. My system booted haha

So what did I just do to my system with your help? What should I see different when running Heaven or one of my profiles (1175MHz GPU 1625MHz RAM is my max profile I really only play with when I'm feeling bullet proof)

I know, I know.... I'm clueless. Regardless, THANK YOU!


----------



## chris89

Quote:


> Originally Posted by *PhantomLlama*
> 
> Perfect! I flashed my Bios to that new one. My system booted haha
> 
> So what did I just do to my system with your help? I know, I know.... I'm clueless. Regardless, THANK YOU!


Open GPUZ and ... Do an ALT + PRINT SCREEN and Paste in Paint... Then upload picture here... 1000x2000 resolution...

GPUz make sure it's 70 billion pixels per second....

Then run some Games and Download and run HWINFO64 so we can make sure we know the VRM isn't running hot and Core is running cool as well.









http://www.majorgeeks.com/files/details/hwinfo64.html

Do the same for HWINFO for the GPU data screen...

I always change ths... /width/500/height/1000 to this... higher resolution /width/1000/height/2000

Logitech MK750 Wireless Keyboard Mouse Solar ::::: Eternal Battery Life On Mouse & Keyboard


----------



## PhantomLlama

Quote:


> Originally Posted by *chris89*
> 
> Open GPUZ and ... Do an ALT + PRINT SCREEN and Paste in Paint... Then upload picture here... 1000x2000 resolution...
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> GPUz make sure it's 70 billion pixels per second....
> 
> Then run some Games and Download and run HWINFO64 so we can make sure we know the VRM isn't running hot and Core is running cool as well.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.majorgeeks.com/files/details/hwinfo64.html
> 
> Do the same for HWINFO for the GPU data screen...
> 
> I always change ths... /width/500/height/1000 to this... higher resolution /width/1000/height/2000
> 
> Logitech MK750 Wireless Keyboard Mouse Solar ::::: Eternal Battery Life On Mouse & Keyboard


Alright, if everything goes according to plan, I should have a screenshot of GPU-Z attached. Currently running a stock profile in MSI Afterburner (though I haven't opened Afterburner this session).

I haven't gotten any temps for you just yet, but while I was just playing ARK and watching YouTube, all my screens went dead for a good 30 seconds. When they came back on, Firefox was a white window, and the game had crashed without an error message... hmmmmm

NewBIOSGPUZ.png 108k .png file


NewBIOSHWiNFO64.png 87k .png file


----------



## Cherryblue

Quote:


> Originally Posted by *chris89*
> 
> here ya go
> 
> 
> 
> 
> 
> 
> 
> 
> 
> atiflash_277.zip 1189k .zip file
> 
> 
> right click atiwinflash.exe properties compatibility run as administrator. make sure msi is closed and hwinfo and all apps. begin flash and don't click anything. it will take some time.
> 
> Plus installing the driver alone via Device Manager 17.8.2 alone without the suite speeds things up a lot I found... DDU first without restart then install the .INF via device manager.
> 
> 1000mhz-1001mhz-undervolt-delimited-875mv-coolestquietests.zip 101k .zip file
> 
> 
> 1133mhz-1001mhz-undervolt-delimited-1288mv-875mv.zip 101k .zip file
> 
> 
> 1133mhz-1001mhz-undervolt-delimited-1333mv-875mv.zip 101k .zip file
> 
> 
> 1172mhz-1001mhz-undervolt-delimited-1366mv-875mv.zip 101k .zip file


Hi again Chris







.

So after a few days, I tested these four, and while it's really nice to have a card way cooler (and I still have to rework the thermal pad on some of the vrms as exchanged in the posts in between), it's not stable to play on my card.

Every game I play, I get black screen pause of about 5 seconds, then it's playable again.... until I get the "final black screen" loul







.

Tried the four bioses, and got the exact same effects on the four. I suppose it's about the memory voltage being a little too low?

You have a lot of demand and I wouldn't dare ask you for the four bioses again, could you pehaps simply update the one at 1133mhz - 1001mhz 1333mv with a few mv more on memory? like 50mv?

Thanks a lot mate


----------



## chris89

Quote:


> Originally Posted by *Cherryblue*
> 
> Hi again Chris
> 
> 
> 
> 
> 
> 
> 
> .
> 
> So after a few days, I tested these four, and while it's really nice to have a card way cooler (and I still have to rework the thermal pad on some of the vrms as exchanged in the posts in between), it's not stable to play on my card.
> 
> Every game I play, I get black screen pause of about 5 seconds, then it's playable again.... until I get the "final black screen" loul
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Tried the four bioses, and got the exact same effects on the four. I suppose it's about the memory voltage being a little too low?
> 
> You have a lot of demand and I wouldn't dare ask you for the four bioses again, could you pehaps simply update the one at 1133mhz - 1001mhz 1333mv with a few mv more on memory? like 50mv?
> 
> Thanks a lot mate


Even the first 1ghz bios does it? try 1ghz and see...


----------



## Harry604

Quote:


> Originally Posted by *chris89*
> 
> Unusual... no issues on bios.. it should work fine at 1001mhz memory to 1250mhz at 875mv memory voltage .. saves 50% power heat... like not slower at all .. runs really smoothly.
> 
> Try this one. These are power delimited.
> 
> 1094mhz-1001mhz-875mv-memory-cool.zip 103k .zip file


ok this one worked.. thank you...

should i try overclocking or just leave it way it is?


----------



## chris89

Quote:


> Originally Posted by *Harry604*
> 
> ok this one worked.. thank you...
> 
> should i try overclocking or just leave it way it is?


I'd leave it this way and post your HWInfo Screenshot so we can see the Maximum VRM Temperature & Core Temperature?

Thanks

ComponentGirl90 hasn't posted a screenshot either so your screenshot will help her know how to post one


----------



## PhantomLlama

I just realized that the GPUZ one was an epic fail... here's another attempt. What do you think, Chris?

NewBIOSInfo.gif 26k .gif file


----------



## chris89

Quote:


> Originally Posted by *PhantomLlama*
> 
> I just realized that the GPUZ one was an epic fail... here's another attempt. What do you think, Chris?
> 
> NewBIOSInfo.gif 26k .gif file


It looks like you didn't flash the BIOS. Close MSI Afterburner & All Other Applications.

Follow The Instructions. Open The .ROM I sent You... In ATIWinFlash & WAIT 5 MINUTES Don't click Anything while Flashing.

*http://www.rarlab.com/rar/winrar-x64-550.exe*

ATIWINFLASH_v277.zip 1189k .zip file


Sapphire-TriX-390X-1094Mhz-1001Mhz-DeLimited-895mv-Memory.zip 99k .zip file


----------



## chris89

Quote:


> Originally Posted by *componentgirl90*
> 
> Mine runs at 93 degrees celcius. Am I in danger of melting or catching fire?


Close MSI Afterburner & All Other Applications.

Follow The Instructions. Open The .ROM I sent You... In ATIWinFlash & WAIT 5 MINUTES Don't click Anything while Flashing.

*http://www.rarlab.com/rar/winrar-x64-550.exe*

XFX-DOUBLE-DISSIPATION-AMD-R9-390X-192W-75C-1094MHZ.zip 99k .zip file


ATIWINFLASH_v277.zip 1189k .zip file


----------



## PhantomLlama

Quote:


> Originally Posted by *chris89*
> 
> It looks like you didn't flash the BIOS. Close MSI Afterburner & All Other Applications.
> 
> Follow The Instructions. Open The .ROM I sent You... In ATIWinFlash & WAIT 5 MINUTES Don't click Anything while Flashing.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> *http://www.rarlab.com/rar/winrar-x64-550.exe*
> 
> ATIWINFLASH_v277.zip 1189k .zip file
> 
> 
> Sapphire-TriX-390X-1094Mhz-1001Mhz-DeLimited-895mv-Memory.zip 99k .zip file


Let's try this again....


----------



## chris89

Quote:


> Originally Posted by *PhantomLlama*
> 
> Let's try this again....


----------



## PhantomLlama

Alright, this time it worked. Looks like default memory has been dropped to 1001MHz, and default core clock is 1094. Looks like top memory speed is now limited to 1300MHz.

Heaven and everything seems stable (why wouldn't it if we are underclocking).

Temps were a cool 60 + - as usual. Okay, I think I want to push this card to its max without destroying it until I can upgrade. With the original bios, the best I could do was 1165MHz on the core and 1625 on the memory, but that would get artifacts on rare occasion. Is there anything we can do to push this? I've never had temp issues.

You're a wizard man! If only i had the patience to get into this kind of thing and learn haha!


----------



## Harry604

Quote:


> Originally Posted by *chris89*
> 
> I'd leave it this way and post your HWInfo Screenshot so we can see the Maximum VRM Temperature & Core Temperature?
> 
> Thanks
> 
> ComponentGirl90 hasn't posted a screenshot either so your screenshot will help her know how to post one


i have 2 120mm fans instead of the stock strix cooler fans and temps dont go over 57c vrm is 62 and 51c

is there a bios for pure performace for gaming


----------



## chris89

Quote:


> Originally Posted by *Harry604*
> 
> i have 2 120mm fans instead of the stock strix cooler fans and temps dont go over 57c vrm is 62 and 51c
> 
> is there a bios for pure performace for gaming


*This Bios is for gaming (The 1094Mhz BIOS I'm referring to, It's perfect as is, no need adjust anything just be happy and game away). We could double the power consumption for like 5-10fps? That's from 200 to 400 watts. It's uber efficient at these clocks (1094mhz core that is).

It's inefficient above 1,094mhz core clock... 1094mhz is 70 GPixel/s or 70 Billion Pixel's that's amazing on stock voltage which is low voltage.

1094Mhz Core clock on this 64 ROP GPU like the AMD RX VEGA... 1094 X 64 = 70 Billion 16 Billion Pixel's & 192 Billion 544 Million Texel's On 390X & 175 Billion 40 Million Texel's on 390.

AMD VEGA has 64 ROP just like the 390/390X... With 256 TMU's ... Unlike the 390/390X 160 to 176 TMU.

AMD VEGA at 1,563Mhz X 64 ROP = 100 Billion 32 Million Pixel's per second (100 Giga Pixel Per Second)
AMD VEGA at 1,563Mhz X 256 TMU = 400 Billion 128 Million Texel's per second (400 Giga Texel Per Second)

AMD VEGA needs to run lower memory since it doesn't help FPS at all and runs crazy hot at it's 945Mhz MEMORY HBM @ 1.35V HBM Voltage (THATS A LOT OF VOLTS)

AMD VEGA @ 511MHZ HBM is 320GB/s on 700 mv vs 1350mv @ 483GB/s... Run the TEMPs and Power down to below 200 watts & below 50C at load.

Then Clock VEGA to 2000Mhz for 128 GIGA PIXEL & 512 GIGA TEXEL.

Just to prove that overclocking is wasteful on power and heat... It's so nice.. Ultra Smooth 60-70C gaming isn't it?*

1172mhz-1563mh-1366mv-core-1000mv-memory.zip 103k .zip file

Quote:


> Originally Posted by *PhantomLlama*
> 
> Alright, this time it worked. Looks like default memory has been dropped to 1001MHz, and default core clock is 1094. Looks like top memory speed is now limited to 1300MHz.
> 
> Heaven and everything seems stable (why wouldn't it if we are underclocking).
> 
> Temps were a cool 60 + - as usual. Okay, I think I want to push this card to its max without destroying it until I can upgrade. With the original bios, the best I could do was 1165MHz on the core and 1625 on the memory, but that would get artifacts on rare occasion. Is there anything we can do to push this? I've never had temp issues.
> 
> You're a wizard man! If only i had the patience to get into this kind of thing and learn haha!


 1172mhz-1563mhz-1366mv-1000mv.zip 99k .zip file


*Might be able to do 1188mhz to 1200mhz on 1366mv & maybe as high as I ran mine up to 1,758mhz memory (450GB/s)

Didn't make much difference though but u could try it.

Thanks Man

1094mhz is amazing and is done on stock 65288 (1.250v max)... that's 70 billion pixel's... on 200 watts with 256GB/s ram .. only 200 watts total

1172mhz is just 75 billion vs 70 billion pixel's... only 5 billion pixel's... 5-20fps maybe on 2x the power...

1172mhz needs from 1250mv to 1366mv for stability thats 9.3% hotter and 9.3 more power however it uses more than that... still has a cool profle in line to remain cool so it could work...

1,563mhz memory = 400 Gigabytes per second so only 83GB shy of the VEGA ... doesn't matter since memory doesn't matter...

It's all about finding the optimal thermal and power specifications on the bios parameters to yield peak performance on a modest a cool running card.

1094Mhz is impeciably awesome clock for cool running performance. Plus 1001Mhz memory is ideal since can run it at 875mv vs 1000mv... thats 14.29% less memory power and therefore 14.29% less heat...

I can run it at 1250mhz memory 320GB/s vs 256GB/s ... at 875mv... However it's ideal to run it at the least power consumption possible for ideal continuous efficiency.*


----------



## Cherryblue

Quote:


> Originally Posted by *chris89*
> 
> Even the first 1ghz bios does it? try 1ghz and see...


That was my last measure before rolling back to stock, but when freezing and black screening on league of legend which isn't even a demanding game, I had to revert to stock bios sadly.


----------



## chris89

Quote:


> Originally Posted by *Cherryblue*
> 
> That was my last measure before rolling back to stock, but when freezing and black screening on league of legend which isn't even a demanding game, I had to revert to stock bios sadly.


HWInfo VRM Temps? At Load? Max Temperature?

These are delimited, so they run like a dream.

Coolest & Quietest & Like 60-70C and takes forever to get up there... Less than 200 watts likely

1000mhz-1001mhz-1250mv-895mv-delimited.zip 101k .zip file


Higher Speed / Hotter

1133mhz-1001mhz-1333mv-895mv-delimited.zip 101k .zip file


1133mhz-1250mhz-1333mv-895mv-delimited.zip 101k .zip file


----------



## Cherryblue

Quote:


> Originally Posted by *chris89*
> 
> HWInfo VRM Temps? At Load? Max Temperature?
> 
> These are delimited, so they run like a dream.
> 
> Coolest & Quietest & Like 60-70C and takes forever to get up there... Less than 200 watts likely
> 
> 1000mhz-1001mhz-1250mv-895mv-delimited.zip 101k .zip file
> 
> 
> Higher Speed / Hotter
> 
> 1133mhz-1001mhz-1333mv-895mv-delimited.zip 101k .zip file
> 
> 
> 1133mhz-1250mhz-1333mv-895mv-delimited.zip 101k .zip file


Thanks a lot Chris, I'll be testing them next week-end since I'm taking off this week for holidays.

Hope I'll come back with great news







.


----------



## componentgirl90

Quote:


> Originally Posted by *chris89*
> 
> Too hot.


Thank you


----------



## componentgirl90

Quote:


> Originally Posted by *chris89*
> 
> Close MSI Afterburner & All Other Applications.
> 
> Follow The Instructions. Open The .ROM I sent You... In ATIWinFlash & WAIT 5 MINUTES Don't click Anything while Flashing.
> 
> *http://www.rarlab.com/rar/winrar-x64-550.exe*
> 
> XFX-DOUBLE-DISSIPATION-AMD-R9-390X-192W-75C-1094MHZ.zip 99k .zip file
> 
> 
> ATIWINFLASH_v277.zip 1189k .zip file


Ok I will do that soon, just a bit busy atm.

I ordered an arctic hyper IV as well as obviously the thermal paste as gone on the core as well as the VRM pads or whatever was the original problem.


----------



## chris89

Quote:


> Originally Posted by *componentgirl90*
> 
> Ok I will do that soon, just a bit busy atm.
> 
> I ordered an arctic hyper IV as well as obviously the thermal paste as gone on the core as well as the VRM pads or whatever was the original problem.


I would just buy this copper heatsink & thermal glue... glue the heatsink on the memory vrm's towards the front of the card... only a couple quid.

http://www.ebay.co.uk/itm/8pcs-15x15mm-heatsink-led-ic-cooler-gold-color-aluminium-/131823788966?hash=item1eb14f63a6:g:~oUAAOSwEeFVROgb

http://www.ebay.co.uk/itm/Halnziye-HY910-5g-tube-Silicone-Heatsink-Plaster-Thermal-Adhesive-Glue-/253115307120?epid=565823394&hash=item3aeed91470:g:UfEAAOSwDk5UDNfG

http://www.ebay.co.uk/itm/Halnziye-HY910-30g-tube-Silicone-Heatsink-Plaster-Thermal-Adhesive-Glue-/263184236605?hash=item3d4700c03d:g:4toAAOSw3YJZXLlm


----------



## componentgirl90

Quote:


> Originally Posted by *chris89*
> 
> I would just buy this copper heatsink & thermal glue... glue the heatsink on the memory vrm's towards the front of the card... only a couple quid.
> 
> http://www.ebay.co.uk/itm/8pcs-15x15mm-heatsink-led-ic-cooler-gold-color-aluminium-/131823788966?hash=item1eb14f63a6:g:~oUAAOSwEeFVROgb
> 
> http://www.ebay.co.uk/itm/Halnziye-HY910-5g-tube-Silicone-Heatsink-Plaster-Thermal-Adhesive-Glue-/253115307120?epid=565823394&hash=item3aeed91470:g:UfEAAOSwDk5UDNfG
> 
> http://www.ebay.co.uk/itm/Halnziye-HY910-30g-tube-Silicone-Heatsink-Plaster-Thermal-Adhesive-Glue-/263184236605?hash=item3d4700c03d:g:4toAAOSw3YJZXLlm


ok i will do that

I have cancelled the order for the cooler and I will just use the thermal paste I bought for a CPU on the core and keep the stock cooler as well as using the bios.


----------



## chris89

Quote:


> Originally Posted by *componentgirl90*
> 
> ok i will do that
> 
> I have cancelled the order for the cooler and I will just use the thermal paste I bought for a CPU on the core and keep the stock cooler as well as using the bios.


I'd use this thermal paste. It'll last you forever. Amazing stuff. Cheap too.

http://www.ebay.co.uk/itm/Arctic-Silver-Ceramique-2-Tri-Linear-Ceramic-Thermal-Compound-25g-CMQ2-25G-/121177965993?epid=1812813792&hash=item1c36c505a9:g:MssAAOSwPTlTzOQP

Good idea. This thermal glue adhesive works amazing. Carefully apply a thin adequate coat over the VRM (3) then set heatsink on top.

Wait 10-20 minutes. Until it doesn't budge. Don't try to move to too move, carefully.

Then as soon as you power it on, it'll CURE real fast. The VRM will drop 50-60C... Idling & load about 55C or so.

Check out my RX 460 with RX 560 BIOS, I added more VRM coolng. This thermal glue is so sweet. Now I can clock up to 1,375MHZ maybe even 1,400Mhz.


----------



## chris89

Can someone please post AIDA64 GPGPU scores for the 390X on a Ryzen system at 4Ghz?

I really wanna see a video on Youtube with 390X & Ryzen with my BIOS especially?

Anyone up for it? Anyone down? I'm eager for tons of Youtube Channels with Ryzen 4Ghz 1700 minimum with Different GPU(s)...

GTX 980 vs 390X with my BIOS. Not to mention GPGPU Tests for each one.


----------



## chris89

Quote:


> Originally Posted by *Cherryblue*
> 
> Thanks a lot Chris, I'll be testing them next week-end since I'm taking off this week for holidays.
> 
> Hope I'll come back with great news
> 
> 
> 
> 
> 
> 
> 
> .


Your welcome. Any updates?


----------



## B'Fish

Hello people,

Just got a r9 390 PCS+ today 2nd hand, really happy with it so far.

Asic quality 79.5% ( which is bad on amd cards right? Lower is better??

*Strange story so here it comes. never had a card that behaved like this one does!*
I have tried overclocking this little beast upping the powerlimit to +50% so far my in-game voltages are around 1.15-1.175v (1175mv in wattman) And a core clock of 1160mhz. This is currently the max I can achieve. My max voltages in wattman are 1231mv, If i apply this voltage to the stables clockspeeds of 1160mhz it throttles. It seems like I hit my powerlimit wall (Gpu-z reports a max VDDC-in draw of 355watt! and averages around 300) -< these readings are from 1175mv and 1160mhz core clock speeds. I truly believe I can get more out of this beast by upping the powerlimit inside the bios... my only question is _How_ When i apply 1231mv or even 1250mv with msi afterburner and i use 1160mhz it just blackscreens / dies on me.

Can anyone help me? Thanks in advance!

P.S. I had a 980ti before and this little fella responded better on stock volts than anything higher which is usuall on maxwell, but hawaii(grenada) do scale pretty well with voltages right??

PSU 750W evga G2 supernova


----------



## chris89

Quote:


> Originally Posted by *B'Fish*
> 
> Hello people,
> 
> Just got a r9 390 PCS+ today 2nd hand, really happy with it so far.
> 
> Asic quality 79.5% ( which is bad on amd cards right? Lower is better??
> 
> *Strange story so here it comes. never had a card that behaved like this one does!*
> I have tried overclocking this little beast upping the powerlimit to +50% so far my in-game voltages are around 1.15-1.175v (1175mv in wattman) And a core clock of 1160mhz. This is currently the max I can achieve. My max voltages in wattman are 1231mv, If i apply this voltage to the stables clockspeeds of 1160mhz it throttles. It seems like I hit my powerlimit wall (Gpu-z reports a max VDDC-in draw of 355watt! and averages around 300) -< these readings are from 1175mv and 1160mhz core clock speeds. I truly believe I can get more out of this beast by upping the powerlimit inside the bios... my only question is _How_ When i apply 1231mv or even 1250mv with msi afterburner and i use 1160mhz it just blackscreens / dies on me.
> 
> Can anyone help me? Thanks in advance!
> 
> P.S. I had a 980ti before and this little fella responded better on stock volts than anything higher which is usuall on maxwell, but hawaii(grenada) do scale pretty well with voltages right??
> 
> PSU 750W evga G2 supernova


Yes sir. I just started using my AMD Radeon R9 390X : With Reference 290X Blower Heatsink. It's awesome after my BIOS mod.

Oh yeah way higher fps wthout the 17.9.1. Suite & Installing the driver Alone in the Device Manager.

Send me your BIOS bro. These cards ROCK with my BIOS Mods.

ATIWINFLASH_v277.zip 1189k .zip file


GPU-Z.2.2.0.zip 4413k .zip file


DDUv17.0.7.2.zip 2532k .zip file


----------



## B'Fish

Ty Chris98

You can download my bios at this link https://ufile.io/0htk3

btw didnt know that this " Oh yeah way higher fps wthout the 17.9.1. Suite & Installing the driver Alone in the Device Manager. " was a thing







I cannot even overclock from other 3th party programs because of this wattman stuff I hope this fixes it!

Thx in advance.

P.S. cannot upload the bios as an attachment, it says *AJAX response unable to be parsed as valid JSON object.*


----------



## chris89

Quote:


> Originally Posted by *B'Fish*
> 
> Ty Chris98
> 
> You can download my bios at this link https://ufile.io/0htk3
> 
> btw didnt know that this " Oh yeah way higher fps wthout the 17.9.1. Suite & Installing the driver Alone in the Device Manager. " was a thing
> 
> 
> 
> 
> 
> 
> 
> I cannot even overclock from other 3th party programs because of this wattman stuff I hope this fixes it!
> 
> Thx in advance.
> 
> P.S. cannot upload the bios as an attachment, it says *AJAX response unable to be parsed as valid JSON object.*


.ZIP the .ROM to avoid the AJAX









Your welcome. I've been working on an underclock bios as well. Been able to run it -25mv @ 1,094Mhz Core. Only 68C core & 64C VRM & it can't run any lower than 1.225v @ 1,094mhz.

Also its running amazingly well. I found out my memory was overheating since I saw Memory Errors on HWInfo, So I used my Boron Nitride on all 16 memory modules (8gb) and now no errors.

Also after adding the new material to the memory. My FPS has gone up considerable & power consumption has gone down as well.

I run the memory @ 1,001Mhz to 1,250Mhz. I tried 1,563Mhz & 1,700Mhz memory and sure benchmarks went up but it's hotter & not practical for gaming. I like the card running cool & powerful at this undervolt setting.

It's all about the DeLimited Nature of My BIOS Power Limits. It's set to the maximum limit of 57,599 watts. Which is what allows the card to Soar in FPS.

Oh yeah It's the VIDEO MEMORY COOLING which Affects Firestrike Combined Test. Interesting huh? I went from 17fps to 23fps to upwards of nearly 30fps on the test now consistently.


----------



## chris89

Has anybody been able to get the 390X to run @ 3200x1800 @60Hz on a 4K 60Hz UHDTV over HDMI?

I can run it at 3200x1800 at 60Hz on a 1080p 60Hz HDTV, However my SAMSUNG 40" 4K 60Hz UHDTV defaults to 30Hz over 1920x1080. Cannot even do 2560x1440 @ 60Hz.

These new 17.9.1 Drivers enable 3200x1800 by default & Over the .INF alone is ideal in the device manager & it's quicker & easier.

I use RADEON PRO TOOLS for all my little tweaks, since it has way MORE tweaks than Radeon Settings Suite does.

Plus I was able to get my Firestrike GFX score over 15,500 points. With RADEON PRO TWEAKS.

Here's Batman Arkham Knight @ 3200x1800 @ 60Hz on my SAMSUNG 40" 1080p HDTV.


----------



## B'Fish

R9-390-stock.zip 99k .zip file


ah yes it worked! here is my bios







i only need this powerlimit to be gone or atleast higher, so i can test if i can get this puppy to 1200mhz









Really great firestrike scores btw, is this with tessellation tweaks??

and what hdmi cable / port do you use? Maybe its on the limit of your HDMI cable/port (not totally sure if this is right though but maybe worth to take a lookt at)

couldn't install the driver with device manager btw since i have no CD-ROM player in this case ( also no space for one ) so i just cleaned and reinstalled the lastest AMD driver.


----------



## chris89

Quote:


> Originally Posted by *B'Fish*
> 
> R9-390-stock.zip 99k .zip file
> 
> 
> ah yes it worked! here is my bios
> 
> 
> 
> 
> 
> 
> 
> i only need this powerlimit to be gone or atleast higher, so i can test if i can get this puppy to 1200mhz
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Really great firestrike scores btw, is this with tessellation tweaks??
> 
> and what hdmi cable / port do you use? Maybe its on the limit of your HDMI cable/port (not totally sure if this is right though but maybe worth to take a lookt at)
> 
> couldn't install the driver with device manager btw since i have no CD-ROM player in this case ( also no space for one ) so i just cleaned and reinstalled the lastest AMD driver.


You can likely clock the core up to 1200mhz using the 1366mv bios, just use HWINFO to watch Core/ VRM temperatures.

1,094Mhz is plenty for 4k ultra on 1225mv & 1,250mhz memory @ 875mv ... loading out like 68C Core & 64C VRM.
Radeon-Pro-Tools : Overclock without the AMD Suite & Gain Quality or Performance : Install INF via Packages/ Display/ Driver/ etc inf easy...

Radeon-Pro-Tools.zip 3482k .zip file

Sapphire Trixx 6.4 : Overclocking

TRIXX_installer_6.4.0.zip 2395k .zip file


ATIWINFLASH_v277.zip 1189k .zip file







































1094mhz-1225mv-1250mhz-875mv.zip 101k .zip file


1094mhz-1225mv-1563mhz-1000mv.zip 101k .zip file


1172mhz-1366mv-1563mhz-1000mv.zip 101k .zip file


----------



## B'Fish

Thanks alot, is 1366mv safe for everyday use? just wondering









started with the 1094mhz 1225mv 1563mhz 1000mv bios... flashing went flawlessly but after reboot clock speeds back to default! gpu-z reads the defaults as 1094mhz 1563mhz. Now i do not normally care to set this clock speeds in windows, but MSI afterburner doesnt even work properly with this card. Everytime i want to try to adjust core clock speeds it put my GPU into powersafe mode (300mhz core 150mhz mem) or something like that, so annoying. Sapphire trixx same story, only stuff that works is ASUS GPU tweak for my core clock but mehhh.. dont know what to do. i think this AMD drivers are the problem, but maybe i changed some registery stuff for msi afterburner while i had my 980TI to unlock extra options xD.

btw the volts also didnt go up to 1225mv, this also resetted instantly to stock and thats around 1.115v bluhh.

p.s. no overclocking software is starting up with a boot so this must be driver related i think.

after some more testing i really think 1160mhz is just the wall of this GPU. the powerdraw has come up to an amazing average of 375watts xD thats insane.


----------



## chris89

Quote:


> Originally Posted by *B'Fish*
> 
> Thanks alot, is 1366mv safe for everyday use? just wondering
> 
> 
> 
> 
> 
> 
> 
> 
> 
> started with the 1094mhz 1225mv 1563mhz 1000mv bios... flashing went flawlessly but after reboot clock speeds back to default! gpu-z reads the defaults as 1094mhz 1563mhz. Now i do not normally care to set this clock speeds in windows, but MSI afterburner doesnt even work properly with this card. Everytime i want to try to adjust core clock speeds it put my GPU into powersafe mode (300mhz core 150mhz mem) or something like that, so annoying. Sapphire trixx same story, only stuff that works is ASUS GPU tweak for my core clock but mehhh.. dont know what to do. i think this AMD drivers are the problem, but maybe i changed some registery stuff for msi afterburner while i had my 980TI to unlock extra options xD.
> 
> btw the volts also didnt go up to 1225mv, this also resetted instantly to stock and thats around 1.115v bluhh.
> 
> p.s. no overclocking software is starting up with a boot so this must be driver related i think.
> 
> after some more testing i really think 1160mhz is just the wall of this GPU. the powerdraw has come up to an amazing average of 375watts xD thats insane.


1366mv is safe if your card has been repasted & revisit the Core, Memory Modules, & VRM 1/2 then sure. If the original paste & pads, No. Just stick with 1094mhz & 1250mhz if you dont wanna pull the card apart.
















Use the 1094mhz 1250mhz and leave it and try it you will be surprised. My BIOS are designed around, FLASH IT & Forget Overclocking Software All Together. As it will be performing way higher than expectations. Go ahead & test out the 1094mhz & 1250mhz & Monitor THE VRM. Over clocking with software is a thing of the past. Sapphire Trixx 6.4 works fine for me after DDU, then Device Manager Alone install of the INF, and using RADEON PRO TOOLS adjust the details.

I suggest uninstalling rtss & MSI afterburner, it will erase the registry entries. Then delete the MSI afterburner folder & Rtss. Then download the very latest, but disable monitoring so you can use the RTSS if alone & things like Screen Capture. I wouldn't ever tick Unified Overclocking. MSI Afterburner isn't as good as Sapphire Trixx 6.4.

You'll fine 1094mhz & 1250mhz to be plenty of performance. Only 220 watts draw or so, quite low. I see almost no difference from 1001mhz memory to 1563mhz up to 1700mhz.

LMK & Post HWINFO Screenshot & GPUZ main page.


----------



## B'Fish

Damnit used that bios that u suggested, deinstalled all OVERCLOCK programs. rebooted and boom still going to the stock values of 1010mhz core and 1500mhz MEM. I really think this AMD driver is pushing this forward. I cannot do that installation method that you suggested so bad. I also did a custom install of the AMD driver and disabled all gimmicks. still no succes. Hate this







maybe i need to do a clean install because this w10 version is from release haha.



reinstalled the AMD drivers this time the complete package, clockspeeds are as the bios set them! lets do some testing


----------



## chris89

Quote:


> Originally Posted by *B'Fish*
> 
> Damnit used that bios that u suggested, deinstalled all OVERCLOCK programs. rebooted and boom still going to the stock values of 1010mhz core and 1500mhz MEM. I really think this AMD driver is pushing this forward. I cannot do that installation method that you suggested so bad. I also did a custom install of the AMD driver and disabled all gimmicks. still no succes. Hate this
> 
> 
> 
> 
> 
> 
> 
> maybe i need to do a clean install because this w10 version is from release haha.
> 
> 
> 
> reinstalled the AMD drivers this time the complete package, clockspeeds are as the bios set them! lets do some testing


























Sweet dude & also using Trixx... A simple "Reset" Fixes that issue with the clocks. Let me know and what games ya playing? Got any screenshots?


----------



## B'Fish

i only play project cars and BF1 at this moment, no screen shots can make a few if u want









Sadly this powerlimit is still a problem 375watts when overclocked, could use that 1366mv bios but that would mean a powerdraw of atleast 450watts. With a 8+6pin powerconnector i think this is not possible. even the powerdraw of 375watts average is triggering some kind of overcurrent protection of my PSU. that is at +200mv (+-1.25 volts on the core ) 1160mhz . I can use valley benchmark with no artifacts whatsoever at these speeds but the PSU just says *no* and shuts my system off.

Bummer wanted to reach 1200mhz but the high asic score means high leakage for amd cards -> High asic equals higher powerdraw but less vcore on the chip itself , crazy ... even my 290x lightning could do more Vcore 1180mhz and achieve less powerdraw (asic round 72%) and my best one yet was a r9 290 Tri-x (asic around 68%) but it wasnt able to cool itself as much and or else this would be the best one i had. Even my reference r9 290 could do easily 1160mhz with the stock cooler. Feels kinda bad that i lost this "lottery" haha

the 980ti hybrid i sold was just a boss at overclocking damn.. 86% asic and got to 1500mhz core on stock voltage (1.17v) thats just insane. sold this one though because it had too much performance for my gaming usage (powerdraw of around 250-300watt at these settings).

but this r9 290 ref i sold it for 200 euros and bought this r9 390 pcs+ for 210 so nice "upgrade" for a beefier cooler and more vram


----------



## chris89

Quote:


> Originally Posted by *B'Fish*
> 
> i only play project cars and BF1 at this moment, no screen shots can make a few if u want
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sadly this powerlimit is still a problem 375watts when overclocked, could use that 1366mv bios but that would mean a powerdraw of atleast 450watts. With a 8+6pin powerconnector i think this is not possible. even the powerdraw of 375watts average is triggering some kind of overcurrent protection of my PSU. that is at +200mv (+-1.25 volts on the core ) 1160mhz . I can use valley benchmark with no artifacts whatsoever at these speeds but the PSU just says *no* and shuts my system off.
> 
> Bummer wanted to reach 1200mhz but the high asic score means high leakage for amd cards -> High asic equals higher powerdraw but less vcore on the chip itself , crazy ... even my 290x lightning could do more Vcore 1180mhz and achieve less powerdraw (asic round 72%) and my best one yet was a r9 290 Tri-x (asic around 68%) but it wasnt able to cool itself as much and or else this would be the best one i had. Even my reference r9 290 could do easily 1160mhz with the stock cooler. Feels kinda bad that i lost this "lottery" haha
> 
> the 980ti hybrid i sold was just a boss at overclocking damn.. 86% asic and got to 1500mhz core on stock voltage (1.17v) thats just insane. sold this one though because it had too much performance for my gaming usage (powerdraw of around 250-300watt at these settings).
> 
> but this r9 290 ref i sold it for 200 euros and bought this r9 390 pcs+ for 210 so nice "upgrade" for a beefier cooler and more vram


Sure throw up some screenshots, better yet a ReLive Capture









Nice so hows the VRM temps. This bios now has a 32 Degrees Celsius ASIC Max Temp to reduce power probably *200 watts*.

Try it & don't overclock at all. Your PSU will blow up if you overclock, so don't do it until you upgrade your PSU. This will run very fast.

1094mhz-1225mv-1250mhz-875mv-32C-ASIC.zip 101k .zip file


PS - Stock voltage is 1250mv so thats -25mv.

1133mhz = 1333mv
1166mhz = 1366mv
1177mhz = 1377mv
1188mhz = 1388mv
1200mhz = 1400mv
1250mhz = 1425mv

1449mv is maximum voltage. Maybe 1300mhz on water.


----------



## Cherryblue

Quote:


> Originally Posted by *chris89*
> 
> HWInfo VRM Temps? At Load? Max Temperature?
> 
> These are delimited, so they run like a dream.
> 
> Coolest & Quietest & Like 60-70C and takes forever to get up there... Less than 200 watts likely
> 
> 1000mhz-1001mhz-1250mv-895mv-delimited.zip 101k .zip file
> 
> 
> Higher Speed / Hotter
> 
> 1133mhz-1001mhz-1333mv-895mv-delimited.zip 101k .zip file
> 
> 
> 1133mhz-1250mhz-1333mv-895mv-delimited.zip 101k .zip file


So I opened up my graphic card again because I was not satisfied of vrm temps (using stock bios for temp check).

Before opening up, I was at 60°C doing nothing on vrm temp1.

I put thermal paste and 2 layers of thermal pad 0.5mm instead of one layer and no thermal paste.

Now I'm at 65°C XD.

If I could say a few words forbidden here, you certainly know what I would say.







.



It's so tiring me out.. I'd like not to have to reopened it again but well...

*EDIT: seems better nonetheless! Didn't go higher than 66°C on LOL, and 75°C on very very heavily modded skyrim. Didn't hear the fans on stock bios, which means the card is cooler than before I believe.

I'll now try with your new modded bios Chris, in hope it's stable with upgraded memory voltage







.

Thanks again!*

EDIT 2:
So I'm currently using 1133mhz-1001mhz-1333mv-895mv-delimited and temps are awesome







. At idle, vrm doesn't go over 48°C, and so does core! (With fans OFF) Awesome!

Tried 3 games, Witcher 3 no problem, o*n LOL I had a few artefacts observed when the core went from 700mhz to 1133mhz.* I'll reinstall afterburner and check if it works like a charm with 25mv more on the vcore







.

EDIT 3:
Welp I played for real (not custom games) on LOL, and every game I got multiple artifacts or black screens for a few seconds, then I could continue playing. But well unplayable







.

Tried to change settings with afterburner for voltage and core & memory, but couldn't get it stable sadly... Don't understand







.. But it was so cool & quiet lol. Sadness.


----------



## chris89

Quote:


> Originally Posted by *Cherryblue*
> 
> So I opened up my graphic card again because I was not satisfied of vrm temps (using stock bios for temp check).
> 
> Before opening up, I was at 60°C doing nothing on vrm temp1.
> 
> I put thermal paste and 2 layers of thermal pad 0.5mm instead of one layer and no thermal paste.
> 
> Now I'm at 65°C XD.
> 
> If I could say a few words forbidden here, you certainly know what I would say.
> 
> 
> 
> 
> 
> 
> 
> .
> 
> 
> 
> It's so tiring me out.. I'd like not to have to reopened it again but well...
> 
> *EDIT: seems better nonetheless! Didn't go higher than 66°C on LOL, and 75°C on very very heavily modded skyrim. Didn't hear the fans on stock bios, which means the card is cooler than before I believe.
> 
> I'll now try with your new modded bios Chris, in hope it's stable with upgraded memory voltage
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Thanks again!*
> 
> EDIT 2:
> So I'm currently using 1133mhz-1001mhz-1333mv-895mv-delimited and temps are awesome
> 
> 
> 
> 
> 
> 
> 
> . At idle, vrm doesn't go over 48°C, and so does core! (With fans OFF) Awesome!
> 
> Tried 3 games, Witcher 3 no problem, o*n LOL I had a few artefacts observed when the core went from 700mhz to 1133mhz.* I'll reinstall afterburner and check if it works like a charm with 25mv more on the vcore
> 
> 
> 
> 
> 
> 
> 
> .
> 
> EDIT 3:
> Welp I played for real (not custom games) on LOL, and every game I got multiple artifacts or black screens for a few seconds, then I could continue playing. But well unplayable
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Tried to change settings with afterburner for voltage and core & memory, but couldn't get it stable sadly... Don't understand
> 
> 
> 
> 
> 
> 
> 
> .. But it was so cool & quiet lol. Sadness.


Try this.. HWInfo Screenshot









1000mhz-1250mhz-New.zip 101k .zip file


----------



## Cherryblue

Quote:


> Originally Posted by *chris89*
> 
> Try this.. HWInfo Screenshot
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1000mhz-1250mhz-New.zip 101k .zip file




Instant crash on lol, couldn't even play a custom game, blackscreen every 5 to 10 seconds, with screens off then on again.

Weird: rebooting computer after flashing, fan was locked at 60%.
When crashing on game, it locked at 34%. Now at 34%. Don't undertsand what you did @[email protected] But even less stable.

Temp really wasn't a problem, I suppose it's more of a voltage problem, or maybe something not apparent to me that you modify in bios?

EDIT:
So yesterday I only tried 1133mhz-1001mhz-1333mv-895mv-delimited and it wasn"t stable on LOL.
Today I tried your other bios 1000mhz-1001mhz-1250mv-895mv-delimited and not stable either, got a reaaaaal weird black screen after 5 minutes that blew it that way lol :

(Screen is from after black screen, nothing readable on HW INFO!)

But compared to the one above, I could play 5 minutes. The only difference I see is the memory clock, but through the images I would tell voltage was the same.
You went from 875mv to 895, maybe my card needs more than 20mv difference?







.


----------



## componentgirl90

I have an MSI R9 270 OC 2gb. It is running at 70 degrees with the case open and after a bit of gaming it is getting artifacts. It would probably be better if it ran cooler. I am happy with the clock speeds tbh. I am guessing the bios I attached isn't ideal. It is the bios that came with it preinstalled. Would it be better to lower the power limit or whatever on this?

MSI-R9270OC.zip 97k .zip file


----------



## chris89

Quote:


> Originally Posted by *componentgirl90*
> 
> I have an MSI R9 270 OC 2gb. It is running at 70 degrees with the case open and after a bit of gaming it is getting artifacts. It would probably be better if it ran cooler. I am happy with the clock speeds tbh. I am guessing the bios I attached isn't ideal. It is the bios that came with it preinstalled. Would it be better to lower the power limit or whatever on this?
> 
> MSI-R9270OC.zip 97k .zip file










I think artifacts was too hot vrm temps. So I reduce voltage from 1.188v to 1.163v & went to 1000mhz core from 995mhz & adjusted the fans to run much cooler.









MSI-R9.270.OC.zip 98k .zip file


----------



## chris89

Quote:


> Originally Posted by *Cherryblue*
> 
> 
> 
> Instant crash on lol, couldn't even play a custom game, blackscreen every 5 to 10 seconds, with screens off then on again.
> 
> Weird: rebooting computer after flashing, fan was locked at 60%.
> When crashing on game, it locked at 34%. Now at 34%. Don't undertsand what you did @[email protected] But even less stable.
> 
> Temp really wasn't a problem, I suppose it's more of a voltage problem, or maybe something not apparent to me that you modify in bios?
> 
> EDIT:
> So yesterday I only tried 1133mhz-1001mhz-1333mv-895mv-delimited and it wasn"t stable on LOL.
> Today I tried your other bios 1000mhz-1001mhz-1250mv-895mv-delimited and not stable either, got a reaaaaal weird black screen after 5 minutes that blew it that way lol :
> 
> (Screen is from after black screen, nothing readable on HW INFO!)
> 
> But compared to the one above, I could play 5 minutes. The only difference I see is the memory clock, but through the images I would tell voltage was the same.
> You went from 875mv to 895, maybe my card needs more than 20mv difference?
> 
> 
> 
> 
> 
> 
> 
> .


I've tested this Voltage on these Early Revisions of Hawaii XT & 895mv works perfectly. So here I'll just try 950mv for 1250mhz memory. Limit core to 1Ghz & DeLimit the power limit. Which staying cooler on fan speed.

The Black Screen means Too Hot ASIC Temps... Meaning over 100C. So I limited everything to force the card to run cooler while still performing great with DeLimited Power Limit.

1GHZ.1.25GHZ.390X.NITRO.950MV.MEMORY.zip 101k .zip file


----------



## componentgirl90

Quote:


> Originally Posted by *chris89*
> 
> 
> 
> 
> 
> 
> 
> 
> I think artifacts was too hot vrm temps. So I reduce voltage from 1.188v to 1.163v & went to 1000mhz core from 995mhz & adjusted the fans to run much cooler.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> MSI-R9.270.OC.zip 98k .zip file


Nice thanks.

I noticed that the fans were running at 20% all the time so I enabled user defined in MSI afterburner and it runs a lot cooler now (now 55, was 70). Also reduced the power limit by 20%. I see the voltage at 1.150v on my card and also the clock was 955MHz. What are these original numbers from of 1.188v and 995MHz from?


----------



## Cherryblue

Quote:


> Originally Posted by *chris89*
> 
> I've tested this Voltage on these Early Revisions of Hawaii XT & 895mv works perfectly. So here I'll just try 950mv for 1250mhz memory. Limit core to 1Ghz & DeLimit the power limit. Which staying cooler on fan speed.
> 
> The Black Screen means Too Hot ASIC Temps... Meaning over 100C. So I limited everything to force the card to run cooler while still performing great with DeLimited Power Limit.
> 
> 1GHZ.1.25GHZ.390X.NITRO.950MV.MEMORY.zip 101k .zip file


Hi Chris, still a no go sadly







.

Hottest VRM at 50°C while playing, crashed when core clock went at 1000mhz , was fine until 900mhz. 1000 froze the screen and then the sound repeated in loop, biiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiii







. Then I had to reboot.

Well, I believe I'm taking too much of your time, and I'd like to help too. I read the thread for bios editing, I guess I could simply begin by editing one of yours and trying different voltages.

- I'll search specific values in your bioses (core voltage, memory voltage)
- I'll change them with the hex dec method proposed in the hawai bios editing thread
Is there something else I must do before flashing? Like something with the checksum or I don't know?

Thanks again









*EDIT:* So I've editted one of your bioses, altered core voltage, and ajusted the checksum using one HD7XXUEFIPatchToolBeta, anything else to do before I flash? (I'd rather not flash something incorrect







. There are no given value for voltage? I chose to up 1333mv to 1366, seems OK to you? Any increment I want?


----------



## chris89

Quote:


> Originally Posted by *componentgirl90*
> 
> Nice thanks.
> 
> I noticed that the fans were running at 20% all the time so I enabled user defined in MSI afterburner and it runs a lot cooler now (now 55, was 70). Also reduced the power limit by 20%. I see the voltage at 1.150v on my card and also the clock was 955MHz. What are these original numbers from of 1.188v and 995MHz from?


Quote:


> Originally Posted by *componentgirl90*
> 
> Okay yeah I reduced the power limit in the bios from 125 to 100 so -25% power limit. So with another -20% that's 83 watts. Which will have to throttle to the lower state to maintain the 83 watts.
> 
> Here I show the GPU Clock States & Fan & I will show you what I change to keep temps nice & low... I didn't want it to be too noisy for ya.
> 
> 
> 
> So I changed the fan speed to reflect your MSI profile to keep it in the 50C range.
> 
> 
> 
> MSI.r9.270.oc.tune.1.zip 98k .zip file


----------



## chris89

PCIe 2.0 Comparison 390X

1200mhz core like 1700mhz memory

With Tesselation

Without Tesselation


----------



## chris89

Quote:


> Originally Posted by *Cherryblue*
> 
> So I opened up my graphic card again because I was not satisfied of vrm temps (using stock bios for temp check).
> 
> Before opening up, I was at 60°C doing nothing on vrm temp1.
> 
> I put thermal paste and 2 layers of thermal pad 0.5mm instead of one layer and no thermal paste.
> 
> Now I'm at 65°C XD.
> 
> If I could say a few words forbidden here, you certainly know what I would say.
> 
> 
> 
> 
> 
> 
> 
> .
> 
> 
> 
> It's so tiring me out.. I'd like not to have to reopened it again but well...
> 
> *EDIT: seems better nonetheless! Didn't go higher than 66°C on LOL, and 75°C on very very heavily modded skyrim. Didn't hear the fans on stock bios, which means the card is cooler than before I believe.
> 
> I'll now try with your new modded bios Chris, in hope it's stable with upgraded memory voltage
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Thanks again!*
> 
> EDIT 2:
> So I'm currently using 1133mhz-1001mhz-1333mv-895mv-delimited and temps are awesome
> 
> 
> 
> 
> 
> 
> 
> . At idle, vrm doesn't go over 48°C, and so does core! (With fans OFF) Awesome!
> 
> Tried 3 games, Witcher 3 no problem, o*n LOL I had a few artefacts observed when the core went from 700mhz to 1133mhz.* I'll reinstall afterburner and check if it works like a charm with 25mv more on the vcore
> 
> 
> 
> 
> 
> 
> 
> .
> 
> EDIT 3:
> Welp I played for real (not custom games) on LOL, and every game I got multiple artifacts or black screens for a few seconds, then I could continue playing. But well unplayable
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Tried to change settings with afterburner for voltage and core & memory, but couldn't get it stable sadly... Don't understand
> 
> 
> 
> 
> 
> 
> 
> .. But it was so cool & quiet lol. Sadness.


I installed a different OS & Everything is fine... had so many issues with windows updates... so far no update..perfection disabled update service.


----------



## chris89

If the 390x could do 14nm Finfet or Vega Clocks it would be unreal. Here I have it up to 1,410mhz core.. haha

Plus with 390x at like 1335mhz core it's almost 7.5 Tera Flops on Single Precision ... which is faster than the rx 480 basically.

Ellesmere can hit like 6.5 Tera Flops Single Precision at like over 1,430mhz core.


----------



## Dundundata

Decided to boot up the ole Firestrike and was pleasantly surprised to stomp my old score with only +50mV.. with 4790k not OC.


----------



## chris89

Quote:


> Originally Posted by *Dundundata*
> 
> Decided to boot up the ole Firestrike and was pleasantly surprised to stomp my old score with only +50mV.. with 4790k not OC.


Nice. Your combined is good. Wanna send me your bios? I can send you an unlocked bios so you don't have to touch the voltage & power limit or fan curve & clock the core & memory as high as you can... I find max scores around 1200mhz core 1700mhz memory.

At these speeds its ideal to go to 100% fan speed at like 55C core temp to keep the ASIC temp in check.

JSON error means need to .zip the .rom first to attach.

tesselation disabled here 4:2:2 & 8bpc in Display & Performance & Frame Rate Target 200fps

12,986.16,236.gfx.79.31.63.60.1200.1700.8bpc.4.2.2.no.tess.3dmark.firestrike.jpg



12,194.14,795.gfx.72.02.58.12.390x.1200.1700.3dmark.firestrike.jpg


----------



## AlphaGaming17

Validation Link: https://www.techpowerup.com/gpuz/details/7hynk


----------



## componentgirl90

Quote:


> Originally Posted by *chris89*
> 
> Quote:
> 
> 
> 
> Originally Posted by *componentgirl90*
> 
> Okay yeah I reduced the power limit in the bios from 125 to 100 so -25% power limit. So with another -20% that's 83 watts. Which will have to throttle to the lower state to maintain the 83 watts.
> 
> Here I show the GPU Clock States & Fan & I will show you what I change to keep temps nice & low... I didn't want it to be too noisy for ya.
> 
> 
> 
> So I changed the fan speed to reflect your MSI profile to keep it in the 50C range.
> 
> 
> 
> MSI.r9.270.oc.tune.1.zip 98k .zip file
> 
> 
> 
> 
> Thanks for this bios. It is ideal to have it at 50 that would be perfect as even when its running at 55 after a long session at full load it does start to artifact again.
> 
> Also sorry about the confusion. I meant in my last post that I had reduced it by 20% in MSI afterburner but I was running the stock bios.
> 
> Do you think that this card would benefit from thermal paste on the GPU and/or a heatsink on the VRMs?
> 
> Just for my own understanding as well with power limit numbers, how did you arrive at 83 watts? Also I think 125 to 100 is 20% reduction in TDP or is there some other reason it is 25%?
> 
> I think also that GPU-z reports 1.15 voltage but the bios voltage may be different for some reason. I wonder if that is normal.
Click to expand...


----------



## chris89

Quote:


> Originally Posted by *componentgirl90*
> 
> Thanks for this bios. It is ideal to have it at 50 that would be perfect as even when its running at 55 after a long session at full load it does start to artifact again.
> 
> Also sorry about the confusion. I meant in my last post that I had reduced it by 20% in MSI afterburner but I was running the stock bios.
> 
> Do you think that this card would benefit from thermal paste on the GPU and/or a heatsink on the VRMs?
> 
> Just for my own understanding as well with power limit numbers, how did you arrive at 83 watts? Also I think 125 to 100 is 20% reduction in TDP or is there some other reason it is 25%?
> 
> I think also that GPU-z reports 1.15 voltage but the bios voltage may be different for some reason. I wonder if that is normal.


125 divided by 1.20 equals 104.17 rounded up.

125 divided by 100 equals 1.25 (25%)

100 divided by 1.20 (20%) equals 83.33

Its likely too much of a pain in the rear end to add VRM cooling. Did you do anything with your XFX 390X?

So did you ever even flash the new bios? Your running stock bios & still artifacting & still asking how to fix artifacting?


----------



## sheratann

Need some help with these bioses for mining (cool and quiet) , 1000 mhz both, i managed to make the asus mine at 1075 mv and the gigabyte at 1100 mv ,can it go lower?

BIOSI.zip 196k .zip file


----------



## chris89

Quote:


> Originally Posted by *sheratann*
> 
> Need some help with these bioses for mining (cool and quiet) , 1000 mhz both, i managed to make the asus mine at 1075 mv and the gigabyte at 1100 mv ,can it go lower?
> 
> BIOSI.zip 196k .zip file


Your card is ASUS or Gigabyte? I'll use the ASUS BIOS. Is it R9 390 or 390X?


----------



## chris89

*@Sheratann*

Here's 1133mhz on stock voltage 65288 1.25v with a good quiet fan profile. I hope your VRM cooling is good. What's your HWInfo look like after a while at load? This BIOS should be an immense performance increase as cool & stable. If you want cooler, I made 1Ghz Core 1Ghz memory.

This bios will fluctuate between 1001mhz memory & 1250mhz for cooler power efficiency
1133mhz core 1250mhz memory 1250mv core max, though typically less, and 875mv memory

ASUS.1133mhz.1250mhz.DeLimited.zip 100k .zip file

1094mhz core 1001mhz memory 1250mv core max, though typically less, and 875mv memory

ASUS.1094mhz.1001mhz.DeLimited.zip 100k .zip file

1000mhz core 1001mhz memory 1250mv core max, though typically less, and 875mv memory

ASUS.1000mhz.1001mhz.DeLimited.zip 100k .zip file


----------



## sheratann

gigabytestock.zip 98k .zip file
Both are r9 390s. I mine and I play with them. The gigabyte one is mining at 1100mv,it wouldn't go lower. The Asus one is at 1075mv. Vrm 1 is 71c and the Vrm 2 is at 77c while at 1075mv for core & memory. The gigabyte one doesnt have vrm sensors. If you can make the same for the gigabyte one,that would be awesome. Thanks!

EDIT: Nvm, i did it myself. Didnt know that VDDCI states has something to do with memory.


----------



## chris89

I was doing some research and I found out the AIDA GPGPU PCIe Memory Read/ Write was on the PCIe slot... The smaller part of the PCIe slot.

It uses +5V & Ground To power the GPU Memory Read Write from the system memory, yet the voltage regulator supplying the power to those pins, overheats only reaching like 70% of the slot Capacity regularly. 30% more or less is being lost from that PCIe slot VRM location. I could cool that specific VRM that supplies the 5 volts to the Memory Read/ Write Pins, but I think it's limited even to the PCIe Specification.

I think that if I solder a red wire to the +5v & Ground directly to the PSU's +5v & Ground, it could completely DeLimit, the PCIe bus entirely.

These slots at PCIe 2.0 is 8GB/s per slot x16 rated by the VRM memory read/write... PCIe 3.0 is rated 16GB/s per slot rated by the "VRM" key word, VRM. Bypass the VRM entirely & delimit the PCIe bus memory read/write & I'm sure FPS would go through the roof.

I'll test this sometime soon.


----------



## componentgirl90

Quote:


> Originally Posted by *chris89*
> 
> 125 divided by 1.20 equals 104.17 rounded up.
> 
> 125 divided by 100 equals 1.25 (25%)
> 
> 100 divided by 1.20 (20%) equals 83.33
> 
> Its likely too much of a pain in the rear end to add VRM cooling. Did you do anything with your XFX 390X?
> 
> So did you ever even flash the new bios? Your running stock bios & still artifacting & still asking how to fix artifacting?


I will be flashing the new bioses for the 270 and 390x.

After 1st 270 post saw fans @20% + New Windows install is unstable, became in 2 minds about risking a bios flash. After that decided to risk bios flash as still artifacts even with fans > 20%.

I want to do the 270 before the 390x in case new windows install causes problems with flash.

Thanks for clarifying how you got to 83 watts. After I flash, I will just leave the power limit as it is. :0)


----------



## PhantomLlama

I know this is slightly off topic...but I'm starting to itch to upgrade my card, and have my 390x for in another build. Ever since I built my first custom PC 12 years ago I have been an ATI/AMD fanboy (and Sapphire for that matter). Now, looking at Vega's "flop" and seeing the success of the other camp...I'm being really tempted to switch over. I know it's silly to put so much emphasis on which "side" you are on, and a person should go with the card that they can A)afford, and B) is best suited to their needs/wants.

So I suppose I'm torn on milking my sturdy, power hungry and aging 390x and holding out hope for a new card from either side.... or just jumping and getting a new card ASAP.

Also, i may be bored at work today...so I may or may not be watching a lot of Tech Videos and reading a lot. haha


----------



## chris89

Quote:


> Originally Posted by *PhantomLlama*
> 
> I know this is slightly off topic...but I'm starting to itch to upgrade my card, and have my 390x for in another build. Ever since I built my first custom PC 12 years ago I have been an ATI/AMD fanboy (and Sapphire for that matter). Now, looking at Vega's "flop" and seeing the success of the other camp...I'm being really tempted to switch over. I know it's silly to put so much emphasis on which "side" you are on, and a person should go with the card that they can A)afford, and B) is best suited to their needs/wants.
> 
> So I suppose I'm torn on milking my sturdy, power hungry and aging 390x and holding out hope for a new card from either side.... or just jumping and getting a new card ASAP.
> 
> Also, i may be bored at work today...so I may or may not be watching a lot of Tech Videos and reading a lot. haha


I'd go GTX 980 Ti. It's a TITAN X Maxwell, dialed back a bit... I'd rather have this GPU than 390X, & VEGA because of PRICES... 980 TI Gigabyte Windforce .. 96 ROP's equal to the GTX 1080 Ti & TITAN X Pascal for $370... It waxes everything all the way up to 4k.

First thing I'd do to the 980 Ti is use new thermal pads on VRM & Memory & RePaste. Very powerful gpu core, ever more powerful than 390x. Send me the BIOS when you get it and watch the 980 Ti's 96 ROP's wax the 390X.

*http://www.ebay.com/itm/NEW-Thermagon-thermal-gap-filler-pad-T-PLI-2200-A1-12mm-x-12mm-x-5mm-49-per-pack-/172855009184?hash=item283ef61fa0:g:w3wAAOSwAuZX1TRU*

AMD R9 390X : 2860 : 64 : 176 : solid.. 8gb

GTX TITAN X MAXWELL : 3072 : 192 : 96..

GTX 980 Ti : 2816 : 96 : 176 : ... Clearly 96 ROP Count & same as 390X 176 TMU & higher clockability like 1,500mhz I think.. maybe more?

https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_980_Ti/

https://www.techpowerup.com/gpudb/2724/geforce-gtx-980-ti

https://www.techpowerup.com/reviews/Gigabyte/GTX_980_Ti_XtremeGaming/

$369

*http://www.ebay.com/itm/Gigabyte-GTX-980-Ti-G1-Gaming-Graphics-Card-6GB-Nvidia-GPU-GV-N98TG1-GAMING-6GD/322752752352?epid=691132011&hash=item4b25902ae0%3Ag%3AwMEAAOSw7L1ZveLz
*


----------



## PhantomLlama

Call me foolish, but wouldn't I just go for the most advanced model, like the 1080ti? Considering it can do considerably better than the 980ti, which is also great.

I'm just stuck in that place where my beloved 390x is a great card but if I upgrade, do I wait or just get the current flagship.

Money isn't the issue lol (but I'm not going to buy one now and then another one in a few months)


----------



## componentgirl90

Quote:


> Originally Posted by *PhantomLlama*
> 
> Call me foolish, but wouldn't I just go for the most advanced model, like the 1080ti? Considering it can do considerably better than the 980ti, which is also great.
> 
> I'm just stuck in that place where my beloved 390x is a great card but if I upgrade, do I wait or just get the current flagship.
> 
> Money isn't the issue lol (but I'm not going to buy one now and then another one in a few months)


I don't see anything wrong with that.

The current flagship is the 1080ti.

As you are probably aware, current prices are inflated by supply/demand issues related to the mining of cryptocurrency, in particular Ethereum. This particularly affects most AMD cards and GTX 1060s and GTX 1070s.

What you may not be aware of is that Ethereum just became very difficult to mine in the last few days. There is a strong chance that there will be a drop in demand for cards and this will inevitably drive prices down. However, there is also a shortage in the supply of the memory for graphics cards due to increased production in mobile phones which also use that memory. This will be a force driving prices up. Some articles have stated that this should not be much. I would guess that the net effect will be that prices go down but you should do more research on that if you want to be sure.

That applies to the GTX 1070 and 1060. That does not directly apply to the GTX 1080 or 1080ti as they are not good mining cards for the ethereum cryptocurrency. However, their prices may drop if they are being kept artificially high to keep them above the GTX 1070.

One interesting thing to note is that at least in my part of the world, with the 1080ti you get the same increase in performance from the 1080ti proportional to its price increase over the 1080 (I think it was 35%). This doesn't always happen at the high end so if you got the coin you might want to go for that. But it is a hell of a lot of money!!

BTW I remember thinking that the 1080ti is a good 4k card and that the 1080 falls short of being a complete 4k card although in most titles you would be satisfied.


----------



## componentgirl90

Quote:


> Originally Posted by *chris89*
> 
> I'd go GTX 980 Ti. It's a TITAN X Maxwell, dialed back a bit... I'd rather have this GPU than 390X, & VEGA because of PRICES... 980 TI Gigabyte Windforce .. 96 ROP's equal to the GTX 1080 Ti & TITAN X Pascal for $370... It waxes everything all the way up to 4k.
> 
> First thing I'd do to the 980 Ti is use new thermal pads on VRM & Memory & RePaste. Very powerful gpu core, ever more powerful than 390x. Send me the BIOS when you get it and watch the 980 Ti's 96 ROP's wax the 390X.
> 
> *http://www.ebay.com/itm/NEW-Thermagon-thermal-gap-filler-pad-T-PLI-2200-A1-12mm-x-12mm-x-5mm-49-per-pack-/172855009184?hash=item283ef61fa0:g:w3wAAOSwAuZX1TRU*
> 
> AMD R9 390X : 2860 : 64 : 176 : solid.. 8gb
> 
> GTX TITAN X MAXWELL : 3072 : 192 : 96..
> 
> GTX 980 Ti : 2816 : 96 : 176 : ... Clearly 96 ROP Count & same as 390X 176 TMU & higher clockability like 1,500mhz I think.. maybe more?
> 
> https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_980_Ti/
> 
> https://www.techpowerup.com/gpudb/2724/geforce-gtx-980-ti
> 
> https://www.techpowerup.com/reviews/Gigabyte/GTX_980_Ti_XtremeGaming/
> 
> $369
> 
> *http://www.ebay.com/itm/Gigabyte-GTX-980-Ti-G1-Gaming-Graphics-Card-6GB-Nvidia-GPU-GV-N98TG1-GAMING-6GD/322752752352?epid=691132011&hash=item4b25902ae0%3Ag%3AwMEAAOSw7L1ZveLz
> *


I wanted a 980 when I got my 390x because of its overclocking. I guess the 980ti must overclock just as well.


----------



## akama

Heya, I'm just wondering how long my card will survive? I've cleaned it from dust and tried even undervolting the card without the VRM temps going any lower while gaming/stresstesting.

My VRM temps are *110-125c* on VRMtemp1-sensor and around 96c on VRMtemp2-sensor. (Getting these same temps under stresstest even if I run stock/undervolted/overlocked)

Tried to undervolt the card by -25mV without no success to lower the VRM temps, so im just running the card at 1100MHz on the core and 1600MHz on the memory with +12mV, getting same temps as above.

Using the Strix R9 380 4GB version.

Is it worth to replace the old thermal pads + heatsinks with some newones to the VRMs and even put some thermal pad on the backside of the card between the PCB + backplate?

Can replacing the thermalpaste on the chip reduce VRM temps? because the card/core temp is under 75c at all times, im using a high fanprofile and the case is good ventilated.


----------



## chris89

Quote:


> Originally Posted by *PhantomLlama*
> 
> Call me foolish, but wouldn't I just go for the most advanced model, like the 1080ti? Considering it can do considerably better than the 980ti, which is also great.
> 
> I'm just stuck in that place where my beloved 390x is a great card but if I upgrade, do I wait or just get the current flagship.
> 
> Money isn't the issue lol (but I'm not going to buy one now and then another one in a few months)


$700 vs $369 ... if you have more money than intelligence then yeah 1080ti or the actual flagship of nvidia is titian x pascal... it's just clocked less but way faster capability than 1080ti.

Plus titan x pascal is
*3840:240:96*

980 Ti is
*2816:176:96*

So yeah, the Vega is faster than Titan xp if you were willing to work on the card. Otherwise it's simply thermal limited, some vrm run 110c 24/7 on vega, they did it on purpose to make a few extra bucks.

980 Ti is the best bang for the buck. Until Vega comes down in price.

980 Ti can do 1,563mhz & thats 150 giga pixel's & 275 giga texel's & 400gb/s... this cannot be overlooked, the 980 ti is the best value.


----------



## chris89

I'd go Ryzen first to show the difference between PCIe 2.0 & PCIe 3.0 before upgrading the GPU

AMD Ryzen R7 1700 : *$269* : http://www.ebay.com/itm/NEW-AMD-RYZEN-7-1700-8-Core-Processor-3-0-3-7GHz-AM4-with-Wraith-Spire-Cooler-/262896694740?epid=235133854&hash=item3d35dd35d4:g:mIEAAOSw4A5Yyeon

AMD Ryzen R3 1200 : *$99* : http://www.ebay.com/itm/New-AMD-Ryzen-3-1200-4-Core-3-1GHz-3-4GHz-Turbo-Socket-AM4-65W-YD1200BBAEBOX-/272789837796?hash=item3f838a97e4:g:SScAAOSwIwJZg4Cb

AM4 MSI B350M Motherboard : *$51* : http://www.ebay.com/itm/MSI-AMD-AM4-Ryzen-B350M-Gaming-Pro-DDR4-VR-Ready-HDMI-Motherboard-SHIPS-FREE/182777360823?rt=nc&_trkparms=aid%3D222007%26algo%3DSIM.MBE%26ao%3D2%26asc%3D47507%26meid%3D97a0ce8dac10421e8e8d763715883f5f%26pid%3D100005%26rk%3D4%26rkt%3D6%26sd%3D162678845919&_trksid=p2047675.c100005.m1851

AM4 Crossfire Gigabyte B350M Motherboard : *$69.99* : http://www.ebay.com/itm/GIGABYTE-GA-AB350M-HD3-AMD-Ryzen-AM4-2-Way-CrossFire-HDMI-M-2-MATX-MB-/162678845919?epid=706277132&hash=item25e06a31df:g:FNIAAOSwXedZwGNh

16GB DDR4 2X 8GB DDR4 : *$105* : http://www.ebay.com/itm/CORSAIR-Vengeance-LPX-16GB-2-x-8GB-288-Pin-DDR4-2133Mhz-fast-ship-/152718905005?hash=item238ec18ead:g:ScIAAOSw5RRZyvkx

8GB DDR4 2X 4GB DDR4 : *$64* : http://www.ebay.com/itm/8GB-2-4GB-PC4-2400-Desktop-Memory-RAM-Acer-Dell-HP-Lenovo-DDR4-19200-DIMM-NonECC-/142472536510?hash=item212c0661be:g:CEwAAOSwlY1ZHN78

Total : $444.98 : 16GB ram
Total : $403.98 : 8GB ram
Total : $232.99 : 8GB ram + Ryzen 3 1200 (Not much slower than 1700 in games with PCIe 3.0)

Done









Big difference is CPU is real fast but also 16GB/s over PCIe compared to 8GB/s on PCIe 2.0 which helps a lot at 4k, like 20-30fps higher at max resolution/ details. Where before it would lag out and be like I need a faster GPU... It's just a motherboard limitation on PCIe 2.0... The 390X is PCIe 3.0 GPU, so need PCIe 3.0 system to see its full potential.


----------



## chris89

Quote:


> Originally Posted by *akama*
> 
> Heya, I'm just wondering how long my card will survive? I've cleaned it from dust and tried even undervolting the card without the VRM temps going any lower while gaming/stresstesting.
> 
> My VRM temps are *110-125c* on VRMtemp1-sensor and around 96c on VRMtemp2-sensor. (Getting these same temps under stresstest even if I run stock/undervolted/overlocked)
> 
> Tried to undervolt the card by -25mV without no success to lower the VRM temps, so im just running the card at 1100MHz on the core and 1600MHz on the memory with +12mV, getting same temps as above.
> 
> Using the Strix R9 380 4GB version.
> 
> Is it worth to replace the old thermal pads + heatsinks with some newones to the VRMs and even put some thermal pad on the backside of the card between the PCB + backplate?
> 
> Can replacing the thermalpaste on the chip reduce VRM temps? because the card/core temp is under 75c at all times, im using a high fanprofile and the case is good ventilated.


Cool, replace the vrm thermal pad & core paste & send me the bios .rom & .zip it for attach avoid json error









https://www.techpowerup.com/reviews/ASUS/R9_380X_Strix/4.html

pull the vrm heatsink & use new pads for each vrm & repaste gpu & will bring down everything a lot.

http://www.ebay.com/itm/NEW-Thermagon-thermal-gap-filler-pad-T-PLI-2200-A1-12mm-x-12mm-x-5mm-49-per-pack-/172855009184?hash=item283ef61fa0:g:w3wAAOSwAuZX1TRU

http://www.ebay.com/itm/Arctic-Silver-Ceramique-2-Tri-Linear-Ceramic-Thermal-Compound-25-g-gram-syringe-/191536297259?epid=2255302025&hash=item2c9873f52b:g:fHoAAOSw-KFXcq6s


----------



## akama

Quote:


> Originally Posted by *chris89*
> 
> Cool, replace the vrm thermal pad & core paste & send me the bios .rom & .zip it for attach avoid json error
> 
> 
> 
> 
> 
> 
> 
> 
> 
> https://www.techpowerup.com/reviews/ASUS/R9_380X_Strix/4.html
> 
> pull the vrm heatsink & use new pads for each vrm & repaste gpu & will bring down everything a lot.


Will all that hassle really bring down the VRM temps?

Guess I need to try, to find out if it works, ordered some Arctic Thermal pad 1mm thickness and some Noctua NT-H1 paste, guess that will do the job.

Why would you want the bios .rom?


----------



## Streetdragon

Quote:


> Originally Posted by *chris89*
> 
> $700 vs $369 ... if you have more money than intelligence then yeah 1080ti or the actual flagship of nvidia is titian x pascal... it's just clocked less but way faster capability than 1080ti.
> 
> Plus titan x pascal is
> *3840:240:96*
> 
> 980 Ti is
> *2816:176:96*
> 
> So yeah, the Vega is faster than Titan xp if you were willing to work on the card. Otherwise it's simply thermal limited, some vrm run 110c 24/7 on vega, they did it on purpose to make a few extra bucks.
> 
> 980 Ti is the best bang for the buck. Until Vega comes down in price.
> 
> 980 Ti can do 1,563mhz & thats 150 giga pixel's & 275 giga texel's & 400gb/s... this cannot be overlooked, the 980 ti is the best value.


http://gpu.userbenchmark.com/Compare/Nvidia-GTX-980-Ti-vs-Nvidia-GTX-1080-Ti/3439vs3918

go for 1080TI if you wanna get the best....


----------



## chris89

It's price bro. If I had the money, I'd buy the 980 Ti in a heart beat. Were years behind on AMD's first 96 ROP GPU... That 96 ROP really helps, it gives that over head at high resolution. It's like 60fps or near it at 4k on the 980 Ti ...

What's the point of the 1080 Ti?

1080 Ti has 96 Raster Operators within the Core Diode just like the 980 Ti & TITAN XP.

1080 Ti has 224 Texture Mapping Units within the Core Diode & The 980 Ti has 176.

So 1080 Ti has 5GB more memory & 48 Texture Mapping Units... For Extra 450 Dollars... Is it worth it?

You pay $450 more for the 1080 Ti... No one can afford a GPU over $300 anyway, just rack up debt on their credit cards.

I think the 980 Ti is a clear winner in my eyes, Nvidia does really well when memory is taxed, unlike AMD will hitch, Nvidia seems to take use of dynamic shared video memory more efficiently & effectively.

I Love AMD, but The Elite is ruining the whole Computer Technology Industry right now... Gaming is going down hill, No Good Games, Prices Far Too High On PC Hardware... Right?!?










@akama

I recommend that pad material I posted because I use it & it proves to work. Here are the temps using Arctic Silver Ceramique 2 & Boron Nitride 15W/m K Thermal Conductive Laird **Thermagon**.

It costs $6 for a years worth of thermal paste basically & $6 worth of a lot of thermal material... good stuff... Lets hope your VRM temps go down using your choice of thermal solutions.


----------



## chris89

*@akama*

I managed to fix up your bios with hex editing & modding. It's quite decked out & should be much cooler. Test it & post screenshots.

It's the lastest ASUS R9 380 Strix Bios 4GB from Techpowerup. Modded.

1133mhz core & 1500mhz memory
128w tdp limits & 128a tdc limits
zero core voltage to vddci imc offset
Try it & lmk

Might need to force flash windows atiwinflash. Extract ATIWinflash.. right click atiwinflash.exe properties compatibility open as administrator.
Then rename bios file to bios.rom .. copy paste into AtiWinFlash folder... Hold Shift right click .. open command prompt here...
*atiwinflash -p -f 0 bios.rom
*

ASUS.R9.380.4GB.1133.1500.Z.Offset.75C.128.P.Limit.zip 101k .zip file


----------



## christoph

I have a problem with my 390 Sapphire nitro at idle temps



you want to see what are my full load temps?


----------



## akama

Quote:


> Originally Posted by *chris89*
> 
> *@akama*
> 
> I managed to fix up your bios with hex editing & modding. It's quite decked out & should be much cooler. Test it & post screenshots.
> 
> It's the lastest ASUS R9 380 Strix Bios 4GB from Techpowerup. Modded.
> 
> 1133mhz core & 1500mhz memory
> 128w tdp limits & 128a tdc limits
> zero core voltage to vddci imc offset
> Try it & lmk


Can a custom bios really help? Didnt know that, thought it was all about the silicon lottery.

Will try to research abit more before i reflash


----------



## diggiddi

Quote:


> Originally Posted by *chris89*
> 
> I'd go Ryzen first to show the difference between PCIe 2.0 & PCIe 3.0 before upgrading the GPU
> 
> Big difference is CPU is real fast but also 16GB/s over PCIe compared to 8GB/s on PCIe 2.0 which helps a lot at 4k, *like 20-30fps higher at max resolution/ details.* Where before it would lag out and be like I need a faster GPU... It's just a motherboard limitation on PCIe 2.0... The 390X is PCIe 3.0 GPU, so need PCIe 3.0 system to see its full potential.


Where are you getting these numbers from?


----------



## chris89

Quote:


> Originally Posted by *diggiddi*
> 
> Where are you getting these numbers from?


Your questioning my computing performance guidance? Shall I HAVE TO EXPLAIN MYSELF?


----------



## Seahawkshunt

Quote:


> Originally Posted by *chris89*
> 
> Your questioning my computing performance guidance? Shall I HAVE TO EXPLAIN MYSELF?


PLEASE DO! Not because I do not believe you but because I am genuinely curious. I do not have a 4k monitor but I do use VSR from 4k to 1080p in some games and in some benchmarks. When I went from AM3+ to AM4 I see little to no difference in GPU performance(within margin of error) So yes please explain?


----------



## chris89

Quote:


> Originally Posted by *Seahawkshunt*
> 
> PLEASE DO! Not because I do not believe you but because I am genuinely curious. I do not have a 4k monitor but I do use VSR from 4k to 1080p in some games and in some benchmarks. When I went from AM3+ to AM4 I see little to no difference in GPU performance(within margin of error) So yes please explain?


I BET YOU NEVER EVER CHECKED AIDA GPGPU, HUH?


----------



## Seahawkshunt

Quote:


> Originally Posted by *chris89*
> 
> I BET YOU NEVER EVER CHECKED AIDA GPGPU, HUH?


Thanks for the explanation. No I have not checked AIDA GPGPU. Do you see a big FPS difference in AIDA GPGPU when switching from PCIe 2.0 to PCIe 3.0..

EDIT; forgot to quote


----------



## componentgirl90

Quote:


> Originally Posted by *chris89*
> 
> 125 divided by 1.20 equals 104.17 rounded up.
> 
> 125 divided by 100 equals 1.25 (25%)
> 
> 100 divided by 1.20 (20%) equals 83.33
> 
> Its likely too much of a pain in the rear end to add VRM cooling. Did you do anything with your XFX 390X?
> 
> So did you ever even flash the new bios? Your running stock bios & still artifacting & still asking how to fix artifacting?


I flashed the 270 bios. It seems fine. It sits at 80% fans and 54 degrees when I tested it.

The reported voltage on GPU-z is 1.163. The
Quote:


> Originally Posted by *chris89*
> 
> 125 divided by 1.20 equals 104.17 rounded up.
> 
> 125 divided by 100 equals 1.25 (25%)
> 
> 100 divided by 1.20 (20%) equals 83.33
> 
> Its likely too much of a pain in the rear end to add VRM cooling. Did you do anything with your XFX 390X?
> 
> So did you ever even flash the new bios? Your running stock bios & still artifacting & still asking how to fix artifacting?


Hey Chris. I flashed ur bios...

Its nice.







I got to 54 degrees celcius earlier and the fans were at 80%.

What would happen if the power limit was set to 100 and not 130, would it throttle?

Would it be worth putting thermal paste on the core? Got in feb 2015. Used for a few months and its mostly been in storage since.


----------



## chris89

@componentgirl90 Your welcome. All that matter is if your happy with it. Is it fast enough? It should be 1000mhz core in hwinfo & 1.163v as you said. Your screenshot shows 1.15v & 900mhz. Yes it's throttling. It's at 130w so it can run at 1000mhz core.
Quote:


> Originally Posted by *Seahawkshunt*
> 
> Thanks for the explanation. No I have not checked AIDA GPGPU. Do you see a big FPS difference in AIDA GPGPU when switching from PCIe 2.0 to PCIe 3.0..
> 
> EDIT; forgot to quote


Use AIDA GPGPU, Memory Read/ Write should be 16GB/s on PCIe 3.0 & 8GB/s on PCIe 2.0. 2x more bandwidth from system memory to video memory, helps at very high resolution.

Send me your BIOS. .zip the .rom from GPUz. Attach to avoid JSON.


----------



## diggiddi

Dude that's a benchmark not a game


----------



## componentgirl90

Quote:


> Originally Posted by *chris89*
> 
> @componentgirl90 Your welcome. All that matter is if your happy with it. Is it fast enough? It should be 1000mhz core in hwinfo & 1.163v as you said. Your screenshot shows 1.15v & 900mhz. Yes it's throttling. It's at 130w so it can run at 1000mhz core.
> Use AIDA GPGPU, Memory Read/ Write should be 16GB/s on PCIe 3.0 & 8GB/s on PCIe 2.0. 2x more bandwidth from system memory to video memory, helps at very high resolution.
> 
> Send me your BIOS. .zip the .rom from GPUz. Attach to avoid JSON.


I am happy with that for sure. Just wanted to get rid of the artifacts.

I dont know why it was not at 1.163, it did that on the stock bios too.

Here you can see it performing as it is supposed to in the last session:


----------



## chris89

Quote:


> Originally Posted by *componentgirl90*
> 
> I am happy with that for sure. Just wanted to get rid of the artifacts.
> 
> I dont know why it was not at 1.163, it did that on the stock bios too.
> 
> Here you can see it performing as it is supposed to in the last session:


Still seeing artifacts? Can you take a screenshot & post it with the artifacts? or use ReLive to show it...?

Can you check HWInfo memory errors?


----------



## chris89

Quote:


> Originally Posted by *diggiddi*
> 
> Dude that's a benchmark not a game


If you can't tell a difference from PCIe 2.0 to 3.0 ... then its likely not even yielding PCIe 3.0 speeds... Gotta test AIDA GPGPU to see if its even working at PCIe 3.0... I've seen dudes with new PCIe 3.0 systems operating at PCIe 1.0 speeds dude.. so I would check.. might need a BIOS update or DDU & driver reinstall.


----------



## Seahawkshunt

Quote:


> Originally Posted by *diggiddi*
> 
> Dude that's a benchmark not a game


I know... I was attempting sarcasm, benchmark differences do not always equate to real life gains. Like 20-30 FPS gains from changing from PCI 2.0 to PCI 3.0 when using 4k monitors with the 390x. All my GPU benchmark scores stayed the same on HWBot when I changed from AM3+ to AM4, same with all my games. I was hoping after upgrading I would see a boost with my r9 390x but no such luck. Maybe I am doing something wrong?


----------



## chris89

Quote:


> Originally Posted by *Seahawkshunt*
> 
> I know... I was attempting sarcasm, benchmark differences do not always equate to real life gains. Like 20-30 FPS gains from changing from PCI 2.0 to PCI 3.0 when using 4k monitors with the 390x. All my GPU benchmark scores stayed the same on HWBot when I changed from AM3+ to AM4, same with all my games. I was hoping after upgrading I would see a boost with my r9 390x but no such luck. Maybe I am doing something wrong?


Yeah test AIDA GPGPU & Send me your BIOS & we will compare with AIDA GPGPU.. It tell how many Giga Flops of Total GPU Horsepower as well as showing that PCIe 3.0 does or doesn't make a difference... The PCIe slot will make a difference if it's reporting 13GB/s + over PCIe memory read/ write...

Has to do with the throughput between system memory & video memory... there is a limitation there... I run into a bottleneck on PCIe 2.0 with certain details.. the GPU is powerful enough, it's the PCIe slot can only hold a light load on PCIe 2.0. 4K can saturate PCIe 2 & 3 in a heartbeat though...


----------



## Seahawkshunt

Quote:


> Originally Posted by *chris89*
> 
> Yeah test AIDA GPGPU & Send me your BIOS & we will compare with AIDA GPGPU..


o7 Chris, I will check it out this weekend (compare Aida64 GPGPU on both of my rigs). Also I will follow your directions on sending you my BIOS and post my results for AIDA64. Maybe with a BIOS mod from you my card can be #1 R9 390x on Firestrike (valid) I am currently 2nd with any CPU/390x combo(on air). https://www.3dmark.com/fs/13653983

Thanks for your time and effort.


----------



## chris89

Quote:


> Originally Posted by *Seahawkshunt*
> 
> o7 Chris, I will check it out this weekend (compare Aida64 GPGPU on both of my rigs). Also I will follow your directions on sending you my BIOS and post my results for AIDA64. Maybe with a BIOS mod from you my card can be #1 R9 390x on Firestrike (valid) I am currently 2nd with any CPU/390x combo(on air). https://www.3dmark.com/fs/13653983
> 
> Thanks for your time and effort.


Your card is waxing my 390x on PCIe 3.0 just to let you know but vast margins as well.. dude crazy

On stock clocks dude... with my bios mod your results are gonna soar sky high dude...

I'm having a hard time pulling 15,000 gfx on dual 24 thread xeon hexa core 48gb ram PCIe 2.0 & a 390X @ over 1,200Mhz core.

1200Mhz Core 1700Mhz memory here Delimited power limit maxed PCIe 2.0 all the way out

Try setting Ryzen to 4.1Ghz or 4.2Ghz?

Tesselation ON & OFF AMD SETTINGS 4:2:2 for higher FPS & 8bpc & set Performance in Gaming & Frame Rate Target 200fps & also try Tesselation ON & OFF so you can be #1 on the charts.. I see #2... Invalid i7-6700k result.. 104fps? yeah right on average?


----------



## chris89

Check out my 390X haha .. I added a whole bunch of thermal pads to fix some issues I was having & its doing as good as it can... Clocking to 1400Mhz for screenshot sake haha

*AMD ought to use the RX 480 Samsung 8GB Memory on a 14nm Hawaii XT Core 64 : 176 & Release for $249.99*

390X with 14nm could handle 1,407Mhz on 1.25v but pull 90 Billion Giga Pixel's & 248 Billion Giga Texel's ... it would be way more efficient.

*I'm showing AMD here, that 115C limit is way too high and is slowing it down massively... As we can see here, AMD, 84C yields the highest Single Precision Julia.*


----------



## componentgirl90

Quote:


> Originally Posted by *chris89*
> 
> Still seeing artifacts? Can you take a screenshot & post it with the artifacts? or use ReLive to show it...?
> 
> Can you check HWInfo memory errors?


It hasn't artifacted at all yet but its only been a day or so. I will let you know if it still artifacts.









Also no memory errors after running heaven benchmark.


----------



## chris89

Quote:


> Originally Posted by *componentgirl90*
> 
> It hasn't artifacted at all yet but its only been a day or so. I will let you know if it still artifacts.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also no memory errors after running heaven benchmark.


Which games ya playing? Is it gaming at 1000Mhz or 900Mhz? If 900Mhz I think it's throttling .. idk though haha

Need to leave GPUz open & show all max values after gaming and show the chart to see if it throttled


----------



## componentgirl90

Battlefield series games e.g. 1 and 4. Heaven Benchmark.

Ok so I am not sure if was 1000Mhz the whole session yesterday.

But today was 900Mhz the whole time in Heaven Benchmark. Maybe it doesn't have enough power tbh....the TDP is stated in the specs as 175W for this card but the bios says 125 on the old one and 130 on the new one. \Also. maybe afterburner is interfering with it.

I think I remember it behaving like on the stock bios but cant remember.

Anyway its not too important so long as it doesn't artifact. I want to move on to the 390x as that card is totally screwed and is actually a card that matters lol. Going to put the bios on that but also put new thermal paste on the core (which i suck at btw but its better to do it than not in this case) and then put that Heatsink on afterwards.


----------



## chris89

Quote:


> Originally Posted by *componentgirl90*
> 
> Battlefield series games e.g. 1 and 4. Heaven Benchmark.
> 
> Ok so I am not sure if was 1000Mhz the whole session yesterday.
> 
> But today was 900Mhz the whole time in Heaven Benchmark. Maybe it doesn't have enough power tbh....the TDP is stated in the specs as 175W for this card but the bios says 125 on the old one and 130 on the new one. \Also. maybe afterburner is interfering with it.
> 
> I think I remember it behaving like on the stock bios but cant remember.
> 
> Anyway its not too important so long as it doesn't artifact. I want to move on to the 390x as that card is totally screwed and is actually a card that matters lol. Going to put the bios on that but also put new thermal paste on the core (which i suck at btw but its better to do it than not in this case) and then put that Heatsink on afterwards.


Right on. I think it's because you might of added -20% power limit in MSI Afterburner. It should run at 1000mhz.

This one is a little undervolted but faster clocks & same fan speed.. might help.. 1138mv @ 1094mhz rather than 1163mv 1000mhz.. it was 1150mv from 900mhz so this should work.. 1094mhz 1138mv.

Pitcairn.1094mhz.zip 97k .zip file


----------



## componentgirl90

Quote:


> Originally Posted by *chris89*
> 
> Right on. I think it's because you might of added -20% power limit in MSI Afterburner. It should run at 1000mhz.
> 
> This one is a little undervolted but faster clocks & same fan speed.. might help.. 1138mv @ 1094mhz rather than 1163mv 1000mhz.. it was 1150mv from 900mhz so this should work.. 1094mhz 1138mv.
> 
> Pitcairn.1094mhz.zip 97k .zip file


I loaded this one and it seems stable so far. Running at 1094MHz constantly.


----------



## chris89

nice check out my 390x.. got it up to 1,335mhz core clock & temps are on reference blower


----------



## componentgirl90

are you serious holy

i looked on posts and stuff and I was thinking of settling for a 1160 overclock or 1180 at most in future and even then I wondered if it would artifact or whatever.

nice

What is your heaven benchmark score?
Quote:


> Originally Posted by *chris89*
> 
> nice check out my 390x.. got it up to 1,335mhz core clock & temps are on reference blower


----------



## boot318

Quote:


> Originally Posted by *chris89*
> 
> nice check out my 390x.. got it up to 1,335mhz core clock & temps are on reference blower


HWinfo has a bug where it can misread clocks speeds. It didn't get up 1335mhz. No Hawaii chip is getting there without extreme watercooling/luck/golden chip or LN2.

Did you stress test that?


----------



## chris89

Yes, it's like 7.4 tera flops. See where it says Single Precision FLOPS? It's below 1000 is Mega Flops, above 1000 is Giga Flops so 7400 Giga Flops is 7.4 Tera Flops. It can take load at these speeds & complete at maximum voltage 1.449v on reference 290x blower on 390x card.

Vega is like 14 Tera Flops, so 2x faster in Single Precision & most other things, but Double Precision FLOPS, the 390X is just as fast if not faster than VEGA in Double Precision FLOPS.

That HWinfo is after load... hefty Compute load.. it can only Compute at these speeds.. it can game at maximum 1,250mhz core clock.


----------



## componentgirl90

Quote:


> Originally Posted by *chris89*
> 
> nice check out my 390x.. got it up to 1,335mhz core clock & temps are on reference blower


Heaven froze on that 270 bios I think. I guess that means the card is being pushed a bit too far for that voltage or whatever. Maybe 1050 would work?


----------



## chris89

Quote:


> Originally Posted by *componentgirl90*
> 
> Heaven froze on that 270 bios I think. I guess that means the card is being pushed a bit too far for that voltage or whatever. Maybe 1050 would work?


Cool dude 1 step higher voltage than 1138mv is 1150mv









150w & still 1094mhz core & 1500mhz memory from 1400mhz

pitcairn4.zip 97k .zip file


----------



## componentgirl90

Quote:


> Originally Posted by *chris89*
> 
> Cool dude 1 step higher voltage than 1138mv is 1150mv
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 150w & still 1094mhz core & 1500mhz memory from 1400mhz
> 
> pitcairn4.zip 97k .zip file


Artifacts. Gets to 59 degrees C.

1050Mhz, 1400Mhz, 1138v, 130W might be ideal tbh?

Performs really well in Pitcairn4.rom but artifacts. Any time it is over like 55 seems to be a recipe for artifacting.


----------



## chris89

Quote:


> Originally Posted by *componentgirl90*
> 
> Artifacts. Gets to 59 degrees C.
> 
> 1050Mhz, 1400Mhz, 1138v, 130W might be ideal tbh?
> 
> Performs really well in Pitcairn4.rom but artifacts. Any time it is over like 55 seems to be a recipe for artifacting.


Let's see if voltage does the trick... also post GPUz screenshot so we can see it took the 1500mhz.

Thanks

I won't give up on you if you don't give up on me, I'll stick with you until its perfect, Honey.

1094mhz-1500mhz-1163mv,150w,same fan speed : PS : last bios was 1.150v

pitcairn5.zip 97k .zip file


----------



## componentgirl90

.....

pitcairn5 still artifacts.

1138v = best one you sent. But froze.

I made new bios with 1094MHz to 1050.

Ownbios.zip 97k .zip file


Is ok but noisy (fan 100%) and 59 degrees C. Want cooler + less noisy. Not bothered about performance 2 much.

What next?


----------



## chris89

Quote:


> Originally Posted by *componentgirl90*
> 
> .....
> 
> pitcairn5 still artifacts.
> 
> 1138v = best one you sent. But froze.
> 
> I made new bios with 1094MHz to 1050.
> 
> Ownbios.zip 97k .zip file
> 
> 
> Is ok but noisy (fan 100%) and 59 degrees C. Want cooler + less noisy. Not bothered about performance 2 much.
> 
> What next?


Kk I Luv U









1000mhz 1125mv 1400mhz memory 256w quiet

pitcairn6.zip 97k .zip file


----------



## POLJDA

pci 2.0 100mhz vs 130mhz memory


----------



## chris89

Quote:


> Originally Posted by *POLJDA*
> 
> pci 2.0 100mhz vs 130mhz memory


Wanna send your BIOS? I can get it up to over 1,000 fps single precision Julia & 6.5 Tera Flops per second..?

Attach .rom from GPUz as .zip to avoid JSON error

Thanks


----------



## POLJDA

Hawaiir9390noX.zip 99k .zip file

Quote:


> Originally Posted by *chris89*
> 
> Wanna send your BIOS? I can get it up to over 1,000 fps single precision Julia & 6.5 Tera Flops per second..?
> 
> Attach .rom from GPUz as .zip to avoid JSON error
> 
> Thanks


1200/1565 on AIDA


----------



## chris89

@Tame

Can you help me make the timings change for others? This is gonna help tremendously.. especially when memory clock is 450GB/s 1,758mhz @ 1000mv

Is this what I do? Simply search for these originals timings in any rom & replace with new timings? I compared the differences.. there was one difference I was unsure of, if that changes when Checksum is corrected?'

I wasn't sure about this difference?



Original Timings
ABC07 - Start Offset
ACF9 - End Offset
77 71 33 20 00 00 00 00 8C C5 58 34 60 55 0F 0F 2C 94 B8 07 00 48 C5 00 22 FF 1C 08 5C 0F 14 20 5A 89 00 A0 00 00 01 20 12 0D 23 28 7B 22 2D 13 1C 19 02 00 77 71 33 20 00 00 00 00 CE CD 59 39 80 55 11 11 2E 15 89 08 00 48 C6 00 22 33 9D 08 6C 00 14 20 6A 89 00 A0 02 00 01 20 14 0F 26 2B 88 25 2F 15 A4 2C 02 00 77 71 33 20 00 00 00 00 CE 51 6A 3B 80 55 11 11 2F 96 D9 08 00 4A E6 00 22 33 9D 08 6C 00 14 20 6A 89 00 A0 02 00 01 20 15 0F 27 2D 8D 26 30 15 F0 49 02 00 77 71 33 20 00 00 00 00 CE 51 6A 3D 90 55 11 12 30 96 49 09 00 4A E6 00 22 33 9D 08 74 01 14 20 6A 89 00 A0 02 00 01 20 15 0F 29 2F 94 27 31 16 C4 7A 02 00 99 91 33 20 00 00 00 00 10 DE 7B 44 80 55 13 12 37 19 4B 0A 00 4C 06 01 22 55 9D 08 75 04 14 20 6A 89 00 A0 02 00 01 20 18 11 2D 34 A4 2A 38 16 98 AB 02 00 99 91 33 20 00 00 00 00 31 62 7C 48 90 55 13 13 39 9A DB 0A 00 4C 06 01 22 55 9D 08 7D 05 14 20 6A 89 00 A0 02 00 01 20 19 12 30 37 AD 2C 3A

2000mhz strap timings error free at high memory clocks & gained some core performance as a result of less memory errors... errors reduce fps
ABC07 - Start Offset
ACF9 - End Offset
BB B1 33 20 00 00 00 00 73 EE 8D 53 80 55 15 13 3E 9E 5D 0C 00 4E 26 01 22 88 9D 08 7E 05 14 20 6A 89 00 A0 02 00 01 20 1C 14 38 40 C5 30 3F 17 1C 19 02 00 BB B1 33 20 00 00 00 00 73 EE 8D 53 80 55 15 13 3E 9E 5D 0C 00 4E 26 01 22 88 9D 08 7E 05 14 20 6A 89 00 A0 02 00 01 20 1C 14 38 40 C5 30 3F 17 A4 2C 02 00 BB B1 33 20 00 00 00 00 73 EE 8D 53 80 55 15 13 3E 9E 5D 0C 00 4E 26 01 22 88 9D 08 7E 05 14 20 6A 89 00 A0 02 00 01 20 1C 14 38 40 C5 30 3F 17 F0 49 02 00 BB B1 33 20 00 00 00 00 73 EE 8D 53 80 55 15 13 3E 9E 5D 0C 00 4E 26 01 22 88 9D 08 7E 05 14 20 6A 89 00 A0 02 00 01 20 1C 14 38 40 C5 30 3F 17 C4 7A 02 00 BB B1 33 20 00 00 00 00 73 EE 8D 53 80 55 15 13 3E 9E 5D 0C 00 4E 26 01 22 88 9D 08 7E 05 14 20 6A 89 00 A0 02 00 01 20 1C 14 38 40 C5 30 3F 17 98 AB 02 00 BB B1 33 20 00 00 00 00 73 EE 8D 53 80 55 15 13 3E 9E 5D 0C 00 4E 26 01 22 88 9D 08 7E 05 14 20 6A 89 00 A0 02 00 01 20 1C 14 38 40 C5 30 3F

Sweet dude heres my new timings vs old too tight original timings.. plus so far so good.. even yielded higher Tera Flops & FPS with Looser more stable memory timings.

It also increased the efficiency of the memory sub system by 5% in power efficiency... less loss input to output power...

From 28-33% loss to now 23% loss.. amazing now it's near the Core efficiency of 23%.


----------



## chris89

Can we go looser on timings? it seems I don't see errors on AIDA GPGPU, but 3dmark firestrike sees errors after 5-10 seconds of load... It's highly sustainable to errors at 1,563mhz @ 1000mv even on 2000mhz timings?

I see like 500 errors after 10 seconds of load...

It's annoying having errors.

I think it needs Fujipoly 17w/m K 1mm to 1.5mm thermal pads, I'm thinking 1.5mm for compression sake to cool the memory modules.. since they heat up & error out at 1,563mhz...

I'm using Thermagon on my modules & Memory VRM & all other chips... which reduced power loss from 29% to 23% ... so saved 6% efficiency.

I wish we could fix this error issue...

I can't wait for a FinFet 14nm Hawaii/Grenada XT with Ellesmere SAMSUNG 2000Mhz GDDR5 256-bit GDDR5. As it's less bound to error possbility & much cooler memory than the 512-bit modules.

Unless AMD makes low voltage 512bit 8GB GDDR5 at like 1,563mhz stock using as much power as the Samsung GDDR5 used on Ellesmere.

Particularly needs to Use AMD Infinity Fabric for the PCIe I/O Controller or make it AMD Xfinity PCIe I/O HyperSpeed I/O controller capable of 64GB/s read/write with an error correcting buffer within the I/O controller & A HyperSpeed buffer of 256MB on the I/O chip to increase the Memory Read/ Write Speeds... particularly is a bottleneck.


----------



## chris89

It's difficult to break this 15,000 gfx on stock timings & at way higher clocks & power consumption... So it's working better for sure.. I think looser timings would yield ever-more-fps...


*This is old stock original timings, like 10,000,000 memory errors with 1200mhz core & 1700mhz memory using like 370 watts compared to new timings 240 watts*


----------



## chris89

@POLJDA

Until I find out how to loosen timings further.. this is stock timings.. post results

Post AIDA GPGPU & 3DMark Firestrike with Tesselation Off & Performance Mode & 4:2:2 in Display & 8bpc .. higher fps

1173mhz.1333mv.1563mhz.zip 99k .zip file


To push it to the limit.. has highly optimized fan profile here up to 1449mv core voltage though won't yield that until turning up the clock past 1200mhz

fan goes 8% at 40C, 32% at 50C, 88% at 60C... it's hot but fast for testing take

1205mhz.1449mv.1719mhz.zip 99k .zip file


----------



## POLJDA

gpgpu1719.png 116k .png file


Capture.PNG 430k .PNG file

Quote:


> Originally Posted by *chris89*
> 
> @POLJDA
> 
> Until I find out how to loosen timings further.. this is stock timings.. post results
> 
> Post AIDA GPGPU & 3DMark Firestrike with Tesselation Off & Performance Mode & 4:2:2 in Display & 8bpc .. higher fps
> 
> 1173mhz.1333mv.1563mhz.zip 99k .zip file
> 
> 
> To push it to the limit.. has highly optimized fan profile here up to 1449mv core voltage though won't yield that until turning up the clock past 1200mhz
> 
> fan goes 8% at 40C, 32% at 50C, 88% at 60C... it's hot but fast for testing take
> 
> 1205mhz.1449mv.1719mhz.zip 99k .zip file


----------



## chris89

Quote:


> Originally Posted by *POLJDA*
> 
> Capture.PNG 430k .PNG file


Hwinfo? VRM overheating could be the issue not break 6 tera flops..

I just hit 7.465 tera flops single precision FLOPs... haha before instability.. only yield 1.400v on 1.449v bios so need to adjust Load Line Slope to 0 or +25 or +50mv to fix this voltage droop


----------



## POLJDA

Capture1.PNG 921k .PNG file

Quote:


> Originally Posted by *chris89*
> 
> Hwinfo? VRM overheating could be the issue not break 6 tera flops..
> 
> I just hit 7.465 tera flops single precision FLOPs... haha before instability.. only yield 1.400v on 1.449v bios so need to adjust Load Line Slope to 0 or +25 or +50mv to fix this voltage droop


----------



## POLJDA

Quote:


> Originally Posted by *chris89*
> 
> Hwinfo? VRM overheating could be the issue not break 6 tera flops..
> 
> I just hit 7.465 tera flops single precision FLOPs... haha before instability.. only yield 1.400v on 1.449v bios so need to adjust Load Line Slope to 0 or +25 or +50mv to fix this voltage droop


can u download new AIDA and test?


----------



## chris89




----------



## POLJDA

Quote:


> Originally Posted by *chris89*


hm.........


----------



## chris89

Quote:


> Originally Posted by *POLJDA*
> 
> hm.........


*Let's see what we can pull on LuxMark v3.0 Hotel ... It's a fun super-realistic, ray tracing app that uses Compute & you can clock the GPU like 20% higher stable ... Let's see how it fairs against TITAN XP .. haha

http://www.luxrender.net/release/luxmark/v3.0/luxmark-windows64-v3.0.zip
*


----------



## POLJDA

luxmark.PNG 27k .PNG file

Quote:


> Originally Posted by *chris89*
> 
> *Let's see what we can pull on LuxMark v3.0 Hotel ... It's a fun super-realistic, ray tracing app that uses Compute & you can clock the GPU like 20% higher stable ... Let's see how it fairs against TITAN XP .. haha
> 
> http://www.luxrender.net/release/luxmark/v3.0/luxmark-windows64-v3.0.zip
> *


----------



## chris89

Quote:


> Originally Posted by *POLJDA*
> 
> luxmark.PNG 27k .PNG file


Nice.. have Hotel Results?


----------



## POLJDA




----------



## chris89

Quote:


> Originally Posted by *POLJDA*


Nice results, great job!















































































I like to change that later half to 1000/ 2000 resolution so we dont have to click on it...

width/500/height/1000 to width/1000/height/2000

Cool results... wanna see if reference bios works on your card? Is your card 390x unlockable?


























































http://www.overclock.net/t/1567179/activation-of-cores-in-hawaii-tonga-and-fiji-unlockability-tester-ver-1-6-and-atomtool


----------



## chris89

1133mhz 1641mhz


----------



## POLJDA

Quote:


> Originally Posted by *chris89*
> 
> Nice results, great job!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I like to change that later half to 1000/ 2000 resolution so we dont have to click on it...
> 
> width/500/height/1000 to width/1000/height/2000
> 
> Cool results... wanna see if reference bios works on your card? Is your card 390x unlockable?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.overclock.net/t/1567179/activation-of-cores-in-hawaii-tonga-and-fiji-unlockability-tester-ver-1-6-and-atomtool


its not unlockable


----------



## componentgirl90

I put paste on the 390x as well as the 270.

The 390x had very dried up flakey paste and the core had large gaps where there was no thermal paste on it at all, it was just exposed metal. There was some material on the cooler but there may have been some points where the metal was just touching the core.

The temperatures are 10-12 degrees celcius lower in game now.


----------



## gapottberg

Well folks, i have been contemplating doing a full BIOS edit again on my 390x...but after updating the latest MSI:AB utility i have my software settings running as smooth as butter again in all my games.

The slight undervolt and clock speed reduction has my card running silently, and with my latest custom fan speed settings I amazed once again at how well these cards handle even demanding loads when it comes to temps and noise when you customize settings over the stock ones.

I love this card. Good thing too, because I will be using it for a long time yet if stupid gfx card prices dont come back down soon.


----------



## componentgirl90

Quote:


> Originally Posted by *gapottberg*
> 
> Well folks, i have been contemplating doing a full BIOS edit again on my 390x...but after updating the latest MSI:AB utility i have my software settings running as smooth as butter again in all my games.
> 
> The slight undervolt and clock speed reduction has my card running silently, and with my latest custom fan speed settings I amazed once again at how well these cards handle even demanding loads when it comes to temps and noise when you customize settings over the stock ones.
> 
> I love this card. Good thing too, because I will be using it for a long time yet if stupid gfx card prices dont come back down soon.


Hopefully prices should come down soon. For many cards they are above the MSRP mostly because of demand from mining which is waning a bit. There are other factors like memory prices but not really sure how much of a difference that will make, only a small amount possibly. So prices should drop to MSRP. You'll probably be able to pick up a card or two at the normal price if you look hard and regularly.


----------



## granadier12

Hi there guys, I've been experiencing something weird with my r9390 gigabyte. I was owner of fx9370 and upgrade to r5 1600, now my temps on full load go from 85-95 c and before with the fx i was getting 72 at max, i already change the thermal paste with the noctua nh but the temps keeps getting higher, dont know what to do. What do u guys suggest??


----------



## christoph

Quote:


> Originally Posted by *granadier12*
> 
> Hi there guys, I've been experiencing something weird with my r9390 gigabyte. I was owner of fx9370 and upgrade to r5 1600, now my temps on full load go from 85-95 c and before with the fx i was getting 72 at max, i already change the thermal paste with the noctua nh but the temps keeps getting higher, dont know what to do. What do u guys suggest??


I think is cuz the video card is fully working with that cpu, I mean it was bottlenecked with the FX cpu, I would check airflow in the case, the driver too, did you clean install the driver? even the OS is a clean install?


----------



## granadier12

everything is clean install, im just about 4 days since runing ryzen 5, i have a good airflow in the case but even without the side window still run up to 93 c, its driving me crazy!


----------



## christoph

Quote:


> Originally Posted by *granadier12*
> 
> everything is clean install, im just about 4 days since runing ryzen 5, i have a good airflow in the case but even without the side window still run up to 93 c, its driving me crazy!


check in game how the video card is behaving if it's always at 100%

do you have the reference 390 video card? with the blower fan?

if so, then those temps are normal


----------



## gapottberg

My advice with the 390 is to try some custom settings using something like MSI Afterburner...and once you get it dialed in maybe look to transfer the set up to a custom BIOS.

I currently have a lot of success with my 390x using an undervolt. I run it at 1000mhz on GPU and drop Ram to 1250...then undervolt to -50mv and use a custom fan curve.

With these settings i can hold 60fps in all of my games...granted they are all older titles...and it is neaely silent 100% if the time thanks to my fan settings. I can always Overclock it back to something like stock voltage with 1100gpu and 1333 for more serious gaming.

The ram speeds are generally overkill for these cards unless you are looking at synthetic bench scores and produce a ton of extra heat. I highly recomend doing some gaming with 1333/1250 clock on the ram and see if you miss any performance.

Also make sure if you are running a 60hz monitor to framecap the card to 60fps in Crimson. There is zero need to make the card render frames your monitor can never show you in most cases and it can save a ton of power and heat and fan noise by capping it.

AMD chill also takes that idea a step further and in some games can help dynamicly lower your rendered fps when you dont need them. There are some good videos on youtube showing how it works.


----------



## granadier12

Quote:


> Originally Posted by *gapottberg*
> 
> My advice with the 390 is to try some custom settings using something like MSI Afterburner...and once you get it dialed in maybe look to transfer the set up to a custom BIOS.
> 
> I currently have a lot of success with my 390x using an undervolt. I run it at 1000mhz on GPU and drop Ram to 1250...then undervolt to -50mv and use a custom fan curve.
> 
> With these settings i can hold 60fps in all of my games...granted they are all older titles...and it is neaely silent 100% if the time thanks to my fan settings. I can always Overclock it back to something like stock voltage with 1100gpu and 1333 for more serious gaming.
> 
> The ram speeds are generally overkill for these cards unless you are looking at synthetic bench scores and produce a ton of extra heat. I highly recomend doing some gaming with 1333/1250 clock on the ram and see if you miss any performance.
> 
> Also make sure if you are running a 60hz monitor to framecap the card to 60fps in Crimson. There is zero need to make the card render frames your monitor can never show you in most cases and it can save a ton of power and heat and fan noise by capping it.
> 
> AMD chill also takes that idea a step further and in some games can help dynamicly lower your rendered fps when you dont need them. There are some good videos on youtube showing how it works.


i already have custom fan curve since when i bought the card 2 years ago, the fan spin 100% at 65 c and shut down at 44 c, i have the gigabyte gaming g1, the card is always at 100% utilization when gaming, so i dont know what to do. i have even underclock the card


----------



## christoph

Quote:


> Originally Posted by *granadier12*
> 
> i already have custom fan curve since when i bought the card 2 years ago, the fan spin 100% at 65 c and shut down at 44 c, i have the gigabyte gaming g1, the card is always at 100% utilization when gaming, so i dont know what to do. i have even underclock the card


maybe rechecking the thermal paste and use a better one like Gelid GC xtreme


----------



## Rexer

Quote:


> Originally Posted by *christoph*
> 
> maybe rechecking the thermal paste and use a better one like Gelid GC xtreme


Hey christoph, how you doing?
That Gelid GC xtreme is pretty good stuff. I'm been using Artic Silver for years but using xtreme gave me a few C's cooler. Good stuff. I got better temp results using very thin even amounts of it. .
In fact, it worked so well on my 390x, I took apart my RX 580 and applied some on that, too. The 580 is a cooler running card than the 390x and it didn't make much of a difference. But it worked wonders on the 390x. Probably because it's such a hot running card.
I'm planning on purchasing a Vega 64 after reference card. I'm hearing it's also runs hot so I'm keeping a couple squirts of it around.


----------



## Rexer

Quote:


> Originally Posted by *gapottberg*
> 
> Well folks, i have been contemplating doing a full BIOS edit again on my 390x...but after updating the latest MSI:AB utility i have my software settings running as smooth as butter again in all my games.
> 
> The slight undervolt and clock speed reduction has my card running silently, and with my latest custom fan speed settings I amazed once again at how well these cards handle even demanding loads when it comes to temps and noise when you customize settings over the stock ones.
> 
> I love this card. Good thing too, because I will be using it for a long time yet if stupid gfx card prices dont come back down soon.


I'm waiting on Vega after reference cards myself. I sorta choked on Vega prices. My favorite card, the Asus Strix 390 kacked last spring so I resorted to using an XFX 390x. It's O.K. but somehow the XFX 390x doesn't like me ramping it to the moon like the Strix. I purchased a 580 as a fail safe and was pretty happy with it's performance. It's short on rendering detail but it's pretty quick and smooth out of the box.
But I still liked the 390 best.
The understanding is, none of AMD's after market partners like modifying Vega and it's a hot running card like 390/390x. So while AMD's partners mull and fiddle-faddle with Vega, I'm gonna run my 390x till it kacks.
What my concern on newer cards is the mining junkies. I've got a friend who's a Bitcoin miner and he buys gpu cards by the pack! Ugh! Really a depressing sight. I barely got an OC 580 8gb before they were all sold out. In less than month after first released, all 580 8gb cards were gone. If there was a 580 8gb on some retailer's shelf, it was marked up $150 to $250 over manufacturer's retail price. As of last month the 580 8gb reappeared on the market. I'm wondering if the miners go after Vega after market cards and they become inflated. I'n not paying $150 over retail if I can help it. In the meantime, I'm gonna run the 390x till 2+2 =5.


----------



## gapottberg

Yeah my 390x has been super impressive. I have been experimenting with DSR again and am amazed how well it handles 1440p on my hardware. Even though it downscales to 1080p on my monitor it runs incredibly smooth and there is a noticable improvment to image quality...perticularly with at smoothing out edges better than some AA methods at 1080p. It looks as good as 4x SSAA without tanking framerates like SSAA does.

I do get a few gamea that dont play nice with DSR, but most do and for them i think i will keep it enabled for the long run. DOOM offered perticularly amazing performance and image quality with DSR at 1440p.

Makes me tempted to get a native 1440p monitor but i cant find one with the features i want at a reasonable price.

However, my buddy saw a 1080p 27" freesync curved screen monitor on sale for $200! Samsung model even. Prices like that have me about ready to pull the trigger on an upgrade.


----------



## Rexer

When it comes to just straight gaming, I can't stop gushing about 390 & 390x. I take my 390x into big, multiplayer games, Battlefield 1, 3 & 4, Call of Duty Ghost & Advance Warefare and I can't believe how fast and responsive it is. When I can get my clocks set it's so smooth an effortless, I feel like I'm the only guy in town. Better than James Brown. I'm using a faster mouse these days and the 390x doesn't miss a beat. Doing war with 16 to 64 guys dukin' it out on the internet is taxing work for a video card but 390x hammers out every move and too often, beats them down. Yeah, between all the gpu cards I own, 390x is a supreme being.
I'm using a 144hz monitor and an older i7 3770k with 16gb of DDR3 2400. I'm hoping if this computer still flies for a few more years. I also play against klan friends who have big 1080's who can't believe it's as quick they are. In fact, it's been dominant at times. Lol. I have to be careful not run away with the score because they'll quit and (I don't want to discourage them form playing) I won't have team to play with. Sure they're 1080's are fast but I make 'em pay for that $$$ Nvidia name.
390 & 390x are remarkable cards.


----------



## mus1mus

Has anyone tried undervolting and underclocking a 390/X to 1000/1500?


----------



## gapottberg

Quote:


> Originally Posted by *mus1mus*
> 
> Has anyone tried undervolting and underclocking a 390/X to 1000/1500?


I run mine at 1000/1250 with a -50mv to the GPU using MSI AB. Smooth as butter and quiet and cool for most of my needs.


----------



## mus1mus

Quote:


> Originally Posted by *gapottberg*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mus1mus*
> 
> Has anyone tried undervolting and underclocking a 390/X to 1000/1500?
> 
> 
> 
> I run mine at 1000/1250 with a -50mv to the GPU using MSI AB. Smooth as butter and quiet and cool for most of my needs.
Click to expand...

What is that in terms of Voltage?

I ask coz I have undervolted my 290 to 1.1V @ 1050/1450. Load Voltage goes down to 0.980 - 1.03V. How low do these cards go?


----------



## gapottberg

Voltage under full load during Firestrike Extream Benchmark ranged from (1.109v - 1.078v). I believe the max offset available using MSI AB is -100mv so you could potentially take another -50mv off that if you were confident that your clockspeed could handle it. I have not pushed it lower due to reaching the temps and fan speeds I wanted.

Hope that helps.


----------



## christoph

Quote:


> Originally Posted by *Rexer*
> 
> Hey christoph, how you doing?
> That Gelid GC xtreme is pretty good stuff. I'm been using Artic Silver for years but using xtreme gave me a few C's cooler. Good stuff. I got better temp results using very thin even amounts of it. .
> In fact, it worked so well on my 390x, I took apart my RX 580 and applied some on that, too. The 580 is a cooler running card than the 390x and it didn't make much of a difference. But it worked wonders on the 390x. Probably because it's such a hot running card.
> I'm planning on purchasing a Vega 64 after reference card. I'm hearing it's also runs hot so I'm keeping a couple squirts of it around.


yeah the artic silver is a good thermal paste but the Gelic is one of the best right now


----------



## mus1mus

Quote:


> Originally Posted by *gapottberg*
> 
> Voltage under full load during Firestrike Extream Benchmark ranged from (1.109v - 1.078v). I believe the max offset available using MSI AB is -100mv so you could potentially take another -50mv off that if you were confident that your clockspeed could handle it. I have not pushed it lower due to reaching the temps and fan speeds I wanted.
> 
> Hope that helps.


Indeed. Very nice card you have there it seems.









Thanks a lot bud.

Real reason is, I'm looking at buying one for some 24/7 crunching.







Very helpful.


----------



## gapottberg

Just out of curiosity...does anyone have any games or workloads that utilize more than 4GB if vram buffer, and if so what settings are you using?

I have been running all my games at 1440p with high or better graphics settings in all my games and none of them seem to come close to 4GB yet. Granted most of my games are older now.

Tombraider 2013
Thief 2013
DeusEx Human Revolution
Doom 2016
AoS:E
KillingFloor2


----------



## jbravo14

Rise of the Tomb Raider
Ghost Recon Wildlands
The Division
Witcher 3


----------



## gapottberg

Quote:


> Originally Posted by *jbravo14*
> 
> Rise of the Tomb Raider
> Ghost Recon Wildlands
> The Division
> Witcher 3


Any of those over 4GB at 1080p?


----------



## jbravo14

Quote:


> Originally Posted by *gapottberg*
> 
> Any of those over 4GB at 1080p?


Yes 1080p


----------



## gapottberg

Underclocked, undervolted, 390X [email protected] / [email protected]
FX-8320e OCed @ 4.5Ghz,
CPU/NB 2600 / HT 2400
Ram 2133.C11.1T

1440p Ultra settings in Dirt Rally...60fps with a few dips in the 50s during play (droped to 57 once in benchmark). Could lock it above with subtle tweaks to GFX settings. The game is beautiful but very difficult. A true driving sim. On sale for 80% off through Nov1st 2017.

Sadly, it doesn't even use 3GB of Vram buffer.


----------



## chris89

Heres a comparison stock 390x vs delimited 390x on 2000mhz memory timings

1050mhz core 1500mhz memory

*This PCIe 2.0 & from 5174MB/s to 6681MB/s is 29% which means 29% more 4k fps legit shiz*

Someone post this is on PCIe 3.0 system... the biggest change is Memory Read/ Write which is everything at 4k. Clocks are meaningless, it's all read/ write limited here.

This is everything here from 5174MB/s to 6752MB/s is 30.50%... thats all that is to be gained... gotta focus on improving the read write to near PCIe 2.0 @ 8000MB/s & PCIe 3.0 @ 16000MB/s





1150mhz 1750mhz memory on stock voltage


----------



## Worldwin

Is there any real world differences? Like in games?


----------



## chris89

Only at 4k & yes! Huge difference ... way smoother .. rise of tomb raider *4096x2160* all ultra very high fxaa triple buffering dx12 exclusive fullscreen

No difference hardly from 1094mhz core to 1250mhz core at 4k... just need that high memory read/ write.

390x on Ryzen hits these scores with low speed gpu ... on PCIe 3.0

*https://www.3dmark.com/fs/13653983*


----------



## gapottberg

Think we should do a study of how far you need to push it for 1440p. Obviously 4k needs near max bandwidth/memoryspeed to keep from bottlenecking on PCIe 2.0, but i am curious where the deminishing returns start when running 1440p. The 390x is best suited for 1440p 60hz imo, and finding the sweet spot for bandwidth/memoryspeeds will allow us to clock them as effeciently as possible to keep them cool and quiet and more power effecient.


----------



## chris89

The 390x has VEGA double precision performance though so it's a monstrously powerful core.

920 mega flops nearly 1 tera flop double precision

I just found it needs minimum 1173mhz core & 1563mhz memory for the increased memory read/ write & 2000mhz memory timings without errors or maybe a few... hot memory.. 1000mv memory voltage & voltage protection at 255...

Fan profile lets it hit 75C or so not too loud either if your VRM is cooled properly... VRM doesn't exceed 75C basically.. Only Boron Nitride material Thermagon by Laird

*Send me your BIOS*, I'm getting 40fps 4k ultra 4096x2160 but 30hz with hdmi... need display port for even 60hz 1440p... can't do 60hz beyond 1080p on hdmi with 390x, sad.

Here's the BIOS it's the latest version possible, I compared. It's HP 390X reference GPU.
1333mv core

6.8 Tera Flops here

1173mhz-1563mhz.zip 99k .zip file


1449mv core - need exceptional air coolng or water... I use air.. can clock to 1300mhz on this voltage for OpenCL or something light

I can go to 1600mv too nearly 1400mhz core & 7.5 Tera Flops I saw

1225mhz-1563mhz.zip 99k .zip file


----------



## gapottberg

Crap. I totally forgot the hdmi was limited to 60hz at 1080p that means i need to grab a display port cable if i do go with the new 27" freesync monitor i have my eye on this holiday. Thanks for reminding me mate.

I plan on getting you my BIOS soon but i want to work on it with you. Jump in discord when you are on and i will hit you up some evening when we are both free and you can walk me through what you are doing. I am keen to learn the finer points from a seasoned vet as i am running out of material to read on the topic.


----------



## chris89

Sure thing but here man Im using mobile data so very limited access here are some screenshots


----------



## gapottberg

Quote:


> Originally Posted by *chris89*
> 
> Sure thing but here man Im using mobile data so very limited access here are some screenshots


Question. What do the 6528X numbers indicate in the voltage columns for both gpu and memory rows 1-6.

My guess is they are some code for stepping voltage up from the previous entry or something like that. So voltage starts at 900 in row 0...steps up by some unit each time in rows 1-6...and finally uses the 1333 volts you tell it to in row 7 instead of whaterver the stepping was.

Am i on track with my thinking?


----------



## chris89

Here shows it uses less voltage & 65288 by stock is 1.275v so I set it to 1.333v for 1173mhz core

If I set them lower to like 300-325-350-375-400-425-450-1173mhz then it would use even less voltage below peak 3d... & it doesn't throttle below 1173mhz unless set a frame rate cap.

It idles at 900mv and only goes up to 1.1v so this is okay as is


----------



## gapottberg

Ok so i am on the right track. Also i am curious. Why 1173mhz? Is it intentional? I ask because the new x box one x has essentially the exact same clock speed on its cores. Coincedence?


----------



## chaosblade02

Is anyone else with an R9 390 getting desktop artifacts with the newer Crimson Drivers?


----------



## chaosblade02

MSI R9 390 gaming 8G. I need +88mv to hit 1170 core, stable, but I can get 1125 for a minor bump of +13mv, or 1100 core for no voltage increase. The memory OCs an extra +150, for no increase on the aux volts, but won't hit +200 stable no matter what. The core won't go any higher than 1170 range, even @ +100mv. +88mv is pushing the limits of air cooling on my GPU already. Some games already hit 85-88C range @ +88mv after 30 minutes of play, where +13mv, I'm sitting at 75-80C range. Power limit +20%. It doesn't like it if I set the power limit any higher, the GPU gets more unstable with power limit higher than 20%.

I'm unsure if increasing aux volts has any impact on core stability, and if it's possible I could get a higher core clock by modifying aux volts or not. Increasing them by 20-30 didn't seem to have much of an impact on temps.


----------



## gapottberg

I found while stress testing that my MSI 390x needs +06mv in MSI AB when running it at 1100mhz/+50% power limit to eliminate all artifacts.

I dont perticularly like the thermals and fan speeds I get running it that way, but those are subjective. I have been undervolting and underclocking it to 1000/1250/+50%PL with -50mv to the GPU.

Some recent benchmarking has me looking into the max settings and yields i can squeek out before my breaking my limits on temps and noise are breached.

So far I have the memory back up to 1500mhz with good fps gains and very little additional heat or noise. I am hoping I have time for a detialed inspection of various settings that i can share here later this month.


----------



## chris89

Sweet dude. I pulled my 390x & am running an RX 550 LP 2GB, it's ultra efficient & fast. I have it clocked to 1,375Mhz core from 1,203Mhz & from 1.075v to 1.15v... It's performing nicely. It's a beast once BIOS modded.

From 1,203Mhz to 1,375Mhz It went up 0.5 Tera Flops on single Precision & gaming went up by 50%.

Changed TDP from 38 watts to 75 watts.


----------



## gapottberg

Hey Chris, can you explain better the changes on your 390x BIOS for delimiting a card. I can see the values you have changed but am curious as to why you choose the numbers you did and how they compare to a software power limit boost of +50%.

In other words...

If stock BIOS number is X...what would it be for X +50% power limit? (My guess is X * 1.5 but idk)

And why did you choose the inputs you did for your BIOS edits? Can they go higher or is that the max?


----------



## chris89

Do you know about GPU memory errors? I get them I think at 1500mhz I never tested but 1563mhz is a lot of errors at 1000mv .. using 2000mhz timings there are less errors from millions to just hundreds of thousands but sitll...

I think the modules overheat at 1000mv... need fujipoly 17w/m K pads.. im using Laird Thermagon Boron Nitride lke 15w/m K but not good enough I guess.. idk .. this stuff holds the VRM no issue at upwards 1400mhz.. more like 1300mhz or so to 1320mhz.

No errors on 1250mhz memory at 875mv .. idk maybe the modules overheat?

I found out I can run the memory at 1563mhz at bare minimum 919mv which only yields a couple errors, plus I noticed one of my modules or two had minimal thermal pad contact so I reidid it and its better.



*Turn down ram voltage to 875mv & 1272mhz memory & no errors*



*919mv memory but with a crash so I guess 920-925mv is ideal. It's faster too without errors at 4k it's noticeably smoother & faster & less laggy*


----------



## Cherryblue

Guys, I'm asking again,

Was anyone able to change the lowest state core voltage of their 390X?

I use Hawai Bios Reader although I change DPM0 value, it doesn't take it into consideration when I look at core voltage through HW-Info.

Any idea why?

Thanks in advance!


----------



## thiussat

So my 390 died a while back and Gigabyte said that since they don't make 390's anymore, that they'd send me a 480. I looked at the benchmarks and the 480 seemed to be even with the 390 in most titles.

However, it hasn't been my experience. I play MMO's and my 480 has given me a massive drop in FPS. I have to turn my graphics all the way to low and I STILL lag in cities. When I had my 390, I could turn my graphics to max and never lag, even in cities (60 FPS at all times).

***? The 480 is a piece of excrement and in no way is equal to the 390. I wish there was a way I could make Gigabyte give me a faster card.


----------



## chris89

Quote:


> Originally Posted by *Cherryblue*
> 
> Guys, I'm asking again,
> 
> Was anyone able to change the lowest state core voltage of their 390X?
> 
> I use Hawai Bios Reader although I change DPM0 value, it doesn't take it into consideration when I look at core voltage through HW-Info.
> 
> Any idea why?
> 
> Thanks in advance!


Core voltage 1250mv & memory at 895mv .... vs 1275mv & 1050mv

1094Mhz Core & 1250Mhz Memory

CherryBlue-TryThis.zip 101k .zip file


It can't really do more than 1Ghz at 1.250v so maybe 1094mz at 1.250v bare minimum
at 1.275v it can do 1,133Mhz, that's 72.5GPixel/s
What's your issue? Send me your BIOS









Quote:


> Originally Posted by *thiussat*
> 
> So my 390 died a while back and Gigabyte said that since they don't make 390's anymore, that they'd send me a 480. I looked at the benchmarks and the 480 seemed to be even with the 390 in most titles.
> 
> However, it hasn't been my experience. I play MMO's and my 480 has given me a massive drop in FPS. I have to turn my graphics all the way to low and I STILL lag in cities. When I had my 390, I could turn my graphics to max and never lag, even in cities (60 FPS at all times).
> 
> ***? The 480 is a piece of excrement and in no way is equal to the 390. I wish there was a way I could make Gigabyte give me a faster card.


Yeah need a better CPU, need Ryzen, Even the AMD RYZEN R3 1200 little Quad Core will outperform my Dual Xeon X5650 System with 24 threads at 3.06Ghz ... That R3 1200 @ 4Ghz would smoke my Xeon's in Games at 1080p, 1440p, and 4k... It has to do with the Ryzen Epic Infinity Fabric & PCIe v3.0 latest spec PCIe slot...

I lag at 1080p, 1440p, and 4k ... not even 60fps at 1080p on Rise Of The Tomb Raider maxed out, like 40-50 fps

At 1440p like 40fps max and in the 30's & 4k like 20 frickin FPS on the 390X no matter how high I clock it... So it's my CPU & Motherboard which limit me right now...

Here's a RX 480 with 580 BIOS Comparison with some GPU's ... 390X x2 is nearly VEGA throughput on Single Precision...


----------



## chris89

I was able to hit 1240mhz core and 1563mhz memory at 920mv memory & 1450mv core

This is the fastest I saw on 3dmark results



AND MY RESULT .. No Tesselation.. I'll download the latest so I can submit.


----------



## robin69

This Score whas reached with PCIE Gen 2.0 ? Stunning, really stunning results


----------



## chris89

Quote:


> Originally Posted by *robin69*
> 
> This Score whas reached with PCIE Gen 2.0 ? Stunning, really stunning results


Thank You. Check out my Superposition 4k opimized results..

I just redid the thermal pads on the memory & VRM(s) & repasted the GPU.

Running the video memory undervolted helps reduce errors, yields higher fps as a result.


----------



## thiussat

Quote:


> Yeah need a better CPU, need Ryzen, Even the AMD RYZEN R3 1200 little Quad Core will outperform my Dual Xeon X5650 System with 24 threads at 3.06Ghz ... That R3 1200 @ 4Ghz would smoke my Xeon's in Games at 1080p, 1440p, and 4k... It has to do with the Ryzen Epic Infinity Fabric & PCIe v3.0 latest spec PCIe slot...


Yeah that's irrelevant since I am using the same CPU I was before.


----------



## gapottberg

Quote:


> Originally Posted by *thiussat*
> 
> So my 390 died a while back and Gigabyte said that since they don't make 390's anymore, that they'd send me a 480. I looked at the benchmarks and the 480 seemed to be even with the 390 in most titles.
> 
> However, it hasn't been my experience. I play MMO's and my 480 has given me a massive drop in FPS. I have to turn my graphics all the way to low and I STILL lag in cities. When I had my 390, I could turn my graphics to max and never lag, even in cities (60 FPS at all times).
> 
> ***? The 480 is a piece of excrement and in no way is equal to the 390. I wish there was a way I could make Gigabyte give me a faster card.


That seems really odd and has not been the experience of many other 480 users from what i have seen.

I am wondering exactly what model of 480 you have and if you have tried tweaking it at all with wattman or similar OCing software? If not and maybe even if so, you should look at AdoredTVs guide to undervolting the 480. He was able to show how some of them at stock settings perform better with some simple tweaks including undervolting. Check it out on youtube. I will post the link here later when i am not on my phone.

It may help, good luck!


----------



## chris89

Quote:


> Originally Posted by *gapottberg*
> 
> That seems really odd and has not been the experience of many other 480 users from what i have seen.
> 
> I am wondering exactly what model of 480 you have and if you have tried tweaking it at all with wattman or similar OCing software? If not and maybe even if so, you should look at AdoredTVs guide to undervolting the 480. He was able to show how some of them at stock settings perform better with some simple tweaks including undervolting. Check it out on youtube. I will post the link here later when i am not on my phone.
> 
> It may help, good luck!


I can get the 480 to perform like the 390X, if I set TDP/TDC to 256 watts/ amps. Then use a 950mv core voltage mod so that it can run at 1,250mhz core at 950mv. Lets it run much cooler. About 60C load.

Right now I use the Startech Displayport 1.2 to HDMI 2.0 adapter & I'm getting 21fps at 4096x2160 on Rise Of The Tomb Raider using the 390X. Its difficult running the 390X at 30fps at 4k on most titles. It simply isn't quite capable of 4k 30fps on PCIe 2.0.

It's insane how much faster the AMD VEGA is than the 390X.

I honestly would say thought the 96 ROP GTX 980 Ti is the best choice at the moment. Can't beat 96 ROP, am I right?

4096x2160 on 390X



4096x2160 on 390X all low


----------



## chris89

I noticed something remarkable with 4K 390x testing.

I clocked the core to 750MHZ, which only needed 0.981v 981 millivolts at 4096x2160 on Prey.

The FPS at *1172Mhz* using *300 watts*, is *only 29 fps*. At *750MHZ*, it's 20 fps, with an unusual as ever *100,000,000 memory errors*, where as at *1000Mhz+, no errors*.

Interesting how insanely efficient it is at *750MHZ*, fan in *INAUDIBLE* and only hit 60C core.

So it's interesting at a CLOCK of *56.22% higher frequency, only yields 45% worse FPS... So that's saying something.*

Though *POWER* wise from *300* watts down to *89* watts is a remarkable *337% LESS POWER! 337%! YES 337% less power for 45% worse FPS.*


----------



## dagget3450

Quote:


> Originally Posted by *chris89*
> 
> I can get the 480 to perform like the 390X, if I set TDP/TDC to 256 watts/ amps. Then use a 950mv core voltage mod so that it can run at 1,250mhz core at 950mv. Lets it run much cooler. About 60C load.
> 
> Right now I use the Startech Displayport 1.2 to HDMI 2.0 adapter & I'm getting 21fps at 4096x2160 on Rise Of The Tomb Raider using the 390X. Its difficult running the 390X at 30fps at 4k on most titles. It simply isn't quite capable of 4k 30fps on PCIe 2.0.
> 
> It's insane how much faster the AMD VEGA is than the 390X.
> 
> I honestly would say thought the 96 ROP GTX 980 Ti is the best choice at the moment. Can't beat 96 ROP, am I right?
> 
> 4096x2160 on 390X
> 
> 
> 
> 4096x2160 on 390X all low


i like the way you tinker with these no one else really does much, thanks for sharing the results.

Also, not sure if you noticed but the two screenshots the bottom one looks like textures didnt load? nm i didnt read


----------



## chris89

Quote:


> Originally Posted by *dagget3450*
> 
> i like the way you tinker with these no one else really does much, thanks for sharing the results.
> 
> Also, not sure if you noticed but the two screenshots the bottom one looks like textures didnt load? nm i didnt read


Yeah it looks like 1.150v works at 1133Mhz core clock too so undervolting is possible.

Wanna send me your BIOS and try & beat some records? I have the deal settings lined up. Need 2000mhz timings strap applied.


----------



## gapottberg

Here is my set up and results for some TR 2013. Not sure if they will help anyone, but I love to tinker and share myself.









*FX-8320e*
CPU 4.5Ghz, 1.44v, No LCC

*ASrock 970 Fatal1ty*
CPU/NB 2.6Ghz, 1.4v
HT 2.4Ghz, 1.2v
NB 1.2v
DDR3 2133mhz, C11, 1T, Duel channel, 1.6v

*MSI R9-390X*
GPU 1000mhz (-81mv, +50% power limit)
GDDR5 1500mhz (stock voltage 1.000v)
Crimson 17.7
MSI:AB 4.4.0
Windows 10 64bit, FCU 1709

*Tomb Raider 2013*
3200x1800 (using VSR)
Ultra Preset



Spoiler: Warning: Spoiler!









Spoiler: Warning: Spoiler!









Spoiler: Warning: Spoiler!


----------



## chris89

Quote:


> Originally Posted by *gapottberg*
> 
> Here is my set up and results for some TR 2013. Not sure if they will help anyone, but I love to tinker and share myself.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *FX-8320e*
> CPU 4.5Ghz, 1.44v, No LCC
> 
> *ASrock 970 Fatal1ty*
> CPU/NB 2.6Ghz, 1.4v
> HT 2.4Ghz, 1.2v
> NB 1.2v
> DDR3 2133mhz, C11, 1T, Duel channel, 1.6v
> 
> *MSI R9-390X*
> GPU 1000mhz (-81mv, +50% power limit)
> GDDR5 1500mhz (stock voltage 1.000v)
> Crimson 17.7
> MSI:AB 4.4.0
> Windows 10 64bit, FCU 1709
> 
> *Tomb Raider 2013*
> 3200x1800 (using VSR)
> Ultra Preset
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Your score is amazing. I haven't tested 3200x1800 VSR... Wanna go higher? We got it up to 3400x1913 I believe VSR mod?

Check out my latest 3dmark scores I posted, not bad...

https://www.3dmark.com/3dm/23246865

https://www.3dmark.com/3dm/23246932


----------



## Cherryblue

Quote:


> Originally Posted by *chris89*
> 
> Core voltage 1250mv & memory at 895mv .... vs 1275mv & 1050mv
> 
> 1094Mhz Core & 1250Mhz Memory
> 
> CherryBlue-TryThis.zip 101k .zip file


Hi Chris,

This is where I am for now (stable and all for me):

Cherryblue-1080mhz-1200mhz-lo968mv-hi1187mv-aux875mv.zip 99k .zip file


I'd like to customize every frequency p-state. (DPM0 to DPM7).

DPM7 is nicely working (1080mhz at 1187mv), but I cannot seem to put the voltage for DPM0 (300mhz).

Here is what I have:


And what H-INFO displays to me:


*See, I'm waiting for 0.968mv, but instead I get 1.0688v. I don't understand







.*


----------



## narutonic

Quote:


> Originally Posted by *chris89*
> 
> I noticed something remarkable with 4K 390x testing.
> 
> I clocked the core to 750MHZ, which only needed 0.981v 981 millivolts at 4096x2160 on Prey.
> 
> The FPS at *1172Mhz* using *300 watts*, is *only 29 fps*. At *750MHZ*, it's 20 fps, with an unusual as ever *100,000,000 memory errors*, where as at *1000Mhz+, no errors*.
> 
> Interesting how insanely efficient it is at *750MHZ*, fan in *INAUDIBLE* and only hit 60C core.
> 
> So it's interesting at a CLOCK of *56.22% higher frequency, only yields 45% worse FPS... So that's saying something.*
> 
> Though *POWER* wise from *300* watts down to *89* watts is a remarkable *337% LESS POWER! 337%! YES 337% less power for 45% worse FPS.*


its only -70% of power used. 337% means clearly nothing here.
I dont find this really surprising at some point depending of the game and other parameter the gain from frequency can change a lot.
You can do something similar with a l
high end card youll llse performance but gain a lot in efficiency and still have good performance. But not really useful in fact.


----------



## chris89

I just took before 300 watts VRM Power Input at 1172mhz up to 1204mhz core clock .... then Divided it by 89 watts at 750mhz using 65288 ... Equals 3.37 ... that's 337% right? since 300 is 3.37 times higher than 89 watts...

The difference in 4k fps is about 5 frames per second ... from 300 watts down to 89 watts... Interesting.









It's just insane if I want 26-30 fps at 4k it costs me over 300 watts, if I want 22 fps then it's 89 watts... Just wildly inefficient.

Now I have it down to 1,094mhz core @ 1200mv, so a -75mv undervolt & its using 200 watts max... same FPS as 1133mhz & 1204mhz

So at 4k, I see no gains past a certain clock, except wasteful power & heat

plus 1200mv causes memory errors, if I want to fix memory errors, need to turn up the voltage back to stock 1275mv voltage.


----------



## narutonic

You can not consume 337% less power. Its simply impossible. It would mean that you ""consume"" -711w.
in your case 100% = 300w
89w represent 30%~ of 300w.

Even though still interesting to see how much efficient the card could be in 4K im guessing in crossfire how it will react?


----------



## chris89

Quote:


> Originally Posted by *narutonic*
> 
> You can not consume 337% less power. Its simply impossible. It would mean that you ""consume"" -711w.
> in your case 100% = 300w
> 89w represent 30%~ of 300w.
> 
> Even though still interesting to see how much efficient the card could be in 4K im guessing in crossfire how it will react?


I have a 290X in there at the moment, Crossfire 290x 4gb & 390x 8gb...

Do you know how to flash the 390x to 4gb & my 290x to 390x 4gb?

I have some issues sort of in some games... Idk why? maybe it's the timings?

maybe 290x uses different 1250mhz timings than 390x?

Can you take a look?

AMD.ATI.290X.4GB.BIOS.zip 99k .zip file


AMD.ATI.390X.8GB.BIOS.zip 99k .zip file


----------



## Worldwin

In case anyone is wondering for the MSI gaming 390x you need 1.5mm thermal pad for the VRMS if you are only replacing the thermal pad. I learned this the hardway using only 1mm pads.


----------



## chris89

Quote:


> Originally Posted by *Worldwin*
> 
> In case anyone is wondering for the MSI gaming 390x you need 1.5mm thermal pad for the VRMS if you are only replacing the thermal pad. I learned this the hardway using only 1mm pads.


Got any pictures? shows the compression & gap? thanks.


----------



## Worldwin

Quote:


> Originally Posted by *chris89*
> 
> Got any pictures? shows the compression & gap? thanks.


Too lazy to take photo. All you need to know is that the thermal was not touching the VRM's. Went back to stock and behold temps are the same as the core now. With the 1mm pads it was about 12-20C higher.


----------



## chris89

Quote:


> Originally Posted by *Worldwin*
> 
> Too lazy to take photo. All you need to know is that the thermal was not touching the VRM's. Went back to stock and behold temps are the same as the core now. With the 1mm pads it was about 12-20C higher.


Nice man. I wish Radeon Polaris would go down n price
Quote:


> Originally Posted by *Cherryblue*
> 
> Hi Chris,
> 
> This is where I am for now (stable and all for me):
> 
> Cherryblue-1080mhz-1200mhz-lo968mv-hi1187mv-aux875mv.zip 99k .zip file
> 
> 
> I'd like to customize every frequency p-state. (DPM0 to DPM7).
> 
> DPM7 is nicely working (1080mhz at 1187mv), but I cannot seem to put the voltage for DPM0 (300mhz).
> 
> Here is what I have:
> 
> 
> And what H-INFO displays to me:
> 
> 
> *See, I'm waiting for 0.968mv, but instead I get 1.0688v. I don't understand
> 
> 
> 
> 
> 
> 
> 
> .*


 1094mhz-1250mhz-lo968mv-hi1188mv-aux875mv.zip 101k .zip file


Sure thing. DDU. Then install 17.11.1 in Device Manager Only. Then Use HWInfo to post a Screenshot of HWInfo. It should idle correctly on fresh driver install.

Download CRU so you can click on Restart64.exe when the voltage is too high at idle...

RESTART.DISPLAY.DRIVER.x64.zip 33k .zip file


@ Moderator **** You limited my screenshots resolution huh? What is your guys deal?


----------



## gapottberg

Already posted elsewhwere...but ill take your guy's opinion too.

I have a potential used saphire 290 tri-x card for sale locally. How much would you all spend on this as an upgrade to an FX system with say a R9 270 or older. I know for a fact my 390x blows the 270 out of the water in terms if gaming on my FX platform.

I also know an RX 580 4gb goes new for about $250 and a 570 is a bit cheaper.

I am thinking maybe like $125? Thoughts?


----------



## chris89

Quote:


> Originally Posted by *gapottberg*
> 
> Already posted elsewhwere...but ill take your guy's opinion too.
> 
> I have a potential used saphire 290 tri-x card for sale locally. How much would you all spend on this as an upgrade to an FX system with say a R9 270 or older. I know for a fact my 390x blows the 270 out of the water in terms if gaming on my FX platform.
> 
> I also know an RX 580 4gb goes new for about $250 and a 570 is a bit cheaper.
> 
> I am thinking maybe like $125? Thoughts?


*I'd do it, but take the cooler off & sell it to me hahahahha* .. I want that cooler.. It should be nice cool & quiet so that's a plus!

Send me the BIOS to the card & we can make it output about *6.5 Tera Flops* compared to just *5.5 Tera Flops*, so a whole extra Tera Flop with my BIOS mod.


----------



## gapottberg

Yeah, I have a feeling he is gonna want closer to $200, which at that price point why buy a 4 year old used card with no warrenty and a questionable amount of life left in it...when i can get a RX-570 brand new that is about the same performance...or a RX-580 for a few bucks more that beats it like a drum? We shall see what he says.


----------



## chris89

Visiontek Radeon Rx 580 8gb Dual Fan : $249 can clock to I think 1,563mhz with a little extra votlage
https://www.visiontek.com/refurbished-radeon-rx-580-overclocked-8gb-gddr5.html

Visiontek Radeon Rx 480 8gb reference : $229 can clock to 1,407mhz on stock voltage
https://www.visiontek.com/refurbished-radeon-rx480-polaris-edition-8gb-gddr5.html

Your right. Go Rx 480 or 580... 580 can do 1,500mhz. Get the RX 580 & send me the BIOS. I think it can run up against the 390X.

If AMD rejuvenates & resurrects Grenada XT to 14nm, I'm guessing 2x as much performance, plus higher clocks.

My 390X can clock to 1,360Mhz core clock max just for screenshot sake. It can GPUz up to 1,400mhz I think.


----------



## Cherryblue

Quote:


> Originally Posted by *chris89*
> 
> Nice man. I wish Radeon Polaris would go down n price
> 
> 1094mhz-1250mhz-lo968mv-hi1188mv-aux875mv.zip 101k .zip file
> 
> 
> Sure thing. DDU. Then install 17.11.1 in Device Manager Only. Then Use HWInfo to post a Screenshot of HWInfo. It should idle correctly on fresh driver install.
> 
> Download CRU so you can click on Restart64.exe when the voltage is too high at idle...
> 
> RESTART.DISPLAY.DRIVER.x64.zip 33k .zip file


Doesn't seem better, idle voltage is even higher than before







..



What I did:
--> ATiTools Flash the bios you gave me.
--> reboot
--> DDU
--> Install graphic driver using device manager only
--> reboot
--> used your executable to reset graphic driver.

*Here you have: Current - Min - Max*


(Idle state after a 5 minutes witcher 3 play)

The core voltage you see at minimum, I could never get to see it for real







. Right now, after another driver reboot, I sit at 1.1938V, way too high.

Rolling back to my last modded bios.


----------



## chris89

Quote:


> Originally Posted by *Cherryblue*
> 
> Doesn't seem better, idle voltage is even higher than before
> 
> 
> 
> 
> 
> 
> 
> ..
> 
> 
> 
> What I did:
> --> ATiTools Flash the bios you gave me.
> --> reboot
> --> DDU
> --> Install graphic driver using device manager only
> --> reboot
> --> used your executable to reset graphic driver.
> 
> *Here you have: Current - Min - Max*
> 
> 
> (Idle state after a 5 minutes witcher 3 play)
> 
> The core voltage you see at minimum, I could never get to see it for real
> 
> 
> 
> 
> 
> 
> 
> . Right now, after another driver reboot, I sit at 1.1938V, way too high.
> 
> Rolling back to my last modded bios.


1) DDU without restart
2) ATIWinFlash.exe (Run As Administrator : right click properties : compatibility : ok)
3) Restart when ATIWinFlash.exe completes
4) Then install the driver inf alone from the device manager
5) No need to restart again
6) Close HWInfo
7) Close other apps? What do you have open? Close everything
8) Open HWInfo ONLY (Only App Open After Device Manager Inf Install)

DDUv17.0.7.2.zip 1063k .zip file


ATIWINFLASH_v277.zip 1189k .zip file


hw64_561_3285.zip 4049k .zip file


----------



## Rexer

Lately, buying a video card's been like waiting for mouse trap to catch a mouse. I just managed to get a Vega 64 at $499.00 on Newegg. Retail price last week, Chris. I couldn't get one cheaper. Plus, there's only one custom board partner card out (after 4 months).
I waited a long time for this card. After I blew up my 390x. Last May, I decided to buy a Sapphire 580 8gb limited edition. The 580s came out less than a month ago and I was just casually taking my time to purchase it. But when I got around to buying it. The limited editions were all sold out a day earlier. ? Well, I went into panic mode and made a search for a lower model 580 8gb. I ended up with a 580 nitro+ 8gb which was a lucky buy in the middle of May, '.17. Two days later they were all sold out. The next week, all 580 8gb brands were sold out.
Original retail for Sapphire Rx 580 8gb nitro+ is $259.
Adjust market price is anywhere between $300 to $450
October was the first time I'd seen the price drop below $270. They're all sold out now. The miners just love 'em.
I had no idea the miners would go into a frenzi and buy the brains outta the market. I heard they were being sold in 10 packs. It was discouraging to say the least because the price on 580 Polaris 8gbs never came down to the original retail prices. If you wanted to pay less, you really had to scour the internet. It's sort of takes away it's value.


----------



## gapottberg

I hear you man. I actually saw an 8gb 580 on sale a week ago on newegg for $250!!! I almost snatched one. It didnt last 24hrs at that price. Right now you can find the low end 580 8gb cards around $279...with the nicer ones still pushing $300+, such an FN shame.


----------



## Rexer

It's a real shame because 580 8gb is a great card out of the box. Not only fast, it has the coolest temps of any AMT/ATI card. I haven't gone over 63c and even in intense games, it's been between 52c to 58c (namely in the Battlefield/Battlefront games). In Call of Duty Advance Warfare, it didn't rise above 53c. Pricing it into 390's price range, well, you're just better off buying a 390 or 390x.
Judging the two cards, 390 has more pixels and resolution nunsenses blah, blah, blah are better. Especially when clocked it's the monster of the hill.
The 580 is very fast, very smooth with high frame rates but lacks the resolve (detail) the big 390s have.
If you can find one at retail or cheaper, I would certainly buy a 580 8gb just to keep it around for a back up (wish I had one when my 390x kacked, lol).
Here's a marvel. Some 480s' can be flashed to 580 and when they do, they run cooler. I've no clue about any details. I was hanging out in the 480/580 OC club when they were threading the info.


----------



## chris89

Quote:


> Originally Posted by *Rexer*
> 
> It's a real shame because 580 8gb is a great card out of the box. Not only fast, it has the coolest temps of any AMT/ATI card. I haven't gone over 63c and even in intense games, it's been between 52c to 58c (namely in the Battlefield/Battlefront games). In Call of Duty Advance Warfare, it didn't rise above 53c. Pricing it into 390's price range, well, you're just better off buying a 390 or 390x.
> Judging the two cards, 390 has more pixels and resolution nunsenses blah, blah, blah are better. Especially when clocked it's the monster of the hill.
> The 580 is very fast, very smooth with high frame rates but lacks the resolve (detail) the big 390s have.
> If you can find one at retail or cheaper, I would certainly buy a 580 8gb just to keep it around for a back up (wish I had one when my 390x kacked, lol).
> Here's a marvel. Some 480s' can be flashed to 580 and when they do, they run cooler. I've no clue about any details. I was hanging out in the 480/580 OC club when they were threading the info.


Yeah 480 is 229 right now on visiontek website & 580 is 249 & 390x is 229 but out of stock

Im gonna buy 2 more rx 480s, I love the 480. I love the visiontek 480 can flash to 580 without an issue and runs much better.

I prefer the newer 14nm polaris technology over the old hog 390x, even know the 390x can pull near 8 Tera Flops over clocked.. it's pulling 7.1 tera flops at 1,275mhz, just not stable.

The RX 480 when the VRM is cooled, can consistently pull 7 tera flops if the VRM is running cooler than the core temperature.

My question about the 390x is how do I change the voltage in Hex for the 65288 value?

Once I do that & increase it, it can overclock in windows better than setting a voltage manually.

Just for fun... Imagine 1,500mhz 390x.... Future 14nm 390x would be unreal


----------



## oskullop

Hi all is this normal temp difference between vrm1 & vrm2,mem and clock are on stock


Spoiler: Warning: Spoiler!







gpu is Sapphire NITRO R9 390X 8GB Back Plate

this ss is while running two 22" screens and just twitch watching


----------



## Rexer

Dang, Chris, 1500! Where you going in a hurry? Lol. I was happy with 1190. Even without bios mods I'm moving so fast, I can't even see when I step up dpi on the mouse! I have to admit, I almost love being kicked out of multiplayer fps servers as a cheat. Battlefield and Call of Duty players are the worse cryers in gaming. For the love of that flawless, high speed view!


----------



## gapottberg

Quote:


> Originally Posted by *oskullop*
> 
> Hi all is this normal temp difference between vrm1 & vrm2,mem and clock are on stock
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> gpu is Sapphire NITRO R9 390X 8GB Back Plate
> 
> this ss is while running two 22" screens and just twitch watching


I get a similar difference on my card. It doesnt seem to be a problem for me as temps remain below 60'c which is far from max temp of the card.

I am curious to do a full breakdown and upgrade my thermal paste someday and see if maybe one of the VRMs needs better contact with thermal pads or something. But for now everything runs below 70'c while running stress tests and games for over an hour.


----------



## Rexer

Quote:


> Originally Posted by *gapottberg*
> 
> I hear you man. I actually saw an 8gb 580 on sale a week ago on newegg for $250!!! I almost snatched one. It didnt last 24hrs at that price. Right now you can find the low end 580 8gb cards around $279...with the nicer ones still pushing $300+, such an FN shame.


That's probably the lowest will ever see for 580 8gb OC. Miners will devour it in a heartbeat. No offense to the miners.


----------



## gapottberg

Yeah, i am almost kicking myself for not buying it when I saw it...but I dont really have the money laying around and I may have a used one of these...



It is a Saphire 290 Tri-X 4Gb being sold locally on CL. Gonna see if I can snatch it up for under $150. Might be a deal and a worthy upgrade for my kid if I can.


----------



## chris89

Quote:


> Originally Posted by *gapottberg*
> 
> Yeah, i am almost kicking myself for not buying it when I saw it...but I dont really have the money laying around and I may have a used one of these...
> 
> 
> 
> It is a Saphire 290 Tri-X 4Gb being sold locally on CL. Gonna see if I can snatch it up for under $150. Might be a deal and a worthy upgrade for my kid if I can.


Yeah pick it up and send me the BIOS... The cooler on that card is what makes it worth it.

By the way...

I hit total score 4,903 time spy ... anyone else here beating my 390x x5650 x2 system?

https://www.3dmark.com/spy/2755298


----------



## Seahawkshunt

Quote:


> Originally Posted by *chris89*
> 
> Yeah pick it up and send me the BIOS... The cooler on that card is what makes it worth it.
> 
> By the way...
> 
> I hit total score 4,903 time spy ... anyone else here beating my 390x x5650 x2 system?
> 
> https://www.3dmark.com/spy/2755298


https://www.3dmark.com/spy/2403407 valid


----------



## chris89

Quote:


> Originally Posted by *Seahawkshunt*
> 
> https://www.3dmark.com/spy/2403407 valid


Nice, which result do you use and how do I skip the Demo? haha I press Start key but sometimes it doesn't work.. a little annoying...

nice system.. im comparing.. Also what's your driver version?

17.11.2 shows as generic vga gpu


----------



## Seahawkshunt

Quote:


> Originally Posted by *chris89*
> 
> Nice, which result do you use and how do I skip the Demo? haha I press Start key but sometimes it doesn't work.. a little annoying...
> 
> nice system.. im comparing.. Also what's your driver version?
> 
> 17.11.2 shows as generic vga gpu


Results? Not sure what you mean, but here is the direct comparison https://www.3dmark.com/compare/spy/2403407/spy/2755298
I use Driver 17.7.1 for benching. Skip the demo by going into settings (cog wheel top right of launcher) and turning it off, this needs to be done for each benchmark. This run was at 1200/1725, the older system scan on 3DMark was not working with driver 17.7.1 You should check out this driver. http://support.amd.com/en-us/download/desktop/previous?os=Windows%207%20-%2064 It has given me the best results out of all the drivers since 2015, benching only. I use the most recent driver for daily use and gaming.


----------



## chris89

I kept the same 17.11.2 but this time turned off Speed Step & CPU Score plummeted haha ... However did 1225mhz core 1425mv & 1725mhz memory and got a really high graphics score

comparison

https://www.3dmark.com/compare/spy/2757125/spy/2403407


----------



## oskullop

i did that allready removed pads and put new ones in
Quote:


> Originally Posted by *gapottberg*
> 
> I get a similar difference on my card. It doesnt seem to be a problem for me as temps remain below 60'c which is far from max temp of the card.
> 
> I am curious to do a full breakdown and upgrade my thermal paste someday and see if maybe one of the VRMs needs better contact with thermal pads or something. But for now everything runs below 70'c while running stress tests and games for over an hour.


I did that removed pads and put new ones in same temps ok contact probably one side of vram is more used something like that


Spoiler: Warning: Spoiler!











and the thing that makes me go crazy is mem running at 1500mhz when watching movie,or yt/twitch totally unnecessary,i member the time when i could just lock mem to 500mhz in drivers 40c was max temp on vram while doing the same thing







(before crimson came out)


----------



## chris89

4,975 time spy ... gpu score lower than I have seen but almost 5,000 points... I'm aiming for 5,000 points haha

https://www.3dmark.com/compare/spy/2757125/spy/2757217/spy/2403407


----------



## chris89

I did some modding to my 390x to get that score, but I can't beat it... 4,975 is extremely difficult, if not pure luck... 33fps is only possible if in the beginning of test 1, hits 37fps in the beginning, if it hits 36, not close.

This copper just gained a tiny bit of efficiency & reduced capacitor temperature at load.


----------



## dagget3450

Quote:


> Originally Posted by *chris89*
> 
> I did some modding to my 390x to get that score, but I can't beat it... 4,975 is extremely difficult, if not pure luck... 33fps is only possible if in the beginning of test 1, hits 37fps in the beginning, if it hits 36, not close.
> 
> This copper just gained a tiny bit of efficiency & reduced capacitor temperature at load.
> 
> 
> Spoiler: Warning: Spoiler!


I am curious whats your cpu clocks and memory clocks? - x5650 cpus can easily hit 4ghz without much effort.


----------



## chris89

Quote:


> Originally Posted by *dagget3450*
> 
> I am curious whats your cpu clocks and memory clocks? - x5650 cpus can easily hit 4ghz without much effort.


It's a Dell Precision T7500 with 2x Intel Xeon X5650's that can only overclock as much as Turbo Boost allows, about 3.06Ghz & Memory is 48GB with 1333Mhz Hexa-Channel...

Since 2x Xeon's uses Triple Channel per CPU so with 2 CPU's, has 6-Channel Memory. It's not that fast. I can't wait to replace it with Ryzen 7 1700. I wonder if it can do 4.5Ghz with lapping & a Thermalright True Copper 120? haha I love that cooler, so beautiful. Push pull 120mm fan's for 4.5Ghz Ryzen would be wicked.


----------



## robin69

Sick stock-cooler modding @chris89, Thinking about repasting my Card when i see pics like this









But since i run on of your BIOS-Mods my Card wont go over 75-77 degrees under gaming sessions, really appreciate your work.
Card Runs at [email protected] and [email protected] Voltage (Started with your 1094-1000Mhz Bios) only get 7-8 Memory Errors in two hours.

Runs Witcher 3 at 1440p(DSR) 60FPS , but rarely get dips into the 40's, wanted to know if 1501Mhz Memory is reasonable or just power waste.

Thanks in regards !

PS:Sorry for bad spelling etc. not a native Speaker !


----------



## chris89

Quote:


> Originally Posted by *robin69*
> 
> Sick stock-cooler modding @chris89, Thinking about repasting my Card when i see pics like this
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But since i run on of your BIOS-Mods my Card wont go over 75-77 degrees under gaming sessions, really appreciate your work.
> Card Runs at [email protected] and [email protected] Voltage (Started with your 1094-1000Mhz Bios) only get 7-8 Memory Errors in two hours.
> 
> Runs Witcher 3 at 1440p(DSR) 60FPS , but rarely get dips into the 40's, wanted to know if 1501Mhz Memory is reasonable or just power waste.
> 
> Thanks in regards !
> 
> PS:Sorry for bad spelling etc. not a native Speaker !


Nice. Can you send me your original bios .rom? which card do you use 390 or 390x?

1250mhz at 875mv is more efficient & saves power almost no noticeable difference at all.

thx


----------



## chris89

Marked Efficiency Improvements with copper on the caps actually increased the frame rate too quite a bit.. sometimes 50fps with 4k is awesome performance.. compared to before far too much delay

Plus Rise Of The Tomb Raider using Async Compute patch 4096x2160 @ 60hz


----------



## Derek129

Any bios modding available for gigabyte g1 r9 390?


----------



## chris89

Sure send bios


----------



## chris89




----------



## chris89

https://www.youtube.com/edit?o=U&video_id=wklEB2vfOlM


----------



## Derek129

Even though its voltage locked and a non reference card?


----------



## chris89

Quote:


> Originally Posted by *Derek129*
> 
> Even though its voltage locked and a non reference card?


Dump the bios .rom & .zip it & attach via paperclip


----------



## GeRmAnOs36

I send you mine , i got to r9 390 g1 from gigabyte.


Spoiler: Warning: Spoiler!



http://www14.zippyshare.com/v/ggOsTIJi/file.html


----------



## chris89

Quote:


> Originally Posted by *GeRmAnOs36*
> 
> I send you mine , i got to r9 390 g1 from gigabyte.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://www14.zippyshare.com/v/ggOsTIJi/file.html












1133.1250.1000.1333.zip 101k .zip file


1172.1250.1000.1367.zip 101k .zip file


----------



## chris89

Now im running 290x 4gb + 390x 8gb crossfire...

the 4gb + 8gb combo isn't ideal huh?



Here I'm showing how I replaced my 290X 4GB timings with all of my 390X 8GB timings... Because there is issues wth timings dfferences when crossfiring 290x 4gb & 390x 8gb... Sometimes fps is immense, other times terrible because of timings... So they will by synced up.

I wonder if the 290X 4GB can hit 1,500mhz memory clock with the 390Xs LOOSER Timings? As the 290X 4GB has very tight tmings compared to the faster LOOSE more stable 390X timings.

AMD.ATI.290X.4GB.WITH.390X.8GB.TIMINGS.zip 99k .zip file


Now im running 290x 4gb + 390x 8gb crossfire...

the 4gb + 8gb combo isn't ideal huh?

Here are the results using 390x timings on the 290x 4gb & scaling was flawless.. 1094mhz core 1250mhz memory 875mv 1275mv

https://www.3dmark.com/spy/2810894


----------



## GeRmAnOs36

Tested 1133 version and all good, but fan curve is not accurate for me. I find this little app to set my own fan curve profile.


Is there any benefits from bumping little bit higher memory clocks and voltage??


----------



## chris89

Quote:


> Originally Posted by *GeRmAnOs36*
> 
> Tested 1133 version and all good, but fan curve is not accurate for me. I find this little app to set my own fan curve profile.
> 
> 
> Is there any benefits from bumping little bit higher memory clocks and voltage??


here I copied your fan settings from that app in the bios & increased memory clock & decreased core clock a little



1133.1500.1000.1316.zip 101k .zip file


----------



## Harry604

is there a new bios for 390 strix ? i flashed an old one of yours


----------



## chris89

Quote:


> Originally Posted by *Harry604*
> 
> is there a new bios for 390 strix ? i flashed an old one of yours












1133.1500.1000.1316.zip 103k .zip file


----------



## Harry604

thank you brotha


----------



## Harry604

Quote:


> Originally Posted by *chris89*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1133.1500.1000.1316.zip 103k .zip file


Thank you

I flashed this bios.. rebooted to safe mode used DDU then rebooted than installed new 17.11.2 drivers but once I get to the log in screen I sign in than it black screens everytime

Any ideas?


----------



## chris89

Quote:


> Originally Posted by *Harry604*
> 
> Thank you
> 
> I flashed this bios.. rebooted to safe mode used DDU then rebooted than installed new 17.11.2 drivers but once I get to the log in screen I sign in than it black screens everytime
> 
> Any ideas?


You don't need to uninstall the driver to flash bios. I wanna help. That happens randomly. Need to reflash.

1094.1500.1000.65288.zip 103k .zip file


----------



## Harry604

Quote:


> Originally Posted by *chris89*
> 
> You don't need to uninstall the driver to flash bios. I wanna help. That happens randomly. Need to reflash.
> 
> 1094.1500.1000.65288.zip 103k .zip file


ok ill try this one thank you


----------



## Harry604

It might of been afterburner causing black screens... I have it disabled and no black screen now.... What should I use to overclock now ?


----------



## chris89

Quote:


> Originally Posted by *Harry604*
> 
> It might of been afterburner causing black screens... I have it disabled and no black screen now.... What should I use to overclock now ?


It was probably a profile set or something. How much are you looking to get out of it?

We get error's past 1,500mhz memory & 1,172mhz is about max on the core game stable. Benchmarks, possibly 1,225mhz if you have a really high end power supply.

1172.1500.1000.1367.zip 103k .zip file


1205.1500.1000.1425.zip 103k .zip file


----------



## Harry604

Quote:


> Originally Posted by *chris89*
> 
> It was probably a profile set or something. How much are you looking to get out of it?
> 
> We get error's past 1,500mhz memory & 1,172mhz is about max on the core game stable. Benchmarks, possibly 1,225mhz if you have a really high end power supply.


ill just leave it at the first bios you gave me 1133 and 1500mhz

thank you sir

i just wanted something i can monitor temps on OSD


----------



## robin69

If you want to Monitor with OSD, try the HWinfo + Rivatuner Combo.

This is the ultimate configuration for OSD Monitoring


----------



## Harry604

Quote:


> Originally Posted by *robin69*
> 
> If you want to Monitor with OSD, try the HWinfo + Rivatuner Combo.
> 
> This is the ultimate configuration for OSD Monitoring


thank you ill try it


----------



## robin69

Hi @chris89, I finally had time to upload my BIOS
https://www.zipshare.com/download/eyJhcmNoaXZlSWQiOiI1NDFhOWI3MC0wM2NmLTRhMmUtYmY3OS04ZWFmZTUyNWJkYTMiLCJlbWFpbCI6InJhaW5lcjE0OTZAZ21haWwuY29tIn0=


----------



## chris89

Quote:


> Originally Posted by *robin69*
> 
> Hi @chris89, I finally had time to upload my BIOS
> https://www.zipshare.com/download/eyJhcmNoaXZlSWQiOiI1NDFhOWI3MC0wM2NmLTRhMmUtYmY3OS04ZWFmZTUyNWJkYTMiLCJlbWFpbCI6InJhaW5lcjE0OTZAZ21haWwuY29tIn0=












1133.1563.1050.1333.zip 42k .zip file


----------



## GeRmAnOs36

Hey chris can you mod my bios for some around 1150 on core ,cos 1172 is little bit to high to my gigabyte. 1133 on core is fine ,but i think 1150 be optimal. I try to underclock to 1150 via wattman and see how it works.


----------



## chris89

Try less voltage more memory clock

1172.1563.1000.1348.zip 101k .zip file


1150.1563.1000.1333.zip 101k .zip file


----------



## Rexer

Quote:


> Originally Posted by *chris89*
> 
> Try less voltage more memory clock
> 
> 1172.1563.1000.1348.zip 101k .zip file
> 
> 
> 1150.1563.1000.1333.zip 101k .zip file


I don't know where you come up with these mods. Very cool on you, Chirs. If I recall, you gave me the bios for my card. Thanks.
Ran my Asus 390x at 1150 flat for a long time. Was very comfortable and happy. Lol. Seamless smoothness, vivid color and extremely fast detail. Just too much fun. Then I got the 'more is better' addiction and started cranking past 1250+. Well, RIP to my stupidity.
I've owned 970sc, 580oc and now a Vega 64. As much as I like the Vega, right now it has teething problems with drivers and Windows. I'm constantly groveling about new settings for a sweet spot, I missed what carefree joy the Asus was at 1150. Good on you.


----------



## chris89

Quote:


> Originally Posted by *Rexer*
> 
> I don't know where you come up with these mods. Very cool on you, Chirs. If I recall, you gave me the bios for my card. Thanks.
> Ran my Asus 390x at 1150 flat for a long time. Was very comfortable and happy. Lol. Seamless smoothness, vivid color and extremely fast detail. Just too much fun. Then I got the 'more is better' addiction and started cranking past 1250+. Well, RIP to my stupidity.
> I've owned 970sc, 580oc and now a Vega 64. As much as I like the Vega, right now it has teething problems with drivers and Windows. I'm constantly groveling about new settings for a sweet spot, I missed what carefree joy the Asus was at 1150. Good on you.


haha yeah mean I hear ya... want me to make you new bios?


----------



## GeRmAnOs36

Thanks chris for new bioses and i report that 1150 version is in fact 1133 but in crimson it states 1150. On other way on wattman when set to dynamic it says 1135, ***??



And got 99 gpu memory errors in battlefield 1(for test new bios), Now i try 1172 version and see how it goes.


----------



## Rexer

Quote:


> Originally Posted by *chris89*
> 
> haha yeah mean I hear ya... want me to make you new bios?


I'd love one. But my big, 390x friend went to gpu heaven months ago. This Vega, meh. I get new problems, new cures like a 290x. I can't even tamper with the clocks yet. It likes to let me see an unknown crash, a fuzzy screen full of artifacts or make nice, still, wall pictures for all sorts of reasons.
I just got through digging out little bits and pieces of the Nvidia 970sc drivers in the HKey registry. No doubt about it, AMD and Nivida aren't friends. (Lol, somehow I can't get it out of my mind Nvidia really gets offended if you replace their gpu for an AMD).
And here's the weird part. None of these crashes happened with the lesser power cards, 970ssc or 580oc.
So now it's got me thinking power. The Vega 64 power requirement suggest a 750w psu. That seems ludicrous but o.k. I throw a known good (EVGA G-2) 850 watter in, get myself involved in a huge multi-player Battlefield game and the next morning, bless my soul, a black screen start up.
I try three restarts. Nope. So I turn the computer off, psu off, throw out last night's beer bottles and go back to bed. A few hours later, I turn on the computer for a check and it starts up. ? Perfect, so I restart it again and it's fine. Play a round of Call of Duty AW and it's as nice as mother makes it. ? It's scary. I'm afraid a capacitor wants to die on the board but I also seen start up crashes when Windows does an ingenious update.
The reason why I think a capacitor is failing to start, I seen another computer with a big, power hungry card struggle to boot. The board properties just couldn't muster up enough power to consistently start a big 290x. The problem doesn't happen with the lower power 580 or the 970. Just Vega 64.
So the Vega gets no toys till it can behave. Dang, if I don't talk my computers like kiddies. Make me wanna kick 'em.


----------



## chris89

Quote:


> Originally Posted by *Rexer*
> 
> I'd love one. But my big, 390x friend went to gpu heaven months ago. This Vega, meh. I get new problems, new cures like a 290x. I can't even tamper with the clocks yet. It likes to let me see an unknown crash, a fuzzy screen full of artifacts or make nice, still, wall pictures for all sorts of reasons.
> I just got through digging out little bits and pieces of the Nvidia 970sc drivers in the HKey registry. No doubt about it, AMD and Nivida aren't friends. (Lol, somehow I can't get it out of my mind Nvidia really gets offended if you replace their gpu for an AMD).
> And here's the weird part. None of these crashes happened with the lesser power cards, 970ssc or 580oc.
> So now it's got me thinking power. The Vega 64 power requirement suggest a 750w psu. That seems ludicrous but o.k. I throw a known good (EVGA G-2) 850 watter in, get myself involved in a huge multi-player Battlefield game and the next morning, bless my soul, a black screen start up.
> I try three restarts. Nope. So I turn the computer off, psu off, throw out last night's beer bottles and go back to bed. A few hours later, I turn on the computer for a check and it starts up. ? Perfect, so I restart it again and it's fine. Play a round of Call of Duty AW and it's as nice as mother makes it. ? It's scary. I'm afraid a capacitor wants to die on the board but I also seen start up crashes when Windows does an ingenious update.
> The reason why I think a capacitor is failing to start, I seen another computer with a big, power hungry card struggle to boot. The board properties just couldn't muster up enough power to consistently start a big 290x. The problem doesn't happen with the lower power 580 or the 970. Just Vega 64.
> So the Vega gets no toys till it can behave. Dang, if I don't talk my computers like kiddies. Make me wanna kick 'em.


Vega isn't coolng all power phase components, so some run hot & need cooling. Pull the card apart & lets take a look...


----------



## Rexer

You know Chris, I don't think it's the Vega. When the 390x died, I replaced it with a XFX 390 I borrowed.That's when I first notice the black screen start up crashes (same as the Vega).
For a week I tolerated the crashing, then I popped the old 970sc in it and the crashing went away. ? Nice. I purchased the 580 and it ran great, no crash problems at all. So I dubbed the XFX 390 as faulty. What convinced me more the 390 was the problem, the 580 ran super cool, around 53c average. Very few times did I get over 62c. From May to November, not a start up crash.
But when I got the Vega, the black screen crashing at started up returned.
I switched power connections from the psu and still, no difference. I think the problems' on the motherboard. Maybe one of the motherboard capacitors isn't holding enough power at startup. I haven't tried switching to #2 PCIe 3 slot, so after I try that, I'm going to test all my wiring.

By the way, that XFX 390 was a terrible creature. I could barely move the clocks/power before it locked. I'm glad I didn't own it.


----------



## gavrilo77

How to set up bios for Gigabyte R9 390x G1. I want to underclock from 1025/1500 to 1000/1250

I know i need to change GPU clock 1/MEM clock1 1000/1250

Anyone can help regarding to voltage. Shall i change the values in all tables including Limit table?

Thank you


----------



## chris89

Quote:


> Originally Posted by *gavrilo77*
> 
> How to set up bios for Gigabyte R9 390x G1. I want to underclock from 1025/1500 to 1000/1250
> 
> I know i need to change GPU clock 1/MEM clock1 1000/1250
> 
> Anyone can help regarding to voltage. Shall i change the values in all tables including Limit table?
> 
> Thank you


send bios?


----------



## gavrilo77

Quote:


> Originally Posted by *chris89*
> 
> send bios?


Here you are. Thanks!!!

Hawaii.zip 101k .zip file


----------



## chris89

Quote:


> Originally Posted by *gavrilo77*
> 
> Here you are. Thanks!!!
> 
> Hawaii.zip 101k .zip file


 1000.1250.875.65288.zip 101k .zip file


----------



## gavrilo77

Quote:


> Originally Posted by *chris89*
> 
> 1000.1250.875.65288.zip 101k .zip file


Not working. Memory clock under full load is const 150 only


----------



## chris89

Quote:


> Originally Posted by *gavrilo77*
> 
> Not working. Memory clock under full load is const 150 only


 1000.1250.875.65288.zip 101k .zip file


----------



## chris89

Finally managed to break 5,000 points Time Spy using SetFSB... It's that PCIe Frequency that yields higher CPU Physics & Frames Per Second.
Onboard SATA cuts out so I am using 2x PCIe SATA Adapters to bypass BSOD, but if you use Onboard SATA it will BSOD. Need PCIe SATA.


----------



## christoph

what?


----------



## gavrilo77

Quote:


> Originally Posted by *chris89*
> 
> 1000.1250.875.65288.zip 101k .zip file


Thank you!!!

I would appreciate if someone has bios file for Gigabyte windforce R9290C 4 GD other than F2. Those ones i found on Gigabyte site are not working when i tried to flash. Issue i have with this GPU it is black screen on boot. When i insert one more card boot is working, but after some min i have got blue screen and error Thread stuck in device driver. I have tried with different drivers as well but error is there. So i would like to try with flash if it is possible to do something.


----------



## chris89

Quote:


> Originally Posted by *christoph*
> 
> what?


Stock? haha nice I plan on upgrading soon to Ryzen 1700.


----------



## christoph

Quote:


> Originally Posted by *chris89*
> 
> Stock? haha nice I plan on upgrading soon to Ryzen 1700.


yeah stock with ram at 2933


----------



## chris89

Quote:


> Originally Posted by *christoph*
> 
> yeah stock with ram at 2933


Nice... I'll take the system off your hands buddy?


----------



## christoph

Quote:


> Originally Posted by *chris89*
> 
> Nice... I'll take the system off your hands buddy?


----------



## chris89

Quote:


> Originally Posted by *christoph*
> 
> what?


Nice score. I really really really hope I can hit 5,000 CPUz so I can smoke my X5650 X2 Rig With the Ryzen 1700...???

Can you try & hit 5,000 CPUz?


----------



## christoph

Quote:


> Originally Posted by *chris89*
> 
> Nice score. I really really really hope I can hit 5,000 CPUz so I can smoke my X5650 X2 Rig With the Ryzen 1700...???
> 
> Can you try & hit 5,000 CPUz?


actually that what I was planning to do this week, but I'm a little busy, I got a couple of hours between days so I'll see what I can get


----------



## chris89

Quote:


> Originally Posted by *christoph*
> 
> actually that what I was planning to do this week, but I'm a little busy, I got a couple of hours between days so I'll see what I can get


If I run it randomly like don't run it over-and-over again unless you can overclock in windows? For me I run it 1st time is the fastest, since I think VRM overheat or something & score comes down.


----------



## chris89

Anyone wanna compare PEAK Overclocking of your R9 290/ 290X/ 390/ 390X with mine? AIDA GPGPU?


----------



## chris89

https://www.youtube.com/channel/UCfjHkc1HNf5lkzzfUaD6-Uw?view_as=subscriber


----------



## robin69

Hy Chris, i can do one later If you want









The Bios you sent me should score fine ^^ Just noticed some artifacts while gaming, voltage is already at 1333, so maybe drop to 1125Mhz -1325~1330mV ?

Also noticed that your BIOS has only a 900Mhz state before going to Peak clock, is this important ? before i had it around 1000Mhz, is there any difference ?


----------



## chris89

Quote:


> Originally Posted by *robin69*
> 
> Hy Chris, i can do one later If you want
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The Bios you sent me should score fine ^^ Just noticed some artifacts while gaming, voltage is already at 1333, so maybe drop to 1125Mhz -1325~1330mV ?
> 
> Also noticed that your BIOS has only a 900Mhz state before going to Peak clock, is this important ? before i had it around 1000Mhz, is there any difference ?


Can you monitor your VRM temperature & post here? Artifacts means overheating VRM.

1094.1250.895.65288.zip 42k .zip file


----------



## robin69

Hi Chris ! I can do that, but i have to study for an exam this week.
I try to upload till Saturday ( I dont have time for a proper gaming session until then )

Cheers and thanks for your help !


----------



## robin69

Hi Chris, tested the 1094/1250 BIOS for about 1:30h in Skyrim (1440p+FXAA,nearly everything on max)
I know it's not a good scenario because the game is locked to 60, i will test it with witcher 3 tomorrow

I didnt monitor the 1133/1563, but VRAM was hitting 90 degrees after an hour ( Guess i have to repaste before i try higher clocks)

Temps look very good ( but also average utilization was only 77% )


Edit: I am a bit confused know, i know tried to up Core to 1110Mhz with the 1094/1250 Bios, the run Superpostion
Result where the same artifacts as with 1133/1563 , seconds after i enter the Benchmark, can this be voltage related ?

The artifacts look like a pattern made out of points and diagonal lines, any Idea why ? Card was still cold, 55-60 degrees max.


----------



## chris89

Quote:


> Originally Posted by *robin69*
> 
> Hi Chris, tested the 1094/1250 BIOS for about 1:30h in Skyrim (1440p+FXAA,nearly everything on max)
> I know it's not a good scenario because the game is locked to 60, i will test it with witcher 3 tomorrow
> 
> I didnt monitor the 1133/1563, but VRAM was hitting 90 degrees after an hour ( Guess i have to repaste before i try higher clocks)
> 
> Temps look very good ( but also average utilization was only 77% )
> 
> 
> Edit: I am a bit confused know, i know tried to up Core to 1110Mhz with the 1094/1250 Bios, the run Superpostion
> Result where the same artifacts as with 1133/1563 , seconds after i enter the Benchmark, can this be voltage related ?
> 
> The artifacts look like a pattern made out of points and diagonal lines, any Idea why ? Card was still cold, 55-60 degrees max.


This is the max your power supply can handle before it goes out, its real close too... 12v rail is like 10.9v very bad

1094.1001.895.65288.-25mv.Offset.zip 42k .zip file


----------



## robin69

Thanks for your help , totally forgot about my s***ty PSU








You even told me that this is my limiting factor earlier this year when i first starting posting in this Thread, embarissing









I think I´m gonna upgrade it over Christmas. Planning on getting an Enermax MaxPro 700W or a Corsair CX 750W.

Maybe someone got a recommendation for a PSU for under 100 bucks ?


----------



## chris89

Quote:


> Originally Posted by *robin69*
> 
> Thanks for your help , totally forgot about my s***ty PSU
> 
> 
> 
> 
> 
> 
> 
> 
> You even told me that this is my limiting factor earlier this year when i first starting posting in this Thread, embarissing
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I think I´m gonna upgrade it over Christmas. Planning on getting an Enermax MaxPro 700W or a Corsair CX 750W.
> 
> Maybe someone got a recommendation for a PSU for under 100 bucks ?


Rosewill Hive 1,000w $78.99 with 83A on 12V rail.
https://www.newegg.com/Product/Product.aspx?Item=N82E16817182315&cm_re=rosewill_hive_1000w-_-17-182-315-_-Product

You really want a very solid 12v rail. If you want a full modular as I do, the Rosewill Photo-1050 is 1050 watts fully modular like 87.5A 12v.

$120
https://www.ebay.com/itm/Rosewill-Photon-1050-PHOTON-Series-1050W-Full-Modular-Power-Supply-80-PLUS/192360769210?epid=1434390118&hash=item2cc99866ba:g:MzsAAOSwZQRYXFx3

https://www.newegg.com/Product/Product.aspx?Item=N82E16817182325&cm_re=rosewill_photon_1050-_-17-182-325-_-Product


----------



## robin69

Sadly i live in Europe (Austria ).But it seems that newegg does ship to my country.
The Hive 1000W would cost me 93€ incl shipping which would be a nice deal for a modular 1000W PSU.

Thanks again Chris !


----------



## chris89

Quote:


> Originally Posted by *robin69*
> 
> Sadly i live in Europe (Austria ).But it seems that newegg does ship to my country.
> The Hive 1000W would cost me 93€ incl shipping which would be a nice deal for a modular 1000W PSU.
> 
> Thanks again Chris !


Yeah that's good! I would go with the Hive 1000w or Photon 1050w.


----------



## VBoOmeRanGV

Anyone have experience overclocking the Gigabyte R9 390X G1? I have the original, not revision 1.
Just curious if it's worth my time. I currently run it with a Ryzen 5 1600X and 16GB 3000mHz RAM.

Thanks!


----------



## ZealotKi11er

VBoOmeRanGV said:


> Anyone have experience overclocking the Gigabyte R9 390X G1? I have the original, not revision 1.
> Just curious if it's worth my time. I currently run it with a Ryzen 5 1600X and 16GB 3000mHz RAM.
> 
> Thanks!


Probably not. Try 50-100MHz on the Core and 100-150 on the memory. 390X is a OCed 290X so most of the overclocking was done from factory.


----------



## GeRmAnOs36

Gigabyte locked voltage control on 390 and X versions, so only custom bios can get you some overclock power.
My 390 Gaming G1 works perfect with 1133 on core.


----------



## chris89

VBoOmeRanGV said:


> Anyone have experience overclocking the Gigabyte R9 390X G1? I have the original, not revision 1.
> Just curious if it's worth my time. I currently run it with a Ryzen 5 1600X and 16GB 3000mHz RAM.
> 
> Thanks!


Sure upload your bios to like www.mediafire.com or something & post link here


----------



## chris89

robin69 said:


> Sadly i live in Europe (Austria ).But it seems that newegg does ship to my country.
> The Hive 1000W would cost me 93€ incl shipping which would be a nice deal for a modular 1000W PSU.
> 
> Thanks again Chris !


How do you like the Rosewill Hive 1000w? Can you please post Hwinfo sensor page so we can see how good the 12 volt readings are, pretty pretty please?

Post your image from creating a www.mediafire.com account & you can post a thumbnail here since Overclock.net attachment system is broken atm.


----------



## chris89

GeRmAnOs36 said:


> Gigabyte locked voltage control on 390 and X versions, so only custom bios can get you some overclock power.
> My 390 Gaming G1 works perfect with 1133 on core.


Right on bro, yeah glad your liking the 1133mhz bios.. right now im running my reference 390x with reference 290x blower & larger heavier heatsink at 1188mhz 1350mv... Its really powerful at this clock but I'm cpu bound on dual 3.5ghz xeon's with 24 threads. Need Ryzen Infinity Fabric to utilize the max performance of the 390x on top of really really fast system memory bandwidth, up to 60GB/s+... I'm hitting 40gb/s on system memory read/write but copy is 26gb/s so thats my bottleneck. Need 60gb/s+ read/write/copy to seriously boost the fps on a 4ghz Ryzen 1700 on DDR4-3600mhz or faster.


----------



## VBoOmeRanGV

Thanks guys, I'll leave it where it's at for now. I'm still working out random bugs with my display. Even at stock CPU/GPU/RAM speeds I am having random shut-downs. Also, randomly on computer start it acts like it doesn't recognize my monitor. I have to turn off the computer, switch off the PSU, wait a minute, turn it on again and it works fine for awhile. Two days ago it worked good for 30 minutes and then I started watching a youtube video and the computer shut off after a short time. Last night I took the whole GPU apart and replaced the thermal paste. I was not having overheating issues (HWMonitor reported temps as high as 75-80*C, which I hear is OK when under load -it sits at about 42*C unloaded) and I was surprised to see that the factory put almost zero thermal paste on my card. It was literally metal to metal, with a tiny line of paste that was pushed in-between the 4 copper heat pipes. The pipes did not have a flat base, they were just each flattened against the chip and allowed the paste to go into the small grooves inbetween the pipes and there was nothing left for the pipes to chip connection at all. I put enough on there to work, but I did not over-do it. It looks to have knocked off 3-5 degrees of temp, but I'm still annoyed at this.,

I almost forgot to say, when I took the GPU apart they had these thermal stickers attached to various parts of the GPU in strips. Kind of sticky, like clay. When I took the GPU apart they were torn to pieces and I had to puzzle the small parts back to their respecting positions. I'd love to know if you can buy these thermal strips for future use so I don't have to re-use them.


----------



## GeRmAnOs36

Buy new thermal pad asap , the one you got now dont do their job. Get it on ebay or any computerr part shop and change old crappy pads, your problems are in 80% caused by overheatin the card.


----------



## mikeyy233

anyone wanna help me with 390X bios mod not using it for mining look to squeeze more performance. maybe someone mod my timings?


----------



## agentx007

Hello

I received my used 390X yesterday and the first thing I did was to check pads and paste on it (it was mined on).
I exchanged GPU one with Thermal Grizzly Hydronaut, and I replaced two pads on memory that were damaged/dry.
I also added a pad for VRM contoller (just in case card was put together by a monkey second time around, which it was... because one VRAM chip was replaced).

At this point, I tested it in almost everything I wanted, here are highest temps reached so far :









Are those OK ?
I mean, GPU temp is OK (48C duh ), I only want to know if two VRMs should be this far apart (VRM1 = 85C, VRM2 = 73C) temperature wise ?
Can the reason be that they measure Vcore temp (VRM1) and Vmem temp (VRM2) ?
At IDLE, they stay at 38C VRM1 and 42C VRM2.

Oh, and my card is : PowerColor R9 390X Devil 

PS. How to add hyperlinks to this post ?
Do I need to manually type HTML code ?


----------



## christoph

agentx007 said:


> Hello
> 
> I received my used 390X yesterday and the first thing I did was to check pads and paste on it (it was mined on).
> I exchanged GPU one with Thermal Grizzly Hydronaut, and I replaced two pads on memory that were damaged/dry.
> I also added a pad for VRM contoller (just in case card was put together by a monkey second time around, which it was... because one VRAM chip was replaced).
> 
> At this point, I tested it in almost everything I wanted, here are highest temps reached so far :
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Are those OK ?
> I mean, GPU temp is OK (48C duh ), I only want to know if two VRMs should be this far apart (VRM1 = 85C, VRM2 = 73C) temperature wise ?
> Can the reason be that they measure Vcore temp (VRM1) and Vmem temp (VRM2) ?
> At IDLE, they stay at 38C VRM1 and 42C VRM2.
> 
> Oh, and my card is : PowerColor R9 390X Devil
> 
> PS. How to add hyperlinks to this post ?
> Do I need to manually type HTML code ?



I don't know that card really, but the VRM temps looks a little high, but the fan is not spinning fast at all


run the test but set the fan up to 60 -75% speed


----------



## agentx007

Thanks for reply, but making fan spin faster is easier said than done.
Fan speed is controlled by on board controller, however there are two fans in there (one for AIO and one for VRM).
Thing is : I can set "fan speed" to 100% in Afterburner... but it doesn't do anything on either of them :/
The good news is that those 85C I seen only in Superposition 4k optimised.
For example in Time Spy VRMs are over 10C cooler :


----------



## christoph

you can control the fan using the AMD radeon software


----------



## hirasaldhi

*need help...*

i get new set pc about 5 mount a go,
asrock ab350 pro 4
amd ryzen 5 1600x
8gb ram
powercolor r9 390 pcs+
120 gb kingstone ssd
running on windows 10 pro

but i always get thread stuck in device driver BSOD
i have try many time fresh install but keep get the BSOD
i have try update bios still same
all driver the latest one and still get the BSOD

try scaning using cmd: sfc /scannow no error found

please any one can help me......???
(sory for my bad english by the way)


----------



## christoph

hirasaldhi said:


> i get new set pc about 5 mount a go,
> asrock ab350 pro 4
> amd ryzen 5 1600x
> 8gb ram
> powercolor r9 390 pcs+
> 120 gb kingstone ssd
> running on windows 10 pro
> 
> but i always get thread stuck in device driver BSOD
> i have try many time fresh install but keep get the BSOD
> i have try update bios still same
> all driver the latest one and still get the BSOD
> 
> try scaning using cmd: sfc /scannow no error found
> 
> please any one can help me......???
> (sory for my bad english by the way)



yeah but what device? check with bluescreenview to see whats the problem


----------



## hirasaldhi

christoph said:


> yeah but what device? check with bluescreenview to see whats the problem


this is what i get...
atikmdag.sys and dxgkrnl.sys

another info, after i do fresh install windows 10, its so lagy and some time on bios setting its get lagy too..


----------



## christoph

hirasaldhi said:


> this is what i get...
> atikmdag.sys and dxgkrnl.sys
> 
> another info, after i do fresh install windows 10, its so lagy and some time on bios setting its get lagy too..


what monitor do you have? what year was made?

check in Radeon settings in the display tab and Specs (right upper corner) what does it say in current link setting and DHCP status


----------



## hirasaldhi

christoph said:


> what monitor do you have? what year was made?
> 
> check in Radeon settings in the display tab and Specs (right upper corner) what does it say in current link setting and DHCP status


i buy it a week ago

i get 3 time all work good, 60 fps no drop playing BF1, but after shut down, lag and BSOD come again...


----------



## christoph

hirasaldhi said:


> i buy it a week ago
> 
> i get 3 time all work good, 60 fps no drop playing BF1, but after shut down, lag and BSOD come again...



but you see, the current link setting is not available, and the refresh rate of the monitor is 30 hz and not 60, try checking the driver for that monitor and look at the setting in the monitor to see if you have a option to change the refresh rate to 60 and some monitor have this option (the name is different in many monitors) that enables the communication with the video card


----------



## agentx007

Just wanted to show my R9 390X works really odd boards as well.
Valid : https://www.3dmark.com/fs/15471792


----------



## Dundundata

*After 3 years of loyal service*

I went and bought a 1080. Let's just say an initial firestrike run beat the 390 graphics score by 10,000.

I just repasted the 390 and it's still running like a champ so I may try to find some use for it. 

Many good gaming memories. My first real graphics card and first time OCing.


----------



## chris89

@mikeyy233

http://www.mediafire.com/file/77e9x4099a36g04/1173.1563.1373.1000.rom


----------



## maizbset

@chris89

Hi! Sorry, but can you help me with my MSI r9 390? It behaves strangely, on stock vbios I have a black screen(system stops) in a few minutes after boot in idle. Then I've changed the voltage and clocks to SAPPHIRE stock bios values, and it's sometimes freezing on idle so I need to reboot and the temperature average is about 85C in games, but no black screens. So would you please give me some advice on how to modify bios to keep performance, stability, and temperature in normal?
Sorry for bad English)


----------



## chris89

maizbset said:


> @chris89
> 
> Hi! Sorry, but can you help me with my MSI r9 390? It behaves strangely, on stock vbios I have a black screen(system stops) in a few minutes after boot in idle. Then I've changed the voltage and clocks to SAPPHIRE stock bios values, and it's sometimes freezing on idle so I need to reboot and the temperature average is about 85C in games, but no black screens. So would you please give me some advice on how to modify bios to keep performance, stability, and temperature in normal?
> Sorry for bad English)


Have you used Mini Tool Partition Wizard to test your HDD for windows for bad sector's? I'd say it's a sector issue. Because the BIOS is perfect. Unless it's a physical issue, the VRM overheat is possible... We could try ramping up the fan's at idle to cool it more, maybe the VRM scores... idk what the MSI 390 design is but maybe they forgot to cool the memory VRM like XFX?


----------



## maizbset

chris89 said:


> Have you used Mini Tool Partition Wizard to test your HDD for windows for bad sector's? I'd say it's a sector issue. Because the BIOS is perfect. Unless it's a physical issue, the VRM overheat is possible... We could try ramping up the fan's at idle to cool it more, maybe the VRM scores... idk what the MSI 390 design is but maybe they forgot to cool the memory VRM like XFX?


Thanks for the reply!
No, I didn't use Mini Tool Partition Wizard. I don't think that the problem, because I have a two SSD with Windows on the first and Ubuntu on the second and the problem have been on both of them. I've tested stock bios with ramped up fan's, but no results. I've noticed that when black screen happens, the card is still heating, maybe that really the VRAM overheating problem. Anyway since the time I've posted the previous message, I've modified bios again and for 2 days have no any issues. I've read on some posts that the problem can be in the GPU power controller that very feel to voltage, that why need to gain it. And some people told that fixed it by setting mem clock to ~1300 Mhz. I've tried all of that and it helps, but I'm still in search for the best decision. Probably this vbios is what I'm looking for. Can you tell me is it normal?


----------



## betsujin

My Sapphire Tri-XXX r9 390x fan control hardware died, it still works and shows picture but even remote stress and the fans won't spin up and it overheats, wattman shows 0rpm even though the fans spin during POST you just can't even set them to spin up, I tried reflashing a stock bios and I have two of these so I know it's a hardware issue with this one. Does anyone know if it's fixeable or give up, I've replaced the fans and everything. I am fairly good at repairing electronics but I'm guessing this would be extremely complicated?


----------



## betsujin

I guess I could just solder the fan wires to a molex female connector and force them to full blast at all times... Just thought of that, I just use the two r9s for mining anyway, my gamer is a Vega 64


----------



## chris89

betsujin said:


> I guess I could just solder the fan wires to a molex female connector and force them to full blast at all times... Just thought of that, I just use the two r9s for mining anyway, my gamer is a Vega 64


Attach the .rom via Save BIOS GPUz here as attachment. I can force it to 100% in the BIOS all the time. Or adjust the Profile to Cool more accurately.


----------



## Dundundata

maizbset said:


> @chris89
> 
> Hi! Sorry, but can you help me with my MSI r9 390? It behaves strangely, on stock vbios I have a black screen(system stops) in a few minutes after boot in idle. Then I've changed the voltage and clocks to SAPPHIRE stock bios values, and it's sometimes freezing on idle so I need to reboot and the temperature average is about 85C in games, but no black screens. So would you please give me some advice on how to modify bios to keep performance, stability, and temperature in normal?
> Sorry for bad English)


have you tried taking the cooler off and cleaning/repasting?


----------



## maizbset

Dundundata said:


> have you tried taking the cooler off and cleaning/repasting?


No, but I will do it in the near future


----------



## diggiddi

betsujin said:


> My Sapphire Tri-XXX r9 390x fan control hardware died, it still works and shows picture but even remote stress and the fans won't spin up and it overheats, wattman shows 0rpm even though the fans spin during POST you just can't even set them to spin up, I tried reflashing a stock bios and I have two of these so I know it's a hardware issue with this one. Does anyone know if it's fixeable or give up, I've replaced the fans and everything. I am fairly good at repairing electronics but I'm guessing this would be extremely complicated?


You can get a an adapter cable to connect gpu fan to motherboard I think its a mini pwm to pwm cable, i got mine on Amazon but ebay should carry it also
In this case instead of plugging the fan into the card you'll plug it into the adapter which plugs into motherboard


----------



## chris89

post your hwinfo vrm temperature screenshot both sensors under load


----------



## MarkJohnson

I have an Asus Strix R9 390 (STRIX-R9390-DC3OC-8GD5-GAMING) and I can't seem to under volt it. I tried afterburner and voltages are greyed out. Wattman seem to save the numbers, but when I stress test, my temps soar at 94C steady as usual and 1.250 volts still.

I can use afterburner to set 80c max and it seems cool, but it max the clocks at 500Mhz and my game is laggy.

Is there another OC tool to use, or an alternate BIOS or anything?


----------



## PunkX 1

Here's my 3D Mark score with MY R9 390.

https://www.3dmark.com/3dm/27797783

Any good ?


----------



## chris89

MarkJohnson said:


> I have an Asus Strix R9 390 (STRIX-R9390-DC3OC-8GD5-GAMING) and I can't seem to under volt it. I tried afterburner and voltages are greyed out. Wattman seem to save the numbers, but when I stress test, my temps soar at 94C steady as usual and 1.250 volts still.
> 
> I can use afterburner to set 80c max and it seems cool, but it max the clocks at 500Mhz and my game is laggy.
> 
> Is there another OC tool to use, or an alternate BIOS or anything?


Attach BIOS file here



PunkX 1 said:


> Here's my 3D Mark score with MY R9 390.
> 
> https://www.3dmark.com/3dm/27797783
> 
> Any good ?


I'm doing 1,267mhz top end on my reference 390x

https://www.3dmark.com/compare/fs/14735110/fs/14734945/fs/16028772


----------



## PunkX 1

chris89 said:


> Attach BIOS file here
> 
> 
> 
> I'm doing 1,267mhz top end on my reference 390x
> 
> https://www.3dmark.com/compare/fs/14735110/fs/14734945/fs/16028772


How much voltage are you using?


----------



## chris89

PunkX 1 said:


> How much voltage are you using?


A lot & tested it alot kept re running until I found what worked... Try it on yours... +100mv will yield 1433mv close to my 1267mhz voltage run.... Maybe I needed 1475mv or something?


----------



## Bense

I just got an R9 390 from ebay. I just dumped the vbios, and the checksums match the card that's listed here:
https://www.techpowerup.com/vgabios/181401/xfx-r9390-8192-151008

It's an XFX card, and it appears to be the 'Double Dissipation' model. 

I have carefully installed the Heatkiller fullcoverage that I pulled off of my dead R9 290. Cooling isn't much of a concern for me since I'm on a custom water cooling loop. I am looking for a decent vbios that I can reflash this card with that will provide a small bump in overclock, that I won't have to use the MSI app for. 

Anyone have any suggestions?


----------



## chris89

Bense said:


> I just got an R9 390 from ebay. I just dumped the vbios, and the checksums match the card that's listed here:
> https://www.techpowerup.com/vgabios/181401/xfx-r9390-8192-151008
> 
> It's an XFX card, and it appears to be the 'Double Dissipation' model.
> 
> I have carefully installed the Heatkiller fullcoverage that I pulled off of my dead R9 290. Cooling isn't much of a concern for me since I'm on a custom water cooling loop. I am looking for a decent vbios that I can reflash this card with that will provide a small bump in overclock, that I won't have to use the MSI app for.
> 
> Anyone have any suggestions?


Its very highly suggested to dump save the original from GPUz instead & zip attach here :thumb:


----------



## Bense

chris89 said:


> Its very highly suggested to dump save the original from GPUz instead & zip attach here :thumb:


What's the point? I provided a URL with a vbios dump that has the exact same md5 and sha1 checksums as the one that I dumped myself with GPU-Z.


----------



## PunkX 1

Bense said:


> I just got an R9 390 from ebay. I just dumped the vbios, and the checksums match the card that's listed here:
> https://www.techpowerup.com/vgabios/181401/xfx-r9390-8192-151008
> 
> It's an XFX card, and it appears to be the 'Double Dissipation' model.
> 
> I have carefully installed the Heatkiller fullcoverage that I pulled off of my dead R9 290. Cooling isn't much of a concern for me since I'm on a custom water cooling loop. I am looking for a decent vbios that I can reflash this card with that will provide a small bump in overclock, that I won't have to use the MSI app for.
> 
> Anyone have any suggestions?


We have the exact same GPU! Ho far have you been able to push yours? I've noticed that as the voltage remains constant around the 1.3-1.32v range on load, it black screens and you have to restart the computer for it to recover.


----------



## chris89

Bense said:


> What's the point? I provided a URL with a vbios dump that has the exact same md5 and sha1 checksums as the one that I dumped myself with GPU-Z.


I modded the bios u linked anyway


PunkX 1 said:


> We have the exact same GPU! Ho far have you been able to push yours? I've noticed that as the voltage remains constant around the 1.3-1.32v range on load, it black screens and you have to restart the computer for it to recover.


Is the bios I made not working correctly?


----------



## PunkX 1

chris89 said:


> I modded the bios u linked anyway
> 
> Is the bios I made not working correctly?


Works perfectly fine but any attempt at increasing voltage to 1.38v in bios results in the blackscreen. At this voltage, my 3D load voltage is usually at 1.3V. I really wanted to see how my GPU compares against an R9 390X overclocked in Firestrike. Do you have any such benches?


----------



## PunkX 1

Here's what I get at my stable clocks. Interested to see how an R9 390X performs.


----------



## chris89

https://www.3dmark.com/compare/fs/14735110/fs/14734945#


----------



## PunkX 1

chris89 said:


> https://www.3dmark.com/compare/fs/14735110/fs/14734945#


So, that is your OC with a 390X without disabling tesselation? How much does your 390X get at stock?


----------



## chris89

PunkX 1 said:


> So, that is your OC with a 390X without disabling tesselation? How much does your 390X get at stock?


https://www.3dmark.com/fs/14305674


----------



## PunkX 1

chris89 said:


> https://www.3dmark.com/fs/14305674


That's an R9 390X at 1172/1563MHz?


----------



## Temuka

Maybe someone can help me please? 
https://www.overclock.net/forum/67-amd-ati/1708648-amd-r9-390-gray-screen-death.html


----------



## christoph

nevermind


----------



## Dundundata

The 390 lives on in new build! After upgrading my devil's canyon build to a 1080 the ole' 390 is revived. Got a chip off ebay and a freesync monitor and the gaming is smooth.

4690k
msi 390
16gb ddr3 1600
850evo

The one issue is I have a cheap Corsair 600W psu in use. I used a bunch of "old" parts to make this build. So no OCing atm.


----------



## robin69

@chris89

Sorry for the late reply, I had much going on in my private life. Inclduding school, got my first job etc. Sadly i forgot about this thread here. I got a Seasonic 850W Focus Gold in form of a riotoro 850W PSU (100% identical), because i didn't get the rosewill because of VAT Problems und it would have become way to expensive.

I got no coil whine and the card stable at 1133Mhz (your BIOS) without artifacts, gonna give you an update when i got time for testing later this week


----------



## Metalcrack

Found a nice deal on a 4K monitor, and my ol' 390 and 8350 are getting a workout. Almost max out Shadow Warrior 2 with 45ish FPS. Not bad for a mid-tier system.


----------



## gapottberg

Hey boys, I am finally back in action again and took 3Dmark Firestrike for a spin today just for ****s'n'giggles. Was pleasantly surprised to see a new personal best score considering I have my overclocks backed off a bit from some of my previous attempts. Was finally able to break 10,000 points; which had been a goal of mine for some time now while using this hardware. 


http://www.3dmark.com/fs/17097032


----------



## oskullop

Hi all ! safe vrm temps for 390x ? i just did the red mod with coolermaster aio 
,heatsinks over vram modules and on vrm 
,also with 80 mm fan to blow at them but got 80 c on vrm2 and 69 c on vrm1 when playing new tomb raider 1440 ultra all filters off.
.core temp is great top i saw 57 c.to high or normal?

Sent from my Redmi Note 5 using Tapatalk


----------



## gapottberg

Anyone thinking of upgrading to the new RX-590? I am thinking about it, as it seems that AMD finally have a card not named Vega that truly outclasses this one. Where I always felt the RX-580 was more trading blows at best and more of a side grade than an upgrade. A true upgrade at under $300 might be too good to pass up...especially if we see any holiday sales that put it in range of say $250.


----------



## Metalbeard

gapottberg said:


> Anyone thinking of upgrading to the new RX-590? I am thinking about it, as it seems that AMD finally have a card not named Vega that truly outclasses this one. Where I always felt the RX-580 was more trading blows at best and more of a side grade than an upgrade. A true upgrade at under $300 might be too good to pass up...especially if we see any holiday sales that put it in range of say $250.


I definitely am. I'm still happy with my 390 but the 590 is tempting especially if I can find a good deal. I got my 390 openbox at Microcenter. It was too good of a deal to pass up.


----------



## gapottberg

Well dam...with the leak of the supposed RX 3080 specs and price...maybe waiting for Navi is the right call. $250 for better than 1080 performance with lower TDP than my 390X...yes please!!!


----------



## tolis626

Hey everyone! Been a while since I last posted here!

So, I'm still rocking an MSI R9 390x. It's been great so far (minus the huge power draw), but it's beginning to heavily show its age. However, running a 1080p 60Hz panel isn't even a stress for it, so I couldn't care less outside of benchmarking. Most demanding game I've played so far is Witcher 3, but even there it performed admirably.

That, though, is about to change because tomorrow I'm gonna receive my new monitor at long last. I got a Samsung LC27HG70QQ, so 1440p 144Hz Freesync, blah blah. Anyways, I mainly play Rainbow Six Siege right now and have no money left over for a GPU (not that I'd dare upgrade before Navi or whatever shows up), so I might as well try to get the most out of my current GPU.

Which brings me to my "problem". While the core is an average overclocker (1150MHz at +50mV, can go higher with both but heat becomes an issue), my RAM refuses to do anything above stock, really. I see a lot of people doing 1700MHz+ on their 390 series cards, but mine throws huge amounts of errors at anything over 1575MHz and will give errors regardless if I go above stock. Above 1625MHz it's downright unstable and will eventually crash. That doesn't change by messing with the aux voltage, as I've tried from 950mV to 1050mV (so - and + 50mV in Afterburner). Does anyone have any ideas about what that could be? I figured the extra memory speed might help, but the damn thing refuses to cooperate with me and it's driving me nuts. Could there be something wrong that I could change or did I just get a lemon when it comes to VRAM?

Bonus question : How much hotter can I expect it to run at 1440p? Right now with my 1150MHz overclock I get like an average temp of about 75C and may peak at like 80C. I have set an aggressive fan curve and I've repasted the GPU and changed the thermal pads on the VRMs to some Fujipoly high conductivity ones (both of these helped a lot).

Thanks in advance!


----------



## tolis626

Quick update on my post above. I got the screen. In short, I have no idea how I lived with 1080p 60Hz up until now. Not to mention my old screen's horrendous input lag (dropped from just over 300ms of reaction time in humanbenchmark.com to 174ms, which I'd call a massive difference, I went from the reaction time of a drunk 50 year old to quite a bit better than average). The screen's been awesome except for one thing I'll touch on below.

Now, I expected the poor 390x to start cooking itself at 1440p. But I've seen next to no difference so far in temps or stability. Just lower framerates. Some older games are ok, like BF4 with no MSAA or with 2x MSAA which gives me 110-150fps at 1440p (without MSAA), some others, like the Witcher 3... Not so much. Let's just say that I'll be sticking with 1080p Ultra for the Witcher, but the added refresh rate makes it that much better. 

My problem with this whole thing (apart from not being able to overclock further to push out that last bit of performance) is that I've been having problems with my Displayport connection. I will get a black screen when I start a game (not in game menus, when I get into the game itself), then it will come back on after a second showing the displayport icon, then go black again, then back to on with the displayport icon etc etc. You can see exactly what I'm talking about here 



 . Other times, the screen may stay black and show a "not optimal settings, use 2560x1440 144Hz" message, like I've tried to run the screen out of spec, but I'm using exactly these settings. At first I thought the display is defective, I spoke with Samsung support and they suggested I try another DP cable. And so I bought one and same thing. Seeing as I don't have another GPU on hand to try it, I used the CPU's integrated graphics. Needless to say, launching games at 1440p on that poor thing was painful. Regardless though, it never exhibited the same issue I was getting before. It did run at 2560x1440 120Hz max, but I'd tried the 390x with 120Hz and I still got the same results. So I ran DDU, cleaned everything, restarted, reset all my overclocks, fired up Battlefield 4 (I like it for testing, shows instability faster than any other game I own, even though it's old and easy to run) and, while I didn't get the same issue, it was slow (70-ish FPS down from 100+) and after a time it started showing some... Artifacts, like I had an unstable memory overclock, with the screen getting distorted, like the display signal is bad. I decided to reboot, I reapplied my overclock and it seems to... work? Like, no idea what changed, but it ran fine after that. Didn't get the chance to test further, but it worked then. So maybe it's a GPU issue? If so, damn, I got no money at the moment (blew it all on the monitor). Plus, with the rumors about Navi as @gapottberg said above, I wouldn't want to upgrade at this point. I just want this thing to last me 3-6 months. Then it can rest in peace.

I'd appreciate any help guys. Thanks!

PS : Jinxed it. Just fired up Rainbow Six Siege and it did the same thing. Strange thing with Siege is that, it will do that in the start of a round, go black and come back again 3-4 or sometimes a bit more times, but then come back and work normally. I'm at a loss, really.


----------



## christoph

tolis626 said:


> Quick update on my post above. I got the screen. In short, I have no idea how I lived with 1080p 60Hz up until now. Not to mention my old screen's horrendous input lag (dropped from just over 300ms of reaction time in humanbenchmark.com to 174ms, which I'd call a massive difference, I went from the reaction time of a drunk 50 year old to quite a bit better than average). The screen's been awesome except for one thing I'll touch on below.
> 
> Now, I expected the poor 390x to start cooking itself at 1440p. But I've seen next to no difference so far in temps or stability. Just lower framerates. Some older games are ok, like BF4 with no MSAA or with 2x MSAA which gives me 110-150fps at 1440p (without MSAA), some others, like the Witcher 3... Not so much. Let's just say that I'll be sticking with 1080p Ultra for the Witcher, but the added refresh rate makes it that much better.
> 
> My problem with this whole thing (apart from not being able to overclock further to push out that last bit of performance) is that I've been having problems with my Displayport connection. I will get a black screen when I start a game (not in game menus, when I get into the game itself), then it will come back on after a second showing the displayport icon, then go black again, then back to on with the displayport icon etc etc. You can see exactly what I'm talking about here https://www.youtube.com/watch?v=SWIP00Gvr1g . Other times, the screen may stay black and show a "not optimal settings, use 2560x1440 144Hz" message, like I've tried to run the screen out of spec, but I'm using exactly these settings. At first I thought the display is defective, I spoke with Samsung support and they suggested I try another DP cable. And so I bought one and same thing. Seeing as I don't have another GPU on hand to try it, I used the CPU's integrated graphics. Needless to say, launching games at 1440p on that poor thing was painful. Regardless though, it never exhibited the same issue I was getting before. It did run at 2560x1440 120Hz max, but I'd tried the 390x with 120Hz and I still got the same results. So I ran DDU, cleaned everything, restarted, reset all my overclocks, fired up Battlefield 4 (I like it for testing, shows instability faster than any other game I own, even though it's old and easy to run) and, while I didn't get the same issue, it was slow (70-ish FPS down from 100+) and after a time it started showing some... Artifacts, like I had an unstable memory overclock, with the screen getting distorted, like the display signal is bad. I decided to reboot, I reapplied my overclock and it seems to... work? Like, no idea what changed, but it ran fine after that. Didn't get the chance to test further, but it worked then. So maybe it's a GPU issue? If so, damn, I got no money at the moment (blew it all on the monitor). Plus, with the rumors about Navi as @gapottberg said above, I wouldn't want to upgrade at this point. I just want this thing to last me 3-6 months. Then it can rest in peace.
> 
> I'd appreciate any help guys. Thanks!
> 
> PS : Jinxed it. Just fired up Rainbow Six Siege and it did the same thing. Strange thing with Siege is that, it will do that in the start of a round, go black and come back again 3-4 or sometimes a bit more times, but then come back and work normally. I'm at a loss, really.



use a better DP cable


----------



## tolis626

christoph said:


> use a better DP cable


Already bought a second one and it does the same thing. Wasn't too cheap either.

EDIT : http://www.logilink.com/Produkte_Lo...ayPort_Kabel_DP_Stecker_auf_DP_Stecker_2m.htm
This is the cable I bought. Looks fine, I guess.


----------



## christoph

tolis626 said:


> Already bought a second one and it does the same thing. Wasn't too cheap either.
> 
> EDIT : http://www.logilink.com/Produkte_Lo...ayPort_Kabel_DP_Stecker_auf_DP_Stecker_2m.htm
> This is the cable I bought. Looks fine, I guess.



Maybe I am wrong but that DP cable does not look so good, try one, borrow one that is 1.2 at least and that is says it supports 1440p at 144hz, if you can get one that is 1.4 complain cuz that is a 2 meter cable, the high refresh rate needs a lot of bandwidth so only a true 1.2 DP cable can do that or 1.4 


https://www.amazon.com/ivanky-Displ...4742&sr=8-1-spons&keywords=dp+cable+1.3&psc=1

https://www.amazon.com/Infinnet-Dis...8&qid=1545014742&sr=8-5&keywords=dp+cable+1.3


----------



## tolis626

christoph said:


> Maybe I am wrong but that DP cable does not look so good, try one, borrow one that is 1.2 at least and that is says it supports 1440p at 144hz, if you can get one that is 1.4 complain cuz that is a 2 meter cable, the high refresh rate needs a lot of bandwidth so only a true 1.2 DP cable can do that or 1.4
> 
> 
> https://www.amazon.com/ivanky-Displ...4742&sr=8-1-spons&keywords=dp+cable+1.3&psc=1
> 
> https://www.amazon.com/Infinnet-Dis...8&qid=1545014742&sr=8-5&keywords=dp+cable+1.3


Yeah, I will probably return the cable. I hope the store doesn't give me any trouble, but meh. DP cables are tricky it seems.

I will probably try with another GPU too and contact Samsung again. Sigh..

I have to say though, when it works properly, this monitor is awesome.

Thanks @christoph !


----------



## christoph

tolis626 said:


> Yeah, I will probably return the cable. I hope the store doesn't give me any trouble, but meh. DP cables are tricky it seems.
> 
> I will probably try with another GPU too and contact Samsung again. Sigh..
> 
> I have to say though, when it works properly, this monitor is awesome.
> 
> Thanks @christoph !



yeah actually many of us with high refresh monitor had this issue, many many of us, so most of the times is the DP cable causing this


----------



## tolis626

christoph said:


> yeah actually many of us with high refresh monitor had this issue, many many of us, so most of the times is the DP cable causing this


Yeah, as I said in my other thread (link below), I did a quick search and saw a video on YouTube showing a guy with a different monitor and completely different setup that has the exact same issue as I do and it turned out to be a DP cable. I'm probably going to order another cable and see what happens.
https://www.overclock.net/forum/44-...layport-problem-help-needed.html#post27763898


----------



## christoph

tolis626 said:


> Yeah, as I said in my other thread (link below), I did a quick search and saw a video on YouTube showing a guy with a different monitor and completely different setup that has the exact same issue as I do and it turned out to be a DP cable. I'm probably going to order another cable and see what happens.
> https://www.overclock.net/forum/44-...layport-problem-help-needed.html#post27763898




oh no no, don't get me wrong, I was just trying to say that this issue is very common


----------



## tolis626

christoph said:


> oh no no, don't get me wrong, I was just trying to say that this issue is very common


Yeah, but how do I solve it? There's the rub.


----------



## christoph

tolis626 said:


> Yeah, but how do I solve it? There's the rub.



well by just trying with another cable

could you post a picture of the AMD Radeon settings, in Display/specs ( in display top right corner) , it should say the current link setting and the G.Pixel clock


----------



## tolis626

christoph said:


> well by just trying with another cable
> 
> could you post a picture of the AMD Radeon settings, in Display/specs ( in display top right corner) , it should say the current link setting and the G.Pixel clock



Here it is!

Meanwhile, the display developed a dead/stuck pixel right in the middle, so I'm probably gonna return it anyway and be done with it. I hope the next one I receive doesn't have any problems. And I hope that Samsung doesn't give me any trouble. Oh well...

PS : Do you know what the override settings in the Radeon display settings do? Something about voltage swing and pre-emphasis.


----------



## christoph

tolis626 said:


> Here it is!
> 
> Meanwhile, the display developed a dead/stuck pixel right in the middle, so I'm probably gonna return it anyway and be done with it. I hope the next one I receive doesn't have any problems. And I hope that Samsung doesn't give me any trouble. Oh well...
> 
> PS : Do you know what the override settings in the Radeon display settings do? Something about voltage swing and pre-emphasis.



hmm, the link and the G.Pixel clock is good, then probably is the monitor itself...

to override the setting is right there, on the right tab that says override, adjust both by 1 (unit?) and test, if not then test with 2


----------



## tolis626

christoph said:


> hmm, the link and the G.Pixel clock is good, then probably is the monitor itself...
> 
> to override the setting is right there, on the right tab that says override, adjust both by 1 (unit?) and test, if not then test with 2


Yeah, pixel clock etc are all according to the monitor's spec (has everything in the manual). I checked that.

I tried with 1, but honestly I'm kind of afraid to go higher. I'm sending this thing back anyway, so no point risking anything more. Thank you for all your help though man!


----------



## tolis626

Completely different topic here.

As I'm waiting to get my screen back, I decided to redo my GPU overclock to see where it all went wrong. So I have a question. Any of you guys that have high memory clocks (over 1600MHz), can you check if HWiNFO64 shows memory errors popping up? I get loads of them, but I don't know how to interpret them. Seems there's certain configurations that are more stable than others. Eg if I run +50mV core and +50mV aux, I'll get a lot of errors (clocks are 1150MHz core, 1650MHz memory). If I drop down to +40mV on the core, errors pop up WAY slower. If I drop further to +30mV, they start popping up fast again. Dropping the core clock to 1140MHz did nothing to calm it down.

Any ideas? The 390x is a strange overclocker.


----------



## christoph

tolis626 said:


> Completely different topic here.
> 
> As I'm waiting to get my screen back, I decided to redo my GPU overclock to see where it all went wrong. So I have a question. Any of you guys that have high memory clocks (over 1600MHz), can you check if HWiNFO64 shows memory errors popping up? I get loads of them, but I don't know how to interpret them. Seems there's certain configurations that are more stable than others. Eg if I run +50mV core and +50mV aux, I'll get a lot of errors (clocks are 1150MHz core, 1650MHz memory). If I drop down to +40mV on the core, errors pop up WAY slower. If I drop further to +30mV, they start popping up fast again. Dropping the core clock to 1140MHz did nothing to calm it down.
> 
> Any ideas? The 390x is a strange overclocker.



I think it has to do with windows itself and not the video card...

what are you doing while checking for those errors?


I had them before, almost RMA my video card, but they disappear I think when I upgraded to 1809 version


----------



## tolis626

christoph said:


> I think it has to do with windows itself and not the video card...
> 
> what are you doing while checking for those errors?
> 
> 
> I had them before, almost RMA my video card, but they disappear I think when I upgraded to 1809 version


Oh, they pop up while gaming. I have to go to extremes to have it crash or error out doing other things. I have the error counter on the OSD from RivaTuner.

1809 is the version of what? 

PS : You were probably talking about Windows 10. Noted. I'm doing the update now. I had canceled it when I saw the issues people had with files disappearing, but I'm hoping Microsoft got their crap together. I'll see how that goes.


----------



## christoph

tolis626 said:


> Oh, they pop up while gaming. I have to go to extremes to have it crash or error out doing other things. I have the error counter on the OSD from RivaTuner.
> 
> 1809 is the version of what?
> 
> PS : You were probably talking about Windows 10. Noted. I'm doing the update now. I had canceled it when I saw the issues people had with files disappearing, but I'm hoping Microsoft got their crap together. I'll see how that goes.



yeah, I had them before, same problem as you, and the errors just went away, don't know why, but I know that windows 10 has a lot of issues and one of them is the monitor scaling and their full screen gaming optimization...

when you get your new monitor reset the bios to defaults and type in your settings manually and then you can save the profile but once you have your new monitor plugged in


----------



## tolis626

christoph said:


> yeah, I had them before, same problem as you, and the errors just went away, don't know why, but I know that windows 10 has a lot of issues and one of them is the monitor scaling and their full screen gaming optimization...
> 
> when you get your new monitor reset the bios to defaults and type in your settings manually and then you can save the profile but once you have your new monitor plugged in


What BIOS settings? For the CPU? I mean, these have been the same for a while. 

Weird thing, I was able to get 4.8GHz on my CPU with 2400MHz RAM a few months ago. Then I tried to update the microcode to see if it imrpoves things and poof, no more 4.8GHz for me, no matter what I do. Even reverting to the BIOS I was using before isn't working.

EDIT : I updated Windows to 1809 and installed the 18.12.3 driver from AMD. Same behaviour. Doesn't crash, just throws errors. It's been like that forever though, since the first moment HWiNFO64 introduced the error counter.


----------



## christoph

tolis626 said:


> What BIOS settings? For the CPU? I mean, these have been the same for a while.
> 
> Weird thing, I was able to get 4.8GHz on my CPU with 2400MHz RAM a few months ago. Then I tried to update the microcode to see if it imrpoves things and poof, no more 4.8GHz for me, no matter what I do. Even reverting to the BIOS I was using before isn't working.
> 
> EDIT : I updated Windows to 1809 and installed the 18.12.3 driver from AMD. Same behaviour. Doesn't crash, just throws errors. It's been like that forever though, since the first moment HWiNFO64 introduced the error counter.



did you clean install the AMD driver?

is your OC stable right now? try the Intelburntest


----------



## tolis626

christoph said:


> did you clean install the AMD driver?
> 
> is your OC stable right now? try the Intelburntest


Yes, I DDU'ed the damn thing.

My CPU overclock? It's like 90% stable. Something's not quite right and I'll get a BSOD every now and then. I'm trying to figure out what. I suspect it's the memory overclock.


----------



## gapottberg

Memory Overclocking has caused me more grief than any other tinkering I know how to do. That being said I am unable to help tinkering. My advice that has proven very helpful for me is to get a copy of memtest 86 on a bootable USB and every time you tinker with memory settings unplug your os HDD and boot to the memtest. Run at least one full pass before booting to windows. Will save you much grief if you push it too far. Nothing worse than corrutping your os and having to do a full reinstallation.


----------



## christoph

tolis626 said:


> Yes, I DDU'ed the damn thing.
> 
> My CPU overclock? It's like 90% stable. Something's not quite right and I'll get a BSOD every now and then. I'm trying to figure out what. I suspect it's the memory overclock.


then you should start from there, or at least try gaming and check for errors WITHOUT OC at all


----------



## tolis626

gapottberg said:


> Memory Overclocking has caused me more grief than any other tinkering I know how to do. That being said I am unable to help tinkering. My advice that has proven very helpful for me is to get a copy of memtest 86 on a bootable USB and every time you tinker with memory settings unplug your os HDD and boot to the memtest. Run at least one full pass before booting to windows. Will save you much grief if you push it too far. Nothing worse than corrutping your os and having to do a full reinstallation.


Oh, I went through that whole ordeal with memory overclocking. If I knew what I was getting into, I wouldn't have started it at all, but I did and here I am. 

The memory itself is stable. With the settings I have, it was stable after like 8 hours of memtest86. So I don't think it's the RAM itself, but rather the IMC on my CPU. I'm increasing system agent and IO voltages slowly and I'll see how it goes. Oh well, if Ryzen 3 rumors turn out to be true, this rig's days are numbered anyway.



christoph said:


> then you should start from there, or at least try gaming and check for errors WITHOUT OC at all


Stock doesn't give errors, I've already tried it. But if I move the core from stock, it will give like 2-3 errors after hours of gaming. Touch the memory at all and errors multiply like rabbits during mating season. I think my card is just a dud. I can push the memory past 1700MHz for short benchmarking runs, but that's about it. As I said above about the Ryzen rumors, same goes for Navi rumors. The 390x should've been on its way out long ago, but I had audio gear siphoning all my money. Still, can't complain, it's served me well these past 4 years. Far longer than I expected it to last to be honest. Only big downside is that it's a damn power hog. Really helps with heating the house during Greece's mild winters, though, so it really pulled double duty as a space heater too.


----------



## tolis626

So, in case anyone sees this...

I sent the monitor in for RMA. There was no dead pixel, but rather dirt in the panel. They changed my panel instead of giving me a new monitor, which I'm kind of salty about, but we'll see, I don't have it in my hands yet. Bad thing is, the service guy told me that he could not reproduce the problem with the DP connection, so they said I should either try with another GPU or contact the manufacturer of my GPU. So I sent an email to MSI support and I'm waiting for the "screw you" response I'm expecting. The way this whole thing looks, I'll probably have to buy another GPU if I'm to enjoy this damn thing, provided it doesn't give me even more problems. Worst thing is that I don't wanna go NVidia and, apart from the rather expensive Radeon VII, there's nothing REALLY better than the 390x on the AMD side. I mean, +5-10% at best isn't reason enough for me to go 580 or 590. I was really hoping for Navi, but I doubt I'm gonna be able to wait 'til the summer for it. Unless money (or lack thereof) forces me to wait and not buy a VII. Sigh...

PS : Got a response from MSI. First message gave me some general crap about reinstalling drivers and cleaning ports on the card. After I explained that I've tried everything, the guy sent me a shady looking BIOS flashing folder, complete with ATIWinflash and some other stuff that I don't know anything about. I'd appreciate if anyone gave a look.


----------



## tolis626

Sorry to bother everyone again, but I was messing around with Wattman and I saw that there is an option to increase the memory voltage? Is that actually working or is it just a placeholder than does nothing?


----------



## christoph

tolis626 said:


> Sorry to bother everyone again, but I was messing around with Wattman and I saw that there is an option to increase the memory voltage? Is that actually working or is it just a placeholder than does nothing?


I think it does not apply for the R9 video cards


----------



## BradleyW

tolis626 said:


> Sorry to bother everyone again, but I was messing around with Wattman and I saw that there is an option to increase the memory voltage? Is that actually working or is it just a placeholder than does nothing?


Try increasing it a tiny bit and check voltage in gpuz to see if the memory voltage has indeed changed. If there's no change, it'll either be locked or your in fact changing the cards minimum floor voltage like on the Vega.


----------



## tolis626

christoph said:


> I think it does not apply for the R9 video cards


That's what I thought too. But if so, why are we allowed to change it? With that said, changing it from the stock 1109mV down to 1050mV or up to 1200mV didn't seem to do anything, so it may just be there doing nothing. I was under the assumption that memory voltage is locked in these cards.


BradleyW said:


> Try increasing it a tiny bit and check voltage in gpuz to see if the memory voltage has indeed changed. If there's no change, it'll either be locked or your in fact changing the cards minimum floor voltage like on the Vega.


There's the thing. There's no memory voltage sensor for the 390x. At first I thought it would change the auxiliary voltage, but that's not it (plus, the 1109mV stock value would be insane for that, it's exactly 1V stock). Also, as said above, changing it to 1050mV or 1200mV didn't seem to do anything for stability, so I guess it's just a placeholder. My memory still can't overclock worth s**t, so there's that. I was just curious.


----------



## christoph

tolis626 said:


> That's what I thought too. But if so, why are we allowed to change it? With that said, changing it from the stock 1109mV down to 1050mV or up to 1200mV didn't seem to do anything, so it may just be there doing nothing. I was under the assumption that memory voltage is locked in these cards.
> 
> There's the thing. There's no memory voltage sensor for the 390x. At first I thought it would change the auxiliary voltage, but that's not it (plus, the 1109mV stock value would be insane for that, it's exactly 1V stock). Also, as said above, changing it to 1050mV or 1200mV didn't seem to do anything for stability, so I guess it's just a placeholder. My memory still can't overclock worth s**t, so there's that. I was just curious.



yeah well, I can change the AUX voltage or VDDCI voltage in GPU-Z but can't OC the Vram much


----------



## mynm

tolis626 said:


> So, in case anyone sees this...
> 
> I sent the monitor in for RMA. There was no dead pixel, but rather dirt in the panel. They changed my panel instead of giving me a new monitor, which I'm kind of salty about, but we'll see, I don't have it in my hands yet. Bad thing is, the service guy told me that he could not reproduce the problem with the DP connection, so they said I should either try with another GPU or contact the manufacturer of my GPU. So I sent an email to MSI support and I'm waiting for the "screw you" response I'm expecting. The way this whole thing looks, I'll probably have to buy another GPU if I'm to enjoy this damn thing, provided it doesn't give me even more problems. Worst thing is that I don't wanna go NVidia and, apart from the rather expensive Radeon VII, there's nothing REALLY better than the 390x on the AMD side. I mean, +5-10% at best isn't reason enough for me to go 580 or 590. I was really hoping for Navi, but I doubt I'm gonna be able to wait 'til the summer for it. Unless money (or lack thereof) forces me to wait and not buy a VII. Sigh...
> 
> PS : Got a response from MSI. First message gave me some general crap about reinstalling drivers and cleaning ports on the card. After I explained that I've tried everything, the guy sent me a shady looking BIOS flashing folder, complete with ATIWinflash and some other stuff that I don't know anything about. I'd appreciate if anyone gave a look.


Hi, do you have tested to not use freesync, maybe your monitor going to black problem is caused by it. About the BIOS flashing folder the bios is the TV308MH.201 file.



tolis626 said:


> Sorry to bother everyone again, but I was messing around with Wattman and I saw that there is an option to increase the memory voltage? Is that actually working or is it just a placeholder than does nothing?





christoph said:


> I think it does not apply for the R9 video cards





BradleyW said:


> Try increasing it a tiny bit and check voltage in gpuz to see if the memory voltage has indeed changed. If there's no change, it'll either be locked or your in fact changing the cards minimum floor voltage like on the Vega.


In 380, and polaris gpus, the wattman memory voltage is not the memory voltage and is the voltage floor for the core for per dpm memory clock. Maybe in the 390 is the same. In vega @gupsterg have tested that it is the dpm5 voltage.

About the memory voltage (mvddc), I think I know a way to know what voltage is using you gpu for it. I see info about it in two bios tables, ASIC_Init and VoltageObjectInfo tables. The ones I see into the TV308MH.201 bios file are:

ASIC_Init: 
91 00 01 01 00 08 02 01 02 00 52 47 52 02 02 65 02 07 52 0D 55 00 02 52 0D 52 23 2C 25 02 01 3D 25 02 06 45 17 00 51 02 4A 25 6F 22 06 49 3E 00 01 05 00 C2 00 00 00 E0 5C 25 70 22 F9 04 52 39 02 0D 02 01 03 52 43 66 04 02 8C 02 2E 00 02 0D 02 01 00 52 43 02 05 02 04 00 E8 03 52 43 02 0D 02 06 03 52 43 02 0D 02 02 03 52 43 52 05 02 01 02 00 0E E5 02 08 52 0A 02 05 02 02 00 *DC 05* 52 43 02 01 02 01 0E E5 02 08 52 0B 0D 65 D0 05 02 5B

VoltageObjectInfo:
70 00 03 01 01 03 0E 00 08 96 60 00 00 00 00 00 FF 00 01 07 0C 00 0A 00 00 00 00 00 00 00 04 07 0C 00 0E 00 00 00 00 00 00 00 02 00 24 00 00 04 00 00 00 80 10 00 00 00 00 00* 1E 05* 00 80 00 00 *46 05* 00 00 10 00 *DC 05* 00 80 10 00 *0E 06* 06 03 22 00 0C 96 A6 00 00 00 00 00 D4 00 F0 00 D5 00 F0 00 D6 00 F0 00 D7 00 F0 00 D3 00 40 00 FF 00

In the first table DC 05 is 1500 so is 1.5v, so your gpu could be using 1.5v. I saw 390x bioses with 1.55v to this value so maybe that is why you are getting errors and other people not.

In the second table there are other voltage values related to the memory voltage. 1E 05 1.31v, 4605 1.35v, DC05 1.5v and 0E 06 1.55v. I think these voltages are the ones the memory can have. I think no other voltage can by aplied.

So I think that if you change the first table DC 05 (1.5v) value to 0E 06 (1.55v) you could have 1.55v for the memory. 

As mvddc can't be monitored by any software there is no way to know if it is working without a multimeter, or testing and see if you get less erros. Anybody have tested this, so if you whant to do this do it in your own risk.


----------



## tolis626

christoph said:


> yeah well, I can change the AUX voltage or VDDCI voltage in GPU-Z but can't OC the Vram much


Well, it definitely helps on my card. It's just that it won't overclock the memory much even if I go balls to the wall with the voltages. Oh well, outside benchmarks, I haven't seen any big gains from memory overclocking.


mynm said:


> Hi, do you have tested to not use freesync, maybe your monitor going to black problem is caused by it. About the BIOS flashing folder the bios is the TV308MH.201 file.


Testing without Freesync was the first thing I tried. It makes no difference. Turns out it's directly related to my GPU voltage. If I go above +25mV in Afterburner (Above my stock 1.275V VID), it will start going black when it first transitions from 2D to 3D clocks. Sometimes it will come back, sometimes it won't. The higher I go, the worse the problem becomes. But if I limit the GPU to 1140MHz and +25mV it doesn't happen much, if at all. Weird, but hey, I'm missing like 10-20MHz from my previous 24/7 overclock.



mynm said:


> In 380, and polaris gpus, the wattman memory voltage is not the memory voltage and is the voltage floor for the core for per dpm memory clock. Maybe in the 390 is the same. In vega @gupsterg have tested that it is the dpm5 voltage.
> 
> About the memory voltage (mvddc), I think I know a way to know what voltage is using you gpu for it. I see info about it in two bios tables, ASIC_Init and VoltageObjectInfo tables. The ones I see into the TV308MH.201 bios file are:
> 
> ASIC_Init:
> 91 00 01 01 00 08 02 01 02 00 52 47 52 02 02 65 02 07 52 0D 55 00 02 52 0D 52 23 2C 25 02 01 3D 25 02 06 45 17 00 51 02 4A 25 6F 22 06 49 3E 00 01 05 00 C2 00 00 00 E0 5C 25 70 22 F9 04 52 39 02 0D 02 01 03 52 43 66 04 02 8C 02 2E 00 02 0D 02 01 00 52 43 02 05 02 04 00 E8 03 52 43 02 0D 02 06 03 52 43 02 0D 02 02 03 52 43 52 05 02 01 02 00 0E E5 02 08 52 0A 02 05 02 02 00 *DC 05* 52 43 02 01 02 01 0E E5 02 08 52 0B 0D 65 D0 05 02 5B
> 
> VoltageObjectInfo:
> 70 00 03 01 01 03 0E 00 08 96 60 00 00 00 00 00 FF 00 01 07 0C 00 0A 00 00 00 00 00 00 00 04 07 0C 00 0E 00 00 00 00 00 00 00 02 00 24 00 00 04 00 00 00 80 10 00 00 00 00 00* 1E 05* 00 80 00 00 *46 05* 00 00 10 00 *DC 05* 00 80 10 00 *0E 06* 06 03 22 00 0C 96 A6 00 00 00 00 00 D4 00 F0 00 D5 00 F0 00 D6 00 F0 00 D7 00 F0 00 D3 00 40 00 FF 00
> 
> In the first table DC 05 is 1500 so is 1.5v, so your gpu could be using 1.5v. I saw 390x bioses with 1.55v to this value so maybe that is why you are getting errors and other people not.
> 
> In the second table there are other voltage values related to the memory voltage. 1E 05 1.31v, 4605 1.35v, DC05 1.5v and 0E 06 1.55v. I think these voltages are the ones the memory can have. I think no other voltage can by aplied.
> 
> So I think that if you change the first table DC 05 (1.5v) value to 0E 06 (1.55v) you could have 1.55v for the memory.
> 
> As mvddc can't be monitored by any software there is no way to know if it is working without a multimeter, or testing and see if you get less erros. Anybody have tested this, so if you whant to do this do it in your own risk.


Well, I really don't understand much about that Wattman memory voltage to be honest. 

Doesn't seem to be doing much with the 390x though.

Regarding the different MVDDC values, that may actually be the culprit. Bad thing is, I have no idea about BIOS modding, especially "advanced" stuff like using Hex editors and such. I'd love to try going higher though. With that said, right now I can't afford this GPU dying on me, so I'd be pretty reluctant, but if I get another GPU, I could definitely see myself giving this a go for fun. The 390x has served me long and well, but it may have one last trick up its sleeve it seems.


----------



## mynm

tolis626 said:


> Testing without Freesync was the first thing I tried. It makes no difference. Turns out it's directly related to my GPU voltage. If I go above +25mV in Afterburner (Above my stock 1.275V VID), it will start going black when it first transitions from 2D to 3D clocks. Sometimes it will come back, sometimes it won't. The higher I go, the worse the problem becomes. But if I limit the GPU to 1140MHz and +25mV it doesn't happen much, if at all. Weird, but hey, I'm missing like 10-20MHz from my previous 24/7 overclock.
> 
> 
> Well, I really don't understand much about that Wattman memory voltage to be honest.
> 
> Doesn't seem to be doing much with the 390x though.
> 
> Regarding the different MVDDC values, that may actually be the culprit. Bad thing is, I have no idea about BIOS modding, especially "advanced" stuff like using Hex editors and such. I'd love to try going higher though. With that said, right now I can't afford this GPU dying on me, so I'd be pretty reluctant, but if I get another GPU, I could definitely see myself giving this a go for fun. The 390x has served me long and well, but it may have one last trick up its sleeve it seems.



Ok, weird, I have no idea of what can be causing the problem.




About the wattman memory voltage, if ti is working fot the 390 I don't have a 390 to test, maybe like *BradleyW *said, it could helpto solve your problem. Because as it is a lower limit to to the core voltage for a memory clock, it can help in transitions from 2D to 3D.


I did some test in the past with the wattman memory voltage for my 380 (now is broken, but not for testing this), and the weird thing about it is that with memory timings mod it can be lowered and not cause any problem. But without it lowering it can cause black screens. Also I have tested to lower it to my new 590, but I get black screens.


I don't know how to do a proper timing mod, I don't have these tecnical skills. But this is the mod I did for my 380 with H5GC4H24AJR memorys (some 390 have the same memoryes, so it coul work for them), for 1500 and 1625 strap, and was working ok with low wattman memory voltages. :


77713320000000006BBD572F40550F0E2892F7060048C50022559D084C0D14205A8900A000000120100C20246F1E2912


And the values for it that I get with this Vento041 tool are: 



Spoiler



####SEQ_WR_CTL_D1####
DAT_DLY = 7
DQS_DLY = 7
DQS_XTR = 1
DAT_2Y_DLY = 0
ADR_2Y_DLY = 0
CMD_2Y_DLY = 0
OEN_DLY = 7
OEN_EXT = 3
OEN_SEL = 3
ODT_DLY = 0
ODT_EXT = 0
ADR_DLY = 1
CMD_DLY = 0
####SEQ_WR_CTL_2####
DAT_DLY_H_D0 = 0
DQS_DLY_H_D0 = 0
OEN_DLY_H_D0 = 0
DAT_DLY_H_D1 = 0
DQS_DLY_H_D1 = 0
OEN_DLY_H_D1 = 0
WCDR_EN = 0
####SEQ_PMG_TIMING####
TCKSRE = 3
TCKSRX = 3
TCKE_PULSE = 11
TCKE = 27
SEQ_IDLE = 7
TCKE_PULSE_MSB = 1
SEQ_IDLE_SS = 8
####SEQ_RAS_TIMING####
TRCDW = 11
TRCDWA = 11
TRCDR = 15
TRCDRA = 15
TRRD = 5
TRC = 47
####SEQ_CAS_TIMING####
TNOPW = 0
TNOPR = 0
TR2W = 20
TCCDL = 2
TR2R = 5
TW2R = 15
TCL = 14
####SEQ_MISC_TIMING####
TRP_WRA = 40
TRP_RDA = 18
TRP = 15
TRFC = 111
####SEQ_MISC_TIMING2####
PA2RDATA = 0
PA2WDATA = 0
TFAW = 8
TCRCRL = 2
TCRCWL = 5
TFAW32 = 6
TWDATATR = 0
####ARB_DRAM_TIMING####
ACTRD = 16
ACTWR = 12
RASMACTRD = 32
RASMACTWR = 36
####ARB_DRAM_TIMING2####
RAS2RAS = 111
RP = 30
WRPLUSRP = 41
BUS_TURN = 18
####MC_SEQ_MISC####
MC_SEQ_MISC1 = 0x20140D4C
MC_SEQ_MISC3 = 0xA000895A
MC_SEQ_MISC8 = 0x20010000




The stock timings for the 1500 strap are: 



7771332000000000CE516A3D9055111230964909004AE60022339D08740114206A8900A002003120150F292F94273116



And the values for if are:


Spoiler



####SEQ_WR_CTL_D1####
DAT_DLY = 7
DQS_DLY = 7
DQS_XTR = 1
DAT_2Y_DLY = 0
ADR_2Y_DLY = 0
CMD_2Y_DLY = 0
OEN_DLY = 7
OEN_EXT = 3
OEN_SEL = 3
ODT_DLY = 0
ODT_EXT = 0
ADR_DLY = 1
CMD_DLY = 0
####SEQ_WR_CTL_2####
DAT_DLY_H_D0 = 0
DQS_DLY_H_D0 = 0
OEN_DLY_H_D0 = 0
DAT_DLY_H_D1 = 0
DQS_DLY_H_D1 = 0
OEN_DLY_H_D1 = 0
WCDR_EN = 0
####SEQ_PMG_TIMING####
TCKSRE = 2
TCKSRX = 2
TCKE_PULSE = 3
TCKE = 19
SEQ_IDLE = 7
TCKE_PULSE_MSB = 1
SEQ_IDLE_SS = 8
####SEQ_RAS_TIMING####
TRCDW = 14
TRCDWA = 14
TRCDR = 20
TRCDRA = 20
TRRD = 6
TRC = 61
####SEQ_CAS_TIMING####
TNOPW = 0
TNOPR = 0
TR2W = 25
TCCDL = 2
TR2R = 5
TW2R = 17
TCL = 18
####SEQ_MISC_TIMING####
TRP_WRA = 48
TRP_RDA = 22
TRP = 19
TRFC = 148
####SEQ_MISC_TIMING2####
PA2RDATA = 0
PA2WDATA = 0
TFAW = 10
TCRCRL = 2
TCRCWL = 6
TFAW32 = 7
TWDATATR = 0
####ARB_DRAM_TIMING####
ACTRD = 21
ACTWR = 15
RASMACTRD = 41
RASMACTWR = 47
####ARB_DRAM_TIMING2####
RAS2RAS = 148
RP = 39
WRPLUSRP = 49
BUS_TURN = 22
####MC_SEQ_MISC####
MC_SEQ_MISC1 = 0x20140174
MC_SEQ_MISC3 = 0xA000896A
MC_SEQ_MISC8 = 0x20310002




Maybe somebody with timings mod skills could test this and say why these timings can help to lower the wattman memory voltage.


As memory timings mod can cause the gpu to brick, is not recomende to do this if you don't have a dual bios gpu or advanced bios flashing skills. 



As your 390 performance is excellent nowadays, don't put it in risk.


----------



## tolis626

mynm said:


> Ok, weird, I have no idea of what can be causing the problem.


Well, I did search around the web when it turned out that my screen was ok and the GPU was causing the problem. The most plausible theory I found was a guy on the AMD forums (I think) that was saying that the DP output of this card is fed by the 0.95V rail and when pushing the core voltage you may get some voltage drop on that rail and cause instability in the display output. That would actually make a lot of sense, considering that the core is the major power hog on these cards. AUX voltage doesn't seem to affect it, or at least not as much. Or it could be BS and I wouldn't know the difference, I dunno. What sucks is that I had to dial my OC down, even if slightly, now that the card needs every last drop of performance it can muster. I could realistically run 1180/1625MHz while on 1080p 60Hz, but now that I use 1440p 144Hz I'm stuck at 1140/1575MHz. Wouldn't make a huge difference, but meh. Oh well...

And, by the way, I got myself a third DP cable. This time, it's a KabelDirekt DP 1.4 cable, so 8K certified and whatever, 5 star reviews all around Amazon.de. Still, same issue. The only thing that changed (but I can't pin it down to the cable change) is that now, when it loses signal it will stay black until I either turn the screen off and on again, or if I switch inputs and go back to DP. That's a minor inconvenience, though, so...



mynm said:


> About the wattman memory voltage, if ti is working fot the 390 I don't have a 390 to test, maybe like *BradleyW *said, it could helpto solve your problem. Because as it is a lower limit to to the core voltage for a memory clock, it can help in transitions from 2D to 3D.
> 
> 
> I did some test in the past with the wattman memory voltage for my 380 (now is broken, but not for testing this), and the weird thing about it is that with memory timings mod it can be lowered and not cause any problem. But without it lowering it can cause black screens. Also I have tested to lower it to my new 590, but I get black screens.
> 
> 
> I don't know how to do a proper timing mod, I don't have these tecnical skills. But this is the mod I did for my 380 with H5GC4H24AJR memorys (some 390 have the same memoryes, so it coul work for them), for 1500 and 1625 strap, and was working ok with low wattman memory voltages. :
> 
> 
> 77713320000000006BBD572F40550F0E2892F7060048C50022559D084C0D14205A8900A000000120100C20246F1E2912
> 
> 
> And the values for it that I get with this Vento041 tool are:
> 
> 
> 
> Spoiler
> 
> 
> 
> ####SEQ_WR_CTL_D1####
> DAT_DLY = 7
> DQS_DLY = 7
> DQS_XTR = 1
> DAT_2Y_DLY = 0
> ADR_2Y_DLY = 0
> CMD_2Y_DLY = 0
> OEN_DLY = 7
> OEN_EXT = 3
> OEN_SEL = 3
> ODT_DLY = 0
> ODT_EXT = 0
> ADR_DLY = 1
> CMD_DLY = 0
> ####SEQ_WR_CTL_2####
> DAT_DLY_H_D0 = 0
> DQS_DLY_H_D0 = 0
> OEN_DLY_H_D0 = 0
> DAT_DLY_H_D1 = 0
> DQS_DLY_H_D1 = 0
> OEN_DLY_H_D1 = 0
> WCDR_EN = 0
> ####SEQ_PMG_TIMING####
> TCKSRE = 3
> TCKSRX = 3
> TCKE_PULSE = 11
> TCKE = 27
> SEQ_IDLE = 7
> TCKE_PULSE_MSB = 1
> SEQ_IDLE_SS = 8
> ####SEQ_RAS_TIMING####
> TRCDW = 11
> TRCDWA = 11
> TRCDR = 15
> TRCDRA = 15
> TRRD = 5
> TRC = 47
> ####SEQ_CAS_TIMING####
> TNOPW = 0
> TNOPR = 0
> TR2W = 20
> TCCDL = 2
> TR2R = 5
> TW2R = 15
> TCL = 14
> ####SEQ_MISC_TIMING####
> TRP_WRA = 40
> TRP_RDA = 18
> TRP = 15
> TRFC = 111
> ####SEQ_MISC_TIMING2####
> PA2RDATA = 0
> PA2WDATA = 0
> TFAW = 8
> TCRCRL = 2
> TCRCWL = 5
> TFAW32 = 6
> TWDATATR = 0
> ####ARB_DRAM_TIMING####
> ACTRD = 16
> ACTWR = 12
> RASMACTRD = 32
> RASMACTWR = 36
> ####ARB_DRAM_TIMING2####
> RAS2RAS = 111
> RP = 30
> WRPLUSRP = 41
> BUS_TURN = 18
> ####MC_SEQ_MISC####
> MC_SEQ_MISC1 = 0x20140D4C
> MC_SEQ_MISC3 = 0xA000895A
> MC_SEQ_MISC8 = 0x20010000
> 
> 
> 
> 
> The stock timings for the 1500 strap are:
> 
> 
> 
> 7771332000000000CE516A3D9055111230964909004AE60022339D08740114206A8900A002003120150F292F94273116
> 
> 
> 
> And the values for if are:
> 
> 
> Spoiler
> 
> 
> 
> ####SEQ_WR_CTL_D1####
> DAT_DLY = 7
> DQS_DLY = 7
> DQS_XTR = 1
> DAT_2Y_DLY = 0
> ADR_2Y_DLY = 0
> CMD_2Y_DLY = 0
> OEN_DLY = 7
> OEN_EXT = 3
> OEN_SEL = 3
> ODT_DLY = 0
> ODT_EXT = 0
> ADR_DLY = 1
> CMD_DLY = 0
> ####SEQ_WR_CTL_2####
> DAT_DLY_H_D0 = 0
> DQS_DLY_H_D0 = 0
> OEN_DLY_H_D0 = 0
> DAT_DLY_H_D1 = 0
> DQS_DLY_H_D1 = 0
> OEN_DLY_H_D1 = 0
> WCDR_EN = 0
> ####SEQ_PMG_TIMING####
> TCKSRE = 2
> TCKSRX = 2
> TCKE_PULSE = 3
> TCKE = 19
> SEQ_IDLE = 7
> TCKE_PULSE_MSB = 1
> SEQ_IDLE_SS = 8
> ####SEQ_RAS_TIMING####
> TRCDW = 14
> TRCDWA = 14
> TRCDR = 20
> TRCDRA = 20
> TRRD = 6
> TRC = 61
> ####SEQ_CAS_TIMING####
> TNOPW = 0
> TNOPR = 0
> TR2W = 25
> TCCDL = 2
> TR2R = 5
> TW2R = 17
> TCL = 18
> ####SEQ_MISC_TIMING####
> TRP_WRA = 48
> TRP_RDA = 22
> TRP = 19
> TRFC = 148
> ####SEQ_MISC_TIMING2####
> PA2RDATA = 0
> PA2WDATA = 0
> TFAW = 10
> TCRCRL = 2
> TCRCWL = 6
> TFAW32 = 7
> TWDATATR = 0
> ####ARB_DRAM_TIMING####
> ACTRD = 21
> ACTWR = 15
> RASMACTRD = 41
> RASMACTWR = 47
> ####ARB_DRAM_TIMING2####
> RAS2RAS = 148
> RP = 39
> WRPLUSRP = 49
> BUS_TURN = 22
> ####MC_SEQ_MISC####
> MC_SEQ_MISC1 = 0x20140174
> MC_SEQ_MISC3 = 0xA000896A
> MC_SEQ_MISC8 = 0x20310002
> 
> 
> 
> 
> Maybe somebody with timings mod skills could test this and say why these timings can help to lower the wattman memory voltage.
> 
> 
> As memory timings mod can cause the gpu to brick, is not recomende to do this if you don't have a dual bios gpu or advanced bios flashing skills.
> 
> 
> 
> As your 390 performance is excellent nowadays, don't put it in risk.


So let me get this straight. That "memory voltage" is actually how soon the GPU will ramp up depending on the voltage? So, if I'm overclocking, lower would actually be better? I'm a bit confused, sorry. 

Well, I wouldn't call the 390x's performance excellent in 2019. It's on par with cards like the 580 and 1060 but can't overclock well and consume, like, all the power. I've seen my card do north of 340W, easy. That was overclocked balls to the wall, but still, damn. So yeah, I would love to see some more squeezed out of my card until I can get a new one, but it not having dual BIOS and me not having experience with BIOS modding kind of kills it for me. 

Thanks for your input though bro! Much appreciated!

PS : Decreasing the memory voltage in Wattman seems to help a bit with my issue. I tried decreasing it from the stock 1109mV to 1000mV and I don't get black screens quite as much. I can run my 1160/1600MHz overclock at +55mV (about 1.24V actual voltage) and I barely get any black screens now. I may get one when initially starting a game, but that's it, it won't go off and on constantly like it used to at the same otherwise settings. Or it may be something else that's helping, but so far it seems to be working, so I'm ok with it. Glad to be able to squeeze these last few MHz out of this poor card.


----------



## mynm

tolis626 said:


> Well, I did search around the web when it turned out that my screen was ok and the GPU was causing the problem. The most plausible theory I found was a guy on the AMD forums (I think) that was saying that the DP output of this card is fed by the 0.95V rail and when pushing the core voltage you may get some voltage drop on that rail and cause instability in the display output. That would actually make a lot of sense, considering that the core is the major power hog on these cards. AUX voltage doesn't seem to affect it, or at least not as much. Or it could be BS and I wouldn't know the difference, I dunno. What sucks is that I had to dial my OC down, even if slightly, now that the card needs every last drop of performance it can muster. I could realistically run 1180/1625MHz while on 1080p 60Hz, but now that I use 1440p 144Hz I'm stuck at 1140/1575MHz. Wouldn't make a huge difference, but meh. Oh well...
> 
> And, by the way, I got myself a third DP cable. This time, it's a KabelDirekt DP 1.4 cable, so 8K certified and whatever, 5 star reviews all around Amazon.de. Still, same issue. The only thing that changed (but I can't pin it down to the cable change) is that now, when it loses signal it will stay black until I either turn the screen off and on again, or if I switch inputs and go back to DP. That's a minor inconvenience, though, so...
> 
> 
> So let me get this straight. That "memory voltage" is actually how soon the GPU will ramp up depending on the voltage? So, if I'm overclocking, lower would actually be better? I'm a bit confused, sorry.
> 
> Well, I wouldn't call the 390x's performance excellent in 2019. It's on par with cards like the 580 and 1060 but can't overclock well and consume, like, all the power. I've seen my card do north of 340W, easy. That was overclocked balls to the wall, but still, damn. So yeah, I would love to see some more squeezed out of my card until I can get a new one, but it not having dual BIOS and me not having experience with BIOS modding kind of kills it for me.
> 
> Thanks for your input though bro! Much appreciated!
> 
> PS : Decreasing the memory voltage in Wattman seems to help a bit with my issue. I tried decreasing it from the stock 1109mV to 1000mV and I don't get black screens quite as much. I can run my 1160/1600MHz overclock at +55mV (about 1.24V actual voltage) and I barely get any black screens now. I may get one when initially starting a game, but that's it, it won't go off and on constantly like it used to at the same otherwise settings. Or it may be something else that's helping, but so far it seems to be working, so I'm ok with it. Glad to be able to squeeze these last few MHz out of this poor card.



You are welcome :thumb:. Nice to know that decreasing the wattman memory voltage is helping with the problem. So what is the voltage you see for memory voltage?. In hawaii gpus afaik, the core voltage is linked to the memory mhz. For example for these attached image of a 390 bios, the lower voltage for 1500mhz memory clock is the dpm3 voltage. So for these gpu wattman memory voltage I think it could be dpm3 voltage.


Maybe your gpu is power throttling and lowering wattman voltage the voltage could be going lower and help wit the 0.95V, but I don't know, I don't how is working 0.95V and DP.


About the 390 performance, I have a 590, it is near the 390. So a 2015 with near the same performance is nice. And I will be using it for years. I have a bad sample gpu, so it needs like 1.244v for 1545 mhz with is around a 212w power consumption, that is arond the stock power consumption of the 390, so it is not low aswell.


----------



## tolis626

mynm said:


> You are welcome :thumb:. Nice to know that decreasing the wattman memory voltage is helping with the problem. So what is the voltage you see for memory voltage?. In hawaii gpus afaik, the core voltage is linked to the memory mhz. For example for these attached image of a 390 bios, the lower voltage for 1500mhz memory clock is the dpm3 voltage. So for these gpu wattman memory voltage I think it could be dpm3 voltage.
> 
> 
> Maybe your gpu is power throttling and lowering wattman voltage the voltage could be going lower and help wit the 0.95V, but I don't know, I don't how is working 0.95V and DP.
> 
> 
> About the 390 performance, I have a 590, it is near the 390. So a 2015 with near the same performance is nice. And I will be using it for years. I have a bad sample gpu, so it needs like 1.244v for 1545 mhz with is around a 212w power consumption, that is arond the stock power consumption of the 390, so it is not low aswell.


In Hawaii GPUs the core voltage is directly tied to memory overclocking as far as I know too, so I think you're correct. I could never really overclock the memory reliably without added voltage. And to get over 1650MHz bench stable (game stable is no no on my poorly overclocking card) I had to crank the voltage up quite a bit. My highest bench clocks have been 1225MHz core and 1750MHz memory IIRC, but that was at +125mV using an Afterburner script (and +50mV aux voltage). I tried to game on that just for giggles, but it crashed withink 10 minutes. The temps weren't even THAT bad, it barely touched 80C with the fans at 100% and the side panel off, it hovered mostly around 77C on the core and 80C on the VRMs IIRC. Clocking the memory down to 1700MHz allowed me to play for a bit, but it crashed again after throwing millions of errors first. With memory at 1650Mhz it was ok. I guess if I had my rig under water, I could use these frequencies no problem, probably at lower volts too. But boy would I have to bypass some power limits first... The MSI Gaming 390x is great, but not having dual BIOS kills me a little bit on the inside. I don't want my first attemp at BIOS modding to be without a backup BIOS.

Regarding my issue, I really have no idea what's going on. But hey, as long as it works with a bit more performance on tap, I'm happy!

Well, 212W for the 390x is peanuts. Typical board power for these cards is 230W, so adding 50% we're sitting at 345W. I've ran out of power budget A LOT with this card, so yeah... With that said, if you don't push them, they can be quite efficient! I can get 1080MHz (so stock) with some stupid undervolt of like -50mV or -75mV. Like that, the card sips power like a British lady does tea. I think it barely topped 150W.


----------



## mynm

tolis626 said:


> In Hawaii GPUs the core voltage is directly tied to memory overclocking as far as I know too, so I think you're correct. I could never really overclock the memory reliably without added voltage. And to get over 1650MHz bench stable (game stable is no no on my poorly overclocking card) I had to crank the voltage up quite a bit. My highest bench clocks have been 1225MHz core and 1750MHz memory IIRC, but that was at +125mV using an Afterburner script (and +50mV aux voltage). I tried to game on that just for giggles, but it crashed withink 10 minutes. The temps weren't even THAT bad, it barely touched 80C with the fans at 100% and the side panel off, it hovered mostly around 77C on the core and 80C on the VRMs IIRC. Clocking the memory down to 1700MHz allowed me to play for a bit, but it crashed again after throwing millions of errors first. With memory at 1650Mhz it was ok. I guess if I had my rig under water, I could use these frequencies no problem, probably at lower volts too. But boy would I have to bypass some power limits first... The MSI Gaming 390x is great, but not having dual BIOS kills me a little bit on the inside. I don't want my first attemp at BIOS modding to be without a backup BIOS.
> 
> Regarding my issue, I really have no idea what's going on. But hey, as long as it works with a bit more performance on tap, I'm happy!
> 
> Well, 212W for the 390x is peanuts. Typical board power for these cards is 230W, so adding 50% we're sitting at 345W. I've ran out of power budget A LOT with this card, so yeah... With that said, if you don't push them, they can be quite efficient! I can get 1080MHz (so stock) with some stupid undervolt of like -50mV or -75mV. Like that, the card sips power like a British lady does tea. I think it barely topped 150W.



I have to buy one dual bios 390 some day  , to test the bios mods for it, specially the mvddc voltage one. 



About the 212w it was with 1.238v, with 1.244v is 216w but while gaming. I don't have tested it with benchmarks, but I suppose it can go to 230w or more aswell, and the core is at 80º with 2050 rpm, it don't have vrm temp monitoring. 1545mhz is the stock boost clock with 1.212v, but it is not stable if you add + 30% power limit from 178w stock tdp. I was expecting some thing better. It seems reviewes are using high quallity samples.


----------



## tolis626

mynm said:


> I have to buy one dual bios 390 some day  , to test the bios mods for it, specially the mvddc voltage one.
> 
> 
> 
> About the 212w it was with 1.238v, with 1.244v is 216w but while gaming. I don't have tested it with benchmarks, but I suppose it can go to 230w or more aswell, and the core is at 80º with 2050 rpm, it don't have vrm temp monitoring. 1545mhz is the stock boost clock with 1.212v, but it is not stable if you add + 30% power limit from 178w stock tdp. I was expecting some thing better. It seems reviewes are using high quallity samples.


Meh, don't waste your time. These cards were terrific back in the day, but now you get midrange performance at best, with the power consumption, thermals and noise of an OC top of the line card. Funny for experiments if you can, like, maybe find one cheap and preferably water cool it, but that's about it. I have to commend AMD for how well this card has aged, though. I remember it's competition was really Kepler. I don't see anyone even mentioning 700 series cards from NVidia anymore... Seems I made the right choice back then.


----------



## mynm

tolis626 said:


> Meh, don't waste your time. These cards were terrific back in the day, but now you get midrange performance at best, with the power consumption, thermals and noise of an OC top of the line card. Funny for experiments if you can, like, maybe find one cheap and preferably water cool it, but that's about it. I have to commend AMD for how well this card has aged, though. I remember it's competition was really Kepler. I don't see anyone even mentioning 700 series cards from NVidia anymore... Seems I made the right choice back then.



Yes, maybe is a waste of time, but I have already wasted the time finding out how to do the mvddc mod, and I want to know if it is working. I bought a 380 ,before the 590 I have now, instead of a 290 a little bit more expensive, maybe a bad choice, and I have started doing bios modding to my 380 reading the hawaii bios mod thread, so is some thing I want to test, if I found a cheap 390.


----------



## tolis626

mynm said:


> Yes, maybe is a waste of time, but I have already wasted the time finding out how to do the mvddc mod, and I want to know if it is working. I bought a 380 ,before the 590 I have now, instead of a 290 a little bit more expensive, maybe a bad choice, and I have started doing bios modding to my 380 reading the hawaii bios mod thread, so is some thing I want to test, if I found a cheap 390.


Well then, I hope you do get to buy one! These are really great cards to push. I mean, treat them correctly and they can take some punishment. They are just... Old at this point. But I guess it's fine for tinkering, especially if you won't really mind that much if the card dies. Cheers!


----------



## hawker-gb

Hello guys and girls

i have Sapphire R9 390 Nitro without backplate.
Suddenly and randomly my monitor blacks out and pc needs hard reset.
I suspect on GPU and did following:

-Use it in another pc and screen,same effect
-clean and replace thermal paste,same
-Change cables,try DP and HDMI,same effect
-There is no overheating,just randomly blacks out

Can i try anything else or its dead?


----------



## christoph

hawker-gb said:


> Hello guys and girls
> 
> i have Sapphire R9 390 Nitro without backplate.
> Suddenly and randomly my monitor blacks out and pc needs hard reset.
> I suspect on GPU and did following:
> 
> -Use it in another pc and screen,same effect
> -clean and replace thermal paste,same
> -Change cables,try DP and HDMI,same effect
> -There is no overheating,just randomly blacks out
> 
> Can i try anything else or its dead?


is not the video card, is windows, lots of people are experiencing black screens


----------



## hawker-gb

christoph said:


> is not the video card, is windows, lots of people are experiencing black screens


Strange because R5 2400g works well on same Windows (igpu)
And it happen on two different PCs on win10.

What is solution in case of Windows?


----------



## christoph

hawker-gb said:


> Strange because R5 2400g works well on same Windows (igpu)
> And it happen on two different PCs on win10.
> 
> What is solution in case of Windows?



there's a lot you can try, look in google about windows Black screens, and you'll find a lot


----------



## Mister300

*Doom Eternal Performance on a 2015 card*

My 5820K with a 5 yr old XFX underclocked 390X hits 70-100 fps on my LG 2K 144Hz VA panel on ultra nightmare settings, runs like butter. Vulkan API runs great. I see no need to upgrade card yet.


----------



## dlee7283

Ive seen the used 390 drop below 50 usd used prices which means it might be crossfire time for some owners with Doom Eternal and Resident Evil 3 coming out

https://www.ebay.com/itm/SAPPHIRE-N...207395?hash=item2ae2884363:g:GrsAAOSw~ppeWEjp


----------



## speed_demon

How well does the 390 scale in crossfire? 

Because otherwise you are better off going with the Vega 56 or comparable price/perf Nvidia card.


----------



## dlee7283

speed_demon said:


> How well does the 390 scale in crossfire?
> 
> Because otherwise you are better off going with the Vega 56 or comparable price/perf Nvidia card.


Its around the 50 bucks range right now with mining taking a nosedive ATM. It might be worth it if you got a beefy PSU and you have games that perform well in Vulkan. 

Vega is optimal but some might not want to fork out 200+. Just a budget option for some who already have 390 card.

Depending on the scenarios CFX outperform a single GTX 1070 in many tests which I think is good for a 2015 card. 

https://www.reddit.com/r/Amd/comments/4nmhjw/gtx_1070_vs_crossfire_r9_390_1440p_not_oc/

Major con is of course you would need to have a 1000w PSU


----------



## christoph

dlee7283 said:


> Its around the 50 bucks range right now with mining taking a nosedive ATM. It might be worth it if you got a beefy PSU and you have games that perform well in Vulkan.
> 
> Vega is optimal but some might not want to fork out 200+. Just a budget option for some who already have 390 card.
> 
> Depending on the scenarios CFX outperform a single GTX 1070 in many tests which I think is good for a 2015 card.
> 
> https://www.reddit.com/r/Amd/comments/4nmhjw/gtx_1070_vs_crossfire_r9_390_1440p_not_oc/
> 
> Major con is of course you would need to have a 1000w PSU



but what about games? does the cossfire just works no matter the game?, I might consider buying another 390 cuz all this pandemic thing instead of buying a 5700xt


----------



## dlee7283

christoph said:


> but what about games? does the cossfire just works no matter the game?, I might consider buying another 390 cuz all this pandemic thing instead of buying a 5700xt


This article helped me alot, basically there are different ways Crossfire can analyze data even if the game doesn't officially support it.

https://gpuopen-librariesandsdks.github.io/doc/AMD-CrossFire-guide-for-Direct3D11-applications.pdf

Some games like Fallout 4 won't benefit from it at all. Some games will get a massive boost.


----------



## christoph

dlee7283 said:


> This article helped me alot, basically there are different ways Crossfire can analyze data even if the game doesn't officially support it.
> 
> https://gpuopen-librariesandsdks.github.io/doc/AMD-CrossFire-guide-for-Direct3D11-applications.pdf
> 
> Some games like Fallout 4 won't benefit from it at all. Some games will get a massive boost.


 thanks, I'll take a look at it


----------



## speed_demon

I'm not sure I understand why the X is in front of my score? Probably because it's on a legacy bench? Just tinkering with the system to find its limits.


----------



## Gorszkof

Hey I just become an owner of r9 390 gigabyte g1 gaming card. It is bloody hot and I am curious, is there any list of what coolers are compatible with this card? And maybe You know if changing fans will cool it a bit?


----------



## speed_demon

@Gorszkof - Before you do anything, remove the existing thermal paste and apply some Arctic Silver 5 or Arctic MX-4 (Or whatever you have lying around). These cards hit 90c under load due to the old thermal paste drying out and no longer making good contact with the heatsink. 

I have the same card and just repasted mine earlier today. Haven't seen it past 65c since.


----------



## Gorszkof

speed_demon said:


> @*Gorszkof* - Before you do anything, remove the existing thermal paste and apply some Arctic Silver 5 or Arctic MX-4 (Or whatever you have lying around). These cards hit 90c under load due to the old thermal paste drying out and no longer making good contact with the heatsink.
> 
> I have the same card and just repasted mine earlier today. Haven't seen it past 65c since.





Well I cleaned it already and i put some new paste (some no name cause I am waiting for mx-2) and propably it is better because previous owner said it's get too hot and shut down itself (I am not suprised because it was very dirty and fans were not installed in a right way). Right now I don't have this problem but it gets to like 94c and is really loud. Only in gaming of course.


----------



## speed_demon

Same thing happened to me - you need to be very careful that the heatsink is making solid contact. The heatsink is finnicky to install as there is a large capacitor that has to sit inside of a cutout below the cooling fins, which can make it seem like it's been installed right but is just enough off that it doesn't have the mounting pressure it needs to cool properly.


----------



## Gorszkof

speed_demon said:


> Same thing happened to me - you need to be very careful that the heatsink is making solid contact. The heatsink is finnicky to install as there is a large capacitor that has to sit inside of a cutout below the cooling fins, which can make it seem like it's been installed right but is just enough off that it doesn't have the mounting pressure it needs to cool properly.





Good God, You were totally right. MX-2 on GPU and few new thermopads and it dropped aroudn 10°C. Awesome. 
It won't stop me from going full crazy with water cooling


----------



## dagget3450

speed_demon said:


> I'm not sure I understand why the X is in front of my score? Probably because it's on a legacy bench? Just tinkering with the system to find its limits.


X i thought was for extreme settings/bench scores? 



speed_demon said:


> @Gorszkof - Before you do anything, remove the existing thermal paste and apply some Arctic Silver 5 or Arctic MX-4 (Or whatever you have lying around). These cards hit 90c under load due to the old thermal paste drying out and no longer making good contact with the heatsink.
> 
> I have the same card and just repasted mine earlier today. Haven't seen it past 65c since.


I picked up 2 r9 390 nitros off fleabay, one of them had this issue as well. Thanks for the heads up!


----------



## Ultra-m-a-n

Guys, I finally sold my r9 390... I bought it from the OP Agent Smith1984 himself.

I sold it locally for $80, to fund my $90 RX 480 Red Devil I bought from [H].


.....moment of silence.


----------



## speed_demon

Having some recurring problems with my R9 390 and wondering if anybody has any tips to try and fix it. Biggest problem is the random black screens and system crashes. Also under load it sounds like a shaken up & recently opened can of soda. Doesn't sound at all like the coil whine of other cards I've had. 

Any ideas?


----------



## christoph

speed_demon said:


> Having some recurring problems with my R9 390 and wondering if anybody has any tips to try and fix it. Biggest problem is the random black screens and system crashes. Also under load it sounds like a shaken up & recently opened can of soda. Doesn't sound at all like the coil whine of other cards I've had.
> 
> Any ideas?


 one of the chokes must be bad


----------



## speed_demon

No I was very gentle when handling it. I would never hurt it, or even choke the card. Need to be careful with electronics.


----------



## Nerdyfred07

Hey guys. I too bought a Strix 390x for cheap. 

But bloody I think there's something wrong!! Everyone knows regardless of card it would only consume 20wattsish at idle but mine is confirmed 100watts! 

I got a energy meter and with a gtx 1070 total consumption is 120watts idle. 
With the 390x it's 240watts. When I game it's 700 watts!! I can bring it down to 450watts gaming by power limiting at - 50% in radeon software. 

Like what the hells going on? 😮

Expected maybe 150watts idle and 350watts gaming not the +100watts either way with power limited lol..


----------



## speed_demon

Looks to be like the card really does draw 100 watts at idle. Found this, and it lists the MSI 390X as just under 100 watt minimum draw - https://www.tomshardware.com/reviews/amd-radeon-r9-390x-r9-380-r7-370,4178-10.html


----------



## Nerdyfred07

speed_demon said:


> Looks to be like the card really does draw 100 watts at idle. Found this, and it lists the MSI 390X as just under 100 watt minimum draw - https://www.tomshardware.com/reviews/amd-radeon-r9-390x-r9-380-r7-370,4178-10.html



You're not wrong! but is it depending on the sample/card!?


Looking at multiple other sites like guru3d and kitguru the idle is more like 20 watts (or lets just say 50w max) but then mines a whopping 150watts at idle :O


even here https://bjorn3d.com/2015/07/asus-strix-r9-390x-gaming-oc-8g-review-strix-r9390x-dc3oc-8gd5-gaming/5/ i thought at first from yours sounded right but all their other reviews are full system power. so again can't be 100w 


Thinking should i flash a different lower variant bios to see how that goes. there's just something seriously wrong with max draw for card itself at 600w! (at the wall)


----------



## neurotix

Nerdyfred07 said:


> You're not wrong! but is it depending on the sample/card!?
> 
> 
> Looking at multiple other sites like guru3d and kitguru the idle is more like 20 watts (or lets just say 50w max) but then mines a whopping 150watts at idle :O
> 
> 
> even here https://bjorn3d.com/2015/07/asus-strix-r9-390x-gaming-oc-8g-review-strix-r9390x-dc3oc-8gd5-gaming/5/ i thought at first from yours sounded right but all their other reviews are full system power. so again can't be 100w
> 
> 
> Thinking should i flash a different lower variant bios to see how that goes. there's just something seriously wrong with max draw for card itself at 600w! (at the wall)



Hey, I can help.

I had two R9 290 Tri-X, and two R9 290 Vapor-X as well. 

Windows 10 uses 3D more heavily, so it seems like maybe these don't clock down to 300/150MHz like in Windows 7. Check power plan maybe.

I would say that 100w would not be surprising, both Hawaii and Fiji are monster GPUs and use 300w-400w under load. I remember I had the two 290 Tri-X and a 5GHZ FX-8350 back in the day- I did see upwards of 900w power usage in games and 300 was probably the CPU.

If it functions and doesn't overheat, *shrug*


----------



## yrd

*Booting - altered BIOS?*

Hey guys, I have an old r9 390 Sapphire Nitro 8GB with backplate (b450 tomahawk /w 2600x - running on win10 1903) that I suspect was used for light mining. It was demonstrated running a game to me, however.

It seems to only like booting if the system is warm (blackscreen at post, with VGA error, unless i have left it at the failed post stage for a few minutes and let the system warm up a bit, then rebooting posts fine). 

I'm going to assume some idiot has screwed around with the BIOS on this card, I know it has a switchover to UEFI, but that doesn't seem to make a difference. 

Is there anything I can to do to see if the card is beyond repair, or to see if I can restore its original settings/voltages, or a guide to learning more. 

I'm not even sure what they are supposed to look like. GPU-Z screens:

http://gpuz.techpowerup.com/20/07/20/3tg.png
http://gpuz.techpowerup.com/20/07/20/5vs.png

Thanks,
Dray


----------



## chris89

If anyone wants their BIOS modded message me and ill mod it for them but please make a video of it in action on youtube afterward so I can see it in action. I love watching the results with full osd on msi afterburner enabled showing the clock and temps and fan speed and vrm 1 & 2 temperature please.


----------



## Cleatboks

chris89 said:


> If anyone wants their BIOS modded message me and ill mod it for them but please make a video of it in action on youtube afterward so I can see it in action. I love watching the results with full osd on msi afterburner enabled showing the clock and temps and fan speed and vrm 1 & 2 temperature please.


i try your mod bios chris it's really great improvement!! i got about 5-10 fps more while gaming, also it become snapier, but seems my card building artifact and the system freeze after while using this setup (i really wanna keep this setting or even higher if possible ) guess i have to go back to default setting for now

pic


----------



## chris89

Cleatboks said:


> i try your mod bios chris it's really great improvement!! i got about 5-10 fps more while gaming, also it become snapier, but seems my card building artifact and the system freeze after while using this setup (i really wanna keep this setting or even higher if possible ) guess i have to go back to default setting for now
> 
> pic
> View attachment 2459687


Can you monitor VRM 1&2 in game as well? It probably needs more voltage if its artifacting.


----------



## Cleatboks

chris89 said:


> Can you monitor VRM 1&2 in game as well? It probably needs more voltage if its artifacting.


gonna try test this one then chris, hope my card can handle the voltage increase. i'll screen shoot it later here


----------



## chris89

Cleatboks said:


> gonna try test this one then chris, hope my card can handle the voltage increase. i'll screen shoot it later here


Thanks bro yeah I can't remember off the top of my head what stock voltage is but I think its 1.275v if someone else can verify this I'd appreciate it or unless you can, Cleatboks?

If 1.275v is the case then an increase of +75mv from 1.275v to 1.350v for 1.133Ghz Core clock. I thought 1.333v was sufficient for you but I guess not.

Let me know & post your complete GPU details sensors of Hwinfo for me too please under load sensors data. I need to see all of your values current/min/max/average.


----------



## Cleatboks

chris89 said:


> Thanks bro yeah I can't remember off the top of my head what stock voltage is but I think its 1.275v if someone else can verify this I'd appreciate it or unless you can, Cleatboks?
> 
> If 1.275v is the case then an increase of +75mv from 1.275v to 1.350v for 1.133Ghz Core clock. I thought 1.333v was sufficient for you but I guess not.
> 
> Let me know & post your complete GPU details sensors of Hwinfo for me too please under load sensors data. I need to see all of your values current/min/max/average.


i'm not sure about the card stock voltage either, each card is different i supposed? still learning about this stuff, i'm just run of the mill gaming chris😅. 
i try your suggestion to increase +75 even max out to +100 but doesn't have any effect the artifact still there, my card maybe just bad at overclocking.

here the pic chris








it's kinda hard to take screenshoot when starting the game, everything gone haywire (lot's of dot, screen flickering then suddenly system freeze)
the seems stable core clock i try is below 1.100Ghz or 1090 to be exact, but it barely had any noticeable different from the base clock plus had to increase the voltage around +20~30
which also increase the temp. guess the default clock already hardwire to this card. anyway thanks alot for the help man, cheers!


----------



## chris89

Cleatboks said:


> i'm not sure about the card stock voltage either, each card is different i supposed? still learning about this stuff, i'm just run of the mill gaming chris😅.
> i try your suggestion to increase +75 even max out to +100 but doesn't have any effect the artifact still there, my card maybe just bad at overclocking.
> 
> here the pic chris
> View attachment 2459871
> 
> 
> it's kinda hard to take screenshoot when starting the game, everything gone haywire (lot's of dot, screen flickering then suddenly system freeze)
> the seems stable core clock i try is below 1.100Ghz or 1090 to be exact, but it barely had any noticeable different from the base clock plus had to increase the voltage around +20~30
> which also increase the temp. guess the default clock already hardwire to this card. anyway thanks alot for the help man, cheers!


Try downloading HWInfo & post your sensors page like this after loaded gaming on stock settings.

















Free Download HWiNFO Sofware | Installer & Portable for Windows, DOS


Start to analyze your hardware right now! HWiNFO has available as an Installer and Portable version for Windows (32/64-bit) and Portable version for DOS.




www.hwinfo.com





We need to know the stock voltage?


----------



## Cleatboks

It's fine chris, i think the card just doesn't do well at overclock.


----------



## chris89

At least we could test a delimited bios on stock clocks to see if it helps & see what happens?


----------



## Fuski

Hello everyone! 
I started getting random black screens with my sapphire nitro 390x (during desktop,games etc). I cant even get into windows sometimes. Is my gpu dying?  
I heard that tweaking some 2d or 3d voltages in bios can fix this, or is it already useless to do it?

This is my stock bios in attachments if someone interested to help me fix this


----------



## speed_demon

I had the black screens and screen flickering for a long time with my 390. The driver version is very very important with that issue and you need to find out which drivers work best for you. In my case the 19.10 driver and 20.8.2 driver both work great with no black screens, any driver revision other than those two and I get frequent problems.


----------



## chris89

Fuski said:


> Hello everyone!
> I started getting random black screens with my sapphire nitro 390x (during desktop,games etc). I cant even get into windows sometimes. Is my gpu dying?
> I heard that tweaking some 2d or 3d voltages in bios can fix this, or is it already useless to do it?
> 
> This is my stock bios in attachments if someone interested to help me fix this


I had black screen issues on my 390x if the fan wasn't cooling the ASIC down enough on boot up & on the desktop & in game so you have to tune the fan profile to cool down the card further & earlier. Test this BIOS & let me know what happens.


----------



## Fuski

chris89 said:


> I had black screen issues on my 390x if the fan wasn't cooling the ASIC down enough on boot up & on the desktop & in game so you have to tune the fan profile to cool down the card further & earlier. Test this BIOS & let me know what happens.


I appreciate your help, Chris
Bios is not tested yet, but i have some things that i need to mention - i have an Arctic Cooling custom cooler installed and this can't be modded through bios. And - my gpu degraded fast these days so i cant even get into windows now with drivers installed. One more thing - i checked on another pc and same ****. My gpu is prolly dead now


----------



## chris89

Fuski said:


> I appreciate your help, Chris
> Bios is not tested yet, but i have some things that i need to mention - i have an Arctic Cooling custom cooler installed and this can't be modded through bios. And - my gpu degraded fast these days so i cant even get into windows now with drivers installed. One more thing - i checked on another pc and same ****. My gpu is prolly dead now


If the card posts video on startup its not bad. By the way I recommend you flash the BIOS I made, it will help. My friend did the same thing to his GPU. The problem is your not cooling the VRM with enough pressure. Do you have any pictures of the card right now. Take close ups of the VRM heatsinks & RAM heatsinks. Its the VRM thats overheating & the ASIC that causes the black screen issue. Same applies to the 5700 XT, its an overheating issue AMD/ ATI are idiots about temperature limits.


----------



## Fuski

chris89 said:


> If the card posts video on startup its not bad. By the way I recommend you flash the BIOS I made, it will help. My friend did the same thing to his GPU. The problem is your not cooling the VRM with enough pressure. Do you have any pictures of the card right now. Take close ups of the VRM heatsinks & RAM heatsinks. Its the VRM thats overheating & the ASIC that causes the black screen issue. Same applies to the 5700 XT, its an overheating issue AMD/ ATI are idiots about temperature limits.


Flashed the bios through DOS using atiflash 4.17 - no difference. Like i said, bioses doesnt affect to custom coolers. Arctic Cooling Accelero Xtreme IV - google it, there is no ram/vrm heatsinks because big radiator as a backplate doing this job. So I dont think its all about temperatures


----------



## Lex-Man

Hi guys, I've just joined the forum because I have a 390x but I can't seem to play any new games on it. Been trying to play Control and Outer Wilds. After about 10 minutes of play I start getting a lot of artefacts in a pattern and then the game crashes. Older games like GTA 5 run fine and I can run Blender with cycles render to the card at 100% and apart from getting very noisy it seems fine. 

I've got a ASRock X99E-ITX/ac 
16GB RAM CORSAIR DDR 4 (2x8GB)
i7 5820k


----------



## chris89

You need to make sure the core never exceeds 84 degrees celsius, & the 512 bit GDDR5 gets scorching hot under load. You could try under volting the ram bus to 875mv from 1000mv & downclocking the ram from 1500mhz to 1250mhz. I did this to mine, it helps a lot. Black screen is too hot board temperature. Artifacting is too low voltage.


----------



## Lex-Man

chris89 said:


> You need to make sure the core never exceeds 84 degrees celsius, & the 512 bit GDDR5 gets scorching hot under load. You could try under volting the ram bus to 875mv from 1000mv & downclocking the ram from 1500mhz to 1250mhz. I did this to mine, it helps a lot. Black screen is too hot board temperature. Artifacting is too low voltage.


I pushed my base core clock down to 900Mhz and Mem to 1250Mhz. The game seems to run fine at that setting although haven't had a massive session yet. The temp was mid 70s. I can't seem to get the card to under voltage itself or limit the temp setting either on Afterburner just cause the values to reset to default. 

It seems from what you said that the issues is the card isn't getting enough voltage. I have got both the PCI pins coming from the same cable I know from when I installed the card that this isn't recommended but due my case I don't have any other option.


----------



## chris89

Yeah you can undervolt the memory from 1000mv to 875mv in HawaiiBiosEditor to set it to 1250mhz on the memory. I wouldn't go below 1000mhz on the core clock for leaving the voltage at 65288.


----------



## aqvyx

is there something like memory timings mod for the R9 390? Increasing performance without straight up just overclocking?


----------



## gordesky1

So using my r9 390 as my gaming card again been a mining card sense 2017 up till 2 months ago and 4 days ago i pop it in my main rig and have my 1080ti mining with my 2 other cards cause yea mining been great again.

Had to put the stock bios back on the 390 tho sense in gaming it was unstable at times which fixed the lock ups. running great again but i noticed when running 4 monitors on it sometimes one of the monitors would either go a full color or snowy and to fix it you just disconnect it threw display options and enable it again.

But i remember this used to happen with my old amd 5870 when more than 1 monitor is hooked up and the fix is to make the 2d clocks higher than they was.

Is there anyway too run all clock states at 3d full clocks always?? Nvidia you can lock the clocks always but cant find such a feature on the amd side? in the amd software i have state 1 up to state 7 which is maxed on all the same 3d max clocks but you cant change the state 0 min clock.


----------



## chris89

Does anyone have the complete strings of the Elpida & Hynix memory timings?

I have this info here but I need the strings for Elpida I think.

290 290x 390 390x memory timings help video - YouTube

Strap end 400MHz (40 9C 00) , Range = 150-400MHz
Strap end 800MHz (80 38 01) , Range = 401-800MHz
Strap end 900MHz (90 5F 01) , Range = 801-900MHz
Strap end 1000MHz (A0 86 01) , Range = 901-1000MHz
Strap end 1125MHz (74 B7 01) , Range = 1001-1125MHz
Strap end 1250MHz (48 E8 01) , Range = 1126-1250MHz
Strap end 1375MHz (1C 19 02) , Range = 1251-1375MHz
Strap end 1500MHz (F0 49 02) , Range = 1376-1500MHz
Strap end 1625MHz (C4 7A 02) , Range = 1501-1625MHz
Strap end 1750MHZ (98 AB 02) , Range = 1626-1750MHz


----------



## Flingflong

Lads, my Gigabyte G1 390 fans are dying a slow death and I think I'm gonna go the route of zip tying two Arctic P12 pwm fans to the heatsink. 

My question is, how should I go about connecting the fans? I have the MSI Z97S Krait Edition mobo, it has 3 non-pwm sys fan headers and 2 pwm cpu fan headers. Could I connect them to the gpu fan header? It's tiny though to use a technical term so I'd have to get an adapter of some sort but I couldn't find out which one I'd need by googling. Or should I just connect them to one of the sys fan headers? From what I understand you can still control them without the pwm option but I am pretty dumb so I can't be sure. I'm guessing I can't use the extra cpu fan header but I couldn't really find any info on it


----------



## The red spirit

Flingflong said:


> Lads, my Gigabyte G1 390 fans are dying a slow death and I think I'm gonna go the route of zip tying two Arctic P12 pwm fans to the heatsink.
> 
> My question is, how should I go about connecting the fans? I have the MSI Z97S Krait Edition mobo, it has 3 non-pwm sys fan headers and 2 pwm cpu fan headers. Could I connect them to the gpu fan header? It's tiny though to use a technical term so I'd have to get an adapter of some sort but I couldn't find out which one I'd need by googling. Or should I just connect them to one of the sys fan headers? From what I understand you can still control them without the pwm option but I am pretty dumb so I can't be sure. I'm guessing I can't use the extra cpu fan header but I couldn't really find any info on it


GPU header would be ideal as GPU itself controls fan speed according to its temperatures. If you can't find an adapter, then it's better to just run fans at full speed or at any other speed, where you know that GPU won't overheat. Some motherboards let you control fans automatically and with your own fan curve, just like GPU header does.


----------



## chris89

PS - What I also found as for clocks & voltages were as follows on my 390x with my 1000w kingwin power supply.

Giga = Billion, so Billion's of Pixel's per second.

Highest core clock I got was 1240mhz @ 1440mv.

1094mhz @ 1250mv @ 70.0 Giga Pixel/s
1133mhz @ 1333mv @ 72.5 Giga Pixel/s
1157mhz @ 1357mv @ 74.0 Giga Pixel/s
1173mhz @ 1373mv @ 75.0 Giga Pixel/s
1188mhz @ 1388mv @ 76.0 Giga Pixel/s
1204mhz @ 1404mv @ 77.0 Giga Pixel/s
1219mhz @ 1419mv @ 78.0 Giga Pixel/s
1234mhz @ 1434mv @ 79.0 Giga Pixel/s
1250mhz @ 1450mv @ 80.0 Giga Pixel/s


----------

