# R9 390 vs. GTX 970 [The Final Conclusion]



## sinholueiro

I have to go with the 390. It just performs better, that's what matters in the end.


----------



## Slay

But what about OC? I've heard some bad things about the 390 and good things about 970.


----------



## Artikbot

It should be a Tahiti Pro, so overclocking shouldn't be much of an issue.

Other than it also bumping power consumption generously, of course.


----------



## Noufel

Quote:


> Originally Posted by *Artikbot*
> 
> It should be a Tahiti Pro, so overclocking shouldn't be much of an issue.
> 
> Other than it also bumping power consumption generously, of course.


the 390 is a hawaii pro and for oc you'll be targeting 1150-1200 on the core with a voltage bump


----------



## rv8000

There really isn't a better card between the two, they both provide fantastic performance @ their price range, it simply depends on what you need and what games you play.


----------



## PontiacGTX

Quote:


> Originally Posted by *Slay*
> 
> But what about OC? I've heard some bad things about the 390 and good things about 970.


the 290/390 oced shoudl be about the same at 1080 when the 970 is oced too unless the game is nvidia biased.but at 1440 or 4k might be better than a 970(mainly due to the vram on the 970)

http://www.guru3d.com/articles_pages/powercolor_radeon_r9_390_pcs_8gb_review,26.html
http://www.guru3d.com/articles_pages/gigabyte_geforce_gtx_970_oc_mini_itx_review,28.html


----------



## Artikbot

Quote:


> Originally Posted by *Noufel*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Artikbot*
> 
> It should be a Tahiti Pro, so overclocking shouldn't be much of an issue.
> 
> Other than it also bumping power consumption generously, of course.
> 
> 
> 
> the 390 is a hawaii pro and for oc you'll be targeting 1150-1200 on the core with a voltage bump
Click to expand...

Absolutely right, my apologies. My memory isn't as good as it used to be for core names, lol


----------



## InsidiousBoot

Really it all matters down to software at this point, which technologies do you prefer?

Nvidia has MFAA for example which is quite unique and works well.. TXAA is also quite lovely with certain games ( mostly GTA V ) ..

AMD is catching up in a lot of areas as well, can't say much more on that however.

So the 390 has 8 gigs of VRAM, but you'll be fine with 3.5 - 4 gigs..

People also seem to forget Nvidia's cards have memory texture compression which alows for more data storage so that should even things out.

You can OC both fairly high.

So it's up to you really.


----------



## maxlimits

what about the new deal that nvidia revealed about the gtx cards coming with with phantom pain game codes? would it be better to purchase a 970 over a 390 if they are roughly the same price?


----------



## aDyerSituation

390>970 all the way


----------



## Mad Pistol

Going to have to go with the 390 on this. 8GB vram and 512-bit memory bus.

Combine that with similar performance between the 970 and 390, and the 390 is the better/more futureproof card IMO.


----------



## xd9denz

Quote:


> Originally Posted by *Mad Pistol*
> 
> Going to have to go with the 390 on this. 8GB vram and 512-bit memory bus.
> 
> Combine that with similar performance between the 970 and 390, and the 390 is the better/more futureproof card IMO.


the perfect explanation and same idea here


----------



## Crouch

Definitely the 390


----------



## aDyerSituation

SO let's recap.

390>970 for pure gaming performance. Why?

Pros:

-Beats the 970 handily in most benchmarks
-Doesn't just have 4 full gigs of gddr5, but 8!
-Will probably age better with more vram and that seems to be the trend lately with AMD cards.

Cons:

-Power Draw
-Overclocking might be more limited when compared to the 970, but both fully overclocked I'm sure the 390 still wins

And this is coming from someone who has a 970.


----------



## raghu78

Quote:


> Originally Posted by *edo101*
> 
> ^don't listen to him. Go with the green. Green is always the best way to go for *future stability and ownership.*
> 
> Pascal is coming too. Buy a 970 now and then buy Pascal when it comes out. Don't wait for AMD to release a card. They might not even last that long


Ask Kepler owners about that.







They are getting shafted in performance in the latest games.


----------



## Slay

Quote:


> Originally Posted by *edo101*
> 
> Not entirely sure what you are implying. I am just trying to apply good sound OCN logic to help OP make a solid OCN decision


However hard you're trolling, you are right on that one, some OCN users will just say "Go NVidia because NVidia". Nice to see that 390 is winning so far.


----------



## iinversion

The R9 390 does have 8GB of VRAM but you will never be able to fully make use of that before you max the core. I believe it is more for marketing than anything. Since this discussion also focuses on 1080p, you will never ever need 8GB of VRAM. Since you are talking about 1080p and single card setups you should not even be talking about the R9 390 as the 390 is just a factory OC'd R9 290 + extra 4GB of VRAM(which is completely useless esp. at 1080p.) for the most part with slight changes to power management for $90 more.

If you OC them both to the max I have no doubt the 970 will win more often than not esp. at 1080p and 3.5GB of VRAM will not be a problem at 1080p.

If you consider the R9 290 at around $80-$90 cheaper than a R9 390 then it definitely represents a better value than the GTX 970. However, if you are comparing a 1500MHz+ 970 to a 1200MHz R9 290/390 @ 1080p only then 970 will definitely win more often than not.


----------



## aDyerSituation

Quote:


> Originally Posted by *iinversion*
> 
> The R9 390 does have 8GB of VRAM but you will never be able to fully make use of that before you max the core. I believe it is more for marketing than anything. Since this discussion also focuses on 1080p, you will never ever need 8GB of VRAM. Since you are talking about 1080p and single card setups you should not even be talking about the R9 390 as the 390 is just a factory OC'd R9 290 + extra 4GB of VRAM(which is completely useless esp. at 1080p.) for the most part with slight changes to power management for $90 more.
> 
> I*f you OC them both to the max I have no doubt the 970 will win more often than not esp. at 1080p and 3.5GB of VRAM will not be a problem at 1080p.*
> 
> If you consider the R9 290 at around $80-$90 cheaper than a R9 390 then it definitely represents a better value than the GTX 970. However, if you are comparing a 1500MHz+ 970 to a 1200MHz R9 290/390 @ 1080p only then 970 will definitely win more often than not.


Not true at all. It's super easy to hit the vram wall in GTA V and heavily modded games. And the 390 is way closer to a 980 than a 970. Overclocked or not.

Also, 3.5gb vs 4gb makes or breaks ultra textures in SoM


----------



## Yorkston

Quote:


> Originally Posted by *iinversion*
> 
> stuff


This

After OC the 290/390 will likely fall behind the aftermarket 970s at 1080p, although the 390 is a much better dual-gpu candidate and handles high resolutions better. For anyone considering a single card now, I would recommend an aftermarket 290x over the 390. You can get a 290x PCS+, Tri-X OC, or Lightning LE for the same price as a 390 currently, although they will probably begin to disappear as production has switched over to the 300 series.


----------



## diggiddi

Quote:


> Originally Posted by *Yorkston*
> 
> This
> 
> After OC the 290/390 will likely fall behind the aftermarket 970s at 1080p, although the 390 is a much better dual-gpu candidate and handles high resolutions better. For anyone considering a single card now, I would recommend an aftermarket 290x over the 390. You can get a 290x PCS+, Tri-X OC, or *Lightning LE* for the same price as a 390 currently, although they will probably begin to disappear as production has switched over to the 300 series.


How well do those OC, I see one in my future


----------



## HeadlessKnight

Quote:


> Originally Posted by *iinversion*
> 
> The R9 390 does have 8GB of VRAM but you will never be able to fully make use of that before you max the core. I believe it is more for marketing than anything. Since this discussion also focuses on 1080p, you will never ever need 8GB of VRAM. Since you are talking about 1080p and single card setups you should not even be talking about the R9 390 as the 390 is just a factory OC'd R9 290 + extra 4GB of VRAM(which is completely useless esp. at 1080p.) for the most part with slight changes to power management for $90 more.
> 
> If you OC them both to the max I have no doubt the 970 will win more often than not esp. at 1080p and 3.5GB of VRAM will not be a problem at 1080p.
> 
> If you consider the R9 290 at around $80-$90 cheaper than a R9 390 then it definitely represents a better value than the GTX 970. However, if you are comparing a 1500MHz+ 970 to a 1200MHz R9 290/390 @ 1080p only then 970 will definitely win more often than not.


CoD AW, SoM, Modded Skyrim, Far Cry 4, AC:U & GTA V all can get 3.5 GB+ at 1080p and make the 970 stutter pretty badly due to its uneven memory configuration. 8 GB is not useless at 1080p as you are implying. The main reason why I got rid of the 780 Ti from my main rig was the 3 GB and guess what that extra .5 GB the GTX 970 has over the 780 Ti is not much better, give it a few months ahead and it is going to suffer just like the 780 Ti, you won't be able to turn on Ultra textures in games.
8 GB while might not be fully utilized but at least the R9 390 is a much safer option than the 970 which has barely enough VRAM to run current games textures at High-Ultra at 1080p with 80%+ VRAM utilization.


----------



## iinversion

Quote:


> Originally Posted by *aDyerSituation*
> 
> Not true at all. It's super easy to hit the vram wall in GTA V and heavily modded games. And the 390 is way closer to a 980 than a 970. Overclocked or not.
> 
> Also, 3.5gb vs 4gb makes or breaks ultra textures in SoM


OC vs OC the 970 wins pretty much every time against the R9 290. Is it worth the extra $100? In terms of bang/buck, no probably not. The 970 at the same price is definitely a better deal than the R9 390.

Why would it change with a rebranded card just because the OC is higher from the factory and it has an extra 4GB of VRAM?
Quote:


> Originally Posted by *diggiddi*
> 
> How well do those OC, I see one in my future


MSI LE edition cards are not binned. LE basically takes the lightning cooler and lighting PCB and slaps it on a non-binned GPU. I would expect average OC ability but the cooler itself is pretty good.
Quote:


> Originally Posted by *HeadlessKnight*
> 
> CoD AW, SoM, Modded Skyrim, Far Cry 4, AC:U & GTA V all can get 3.5 GB+ at 1080p and make the 970 stutter pretty badly due to its uneven memory configuration. 8 GB is not useless at 1080p as you are implying. The main reason why I got rid of the 780 Ti from my main rig was the 3 GB and guess what that extra .5 GB the GTX 970 has over the 780 Ti is not much better, give it a few months ahead and it is going to suffer just like the 780 Ti, you won't be able to turn on Ultra textures in games.


So now people are suggesting 8GB of VRAM for 1080p just so they can max up AA all the way? I know the 3.5GB config can make the 970 stutter but it is easily avoidable.

People act like the extra 512MB of VRAM from 3.5GB > 4GB is going to make a break a game.


----------



## aDyerSituation

Quote:


> Originally Posted by *iinversion*
> 
> OC vs OC the 970 wins pretty much every time against the R9 290. Is it worth the extra $100? In terms of bang/buck, no probably not. The 970 at the same price is definitely a better deal than the R9 390.
> .


the 390 is faster by 5% than a 290x. Maybe you should go read some reviews.

Also, you avoided the vram problem.

Oh and your "easily avoidable" argument doesn't make any sense when a 970 and a 390 cost around the same, and the 390 doesn't limit you graphically.


----------



## bigkahuna360

Quote:


> Originally Posted by *HeadlessKnight*
> 
> Quote:
> 
> 
> 
> Originally Posted by *iinversion*
> 
> The R9 390 does have 8GB of VRAM but you will never be able to fully make use of that before you max the core. I believe it is more for marketing than anything. Since this discussion also focuses on 1080p, you will never ever need 8GB of VRAM. Since you are talking about 1080p and single card setups you should not even be talking about the R9 390 as the 390 is just a factory OC'd R9 290 + extra 4GB of VRAM(which is completely useless esp. at 1080p.) for the most part with slight changes to power management for $90 more.
> 
> If you OC them both to the max I have no doubt the 970 will win more often than not esp. at 1080p and 3.5GB of VRAM will not be a problem at 1080p.
> 
> If you consider the R9 290 at around $80-$90 cheaper than a R9 390 then it definitely represents a better value than the GTX 970. However, if you are comparing a 1500MHz+ 970 to a 1200MHz R9 290/390 @ 1080p only then 970 will definitely win more often than not.
> 
> 
> 
> CoD AW, SoM, Modded Skyrim, Far Cry 4, AC:U & GTA V all can get 3.5 GB+ at 1080p and make the 970 stutter pretty badly due to its uneven memory configuration. 8 GB is not useless at 1080p as you are implying. The main reason why I got rid of the 780 Ti from my main rig was the 3 GB and guess what that extra .5 GB the GTX 970 has over the 780 Ti is not much better, give it a few months ahead and it is going to suffer just like the 780 Ti, you won't be able to turn on Ultra textures in games.
> 8 GB while might not be fully utilized but at least the R9 390 is a much safer option than the 970 which has barely enough VRAM to run current games textures at High-Ultra at 1080p with 80%+ VRAM utilization.
Click to expand...

I dunno, you did just quote three trash games, one game with huge memory leaks, and one that is trash for optimization.

I will argue with you until the end of time that you can do just fine on 2GB VRAM and 3GB VRAM at 1440p, even more so on 1080p.


----------



## iinversion

Quote:


> Originally Posted by *aDyerSituation*
> 
> the 390 is faster by 5% than a 290x. Maybe you should go read some reviews.
> 
> Also, you avoided the vram problem.


Yes. Do you know why?

Because it is has 1GHz faster memory and 50MHz faster core by stock.

Increase the 290X memory by 1GHz and 50MHz core and the 290X will be faster. I don't know why this so hard for some people to wrap their head around.


----------



## HeadlessKnight

Quote:


> Originally Posted by *iinversion*
> 
> So now people are suggesting 8GB of VRAM for 1080p just so they can max up AA all the way? I know the 3.5GB config can make the 970 stutter but it is easily avoidable.
> 
> People act like the extra 512MB of VRAM from 3.5GB > 4GB is going to make a break a game.


SoM can get over 4 GB at 1080p with Ultra textures without AA. And the 970 can easily get 60 fps average with Ultra Textures provided it has enough VRAM. Most people buy $330+ cards to max out their games including textures, and have stutter in almost every game because the card cannot max out the textures is an issue especially at a mainstream resolution like 1080p.
Quote:


> Originally Posted by *bigkahuna360*
> 
> I dunno, you did just quote three trash games, one game with huge memory leaks, and one that is trash for optimization.
> 
> I will argue with you until the end of time that you can do just fine on 2GB VRAM and 3GB VRAM at 1440p, even more so on 1080p.


Whether a game is trash or not that is your opinion and it adds zero value to the discussion. The fact remains one card in the same price range can do things better and handle higher textures settings than the other card in that "trash" or "unoptimized" game. At least those "trash" or "unoptimized" games scale with hardware. Many people blame the game developers for making poorly optimized games. It is like they are developers themselves and know for a fact that the games are unoptimized. We have games that are "poorly optimized (your claim)" and we have hardware in the market, there is a GPU that can handle higher textures better due to its VRAM capacity and there is one that cannot handle it well due to its lacking VRAM, if the first card did it right at the same price range you must know that the second card is the overpriced trash itself and not the game.


----------



## aDyerSituation

Quote:


> Originally Posted by *iinversion*
> 
> Yes. Do you know why?
> 
> Because it is has 1GHz faster memory and 50MHz faster core by stock.
> 
> Increase the 290X memory by 1GHz and 50MHz core and the 290X will be faster. I don't know why this so hard for some people to wrap their head around.


Dude your fanboy is showing so hard. You tried to compare a 290 to a 970.
And it's not just the clocks of the card the drivers have also improved significantly as well. IDK if the 290/x has or will get these drivers yet, but maybe then you will have an argument.

Just admit AMD finally won a price bracket and move on.


----------



## diggiddi

Quote:


> Originally Posted by *aDyerSituation*
> 
> the 390 is faster by 5% than a 290x. Maybe you should go read some reviews.
> 
> Also, you avoided the vram problem.
> 
> Oh and your "easily avoidable" argument doesn't make any sense when a 970 and a 390 cost around the same, and the 390 doesn't limit you graphically.


The problem with those reviews is they are using different drivers for 290x and 390 so they are not accurate from that standpoint,
but I believe the 390 will clock slightly higher than the 290x


----------



## iinversion

Quote:


> Originally Posted by *aDyerSituation*
> 
> Dude your fanboy is showing so hard. You tried to compare a 290 to a 970.
> And it's not just the clocks of the card the drivers have also improved significantly as well. IDK if the 290/x has or will get these drivers yet, but maybe then you will have an argument.
> 
> Just admit AMD finally won a price bracket and move on.


If you actually read my first post in this thread I mentioned that the R9 290 was a better value than the GTX 970. Of course I compared the 290 to the 970 when it is completely relevant considering the R9 390 is a REBRAND of the R9 290 for $90 more! Of course, it does have 4GB extra VRAM and "better" power management. But unless you are getting into multiple cards and using a 4K resolution I will never see the point in having 8GB of VRAM.

The R9 390 is a joke and an awful value compared to the R9 290 and the GTX 970 for 1080p gaming. The R9 3xx cards are not winning any price bracket.

Save $90 and OC the card yourself and get the same results with the R9 290 or just buy a GTX 970 and have .5GB less VRAM but a slightly faster GPU when you account for max OC


----------



## aDyerSituation

Quote:


> Originally Posted by *iinversion*
> 
> If you actually read my first post in this thread I mentioned that the R9 290 was a better value than the GTX 970. Of course I compared the 290 to the 970 when it is completely relevant considering the R9 390 is a REBRAND of the R9 290 for $90 more! Of course, it does have 4GB extra VRAM and "better" power management. But unless you are getting into multiple cards and using a 4K resolution I will never see the point in having 8GB of VRAM.
> 
> The R9 390 is a joke and an awful value compared to the R9 290 and the GTX 970 for 1080p gaming. The R9 3xx cards are not winning any price bracket.
> 
> Save $90 and OC the card yourself and get the same results with the R9 290 or just buy a GTX 970 and have .5GB less VRAM but a slightly faster GPU.












You are delusional. $330 r9 390 vs $340 gtx 970. This is the comparison. This thread has nothing to do with the 290. You are just trying to find an angle to come at AMD with. It's quite sad really.
And how is it a bad value compared to the 970? Lmao you have to be trolling

For the same price you get:
-A faster card in almost every game
-A card that will let you play your games with ultra textures without stuttering
-A card that doesn't need to be overclocked to show it's "true potential" against it's rivals

At the cost of:
-power draw


----------



## iinversion

Quote:


> Originally Posted by *aDyerSituation*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You are delusional. $330 r9 390 vs $340 gtx 970. This is the comparison. This thread has nothing to do with the 290. You are just trying to find an angle to come at AMD with. It's quite sad really.
> And how is it a bad value compared to the 970? Lmao you have to be trolling
> 
> For the same price you get:
> -A faster card in almost every game
> -A card that will let you play your games with ultra textures without stuttering
> -A card that doesn't need to be overclocked to show it's "true potential" against it's rivals
> 
> At the cost of:
> -power draw


Why would anyone spend an extra $90 to get a 390 over a 290 for 1080p when they are the same freaking card minus VRAM and stock clocks.

You are ridiculous.


----------



## aDyerSituation

Quote:


> Originally Posted by *iinversion*
> 
> Why would anyone spend an extra $90 to get a 390 over a 290 for 1080p when they are the same freaking card minus VRAM and stock clocks.
> 
> You are ridiculous.












For one, they aren't the same card until the 290 gets the same drivers and support. Also, personally I would pay the $90 so I could crossfire and have the horsepower to use the extra vram. But yes, most users don't need it.

And for two, the argument has nothing to do with the 290 being a better value. It has to do with the 390 vs the 970, and the 970 loses.

You're speaking to a 970 owner and I would pay $90 to switch to a 390 fwiw


----------



## bigkahuna360

Quote:


> Originally Posted by *HeadlessKnight*
> 
> Quote:
> 
> 
> 
> Originally Posted by *iinversion*
> 
> So now people are suggesting 8GB of VRAM for 1080p just so they can max up AA all the way? I know the 3.5GB config can make the 970 stutter but it is easily avoidable.
> 
> People act like the extra 512MB of VRAM from 3.5GB > 4GB is going to make a break a game.
> 
> 
> 
> SoM can get over 4 GB at 1080p with Ultra textures without AA. And the 970 can easily get 60 fps average with Ultra Textures provided it has enough VRAM. Most people buy $330+ cards to max out their games including textures, and have stutter in almost every game because the card cannot max out the textures is an issue especially at a mainstream resolution like 1080p.
> Quote:
> 
> 
> 
> Originally Posted by *bigkahuna360*
> 
> I dunno, you did just quote three trash games, one game with huge memory leaks, and one that is trash for optimization.
> 
> I will argue with you until the end of time that you can do just fine on 2GB VRAM and 3GB VRAM at 1440p, even more so on 1080p.
> 
> Click to expand...
> 
> Whether a game is trash or not that is your opinion and it adds zero value to the discussion. The fact remains one card in the same price range can do things better and handle higher textures settings than the other card in that "trash" or "unoptimized" game. At least those "trash" or "unoptimized" games scale with hardware. Many people blame the game developers for making poorly optimized games. It is like they are developers themselves and know for a fact that the games are unoptimized. We have games that are "poorly optimized (your claim)" and we have hardware in the market, there is a GPU that can handle higher textures better due to its VRAM capacity and there is one that cannot handle it well due to its lacking VRAM, if the first card did it right at the same price range you must know that the second card is the overpriced trash itself and not the game.
Click to expand...

So we should all have just bought the overpriced cards that come with double the VRAM so devs dont have to worry about how much time they put into their game?

Sure having extra VRAM is nice, but why should we pay an extra $100 for the same card so devs can be lazy?

EDIT: I guess I should have specified that I was talking about the 290x vs 390.


----------



## aDyerSituation

Quote:


> Originally Posted by *bigkahuna360*
> 
> So we should all have just bought the overpriced cards that come with double the VRAM so devs dont have to worry about how much time they put into their game?
> 
> Sure having extra VRAM is nice, but why should we pay an extra $100 for the same card so devs can be lazy?










You and inversion should sharpen your pencil because you are missing the point!


----------



## raghu78

Quote:


> Originally Posted by *iinversion*
> 
> Why would anyone spend an extra $90 to get a 390 over a 290 for 1080p when they are the same freaking card minus VRAM and stock clocks. You are ridiculous.


Again the main improvement in R9 390 and R9 390X is the memory controller which can now run at 6 Ghz and overclock to 6.8 Ghz. The extra memory bandwidth is definitely helping the R9 390 edge out GTX 970 OC and even the R9 290X clock for clock.

http://www.pcper.com/reviews/Graphics-Cards/Sapphire-Nitro-Radeon-R9-390-8GB-Review/3DMark-Power-and-Conclusions

"Before knowing what pricing AMD had decided on for these cards I assumed, as did most others, that the R9 390 would perform nearly identically to the R9 290. But that's not true, as *the Sapphire Nitro R9 390 is anywhere from 10-15% faster than the XFX R9 290 DD and matches the performance of the ASUS R9 290X DirectCU II card at stock settings*. Clearly the added clock speed and *(more importantly) the increased memory clock speed have been able to juice up the Hawaii GPU, now called Grenada,* to better compete with the NVIDIA GeForce lineup."

Sapphire R9 390 Nitro (1010 Mhz) is able to match and even slightly edge out R9 290X DC2(1050 Mhz). So even on a core clock for clock comparison R9 390 is faster than R9 290X , leave alone R9 290.

Here is a video review of GTX 970 OC vs R9 390 (both stock and max overclocked). R9 390 wins both cases and with the massive 8 GB VRAM is a better buy.

https://www.youtube.com/watch?v=k9cKZiJw6Pk


----------



## bigkahuna360

Quote:


> Originally Posted by *aDyerSituation*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bigkahuna360*
> 
> So we should all have just bought the overpriced cards that come with double the VRAM so devs dont have to worry about how much time they put into their game?
> 
> Sure having extra VRAM is nice, but why should we pay an extra $100 for the same card so devs can be lazy?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You and inversion should sharpen your pencil because you are missing the point!
Click to expand...

You must not have read what I was talking about. I was arguing the choices of games that he decided to use as examples and the need for 8GBs VRAM at such a low resolution.


----------



## aDyerSituation

Quote:


> Originally Posted by *bigkahuna360*
> 
> You must not have read what I was talking about. I was arguing the choices of games that he decided to use as examples and the need for 8GBs VRAM at such a low resolution.


So basically, now that we have established the 970's vram problem IS a problem, it's now the game developer's fault that the card doesn't have the advertised vram.

Hmm

Regardless of what you believe is optimized or unoptimized, there are two cards from two companies in the same price bracket and one of them doesn't have this problem.


----------



## bigkahuna360

Quote:


> Originally Posted by *aDyerSituation*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bigkahuna360*
> 
> You must not have read what I was talking about. I was arguing the choices of games that he decided to use as examples and the need for 8GBs VRAM at such a low resolution.
> 
> 
> 
> So basically, now that we have established the 970's vram problem IS a problem, it's now the game developer's fault that the card doesn't have the advertised vram.
> 
> Hmm
Click to expand...

Way to twist my words when I was never even talking about the 970.









Calm down, don't get too ahead of yourself.


----------



## white owl

I love how people will let AMD sell them the same card twice if they slap on more ram. Oh I forgot, they made drivers for it. It's better now.
If they made drivers for the 290, who would buy the 390? Don't say BETA either. I won't buy your card and beta test the driver too.
The 390 looks better at first until you realize there is no room to OC the thing. People have a hard time hitting the thermal limit on the 970.
And the whole 3.5Gb Vram thing....god almighty. My 780 SC worked just fine. People do 1440p with them!
SoM? That little bar has nothing to do with the ram it will use, simply how much it will allocate to use. If you Give Windows more ram, it does the same thing.
Textures and AA? I can tank my 980 too. I could tank a Titan X with most AAA titles.

My point?

The 390 is on the ragged edge of what it can do with out a fire or crashing. The 970 isn't.

Just let Hawaii die so AMD can make a better GPU.

BTW: I love AMD








The 290 is awesome now that the price is amazing. Can't wait for Zen.


----------



## raghu78

Quote:


> Originally Posted by *white owl*
> 
> I love how people will let AMD sell them the same card twice if they slap on more ram. Oh I forgot, they made drivers for it. It's better now. If they made drivers for the 290, who would buy the 390? Don't say BETA either. I won't buy your card and beta test the driver too. The 390 looks better at first until you realize there is no room to OC the thing. People have a hard time hitting the thermal limit on the 970. And the whole 3.5Gb Vram thing....god almighty. My 780 SC worked just fine. People do 1440p with them! SoM? That little bar has nothing to do with the ram it will use, simply how much it will allocate to use. If you Give Windows more ram, it does the same thing. Textures and AA? I can tank my 980 too. I could tank a Titan X with most AAA titles.
> 
> My point?
> 
> The 390 is on the ragged edge of what it can do with out a fire or crashing. The 970 isn't.
> 
> Just let Hawaii die so AMD can make a better GPU.
> 
> BTW: I love AMD
> 
> 
> 
> 
> 
> 
> 
> 
> The 290 is awesome now that the price is amazing. Can't wait for Zen.


Why are you spreading lies. Here are a few reviews where the R9 390 is stable at stock and also overclocking solidly. the custom coolers also keep noise low. All you need is a little bit of extra voltage. +100mv will get you to 1150-1200 Mhz.

https://www.youtube.com/watch?v=k9cKZiJw6Pk
http://www.guru3d.com/articles_pages/powercolor_radeon_r9_390_pcs_8gb_review,26.html

Maybe you should check out the R9 390 / R9 390X users thread before posting such false comments. At stock voltage users are getting 1075-1100 Mhz and with a bit of extra voltage 1150-1200 Mhz..

http://www.overclock.net/t/1561704/official-amd-r9-390-390x-owners-club


----------



## hansen6

Any advice regarding which card to get for 1600p resolution? Can't seem to find many reviews for this resolution. I play quite a variety of games, both old and new.


----------



## white owl

Quote:


> Originally Posted by *raghu78*
> 
> Why are you spreading lies. Here are a few reviews where the R9 390 is stable at stock and also overclocking solidly. the custom coolers also keep noise low. All you need is a little bit of extra voltage. +100mv will get you to 1150-1200 Mhz.
> 
> https://www.youtube.com/watch?v=k9cKZiJw6Pk
> http://www.guru3d.com/articles_pages/powercolor_radeon_r9_390_pcs_8gb_review,26.html
> 
> Maybe you should check out the R9 390 / R9 390X users thread before posting such false comments. At stock voltage users are getting 1075-1100 Mhz and with a bit of extra voltage 1150-1200 Mhz..
> 
> http://www.overclock.net/t/1561704/official-amd-r9-390-390x-owners-club


Your right! They can be overclocked, they have made improvements.
You are right on that front.
I admit, I did troll a bit to see who would post something good.

But could a 290/x have a driver made that would equal the 390/x performance?
Is 8Gb just a way to get people to but the card again? If so, it worked.


----------



## caenlen

at 1440p I take the 390 every time.


----------



## Slay

Quote:


> Originally Posted by *raghu78*
> 
> Again the main improvement in R9 390 and R9 390X is the memory controller which can now run at 6 Ghz and overclock to 6.8 Ghz. The extra memory bandwidth is definitely helping the R9 390 edge out GTX 970 OC and even the R9 290X clock for clock.
> 
> http://www.pcper.com/reviews/Graphics-Cards/Sapphire-Nitro-Radeon-R9-390-8GB-Review/3DMark-Power-and-Conclusions
> 
> "Before knowing what pricing AMD had decided on for these cards I assumed, as did most others, that the R9 390 would perform nearly identically to the R9 290. But that's not true, as *the Sapphire Nitro R9 390 is anywhere from 10-15% faster than the XFX R9 290 DD and matches the performance of the ASUS R9 290X DirectCU II card at stock settings*. Clearly the added clock speed and *(more importantly) the increased memory clock speed have been able to juice up the Hawaii GPU, now called Grenada,* to better compete with the NVIDIA GeForce lineup."
> 
> Sapphire R9 390 Nitro (1010 Mhz) is able to match and even slightly edge out R9 290X DC2(1050 Mhz). So even on a core clock for clock comparison R9 390 is faster than R9 290X , leave alone R9 290.
> 
> Here is a video review of GTX 970 OC vs R9 390 (both stock and max overclocked). R9 390 wins both cases and with the massive 8 GB VRAM is a better buy.
> 
> https://www.youtube.com/watch?v=k9cKZiJw6Pk


And that is exactly why I'm asking only about 390. All benchmarks that i've seen to date confirm this.


----------



## Mad Pistol

Quote:


> Originally Posted by *caenlen*
> 
> at 1440p I take the 390 every time.


Me too.

I'm not saying the GTX 970 is a bad card by any means. My wife has one, and I've tested it extensively. It's a great card.

However, with Nvidia not optimizing Kepler for future titles anymore, I can't really recommend a Maxwell GPU outside of the GTX 980 Ti. AMD has continued to optimize their drivers for existing cards to the point that they are far faster than they were at their release. The R9 390 is proof of this.

Also, the 390 and 390x are refreshed cards, similar to the GTX 770 vs. 680 (faster memory, faster core). Some reviewers have pointed out that the R9 390 and 390x overclock better than their 290 and 290x brethren. This is important to note and squashes the notion that the 390 and 390x are simply rebrands. AMD changed something, and it yielded higher performance compared to the previous generation.

Also, 8GB VRAM as standard doesn't hurt either. It's completely unnecessary in most cases, but in terms of games like Shadows of Mordor or GTA V, I don't think anyone is complaining about the extra VRAM on the newer cards.

TL;DR, I own a GTX 780 and a 970, and if I were buying a $300-350 video card right now, it would be the R9 390.








Quote:


> Originally Posted by *hansen6*
> 
> Any advice regarding which card to get for 1600p resolution? Can't seem to find many reviews for this resolution. I play quite a variety of games, both old and new.


Within the scope of what we're talking about, I'd get the R9 390.

If money is no object, the GTX 980 Ti is the one to get ($650ish)


----------



## iinversion

I have yet to see one benchmark where they tested a 290(X)/390(X) with the same core clock AND memory clock.

Every single review I've seen leaves the 3xx card memory 1GHz faster and only changes core clocks. This just skews the comparison.

Also lol 6.8GHz OC on memory is nothing Hawaii can't already do.


----------



## Yorkston

Quote:


> Originally Posted by *iinversion*
> 
> I have yet to see one benchmark where they tested a 290(X)/390(X) with the same core clock AND memory clock.
> 
> Every single review I've seen leaves the 3xx card memory 1GHz faster and only changes core clocks. This just skews the comparison.
> 
> Also lol 6.8GHz OC on memory is nothing Hawaii can't already do.


Does anyone here actually have a 390?

I have a 290 that does 1190/1600 at +100mv, i'll upgrade to the 15.7 drivers later tonight. Give me target clocks and what benchmarks to run, lets just settle this idiocy instead of debating it for another 6 pages.

edit: also [email protected]


----------



## Slay

Quote:


> Originally Posted by *Yorkston*
> 
> Does anyone here actually have a 390?
> 
> I have a 290 that does 1190/1600 at +100mv, i'll upgrade to the 15.7 drivers later tonight. Give me target clocks and what benchmarks to run, lets just settle this idiocy instead of debating it for another 6 pages.
> 
> edit: also [email protected]


Match 390 stock if possible.


----------



## Yorkston

I'll run my tests at 1010/1500 (stock config of the sapphire model). Anyone with a 390 or 290x feel free to jump in as well. Planning on:

3dmark (regular, i'm cheap and only have the free version)
heaven max 1080
valley extremeHD
ff14 bench 1080

Will probably do a round of 1440p later.


----------



## PontiacGTX

Spoiler: ...



Quote:


> Originally Posted by *edo101*
> 
> Ask yourself, do you really want a Refresh? Everyone knows refreshes suck. Its not new tech. Every GTX card is tenderly cared for in the assembly line. AMD just does refreshes


what does it has to do with a single card setup?
Quote:


> Originally Posted by *edo101*
> 
> ^don't listen to him. Go with the green. Green is always the best way to go for future stability and ownership.
> 
> Pascal is coming too. Buy a 970 now and then buy Pascal when it comes out. Don't wait for AMD to release a card. They might not even last that long


what about those nvidia driver that were going to help withthe crippled performance and just improved in 3 games and had crashes?








Quote:


> Originally Posted by *edo101*
> 
> Nooooo thats not true. Nvidia represents value and loyalty.


which doesnt help any customer,it has been proven that nvidia instead decides against their customers
Quote:


> Plus c'mon new tech.


are you taling about the 2008 GDDR5 memory that also is bottlenecked on the 970?







Quote:


> You have to abandon old tech to move on to new tech.


then the way is HBM
Quote:


> Why do you think they don't do refreshes anymore.


because the competition didnt have something better,just equal
Quote:


> They also release more Game Ready drivers to patch stuff after users complain.


what about the still cripplex performance on older games?
Quote:


> Originally Posted by *bigkahuna360*
> 
> I dunno, you did just quote three trash games, one game with huge memory leaks, and one that is trash for optimization.
> 
> I will argue with you until the end of time that you can do just fine on 2GB VRAM and 3GB VRAM at 1440p, even more so on 1080p.


2GB definately arent enough at 1080.let alone 1440


Quote:


> Originally Posted by *hansen6*
> 
> Any advice regarding which card to get for 1600p resolution? Can't seem to find many reviews for this resolution. I play quite a variety of games, both old and new.


get a r9 290/x or r9 390 or wait for fury nano


----------



## Yorkston

Gigabyte R9 290 @ 1010/1500
3570k @ 4.5
RAM @ 1866

Firestrike:


Spoiler: Warning: Spoiler!



http://www.3dmark.com/3dm/7822025
Not sure why I keep getting the timing error



Heaven 1080:


Spoiler: Warning: Spoiler!







Heaven 1440:


Spoiler: Warning: Spoiler!







Valley ExtremeHD (1080):


Spoiler: Warning: Spoiler!







FF14 Heavensward Bench (1080 Maximum Preset):


Spoiler: Warning: Spoiler!


----------



## Slay

Quote:


> Originally Posted by *Yorkston*
> 
> Gigabyte R9 290 @ 1010/1500


Thanks for that, will add to OP later. 390 appears to be faster according to
http://www.bjorn3d.com/wp-content/uploads/2015/06/heaven.jpg?8c61ba
and
http://pclab.pl/zdjecia/artykuly/chaostheory/2015/06/radeon_390/charts/heaven4_fps.png,
http://pclab.pl/zdjecia/artykuly/chaostheory/2015/06/radeon_390/charts/valley1_fps.png

however the second results have been done on Core i7-4770K @ 4,5 GHz so there's that.


----------



## PontiacGTX

Quote:


> Originally Posted by *Slay*
> 
> Thanks for that, will add to OP later. 390 appears to be faster according to
> http://www.bjorn3d.com/wp-content/uploads/2015/06/heaven.jpg?8c61ba
> and
> http://pclab.pl/zdjecia/artykuly/chaostheory/2015/06/radeon_390/charts/heaven4_fps.png,
> http://pclab.pl/zdjecia/artykuly/chaostheory/2015/06/radeon_390/charts/valley1_fps.png
> 
> however the second results have been done on Core i7-4770K @ 4,5 GHz so there's that.


those synthetic test are quite CPU bound, so memory speed and cpu core count/architecture might change those scores


----------



## Slay

And that is why we should take these tests with a grain of salt. After all Grenada is Hawaii 1:1, no improvements have been made.


----------



## Yorkston

From what I can find so far:

http://www.eteknix.com/sapphire-nitro-r9-390-8gb-graphics-card-review/3/
http://www.guru3d.com/articles_pages/powercolor_radeon_r9_390_pcs_8gb_review,23.html
http://www.kitguru.net/components/graphic-cards/zardon/sapphire-r9-390-nitro-8gb-review/8/
http://www.pcper.com/reviews/Graphics-Cards/Sapphire-Nitro-Radeon-R9-390-8GB-Review/3DMark-Power-and-Conclusions
http://www.overclockersclub.com/reviews/powercolor_pcs_r9_390_8gb/12.htm

Outside of Eteknix's Valley test failing spectacularly at minimum framerates, the 390 looks to average around 4-5% faster than the 290 at same clocks. It would be nice to get an apples to apples comparison but most people here with a 390 probably aren't rocking ivy bridge at this point.


----------



## iinversion

Quote:


> Originally Posted by *Yorkston*
> 
> From what I can find so far:
> 
> http://www.eteknix.com/sapphire-nitro-r9-390-8gb-graphics-card-review/3/
> http://www.guru3d.com/articles_pages/powercolor_radeon_r9_390_pcs_8gb_review,23.html
> http://www.kitguru.net/components/graphic-cards/zardon/sapphire-r9-390-nitro-8gb-review/8/
> http://www.pcper.com/reviews/Graphics-Cards/Sapphire-Nitro-Radeon-R9-390-8GB-Review/3DMark-Power-and-Conclusions
> http://www.overclockersclub.com/reviews/powercolor_pcs_r9_390_8gb/12.htm
> 
> Outside of Eteknix's Valley test failing spectacularly at minimum framerates, the 390 looks to average around 4-5% faster than the 290 at same clocks. It would be nice to get an apples to apples comparison but most people here with a 390 probably aren't rocking ivy bridge at this point.


Even if someone had an Haswell chip they could clock it 200-300MHz lower than your chip and turn HT off if necessary to get the closest comparison.

So is what we have so far is 5% faster in the best case but probably not due to CPU.

If you compare these 3DMark scores to what you posted previously.

http://www.3dmark.com/fs/5434768 slightly higher clocks R9 390 with 4690K @ 4.5GHz. Physics score is practically the same and the graphic score falls within the margin of error of being practically identical within 1-2%

http://www.3dmark.com/fs/5410338 - here's another one. This time clocks are identical but it is a stock CPU. If you ignore overall/physics score then the graphic score is yet again within the margin of error for being practically identical within 1-2%

Also note both of the above 3DMark scores are using the same drivers you did for your R9 290.


----------



## pereira2088

hey guys, sorry to dig up this topic.

i'm gonna build a new rig - just waiting for the new skylake (going for a i5 6500 + h170 motherboard + 8gb ddr4 ) - and i have doubts for the graphic card.

my price options are

-MSI GTX970 Armor OC 4GB GD5 (GTX970-4GD5T-OC) - 360€
-Gigabyte GTX970 G1 GAMING 4GB GD5 - 400€
-MSI R9 390 Gaming 8GB GD5 - 400€
-Asus GTX970 STRIX OC 4GB GD5 - 400€
-Asus R9 390 DirectCU II 8GB GD5 - 400€

(there are others 970 and 390 (evga, for ex) but all around the 400€ mark)

the goal will be to play at [email protected]

if at the beginning i was into the gtx 970 strix now i'm considering the r9 390.

i have seen a number of reviews, where generally speaking, at this resolution, the 390 performs slightly better on some games while the 970 goes better on others.
however in power consumption nvidia clearly wins.
some 970 go from 150 to 250w , and the 390 go from 225 to 300. it's not "that" big of a difference - but it is still a difference from getting a 550w psu or a 650w

then i get mixed info. the g1, i see reviews (guru3d) where on load it gets 170w max tdp, then on overclock.com it has a 280w max power limit. but it's hard to be specific since some give the gpu power and others show the full system wattage. still, the 390 takes 30% to 50% more wattage then the nvidia.

finally i'm reading the new "info" of the 390 on the dx12. that it gets way better performance then the nvidia cards. of course, we still dont know anything (or that much) on the nvidia side for dx12.

so basically, the same price, it's a "fight" between power consumption vs slightly better performance and/or more future proof (still not knowing nvidia side).

so, 970 or 390?

noise is no problem, it's going into a fractal r5 case.


----------



## Piccolo55

Quote:


> Originally Posted by *pereira2088*
> 
> hey guys, sorry to dig up this topic.
> 
> i'm gonna build a new rig - just waiting for the new skylake (going for a i5 6500 + h170 motherboard + 8gb ddr4 ) - and i have doubts for the graphic card.
> 
> my price options are
> 
> -MSI GTX970 Armor OC 4GB GD5 (GTX970-4GD5T-OC) - 360€
> -Gigabyte GTX970 G1 GAMING 4GB GD5 - 400€
> -MSI R9 390 Gaming 8GB GD5 - 400€
> -Asus GTX970 STRIX OC 4GB GD5 - 400€
> -Asus R9 390 DirectCU II 8GB GD5 - 400€
> 
> (there are others 970 and 390 (evga, for ex) but all around the 400€ mark)
> 
> the goal will be to play at [email protected]
> 
> if at the beginning i was into the gtx 970 strix now i'm considering the r9 390.
> 
> i have seen a number of reviews, where generally speaking, at this resolution, the 390 performs slightly better on some games while the 970 goes better on others.
> however in power consumption nvidia clearly wins.
> some 970 go from 150 to 250w , and the 390 go from 225 to 300. it's not "that" big of a difference - but it is still a difference from getting a 550w psu or a 650w
> 
> then i get mixed info. the g1, i see reviews (guru3d) where on load it gets 170w max tdp, then on overclock.com it has a 280w max power limit. but it's hard to be specific since some give the gpu power and others show the full system wattage. still, the 390 takes 30% to 50% more wattage then the nvidia.
> 
> finally i'm reading the new "info" of the 390 on the dx12. that it gets way better performance then the nvidia cards. of course, we still dont know anything (or that much) on the nvidia side for dx12.
> 
> so basically, the same price, it's a "fight" between power consumption vs slightly better performance and/or more future proof (still not knowing nvidia side).
> 
> so, 970 or 390?
> 
> noise is no problem, it's going into a fractal r5 case.


Go with ether the strix 390, the g1 390 or the nirto 390


----------



## Geoclock

GTX 970 prices are $309 after MIR when r9 390 prices are $340 and 290x also around $320.

AMD pushing prices since DX12 results are better for them.

SO is $30 cheaper Gigabyte GTX GTX G1 still worst choice?


----------



## Piccolo55

Quote:


> Originally Posted by *Geoclock*
> 
> GTX 970 prices are $309 after MIR when r9 390 prices are $340 and 290x also around $320.
> 
> AMD pushing prices since DX12 results are better for them.
> 
> SO is $30 cheaper Gigabyte GTX GTX G1 still worst choice?


Ah but the r9 390 g1 is between $305-$310 as of writing on pcpartpickit


----------



## iinversion

970 is faster than a 390.

390 is a rebrand of the 290 just with higher clocks and more VRAM. 970 was ~7%-ish faster than a R9 290 so it still stands with the 390. If you don't plan on OC'ing maybe the 390 is the way to go but the 3xx series is way overpriced for what it is.


----------



## Geoclock

Quote:


> Originally Posted by *Piccolo55*
> 
> Ah but the r9 390 g1 is between $305-$310 as of writing on pcpartpickit


Radeon version is bad OC-er and voltage is locked, better choice is Powercolor 390, but best MSI 390 or Sapphire with price jump $345.

So best GTX970 for $309 or best R9 390 for $345 ?

Is r9 390 worth extra $35 ?


----------



## batman900

No OC go 390, plan to OC? 970.


----------



## TopicClocker

You can't beat the 8GB vram on the R9 390, both cards are equally as fast, neither is especially faster than the other, at higher resoutlions the 390 can be a little faster I'd say it's down to the choice of the user.

Here is an excellent review of the R9 390 compared against a couple of other cards, stock and overclocked.

OverclockersClub

The GTX 970 at 1609MHz and the 390 at 1116MHz.


----------



## PontiacGTX

Quote:


> Originally Posted by *Geoclock*
> 
> Radeon version is bad OC-er and voltage is locked, better choice is Powercolor 390, but best MSI 390 or Sapphire with price jump $345.
> 
> So best GTX970 for $309 or best R9 390 for $345 ?
> 
> Is r9 390 worth extra $35 ?


there were 290x for less than 300 maybe they were OOS since amd stop producing them,but 1080 get the 970,1440 or hugher get a 290/x/390/x


----------



## Geoclock

Quote:


> Originally Posted by *iinversion*
> 
> 970 is faster than a 390.
> 
> 390 is a rebrand of the 290 just with higher clocks and more VRAM. 970 was ~7%-ish faster than a R9 290 so it still stands with the 390. If you don't plan on OC'ing maybe the 390 is the way to go but the 3xx series is way overpriced for what it is.


Agreed, 390 series are WAY overpriced, 8 GB marketing is pushing prices up and until consumer clears up positives and negatives retailers are fishing in the DIRTY WATER.


----------



## Piccolo55

Quote:


> Originally Posted by *Geoclock*
> 
> Agreed, 390 series are WAY overpriced, 8 GB marketing is pushing prices up and until consumer clears up positives and negatives retailers are fishing in the DIRTY WATER.


Honestly the 970 should be around $270 after the 390 launched at $329. But the 970 at $310 isn't stealer really. 390 will last longer with out having to upgrade c


----------



## Nvidia Fanboy

980>390X>390>290X>970>290. This is generally accepted and it makes it look like the 970 is so far behind the 980. Keep in mind though that the 980 is realistically no more than 15 - 20% faster than the 290. OCN is squabbling over cards that are all within spitting distance of one another. You can't go wrong with either the 970 or 390 assuming you're not paying too much more for the 390. I'd choose the 390 personally but let's face the facts that all 6 of these cards are pretty darn similar.


----------



## PontiacGTX

Quote:


> Originally Posted by *Nvidia Fanboy*
> 
> 980=/>390X>390=/>290X>970>290. This is generally accepted .


FTFY


----------



## Geoclock

Is used ASUS 390x DirectCII worth to buy $60 less than retailer?
How is ASUS warranty for second owners?


----------



## Piccolo55

Quote:


> Originally Posted by *Geoclock*
> 
> Is used ASUS 390x DirectCII worth to buy $60 less than retailer?
> How is ASUS warranty for second owners?


Would never trust a used gpu. But a asus 390/x would be a good gpu


----------



## Geoclock

Quote:


> Originally Posted by *Piccolo55*
> 
> Would never trust a used gpu. But a asus 390/x would be a good gpu


It;s 2 month old but i don't know ASUS warranty for second owners.


----------



## Piccolo55

Quote:


> Originally Posted by *Geoclock*
> 
> It;s 2 month old but i don't know ASUS warranty for second owners.


I suggest look it up. They may not have one.


----------



## iinversion

Quote:


> Originally Posted by *Geoclock*
> 
> It;s 2 month old but i don't know ASUS warranty for second owners.


There is nothing wrong with second hand parts. Almost everything I buy is secondhand lol.

Asus has a 3 year serial based warranty on all their GPU's. So 3 years from the manufacture date. No receipt or proof of purchase required.

I would not get a Asus DCII model though. That cooler is junk, especially on the high power output from Hawaii cards.


----------



## Geoclock

Thanks man, probably this is the main reason he is selling it so fast.


----------



## Piccolo55

Quote:


> Originally Posted by *iinversion*
> 
> There is nothing wrong with second hand parts. Almost everything I buy is secondhand lol.
> 
> Asus has a 3 year serial based warranty on all their GPU's. So 3 years from the manufacture date. No receipt or proof of purchase required.
> 
> I would not get a Asus DCII model though. That cooler is junk, especially on the high power output from Hawaii cards.


Is the strix the next gen cooler though on the 390?


----------



## iinversion

Quote:


> Originally Posted by *Piccolo55*
> 
> Is the strix the next gen cooler though on the 390?


Yes it is.

I have no experience with that cooler so can't tell you if it is any better or not. DC2 model is definitely junk compared to other coolers though.


----------



## Geoclock

Here is the Warranty terms:

"ALL ASUS WARRANTY TERMS AND AGREEMENTS ARE NON-TRANSFERABLE AND ONLY APPLY TO THE ORIGINAL UNIT AND ORIGINAL PURCHASER. "


----------



## iinversion

Quote:


> Originally Posted by *Geoclock*
> 
> Here is the Warranty terms:
> 
> "ALL ASUS WARRANTY TERMS AND AGREEMENTS ARE NON-TRANSFERABLE AND ONLY APPLY TO THE ORIGINAL UNIT AND ORIGINAL PURCHASER. "


Wrong.

Asus has a serial based warranty. I have first hand experience and when you do an RMA through Asus all you need is the serial number.

Some companies that don't do serial based warranties:

Sapphire
XFX
PNY
PowerColor

Pretty much all the others are serial based I believe. MSI, Asus, Gigabyte, EVGA, etc.


----------



## Geoclock

Quote:


> Originally Posted by *iinversion*
> 
> Wrong.
> 
> Asus has a serial based warranty. I have first hand experience and when you do an RMA through Asus all you need is the serial number.


Could be, anyway i'll stay away from Direct CII version. Thanks.


----------



## iinversion

Quote:


> Originally Posted by *Geoclock*
> 
> Could be, anyway i'll stay away from Direct CII version. Thanks.


It definitely is..edited my post with some more info as to which companies are serial based and which ones require receipt


----------



## BinaryDemon

If I had to choose today, I'd probably pick a pair of R9 390's. Mostly due to 3.5gb vs 8gb vram.


----------



## Redwoodz

Quote:


> Originally Posted by *Geoclock*
> 
> GTX 970 prices are $309 after MIR when r9 390 prices are $340 and 290x also around $320.
> 
> AMD pushing prices since DX12 results are better for them.
> 
> SO is $30 cheaper Gigabyte GTX GTX G1 still worst choice?


Quote:


> Originally Posted by *Geoclock*
> 
> Radeon version is bad OC-er and voltage is locked, better choice is Powercolor 390, but best MSI 390 or Sapphire with price jump $345.
> 
> So best GTX970 for $309 or best R9 390 for $345 ?
> 
> Is r9 390 worth extra $35 ?


Um, Sapphire Nitro 390 is $329 as we speak. http://www.newegg.com/Product/Product.aspx?Item=N82E16814202148

What you are seeing here is retailer's pricing. 390 is slightly more than the 970 because people are willing to pay for it. Both cards are very close at 1080p,but the memory makes the 390 the clear winner. No one is going to buy a 970 to play 1080p now, and hope to add another card in the future for 1440p and up resolutions. The 390 will do all the 970 does, and offers the user the option to get great performance at 1440p and 4K with another card. DX12 features and 8GB offers a very good future for the 390,970 not so much.


----------



## Randomdude

I did not fully read the thread yet, but halfway through I stopped just so I could mention how incredibly shocked I am. People are being borderline legit stupid. At least thank you OP, this thread will serve as future reference for me, great way to know if someone is credible or not (not one bit that is). Thank you for helping me make that mental list


----------



## Duality92

390 unless you're folding of course. Since I'm folding the 970 is much better.


----------



## NBrock

I love my Sapphire Tri X (New Edition). The cooler on it is great. I have it running 24/7 for [email protected] @ 1200 core and the stock 1350 for mem. I would recommend the Tri X 290 or 290x or 390 or 390x because that cooler is beast (also very quiet).

So far this month I seem to be averaging 239,830 PPD so I don't think they are that bad at folding for the price...obviously the 390 is going to run you more money than a 290.


----------



## Duality92

Quote:


> Originally Posted by *NBrock*
> 
> I love my Sapphire Tri X (New Edition). The cooler on it is great. I have it running 24/7 for [email protected] @ 1200 core and the stock 1350 for mem. I would recommend the Tri X 290 or 290x or 390 or 390x because that cooler is beast (also very quiet).
> 
> So far this month I seem to be averaging 239,830 PPD so I don't think they are that bad at folding for the price...obviously the 390 is going to run you more money than a 290.


Not bad, I'm just saying for less power output, I can yield 330k ppd with mine on stock clocks. 360k with OC.


----------



## AverdanOriginal

Quote:


> Originally Posted by *iinversion*
> 
> Yes it is.
> 
> I have no experience with that cooler so can't tell you if it is any better or not. DC2 model is definitely junk compared to other coolers though.


I'd second that. if you go for Asus on R9 390 then take the Strix version.

Apart from that if you want a cooler and quieter R9 390 then go for Saphire Nitro. If you want better binned cards for better OCing then go for MSI (the none LE version) followed by XFX. DO NOT go for Gigabyte R9 390s because they hard locked the voltage regulator...

If you guys want more info go visit the R9 390 / 390X Owners Club. http://www.overclock.net/t/1561704/official-amd-r9-390-390x-owners-club

Most people where able to reach OC between 1100 - 1220 on the Core (starting to reach 1180 on the core beats top overclocked 970s considering official benchmarks). Only real issue with the 390s is heat, but that can be dealt with either not overclocking







or enhancing your case air flow.








I managed to undervolt my MSI R9 390 so that it runs at 1030/1680 MHz (Core/Memory) and draws less Watt then a 970 and faster than a reference 970. very silent and goes on Heaven benchnmarks not over 62C°. Needed to this since I had constant ambient temps of 30-35 C° during summer time.


----------



## Geoclock

Quote:


> Originally Posted by *AverdanOriginal*
> 
> Most people where able to reach OC between 1100 - 1220 on the Core (starting to reach 1180 on the core beats top overclocked 970s considering official benchmarks). Only real issue with the 390s is heat, but that can be dealt with either not overclocking
> 
> 
> 
> 
> 
> 
> 
> or enhancing your case air flow.
> 
> 
> 
> 
> 
> 
> 
> 
> I managed to undervolt my MSI R9 390 so that it runs at 1030/1680 MHz (Core/Memory) and draws less Watt then a 970 and faster than a reference 970. very silent and goes on Heaven benchnmarks not over 62C°. Needed to this since I had constant ambient temps of 30-35 C° during summer time.


Can you tell us what was the original voltage and what is the new one?


----------



## mtcn77

This is from the recent TechReport review of the "Is 4GB necessary for 4K resolution?" subject. The point bears very clear repercussions as FullHD also has a DSR/VSR 4K-upscaling option.


----------



## StrongForce

r9 390 all the way... isn't the 970 3.5 gb also and not really full 4 ?


----------



## Piccolo55

Quote:


> Originally Posted by *StrongForce*
> 
> r9 390 all the way... isn't the 970 3.5 gb also and not really full 4 ?


500 mb of it is slower vram


----------



## AverdanOriginal

Quote:


> Originally Posted by *Geoclock*
> 
> Can you tell us what was the original voltage and what is the new one?


Sorry for the late reply. Was away for 2 weeks from my PC... now I can hug it again.









Here the GPU-Z with Overclocking before I needed to turn up the voltage:


IF GPU-Z VDDC shows the correct Voltage then the normal max Voltage is 1.258V (mind you on slighter Overclocks like 1100/1600 I did not reach this with max. Voltage)

With Undervolting I was able to stable it at 1.172V here the GPU-Z and other benchmarks:



Hope that helps.


----------



## neurotix

If anyone can beat that with a 970 that's not on LN2 I'll concede that the 970 is a better card.

Here's a 970 SLI at 1600mhz+ in Valley:



Here's my 290s at 1150mhz:



Note, he has a 4790k at 4.8ghz and I have a 4770k at 4.5ghz. If his 4790k was at 4.5ghz like mine his score would actually be *LOWER*.

It doesn't matter if they clock so high if the performance is 20 fps less.

I've also seen SLI 970 scores in Fire Strike and my 290s do better, at lower clocks. Keeping in mind the prices of a 290 used nowadays, compared to the 970, and the 290's performance is vastly better, you'd have to be an idiot to get 970s.

The 980 or 980ti are the only Nvidia cards worth buying.

Case closed.


----------



## AverdanOriginal

Quote:


> Originally Posted by *mtcn77*
> 
> This is from the recent TechReport review of the "Is 4GB necessary for 4K resolution?" subject. The point bears very clear repercussions as FullHD also has a DSR/VSR 4K-upscaling option.


After reading some tests where they compared, sadly only the 290 to the 970, it seems that the 970 uses in most games on average more RAM than the 290. I guess nividia does this to forego a problem where the RAM might rise above the 3.5 GB barrier. So you might actually get a smoother FPS and Frametimes (frametimes important to actually measure the problem you get when going past the 3.5 GB barrier) with the nividia in lower resolutions (not necessarily though). While once you hit that 3.5GB barrier you can notice micro stutters with the 970 which you would not with a 290 @ 4GB or a 390 @ 8 GB.

Batman Arkham Knight, Watchdogs, GTA V and some other games are already using up more than 4GB of RAM if your card allows it on 1920x1080. While the 970 downscales these games so that you can still play but you might notice micro stutters.

Witcher 3 is a game that tends to use up more RAM from the system and not the Videocard... hence here you see normally not much more then 2-2.5 GB RAM usage.

So if you already have games today that use up more RAM than 4GB on 1080p, I am pretty sure in the future there will not be a trend to lower RAM usage on Videocards. Plus the tendency is going to 4K (like FULL HD), so in the future (like 2-3 years) in order to get the best PC Gaming experience, you will probably not be able to pass on 4K.
BUT if you already have a 970 or 290, upgrading to a 390 or 390X DOES NOT make sense in my opinion.


----------



## AverdanOriginal

Quote:


> Originally Posted by *neurotix*
> 
> 
> 
> If anyone can beat that with a 970 that's not on LN2 I'll concede that the 970 is a better card.
> 
> Here's a 970 SLI at 1600mhz+ in Valley:
> 
> Here's my 290s at 1150mhz:
> 
> Note, we have the same CPU at 4.5ghz.
> 
> It doesn't matter if they clock so high if the performance is 20 fps less.
> 
> I've also seen SLI 970 scores in Fire Strike and my 290s do better, at lower clocks. Keeping in mind the prices of a 290 used nowadays, compared to the 970, and the 290's performance is vastly better, you'd have to be an idiot to get 970s.
> 
> The 980 or 980ti are the only Nvidia cards worth buying.
> 
> Case closed.


Baam.... now that should clear the topic


----------



## iinversion

Quote:


> Originally Posted by *neurotix*
> 
> 
> 
> If anyone can beat that with a 970 that's not on LN2 I'll concede that the 970 is a better card.
> 
> Here's a 970 SLI at 1600mhz+ in Valley:
> 
> 
> 
> Here's my 290s at 1150mhz:
> 
> 
> 
> Note, he has a 4790k at 4.8ghz and I have a 4770k at 4.5ghz. If his 4790k was at 4.5ghz like mine his score would actually be *LOWER*.
> 
> It doesn't matter if they clock so high if the performance is 20 fps less.
> 
> I've also seen SLI 970 scores in Fire Strike and my 290s do better, at lower clocks. Keeping in mind the prices of a 290 used nowadays, compared to the 970, and the 290's performance is vastly better, you'd have to be an idiot to get 970s.
> 
> The 980 or 980ti are the only Nvidia cards worth buying.
> 
> Case closed.


What about a 780 Ti? Here's mine on air..


----------



## rdr09

Quote:


> Originally Posted by *iinversion*
> 
> What about a 780 Ti? Here's mine on air..


3GB should suffice in op's case. at $200, the 780 Ti will be a good buy.

edit: $200 might be too low. $250 will still be good for the Ti. i remember it costing like $700 at launch.


----------



## Randomdude

Thank you, neurotix. Nothing to further discuss about the 390 vs 970, it's clear as day. I'm going to sit here and enjoy the diehards slowly drown trying to twist this but inevitably fail.


----------



## iinversion

Quote:


> Originally Posted by *rdr09*
> 
> 3GB should suffice in op's case. at $200, the 780 Ti will be a good buy.
> 
> edit: $200 might be too low. $250 will still be good for the Ti. i remember it costing like $700 at launch.


$250 sounds right. If you get lucky you might find one closer to $200 though.

I re-ran the test cause the first time something popped up notifying me of an update and the minimum frame was low.



Either way.. about the same result..

Never had any stuttering on 1440p with the 3GB. Even on a super resource hog and unoptimized of a game like ARK Evolved, no stuttering despite maxing VRAM. The drivers seem to manage memory pretty well.. it'll go right up to the 3GB mark but stay under it and never have any stuttering, at least in my experience. I had a 980 as well and the game used all 4GB of that VRAM, so not like it only uses 3GB.


----------



## rdr09

Quote:


> Originally Posted by *iinversion*
> 
> $250 sounds right. If you get lucky you might find one closer to $200 though.
> 
> I re-ran the test cause the first time something popped up notifying me of an update and the minimum frame was low.
> 
> 
> 
> Either way.. about the same result..
> 
> Never had any stuttering on 1440p with the 3GB. Even on a super resource hog and unoptimized of a game like ARK Evolved, no stuttering despite maxing VRAM. The drivers seem to manage memory pretty well.. it'll go right up to the 3GB mark but stay under it and never have any stuttering, at least in my experience. I had a 980 as well and the game used all 4GB of that VRAM, so not like it only uses 3GB.


kinda tough to recommend a 3.5 GB or less at anything over $200. op might end up upgrading sooner and costing him/her more.


----------



## iinversion

Quote:


> Originally Posted by *rdr09*
> 
> kinda tough to recommend a 3.5 GB or less at anything over $200. op might end up upgrading sooner and costing him/her more.


and that's fine. if op wants 4GB then more power to him/her. The fact is a lot of games just fill VRAM if it is available rather than actually require it for smooth gameplay. That's where I was going with that comment.

& Just wanted to make a point with that screenshot that the 780 Ti is actually faster than both the 970/R9 290/whatever rebrand.

People seem to continually rank it lower than both and that is obviously not the case.


----------



## rdr09

Quote:


> Originally Posted by *iinversion*
> 
> and that's fine. if op wants 4GB then more power to him/her. The fact is a lot of games just fill VRAM if it is available rather than actually require it for smooth gameplay. That's where I was going with that comment.
> 
> & Just wanted to make a point with that screenshot that the 780 Ti is actually faster than both the 970/R9 290/whatever rebrand.
> 
> People seem to continually rank it lower than both and that is obviously not the case.


sadly, the only true measure of vram running out is . . . fps drops.


----------



## HeadlessKnight

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *neurotix*
> 
> 
> 
> If anyone can beat that with a 970 that's not on LN2 I'll concede that the 970 is a better card.
> 
> Here's a 970 SLI at 1600mhz+ in Valley:
> 
> 
> 
> Here's my 290s at 1150mhz:
> 
> 
> 
> Note, he has a 4790k at 4.8ghz and I have a 4770k at 4.5ghz. If his 4790k was at 4.5ghz like mine his score would actually be *LOWER*.
> 
> It doesn't matter if they clock so high if the performance is 20 fps less.
> 
> I've also seen SLI 970 scores in Fire Strike and my 290s do better, at lower clocks. Keeping in mind the prices of a 290 used nowadays, compared to the 970, and the 290's performance is vastly better, you'd have to be an idiot to get 970s.
> 
> The 980 or 980ti are the only Nvidia cards worth buying.
> 
> Case closed.






Just a single synthetic benchmark that doesn't tell the whole story. From my experience with many Nvidia and AMD cards, 3DMark and Unigine stuff are useless to decide which card is better. It is always best to compare them on game to game basis. Or benchmark games the OP is interested in.
Also generally speaking Maxwell cards suck in Valley benchmark compared to Keplers. 780 Ti's are faster in Valley than 980s but slower in everything else.
Also nice cherry picked flawed result, here are other random results from other users at lower clocks and much higher fps than the one you cherry picked.

http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0/11550_50#post_23068245

http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0/12450_50#post_24325548

I am not defending Nvidia (espeically after 970s 3.5GB fiasco) but cherry picking a single flawed result doesn't give the OP a proper idea about the performance of cards.


----------



## PontiacGTX

Quote:


> Originally Posted by *neurotix*
> 
> 
> 
> Spoiler: Benchmarks
> 
> 
> 
> 
> 
> If anyone can beat that with a 970 that's not on LN2 I'll concede that the 970 is a better card.
> 
> Here's a 970 SLI at 1600mhz+ in Valley:
> 
> 
> 
> Here's my 290s at 1150mhz:
> 
> 
> 
> Note, he has a 4790k at 4.8ghz and I have a 4770k at 4.5ghz. If his 4790k was at 4.5ghz like mine his score would actually be *LOWER*.
> 
> It doesn't matter if they clock so high if the performance is 20 fps less.
> 
> I've also seen SLI 970 scores in Fire Strike and my 290s do better, at lower clocks. Keeping in mind the prices of a 290 used nowadays, compared to the 970, and the 290's performance is vastly better, you'd have to be an idiot to get 970s.
> 
> 
> The 980 or 980ti are the only Nvidia cards worth buying.
> 
> Case closed.


Exactly


----------



## Rickles

To be fair,

You can't really grab two benchmark results and declare X is > Y.

There are so many variables in any given system/build that the notion of doing that is just plain silly.

Now that we've gotten our sillies out of the way.

Buy a 290 or 390 IMO, the 970 isn't anything spectacular, and that memory setup was handled in such a sketchy manner that this card should have been boycotted by all of us. I know I sent mine back as soon as I could.

There are pretty much only 4 video cards I'd recommend at the moment.

750ti, 290 (if less than $300), 390 (if you are going to use the VRAM), and the 980Ti.

Outside of those cards I'd say save more money and upgrade later on.

Good used cards to pick up would be a 780Ti close to $200, a 290 close to $160 (that hasn't been mined for months on end) or a 980 for $350.


----------



## caliking420

Quote:


> Originally Posted by *neurotix*
> 
> The 980 or 980ti are the only Nvidia cards worth buying.


this.

I actually don't even think a 980 is relevant anymore, considering the performance jump with the Ti.


----------



## Duality92

Quote:


> Originally Posted by *Rickles*
> 
> To be fair,
> 
> You can't really grab two benchmark results and declare X is > Y.
> 
> There are so many variables in any given system/build that the notion of doing that is just plain silly.
> 
> Now that we've gotten our sillies out of the way.
> 
> Buy a 290 or 390 IMO, the 970 isn't anything spectacular, and that memory setup was handled in such a sketchy manner that this card should have been boycotted by all of us. I know I sent mine back as soon as I could.
> 
> There are pretty much only 4 video cards I'd recommend at the moment.
> 
> 750ti, 290 (if less than $300), 390 (if you are going to use the VRAM), and the 980Ti.
> 
> Outside of those cards I'd say save more money and upgrade later on.
> 
> Good used cards to pick up would be a 780Ti close to $200, a 290 close to $160 (that hasn't been mined for months on end) or a 980 for $350.


7970's @ 120$ and 7950's @ 100$ are worth it too.

If I can get my hands on a R9 390, I will test it as much as you guys want on the same system. I'm going to try and get a sponsored one (already in process) and I will be able to give results with an i5 3470 and i5 4690k. (can even do c2d if you guys want)


----------



## TopicClocker

Quote:


> Originally Posted by *neurotix*
> 
> -snip-
> I've also seen SLI 970 scores in Fire Strike and my 290s do better, at lower clocks. Keeping in mind the prices of a 290 used nowadays, compared to the 970, and the 290's performance is vastly better, you'd have to be an idiot to get 970s.
> 
> The 980 or 980ti are the only Nvidia cards worth buying.
> 
> Case closed.


Valley is the worst possible way to compare the performance of GPUs, especially since Maxwell GPUs such as the 970 and 980 are notoriously poor at Valley compared to Kepler GPUs such as the 780 and 780 Ti.

Synthetic benchmarks alone are not the best way to compare the Gaming performance of GPUs.
Do they tell you how a GPU will perform in Games? No, they don't.

Comparing GPUs in Games is the best way to do it.

Grand Theft Auto V Multi-GPU Performance Review Part 2


Quote:


> *GeForce GTX 970 SLI is faster than Radeon R9 290 CrossFire, but not by a whole lot in GTA V.* GeForce *GTX 970 SLI is also faster than a single GeForce GTX TITAN X*, which is interesting. Radeon R9 295X2 and GTX 970 SLI are close in performance, *but R9 295X2 has some more major dips in performance, attributing to its more choppy nature.*



Quote:


> *GeForce GTX 970 SLI pulls out ahead of R9 290 CrossFire* without advanced graphics options enabled in GTA V. *Radeon R9 295X2 still has some major stuttering problems even without advanced graphics options.*


However in GTA 5, the GTX 970 will struggle at 4K with everything enabled because of it's VRAM configuration.

I don't expect 4GB cards to be able to run Games at 4K maxed out for another whole year, for 4K Gaming 6GB+ is the best bet.

But even the top single cards out right now are unable to do 4K 60fps maxed out in the most demanding titles.

Also, calling people idiots over a GPU purchase is immature to say the least.

Quote:


> Originally Posted by *HeadlessKnight*
> 
> 
> Just a single synthetic benchmark that doesn't tell the whole story. From my experience with many Nvidia and AMD cards, 3DMark and Unigine stuff are useless to decide which card is better. It is always best to compare them on game to game basis. Or benchmark games the OP is interested in.
> Also generally speaking Maxwell cards suck in Valley benchmark compared to Keplers. 780 Ti's are faster in Valley than 980s but slower in everything else.
> Also nice cherry picked flawed result, here are other random results from other users at lower clocks and much higher fps than the one you cherry picked.
> 
> http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0/11550_50#post_23068245
> 
> http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0/12450_50#post_24325548
> 
> I am not defending Nvidia (espeically after 970s 3.5GB fiasco) but cherry picking a single flawed result doesn't give the OP a proper idea about the performance of cards.


Well said!


----------



## semitope

the 970 will be no match for the 390 in dx12. End of story right there. 8GB of VRAM, faster in dx12. It's a really bad idea to pick 3.5GB over 8GB at this point when that 3.5GB has sub 200GB/s bandwidth. For the sake of not regretting it later, 390 all the way. You know you are safe going that route.


----------



## buttface420

i was debating 970 vs 390 for my new gpu but after seeing benchmarks from 15.7 i see the 290x beating the 390 and even matching the 390x at clock for clock even in dx12 so i just bought a 290x used off ebay for $230.


----------



## TopicClocker

Quote:


> Originally Posted by *buttface420*
> 
> i was debating 970 vs 390 for my new gpu but after seeing benchmarks from 15.7 i see the 290x beating the 390 and even matching the 390x at clock for clock even in dx12 so i just bought a 290x used off ebay for $230.


That's a great buy!

I remember before the GTX 970 and 980 announcement I was watching the 290X prices like a Hawk as they had a crazy price drop around August and September IIRC.

The Sapphire Tri X and Vapor X cards are gorgeous!


----------



## rdr09

Quote:


> Originally Posted by *buttface420*
> 
> i was debating 970 vs 390 for my new gpu but after seeing benchmarks from 15.7 i see the 290x beating the 390 and even matching the 390x at clock for clock even in dx12 so i just bought a 290x used off ebay for $230.


the 390 is faster than both the 290 and 290X at stock. the only way my 290 will match a 390 is if i oc my 290 100MHz more or flash it with 390 bios. here is a 290 flashed with 390 bios . . .

http://www.3dmark.com/3dm/8214970

i have to oc my 290 with orig bios over 1300 core to match that. i can only oc mine to 1300.


----------



## TopicClocker

Quote:


> Originally Posted by *rdr09*
> 
> the 390 is faster than both the 290 and 290X at stock. the only way my 290 will match a 390 is if i oc my 290 100MHz more or flash it with 390 bios. here is a 290 flashed with 390 bios . . .
> 
> http://www.3dmark.com/3dm/8214970
> 
> i have to oc my 290 with orig bios over 1300 core to match that. i can only oc mine to 1300.


What the hell did AMD do with the 390/X bios? It couldn't purely be timings right?


----------



## buttface420

i may be wrong but from many benches i saw clock for clock they were the same


----------



## semitope

Never really bought into that "GPU isn't powerful enough to make use of the VRAM size" argument.


----------



## iinversion

Quote:


> Originally Posted by *TopicClocker*
> 
> What the hell did AMD do with the 390/X bios? It couldn't purely be timings right?


Looks like he's ignoring the fact that the 390 bios raises memory speed by 1GHz and he's only changing core clocks himself for comparison.

Clock for clock they are the same. Every review I've seen who matches hardware and core/memory clocks the result is the same.


----------



## buttface420

Quote:


> Originally Posted by *semitope*
> 
> Never really bought into that "GPU isn't powerful enough to make use of the VRAM size" argument.


i don't either. one gpu may not use all 8 gb yet, but two gpus crossfired could definitely use more than 4.


----------



## TopicClocker

Quote:


> Originally Posted by *semitope*
> 
> Never really bought into that "GPU isn't powerful enough to make use of the VRAM size" argument.


Because it's not true.

Quote:


> Originally Posted by *buttface420*
> 
> i don't either. one gpu may not use all 8 gb yet, but two gpus crossfired could definitely use more than 4.


Even one GPU could use more than 4GB, or the whole 8GB.

What people are forgeting is that there's more to VRAM usage than resolution and anti-aliasing.

One of the biggest consumers, if not the biggest consumer of VRAM is Textures, there are also things like render targets to take into consideration.

Killzone Shadowfall.


Infamous: Second Son.


----------



## rdr09

Quote:


> Originally Posted by *TopicClocker*
> 
> What the hell did AMD do with the 390/X bios? It couldn't purely be timings right?


rebranded. rebranding in gpus is not just changing the name but improving stuff like increasing the vram is one. even with my example, others will not believe. i can easily run my 290s (any of them) at same clocks as the ones flashed with 390 bios and it will not keep up. i have to oc.

mind you my 290s are above average than most 290s. i can both benched them at 1290 . . .

http://www.3dmark.com/3dm/4644282?

if i flash my 290s with 390 bios . . . that graphics score will be higher . . . and some will still doubt.









edit: BTW, i know who owns that 290. its vram can oc that high even with orig bios.


----------



## Yungbenny911

I see ya'll throwing benchmarks here and there saying 290's/390's are better than 970's? And 980's/980ti's are the only Nvidia cards worth buying? Haha

@neurotix You think your 290 is fast?







. Here's my 5 months old score.



http://www.3dmark.com/fs/4659651


----------



## ImJJames

Quote:


> Originally Posted by *Yungbenny911*
> 
> I see ya'll throwing benchmarks here and there saying 290's/390's are better than 970's? And 980's/980ti's are the only Nvidia cards worth buying? Haha
> 
> @neurotix You think your 290 is fast?
> 
> 
> 
> 
> 
> 
> 
> . Here's my 5 months old score.
> 
> 
> 
> http://www.3dmark.com/fs/4659651


Strong insecurities.


----------



## rdr09

and 4GB.


----------



## umeng2002

I have the 970 since last January. I like it, but I'd get the 390 now. After the 3.5 GB thingy, I wouldn't give my money to nVidia again until we see some real DX12 tests.


----------



## Yungbenny911

As you guys can tell, that screenshot is at 4K res. I've played at Native 4K res since day one with zero issues, and i'm currently playing Assassin's creed unity with everything MAX (except ambient occlusion set to high) @ 1578Mhz...

If it makes you feel better about your AMD GPU, have fun spreading misinformation







. All these 390 vs 970 threads are pointless, because we all know they perform similarly... Happy gaming!


----------



## Stige

If it makes you feel better about your 3.5GB GPU then have fun spreading misinformation!


----------



## semitope

Quote:


> Originally Posted by *Yungbenny911*
> 
> As you guys can tell, that screenshot is at 4K res. I've played at Native 4K res since day one with zero issues, and i'm currently playing Assassin's creed unity with everything MAX (except ambient occlusion set to high) @ 1578Mhz...
> 
> If it makes you feel better about your AMD GPU, have fun spreading misinformation
> 
> 
> 
> 
> 
> 
> 
> . All these 390 vs 970 threads are pointless, because we all know they perform similarly... Happy gaming!


you realize that's sli right?


----------



## Yungbenny911

Quote:


> Originally Posted by *Stige*
> 
> If it makes you feel better about your 3.5GB GPU then have fun spreading misinformation!


And what exactly is the misinformation i'm spreading? That my 3.5gb 970's whoops your 290/390's arse when OC'ed?









All jokes aside, we all know that no GPU is a clear winner, and it's all dependent on the buyer's needs.


----------



## Stige

Don't have any of those, yet.

Leaning more towards the R9 390 or 390X at 1440p, the 3.5GB is too limiting factor at 1440p in the future propably.
The R9 390 is also cheaper than 970 for same or better performance.


----------



## Yungbenny911

Quote:


> Originally Posted by *Stige*
> 
> Don't have any of those, yet.
> 
> Leaning more towards the R9 390 or 390X at 1440p, the 3.5GB is too limiting factor at 1440p in the future propably.
> The R9 390 is also cheaper than 970 for same or better performance.


Oh, so you don't even have any of them, yet you accused me of spreading misinformation?










Well, that's too bad... lol


----------



## mtcn77

Quote:


> Originally Posted by *Yungbenny911*
> 
> And what exactly is the misinformation i'm spreading? That my 3.5gb 970's whoops your 290/390's arse when OC'ed?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> All jokes aside, we all know that *no GPU* is a clear winner, and it's all dependent on the buyer's needs.


Except when we have a clear winner as it seems...


----------



## rdr09

Quote:


> Originally Posted by *Yungbenny911*
> 
> And what exactly is the misinformation i'm spreading? That my 3.5gb 970's whoops your 290/390's arse when OC'ed?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> All jokes aside, we all know that no GPU is a clear winner, and it's all dependent on the buyer's needs.


i for one did not say the 970 is slow. just the 0.5 is. a nvidia engineer attested to this fact. in you tube.


----------



## TopicClocker

A stock R9 390 is a little faster than a stock GTX 970, the R9 390 is a revised/refreshed R9 290.

It also has 8GB of VRAM compared to the 4GB/3.5GB VRAM on the GTX 970.

That 4-4.5GB more VRAM is a really great selling point of the R9 390, as both GPUs can be had for around the same price.

So far from this, the R9 390 has advantages in VRAM and is a little faster.

R9 390

Is faster
Has significantly more VRAM
Handles higher resolutions better

The GTX 970 has NVIDIA features such as PhysX and GameWorks, it also can have a great overclocking headroom to overclock to around the speed of a GTX 980.
It's also significantly more power efficient than the R9 390, consuming alot less power when both GPUs are under-load, it's roughly about 40% more power efficient.

GTX 970

Capable of using NVIDIA features in the Games that support it. (e.g PhysX, GameWorks)
Can have a large overclocking headroom
Significantly more power efficient
NVIDIA Shadow Play, able to record footage with minimal performance impact.
That 8GB VRAM is not useless, it's just not totally beneficial or even mandatory at 1080p to 1440p yet.

The R9 390 would be a great candidate for a 4K setup, you'll likely want two of these in cross-fire for it to be worthwhile and a great experience.
Due to it's 8GB of VRAM you'll have minimal worries about VRAM when running resolutions like this, however we're not totally sure how well this shall suffice in the future, just like any piece of hardware.

From my list it seems the R9 390 is mostly leaning towards more of the performance side, along with having more VRAM.

The GTX 970 is more power efficient, and has the capablity of using NVIDIA features such as PhysX and Gameworks which improve the visuals of the games which support it.

You could say the R9 390 is more future proof purely because of its VRAM; you're most unlikely to be limited by VRAM with the R9 390 than the GTX 970.
However its also a lot less power efficient than the GTX 970.

The benefits of having more VRAM on the R9 390 would mostly allow you to run higher texture quality settings on future Games, and possibly anti-aliasing, however you'd also need the GPU power to run that anti-aliasing so a CF set-up could possibly benefit more from this than a single GPU.

It's really down to what the user wants.


----------



## ad hoc

I have an Fx 6300 and an R9 270. I've been planning on buying a 390, but I've been reading some bad stuff about AMD's driver overhead when paired with weaker GPUs. Should I get a 390 or a 970?

Edit: I also have a very mediocre (if not worse) CX 600 PSU. And I don't have a lot of OC'ing headroom because of my board crappy VRM's.


----------



## iinversion

Quote:


> Originally Posted by *ad hoc*
> 
> I have an Fx 6300 and an R9 270. I've been planning on buying a 390, but I've been reading some bad stuff about AMD's driver overhead when paired with weaker *GPUs*. Should I get a 390 or a 970?
> 
> Edit: I also have a very mediocre (if not worse) CX 600 PSU.


I think you meant CPU's** I would definitely get a 970 unless you upgrade your CPU to something Intel offers as there really isn't an upgrade on AMD's side of things.

The video in my signature shows how even a R9 380 can perform worse than a 750 Ti in DX11 if your CPU isn't powerful enough.

Draw your own conclusions.

CX600 is very mediocre and only has 522W on the 12V+ rail. Not the best but shouldn't be an issue with either GPU tbh.


----------



## ad hoc

Quote:


> Originally Posted by *iinversion*
> 
> I think you meant CPU's** I would definitely get a 970 unless you upgrade your CPU to something Intel offers as there really isn't an upgrade on AMD's side of things.
> 
> The video in my signature shows how even a R9 380 can perform worse than a 750 Ti in DX11 if your CPU isn't powerful enough.
> 
> Draw your own conclusions.
> 
> CX600 is very mediocre and only has 522W on the 12V+ rail. Not the best but shouldn't be an issue with either GPU tbh.


Ah, that's pretty much what I was expecting. It looks like I won't be supporting Team Red on the GPU side this time around. Oh well. Thanks for getting back to me so fast.


----------



## iRUSH

What I like about the 970 over the equivalent AMD card are the following:

I love that I get day one drivers even if in beta for big titles as well as beta drivers for the AAA beta's. Nvidia is on top of that. I also like the better CPU overhead built into their drivers. I play online competitive shooters with a 144hz 1080p display. I can achieve a higher minimum FPS with my 970 vs my 290x in every online FPS game I play from Insurgency, to Dirty Bomb to BF4.

What I love about the Red team and their 390 is that you get a lot of hardware for the money. That 8gb VRAM while probably pointless for most, is still 8gb and it's real. Xfire down the road not only will really take advantage of that higher RAM but also scales better than SLI. If you're patient (and I'm not) and you're not a competitive FPS high refresh rate nut bag like me, the 390 is the better card.


----------



## ImJJames

Quote:


> Originally Posted by *iinversion*
> 
> I think you meant CPU's** I would definitely get a 970 unless you upgrade your CPU to something Intel offers as there really isn't an upgrade on AMD's side of things.
> 
> The video in my signature shows how even a R9 380 can perform worse than a 750 Ti in DX11 if your CPU isn't powerful enough.
> 
> Draw your own conclusions.
> 
> CX600 is very mediocre and only has 522W on the 12V+ rail. Not the best but shouldn't be an issue with either GPU tbh.


Ummmnnn no. I personally had FX-6300 with 7970, upgraded to 4770K and FPS were pretty similar unless the game was very CPU dependent like GTA.

It would actually be wise to stick with a AMD GPU with a low tier CPU because of ASYNC COMPUTE in the future. That is obviously if you don't plan to upgrade your CPU.


----------



## iinversion

Quote:


> Originally Posted by *ImJJames*
> 
> Ummmnnn no. I personally had FX-6300 with 7970, upgraded to 4770K and FPS were pretty similar *unless the game was very CPU dependent* like GTA.
> 
> It would actually be wise to stick with a AMD GPU with a low tier CPU because of ASYNC COMPUTE in the future. That is obviously if you don't plan to upgrade your CPU.


ummm yes.

Even if the game is slightly CPU bound there will be a large difference with a poor CPU, especially in minimums. GTA isn't even that CPU dependent overall. There are just some spots where it is very CPU dependent just like a lot of games out there today. Just look at the video in my sig. R9 280 doing quite a bit better than a 760 in the beginning portion and then it drops to become worse than a 750 Ti in other areas.

And yeah, typical response. Wait for [fill in the blank] because then it will be better. How long have people in the AMD camp been saying this? Lol. It's always waiting. What about if someone wants better performance now?

By the time DX12 is mainstream none of the GPU's today are going to be relevant whether it's AMD or Nvidia. Not to mention, AMD has no plans to further their DX11 performance. So even when DX12 is mainstream if you wanna play anything that uses DX11 the performance will still suck if your CPU isn't powerful enough to overcome the overhead.

For someone like the person in question, with a low IPC CPU like a FX 6300, I would absolutely recommend a Nvidia card. If he updates his CPU to something that Intel released in the past 4 years then I would say it doesn't really matter what he gets.


----------



## rdr09

Quote:


> Originally Posted by *TopicClocker*
> 
> A stock R9 390 is a little faster than a stock GTX 970, the R9 390 is a revised/refreshed R9 290.
> 
> It also has 8GB of VRAM compared to the 4GB/3.5GB VRAM on the GTX 970.
> 
> That 4-4.5GB more VRAM is a really great selling point of the R9 390, as both GPUs can be had for around the same price.
> 
> So far from this, the R9 390 has advantages in VRAM and is a little faster.
> 
> R9 390
> 
> Is faster
> Has significantly more VRAM
> Handles higher resolutions better
> 
> The GTX 970 has NVIDIA features such as PhysX and GameWorks, it also can have a great overclocking headroom to overclock to around the speed of a GTX 980.
> It's also significantly more power efficient than the R9 390, consuming alot less power when both GPUs are under-load, it's roughly about 40% more power efficient.
> 
> GTX 970
> 
> Capable of using NVIDIA features in the Games that support it. (e.g PhysX, GameWorks)
> Can have a large overclocking headroom
> Significantly more power efficient
> NVIDIA Shadow Play, able to record footage with minimal performance impact.
> That 8GB VRAM is not useless, it's just not totally beneficial or even mandatory at 1080p to 1440p yet.
> 
> The R9 390 would be a great candidate for a 4K setup, you'll likely want two of these in cross-fire for it to be worthwhile and a great experience.
> Due to it's 8GB of VRAM you'll have minimal worries about VRAM when running resolutions like this, however we're not totally sure how well this shall suffice in the future, just like any piece of hardware.
> 
> From my list it seems the R9 390 is mostly leaning towards more of the performance side, along with having more VRAM.
> 
> The GTX 970 is more power efficient, and has the capablity of using NVIDIA features such as PhysX and Gameworks which improve the visuals of the games which support it.
> 
> You could say the R9 390 is more future proof purely because of its VRAM; you're most unlikely to be limited by VRAM with the R9 390 than the GTX 970.
> However its also a lot less power efficient than the GTX 970.
> 
> The benefits of having more VRAM on the R9 390 would mostly allow you to run higher texture quality settings on future Games, and possibly anti-aliasing, however you'd also need the GPU power to run that anti-aliasing so a CF set-up could possibly benefit more from this than a single GPU.
> 
> It's really down to what the user wants.


the only thing that's a waste is 0.5 GB.

But, then again, Yung is using his with 4K.


----------



## diggiddi

Quote:


> Originally Posted by *rdr09*
> 
> i for one did not say the 970 is slow. just the 0.5 is. a nvidia engineer attested to this fact. in you tube.


Who this guy ?? lol I keed I keed





@Ad hoc grab a 390 and call it a day, you can always crossfire when you get a better PSU


----------



## Tivan

Quote:


> Originally Posted by *ad hoc*
> 
> Ah, that's pretty much what I was expecting. It looks like I won't be supporting Team Red on the GPU side this time around. Oh well. Thanks for getting back to me so fast.


Keep in mind that it's not an issue for Dx9 games, if you're concerend about legacy performance. (though the combination with an FX CPU might make it more of an issue in DX11(/10?) games than the usual benchmarks tell you.)


----------



## rdr09

Quote:


> Originally Posted by *ad hoc*
> 
> I have an Fx 6300 and an R9 270. I've been planning on buying a 390, but I've been reading some bad stuff about AMD's driver overhead when paired with weaker GPUs. Should I get a 390 or a 970?
> 
> Edit: I also have a very mediocre (if not worse) CX 600 PSU. And I don't have a lot of OC'ing headroom because of my board crappy VRM's.


see the search box up top? type i5 Bottleneck.
Quote:


> Originally Posted by *diggiddi*
> 
> Who this guy ?? lol I keed I keed
> 
> 
> 
> 
> 
> @Ad hoc grab a 390 and call it a day, you can always crossfire when you get a better PSU


i did not mean to wake the dead from sleep.


----------



## ColdHardCash

Love my 390 upgrade from my 6950. Doesn't even get hot under load unlike what nvidia users say.


----------



## iinversion

Quote:


> Originally Posted by *ColdHardCash*
> 
> Love my 390 upgrade from my 6950. Doesn't even get hot under load unlike what nvidia users say.


It's more about the power consumption than the temps. The temps are not related to the power consumption. & The temp issue was more so on the reference models than anything.

It doesn't matter anyway. The power consumption isn't a factor for most people when choosing a GPU.


----------



## ad hoc

Quote:


> Originally Posted by *Tivan*
> 
> Keep in mind that it's not an issue for Dx9 games, if you're concerend about legacy performance. (though the combination with an FX CPU might make it more of an issue in DX11(/10?) games than the usual benchmarks tell you.)


Most dx9 games are old enough that I don't think I'd have trouble with either GPU anyway.


----------



## Yungbenny911

Quote:


> Originally Posted by *rdr09*
> 
> the only thing that's a waste is 0.5 GB.
> 
> But, then again, Yung is using his with 4K.


Yes Sir! IPS 32"







. that's not me in the photo btw...


Spoiler: Warning: Spoiler!















Did you forget that i bought two 980's and 970's when they first released? I have an apples to apples benchmark "showdown" in my sig @ native 4K res (if you need to refresh your memory).

You honestly think i'll advocate for a 970 IF it performed the way ya'll portray it to be?


----------



## rdr09

Quote:


> Originally Posted by *Yungbenny911*
> 
> Yes Sir! IPS 32"
> 
> 
> 
> 
> 
> 
> 
> . that's not me in the photo btw...
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Did you forget that i bought two 980's and 970's when they first released? I have an apples to apples benchmark "showdown" in my sig @ native 4K res (if you need to refresh your memory).
> 
> You honestly think i'll advocate for a 970 IF it performed the way ya'll portray it to be?


how much vram does watching video eats?

prolly as much as FS.


----------



## PontiacGTX

Quote:


> Originally Posted by *TopicClocker*
> 
> 
> Capable of using NVIDIA features in the Games that support it. (e.g PhysX, GameWorks)
> Can have a large overclocking headroom
> Significantly more power efficient
> NVIDIA Shadow Play, able to record footage with minimal performance impact.
> .



...and AMD cab use Gameworks...
OC vs OC the 290x achieve similar performance
AMD VCE...


----------



## Yungbenny911

Quote:


> Originally Posted by *rdr09*
> 
> how much vram does watching video eats?
> 
> prolly as much as FS.


Yeah, i spent $4k+ on my setup for youtube videos; just some change i found laying in the couch









Games below are all @ 4K... All Max Settings... All Nvidia Gameworks features and yadi ya turned on. If it still makes you feel better about your 8Gb 390's, keep spreading misinformation about GTX 970's.

For now, I'll stick to Nvidia and wish the best for AMD (because competition is always needed to keep Nvidia from charging an arm and leg). I'm not saying AMD GPU's are bad in any way, they're great, but not just for me. Which goes back to what i said earlier, it's all about what the user wants.


----------



## rdr09

Quote:


> Originally Posted by *Yungbenny911*
> 
> Yeah, i spent $4k+ on my setup for youtube videos; just some change i found laying in the couch
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Games below are all @ 4K... All Max Settings... All Nvidia Gameworks features and yadi ya turned on. If it still makes you feel better about your 8Gb 390's, keep spreading misinformation about GTX 970's.
> 
> For now, I'll stick to Nvidia and wish the best for AMD (because competition is always needed to keep Nvidia from charging an arm and leg). I'm not saying AMD GPU's are bad in any way, they're great, but not just for me. Which goes back to what i said earlier, it's all about what the user wants.


Must be running on fumes.


----------



## HeadlessKnight

Quote:


> Originally Posted by *PontiacGTX*
> 
> 
> ...and AMD cab use Gameworks...
> OC vs OC the 290x achieve similar performance
> AMD VCE...


From my experience with ASUS 290X. VCE sucks and it is not in the same level as Shadowplay. It is buggy and has higher performance hit too, it doesn't work with OSDs as well as Shadowplay.


----------



## Stige

390 vs 970 was clear for me, 390 all the way.

But then I found a deal of 290X for only 329€ (cheapest 390 was 360€, 970 closer to 400€) so it was no question now.

Ordered the 290X and gonna get a waterblock for it later with the money I just saved.

Money well spent I think, at same clocks it seems to be just the same as the refresh 390X.


----------



## buttface420

Quote:


> Originally Posted by *Stige*
> 
> 390 vs 970 was clear for me, 390 all the way.
> 
> But then I found a deal of 290X for only 329€ (cheapest 390 was 360€, 970 closer to 400€) so it was no question now.
> 
> Ordered the 290X and gonna get a waterblock for it later with the money I just saved.
> 
> Money well spent I think, at same clocks it seems to be just the same as the refresh 390X.


same here. was deciding between 390 and 970 ..390 was the winner. but then i saw cheap deal for 290x and got that instead. very pleased with it!


----------



## PontiacGTX

Quote:


> Originally Posted by *HeadlessKnight*
> 
> From my experience with ASUS 290X. VCE sucks and it is not in the same level as Shadowplay. It is buggy and has higher performance hit too, it doesn't work with OSDs as well as Shadowplay.


it can be used with other than gaming evolved and has way better result than that one..


----------



## ImJJames

Quote:


> Originally Posted by *rdr09*
> 
> Must be running on fumes.


This lol


----------



## Yungbenny911

Quote:


> Originally Posted by *rdr09*
> 
> Must be running on fumes.


Would you like to take a screenshot of your 8gb 390, or FULL 4gb 290's @ 4K with AA and everything on?

If my 3.5Gb 970 gets 40 FPS with everything MAX + AA, your 8gb GPU's that are "MUCH" better than mine should get 60+ FPS right?









Don't kid yourself, even a 12gb Titan X would "run on fumes" at the settings i had those games at...


----------



## rdr09

Quote:


> Originally Posted by *Yungbenny911*
> 
> Would you like to take a screenshot of your 8gb 390, or FULL 4gb 290's @ 4K with AA and everything on?
> 
> If my 3.5Gb 970 gets 40 FPS with everything MAX + AA, your 8gb GPU's that are "MUCH" better than mine should get 60+ FPS right?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Don't kid yourself, even a 12gb Titan X would "run on fumes" at the settings i had those games at...


i knew nvidia found a fix. they were talking about it here (see last two posts) . . .

http://www.overclock.net/t/1535502/gtx-970s-can-only-use-3-5gb-of-4gb-vram-issue/2980

i like to stay above 60 so i lower my settings. in BF4, i set all medium and no AA with my vram going as high as 3750MB per card. my stuff is still floating somewhere in the atlantic.


----------



## Yungbenny911

Quote:


> Originally Posted by *rdr09*
> 
> i knew nvidia found a fix. they were talking about it here (see last two posts) . . .
> 
> http://www.overclock.net/t/1535502/gtx-970s-can-only-use-3-5gb-of-4gb-vram-issue/2980
> 
> i like to stay above 60 so i lower my settings. in BF4, i set all medium and no AA with my vram going as high as 3750MB per card. my stuff is still floating somewhere in the atlantic.


First guy in that link has a single 970, and wants to play at 4K x8MSAA with butter smooth FPS? okay... lol. Isn't it common sense to know that your GPU goes to an idle state when a game goes to a "loading screen"? They didn't "fix" anything, nothing was broken to begin with.

The only change i noticed with windows 10 and DX12 is that it allows 3D applications to directly recognize my DDR4 as V-RAM (If you check the screenshot of assassin's creed, you'll see that my allocated V-RAM was at 11GB+), however; this has been done for a long time now, it's just more efficient/direct with windows 10 IMO.

Improve performance? Maybe. Fix? No...


----------



## rdr09

Quote:


> Originally Posted by *Yungbenny911*
> 
> First guy in that link has a single 970, and wants to play at 4K x8MSAA with butter smooth FPS? okay... lol. Isn't it common sense to know that your GPU goes to an idle state when a game goes to a "loading screen"? They didn't "fix" anything, nothing was broken to begin with.
> 
> The only change i noticed with windows 10 and DX12 is that it allows 3D applications to directly recognize my DDR4 as V-RAM (If you check the screenshot of assassin's creed, you'll see that my allocated V-RAM was at 11GB+), however; this has been done for a long time now, it's just more efficient/direct with windows 10 IMO.
> 
> Improve performance? Maybe. Fix? No...


teach them.


----------



## Yungbenny911

Quote:


> Originally Posted by *rdr09*
> 
> teach them.


You should also grab a pen and notepad, so you can take down some notes as i speak


----------



## rdr09

Quote:


> Originally Posted by *Yungbenny911*
> 
> You should also grab a pen and notepad, so you can take down some notes as i speak


i do. i knew that engineer was lying.


----------



## PontiacGTX

Quote:


> Originally Posted by *Yungbenny911*
> 
> The only change i noticed with windows 10 and DX12 is that it allows 3D applications to directly recognize my DDR4 as V-RAM
> .


which DX12 game are you playing?


----------



## TopicClocker

Quote:


> Originally Posted by *Yungbenny911*
> 
> Yes Sir! IPS 32"
> 
> 
> 
> 
> 
> 
> 
> . that's not me in the photo btw...
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Did you forget that i bought two 980's and 970's when they first released? I have an apples to apples benchmark "showdown" in my sig @ native 4K res (if you need to refresh your memory).
> 
> You honestly think i'll advocate for a 970 IF it performed the way ya'll portray it to be?


How well does overclocking increase performance on multi GPU systems compared to single GPU systems?
I know it can be quite linear with a single GPU, like a 10% Core overclock can get you a performance increment of around 10%
Quote:


> Originally Posted by *PontiacGTX*
> 
> which DX12 game are you playing?


They might be referring to WDDM 2.0.


----------



## bonami2

Im running 5760x1080 crossfire 7950

3gb vram is not enough Stutter like mad and crash in gta v if going over limit.

320gb bandwith give me 5+ fps in benchmark and game 1250-1500mhz

The 970 as the same bandwith.

the r9 390 as 8gb vram. 512gbs

if your considering future proofing the r9 390 win.

But they are both weak gpu for high res... vs 980ti fury x

But talking about 1080p. So i would says any of them should be good the cheaper one is the best


----------



## TopicClocker

Quote:


> Originally Posted by *bonami2*
> 
> Im running 5760x1080 crossfire 7950
> 
> 3gb vram is not enough Stutter like mad and crash in gta v if going over limit.
> 
> 320gb bandwith give me 5+ fps in benchmark and game 1250-1500mhz
> 
> The 970 as the same bandwith.
> 
> the r9 390 as 8gb vram. 512gbs
> 
> if your considering future proofing the r9 390 win.
> 
> But they are both weak gpu for high res... vs 980ti fury x
> 
> But talking about 1080p. So i would says any of them should be good the cheaper one is the best


The Fury X is terribly priced, in the UK both the Fury X and 980 Ti are around £500, no way would I buy a 4GB card for £500.

Its not a bad card by any means, but its priced pretty much the same as the 980 Ti for a card that's slower, and has less VRAM than the 980 Ti.

However, to it's benefit it does have a AIO water cooler.

HBM must be really expensive, its quite unusual for AMD to price their hardware like that IMO, its not really competive.

Hopefully DX12 is where I'll take off, as GCN is wonderfully architectured for it.


----------



## bonami2

Quote:


> Originally Posted by *TopicClocker*
> 
> The Fury X is terribly priced, in the UK both the Fury X and 980 Ti are around £500, no way would I buy a 4GB card for £500.
> 
> Its not a bad card by any means, but its priced pretty much the same as the 980 Ti for a card that's slower, and has less VRAM than the 980 Ti.
> 
> However, to it's benefit it does have a AIO water cooler.
> 
> HBM must be really expensive, its quite unusual for AMD to price their hardware like that IMO, its not really competive.
> 
> Hopefully DX12 is where I'll take off, as GCN is wonderfully architectured for it.


Yes but seing that my 7950 crossfire gain performance from overclocking the vram at 5760x1080.

I would says the fury x as an edge but lack vram

the 980ti as almost enough vram but lack bandwith

So only the r9 390x 8gb is interresting for high end setup. But it old and use power like mad.

Im gonna wait and see what i can get in a years for [email protected] and maybe water cool it and be able to push my 5760x180 setup.... And maybe im going to get a 4k display to be able to try it


----------



## PontiacGTX

Quote:


> Originally Posted by *bonami2*
> 
> Yes but seing that my 7950 crossfire gain performance from overclocking the vram at 5760x1080.
> 
> I would says the fury x as an edge but lack vram
> 
> the 980ti as almost enough vram but lack bandwith
> 
> So only the r9 390x 8gb is interresting for high end setup. But it old and use power like mad.
> 
> Im gonna wait and see what i can get in a years for [email protected] and maybe water cool it and be able to push my 5760x180 setup.... And maybe im going to get a 4k display to be able to try it


or wait 2016 cards in case you have good enough performance


----------



## Yungbenny911

Quote:


> Originally Posted by *TopicClocker*
> 
> How well does overclocking increase performance on multi GPU systems compared to single GPU systems?
> I know it can be quite linear with a single GPU, like a 10% Core overclock can get you a performance increment of around 10%


Comparing a percentage increase in Overclock to a percentage increase in Performance is dependent on a lot of things: the architecture of the GPU, the resolution played at, the AA/settings of the game e.t.c, and 95% of the time, it's never linear (single or multi GPU), percentage increase in OC is most-likely always higher than performance increase.

Here's *Hitman Absolution Single VS Multi.*


Spoiler: Game Settings Used







*5930k @ 3.6Ghz*

*x1 GTX 970 Stock - 1177Mhz (core)/ 1753Mhz (mem) -- 31 FPS AVG*


Spoiler: Warning: Spoiler!







*x1 GTX 970 OC - 1599Mhz (core)/ 1903Mhz (mem) -- 38 FPS AVG*


Spoiler: Warning: Spoiler!







*x2 GTX 970 Stock - 1177Mhz (core)/ 1753Mhz (mem) -- 58 FPS AVG*


Spoiler: Warning: Spoiler!







*x2 GTX 970 OC - 1599Mhz (core)/ 1903Mhz (mem) -- 70 FPS AVG*


Spoiler: Warning: Spoiler!







With a 35.8% increase in core clock, and 8.6% increase in mem clock, single GPU got a 22.1% increase in AVG FPS, while dual GPU's got a 20.3%.. Scaling was at 87% stock, and 84% OC (would be much better with an OC CPU, but i'm too lazy to restart my PC )







.
Quote:


> Originally Posted by *PontiacGTX*
> 
> which DX12 game are you playing?


I must have confused you with my diction, i meant Windows 10 and the DX12 Nvidia Drivers, not the API itself. That driver was what enabled WDDM 2.0 on Nvidia compatible GPU's, hence why i stated in my post that *it's just more efficient/direct with windows 10*. I didn't say it's just more efficient/direct with DX12, and that should have made you understand that i wasn't talking about DX12 API, but i guess you're just trying to make me sound dumb by cherry picking


----------



## PontiacGTX

Quote:


> Originally Posted by *Yungbenny911*
> 
> I must have confused you with my diction


indeed who confused the things were you.

Quote:


> i meant Windows 10 and the DX12 Nvidia Drivers, not the API itself. That driver was what enabled WDDM 2.0 on Nvidia compatible GPU's


you didnt say that.,.
Quote:


> Originally Posted by *Yungbenny911*
> 
> The only change i noticed with windows 10 and DX12 is that it allows 3D applications to directly recognize my DDR4 as V-RAM
> .


that si so old

http://www.cnet.com/forums/discussions/pc-video-memory-confusion-dedicated-shared-discrete-oh-my-291147/

regardless of that you cant use over 4GB of physical RAM on SLI. and if you do it will happen the same that the 970 does above 3,5
Quote:


> Originally Posted by *Yungbenny911*
> 
> I showed them photos of Assassins creed unity @ 4k + AA with over 11Gb allocated video memory, and watchdogs @ 4K + SMAA with over 6gb of allocated video memory.


Allocated vram past the physical vram will make slow downs it seems you keep
Quote:


> I logged system usage for the GTX 770 at QHD Ultra and found that the game was trying to allocate nearly 3GB of VRAM use, which on a 2GB card means there's going to be a lot of texture thrashing.


----------



## tridentie

Hi guys!

I am currently planning to buy a videocard. Originally I wanted 980 ti but due to coming of HBM2 rather soon I decided not to buy it.
I am planning to buy r9 390 8GB. Originally I wanted to buy 970, but then decided to go for radeon at least once, because I wanted to buy videocard as temporary solution before HBM2 (my current is 9800 GT 512 MB







For 1024x768 it still works fine)
*So what do you think?* I play rarely and in 1080p and usually in old games. (Prefer consoles). Power consumtion does not bother me - I've got 1200 W PSU









My other parts are i7 5820, 32 GB DDR4, Asus Rampage V.

Additional question is just how well R9 390 will work with R9 380 in crossfire (maybe I'll buy one just for lulz) and what performance gains I'll have?


----------



## bonami2

Quote:


> Originally Posted by *PontiacGTX*
> 
> or wait 2016 cards in case you have good enough performance


Yea im gonna wait for sure









Crossfire work with 100% scaling in some of my game. Seriously 30fps in windowed and 60fps in fullscreen ahah

Just some stutter from moving fast ( the gpu can stay loaded equally so usage drop and fps so not the best experience vs single gpu. Still if moving slowly it awesome solid fps increase.

Sorry for being out of subjec
Quote:


> Originally Posted by *tridentie*
> 
> Hi guys!
> 
> I am currently planning to buy a videocard. Originally I wanted 980 ti but due to coming of HBM2 rather soon I decided not to buy it.
> I am planning to buy r9 390 8GB. Originally I wanted to buy 970, but then decided to go for radeon at least once, because I wanted to buy videocard as temporary solution before HBM2 (my current is 9800 GT 512 MB
> 
> 
> 
> 
> 
> 
> 
> For 1024x768 it still works fine)
> *So what do you think?* I play rarely and in 1080p and usually in old games. (Prefer consoles). Power consumtion does not bother me - I've got 1200 W PSU
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My other parts are i7 5820, 32 GB DDR4, Asus Rampage V.
> 
> Additional question is just how well R9 390 will work with R9 380 in crossfire (maybe I'll buy one just for lulz) and what performance gains I'll have?


it will perform 2 millions time better than the 9800gt ahah

You cant crossfire a r9 380 with a r9 390 Only r9 390 with 390x i think


----------



## tridentie

Quote:


> Originally Posted by *bonami2*
> 
> it will perform 2 millions time better than the 9800gt ahah


I used to 9600m GT on my laptop actually








Quote:


> Originally Posted by *bonami2*
> 
> You cant crossfire a r9 380 with a r9 390 Only r9 390 with 390x i think


I see. Well then it will be alone


----------



## bonami2

Quote:


> Originally Posted by *tridentie*
> 
> I used to 9600m GT on my laptop actually
> 
> 
> 
> 
> 
> 
> 
> 
> I see. Well then it will be alone


Anyways crossfire is not that fun.

It do increase fps but some bug here and there and stutter. Im still happy with my crossfire price wise it awesome. But i would says next upgrade im probably going to go single gpu and have extra gpu for benchmark and or worst case scenario when the game is too heavy.


----------



## TopicClocker

Quote:


> Originally Posted by *Yungbenny911*
> 
> Comparing a percentage increase in Overclock to a percentage increase in Performance is dependent on a lot of things: the architecture of the GPU, the resolution played at, the AA/settings of the game e.t.c, and 95% of the time, it's never linear (single or multi GPU), percentage increase in OC is most-likely always higher than performance increase.
> 
> -snip-
> 
> With a 35.8% increase in core clock, and 8.6% increase in mem clock, single GPU got a 22.1% increase in AVG FPS, while dual GPU's got a 20.3%.. Scaling was at 87% stock, and 84% OC (would be much better with an OC CPU, but i'm too lazy to restart my PC )
> 
> 
> 
> 
> 
> 
> 
> .
> I must have confused you with my diction, i meant Windows 10 and the DX12 Nvidia Drivers, not the API itself. That driver was what enabled WDDM 2.0 on Nvidia compatible GPU's, hence why i stated in my post that *it's just more efficient/direct with windows 10*. I didn't say it's just more efficient/direct with DX12, and that should have made you understand that i wasn't talking about DX12 API, but i guess you're just trying to make me sound dumb by cherry picking


Thanks for the detailed post and benchmark runs!









Yeah I know that the performance increase doesn't necessarily scale linearly all the time when overclocking the core and the memory clock.

In a couple of the benchmarks I've run they've come close to scaling linearly, but sometimes they're a couple of percent off.


----------



## HeadlessKnight

Nvidia has done some magic to 780 Ti drivers, even SOM with Ultra textures (requires a 6 GB card) at 1080p it does no longer stutter anymore and play butter smooth. Tried Watch Dogs too it doesn't stutter anymore. all that with 355.98 drivers. I have no doubt Nvidia got over the 3.5 GB issue of GTX 970.


----------



## Yungbenny911

Quote:


> Originally Posted by *HeadlessKnight*
> 
> Nvidia has done some magic to 780 Ti drivers, even SOM with Ultra textures (requires a 6 GB card) at 1080p it does no longer stutter anymore and play butter smooth. Tried Watch Dogs too it doesn't stutter anymore. all that with 355.98 drivers. I have no doubt Nvidia got over the 3.5 GB issue of GTX 970.


I showed them photos of Assassins creed unity @ 4k + AA with over 11Gb allocated video memory, and watchdogs @ 4K + SMAA with over 6gb of allocated video memory. Some people recognized that performance, while others willingly chose to turn a blind eye and dwell in ignorance...

I personally don't know how they think i would stick to a GPU that'll render all my games unplayable. I mean, i might like Nvidia specific features, but i'm not that much of a fan lol
Quote:


> Originally Posted by *TopicClocker*
> 
> Thanks for the detailed post and benchmark runs!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yeah I know that the performance increase doesn't necessarily scale linearly all the time when overclocking the core and the memory clock.
> 
> In a couple of the benchmarks I've run they've come close to scaling linearly, but sometimes they're a couple of percent off.


You're welcome!


----------



## iinversion

Quote:


> Originally Posted by *OldBenny*
> 
> Good grief...... that isnt even how Vram works you dumb little kid. When a card runs out of video ram the frame rate can suddenly go from being a smooth 60fps to a choppy 10-15fps and then back up again. It is a very harsh and abrupt drop in frame rates and you will know it when it happens.
> 
> If you honestly think anyone is saying that a card is faster just because it has more vram then you need to start over from the beginning and actually figure out what vram does before commenting.


You realize he was being extremely sarcastic?


----------



## iinversion

Quote:


> Originally Posted by *Yungbenny911*
> 
> Lmao! He/She created an account specifically to post that, i'm flattered!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> @OldBenny Idk what to say to you tbh...


Omg I just realized the name he picked was "OldBenny" LMAO

You're just a young whippersnapper now. Sarcasm must not have been around back in the day.


----------



## danielhowk

Quote:


> Originally Posted by *Yungbenny911*
> 
> I showed them photos of Assassins creed unity @ 4k + AA with over 11Gb allocated video memory, and watchdogs @ 4K + SMAA with over 6gb of allocated video memory. Some people recognized that performance, while others willingly chose to turn a blind eye and dwell in ignorance...
> 
> I personally don't know how they think i would stick to a GPU that'll render all my games unplayable. I mean, i might like Nvidia specific features, but i'm not that much of a fan lol
> You're welcome!


is it worth getting the 970 or 390 amd ? still confuse after reading.
sorry im a newbie on this


----------



## bonami2

Quote:


> Originally Posted by *danielhowk*
> 
> is it worth getting the 970 or 390 amd ? still confuse after reading.
> sorry im a newbie on this


970 if you only plan to single gpu and low res

r9 390 for high resolution and multi gpu setup


----------



## TopicClocker

Quote:


> Originally Posted by *danielhowk*
> 
> is it worth getting the 970 or 390 amd ? still confuse after reading.
> sorry im a newbie on this


If you don't mind higher power consumption and want a faster and overall more future proof GPU; the 390 may be the best choice due to it's 8GB of VRAM.


----------



## badr0b0t

For those in the market for a mid-range GPU. Maybe this can help you make a decision between a 970 and a 390:



This is the 970-390 challenge we did at Google+ :
https://plus.google.com/u/0/photos/107106593450541689751/albums/6199057026203931921?sqi=101456796741817518541&sqsi=14613362-f70d-4977-a7c7-2883b070b974


----------



## Yungbenny911

Quote:


> Originally Posted by *danielhowk*
> 
> is it worth getting the 970 or 390 amd ? still confuse after reading.
> sorry im a newbie on this


Which company do you think would provide a better gaming experience for you? IMO, when two GPU's are closely comparable to each other, the companies behind them come into question.

All that V-Ram bull they're spreading is all horse poop lol, you won't run into problems with a 970 (if you choose to go that route).


----------



## bonami2

well go says that to gtx 680 owner. with 2gb.

7950 crossfire at 5760x1080 with 3gb vram and i would need 4-5gb vram to be able to push good framerate in gta v. Sure i have aa and stuff with 1080p panel

But for 1080p i do think 4gb is a lot more than needed ahah


----------



## TopicClocker

Quote:


> Originally Posted by *Yungbenny911*
> 
> Which company do you think would provide a better gaming experience for you? IMO, when two GPU's are closely comparable to each other, the companies behind them come into question.
> 
> All that V-Ram bull they're spreading is all horse poop lol, *you won't run into problems with a 970* (if you choose to go that route).


That's a bold claim.

Look what happened to the 2GB cards.


----------



## Yungbenny911

Quote:


> Originally Posted by *TopicClocker*
> 
> That's a bold claim.
> 
> Look what happened to the 2GB cards.


This same conversation, over, and over and over again...









Please read...


----------



## mtcn77

Spoiler: Lemme drop this and float away





*nothing to do here*


----------



## TopicClocker

Quote:


> Originally Posted by *Yungbenny911*
> 
> This same conversation, over, and over and over again...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Please read...


Textures.

If you haven't got enough VRAM and you're raising the texture quality, the only thing you'll be playing is a slideshow.

I had a 2GB card, a 760 Hawk; in games like Titanfall and Stutter Dogs with Ultra Textures it was nothing but a slide show, I wouldn't be surprised if Shadow of Mordor would be the same too.

Of-course, the solution to this would be to drop the texture quality settings.


----------



## bonami2

I like those noob.

I had stutter with 7950 crossfire and usage drop when going over 3gb vram. Monitored in game dropping 1setting stopped them.

Gta v 5760x1089

In no ways a 970 would have being future proof

Th


----------



## TopicClocker

Quote:


> Originally Posted by *bonami2*
> 
> I like those noob.
> 
> I had stutter with 7950 crossfire and usage drop when going over 3gb vram. Monitored in game dropping 1setting stopped them.
> 
> Gta v 5760x1089
> 
> In no ways a 970 would have being future proof
> 
> Th


But that's 3x 1080p in modern games on a 3GB card, which is driving nearly as many pixels as a 3840x2160 4K/UHD display, but then again I don't do multi-monitor gaming so I can't really say much about it and it's hardware demand.

What kind of settings were you using if you don't mind me asking?


----------



## bonami2

Quote:


> Originally Posted by *TopicClocker*
> 
> But that's 3x 1080p in modern games on a 3GB card, which is driving nearly as many pixels as a 3840x2160 4K/UHD display, but then again I don't do multi-monitor gaming so I can't really say much about it and it's hardware demand.
> 
> What kind of settings were you using if you don't mind me asking?


It 75% as many pixel.

Uh mostly high setting at 60fps. If i push too much aa the fps start stutering from 20-40 like mad and game crash after sometime


----------



## danielhowk

Quote:


> Originally Posted by *bonami2*
> 
> 970 if you only plan to single gpu and low res
> 
> r9 390 for high resolution and multi gpu setup


Quote:


> Originally Posted by *badr0b0t*
> 
> For those in the market for a mid-range GPU. Maybe this can help you make a decision between a 970 and a 390:
> 
> 
> 
> This is the 970-390 challenge we did at Google+ :
> https://plus.google.com/u/0/photos/107106593450541689751/albums/6199057026203931921?sqi=101456796741817518541&sqsi=14613362-f70d-4977-a7c7-2883b070b974


Quote:


> Originally Posted by *TopicClocker*
> 
> If you don't mind higher power consumption and want a faster and overall more future proof GPU; the 390 may be the best choice due to it's 8GB of VRAM.


so you guys will chose r390 over gtx 970 ? right ?
Quote:


> Originally Posted by *Yungbenny911*
> 
> Which company do you think would provide a better gaming experience for you? IMO, when two GPU's are closely comparable to each other, the companies behind them come into question.
> 
> All that V-Ram bull they're spreading is all horse poop lol, you won't run into problems with a 970 (if you choose to go that route).


Quote:


> Originally Posted by *bonami2*
> 
> well go says that to gtx 680 owner. with 2gb.
> 
> 7950 crossfire at 5760x1080 with 3gb vram and i would need 4-5gb vram to be able to push good framerate in gta v. Sure i have aa and stuff with 1080p panel
> 
> But for 1080p i do think 4gb is a lot more than needed ahah


Quote:


> Originally Posted by *TopicClocker*
> 
> That's a bold claim.
> 
> Look what happened to the 2GB cards.


----------



## bonami2

Quote:


> Originally Posted by *danielhowk*
> 
> so you guys will chose r390 over gtx 970 ? right ?


Yes with 8gb vram

If i had purchased a 670 back 2 years ago i would not be here with my eyefinity setup i purchase a 7950 because it has more vram and bandwith. To todays adding one in crossfire made my purchase 300000% worth it.

a 970 is not worth the money currently. The 980ti is worth it for higher end setup


----------



## neurotix

Quote:


> Originally Posted by *Yungbenny911*
> 
> I see ya'll throwing benchmarks here and there saying 290's/390's are better than 970's? And 980's/980ti's are the only Nvidia cards worth buying? Haha
> 
> @neurotix You think your 290 is fast?
> 
> 
> 
> 
> 
> 
> 
> . Here's my 5 months old score.
> 
> snip
> 
> http://www.3dmark.com/fs/4659651


This is very late but, you're using a 5960x.

Disable HT so you're running 8 cores- which would still be faster than my 4770k because they are physical cores without HT- and you might have a point.

Your physics score is inflating your total score, if you did that run with a 4770k your score would be lower than my 290s.









Here is my best Fire Strike run on hwbot, the graphics score is 400 points higher than yours, your cards are at 1658mhz and mine were at 1200mhz. That's 400 points higher graphics score for 458mhz higher. Which just proves my point and everything I was saying.

Also, I never even posted my Fire Strike score, I posted my Valley Extreme HD score, do you know how this works? If you want to show me up like a chump, than post something... *from the same benchmark*. Preferably something that's higher. But that probably won't happen. So, get to running Valley Extreme HD using the preset and post the results if you want me to take you seriously.









Further, 22 points on hwbot with 5960x and two 970s? I saw your profile. The things I could do with those chips. lol









I also just went through this with someone else. People have the hardware but don't understand how the benchmarks work. You can't compare total scores against people with different CPUs, even if you have the same graphics cards! Graphics score is the only thing that matters unless you have the same CPU, which is why in my comparison of Valley scores with 970 SLI, I made sure the person I compared with had a similar processor (he has a 4790k and I have a 4770k...)


----------



## danielhowk

Quote:


> Originally Posted by *bonami2*
> 
> Yes with 8gb vram
> 
> If i had purchased a 670 back 2 years ago i would not be here with my eyefinity setup i purchase a 7950 because it has more vram and bandwith. To todays adding one in crossfire made my purchase 300000% worth it.
> 
> a 970 is not worth the money currently. The 980ti is worth it for higher end setup


Quote:


> Originally Posted by *neurotix*
> 
> This is very late but, you're using a 5960x.
> 
> Disable HT so you're running 8 cores- which would still be faster than my 4770k because they are physical cores without HT- and you might have a point.
> 
> Your physics score is inflating your total score, if you did that run with a 4770k your score would be lower than my 290s.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Here is my best Fire Strike run on hwbot, the graphics score is 400 points higher than yours, your cards are at 1658mhz and mine were at 1200mhz. That's 400 points higher graphics score for 458mhz higher. Which just proves my point and everything I was saying.
> 
> Also, I never even posted my Fire Strike score, I posted my Valley Extreme HD score, do you know how this works? If you want to show me up like a chump, than post something... *from the same benchmark*. Preferably something that's higher. But that probably won't happen. So, get to running Valley Extreme HD using the preset and post the results if you want me to take you seriously.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Further, 22 points on hwbot with 5960x? I saw your profile. The things I could do with that chip. lol


for 1080p the 970 is better than r9 390 ?
wouldnt as graphics driver update. the r9 390 will eventually beat the 970 ?
since is newer and more " future proof "


----------



## Yungbenny911

Quote:


> Originally Posted by *TopicClocker*
> 
> Textures.
> 
> If you haven't got enough VRAM and you're raising the texture quality, the only thing you'll be playing is a slideshow.
> 
> I had a 2GB card, a 760 Hawk; in games like Titanfall and Stutter Dogs with Ultra Textures it was nothing but a slide show, I wouldn't be surprised if Shadow of Mordor would be the same too.
> 
> Of-course, the solution to this would be to drop the texture quality settings.


So what you're saying is that going from Low to Ultra textures does not affect the processing capabilities of a GPU right? If your 760 had 12Gb V-RAM, you would have been able to play Stutter Dogs at Ultra (everything), with x8MSAA @ 1080p or higher?


----------



## bonami2

Quote:


> Originally Posted by *Yungbenny911*
> 
> So what you're saying is that going from Low to Ultra textures does not affect the processing capabilities of a GPU right? If your 760 had 12Gb V-RAM, you would have been able to play Stutter Dogs at Ultra (everything), with x8MSAA @ 1080p or higher?


quad sli 760 4gb

Vs quad crossfire r9 390x 8gb

What would be the best scaling wise and be more future proof?

A high bandwith gpu that can push high res + as more than enough vram or a crappy gpu with vram

Sure if in the future he never plan to crossfire or sli it wont change alot except when is gonna sell it.


----------



## Yungbenny911

Quote:


> Originally Posted by *neurotix*
> 
> This is very late but, you're using a 5960x.
> 
> Disable HT so you're running 8 cores- which would still be faster than my 4770k because they are physical cores without HT- and you might have a point.
> 
> Your physics score is inflating your total score, if you did that run with a 4770k your score would be lower than my 290s.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Here is my best Fire Strike run on hwbot, the graphics score is 400 points higher than yours, your cards are at 1658mhz and mine were at 1200mhz. That's 400 points higher graphics score for 458mhz higher. Which just proves my point and everything I was saying.
> 
> Also, I never even posted my Fire Strike score, I posted my Valley Extreme HD score, do you know how this works? If you want to show me up like a chump, than post something... *from the same benchmark*. Preferably something that's higher. But that probably won't happen. So, get to running Valley Extreme HD using the preset and post the results if you want me to take you seriously.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Further, 22 points on hwbot with 5960x? I saw your profile. The things I could do with that chip. lol


1) I have a 5930k, not 5960X lol

2) Show me a VALID score with a link to it. There is NO WAY in the world that you'll beat my 970's at 1658Mhz with 1200Mhz 290's.

Ask @rdr09 I'm beating his score by 1000 points on the graphics, and he's clocked at 1290Mhz on his 290's. Unless your 290's are "Blessed by the Gods", or you're disabling Tess lol (which would be really disappointing)

Rdr09's score link == http://www.3dmark.com/3dm/4644282


----------



## bonami2

And never talk about ubugsoft the most unoptimised game i ever played are all from them.


----------



## bonami2

Quote:


> Originally Posted by *Yungbenny911*
> 
> 1) I have a 5930k, not 5960X lol
> 
> 2) Show me a VALID score with a link to it. There is NO WAY in the world that you'll beat my 970's at 1658Mhz with 1200Mhz 290's.
> 
> Ask @rdr09 I'm beating his score by 1000 points on the graphics, and he's clocked at 1290Mhz on his 290's. Unless your 290's are "Blessed by the Gods", or your disabling Tess lol (which would be really disappointing


Funny the r9 390x as more bandwith so in multi gpu at 4k or multimonitor it should crush your score

I do http://www.3dmark.com/fs/5599209

with 1100core 7950 aint that far behind


----------



## iinversion

Quote:


> Originally Posted by *bonami2*
> 
> Funny the r9 390x as more bandwith so in multi gpu at 4k or multimonitor it should crush your score
> 
> I do http://www.3dmark.com/fs/5599209
> 
> with 1100core 7950 aint that far behind


My 980 @ 1500 hit that GPU score lol.


----------



## Yungbenny911

The dude is beating @rdr09's score with over 2k points at 90Mhz lower clock speed, and doesn't seem to have a Link to his scores lol

@neurotix I'm still waiting...


----------



## bonami2

Quote:


> Originally Posted by *iinversion*
> 
> My 980 @ 1500 hit that GPU score lol.


that mean you reach 980ti performance. Great









But your still stuck with 300gbs +

I had like 1000 point increase from overclocking vram from 244gbs to 320gbs from those crappy 7950....

Reason im waiting for hbm with 8gb +

7950 can do like 1300+ core on air so add 2k to the score









for 4 years old gpu it pretty good


----------



## iinversion

Memory bandwidth really isn't an issue in any games currently. The Fury X pretty much proved that to us. There isn't really any tangible gains from having the massive HBM bandwidth.


----------



## bonami2

Quote:


> Originally Posted by *iinversion*
> 
> Memory bandwidth really isn't an issue in any games currently. The Fury X pretty much proved that to us. There isn't really any tangible gains from having the massive HBM bandwidth.


Well unigine heaven increased score too. Considering unigine is pretty ugly and not that hard on gpu. You most remember faster ram = lower latency = increase speed

Cpu performance is affected by ram. i dont see why a gpu would not.

But im pushing 5760x1080 so maybe it harder than 4k on vram. Wider field of view lot more data to load maybe i dont know how that work uh


----------



## iinversion

Quote:


> Originally Posted by *bonami2*
> 
> Well unigine heaven increased score too. Considering unigine is pretty ugly and not that hard on gpu. You most remember faster ram = lower latency = increase speed
> 
> Cpu performance is affected by ram. i dont see why a gpu would not.
> 
> But im pushing 5760x1080 so maybe it harder than 4k on vram. Wider field of view lot more data to load maybe i dont know how that work uh


Yes benchmarks definitely benefit from more bandwidth, but is what I'm saying is that there is a pretty small gain regardless and even in games. The gain from OC'ing the Fury X's bandwidth is even smaller since it is already high.

Of course, more bandwidth is always good but I don't really think you can say a certain GPU is bad because it is lacking in bandwidth. It doesn't really matter all that much unless it is just like superrr low.

A good example would be the 660 Ti which was definitely gimped by the memory bandwidth.


----------



## Yungbenny911

Quote:


> Originally Posted by *bonami2*
> 
> Funny the r9 390x as more bandwith so in multi gpu at 4k or multimonitor it should crush your score
> 
> I do http://www.3dmark.com/fs/5599209
> 
> with 1100core 7950 aint that far behind


Beat my score? that's possible with the right OC, but crush my score at 4K? lol, that's a little too far haha

http://www.3dmark.com/fs/4638111


----------



## rdr09

Quote:


> Originally Posted by *Yungbenny911*
> 
> 1) I have a 5930k, not 5960X lol
> 
> 2) Show me a VALID score with a link to it. There is NO WAY in the world that you'll beat my 970's at 1658Mhz with 1200Mhz 290's.
> 
> Ask @rdr09 I'm beating his score by 1000 points on the graphics, and he's clocked at 1290Mhz on his 290's. Unless your 290's are "Blessed by the Gods", or you're disabling Tess lol (which would be really disappointing)
> 
> Rdr09's score link == http://www.3dmark.com/3dm/4644282


that's an old run using Win7. others with similar setup at lower clocks than mine gets over 27K in graphics. my stuff is seating along the coast of somalia. so, i can't test.

in any case, they all are very similar in performance. But the 390s are faster indeed than 290s. here is a 290 flashed with 390 bios . . .

http://www.3dmark.com/fs/5780378

can you beat that with any of your 970?


----------



## Yungbenny911

We can always cherry pick...

http://www.3dmark.com/3dm/4517506


----------



## rdr09

Quote:


> Originally Posted by *Yungbenny911*
> 
> We can always cherry pick...
> 
> http://www.3dmark.com/3dm/4517506


Lovely. 1600 vs 1200.

Told you. they are similar.


----------



## Yungbenny911

Quote:


> Originally Posted by *rdr09*
> 
> Lovely. 1600 vs 1200.
> 
> Told you. they are similar.


2014 windows 7 score, vs 2015 windows 10 score... Yeah, keep going...

I thought that was what you were complaining about earlier? That your score is old. So the 970 doesn't get a pass? Only when it applies to you?


----------



## rdr09

Quote:


> Originally Posted by *Yungbenny911*
> 
> 2014 windows 7 score, vs 2015 windows 10 score... Yeah, keep going...
> 
> I thought that was what you were complaining about earlier? That your score is old. So the 970 doesn't get a pass? Only when it applies to you?


calm down. they are just gpus. both will work for op's use, especially that the 3.5 has been resolved.









BTW, those clocks are reaching 980 levels. 1200 for the 390 and 1600 for the 970. Lovely.


----------



## neurotix

Quote:


> Originally Posted by *Yungbenny911*
> 
> 1) I have a 5930k, not 5960X lol
> 
> 2) Show me a VALID score with a link to it. There is NO WAY in the world that you'll beat my 970's at 1658Mhz with 1200Mhz 290's.
> 
> Ask @rdr09 I'm beating his score by 1000 points on the graphics, and he's clocked at 1290Mhz on his 290's. Unless your 290's are "Blessed by the Gods", or you're disabling Tess lol (which would be really disappointing)
> 
> Rdr09's score link == http://www.3dmark.com/3dm/4644282


1) Physics score would still make your total score higher, and incomparable. Unless you disable two cores and rerun the tests, or we only compare graphics scores, there's no point. And I've already shown a higher graphics score than yours at hundreds of Mhz lower.

2) hwbot, the biggest professional overclocking website/competition holder on the planet, does not require 3dmark scores to be validated EXCEPT in the case of top 20 world records. If hwbot accepts my scores, they are fine, and I have no reason to show you validation of any kind. My results are legit. Additionally, tweaks like disabling tess, as well as other OS tweaks, are allowed there. That's why my score is so high, because I actually fricken know what I'm doing. Also, is rdr09 on hwbot? I just looked and he isn't. Is he #32 in the US enthusiast league (formerly #14?) Well I am. I have my cred and my results to back me up. Where's yours?

3) Run Valley Extreme HD and post the results. Otherwise, I will simply block you and ignore anything further you say.

I've already basically shown, twice, that the 290 matches or outperforms the 970, without gimped .5 Gbs of RAM, at a lower price (used), at least in some popular benchmarks. If I get 10 less fps in games compared to a 970, so what, because the price of the 290 is much better. I mean, I'm running Witcher 3 at 5760x1080 resolution, all Ultra, no AA, at 60 fps on my setup. That's good enough for me because I don't need more than 60 fps. My minimums might be worse but that's okay, it's still playable to me. Further, pretty much nothing would convince me to use Nvidia because from everything I've heard, Nvidia Surround sucks compared to Eyefinity. So I really have no choice but to use AMD.

That's really all I have to say in this thread otherwise. Anyone who can read and doesn't selectively ignore my statements or points will see I've made a convincing argument, at least in benchmarks.


----------



## neurotix

rdr09, your single Fire Strike you posted could be higher with tess off, even.









http://hwbot.org/submission/2677510_neurotix_3dmark___fire_strike_radeon_r9_290_12010_marks

That's the best I've done with one 290 in Fire Strike, on Win 7 (I refuse to use anything newer.)


----------



## Yungbenny911

haha, So you admit Tess was off in your runs?

If so, that's a shame...


----------



## mtcn77

Do comparisons with a 1.2 GHz 290x count?
Post #263
Footnote: converting the results to 1.2 GHz 290 versus 970 format would indicate gtx [email protected] ties with r9 [email protected] MHz at 4K 0/2/4 ssaavia colour compression while 2/4/8 msaa requires [email protected] MHz to level the performance gradient between them.
In summary, 290 can make liberal use of eqaa which is superior to normal forms of aa(better sample grading, no vram footprint) while csaa is unfortunately off-limits in the geforce driver.


----------



## rdr09

oh stop it. stuff not helping op. either gpu will do for op like i said. i think there was only one game that really ate vram much even in 1080 and that was dying light.

system can always access ram if needed.


----------



## neurotix

No, it's allowed on hwbot.

The reason they allow it is because AMD has inferior hardware tessellation support. There's technical reasons for this, go read some anandtech articles or something. This is common knowledge.

AGAIN, hwbot is where people like 8Pack, k|ngp|n and our own FtW have come from, if you don't support hwbot and the overclocking and benching community, on both pro and amateur (me) levels, then I have no respect for you and your opinion does not matter to me. Since you refuse to do what I ask (because your score will suck), only posted results from one benchmark that didn't prove anything, you aren't helping your case any. Especially since I've shown FACTS and PROOF, across numerous benchmarks, that show that the 290 is as good or better than the 970 in a few tests.

I've been benching and submitting there for 4 years and gained numerous points for the team. I've bought GPUs just to bench them and sell them. I've probably spent more time running benchmarks on my machine than actually gaming. I've spend countless nights running benches all night. I have a virtual database of every benchmark I've ever run, in hwbot.

Go look at my profile and my points, Top 15 Global + World Record and Top 20 Hardware Points.

This isn't the first time someone has brought up the tess thing, either. If you don't understand and recognize the importance of hwbot to overclocking, there's not much else I can say. My cred and proof is there and it's accepted on hwbot.

Tip: if you want to debate someone, you don't respond with one sentence replies and you actually COUNTER your opponent by answering and/or refuting his statements. You also actually click on the links people post and read them (which I strongly doubt you did; I highly doubt you even read and comprehended my posts fully.) Either disprove me by posting superior benchmarks or admit that maybe the 290 is as good as the 970, and maybe you got ripped off (I could almost buy two used 290s for the price of one new 970.)

Keep rationalizing your purchase. It will be obsolete and unsupported in 2 years time with no new driver improvements, just like the 780ti is now. Every driver update, I get more performance out of my cards, and AMD regularly adds improvements for very old cards like 5870s (which are STILL supported.) I feel bad for anyone who still has a 680.

That aside, the 290 is still a great card. Even IF the 970 was getting say, 10 more fps in Valley and a few thousand higher points graphics score in Fire Strike, the 290 (especially used) would still be a good value, as it would be a certain percentage of the 970s performance for a lot less money. (Say 90% for $100 less or whatever, as an example.)

I can't recommend the 390 or 390X, new. If you look at the chart a little ways down here, you'll see that clearly the 390 and 390X are rebrands. They are GCN 1.1, Hawaii Pro and Hawaii XT, with the same amount of shaders and everything else. They are not new technology. The only differences are a different BIOS and a slightly improved tessellation unit. rdr09, the improved tessellation is probably the reason for the increase in performance you're noticing, if you run your benches with tess on, but that doesn't change the fact that realistically, they are still the same damn card. A used 290 at sub $200 on Ebay is probably still the best price/performance option available. In fact, I kind of feel sorry for anyone using anything LESS than a 290 in 2015, considering they're so cheap. Someone with a 7970, 280X, 270X, etc could very easily sell their first gen GCN card for $100 and then spend $100 more for a used 290.










The only real reason to get a 390 is the 8gb RAM, of course, but only if you have a 4k monitor. That's it. Hell, even Witcher 3 on my setup on Ultra, at 5760x1080, only uses around 3GB RAM. I'm yet to find a game that has gone over 4GB in Eyefinity. Some of them are even laughable, Skyrim only uses 1GB and that's with a bunch of high res textures etc.


----------



## Yungbenny911

Quote:


> Originally Posted by *rdr09*
> 
> oh stop it. stuff not helping op. either gpu will do for op like i said. i think there was only one game that really ate vram much even in 1080 and that was dying light.
> 
> system can always access ram if needed.


My apologies lol, but i mean.. He wants to post his tweaked score, and flaunt like it's the normal score 290's get on average. Are we going to be disabling TESS in games? So why post an invalid score?

I'm simply trying to let people know that they should not be misled into buying what they're not comfortable with, all because most of you love to spread myths about the world coming to an end due to 3.5gb on the 970.

Both GPU's are great at what they do, and would provide a great gaming experience on ALL resolutions deemed playable, so it all comes down to "Do i like gameworks, or do i like TressFX" you know..


----------



## neurotix

Get this:

http://www.eteknix.com/8pack-smashes-3dmark-world-record-4-msi-r9-290x-lightnings-1300mhz/

8 Pack, the best pro overclocker in the world, beat the 3dmark Fire Strike World Record with 4 290X on LN2 at 1475mhz. (The article says 1300mhz but check the screenshot in his submission. If you even click on it.)

It was big news. This was a time when everyone was using the 780ti and he used 290X.

http://www.3dmark.com/fs/2524204

That's the 3dmark validation and *tessellation is off.*

Here's the hwbot submission: http://hwbot.org/submission/2599849_8_pack_3dmark___fire_strike_4x_radeon_r9_290x_36731_marks

Would you go up to this guy's face (he's a weightlifter btw) at an overclocking convention, and tell the best overclocker in the world that his scores are invalid or don't count because he had tess off? (Which is ALLOWED?)

Do you even know what hwbot IS?

Btw, I reported one of your submissions on your dusty hwbot profile for not including a CPU-Z memory tab in the screenshot. This is required. It was the dual GTX 770 submission. You might want to check that.









EDIT: I don't use tessellation in any of my games because I don't notice a difference in anything but frame rate with it on or off. I don't see the difference in image quality, at all, when gaming. It's basically a McGuffin that does nothing. It just slows everything down. If more people didn't crank up tessellation, AA and post processing they'd see their games running much faster and smoother with minimal loss in quality. My games look great without this stuff and run much smoother.







Thus, I always recommend people turn these features OFF if they care more about high fps and smooth gameplay.


----------



## TopicClocker

Quote:


> Originally Posted by *Yungbenny911*
> 
> My apologies lol, but i mean.. He wants to post his tweaked score, and flaunt like it's the normal score 290's get on average. Are we going to be disabling TESS in games? So why post an invalid score?
> 
> I'm simply trying to let people know that they should not be misled into buying what they're not comfortable with, all because most of you love to spread myths about the world coming to an end due to 3.5gb on the 970.
> 
> Both GPU's are great at what they do, and would provide a great gaming experience on ALL resolutions deemed playable, so it all comes down to "Do i like gameworks, or do i like TressFX" you know..


Why don't you bench against neurotix?


----------



## HeadlessKnight

Quote:


> Originally Posted by *neurotix*
> 
> No, it's allowed on hwbot.
> 
> Keep rationalizing your purchase. It will be obsolete and unsupported in 2 years time with no new driver improvements, just like the 780ti is now. Every driver update, I get more performance out of my cards, and AMD regularly adds improvements for very old cards like 5870s (which are STILL supported.) I feel bad for anyone who still has a 680.


There are in fact driver improvements, and the most important driver improvement for the 780 Ti is the improved memory management. With recent drivers I played both Watch Dogs and Shadow of Mordor with Ultra textures and both don't stutter with 355.98. While they were a stuttering hell with old drivers.
That all happened after the raging storm in their forums. I doubt they would have done anything if people were silent about it. And I hope they learned something.

Here is a GTX 970 (1434) Vs R9 390 (1150 MHz) benchmarks across many games.

https://www.youtube.com/watch?v=aEiknMpglao


----------



## danielhowk

Quote:


> Originally Posted by *rdr09*
> 
> calm down. they are just gpus. both will work for op's use, especially that the 3.5 has been resolved.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> BTW, those clocks are reaching 980 levels. 1200 for the 390 and 1600 for the 970. Lovely.


did they fix 3.5 gb ? with a new model or driver ?


----------



## neurotix

Quote:


> Originally Posted by *TopicClocker*
> 
> Why don't you bench against neurotix?


I've been trying to get him to post Valley Extreme HD with his SLI 970s but he keeps ignoring me.







Put up or shut up









Again, nothing about my score is "invalid" if hwbot accepts it and if (arguably) the best overclocker in the world, with more first place world overclocking records than anyone in the world, does the same thing when using AMD hardware...and gets news coverage because of it.

Oh, I'm sure "Yungbenny" would get better scores than me if he disabled tess too. Except he can't!







Nvidia won't let him!


----------



## Yungbenny911

Quote:


> Originally Posted by *TopicClocker*
> 
> Why don't you bench against neurotix?


Duh, i want to so bad lol.









@neurotix Quit yapping, which of these games do you have? And if you have others, list all the games you have that have an in-built benchmark. We're sticking to games only because that's what people want to know, which GPU is better for gaming.

Bioshock: Infinite
Tomb raider Ultimate settings
Metro 2033 (no PhysX)
Metro LL (no PhysX)
Batman Arkham City DX11 (no PhysX)
Batman Arkham Knight (no PhysX/Gameworks)
Sleeping Dogs Extreme AA
Hitman Absolution
Mafia 2 (no PhysX)
Max Payne 3
Resident Evil 6 Benchmark

All the games listed above have in-built benchmarks, and i have them all installed in my system. I'll like others to participate also. Time for some 4K benches


----------



## neurotix

Uh, no. I don't have any of those games and I'm not interested in benching them.

I want to see Valley Extreme HD.

And again, you need to actually read what people say and counter it or respond to it, e.g. the stuff about tess and 8 Pack.









You've been blocked.


----------



## iinversion

Quote:


> Originally Posted by *neurotix*
> 
> Uh, no. I don't have any of those games and I'm not interested in benching them.
> 
> I want to see Valley Extreme HD.
> 
> And again, you need to actually read what people say and counter it or respond to it, e.g. the stuff about tess and 8 Pack.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You've been blocked.


You keep asking for Valley because you know Maxwell gets poor results in Valley compared to like Kepler or AMD.

Why not Heaven? Why not anything else?


----------



## iRUSH

Quote:


> Originally Posted by *neurotix*
> 
> Uh, no. I don't have any of those games and I'm not interested in benching them.
> 
> *I want to see Valley Extreme HD*.
> 
> And again, you need to actually read what people say and counter it or respond to it, e.g. the stuff about tess and 8 Pack.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You've been blocked.


We don't "play" benchmarks, we play games.

So since NV is forced to run tess and AMD isn't per that benchmark, you're uninterested is an apples to apples comparison?

Perhaps I'm reading this argument wrong lol.

If so, I apologize


----------



## neurotix

And did anyone have any problem with claiming the 780ti was "better" than AMD or the 290X when it was the king of Valley? (It still outperforms the 980 even now?)

How about all the old 3dmark benches consistently performing better on ANYTHING Nvidia? (3dmark06, 3dmark Vantage, and anything older?)

Nvidia does way better on Aquamark3?

For years and years and years I've dealt with getting lower benchmark scores since I use AMD, in pretty much everything, so I really don't care. Sure, maybe all it means is that the 290 is better at Valley, but hey...









The point isn't that I know he's gonna do bad. The point is that I posted Valley runs, so he should *counter with Valley runs* because that's what I posted. He should not counter with one single Fire Strike run with an artificially inflated score from higher physics (5930k vs 4770k), then ignore every bench I post in response because "tess off", then ignore to address the FACT that my scores are valid because hwbot accepts them, which I also countered with the fact that world record runs by extreme overclockers on AMD GPUs are accepted with tess off, which he also ignored (because he can't think of any way to respond because I have a point? I dunno).

Look, even IF I did have and run all those games, the fact is that it doesn't matter because I already know the results. In games like Sleeping Dogs, Tomb Raider, possibly Bioshock Infinite, and maybe others, the AMD card will do better. In games like the Batman games, probably Metro, RE6 or anything else with Nvidia Gameworks (Batman has Gameworks right?) the Nvidia card will do better. At the end of it, both cards are going to be within 10 fps of each other, or otherwise be within 10% of each other. This just proves the point that the cards are basically equal.

But it's only after the fact that I argue, that users like Yungbenny will change their tune and concede that the 290/390 are still good cards and the 970 certainly isn't significantly better than them. However, they will still refuse to admit the fact that yeah, maybe the 290 is better because it's like, getting the same benchmarks scores at 458mhz lower! Gasp! I guess it hurts your e-peen when even though your cards are running at 1658mhz and overclock like beasts, *gasp* they're only *matching* the 290 at much lower clocks!









Look, I've spent enough time watching and replying to this thread already. I don't have time to debate someone who is not going to seriously answer my posts, and refuses to counter my points (or probably even read my posts fully- what can you expect from someone who names himself "Yungbenny"?) That is time that is better spent gaming, e.g. my sig, I have PS3 games to finish. Adieu.


----------



## mtcn77

Gosh, the lengths people will go to quell the dissonance of purchase justification.
We know you are spectators in the field guys, with no bone in the fight, do you literally have to beat yourself up over it?
Stay classy, my friends.


----------



## bonami2

ah man all those fight.

And im here saying the truth

Where are the ******* nvidia 680 sli owner playing gta v at Surround or 4k.........................................................................................................

Their is not anyone.

8gb vram all the ways


----------



## neurotix

Quote:


> Originally Posted by *iRUSH*
> 
> We don't "play" benchmarks, we play games.
> 
> So since NV is forced to run tess and AMD isn't per that benchmark, you're uninterested is an apples to apples comparison?
> 
> Perhaps I'm reading this argument wrong lol.
> 
> If so, I apologize


Tess disabled for Fire Strike, the cards are neck and neck, with it on, they're still neck and neck. Even with tess enabled in Valley, my cards will outperform SLI 970 significantly and maybe even 980s, assuming similar CPUs are used (e.g. Intel 8-thread). I'm just stating a point that I'm sick of people telling me my benchmarks are invalid because tess is off, when it's allowed for hwbot, the best AMD overclockers in the world use this tweak. He is not the first person to spout this nonsense. My scores count or I wouldn't be ranked relatively high on hwbot for an amateur, non sponsored, non extreme cooling, spare time enthusiast.

He doesn't really have to run Valley because he probably couldn't even do it right- pick the right preset and disable some of his CPU cores so we get a direct comparison. It's like benching an FX vs an Intel or something. That's apples and oranges. This was the problem, the only thing he ever posted was one single Fire Strike run, and was comparing the total score to mine, when he has 4 more threads than I do and a higher physics score, which inflates the total score. You have to either run with the same amount of cores, or just compare graphics scores only. Besides, I posted scores showing my 290s at 1200mhz creaming 970 SLI at a 1600mhz in Valley (I got 20 fps more), they're quite a few pages back now. So I don't need him to run Valley.

As far as benching vs games, what do you think hwbot is? That's where people "play" benchmarks. That's besides the point anyway, as what I said about the 290 vs the 970 in games still stands. In some games the 290 is better and in some, the 970 will be better, and it's usually because of Gameworks.

If we tested these things at say, 5760x1080 or 4k (he has one, I have the other), in a non-Gameworks game, with something like 8xAA to really get the VRAM working, then we could really know for sure. Mind you, the 290 has a much wider memory bus and much more bandwidth, and the 970 has a slow 0.5Gb partition that chokes under such loads.


----------



## bonami2

Quote:


> Originally Posted by *neurotix*
> 
> Tess disabled for Fire Strike, the cards are neck and neck, with it on, they're still neck and neck. Even with tess enabled in Valley, my cards will outperform SLI 970 significantly and maybe even 980s, assuming similar CPUs are used (e.g. Intel 8-thread). I'm just stating a point that I'm sick of people telling me my benchmarks are invalid because tess is off, when it's allowed for hwbot, the best AMD overclockers in the world use this tweak. He is not the first person to spout this nonsense. My scores count or I wouldn't be ranked relatively high on hwbot for an amateur, non sponsored, non extreme cooling, spare time enthusiast.
> 
> He doesn't really have to run Valley because he probably couldn't even do it right- pick the right preset and disable some of his CPU cores so we get a direct comparison. It's like benching an FX vs an Intel or something. That's apples and oranges. This was the problem, the only thing he ever posted was one single Fire Strike run, and was comparing the total score to mine, when he has 4 more threads than I do and a higher physics score, which inflates the total score. You have to either run with the same amount of cores, or just compare graphics scores only. Besides, I posted scores showing my 290s at 1200mhz creaming 970 SLI at a 1600mhz in Valley (I got 20 fps more), they're quite a few pages back now. So I don't need him to run Valley.
> 
> As far as benching vs games, what do you think hwbot is? That's where people "play" benchmarks. That's besides the point anyway, as what I said about the 290 vs the 970 in games still stands. In some games the 290 is better and in some, the 970 will be better, and it's usually because of Gameworks.
> 
> If we tested these things at say, 5760x1080 or 4k (he has one, I have the other), in a non-Gameworks game, with something like 8xAA to really get the VRAM working, then we could really know for sure. Mind you, the 290 has a much wider memory bus and much more bandwidth, and the 970 has a slow 0.5Gb partition that chokes under such loads.


Exactly

Same as the 7950 7970 eating alive a 680 in those resolution todays


----------



## iinversion

Quote:


> Originally Posted by *neurotix*
> 
> Tess disabled for Fire Strike, the cards are neck and neck, with it on, they're still neck and neck. Even with tess enabled in Valley, my cards will outperform SLI 970 significantly and maybe even 980s, assuming similar CPUs are used (e.g. Intel 8-thread). I'm just stating a point that I'm sick of people telling me my benchmarks are invalid because tess is off, when it's allowed for hwbot, the best AMD overclockers in the world use this tweak. He is not the first person to spout this nonsense. My scores count or I wouldn't be ranked relatively high on hwbot for an amateur, non sponsored, non extreme cooling, spare time enthusiast.
> 
> He doesn't really have to run Valley because he probably couldn't even do it right- pick the right preset and disable some of his CPU cores so we get a direct comparison. It's like benching an FX vs an Intel or something. *That's apples and oranges.* This was the problem, the only thing he ever posted was one single Fire Strike run, and was comparing the total score to mine, when he has 4 more threads than I do and a higher physics score, which inflates the total score. You have to either run with the same amount of cores, or just compare graphics scores only. Besides, I posted scores showing my 290s at 1200mhz creaming 970 SLI at a 1600mhz in Valley (I got 20 fps more), they're quite a few pages back now. So I don't need him to run Valley.
> 
> As far as benching vs games, what do you think hwbot is? That's where people "play" benchmarks. That's besides the point anyway, as what I said about the 290 vs the 970 in games still stands. In some games the 290 is better and in some, the 970 will be better, and it's usually because of Gameworks.
> 
> If we tested these things at say, 5760x1080 or 4k (he has one, I have the other), in a non-Gameworks game, with something like 8xAA to really get the VRAM working, then we could really know for sure. Mind you, the 290 has a much wider memory bus and much more bandwidth, and the 970 has a slow 0.5Gb partition that chokes under such loads.


Lol so that is what is apples to oranges? Comparing a different CPU on a GPU bound benchmark. It won't make any difference unless the CPU in question can't push the GPU's.

No one cares about what is allowed on hwbot and what is not. The point is if you are going to compare cards then all settings should be equal to get a fair comparison. Everyone knows a used 290 is a great bang/buck but showing an apples to oranges comparison of benchmarks with different settings is not a way to prove your point.


----------



## danielhowk

so far in directx12 r9 390 is better / faster than 970 ?


----------



## iinversion

Quote:


> Originally Posted by *danielhowk*
> 
> so far in directx12 r9 390 is better / faster than 970 ?


Yes AMD is better in DX12, but we very very limited testing to go on so far.

If you are concerned about DX12 tbh it is probably better to wait until next gen cards.


----------



## danielhowk

Quote:


> Originally Posted by *iinversion*
> 
> Yes AMD is better in DX12, but we very very limited testing to go on so far.
> 
> If you are concerned about DX12 tbh it is probably better to wait until next gen cards.


when will that be ?


----------



## Ha-Nocri

Quote:


> Originally Posted by *danielhowk*
> 
> when will that be ?


At least a year. There will be quite few DX12 games to play until then, and most of them will be AMD friendly it seems


----------



## xboxshqip

I have to agree with aDyerSituation that VRam will sure come in use for texture mods.

My 4 year old GTX 560 just broke so i will go Red this time, R9 390 here I come.


----------



## ryder

can someone explain why nvidia doesnt put as much emphasis on vram compared to AMD?

why do most mid to high end green cards only come with 4gb vs amds 8?

the 390 vs 970 is a prime example.


----------



## iRUSH

Quote:


> Originally Posted by *ryder*
> 
> can someone explain why nvidia doesnt put as much emphasis on vram compared to AMD?
> 
> why do most mid to high end green cards only come with 4gb vs amds 8?
> 
> the 390 vs 970 is a prime example.


AMD build for the future and nvidia/Intel build for now.

At least that's the way I personally see it.

The majority do not need 8 cores and 8 GB of vram, period. Someday we will, but it'll be many years before that is the "standard".

Those who defend agsinst this please consider the users outside our tiny crazed PC hardware community


----------



## ryder

Quote:


> Originally Posted by *iRUSH*
> 
> AMD build for the future and nvidia/Intel build for now.
> 
> At least that's the way I personally see it.
> 
> The majority do not need 8 cores and 8 GB of vram, period. Someday we will, but it'll be many years before that is the "standard".
> 
> Those who defend agsinst this please consider the users outside our tiny crazed PC hardware community


makes sense.

from what ive read the 970 processing power > 390. is this only true at OC speeds? or does OC'ing have nothing to do wtih that gpu benefit


----------



## bonami2

Quote:


> Originally Posted by *ryder*
> 
> makes sense.
> 
> from what ive read the 970 processing power > 390. is this only true at OC speeds? or does OC'ing have nothing to do wtih that gpu benefit


It do have more processing power. Nvidia are mostly always better in that case. Look a [email protected] currently nvidia is destroying amd.

But they lack Vram bandwith and other part for high res that kill them.

my 7950 crossfire at 5760x1080 had fps increase with just hitting 320gb of bandwith

for 4k i expect that 1tb second of bandwith would be recommended.


----------



## bonami2

Quote:


> Originally Posted by *iRUSH*
> 
> AMD build for the future and nvidia/Intel build for now.
> 
> At least that's the way I personally see it.
> 
> The majority do not need 8 cores and 8 GB of vram, period. Someday we will, but it'll be many years before that is the "standard".
> 
> Those who defend agsinst this please consider the users outside our tiny crazed PC hardware community


Gta v
start war battlefront

Seem to show that we are cpu bottlenecked.

Both my 4790k and fx 8300 where at 100% loading map in both of those game for at least 10 second.

That 8 thread total

I think we are really close to needing 6-8 core.


----------



## semitope

Quote:


> Originally Posted by *bonami2*
> 
> It do have more processing power. Nvidia are mostly always better in that case. Look a [email protected] currently nvidia is destroying amd.
> 
> But they lack Vram bandwith and other part for high res that kill them.
> 
> my 7950 crossfire at 5760x1080 had fps increase with just hitting 320gb of bandwith
> 
> for 4k i expect that 1tb second of bandwith would be recommended.


Quote:


> Originally Posted by *ryder*
> 
> makes sense.
> 
> from what ive read the 970 processing power > 390. is this only true at OC speeds? or does OC'ing have nothing to do wtih that gpu benefit


AMD processing power typically is > nvidia. Maybe the folding at home thing is down to the number of GPUs from each manufacturer taking part.

How nvidia keeps up in performance is probably down to their software scheduler and being able to do fancy things in drivers. Without driver updates they'd be left in the dust.


----------



## semitope

Quote:


> Originally Posted by *Yungbenny911*
> 
> This same conversation, over, and over and over again...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Please read...


https://www.youtube.com/watch?v=W0HzWW6mDno

There is a clear benefit to having more than 2GB.


----------



## iRUSH

Quote:


> Originally Posted by *bonami2*
> 
> Gta v
> start war battlefront
> 
> Seem to show that we are cpu bottlenecked.
> 
> Both my 4790k and fx 8300 where at 100% loading map in both of those game for at least 10 second.
> 
> That 8 thread total
> 
> I think we are really close to needing 6-8 core.


Many games load all cores on their "loading" screen.


----------



## mtcn77

Quote:


> Originally Posted by *semitope*
> 
> https://www.youtube.com/watch?v=W0HzWW6mDno
> 
> There is a clear benefit to having more than 2GB.


I presume these "average frame rate" charts are the perfect hype generators. A normal user would have a much better comprehension of the subject looking at the minimum frame rate counter however such continuous video monitorings rarely provide you with such a scale whereas website reviews make that point very clear. What else, unless using a FreeSync/G-Sync monitor, you literally have zero incentive to improve your frame rate from 39>40. The mouse input control becomes more contiguous and, yes, it is favourable however you cannot trace it visually since the monitor never encompassed the action in the latest frame. Frame rates other than the common factors of the monitor refresh rate are actually wasted unless using such a G-sync/FreeSync scaler, so down to 30 fps they are quite similar on a 60Hz monitor.


----------



## bonami2

Quote:


> Originally Posted by *semitope*
> 
> AMD processing power typically is > nvidia. Maybe the folding at home thing is down to the number of GPUs from each manufacturer taking part.
> 
> How nvidia keeps up in performance is probably down to their software scheduler and being able to do fancy things in drivers. Without driver updates they'd be left in the dust.


Yea nvidia do have lot of money for driver.

But nvidia is always stronger in 1080p and fall behind in higher res most of the time it seem. Well of the benchmark i seen.


----------



## semitope

Quote:


> Originally Posted by *bonami2*
> 
> Yea nvidia do have lot of money for driver.
> 
> But nvidia is always stronger in 1080p and fall behind in higher res most of the time it seem. Well of the benchmark i seen.


That might be a sign of hitting their hardware limit. Can only do so much with software. AMDs problem is feeding their GPUs at those lower resolutions and this is solved with dx12.


----------



## Stige

Quote:


> Originally Posted by *bonami2*
> 
> Yea nvidia do have lot of money for driver.
> 
> But nvidia is always stronger in 1080p and fall behind in higher res most of the time it seem. Well of the benchmark i seen.


That is bs, fanboy.

1080p is the only place 970 can compete with the 390, go above and it won't anymore, not in the future anyway.
NVidia will have serious problems in future, especially with DX12.

AMD has always been the more future proof solution compared to NVidia.

Just compare some old cards: GTX 670 vs HD7950, the HD7950 is still a decent card today thanks to real memory bandwidth and more than lowly 2GB of memory.
A 2GB card won't do anyone any good these days.


----------



## bonami2

Quote:


> Originally Posted by *Stige*
> 
> That is bs, fanboy.
> 
> 1080p is the only place 970 can compete with the 390, go above and it won't anymore, not in the future anyway.
> NVidia will have serious problems in future, especially with DX12.
> 
> AMD has always been the more future proof solution compared to NVidia.
> 
> Just compare some old cards: GTX 670 vs HD7950, the HD7950 is still a decent card today thanks to real memory bandwidth and more than lowly 2GB of memory.
> A 2GB card won't do anyone any good these days.


I have 7950 crossfire i do know that.


----------



## neurotix

Quote:


> Originally Posted by *semitope*
> 
> AMD processing power typically is > nvidia. Maybe the folding at home thing is down to the number of GPUs from each manufacturer taking part.
> 
> How nvidia keeps up in performance is probably down to their software scheduler and being able to do fancy things in drivers. Without driver updates they'd be left in the dust.


Quote:


> Originally Posted by *bonami2*
> 
> It do have more processing power. Nvidia are mostly always better in that case. Look a [email protected] currently nvidia is destroying amd.
> 
> But they lack Vram bandwith and other part for high res that kill them.
> 
> my 7950 crossfire at 5760x1080 had fps increase with just hitting 320gb of bandwith
> 
> for 4k i expect that 1tb second of bandwith would be recommended.


I can help clear this up.

AMD processing power > Nvidia because AMD doesn't gimp Double Precision Compute on consumer cards.

Even the Titan X doesn't offer double precision compute from Nvidia anymore, afaik. Because it's not needed for gaming. The original Titan (2013) had fully enabled double precision compute. Now, if you want that you have to buy a Quadro card.

AMD, however, can do double precision on all it's cards, afaik.

The reason Nvidia destroys AMD in [email protected] is because for a very long time now, Nvidia has worked very closely with Stanford to run [email protected] on CUDA. I believe the first GROMACS cores to run on GPUs were made for Nvidia CUDA. Basically, there is much more driver support and optimization for Nvidia that is not there for AMD. Think of it like Gameworks, but for folding, as a very rough analogy. Also, from what I know, CUDA architectures just plain work better for what [email protected] does. And Nvidia usually has a lot more "brute force" power, in terms of speed and high overclocks etc.

If we had 290s running at 1500mhz with almost ten years of vendor specific optimizations solely for [email protected], then we would also see 290s doing 400k+ PPD just like a 970.


----------



## bonami2

Quote:


> Originally Posted by *neurotix*
> 
> I can help clear this up.
> 
> AMD processing power > Nvidia because AMD doesn't gimp Double Precision Compute on consumer cards.
> 
> Even the Titan X doesn't offer double precision compute from Nvidia anymore, afaik. Because it's not needed for gaming. The original Titan (2013) had fully enabled double precision compute. Now, if you want that you have to buy a Quadro card.
> 
> AMD, however, can do double precision on all it's cards, afaik.
> 
> The reason Nvidia destroys AMD in [email protected] is because for a very long time now, Nvidia has worked very closely with Stanford to run [email protected] on CUDA. I believe the first GROMACS cores to run on GPUs were made for Nvidia CUDA. Basically, there is much more driver support and optimization for Nvidia that is not there for AMD. Think of it like Gameworks, but for folding, as a very rough analogy. Also, from what I know, CUDA architectures just plain work better for what [email protected] does. And Nvidia usually has a lot more "brute force" power, in terms of speed and high overclocks etc.
> 
> If we had 290s running at 1500mhz with almost ten years of vendor specific optimizations solely for [email protected], then we would also see 290s doing 400k+ PPD just like a 970.


Yea it do make sense. They do have more raw power. but that raw power cant be used in low resolution like my setup showed me crossfire at 1080p scaling is pure crap. While at 5760x1080 it almost 100%

Amd as always had gpu with lot of core and component and bandwith but their power come with game that have high resolution content and stuff and it cant be fully used with crap stuff it seem

I mean 980ti beat the fury x. but the fury x beat the 980ti in 4k. And if you sli and crossfire the fury x jump ahead like madness probably? aint sure about that since the 4gb may prevent them from working at 100%


----------



## rickcooperjr

Quote:


> Originally Posted by *iinversion*
> 
> Quote:
> 
> 
> 
> Originally Posted by *aDyerSituation*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You are delusional. $330 r9 390 vs $340 gtx 970. This is the comparison. This thread has nothing to do with the 290. You are just trying to find an angle to come at AMD with. It's quite sad really.
> And how is it a bad value compared to the 970? Lmao you have to be trolling
> 
> For the same price you get:
> -A faster card in almost every game
> -A card that will let you play your games with ultra textures without stuttering
> -A card that doesn't need to be overclocked to show it's "true potential" against it's rivals
> 
> At the cost of:
> -power draw
> 
> 
> 
> Why would anyone spend an extra $90 to get a 390 over a 290 for 1080p when they are the same freaking card minus VRAM and stock clocks.
> 
> You are ridiculous.
Click to expand...

I want to clear something out you seem to not understand the R9 290 / 290x vs R9 390 / 390x the difference is significant enough the 300 series rebrands are not just rebadges they have proper VRM cooling and better power delivery and better quality Vram chips that have tighter timings and lower latencies over the r9 290 / 290x. These are very significant changes that go outside the realm of a typical rebrand / rebadge in essence the main PCB portion of the card has been rebuilt along with advanced cooling and actual active cooling of the VRM's unlike what was on the 290 / 290x add the memory situation and better power delivery the 390 / 390x technically hardly constitute as a rebadge / rebrand we are talking very significant changes.


----------



## Stige

http://www.techspot.com/review/1075-best-graphics-cards-2015/

Seems pretty clear to me about 390 vs 970.

Unless you are some hippy or something that wants to save the planet and cares about power consumption of their GPU for some weird reason.
I for one don't. All I care about is the price to performance and how futureproof it is, and the 970 cannot fight the 390 in any of those fields.


----------



## Slay

About the Overclocking, doesn't a 970 OC by 200MHz from stock get the same results as a 390 OC by 100MHz?


----------



## rickcooperjr

Quote:


> Originally Posted by *Slay*
> 
> About the Overclocking, doesn't a 970 OC by 200MHz from stock get the same results as a 390 OC by 100MHz?


yes that is my understanding so when you figure in OC'd they still end up being even or the 390 winning and a few times the 970 will win but again it is a constant trade of blows with the 390 being the better for the money because 80% of time 390 is the winner especially if you run above 1080p then 390 wins over 95% of time again leaving the 390 with a longer lifespan and better future proofing and you get more for your money all around with the 390.


----------



## Madpacket

I agree the 390 is the clear choice overall however where the 970 wins is for small ITX systems that can't take full length cards. There are 2 or 3 ITX 970's that work well in this space due to their power consumption. I do think the regular sized 970's are overpriced though and should be cheaper than the 390's by about $40-50 US dollars.


----------



## ImJJames

Quote:


> Originally Posted by *Slay*
> 
> About the Overclocking, doesn't a 970 OC by 200MHz from stock get the same results as a 390 OC by 100MHz?


I would say more like 100Mhz OC on Hawaii/Grenada = 250MHz on Maxwell

Also AMD is much more linear OC compared to NVIDA


----------



## TopicClocker

Wow, this thread reached it's conclusion months ago, and all that's occurring now is a fanboy war.


----------



## daunow

The only thing that has me scare of buying a GPU is the ******* DX12 drama ****.

I think I am going to go with the 970 tough for it's resell value, so I can sell it later on and get a DX12 ready GPU.

I don't think the 390 is as great as people seem to say the wattage and heat turned me off and the (according to a lot of people) bad drivers as well, however in RAW performance yes it's really good.

https://www.youtube.com/watch?v=4ckA_KTdaJg

Don't really know If I would enjoy playing on lower than 60fps @ 1440p since I would just see too much screen tearing.


----------



## PontiacGTX

Quote:


> Originally Posted by *daunow*
> 
> The only thing that has me scare of buying a GPU is the ******* DX12 drama ****.
> 
> I think I am going to go with the 970 tough for it's resell value, so I can sell it later on and get a DX12 ready GPU.
> 
> I don't think the 390 is as great as people seem to say the wattage and heat turned me off and the (according to a lot of people) bad drivers as well, however in RAW performance yes it's really good.
> 
> https://www.youtube.com/watch?v=4ckA_KTdaJg
> 
> Don't really know If I would enjoy playing on lower than 60fps @ 1440p since I would just see too much screen tearing.


doesnt the 390 outperform the 970 at 1440?


----------



## daunow

Quote:


> Originally Posted by *PontiacGTX*
> 
> doesnt the 390 outperform the 970 at 1440?


According to the video yes


----------



## Ha-Nocri

Sure it does. Stock 290 (947MHz) matches 970 @1440p


----------



## Stige

Well the 290 also has 4GB of actual memory and not some broken 3.5GB crap.


----------



## P-39 Airacobra

Quote:


> Originally Posted by *aDyerSituation*
> 
> Not true at all. It's super easy to hit the vram wall in GTA V and heavily modded games. And the 390 is way closer to a 980 than a 970. Overclocked or not.
> 
> Also, 3.5gb vs 4gb makes or breaks ultra textures in SoM


What? GTA V does not use near that much VRam at 1080p! I think you are reading the wrong memory counter on your Afterburner/EVGA Precision monitor. My old 2GB R9 270 ran GTA just fine. I could almost do Very High settings with it, But it did not have the power or V-Ram. However It did run on High textures and exceeded the memory warning on the settings, And it still did 60FPS most of the time. GTA V is not the GPU killer everyone claims it to be. It's basically the old GTA 4 engine with slighter better textures, and extra texture filters. People with old 8800's are running it. However GTA V is a bit finicky like GTA 4 was, So your PC has to be clean. You can't have junk apps running in the background. And yes I know many say they do not have junk apps, Because people are too trusting nowadays.


----------



## iinversion

Quote:


> Originally Posted by *P-39 Airacobra*
> 
> What? GTA V does not use near that much VRam at 1080p! I think you are reading the wrong memory counter on your Afterburner/EVGA Precision monitor. My old 2GB R9 270 ran GTA just fine. I could almost do Very High settings with it, But it did not have the power or V-Ram. However It did run on High textures and exceeded the memory warning on the settings, And it still did 60FPS most of the time. GTA V is not the GPU killer everyone claims it to be. It's basically the old GTA 4 engine with slighter better textures, and extra texture filters. People with old 8800's are running it. However GTA V is a bit finicky like GTA 4 was, So your PC has to be clean. You can't have junk apps running in the background. And yes I know many say they do not have junk apps, Because people are too trusting nowadays.


Pretty much. My friend IRL has a 660 Ti 2GB and runs GTA V @ 1080p with very high textures and everything else on high and does not have any issues with VRAM. It shows it is utilizing the full 2GB but no stuttering or anything.


----------



## iRUSH

Quote:


> Originally Posted by *iinversion*
> 
> Pretty much. My friend IRL has a 660 Ti 2GB and runs GTA V @ 1080p with very high textures and everything else on high and does not have any issues with VRAM. It shows it is utilizing the full 2GB but no stuttering or anything.


99.9% of the time it's all in the settings anyway. Most people can't tell the difference between "Ultra" and "High" and 2x AA or 4x AA.

But fps from 30, to 60 to 100+ on the other hand...that is very noticeable.

I can get by just fine on 2gb of vram at 1080p for any game I have ever played. That's the beauty of PC gaming in my eyes. We can adjust settings to achieve desired performance and rarely does that impact graphics on the same scale.


----------



## GreGGo

My head hurts from reading this post. I am replacing the build in my sig with a 4690k, new mobo, and gpu. I mainly play Guild Wars 2 and like to max out graphics, as I enjoy the scenery.

390 or 970? Thanks.


----------



## rdr09

Quote:


> Originally Posted by *GreGGo*
> 
> My head hurts from reading this post. I am replacing the build in my sig with a 4690k, new mobo, and gpu. I mainly play Guild Wars 2 and like to max out graphics, as I enjoy the scenery.
> 
> 390 or 970? Thanks.


$300 tops for a 4GB card. Lower the better. Guild Wars i think works best with an nvidia card.


----------



## GreGGo

Quote:


> Originally Posted by *rdr09*
> 
> $300 tops for a 4GB card. Lower the better. Guild Wars i think works best with an nvidia card.


390 is $309. 970 is $319.


----------



## rdr09

Quote:


> Originally Posted by *GreGGo*
> 
> 390 is $309. 970 is $319.


i say 390. You are not pairing any of those with your Phenom, right?


----------



## GreGGo

Quote:


> Originally Posted by *rdr09*
> 
> i say 390. You are not pairing any of those with your Phenom, right?


Nope. Replacing the Phenom with an i5-4690K.


----------



## rdr09

Quote:


> Originally Posted by *GreGGo*
> 
> Nope. Replacing the Phenom with an i5-4690K.


Lovely.


----------



## BinaryDemon

Quote:


> Originally Posted by *GreGGo*
> 
> My head hurts from reading this post. I am replacing the build in my sig with a 4690k, new mobo, and gpu. I mainly play Guild Wars 2 and like to max out graphics, as I enjoy the scenery.
> 
> 390 or 970? Thanks.


I would buy the R9 390.
Quote:


> Originally Posted by *GreGGo*
> 
> 390 is $309. 970 is $319.


You can buy a new GTX970 for as low as $250 [LINK] If that sways anything.


----------



## mtcn77

Haswell gets a major uplift in Broadwell & Sky Lake.


----------



## iRUSH

Quote:


> Originally Posted by *BinaryDemon*
> 
> I would buy the R9 390.
> *You can buy a new GTX970 for as low as $250 [LINK] If that sways anything*.


With a game too... Not a bad price. Not the best cooling solution but it's just a 970.


----------



## GreGGo

Quote:


> Originally Posted by *BinaryDemon*
> 
> I would buy the R9 390.
> You can buy a new GTX970 for as low as $250 [LINK] If that sways anything.


$250 does make a difference. I mean, my 5870 has lasted this long and still holds up so I'm sure that the 970 will last just as long for me.


----------



## Stige

5870 is a Radeon, 970 is not.
Radeons last, Derpvidia doesn't.


----------



## rickcooperjr

Quote:


> Originally Posted by *GreGGo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *BinaryDemon*
> 
> I would buy the R9 390.
> You can buy a new GTX970 for as low as $250 [LINK] If that sways anything.
> 
> 
> 
> $250 does make a difference. I mean, my 5870 has lasted this long and still holds up so I'm sure that the 970 will last just as long for me.
Click to expand...

I dont know about that Nvidia is well known to literally abandon previous gens second they come out with a new series or new architecture alot of keplar people are furious right now and sadly they found if they backed up theyre drivers to older iterations theyre keplar cards performed better than they did on newwer drivers since 900 series release in otherwards fermi / keplar cards lost performance mysteriously after GTX 900 series was released.

Nvidia keplar / fermi cards lost a bit of performance when using modern drivers in alot of games and benchmarks 3dmark is one of the benchmarks in question same goes for alot of games some older games perform worse on keplar / fermi after drivers were released after GTX 900 series release this borders shenanigans imagine what it will be like 6 months down road when Nvidia releases theyre new lineup.

PLZ keep in mind you need to run modern / newer drivers to get alot of game optimizations and bug fixes so being forced to run older drivers is bad in alot of cases and can force you to be stuck with bugs / glitches and such.

So you basically are throwing your hands in the air and just hoping they don't do this in future as they have in past quite a few times Nvidia is out to make money and if it means they make the newer cards look more impressive artificially so be it they will do it they simply don't care it is all about money to them they really don't care about theyre customers or consumers pockets / budgets AMD does hince why they optimize so heavily for the longhaul.

The facts are AMD radeon cards age better and often get better with age by a substantial amount look at the HD 7970 in past few years it has gained a whopping 15% or more performance R9 290x same. I would imagine the R9 390 will be same because it is a revisioned slightly upgraded R9 290 so I imagine it will get around another 10% gains or so in next few years while the GTX 970 is maxed out at its current state.

This is how AMD works they tend to get better with age check into this yourself the numbers are pretty amazing we are talking just since GTX 970 release the R9 290 has gained like 18% performance HD 7970 has gained another 10% the entire GCN lineup is gaining performance overtime.

The thing is with Nvidia they optimize and tweak for first few months then walk away and leave you to deal with what you got AMD nurture the hardware and make it perform better and better AMD is still optimizing 4-5 gens back fairly commonly with substantial gains for them. I don't see Nvidia doing this they worry about current and say screw past gens hince the whole keplar / fermi. The main point is Nvidia killed off a few keplar cards that were only 6 months old and 8 months old literally infants in theyre cribs and abandoned them when they released the GTX 900 series.

I will say this point and case is AMD is still optimizing for your Radeon HD 5870 routinely for games and such this you don't see out of Nvidia your HD 5870 is still gaining performance overtime believe it or not this is very telling on the subject / situation.

I am waiting for the Nvidia fanboys to chime in and say i am full of it but do your own research you will notice this yourself many have found these facts true Nvidia gimped the previous gens like keplar / fermi in the drivers since 900 series launch and such to artificially make the current lineup look more impressive and they also abandoned keplar / fermi in games again a few keplar card were 6-8 months old aka literally new cards when they done this. I will use this metaphor it is like selling you a new car then stop producing parts for it a month or so after theyre made to force you to have to buy a new car when the car breaks a year or so down road.


----------



## GreGGo

Two good points. I wish the 390 were being offered for $250.


----------



## iRUSH

Quote:


> Originally Posted by *GreGGo*
> 
> Two good points. I wish the 390 were being offered for $250.


Try Jet dot com and use their first time buyers discount up to $50.


----------



## PontiacGTX

Quote:


> Originally Posted by *GreGGo*
> 
> Two good points. I wish the 390 were being offered for $250.


on Newegg weeks ago there were 390s form Powercolor at 225usd, maybe on decmeber there will be a deal


----------



## daunow

Quote:


> Originally Posted by *GreGGo*
> 
> Two good points. I wish the 390 were being offered for $250.


I've seen them go that low on newegg.. without rebates.

You can also check jet.com with 20now coupon.


----------



## lightsout

Quote:


> Originally Posted by *PontiacGTX*
> 
> Quote:
> 
> 
> 
> Originally Posted by *GreGGo*
> 
> Two good points. I wish the 390 were being offered for $250.
> 
> 
> 
> on Newegg weeks ago there were 390s form Powercolor at 225usd, maybe on decmeber there will be a deal
Click to expand...

$225







They were floating around $300 a few days ago now they have gone up a tad. Looking at a nitro myself.


----------



## GreGGo

Just ordered the Sapphire Nitro yesterday. Can't wait...


----------



## lightsout

Quote:


> Originally Posted by *GreGGo*
> 
> Just ordered the Sapphire Nitro yesterday. Can't wait...


Congrats, the price jumped $20 or so on newegg in the last day or so. I thought $305 was the real price but I guess not.


----------



## BurgerRipper

Owww my... Look at this offers m8
Give your meat a good ol rob, new egg offers, best offers.

[MSI R9 390/URL]

[Gigabyte R9 390/URL]

[Asus Strix 970/URL]

[MSI GTX 970/URL]

And if you pay with Visa or American you get a 25% off. AMAZING!!!

IDK WHICH TO BUY, ANY SUGGESTIONS?


----------



## ElevenEleven

Quote:


> Originally Posted by *BurgerRipper*
> 
> Owww my... Look at this offers m8
> 
> And if you pay with Visa or American you get a 25% off. AMAZING!!!


Hmm, I'm not seeing that. Just tried doing a test check-out with Visa Checkout. Do you mean $25 off $200? Not 25% off... I also get this error message: "ERROR The promo code VCOBF15 cannot be combined with combo deals or gift items attached to products."


----------



## PontiacGTX

Quote:


> Originally Posted by *BurgerRipper*
> 
> Owww my... Look at this offers m8
> Give your meat a good ol rob, new egg offers, best offers.
> 
> [MSI R9 390/URL]
> 
> [Gigabyte R9 390/URL]
> 
> [Asus Strix 970/URL]
> 
> [MSI GTX 970/URL]
> 
> And if you pay with Visa or American you get a 25% off. AMAZING!!!
> 
> IDK WHICH TO BUY, ANY SUGGESTIONS?


depends on what resolution do you use?whats your cpu,psu?what games would you play?
Quote:


> Originally Posted by *ElevenEleven*
> 
> Hmm, I'm not seeing that. Just tried doing a test check-out with Visa Checkout. Do you mean $25 off $200? Not 25% off... I also get this error message: "ERROR The promo code VCOBF15 cannot be combined with combo deals or gift items attached to products."


try jet it had a 20NOW code for 50usd off on anything there was a 390x at 335 if you use the free waived return


----------



## ElevenEleven

Sorry for noobishness, but that is "jet"? Another on-line store?
edit: found it--jet.com.


----------



## lightsout

I jumped on an evga 970. I got the vanilla ACX 2.0+ The cheapest one with 6 phase vrm instead of 4. I know the 390 is the better card when it comes to raw power. But I think I am ready to go back to nvidia. Should be plenty for me power wise.


----------



## givmedew

Just my 2 cents...

I have GTX 970 and I have (2) R9 290X...

The better buy was the 290X as I bought the first when it first came out for $380 as a 290 but it unlocked and overclocked!!! The second I got for $180 march 2014 when the mining market flipped upside down and people where dumping them by the dozens. But that was a unique circumstance...

When the gtx 970 first came out I bought the g1 gaming because at the time AMD had been just messing up horribly on drivers (they seem to have a habit at that supposedly) but just in crossfire and day 1 releases... At the time they had just released omega and had said it would be another 3+ months for the next update (not acceptable)... they abandoned that idea and release updates more often...

But day 1 releases? NOT AMD!!!

That said the g1 gaming 970 was pure garbage! It must still be garbage as I am not certain how they fixed the ram issues with that card. My card flakes out hardcore before it hits 4GB and like the AMD doesn't do that. I mean seriously I got from great frame rate to no framerate and I don't even hit 4GB... I know that not all 970s have this issue but enough do that you should be comparing the 980 to the 390...

anyways

just my 2 cents...

I do like the special anti-aliasing that NVIDIA offers but it seems there is a new AA method that AMD and NVIDIA supports that has very low performance drop and even a 1x version... I was using it in black ops III and enjoyed it!...

(1) of my 290X OC'd to extreme vs my 970 oc'd as far as I can get it on stock firmware... the 290X wins hands down! at 2560x1440

that my opinion and factors in the issues when the 970 reaches the end of its ram which happens well before 4GB

THE 970 DOES NOT HAVE 4GB!!!

Also...

I know you said 1080P but seriously? If we are talking 1080P who cares which one you get... It is more than enough.

Lets talk about when you buy a new monitor? I'm saying the AMD wins hands down! Also... things are pointing towards AMD for DX12... and if nvidia has to make major changes to be better at DX12 then that means the 970/980 die driver wise when the new nvidia dx12 card comes out.


----------



## PontiacGTX

390 deal
http://www.overclock.net/t/1581629/newegg-xfx-radeon-r9-390-dd-235usd-ar-visa-checkout


----------



## Stige

970 is even worse than it was already now: http://wccftech.com/amd-radeon-software-performance-analysis-is-this-the-crimson-tide/


----------



## rickcooperjr

Quote:


> Originally Posted by *Stige*
> 
> 970 is even worse than it was already now: http://wccftech.com/amd-radeon-software-performance-analysis-is-this-the-crimson-tide/


AMD beta drivers and crimson look very nice I have not checked much into it but with just a few simple searches I am finding some very impressive results.


----------



## iRUSH

Quote:


> Originally Posted by *Stige*
> 
> 970 is even worse than it was already now: http://wccftech.com/amd-radeon-software-performance-analysis-is-this-the-crimson-tide/


This has my attention. Thanks for posting. Personally I felt that Omega wasn't anything special aside from some power saving features that were hit or miss in their function. The last big leap for AMD drivers was early 2012 with their 12.1 performance drivers. The HD7K series GPU's moved up one tier with these drivers. It made my HD 7850 act like a 7870 for example. Good times back then and I hope to see similar improvements here as well


----------



## Melan

My GTX 670 has died from OC not so long ago and since playing anything on Intel HD 4000 isn't fun I went looking for semi-expensive new things to burn use.

I've came across Gigabyte GTX 970 G1 (375 euros) and MSI R9 390 (380 euros). While I've read this entire thread (whole 31 pages) it still didn't help clear up the confusion.

So, to sum up my addiction games:
BF3/4, CSGO - I don't care much about visuals and trying to always push 144 fps (lowish settings etc);
FF14, EVE-O, etc - I care more about visuals here.
I also occasionally play fallout, but I don't use mods or any other enhancements pretty much anywhere.

I play at 1080p/144 and not planning to get any 1440p/144 monitor any time soon. From rather short experience with Dell U2515H 1440p/60 worked fine on 670.
As for 3.5gb memory stuff, with my settings in BF I've never hit 1,5gb in an entire lifespan of my 670, and barely scratched that mark for games with visuals.
Don't care at all for DX12 too by the way. W10 is still unstable on my hardware to do the switch and nothing interesting for me to play that uses DX12.

Edit: I'm not going to OC my GPU.


----------



## iRUSH

Quote:


> Originally Posted by *Melan*
> 
> My GTX 670 has died from OC not so long ago and since playing anything on Intel HD 4000 isn't fun I went looking for semi-expensive new things to burn use.
> 
> I've came across Gigabyte GTX 970 G1 (375 euros) and MSI R9 390 (380 euros). While I've read this entire thread (whole 31 pages) it still didn't help clear up the confusion.
> 
> So, to sum up my addiction games:
> BF3/4, CSGO - I don't care much about visuals and trying to always push 144 fps (lowish settings etc);
> FF14, EVE-O, etc - I care more about visuals here.
> I also occasionally play fallout, but I don't use mods or any other enhancements pretty much anywhere.
> 
> I play at 1080p/144 and not planning to get any 1440p/144 monitor any time soon. From rather short experience with Dell U2515H 1440p/60 worked fine on 670.
> As for 3.5gb memory stuff, with my settings in BF I've never hit 1,5gb in an entire lifespan of my 670, and barely scratched that mark for games with visuals.
> Don't care at all for DX12 too by the way. W10 is still unstable on my hardware to do the switch and nothing interesting for me to play that uses DX12.
> 
> Edit: I'm not going to OC my GPU.


You're a perfect candidate for a 970 then. You game like me.


----------



## Stige

Quote:


> Originally Posted by *Melan*
> 
> My GTX 670 has died from OC not so long ago and since playing anything on Intel HD 4000 isn't fun I went looking for semi-expensive new things to burn use.
> 
> I've came across Gigabyte GTX 970 G1 (375 euros) and MSI R9 390 (380 euros). While I've read this entire thread (whole 31 pages) it still didn't help clear up the confusion.
> 
> So, to sum up my addiction games:
> BF3/4, CSGO - I don't care much about visuals and trying to always push 144 fps (lowish settings etc);
> FF14, EVE-O, etc - I care more about visuals here.
> I also occasionally play fallout, but I don't use mods or any other enhancements pretty much anywhere.
> 
> I play at 1080p/144 and not planning to get any 1440p/144 monitor any time soon. From rather short experience with Dell U2515H 1440p/60 worked fine on 670.
> As for 3.5gb memory stuff, with my settings in BF I've never hit 1,5gb in an entire lifespan of my 670, and barely scratched that mark for games with visuals.
> Don't care at all for DX12 too by the way. W10 is still unstable on my hardware to do the switch and nothing interesting for me to play that uses DX12.
> 
> Edit: I'm not going to OC my GPU.


So what you are saying is you want to buy the crap that is the 970 that you will have to replace a year from now because it would have ran out of breath by then.
Or you could buy the much better and way more future proof 390.

I'm pretty sure these new drivers leave nothing for consideration anymore http://wccftech.com/amd-radeon-software-performance-analysis-is-this-the-crimson-tide/

Why wouldn't you pay 5€ extra for a card that runs better and lasts longer easily?


----------



## Melan

Quote:


> Originally Posted by *Stige*
> 
> So what you are saying is you want to buy the crap that is the 970 that you will have to replace a year from now because it would have ran out of breath by then.


The same has been told 3 years ago when I bought 670. If I didn't push the memory clock too far I wouldn't bother with upgrade.


----------



## iRUSH

Quote:


> Originally Posted by *Stige*
> 
> So what you are saying is you want to buy the crap that is the 970 that you will have to replace a year from now because it would have ran out of breath by then.
> Or you could buy the much better and way more future proof 390.
> 
> I'm pretty sure these new drivers leave nothing for consideration anymore http://wccftech.com/amd-radeon-software-performance-analysis-is-this-the-crimson-tide/
> 
> Why wouldn't you pay 5€ extra for a card that runs better and lasts longer easily?


Based on the way he/she games it's clear the 970 is the better choice.


----------



## Stige

Quote:


> Originally Posted by *iRUSH*
> 
> Based on the way he/she games it's clear the 970 is the better choice.


It's not a better choice in any way lol
Unless you are some tree hugging hippy that cares about "saving environment" and that crap with the lower power consumption lol

In every other regard, the 390 is better.


----------



## specopsFI

Quote:


> Originally Posted by *Stige*
> 
> It's not a better choice in any way lol
> Unless you are some tree hugging hippy that cares about "saving environment" and that crap with the lower power consumption lol
> 
> In every other regard, the 390 is better.


No, it's not. Stop saying things that anyone can prove to be false in numerous ways. If the 390 was better in every regard except power consumption, then why is it that there are more benchmarks than this thread can withhold that demonstrate the 970 being *faster* than 390?

The 390 might be better overall than 970 (not arguing on that with you), but it's definitely not "better in every regard".


----------



## Stige

It was already faster in 99% of the games, and now with the new drivers, there is no dispute anymore, the 390 is faster in 99.9% of the time (0.01% being the crappy FO4).


----------



## specopsFI

Quote:


> Originally Posted by *Stige*
> 
> It was already faster in 99% of the games, and now with the new drivers, there is no dispute anymore, the 390 is faster in 99.9% of the time (0.01% being the crappy FO4).


Your statistics are fictional, sorry.

Edit: I mean, take any review with several games in it and you will find at least one where 970 is faster. In fact, let's not even go that far. Since you named one game where the 970 is faster, then by your statistics you must have a list of 999 games where the 390 is faster. So start there: name 999 games where 390 is faster than 970.


----------



## iRUSH

The reason why the 970 is the better choice per that individual is because he's not maxing out the settings at his desired resolution @ 144 hz. Based on the information given the 970 will in fact provide better minimum fps.

I know this from experience. My 290x lightning destroyed my 970 in benchmarks and when I needed the best graphics settings I could possibly get. But when I needed to maintain the 144 hz minimum fps standard the 970 won every time.

Some of you guys need to look past your own personal settings and stop taking them as gospal. Not everyone has to max out their game


----------



## specopsFI

Yep. For me, the 290 or the 390 are the better cards as of this moment. That's because I'm aiming for 60fps @1440p. That doesn't mean a thing to someone aiming for 144fps @1080p. The CPU overhead difference between AMD and Nvidia is real, there's no denying it. Same goes for tessellation performance.

Learn to separate your personal preferences from universal truths, people.


----------



## iRUSH

Quote:


> Originally Posted by *specopsFI*
> 
> Yep. For me, the 290 or the 390 are the better cards as of this moment. That's because I'm aiming for 60fps @1440p. That doesn't mean a thing to someone aiming for 144fps @1080p. The CPU overhead difference between AMD and Nvidia is real, there's no denying it. Same goes for tessellation performance.
> 
> Learn to separate your personal preferences from universal truths, people.


This man gets it!


----------



## Melan

Ok, I got the answer I was looking for. Thanks.


----------



## PontiacGTX

Quote:


> Originally Posted by *specopsFI*
> 
> Same goes for tessellation performance.


if your biased dev uses over tess on purpose


----------



## specopsFI

Quote:


> Originally Posted by *PontiacGTX*
> 
> if your biased dev uses over tess on purpose


That does nothing to change the real and actual difference in tessellation *performance* between the two.

I'm not one who makes a habit out of asking why. If I can't change the reasons, then I must make do by adjusting my actions to the given facts. That is to say: if my favourite games would happen to be ones where tessellation performance makes a difference between Nvidia and AMD, then I should make my choice accordingly.

I suppose the games where AMD's larger CPU overhead prevents the 390 getting 144fps while the 970 gets there are all "poorly optimized"?

There are valid, logical reasons to choose 970 over 390. That doesn't make the 970 the right choice for everyone, far from it. For the majority, I consider the 390 to be the better card.


----------



## PontiacGTX

Quote:


> Originally Posted by *specopsFI*
> 
> That does nothing to change the real and actual difference in tessellation *performance* between the two
> if my favourite games would happen to be ones where tessellation performance makes a difference between Nvidia and AMD, then I should make my choice accordingly.


Reducing the excessive and unnecessary tess will do the job instead buying a card because a brand forces you do to it


----------



## specopsFI

Quote:


> Originally Posted by *PontiacGTX*
> 
> Reducing the excessive and unnecessary tess will do the job instead buying a card because a brand force you do to


Yea, that sounds like the exact kind of argument where I step out of a conversation. When we are talking about things like "excessive" and "unnecessary", we are not talking about facts anymore. Those are judgement calls and judgement calls are dependent on personal perspective. Not that there is no such thing: there are a few cases where the line has been clearly crossed. It's just that there is a difference already at tessellation levels that do give some visible benefits. If anything, I'd say the 390 comes a bit short on tessellation performance (which AMD kind of acknowledged by improving geometry performance on Tonga and Fiji) whereas 970 has more of it than it can normally use.

Besides: no one is forcing anyone to buy anything: that's just another hyperbolic statement.

One thing we do agree on: the tessellation factor slider in the drivers is an actual advantage, something that AMD has and Nvidia doesn't.


----------



## ElevenEleven

I've been a bit out of touch with computer components for the past two years, instead focusing on photography and other things, so I'm out of date on what does better when. This recent discussion regarding GTX970 vs R9 390 has been helpful to shed some light. I'd like to ask now, which card would be preferable for an older (Sandy Bridge) processor, to replace an HD7970? It's not strictly necessary to replace that 7970, as it's been running cool and well for the past 3-some years, but it does power a 1440p monitor, and new games keep coming out. With current sale prices, I thought it might be worth to finally upgrade.

So with all this talk about CPU overhead, does it imply that I should get an nVidia card with the older Sandy CPU? (basically an i5 2500 equivalent). Or am I interpreting that statement incorrectly? I'm completely brand-agnostic in this regard (my own computer uses an nVidia card), but I do root for AMD a bit in the sense that I think it needs to remain competitive as a counterbalance to nVidia--monopolies are bad.


----------



## Stige

HD7970 is not even a bad card today, unlike something like GTX 670 or 680 from that time with their 2GB limited derp memory.

If your games run fine, I might not even update the HD7970 yet.

AMD > NVidia in any case.


----------



## iRUSH

Quote:


> Originally Posted by *ElevenEleven*
> 
> I've been a bit out of touch with computer components for the past two years, instead focusing on photography and other things, so I'm out of date on what does better when. This recent discussion regarding GTX970 vs R9 390 has been helpful to shed some light. I'd like to ask now, which card would be preferable for an older (Sandy Bridge) processor, to replace an HD7970? It's not strictly necessary to replace that 7970, as it's been running cool and well for the past 3-some years, but it does power a 1440p monitor, and new games keep coming out. With current sale prices, I thought it might be worth to finally upgrade.
> 
> So with all this talk about CPU overhead, does it imply that I should get an nVidia card with the older Sandy CPU? (basically an i5 2500 equivalent). Or am I interpreting that statement incorrectly? I'm completely brand-agnostic in this regard (my own computer uses an nVidia card), but I do root for AMD a bit in the sense that I think it needs to remain competitive as a counterbalance to nVidia--monopolies are bad.


I think at your resolution assuming it's a 60 hz monitor, the 390 should suit you nicely.

The 1440p 60 hz setups is the point where you can recommend either one.

When higher hz is desired and especially at lower resolution such as 1080p the 970 gets the nod.

Even a 1440p 144 hz monitor with its user preferring the higher refresh rate over the extra eye candy should look into the 970 too.

But after narrowing that down, I'd say the 390 has that market held down well with the majority of users with 60 hz panels. At that point there's a good chance the individual wants eye candy over a refresh rate they're uninterested in or have never experienced anyway. In which case means the likelihood of the user to utilize the 4+ GB of RAM the 390 has increases.

After that you have to consider driver support vs longevity. The latter sounds great, till you experience the former.

It's a great time as a PC hardware enthusiast


----------



## specopsFI

Quote:


> Originally Posted by *Stige*
> 
> HD7970 is not even a bad card today, unlike something like GTX 670 or 680 from that time with their 2GB limited derp memory.
> 
> If your games run fine, I might not even update the HD7970 yet.
> 
> AMD > NVidia in any case.


If you were not a fellow Finn, I wouldn't bother anymore, but since you are...

Once more: you are patently wrong when you say things like "AMD > NVidia in any case". For example: how is that true for someone putting together a PC to play Project Cars with? It's not. You don't play Project Cars, you might say. Fine, but you have no right to tell others not to. There is no one choice to rule them all.


----------



## Stige

Quote:


> Originally Posted by *specopsFI*
> 
> If you were not a fellow Finn, I wouldn't bother anymore, but since you are...
> 
> Once more: you are patently wrong when you say things like "AMD > NVidia in any case". For example: how is that true for someone putting together a PC to play Project Cars with? It's not. You don't play Project Cars, you might say. Fine, but you have no right to tell others not to. There is no one choice to rule them all.


Buying a card based on a single game is just dumb really. No one would ever just play one game forever and forever.


----------



## yoyo711

Get the cheap one


----------



## specopsFI

Quote:


> Originally Posted by *Stige*
> 
> Buying a card based on a single game is just dumb really. No one would ever just play one game forever and forever.


So you really are the global authority in what is clever and what is dumb? Got it.

But when you already named Fallout 4, I named Project Cars, then there is BF4, Dying Light, Witcher 3... Put those together, acknowledge that they are not the only ones where 970 is strong and see for yourself if those "AMD is always better than Nvidia" comments hold water.


----------



## rickcooperjr

I myself will chime in AMD has the better history of support unlike Nvidia Nvidia realy only optimizes for current gen then puts it all on backburner once new gen comes out look at keplar / fermi for a example Nvidia fanboys were spitting flames over it and a few actually jumped ship over this.

I will also say this AMD cards just flat out age well theyre like a fine wine that just keeps getting better look at HD 7970 performance gains over past few years we are talking 20%+ gains in 2-3yrs after AMD has come out with 2 gens since look at R9 290x / 290 history at performance gains overtime in past few years we are talking massive performance gains. This makes AMD the longhaul performer in many instances the 290x is side by side with a GTX 980 when before it was competing with the GTX 970 AMD cards overtime are going up a tier or so of performance up Nvidias tiers how is this bad.

I want to also chime in Nvidia has a bad habit of all but abandoning its customers a year or so after a card comes out AMD again is longhaul performer. I myself am upset with the fiasco Nvidia pulled with the GTX 970 and well the way they gimped keplar / fermi and proof is there if you use older drivers on keplar / fermi they perform better than they do on newer drivers since 900 series release in many games / testing software this is a redflag to a issue that borders shenanigans.

I also want to point out Nvidia keeps everything locked down and only they can see use of stuff while AMD tend to go Open Source meaning anyone can use / benefit from it even Nvidia.


----------



## daunow

Quote:


> Originally Posted by *BurgerRipper*
> 
> Owww my... Look at this offers m8
> Give your meat a good ol rob, new egg offers, best offers.
> 
> [MSI R9 390/URL]
> 
> [Gigabyte R9 390/URL]
> 
> [Asus Strix 970/URL]
> 
> [MSI GTX 970/URL]
> 
> And if you pay with Visa or American you get a 25% off. AMAZING!!!
> 
> IDK WHICH TO BUY, ANY SUGGESTIONS?


There was a AC 2.0 Gaming one on EVGA Product B stocks.

Honestly recommend looking at their b stock products if you already have the DVI cables and ****, GPU's / PSU are so cheap there.
Quote:


> Originally Posted by *rickcooperjr*
> 
> I myself will chime in AMD has the better history of support


Let's be honest here, that's more than likely not true.
Quote:


> Originally Posted by *specopsFI*
> 
> So you really are the global authority in what is clever and what is dumb? Got it.
> 
> But when you already named Fallout 4, I named Project Cars, then there is BF4, Dying Light, Witcher 3... Put those together, acknowledge that they are not the only ones where 970 is strong and see for yourself if those "AMD is always better than Nvidia" comments hold water.


Quote:


> Originally Posted by *specopsFI*
> 
> So you really are the global authority in what is clever and what is dumb? Got it.
> 
> But when you already named Fallout 4, I named Project Cars, then there is BF4, Dying Light, Witcher 3... Put those together, acknowledge that they are not the only ones where 970 is strong and see for yourself if those "AMD is always better than Nvidia" comments hold water.


Quote:


> Originally Posted by *specopsFI*
> 
> So you really are the global authority in what is clever and what is dumb? Got it.
> 
> But when you already named Fallout 4, I named Project Cars, then there is BF4, Dying Light, Witcher 3... Put those together, acknowledge that they are not the only ones where 970 is strong and see for yourself if those "AMD is always better than Nvidia" comments hold water.


He is kinda right, buying a GPU for a single game, is kinda stupid, but no one ever does that, they might buy it to play that one game on ultra but that's not the only game he will ever play.

TBH, I regret not going 390, bought a FTW 2.0 for $255 (1 week ago), and after seeing that the MSI had actual lower temps and noise according to some benchmarks, I felt kinda sad, however I am happy with it, because it's easy to resell and upgrade for the next gen that I am probably gonna upgrade too, due to DX12 not being well supported by both cards.


----------



## givmedew

Quote:


> Originally Posted by *iRUSH*
> 
> Quote:
> 
> 
> 
> Originally Posted by *specopsFI*
> 
> Yep. For me, the 290 or the 390 are the better cards as of this moment. That's because I'm aiming for 60fps @1440p. That doesn't mean a thing to someone aiming for 144fps @1080p. The CPU overhead difference between AMD and Nvidia is real, there's no denying it. Same goes for tessellation performance.
> 
> Learn to separate your personal preferences from universal truths, people.
> 
> 
> 
> This man gets it!
Click to expand...

Quote:


> Originally Posted by *Melan*
> 
> Ok, I got the answer I was looking for. Thanks.


Quote:


> Originally Posted by *PontiacGTX*
> 
> Quote:
> 
> 
> 
> Originally Posted by *specopsFI*
> 
> Same goes for tessellation performance.
> 
> 
> 
> if your biased dev uses over tess on purpose
Click to expand...

Quote:


> Originally Posted by *specopsFI*
> 
> Quote:
> 
> 
> 
> Originally Posted by *PontiacGTX*
> 
> if your biased dev uses over tess on purpose
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That does nothing to change the real and actual difference in tessellation *performance* between the two.
> 
> I'm not one who makes a habit out of asking why. If I can't change the reasons, then I must make do by adjusting my actions to the given facts. That is to say: if my favourite games would happen to be ones where tessellation performance makes a difference between Nvidia and AMD, then I should make my choice accordingly.
> 
> I suppose the games where AMD's larger CPU overhead prevents the 390 getting 144fps while the 970 gets there are all "poorly optimized"?
> 
> There are valid, logical reasons to choose 970 over 390. That doesn't make the 970 the right choice for everyone, far from it. For the majority, I consider the 390 to be the better card.
Click to expand...

Quote:


> Originally Posted by *PontiacGTX*
> 
> Quote:
> 
> 
> 
> Originally Posted by *specopsFI*
> 
> That does nothing to change the real and actual difference in tessellation *performance* between the two
> if my favourite games would happen to be ones where tessellation performance makes a difference between Nvidia and AMD, then I should make my choice accordingly.
> 
> 
> 
> Reducing the excessive and unnecessary tess will do the job instead buying a card because a brand forces you do to it
Click to expand...



So,

If your playing a game that says NVIDIA at the beginning or even sometimes if it doesn't... Simple fix for overdoing tessellation!

You just go into the settings create a game profile and FORCE the maximum tessellation down to 16x. I set it to 8x on fallout just to be safe because I couldn't tell a difference (there probably is one). I might have been over-cautious with it but...

That said...

I own (2) 290X and (1) GTX 970 G1 Gaming. I've mentioned this before but I am not going to back down about it.

The GTX 970 has a big big issue when you get into the last few hundred megabytes of memory. I don't know why but it seems like (and I might be wrong) some games might be able to tell how much memory you have and try to use as much of it as possible with out running over... I think Black-Ops III does this... because on several different settings the GTX 970 runs close to 4GB of memory use at 1920x1200 and with the same settings it uses close to 4GB on my R9 290X at 2560x1440. That seems weird to me! The thing is that on the 970 everytime it goes from 3500mb to 3900mb the game looses tons of frames... this doesn't happen on the 290X. Now I know your looking at the 390 vs the 970 but seriously...

DirectX12 is coming... We know for a fact that directx12 works well with the 290X and 390 we do NOT know if it will end up working well for older/current nvidia cards. It might be something that just doesn't work well with their architecture... for example litecoin mining works better (last I knew) on AMD than NVIDIA... not because one is better than the other but because they are built differently. DX12 has been around for the developers long enough that you would think NVIDIA would be doing everything they could to address those troubling benchmarks... maybe they have something coming but I think its going to be a little one sides (towards amd) till nvidia comes out with new cards...

So are you buying the wrong card for next year? Or are you planning on replacing it in a year? If you plan on replacing it are you prepared for its value to plummet like a rock if all the old cards never live up to the dx12 performance?

I'm not a fan of either company period.

I do extensive studying and decide based off what I know at the time and the price. Wattage never matters too me. I don't buy into the amd vs nvidia driver stuff even though it is a fact that nvidia is better about doing day one drivers and they pay out a lot more money to developers. That said NVIDIA is also good about abandoning driver development for previous gen cards... but even AMD has JUST (yesterday) announced that any card that isn't GCN is now a legacy card... BUT GCN is likely to be supported for a very long time. This is similar to what happened with the cards that didn't do dx11... they went legacy eventually but the 5800s where supported all the way up till just now and that card is from 2009!

ALSO AMD dropped their prices I think...

right now

R9 390X & R9 390
Original MSRP : $429 & $329
Down To $379 & $279 - $359 & $259 After Rebates

Read more: http://wccftech.com/amd-r9-fury-x-nano-price-cuts/#ixzz3scqHJGbg

So the R9 390 is $260 after MIR

BUT

As far as I can tell... the 970 and the 390 are the same price on newegg... they are both $260 after $20 or $30 rebates. HOWEVER! I WOULD NEVER EVER EVER buy the 970 after the experience I have had. I would buy the 980...

That said though we are also comparing 4GB to 8GB... you can def get an 8GB 390 for $260... like the XFX Radeon R9 390 DirectX 12 R9-390P-8256 8GB 512-Bit GDDR5 PCI Express 3.0 CrossFireX Support Double Dissipation XXX OC Video Card on newegg.

So that would make things where you possibly get frame rate issues on the 970 before it hits 4GB even at 1080P resolutions because like I said at 1920x1200 even after reducing several settings I still had issues (even reduced scaling so it technically was less than 1920x1200)... and with the AMD you'd have over 2x the truly usable memory...

Lastly the only other thing I can think about is frame times... NVIDIA seems to have steadier frame time if you can stay well below 3500MB VRAM usage. When you use almost all the ram the frame times do suffer because of that horribly handicapped last 200 or so mb. Forgot how much exactly it was and I think that some cards suffer worse than others depending on with they where made a 970 because they couldn't handle being a 980 or if they where made a 970 because of demand. Also I am not certain but it is possible that which parts are defective and therefore disabled may cause different amounts of performance degradation. I assume one of those 2 things are the case since some samples are worse than others. I know several people who bought the 970s and I seem to have the worse one but all of them experience the same issue I do and 2 people I know returned the 970 for 980s They are both NVIDIA ONLY users and wont touch AMD so for them it wasn't a choice between 970 and 290X (at the time) it was a choice of 970 or 980 and everything pointed at the 970 being nearly as good as the 980 and therefore not worth the price difference.


----------



## specopsFI

I find the pro-AMD posters of the last few pages a bit amusing. This is a very thorough thread on the matter of 390 vs 970. So why is it that you still feel the need to post walls of text on the matter to make the point on 390 being better than 970 again and again? As far as I can tell, the only reason seems to be that some people point out that the 390 and the 970 are DIFFERENT. They have DIFFERENT pros and cons. The clear consensus is that the 390 has more pros. So why is it that so many of you have a problem in someone pointing out the FEWER pros of the 970? And how can it be so hard to acknowledge the very obvious fact that if a certain individual has an usage case that leans on exactly those strong points of the 970, then the obvious choice in that individual case is the 970?

If nothing else then at least leave me out of your quotes. I did own a 970 and returned it to buy a 290 instead. And why the 290? Because frankly, it was and partially still is an even better deal than the 390, which in turn I consider to be a better card than the 970. FOR ME. Not to everyone. If someone is better served by Nvidia, then by all means.


----------



## Melan

It feels more like people trying to justify their own purchase rather than give advice on information presented.

I clearly stated I don't care about your DX12. I asked for advice based on stuff I do now. What happens in the future is not your problem.


----------



## daunow

Well, I think I have coiled wine or whatever it is the sound that my card is emitting.. ... sad.

it only makes this sound wen it's being utilizse heavily ex; running fallout 4 at ultra

https://u.teknik.io/24HZ3m.mp3

Sometimes you heard it go away beacuse whenever I alt+tab out of fallout it stops making that noise.


----------



## BurgerRipper

Quote:


> Originally Posted by *specopsFI*
> 
> I find the pro-AMD posters of the last few pages a bit amusing. This is a very thorough thread on the matter of 390 vs 970. So why is it that you still feel the need to post walls of text on the matter to make the point on 390 being better than 970 again and again? As far as I can tell, the only reason seems to be that some people point out that the 390 and the 970 are DIFFERENT. They have DIFFERENT pros and cons. The clear consensus is that the 390 has more pros. So why is it that so many of you have a problem in someone pointing out the FEWER pros of the 970? And how can it be so hard to acknowledge the very obvious fact that if a certain individual has an usage case that leans on exactly those strong points of the 970, then the obvious choice in that individual case is the 970?
> 
> If nothing else then at least leave me out of your quotes. I did own a 970 and returned it to buy a 290 instead. And why the 290? Because frankly, it was and partially still is an even better deal than the 390, which in turn I consider to be a better card than the 970. FOR ME. Not to everyone. If someone is better served by Nvidia, then by all means.


I'd have done the same if the 200 series hardware was ready for direct x 12, but is obviously a better deal than the 970 and the 380x/380

People please help us to find good black friday offers. All the good ones are gone and we couldnt get them


----------



## Ha-Nocri

290 is as ready for dx12 as 390 is. There is no difference in that regard...


----------



## doct0rthrill

390 is a 2xx. get Fury or wait for Pascal.


----------



## Stige

Quote:


> Originally Posted by *doct0rthrill*
> 
> 390 is a 2xx. get Fury or wait for Pascal.


No it's not. Feel free to google.

It isn't the same card under new name, not even close.


----------



## iRUSH

Quote:


> Originally Posted by *Stige*
> 
> No it's not. Feel free to google.
> 
> It isn't the same card under new name, not even close.


Huh? You're saying it's not the same card with more vram, binned chips and a new bios but a completely new card all together?

The 290/390 and the 290x/390x are not the same, at all?


----------



## specopsFI

Quote:


> Originally Posted by *Stige*
> 
> No it's not. Feel free to google.
> 
> It isn't the same card under new name, not even close.


Sigh









http://forums.guru3d.com/showpost.php?p=5109102&postcount=106


----------



## iinversion

Quote:


> Originally Posted by *Stige*
> 
> No it's not. Feel free to google.
> 
> It isn't the same card under new name, not even close.


Lol yes it is. It's 99% the same if you ignore the VRAM differences. Clock for clock they perform identical regardless of whatever "changes" AMD says they did.

HardOCP tested this when they were released.


----------



## outofmyheadyo

Id much rather have the inntuitive and well polished nvidia drivers rather then try to sort out the crap that amd calls drivers. They couldnt make drivers 10 years ago and they still cant.


----------



## rickcooperjr

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Id much rather have the inntuitive and well polished nvidia drivers rather then try to sort out the crap that amd calls drivers. They couldnt make drivers 10 years ago and they still cant.


well polished Nvidia drivers seriously did I just read that Nvidia has gotten a very bad rap in past year or more with theyre faulty drivers and game breaking driver issues not to mention they have released drivers that flat out killed Nvidia GPU's seriously man that is a very uneducated response.

AMD's drivers have stepped up alot in past year or so yes they have had a few buggy drivers crimson has been one that has been buggy for some people but for me crimson works flawlessly so PLZ do your research first Nvidia has gotten a bit lazy on the driver front and people will tell you this.


----------



## Tivan

Quote:


> Originally Posted by *lacrossewacker*
> 
> That only validates the concern people *had* with the AMD drivers holding back the GPUs.....
> 
> Not that I'm complaining


Having not ideal drivers doesn't make em bad. They don't crash (for me at least) and they're still having significantly less overhead than intel graphics drivers...

Also Dx9 drivers have been very competitive with Nivdia for single GPU usage for the longest time. (what I play. And they said they fixed the multi GPU issues in Dx9 with crimson.)

With HD 6000 and prior support being gone, I expect _great things_ with regard to Dx11 driver improvements in the future, but yeah they need work on that end, to catch up to Nvidia.


----------



## tj3n123

I have read many articles and review of both card (mainly aiming for MSI one), my conclusion (for MSI brand) is:

MSI AMD R9 390 Gaming 8G:
Pros:
- Better performance in higher resolution due to more RAM
- A little cheaper
- New Driver seems great (the vid below is sure older driver, new driver MIGHT make this card better at 1080p now, 1 - 4% higher fps vs. older driver)
- Great DX12 support
- Surely better with 2x Crossfire
Cons:
- Power Consumtion is nearly x2 higher
- Very hot under load (From most review in newegg)
- Not much room for OC cause of already high clock and temp
- Bigger, heavier

MSI NVIDIA GTX 970 Gaming 4G:
Pros:
- Really good at 1080p, might be better than r9 390: https://www.youtube.com/watch?v=MIrv2JWlJ-s
- Cool technology from NVidia
- More games compatible with NVidia than AMD
- Less Power consumtion
- Less heat
- More OC potential (can up 1500 is crazy)
Cons:
- OC is a gamble
- 3.5G RAM, cause stuttering on some games with a lot AA or anything higher than 1080p
- Lower Bus Bandwidth : 256 vs 512 from R9
- A bit more expensive
- Lower score in DX12 benchmark

-> Conclusion: clearly the ATI GTX R9 970 is the winner here









Please fell free to correct me







hope this can help some ppl


----------



## Stige

Get a R9 390 with a proper cooler and it will overclock more than fine.
And only "cool tech" Nvidia has is cuda cores, and only if you are into video editing and stuff like that really.

Also wth "more games compatible with nvidia than amd"?? That makes zero sense at all, all games work on all cards these days fine 90% of the time at release.


----------



## tj3n123

Quote:


> Originally Posted by *Stige*
> 
> Get a R9 390 with a proper cooler and it will overclock more than fine.
> And only "cool tech" Nvidia has is cuda cores, and only if you are into video editing and stuff like that really.
> 
> Also wth "more games compatible with nvidia than amd"?? That makes zero sense at all, all games work on all cards these days fine 90% of the time at release.


Wow u seems like a serious AMD fanboy, i make my list is base on internet and normal tech ppl who have no idea which to get, and *MAINLY MSI*

I think 99% ppl wont try to make custom cooler for GPU lol

Tech here i mean those software stuff is some what better and faster driver update, not ussually have serious bugs like the overheat one i read earlier (http://www.pcgamer.com/hardware-report-card-nvidia-vs-amd/#page-1)

More games compatible means more games that shake hand with NVidia, thus *MIGHT* have better performance, not always correct but im saying more games when startup have NVidia brand, not AMD

And all those points is pretty minimal which can be ignored, the Performance/TDP/Temp is the most important, the vid have said it all, most people use Jayz vid to say that the R9 is better, but his GTX970 does not have that good OC and a lot cheaper than the MSI one, the only truly great p/p 970 is MSI Gaming 4g and Gigabyte G1 Gaming, in >1080p or 2 cards/future prof then still that R9 is winner tho


----------



## BinaryDemon

Quote:


> Originally Posted by *Stige*
> 
> Also wth "more games compatible with nvidia than amd"?? That makes zero sense at all, all games work on all cards these days fine 90% of the time at release.


I wouldn't say it makes zero sense with how popular the Nvidia GameWorks application is with developers. Games will often release initially with better tuning for Nvidia cards. AMD is forced to play catch up and release drivers tweaked for that game, that is, provided the game is popular enough for AMD to bother with this level of optimization.


----------



## Stige

Who said anything about custom coolers? I said a proper cooler. Not like this crappy ASUS DC3 STRIX that I have which has the VRM on fire everytime I launch a game...
There are much better alternatives out there like the Sapphire Vapor-X which doesn't have any problems with temps at all.

And by Nvidia Gameworks you mean Hairworks and that other derp stuff that no card will run at reasonable framerates? Not ones we are talking about in here atleast.


----------



## tj3n123

Lol ofc more expensive = better, thats why im comparing 2 card both from *MSI*, which have almost the same cooling architecture and close price to each other, ofc they can choose a higher price R9 which have better cooling solution, thats no prob at all

I dont say anything about the Gamework, i say the software overall is better, faster driver update, less serious bugs, other stuff like ShadowPlay perform great while Raptr is more limited. Both the GameWork and the TressFX give bad performance so its totally no point talking about that. Both have pros and cons and i have research for the exact brand, not any other, Im not a blindly fanboy and argue for nonsense stuff, i just simply collect information on the web and put it here for other ppl who still consider which to purchase


----------



## rickcooperjr

Quote:


> Originally Posted by *tj3n123*
> 
> Lol ofc more expensive = better, thats why im comparing 2 card both from *MSI*, which have almost the same cooling architecture and close price to each other, ofc they can choose a higher price R9 which have better cooling solution, thats no prob at all
> 
> I dont say anything about the Gamework, i say the software overall is better, faster driver update, less serious bugs, other stuff like ShadowPlay perform great while Raptr is more limited. Both the GameWork and the TressFX give bad performance so its totally no point talking about that. Both have pros and cons and i have research for the exact brand, not any other, Im not a blindly fanboy and argue for nonsense stuff, i just simply collect information on the web and put it here for other ppl who still consider which to purchase


Nvidia gameworks hairworks and such drag Nvidia Titan X's into the ground and does even worse on AMD it is because of the shady way Nvidia implimented stuff with the gameworks software and locked it down. So not even the developers can see or modify / optimize the code they get what they get and that is that but AMD TressFX is opensource and functions very well on AMD and Nvidia hardware. So PLZ understand this Nvidia gameworks is often reffered to as Nvidia gimpworks or Nvidia failworks because even the dev's don't like it but they use it because not much else out there but AMD is about to release a open source version alot like gameworks for this reason and game / engine DEV's are now letting out alot of chatter about it and it is buzzing.


----------



## tj3n123

Quote:


> Originally Posted by *rickcooperjr*
> 
> Nvidia gameworks hairworks and such drag Nvidia Titan X's into the ground and does even worse on AMD it is because of the shady way Nvidia implimented stuff with the gameworks software and locked it down. So not even the developers can see or modify / optimize the code they get what they get and that is that but AMD TressFX is opensource and functions very well on AMD and Nvidia hardware. So PLZ understand this Nvidia gameworks is often reffered to as Nvidia gimpworks or Nvidia failworks because even the dev's don't like it but they use it because not much else out there but AMD is about to release a open source version alot like gameworks for this reason and game / engine DEV's are now letting out alot of chatter about it and it is buzzing.


Good point







I agree that Gamework is stupid compare to TressFX though, but its just software and can be improved, so its fine i suppose, but the hardware for already made card then cant, AMD seems got new technology recently and maybe their next card is completely outcome NVidia...not sure though


----------



## BinaryDemon

Quote:


> Originally Posted by *Stige*
> 
> And by Nvidia Gameworks you mean Hairworks and that other derp stuff that no card will run at reasonable framerates? Not ones we are talking about in here atleast.


GameWorks is more than just HairWorks.
Quote:


> Originally Posted by *rickcooperjr*
> 
> Nvidia gameworks hairworks and such drag Nvidia Titan X's into the ground and does even worse on AMD it is because of the shady way Nvidia implimented stuff with the gameworks software and locked it down. So not even the developers can see or modify / optimize the code they get what they get and that is that but AMD TressFX is opensource and functions very well on AMD and Nvidia hardware. So PLZ understand this Nvidia gameworks is often reffered to as Nvidia gimpworks or Nvidia failworks because even the dev's don't like it but they use it because not much else out there but AMD is about to release a open source version alot like gameworks for this reason and game / engine DEV's are now letting out alot of chatter about it and it is buzzing.


Developers hate that they don't get access to the source code, but GameWorks is still a tool that makes their job exponentially easier. It's not like Nvidia requires you to use GameWorks, developers are free to implement their own equivalent FX functions but that is significantly more coding.


----------



## rickcooperjr

Quote:


> Originally Posted by *tj3n123*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rickcooperjr*
> 
> Nvidia gameworks hairworks and such drag Nvidia Titan X's into the ground and does even worse on AMD it is because of the shady way Nvidia implimented stuff with the gameworks software and locked it down. So not even the developers can see or modify / optimize the code they get what they get and that is that but AMD TressFX is opensource and functions very well on AMD and Nvidia hardware. So PLZ understand this Nvidia gameworks is often reffered to as Nvidia gimpworks or Nvidia failworks because even the dev's don't like it but they use it because not much else out there but AMD is about to release a open source version alot like gameworks for this reason and game / engine DEV's are now letting out alot of chatter about it and it is buzzing.
> 
> 
> 
> Good point
> 
> 
> 
> 
> 
> 
> 
> I agree that Gamework is stupid compare to TressFX though, but its just software and can be improved, so its fine i suppose, but the hardware for already made card then cant, AMD seems got new technology recently and maybe their next card is completely outcome NVidia...not sure though
Click to expand...

Gameworks can only be improved by Nvidia Nvidia locks out the game / engine dev's and only gives them minimum access to the code and gives everyone else the finger. The problem with all software from Nvidia is it is locked down so hard that only Nvidia get to even see the content also Nvidia gameworks has been out for a while yet it is just as biased if not more so than it was on release in otherwards it has actually gotten worse because it even discrimenates against older Nvidia hardware anything but current lineup (900 series) meaning GTX 700 series owners get the shaft or keplar / fermi.

The point is game / engine DEV's need to see the source code they are given incentives from Nvidia to use gameworks but are forced to do so blindly and that is a big issue that is why almost all gameworks titles release with issues for even Nvidia cards to point theyre almost game breaking because only Nvidia can see the issue and fix it and well the game / engine DEV's are only able to chase theyre own tail with no real capability of fixxing the issue because it is the gameworks itself that is the issue. We all know how long it takes Nvidia to fix any gameworks issues it is a longtime before they do a thing abotu any of them and they let the game / engine DEV's get the flak and nvidia just push's it onto them like theyre crap don't stink.


----------



## jmcosta

Quote:


> Originally Posted by *rickcooperjr*
> 
> Gameworks can only be improved by Nvidia Nvidia locks out the game / engine dev's and only gives them minimum access to the code and gives everyone else the finger. The problem with all software from Nvidia is it is locked down so hard that only Nvidia get to even see the content also Nvidia gameworks has been out for a while yet it is just as biased if not more so than it was on release in otherwards it has actually gotten worse because it even discrimenates against older Nvidia hardware anything but current lineup (900 series) meaning GTX 700 series owners get the shaft or keplar / fermi.
> 
> The point is game / engine DEV's need to see the source code they are given incentives from Nvidia to use gameworks but are forced to do so blindly and that is a big issue that is why almost all gameworks titles release with issues for even Nvidia cards to point theyre almost game breaking because only Nvidia can see the issue and fix it and well the game / engine DEV's are only able to chase theyre own tail with no real capability of fixxing the issue because it is the gameworks itself that is the issue. We all know how long it takes Nvidia to fix any gameworks issues it is a longtime before they do a thing abotu any of them and they let the game / engine DEV's get the flak and nvidia just push's it onto them like theyre crap don't stink.


being locked or not, lots of gm effects perform equality or almost the same...





Spoiler: Warning: Spoiler!










the 980gtx is a bit stronger so theres a small difference..
i know some of their effects perform terrible on amd gpus but that's rare

i don't like this gameworks situation neither but if you dont like it just disable its like nothing happen...its exactly what i do, if the effect doesn't fit into the game atmosphere or tanks the performance, its simple change to off..

people make this a big deal lol

also nidia has nothing to do with bad development in ports, most of them with or without gm are crap by default

anyway to stay on topic between these two cards (970-390) is more like a personal preference, they are very similar, the 390 performs like 5% better but the framelag isnt good in some games and cpu overhead\driver is another issue


----------



## mcg75

Quote:


> Originally Posted by *BinaryDemon*
> 
> Developers hate that they don't get access to the source code, but GameWorks is still a tool that makes their job exponentially easier. It's not like Nvidia requires you to use GameWorks, developers are free to implement their own equivalent FX functions but that is significantly more coding.


The developer is able to licence and see the source code.
Quote:


> "You can't take that source code and then give it to someone else that doesn't have a license. You have to keep it inside your company," he explained. "But [game developers] are allowed to make changes to it. In fact, what would be the purpose of the source code if they couldn't change it? They can change it, if they want they can optimize for someone else's hardware or put in changes that improve the performance on AMD or something like that, and that's totally fine; that's within the rules of the source license."
> 
> "We do give out source code under a license to the GameWorks modules. The only ones we don't are Nvidia specific, that only run on Nvidia. Most of them don't fall into that bucket. That's not what AMD is complaining about anyway, because those things are not possibly crippling AMD because they don't run on AMD, things like TXAA. For all the other modules that run everywhere, things like HBAO+ and HairWorks, we do offer that under a source license agreement. In addition to that, we have this separate thing that doesn't require any kind of licensing agreement; these samples we have for OpenGL and DirectX, and we also have samples for Android and all the stuff on our GameWorks website right now."


Source.

The problem is the developer is using gameworks to save themselves money.

I welcome AMD introducing their own set of tools to do the same as I've called for it in the past. It's the only fair thing to do.

However, I don't see anyone really using it unless AMD works with them to implement it like Nvidia does.

TressFX has been available for use for quite awhile and we haven't seen it beyond Tomb Raider for any major titles.


----------



## NightAntilli

Quote:


> Originally Posted by *daunow*
> 
> You do realize they open source everything because they have to? right?


Why do you say that?

I might already know the answer, but I still want to see what your reason is.


----------



## daunow

Quote:


> Originally Posted by *NightAntilli*
> 
> Why do you say that?
> 
> I might already know the answer, but I still want to see what your reason is.


Since you already know my answer, might as well see your reasoning instead.


----------



## Stige

Quote:


> Originally Posted by *daunow*
> 
> You do realize they open source everything because they have to? right?


Or because they are not dicks like Derpvidia is?


----------



## rickcooperjr

Quote:


> Originally Posted by *Stige*
> 
> Quote:
> 
> 
> 
> Originally Posted by *daunow*
> 
> You do realize they open source everything because they have to? right?
> 
> 
> 
> Or because they are not dicks like Derpvidia is?
Click to expand...

could not agree more AMD has always went open source with theyre stuff pretty much always to try to better entire community and not sandbox / isolate things like APPLE and Nvidia does seriously this kind of habit holds the entire community back because it stifles innovation and isolates people and companies out of core essential things.

Imagine if Nvidia tried to help the community in a whole by not sand boxing theyre stuff it would help everyone and further improve general technology and software this would greatly benefit games and game performance across the board but no they have to play things as they do and be greedy / power hungry. The thing is Nvidia is almost a monopoloy as it is this is why Intel has partnered with AMD for the GPUOpen and adopted AMD freesync stuff and many game / engine DEV's have also said they will do so because of all the backlash from Nvidia Gamworks and that whole fiasco just constantly repeating over and over every title release using it.

Intel and Freesync http://www.maximumpc.com/intel-pledges-support-for-freesync-where-does-that-leave-g-sync/


----------



## Stige

And just look at NVidias useless Hairworks etc, no one uses that crap because it doesn't work properly, MAYBE if it was open source, maybe...


----------



## phenom01

Quote:


> Originally Posted by *rickcooperjr*
> 
> could not agree more AMD has always went open source with theyre stuff pretty much always to try to better entire community and not sandbox / isolate things like APPLE and Nvidia does seriously this kind of habit holds the entire community back because it stifles innovation and isolates people and companies out of core essential things.


AMD makes Intel pay a fee for every x86 CPU they make. They dont choose to open source this stuff its just simply they cant compete anymore and need help. Apple and nvidia dont "open source" most tech is because they are massively profitable and know people will pay the money thus dont have to.


----------



## rickcooperjr

Quote:


> Originally Posted by *phenom01*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rickcooperjr*
> 
> could not agree more AMD has always went open source with theyre stuff pretty much always to try to better entire community and not sandbox / isolate things like APPLE and Nvidia does seriously this kind of habit holds the entire community back because it stifles innovation and isolates people and companies out of core essential things.
> 
> 
> 
> AMD makes Intel pay a fee for every x86 CPU made. They dont choose to open source this stuff its just simply they cant compete anymore and need help. Apple and nvidia dont "open source" most tech is because they are massively profitable and know people will pay the money thus dont have to.
Click to expand...

well that is because Intel forced them to do so because otherwise AMD would have to pay for things Intel has patents on AMD holds the X86 patent and Intel holds another so they both agreed to pay each other to use one anothers patents.

The whole Intel and AMD royalty thing is a huge conversation that truly is not able to be fully dove into in such a small thread and it has been done so many times already in short AMD holds patents that Intel pay to use and AMD pay Intel to use some of theyre patents both are in fear of giving up this bit of power because it is what holds the status quo.


----------



## phenom01

Quote:


> Originally Posted by *rickcooperjr*
> 
> well that is because Intel forced them to do so because otherwise AMD would have to pay for things Intel has patents on AMD holds the X86 patent and Intel holds another so they both agreed to pay each other to use one anothers patents.


They dont agree they sue each other in billions of dollars in law suits til one gets the upper hand and uses it as a market advantage. Which Intel and Nvidia have won at as AMD is constantly posting down sales number and stagnant performance increases on their main product....their CPU's.

Intel and nvidia stock are trading at around 33-35 atm AMD is 2.97. That says it all.


----------



## rickcooperjr

Quote:


> Originally Posted by *phenom01*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rickcooperjr*
> 
> well that is because Intel forced them to do so because otherwise AMD would have to pay for things Intel has patents on AMD holds the X86 patent and Intel holds another so they both agreed to pay each other to use one anothers patents.
> 
> 
> 
> They dont agree they sue each other in billions of dollars in law suits til one gets the upper hand and uses it as a market advantage. Which Intel and Nvidia have won at as AMD is constantly posting down sales number and stagnant performance increases on their main product....their CPU's.
Click to expand...

well Nvidia technically lost the right to call theyre graphics cards a GPU in mobile devices theyre patent was completely thrown out it is a bit of a blurry mess at sec as to what all of it means but it is very interesting.

https://www.techpowerup.com/218629/nvidia-stares-at-sales-ban-as-us-itc-rules-in-samsungs-favor-in-patent-dispute.html

also a very good read one of Nvidias patents were completely thrown out due to grounds of invalidity http://wccftech.com/nvidia-invent-gpu-itc-court-rules-samsungs-favor/


----------



## Awsan

You will never get a direct answer!

1-Buy both
2-Test both
3-Use the chosen one and return the fallen one
4-???????
5-Profit?


----------



## BinaryDemon

Quote:


> Originally Posted by *phenom01*
> 
> AMD makes Intel pay a fee for every x86 CPU they make.


Are you sure that's accurate? I thought they have an equal footing cross-licensing agreement.


----------



## rickcooperjr

Quote:


> Originally Posted by *BinaryDemon*
> 
> Quote:
> 
> 
> 
> Originally Posted by *phenom01*
> 
> AMD makes Intel pay a fee for every x86 CPU they make.
> 
> 
> 
> Are you sure that's accurate? I thought they have an equal footing cross-licensing agreement.
Click to expand...

so did I but I was not sure


----------



## daunow

Quote:


> Originally Posted by *Stige*
> 
> Or because they are not dicks like Derpvidia is?


ok


----------



## mcg75

Quote:


> Originally Posted by *NightAntilli*
> 
> Yeah... They say developers are allowed to optimize for AMD with the source code, but it doesn't work that way in practice. The source code itself may not be shared, as stated. Making the optimization in the game and being unable to show AMD the source code in actuality means that the developer alone must make the optimization, with limited knowledge of how the graphics hardware and drivers of the AMD cards work, since they may not have support from AMD. This in actuality makes it impossible to do it. Legally, what they say is true, but they are very aware that in practice it's not going to happen, and they deliberately keep the same practice.
> nVidia lies to people's faces with the truth. Disgusting.


You hit the nail square on the head when you said support.

I agree with you partially about Nvidia. They are shady and self-serving 100% no doubt. But tell me you aren't falling for the line that AMD says they need code to optimize? It makes it easier no doubt but it can be done without. We've seen too many gains in Gameworks games in release notes for AMD drivers for them not to be able to optimize without it.

Let's look at a couple games for examples. Watch Dogs was the one that started all this nonsense. What did Watch Dogs have? HBAO+ and TXAA. So one effect that would run on AMD cards and one that would not. That was somehow game breaking for AMD despite the fact that HBAO+ was documented as giving the same fps penalty for both brands.

Move on to GTA V. Nvidia soft shadows and TXAA make an appearance. We didn't get stories of game breaking issues because of Gameworks being present. The difference?

AMD worked with Rockstar during development. They even had a driver ready for the game's release. Have they ever had one ready for a Gameworks game at launch?

The only thing truly shady about Gameworks is use of tessellation. Nvidia built their cards with tessellation in mind. AMD did not. Using more tessellation than needed has been documented many times and it would most certainly would put AMD at a disadvantage.
Quote:


> Originally Posted by *NightAntilli*
> 
> Exactly, and this happens in multiple ways. nVidia basically 'bribes' developers, paying them to implement GameWorks, pretty much like a sponsor. This gives them additional income and at the same time drives down development costs since less time is needed developing certain effects. The long term consequences is developers that have to rely on these effects and haven't implemented their own optimizations, meaning they become more dependent on GameWorks and any of their games that won't use it will either be extremely expensive to develop or be coded like crap. The other obvious drawback is that even if they see the source code and they optimize, they are only known with nVidia hardware, basically being limited to explore AMD hardware at all, even if the AMD hardware is superior. GameWorks on the long term is bad for both AMD and the gaming industry as a whole. It's actually even bad for nVidia themselves. They are focusing on their newest products, and people have been realizing that they practically dumped graphics cards any older than 1-2 years when it comes to GameWorks support.
> It's called GPUOpen, and of course it's open source for the benefit of the whole community, rather than their own pockets.


If it's truly bad for Nvidia then hopefully it turns around and bites them in the butt.

Yeah, it's GPUOpen and I'm sure TressFX is probably part of it. I still think without AMD pouring money into it, it's not going to be adopted. Developers will opt for their own solution before that unless there is an incentive. Hopefully I'm wrong but it's doubtful.

And GPUOpen is a benefit for AMD that will go in their pockets if it works. All of this open standards stuff is simply positive PR for AMD trying to get their market share back. The guy who is number 2 always makes sacrifices that the number 1 guy won't. If I was in their position, I'd do it as well. If they were the ones with 80% market share, do you think shareholders would let them give away a tech that could be making them money to their biggest rivals giving them a chance to use it to possibly catch them? Again, very doubtful.

We have two companies that exist to make money with two different strategies to do it. The strategy they chose is based on where they sit in the market. One abuses their position at the top, the other tells us stories with half-truths so we jump to their side instead thinking they are the good guys. I'm sorry but there are no good guys when it comes to giant corporations.


----------



## NightAntilli

Quote:


> Originally Posted by *BinaryDemon*
> 
> Are you sure that's accurate? I thought they have an equal footing cross-licensing agreement.


Let's put this to rest;

_AMD began life as a second-source supplier for companies using Intel processors. Companies like IBM didn't want to rely solely on Intel for one of the primary components in their computers, so they licensed AMD to produce versions of processors like the 8088 and 80286. While these CPUs were manufactured by AMD (and, in some cases, AMD was actually able to clock the CPUs higher than their Intel counterparts), almost everything about their designs came from Intel.

Beginning with Intel's 80386 in 1985, Intel stopped giving AMD access to its designs. AMD had to forge its own way, soon producing 386 and later 486 CPUs that were essentially reverse-engineered versions of Intel's parts._
http://arstechnica.com/business/2013/04/the-rise-and-fall-of-amd-how-an-underdog-stuck-it-to-intel/

~~~~~~~~~~~~~~~~~~~~~~~~~~

_The K8's biggest benefit for servers, though, was its 64-bit extensions. The extensions enabled AMD's chips to run 64-bit operating systems that could address more than 4GB of memory at a time, but they didn't sacrifice compatibility or speed when running then-standard 32-bit operating systems and applications. These extensions would go on to become the industry standard, beating out Intel's alternate 64-bit Itanium architecture-Intel even licensed the AMD64 extensions for its own compatible x86-64 implementation. (Intel's initial approach could only run x86 code with an immense performance penalty.)_
http://arstechnica.com/business/2013/04/amd-on-ropes-from-the-top-of-the-mountain-to-the-deepest-valleys/

AMD could've played the jerk and refused to license it, and let Intel do the work to reverse-engineer it, just like they had to in the past. Obviously this is a small part of the whole history of Intel vs AMD. AMD is no saint and they've made a LOT of mistakes that cost them and are still paying for it.

But in any case, this is an architecture. Obviously just making the core of your architecture open source is suicide. There has to be some 'secret sauce' as some call it, and AMD's architecture is full of those. Lastly concurrent Async compute came to light as one of the biggest advantages of GCN. Their strength is their overall architecture. This is why they have been pushing Mantle, Vulkan, because their architecture's strengths can be brought to light a lot easier with them. They are using their closed architecture to push open source software.

Going back to GameWorks and nVidia vs AMD, when we talk about GameWorks, we're not really talking about an architecture licensing or anything. We're talking about software code to use a specific architecture only AND not optimize for any other architecture, arguably even gimp any other architecture. There is no 'secret sauce' here, simply licensed restriction of code that can be programmed only as nVidia wishes. They can play to their architecture's strengths, but are also limiting others to do the same. If they deliberately gimp AMD cards, who can know?

The equivalent of this would be like a modem vendor finding out a great way to reduce latency. Rather than putting it in their modems and announce it as a feature, they create an InternetWorks program that has to be installed on the side of the ISP, and any ISP that wishes to take advantage of this, needs to pay a license to let the modem vendor implement it for them, and if any other modem is used, they will have increased latency instead.

AMD and nVidia are hardware vendors. They should be competing on hardware, not on locking out each other's hardware with software. We can't know who has the better hardware if it's all hidden behind smoke and curtains.

Ask yourself this. Why don't we have a game that has the logo of both companies on it? Why don't we have a game that has both TressFX and Hairworks?


----------



## BinaryDemon

Quote:


> Originally Posted by *NightAntilli*
> 
> Let's put this to rest;


I lived thru most of that, I understand we have reached the point were basically neither company can produce a modern CPU without using the others intellectual property.

I was specifically trying to find out where phenom1 got the idea that: Intel currently pays AMD licensing or royalties on x86 cpus.

As far as I can tell neither pays the other anything. It seems like they have a cross-licensing agreement which also includes an agreement not to sue each other.


----------



## mcg75

Quote:


> Originally Posted by *NightAntilli*
> 
> Ask yourself this. Why don't we have a game that has the logo of both companies on it? Why don't we have a game that has both TressFX and Hairworks?


Ok, I asked myself and the answer is Grand Theft Auto V. Both companies have their own version of shadows embedded in the game. PCSS for Nvidia and Contact Hardening Shadows for AMD.


----------



## NightAntilli

Quote:


> Originally Posted by *mcg75*
> 
> Ok, I asked myself and the answer is Grand Theft Auto V. Both companies have their own version of shadows embedded in the game. PCSS for Nvidia and Contact Hardening Shadows for AMD.


Hm. I was unaware of this. GTA is not really a game on my radar lol, but this is actually how it should be. If it stays like this, I have no problem with GameWorks. AMD's solution looks better, but it seems to be more buggy also... But then again, GTA V was not really proclaimed to be a GameWorks title as far as I know, but simply used GameWorks techniques. I could be wrong though. Generally, setting a game to default Ultra settings turns on GameWorks effects in GameWorks games. Does GTA V do the same? Thanks for the example anyway. Is this the first/only title to do this?


----------



## mcg75

Quote:


> Originally Posted by *NightAntilli*
> 
> Hm. I was unaware of this. GTA is not really a game on my radar lol, but this is actually how it should be. If it stays like this, I have no problem with GameWorks. AMD's solution looks better, but it seems to be more buggy also... But then again, GTA V was not really proclaimed to be a GameWorks title as far as I know, but simply used GameWorks techniques. I could be wrong though. Generally, setting a game to default Ultra settings turns on GameWorks effects in GameWorks games. Does GTA V do the same? Thanks for the example anyway. Is this the first/only title to do this?


I agree that this is the way it should be. And it is the only game I can remember that does.

I don't play the game, I'm not a fan so I'm not sure how the settings are. But I pretty sure the features from one will not run on the other.

It's not a Gameworks official title. You'll notice that most or all of official Gameworks games end up being bundled with new cards.

But even though it's not official, it still has Gameworks features and therefore Gameworks code built in. Despite this, AMD had a release day driver ready because they put the effort into working with Rockstar to do it.


----------



## NightAntilli

That means Rockstar was able to communicate with AMD for it, and that means nVidia did not limit the developer from communicating in this case. In GameWorks titles this is definitely the case.

As for the features running on each other's cards, they do. In other words, nVidia's PCSS runs on AMD hardware perfectly, and AMD's CHS runs on nVidia hardware, although with some visual bugs (shadows popping in & out).


----------



## mcg75

Quote:


> Originally Posted by *NightAntilli*
> 
> That means Rockstar was able to communicate with AMD for it, and that means nVidia did not limit the developer from communicating in this case. In GameWorks titles this is definitely the case.


You do realize that there is zero proof of that right?
Quote:


> Originally Posted by *NightAntilli*
> 
> As for the features running on each other's cards, they do. In other words, nVidia's PCSS runs on AMD hardware perfectly, and AMD's CHS runs on nVidia hardware, although with some visual bugs (shadows popping in & out).


So how did AMD optimize for PCSS to run perfectly if they cannot see the code to do so?

When it comes to Gameworks, neither AMD or Nvdia are telling us the whole truth. Believing everything either one says will do far more harm to PC gaming than Gameworks ever could.


----------



## tj3n123

Quote:


> Originally Posted by *Awsan*
> 
> You will never get a direct answer!
> 
> 1-Buy both
> 2-Test both
> 3-Use the chosen one and return the fallen one
> 4-???????
> 5-Profit?


If only i have enough coin for tat, but i dun


----------



## diggiddi

This review by Roboyto might help
http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/34380#post_23458320


----------



## NightAntilli

Quote:


> Originally Posted by *mcg75*
> 
> You do realize that there is zero proof of that right?
> So how did AMD optimize for PCSS to run perfectly if they cannot see the code to do so?


Either AMD was allowed to assist in optimizing the source code for the effect in this game, or Rockstar knows AMD's architecture/drivers very well. It's well-stated that developers who license the source code under the GameWorks prorgram cannot show it to anyone else. It's not exactly rocket science. But since this isn't really a GameWorks title, things might have gone differently? Not sure.
Quote:


> Originally Posted by *mcg75*
> 
> When it comes to Gameworks, neither AMD or Nvdia are telling us the whole truth. Believing everything either one says will do far more harm to PC gaming than Gameworks ever could.


What AMD has been saying makes a lot more sense than what nVidia has been saying, and all the evidence of these 'coincidences' that gimp hardware are not exactly in the favor of nVidia's story, and actually favor AMD. And even if AMD is lying, I don't see how believing in AMD's side can harm PC gaming. Right now the GameWorks program is nothing more than a forceful integration of a bunch of visual gimmicks that are nice to have from now and then, and hamper performance more than it needs to. If the effects can be implemented freely, it's a win for everyone, except maybe nVidia.
Quote:


> Originally Posted by *diggiddi*
> 
> This review by Roboyto might help
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/34380#post_23458320


It's a nice comparison. The gripe I have with it is that the CPUs are not the same, and the R9 290 which is more CPU dependent than the GTX 970 gets the slower CPU. How significant it is..? Unsure, but it might have been a better balance if the CPUs were swapped. The averages wouldn't be influenced that much I think, but the minimums would.


----------



## Mazda6i07

So which is better overclocked @ 1080p then? I feel like they're the same card performance wise....


----------



## Ha-Nocri

Quote:


> Originally Posted by *Mazda6i07*
> 
> So which is better overclocked @ 1080p then? I feel like they're the same card performance wise....


@1080p they are about the same, maybe 390 is a bit faster. Now, do you want more VRAM, or less power consumption is up to you


----------



## Mazda6i07

Well, at the moment the psu im using is overkill so im not too concerned with the amount of power usage. But I guess whichever is cheapest on newegg at the moment ill end up buying. I plan on getting either a second one end of next year or just getting new card all together. we'll see


----------



## Allseeing

What I like about the 970 is its voltage. It performs close to the 390 yet has a much lower voltage witch means lower temps and bills. You try getting a 390 down to 29c.


----------



## Mazda6i07

Yeah, that's my only concern is temps inside the case getting outrageous, so hoping my d-15 can keep up on the cpu end, So its a real toss up, but pretty much preference these days it seems.


----------



## iRUSH

Quote:


> Originally Posted by *Allseeing*
> 
> What I like about the 970 is its voltage. It performs close to the 390 yet has a much lower voltage witch means lower temps and bills. You try getting a 390 down to 29c.


There's plenty of testing out there to show the difference between two cards like this has little bearing on ones power bill.


----------



## Mazda6i07

Pretty much, thats why im just seeing which i can get cheaper and going that route, which seems to be the 390 at the moment. So guess i'll go with that, i can always switch to nvidia whenever they release their new cards too.


----------



## Schmuckley

Not sure what's going on here, but I'd pick the 390 all day.
Is it $250 like I paid for gtx 970?
One thing I've noticed about 970 is..while it runs cool and all..It heats up the res water for real.
It could be my weird BIOS tweakins and whatnot..but..I don't think so


----------



## Mazda6i07

Just ordered a 390


----------



## iRUSH

Quote:


> Originally Posted by *Mazda6i07*
> 
> Just ordered a 390


Awesome, which one did you buy?


----------



## Mazda6i07

I just went with the Msi one to match my entire build, could have gotten the Asus gpu cheaper since it's on sale in email deals, but dont really care since i wanted it all to match. So once i have the card, ill try to oc it to a solid stable clock and be good to go.


----------



## PontiacGTX

Quote:


> Originally Posted by *Mazda6i07*
> 
> I just went with the Msi one to match my entire build, could have gotten the Asus gpu cheaper since it's on sale in email deals, but dont really care since i wanted it all to match. So once i have the card, ill try to oc it to a solid stable clock and be good to go.


what was the price?


----------



## Mazda6i07

Still a bit high in my opinion, but the Msi was $340. Comes with a $10 mir, so $330.
The Asus had a great email deal it ended up being like $310. Good price for performance in my opinion at $300.
Prices from Newegg.


----------



## rpnp7

Hey guys,
If you are making a Skylake gaming PC right now as of January 12th 2016 & want to game at 1440p, which one of these would your personally go with? Biggest reason as to why would be appreciated to!

1) Gigabyte G1 Gaming R9 390 [$436 CAD]
2) MSI Radeon R9 390 Twinfrozr V [$480 CAD]
3) Gigabyte Gtx 970 Xtreme [$525 CAD] -- new 1 that came out with massive core & boost & massive OC potential
4) Gigabyte G1 Gaming R9 390X [$555 CAD]
5) MSI Radeon R9 390X TwinFrozr V [$620 CAD]

I compared each card to each other using the following website:
http://www.game-debate.com/gpu/index.php?gid=3462&gid2=3078&compare=geforce-gtx-970-gigabyte-xtreme-4gb-edition-vs-radeon-r9-390x-msi-gaming-8gb-edition

The "Gigabyte Gtx 970 Xtreme [$525]" beat them all according to the website.

The build:
Intel i7 6700k
Asus ROG Maximus VIII Impact Mini-Itx
Corsair Carbide air 240
16gb kingston ram
corsair h100i gtx cpu cooler
EVGA 650W G2
GPU: ?

The things i really do often on my machine are the following:
-Autocad
-Engineering work with massive images which need editing
-A-lot of engineering work (mechanical engineering)
-Games: I'm a huge FPS & MMORPG fan. examples) World of warcraft, black desert, tera, bf3, bf4, bf hardline, black ops 3
-A-lot of web surfing

I really like the power efficiency of the Gtx 970 & of-course it being a Nvidia card, etc ... but every time i try to click "buy" on a Gtx 970 my heads like: but the 390 & 390x has 4gb more vram and it's more future proof, wth are you doing spending the same amount on a less future proof type card that costs the same/more.

Thanks.


----------



## iRUSH

Quote:


> Originally Posted by *rpnp7*
> 
> Hey guys,
> If you are making a Skylake gaming PC right now as of January 12th 2016 & want to game at 1440p, which one of these would your personally go with? Biggest reason as to why would be appreciated to!
> 
> 1) Gigabyte G1 Gaming R9 390 [$436 CAD]
> 2) MSI Radeon R9 390 Twinfrozr V [$480 CAD]
> 3) Gigabyte Gtx 970 Xtreme [$525 CAD] -- new 1 that came out with massive core & boost & massive OC potential
> 4) Gigabyte G1 Gaming R9 390X [$555 CAD]
> 5) MSI Radeon R9 390X TwinFrozr V [$620 CAD]
> 
> I compared each card to each other using the following website:
> http://www.game-debate.com/gpu/index.php?gid=3462&gid2=3078&compare=geforce-gtx-970-gigabyte-xtreme-4gb-edition-vs-radeon-r9-390x-msi-gaming-8gb-edition
> 
> The "Gigabyte Gtx 970 Xtreme [$525]" beat them all according to the website.
> 
> The build:
> Intel i7 6700k
> Asus ROG Maximus VIII Impact Mini-Itx
> Corsair Carbide air 240
> 16gb kingston ram
> corsair h100i gtx cpu cooler
> EVGA 650W G2
> GPU: ?
> 
> The things i really do often on my machine are the following:
> -Autocad
> -Engineering work with massive images which need editing
> -A-lot of engineering work (mechanical engineering)
> -Games: I'm a huge FPS & MMORPG fan. examples) World of warcraft, black desert, tera, bf3, bf4, bf hardline, black ops 3
> -A-lot of web surfing
> 
> I really like the power efficiency of the Gtx 970 & of-course it being a Nvidia card, etc ... but every time i try to click "buy" on a Gtx 970 my heads like: but the 390 & 390x has 4gb more vram and it's more future proof, wth are you doing spending the same amount on a less future proof type card that costs the same/more.
> 
> Thanks.


A build this this at 1440p commands a 980ti or a Fury in my opinion.


----------



## TopicClocker

Quote:


> Originally Posted by *rpnp7*
> 
> Hey guys,
> If you are making a Skylake gaming PC right now as of January 12th 2016 & want to game at 1440p, which one of these would your personally go with? Biggest reason as to why would be appreciated to!
> 
> 1) Gigabyte G1 Gaming R9 390 [$436 CAD]
> 2) MSI Radeon R9 390 Twinfrozr V [$480 CAD]
> 3) Gigabyte Gtx 970 Xtreme [$525 CAD] -- new 1 that came out with massive core & boost & massive OC potential
> 4) Gigabyte G1 Gaming R9 390X [$555 CAD]
> 5) MSI Radeon R9 390X TwinFrozr V [$620 CAD]
> 
> I compared each card to each other using the following website:
> http://www.game-debate.com/gpu/index.php?gid=3462&gid2=3078&compare=geforce-gtx-970-gigabyte-xtreme-4gb-edition-vs-radeon-r9-390x-msi-gaming-8gb-edition
> 
> The "Gigabyte Gtx 970 Xtreme [$525]" beat them all according to the website.
> 
> The build:
> Intel i7 6700k
> Asus ROG Maximus VIII Impact Mini-Itx
> Corsair Carbide air 240
> 16gb kingston ram
> corsair h100i gtx cpu cooler
> EVGA 650W G2
> GPU: ?
> 
> The things i really do often on my machine are the following:
> -Autocad
> -Engineering work with massive images which need editing
> -A-lot of engineering work (mechanical engineering)
> -Games: I'm a huge FPS & MMORPG fan. examples) World of warcraft, black desert, tera, bf3, bf4, bf hardline, black ops 3
> -A-lot of web surfing
> 
> I really like the power efficiency of the Gtx 970 & of-course it being a Nvidia card, etc ... but every time i try to click "buy" on a Gtx 970 my heads like: but the 390 & 390x has 4gb more vram and it's more future proof, wth are you doing spending the same amount on a less future proof type card that costs the same/more.
> 
> Thanks.


If you're going heavily for future proofing then I would go for the R9 390/390X cards, the GTX 970 and R9 390 perform quite similar at 1080p, however that 8GB of VRAM is a great addition.
The R9 390 and 390X also perform notably better at higher resolutions such as 1440p.

However The GTX 970 is quite likely to perform better in CPU bound situations in games like MMOs due to there being less CPU overhead in games under DX11.

So you're going for 1440p gaming? What kind of graphical settings and frame-rate target are you aiming for?

EDIT: 390th post!


----------



## rpnp7

Quote:


> Originally Posted by *TopicClocker*
> 
> If you're going heavily for future proofing then I would go for the R9 390/390X cards, the GTX 970 and R9 390 perform quite similar at 1080p, however that 8GB of VRAM is a great addition.
> The R9 390 and 390X also perform notably better at higher resolutions such as 1440p.
> 
> However The GTX 970 is quite likely to perform better in CPU bound situations in games like MMOs due to there being less CPU overhead in games under DX11.
> 
> So you're going for 1440p gaming? What kind of graphical settings and frame-rate target are you aiming for?
> 
> EDIT: 390th post!


I want ultra everything turned up BUT i only play mmorpg games & fps games.
Mostly arena mmorpg games and pve mmorpg.
Wow, tera, black desert. Fps like: bf4, bf hard line, black ops 3.

That new gtx 970 xtreme, if you look in depth it surpasses reference gtx 980 easily without being manually overclocked. it can hit 1650 boost & stay under 60 degree celsius.
it's newly released so not many people have taken a look of what it has to offer.

The gtx 980Ti Xtreme beats titan X.


----------



## PontiacGTX

Quote:


> Originally Posted by *rpnp7*
> 
> I want ultra everything turned up BUT i only play mmorpg games & fps games.
> Mostly arena mmorpg games and pve mmorpg.
> Wow, tera, black desert. Fps like: bf4, bf hard line, black ops 3.


if you will play wow wod, and some quite cpu bound mmorpg most of the time then get the 970
for BF series and battlefront the 390/290x/390x

also there is a 290x for 450cad
http://www.newegg.ca/Product/Product.aspx?Item=N82E16814150696&AID=10657534


----------



## rpnp7

Quote:


> Originally Posted by *PontiacGTX*
> 
> if you will play wow wod, and some quite cpu bound mmorpg most of the time then get the 970
> for BF series and battlefront the 390/290x/390x
> 
> also there is a 290x for 450cad
> http://www.newegg.ca/Product/Product.aspx?Item=N82E16814150696&AID=10657534


I care about aesthetics alot too, i want a RED LED LOGO.


----------



## TopicClocker

Quote:


> Originally Posted by *rpnp7*
> 
> Hey guys,
> If you are making a Skylake gaming PC right now as of January 12th 2016 & want to game at 1440p, which one of these would your personally go with? Biggest reason as to why would be appreciated to!
> 
> -Games: I'm a huge FPS & MMORPG fan. examples) World of warcraft, black desert, tera, bf3, bf4, bf hardline, black ops 3
> -A-lot of web surfing


Quote:


> Originally Posted by *rpnp7*
> 
> *I want ultra everything turned up BUT i only play mmorpg games & fps games.*
> Mostly arena mmorpg games and pve mmorpg.
> Wow, tera, black desert. Fps like: bf4, bf hard line, black ops 3.
> 
> *That new gtx 970 xtreme, if you look in depth it surpasses reference gtx 980 easily without being manually overclocked. it can hit 1650 boost & stay under 60 degree celsius.*
> it's newly released so not many people have taken a look of what it has to offer.
> 
> The gtx 980Ti Xtreme beats titan X.


I haven't played Tera in months, and I haven't seen too many benchmarks for it as of recently.
Battlefield 4 runs great on on the GTX 970 and the R9 390 and 390X cards, I played Black Ops 3 during a free weekend and ran it at the highest settings, it ran really well, as in 60+ fps but I didn't run it at 1440p.

1440p Ultra settings with those cards is going to hit the frame-rate pretty hard in the more demanding games, but for most MMOs it should be fine... Unless it's Guild Wars 2.

The GTX 970 Xtreme is just like any other GTX 970, and what overclock you can get really depends on what is called the silicon lottery and whether the card has good clocking memory chips or not.

Providing the GPU overclocks well, it can match a stock GTX 980 or exceed it by a small amount.
Somewhere within the range of a 1450-1500MHz core and 7600-8000MHz memory should do it.

I played Black Desert a few months ago and it ran pretty good on my GTX 970 at 1080p, I think I might have had the graphic settings one notch down though because I remember one of the Ultra settings would hit the frame-rate pretty heard.

I really should have included my graphics settings in this video, I'm not sure why I haven't, but you can see how it performs in the top left of the screen.


----------



## rpnp7

Quote:


> Originally Posted by *TopicClocker*
> 
> I haven't played Tera in months, and I haven't seen too many benchmarks for it as of recently.
> Battlefield 4 runs great on on the GTX 970 and the R9 390 and 390X cards, I played Black Ops 3 during a free weekend and ran it at the highest settings, it ran really well, as in 60+ fps but I didn't run it at 1440p.
> 
> 1440p Ultra settings with those cards is going to hit the frame-rate pretty hard in the more demanding games, but for most MMOs it should be fine... Unless it's Guild Wars 2.
> 
> The GTX 970 Xtreme is just like any other GTX 970, and what overclock you can get really depends on what is called the silicon lottery and whether the card has good clocking memory chips or not.
> 
> Providing the GPU overclocks well, it can match a stock GTX 980 or exceed it by a small amount.
> Somewhere within the range of a 1450-1500MHz core and 7600-8000MHz memory should do it.
> 
> I played Black Desert a few months ago and it ran pretty good on my GTX 970 at 1080p, I think I might have had the graphic settings one notch down though because I remember one of the Ultra settings would hit the frame-rate pretty heard.
> 
> I really should have included my graphics settings in this video, I'm not sure why I haven't, but you can see how it performs in the top left of the screen.


The gigabyte gtx 970 xtreme is not like every other gtx 970 & there is no silicon lottery because they do: "GPU Gauntlet Sorting" for them.
Meaning they only use silicon lottery pcb's.


----------



## ITAngel

I own the Gigabyte GTX 970 Extreme Edition and I use to own the MSI R9 290X Lightning Edition. I feel that my GTX 970 is a beast when it comes to games, I get better low and high frames and that was not even in OC mode compare to the MSI which I kept OCed. Also when the system is not gaming and doing anything the fans are off on the GTX 970 which i love and now getting the Dark Rock PRO 3 to go along with it. I should so have a pretty silence case for audio recording. Right now I want to grab another Gigabyte GTX 970 Extreme, also keep in mind I game with a 1080p 60Hz LED LCD monitor.







So I can only speak base on this resolution.


----------



## iRUSH

Quote:


> Originally Posted by *ITAngel*
> 
> I own the Gigabyte GTX 970 Extreme Edition and I use to own the MSI R9 290X Lightning Edition. I feel that my GTX 970 is a beat when it comes to games, I get better low and high frames and that was not even in OC mode compare to the MSI which I kept OCed. Also when the system is not gaming and doing anything the fans are off on the GTX 970 which i love and now getting the Dark Rock PRO 3 to go along with it. I should so have a pretty silence case for audio recording. Right now I want to grab another Gigabyte GTX 970 Extreme, also keep in mind I game with a 1080p 60Hz LED LCD monitor.
> 
> 
> 
> 
> 
> 
> 
> So I can only speak base on this resolution.


That's a good looking build you have there


----------



## ITAngel

Quote:


> Originally Posted by *iRUSH*
> 
> That's a good looking build you have there


Thanks iRUSH, soon it will look better and sound better with the new CPU cooler. XD I only keep that system at 4.2Ghz even though it can do 4.6Ghz easy.







Which is why I don't mind letting go the Noctua for some more quietness.


----------



## TopicClocker

Quote:


> Originally Posted by *rpnp7*
> 
> The gigabyte gtx 970 xtreme is not like every other gtx 970 & there is no silicon lottery because they do: "GPU Gauntlet Sorting" for them.
> Meaning they only use silicon lottery pcb's.


There is always a silicon lottery, not every GPU will overclock the same.

The G1 Gaming also has GPU Gauntlet Sorting.

http://www.gigabyte.com/products/product-page.aspx?pid=5209#ov
Quote:


> Originally Posted by *ITAngel*
> 
> I own the Gigabyte GTX 970 Extreme Edition and I use to own the MSI R9 290X Lightning Edition. I feel that my GTX 970 is a beast when it comes to games, I get better low and high frames and that was not even in OC mode compare to the MSI which I kept OCed. Also when the system is not gaming and doing anything the fans are off on the GTX 970 which i love and now getting the Dark Rock PRO 3 to go along with it. I should so have a pretty silence case for audio recording. Right now I want to grab another Gigabyte GTX 970 Extreme, also keep in mind I game with a 1080p 60Hz LED LCD monitor.
> 
> 
> 
> 
> 
> 
> 
> So I can only speak base on this resolution.


That looks really nice!

I've contemplated getting a second GTX 970 for SLI, maybe a GTX 970 Xtreme if the price is right, but I'm trying to fight the temptation due to the new GPUs coming out this year.

The VRAM concerns me a little bit but I'm not planning on 4K Gaming, just 1080p and 1440p at the most, so I'm hoping the ram holds up well there in newer games.

If the new GPUs are coming out within 2 months then I might run SLI, if they're coming out in 2-3+ months then I'll probably wait.


----------



## ITAngel

Quote:


> Originally Posted by *TopicClocker*
> 
> There is always a silicon lottery, not every GPU will overclock the same.
> 
> The G1 Gaming also has GPU Gauntlet Sorting.
> 
> http://www.gigabyte.com/products/product-page.aspx?pid=5209#ov
> That looks really nice!
> 
> I've contemplated getting a second GTX 970 for SLI, maybe a GTX 970 Xtreme if the price is right, but I'm trying to fight the temptation due to the new GPUs coming out this year.
> 
> The VRAM concerns me a little bit but I'm not planning on 4K Gaming, just 1080p and 1440p at the most, so I'm hoping the ram holds up well there in newer games.
> 
> If the new GPUs are coming out within 2 months then I might run SLI, if they're coming out in 2-3+ months then I'll probably wait.


Thanks TopicClocker! Working on it which is a slow process. Is why I am waiting for this other cpu cooler to get here so I can replace the one in there and see if i can SLI the system.







Other than that, I am trying to have it black and silver as much as possible. lol


----------



## Stige

Quote:


> Originally Posted by *TopicClocker*
> 
> If you're going heavily for future proofing then I would go for the R9 390/390X cards, the GTX 970 and R9 390 perform quite similar at 1080p, however that 8GB of VRAM is a great addition.
> The R9 390 and 390X also perform notably better at higher resolutions such as 1440p.
> 
> However The GTX 970 is quite likely to perform better in CPU bound situations in games like MMOs due to there being less CPU overhead in games under DX11.
> 
> So you're going for 1440p gaming? What kind of graphical settings and frame-rate target are you aiming for?
> 
> EDIT: 390th post!


R9 390 will outperform the 970 anywhere these days thanks to the two latest AMD drivers. Overclocked or not.
Only reason ever to even consider the 970 would be if you are some environmentalist hippy that thinks power consumption matters or you run some 400W crappy PSU.

R9 390 > GTX 970 any day, any time. And it will work about 100 times better a year from now than the GTX 970 will, games will start pushing that 3.5GB limit in the future even at 1080p, they already do at 1440p easily.


----------



## rpnp7

Quote:


> Originally Posted by *Stige*
> 
> R9 390 will outperform the 970 anywhere these days thanks to the two latest AMD drivers. Overclocked or not.
> Only reason ever to even consider the 970 would be if you are some environmentalist hippy that thinks power consumption matters or you run some 400W crappy PSU.
> 
> R9 390 > GTX 970 any day, any time. And it will work about 100 times better a year from now than the GTX 970 will, games will start pushing that 3.5GB limit in the future even at 1080p, they already do at 1440p easily.


i would love to buy a r9 390 or r9 390x but can't find one with a RED LED LOGO.
Any advice?


----------



## rickcooperjr

Quote:


> Originally Posted by *Stige*
> 
> Quote:
> 
> 
> 
> Originally Posted by *TopicClocker*
> 
> If you're going heavily for future proofing then I would go for the R9 390/390X cards, the GTX 970 and R9 390 perform quite similar at 1080p, however that 8GB of VRAM is a great addition.
> The R9 390 and 390X also perform notably better at higher resolutions such as 1440p.
> 
> However The GTX 970 is quite likely to perform better in CPU bound situations in games like MMOs due to there being less CPU overhead in games under DX11.
> 
> So you're going for 1440p gaming? What kind of graphical settings and frame-rate target are you aiming for?
> 
> EDIT: 390th post!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> R9 390 will outperform the 970 anywhere these days thanks to the two latest AMD drivers. Overclocked or not.
> Only reason ever to even consider the 970 would be if you are some environmentalist hippy that thinks power consumption matters or you run some 400W crappy PSU.
> 
> R9 390 > GTX 970 any day, any time. And it will work about 100 times better a year from now than the GTX 970 will, games will start pushing that 3.5GB limit in the future even at 1080p, they already do at 1440p easily.
Click to expand...

alot of games already push the 3.5gb area @1080p stock some push past 4gb look at modded fall out 4 or skyrim they easily push into 6gb of Vram usage and above and trust me if haven't ran fallout 4 or skyrim modded you are losing out bigtime because some of the mods add so much to the game like true storms and wet on fallout 4 OMG those are phenominal mods and the sound fixxes for true storms are so immersive it is almost a new game with just these 2 mods.

I want to also point this out for the GTX 970 users record your frame latencies and compare them with AMD R9 390 with latest drivers the 390 has much better frame times than a GTX 970 and if you go with a game with mods and such well the issue stands out even more because the frame times stay same on the 390 but tank on the GTX 970.


----------



## Stige

Quote:


> Originally Posted by *rpnp7*
> 
> i would love to buy a r9 390 or r9 390x but can't find one with a RED LED LOGO.
> Any advice?


Buy one without? Or buy the Strix DC3 which has some sort of led white led and put a cover on it = red led.


----------



## rpnp7

Quote:


> Originally Posted by *rickcooperjr*
> 
> alot of games already push the 3.5gb area @1080p stock some push past 4gb look at modded fall out 4 or skyrim they easily push into 6gb of Vram usage and above and trust me if haven't ran fallout 4 or skyrim modded you are losing out bigtime because some of the mods add so much to the game like true storms and wet on fallout 4 OMG those are phenominal mods and the sound fixxes for true storms are so immersive it is almost a new game with just these 2 mods.


Can you recommend me a r9 390 or r9 390x that has a RED LED LOGO?


----------



## ITAngel

Well I am glad I don't game on those two games as this card or my 290X would had suffer from it anyways. A side from that I have seen my GPU pass the 3.5GB mark and no issues what so ever but that may be due to this new GPU over your standard one? The frames and rendering are very noticeable over my old 290X are much nicer.


----------



## rickcooperjr

Quote:


> Originally Posted by *rpnp7*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rickcooperjr*
> 
> alot of games already push the 3.5gb area @1080p stock some push past 4gb look at modded fall out 4 or skyrim they easily push into 6gb of Vram usage and above and trust me if haven't ran fallout 4 or skyrim modded you are losing out bigtime because some of the mods add so much to the game like true storms and wet on fallout 4 OMG those are phenominal mods and the sound fixxes for true storms are so immersive it is almost a new game with just these 2 mods.
> 
> 
> 
> Can you recommend me a r9 390 or r9 390x that has a RED LED LOGO?
Click to expand...

not really no but there are options without any LED on them and just use a red cathode and be done with it then could run pretty much any GPU you want.


----------



## ITAngel

Quote:


> Originally Posted by *rpnp7*
> 
> Can you recommend me a r9 390 or r9 390x that has a RED LED LOGO?


Maybe you can do a little custom work on the R9 390 LED/Logo areas? I was looking at the MSI white lettering and can be turn into different colors.







with a little bit of paint.


----------



## rpnp7

Quote:


> Originally Posted by *rickcooperjr*
> 
> not really no but there are options without any LED on them and just use a red cathode and be done with it then could run pretty much any GPU you want.


red cathode on the gpu's logo specifically? how so?


----------



## rickcooperjr

Quote:


> Originally Posted by *ITAngel*
> 
> Well I am glad I don't game on those two games as this card or my 290X would had suffer from it anyways. A side from that I have seen my GPU pass the 3.5GB mark and no issues what so ever but that may be due to this new GPU over your standard one? The frames and rendering are very noticeable over my old 290X are much nicer.


GTA V has issues with the GTX 970 elite dangerous and also just cause 3 I believe also has the issue.


----------



## rpnp7

Quote:


> Originally Posted by *ITAngel*
> 
> Maybe you can do a little custom work on the R9 390 LED/Logo areas? I was looking at the MSI white lettering and can be turn into different colors.
> 
> 
> 
> 
> 
> 
> 
> with a little bit of paint.


How to make that white led a red led?


----------



## rickcooperjr

Quote:


> Originally Posted by *rpnp7*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rickcooperjr*
> 
> not really no but there are options without any LED on them and just use a red cathode and be done with it then could run pretty much any GPU you want.
> 
> 
> 
> red cathode on the gpu's logo specifically? how so?
Click to expand...

well red cathode you can simply position it and make the card glow if you want one to already have the LED having a LED directly on the card can be done also most LED's used in GPU's for stuff are RGB just need to do a few tweaks and if they aren't replace current one with a red one and have nice day.


----------



## ITAngel

Quote:


> Originally Posted by *rickcooperjr*
> 
> GTA V has issues with the GTX 970 elite dangerous and also just cause 3 I believe also has the issue.


I might have to try it not my kind of game but for the heck of science I am willing to test my card with it. I would much rather test Skyrim as I like that game even though I have yet to play it. I use to mod the heck out of Oblivion and loved it back in the days.

-Elite: Dangerous seems interesting to me I have Star Citizen and it works fine there.


----------



## rickcooperjr

Quote:


> Originally Posted by *rpnp7*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ITAngel*
> 
> Maybe you can do a little custom work on the R9 390 LED/Logo areas? I was looking at the MSI white lettering and can be turn into different colors.
> 
> 
> 
> 
> 
> 
> 
> with a little bit of paint.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> How to make that white led a red led?
Click to expand...

candy colored red finger nail polish will do this candy colors are essentially transparent then simply add thin coats till you get it red enough.

candy colors or paints are transparent so thin coats you can still see thru so you can add a thin coat at a time to get it just red enough also because of it being finger nail polish get easy remove finger nail polish this way can use a very minute amount of finger nail polish remover and only rub the finger nail polish area and will remove the paint without harming the card I tend to add a hard clear coat to my original decals and such before I do this so if I goof no chance of harming original finish / decals and such.


----------



## rickcooperjr

Quote:


> Originally Posted by *ITAngel*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rickcooperjr*
> 
> GTA V has issues with the GTX 970 elite dangerous and also just cause 3 I believe also has the issue.
> 
> 
> 
> I might have to try it not my kind of game but for the heck of science I am willing to test my card with it. I would much rather test Skyrim as I like that game even though I have yet to play it. I use to mod the heck out of Oblivion and loved it back in the days.
> 
> -Elite: Dangerous seems interesting to me I have Star Citizen and it works fine there.
Click to expand...

check your frame times alot of games with a GTX 970 will say your running 120fps or whatever and if you check frame times it is actually like 60-75 fps according to frame times this is a big issue with GTX 970's it is a bit odd in how FPS says like double what the frame times tell is going on this is why with some games running 45-60FPS on GTX 970's they seem laggy and stuttery / jerky this is because actual true FPS is around 30fps with modern AMD cards your frame times are more matching to your actual FPS while with the GTX 970 it is a bit weird and all over the place.

PLZ keep in mind frame times are the true measure of FPS this is what the FPS meters are suppose to be going by but in my testing with the GTX 970 it is just to odd and not precise they just don't match up like they do with AMD cards.


----------



## ITAngel

Quote:


> Originally Posted by *rickcooperjr*
> 
> check your frame times alot of games with a GTX 970 will say your running 120fps or whatever and if you check frame times it is actually like 60-75 fps according to frame times this is a big issues with GTX 970's it is a bit odd in how FPS says like double what the frame times tell is going on this is why with some games running 45-60FPS on GTX 970's they seem laggy and stuttery / jerky this is because actual true FPS is around 30fps with modern AMD cards your frame times are more matching to your actual FPS while with the GTX 970 it is a bit weird and all over the place.


Honestly I have yet to experience anything you are saying with this card, but i did with the EVGA GTX 970 SSC. hmm..... I will run some test this week to see.


----------



## rickcooperjr

Quote:


> Originally Posted by *ITAngel*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rickcooperjr*
> 
> check your frame times alot of games with a GTX 970 will say your running 120fps or whatever and if you check frame times it is actually like 60-75 fps according to frame times this is a big issues with GTX 970's it is a bit odd in how FPS says like double what the frame times tell is going on this is why with some games running 45-60FPS on GTX 970's they seem laggy and stuttery / jerky this is because actual true FPS is around 30fps with modern AMD cards your frame times are more matching to your actual FPS while with the GTX 970 it is a bit weird and all over the place.
> 
> 
> 
> Honestly I have yet to experience anything you are saying with this card, but i did with the EVGA GTX 970 SSC. hmm..... I will run some test this week to see.
Click to expand...

compare your frame times to your recorded FPS and you will see alot of discrepancies between them they just won't match up FPS meter will say running 120fps while frame times will say 60-75 fps or so. This is a major issue with the GTX 970's and infact entire GTX 900 series lineup but it takes a bit of hands on to see this so the average Joe has no clue about this. This is why with VR currently AMD does so well theyre frame times are better matching and varying frame times cause what ( motion sickness ) this is why AMD cards are more preferred for VR in most cases theyre more consecutive with much less variation so less motion sickness.

they mention here how varying frame times cause motion sickness or VR sickness / simulator sickness https://forums.oculus.com/viewtopic.php?t=170


----------



## ITAngel

Quote:


> Originally Posted by *rpnp7*
> 
> How to make that white led a red led?


I would do it the way rickcooperjr mention, he has a much better idea how to go about that. I my self don't do much customizing or painting to really guide you the best way, I just though a little pain will help with that.


----------



## iRUSH

Quote:


> Originally Posted by *rickcooperjr*
> 
> check your frame times alot of games with a GTX 970 will say your running 120fps or whatever and if you check frame times it is actually like 60-75 fps according to frame times this is a big issue with GTX 970's it is a bit odd in how FPS says like double what the frame times tell is going on this is why with some games running 45-60FPS on GTX 970's they seem laggy and stuttery / jerky this is because actual true FPS is around 30fps with modern AMD cards your frame times are more matching to your actual FPS while with the GTX 970 it is a bit weird and all over the place.
> 
> PLZ keep in mind frame times are the true measure of FPS this is what the FPS meters are suppose to be going by but in my testing with the GTX 970 it is just to odd and not precise they just don't match up like they do with AMD cards.


Woah woah, I have an enormous amount of game time with a 290x and a 970 at 1080p 144 hz and have yet to experience this let alone read about it.

Now to be honest, I've been out of the loop since November, but I'm back in and will have to look into this more once my 970 arrives tomorrow.


----------



## rpnp7

Quote:


> Originally Posted by *iRUSH*
> 
> Woah woah, I have an enormous amount of game time with a 290x and a 970 at 1080p 144 hz and have yet to experience this let alone read about it.
> 
> Now to be honest, I've been out of the loop since November, but I'm back in and will have to look into this more once my 970 arrives tomorrow.


Just bought a gtx 970 now Irush?


----------



## rickcooperjr

Quote:


> Originally Posted by *iRUSH*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rickcooperjr*
> 
> check your frame times alot of games with a GTX 970 will say your running 120fps or whatever and if you check frame times it is actually like 60-75 fps according to frame times this is a big issue with GTX 970's it is a bit odd in how FPS says like double what the frame times tell is going on this is why with some games running 45-60FPS on GTX 970's they seem laggy and stuttery / jerky this is because actual true FPS is around 30fps with modern AMD cards your frame times are more matching to your actual FPS while with the GTX 970 it is a bit weird and all over the place.
> 
> PLZ keep in mind frame times are the true measure of FPS this is what the FPS meters are suppose to be going by but in my testing with the GTX 970 it is just to odd and not precise they just don't match up like they do with AMD cards.
> 
> 
> 
> Woah woah, I have an enormous amount of game time with a 290x and a 970 at 1080p 144 hz and have yet to experience this let alone read about it.
> 
> Now to be honest, I've been out of the loop since November, but I'm back in and will have to look into this more once my 970 arrives tomorrow.
Click to expand...

The thing is it is hard to see a difference when above 60fps some say it is hard above 45fps I myself can tell the difference of 120fps vs 60fps and such and well 45fps-60fps is a bit harder but past 60fps things just feel smoother and more responsive because at 60fps+ the player response time drops drastically to point you get better advantage in competitive gaming especially when running 120fps. PLZ measure your frame times you will notice huge variation from reported FPS and what your recorded frame times show.

I used to do this alot but it is alot of work to see this first hand and well I just got tired of arguing it and others seeing it themselfs and coming back saying yes your right it wasnt them saying I was right that was the problem it was the arguement before hand was just not productive and often caused flame wars over the conversation. The issue is there are issues with the GTX 900 series as a whole with weird variation in frame times some say it is because of the on the fly color compression others say it is due to the way theyre memory config works.

a good read on frame times and such https://www.mvps.org/directx/articles/fps_versus_frame_time.htm

a good simple frame time converter http://www.hardwarepal.com/frame-time-calculator-fps/ put in the FPS the meter says your running and compare it with the proper frame time it lists to the one you recorded during same test run as the FPS meter recording was taken and then record the variation / discrepancies you will see the GTX 900 series will vary from the result it should be much more often than the AMD modern cards meaning the AMD cards show more of the actual FPS theyre putting out and the GTX 900 series are off by a substantial amount.

This is why AMD cards do so well in VR because theyre frame times and FPS is more solid meaning less motion sickness and such. This is also why leap computing recomends AMD 290X GPU for cloud gaming over the Titan X the AMD cardss have much less latency / delay and a more solid frame time.

watch from 3:20 in on the video a r9 290 8gb is essentially a R9 390 so this is important and relevant that is what leap offered was R9 290 8gb or Titan X and the R9 290 8gb ran circles around the Titan X and with recent drivers that R9 290 8gb has gained alot more performance than the Titan X so make your own conclusion here.


----------



## iRUSH

Quote:


> Originally Posted by *rickcooperjr*
> 
> The thing is it is hard to see a difference when above 60fps some say it is hard above 45fps I myself can tell the difference of 120fps vs 60fps and such and well 45fps-60fps is a bit harder but past 60fps things just feel smoother and more responsive because at 60fps+ the player response time drops drastically to point you get better advantage in competitive gaming especially when running 120fps. PLZ measure your frame times you will notice huge variation from reported FPS and what your recorded frame times show.
> 
> I used to do this alot but it is alot of work to see this first hand and well I just got tired of arguing it and others seeing it themselfs and coming back saying yes your right it wasnt them saying I was right that was the problem it was the arguement before hand was just not productive and often caused flame wars over the conversation. The issue is there are issues with the GTX 900 series as a whole with weird variation in frame times some say it is because of the on the fly color compression others say it is due to the way theyre memory config works.
> 
> a good read on frame times and such https://www.mvps.org/directx/articles/fps_versus_frame_time.htm
> 
> a good simple frame time converter http://www.hardwarepal.com/frame-time-calculator-fps/ put in the FPS the meter says your running and compare it with the proper frame time it lists to the one you recorded during same test run as the FPS meter recording was taken and then record the variation / discrepancies you will see the GTX 900 series will vary from the result it should be much more often than the AMD modern cards meaning the AMD cards show more of the actual FPS theyre putting out and the GTX 900 series are off by a substantial amount.
> 
> This is why AMD cards do so well in VR because theyre frame times and FPS is more solid meaning less motion sickness and such. This is also why leap computing recomends AMD 290X GPU for cloud gaming over the Titan X the AMD cardss have much less latency / delay and a more solid frame time.


Thank you for the links. I'll read up on this.

I'm very sensitive to fps variance and can tell what fps I'm at within 10 fps till I get to 130ish. I can tell the difference between 120 and 144 too. After that it all feels and looks the same.

I cannot personally side between either choice for anyone other than me. The 970 suits me more because I only play online fps titles and for whatever reason , call it driver magic if you want, I am able to get a higher minimum fps with the 970, always.

But, I do not run AA since it causes input lag and I also do not require massive eye candy in general. Just my monitors native resolution, maintain my desired refresh rate and stay above the servers tick rate.

Perhaps some of the reasons above are why I do not have the issues the 970 is allegedly plagued with?


----------



## mcg75

Quote:


> Originally Posted by *rickcooperjr*
> 
> check your frame times alot of games with a GTX 970 will say your running 120fps or whatever and if you check frame times it is actually like 60-75 fps according to frame times this is a big issue with GTX 970's it is a bit odd in how FPS says like double what the frame times tell is going on this is why with some games running 45-60FPS on GTX 970's they seem laggy and stuttery / jerky this is because actual true FPS is around 30fps with modern AMD cards your frame times are more matching to your actual FPS while with the GTX 970 it is a bit weird and all over the place.
> 
> PLZ keep in mind frame times are the true measure of FPS this is what the FPS meters are suppose to be going by but in my testing with the GTX 970 it is just to odd and not precise they just don't match up like they do with AMD cards.


Where is the published data confirming all of this?

https://www.youtube.com/watch?v=4JHuyLoYU4c

Digital Foundry has 970 vs 390 frame time testing done on youtube and it does not match anything you are claiming here.

At this point, I'd buy a 390 over a 970 but certainly not because of frame times.


----------



## rickcooperjr

Quote:


> Originally Posted by *mcg75*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rickcooperjr*
> 
> check your frame times alot of games with a GTX 970 will say your running 120fps or whatever and if you check frame times it is actually like 60-75 fps according to frame times this is a big issue with GTX 970's it is a bit odd in how FPS says like double what the frame times tell is going on this is why with some games running 45-60FPS on GTX 970's they seem laggy and stuttery / jerky this is because actual true FPS is around 30fps with modern AMD cards your frame times are more matching to your actual FPS while with the GTX 970 it is a bit weird and all over the place.
> 
> PLZ keep in mind frame times are the true measure of FPS this is what the FPS meters are suppose to be going by but in my testing with the GTX 970 it is just to odd and not precise they just don't match up like they do with AMD cards.
> 
> 
> 
> Where is the published data confirming all of this?
> 
> https://www.youtube.com/watch?v=4JHuyLoYU4c
> 
> Digital Foundry has 970 vs 390 frame time testing done on youtube and it does not match anything you are claiming here.
> 
> At this point, I'd buy a 390 over a 970 but certainly not because of frame times.
Click to expand...

man that video really shows how the image quality is better on AMD seriously it is often very blurry on the GTX 970 and crystal clear on the AMD R9 390 and color is crisper on AMD side also the textures look better on AMD 390 most of the time.


----------



## Smanci

Quote:


> Originally Posted by *rickcooperjr*
> 
> man that video really shows how the image quality is better on AMD seriously it is often very blurry on the GTX 970 and crystal clear on the AMD R9 390 and color is crisper on AMD side also.


Didn't notice the awful stuttering with AMD, huh?







Stop spoiling a thread with FUD.


----------



## rickcooperjr

Quote:


> Originally Posted by *Smanci*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rickcooperjr*
> 
> man that video really shows how the image quality is better on AMD seriously it is often very blurry on the GTX 970 and crystal clear on the AMD R9 390 and color is crisper on AMD side also.
> 
> 
> 
> Didn't notice the awful stuttering with AMD, huh?
> 
> 
> 
> 
> 
> 
> 
> Stop spoiling a thread with FUD.
Click to expand...

That is not FUD that is a big conversation at moment image quality has become a major thing about Nvidia vs AMD. The point is Nvidia gets better FPS with a reduction in image quality while AMD has the image quality and slightly reduced performance a few good examples pay attention the image quality on AMD stands out alot in areas.


----------



## mcg75

Quote:


> Originally Posted by *rickcooperjr*
> 
> man that video really shows how the image quality is better on AMD seriously it is often very blurry on the GTX 970 and crystal clear on the AMD R9 390 and color is crisper on AMD side also the textures look better on AMD 390 most of the time.


That's not what was asked.

Again, where is any published data backing up your claims of a 970 getting a lot less fps than reality based on frame times.

It's important to have factual data to back up claims like this. Let's see it.


----------



## rickcooperjr

Quote:


> Originally Posted by *mcg75*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rickcooperjr*
> 
> man that video really shows how the image quality is better on AMD seriously it is often very blurry on the GTX 970 and crystal clear on the AMD R9 390 and color is crisper on AMD side also the textures look better on AMD 390 most of the time.
> 
> 
> 
> That's not what was asked.
> 
> Again, where is any published data backing up your claims of a 970 getting a lot less fps than reality based on frame times.
> 
> It's important to have factual data to back up claims like this. Let's see it.
Click to expand...

well I will have to look that up again it was in another thread on here and it was a long list of links / testing having to do with the 3.5gb Vram issue that could take a while to dig it all up they did a ton of testing and had alot of info to sift thru it was a crazy amount of testing people did that confirmed it and because it was a hardware issue with the GTX 900 series drivers couldn't fix the core issue again it was a few months back but it was alot of testing done on it.

I remember they found it rolled over into other versions of the GTX 900 series lineup proving it was a hardware issue that was a core issue with maxwell.

I will try to find all of it again it was a few months back and was a huge amount of testing done when I say huge I mean huge not just 1 person but a bunch of them confirmed it and a few were very high up in the OCN community and they did unbiased hands on testing and spent a few weeks doing so.

I know there was like 5 threads or so that had 3-5 pages full of the testing / results that confirmed this all by different people all with similar results. I then did some testing of my own and found it to also be true I no longer have my GTX 970. I got a rig in other room here with 2x GTX 980 TI's in it and a I7 5960x in it @ 4.3ghz and 32gb of DDR4 3200mhz in it but that one is my video editing machine and game streaming machine I don't use it for anything else and it is usually in use longstory short I got rid of my 3x R9 290x's waiting on Zen and new AMD GPU lineup to release and see what theyre results are and go from there.

I want to point this out I am no fanboy I have both Intel and AMD setups and Nvidia and AMD setups I buy what I like at the time for what I need at the time for its particular task / job. I will admit I don't make it widely public i run such configs I believe currently AMD needs more publication at moment and I try to support them with what I can unless theyre completely unable to do the task at hand.


----------



## mcg75

Quote:


> Originally Posted by *rickcooperjr*
> 
> well I will have to look that up again it was in another thread on here and it was a long list of links / testing having to do with the 3.5gb Vram issue that could take a while to dig it all up they did a ton of testing and had alot of info to sift thru it was a crazy amount of testing people did that confirmed it and because it was a hardware issue with the GTX 900 series drivers couldn't fix the core issue again it was a few months back but it was alot of testing done on it.
> 
> I remember they found it rolled over into other versions of the GTX 900 series lineup proving it was a hardware issue that was a core issue with maxwell.
> 
> I will try to find all of it again it was a few months back and was a huge amount of testing done when I say huge I mean huge not just 1 person but a bunch of them confirmed it and a few were very high up in the OCN community and they did unbiased hands on testing and spent a few weeks doing so.


If it has to do with the 3.5 gb ram issue then it's not actually a frame time issue.

When you fill any card's frame buffer, you're going to have issues with fps and frame times.

If you have a game that's not filling the vram up and frames fluctuate then it's a frame time issue.


----------



## Awsan

Why is this still a thing??????


----------



## TopicClocker

Quote:


> Originally Posted by *Stige*
> 
> *R9 390 will outperform the 970 anywhere these days thanks to the two latest AMD drivers. Overclocked or not.*


Do you have any benchmarks proving this?


----------



## Stige

Quote:


> Originally Posted by *TopicClocker*
> 
> Do you have any benchmarks proving this?


Don't know if there are any, but after the Crimson drivers the FPS went up across all games and the latest hotfix 16.1 also improved some performance across all I think.

Will have to wait for a review propably? Google, I don't know of any 16.1 reviews atleast.


----------



## ITAngel

I did notice a pretty good improvement on the 290X LE that I had. I am sure the new drivers did improve across the board on all cards maybe. I did get a pretty good amount of FPS which I did a test and recorded the results but i was to busy to do so. However; The same can be said when I switch over to my new GTX 970 Xtreme.


----------



## rpnp7

Quote:


> Originally Posted by *ITAngel*
> 
> I did notice a pretty good improvement on the 290X LE that I had. I am sure the new drivers did improve across the board on all cards maybe. I did get a pretty good amount of FPS which I did a test and recorded the results but i was to busy to do so. However; The same can be said when I witched over to my new GTX 970 Xtreme.


Someone's in love with their gtx 970 xtreme ^
Wanna run us some benchmarks for gtx 970 xtreme vs msi r9 390 & msi r9 390x?


----------



## Stige

We can run some comparisons later, save a game in some popular game in a spot, share it for people and people take screenshot from loading that save with FPS display? Easy enough really? Witcher 3 etc.

I can take part next week, I won't be home this weekend and should get my motherboard back next week too so I don't have to run with a stock CPU...

EDIT: This is my stock FPS in Valley, should be easy to compare with and doesn't matter what CPU you have.


Can't remember if it was with 16.1, I'll run tomorrow at stock again before I leave to verify.


----------



## iRUSH

A 290 non X @ 1200 will out bench my old 970 @ 1600 in all benchmarks.

Gaming at the settings I play at were a different story.


----------



## ITAngel

Quote:


> Originally Posted by *rpnp7*
> 
> Someone's in love with their gtx 970 xtreme ^
> Wanna run us some benchmarks for gtx 970 xtreme vs msi r9 390 & msi r9 390x?


I can do that maybe later on tonight. Have to hit the gym after work hahaha.


----------



## rpnp7

Quote:


> Originally Posted by *ITAngel*
> 
> I can do that maybe later on tonight. Have to hit the gym after work hahaha.


Can't wait.


----------



## TopicClocker

Quote:


> Originally Posted by *Stige*
> 
> *R9 390 will outperform the 970 anywhere these days thanks to the two latest AMD drivers. Overclocked or not.*


Quote:


> Originally Posted by *Stige*
> 
> Don't know if there are any, but after the Crimson drivers the FPS went up across all games and the latest hotfix 16.1 also improved some performance across all I think.
> 
> Will have to wait for a review propably? Google, I don't know of any 16.1 reviews atleast.


Ok, so you're just making things up now? Good to know.


----------



## ITAngel

Quote:


> Originally Posted by *Stige*
> 
> We can run some comparisons later, save a game in some popular game in a spot, share it for people and people take screenshot from loading that save with FPS display? Easy enough really? Witcher 3 etc.
> 
> I can take part next week, I won't be home this weekend and should get my motherboard back next week too so I don't have to run with a stock CPU...
> 
> EDIT: This is my stock FPS in Valley, should be easy to compare with and doesn't matter what CPU you have.
> 
> 
> Can't remember if it was with 16.1, I'll run tomorrow at stock again before I leave to verify.


DEFAULT GAMING MODE (1190/1342)


DEFAULT OC MODE (1215/1367)


I have not tried doing a manual OC yet as I been busy but I did the default settings of the Gigabyte OC GURU II Utility.

Here is a review base on this actual card. http://www.overclockers.com/gigabyte-gtx-970-extreme-video-card-review/. Yes this was the card on that review.


----------



## rickcooperjr

Quote:


> Originally Posted by *ITAngel*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Stige*
> 
> We can run some comparisons later, save a game in some popular game in a spot, share it for people and people take screenshot from loading that save with FPS display? Easy enough really? Witcher 3 etc.
> 
> I can take part next week, I won't be home this weekend and should get my motherboard back next week too so I don't have to run with a stock CPU...
> 
> EDIT: This is my stock FPS in Valley, should be easy to compare with and doesn't matter what CPU you have.
> 
> 
> Can't remember if it was with 16.1, I'll run tomorrow at stock again before I leave to verify.
> 
> 
> 
> DEFAULT GAMING MODE (1190/1342)
> 
> 
> DEFAULT OC MODE (1215/1367)
> 
> 
> I have not tried doing a manual OC yet as I been busy but I did the default settings of the Gigabyte OC GURU II Utility.
> 
> Here is a review base on this actual card. http://www.overclockers.com/gigabyte-gtx-970-extreme-video-card-review/. Yes this was the card on that review.
Click to expand...

didn't you get like 2600's score I swear it was in the 2700's not the 2600's with your R9 290x LE OC'd where we OC'd just keep that in mind.

I remember your AVG's being around 75+ fps and your maxes around 150+ fps or so on the R9 290X LE we OC'd.

I think we pushed 1250mhz core and like 1550mhz ram or was it 1225mhz core and 1500mhz ram I can't remember exactly but I know I walked you thru it that one night for like 4+hrs on helping you and teaching you how to do it same with your CPU overclocking on AMD and Intel CPU's over the past few months LOL.

I remember we used me and my cousins TS and team viewer to get you dialed in and I had just gotten rid of my 3x R9 290x matrix's and used my R9 280 with a massive OC to help teach you how to do things I also set your custom fan profile up and voltages and such on your lightning and helped you properly wipe your graphics drivers to get drivers installed correctly because you had overlapping drivers causing issues.

I can't remember what rig that was on if it was your old I7 3930k 5ghz rig or your AMD FX 8350 rig or your I5 haswell rig or your current I7 haswell rig all I can say is you change rigs like my daughter changes diapers LOL it is hard to keep up with.


----------



## ITAngel

Oh that was much lower bro, I think I manage to hit 68.6 or so. Anything that was higher I was not adding the 8x sampler or something like that to the test. I think I range around 60 something with that card OCed mid 60s. Let me find the screen shot and it will tell me which CPU platform I was in.









Also keep in mine I only used the preset options have not manually pushed this card yet. The review done on the card by a friend was him pushing the card so whatever he got is what this card was able to manage maybe?

Update: Below are the results done on the AMD setup with the FX-8350 using the 290X LE card. The card itself was overclocked and I don't recall if the system was or not.









MSI R9 290X Lightning Edition OC | Valley Test


MSI R9 290X Lightning Edition OC | Heaven Test


Keep in mind I have not pushed it yet to what my friend did which was


His comment was;
_"GPU-Z

Next is our always gratuitous screenshot of GPUz to validate the specifications&#8230; and it does! The GTX 970 works on the GM204 core which sports a total of 1664 shaders with a pixel fill rate of 66.6 GPixel/s and a texture fillrate of 123.8 GTexel/s. ROPs and TMUs come in at 56 and 104 respectively. The memory totals 4 GB of Elpida based GDDR5 on the 256 bit bus. It comes in at a clock speed of 1774 MHz (7100 MHz GDDR5) which translates to 227.1 GB/s bandwidth. Stock clocks are 1190 MHz core that boosts to at least 1342 MHz (1443 MHz actual sustained)."_


----------



## rickcooperjr

Quote:


> Originally Posted by *ITAngel*
> 
> Oh that was much lower bro, I think I manage to hit 68.6 or so. Anything that was higher I was not adding the 8x sampler or something like that to the test. I think I range around 60 something with that card OCed mid 60s. Let me find the screen shot and it will tell me which CPU platform I was in.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also keep in mine I only used the preset options have not manually pushed this card yet. The review done on the card by a friend was him pushing the card so whatever he got is what this card was able to manage maybe?
> 
> Update: Below are the results done on the AMD setup with the FX-8350 using the 290X LE card. The card itself was overclocked and I don't recall if the system was or not.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> MSI R9 290X Lightning Edition OC | Valley Test
> 
> 
> MSI R9 290X Lightning Edition OC | Heaven Test


okay well I thought you had a higher score than that but maybe my memory is slipping given that was a few months ago and i have worked on alot of rigs and such since then LOL.

I think I got your valley score mixxed up with your heaven score LOL I do remember we maxxed everything on the heaven run if i remember correctly I mean everything and we came to conclusion you were CPU bottlenecked or something.


----------



## ITAngel

Yes you are currect I think we found out we can't push it pass 1450/1550 +75 if I recall? without it crashing or needing more voltage which then I needed more cooling. It was the only sweet spot for that card.









If I recall you are thinking this bad test. This one had the wrong screen resolution on it small compare to the 1080p size on my 25" screen. lol


----------



## daunow

Quote:


> Originally Posted by *Stige*
> 
> R9 390 will outperform the 970 anywhere these days thanks to the two latest AMD drivers. Overclocked or not.
> Only reason ever to even consider the 970 would be if you are some environmentalist hippy that thinks power consumption matters or you run some 400W crappy PSU.
> 
> R9 390 > GTX 970 any day, any time. And it will work about 100 times better a year from now than the GTX 970 will, games will start pushing that 3.5GB limit in the future even at 1080p, they already do at 1440p easily.


You don't sound like a fanboy at all lmao, than again most of the thread is about fanboyism.


----------



## tj3n123

Well the reason i choose gtx970 over r9 390 was because of 2 reasons : i can push the core up to 200 but temp still lower than 80, r9 390 push to 1150 the card is already wanna burn itself if its not high priced cooler, second is that if anyone here actually use their brain to think, all the games is already at unplayable fps before u should actually be concern about the ram that pass 3.5gb lol, now and the future will always like that, and a single 980ti is always better than go sli or crossfire 970/390


----------



## AliNT77

r9 290 , 1100/1675 +25mv +5% power limit (1250 Timings for memory) and crimson 16.1 :


and it scores 13050 in FS










also i can run my card @947/1625 with -156mV and -32% power limit with no power throttling and it scores 11400 in FS and 61fps in Valley


----------



## iRUSH

Quote:


> Originally Posted by *AliNT77*
> 
> r9 290 , 1100/1675 +25mv +5% power limit (1250 Timings for memory) and crimson 16.1 :
> 
> 
> and it scores 13050 in FS
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> also i can run my card @947/1625 with -156mV and -32% power limit with no power throttling and it scores 11400 in FS and 61fps in Valley


Those minimum frame rates absolutely suck.

But let me guess, you guys don't count those? Lol

I'm sorry, am I the only one here that thinks minimum frame rate is the most important thing?


----------



## AliNT77

Quote:


> Originally Posted by *iRUSH*
> 
> Those minimum frame rates absolutely suck.
> 
> But let me guess, you guys don't count those? Lol
> 
> I'm sorry, am I the only one here that thinks minimum frame rate is the most important thing?


the minimum framerate in valley is just a bug or maybe it counts the stutter that happens when moving to a new scene.
its the same with both nvidia and AMD cards

i can assure u that i have never seen my framerate to dip under 47-8.

u can run it for yourself and understand what im trying to say


----------



## neurotix

R9 290 Vapor-X @ 1200/1500mhz, 4.5ghz 4770k



Crossfire R9 290 Vapor-X @ 1150/1500mhz, 4.5ghz 4770k

I have not seen SLI 970s that can outperform my Crossfire in Valley, and it's at this point that everyone says "but Maxwell performs bad in Valley", well I've also done 19k in Fire Strike at much lower clocks than it would take SLI 970s. In addition, I have yet to see a 980ti beat me in Valley OR Fire Strike when using a similar CPU (Intel 8-thread). The 5960x doesn't count. The 980tis are lucky to break 100fps when paired with a 8-thread Intel.

This post shows SLI 970s at 1400mhz getting 20fps less than my 1150mhz 290s. His 4790k is running 300mhz higher too.

Enjoy.


----------



## TopicClocker

Quote:


> Originally Posted by *neurotix*
> 
> 
> 
> R9 290 Vapor-X @ 1200/1500mhz, 4.5ghz 4770k
> 
> 
> 
> Crossfire R9 290 Vapor-X @ 1150/1500mhz, 4.5ghz 4770k
> 
> *
> I have not seen SLI 970s that can outperform my Crossfire in Valley, and it's at this point that everyone says "but Maxwell performs bad in Valley", well I've also done 19k in Fire Strike at much lower clocks than it would take SLI 970s.* In addition, I have yet to see a 980ti beat me in Valley OR Fire Strike when using a similar CPU (Intel 8-thread). The 5960x doesn't count. The 980tis are lucky to break 100fps when paired with a 8-thread Intel.
> 
> This post shows SLI 970s at 1400mhz getting 20fps less than my 1150mhz 290s. His 4790k is running 300mhz higher too.
> 
> Enjoy.


It's actually true though, Maxwell doesn't perform well in Valley, I don't think it's a good idea to compare Valley scores across different architectures or at-least Maxwell because of this very reason.
Kepler GK110 shows up Maxwell GM204 pretty bad in Valley, ever since the Maxwell GPUs came out, especially when the Maxwell GPUs are comparable or faster in gaming performance.

It would be great to see some gaming performance comparisons against these cards in SLI and Crossfire, as the synthetic benchmarks like Valley and Firestrike don't really paint the whole picture like gaming performance does.


----------



## mcg75

Quote:


> Originally Posted by *TopicClocker*
> 
> It's actually true though, Maxwell doesn't perform well in Valley, I don't think it's a good idea to compare Valley scores across different architectures or at-least Maxwell because of this very reason.
> Kepler GK110 shows up Maxwell GM204 pretty bad in Valley, ever since the Maxwell GPUs came out, especially when the Maxwell GPUs are comparable or faster in gaming performance.


This is 100% true.

Back when the 780 Ti and 290x were king, we didn't even bother using Valley because the results were so lopsided for Nvidia. Valley meant nothing because in gaming, the two were always neck and neck.

The fact that there are tons of GTX 780 SLI getting 150+ fps in Valley on the Valley bench thread confirms it.

My 980 Ti gets 103.9 fps in Valley with it's everyday overclock 1453 mhz with no voltage adjustment or driver tweaks. Yet, i'd be behind some 780 Ti in the Valley thread.

Valley is just one of those benches that better comparing the same card to the same card.


----------



## ITAngel

Yes I don't find chasing numbers all that great I do a few test to see any tweak or testing on the same card or same platform. But between the 290X LE I had and the 970 Extreme I have now there are real gaming differences between them. But Heaven or Valley I find them interested to try compare the two. This is were test like these here come in handy better in my opinion.




This here makes more sense to me.


----------



## rdr09

Quote:


> Originally Posted by *ITAngel*
> 
> Yes I don't find chasing numbers all that great I do a few test to see any tweak or testing on the same card or same platform. But between the 290X LE I had and the 970 Extreme I have now there are real gaming differences between them. But Heaven or Valley I find them interested to try compare the two. This is were test like these here come in handy better in my opinion.
> 
> 
> 
> 
> This here makes more sense to me.


970 is fast no question about that. so long as you don't hit the slow 0.5 VRAM. At 1080, doubt it. Enjoy the card.


----------



## Mazda6i07

Just got my msi 390 in the other day, haven't had any time to overclock it, don't know if i even will. But I don't have any real complaints about performance other than slight coil wine. Card performs well and I doubt you'd be able to tell the difference between a 970 and the 390. Just my


----------



## neurotix

The reality is that in games, the 970 and 390 are extremely similar.

In Gameworks games or Nvidia games (Batman series, Witcher, other examples) the 970 will win. In AMD games (Tomb Raider, Thief, Sleeping Dogs) the 390 will win. But the margin between the two won't be THAT different, and they should really be within 10 fps of each other regardless of the game and the advantage.

The 390 pulls ahead because of 1) 8GB VRAM and 2) no slow last 0.5 GB of VRAM.

Also, as I already showed with my benchmarks (it holds true for games too) the 390 is actually quite a bit faster if the cores are clocked the same (that is, both cards run at 1100mhz). At the same clock speed the 390 is significantly faster. It takes 1500mhz+ for the 970 to keep up with my lower clocked 290s. So, the advantage "_but it overclocks crazy high_" is kind of null if that doesn't make it pull ahead significantly.

I would not recommend the 970 to a new buyer, I would recommend the 390 or even a sub-$200 used 290. If someone were rich, I would recommend the 980 and especially the 980ti over the 390X and (piece of garbage) Fury and Fury X cards, however. At this point, though, 14nm is right around the corner and I would definitely not recommend investing in a $500 tier card right now; wait a few months.


----------



## ITAngel

I have yet to seen any sign of slowness on my Extreme card, with the .5 VRAM and I am not saying that just because I own a GTX 970 Extreme. I saw some issues on the EVGA GTX 970 SSC before and I got a MSI R9 290X LE and was fine. But with this Gigabyte GTX 970 Extreme I don't see any of those issues as of yet. I wish I had a good tool that will show it so I can record it and proof it. None what so ever plus is silence quite, only thing I hear is the case top fans or maybe is the Noctua but that is about it.







I will be replacing my Noctual NH-D15 hopefully tomorrow or the next day with a Dark Rock Pro 3 and going to see if i hear that graphic card during test. I must admit that base on spec, performance per dollar the 390 seems a lot better. I even guided someone local to grab a R9 390 8GB graphic card over the EVGA GTX 970 local so I am not in any way a R9 390 hater. I find that some games will favorite one over the other card it all depends what people are playing. For me Guild Wars 2 has favorite the GTX 970 more than my old 290X.







It dosen't even run the fans on Rocksmith 2014.


----------



## neurotix

I think the slow VRAM thing depends on the game, the resolution and the settings.

The surefire way to find out is probably at 1440p or above in a demanding "Next Gen"/GPU intensive game like Witcher 3. Crank up the AA and all the post-processing settings and you should probably pass the 3.5GB mark. I run at 5760x1080p with everything on Ultra and it's a smooth 60 fps until I turn on all the post-processing features and high levels of AA, then my FPS tanks (likely exceeding the 4GB on my cards and using swap memory at that point). Another good game to try might be Dragon Age Inquisition. I ran it at pretty high settings in Eyefinity just fine with two cards, but turning up some of the settings easily made me exceed the RAM buffer.

Guild Wars 2 is pretty old at this point, I believe, and it's also an MMO so it's more of a CPU test.


----------



## ITAngel

Quote:


> Originally Posted by *neurotix*
> 
> I think the slow VRAM thing depends on the game, the resolution and the settings.
> 
> The surefire way to find out is probably at 1440p or above in a demanding "Next Gen"/GPU intensive game like Witcher 3. Crank up the AA and all the post-processing settings and you should probably pass the 3.5GB mark. I run at 5760x1080p with everything on Ultra and it's a smooth 60 fps until I turn on all the post-processing features and high levels of AA, then my FPS tanks (likely exceeding the 4GB on my cards and using swap memory at that point). Another good game to try might be Dragon Age Inquisition. I ran it at pretty high settings in Eyefinity just fine with two cards, but turning up some of the settings easily made me exceed the RAM buffer.
> 
> Guild Wars 2 is pretty old at this point, I believe, and it's also an MMO so it's more of a CPU test.


Yea you are correct about GW2 is more CPU depended and most MMO are. You are also correct the resolution I run is 1080p @ 60Hz so I don't see it unlike you or others running much higher resolution with max settings. With that in mind I can see that happen as most game will be pushing much higher memory and 4GB being limited unless you run Windows 10 and dual GTX 970 to make them act as a single card with DX12. Unless I upgrade my monitors I guess I will never find out and by that time I will have the new gen GPUs like Pascal or however is called. lol


----------



## Stige

Quote:


> Originally Posted by *ITAngel*
> 
> Yes I don't find chasing numbers all that great I do a few test to see any tweak or testing on the same card or same platform. But between the 290X LE I had and the 970 Extreme I have now there are real gaming differences between them. But Heaven or Valley I find them interested to try compare the two. This is were test like these here come in handy better in my opinion.
> 
> 
> 
> 
> This here makes more sense to me.


This video is very old.
People need to make proper comparison between the two cards with uptodate drivers, the last two drivers for AMD have increased the performance pretty nicely on 390.

Anyone up for this with GTX 970? I'm home now so we can start testing. Share a save game in Witcher 3 or something and load it and see FPS on both cards. Stock/Overclocked.
I would gladly benchmark these so called "NVidia / AMD games". I doubt they make any difference no matter what logo is in the game intro, just marketing crap.

This is my new Valley score after putting my proper motherboard back in:


----------



## Luciferxy

afaik, gm 204 doesn't fare well in valley. Kepler in the other hand does. With nvcp settings at HQ, I only got around 2200~2300 score with my 970 at default stock core & mem clock. 1040~1278/3550. To make it worse, it started throttling to 1265 when temperature reach 69 deg C.

edit: didn't read TopicClocker & mcg75 before me ...








So there's that, gm204 kinda suck in Valley


----------



## ambientblue

Quote:


> Originally Posted by *PontiacGTX*
> 
> if you will play wow wod, and some quite cpu bound mmorpg most of the time then get the 970
> for BF series and battlefront the 390/290x/390x
> 
> also there is a 290x for 450cad
> http://www.newegg.ca/Product/Product.aspx?Item=N82E16814150696&AID=10657534


It's crazy how expensive the dollar has made everything. I got my 290x from the states a year ago, worked out to be maybe $340 CAD plus tax aka customs


----------



## rickcooperjr

Quote:


> Originally Posted by *ambientblue*
> 
> Quote:
> 
> 
> 
> Originally Posted by *PontiacGTX*
> 
> if you will play wow wod, and some quite cpu bound mmorpg most of the time then get the 970
> for BF series and battlefront the 390/290x/390x
> 
> also there is a 290x for 450cad
> http://www.newegg.ca/Product/Product.aspx?Item=N82E16814150696&AID=10657534
> 
> 
> 
> It's crazy how expensive the dollar has made everything. I got my 290x from the states a year ago, worked out to be maybe $340 CAD plus tax aka customs
Click to expand...

here you can buy MSI R9 290X lightnings and such for around $200-$250 just ask ITAngel he had option not to long ago to get a 2nd one only 1 week old for like $250.

I bought a few R9 290x's for around $175-$200 not that long ago off ebay lightly used they upgraded to titan X's and were just trying to recoup what they could in short if you look around you can find killer deals on R9 290X's / 290's just got to look can almost get 2x 290x's for the price of a single R9 390X and 2x 290X's will eat a single 390X alive.


----------



## ITAngel

Quote:


> Originally Posted by *rickcooperjr*
> 
> here you can buy MSI R9 290X lightnings and such for around $200-$250 just ask ITAngel he had option not to long ago to get a 2nd one only 1 week old for like $250.
> 
> I bought a few R9 290x's for around $175-$200 not that long ago off ebay lightly used they upgraded to titan X's and were just trying to recoup what they could in short if you look around you can find killer deals on R9 290X's / 290's just got to look can almost get 2x 290x's for the price of a single R9 390X and 2x 290X's will eat a single 390X alive.


Yea is true, I had the chance to have 3x 290X LE for for about $240-260 each which it was not a bad deal a week old too cards.


----------



## rickcooperjr

Quote:


> Originally Posted by *ITAngel*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rickcooperjr*
> 
> here you can buy MSI R9 290X lightnings and such for around $200-$250 just ask ITAngel he had option not to long ago to get a 2nd one only 1 week old for like $250.
> 
> I bought a few R9 290x's for around $175-$200 not that long ago off ebay lightly used they upgraded to titan X's and were just trying to recoup what they could in short if you look around you can find killer deals on R9 290X's / 290's just got to look can almost get 2x 290x's for the price of a single R9 390X and 2x 290X's will eat a single 390X alive.
> 
> 
> 
> Yea is true, I had the chance to have 3x 290X LE for for about $240-260 each which it was not a bad deal a week old too cards.
Click to expand...

if I remember correctly warranty and everything was SN linked and had not been registered so essentially you would have had a fresh warranty on top of it I believe he offered proof of original purchase also for you so you would have warranty regardless.


----------



## ITAngel

Quote:


> Originally Posted by *rickcooperjr*
> 
> if I remember correctly warranty and everything was SN linked and had not been registered so essentially you would have had a fresh warranty on top of it I believe he offered proof of original purchase also for you so you would have warranty regardless.


Yes you are correct, I wish my motherboard issue would had not gotten in the way of that but oh well I am still dealing with it even now. Another cheap GTX 970 came around my way and needed to turn it down because of i.t


----------



## neurotix

Great posts guys.

I pretty much always recommend people look for used cards now.

I also recommend that anyone with a 7970 or 280X, or anyone considering buying a new R9 380, look at used 290s instead. In 2016 if you are going to have an AMD graphics card and game at a high resolution, I think nothing under the 290/X will do.

The stigma against used cards is pretty untrue. I've had numerous AMD graphics cards pass through my hands, for benching them for HWBOT. None of them were DOA, none of them failed to work great, and I resold them and never got any complaints from the buyer. I got a used R7 265 for $100 when it still cost $160 new, it was 3 days old and still had the original box. Just fantastic. I also had Terrere's 7870 XT. Additionally, I had a 7770 Vapor-X. That one was shipped incredibly poorly but still worked. It was basically wrapped in a bunch of paper towel and stuffed in a manilla envelope, and it survived this trip through the mail and still worked great (good overclocker, too). Needless to say, I shipped it out MUCH better when I sold it, encased in foam and wrapped in bubble wrap.









For me, though, I'm selling my Sapphire R9 290 Vapor-X. One is leaving tomorrow. I'll be running a R9 270X for the next year while I wait for big die polaris. I figure I need to sell them BEFORE it comes out to get as much money as I can, and I'm satisfied with what I got for them. I have a assload of consoles and handhelds and I mostly play those, and not my PC anymore sadly, so I have plenty to keep me busy until Polaris.


----------



## ITAngel

I ended up getting today the PowerColor Devil 13 Dual Core Radeon R9 290X and will be selling my Gigabyte GTX 970 Xtreme Edition OC card and holding off for Polaris as well. I should have my card in on Saturday so I can't wait.


----------



## cainy1991

Quote:


> Originally Posted by *neurotix*
> 
> Great posts guys.
> 
> I pretty much always recommend people look for used cards now.
> 
> I also recommend that anyone with a 7970 or 280X, or anyone considering buying a new R9 380, look at used 290s instead. In 2016 if you are going to have an AMD graphics card and game at a high resolution, I think nothing under the 290/X will do.
> 
> The stigma against used cards is pretty untrue. I've had numerous AMD graphics cards pass through my hands, for benching them for HWBOT. None of them were DOA, none of them failed to work great, and I resold them and never got any complaints from the buyer. I got a used R7 265 for $100 when it still cost $160 new, it was 3 days old and still had the original box. Just fantastic. I also had Terrere's 7870 XT. Additionally, I had a 7770 Vapor-X. That one was shipped incredibly poorly but still worked. It was basically wrapped in a bunch of paper towel and stuffed in a manilla envelope, and it survived this trip through the mail and still worked great (good overclocker, too). Needless to say, I shipped it out MUCH better when I sold it, encased in foam and wrapped in bubble wrap.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> For me, though, I'm selling my Sapphire R9 290 Vapor-X. One is leaving tomorrow. I'll be running a R9 270X for the next year while I wait for big die polaris. I figure I need to sell them BEFORE it comes out to get as much money as I can, and I'm satisfied with what I got for them. I have a assload of consoles and handhelds and I mostly play those, and not my PC anymore sadly, so I have plenty to keep me busy until Polaris.


I'm always paranoid buying anything second hand UNLESS its near new and comes with proof of purchase....or is really cheap.

Otherwise I just stick with new, though I have been burned before.


----------



## ITAngel

Yea have to be careful for sure but I use Technics to protect me so I am okay. The card I just picked up is a month old, and got it for $400.00. Didn't come with the mouse but that is okay I have my own MMO gaming mouse.


----------



## ambientblue

Quote:


> Originally Posted by *rickcooperjr*
> 
> here you can buy MSI R9 290X lightnings and such for around $200-$250 just ask ITAngel he had option not to long ago to get a 2nd one only 1 week old for like $250.
> 
> I bought a few R9 290x's for around $175-$200 not that long ago off ebay lightly used they upgraded to titan X's and were just trying to recoup what they could in short if you look around you can find killer deals on R9 290X's / 290's just got to look can almost get 2x 290x's for the price of a single R9 390X and 2x 290X's will eat a single 390X alive.


I meant new, not sure if you're talking used cards but good prices anyhow


----------



## PontiacGTX

Quote:


> Originally Posted by *ITAngel*
> 
> I ended up getting today the PowerColor Devil 13 Dual Core Radeon R9 290X and will be selling my Gigabyte GTX 970 Xtreme Edition OC card and holding off for Polaris as well. I should have my card in on Saturday so I can't wait.


your airflow on pcie should be good since it is air cooled

And some people reduced the core temp by applying new tim to the cores since powercolor sometimes uses too much tim


----------



## ITAngel

Quote:


> Originally Posted by *ambientblue*
> 
> I meant new, not sure if you're talking used cards but good prices anyhow


Those were used prices like new cards some were about a week to a month old GPUs.
Quote:


> Originally Posted by *PontiacGTX*
> 
> your airflow on pcie should be good since it is air cooled
> 
> And some people reduced the core temp by applying new tim to the cores since powercolor sometimes uses too much tim


Oh i see, good to know my friend. I will check how temps are once I have it and see if i have to do that or not.


----------



## kjrayo18

I actually can't decide wether to get the Msi 970 for $330 or the r9 390 for $340. I really like the back plate on the r9. Thing is only game I play is csgo lol


----------



## iRUSH

Quote:


> Originally Posted by *kjrayo18*
> 
> I actually can't decide wether to get the Msi 970 for $330 or the r9 390 for $340. I really like the back plate on the r9. Thing is only game I play is csgo lol


Is this for a different PC? Your sig rig suggests you have a 970 already.

You don't need much to keep csgo at the desired 200 fps mark.


----------



## kjrayo18

Quote:


> Originally Posted by *iRUSH*
> 
> Is this for a different PC? Your sig rig suggests you have a 970 already.
> 
> You don't need much to keep csgo at the desired 200 fps mark.


Same pc but I gave my brother my 970 for Christmas so in the market again.


----------



## mcg75

Finally have some results worth reporting to the thread from a reputable source.

Techpowerup just tested a 390 Nitro. And they use up to date drivers and retest every card unlike most sites including a stock 390 and a stock 970.

http://www.techpowerup.com/reviews/Sapphire/R9_390_Nitro/23.html

390 is 4% faster than 970 at 1080p.

390 is 9% faster than 970 at 1440p.

390 is 13% faster than 970 at 4K.


----------



## daunow

Quote:


> Originally Posted by *mcg75*
> 
> Finally have some results worth reporting to the thread from a reputable source.
> 
> Techpowerup just tested a 390 Nitro. And they use up to date drivers and retest every card unlike most sites including a stock 390 and a stock 970.
> 
> http://www.techpowerup.com/reviews/Sapphire/R9_390_Nitro/23.html
> 
> 390 is 4% faster than 970 at 1080p.
> 
> 390 is 9% faster than 970 at 1440p.
> 
> 390 is 13% faster than 970 at 4K.


Pretty good, doubt anyone would play 4k with neither card tough, however 1440p sounds good.


----------



## diggiddi

Quote:


> Originally Posted by *daunow*
> 
> Pretty good, doubt anyone would play 4k with neither card tough, however 1440p sounds good.


Why not? especially when you can use VSR or just double up


----------



## Stige

Quote:


> Originally Posted by *mcg75*
> 
> Finally have some results worth reporting to the thread from a reputable source.
> 
> Techpowerup just tested a 390 Nitro. And they use up to date drivers and retest every card unlike most sites including a stock 390 and a stock 970.
> 
> http://www.techpowerup.com/reviews/Sapphire/R9_390_Nitro/23.html
> 
> 390 is 4% faster than 970 at 1080p.
> 
> 390 is 9% faster than 970 at 1440p.
> 
> 390 is 13% faster than 970 at 4K.


Finally.

Now to wait for DX12 and 970 is as good as dead. Or any game released a year from now and that 970 won't run it cause of crap memory.


----------



## TopicClocker

Quote:


> Originally Posted by *diggiddi*
> 
> Why not? especially when you can use VSR or just double up


If you're aiming to max out games then the performance hit will be pretty crazy, in the majority games sub 30-40 fps, however you can always use a custom setup of graphical settings.

Dark Souls 2 Scholar of the First Sin runs at a locked 60fps at 4K for me, looks fantastic and if I recall correctly Street Fighter V also did when I played it during the last beta test.

Quote:


> Originally Posted by *Stige*
> 
> Finally.
> 
> Now to wait for DX12 and 970 is as good as dead. Or any game released a year from now and that 970 won't run it cause of crap memory.


Only a true fanboy would wish the demise of a competitors product. Don't like the GTX 970? No one cares.

People just want to know what's best for them, both are great products however the R9 390 is more future proof, people like you fanboying like mad man is doing no favors for anybody.

Your statements are filled with fabrications and FUD, and what you said doesn't even make any sense if you even understand how hardware works.
If you're running a game and the GPU is running out of memory what works exceptionally well in the majority of cases is to reduce the texture quality, games don't simply stop running.









That's the stupidest thing I've read in a long time.

Please, learn more about hardware before you embarrass yourself again.


----------



## rickcooperjr

Quote:


> Originally Posted by *mcg75*
> 
> Finally have some results worth reporting to the thread from a reputable source.
> 
> Techpowerup just tested a 390 Nitro. And they use up to date drivers and retest every card unlike most sites including a stock 390 and a stock 970.
> 
> http://www.techpowerup.com/reviews/Sapphire/R9_390_Nitro/23.html
> 
> 390 is 4% faster than 970 at 1080p.
> 
> 390 is 9% faster than 970 at 1440p.
> 
> 390 is 13% faster than 970 at 4K.


I have to say very very nice and exactly what I was expecting now the question is what will Nvidia do when Nvidia release next lineup of GPU's will Nvidia do the usual and abandon the previous one or what just a bit of curriousity here as that is Nvidia's track record of how they work and do things for past many gens.

I will say if Nvidia do theyre usual well the R9 390 will get even better of an option both current and longeterm for literally same price as the GTX 970.

I want to also say if it is true that AMD current architecture has a huge advantage in DX12 well this will be a huge thorn in Nvidia GTX 970's side ontop of current situation with the 390 outperforming the 970 by 4%+ imagine if R9 390 with DX12 is 10%+ better performing on DX12 over GTX 970 that would put it on average 14%+ better performing that is a substantial amount above the GTX 970 for literally same price.

I want to also point out even Nvidia claim DX12 brings 20%+ increases in performance on theyre current hardware imagine what it does if what has been said about AMD side is true we could be talking 30%+ gains on AMD side not being biased here just stating some stuff that very well would be very relevant to the conversation but hasn't really been put into the conversation.


----------



## daunow

Quote:


> Originally Posted by *diggiddi*
> 
> Why not? especially when you can use VSR or just double up


They are gonna run pretty bad. (But like topiclocker said, you can run some older games on it)

Should dead ass try that feature that upscales the game for a better anti aliasing, now thinking about it..
Quote:


> Originally Posted by *Stige*
> 
> Finally.
> 
> Now to wait for DX12 and 970 is as good as dead. Or any game released a year from now and that 970 won't run it cause of crap memory.


Man are you like insecure about your purchase or something, jesus.


----------



## Xizel14

I haven't been following the thread, but how is the 'average' overclock headroom of both cards?


----------



## rickcooperjr

Quote:


> Originally Posted by *Xizel14*
> 
> I haven't been following the thread, but how is the 'average' overclock headroom of both cards?


it takes alot less mhz to get more performance on the 390 vs the 970 in short a 100mhz or so OC on the R9 390 is = to around a 300mhz OC on the GTX 970 in short you get more mhz out of the 970 but you get less performance per mhz than you do with the R9 390.


----------



## neurotix

A custom 970 should be good for 1500mhz on air, possibly 1600mhz if you're lucky.

All custom 290s should do 1100mhz on air. 1150mhz is common. 1200mhz+ is less common and 1300mhz is almost unheard of on air. The 290 doesn't overclock as well as the 7970/280X.

An 1150mhz 290 = 1500mhz 970.


----------



## iRUSH

Quote:


> Originally Posted by *rickcooperjr*
> 
> it takes alot less mhz to get more performance on the 390 vs the 970 in short a 100mhz or so OC on the R9 390 is = to around a 300mhz OC on the GTX 970 in short you get more mhz out of the 970 but you get less performance per mhz than you do with the R9 390.


This is very correct.


----------



## Stige

Quote:


> Originally Posted by *neurotix*
> 
> A custom 970 should be good for 1500mhz on air, possibly 1600mhz if you're lucky.
> 
> All custom 290s should do 1100mhz on air. 1150mhz is common. 1200mhz+ is less common and 1300mhz is almost unheard of on air. The 290 doesn't overclock as well as the 7970/280X.
> 
> An 1150mhz 290 = 1500mhz 970.


I would say any card can do 1100, 1150 is good already and 1200 is pretty uncommon, don't see many in the lists atleast.


----------



## neurotix

1200 is pretty uncommon, yeah. I was being somewhat conservative. Most 290s can probably do 1150mhz.


----------



## FlyingSolo

My GTX 970 does 1500mhz on air. Its a place holder until the new cards come out. Then that card is going straight to my arcade build. But if i was to buy a card now between them two. I will for sure take the R9 390 over a GTX 970. Since in my case i might need that 1GB more. The arcade build will have one of my 1440p monitor.


----------



## TopicClocker

Quote:


> Originally Posted by *FlyingSolo*
> 
> My GTX 970 does 1500mhz on air. Its a place holder until the new cards come out. Then that card is going straight to my arcade build. But if i was to buy a card now between them two. I will for sure take the R9 390 over a GTX 970. Since in my case i might need that 1GB more. The arcade build will have one of my 1440p monitor.


What kind of games will you be running on the Arcade build if you don't mind me asking?


----------



## FlyingSolo

Quote:


> Originally Posted by *TopicClocker*
> 
> What kind of games will you be running on the Arcade build if you don't mind me asking?


Not much really. Just old arcade games as well as the new street fighter 5 and other new fighting games that come out. like tekken and killer instinct and some racing games like project cars. Will be keeping the card as long as i can. Until it needs a update. Hopefully should last me a long time with this card.


----------



## Luciferxy

project cars hates amd cards dude.


----------



## Intel CPU

Hi guys,

just sharing what hell I went through with a GTX 970 I bought. I experienced serious coil whine and changed to a R390 in the end and never looked back!!!

I've done some small research myself and would like to share about it too.

From at least 5 website reviews using the AMD LATEST DRIVERS, the R9 390 BEATS the GTX 970 in terms of raw power. Shadow of Mordor (and probably more upcoming heavyweight titles) also uses more than 3.5GB of RAM and causes micro stuttering in Nvidia's controversial 3.5GB +0.5GB RAM, and that's where the power of AMD R390's 8GB RAM comes in.

1.http://www.pcper.com/reviews/Graphics-Cards/Sapphire-Nitro-Radeon-R9-390-8GB-Review - for the Sapphire Nitro R9 390 8GB card, users will find that it is competitive with the performance of the GeForce GTX 970 4GB for a nearly identical price. It's not new, it's not flashy and it's not rewriting the world of GPUs, but it is at least slightly more than expected.

2.http://www.kitguru.net/components/graphic-cards/zardon/sapphire-r9-390-nitro-8gb-review/ - KitGuru Must Have

3.http://www.eteknix.com/sapphire-nitro-r9-390-8gb-graphics-card-review/ - Teknix.com Innovation Award

4.http://www.tomshardware.com/reviews/sapphire-nitro-r9-390-8g-d5,4245.html - Verdict : Sapphire's Radeon R9 390 Nitro offers a tremendous value for the money. It operates very quietly and has performance that outpaces the competition. The Tri-X cooler keeps the GPU at a reasonable temperature and having 8GB of memory means you'll have room for the high resolution textures sure to be found in future games.

5.http://www.techpowerup.com/reviews/Sapphire/R9_390_Nitro/28.html - Editor's Choice

Also google "GTX 970 coil whine." Plenty of pissed off owners of GTX 970 as well. A lot of owners are encountering coil whine issues (some seems badly mentally affected with the high pitched whines during benchmark loading, running etc ) with GTX 970 that are virtually close to none in AMD R390s. EVGA, MSI, ASUS, Gigabyte, Palit, Galaxy all have GTX 970 boards that may give you a potential coil whine nightmare especially for those who leave their side panels open and/or game without headsets. Some have did RMAs 4 times.

I bought a brand GTX 970 and got so pissed off with 2 exchanges of GTX 970 (Both still have coil whine, the last one is especially bad when paired up with a Corsair HXi-850 (Platinum). In the end, I went the AMD R390 route and never looked back. A quick test on 1600x900 unigine valley netted me 88 fps for the R390 over 78.9 fps for the GTX 970. That's a 9.1 fps advantage. A test on Uniheaven 1600x900 netted me 77.0 fps for the R390 over 72.8 fps for the GTX 970. That's a 4.2 fps, not too shabby. I'm running an Intel I7-6700K on an Asus Z170M-plus setup. Win 7 OS. 480 GB Sandisk Extreme Pro. Cooler Master V1000 (Gold). (Many corsair sucks, saw 2 units in RMA at the service centre, one is RM series another is HX series). My Cooler Master Silent Pro 1000 lasted me 6 years and still going strong.

No coil whine nightmare on R390. For the GTX 970, its a lottery draw for you. 50-50 it seems. If you're lucky, u get a good batch. If you're not lucky, you get your life turned upside down by coil whine. Sapphire Nitro R390 runs silent and kool. I would take higher power consumption + raw brute power over the hellish coil whine anyday. Note that I change cards between green and red camps regularly.

I went green, but went red again.

I'm not sure which camp I will be in in 5 years time though.

Thanks.

permalink


----------



## coffeeplus

How about other potential problems with AMD/R9 - do you have any, especially if using latest drivers? I've seen a lot of negative comments, and compiled a list. Can you please have a look and comment on the following thread? http://www.overclock.net/t/1590013/what-is-the-current-state-of-amd-cards-and-drivers-from-a-stability-perspective


----------



## Intel CPU

Quote:


> Originally Posted by *coffeeplus*
> 
> How about other potential problems with AMD/R9 - do you have any, especially if using latest drivers? I've seen a lot of negative comments, and compiled a list. Can you please have a look and comment on the following thread? http://www.overclock.net/t/1590013/what-is-the-current-state-of-amd-cards-and-drivers-from-a-stability-perspective


My sapphire Nitro R390 runs with no problems at all in Windows 10 with the latest Crimson drivers. In fact, it gains a few extra fps and becomes an MSI R390X equivalent after I overclocked it. Runs cool, quiet as hell and wickedly fast. Having R390X (and GTX 980 level performance) at a much lower price point than the GTX 980 is simply a joy.

The massive coil whine problems with GTX 970 and GTX 980, no thanks man. I've RMAed two Palits even though they claim they have switched component vendors so I'm not going to risk wasting my time and effort RMAing GTX 970s.

My Sapphire has ZERO coil whine compared to the GTX 970.

To be fair, I've seen a significant lot of negative comments about the GTX 970 as well. Crashes, massive coil whines (google and youtube them), 3.5+0.5gb scam (seriously 30+gb/s bandwidth), are they kidding me....even if there are no performance issues with this layout, there ARE (a website has documented microstuttering with the GTX 970 when it hits above 3.5GB in Shadow of Mordor and Attila etc)., it is ethically wrong to slap a 4GB sticker on a 3.5GB+0.5GB card.

Also, I feel with Nvidia's expensive G-sync, I'm being locked into a proprietory format that will hurt my own wallet in the long run.

So, I've turned from green to red.

No thanks Nvidia. Screw you.

- Here's writing as an ex Super Nvidia "FANBOY".


----------



## daunow

Quote:


> Originally Posted by *Intel CPU*
> 
> My sapphire Nitro R390 runs with no problems at all in Windows 10 with the latest Crimson drivers. In fact, it gains a few extra fps and becomes an MSI R390X equivalent after I overclocked it. Runs cool, quiet as hell and wickedly fast. Having R390X (and GTX 980 level performance) at a much lower price point than the GTX 980 is simply a joy.


k


----------



## Intel CPU

Quote:


> Originally Posted by *daunow*
> 
> k


Going w sapphire cant be wrong. They r tier 1 manufacturer for amd.


----------



## Bluescreendeath

Quote:


> Originally Posted by *iinversion*
> 
> The R9 390 does have 8GB of VRAM but you will never be able to fully make use of that before you max the core. I believe it is more for marketing than anything. Since this discussion also focuses on 1080p, you will never ever need 8GB of VRAM. Since you are talking about 1080p and single card setups you should not even be talking about the R9 390 as the 390 is just a factory OC'd R9 290 + extra 4GB of VRAM(which is completely useless esp. at 1080p.) for the most part with slight changes to power management for $90 more.
> 
> If you OC them both to the max I have no doubt the 970 will win more often than not esp. at 1080p and 3.5GB of VRAM will not be a problem at 1080p.
> 
> If you consider the R9 290 at around $80-$90 cheaper than a R9 390 then it definitely represents a better value than the GTX 970. However, if you are comparing a 1500MHz+ 970 to a 1200MHz R9 290/390 @ 1080p only then 970 will definitely win more often than not.


What are the price points you're looking at?

The GTX970s I've seen go for around $300, and a bit under $300 with promos. (not Jet.com)

The R9 290s are around that price as well, and R9 390s are at $400.


----------



## Stige

390s can't be $400 if you have 970s for 300$, that makes no sense at all.

There is a ~20€ price difference between the cheapest 970 and 390 here in Finland, it can't be 100$ anywhere.


----------



## daunow

Quote:


> Originally Posted by *Bluescreendeath*
> 
> What are the price points you're looking at?
> 
> The GTX970s I've seen go for around $300, and a bit under $300 with promos. (not Jet.com)
> 
> The R9 290s are around that price as well, and R9 390s are at $400.


I am sorry but the fanboy is right, they don't go for $400, even on Jet you can find them for almost the same price as the 970's, however never seen them go lower, and they are always $10-$20 bucks more from what I've seen.


----------



## Bluescreendeath

Quote:


> Originally Posted by *daunow*
> 
> I am sorry but the fanboy is right, they don't go for $400, even on Jet you can find them for almost the same price as the 970's, however never seen them go lower, and they are always $10-$20 bucks more from what I've seen.


My bad - you are correct. I was looking at the R9 390X version.


----------



## Stige

Quote:


> Originally Posted by *daunow*
> 
> I am sorry but the fanboy is right, they don't go for $400, even on Jet you can find them for almost the same price as the 970's, however never seen them go lower, and they are always $10-$20 bucks more from what I've seen.


I only buy what gives the most performance for said amount of money. And that would be the 390, the 970 doesn't even come close to these days.

If it was NVidia or AMD CPU, I would buy it. But they are not.


----------



## daunow

Quote:


> Originally Posted by *Stige*
> 
> the 970 doesn't even come close


k

You know what I find funny is that I should be on your side, since I actually don't like my 970 that much.


----------



## Bluescreendeath

Quote:


> Originally Posted by *daunow*
> 
> k
> 
> You know what I find funny is that I should be on your side, since I actually don't like my 970 that much.


What's wrong with your GTX970? I'm actually considering buying it, getting the R9 3x0, or waiting till Pascal or AMD's HBM stuff in mid 2016.


----------



## daunow

Quote:


> Originally Posted by *Bluescreendeath*
> 
> What's wrong with your GTX970? I'm actually considering buying it, getting the R9 3x0, or waiting till Pascal or AMD's HBM stuff in mid 2016.


Coil Whine/Fan is loud as hell under full load, funny thing is that I RMA'd my first one and the second one still had coil whine, I think I am curse or maybe it has to do with some components on my rig other than the PSU (since I even replace that). At the same time I've seen many people not experience this hell I am experiencing.

If the 970 is bad I just don't want to imagine how bad the 390 is ( or maybe it isn't, actually heard a lot of positive things about it ), however I am eventually gonna sell this 970 to get a pascal card.


----------



## Bluescreendeath

Quote:


> Originally Posted by *daunow*
> 
> Coil Whine/Fan is loud as hell under full load, funny thing is that I RMA'd my first one and the second one still had coil whine, I think I am curse or maybe it has to do with some components on my rig other than the PSU (since I even replace that). At the same time I've seen many people not experience this hell I am experiencing.
> 
> If the 970 is bad I just don't want to imagine how bad the 390 is ( or maybe it isn't, actually heard a lot of positive things about it ), however I am eventually gonna sell this 970 to get a pascal card.


I see, that sucks. What is the company? (Asus, MSI, EVGA, etc)


----------



## daunow

Quote:


> Originally Posted by *Bluescreendeath*
> 
> I see, that sucks. What is the company? (Asus, MSI, EVGA, etc)


EVGA, they cool tough, I told them I had it again and asked if i wanted to RMA the card again, but I just want to play some games.. feelt like I hadn't use the card much.. and god bless I did had a ton of fun on The Division Beta.


----------



## iRUSH

Quote:


> Originally Posted by *daunow*
> 
> EVGA, they cool tough, I told them I had it again and asked if i wanted to RMA the card again, but I just want to play some games.. feelt like I hadn't use the card much.. and god bless I did had a ton of fun on The Division Beta.


I've had nearly every 970 made. EVGA was the worst regards to coil whine. Every one icluding their newer SSC whined, both of them.

That ACX cooler looks good, but cools average at best.

Every one aside from EVGA was silent.


----------



## Stige

Sapphire has always made the best coolers for Radeons for a while now, I would think there is a similiar think with NVidias aswell? If one manufacturer produces a magnificent cooler, why buy anything else?


----------



## Intel CPU

Quote:


> Originally Posted by *daunow*
> 
> Coil Whine/Fan is loud as hell under full load, funny thing is that I RMA'd my first one and the second one still had coil whine, I think I am curse or maybe it has to do with some components on my rig other than the PSU (since I even replace that). At the same time I've seen many people not experience this hell I am experiencing.
> 
> If the 970 is bad I just don't want to imagine how bad the 390 is ( or maybe it isn't, actually heard a lot of positive things about it ), however I am eventually gonna sell this 970 to get a pascal card.


I was from Team Red. I envied Team Green's efficient architecture and bought a palit gtx 970 card and got horrible coil whine. RMA two times and they told me these are the 'latest batches' as Palit has changed component vendors. I don't buy what they say and did not risk a 3rd time. I went through a very frustrating time RMaing, changing PSUs, frame limiting, overvolting, v-syncing, you name it to get a cure for coil whine. The curse wouldn't go away until I got enlightened from above.

I swapped for a sapphire nitro R390 and never looked back. That Beast has ZERO coil whine. Runs cooler than antartica and you cant hear anything. Its eerily quiet. The 390s of today are a far cry from the hellish heaty loud R290 reference of yesterdays.

People buy R390s now for the silence and coolness, plus an extra 8GB doesn't hurt. My friend is trying to sell his GTX 970 now but he says no one wants to buy his card. He experiences unpleasant stutters when playing shadow of mordor in a multi monitor setup and he fears the worst if upcoming games keep exceeding the 3.5gb.

Now hes planning to just ditch the GTX 970 and get a sapphire R390x 8gb just to play shadow of mordor smoothly.

Team Red AMD really nailed it with Shadow of Mordor this time.

I'm glad I stayed with Team Red this time...


----------



## TopicClocker

Quote:


> Originally Posted by *Intel CPU*
> 
> I was from Team Red. I envied Team Green's efficient architecture and bought a palit gtx 970 card and got horrible coil whine. RMA two times and they told me these are the 'latest batches' as Palit has changed component vendors. I don't buy what they say and did not risk a 3rd time. I went through a very frustrating time RMaing, changing PSUs, frame limiting, overvolting, v-syncing, you name it to get a cure for coil whine. The curse wouldn't go away until I got enlightened from above.
> 
> I swapped for a sapphire nitro R390 and never looked back. That Beast has ZERO coil whine. Runs cooler than antartica and you cant hear anything. Its eerily quiet. The 390s of today are a far cry from the hellish heaty loud R290 reference of yesterdays.
> 
> People buy R390s now for the silence and coolness, plus an extra 8GB doesn't hurt. My friend is trying to sell his GTX 970 now but he says no one wants to buy his card. He experiences unpleasant stutters when playing shadow of mordor in a multi monitor setup and he fears the worst if upcoming games keep exceeding the 3.5gb.
> 
> Now hes planning to just ditch the GTX 970 and get a sapphire R390x 8gb just to play shadow of mordor smoothly.
> 
> Team Red AMD really nailed it with Shadow of Mordor this time.
> 
> I'm glad I stayed with Team Red this time...


Shadow of Mordor recommends 6GB of VRAM with ultra textures, it's a terrible idea to use a GTX 970 in that game and expect decent performance in a multi-monitor setup.


----------



## neurotix

Intel CPU, enjoy your Sapphire card.

I'm just about the biggest Sapphire fanboy you'll ever find on these forums.

I've owned two Tri-X 290s (when they were brand new and one cost me $650- mining craze), two Vapor-X 290s (best cards I ever owned), a Vapor-X 7970, a Vapor-X 270X and now my 380X Nitro which is my placeholder card until Polaris.

I've had like 15 different GPUs from them and all have been great. The only one that failed on me was my first (not garbage) card, a 6870 Dual-X. The only reason it failed was that it was in another machine for my brother to game on and he played some game and didn't turn the fan profile on in Trixx. It overheated and fried. I still have it as a shelf piece for people to look at here.

Pretty much every GPU I've benched for HWBOT has been Sapphire.

Generally any Vapor-X card on the market in the last 5 years has had pretty much the BEST air cooler for any AMD card, period. They even heatsink the VRMs now and have fans blowing directly on them. Look at the disassembly of the 290 Vapor-X and you can see.

I really miss my 290s but I'm not gaming on the computer much now so I can hold off for Polaris. I really hope that Sapphire releases good coolers for them. Vapor-X would be great but what I'd really like to see is a comeback of the Toxic series of cards. Those were amazing.

Anyway, nothing to add other than that. Sapphire makes the best AMD GPUs imo.


----------



## JackWarren

When i was shopping for graphics card around Christmas I did a lot of research as to what was the better performing. The answer was the R9 390,
about 100 dollars cheaper in AUD when i bought from NewEgg

There was something that reviews never mentioned.
AMDs bloody drivers
Ive either had to backdate drivers or download community fixes for games that are not any more than 6 or so months old.

After about 2 or 3 weeks of general use and fixing everything is fine and after a bit of OC I flog almost every 970 benchmark.
But during the time i spent fixing little things in games i sometimes wished that i got the GTX 970

That is why AMD is not profitable and not in the position is was years ago.


----------



## Bluescreendeath

Quote:


> Originally Posted by *neurotix*
> 
> Intel CPU, enjoy your Sapphire card.
> 
> I'm just about the biggest Sapphire fanboy you'll ever find on these forums.
> 
> I've owned two Tri-X 290s (when they were brand new and one cost me $650- mining craze), two Vapor-X 290s (best cards I ever owned), a Vapor-X 7970, a Vapor-X 270X and now my 380X Nitro which is my placeholder card until Polaris.
> 
> I've had like 15 different GPUs from them and all have been great. The only one that failed on me was my first (not garbage) card, a 6870 Dual-X. The only reason it failed was that it was in another machine for my brother to game on and he played some game and didn't turn the fan profile on in Trixx. It overheated and fried. I still have it as a shelf piece for people to look at here.
> 
> Pretty much every GPU I've benched for HWBOT has been Sapphire.
> 
> Generally any Vapor-X card on the market in the last 5 years has had pretty much the BEST air cooler for any AMD card, period. They even heatsink the VRMs now and have fans blowing directly on them. Look at the disassembly of the 290 Vapor-X and you can see.
> 
> I really miss my 290s but I'm not gaming on the computer much now so I can hold off for Polaris. I really hope that Sapphire releases good coolers for them. Vapor-X would be great but what I'd really like to see is a comeback of the Toxic series of cards. Those were amazing.
> 
> Anyway, nothing to add other than that. Sapphire makes the best AMD GPUs imo.


I like my sapphire cards too but why do they only have a 2 year warranty?


----------



## rickcooperjr

here is a very good video I was impressed by it is pretty ell humbling because they actually did things in a unbiased testing method and really put the effort in to do so keep in mind the R9 390 is cheaper than the GTX 970 they did stock and overclocked and well 90% of time the OC on the R9 390 gave more performance per mhz also keep in mind the drivers are maturing still on the R9 390 and such and in general entire AMD lineup.

His explanation at the end is very note worthy and eye opening.





here are a few other good videos


----------



## daunow

Kinda wish I knew how to OC my 970... damm.. that FPS gain on The division


----------



## rickcooperjr

Quote:


> Originally Posted by *daunow*
> 
> Kinda wish I knew how to OC my 970... damm.. that FPS gain on The division


yeah but notice in alot of games the OC on the 970 doesn't really increase performance yet the mild OC on the R9 390 makes alot of difference and did you notice the 390 smacks the crap out of the 970 on it.


----------



## Enterprise24

Some people want to trade his Powercolor R9-390 PCS+ with my Zotac GTX 780 Ti AMP!
I have universal block for GPU and I OC 780 Ti to 1301Mhz/1900Mhz for 24/7 use.
I hope that 390 can OC to 1250Mhz-1300Mhz under water also.
I game at 1440p and 3GB VRAM start to have issue with GTA V. I can't enable extended distance scaling or high res shadows or MSAA at all because game will stutter (actually my card have enough horse power).
My 780 Ti @ 1301Mhz/1900Mhz score around 14200 firestrike graphics score (without any tweaks just OC alone).
I see people with 390 @ around 1300Mhz score 16800 in same test (don't know tessellation is enable or not).

780 Ti trade with 390 should be upgrade for me right ?


----------



## Stige

Yes it will be. Just alone on the fact that 3GB is not enough these days, not at 1440p anyway.


----------



## Themisseble

Quote:


> Originally Posted by *Enterprise24*
> 
> Some people want to trade his Powercolor R9-390 PCS+ with my Zotac GTX 780 Ti AMP!
> I have universal block for GPU and I OC 780 Ti to 1301Mhz/1900Mhz for 24/7 use.
> I hope that 390 can OC to 1250Mhz-1300Mhz under water also.
> I game at 1440p and 3GB VRAM start to have issue with GTA V. I can't enable extended distance scaling or high res shadows or MSAA at all because game will stutter (actually my card have enough horse power).
> My 780 Ti @ 1301Mhz/1900Mhz score around 14200 firestrike graphics score (without any tweaks just OC alone).
> I see people with 390 @ around 1300Mhz score 16800 in same test (don't know tessellation is enable or not).
> 
> 780 Ti trade with 390 should be upgrade for me right ?


wait for polaris.


----------



## rickcooperjr

Quote:


> Originally Posted by *Stige*
> 
> Yes it will be. Just alone on the fact that 3GB is not enough these days, not at 1440p anyway.


I believe I seen testing not to long ago GTA V used 4gb+ at 1080p with cards with more than 4gb and with 1440p it used upto around 6gb and with 4k it was around 7-7.5gb of Vram usage I will try to find that again.


----------



## BinaryDemon

Quote:


> Originally Posted by *daunow*
> 
> Kinda wish I knew how to OC my 970... damm.. that FPS gain on The division


A simple overclock is easy enough. use MSI Afterburner (it works with any GPU, not just MSI cards) and start testing with increasing your Core Clock and Memory Clock speeds. You dont need to even mess with Core Voltage or Power Limit unless you are trying to squeeze every last mhz out of it. I would probably start somewhere like +75mhz Core Clock, +200mhz Memory Clock and slowly step up from there testing for stability using games and benchmarks.

Later if you want to get fancy you can look at flashing a custom bios that would let you disable boost clocks, increase the voltage beyond stock limitation, ect.


----------



## Themisseble

Quote:


> Originally Posted by *daunow*
> 
> Kinda wish I knew how to OC my 970... damm.. that FPS gain on The division


Well GTX 970 OC is still behind R9 390 in the division beta.


----------



## daunow

Quote:


> Originally Posted by *Themisseble*
> 
> Well GTX 970 OC is still behind R9 390 in the division beta.


Does not look like that to me, maybe I am blind.
Quote:


> Originally Posted by *BinaryDemon*
> 
> A simple overclock is easy enough. use MSI Afterburner (it works with any GPU, not just MSI cards) and start testing with increasing your Core Clock and Memory Clock speeds. You dont need to even mess with Core Voltage or Power Limit unless you are trying to squeeze every last mhz out of it. I would probably start somewhere like +75mhz Core Clock, +200mhz Memory Clock and slowly step up from there testing for stability using games and benchmarks.
> 
> Later if you want to get fancy you can look at flashing a custom bios that would let you disable boost clocks, increase the voltage beyond stock limitation, ect.


Yeah I've tried this but I can never tell if I am overclocked or not.. and when I did do it, I even started losing FPS on the valley benchmark program...
weird, just recently I overclocked my CPU 3.5ghz to 4.2ghz but loading up CPU-Z it still tells me it's 3.5ghz.
so I guess it didn't work.

I've tried to overclock but, when I can't tell that I've done it is when i get confuse there is nothing that tells me oh your cpu/gpu is overclock.


----------



## Themisseble

Quote:


> Originally Posted by *daunow*
> 
> Does not look like that to me, maybe I am blind.
> Yeah I've tried this but I can never tell if I am overclocked or not.. and when I did do it, I even started losing FPS on the valley benchmark program...
> weird, just recently I overclocked my CPU 3.5ghz to 4.2ghz but loading up CPU-Z it still tells me it's 3.5ghz.
> so I guess it didn't work.
> 
> I've tried to overclock but, when I can't tell that I've done it is when i get confuse there is nothing that tells me oh your cpu/gpu is overclock.


No that benchmark is not great. But in The divison R9 390 is just to fast for GTX 970.
Look even r9 290 stock is way ahead of GTX 970.


----------



## daunow

Quote:


> Originally Posted by *Themisseble*
> 
> No that benchmark is not great. But in The divison R9 390 is just to fast for GTX 970.
> Look even r9 290 stock is way ahead of GTX 970.


I'll be honest with you man, I don't believe this scores after playing it first hand.

In fact, the fps that they are showing was the fps I got when i got drops on that house near a extraction point with a boss in it.

most of the time I was on 60 on the dark zone.

Maybe it's cus is a normal 970 and mines is a FTW, but even than I doubt it.


----------



## BinaryDemon

Quote:


> Originally Posted by *daunow*
> 
> I've tried to overclock but, when I can't tell that I've done it is when i get confuse there is nothing that tells me oh your cpu/gpu is overclock.


AfterBurner can show you realtime Temps/Speeds using an onscreen overlay. Or CPU-Z / GPU-Z are pretty good for checking clockspeeds.


----------



## daunow

Quote:


> Originally Posted by *BinaryDemon*
> 
> AfterBurner can show you realtime Temps/Speeds using an onscreen overlay. Or CPU-Z / GPU-Z are pretty good for checking clockspeeds.


iight thanks, might start doing some overclocking again, see if i could get it working.


----------



## Klocek001

pclab just published a test of 970 vs 390 on high-end vs mid-range CPUs
http://pclab.pl/art60000-21.html


----------



## Klocek001

Quote:


> Originally Posted by *Klocek001*
> 
> pclab just published a test of 970 vs 390 on high-end vs mid-range CPUs
> http://pclab.pl/art60000-21.html


while I gotta say testing with all gameworks on really handicaps 390 here and GW should be off, there's a lot to think about regarding amd overhead. I mean i5 Haswell is not a slow chip to bottleneck the 390, most of OCN would agree. Well, think again.

sorry I quoted myself,thought I was editing. It's 6:30 AM here.


----------



## rdr09

Quote:


> Originally Posted by *Klocek001*
> 
> while I gotta say testing with all gameworks on really handicaps 390 here and GW should be off, there's a lot to think about regarding amd overhead. I mean i5 Haswell is not a slow chip to bottleneck the 390, most of OCN would agree. Well, think again.
> 
> sorry I quoted myself,thought I was editing. It's 6:30 AM here.


here was what i got when i played C3 MP with a single 290 stock and i7 4.5 HT off . . .

very old run



their gpu must be throttling.


----------



## NightAntilli

Quote:


> Originally Posted by *Klocek001*
> 
> while I gotta say testing with all gameworks on really handicaps 390 here and GW should be off, there's a lot to think about regarding amd overhead. I mean i5 Haswell is not a slow chip to bottleneck the 390, most of OCN would agree. Well, think again.
> 
> sorry I quoted myself,thought I was editing. It's 6:30 AM here.


Which driver did they use? These seem to be with older drivers, although I might be wrong.


----------



## mtcn77

Quote:


> Originally Posted by *Klocek001*
> 
> pclab just published a test of 970 vs 390 on high-end vs mid-range CPUs
> http://pclab.pl/art60000-21.html


Vulkan has arrived, probably bearing better performance than SteamOS, too.
[Source]


----------



## daunow

man haven't heard steamos I guess it was that bad..


----------

